CN110166766B - Multi-line array CCD camera coplanar collinear imaging combined debugging method - Google Patents
Multi-line array CCD camera coplanar collinear imaging combined debugging method Download PDFInfo
- Publication number
- CN110166766B CN110166766B CN201910480255.5A CN201910480255A CN110166766B CN 110166766 B CN110166766 B CN 110166766B CN 201910480255 A CN201910480255 A CN 201910480255A CN 110166766 B CN110166766 B CN 110166766B
- Authority
- CN
- China
- Prior art keywords
- camera
- cameras
- imaging
- adjusting
- width
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a co-planar and co-linear imaging combined debugging method for a multi-line array CCD camera, which is used for accurately adjusting the accurate position, the height, the front and back positions, the parallelism and the pitching angle of the camera one by one. The method solves the problem that the traditional co-planar and co-linear shooting combined debugging method of the multi-linear array CCD camera is difficult to meet the requirements of calibration efficiency and precision, and realizes high calibration efficiency and high precision.
Description
Technical Field
The invention relates to the technical field of machine vision, in particular to a co-planar collinear imaging combined debugging method for a multi-line array CCD camera.
Background
With the rapid development of the machine vision industry, when large-scale object images are acquired, a plurality of linear array CCD cameras are required to be used for acquiring the images simultaneously, so that the accurate splicing of the images acquired by the plurality of linear array CCD cameras plays an important role. One frame of image collected by the linear array CCD camera is a line of pixel points. When a plurality of linear CCD cameras are used, the heights, positions and angles of the plurality of linear CCD cameras need to be adjusted, so that images acquired by the plurality of cameras simultaneously correspond to the same line on an object. Therefore, many experts propose various methods for solving the joint debugging of multiple linear array CCD cameras. The existing linear array camera pose calibration method mainly comprises an optical calibration based method and a calibration object and image processing based method. The optical calibration method divides a measured object into a plurality of areas, each area is provided with a group of linear array cameras and an optical system which can be independently adjusted, and the cameras are adjusted one by one to meet the measurement requirements. The method mainly aims at the optical measurement requirement, has higher precision, but has single adjusting function, excessively depends on the performance of an optical system and the precision of hardware, and has high cost. The method based on the calibration object and the image processing mainly aims at designing a specific calibration plate and a pattern and combining an image processing algorithm and a motion control mechanism for adjustment. The method has certain flexibility, but has the limitations of high motion control precision, complex algorithms such as centroid extraction and fitting, need of accurate dynamic imaging and the like, and has insufficient reliability in industrial application. Aiming at the existing method that the existing method excessively depends on hardware and a complex image processing algorithm, the invention provides a simple and efficient calibration method by combining practical industrial application, and the calibration efficiency and the precision requirement are both considered.
Disclosure of Invention
Based on the technical problems in the background art, the invention provides a co-planar collinear imaging combined debugging method for a multi-line array CCD camera.
The technical scheme adopted by the invention is as follows:
a co-planar collinear imaging combined debugging method for a multi-line array CCD camera is characterized by comprising the following steps:
(I) Camera height and Camera position
(1) Establishing an imaging system space coordinate system for describing space position information of an imaging system, and setting dx, Ly and Lz as three-dimensional reference lines of the imaging system space coordinate system respectively; lx is the left side edge of the TFT-LCD glass substrate of the detection object, Ly is a horizontal reference line drawn during light source adjustment, and Lz is a vertical reference line which passes through the intersection point of the straight lines Lx and Ly and is vertical to the reference plane where the detection object is located;
(2) when the standard magnification of the lens is n, the working distance of the lens is h millimeters, namely the distance from the detection reference surface T to the lower end of the lens is h millimeters, so that the position of a height reference line Lh can be determined; adjusting the whole camera support, and enabling the camera 1 to reach a height reference position under the condition that the support is horizontal; recording a height difference parameter h1 between the camera 1 and the bracket according to a certain characteristic position on the camera 1; taking h1 as a reference, adjusting the camera base to enable the height difference parameter between the other 3 cameras and the support to be equal to h 1; when h 1-h 2-h 3-h 4, the cameras are considered to be equal in height and meet the requirement of the lens imaging working distance, the shooting areas of each camera are separated from each other as a whole, and the edge areas are connected in pairs;
(3) marking equal-height detection width ranges OE on the camera bracket by a plumb method by taking the boundaries at two sides of the detection surface as reference, wherein the width yv of the camera field of view imaged by a single camera is w1Mm, width of detection area is lOELength w2Millimeter, the field of view range of 4 cameras of the imaging system completely covers OE; the mutual position interval delta d of the cameras is set to be the same value, on the detection plane T of the glass substrate, the widths v of the overlapping areas of the three visual fields of the 4 camera shooting areas are equal, and the relation among the width yv of the camera visual field, the width v of the overlapping area and the installation position interval delta d of the cameras is as follows:
yv=Δd+v (1)
in this case, O is the origin, O represents the stent start point, E represents the stent end point, and diDenotes the distance of the camera i from the point O, deRepresents the distance from camera 4 to point E, Δ d represents the separation between camera i and camera i + 1; points P1-P4 represent the positions of the cameras 1-4 projected onto OE, and the L of the position Pi of camera iyThe axis coordinate values may be expressed as: di=d1+ (i-1) Δ d or di=lOE-de- (4-i) Δ d, wherein i ═ 1,2,3, 4; in this case, the equal-height detection width range OE and the camera interval Δ d have the following relationship:
lOE=d1+3Δd+de(2)
to facilitate subsequent image processing, it should be ensured that the effective field width of each camera is equal, and when the effective field widths of the cameras 1 and 4 are the same, it is obvious that:then the formula (2) can be used to obtainSpecific position coordinates of the cameras 1-4 on the camera support can be determined, and then the cameras are translated according to the marked positions;
width of field overlap region within detection range: v ═ w1Δ d, each camera should be dropped in the left and right regions of the field of view due to the symmetry of the camera imaging systemThe field width of (d);
(II) problem of front and back position and parallelism of camera
(1) Each camera is respectively arranged on a camera base for adjusting the inclination angle and the pitch angle of the camera, whether an imaging area is parallel to an illumination reference line is used as a basis for adjusting the parallelism of the camera, and the front and back positions of the camera are adjusted to enable a shooting area of the camera to be in an illumination range;
(2) when the detected object passes through the triggering position S of the photoelectric sensor, the camera starts to shoot images, and if the inclination angle of the camera is theta, the included angle between the straight line l where the shooting area is located and the illumination datum line Ly is also theta; when the movement distance of the glass substrate is delta s, the scanning speed of the camera is constant, so that a shot image area I can be obtained through a translation straight line l in an imaging system coordinate system, and the translation distance is delta s; obviously, in a coordinate system of an imaging system, an included angle between the boundary of the detected object and the initial boundary of the image is theta; according to the camera imaging principle, in the image I, the included angle between the initial boundary of the detected object and the image boundary is also theta; through constantly adjusting the left and right included angle of camera, work as:
θ=0 (3)
then, the current camera is already parallel to the camera mount. The above steps are applied to each camera, respectively, so that each camera is parallel to the camera support.
(3) Arbitrarily taking a certain characteristic point on the camera 1 as a reference point, and taking the distance d from the reference point to the camera support1The distance between the other cameras and the camera support is adjusted according to the following conditions:
s1=s2=s3=s4(4)
the pictures taken by each camera per frame are now in parallel.
(III) Camera Pitch Angle
(1) The imaging brightness of each camera is taken as reference, when the optical axis of the camera is close to the vertical direction, the shooting area is close to an illumination reference line, the image brightness is enhanced, the pitching angle of the camera is firstly slightly adjusted by using the standard, the brightness of the strip-shaped light source illumination area is uniform in a rectangular range, and when the pitching angle of the camera is adjusted, the shooting position of the camera moves in the rectangular area, the images are all high and the brightness change is not obvious;
(2) the pitch angle is further accurately judged in a motion imaging mode: by adjusting the pitch angle of the camera 1, the pitch angle range theta with high brightness and no obvious brightness change is recorded1~θnTaking the intermediate value as the accurate value of the angle of the camera 1, namely:
θm=(θ1+θn)/2
the shooting position of the camera 1 at this time is regarded as the position of the illumination datum line; then, 4 cameras are used for imaging the detected object through the motion platform; when the detected object passes through the lightWhen the electric sensor triggers the position S, triggering the camera to start collecting; when the detected object moves by deltas, the blank area width x1 in the image obtained by shooting by the camera 1 is taken as a reference, and when the collection position of the camera i is lower than the collection position of the camera 1, the blank area width x in the image obtained by shooting is taken as a referencei>x1Whereas x is the case when the acquisition position of camera i is higher than the acquisition position of camera 1i<x1(ii) a Therefore, when:
x1=x2=x3=x4
the imaging positions of the cameras can be on the same straight line which is very close to and parallel to the illumination datum line.
The invention has the advantages that:
the method solves the problem that the traditional co-planar and co-linear shooting combined debugging method of the multi-linear array CCD camera is difficult to meet the requirements of calibration efficiency and precision, and realizes high calibration efficiency and high precision.
Drawings
FIG. 1 is a diagram illustrating the height and position of a camera.
Fig. 2 is a schematic diagram of the problem of parallel adjustment and front-back position of the camera.
Fig. 3 shows the positional relationship between the camera shooting area and the illumination area.
Fig. 4 illustrates the principle of tilt camera imaging.
Fig. 5 is a schematic diagram of the camera pitch angle problem.
Fig. 6 is a schematic diagram of motion imaging in camera tilt adjustment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Examples are given.
A co-planar and co-linear imaging combined debugging method for a multi-linear array CCD camera is characterized in that a shooting area of the linear array CCD camera can be seen as a fixed-length line segment on a detection reference surface, the position of the camera can only be slightly estimated initially when the camera is installed, and the accurate position, the height, the front and back positions, whether the camera is parallel or not, the pitching angle and other conditions of the camera need to be further accurately adjusted. Therefore, after the lighting system is adjusted, camera adjustment is required.
(I) Camera height and Camera position
In the installation process of the cameras, the heights of the cameras can be estimated only slightly, and the heights of the cameras are easily adjusted by naked eyes. Because the lens of the industrial camera has a fixed range, when the overall height of the camera is beyond the working distance, the camera cannot clearly image; when the height of the camera is uneven, if the camera is completely imaged by adjusting the focal length, the focal length of the lens is likely to be greatly different. Therefore, the camera height needs to be adjusted and calibrated in advance.
As shown in fig. 1, in order to describe the spatial position information of the imaging system, a spatial coordinate system of the imaging system needs to be established. Wherein Lx, Ly and Lz are three-dimensional reference lines of a space coordinate system of the imaging system respectively. Lx is the left side edge of the TFT-LCD glass substrate of the detection object, Ly is a horizontal reference line drawn when the light source is adjusted, and Lz is a vertical reference line which passes through the intersection point of the straight lines Lx and Ly and is vertical to the reference plane of the detection object. The following description of the imaging system will also be extended to this three-dimensional coordinate system.
According to the lens document, when the standard lens magnification is n, the working distance of the lens can be calculated to be h mm, namely, the distance from the detection reference surface T to the lower end of the lens is h mm. From this, the height reference line Lh position can be determined. And adjusting the whole camera support to enable the camera 1 to reach a height reference position under the condition that the support is horizontal. At this time, the height difference parameter h1 between the camera 1 and the support is recorded at a certain characteristic position on the camera 1. And taking h1 as a reference, adjusting the camera base to enable the height difference parameter between the other 3 cameras and the support to be equal to h 1. When h 1-h 2-h 3-h 4, the camera may be considered as equal in height and meet the lens imaging working distance requirement.
4 cameras of the imaging system need to be matched with each other to complete the imaging work of the whole target surface, so that the shooting areas of all the cameras are separated from each other integrally, and the edge areas are connected in pairs. If the camera position is unreasonable to put, the area that the shooting lacked appears easily, and adjust the camera position in advance and also be favorable to subsequent visual field concatenation work.
And marking the equal-height detection width range OE on the camera bracket by a plumb method by taking the boundaries at the two sides of the detection surface as reference. As can be seen from the previous section, the width yv of the camera field of view for a single camera is w1Mm, width of detection area is lOELength w2And (4) millimeter. It is clear that for detection purposes the field of view of the 4 cameras should cover the OE completely. To facilitate the stitching of the camera fields of view, the camera spacing Δ d is preferably set to the same value. When the distance Δ d between the camera mounting positions is the same, it is obvious that on the glass substrate detection plane T, the widths v of the overlapping areas of the three viewing fields of the 4 camera shooting areas are equal, and the relationship among the camera viewing field width yv, the overlapping area width v, and the camera mounting position distance Δ d is:
yv=Δd+v (1)
in this case, O is the origin, O represents the stent start point, E represents the stent end point, and diDenotes the distance of the camera i from the point O, deRepresents the distance from camera 4 to point E, Δ d represents the separation between camera i and camera i + 1; points P1-P4 represent the positions of the cameras 1-4 projected onto OE, and the L of the position Pi of camera iyThe axis coordinate values may be expressed as: di=d1+ (i-1) Δ d or di=lOE-de- (4-i) Δ d, wherein i ═ 1,2,3, 4; in this case, the equal-height detection width range OE and the camera interval Δ d have the following relationship:
lOE=d1+3Δd+de(2)
to facilitate subsequent image processing, it should be ensured that the effective field width of each camera is equal, and when the effective field widths of the cameras 1 and 4 are the same, it is obvious that:then the formula (2) can be used to obtainAt this timeSpecific position coordinates of the cameras 1-4 on the camera support can be determined, and then the cameras are translated according to the marked positions;
width of field overlap region within detection range: v ═ w1Δ d, each camera should be dropped in the left and right regions of the field of view due to the symmetry of the camera imaging systemThe field width of (d);
(II) problem of front and back position and parallelism of camera
When an industrial camera is used in a machine vision inspection system, the camera is generally mounted on a camera base capable of adjusting various angles, and then the base is mounted on a camera support. The camera base for adjusting the inclination angle and the pitch angle of the camera is a conforming camera base formed by combining two layers of precise angular displacement tables.
In the installation process, the alignment of the front and back positions of each camera cannot be completely guaranteed, and whether the cameras are parallel to the illumination area cannot be judged. When the cameras are not installed in parallel, the shooting area of the cameras is affected to be inclined at a certain angle with the illumination area because the cameras and the shooting area are in strict parallel relation. Without adjustment, the 4 cameras cannot reach the same shooting reference line (illumination reference line), as shown in fig. 2:
l 'in the drawing'iAnd (i is 1,2,3,4) is a straight line formed by translating a straight line in which the camera shooting area is located to the origin of coordinates. If the camera is slightly uneven in the installation process, the shooting area of the camera cannot be completely matched with the illumination area.
As shown in fig. 3, when the front and back positions of the camera are close to the proper position and the camera is tilted left and right, the situation described in fig. a and b occurs. Regardless of the camera pitch, the situation depicted in figures c, d occurs when the camera is too far forward or too far back. And the imaging mode depicted in figure e is what we finally need to achieve.
In solving the problem of camera non-parallelism and front-to-back misalignment, we should first solve the problem of camera non-parallelism. When the camera is in a parallel state, a certain characteristic point on the camera can be randomly selected, and the horizontal distance between the point and the support is used for replacing the front-back position coordinate of the camera as a reference for adjusting the front-back position of the camera.
Because the camera and the imaging area are strictly parallel, whether the imaging area is parallel to the illumination datum line can be used as a basis for adjusting the parallelism of the camera. And in order to make the camera image clearly and the brightness is suitable, the front and back positions of the camera can be adjusted, and the shooting area of the camera can be in the illumination range. Since there is also a pitch angle problem, the front-to-back position of the camera is not yet the end result, but needs to be further adjusted after the camera parallelism problem is solved by the calibration method described above.
In order to adjust the parallelism of the camera, as shown in fig. 4(a), the camera starts to capture an image when the detected object passes through the photosensor trigger position S. If the inclination angle of the camera is theta, the included angle between the straight line l where the shooting area is located and the illumination reference line Ly is also theta. When the glass substrate movement distance is Δ s, since the camera scanning speed is constant, the photographed image area I can be obtained by translating the straight line l by the translation distance Δ s in the imaging system coordinate system. Obviously, in the coordinate system of the imaging system, the included angle between the boundary of the detected object and the initial boundary of the image is θ. According to the camera imaging principle, in the image I, the included angle between the initial boundary of the detected object and the image boundary is also theta; through constantly adjusting the left and right included angle of camera, work as:
θ=0 (3)
then, the current camera is already parallel to the camera mount. The above steps are applied to each camera, respectively, so that each camera is parallel to the camera support.
At this time, although the camera pitch angles are not adjusted, the pitch angles of the cameras are smaller, so that the influence on the front and back positions of the feature points on the cameras is extremely small and can be ignored. Therefore, a certain characteristic point on the camera 1 can be arbitrarily taken as a reference point, and the distance d from the reference point to the camera support1The distance between the other cameras and the camera support is adjusted according to the following conditions:
s1=s2=s3=s4(4)
the pictures taken by each camera per frame are now in parallel.
(III) Camera Pitch Angle
The camera is sequentially subjected to height adjustment in the Lz direction, installation position adjustment in the Ly direction and front-back position adjustment in the Lx direction, and is parallel to the illumination datum line. At this time, as shown in fig. 5(b), theoretically, the imaging positions of the 4 cameras on the plane of the imaging systems Lx and Lz coincide with a certain point on the vertical reference line Lz. The camera optical axis will make a very small angle with the vertical reference line Lz due to camera pitch angle problems. The shooting situation of the camera at this time is shown in fig. 5 (a). Obviously, the imaging brightness of each camera can be taken as reference, when the optical axis of the camera is close to the vertical direction, the shooting area is close to the illumination reference line, and the image brightness is enhanced. The standard can only be used for adjusting the pitching angle of the camera slightly, and because the brightness of the strip-shaped light source illumination area is uniform in a rectangular range, when the pitching angle of the camera is adjusted, the shooting position of the camera moves in the rectangular area, images are all high and the brightness change is not obvious.
The pitch angle should be further accurately determined by means of motion imaging. Before that, the pitching angle range theta with high brightness and no obvious brightness change can be recorded by adjusting the pitching angle of the camera 11~θnTaking the intermediate value as the accurate value of the angle of the camera 1, namely:
θm=(θ1+θn)/2
the shooting position of the camera 1 at this time is regarded as the position of the illumination reference line. Then the 4 cameras are used for imaging the detected object simultaneously through the moving platform. The imaging effect is shown in fig. 6. When the detected object passes through the photoelectric sensor trigger position S, the camera is triggered to start collecting. When the detected object moves by deltas, the blank area width x1 in the image obtained by shooting by the camera 1 is taken as a reference, and when the collection position of the camera i is lower than the collection position of the camera 1, the blank area width x in the image obtained by shooting is taken as a referencei>x1On the contrary, when the acquisition position of the camera i is higher than that of the camera 1X when position is collectedi<x1. Therefore, when:
x1=x2=x3=x4
the imaging positions of the cameras can be on the same straight line which is very close to and parallel to the illumination datum line.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.
Claims (1)
1. A co-planar collinear imaging combined debugging method for a multi-line array CCD camera is characterized by comprising the following steps:
firstly, adjusting the height and position of a camera;
secondly, the front and back positions and the parallelism of the cameras are adjusted;
thirdly, adjusting the pitch angle of the camera;
the method for adjusting the height and the position of the camera comprises the following steps:
(1) establishing an imaging system space coordinate system for describing space position information of an imaging system, and setting Lx, Ly and Lz as three-dimensional reference lines of the imaging system space coordinate system respectively; lx is the left side edge of the TFT-LCD glass substrate of the detection object, Ly is a horizontal reference line drawn during light source adjustment, and Lz is a vertical reference line which passes through the intersection point of the straight lines Lx and Ly and is vertical to the reference plane where the detection object is located;
(2) when the standard magnification of the lens is n, the working distance of the lens is h millimeters, namely the distance from the detection reference surface T to the lower end of the lens is h millimeters, so that the position of a height reference line Lh can be determined; adjusting the whole camera support, and enabling the camera 1 to reach a height reference position under the condition that the support is horizontal; recording a height difference parameter h1 between the camera 1 and the bracket according to a certain characteristic position on the camera 1; taking h1 as a reference, adjusting the camera base to enable the height difference parameter between the other 3 cameras and the support to be equal to h 1; when h 1-h 2-h 3-h 4, the cameras are considered to be equal in height and meet the requirement of the lens imaging working distance, the shooting areas of each camera are separated from each other as a whole, and the edge areas are connected in pairs;
(3) marking equal-height detection width ranges OE on the camera bracket by a plumb method by taking the boundaries at two sides of the detection surface as reference, wherein the width yv of the camera field of view imaged by a single camera is w1Mm, width of detection area is lOELength w2Millimeter, the field of view range of 4 cameras of the imaging system completely covers OE; the mutual position interval delta d of the cameras is set to be the same value, on the detection plane T of the glass substrate, the widths v of the overlapping areas of the three visual fields of the 4 camera shooting areas are equal, and the relation among the width yv of the camera visual field, the width v of the overlapping area and the installation position interval delta d of the cameras is as follows:
yv=Δd+v (1)
in this case, O denotes a stent start point, E denotes a stent end point, and diDenotes the distance of the camera i from the point O, deRepresents the distance from camera 4 to point E, Δ d represents the separation between camera i and camera i + 1; points P1-P4 represent the positions of the cameras 1-4 projected onto OE, and the L of the position Pi of camera iyThe axis coordinate values may be expressed as: di=d1+ (i-1) Δ d or di=lOE-de- (4-i) Δ d, wherein i ═ 1,2,3, 4; in this case, the equal-height detection width range OE and the camera interval Δ d have the following relationship:
lOE=d1+3Δd+de(2)
to facilitate subsequent image processing, it should be ensured that the effective field width of each camera is equal, and when the effective field widths of the cameras 1 and 4 are the same, it is obvious that:then the formula (2) can be used to obtainAt the moment, the specific position coordinates of the cameras 1-4 on the camera support can be determined, and then the positions are markedTranslating the camera;
the width of the field of view overlap region in the detection range: v ═ w1Δ d, each camera should be dropped in the left and right regions of the field of view due to the symmetry of the camera imaging systemThe field width of (d);
the step (II) of adjusting the front and back positions and the parallelism of the cameras specifically comprises the following steps:
(1) each camera is respectively arranged on a camera base for adjusting the inclination angle and the pitch angle of the camera, whether an imaging area is parallel to an illumination reference line is used as a basis for adjusting the parallelism of the camera, and the front and back positions of the camera are adjusted to enable a shooting area of the camera to be in an illumination range;
(2) when the detected object passes through the triggering position S of the photoelectric sensor, the camera starts to shoot images, and if the inclination angle of the camera is theta, the included angle between the straight line l where the shooting area is located and the illumination datum line Ly is also theta; when the movement distance of the glass substrate is delta s, the scanning speed of the camera is constant, so that a shot image area I can be obtained through a translation straight line l in an imaging system coordinate system, and the translation distance is delta s; obviously, in a coordinate system of an imaging system, an included angle between the boundary of the detected object and the initial boundary of the image is theta; according to the camera imaging principle, in the image I, the included angle between the initial boundary of the detected object and the image boundary is also theta; through constantly adjusting the left and right included angle of camera, work as:
θ=0 (3)
when the current camera is parallel to the camera support; the steps are respectively used for each camera, so that each camera is parallel to the camera support;
(3) arbitrarily taking a certain characteristic point on the camera 1 as a reference point, and taking the distance s from the characteristic point to the camera support1The distance between the other cameras and the camera support is adjusted according to the following conditions:
s1=s2=s3=s4(4)
at the moment, pictures shot by each frame of each camera are in a parallel state;
adjusting the pitch angle of the camera specifically comprises the following steps:
(1) the imaging brightness of each camera is taken as reference, when the optical axis of the camera is close to the vertical direction, the shooting area is close to an illumination reference line, the image brightness is enhanced, the pitching angle of the camera is firstly slightly adjusted by using the standard, the brightness of the strip-shaped light source illumination area is uniform in a rectangular range, and when the pitching angle of the camera is adjusted, the shooting position of the camera moves in the rectangular area, the images are all high and the brightness change is not obvious;
(2) the pitch angle is further accurately judged in a motion imaging mode: by adjusting the pitch angle of the camera 1, the pitch angle range theta with high brightness and no obvious brightness change is recorded1~θnTaking the intermediate value as the accurate value of the angle of the camera 1, namely:
θm=(θ1+θn)/2
the shooting position of the camera 1 at this time is regarded as the position of the illumination datum line; then, 4 cameras are used for imaging the detected object through the motion platform; when the detected object passes through a photoelectric sensor trigger position S, a camera is triggered to start collecting; when the detected object moves by deltas, the blank area width x1 in the image obtained by shooting by the camera 1 is taken as a reference, and when the collection position of the camera i is lower than the collection position of the camera 1, the blank area width x in the image obtained by shooting is taken as a referencei>x1Whereas x is the case when the acquisition position of camera i is higher than the acquisition position of camera 1i<x1(ii) a Therefore, when:
x1=x2=x3=x4
the imaging positions of the cameras can be on the same straight line which is very close to and parallel to the illumination datum line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910480255.5A CN110166766B (en) | 2019-06-04 | 2019-06-04 | Multi-line array CCD camera coplanar collinear imaging combined debugging method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910480255.5A CN110166766B (en) | 2019-06-04 | 2019-06-04 | Multi-line array CCD camera coplanar collinear imaging combined debugging method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110166766A CN110166766A (en) | 2019-08-23 |
CN110166766B true CN110166766B (en) | 2020-09-08 |
Family
ID=67627156
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910480255.5A Active CN110166766B (en) | 2019-06-04 | 2019-06-04 | Multi-line array CCD camera coplanar collinear imaging combined debugging method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110166766B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021114090A1 (en) * | 2019-12-10 | 2021-06-17 | 深圳迈瑞生物医疗电子股份有限公司 | Cell image analysis device and camera installation parallelism checking method |
CN111327835B (en) | 2020-03-20 | 2021-07-09 | 合肥埃科光电科技有限公司 | Multi-line time-sharing exposure processing method and system for camera |
CN111366531B (en) * | 2020-04-03 | 2023-02-17 | 湖南讯目科技有限公司 | Linear array camera adjusting auxiliary device and adjusting method |
CN112428264B (en) * | 2020-10-26 | 2021-12-07 | 中国计量大学 | Robot arm correction method and system |
CN112460414B (en) * | 2020-12-09 | 2024-07-02 | 广东龙天智能仪器股份有限公司 | Linear array camera group mounting mechanism |
CN112581547B (en) * | 2020-12-30 | 2022-11-08 | 安徽地势坤光电科技有限公司 | Rapid method for adjusting installation angle of imaging lens |
CN113364980B (en) * | 2021-05-31 | 2022-12-06 | 浙江大华技术股份有限公司 | Device control method, device, storage medium, and electronic apparatus |
CN116256909B (en) * | 2023-05-15 | 2023-08-08 | 苏州优备精密智能装备股份有限公司 | Real-time detection processing system and processing method for liquid crystal coating |
CN117241012B (en) * | 2023-11-16 | 2024-02-06 | 杭州百子尖科技股份有限公司 | Calibrating device, calibrating method and machine vision detection system of linear array camera |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106289106A (en) * | 2016-08-04 | 2017-01-04 | 北京航空航天大学 | Stereo vision sensor that a kind of line-scan digital camera and area array cameras combine and scaling method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7708204B2 (en) * | 2005-02-07 | 2010-05-04 | Hamar Laser Instruments, Inc. | Laser alignment apparatus |
US7309161B1 (en) * | 2006-10-25 | 2007-12-18 | Agilent Technologies, Inc. | Method for determination of magnification in a line scan camera |
CN101419177B (en) * | 2007-10-25 | 2011-06-15 | 宝山钢铁股份有限公司 | Method for demarcating multi line scan video cameras |
CN106982370B (en) * | 2017-05-03 | 2018-07-06 | 武汉科技大学 | A kind of camera high-precision calibration scaling board and the method for realizing calibration |
CN107976146B (en) * | 2017-11-01 | 2019-12-10 | 中国船舶重工集团公司第七一九研究所 | Self-calibration method and measurement method of linear array CCD camera |
CN109724623B (en) * | 2018-12-26 | 2020-08-21 | 中国科学院长春光学精密机械与物理研究所 | Two-dimensional calibration method and device for mapping internal orientation elements of camera |
-
2019
- 2019-06-04 CN CN201910480255.5A patent/CN110166766B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106289106A (en) * | 2016-08-04 | 2017-01-04 | 北京航空航天大学 | Stereo vision sensor that a kind of line-scan digital camera and area array cameras combine and scaling method |
Also Published As
Publication number | Publication date |
---|---|
CN110166766A (en) | 2019-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110166766B (en) | Multi-line array CCD camera coplanar collinear imaging combined debugging method | |
CN110006905B (en) | Large-caliber ultra-clean smooth surface defect detection device combined with linear area array camera | |
US4611292A (en) | Robot vision system | |
CN109462752B (en) | Method and device for measuring optical center position of camera module | |
CN102023164A (en) | Device and method for detecting local defects of transparent surface plate | |
US11978222B2 (en) | Three-dimensional light field technology-based optical unmanned aerial vehicle monitoring system | |
CN107683401A (en) | Shape measuring apparatus and process for measuring shape | |
CN109712139B (en) | Monocular vision size measurement method based on linear motion module | |
CN112697112A (en) | Method and device for measuring horizontal plane inclination angle of camera | |
CN110136047A (en) | Static target 3 D information obtaining method in a kind of vehicle-mounted monocular image | |
ES2733651T3 (en) | Measurement of the angular position of a lenticular lens sheet | |
CN109541626B (en) | Target plane normal vector detection device and detection method | |
CN109751987A (en) | A kind of vision laser locating apparatus and localization method for mechanical actuating mechanism | |
US20110018965A1 (en) | Apparatus for optimizing geometric calibration | |
JP5875676B2 (en) | Imaging apparatus and image processing apparatus | |
CN106846284A (en) | Active-mode intelligent sensing device and method based on cell | |
WO2002075245A1 (en) | Apparatus and method for measuring three dimensional shape with multi-stripe patterns | |
CN108323154B (en) | SMT chip mounter multi-view flying camera and method for imaging data selection processing | |
CN218728592U (en) | Workpiece batch exposure equipment | |
KR100752989B1 (en) | Device capable of measuring 2-dimensional and 3-dimensional images | |
Sueishi et al. | Mirror-based high-speed gaze controller calibration with optics and illumination control | |
CN112557407B (en) | Optical detection module and optical detection method for detecting corner defects of notebook shell | |
CN111260737B (en) | Method and device for adjusting optical center of integrated camera | |
JPH0820207B2 (en) | Optical 3D position measurement method | |
US20070065042A1 (en) | Method for measuring dimensions by means of a digital camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |