CN106767913B - Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table - Google Patents

Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table Download PDF

Info

Publication number
CN106767913B
CN106767913B CN201611094763.2A CN201611094763A CN106767913B CN 106767913 B CN106767913 B CN 106767913B CN 201611094763 A CN201611094763 A CN 201611094763A CN 106767913 B CN106767913 B CN 106767913B
Authority
CN
China
Prior art keywords
angle
led
compound eye
calibration
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611094763.2A
Other languages
Chinese (zh)
Other versions
CN106767913A (en
Inventor
王克逸
何建争
简慧杰
陶家园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN201611094763.2A priority Critical patent/CN106767913B/en
Publication of CN106767913A publication Critical patent/CN106767913A/en
Application granted granted Critical
Publication of CN106767913B publication Critical patent/CN106767913B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Abstract

The invention discloses a compound eye system calibration device and a calibration method based on a single LED luminous point and a two-dimensional rotary table. In the calibration process, the computer drives the rotary table to rotate by controlling the driver, acquires target images captured by compound eyes under different rotation angles of the two motors, calculates the center coordinates of light spots of the images, records the rotation angles of the motors of the rotary table and corresponding spot coordinates, and establishes a mapping relation between the two through a mathematical model. The invention can construct a high-precision target covering the full field of view of compound eyes, can unify the coordinate systems of a plurality of sub-eyes, does not need to consider complex system imaging and distortion models, can realize unattended automatic calibration, can complete the precise calibration of the compound eye system with a plurality of sub-eyes, complex distortion and large field of view, and has simple calibration device and high calibration precision.

Description

Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table
Technical Field
The invention belongs to the field of machine vision, and particularly relates to a compound eye system calibration device and a compound eye system calibration method based on a single LED luminous point and a two-dimensional rotary table.
Background
The natural insect compound eye has great potential application value in the fields of industry, security, military and the like due to the advantages of large field of view, high motion sensitivity, compact structure and the like. Therefore, a novel bionic compound eye device is designed, as shown in the attached figure 1, the system keeps the sub-eye curved surface distribution characteristics of the compound eye of the natural insect; a meniscus lens is introduced to enable light rays to irradiate the image sensor more vertically; 141 sub-eyes are distributed on the spherical shell of the hemispherical compound eye, and the sub-eyes adopt logarithmic cone lenses so as to improve the focusing characteristic of light spots; all sub-eyes share one CMOS camera, the image sensor resolution is 2048 pixels by 2048 pixels, and the pixel size is 12 μm by 12 μm. The compound eye device is about 110mm x 80mm in size, the overall field of view is 100 ° x 100 °, and the captured image can be transmitted to a computer through a USB interface.
The imaging of each sub-eye of the compound eye system inevitably has nonlinear distortion, and in order to use the device to perform the work of target point detection and positioning, the calibration is needed. In machine vision applications, the purpose of camera calibration is to determine the correlation between the three-dimensional geometric position of a point on the surface of an object in space and its corresponding point in the image. In order to determine the corresponding relationship between image points and space points in a compound eye system, the calibration of the compound eye needs to solve the problems that: how to establish the relationship between each sub-eye channel image light spot and an incident light vector; how to unify the numerous sub-eye coordinate systems; how to reduce the amount of time and resources spent on calibrating the numerous sub-eyes.
The currently common camera calibration method mainly comprises the steps of firstly constructing an image model and a distortion model, and then solving internal, external and distortion parameters in the camera model by adopting a series of calibration methods based on radial constraint, 2D target, cross ratio invariance and the like, so as to establish the relationship between an image and a space point. In a compound eye system, the number of sub-eyes is large, the distortion of the sub-eyes is larger and more complex when the sub-eyes are closer to the edge of the spherical shell, and if each sub-eye is calibrated, the problems that a model is difficult to construct, the coordinate system of the sub-eyes is large, the calibration workload is large and the like are faced; meanwhile, the calibration is carried out in a compound eye large view field, the needed large-scale target is expensive, and the target precision is difficult to guarantee. These conventional calibration methods are therefore not suitable for this application.
In consideration of the complexity of the compound eye system calibration, a set of calibration device with high calibration precision, uniform coordinate system and automatic operation and a corresponding efficient and flexible calibration method must be designed.
According to the structural characteristics of the bionic compound eye system, a set of calibration device and a corresponding calibration method are designed, and how to solve specific imaging and distortion parameters of the system is not needed any more, so that the defects of the existing calibration method are overcome, a high-precision target covering the whole field range of the compound eye is constructed, the automatic operation of the calibration process is realized through software control, the nonlinear corresponding relation between each sub-eye channel light spot and the target point angle is established under a unified coordinate system, the calibration problems of the compound eye system with multiple sub-eyes, complicated distortion and large field of view are solved, the calibration efficiency is improved, and the system calibration error is reduced.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method overcomes the defects of the existing calibration method, provides a set of calibration device with high calibration precision and unified coordinate system and automatic operation and a corresponding high-efficiency flexible calibration method, realizes the precise calibration of the compound eye system under the close-range large view field, and has simple calibration device and high calibration precision.
The technical scheme adopted by the invention is as follows: a compound eye system calibration device based on a single LED luminous point and a two-dimensional rotary table comprises the two-dimensional rotary table, a motor driver, an LED, a magnetic gauge stand support frame, a one-dimensional translation table, a computer and a compound eye system. The two-dimensional rotary table comprises a high-precision servo motor and a fixed frame, the compound eye system is installed at the tail end of a transverse shaft of the rotary table, and the compound eye rotates along with the vertical shaft and the transverse shaft. The LED is arranged at the top end of the magnetic gauge stand support frame, the placing position of the LED can be adjusted, the support frame is fixed on the one-dimensional translation table, and the LED can realize one-dimensional translation motion on the translation table along the direction of the transverse shaft of the rotary table. In the calibration process, a driver is controlled to drive a vertical shaft and a horizontal shaft to rotate, the LED position movement is realized by controlling a one-dimensional translation table, a compound eye system captures a three-dimensional space target point to obtain a light spot image, the image is transmitted to a computer through a USB interface, the central coordinate of the image light spot is calculated, the angle values of the vertical shaft and the horizontal shaft and the central coordinate of the light spot are recorded, and therefore the coordinate-angle mapping relation of each sub-eye is established at two positions under the unified coordinate system.
In addition, the invention also provides a compound eye system calibration method based on the single LED luminous point and the two-dimensional turntable, which comprises the following steps:
and (1) adjusting the two-dimensional turntable device to realize that the vertical axis and the horizontal axis intersect at the spherical center position of the spherical crown surface of the compound eye, and the vertical axis and the horizontal axis are perpendicular to each other.
And (2) setting the zero-point angle positions of the rotating shafts of the two motors as starting points of angle rotation in the calibration process, and rotating the two motors to the zero-point positions.
And (3) placing an LED at the position about 500mm in front of the compound eye, rotating a transverse shaft at equal angular intervals within 360 degrees, collecting a spot image captured by the compound eye at each position, transmitting the image to a computer through a USB (universal serial bus) interface to calculate the center coordinates of light spots, adjusting the position of the LED along the direction of a guide rail, and keeping the center position of an imaging light spot of a central sub-eye unchanged when rotating the transverse shaft.
And (4) starting from the angle zero position, realizing the automatic operation of the calibration process by the control flow (figure 5) of the computer, and driving the upper motor and the lower motor to drive the compound eye to rotate. Note that the position of the LED is Z at this time1And acquiring light spot images of the LED imaged on the CMOS at a set series of angle positions, calculating the coordinates of the central points of the light spots of the images, matching the light spots with corresponding sub-eye channels, determining the sub-eyes to which the spots belong, and recording the rotation angles of the upper motor and the lower motor of the sub-eye channels and the coordinates of the light spot points at the moment.
And (5) after the first position is recorded, measuring the distance from the LED to the spherical center of the spherical crown surface of the compound eye.
Step (6) controlling the one-dimensional translation stage to move the LED to a new position Z2At Z2And (5) positionally repeating the steps (4) and (5).
Step (7) can convert the rotation of the two shafts into the rotation of the LED when the two shafts rotate, so that the turntable can be considered to be static, and the LED rotates around the intersection point of the two shafts in space. Based on this relationship, a unified coordinate system is established, as shown in FIG. 3, which can be at Z1And Z2Respectively obtaining virtual three-dimensional targets with the LEDs distributed in a spherical surface in space at two positions, and recording Z1And Z2The distance between the LED and the intersection point of the two shafts, the rotating angle of the two motor rotating shafts corresponding to each sub-eye and the coordinates of the image point.
And (8) according to the recorded coordinate and angle relation, the corresponding position of each spot on each virtual spherical surface can be obtained by a discrete point interpolation method, and further the mapping relation between each image spot and the incident ray vector can be obtained, so that the calibration process can be completed.
The above calibration method is expressed by a mathematical model as:
the vertical axis is represented by α with respect to its zero angle rotation angle, the horizontal axis is represented by β with respect to its zero angle rotation angle, the coordinates of the light spots on the image are represented by P (u, v), and then Z for each sub-eye1The angle of rotation of the two motors and the spot position can be recorded as (α)1111,P11),(α1212,P12),…,(α1n1n,P1n) At Z2The angle of rotation of the two motors and the spot position are noted (α)2121,P21),(α2222,P22),…,(α2n2n,P2n). Therefore we have established the image upper point for each sub-eye and the corresponding Z1And Z2Relation between upper point of spherical surface at two positions and included angle of coordinate system of rotary table
Figure BDA0001169258070000031
And
Figure BDA0001169258070000032
Figure BDA0001169258070000033
establishing an OXYZ world coordinate system as shown in FIG. 3, wherein the origin of the coordinate system is located at the intersection of two axes, the Y axis is coincident with the vertical axis, the positive direction is the direction far away from the lower motor, the Z axis is coincident with the horizontal axis, the positive direction is the direction far away from the upper motor, the coordinate system is a right-hand coordinate system, and the X direction is shown in the figure, as shown in FIG. 4, in the position with the distance origin of R, the angle of the upper motor and the lower motor is (α), and the three-dimensional coordinates (X, Y, Z) of the virtual target point are:
Figure BDA0001169258070000034
establishing a speckle-angle correspondence at two such positions, and obtaining the corresponding angles by interpolation for a known angle, thereby obtaining corresponding points on the virtual sphere at two positions Z1 and Z2,the linear equation of the connecting line of the corresponding points is the incident light, and the included angle between the incident light and the coordinate system is set as
Figure BDA0001169258070000035
Thus, the relationship between the speckle and the incident ray vector can be established
Figure BDA0001169258070000036
And realizing calibration.
Compared with the prior art, the invention has the advantages that:
(1) the calibration device provided by the invention consists of a single LED and a two-dimensional rotary table, can construct a target covering the full field of view of the compound eye, unifies the coordinate systems of a plurality of sub-eyes, and does not need to consider specific models of imaging and distortion, so that the calibration device is suitable for calibrating a compound eye system with a plurality of sub-eyes, complicated distortion and large field of view.
(2) The calibration device adopts a high-precision servo motor, so that the position of a rotation angle is accurate; the LED luminous points are used, so that the phenomenon that image points are not clear is avoided, and the light spot center extraction precision is improved; the calibration process is automated, so that the calibration precision is higher due to the reasons of less man-made interference and the like.
(3) After the system is adjusted, unattended automatic calibration can be realized through the control of programmed software, the angles of two shafts of the rotary table at the moment can be easily and automatically obtained at each position, and the coordinates of light spot points obtained by calculating a compound eye capture target are calculated, so that the calibration process is more convenient and faster.
Drawings
FIG. 1 is a schematic diagram of the structure of a novel bionic compound eye to be calibrated according to the present invention;
FIG. 2 is a schematic view of a calibration system of the present invention;
FIG. 3 is a schematic diagram of a calibration method of the present invention;
FIG. 4 is a schematic diagram of the position relationship of the LEDs in the coordinate system according to the present invention;
FIG. 5 is a flow chart of computer control according to the present invention;
FIG. 6 is a schematic view of the present invention for three-dimensional object localization;
reference numbers in the figures: the system comprises a 1-sub-eye lens, a 2-meniscus lens, a 3-CMOS image sensor, a 4-PCB, a 5-compound eye spherical shell, a 6-computer, a 7-upper motor, a 8-lower motor, a 9-compound eye system, a 10-rotary table fixed frame, a 11-motor driver, a 12-LED (light emitting diode), a 13-magnetic gauge stand support frame, a 14-one-dimensional translation table, a 15-upper motor rotating shaft (transverse shaft), a 16-lower motor rotating shaft (vertical shaft), a 17-spherical virtual target formed by the LED at a first position, and a 18-spherical virtual target formed by the LED at a second position.
Detailed Description
The technical solution of the present invention will be further described in detail with reference to the accompanying drawings and the specific implementation steps.
The compound eye system to be calibrated is shown in fig. 1 and consists of a sub-eye lens, a meniscus lens, a CMOS camera, a spherical shell and a shell. Each sub-eye lens on the hemispherical spherical shell is responsible for target imaging in a field area of about 20 degrees, the meniscus lens is positioned between the sub-eye lens and the CMOS camera and used for optimizing incident light rays passing through the sub-eye lens in the field, the CMOS camera converts optical images received by the sub-eye and the meniscus lens into digital images and outputs the digital images, and the spherical shell and the shell are mechanical supporting structures of the whole compound eye system.
The invention provides a novel bionic compound eye system calibration device which is specifically formed by the following parts in an organized matching manner:
as shown in fig. 2, the invention mainly comprises a two-dimensional turntable, an LED lamp bead 12, a compound eye system 9 and a computer 6, wherein the two-dimensional turntable mainly comprises two servo motors (7, 8) and a fixed frame 10, the two motors are respectively provided with a high-precision driver 11, the compound eye system 9 is fixed at the tail end of a transverse shaft and can synchronously rotate along with the transverse shaft 15, the rotation of a vertical shaft 16 can drive the two-shaft turntable and the transverse shaft 15 to move in the horizontal direction, and the transverse shaft 15 and the vertical shaft 16 can rotate within the range of 360 degrees. The calibrated compound eye system 9 is arranged on the transverse shaft 15, and the up-down and left-right positions of the compound eye fixed on the transverse shaft 15 can be finely adjusted by adjusting the fastening nuts around the compound eye, so that the intersection of the vertical shaft and the transverse shaft with the spherical center of the spherical crown surface of the compound eye is convenient to adjust. LED lamp pearl 12 is laid on magnetism gauge stand support frame 13, and support frame 13 is fixed on one-dimensional translation platform 14, and LED lamp pearl 12 is located the dead ahead of revolving stage initial point position.
The invention provides a calibration method corresponding to a novel bionic compound eye system calibration device, which specifically comprises the following steps:
(1) the two-dimensional turntable device is adjusted to realize that the horizontal axis 15 and the vertical axis 16 intersect at the spherical center position of the spherical crown surface of the compound eye 9, and the horizontal axis 15 and the vertical axis 16 are perpendicular to each other, as shown in fig. 3.
(2) And setting the zero-point angle positions of the rotating shafts of the two motors as starting points of angle rotation in the calibration process, and rotating the horizontal shaft and the vertical shaft to the zero-point positions.
(3) The LED lamp beads 12 are placed at the position of about 500mm in front of the compound eye, the cross shaft 15 is rotated at equal angle intervals within 360 degrees, at each angle position, the sub-eye 1 of the compound eye system captures LED luminous points 12 and images the LED luminous points on the CMOS image sensor 3 through the meniscus lens 2, the images are transmitted to the computer 6 through the USB interface to calculate the center coordinates of light spots, and the positions of the LED lamp beads 12 are continuously adjusted to enable the centers of the imaging light spots of the middle sub-eye to coincide under different angles.
(4) Starting from the zero point position of the two axes, the control flow of the computer is as shown in fig. 5, and the horizontal axis 15 and the vertical axis 16 are driven to drive the compound eye 9 to rotate, wherein the rotation angles are not equally spaced, so that the distribution uniformity of the image facula points obtained by each sub-eye channel 1 is ensured. Note that the position of the LED is Z at this time1And collecting a light spot image formed by the LED luminous point 12 on the CMOS image sensor 3 at each set angle position, calculating the coordinates of the central point of the light spot of the image, matching the light spot with a corresponding sub-eye channel, determining the sub-eye to which the spot belongs, and recording the rotation angle of the horizontal and vertical axes of the sub-eye and the coordinates of the corresponding light spot point.
(5) Z th1And after the position is recorded, measuring the distance from the LED lamp bead 12 to the spherical center of the spherical crown surface of the compound eye 9.
(6) Controlling the one-dimensional translation stage 14 to move the LED lamp bead 12 to a new position Z2And (5) repeating the steps (4) and (5) at the new position of the LED lamp bead 12.
(7) When the two shafts rotate, the rotation of the two shafts can be converted into the rotation of the LED lamp beads 12, so that the rotary table can be considered to be static, and the LED lamp beads 12 rotate around the intersection points of the two shafts in space. According to this relationMay be at Z as shown in FIG. 31And Z2Virtual three- dimensional targets 17 and 18 in which the LED light-emitting points 12 are distributed in a spherical surface shape in space are respectively obtained at the two positions, and the distance between the LED lamp beads 12 and the intersection point of the two shafts, the rotating angle of the two motor rotating shafts corresponding to each sub-eye and the coordinates of the image points are recorded at the two positions.
(8) For sub-eye numbered m (1 ≦ m ≦ 141), the coordinates of the center point of the image spot are P (u, v), and the angle of rotation of the up-down motor is (α), so that the spot-angle correspondence of the sub-eye can be established (α, P). after the target is captured by the compound eye, the coordinates of the spot on the image are obtained by calculation, the angle information at this moment needs to be obtained, but the obtained coordinates of the spot cannot be completely coincided with the known data, unknown angle information can be obtained by using a discrete point interpolation method according to the known coordinates of the spot and the existing spot-angle relationship1And Z2The corresponding points of the image light spots on the two spherical surfaces can be obtained through angle information on the two positions, and a straight line formed by the two points corresponding to one light spot is the incident light, so that the relation between the image coordinate and the incident light vector is established, and the calibration process of the compound eye system is completed.
The implementation process of the novel bionic compound eye system calibration device and the corresponding calibration method provided by the invention is given in the above process, and the high-precision automatic calibration of the bionic compound eye system can be realized. After calibration is completed, the compound eye device can be used for carrying out the work of detecting, positioning and the like of the three-dimensional space target.
As shown in FIG. 6, when a target point in space is captured by the compound eye, assuming its three-dimensional coordinates are (x, y, z), each spot on the image can be obtained by angle interpolation to obtain corresponding angles at two positions (α)11) And (α)22) Knowing the angle allows to obtain corresponding points on both spheres. As shown in FIG. 4, let L be the distance between the first spherical surface position and the origin of the coordinate system1The distance between the second spherical position and the origin is L2Then, the coordinates of two points calculated on the two spherical surfaces are respectively M1(x1,y1,z1) And M2(x2,y2,z2)。
Figure BDA0001169258070000061
The linear equation formed by the corresponding points of the same light spot on the two spherical surfaces is the incident light, and the linear equation is:
Figure BDA0001169258070000062
when the luminous point is captured by a plurality of sub-eyes, a plurality of incident rays can be obtained, the incident rays are all emitted from the target point, therefore, the intersection point of the incident rays is the target point to be detected, and the simultaneous equation can be obtained:
Figure BDA0001169258070000063
in practical situations, a space target point is generally captured by 4 to 8 sub-eyes, so that n is more than or equal to 4 and less than or equal to 8(n represents the number of spots), only three coordinate parameters need to be solved, the number of equations is more than the number of solved parameters, the equations are overdetermined, and the optimal solution of the three-dimensional coordinates (x, y and z) of the space point can be obtained through a least square method.
On the basis of the theory and the work, the method can measure a three-dimensional large-scale object and can perform point cloud scanning on the surface of the object so as to realize surface shape reconstruction. The functions realized by the calibration device and the calibration method can be well applied to the work of space positioning, surface shape measurement, three-dimensional reconstruction and the like.

Claims (3)

1. The utility model provides a compound eye system calibration device based on single LED luminous point and two-dimensional revolving stage which characterized in that: comprises a two-dimensional turntable, a motor driver, an LED, a magnetic gauge stand support frame, a one-dimensional translation table, a computer and a compound eye system, wherein the two-dimensional turntable comprises a high-precision servo motor and a fixed frame, the compound eye system is arranged at the tail end of a transverse shaft of the turntable, the compound eye rotates along with the vertical shaft and the transverse shaft, the LED is arranged at the top end of a supporting frame of the magnetic gauge stand, the placing position of the LED lamp can be adjusted, the support frame is fixed on the one-dimensional translation table, the LED lamp can realize one-dimensional translation motion on the one-dimensional translation table along the direction of the transverse shaft of the rotary table, the driver is controlled to drive the vertical shaft and the transverse shaft to rotate in the calibration process, the LED position movement is realized by controlling the one-dimensional translation stage, the compound eye system captures a three-dimensional space target point to obtain a light spot image, the image is transmitted to a computer through a USB interface, the central coordinate of the image light spot is calculated, the angle values of the vertical axis and the horizontal axis and the central coordinate of the light spot are recorded, therefore, the coordinate-angle mapping relation of each sub-eye is established at two positions under the unified coordinate system.
2. A compound eye system calibration method based on a single LED light-emitting point and a two-dimensional rotary table, using the compound eye system calibration device based on a single LED light-emitting point and a two-dimensional rotary table of claim 1, characterized in that: the method comprises the following steps:
adjusting a two-dimensional turntable device to realize that a vertical shaft and a horizontal shaft are intersected at the spherical center position of a spherical crown surface of a compound eye, and enabling the vertical shaft and the horizontal shaft to be perpendicular to each other;
setting the zero-point angle positions of the rotating shafts of the two motors as starting points of angle rotation in the calibration process, and rotating the two motors to the zero-point positions;
step (3) placing an LED 500mm in front of the compound eye, rotating a cross shaft at equal angular intervals within 360 degrees, collecting a spot image captured by the compound eye at each position, transmitting the image to a computer through a USB interface to calculate the central coordinates of light spots, adjusting the position of the LED along the direction of the cross shaft of a turntable, and keeping the central position of the imaging light spots of the central sub-eye unchanged when the cross shaft is rotated;
and (4) starting from the zero position of the angle, realizing the automatic operation of the calibration process by the control flow of the computer, driving an upper motor and a lower motor to drive a compound eye to rotate, and recording the position of the LED as Z1Collecting light spot images of the LED imaged on the CMOS at a set series of angle positions, calculating the coordinates of the central points of the light spots of the images, matching the light spots with corresponding sub-eye channels, determining the sub-eyes to which the spots belong, and recording the rotation of an upper motor and a lower motor of the sub-eye channels at the momentThe dynamic angle and the coordinates of the light spot point;
after the first position is recorded, measuring the distance from the LED to the spherical center of the spherical crown surface of the compound eye;
step (6) controlling the one-dimensional translation stage to move the LED to a new position Z2At Z2Positionally repeating steps (4) and (5);
step (7) when the two shafts rotate, the rotation of the two shafts can be converted into the rotation of the LED, so that the turntable can be considered to be static, the LED rotates around the intersection point of the two shafts in space, a uniform coordinate system is established, and the rotation of the LED can be realized in Z1And Z2Respectively obtaining virtual three-dimensional targets with the LEDs distributed in a spherical surface in space at two positions, and recording Z1And Z2The distance between the LED and the intersection point of the two shafts at the two positions, the rotating angle of the two motor rotating shafts corresponding to each sub-eye and the coordinates of the image point;
and (8) according to the recorded coordinate and angle relation, the corresponding position of each spot on each virtual spherical surface can be obtained by a discrete point interpolation method, and further the mapping relation between each image spot and the incident ray vector can be obtained, so that the calibration process can be completed.
3. The method as claimed in claim 2, wherein the mathematical model in the calibration step includes α vertical axis rotation angle relative to its zero angle, β horizontal axis rotation angle relative to its zero angle, P (u, v) light spot coordinates on the image, and Z for each sub-eye1The angle of rotation of the two motors and the spot position can be recorded as (α)1111,P11),(α1212,P12),…,(α1n1n,P1n) At Z2The angle of rotation of the two motors and the spot position are noted (α)2121,P21),(α2222,P22),…,(α2n2n,P2n) Establishing the image upper point and the corresponding Z of each sub-eye1And Z2On a two-position sphereRelation P of included angle between point and coordinate system of rotary table
Figure FDA0002280683590000021
And
Figure FDA0002280683590000022
establishing an OXYZ world coordinate system, wherein the origin of the coordinate system is positioned at the intersection point of two shafts, the Y axis is coincided with the vertical axis, the positive direction is the direction far away from the lower motor, the Z axis is coincided with the horizontal axis, the positive direction is the direction far away from the upper motor, the coordinate system is a right-hand coordinate system, the angle of the upper motor and the lower motor is (α) at the position far away from the origin of R, and the three-dimensional coordinates (x, Y, Z) of the virtual target point are as follows:
Figure FDA0002280683590000023
establishing a corresponding relation between the spots and the angles at two positions, obtaining corresponding angles through interpolation for a known angle so as to respectively obtain corresponding points on a virtual spherical surface at two positions Z1 and Z2, wherein a linear equation of a connecting line of the corresponding points is incident light, and an included angle between the incident light and a coordinate system is set as
Figure FDA0002280683590000024
Thus, the relationship between the speckle and the incident ray vector can be established
Figure FDA0002280683590000025
And realizing calibration.
CN201611094763.2A 2016-12-02 2016-12-02 Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table Active CN106767913B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611094763.2A CN106767913B (en) 2016-12-02 2016-12-02 Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611094763.2A CN106767913B (en) 2016-12-02 2016-12-02 Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table

Publications (2)

Publication Number Publication Date
CN106767913A CN106767913A (en) 2017-05-31
CN106767913B true CN106767913B (en) 2020-05-12

Family

ID=58883700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611094763.2A Active CN106767913B (en) 2016-12-02 2016-12-02 Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table

Country Status (1)

Country Link
CN (1) CN106767913B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108007390A (en) * 2017-11-17 2018-05-08 北京长峰科威光电技术有限公司 A kind of pendulum mirror linearity test system based on CCD
CN108648232B (en) * 2018-05-04 2022-03-25 北京航空航天大学 Binocular stereoscopic vision sensor integrated calibration method based on precise two-axis turntable
CN110009686B (en) * 2019-03-13 2022-02-18 北京小马智行科技有限公司 Camera calibration method, device and platform and camera calibration method applied to calibration platform
CN112393882B (en) * 2020-04-21 2022-08-23 哈尔滨工业大学 Compound eye imaging adjustment method based on micro-imaging micro-lens parameter detection
CN114688996B (en) * 2020-12-31 2023-11-03 北京华航无线电测量研究所 Method for measuring rotation precision angle of turntable
CN112950727B (en) * 2021-03-30 2023-01-06 中国科学院西安光学精密机械研究所 Large-view-field multi-target simultaneous ranging method based on bionic curved compound eye
CN113188776B (en) * 2021-04-27 2022-11-11 哈尔滨工业大学 Compound eye imaging contact ratio detection system and detection method
CN116681732B (en) * 2023-08-03 2023-10-20 南昌工程学院 Target motion recognition method and system based on compound eye morphological vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101755190A (en) * 2008-05-19 2010-06-23 松下电器产业株式会社 Calibration method, calibration device, and calibration system having the device
JP4717863B2 (en) * 2007-09-28 2011-07-06 富士フイルム株式会社 Calibration method and apparatus for compound eye imaging apparatus and calibration chart used therefor
CN104165626A (en) * 2014-06-18 2014-11-26 长春理工大学 Bionic facetted eye imaging target positioning system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8102516B2 (en) * 2008-02-26 2012-01-24 Panasonic Corporation Test method for compound-eye distance measuring apparatus, test apparatus, and chart used for the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4717863B2 (en) * 2007-09-28 2011-07-06 富士フイルム株式会社 Calibration method and apparatus for compound eye imaging apparatus and calibration chart used therefor
CN101755190A (en) * 2008-05-19 2010-06-23 松下电器产业株式会社 Calibration method, calibration device, and calibration system having the device
CN104165626A (en) * 2014-06-18 2014-11-26 长春理工大学 Bionic facetted eye imaging target positioning system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
用于大视场目标定位的复眼系统标定;郭方等;《光学精密工程》;20120531;第20卷(第5期);第913-920页 *

Also Published As

Publication number Publication date
CN106767913A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106767913B (en) Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table
CN105953747B (en) Structured light projection full view 3-D imaging system and method
CN111473739B (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN111486802B (en) Rotating shaft calibration method based on self-adaptive distance weighting
CN1238691C (en) Combined stereovision, color 3D digitizing and motion capture system
CN110763152A (en) Underwater active rotation structure light three-dimensional vision measuring device and measuring method
CN105627948A (en) Large-scale complex curved surface measurement system and application thereof
CN111811395B (en) Monocular vision-based dynamic plane pose measurement method
CN201522266U (en) Computer-based binocular vision false-tooth scanning device
CN101033972A (en) Method for obtaining three-dimensional information of space non-cooperative object
CN102506711B (en) Line laser vision three-dimensional rotate scanning method
CN108648232A (en) A kind of binocular stereo visual sensor integral type scaling method based on accurate two-axis platcform
CN113205603A (en) Three-dimensional point cloud splicing reconstruction method based on rotating platform
CN112254680B (en) Multi freedom's intelligent vision 3D information acquisition equipment
CN112257537B (en) Intelligent multi-point three-dimensional information acquisition equipment
CN113191388B (en) Image acquisition system for training target detection model and sample generation method
CN112254670B (en) 3D information acquisition equipment based on optical scanning and intelligent vision integration
CN205352322U (en) Large -scale complicated curved surface measurement system
CN112253913B (en) Intelligent visual 3D information acquisition equipment deviating from rotation center
CN113702384A (en) Surface defect detection device, detection method and calibration method for rotary component
CN109541626B (en) Target plane normal vector detection device and detection method
CN107063123A (en) 360 degree of environment pattern spinning Laser Scannings
CN114279325B (en) System and method for calibrating spatial position relation of measurement coordinate system of vision measurement module
CN113566733B (en) Line laser vision three-dimensional scanning device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: No.443 Huangshan Road, Shushan District, Hefei City, Anhui Province 230022

Patentee after: University of Science and Technology of China

Address before: 230026 Jinzhai Road, Baohe District, Hefei, Anhui Province, No. 96

Patentee before: University of Science and Technology of China

CP02 Change in the address of a patent holder