CN105806316A - Trinocular vision sensor for micro/nano satellite and measurement method thereof - Google Patents

Trinocular vision sensor for micro/nano satellite and measurement method thereof Download PDF

Info

Publication number
CN105806316A
CN105806316A CN201410845011.XA CN201410845011A CN105806316A CN 105806316 A CN105806316 A CN 105806316A CN 201410845011 A CN201410845011 A CN 201410845011A CN 105806316 A CN105806316 A CN 105806316A
Authority
CN
China
Prior art keywords
target
eyeglass
field
point
vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410845011.XA
Other languages
Chinese (zh)
Other versions
CN105806316B (en
Inventor
张宇
刘宗明
卢山
韩飞
曹姝清
贺亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xinyue Instrument Factory
Original Assignee
Shanghai Xinyue Instrument Factory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xinyue Instrument Factory filed Critical Shanghai Xinyue Instrument Factory
Priority to CN201410845011.XA priority Critical patent/CN105806316B/en
Publication of CN105806316A publication Critical patent/CN105806316A/en
Application granted granted Critical
Publication of CN105806316B publication Critical patent/CN105806316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a trinocular vision sensor for micro/nano satellite and a measurement method thereof. The trinocular vision sensor comprises two far-field lenses and a near-field lens, and the two far-field lenses are located at two sides of the near-field lens. The two far-field lenses serve as binocular vision lenses and the near-field lens serves as a monocular vision lens. A small-view field far-field binocular vision measurement mode is adopted at a remote distance, when the distance is close to the degree that a target cannot enter a binocular vision field, the measurement mode is switched to large-view field near-field monocular vision measurement. In monocular vision measurement, monocular vision acquires the relative position and attitude of the target according to a target size obtained by binocular vision. The invention relates to the trinocular vision sensor for micro/nano satellite and the measurement method thereof, the measurement range is large, the mass is small, and the volume is small.

Description

Trinocular vision sensor and measuring method thereof for micro-nano satellite
Technical field
The present invention relates to vision measurement technology, particularly relate to monocular vision and trinocular vision sensor that binocular vision combines and measuring method thereof.
Background technology
Owing to mankind's space activity complexity improves constantly, the spacecrafts rendezvous of different autonomy has been widely used for the field such as spacecraft maintainable technology on-orbit, space platform load upgrading with flying the technology of grabbing, and is increasingly becoming space military system logistics support and space captive's calculated important component part of star.
Spacecrafts rendezvous is that the information such as orientation current for target, attitude send to satellite with flying the technology of grabbing, and in conjunction with the status information of satellite self, reasonable arrangement spacecrafts rendezvous and fly to grab strategy, and event performs instruction passes to corresponding subsystem.Sensor, as spacecrafts rendezvous and the core flying the technology of grabbing, undertakes measurement and the tracking of the information such as target bearing, attitude.Video optical sensor owing to having the features such as certainty of measurement height, miniaturization, low cost, closely intersection with dock in there is certain advantage.
In prior art, the mode that near field binocular vision and far field binocular vision combine is adopted to realize measurement and the tracking of the information such as noncooperative target orientation, attitude in wide-measuring range, for large satellite platform, this mode is applicable, but for micro-nano satellite platform, this mode is unsatisfactory for the micro-nano satellite requirement to load volume, quality.If alleviating quality, then can bring only with binocular vision and measuring the problem being limited in scope.
Summary of the invention
It is an object of the invention to provide a kind of trinocular vision sensor for micro-nano satellite and measuring method thereof, scope of measuring is big, light weight, volume are little.
For solving the problems referred to above, the present invention provides a kind of trinocular vision sensor for micro-nano satellite, including two far field camera lenses and a near-field lens, each camera lens is respectively provided with an image-generating unit, and a data processing unit shared by three camera lenses;Said two far field camera lens lays respectively at the both sides of described near-field lens, and said two far field camera lens adopts identical camera lens;Said two far field camera lens is as the camera lens of binocular vision, and described near-field lens is as the camera lens of monocular vision.
The above-mentioned trinocular vision sensor for micro-nano satellite, wherein, described far field camera lens includes the first body tube, first eyeglass, the first spacer ring, the second eyeglass, the second spacer ring, the 3rd eyeglass, the 3rd spacer ring, the first diaphragm sheet and the 4th eyeglass are successively set in described first body tube, and one end of described first body tube the first eyeglass is arranged with trim ring;Shade is installed in described trim ring front end.
The above-mentioned trinocular vision sensor for micro-nano satellite, wherein, described first body tube, the first spacer ring, the second spacer ring and the 3rd spacer ring all has gas leakage groove;The equal japanning in whole non-imaged faces of described first eyeglass, the second eyeglass, the 3rd eyeglass and the 4th eyeglass, the metal parts of described far field camera lens is all through blackening process.
The above-mentioned trinocular vision sensor for micro-nano satellite, wherein, described near-field lens includes the second body tube, 5th eyeglass, the 4th spacer ring, the 6th eyeglass, the 5th spacer ring, the 7th eyeglass, the second diaphragm sheet, the 8th eyeglass, the 6th spacer ring, the 9th eyeglass are successively set in described second body tube, one end of described second body tube the 5th eyeglass is arranged with front trim ring, and one end of described second body tube the 9th eyeglass is arranged with rear trim ring;Shade is installed in described front trim ring front end.
The above-mentioned trinocular vision sensor for micro-nano satellite, wherein, described first body tube, the first spacer ring, the second spacer ring and the 3rd spacer ring all has gas leakage groove;The equal japanning in whole non-imaged faces of the 5th eyeglass, the 6th eyeglass, the 7th eyeglass, the 8th eyeglass and the 9th eyeglass, the metal parts of described near-field lens is all through blackening process.
Another technical scheme provided by the invention is trinocular vision sensor noncooperative target measuring method, adopt above-mentioned trinocular vision sensor, small field of view far field Binocular vision photogrammetry mode is adopted remote, when distance be close to target cannot simultaneously enter binocular vision visual field time, switch near field, big visual field monocular vision metering system;In the monocular vision metering system of near field, big visual field, monocular vision obtains the relative position of target according to the target size that binocular vision obtains.
Above-mentioned trinocular vision sensor noncooperative target measuring method, wherein, small field of view far field Binocular vision photogrammetry mode includes again two kinds of situations, when target is point target in binocular vision visual field, utilize the angle of site and the azimuth of Binocular vision photogrammetry target, when target is non-point target in binocular vision visual field, utilize relative position and the attitude of Binocular vision photogrammetry target.
Above-mentioned trinocular vision sensor noncooperative target measuring method, wherein, when target is non-particle in binocular vision visual field, utilizes relative position and the attitude of Binocular vision photogrammetry target, specifically includes following steps:
Step S1: image is carried out pretreatment, rim detection and edge post processing;
Step S2: the three-dimensional reconstruction according to the circle that the ellipse extracted in image carries out in three dimensions in target, the geometric parameter of estimation space circle, and it is derived from the relative position of target;
Step S3: the two dimension target profile according to extracting in image carries out the three-dimensional reconstruction of objective contour in three dimensions, and thus estimates the relative attitude of target.
Above-mentioned trinocular vision sensor noncooperative target measuring method, wherein, in described step S1, described pretreatment includes image log conversion and lens distortion is corrected;
The canny edge detection operator of employing standard carries out rim detection;
Particularly as follows: first the post processing of described edge, removes cross point and bends relatively larger point, thus the edge segmentation on image is become different edge sections;Followed by border following algorithm, remove following the tracks of the short edge obtained;Then utilize straightway and the mode of circle matching, straightway big for application condition is rejected;Utilizing the modes such as Straight-line segments mergence, retaining is most possibly straightway and oval edge, and other edge then directly weeds out.
Above-mentioned trinocular vision sensor noncooperative target measuring method, wherein, in described step S2, the three-dimensional reconstruction of space circle specifically includes:
Oval thick extraction: utilize remaining edge after step S1 processes to carry out the matching of ellipse, the ratio weeding out length semiaxis is substantial access to a certain proportion of ellipse, and retain ellipse and there is the part of efficient frontier point account for the ellipse of more than the certain proportion of whole oval perimeters, then take length semiaxis meansigma methods the maximum as calculating required ellipse;
Oval iterative refinement is extracted: in order to improve position calculation accuracy and stability, be iterated extraction of refining on the basis that ellipse slightly extracts, specific as follows:
Step S2.1: assume that initial elliptic equation is, wherein, a, b, c, d, e, f are elliptic parameter, x-axis that x, y are respectively oval and y-axis direction coordinate;
Step S2.2: carried out discretization by elliptic equation, obtains some discrete oval boundary points, n is discrete point number;
Step S2.3: elliptical center O withRectilinear direction on,In the certain limit of front and back, according to a fixed step size, search for partial gradient maximum point, obtain the oval boundary point that a circle is new;
Step S2.4: get rid of partial gradient in new boundary point and the point of error in more than three times, the ellipse that then matching is new;
Step S2.5: repeat above step, until twice, front and back elliptical center and length radius are less than finishing iteration after certain threshold value, the oval home position finally obtained then can represent the relative position of target.
Above-mentioned trinocular vision sensor noncooperative target measuring method, wherein, in described step S3, the step of objective contour three-dimensional reconstruction is as follows:
Step S3.1: the angle point for objective contour sorts according to clockwise direction, and using the point in the lower right corner as starting point, then on the image of left and right, the profile angle point of identical numbering is same place, can be obtained the three-dimensional coordinate of four profile angle points by forward intersection, it is assumed that respectively
Step S3.2: calculate the center of gravity of above four profile angle points, it is assumed that for
Step S3.3: calculate matrix
Step S3.4: N carries out singular value decomposition, decomposes the normal vector that the characteristic vector corresponding to minimum singular value obtained is exactly objective contour, it is assumed that for, wherein
Step S3.5: estimate two unit vectors parallel as far as possible with objective contour being perpendicular to normal vectorWith,WithSet direction must assure that withConstitute right-handed system, the two unit vector withConstitute three of target body coordinate system axially, it is determined that the attitude of target;The attitude of hypothetical target represents with following matrix:If representing the attitude of target by three angles, being
Step S3.6: can estimate half length of side on each limit of tetragon of four profile angle points compositions further according to four profile angle points, this tetragon is objective contour;
Step S3.7: estimate all parameters of objective contour, and four profile angle points are expressed as:, wherein,,,,Half for the objective contour length of side;
Step S3.8: above equation is substituted into collinearity equation, obtains following error equation
Step S3.9: analyze above error equation, the position quantity of this error equation isWith, this error equation is designated as, it is carried out linearisation, namely according to Taylor series expansion, obtains
Step S3.10: to four profile angle points, it is considered to left and right image, lists above-mentioned linearizing error equation, obtains the error equation of following matrix form
Step S3.11: adopting least square principle, the error equation solving above-mentioned matrix form obtains
Step S3.12: thus revise the parameter of objective contour, if it find that correction is only small, then just have been obtained for the optimal estimation value of objective contour parameter, otherwise needs to return step S3.5 and carries out iterative computation next time;
According to the above quadrangular configuration extracted, utilize two orthogonal directions of its offer to carry out the foundation of local coordinate system, thus can be gone out three attitude angle of target by the Directional Decomposition of the coordinate axes of local coordinate system.
Compared with prior art, the solution have the advantages that:
The present invention adopts trinocular vision sensor to achieve measurement and the tracking of noncooperative target on a large scale, overcome the shortcoming that weight, volume and power consumption that conventionally employed near field binocular vision and far field binocular vision combine and cause are all bigger, overcome again the problem that independent Binocular vision photogrammetry is limited in scope, achieve small light design, it is possible to meet the demand of the small light of micro-nano satellite platform.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of the trinocular vision sensor for micro-nano satellite of the present invention.
Fig. 2 is the structural representation of the far field camera lens of present pre-ferred embodiments.
Fig. 3 is the structural representation of the near-field lens of present pre-ferred embodiments.
Fig. 4 is the noncooperative target profile schematic diagram of present pre-ferred embodiments.
Fig. 5 is the flow chart of the binocular vision target relative pose measuring method of present pre-ferred embodiments.
Detailed description of the invention
Understandable for enabling the above-mentioned purpose of the present invention, feature and advantage to become apparent from, below in conjunction with the drawings and specific embodiments, the present invention is further detailed explanation.
The trinocular vision sensor for micro-nano satellite of the present invention includes optical unit, image-generating unit and data processing unit, described optical unit includes two far field camera lenses 1 and a near-field lens 2, said two far field camera lens is as the camera lens of binocular vision, described near-field lens is as the camera lens of monocular vision, each camera lens is respectively provided with an image-generating unit, and a data processing unit shared by three camera lenses.Image-generating unit is for controlling the camera lens exposure image of correspondence, and view data is transferred to data processing unit, data processing unit view data is processed thus obtaining noncooperative target (following target the refers both to noncooperative target) information such as orientation, attitude.Preferably, said two far field camera lens lays respectively at the both sides of described near-field lens, as shown in Figure 1, two far field camera lenses are arranged on the both sides of near-field lens, both can guarantee that and kept at a certain distance away two far field camera lenses (spacing of two far field camera lenses is more remote, image quality is more good), can effectively save again the space that three camera lenses take, be conducive to the miniaturization of trinocular vision sensor.It is preferred that image-generating unit corresponding to three camera lenses all adopts identical image-generating unit.
The trinocular vision sensor for micro-nano satellite of the present invention adopts the working method that monocular vision and binocular vision combine, small field of view far field Binocular vision photogrammetry mode is adopted in remote (distance between micro-nano satellite and target), when distance be close to target cannot simultaneously enter binocular vision visual field time, switch near field, big visual field monocular vision metering system, thereby may be ensured that the relative measurement on a large scale.Wherein, small field of view far field Binocular vision photogrammetry mode includes again two kinds of situations, when target is point target in binocular vision visual field, utilize the angle of site and the azimuth of Binocular vision photogrammetry target, when target is non-point target (can see target shape profile clearly) in binocular vision visual field, utilize relative position and the attitude of Binocular vision photogrammetry target.In the monocular vision metering system of near field, big visual field, monocular vision obtains the relative position of target according to the target size that binocular vision obtains.
The trinocular vision sensor for micro-nano satellite and the measuring method thereof of the present invention is now described in detail with a preferred embodiment.
In the present embodiment, two far field camera lenses adopt identical camera lens, and Fig. 2 show the structural representation of the far field camera lens of the present embodiment.Described far field camera lens includes the first body tube 100, first eyeglass the 101, first spacer ring the 102, second eyeglass the 103, second spacer ring the 104, the 3rd eyeglass the 105, the 3rd spacer ring the 106, first diaphragm sheet 107 and the 4th eyeglass 108 are successively set in described first body tube 100, and one end of described first body tube 100 first eyeglass is arranged with trim ring 109;(optical unit thing Fang Weiqian, image space is rear) is provided with external screw thread in described trim ring 109 front end, is used for installing shade;Described first body tube 100 rear end is provided with the first ring flange, first ring flange flange face has on the circle of φ 16mm first through hole of three φ 1.8, can being easily fixed on camera body by screw, the first ring flange cylinder cylindrical φ 9 is the hole, location that camera lens is connected with fuselage.Described trim ring 109 adopts with described first body tube 100 and is screwed, and in order to reduce volume, described trim ring 109 is arranged on the outside of described first body tube 100.In order to adapt to the air pressure balance under vacuum environment, lens design has gas leakage structure, and described first body tube the 100, first spacer ring the 102, second spacer ring the 104, the 3rd spacer ring 106 all has gas leakage groove.
In the present embodiment, the parameter of described far field camera lens is: focal length 15.28mm, the angle of visual field 30 °, F number 10, distorts about 0.1%, and the volume of camera lens is φ 19.4mm*14.7mm, adopting titanium alloy as camera lens material, gross weight is less than 10g, and flange face is 7.5mm to the distance of image planes.In order to reduce the veiling glare of far field camera lens as far as possible, whole non-imaged topcoating of far field lens wearer are pitch-dark, shade it is provided with before camera lens, inner side processing delustring screw thread, whole metal parts blackening process, all can meet imaging requirements, it is not necessary to special processing and measure of insalling and regulating in rational tolerance situation simultaneously, all keeping good performance in the temperature range of-40 DEG C to 60 DEG C, temperature influence is little.
Fig. 3 show the structural representation of the near-field lens of the present embodiment.Described near-field lens includes the second body tube 200,5th eyeglass the 201, the 4th spacer ring the 202, the 6th eyeglass the 203, the 5th spacer ring the 204, the 7th eyeglass the 205, second diaphragm sheet the 206, the 8th eyeglass the 207, the 6th spacer ring the 208, the 9th eyeglass 209 is successively set in described second body tube 200, one end of described second body tube 200 the 5th eyeglass is arranged with front trim ring 210, and one end of described second body tube 200 the 9th eyeglass is arranged with rear trim ring 211.Described front trim ring 210 front end is provided with external screw thread, is used for installing shade.In order to reduce volume, described rear trim ring 211 also acts as ring flange, having second through hole 213 of three φ 1.8 on flange face 212 on the circle of φ 18, it is possible to be easily fixed on camera body by screw, ring flange cylinder cylindrical 214 φ 14.5 is the hole, location that camera lens is connected with fuselage.Described front trim ring 210 and described second body tube 200 and described rear trim ring 211 all adopt with described second body tube 200 and are screwed, and in order to reduce volume, two trim rings are arranged at the outside of the second body tube.In order to adapt to the air pressure balance under vacuum environment, lens design has gas leakage structure, and the second body tube the 200, the 4th spacer ring the 202, the 5th spacer ring the 204, the 6th spacer ring 208 all has gas leakage groove.
In the present embodiment, the parameter of described near-field lens is: focal length 6.815mm, the angle of visual field 62 °, F number 10, distorts about 0.5%, and the volume of camera lens is φ 21.4mm*8.8mm, adopt titanium alloy as camera lens material, gross weight is less than 15g, and flange face is 5.0mm to the distance of image planes, and last metal covering is to image planes 3.0mm.In order to reduce the veiling glare of near-field lens as far as possible, whole non-imaged topcoating of near-field lens eyeglass are pitch-dark, shade it is provided with before camera lens, inner side processing delustring screw thread, whole metal parts blackening process, all can meet imaging requirements, it is not necessary to special processing and measure of insalling and regulating in rational tolerance situation simultaneously, all keeping good performance in the temperature range of-40 DEG C to 60 DEG C, temperature influence is little.
In the present embodiment, trinocular vision sensor covers the measurement scope of 0.2m to 35m: adopt the metering system of big visual field monocular vision to measure relative position and the attitude of target in 0.2m to 5m scope;In 5m to 35m scope, adopt the metering system of small field of view binocular vision, in 5m to 10m scope, wherein utilize relative position and the attitude of Binocular vision photogrammetry target, in 10m to 35m scope, utilize the angle of site and the azimuth of Binocular vision photogrammetry target.
In 0.2m to 5m scope, monocular vision metering system obtains the target size that target relative position need to rely on Binocular vision photogrammetry mode to obtain, and the algorithm related in measurement all can adopt prior art, and this is not launched to describe in detail by the present embodiment.
In 10m to 35m scope, target is particle in binocular vision visual field, utilizes the angle of site and the azimuth of Binocular vision photogrammetry target, and the algorithm related in measurement all can adopt prior art, and this is not launched to describe in detail by the present embodiment.
In 5m to 10m scope, target can not regard as point target in binocular vision visual field, is non-point target, utilizes relative position and the attitude of Binocular vision photogrammetry target.The present embodiment mainly includes tetragon, circle for objective contour, as shown in Figure 4, introduces a kind of binocular vision target relative pose measuring method, as it is shown in figure 5, the method comprises the following steps:
Step S1: image is carried out pretreatment, rim detection and edge post processing;
Preferably, described pretreatment includes image log conversion and lens distortion is corrected, wherein, it is convenient position below and attitude algorithm that main purpose is corrected in lens distortion, image log conversion is then for image enhaucament, convenient Image Edge-Detection after a while, makes the edge of image be detected as much as possible, it is ensured that useful edge feature can be fully utilized in the process below;
It is preferred that the canny edge detection operator of the standard of employing carries out rim detection;
The edge obtained due to rim detection is a lot, and it is relatively more mixed and disorderly, disturb a lot of, in order to the extraction of straightway, ellipse and contour feature can be carried out below more easily, it is necessary to mixed and disorderly, unnecessary edge is rejected, namely carries out edge post processing, preferably, particularly as follows: first edge post processing, removes cross point and bends relatively larger point, thus the edge segmentation on image is become different edge sections;Followed by border following algorithm, remove following the tracks of the short edge obtained;Then utilize straightway and the mode of circle matching, straightway big for application condition is rejected;Utilizing the modes such as Straight-line segments mergence, retaining is most possibly straightway and oval edge, and other edge then directly weeds out;
The algorithm used in above-mentioned pretreatment, rim detection and edge last handling process all can adopt prior art;
Step S2: the three-dimensional reconstruction according to the circle that the ellipse extracted in image carries out in three dimensions in target, the geometric parameter of estimation space circle, and it is derived from the relative position of target;
It is preferred that the three-dimensional reconstruction of space circle specifically includes:
Oval thick extraction: utilize remaining edge after step S1 processes to carry out the matching of ellipse, weed out the ratio of length semiaxis to be substantial access to certain proportion (this ratio value is generally by empirically determined, such as 1.0) ellipse, and retain ellipse and there is the part of efficient frontier point account for the certain proportion of whole oval perimeters (this ratio value is generally by empirically determined, such as, ellipse more than 40%), then takes length semiaxis meansigma methods the maximum as calculating required ellipse;
Oval iterative refinement is extracted: in order to improve position calculation accuracy and stability, be iterated extraction of refining on the basis that ellipse slightly extracts, specific as follows:
Step S2.1: assume that initial elliptic equation is, wherein, a, b, c, d, e, f are elliptic parameter, x-axis that x, y are respectively oval and y-axis direction coordinate;
Step S2.2: carried out discretization by elliptic equation, obtains some discrete oval boundary points, n is discrete point number;
Step S2.3: elliptical center O withRectilinear direction on,In the certain limit of front and back, according to a fixed step size, search for partial gradient maximum point, obtain the oval boundary point that a circle is new;
Described certain limit to weigh computational accuracy and real-time (amount of calculation) is determined, scope is more big, and precision is more high, but amount of calculation is more big, and real-time is worse.After ranging up to certain scope, precision improve be not it is obvious that but amount of calculation substantially increase.
Step S2.4: get rid of partial gradient in new boundary point and the point of error in more than three times, the ellipse that then matching is new;
Step S2.5: repeat above step, until twice, front and back elliptical center and length radius are less than finishing iteration after certain threshold value, the oval home position finally obtained (position of relative micro-nano satellite) then can represent the relative position of target;
Step S3: the two dimension target profile according to extracting in image carries out the three-dimensional reconstruction of objective contour in three dimensions, and thus estimates the relative attitude of target;
It is preferred that the step of objective contour three-dimensional reconstruction is as follows:
Step S3.1: the angle point for objective contour sorts according to clockwise direction, and using the point in the lower right corner as starting point, the profile angle point of the identical numbering that then on the image of left and right (refers to the image that two far field camera lenses obtain) is same place, the three-dimensional coordinate of four profile angle points can be obtained, it is assumed that respectively by forward intersection, Xi、YiAnd ZiFor i-th profile angle point three-dimensional coordinate in camera coordinates system;
Step S3.2: calculate the center of gravity of above four profile angle points, it is assumed that for,WithFor profile angle point center of gravity three-dimensional coordinate in target-based coordinate system;
Step S3.3: calculate matrix
Step S3.4: N carries out singular value decomposition, decomposes the normal vector that the characteristic vector corresponding to minimum singular value obtained is exactly objective contour, it is assumed that for, wherein
Step S3.5: estimate two unit vectors parallel as far as possible with objective contour being perpendicular to normal vectorWith(WithSet direction must assure that withConstitute right-handed system), the two unit vector withConstitute three of target body coordinate system axially, thus also determining the attitude of target;The attitude of hypothetical target can represent with following matrix:, it is also possible to represent the attitude of target by three angles, namely,WithIt is three attitude angles;
Step S3.6: can estimate half length of side (i.e. the half of the length of side) on each limit of tetragon of four profile angle points compositions further according to four profile angle points, this tetragon is objective contour, in the present embodiment, objective contour is square;
Step S3.7: estimate all parameters of objective contour further, and four profile angle points are expressed as:, whereinWithFor coordinate in target-based coordinate system of four profile angle points of target,,,,,For the half of the objective contour length of side,For the camera coordinates system position coordinates relative to target-based coordinate system, i.e. relative position;
Step S3.8: above equation is substituted into collinearity equation, obtains following error equation, wherein Xs, Ys and Zs are photo centre's coordinate, and a1, a2, a3, b1, b2, b3, c1, c2 and c3 are 9 direction cosines in spin matrix, and xi and yi is elements of interior orientation, and f is unknown parameter, vxAnd vyImage plane coordinate for picture point;
Step S3.9: analyze above error equation, it can be seen that the position quantity of above error equation isWith, for this, it is possible to above error equation is designated as, it is carried out linearisation, namely according to Taylor series expansion, obtains
Step S3.10: to four profile angle points, it is considered to left and right image, lists above linearizing error equation (totally 8), it is possible to obtain the error equation of following matrix form
Step S3.11: adopting least square principle, solving above equation can obtain
Step S3.12: thus can revise the parameter of objective contour, if it find that correction is only small, then just have been obtained for the optimal estimation value of objective contour parameter, otherwise needs to return step S3.5 and carries out iterative computation next time;
According to the above quadrangular configuration extracted, utilize two orthogonal directions of its offer to carry out the foundation of local coordinate system, thus can be gone out three attitude angle of target by the Directional Decomposition of the coordinate axes of local coordinate system.

Claims (11)

1., for the trinocular vision sensor of micro-nano satellite, it is characterised in that include two far field camera lenses and a near-field lens, each camera lens is respectively provided with an image-generating unit, a data processing unit shared by three camera lenses;Said two far field camera lens lays respectively at the both sides of described near-field lens, and said two far field camera lens adopts identical camera lens;Said two far field camera lens is as the camera lens of binocular vision, and described near-field lens is as the camera lens of monocular vision.
2. the trinocular vision sensor for micro-nano satellite as claimed in claim 1, it is characterized in that, described far field camera lens includes the first body tube, first eyeglass, the first spacer ring, the second eyeglass, the second spacer ring, the 3rd eyeglass, the 3rd spacer ring, the first diaphragm sheet and the 4th eyeglass are successively set in described first body tube, and one end of described first body tube the first eyeglass is arranged with trim ring;Shade is installed in described trim ring front end.
3. the trinocular vision sensor for micro-nano satellite as claimed in claim 2, it is characterised in that all have gas leakage groove on described first body tube, the first spacer ring, the second spacer ring and the 3rd spacer ring;The equal japanning in whole non-imaged faces of described first eyeglass, the second eyeglass, the 3rd eyeglass and the 4th eyeglass, the metal parts of described far field camera lens is all through blackening process.
4. the trinocular vision sensor for micro-nano satellite as claimed in claim 1, it is characterized in that, described near-field lens includes the second body tube, 5th eyeglass, the 4th spacer ring, the 6th eyeglass, the 5th spacer ring, the 7th eyeglass, the second diaphragm sheet, the 8th eyeglass, the 6th spacer ring, the 9th eyeglass are successively set in described second body tube, one end of described second body tube the 5th eyeglass is arranged with front trim ring, and one end of described second body tube the 9th eyeglass is arranged with rear trim ring;Shade is installed in described front trim ring front end.
5. the trinocular vision sensor for micro-nano satellite as claimed in claim 4, it is characterised in that all have gas leakage groove on described first body tube, the first spacer ring, the second spacer ring and the 3rd spacer ring;The equal japanning in whole non-imaged faces of the 5th eyeglass, the 6th eyeglass, the 7th eyeglass, the 8th eyeglass and the 9th eyeglass, the metal parts of described near-field lens is all through blackening process.
6. trinocular vision sensor noncooperative target measuring method, it is characterized in that, adopt the trinocular vision sensor as described in any claim in claim 1 to 5, small field of view far field Binocular vision photogrammetry mode is adopted remote, when distance be close to target cannot simultaneously enter binocular vision visual field time, switch near field, big visual field monocular vision metering system;In the monocular vision metering system of near field, big visual field, monocular vision obtains relative position and the attitude of target according to the target size that binocular vision obtains.
7. trinocular vision sensor noncooperative target measuring method as claimed in claim 6, it is characterized in that, small field of view far field Binocular vision photogrammetry mode includes again two kinds of situations, when target is point target in binocular vision visual field, utilize the angle of site and the azimuth of Binocular vision photogrammetry target, when target is non-point target in binocular vision visual field, utilize relative position and the attitude of Binocular vision photogrammetry target.
8. trinocular vision sensor noncooperative target measuring method as claimed in claim 7, it is characterised in that when target is non-point target in binocular vision visual field, utilize relative position and the attitude of Binocular vision photogrammetry target, specifically include following steps:
Step S1: image is carried out pretreatment, rim detection and edge post processing;
Step S2: the three-dimensional reconstruction according to the circle that the ellipse extracted in image carries out in three dimensions in target, the geometric parameter of estimation space circle, and it is derived from the relative position of target;
Step S3: the two dimension target profile according to extracting in image carries out the three-dimensional reconstruction of objective contour in three dimensions, and thus estimates the relative attitude of target.
9. trinocular vision sensor noncooperative target measuring method as claimed in claim 8, it is characterised in that in described step S1, described pretreatment includes image log conversion and lens distortion is corrected;
The canny edge detection operator of employing standard carries out rim detection;
Particularly as follows: first the post processing of described edge, removes cross point and bends relatively larger point, thus the edge segmentation on image is become different edge sections;Followed by border following algorithm, remove following the tracks of the short edge obtained;Then utilize straightway and the mode of circle matching, straightway big for application condition is rejected;Utilizing the modes such as Straight-line segments mergence, retaining is most possibly straightway and oval edge, and other edge then directly weeds out.
10. trinocular vision sensor noncooperative target measuring method as claimed in claim 8, it is characterised in that in described step S2, the three-dimensional reconstruction of space circle specifically includes:
Oval thick extraction: utilize remaining edge after step S1 processes to carry out the matching of ellipse, the ratio weeding out length semiaxis is substantial access to a certain proportion of ellipse, and retain ellipse and there is the part of efficient frontier point account for the ellipse of more than the certain proportion of whole oval perimeters, then take length semiaxis meansigma methods the maximum as calculating required ellipse;
Oval iterative refinement is extracted: in order to improve position calculation accuracy and stability, be iterated extraction of refining on the basis that ellipse slightly extracts, specific as follows:
Step S2.1: assume that initial elliptic equation is, wherein, a, b, c, d, e, f are elliptic parameter, x-axis that x, y are respectively oval and y-axis direction coordinate;
Step S2.2: carried out discretization by elliptic equation, obtains some discrete oval boundary points, n is discrete point number;
Step S2.3: elliptical center O withRectilinear direction on,In the certain limit of front and back, according to a fixed step size, search for partial gradient maximum point, obtain the oval boundary point that a circle is new;
Step S2.4: get rid of partial gradient in new boundary point and the point of error in more than three times, the ellipse that then matching is new;
Step S2.5: repeat above step, until twice, front and back elliptical center and length radius are less than finishing iteration after certain threshold value, the oval home position finally obtained represents the relative position of target.
11. trinocular vision sensor noncooperative target measuring method as claimed in claim 8, it is characterised in that in described step S3, the step of objective contour three-dimensional reconstruction is as follows:
Step S3.1: the angle point for objective contour sorts according to clockwise direction, and using the point in the lower right corner as starting point, then on the image of left and right, the profile angle point of identical numbering is same place, can be obtained the three-dimensional coordinate of four profile angle points by forward intersection, it is assumed that respectively
Step S3.2: calculate the center of gravity of above four profile angle points, it is assumed that for
Step S3.3: calculate matrix
Step S3.4: N carries out singular value decomposition, decomposes the normal vector that the characteristic vector corresponding to minimum singular value obtained is exactly objective contour, it is assumed that for, wherein
Step S3.5: estimate two unit vectors parallel as far as possible with objective contour being perpendicular to normal vectorWith,WithSet direction must assure that withConstitute right-handed system, the two unit vector withConstitute three of target body coordinate system axially, it is determined that the attitude of target;The attitude of hypothetical target represents with following matrix:If representing the attitude of target by three angles, being
Step S3.6: can estimate half length of side on each limit of tetragon of four profile angle points compositions further according to four profile angle points, this tetragon is objective contour;
Step S3.7: estimate all parameters of objective contour, and four profile angle points are expressed as:, wherein,,,,Half for the objective contour length of side;
Step S3.8: above equation is substituted into collinearity equation, obtains following error equation
Step S3.9: analyze above error equation, the position quantity of this error equation isWith, this error equation is designated as, it is carried out linearisation, namely according to Taylor series expansion, obtains
Step S3.10: to four profile angle points, it is considered to left and right image, lists above-mentioned linearizing error equation, obtains the error equation of following matrix form
Step S3.11: adopting least square principle, the error equation solving above-mentioned matrix form obtains
Step S3.12: thus revise the parameter of objective contour, if it find that correction is only small, then just have been obtained for the optimal estimation value of objective contour parameter, otherwise needs to return step S3.5 and carries out iterative computation next time;
According to the above quadrangular configuration extracted, utilize two orthogonal directions of its offer to carry out the foundation of local coordinate system, thus can be gone out three attitude angle of target by the Directional Decomposition of the coordinate axes of local coordinate system.
CN201410845011.XA 2014-12-31 2014-12-31 Trinocular vision sensor and its measurement method for micro-nano satellite Active CN105806316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410845011.XA CN105806316B (en) 2014-12-31 2014-12-31 Trinocular vision sensor and its measurement method for micro-nano satellite

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410845011.XA CN105806316B (en) 2014-12-31 2014-12-31 Trinocular vision sensor and its measurement method for micro-nano satellite

Publications (2)

Publication Number Publication Date
CN105806316A true CN105806316A (en) 2016-07-27
CN105806316B CN105806316B (en) 2018-11-06

Family

ID=56420204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410845011.XA Active CN105806316B (en) 2014-12-31 2014-12-31 Trinocular vision sensor and its measurement method for micro-nano satellite

Country Status (1)

Country Link
CN (1) CN105806316B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065920A (en) * 2016-12-26 2017-08-18 歌尔科技有限公司 Avoidance obstacle method, device and unmanned plane
CN113658251A (en) * 2021-08-25 2021-11-16 北京市商汤科技开发有限公司 Distance measuring method, device, electronic equipment, storage medium and system
JP7170401B2 (en) 2018-02-14 2022-11-14 キヤノン電子株式会社 Light source angle measurement device, light source position detection device, and artificial satellite

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5737655A (en) * 1996-08-23 1998-04-07 Inaba; Minoru Stereo camera
CN201041488Y (en) * 2007-05-11 2008-03-26 同济大学 Rock surface three-dimensional appearance measuring instrument
CN102538793A (en) * 2011-12-23 2012-07-04 北京控制工程研究所 Double-base-line non-cooperative target binocular measurement system
CN102937811A (en) * 2012-10-22 2013-02-20 西北工业大学 Monocular vision and binocular vision switching device for small robot
CN103051887A (en) * 2013-01-23 2013-04-17 河海大学常州校区 Eagle eye-imitated intelligent visual sensing node and work method thereof
CN103278139A (en) * 2013-05-06 2013-09-04 北京航空航天大学 Variable-focus monocular and binocular vision sensing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5737655A (en) * 1996-08-23 1998-04-07 Inaba; Minoru Stereo camera
CN201041488Y (en) * 2007-05-11 2008-03-26 同济大学 Rock surface three-dimensional appearance measuring instrument
CN102538793A (en) * 2011-12-23 2012-07-04 北京控制工程研究所 Double-base-line non-cooperative target binocular measurement system
CN102937811A (en) * 2012-10-22 2013-02-20 西北工业大学 Monocular vision and binocular vision switching device for small robot
CN103051887A (en) * 2013-01-23 2013-04-17 河海大学常州校区 Eagle eye-imitated intelligent visual sensing node and work method thereof
CN103278139A (en) * 2013-05-06 2013-09-04 北京航空航天大学 Variable-focus monocular and binocular vision sensing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065920A (en) * 2016-12-26 2017-08-18 歌尔科技有限公司 Avoidance obstacle method, device and unmanned plane
JP7170401B2 (en) 2018-02-14 2022-11-14 キヤノン電子株式会社 Light source angle measurement device, light source position detection device, and artificial satellite
CN113658251A (en) * 2021-08-25 2021-11-16 北京市商汤科技开发有限公司 Distance measuring method, device, electronic equipment, storage medium and system

Also Published As

Publication number Publication date
CN105806316B (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN107729808B (en) Intelligent image acquisition system and method for unmanned aerial vehicle inspection of power transmission line
CN108428255B (en) Real-time three-dimensional reconstruction method based on unmanned aerial vehicle
CN108492333B (en) Spacecraft attitude estimation method based on satellite-rocket docking ring image information
Peng et al. A pose measurement method of a space noncooperative target based on maximum outer contour recognition
CN109115184B (en) Collaborative measurement method and system based on non-cooperative target
Yang et al. Panoramic UAV surveillance and recycling system based on structure-free camera array
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
CN113393571B (en) Cloud-free satellite image generation method and device
CN111045455A (en) Visual correction method for flight course angle error of indoor corridor of micro unmanned aerial vehicle
CN105806316A (en) Trinocular vision sensor for micro/nano satellite and measurement method thereof
Luo et al. Docking navigation method for UAV autonomous aerial refueling
CN112444245A (en) Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN110160503B (en) Unmanned aerial vehicle landscape matching positioning method considering elevation
CN116681733B (en) Near-distance real-time pose tracking method for space non-cooperative target
CN113129377A (en) Three-dimensional laser radar rapid robust SLAM method and device
Zhuang et al. Method of pose estimation for UAV landing
CN114723184B (en) Wind driven generator measuring method, device and equipment based on visual perception
CN112985398A (en) Target positioning method and system
CN112150546B (en) Monocular vision pose estimation method based on auxiliary point geometric constraint
Yingying et al. Fast-swirl space non-cooperative target spin state measurements based on a monocular camera
Zhang et al. Hybrid iteration and optimization-based three-dimensional reconstruction for space non-cooperative targets with monocular vision and sparse lidar fusion
CN113206951B (en) Real-time electronic image stabilization method based on flapping wing flight system
CN115307646A (en) Multi-sensor fusion robot positioning method, system and device
CN105243639B (en) Image adjusting method, device and the system of the photoelectric nacelle of sleeping dress
CN112577463B (en) Attitude parameter corrected spacecraft monocular vision distance measuring method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant