CN116045919A - Space cooperation target based on TOF system and relative pose measurement method thereof - Google Patents
Space cooperation target based on TOF system and relative pose measurement method thereof Download PDFInfo
- Publication number
- CN116045919A CN116045919A CN202211722303.5A CN202211722303A CN116045919A CN 116045919 A CN116045919 A CN 116045919A CN 202211722303 A CN202211722303 A CN 202211722303A CN 116045919 A CN116045919 A CN 116045919A
- Authority
- CN
- China
- Prior art keywords
- target
- intersection point
- relative pose
- cooperative
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000691 measurement method Methods 0.000 title claims description 9
- 238000000034 method Methods 0.000 claims abstract description 25
- 239000011159 matrix material Substances 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 description 15
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a space cooperation target based on TOF system and a relative pose measuring method thereof, which comprises the following steps: according to the method, the TOF camera continuously shoots the target to obtain the depth image and the gray level image, the depth image and the gray level image are obtained, further identification, matching and tracking are completed, and the relative position and the gesture information are obtained through resolving target information pre-stored in a memory.
Description
Technical Field
The invention relates to the technical field of space phase pose measurement, in particular to a space cooperative target based on TOF (time of flight) system and a relative pose measurement method thereof.
Background
The relative pose measurement technology is widely applied to various space tasks, and along with the increasing of space intersection butt joint and space control tasks, the requirements for the relative pose measurement technology of the space are also increasingly highlighted. The space relative pose measurement technology is to measure the relative pose and the relative position between a target coordinate system and a measurement coordinate system through a certain optical measurement system, and then input information into a GNC system for the pose control of a subsequent space vehicle.
The currently common spatial relative pose measurement modes comprise a binocular camera, a laser range finder, a TOF camera and the like. The laser range finder can provide target distance information and has the characteristics of low power consumption, low price, simple structure and insensitivity to illumination, but the laser range finder needs to have strong target tracking and recapturing capability; binocular stereoscopic vision has the characteristics of high resolution, low power consumption, low price and the like, but still has the problems of relatively close range finding, complex calibration, complex algorithm, poor real-time performance, poor adaptability to illumination conditions and the like.
Disclosure of Invention
The invention aims to provide a space cooperation target based on TOF system and a relative pose measurement method thereof. The method aims to solve the problems that the traditional method has higher requirements on target tracking and recapturing capability, complex algorithm and poor instantaneity when performing space relative pose measurement.
To achieve the above object, in one aspect, the present invention provides a space cooperation target based on TOF system, including:
a corner cone prism;
seven far field markers, the pyramid prism and the seven far field markers are used for identifying and measuring the target position in different distance segments respectively,
based on four target recognition principles, the positions and serial numbers of the seven far-field marks are recognized.
Preferably, the seven far-field signatures are respectively: a first target, a second target, a third target, a fourth target, a fifth target, a sixth target, and a seventh target.
Preferably, the four target recognition principles include:
the first target, the second target, and the third target meet a collinearly requirement;
the first target, the fifth target, and the sixth target meet a collinearly requirement;
an intersection point of a connecting line of the fourth target and the fifth target and a connecting line of the first target and the third target is a first intersection point C1, an intersection point of a connecting line of the fourth target and the sixth target and a connecting line of the first target and the third target is a second intersection point C2, the first target, the second target, the first intersection point C1 and the third target meet a cross ratio condition, and the first target, the second intersection point C2 and the third target meet the cross ratio condition;
and if the intersection point of the connecting line of the second target and the fourth target and the connecting line of the first target and the sixth target is a third intersection point C3, the first target, the third intersection point C3, the fifth target and the sixth target meet the intersection ratio condition.
On the other hand, the invention also provides a relative pose measurement method of the space cooperative target based on TOF system, which adopts the cooperative target to carry out relative pose measurement,
step S1: shooting the cooperative targets by using a TOF camera to obtain a depth image and a gray level image;
step S2: processing the gray level image, and extracting to obtain the centroid of the cooperative target;
step S3: selecting a pixel point with highest gray level in the gray level image, extracting depth information corresponding to the pixel point, and obtaining the sight distance of the pixel point;
step S4: judging the size of the distance range between the sight distance and the gesture resolving distance to obtain a judging result, and searching and matching the cooperative targets according to the judging result to obtain a recognition result;
step S5: and according to the identification result and the gesture resolving result of the previous frame, performing gesture resolving of the current frame.
Preferably, in the step S3, the extracting depth information corresponding to the pixel point obtains a line-of-sight distance of the pixel point, which specifically includes: and defining the average value of 4 adjacent pixel distance values of the pixel point as the line-of-sight distance, and calculating the line-of-sight angle of the position of the target according to the coordinates of the pixel point.
Preferably, in the step S4, the size of the distance range between the line of sight and the gesture is determined, which specifically includes: judging whether the sight distance is in a gesture resolving distance range, if so, performing gesture resolving; if not, returning to the step S1 to calculate the next frame of image data.
Preferably, in the step S4, the cooperative targets are searched and matched to obtain a recognition result, which specifically includes: and sequentially identifying the cooperative targets according to the four target identification principles.
Preferably, in the step S5, according to the recognition result and the pose resolving result of the previous frame, the pose resolving of the current frame is performed, which specifically includes: if the identification result is that the identification is successful, carrying out gesture calculation of the current frame; and if the identification result is that the identification is successful and the gesture resolving result of the previous frame is completed, performing target tracking according to the gesture calculated in the previous frame, back projecting the coordinate position of the target to obtain the centroid of the cooperative target, and performing gesture calculation of the current frame.
Preferably, the gesture resolving of the current frame specifically includes: according to the collinearity equation and the indirect adjustment principle, the rotation matrix is calculated through iteration.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, the TOF camera continuously shoots the synthetic target to obtain the depth image and the gray level image, the recognition, the matching and the tracking are completed, the relative position and the gesture information are obtained by resolving the target information prestored in the memory, the requirements on the target tracking and recapturing capability are lower, and the algorithm is simple and strong in real-time performance.
Drawings
For a clearer description of the technical solutions of the present invention, the drawings that are needed in the description will be briefly introduced below, it being obvious that the drawings in the following description are one embodiment of the present invention, and that, without inventive effort, other drawings can be obtained by those skilled in the art from these drawings:
fig. 1 is a schematic diagram of a space cooperation target design based on the TOF system according to an embodiment of the present invention;
fig. 2 is a flow chart of a method for measuring relative pose of space cooperative targets based on TOF system according to an embodiment of the present invention.
Reference numerals illustrate: 101-first target, 102-second target, 103-third target, 104-fourth target, 105-fifth target, 106-sixth target, 107-seventh target, 201-pyramid prism, 301-near field tag.
Detailed Description
The space cooperation target based on TOF system and the relative pose measuring method thereof are further described in detail below with reference to the accompanying figures 1-2 and the detailed description. The advantages and features of the present invention will become more apparent from the following description. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for the purpose of facilitating and clearly aiding in the description of embodiments of the invention. For a better understanding of the invention with objects, features and advantages, refer to the drawings. It should be understood that the structures, proportions, sizes, etc. shown in the drawings are for illustration purposes only and should not be construed as limiting the invention to the extent that any modifications, changes in the proportions, or adjustments of the sizes of structures, proportions, or otherwise, used in the practice of the invention, are included in the spirit and scope of the invention which is otherwise, without departing from the spirit or essential characteristics thereof.
In view of the problems of high requirements on target tracking and recapturing capability, complex algorithm and poor instantaneity when the space relative pose measurement is carried out in the traditional method.
On one hand, the embodiment provides a space cooperation target relative pose measurement method based on TOF system, which comprises the following steps: a pyramid prism 201; seven far-field marks, namely the pyramid prism and the seven far-field marks are used for identifying and measuring target positions in different distance sections respectively, and the positions and serial numbers of the seven far-field marks are identified based on a four-target identification principle, and the near-field mark 301 is obtained.
The TOF camera can directly obtain the depth information and the intensity information of the target without a scanning mechanism, has the advantages of low power consumption, compact structure, good instantaneity and insensitivity to light, and is more suitable for short-distance measurement of the space target.
As shown in fig. 1, in this embodiment, seven far-field flags are respectively: a first target 101, a second target 102, a third target 103, a fourth target 104, a fifth target 105, a sixth target 106, and a seventh target 107.
The four target recognition principles include: 1. the first target 101, the second target 102, and the third target 103 meet a collinearly requirement; 2. the first target 101, the fifth target 105, and the sixth target 106 meet a collinearly requirement; 3. an intersection point of a wire of the fourth target 104 and the fifth target 105 and a wire of the first target 101 and the third target 103 is a first intersection point C1, an intersection point of a wire of the fourth target 104 and the sixth target 106 and a wire of the first target 101 and the third target 103 is a second intersection point C2, the first target 101, the second target 102, the first intersection point C1 and the third target 103 satisfy a cross ratio condition, and the first target 101, the second target 102, the second intersection point C2 and the third target 103 satisfy a cross ratio condition; 4. and if the intersection point of the connecting line of the second target 102 and the fourth target 104 and the connecting line of the first target 101 and the sixth target 106 is a third intersection point C3, the first target 101, the third intersection point C3, the fifth target 105 and the sixth target 106 satisfy a cross ratio condition.
On the other hand, the embodiment also provides a relative pose measurement method of a space cooperative target based on TOF system, which adopts the cooperative target to carry out relative pose measurement, and comprises the following steps:
step S1: and shooting the cooperative target by adopting a TOF camera to obtain a depth image and a gray level image.
Step S2: and processing the gray level image, and extracting to obtain the centroid of the cooperative target.
Step S3: and selecting a pixel point with the highest gray level in the gray level image, and extracting depth information corresponding to the pixel point to obtain the sight distance of the pixel point.
Extracting depth information corresponding to the pixel points to obtain the line-of-sight distance of the pixel points, wherein the line-of-sight distance is specifically as follows: defining the average value of 4 adjacent pixel distance values of the pixel point as the line-of-sight distance, and according to the coordinate (X 0 ,Y 0 ) And calculating the sight angle of the position of the target.
Step S4: and judging the size of the distance range between the sight distance and the gesture resolving distance to obtain a judging result, and searching and matching the cooperative targets according to the judging result to obtain a recognition result.
The size of the distance range between the sight distance and the gesture solution is judged, which is specifically as follows: judging whether the sight line distance is within a gesture resolving distance range (the gesture resolving distance range is 15m in the embodiment), and if the sight line distance is within the gesture resolving range (that is, the sight line distance is less than or equal to 15 m), performing gesture resolving; if not, i.e. the line of sight distance is greater than 15m, the process returns to step S1 to calculate the next frame of image data.
Searching and matching the cooperative targets to obtain identification results, wherein the identification results specifically comprise: and sequentially identifying the cooperative targets according to the four target identification principles.
Step S5: and according to the identification result and the gesture resolving result of the previous frame, performing gesture resolving of the current frame.
Referring to fig. 2, after performing the pose calculation of the current frame, the system returns to step S1 to perform the pose calculation of the next frame.
According to the recognition result and the gesture resolving result of the previous frame, the gesture resolving of the current frame is carried out, which comprises the following steps: if the identification result is that the identification is successful, carrying out gesture calculation of the current frame; and if the identification result is that the identification is successful and the gesture resolving result of the previous frame is completed, performing target tracking according to the gesture calculated in the previous frame, back projecting the coordinate position of the target to obtain the centroid of the cooperative target, and performing gesture calculation of the current frame.
The gesture resolving of the current frame specifically comprises the following steps: according to the collinearity equation and the indirect adjustment principle, the rotation matrix is calculated through iteration.
Specific embodiments for resolving gestures are: the detector uses the target imaging effect of the pyramid prism as a circle center spot, calculates the spot centroid by adopting a weighted gray centroid algorithm, and then extracts the pitch angle and the yaw angle of the spot centroid.
Let the spot pixel be (x 1 ,y 1 ),(x 2 ,y 2 ),......(x n ,y n ) Then the gray scale centroid coordinates (x 0 ,y 0 ) The expression of (2) is:
wherein: w (w) i Is the coordinates (x) i ,y i ) Gray values of (2);
the expression of the high and low angle beta is:
β=(y 0 -Y 0 )×θ (2)
the expression of azimuth α is:
α=(x 0 -X 0 )×θ (3)
in (X) 0 ,Y 0 ) And theta is the detector angular resolution, which is the detector principal point coordinate.
Let the gray extremum coordinate of pyramid prism target be (x) m ,y m ) Taking the average value of the distance values of 4 adjacent pixels as the line of sight, and the expression is as follows:
And in a distance range of 1.5 meters to 30 meters, performing gesture and position measurement through a far-field target.
Referring to fig. 1, by identifying and resolving the photographed gray image, the identification criteria are:
1. the first target 101, the second target 102, and the third target 103 meet a collinearly requirement; 2. the first target 101, the fifth target 105, and the sixth target 106 meet a collinearly requirement; 3. an intersection point of a wire of the fourth target 104 and the fifth target 105 and a wire of the first target 101 and the third target 103 is a first intersection point C1, an intersection point of a wire of the fourth target 104 and the sixth target 106 and a wire of the first target 101 and the third target 103 is a second intersection point C2, the first target 101, the second target 102, the first intersection point C1 and the third target 103 satisfy a cross ratio condition, and the first target 101, the second target 102, the second intersection point C2 and the third target 103 satisfy a cross ratio condition; 4. and if the intersection point of the connecting line of the second target 102 and the fourth target 104 and the connecting line of the first target 101 and the sixth target 106 is a third intersection point C3, the first target 101, the third intersection point C3, the fifth target 105 and the sixth target 106 satisfy a cross ratio condition.
And (3) performing cyclic identification matching on the extracted target points, if the conditions are met, successfully identifying, otherwise, unsuccessfully identifying.
The target tracking functional module is divided into three parts, wherein the first part back projects the positions of the large targets (namely far-field marks) according to the pose of the previous frame, the second part back projects all targets and calculates the pose by utilizing the large targets (namely far-field marks) if the number of the large targets (namely far-field marks) in the visual field is more than 5, and the third part back projects all targets. And back-projecting the target point according to the final iteration result, calculating a back-projection error, and judging whether the current pose is effective or not according to the error.
After target recognition or tracking is completed, the relative pose and position can be solved, and the rotation matrix is calculated through iteration by using a collinearity equation and an indirect adjustment principle.
The collineation conditional equation in accordance with photogrammetry theory is:
wherein (x, y) is the coordinates of the image point; (x) 0 ,y 0 F) is an internal azimuth element of the camera; (X, Y, Z) is the ground auxiliary coordinates of the image point corresponding to the object point; (X) S ,Y S ,Z S ) Is an external azimuth line element of the camera; a, a 1 ,a 2 ,a 3 ,b 1 ,b 2 ,b 3 ,c 1 ,c 2 ,c 3 The constituent rotation matrices M, M can be expressed as:
The rotation matrix algorithm discards describing the matrix M as an external azimuth elementIs directly X S ,Y S ,Z S ,a 1 ,a 2 ,a 3 ,b 1 ,b 2 ,b 3 ,c 1 ,c 2 ,c 3 Taking image point coordinates (X, Y) as observation values, linearizing the formula (6) according to known values of azimuth elements and ground coordinate points (X, Y, Z) in the camera to obtain an error equation: />
Writing the above error equation (7) into a matrix form is:
V=Cδ X +1 (8)
wherein, the liquid crystal display device comprises a liquid crystal display device,
V=[v x v y ]T (9)
Correction value delta X =[dX S dY S dZ S da 1 db 1 ...dc 3 ]T (11)
l=[l x l y ]T (12)
Since the matrix M is an orthogonal matrix, the above-mentioned unknown quantity delta X The following 6 constraints are also implied, 6 of which are represented by the following formula (13):
linearizing the above to obtain
B X δ X +W=0 (14)
Wherein the method comprises the steps of
For 8 control points, two error equations can be listed according to the above formula, and then the original error equation of the overall adjustment is the same
Where V and l are 16×1 order matrices and C is a 16×12 order matrix. If the weight matrix of the observed value is P, solving by adopting an indirect adjustment model to obtain
Y=-N Y -1 W Y (16)
given the coordinate position and the initial value of the rotation matrix, then solving the value according to the formula (16), gradually iterating until the correction delta X Less than the limit difference.
In summary, in this embodiment, the TOF camera is used to continuously shoot the synthetic target to obtain the depth image and the gray level image, so as to complete recognition, matching and tracking, and the method of resolving the target information pre-stored in the memory to obtain the relative position and posture information has low requirements on the target tracking and recapturing capability, simple algorithm and strong instantaneity, and solves the problems of high requirements on the target tracking and recapturing capability, complex algorithm and poor instantaneity in the prior art when the spatial relative pose measurement is performed.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It should be noted that the apparatus and methods disclosed in the embodiments herein may be implemented in other ways. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments herein. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
While the present invention has been described in detail through the foregoing description of the preferred embodiment, it should be understood that the foregoing description is not to be considered as limiting the invention. Many modifications and substitutions of the present invention will become apparent to those of ordinary skill in the art upon reading the foregoing. Accordingly, the scope of the invention should be limited only by the attached claims.
Claims (9)
1. A spatial cooperative target based on TOF regime, comprising:
a corner cone prism;
seven far field markers, the pyramid prism and the seven far field markers are used for identifying and measuring the target position in different distance segments respectively,
based on four target recognition principles, the positions and serial numbers of the seven far-field marks are recognized.
2. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 1, wherein seven far-field markers are respectively: a first target, a second target, a third target, a fourth target, a fifth target, a sixth target, and a seventh target.
3. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 2, wherein the four target recognition principles include:
the first target, the second target, and the third target meet a collinearly requirement;
the first target, the fifth target, and the sixth target meet a collinearly requirement;
an intersection point of a connecting line of the fourth target and the fifth target and a connecting line of the first target and the third target is a first intersection point C1, an intersection point of a connecting line of the fourth target and the sixth target and a connecting line of the first target and the third target is a second intersection point C2, the first target, the second target, the first intersection point C1 and the third target meet a cross ratio condition, and the first target, the second intersection point C2 and the third target meet the cross ratio condition;
and if the intersection point of the connecting line of the second target and the fourth target and the connecting line of the first target and the sixth target is a third intersection point C3, the first target, the third intersection point C3, the fifth target and the sixth target meet the intersection ratio condition.
4. A relative pose measurement method of a space cooperative target based on TOF system is characterized in that the relative pose measurement method adopts the cooperative target in any one of claims 1-3,
step S1: shooting the cooperative targets by using a TOF camera to obtain a depth image and a gray level image;
step S2: processing the gray level image, and extracting to obtain the centroid of the cooperative target;
step S3: selecting a pixel point with highest gray level in the gray level image, extracting depth information corresponding to the pixel point, and obtaining the sight distance of the pixel point;
step S4: judging the size of the distance range between the sight distance and the gesture resolving distance to obtain a judging result, and searching and matching the cooperative targets according to the judging result to obtain a recognition result;
step S5: and according to the identification result and the gesture resolving result of the previous frame, performing gesture resolving of the current frame.
5. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 4, wherein in the step S3, depth information corresponding to the pixel is extracted to obtain a line of sight of the pixel, which specifically comprises: and defining the average value of 4 adjacent pixel distance values of the pixel point as the line-of-sight distance, and calculating the line-of-sight angle of the position of the target according to the coordinates of the pixel point.
6. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 5, wherein in said step S4, the size of the distance range between the line of sight and the pose is determined, specifically: judging whether the sight distance is in a gesture resolving distance range, if so, performing gesture resolving; if not, returning to the step S1 to calculate the next frame of image data.
7. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 6, wherein in step S4, the cooperative targets are searched and matched to obtain recognition results, which specifically comprises: and sequentially identifying the cooperative targets according to the four target identification principles.
8. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 7, wherein in step S5, pose calculation of current frame is performed according to the recognition result and pose calculation result of previous frame, specifically: if the identification result is that the identification is successful, carrying out gesture calculation of the current frame; and if the identification result is that the identification is successful and the gesture resolving result of the previous frame is completed, performing target tracking according to the gesture calculated in the previous frame, back projecting the coordinate position of the target to obtain the centroid of the cooperative target, and performing gesture calculation of the current frame.
9. The method for measuring relative pose of space cooperative targets based on TOF system according to claim 8, wherein the pose calculation of the current frame specifically comprises: according to the collinearity equation and the indirect adjustment principle, the rotation matrix is calculated through iteration.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211722303.5A CN116045919A (en) | 2022-12-30 | 2022-12-30 | Space cooperation target based on TOF system and relative pose measurement method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211722303.5A CN116045919A (en) | 2022-12-30 | 2022-12-30 | Space cooperation target based on TOF system and relative pose measurement method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116045919A true CN116045919A (en) | 2023-05-02 |
Family
ID=86128955
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211722303.5A Pending CN116045919A (en) | 2022-12-30 | 2022-12-30 | Space cooperation target based on TOF system and relative pose measurement method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116045919A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010069160A1 (en) * | 2008-12-19 | 2010-06-24 | 中国科学院沈阳自动化研究所 | Apparatus for measuring six-dimension attitude of an object |
CN113048938A (en) * | 2021-03-04 | 2021-06-29 | 湖北工业大学 | Cooperative target design and attitude angle measurement system and method |
CN114659523A (en) * | 2022-03-04 | 2022-06-24 | 中国科学院微电子研究所 | Large-range high-precision attitude measurement method and device |
WO2022143796A1 (en) * | 2020-12-29 | 2022-07-07 | 杭州海康机器人技术有限公司 | Calibration method and calibration device for line structured light measurement system, and system |
CN115471562A (en) * | 2021-06-11 | 2022-12-13 | 深圳市汇顶科技股份有限公司 | TOF module calibration method and device and electronic equipment |
-
2022
- 2022-12-30 CN CN202211722303.5A patent/CN116045919A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010069160A1 (en) * | 2008-12-19 | 2010-06-24 | 中国科学院沈阳自动化研究所 | Apparatus for measuring six-dimension attitude of an object |
WO2022143796A1 (en) * | 2020-12-29 | 2022-07-07 | 杭州海康机器人技术有限公司 | Calibration method and calibration device for line structured light measurement system, and system |
CN113048938A (en) * | 2021-03-04 | 2021-06-29 | 湖北工业大学 | Cooperative target design and attitude angle measurement system and method |
CN115471562A (en) * | 2021-06-11 | 2022-12-13 | 深圳市汇顶科技股份有限公司 | TOF module calibration method and device and electronic equipment |
CN114659523A (en) * | 2022-03-04 | 2022-06-24 | 中国科学院微电子研究所 | Large-range high-precision attitude measurement method and device |
Non-Patent Citations (1)
Title |
---|
赵树磊: "基于TOF相机的靶标识别与位姿测量系统设计", 计算机技术与应用, vol. 45, no. 1, 23 January 2019 (2019-01-23), pages 81 - 84 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021233029A1 (en) | Simultaneous localization and mapping method, device, system and storage medium | |
CN111830953B (en) | Vehicle self-positioning method, device and system | |
CN111220154A (en) | Vehicle positioning method, device, equipment and medium | |
CN109520500B (en) | Accurate positioning and street view library acquisition method based on terminal shooting image matching | |
CN113989450B (en) | Image processing method, device, electronic equipment and medium | |
CN111882612A (en) | Vehicle multi-scale positioning method based on three-dimensional laser detection lane line | |
CN104200086A (en) | Wide-baseline visible light camera pose estimation method | |
CN107917880B (en) | cloud base height inversion method based on foundation cloud picture | |
CN112862881B (en) | Road map construction and fusion method based on crowd-sourced multi-vehicle camera data | |
CN101826157A (en) | Ground static target real-time identifying and tracking method | |
CN107609547B (en) | Method and device for quickly identifying stars and telescope | |
CN111260539A (en) | Fisheye pattern target identification method and system | |
CN111998862A (en) | Dense binocular SLAM method based on BNN | |
Chellappa et al. | On the positioning of multisensor imagery for exploitation and target recognition | |
CN112649803B (en) | Camera and radar target matching method based on cross-correlation coefficient | |
CN116091804B (en) | Star suppression method based on adjacent frame configuration matching | |
Chenchen et al. | A camera calibration method for obstacle distance measurement based on monocular vision | |
CN116045919A (en) | Space cooperation target based on TOF system and relative pose measurement method thereof | |
CN115375774A (en) | Method, apparatus, device and storage medium for determining external parameters of a camera | |
CN113112551B (en) | Camera parameter determining method and device, road side equipment and cloud control platform | |
RU2406071C1 (en) | Method of mobile object navigation | |
CN115471555A (en) | Unmanned aerial vehicle infrared inspection pose determination method based on image feature point matching | |
CN114078144A (en) | Point cloud matching and deformity correction method between two detection devices | |
CN110992413A (en) | High-precision rapid registration method for airborne remote sensing image | |
CN112162252B (en) | Data calibration method for millimeter wave radar and visible light sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |