CN113674353A - Method for measuring accurate pose of space non-cooperative target - Google Patents
Method for measuring accurate pose of space non-cooperative target Download PDFInfo
- Publication number
- CN113674353A CN113674353A CN202110948038.1A CN202110948038A CN113674353A CN 113674353 A CN113674353 A CN 113674353A CN 202110948038 A CN202110948038 A CN 202110948038A CN 113674353 A CN113674353 A CN 113674353A
- Authority
- CN
- China
- Prior art keywords
- dimensional
- cooperative target
- straight line
- pose
- space non
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 239000011159 matrix material Substances 0.000 claims description 10
- 238000013519 translation Methods 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 4
- 230000008901 benefit Effects 0.000 abstract description 5
- 238000003384 imaging method Methods 0.000 abstract description 4
- 230000008447 perception Effects 0.000 abstract description 3
- 238000009616 inductively coupled plasma Methods 0.000 abstract 2
- 238000005259 measurement Methods 0.000 description 7
- 239000000284 extract Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a method for measuring the accurate pose of a space non-cooperative target, which comprises the following steps of utilizing a TOF camera and a color camera to realize the estimation of the accurate relative pose of the space non-cooperative target: acquiring three-dimensional point clouds of the space non-cooperative target by using TOF (time of flight), and splicing according to an ICP (inductively coupled plasma) algorithm to obtain complete three-dimensional point clouds of the space non-cooperative target; extracting three-dimensional feature points and three-dimensional straight lines from a complete three-dimensional point cloud of a space non-cooperative target; and acquiring a sequence two-dimensional image of the space non-cooperative target by using a color camera, extracting two-dimensional feature points and two-dimensional straight lines from the sequence two-dimensional image, and solving the relative pose of the space non-cooperative target according to the corresponding relation of the 2D-3D feature points and the straight lines. The invention combines the imaging advantages of a TOF camera and a color camera, can accurately solve the pose of a spatial non-cooperative target, and can be applied to spatial tasks such as deep space exploration, situation perception and the like.
Description
Technical Field
The invention relates to the field of image measurement, in particular to a method for measuring the accurate pose of a space non-cooperative target.
Background
With the progress of science and technology and the development of aerospace industry, deep space exploration and situation perception become important links for human exploration on space roads. In space exploration, more and more task objects are space non-cooperative targets, and the task objects lack cooperative markers and cannot provide effective prior information. Therefore, under a complex space environment, the problem of accurate pose estimation of a completely unknown space target causes wide attention of scholars, and the method has important research value and engineering practice significance.
Common spatial target pose measurement devices include color cameras, binocular cameras, lidar, and the like. The color camera cannot directly acquire target depth information; the measurement accuracy of the binocular camera is limited by a base line; the laser three-dimensional imaging is restricted by the process level, and the resolution is not high. Therefore, the method of multi-sensor fusion measurement can effectively combine the advantages of imaging technology of each sensor and make up for the inherent defects of a single sensor. The invention provides a space non-cooperative target accurate pose measurement scheme based on a TOF (time of flight) camera and a color camera, which can fully utilize the imaging advantages of the TOF camera and the color camera and realize accurate pose measurement of the space non-cooperative target.
Disclosure of Invention
Aiming at the problems, the invention provides a method for measuring the accurate pose of a space non-cooperative target.
The technical scheme adopted by the invention for solving the technical problems is as follows: a method for measuring the accurate pose of a spatial non-cooperative target comprises the following steps:
step 1, a TOF camera is used for obtaining complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional characteristic points and straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image;
and 5, projecting the three-dimensional characteristic points and the three-dimensional straight lines onto a two-dimensional plane, and solving the pose parameters of the spatial non-cooperative target according to the corresponding relation of the characteristic points and the straight lines.
Preferably, the step 5 specifically comprises:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camerax,fy;
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship (c) can be described by a pinhole camera model:
where the rotation matrix R and the translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
pxcosθ+pysinθ-ρd=0 (9)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
N=(fxcosθ,fysinθ,-ρd)T (11)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, andx,qy]the projection relationship between the two can be expressed as:
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq:
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points asThe total number of the corresponding point pairs is N, and the distance of the corresponding straight line pair of the jth pair 2D-3D is set asAnd M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
Compared with the prior art, the invention has the following beneficial effects:
1. the method combines a TOF camera and a color camera, acquires a clear sequence two-dimensional image by using the color camera, directly acquires the advantage of target depth information by using the TOF camera, realizes accurate pose estimation of a space non-cooperative target, specifically acquires target point cloud by using the TOF camera, realizes three-dimensional reconstruction of the target, can acquire three-dimensional structure information of the space non-cooperative target, acquires a high-resolution sequence two-dimensional image under a target motion state by using the color camera, extracts two-dimensional feature points and two-dimensional straight line information from the high-resolution sequence two-dimensional image, and jointly solves the motion pose of the space non-cooperative target with the three-dimensional feature points and the three-dimensional straight line information;
2. the method aims at the completely unknown space non-cooperative target, realizes the accurate pose estimation of the space non-cooperative target, can be applied to space tasks such as deep space exploration, situation perception and the like, and can provide effective information for subsequent space tasks such as capture, attack and defense and the like.
3. Compared with the paper 'Relative position estimation of uncooperative spatial use 2D-3D line correlation', the invention not only utilizes the linear characteristics, but also utilizes the point characteristics of the space non-cooperative target, and has the advantages that the linear characteristics are difficult to stably extract under the conditions of shielding, background interference and the like of the space non-cooperative target by fully utilizing the point and linear characteristics of the target structure and the edge, and when the position settlement is difficult, the position solution can still be carried out by utilizing the corresponding key characteristic points. The invention also optimizes the objective function, and gives different weights to key points and straight lines, so that the pose measurement is more accurate.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention will now be described in detail with reference to fig. 1, wherein exemplary embodiments and descriptions of the invention are provided to explain the invention, but not to limit the invention.
A method for measuring the accurate pose of a spatial non-cooperative target comprises the following steps:
step 1, acquiring a complete three-dimensional point cloud of a spatial non-cooperative target by using a TOF camera, and specifically comprising the following steps:
step 1.1, adopting a Zhang Zhengyou calibration method (A flexible new technique for camera calibration), published in IEEE Transactions on Pattern Analysis and Machine Analysis in 2000, to calibrate the TOF camera, and using a chessboard calibration plate to calibrate to obtain an internal parameter K of the TOF camera;
step 1.2, shooting around each angle of a space non-cooperative target by using a TOF camera to obtain a sequence depth map;
step 1.3, mapping the depth map to a space according to the internal parameters of the camera to obtain a local point cloud of a space non-cooperative target;
step 1.4, denoising the spatial non-cooperative target point cloud, and removing noise points to obtain a fine local point cloud of the spatial non-cooperative target;
step 1.5, providing an initial value for local point cloud registration by using a Fast Point Feature Histogram (FPFH) algorithm;
step 1.6, after the initial value is obtained, point clouds of two adjacent frames of the space non-cooperative target are matched by utilizing an ICP algorithm, and it is assumed that adjacent point sets to be matched are respectively represented as A and B:
A={a1,...,ai,...,an},B={b1,...,bj,...,bm}
wherein, ai,bjRespectively representing three-dimensional points in the point sets A and B, respectively representing the number of the point sets A and B by n and m, respectively implementing point cloud matching by minimizing the distance between the two point sets by an ICP algorithm, and using a rotation matrix RtAnd translation vector TtRigid transformation from point set A to point set B is described, and the specific steps are as follows:
step 1.6.1, finding the corresponding relation of the three-dimensional points in the adjacent point sets A and B, and carrying out ICP method on the three-dimensional points with the shortest Euclidean distanceTwo three-dimensional points are taken as corresponding points, and the corresponding points are marked as ai,bi,
Step 1.6.2, solving the rotation matrix R by minimizing the distance of the corresponding point pair in the point settAnd translation vector Tt:
Step 1.6.3, calculating the centroid coordinates of the point sets A and B:
step 1.6.4, calculating the coordinates of the centroid removed of each point in the point sets A and B:
ai′=ai-μa,bi′=bi-μb (3)
1.6.5, solving the rotation matrix R by SVD methodt:
W=UΣVT (5)
Rt=UVT (6)
Step 1.6.6, solving translation vector Tt:
Tt=μb-Rtμa (7)
Repeating the steps 1.6.1-1.6.6 until the distance meets the threshold requirement to obtain the optimal rotation matrix RtAnd translation vector TtSplicing the two frames of point clouds;
step 1.7, repeating the step 1.6, and splicing all the point clouds to obtain a complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional feature points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional feature points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image by using an EDlines two-dimensional straight line detection algorithm and an SIFT feature point extraction algorithm;
step 5, matching the two-dimensional characteristic points and the two-dimensional straight lines with the three-dimensional characteristic points and the three-dimensional straight lines according to the solved two-dimensional characteristic points and the two-dimensional straight lines, and correspondingly solving the pose parameters of the space non-cooperative target by using the 2D-3D straight lines, wherein the method specifically comprises the following steps:
step 5.1, calibrating the color camera by adopting a Zhang Zhengyou calibration method to obtain the equivalent focal length f of the color camerax,fy;
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship (c) can be described by a pinhole camera model:
where the rotation matrix R and the translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
pxcosθ+pysinθ-ρd=0 (9)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
N=(fxcosθ,fysinθ,-ρd)T (11)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional characteristic points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, Q ═ Q [ ]x,qy]The projection relationship between the two can be expressed as:
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq:
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points asThere are N pairs of corresponding pointsIn pair, the distance of the 2D-3D corresponding straight line pair of the jth pair is set asAnd M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
The technical solutions provided by the embodiments of the present invention are described in detail above, and the principles and embodiments of the present invention are explained herein by using specific examples, and the descriptions of the embodiments are only used to help understanding the principles of the embodiments of the present invention; meanwhile, for a person skilled in the art, according to the embodiments of the present invention, there may be variations in the specific implementation manners and application ranges, and in summary, the content of the present description should not be construed as a limitation to the present invention.
Claims (2)
1. A method for measuring the accurate pose of a spatial non-cooperative target is characterized by comprising the following steps:
step 1, a TOF camera is used for obtaining complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional feature points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image;
and 5, respectively matching according to the solved 2D-3D characteristic points and straight lines, and solving the pose parameters of the spatial non-cooperative target by utilizing the corresponding relation.
2. The method for measuring the accurate pose of the spatial non-cooperative target according to claim 1, wherein the step 5 specifically comprises:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camerax,fy;
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship of (c) can be described as:
wherein, the pose of the target is described by the rotation matrix R and the translational vector t, i.e. the rigid transformation from the world coordinate system to the camera coordinate system, and the three-dimensional linear equation can be expressed by polar coordinates as:
pxcosθ+pysinθ-ρd=0 (2)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
N=(fxcosθ,fysinθ,-ρd)T (4)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional characteristic points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, Q ═ Q [ ]x,qy]The projection relationship between the two can be expressed as:
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq:
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points asThe total number of the corresponding point pairs is N, and the distance of the corresponding straight line pair of the jth pair 2D-3D is set asAnd M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110948038.1A CN113674353B (en) | 2021-08-18 | 2021-08-18 | Accurate pose measurement method for space non-cooperative target |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110948038.1A CN113674353B (en) | 2021-08-18 | 2021-08-18 | Accurate pose measurement method for space non-cooperative target |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113674353A true CN113674353A (en) | 2021-11-19 |
CN113674353B CN113674353B (en) | 2023-05-16 |
Family
ID=78543640
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110948038.1A Active CN113674353B (en) | 2021-08-18 | 2021-08-18 | Accurate pose measurement method for space non-cooperative target |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113674353B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115661493A (en) * | 2022-12-28 | 2023-01-31 | 航天云机(北京)科技有限公司 | Object pose determination method and device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120268567A1 (en) * | 2010-02-24 | 2012-10-25 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium |
CN109242873A (en) * | 2018-08-22 | 2019-01-18 | 浙江大学 | A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN112179357A (en) * | 2020-09-25 | 2021-01-05 | 中国人民解放军国防科技大学 | Monocular camera-based visual navigation method and system for plane moving target |
CN112284293A (en) * | 2020-12-24 | 2021-01-29 | 中国人民解放军国防科技大学 | Method for measuring space non-cooperative target fine three-dimensional morphology |
-
2021
- 2021-08-18 CN CN202110948038.1A patent/CN113674353B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120268567A1 (en) * | 2010-02-24 | 2012-10-25 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium |
CN109242873A (en) * | 2018-08-22 | 2019-01-18 | 浙江大学 | A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN112179357A (en) * | 2020-09-25 | 2021-01-05 | 中国人民解放军国防科技大学 | Monocular camera-based visual navigation method and system for plane moving target |
CN112284293A (en) * | 2020-12-24 | 2021-01-29 | 中国人民解放军国防科技大学 | Method for measuring space non-cooperative target fine three-dimensional morphology |
Non-Patent Citations (4)
Title |
---|
孙聪,刘海波,陈圣义,尚洋: "基于广义成像模型的Scheimpflug相机标定方法", 光学学报 * |
张跃强,苏昂 刘海波,尚洋,于起峰: "基于多级直线表述和M-估计的三维目标位姿跟踪优化算法", 光学学报 * |
张雄锋,刘海波,尚洋: "单目相机位姿估计的稳健正交迭代方法", 光学学报 * |
徐影,张进,于沫尧,许丹丹: "多星近距离绕飞观测任务姿轨耦合控制研究", 中国空间科学技术 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115661493A (en) * | 2022-12-28 | 2023-01-31 | 航天云机(北京)科技有限公司 | Object pose determination method and device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113674353B (en) | 2023-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110443836B (en) | Point cloud data automatic registration method and device based on plane features | |
RU2609434C2 (en) | Detection of objects arrangement and location | |
CN103971378B (en) | A kind of mix the three-dimensional rebuilding method of panoramic picture in visual system | |
CN107560592B (en) | Precise distance measurement method for photoelectric tracker linkage target | |
CN111325801B (en) | Combined calibration method for laser radar and camera | |
CN110349221A (en) | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor | |
CN106826815A (en) | Target object method of the identification with positioning based on coloured image and depth image | |
CN107729893B (en) | Visual positioning method and system of die spotting machine and storage medium | |
CN109360230A (en) | A kind of method for registering images and system based on 2D camera Yu 3D camera | |
CN104880176A (en) | Moving object posture measurement method based on prior knowledge model optimization | |
Ellmauthaler et al. | A novel iterative calibration approach for thermal infrared cameras | |
CN105957096A (en) | Camera extrinsic parameter calibration method for three-dimensional digital image correlation | |
CN108362205B (en) | Space distance measuring method based on fringe projection | |
CN110147162B (en) | Fingertip characteristic-based enhanced assembly teaching system and control method thereof | |
CN108230402B (en) | Three-dimensional calibration method based on triangular pyramid model | |
CN102567991B (en) | A kind of binocular vision calibration method based on concentric circle composite image matching and system | |
CN112365545B (en) | Calibration method of laser radar and visible light camera based on large-plane composite target | |
CN110060304B (en) | Method for acquiring three-dimensional information of organism | |
CN110532865B (en) | Spacecraft structure identification method based on fusion of visible light and laser | |
CN106871900A (en) | Image matching positioning method in ship magnetic field dynamic detection | |
CN112362034B (en) | Solid engine multi-cylinder section butt joint guiding measurement method based on binocular vision | |
CN111524174A (en) | Binocular vision three-dimensional construction method for moving target of moving platform | |
Han et al. | Target positioning method in binocular vision manipulator control based on improved canny operator | |
CN108596947A (en) | A kind of fast-moving target tracking method suitable for RGB-D cameras | |
CN113674353B (en) | Accurate pose measurement method for space non-cooperative target |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |