CN113674353B - Accurate pose measurement method for space non-cooperative target - Google Patents

Accurate pose measurement method for space non-cooperative target Download PDF

Info

Publication number
CN113674353B
CN113674353B CN202110948038.1A CN202110948038A CN113674353B CN 113674353 B CN113674353 B CN 113674353B CN 202110948038 A CN202110948038 A CN 202110948038A CN 113674353 B CN113674353 B CN 113674353B
Authority
CN
China
Prior art keywords
dimensional
cooperative target
straight line
space non
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110948038.1A
Other languages
Chinese (zh)
Other versions
CN113674353A (en
Inventor
刘海波
刘子宾
宋俊尧
张进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110948038.1A priority Critical patent/CN113674353B/en
Publication of CN113674353A publication Critical patent/CN113674353A/en
Application granted granted Critical
Publication of CN113674353B publication Critical patent/CN113674353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for measuring the accurate pose of a space non-cooperative target, which comprises the steps of utilizing a TOF camera and a color camera to realize the accurate relative pose estimation of the space non-cooperative target: acquiring a three-dimensional point cloud of a space non-cooperative target by utilizing TOF, and splicing according to an ICP algorithm to obtain a complete three-dimensional point cloud of the space non-cooperative target; extracting three-dimensional characteristic points and three-dimensional straight lines from a complete three-dimensional point cloud of a space non-cooperative target; and acquiring a sequence two-dimensional image of the space non-cooperative target by using a color camera, extracting two-dimensional characteristic points and two-dimensional straight lines from the sequence two-dimensional image, and solving the relative pose of the space non-cooperative target according to the corresponding relation of the 2D-3D characteristic points and the straight lines. The method combines the imaging advantages of the TOF camera and the color camera, can accurately solve the pose of the space non-cooperative target, and can be applied to space tasks such as deep space exploration, situation awareness and the like.

Description

Accurate pose measurement method for space non-cooperative target
Technical Field
The invention relates to the field of image measurement, in particular to a method for measuring the accurate pose of a space non-cooperative target.
Background
With the progress of science and technology and the development of aerospace industry, deep space exploration and situation awareness become important links for human exploration on space roads. In space exploration, more and more task objects are spatially non-cooperative objects that lack cooperative signatures and do not provide valid a priori information. Therefore, under a complex space environment, the problem of accurate pose estimation of a completely unknown space target is widely focused by students, and has important research value and engineering practice significance.
Common spatial target pose measurement devices include color cameras, binocular cameras, lidar, and the like. The color camera cannot directly acquire the target depth information; the measurement precision of the binocular camera is limited by a base line; the laser three-dimensional imaging is limited by the technical level, and the resolution ratio is not high. Therefore, the method for fusion measurement of multiple sensors can effectively combine the imaging technical advantages of each sensor and make up for the inherent defects of a single sensor. The invention provides a space non-cooperative target accurate pose measurement scheme based on TOF and a color camera, which can fully utilize imaging advantages of the TOF and the color camera to realize accurate pose measurement of the space non-cooperative target.
Disclosure of Invention
Aiming at the problems, the invention provides a method for measuring the accurate pose of a space non-cooperative target.
The technical scheme adopted for solving the technical problems is as follows: a method for measuring the accurate pose of a space non-cooperative target comprises the following steps:
step 1, acquiring a complete three-dimensional point cloud of a space non-cooperative target by using a TOF camera;
step 2, extracting three-dimensional characteristic points and straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, acquiring image data of the moving space non-cooperative target by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative targets from the obtained sequence two-dimensional images;
and 5, projecting the three-dimensional characteristic points and the three-dimensional straight lines onto a two-dimensional plane, and solving pose parameters of the space non-cooperative targets according to the corresponding relation of the characteristic points and the straight lines.
Preferably, the step 5 specifically includes:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camera x ,f y
Step 5.2, assuming that the pose initial value of the spatial non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the pose initial value, matching with the extracted spatial non-cooperative target two-dimensional straight line, and enabling a three-dimensional straight line endpoint P and a projection point p= (P) x ,p y ) The relationship of (2) can be described by a pinhole camera model:
Figure BDA0003217437090000021
wherein the rotation matrix R and translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
p x cosθ+p y sinθ-ρ d =0 (9)
step 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain the following:
Figure BDA0003217437090000022
N=(f x cosθ,f y sinθ,-ρ d ) T (11)
where N is the normal vector of the projection plane, and the distance d of the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is expressed as:
Figure BDA0003217437090000023
step 5.4,2D-3D straight line represents the corresponding distance as
Figure BDA0003217437090000024
Wherein d is l1 ,d l2 Respectively representing two end points p projected from a three-dimensional straight line to a plane 1 ,p 2 Distance to a two-dimensional straight line;
step 5.5, projecting the three-dimensional feature points onto the two-dimensional plane according to the pose initial values, and setting the three-dimensional points and the two-dimensional feature points projected onto the plane to be respectively represented as Q, and q= [ Q x ,q y ]The projection relationship between the two can be expressed as:
Figure BDA0003217437090000031
step 5.6, projecting the three-dimensional feature points onto the two-dimensional plane, wherein the two-dimensional feature points q ' = [ q ' ] corresponding to the two-dimensional feature points on the image ' x ,q′ y ]Matching, and setting the distance between the projection point and the two-dimensional characteristic point as d q
Figure BDA0003217437090000032
Step 5.7, the feature points and the linear features of the space non-cooperative targets are projected onto a two-dimensional plane, and the i-th pair of 2D-3D corresponding point pair distances are set as
Figure BDA0003217437090000033
N pairs of corresponding points are shared, and the distance between the j-th pair and the 2D-3D corresponding straight line pair is set as +.>
Figure BDA0003217437090000034
And (3) sharing M pairs of 2D-3D corresponding straight line pairs, and solving the target pose by a minimization formula 16:
Figure BDA0003217437090000035
solving a formula (16) through a least square method to solve the pose of the space non-cooperative target: the rotation matrix R and the translation vector t.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the invention, a TOF camera and a color camera are combined, a clear sequence two-dimensional image is obtained by using the color camera, the TOF camera directly obtains the advantage of target depth information, so that accurate pose estimation of a space non-cooperative target is realized, specifically, a TOF camera is used for obtaining a target point cloud, three-dimensional reconstruction of the target is realized, three-dimensional structure information of the space non-cooperative target can be obtained, a high-resolution sequence two-dimensional image in a target motion state is obtained by using the color camera, two-dimensional characteristic points and two-dimensional linear information are extracted from the two-dimensional sequence two-dimensional image, and the three-dimensional characteristic points and the three-dimensional linear information are combined to solve the motion pose of the space non-cooperative target;
2. aiming at the completely unknown space non-cooperative target, the method realizes accurate pose estimation of the space non-cooperative target, can be applied to space tasks such as deep space detection and situation awareness, and can provide effective information for space tasks such as subsequent capturing, attack and defense.
3. Compared with the paper Relative pose estimation of uncooperative spacecraft using 2D-3D line correspondences, the method not only utilizes the linear characteristics, but also utilizes the point characteristics of the space non-cooperative targets, and has the advantages that the point and line characteristics of the target structure and the edges are fully utilized, the linear characteristics of the space non-cooperative targets are difficult to stably extract under the conditions of shielding, background interference and the like, and pose solving can still be carried out by utilizing the corresponding key characteristic points when pose settlement is difficult. The invention optimizes the objective function, gives different weights to the key points and the straight lines, and ensures that pose measurement is more accurate.
Drawings
Fig. 1 is a flow chart of the method of the present invention.
Detailed Description
The present invention will be described in detail below with reference to fig. 1, wherein the exemplary embodiments of the present invention and the description are for explaining the present invention, but are not limiting.
A method for measuring the accurate pose of a space non-cooperative target comprises the following steps:
step 1, acquiring a complete three-dimensional point cloud of a space non-cooperative target by utilizing a TOF camera, and specifically comprising the following steps:
step 1.1, calibrating the TOF camera by adopting a Zhang Zhengyou calibration method (A flexible new technique for camera calibration, published in IEEE Transactions on Pattern Analysis and Machine Intelligence in 2000), wherein calibration can be performed by utilizing a checkerboard calibration plate to obtain an internal parameter K of the TOF camera;
step 1.2, shooting around each angle of a space non-cooperative target by using a TOF camera to obtain a sequence depth map;
step 1.3, mapping the depth map to a space according to internal parameters of a camera to obtain a local point cloud of a space non-cooperative target;
step 1.4, denoising the space non-cooperative target point cloud, and removing noise points to obtain a fine local point cloud of the space non-cooperative target;
step 1.5, providing an initial value for local point cloud registration by using a Fast Point Feature Histogram (FPFH) algorithm;
step 1.6, after the initial value is obtained, matching point clouds of two adjacent frames of the space non-cooperative target by utilizing an ICP algorithm, and assuming that adjacent point sets to be matched are respectively expressed as A and B:
A={a 1 ,...,a i ,...,a n },B={b 1 ,...,b j ,...,b m }
wherein a is i ,b j Respectively representing three-dimensional points in the point sets A and B, n and m respectively representing the quantity of the point sets A and B, and the ICP algorithm realizes the matching of point clouds by minimizing the distance between the two point sets and using a rotation matrix R t And translation vector T t The rigid transformation from point set a to point set B is described as follows:
step 1.6.1, searching the corresponding relation of three-dimensional points in the adjacent point sets A and B, and marking the corresponding point pair as a by using two three-dimensional points with nearest Euclidean distance as corresponding points by the ICP method i ,b i
Step 1.6.2, solving the rotation matrix R by minimizing the distance of the corresponding point pairs in the point set t And translation vector T t
Figure BDA0003217437090000051
Step 1.6.3, calculating centroid coordinates of the point sets a and B:
Figure BDA0003217437090000052
step 1.6.4, calculating the barycenter coordinates of each point in the point sets a and B:
a i ′=a ia ,b i ′=b ib (3)
step 1.6.5, solving the rotation matrix R by SVD method t
Figure BDA0003217437090000053
W=UΣV T (5)
R t =UV T (6)
Step 1.6.6 solving for translation vector T t
T t =μ b -R t μ a (7)
Repeating the steps 1.6.1-1.6.6 until the distance meets the threshold requirement to obtain the optimal rotation matrix R t And translation vector T t Splicing the two frames of point clouds;
step 1.7, repeating the step 1.6, and splicing all the point clouds to obtain a complete three-dimensional point cloud of the space non-cooperative target;
step 2, extracting three-dimensional characteristic points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, acquiring image data of the moving space non-cooperative target by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative targets from the obtained sequence two-dimensional images by using an EDlines two-dimensional straight line detection algorithm and a SIFT characteristic point extraction algorithm;
and 5, matching the solved two-dimensional characteristic points and the solved two-dimensional straight lines with the three-dimensional characteristic points and the three-dimensional straight lines, and correspondingly solving pose parameters of the space non-cooperative target by utilizing the 2D-3D straight lines, wherein the specific steps are as follows:
step 5.1, calibrating the color camera by adopting a Zhang Zhengyou calibration method to obtain the equivalent focal length f of the color camera x ,f y
Step 5.2, assuming that the pose initial value of the spatial non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the pose initial value, matching with the extracted spatial non-cooperative target two-dimensional straight line, and enabling a three-dimensional straight line endpoint P and a projection point p= (P) x ,p y ) The relationship of (2) can be described by a pinhole camera model:
Figure BDA0003217437090000061
wherein the rotation matrix R and translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
p x cosθ+p y sinθ-ρ d =0 (9)
step 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain the following:
Figure BDA0003217437090000062
N=(f x cosθ,f y sinθ,-ρ d ) T (11)
where N is the normal vector of the projection plane, and the distance d of the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is expressed as:
Figure BDA0003217437090000063
step 5.4,2D-3D straight line represents the corresponding distance as
Figure BDA0003217437090000064
Wherein d is l1 ,d l2 Respectively representing two end points p projected from a three-dimensional straight line to a plane 1 ,p 2 Distance to a two-dimensional straight line;
step 5.5, projecting the three-dimensional feature points onto the two-dimensional plane according to the pose initial values, and setting the three-dimensional feature points and the two-dimensional feature points projected onto the plane to be respectively represented as Q, and q= [ Q x ,q y ]The projection relationship between the two can be expressed as:
Figure BDA0003217437090000071
step 5.6, projecting the three-dimensional feature points onto the two-dimensional plane, wherein the two-dimensional feature points q ' = [ q ' ] corresponding to the two-dimensional feature points on the image ' x ,q′ y ]Matching, and setting the distance between the projection point and the two-dimensional characteristic point as d q :
Figure BDA0003217437090000072
Step 5.7, the feature points and the linear features of the space non-cooperative targets are projected onto a two-dimensional plane, and the i-th pair of 2D-3D corresponding point pair distances are set as
Figure BDA0003217437090000073
N pairs of corresponding points are shared, and the distance between the j-th pair and the 2D-3D corresponding straight line pair is set as +.>
Figure BDA0003217437090000074
And (3) sharing M pairs of 2D-3D corresponding straight line pairs, and solving the target pose by a minimization formula 16:
Figure BDA0003217437090000075
solving a formula (16) through a least square method to solve the pose of the space non-cooperative target: the rotation matrix R and the translation vector t.
The foregoing has described in detail the technical solutions provided by the embodiments of the present invention, and specific examples have been applied to illustrate the principles and implementations of the embodiments of the present invention, where the above description of the embodiments is only suitable for helping to understand the principles of the embodiments of the present invention; meanwhile, as for those skilled in the art, according to the embodiments of the present invention, there are variations in the specific embodiments and the application scope, and the present description should not be construed as limiting the present invention.

Claims (1)

1. The accurate pose measurement method for the space non-cooperative target is characterized by comprising the following steps of:
step 1, acquiring a complete three-dimensional point cloud of a space non-cooperative target by using a TOF camera;
step 2, extracting three-dimensional characteristic points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, acquiring image data of the moving space non-cooperative target by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative targets from the obtained sequence two-dimensional images;
step 5, matching is carried out according to the solved 2D-3D characteristic points and the straight lines respectively, and pose parameters of the space non-cooperative targets are solved by utilizing the corresponding relation:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camera x ,f y
Step 5.2, assuming that the pose initial value of the spatial non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the pose initial value, matching with the extracted spatial non-cooperative target two-dimensional straight line, and enabling a three-dimensional straight line endpoint P and a projection point p= (P) x ,p y ) The relationship of (2) can be described as:
Figure QLYQS_1
wherein the pose of the object, i.e. the rigid transformation from the world coordinate system to the camera coordinate system, is described by a rotation matrix R and a translation vector t, the three-dimensional linear equation can be expressed in polar coordinates as:
p x cosθ+p y sinθ-ρ d =0 (2)
step 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (1) and (2) to obtain the following:
Figure QLYQS_2
N=(f x cosθ,f y sinθ,-ρ d ) T (4)
where N is the normal vector of the projection plane, and the distance d of the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is expressed as:
Figure QLYQS_3
step 5.4,2D-3D straight line represents the corresponding distance as
Figure QLYQS_4
Wherein d l1 ,d l2 Respectively representing two end points p projected from a three-dimensional straight line to a plane 1 ,p 2 Distance to a two-dimensional straight line;
step 5.5, projecting the three-dimensional feature points onto the two-dimensional plane according to the pose initial values, and setting the three-dimensional feature points and the two-dimensional feature points projected onto the plane to be respectively represented as Q, and q= [ Q x ,q y ]The projection relationship between the two can be expressed as:
Figure QLYQS_5
step 5.6, projecting the three-dimensional feature points onto the two-dimensional plane, wherein the two-dimensional feature points q ' = [ q ' ] corresponding to the two-dimensional feature points on the image ' x ,q′ y ]Matching, and setting the distance between the projection point and the two-dimensional characteristic point as d q
Figure QLYQS_6
Step 5.7, the feature points and the linear features of the space non-cooperative targets are projected onto a two-dimensional plane, and the i-th pair of 2D-3D corresponding point pair distances are set as
Figure QLYQS_7
N pairs of corresponding points are shared, and the distance between the j-th pair and the 2D-3D corresponding straight line pair is set as +.>
Figure QLYQS_8
And (3) sharing M pairs of 2D-3D corresponding straight line pairs, and solving the target pose by a minimization formula (9):
Figure QLYQS_9
solving a formula (9) through a least square method to solve the pose of the space non-cooperative target: the rotation matrix R and the translation vector t.
CN202110948038.1A 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target Active CN113674353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110948038.1A CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110948038.1A CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Publications (2)

Publication Number Publication Date
CN113674353A CN113674353A (en) 2021-11-19
CN113674353B true CN113674353B (en) 2023-05-16

Family

ID=78543640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110948038.1A Active CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Country Status (1)

Country Link
CN (1) CN113674353B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115661493B (en) * 2022-12-28 2023-07-04 航天云机(北京)科技有限公司 Method, device, equipment and storage medium for determining object pose

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN111243002A (en) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement
CN112179357A (en) * 2020-09-25 2021-01-05 中国人民解放军国防科技大学 Monocular camera-based visual navigation method and system for plane moving target
CN112284293A (en) * 2020-12-24 2021-01-29 中国人民解放军国防科技大学 Method for measuring space non-cooperative target fine three-dimensional morphology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175477A (en) * 2010-02-24 2011-09-08 Canon Inc Three-dimensional measurement apparatus, processing method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN111243002A (en) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement
CN112179357A (en) * 2020-09-25 2021-01-05 中国人民解放军国防科技大学 Monocular camera-based visual navigation method and system for plane moving target
CN112284293A (en) * 2020-12-24 2021-01-29 中国人民解放军国防科技大学 Method for measuring space non-cooperative target fine three-dimensional morphology

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
单目相机位姿估计的稳健正交迭代方法;张雄锋,刘海波,尚洋;光学学报;第39卷(第9期);262-267 *
基于多级直线表述和M-估计的三维目标位姿跟踪优化算法;张跃强,苏昂 刘海波,尚洋,于起峰;光学学报;第35卷(第1期);256-265 *
基于广义成像模型的Scheimpflug相机标定方法;孙聪,刘海波,陈圣义,尚洋;光学学报;第38卷(第8期);114-122 *
多星近距离绕飞观测任务姿轨耦合控制研究;徐影,张进,于沫尧,许丹丹;中国空间科学技术;第39卷(第6期);21-29 *

Also Published As

Publication number Publication date
CN113674353A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN109035320B (en) Monocular vision-based depth extraction method
RU2609434C2 (en) Detection of objects arrangement and location
CN103971378B (en) A kind of mix the three-dimensional rebuilding method of panoramic picture in visual system
Alismail et al. Automatic calibration of a range sensor and camera system
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN109559355B (en) Multi-camera global calibration device and method without public view field based on camera set
CN107729893B (en) Visual positioning method and system of die spotting machine and storage medium
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN107084680B (en) Target depth measuring method based on machine monocular vision
CN108362205B (en) Space distance measuring method based on fringe projection
CN108171715B (en) Image segmentation method and device
CN109255818B (en) Novel target and extraction method of sub-pixel level angular points thereof
Nagy et al. Online targetless end-to-end camera-LiDAR self-calibration
CN113050074B (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN112365545B (en) Calibration method of laser radar and visible light camera based on large-plane composite target
CN111524174A (en) Binocular vision three-dimensional construction method for moving target of moving platform
CN104504691A (en) Camera position and posture measuring method on basis of low-rank textures
CN113674353B (en) Accurate pose measurement method for space non-cooperative target
Han et al. Target positioning method in binocular vision manipulator control based on improved canny operator
CN108596947A (en) A kind of fast-moving target tracking method suitable for RGB-D cameras
WO2020133080A1 (en) Object positioning method and apparatus, computer device, and storage medium
Miled et al. Hybrid online mobile laser scanner calibration through image alignment by mutual information
CN112017259A (en) Indoor positioning and image building method based on depth camera and thermal imager

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant