CN103115613A - Three-dimensional space positioning method - Google Patents

Three-dimensional space positioning method Download PDF

Info

Publication number
CN103115613A
CN103115613A CN2013100575432A CN201310057543A CN103115613A CN 103115613 A CN103115613 A CN 103115613A CN 2013100575432 A CN2013100575432 A CN 2013100575432A CN 201310057543 A CN201310057543 A CN 201310057543A CN 103115613 A CN103115613 A CN 103115613A
Authority
CN
China
Prior art keywords
camera
scaling board
image
target object
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100575432A
Other languages
Chinese (zh)
Other versions
CN103115613B (en
Inventor
刘政怡
李炜
吴建国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Yutian Electronics Co., Ltd.
Original Assignee
Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui University filed Critical Anhui University
Priority to CN201310057543.2A priority Critical patent/CN103115613B/en
Publication of CN103115613A publication Critical patent/CN103115613A/en
Application granted granted Critical
Publication of CN103115613B publication Critical patent/CN103115613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a three-dimensional space positioning method. The method comprises a setting process and a positioning process, wherein a corresponding relationship between image pixel coordinates and mapping point space coordinates on a calibration panel is established in the setting process, and mapping points on the calibration panel are defined as intersection points of actual space points imaged on a camera and lines of the camera on the calibration panel; two-dimensional coordinates of a target object can be identified by an image processing technology in the positioning process, and three-dimensional coordinates of the mapping points on the calibration panel can be calculated according to the corresponding relationship; and the camera and the mapping points on the calibration panel are connected to form two space lines, and the intersection point of the two space lines is a target object. The method provided the invention can be applied to a situation not requiring high accuracy to realize the three-dimensional space positioning of the object; and the corresponding relationship is established by manually moving the calibration panel, and the target object is positioned by combining certain measurement data, so that a complicated camera calibration process can be avoided, and the positioning method is simple and direct and is easy to implement.

Description

A kind of spatial three-dimensional positioning method
Technical field
The present invention relates to that image is processed and the machine vision technique field, particularly a kind of spatial three-dimensional positioning method that utilizes two cameras realizations.
Background technology
Utilize two cameras to realize that the spatial three-dimensional positioning method of object adopts technique of binocular stereoscopic vision usually, by the two width images of imaging device from different position acquisition testees, obtain the object dimensional geological information by the position deviation between the computed image corresponding point.Relate to the localization method under perfect condition as 200910108185.7 " a kind of object dimensional positioning method and video cameras ", 201210075073.8 " a kind of space three-dimensional location technology operating system ", suppose that namely two cameras are identical, has identical focal length, optical axis is fully parallel, there is not image deformation in camera lens, according to similar triangle theory, utilize the image coordinate p of object on two camera imaging planes 1(x l, y l), p 2(x r, y r), the distance b between camera focal distance f, two cameras, can try to achieve object dimensional coordinate (x w, y w, z w):
x w = bx l / ( x l - x r ) y w = by l / ( x l - x r ) z w = bf / ( x l - x r )
In fact, complete parallel can't the accomplishing of optical axis that allows two cameras, so there is larger error in the location of realizing under perfect condition.In order to improve precision, technique of binocular stereoscopic vision is first demarcated camera usually, namely determine object and its position corresponding relation between the picture point on the imaging plane of camera in world coordinate system, gathered simultaneously again the location drawing picture of target object by two cameras, the camera inside and outside parameter of being obtained by demarcation is completed the three-dimensional localization of object, as Liu of Central China University of Science and Technology Jingjing Master's thesis " based on the three-dimensional localization techniques research of binocular stereo vision ", 201110390863.0 " based on the space-location methods of binocular stereo vision ".Camera calibration is to obtain the key link of three-dimensional information from two dimensional image, and precision is high, but algorithm is too complicated, operates too loaded down with trivial details.
Summary of the invention
The present invention is for avoiding the existing weak point of above-mentioned prior art, a kind of method that realizes the object space three-dimensional localization that can be suitable in the not high occasion of accuracy requirement is provided, avoid complicated camera calibration process, determine the corresponding relation between the mapping point volume coordinate on image pixel coordinate and scaling board by manual mobile scaling board, and utilize described corresponding relation and some measurement data to realize the space three-dimensional location of testee.
Technical solution problem of the present invention adopts following technical scheme:
A kind of spatial three-dimensional positioning method of the present invention comprises setting up procedure and position fixing process two parts, and described setting up procedure comprises the steps:
(1) prepare the identical alternate scaling board of black and white lattice of image resolution ratio length breadth ratio that a length breadth ratio and camera are shot;
(2) two cameras are installed, are made its optical axis substantially parallel, measure and record two distances between camera;
(3) respectively two cameras are done following same treatment: open camera, move described scaling board and observe the image that described camera is taken along the camera optical axis, guarantee that the scaling board center is on the picture centre position, mobile scaling board measures and records the distance between scaling board and camera this moment until scaling board is filled the image that described camera is taken fully;
(4) set up respectively the corresponding relation between the mapping point volume coordinate on pixel coordinate and corresponding scaling board on the image that each camera takes, on described scaling board, mapping point is defined as the real space point of imaging on camera and the intersection point of camera line friendship scaling board;
Described position fixing process comprises the steps:
(1) open simultaneously two cameras, take the target object image that comprises the needs location;
(2) identify target object by image processing techniques, and calculate the two-dimensional coordinate of target object;
(3) corresponding relation on the pixel coordinate on the image that the camera of setting up according to described setting up procedure is taken and corresponding scaling board between the mapping point volume coordinate, by the two-dimensional coordinate of target object in two width images, calculate respectively the three-dimensional coordinate of mapping point on two scaling boards;
(4) connect two space lines of mapping point formation on camera and corresponding scaling board;
(5) calculate the intersection point of two space lines, as the three-dimensional coordinate of target object.
Compared with the prior art, beneficial effect of the present invention is embodied in:
1, a kind of spatial three-dimensional positioning method of the present invention is by setting up the corresponding relation localizing objects object between the mapping point volume coordinate on image pixel coordinate and scaling board, and localization method is simply direct, easy to understand;
2, a kind of spatial three-dimensional positioning method of the present invention has been avoided complicated loaded down with trivial details camera calibration process, determines the corresponding relation between the mapping point volume coordinate on image pixel coordinate and scaling board by manual mobile scaling board, and easy operating is realized.
Description of drawings
Fig. 1 is the scaling board that a kind of spatial three-dimensional positioning method of the present invention uses.
Fig. 2 is the 3 d space coordinate system that a kind of spatial three-dimensional positioning method of the present invention is set.
Fig. 3 be in a kind of spatial three-dimensional positioning method of the present invention mobile scaling board until the scaling board image that scaling board is shot when filling the image that described camera takes fully.
Fig. 4 is that a kind of spatial three-dimensional positioning method of the present invention moves the scaling board schematic diagram.
Fig. 5 is a kind of spatial three-dimensional positioning method schematic diagram of the present invention.
Fig. 6 is a kind of spatial three-dimensional positioning method schematic diagram of the present invention.
Symbol in figure:
Between camera C1, camera C2, camera C1 and camera C2 apart from d Cam1_com2, mobile scaling board until scaling board when filling the image that described camera takes fully between scaling board and camera C1 and C2 apart from d cb1And d cb2, corresponding scaling board plane B2, the P of target object P, camera C1 imaging plane V1, scaling board plane B1, camera C2 that camera C2 imaging plane V2, camera C1 are corresponding and the line of C1 hand over imaging plane V1 in a P 1', the line of P and C1 hands over scaling board plane B1 in a P 1", the line of P and C2 hands over imaging plane V2 in a P 2', the line of P and C2 hands over scaling board plane B2 in a P 2".
Below pass through embodiment, and the invention will be further described by reference to the accompanying drawings, but embodiments of the present invention are not limited to this.
Embodiment
A kind of spatial three-dimensional positioning method of the present embodiment comprises setting up procedure and position fixing process two parts, and described setting up procedure comprises the steps:
(1) prepare the identical alternate scaling board of black and white lattice of image resolution ratio length breadth ratio that a length breadth ratio and camera are shot;
If the resolution of camera photographic images is M * N, the wide height of scaling board is respectively BoardWidth and BoardHeight, requirement
BoardWidth BoardHeight = M N - - - ( 1 )
The resolution of the present embodiment camera photographic images is 1024 * 768, and scaling board referring to Fig. 1, is 8 * 6 the alternate cardboard of black and white lattice, and wide height is respectively 40cm and 30cm, satisfies formula (1).
(2) two cameras are installed, are made its optical axis substantially parallel, measure and record between two cameras apart from d Com1_com2
For sake of convenience, set up rectangular coordinate system in space, referring to Fig. 2, horizontal direction is Z axis, vertical direction is Y-axis, perpendicular to the YZ plane be X-axis, take the position at camera C1 place as true origin O (0,0,0), camera C2 position coordinate is (0, d Com1_com2, 0).
(3) respectively two cameras are done following same treatment: open camera, move described scaling board and observe the image that described camera is taken along the camera optical axis, guarantee that the scaling board center is on the picture centre position, mobile scaling board is until scaling board is filled the image that described camera is taken fully, referring to Fig. 3, this moment measure and record between scaling board and camera apart from d cb1And d cb2, referring to Fig. 4.
(4) set up respectively the corresponding relation between the mapping point volume coordinate on pixel coordinate and corresponding scaling board on the image that each camera takes, on described scaling board, mapping point is defined as the real space point of imaging on camera and the intersection point of camera line friendship scaling board;
Take camera C1 as example, what camera adopted is and the similar perspective projection imaging model of human visual system.Referring to Fig. 5, basic perspective projection model is made of viewpoint and view plane two parts, and viewpoint position can be regarded as the position that camera is placed, and view plane has infinite a plurality of.The imaging plane V1 of camera is a view plane, in described setting up procedure, when scaling board moved along the camera optical axis, plane, scaling board place was all view plane, until scaling board is when filling the image that described camera takes fully, scaling board place plane B1 is a view plane equally.Hand over imaging plane place view plane V1 in a P for the line of any point P in real world and viewpoint C1 1', P 1' be the imaging point of P on imaging plane; The line of P and viewpoint C1 hands over scaling board place view plane B1 in a P 1", P 1" be mapping point on described scaling board.
For camera C1, the pixel of choosing the lower left of image is initial point O1, and the horizontal sides direction of image is the x1 axle, and the perpendicular edge direction of image is the y1 axle.By the mode of the table of comparisons, record when not having lens distortion the corresponding relation on the image pixel coordinate that described camera C1 takes and scaling board between the mapping point volume coordinate, P 1Pixel two-dimensional coordinate on the image that ' expression camera C1 takes, P 1" mapping point three-dimensional coordinate on the expression scaling board.
P 1′(0,0)→P 1″(-BoardWidth/2,-BoardHeight/2,d cb1)
P 1′(M/8,0)→P 1″(-BoardWidth/2+BoardWidth/8,-BoardHeight/2,d cb1)
P 1′(M/4,0)→P 1″(-BoardWidth/2+BoardWidth/4,-BoardHeight/2,d cb1)
P 1′(3*M/4,0)→P 1″(-BoardWidth/2+3*BoardWidth/8,-BoardHeight/2,d cb1)
P 1′(M/2,0)→P 1″(0,-BoardHeight/2,d cb1)
P 1′(0,N/6)→P 1″(-BoardWidth/2,-BoardHeight/2+BoardWidth/6,d cb1)
P 1′(M/8,N/6)→P 1″(-BoardWidth/2+BoardWidth/8,-BoardHeight/2++BoardHeight/6,d cb1)
......
The resolution of the present embodiment camera photographic images is 1024 * 768, the wide height of scaling board is respectively 40cm and 30cm, when not having lens distortion, the table of comparisons on the pixel two-dimensional coordinate on the image that described camera C1 takes and scaling board between the mapping point three-dimensional coordinate is as follows:
P 1′(0,0)→P 1″(-20,-15,d cb1)
P 1′(128,0)→P 1″(-15,-15,d cb1)
P 1′(256,0)→P 1″(-10,-15,d cb1)
P 1′(384,0)→P 1″(-5,-15,d cb1)
P 1′(512,0)→P 1″(0,-15,d cb1)
P 1′(0,128)→P 1″(-20,-10,d cb1)
P 1′(128,128)→P 1″(-15,-10,d cb1)
......
But all there is distortion in the image that general camera is taken, actual P 1' (x, y) can be obtained by the Corner Detection Algorithm of image processing techniques.The corresponding relation of mapping point three-dimensional coordinate on the angle point two-dimensional coordinate on the image of camera C1 shooting and scaling board is designated as f 1: P 1' (x, y) → P 1" (x, y, z).
Described position fixing process comprises the steps:
(1) open simultaneously two cameras, take the target object image that comprises the needs location;
(2) identify target object by image processing techniques, and calculate the two-dimensional coordinate of target object;
(3) corresponding relation on the pixel coordinate on the image that the camera of setting up according to described setting up procedure is taken and corresponding scaling board between the mapping point volume coordinate, by the two-dimensional coordinate of target object in two width images, calculate respectively the three-dimensional coordinate of mapping point on two scaling boards;
Corresponding relation f on pixel coordinate on the image that the camera C1 that utilizes formula (2) to set up according to described setting up procedure takes and scaling board between the mapping point volume coordinate 1, obtain on the image that camera C1 takes the three-dimensional coordinate P of mapping point on the corresponding scaling board of target object 1" (x ", y ", z ").Wherein, (x ', y ') be the two-dimensional coordinate of target object in the image that camera C1 takes, (x ' Near, y ' Near) be from the nearest angular coordinate of detected target object (x ', y '), f 1x(x ' Near, y ' Near) be the x coordinate of mapping point on scaling board, f 1y(x ' Near, y ' Near) be the y coordinate of mapping point on scaling board.
x ′ ′ - f 1 x ( x near ′ , y near ′ ) = ( x ′ - x ′ near ) × ( BoardWidth / M ) y ′ ′ - f 1 y ( x near ′ , y near ′ ) = ( y ′ - y ′ near ) × ( BoardHeight / N ) z ′ ′ = f 1 z ( x near ′ , y near ′ ) = d cb 1 - - - ( 2 )
In like manner, the corresponding relation f on the pixel coordinate on the image that the camera C2 that utilizes formula (3) to set up according to described setting up procedure takes and scaling board between the mapping point volume coordinate 2, obtain on the image that camera C2 takes the three-dimensional coordinate P of mapping point on the corresponding scaling board of target object 2" (x ", y ", z "):
x ′ ′ - f 2 x ( x near ′ , y near ′ ) = ( x ′ - x ′ near ) × ( BoardWidth / M ) y ′ ′ - f 2 y ( x near ′ , y near ′ ) - d cam 1 _ cam 2 = ( y ′ - y ′ near ) × ( BoardHeight / N ) z ′ ′ = f 2 z ( x near ′ , y near ′ ) = d cb 2 - - - ( 3 )
(4) connect two space lines of mapping point formation on camera and corresponding scaling board;
Referring to Fig. 6, connect C1 and P 1", C2 and P 2" form respectively two space lines.
(5) calculate the intersection point of two space lines, as the three-dimensional coordinate of target object.
Article two, the space line intersection point is the object position, and due to the existence of error, two space lines may be without intersection point, therefore with the perpendicular bisector mid point of two space lines three-dimensional coordinate as target object.Described two space line perpendicular bisector mid-point computation methods are how much categories of space multistory, no longer describe in detail.
In described setting up procedure, the chequered with black and white grid of scaling board is more, and setting accuracy is higher.

Claims (1)

1. a spatial three-dimensional positioning method, comprise setting up procedure and position fixing process two parts, and described setting up procedure comprises the steps:
(1) prepare the identical alternate scaling board of black and white lattice of image resolution ratio length breadth ratio that a length breadth ratio and camera are shot;
(2) two cameras are installed, are made its optical axis substantially parallel, measure and record two distances between camera;
(3) respectively two cameras are done following same treatment: open camera, move described scaling board and observe the image that described camera is taken along the camera optical axis, guarantee that the scaling board center is on the picture centre position, mobile scaling board measures and records the distance between scaling board and camera this moment until scaling board is filled the image that described camera is taken fully;
(4) set up respectively the corresponding relation between the mapping point volume coordinate on pixel coordinate and corresponding scaling board on the image that each camera takes, on described scaling board, mapping point is defined as the real space point of imaging on camera and the intersection point of camera line friendship scaling board;
Described position fixing process comprises the steps:
(1) open simultaneously two cameras, take the target object image that comprises the needs location;
(2) identify target object by image processing techniques, and calculate the two-dimensional coordinate of target object;
(3) corresponding relation on the pixel coordinate on the image that the camera of setting up according to described setting up procedure is taken and corresponding scaling board between the mapping point volume coordinate, by the two-dimensional coordinate of target object in two width images, calculate respectively the three-dimensional coordinate of mapping point on two scaling boards;
(4) connect two space lines of mapping point formation on camera and corresponding scaling board;
(5) calculate the intersection point of two space lines, as the three-dimensional coordinate of target object.
CN201310057543.2A 2013-02-04 2013-02-04 Three-dimensional space positioning method Active CN103115613B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310057543.2A CN103115613B (en) 2013-02-04 2013-02-04 Three-dimensional space positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310057543.2A CN103115613B (en) 2013-02-04 2013-02-04 Three-dimensional space positioning method

Publications (2)

Publication Number Publication Date
CN103115613A true CN103115613A (en) 2013-05-22
CN103115613B CN103115613B (en) 2015-04-08

Family

ID=48414033

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310057543.2A Active CN103115613B (en) 2013-02-04 2013-02-04 Three-dimensional space positioning method

Country Status (1)

Country Link
CN (1) CN103115613B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103471500A (en) * 2013-06-05 2013-12-25 江南大学 Conversion method of plane coordinate and space three-dimensional coordinate point in vision of monocular machine
CN103593658A (en) * 2013-11-22 2014-02-19 中国电子科技集团公司第三十八研究所 Three-dimensional space positioning system based on infrared image recognition
CN103940407A (en) * 2014-02-13 2014-07-23 鲁东大学 Method used for gully erosion extraction based on landform and remote sensing image fusion technology
CN104776832A (en) * 2015-04-16 2015-07-15 浪潮软件集团有限公司 Method, set top box and system for positioning objects in space
CN105865327A (en) * 2015-01-22 2016-08-17 成都飞机工业(集团)有限责任公司 Zoom photograph-based relative position measurement method
CN108257182A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of scaling method and device of three-dimensional camera module
CN108257181A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of space-location method and device
CN108616753A (en) * 2016-12-29 2018-10-02 深圳超多维科技有限公司 A kind of Nakedness-yet stereoscopic display method and device
CN108648237A (en) * 2018-03-16 2018-10-12 中国科学院信息工程研究所 A kind of space-location method of view-based access control model
CN108731644A (en) * 2017-09-12 2018-11-02 武汉天际航信息科技股份有限公司 Oblique photograph plotting method and its system based on vertical auxiliary line
CN109341530A (en) * 2018-10-25 2019-02-15 华中科技大学 Object point positioning method and system in a kind of binocular stereo vision
CN109751992A (en) * 2017-11-03 2019-05-14 北京凌宇智控科技有限公司 The positioning correction method of three-dimensional space, localization method and its equipment in faced chamber
CN110670860A (en) * 2019-10-15 2020-01-10 广东博智林机器人有限公司 Laying method, laying robot and storage medium
CN112066950A (en) * 2020-07-24 2020-12-11 北京空间机电研究所 Multi-optical-axis parallel mapping camera single-center projection conversion method
WO2022188733A1 (en) * 2021-03-08 2022-09-15 Hangzhou Taro Positioning Technology Co., Ltd. Scenario triggering and interaction based on target positioning and identification

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148173B (en) * 2019-05-21 2021-08-06 北京百度网讯科技有限公司 Method and device for positioning target in vehicle-road cooperation, electronic equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5694483A (en) * 1994-06-13 1997-12-02 Kabushiki Kaisha Toshiba Measurement face extraction apparatus and method
US6637016B1 (en) * 2001-04-25 2003-10-21 Lsi Logic Corporation Assignment of cell coordinates
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
DE102011015893A1 (en) * 2011-04-01 2012-10-04 NUMENUS GmbH Method for visualizing freeform surfaces by means of ray tracing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5694483A (en) * 1994-06-13 1997-12-02 Kabushiki Kaisha Toshiba Measurement face extraction apparatus and method
US6637016B1 (en) * 2001-04-25 2003-10-21 Lsi Logic Corporation Assignment of cell coordinates
DE102011015893A1 (en) * 2011-04-01 2012-10-04 NUMENUS GmbH Method for visualizing freeform surfaces by means of ray tracing
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103471500A (en) * 2013-06-05 2013-12-25 江南大学 Conversion method of plane coordinate and space three-dimensional coordinate point in vision of monocular machine
CN103471500B (en) * 2013-06-05 2016-09-21 江南大学 A kind of monocular camera machine vision midplane coordinate and the conversion method of 3 d space coordinate point
CN103593658A (en) * 2013-11-22 2014-02-19 中国电子科技集团公司第三十八研究所 Three-dimensional space positioning system based on infrared image recognition
CN103940407A (en) * 2014-02-13 2014-07-23 鲁东大学 Method used for gully erosion extraction based on landform and remote sensing image fusion technology
CN103940407B (en) * 2014-02-13 2016-09-07 鲁东大学 One extracts coombe method based on landform and Remote Sensing Image Fusion technology
CN105865327A (en) * 2015-01-22 2016-08-17 成都飞机工业(集团)有限责任公司 Zoom photograph-based relative position measurement method
CN104776832A (en) * 2015-04-16 2015-07-15 浪潮软件集团有限公司 Method, set top box and system for positioning objects in space
CN108616753B (en) * 2016-12-29 2020-08-04 深圳超多维科技有限公司 Naked eye three-dimensional display method and device
CN108616753A (en) * 2016-12-29 2018-10-02 深圳超多维科技有限公司 A kind of Nakedness-yet stereoscopic display method and device
CN108257181A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of space-location method and device
CN108257182A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of scaling method and device of three-dimensional camera module
CN108731644A (en) * 2017-09-12 2018-11-02 武汉天际航信息科技股份有限公司 Oblique photograph plotting method and its system based on vertical auxiliary line
CN108731644B (en) * 2017-09-12 2020-08-11 武汉天际航信息科技股份有限公司 Oblique photography mapping method and system based on vertical auxiliary line
CN109751992B (en) * 2017-11-03 2021-07-20 北京凌宇智控科技有限公司 Indoor three-dimensional space-oriented positioning correction method, positioning method and equipment thereof
CN109751992A (en) * 2017-11-03 2019-05-14 北京凌宇智控科技有限公司 The positioning correction method of three-dimensional space, localization method and its equipment in faced chamber
CN108648237A (en) * 2018-03-16 2018-10-12 中国科学院信息工程研究所 A kind of space-location method of view-based access control model
CN108648237B (en) * 2018-03-16 2022-05-03 中国科学院信息工程研究所 Space positioning method based on vision
CN109341530A (en) * 2018-10-25 2019-02-15 华中科技大学 Object point positioning method and system in a kind of binocular stereo vision
CN110670860A (en) * 2019-10-15 2020-01-10 广东博智林机器人有限公司 Laying method, laying robot and storage medium
WO2021073458A1 (en) * 2019-10-15 2021-04-22 广东博智林机器人有限公司 Laying method and laying robot
CN112066950A (en) * 2020-07-24 2020-12-11 北京空间机电研究所 Multi-optical-axis parallel mapping camera single-center projection conversion method
WO2022188733A1 (en) * 2021-03-08 2022-09-15 Hangzhou Taro Positioning Technology Co., Ltd. Scenario triggering and interaction based on target positioning and identification

Also Published As

Publication number Publication date
CN103115613B (en) 2015-04-08

Similar Documents

Publication Publication Date Title
CN103115613B (en) Three-dimensional space positioning method
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN104616292B (en) Monocular vision measuring method based on global homography matrix
CN103292695B (en) A kind of single eye stereo vision measuring method
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
WO2016037486A1 (en) Three-dimensional imaging method and system for human body
CN107063129A (en) A kind of array parallel laser projection three-dimensional scan method
CN106920276B (en) A kind of three-dimensional rebuilding method and system
CN104677330A (en) Small binocular stereoscopic vision ranging system
CN110375648A (en) The spatial point three-dimensional coordinate measurement method that the single camera of gridiron pattern target auxiliary is realized
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN103983186A (en) Binocular vision system correcting method and device
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method
CN103093460A (en) Moving camera virtual array calibration method based on parallel parallax
CA3233222A1 (en) Method, apparatus and device for photogrammetry, and storage medium
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
Ye et al. An accurate 3D point cloud registration approach for the turntable-based 3D scanning system
CN108180888A (en) A kind of distance detection method based on rotating pick-up head
CN111640156A (en) Three-dimensional reconstruction method, equipment and storage equipment for outdoor weak texture target
CN111854636A (en) Multi-camera array three-dimensional detection system and method
CN111009030A (en) Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method
Deng et al. Registration of multiple rgbd cameras via local rigid transformations
Nagy et al. Development of an omnidirectional stereo vision system
Harvent et al. Multi-view dense 3D modelling of untextured objects from a moving projector-cameras system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20170309

Address after: 213000 Changzhou, Jiangsu new North Road, No. 8

Patentee after: Changzhou Yutian Electronics Co., Ltd.

Address before: 230039 Hefei West Road, Anhui, No. 3

Patentee before: Anhui University

TR01 Transfer of patent right