CN105069781A - Salient object spatial three-dimensional positioning method - Google Patents
Salient object spatial three-dimensional positioning method Download PDFInfo
- Publication number
- CN105069781A CN105069781A CN201510426644.1A CN201510426644A CN105069781A CN 105069781 A CN105069781 A CN 105069781A CN 201510426644 A CN201510426644 A CN 201510426644A CN 105069781 A CN105069781 A CN 105069781A
- Authority
- CN
- China
- Prior art keywords
- well
- dimensional
- marked target
- positioning method
- spatial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a salient object spatial three-dimensional positioning method comprising: calibrating and rectifying a left camera and a right camera, acquiring internal and external parameters, acquiring salient objects in images captured by the two cameras by using image saliency detection; acquiring a central two-dimensional coordinate according to the contour of the object; computing the three-dimensional coordinate of the center of the salient object according to the internal and external parameters by using a binocular stereo stereoscopic vision principle so as to achieve spatial three-dimensional positioning of the salient object. The method acquires the objects in the images captured by the left camera and the right camera by means of the image saliency detection so as to be beneficial to acquisition of the salient object in the binocular visual angle.
Description
Technical field
The present invention relates to computer vision and field of human-computer interaction, be specifically related to a kind of spatial three-dimensional positioning method of well-marked target.
Background technology
Based in the robot application of binocular stereo vision, as avoiding barrier, robot needs the object hidden usually to belong to obvious object in robot binocular field of view, and therefore the space three-dimensional location of well-marked target is significant.
Traditional space-location method based on binocular stereo vision utilizes two cameras to take a space object simultaneously, obtain its two width image under different angles, utilize the difference between respective pixel in two width images, calculate the three-dimensional space position of target object.In two width images, the acquisition of target adopts the method for Iamge Segmentation or contour detecting usually, as the paper " binocular stereo vision object identification and location " of application number 201410665285.0 patent of invention " a kind of robot de-stacking method based on binocular stereo vision ", Beijing Jiaotong University Gao Lining.Due to the existence of noise, the target numbers that Iamge Segmentation or contour detecting obtain is more, directly cannot obtain well-marked target, thus affect the location of well-marked target.
Summary of the invention
The present invention is for solving well-marked target orientation problem in binocular vision application, a kind of spatial three-dimensional positioning method of well-marked target is provided, utilize the well-marked target in the image of saliency detection acquisition dual camera shooting, according to its center two-dimensional coordinate, located by the space three-dimensional of technique of binocular stereoscopic vision realize target.
Technical solution problem of the present invention adopts following technical scheme:
The spatial three-dimensional positioning method of a kind of well-marked target of the present invention, two, left and right camera is demarcated and corrected, obtain inside and outside parameter, the well-marked target in the image obtaining two cameras shootings is respectively detected by saliency, center two-dimensional coordinate is asked for according to objective contour, calculated the three-dimensional location coordinates at well-marked target center again by binocular stereo vision principle according to inside and outside parameter, realize the space three-dimensional location of well-marked target.
Compared with the prior art, beneficial effect of the present invention is embodied in:
The spatial three-dimensional positioning method of a kind of well-marked target of the present invention, detects the target in the image obtaining two the camera shootings in left and right by saliency, is conducive to obtaining the well-marked target in binocular visual angle.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the spatial three-dimensional positioning method of a kind of well-marked target of the present invention.
Below by way of embodiment, and the invention will be further described by reference to the accompanying drawings, but embodiments of the present invention are not limited thereto.
Embodiment
The spatial three-dimensional positioning method of a kind of well-marked target of the present embodiment, two, left and right camera is demarcated and corrected, obtain inside and outside parameter, the well-marked target in the image obtaining two cameras shootings is respectively detected by saliency, center two-dimensional coordinate is asked for according to objective contour, calculated the three-dimensional location coordinates at well-marked target center again by binocular stereo vision principle according to inside and outside parameter, realize the space three-dimensional location of well-marked target.
The spatial three-dimensional positioning method of described well-marked target, its feature comprises calibration process, three-dimensional correcting process, conspicuousness testing process, the two-dimensional localization process of well-marked target, three-dimensional localization process five parts of well-marked target, see Fig. 1.
Described calibration process, its feature comprises:
(1) the calibration tool case utilizing Matlab to provide is demarcated left camera, obtains internal reference matrix M 1 and the Distortion Vector D1 of left camera;
(2) the calibration tool case utilizing Matlab to provide is demarcated right camera, obtains internal reference matrix M 2 and the Distortion Vector D2 of right camera;
(3) the calibration tool case utilizing Matlab to provide carries out stereo calibration to left and right camera, obtains the rotation matrix R between the camera coordinate system of left and right and translation matrix T.
Described three-dimensional correcting process, its feature comprises:
Calculate re-projection rotation matrix, input the rotation matrix R between the internal reference matrix M 1 of the left and right camera obtained in described calibration process, M2 and Distortion Vector D1, D2 and binocular camera coordinate system and translation matrix T, calculate projection matrix P1, the P2 respectively in new coordinate system after the camera rectification of left and right by function stereoRectify.
Described conspicuousness testing process, its feature comprises:
Respectively conspicuousness detection is carried out to the image of left and right camera shooting, obtains two width and significantly scheme S1 and S2.Specifically can adopt current existing various conspicuousness detection algorithm, as: GS-SP, MR, HS, AMC, PCA, SF etc.
The two-dimensional localization process of described well-marked target, its feature comprises:
Significantly scheme S1 and S2 to two width obtained in described conspicuousness testing process, the FindContours function utilizing OpenCV to provide respectively carries out the extraction of profile, and calculates corresponding centre coordinate (X according to formula (1)
1, Y
1) and (X
2, Y
2).
Wherein (X, Y) represents the centre coordinate of well-marked target in image, and n represents the number of objective contour mid point, x
irepresent the coordinate of i-th point in objective contour.
The three-dimensional localization process of described well-marked target, its feature comprises:
Utilize binocular stereo vision principle, according to the actual three dimensional space coordinate of the two-dimensional coordinate localizing objects at well-marked target center in two width images.
By projection matrix P1, P2 of obtaining in described three-dimensional correcting process, set up the relation equation between the two-dimensional coordinate at well-marked target center in two width images and the actual three-dimensional coordinate of well-marked target, as shown in formula (2), (3).
Z
1, Z
2be respectively the Z coordinate of well-marked target central point under the camera coordinate system of two, left and right.Cancellation Z
1, Z
2four linear equations about (X, Y, Z) can be obtained, as shown in formula (4).
Utilize least square method to calculate the three-dimensional coordinate (X, Y, Z) of well-marked target central point, realize the space three-dimensional location of well-marked target.
If there is multiple well-marked target, then match from left to right according to its x coordinate.
The spatial three-dimensional positioning method of a kind of well-marked target of the present embodiment, detects the target in the image obtaining two the camera shootings in left and right by saliency, is conducive to obtaining the well-marked target in binocular visual angle.
Claims (1)
1. the spatial three-dimensional positioning method of a well-marked target, two, left and right camera is demarcated and corrected, obtain inside and outside parameter, the well-marked target in the image obtaining two cameras shootings is respectively detected by saliency, center two-dimensional coordinate is asked for according to objective contour, calculated the three-dimensional location coordinates at well-marked target center again by binocular stereo vision principle according to inside and outside parameter, realize the space three-dimensional location of well-marked target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510426644.1A CN105069781B (en) | 2015-07-17 | 2015-07-17 | A kind of spatial three-dimensional positioning method of well-marked target |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510426644.1A CN105069781B (en) | 2015-07-17 | 2015-07-17 | A kind of spatial three-dimensional positioning method of well-marked target |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105069781A true CN105069781A (en) | 2015-11-18 |
CN105069781B CN105069781B (en) | 2017-11-28 |
Family
ID=54499139
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510426644.1A Expired - Fee Related CN105069781B (en) | 2015-07-17 | 2015-07-17 | A kind of spatial three-dimensional positioning method of well-marked target |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105069781B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106952301A (en) * | 2017-03-10 | 2017-07-14 | 安徽大学 | A kind of RGB D saliency computational methods |
CN107248138A (en) * | 2017-06-16 | 2017-10-13 | 中国科学技术大学 | Human vision conspicuousness Forecasting Methodology in reality environment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104463911A (en) * | 2014-12-09 | 2015-03-25 | 上海新跃仪表厂 | Small infrared moving target detection method based on complicated background estimation |
WO2015039911A1 (en) * | 2013-09-17 | 2015-03-26 | Thomson Licensing | Method for capturing the 3d motion of an object by means of an unmanned aerial vehicle and a motion capture system |
CN104484883A (en) * | 2014-12-24 | 2015-04-01 | 河海大学常州校区 | Video-based three-dimensional virtual ship positioning and track simulation method |
-
2015
- 2015-07-17 CN CN201510426644.1A patent/CN105069781B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015039911A1 (en) * | 2013-09-17 | 2015-03-26 | Thomson Licensing | Method for capturing the 3d motion of an object by means of an unmanned aerial vehicle and a motion capture system |
CN104463911A (en) * | 2014-12-09 | 2015-03-25 | 上海新跃仪表厂 | Small infrared moving target detection method based on complicated background estimation |
CN104484883A (en) * | 2014-12-24 | 2015-04-01 | 河海大学常州校区 | Video-based three-dimensional virtual ship positioning and track simulation method |
Non-Patent Citations (3)
Title |
---|
周兴林 等: "基于双目视觉的车辆行驶跑偏在线自动检测系统", 《中国机械工程》 * |
张远辉: "基于实时视觉的乒乓球机器人标定和轨迹跟踪技术研究", 《中国博士学位论文全文数据库(信息科技辑)》 * |
李东洋等: "基于梯度显著性的轮廓提取方法", 《计算机工程与应用》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106952301A (en) * | 2017-03-10 | 2017-07-14 | 安徽大学 | A kind of RGB D saliency computational methods |
CN106952301B (en) * | 2017-03-10 | 2020-04-03 | 安徽大学 | RGB-D image significance calculation method |
CN107248138A (en) * | 2017-06-16 | 2017-10-13 | 中国科学技术大学 | Human vision conspicuousness Forecasting Methodology in reality environment |
CN107248138B (en) * | 2017-06-16 | 2020-01-03 | 中国科学技术大学 | Method for predicting human visual saliency in virtual reality environment |
Also Published As
Publication number | Publication date |
---|---|
CN105069781B (en) | 2017-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104463108B (en) | A kind of monocular real time target recognitio and pose measuring method | |
CN106570903B (en) | A kind of visual identity and localization method based on RGB-D camera | |
KR101633620B1 (en) | Feature registration apparatus for image based localization and method the same | |
CN101924953B (en) | Simple matching method based on datum point | |
CN111151463B (en) | Mechanical arm sorting and grabbing system and method based on 3D vision | |
CN103115613B (en) | Three-dimensional space positioning method | |
CN110555878B (en) | Method and device for determining object space position form, storage medium and robot | |
EP3114647A2 (en) | Method and system for 3d capture based on structure from motion with simplified pose detection | |
CN106033614B (en) | A kind of mobile camera motion object detection method under strong parallax | |
EP3229208B1 (en) | Camera pose estimation | |
US20190073796A1 (en) | Method and Image Processing System for Determining Parameters of a Camera | |
CN104268853A (en) | Infrared image and visible image registering method | |
CN108765495B (en) | Rapid calibration method and system based on binocular vision detection technology | |
CN106203429A (en) | Based on the shelter target detection method under binocular stereo vision complex background | |
CN106352817A (en) | Non-contact four-wheel positioner and positioning method thereof | |
Huang et al. | Mobile robot localization using ceiling landmarks and images captured from an rgb-d camera | |
Wang et al. | Improvement in real-time obstacle detection system for USV | |
Pi et al. | Stereo visual SLAM system in underwater environment | |
CN205726180U (en) | Terminal guidance video image three dimensional data collection system | |
CN206074001U (en) | A kind of robot indoor locating system based on 3D video cameras | |
CN105069781A (en) | Salient object spatial three-dimensional positioning method | |
Yamaguchi | Three dimensional measurement using fisheye stereo vision | |
Petrovai et al. | Obstacle detection using stereovision for Android-based mobile devices | |
Lu et al. | Binocular stereo vision based on OpenCV | |
CN112884832B (en) | Intelligent trolley track prediction method based on multi-view vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB03 | Change of inventor or designer information |
Inventor after: Liu Zhengyi Inventor after: Xie Feng Inventor after: Huang Zichao Inventor before: Liu Zhengyi Inventor before: Wang Yiheng Inventor before: Huang Zichao |
|
CB03 | Change of inventor or designer information | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20171128 Termination date: 20210717 |
|
CF01 | Termination of patent right due to non-payment of annual fee |