CN102254345A - Method for registering natural characteristic based on cloud computation - Google Patents

Method for registering natural characteristic based on cloud computation Download PDF

Info

Publication number
CN102254345A
CN102254345A CN2011101808816A CN201110180881A CN102254345A CN 102254345 A CN102254345 A CN 102254345A CN 2011101808816 A CN2011101808816 A CN 2011101808816A CN 201110180881 A CN201110180881 A CN 201110180881A CN 102254345 A CN102254345 A CN 102254345A
Authority
CN
China
Prior art keywords
physical feature
feature point
based
method
extract
Prior art date
Application number
CN2011101808816A
Other languages
Chinese (zh)
Inventor
陈明
凌晨
田丰
Original Assignee
上海大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海大学 filed Critical 上海大学
Priority to CN2011101808816A priority Critical patent/CN102254345A/en
Publication of CN102254345A publication Critical patent/CN102254345A/en

Links

Abstract

The invention discloses a method for registering natural characteristic based on cloud computation. The method comprises the following steps of: firstly, extracting a natural characteristic point set from an object, and establishing a coordinate system; secondly, extracting natural characteristic points from a key frame which is shot by a camera; thirdly, matching the natural characteristic points based on cloud; and finally, modifying spatial coordinates according to the matched natural characteristic points, registering a virtual object, and rendering and displaying. The method has the advantage that: the matching speed of the natural characteristic is improved by using commercial cloud computation, so real-time and accurate requirements on unmarked registration of augmented reality are met.

Description

Physical feature register method based on cloud computing

Technical field

What the present invention relates to is a kind of augmented reality registration technology, specifically a kind of physical feature register method based on cloud computing.

Background technology

Along with people constantly strengthen the requirement of interactive experience, (Augmented Reality, AR) application has obtained fast development to augmented reality.Registration is one of important technology of augmented reality.Registration promptly is the consistance of dummy object and real scene position in three dimensions, i.e. integration spatially, and following the tracks of registration is a lasting dynamic process.When user's moving view point, dummy object must and the real goal position registration seen of user.So the accurate position and the direction of dummy object must be learnt by system, follow the tracks of the untimely bigger registration error of bringing and to cause inaccurate display result.Along with the variation of application demand, the augmented reality applied environment develops to outdoor and unmarked physical environment from indoor underlined applied environment.Utilize physical feature accurately to register dummy object in real time and become one of important research direction.

For the augmented reality registration studies, as Chinese patent: its name is called " a kind of new outdoor augmented reality does not have mark and follows the tracks of registration algorithm ", application number 201010523833.8, this invention uses texture and two kinds of features of profile to come scene is expressed, and proposed to follow the tracks of and registered based on composite character, this invention algorithm complexity, undesirable to live effect.As Chinese patent: its name is called " three-dimensional register method of a kind of augmented reality based on monumented point and system ", application number 200710118266.6, this invention projects to the invisible light monumented point that generates on the loading end of actual environment, employing has the video camera of invisible light optical filter the invisible light monumented point that is projected on the loading end is taken, thereby obtaining the two-dimensional coordinate data of described invisible light monumented point in screen registers, this invention does not have too big practical value to video camera hardware requirement height.As Chinese patent: its name is called " based on the dynamic augmented reality process registration in many planes of homography matrix ", application number 201010535231.4, realize three-dimensional registration according to the known position that the specific properties of the feature in the scene is discerned in the true environment automatically, utilize CF information to carry out multilevel dynamic augmented reality registration, but that this invention is affected by the external environment is bigger.

At present based in the registration algorithm of physical feature for the discovery of the physical feature of object still be one ask a question and the real-time of these methods and accuracy requirement is neither can satisfy application requirements before important.Mainly be computing power not enough and can not flexible allocation due to.

Summary of the invention

The problem and shortage of prior art existence the object of the present invention is to provide a kind of physical feature register method based on cloud computing in view of the above, and this method at first from comprising the extraction physical feature point set of object, is set up coordinate system; From the key frame that camera is taken, extract the physical feature point then; Again by physical feature point coupling based on cloud; Revise volume coordinate according to the physical feature point after the coupling at last, dummy object is registered, and played up demonstration.

For achieving the above object, the present invention adopts following technical conceive: according to physics of photography, computer vision technique, optical technology, cloud computing technology etc., improve the speed that physical feature point mates by Distribution calculation, reach real-time requirement.

The present invention realizes above-mentioned physical feature register method based on cloud computing by the following technical solutions, and its step is as follows:

1, physical feature point extracts, and sets up coordinate system: by camera the front elevation of the object wanting to register is taken, as with reference to figure, therefrom extracted the physical feature point.And the zone that can register is as required cut apart, and removes at extra-regional physical feature point.By obtaining the physical feature point, set up three dimensional space coordinate system, obtain the projection matrix of camera simultaneously.

2, physical feature point extracts in the key frame: by camera true environment is taken, extract key frame wherein.Key frame is carried out physical feature extract, the physical feature point of extraction is as the point of desire coupling.

3, must mate based on cloud: the Distributed Calculation of one of gordian technique in the use cloud computing is mated the physical feature point.

1), according to the quantity of matched node, the physical feature point that obtains in the step 2 is divided into groups.With the numbering of physical feature point as raw data, promptly<key ,-1 〉, wherein key is the numbering of each physical feature point;-1 initial value for physical feature point similarity ,-1 representative does not match.

2), with the raw data<key of each group ,-1〉send into and carry out similarity in the matched node and calculate.By with reference diagram in the physical feature point that extracts calculate similarity respectively, obtain data<key, similarity 〉, similarity is the numbering of physical feature point in the reference diagram and the similarity value of response.

Data<the key that 3), will have identical key, similarity〉send into same decision node, to same physical feature point, with the calculating of being divided by of physical feature point similarity is the highest in the reference diagram similarity value and time high similarity value, if less than the threshold value of regulation, think that then it is the unique point of coupling.For example the service range formula calculates as similarity, then can be expressed as:

Minimum distance/time closely<threshold value

Thereby to desire the physical feature point of match map and the preliminary matching result<key of the physical feature point of reference diagram, key_match 〉, wherein key_match is the numbering of the physical feature point of the reference diagram that mates with physical feature point key.According to these preliminary matching results, by eliminating mispairing, obtain correct matching result again.

4, revise volume coordinate, demonstration is played up in registration to dummy object: according to the matching relationship of physical feature point in physical feature point and the reference diagram, correction three dimensional space coordinate system, dummy object is calculated at this coordinate system, thereby obtain registration, make it be in correct three-dimensional space position, and play up demonstration.Going to step 2 after the end carries out new one and takes turns circulation.

The present invention compared with prior art, have following conspicuous outstanding feature and remarkable advantage: the present invention uses the Distribution calculation in the cloud computing that the physical feature point is mated calculating, be not subjected to the not enough limitation of computing power, distribute flexibly, improve the speed of calculating greatly, accomplished the requirement of real-time.Be different from grid computing, cloud computing is commercialization, makes any at any time place of user, as long as be connected to network, just can obtain the computing power of supercomputer, makes mobile augmented reality technology become possibility.And versatility of the present invention is good, for example at physical feature point extraction algorithm, can use existing ripe algorithm, as SIFT, FAST algorithm etc.If any better physical feature point extraction algorithm, only need to replace this module from now on, just can simple upgrade.

Description of drawings

Fig. 1 is the physical feature register method flow chart that the present invention is based on cloud computing;

Fig. 2 is the synoptic diagram that the present invention is based on the coupling of cloud.

Embodiment

Below in conjunction with accompanying drawing one embodiment of the present of invention are elaborated.

A preferred embodiment of the present invention, as shown in Figure 1, this physical feature register method based on cloud computing comprises that step is as follows:

1., the physical feature point set extracts, and sets up coordinate system;

2., physical feature point extracts in the key frame;

3., based on the coupling of cloud;

4., revise volume coordinate, demonstration is played up in registration to dummy object.

Above-mentioned steps is 1. described, and the physical feature point set extracts, and sets up coordinate system, and its concrete steps are as follows:

(1), the front elevation of the object wanting to register is taken, as with reference to figure by camera;

(2), use the SIFT algorithm to extract the physical feature point;

(3), the zone that needs are registered is cut apart;

(4), the SIFT unique point of removal outside registration area;

(5), obtain the projection matrix of camera;

(6), set up three dimensional space coordinate system according to the SIFT unique point that obtains.

Above-mentioned steps is 2. described, and physical feature point extracts in the key frame, and its concrete steps are as follows:

(7), camera is taken extraction key frame wherein to true environment;

(8), key frame is carried out the point of SIFT feature point extraction as the desire coupling.

Above-mentioned steps is 3. described, based on the coupling of cloud, and a coupling synoptic diagram based on cloud with N matched node and N decision node, as shown in Figure 2.Its concrete steps are as follows:

(9), according to the quantity of matched node, the SIFT unique point that 2. step obtains is divided into groups, for example among Fig. 2 matching section to count be 3, be about to all SIFT features and be divided into 3 groups of raw data;

(10), the numbering of SIFT unique point is designated as key, form raw data<key ,-1〉send into matched node;

(11), in matched node, the unique point in SIFT unique point that each is sent into and the reference diagram among the SIFT is carried out Euclidean distance and is calculated;

(12), numbering with reference to figure SIFT unique point of Euclidean distance and response is designated as similarity;

(13), the data<key to from matched node, exporting, similarity〉sort according to key, will have the data of identical key, the data of promptly same SIFT unique point are sent into same decision node;

(14) in decision node, to the data<key that sends into, similarity〉sort according to the size of Euclidean distance;

(15), judge that minimum distance removes in proper order closely whether less than prior preset threshold, as the then SIFT Feature Points Matching of this SIFT unique point and minimum distance corresponding reference figure that satisfies condition is designated as key_match, goes to step (16); As not satisfying condition, then go to step (14), another SIFT unique point is judged;

(16), to the SIFT unique point key_match of coupling, eliminate mispairing by calculating RANSAC, obtain correct matching result.

Above-mentioned steps is 4. described, revises volume coordinate, and demonstration is played up in registration to dummy object, and its concrete steps are as follows:

(17), the correct matching relationship of SIFT unique point in the SIFT unique point that obtains according to step (16) and the reference diagram, revise three dimensional space coordinate and be;

(18), calculate dummy object coordinate in the coordinate system after correction;

(19), play up dummy object, and, go to step (7) in display device output.

Claims (5)

1. the physical feature register method based on cloud computing is characterized in that operation steps is as follows: 1. extract the physical feature point set, set up coordinate system; 2. from the key frame that camera is taken, extract the physical feature point; 3. by physical feature point coupling based on cloud; 4. revise volume coordinate, dummy object is registered played up demonstration.
2. according to the described physical feature register method of claim 1 based on cloud computing, it is characterized in that 1. described step extract the physical feature point set, the method of setting up coordinate system is: by camera the front elevation of the object wanting to register is taken, as with reference to figure, therefrom extract the physical feature point, by these physical feature points, set up three dimensional space coordinate system.
3. according to the described physical feature register method of claim 1 based on cloud computing, it is characterized in that 2. described step extract the method for physical feature point and be from the key frame that camera is taken: from the real image that camera photographs, extract key frame images, in image, extract the physical feature point, as the point of desire coupling.
4. according to the described physical feature register method of claim 1 based on cloud computing, it is characterized in that 3. described step by the method that the physical feature point based on cloud mates is: by physical feature point is divided into groups, each group of feature point is sent into matched node carry out similarity calculating, to export the result afterwards and send into decision node calculating, the final physical feature point of desire match map and the preliminary matching result of the physical feature point of reference diagram of obtaining; According to preliminary matching result,, obtain correct matching result by eliminating mispairing.
5. according to the described physical feature register method of claim 1 based on cloud computing, it is characterized in that 4. described step revise volume coordinate, dummy object is registered the method for playing up demonstration is: by the matching relationship of physical feature point in physical feature point and the reference diagram, revise three dimensional space coordinate, dummy object is registered, make it be in correct three-dimensional space position, and play up demonstration.
CN2011101808816A 2011-06-30 2011-06-30 Method for registering natural characteristic based on cloud computation CN102254345A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011101808816A CN102254345A (en) 2011-06-30 2011-06-30 Method for registering natural characteristic based on cloud computation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011101808816A CN102254345A (en) 2011-06-30 2011-06-30 Method for registering natural characteristic based on cloud computation

Publications (1)

Publication Number Publication Date
CN102254345A true CN102254345A (en) 2011-11-23

Family

ID=44981586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011101808816A CN102254345A (en) 2011-06-30 2011-06-30 Method for registering natural characteristic based on cloud computation

Country Status (1)

Country Link
CN (1) CN102254345A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831405A (en) * 2012-08-16 2012-12-19 北京理工大学 Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching
CN102880854A (en) * 2012-08-16 2013-01-16 北京理工大学 Distributed processing and Hash mapping-based outdoor massive object identification method and system
CN103177468A (en) * 2013-03-29 2013-06-26 渤海大学 Three-dimensional motion object augmented reality registration method based on no marks
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
WO2014012444A1 (en) * 2012-07-17 2014-01-23 中兴通讯股份有限公司 Method, device and system for realizing augmented reality information sharing
CN103929669A (en) * 2014-04-30 2014-07-16 成都理想境界科技有限公司 Interactive video generator, player, generating method and playing method
CN104268519A (en) * 2014-09-19 2015-01-07 袁荣辉 Image recognition terminal based on mode matching and recognition method of image recognition terminal
US9195871B2 (en) 2013-02-06 2015-11-24 Alibaba Group Holding Limited Information processing method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996255A (en) * 2010-11-22 2011-03-30 何吴迪 Method for constructing required-interface oriented cloud computing searching system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996255A (en) * 2010-11-22 2011-03-30 何吴迪 Method for constructing required-interface oriented cloud computing searching system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴忠年: "虚拟全景空间和双目视觉三维重建的研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 5, 15 May 2009 (2009-05-15), pages 138 - 852 *
章国锋等: "面向增强视频的基于结构和运动恢复的摄像机定标", 《计算机学报》, vol. 29, no. 12, 12 December 2006 (2006-12-12) *
陈明等: "增强现实中的视频对象跟踪算法", 《计算机工程》, vol. 36, no. 12, 30 June 2010 (2010-06-30), pages 229 - 231 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014012444A1 (en) * 2012-07-17 2014-01-23 中兴通讯股份有限公司 Method, device and system for realizing augmented reality information sharing
CN102880854B (en) * 2012-08-16 2015-02-18 北京理工大学 Distributed processing and Hash mapping-based outdoor massive object identification method and system
CN102880854A (en) * 2012-08-16 2013-01-16 北京理工大学 Distributed processing and Hash mapping-based outdoor massive object identification method and system
CN102831405B (en) * 2012-08-16 2014-11-26 北京理工大学 Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching
CN102831405A (en) * 2012-08-16 2012-12-19 北京理工大学 Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching
US9704247B2 (en) 2013-02-06 2017-07-11 Alibaba Group Holding Limited Information processing method and system
US9430835B2 (en) 2013-02-06 2016-08-30 Alibaba Group Holding Limited Information processing method and system
US9195871B2 (en) 2013-02-06 2015-11-24 Alibaba Group Holding Limited Information processing method and system
US10121099B2 (en) 2013-02-06 2018-11-06 Alibaba Group Holding Limited Information processing method and system
CN103177468A (en) * 2013-03-29 2013-06-26 渤海大学 Three-dimensional motion object augmented reality registration method based on no marks
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN103929669A (en) * 2014-04-30 2014-07-16 成都理想境界科技有限公司 Interactive video generator, player, generating method and playing method
CN104268519B (en) * 2014-09-19 2018-03-30 袁荣辉 Image recognition terminal and its recognition methods based on pattern match
CN104268519A (en) * 2014-09-19 2015-01-07 袁荣辉 Image recognition terminal based on mode matching and recognition method of image recognition terminal

Similar Documents

Publication Publication Date Title
CN105359190B (en) According to single image estimating depth
Ren et al. Depth camera based hand gesture recognition and its applications in human-computer-interaction
CN103337094B (en) A kind of method of applying binocular camera and realizing motion three-dimensional reconstruction
CN102938844B (en) Three-dimensional imaging is utilized to generate free viewpoint video
KR101135186B1 (en) System and method for interactive and real-time augmented reality, and the recording media storing the program performing the said method
CN102855470B (en) Estimation method of human posture based on depth image
CN100594519C (en) Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
RU2426172C1 (en) Method and system for isolating foreground object image proceeding from colour and depth data
Wendel et al. Dense reconstruction on-the-fly
CN105654471B (en) Augmented reality AR system and method applied to internet video live streaming
CN101226640B (en) Method for capturing movement based on multiple binocular stereovision
KR20150082379A (en) Fast initialization for monocular visual slam
CN103716586A (en) Monitoring video fusion system and monitoring video fusion method based on three-dimension space scene
CN104361314B (en) Based on infrared and transformer localization method and device of visual image fusion
KR20110042971A (en) Marker-less augmented reality system using projective invariant and method the same
US8644551B2 (en) Systems and methods for tracking natural planar shapes for augmented reality applications
CN103489214A (en) Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system
CN101400001B (en) Generation method and system for video frame depth chart
CN102945564A (en) True 3D modeling system and method based on video perspective type augmented reality
GB2520338A (en) Automatic scene parsing
CN106355153A (en) Virtual object display method, device and system based on augmented reality
US8442307B1 (en) Appearance augmented 3-D point clouds for trajectory and camera localization
KR101086068B1 (en) 3 dimensional marker detection method and device and method for providing augmented reality and mixed reality using the same
CN104317391B (en) A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
CN102903144B (en) Cloud computing based interactive augmented reality system implementation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20111123