CN101799938B - Method for producing real object simulated computer image - Google Patents

Method for producing real object simulated computer image Download PDF

Info

Publication number
CN101799938B
CN101799938B CN201010158363XA CN201010158363A CN101799938B CN 101799938 B CN101799938 B CN 101799938B CN 201010158363X A CN201010158363X A CN 201010158363XA CN 201010158363 A CN201010158363 A CN 201010158363A CN 101799938 B CN101799938 B CN 101799938B
Authority
CN
China
Prior art keywords
point
visual angle
data point
point set
neighbour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201010158363XA
Other languages
Chinese (zh)
Other versions
CN101799938A (en
Inventor
何炳蔚
林泽明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Qishan Lake Medical Technology Co ltd
Unnamed Fujian Investment Group Co ltd
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN201010158363XA priority Critical patent/CN101799938B/en
Publication of CN101799938A publication Critical patent/CN101799938A/en
Application granted granted Critical
Publication of CN101799938B publication Critical patent/CN101799938B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a method for producing a real object simulated computer image, which is characterized by comprising the following steps: calculating the curvature of each data point; searching the corresponding data point pair; and calculating transformation matrices R and T. The invention is beneficial for the user to efficiently and accurately draw the detected object in a three-dimensional simulation way, and has the advantage of high registration accuracy.

Description

A kind of manufacturing approach of real object simulated computer image
Technical field
The present invention relates to a kind of manufacturing approach of real object simulated computer image.
Background technology
The manufacturing approach of real object simulated computer image has important use in actual production, its key is the precision of multi-view angle three-dimensional point cloud data registration.Three dimensional optical measuring equipment once can only be seen a side of object, obtain the three-dimensional data of complete object, need measure to cover whole body surface object from a plurality of angles.Because the coordinate system of three-dimensional data is different under the resulting all angles, thereby, need each view be carried out registration, make it unified under the coordinate system of an overall situation.Obviously, registration accuracy has directly determined the measuring accuracy of total system.
Traditional method for registering has subsides positioning label and the high-precision rotary table of use on testee.It is invisible that fixedly ball and positioning label can make the part object, generally is attached to more smooth surf zone relatively.The flow-type optical scanner is to label special circularly on the object or on the instrument of fixed object, and its effect is equivalent to fixedly ball.According to 2 view in front and back more than 3 or 3 not the common tag of conllinear come data are carried out registration.Utilize rotary table directly to carry out registration, but the data point of object bottom and bottom can't be gathered to measurement data.More than these traditional registration technique methods simpler, but can't realize accurate location, and automaticity is lower.
In the autoregistration of three dimensional point cloud; Most widely used ICP (the Iterative Closest Point) algorithm that in document A method for registration of 3-D shape, proposes for Besl etc., it is put the nearest point of model point set and sets up the corresponding relation between point set through constantly searching in the data point set each.But ICP requires the subclass that data set is another data set, and initial position is had relatively high expectations, and each iteration all will be calculated the corresponding point that impact point concentrates each point to concentrate in RP, and calculated amount is big, and efficient is low.Chen etc. propose the utilization direction of normal in document Object modeling by registration of multiple range images distance replaces point-to-point apart from the evaluation function as coupling; Point in such two views need not corresponding one by one; But this method requires lsqnonlin, and counting yield is low.Propositions such as Masuda are carried out stochastic sampling to point set, and as measurement criterion, but this method all need sampled after the iteration each time again with minimum intermediate value square error.Soon-Yang etc. have proposed a kind of CCP (Contrative Projection Point) algorithm, but still can not solve the complicated orientation problem of looking more.Above-mentioned algorithm mostly only is suitable for having the data point set of obvious corresponding relation and the registration between the model, and initial position is all had higher requirements.Therefore, how addressing the above problem is research contents of the present invention.
Summary of the invention
The object of the present invention is to provide a kind of manufacturing approach of real object simulated computer image, this method not only helps the user carries out efficient accurate three-dimensional modelling to testee and describes, and registration accuracy is high.
Its solution is: this new invention has proposed a kind of based on curvature; And the character of comprehensive use resultant curvature; Seek the corresponding data point to carrying out registration with Hausdorff distance between the point set of visual angle and method arrow invariant, realize the manufacturing approach of real object simulated computer image.It is characterized in that being undertaken: calculate each curvature variation of data points by following three steps; The searching that corresponding data point is right; Computational transformation matrix R, T.
The invention has the advantages that:
(1) the present invention's geometric properties of utilizing data point self to be had carries out the registration of data point, realizes data point robotization registration, and registration accuracy is high;
(2) the present invention also need not relation of inclusion for treating that the co-registration data points cloud does not have status requirement, only needs abundant overlapping region;
(3) the present invention can realize allocative effect preferably for the bigger surface points cloud of curved transition under the little situation in overlapping region;
(4) the present invention need not label, and telltale marks such as place kick need not high-precision rotary table.
Description of drawings
Fig. 1 is the right searching process flow diagram of corresponding data point of the present invention
Fig. 2 is a practical implementation exemplary plot 1 of the present invention
Fig. 3 is a practical implementation exemplary plot 2 of the present invention
Fig. 4 is a practical implementation exemplary plot 3 of the present invention.
Embodiment
With reference to figure 1, Fig. 2, Fig. 3 and Fig. 4, a kind of manufacturing approach of real object simulated computer image is characterized in that carrying out according to the following steps:
1. calculate each curvature variation of data points
1.1 on the plural at least vision orientation of model material object, confirm the three-dimensional coordinate of corresponding orientation; On each three-dimensional coordinate, determine the coordinate parameters of each data point of model surface respectively, form corresponding visual angle point set respectively by the data point of model surface on each orientation, visual angle;
1.2 for the point set under each different visual angles of above-mentioned acquisition, the coordinate parameters that selected K neighbour's point around point is concentrated each data point, and neighbour respectively ordered;
1.3 utilize K neighbour to put the curvature of calculating each data point
2. the right searching of corresponding data point
2.1 in an above-mentioned visual angle point set A, choose a data points p i
Satisfy the Euclidean distance DL (p that method is vowed invariant 2.2 in the point set B of visual angle, choose m i, q J)<the data point of ε if can not find any satisfied data point, is returned step 2.1;
2.3 calculate the data point p among the above-mentioned visual angle point set A respectively iWith the Hausdorff distance of the curvature of m data points among the point set B of visual angle, data point p iWith each data point q in the m data points jHausdorff distance calculation process following:
(1) at above-mentioned data point p iAround selected K neighbour's point, formation neighbour point set N bhd (p i), at above-mentioned data point q jAround selected K neighbour's point, formation neighbour point set N bhd (q j);
(2) adopt formula || p-q||=|K p-K q| calculate above-mentioned neighbour's point set N bhd (p respectively i) in each neighbour put p IkTo neighbour's point set N bhd (q j) in all neighbours point between distance, wherein K is that two neighbours put the Gaussian curvature that concentrated neighbour order, selection neighbour point set N bhd (q j) the middle distance neighbour puts p IkNearest neighbour puts q Jn, the neighbour puts p IkPut q with the neighbour JnCorresponding, constitute a pair of neighbour's point to (p Ik, q Jn);
(3) getting above-mentioned all neighbours point, the maximum a pair of neighbour of middle distance is put right distance value is the data point p among the point set A of visual angle iWith data point q among the point set B of visual angle jThe unidirectional Hausdorff distance h (p of curvature i, q j);
(4) in like manner can get data point q among the point set B of visual angle jWith the data point p among the point set A of visual angle iThe unidirectional Hausdorff distance h (q of curvature j, p i);
(5) according to above-mentioned h (p i, q j) and h (q j, p i), selecting maximal value is data point p among the point set A of visual angle iWith data point q among the point set B of visual angle jThe Hausdorff distance H (p of curvature i, q j);
2.4 according to m the Hausdorff distance that aforementioned calculation obtains, get minimum value and be designated as min, if min satisfies min<ξ, then corresponding data point is to (p i, q j) be that a pair of corresponding data point of visual angle point set A and visual angle point set B is to (b t, b t'), otherwise, return step 2.1;
2.5 in above-mentioned visual angle point set A, choose next data point p I+1, to 2.4, calculate p according to step 2.2 I+1With corresponding data point q among the point set B of visual angle J+1, another of formation visual angle point set A and visual angle point set B is to corresponding data point (b T+1, b T+1'), each that so calculates visual angle point set A and visual angle point set B one by one is to the corresponding data point;
The process flow diagram of step 2 is as shown in Figure 1;
3. computational transformation matrix R, T
3.1 the corresponding data point according to above-mentioned acquisition is obtained the coordinate transform on the corresponding three-dimensional coordinate on two different visions orientation to adopting the four-tuple method to find the solution, and obtains rotation matrix R and translation matrix T;
3.2 rotation matrix R and translation matrix T according to above-mentioned acquisition; Data point among data point among the point set A of visual angle and the visual angle point set B is unified under the coordinate system at reference data point set place; Accomplish the registered placement of data point under two visual angles, thereby form analog image on computers.
Practical implementation example of the present invention is following:
Fig. 2 and Fig. 3 are a kind of two visual angles of material object, generate two visual angle point sets.With the data entry program of two visual angle point sets, system just can carry out curvature automatically and calculate the searching of corresponding data point; Operations such as coordinate conversion; Good three-dimensional data and rotation matrix R and the translation matrix T of final output registration, thus analog image formed on computers, and its image is as shown in Figure 4; Registration error: 0.0812mm, registration accuracy is high.

Claims (1)

1. the manufacturing approach of a real object simulated computer image is characterized in that carrying out according to the following steps:
(1) calculates each curvature variation of data points
(1.1) three-dimensional coordinate of definite corresponding orientation on the plural at least vision orientation of model material object; On each three-dimensional coordinate, determine the coordinate parameters of each data point of model surface respectively, form corresponding visual angle point set respectively by the data point of model surface on each orientation, visual angle;
(1.2) for the point set under each different visual angles of above-mentioned acquisition, the coordinate parameters that selected K neighbour's point around point is concentrated each data point, and neighbour respectively ordered;
(1.3) utilize K neighbour to put the curvature of calculating each data point
(2) the right searching of corresponding data point
(2.1) in an above-mentioned visual angle point set A, choose a data points p i
(2.2) in the point set B of visual angle, choose m and satisfy the Euclidean distance DL (p that method is vowed invariant i, q j)<the data point of ε if can not find any satisfied data point, is returned step (2.1);
(2.3) calculate data point p among the above-mentioned visual angle point set A respectively iWith the Hausdorff distance of the curvature of m data points among the point set B of visual angle, data point p iWith each data point q in the m data points jHausdorff distance calculation process following:
(1 ') is at above-mentioned data point p iAround selected K neighbour's point, formation neighbour point set N bhd (p i), at above-mentioned data point q jAround selected K neighbour's point, formation neighbour point set N bhd (q j);
(2 ') adopted formula || p-q||=|K p-K q| calculate above-mentioned neighbour's point set N bhd (p respectively i) in each neighbour put p IkTo neighbour's point set N bhd (q j) in all neighbours point between distance, K wherein pAnd K qBe that two neighbours put the Gaussian curvature that concentrated neighbour is ordered, select neighbour's point set N bhd (q j) the middle distance neighbour puts p IkNearest neighbour puts q Jn, the neighbour puts p IkPut q with the neighbour JnCorresponding, constitute a pair of neighbour's point to (p Ik, q Jn);
(3 ') got above-mentioned all neighbours point, and the maximum a pair of neighbour of middle distance is put right distance value is the data point p among the point set A of visual angle iWith data point q among the point set B of visual angle jThe unidirectional Hausdorff distance h (p of curvature i, q j);
(4 ') in like manner can get data point q among the point set B of visual angle jWith the data point p among the point set A of visual angle iThe unidirectional Hausdorff distance h (q of curvature j, p i);
(5 ') is according to above-mentioned h (p i, q j) and h (q j, p i), selecting maximal value is data point p among the point set A of visual angle iWith data point q among the point set B of visual angle jThe Hausdorff distance H (p of curvature i, q j);
(2.4) m the Hausdorff distance that obtains according to aforementioned calculation got minimum value and is designated as min, if min satisfies min<ξ, then corresponding data point is to (p i, q j) be that a pair of corresponding data point of visual angle point set A and visual angle point set B is to (b t, b t'), otherwise, return step (2.1);
(2.5) in above-mentioned visual angle point set A, choose next data point p I+1, to (2.4), calculate p according to step (2.2) I+1With corresponding data point q among the point set B of visual angle J+1, another of formation visual angle point set A and visual angle point set B is to corresponding data point (b T+1, b T+1'), each that so calculates visual angle point set A and visual angle point set B one by one is to the corresponding data point;
(3) computational transformation matrix R, T
(3.1) obtain the coordinate transform on the corresponding three-dimensional coordinate on two different visions orientation according to the corresponding data point of above-mentioned acquisition to adopting the four-tuple method to find the solution, obtain rotation matrix R and translation matrix T;
(3.2) according to the rotation matrix R and the translation matrix T of above-mentioned acquisition; Data point among data point among the point set A of visual angle and the visual angle point set B is unified under the coordinate system at reference data point set place; Accomplish the registered placement of data point under two visual angles, thereby form analog image on computers.
CN201010158363XA 2010-04-28 2010-04-28 Method for producing real object simulated computer image Active CN101799938B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010158363XA CN101799938B (en) 2010-04-28 2010-04-28 Method for producing real object simulated computer image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010158363XA CN101799938B (en) 2010-04-28 2010-04-28 Method for producing real object simulated computer image

Publications (2)

Publication Number Publication Date
CN101799938A CN101799938A (en) 2010-08-11
CN101799938B true CN101799938B (en) 2012-04-25

Family

ID=42595610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010158363XA Active CN101799938B (en) 2010-04-28 2010-04-28 Method for producing real object simulated computer image

Country Status (1)

Country Link
CN (1) CN101799938B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1919157A (en) * 2006-09-15 2007-02-28 李晓峰 Manufacturing method of fine personalized skull model capable of describing teeth occluding relation
CN1967596A (en) * 2006-08-14 2007-05-23 东南大学 Construction method of triangulation of 3D scattered point set in 3D scan system
CN101151730A (en) * 2005-04-13 2008-03-26 (株)赛丽康 Separation type unit pixel having 3D structure for image sensor and manufacturing method thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000076482A (en) * 1998-09-02 2000-03-14 Meidensha Corp Three-dimensional computer graphics compositing method
JP2000194863A (en) * 1998-12-28 2000-07-14 Nippon Telegr & Teleph Corp <Ntt> Three-dimensional structure acquisition/restoration method and device and storage medium recording three- dimensional structure acquisition/restoration program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101151730A (en) * 2005-04-13 2008-03-26 (株)赛丽康 Separation type unit pixel having 3D structure for image sensor and manufacturing method thereof
CN1967596A (en) * 2006-08-14 2007-05-23 东南大学 Construction method of triangulation of 3D scattered point set in 3D scan system
CN1919157A (en) * 2006-09-15 2007-02-28 李晓峰 Manufacturing method of fine personalized skull model capable of describing teeth occluding relation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JP特开2000-194863A 2000.07.14
JP特开2000-76482A 2000.03.14

Also Published As

Publication number Publication date
CN101799938A (en) 2010-08-11

Similar Documents

Publication Publication Date Title
CN105300316B (en) Optical losses rapid extracting method based on grey scale centre of gravity method
JP4785880B2 (en) System and method for 3D object recognition
Salvi et al. A review of recent range image registration methods with accuracy evaluation
CN107767442A (en) A kind of foot type three-dimensional reconstruction and measuring method based on Kinect and binocular vision
CN101315698B (en) Characteristic matching method based on straight line characteristic image registration
CN103913131A (en) Free curve method vector measurement method based on binocular vision
CN109000582A (en) Scan method and system, storage medium, the equipment of tracking mode three-dimensional scanner
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN103729643A (en) Recognition and pose determination of 3d objects in multimodal scenes
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
Wuhrer et al. Landmark-free posture invariant human shape correspondence
CN109523595A (en) A kind of architectural engineering straight line corner angle spacing vision measuring method
CN105910584B (en) Large scale dynamic photogrammtry system it is high-precision fixed to and orientation accuracy evaluation method
CN102930551B (en) Camera intrinsic parameters determined by utilizing projected coordinate and epipolar line of centres of circles
CN101901502B (en) Global optimal registration method of multi-viewpoint cloud data during optical three-dimensional measurement
CN108010125A (en) True scale three-dimensional reconstruction system and method based on line-structured light and image information
CN103839253A (en) Arbitrary point matching method based on partial affine transformation
Cheung et al. Measurement and characterization of ultra-precision freeform surfaces using an intrinsic surface feature-based method
CN110728745B (en) Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
Zheng et al. LiDAR point cloud registration based on improved ICP method and SIFT feature
Liu et al. The applications and summary of three dimensional reconstruction based on stereo vision
CN109345570A (en) A kind of multichannel three-dimensional colour point clouds method for registering based on geometry
CN101799938B (en) Method for producing real object simulated computer image
CN108257184A (en) A kind of camera attitude measurement method based on square dot matrix cooperative target
CN109272558B (en) Method for calibrating pinhole camera by using common free-pole triangle and circular ring points of separating circles

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230302

Address after: Room 501-3, Floor 5, Building 14, Phase I, Innovation Park, No. 3, Keji East Road, High-tech Zone, Fuzhou, Fujian 350100

Patentee after: Fujian Qishan Lake Medical Technology Co.,Ltd.

Address before: 350100 Room 501, Floor 5, Building 14, Phase I, "Haixi High-tech Industrial Park", High-tech Zone, Fuzhou City, Fujian Province (located at No. 3, Keji East Road, Shangjie Town, Minhou County)

Patentee before: Unnamed (Fujian) Investment Group Co.,Ltd.

Effective date of registration: 20230302

Address after: 350100 Room 501, Floor 5, Building 14, Phase I, "Haixi High-tech Industrial Park", High-tech Zone, Fuzhou City, Fujian Province (located at No. 3, Keji East Road, Shangjie Town, Minhou County)

Patentee after: Unnamed (Fujian) Investment Group Co.,Ltd.

Address before: 350108 new campus of Fuzhou University, No. 2, Xue Yuan Road, University Town, Minhou street, Minhou, Fujian.

Patentee before: FUZHOU University