CN102506758A - Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method - Google Patents

Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method Download PDF

Info

Publication number
CN102506758A
CN102506758A CN2011103089051A CN201110308905A CN102506758A CN 102506758 A CN102506758 A CN 102506758A CN 2011103089051 A CN2011103089051 A CN 2011103089051A CN 201110308905 A CN201110308905 A CN 201110308905A CN 102506758 A CN102506758 A CN 102506758A
Authority
CN
China
Prior art keywords
dimensional
video camera
plane target
target drone
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011103089051A
Other languages
Chinese (zh)
Other versions
CN102506758B (en
Inventor
刘震
张广军
樊巧云
黄邦奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN 201110308905 priority Critical patent/CN102506758B/en
Publication of CN102506758A publication Critical patent/CN102506758A/en
Application granted granted Critical
Publication of CN102506758B publication Critical patent/CN102506758B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method. The object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system comprises a plurality of three-dimensional optical measuring heads and plane targets, a controller, a image acquisition system and a core computer, wherein the three-dimensional optical measuring heads and the plane targets are arranged around an object to be measured; each three-dimensional optical measuring head comprises a grating type binocular vision sensor and a wide field camera; the core computer controls a characteristic point of each plane target to be lighted through the controller, the wide field camera measures the plane targets around the wide field camera, the grating type binocular vision sensor measures a surface three-dimensional morphology of the object to be measured; and with the plane targets in common vision fields of the wide field cameras as mediums, local three-dimensional data obtained through the measurement of the grating type binocular vision sensors are unified to a global coordinate system. According to the invention, the problems of large influence to dynamic measurement precision by a site vibration environment due to rigid connection among the vision sensors in the traditional multi-vision-sensor measurement system.

Description

Flexible dynamic vision measuring system of body surface three-dimensional pattern multisensor and method
Technical field
The present invention relates to the dynamic vision measuring technique of body surface three-dimensional pattern, relate in particular to flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor and method.
Background technology
Large-sized mechanical component (hereinafter to be referred as large-scale component) is the key core parts of national great equipment, like large ship with all kinds of rotating shafts, large-scale generating set with all kinds of rotating shafts and rotor, Aero-Space with all kinds of rotors and blade etc.At present the crucial difficult point measured of restriction large-scale component dynamic vision mainly is: the large-scale component size is big, often has complicated abominable work on the spot environment (hereinafter to be referred as complicated site environment) such as measure field strong vibration, the mechanical component condition of high temperature, measurement space be narrow and small in the process; Characteristics such as these all require dynamic measurement system to have big, the anti-on-the-spot strong vibration of measurement range, can measure high-temperature component, real-time.
At present, the large-scale component three-dimensional surface shape detects and mainly to comprise contact and contactless two kinds.The contact detection technique mainly comprises: manual pick is measured and three coordinate measuring machine with jig.Manual pick with jig measure simple to operate, cost is low, is a kind of measuring method of more employing in producing at present, but efficiency of measurement is low, low precision, can't realize kinetic measurement.Three coordinate measuring machine is a common apparatus of accomplishing three-dimensional coordinates measurement, have good measuring accuracy, but measurement range is limited, can't realize kinetic measurement equally.The non-contact detection technology mainly contains laser tracker, 3D laser range finder, total powerstation, transit, vision detection technology etc.Laser tracker, 3D laser range finder, total powerstation and transit are suitable for general field condition; Measurement range is big, precision is high; But it is low that major defect is an efficiency of measurement, once can only realize spot measurement, can't realize high density, the kinetic measurement of large-scale component three-dimensional surface shape.
Present widely used vision detection technology mainly comprises: passive vision detection technique and active vision detection technique.
The equipment requirements of passive vision detection technique is simple, can under the natural lighting situation, obtain testee surface three dimension information, has been widely used in body of a motor car measurement, large-scale antenna deformation measurement, big aircraft assembling, large-sized forging part processing and other fields.The major defect of passive vision detection technique is that passive vision detection technique metrical information is less if do not adopt the subsidiary means, only is fit to the simple geometry parameter measurement.
The active vision detection technique mainly comprises: fourier art, phase measuring profilometer, structured light vision detection method.The Fourier transform technology of profiling is fit to kinetic measurement, but operation time is long, the robotization poor-performing.Phase measuring profilometer has higher measuring accuracy, but is not suitable for kinetic measurement.In the structured light vision detection method, line-structured light method and grating structured light method are the most widely used Three-Dimensional Dynamic visible detection methods of present industry spot, and this method has advantages such as equipment is simple, automaticity, suitable kinetic measurement.Both key distinctions are: striation of line-structured light method projection, surface profile of Measuring Object; Many striations of grating structured light method projection, a plurality of surface profiles of Measuring Object.The grating structured light method has more efficient, but exists system calibrating complicacy and striation to discern difficult problems such as difficulty automatically.
The large-scale component size is big, exists self to block, and single vision sensor can't be realized whole large-scale component three-dimensional surface shape dynamic vision measurement.Usually can measured zone be divided into a plurality of subregions, the three-dimensional data of all subregion is unified to global coordinate system, to obtain the three-dimensional appearance on whole member surface.Unify the difference of mode according to the overall situation, vision-based detection mainly is divided into two big types: flow-type visual detection method and multisensor visual detection method.
The flow-type visual detection method is measured the three-dimensional appearance on whole large-scale component surface with the type of flow through single vision sensor; Target monumented point with before sticking on the monumented point on the mechanical component or being placed on mechanical component is an intermediary, and the subregion measurement data that all flow measurements obtain is unified to global coordinate system.The advantage of this method is that equipment is simple, easy to operate, is fit to the industry spot static measurement, but can't satisfies the kinetic measurement requirement.
The multisensor visual detection method is accomplished multi-visual sense sensor and is demarcated before measurement, according to the global calibration result, the subregion uniform data that each vision sensor is measured is to global coordinate system during measurement.This method principle is simple; Can realize the dynamic vision measurement of whole member three-dimensional surface shape; But be rigidly connected owing to belong between the vision sensor; Therefore have the on-the-spot global calibration difficulty of multiple vision sensor, The measuring precision receives limitation such as on-the-spot strong vibration environmental impact is bigger behind the global calibration.
Summary of the invention
In view of this; Fundamental purpose of the present invention is to provide flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor and method; Be rigidly connected between each vision sensor to solve in the existing multiple vision sensor measuring system, dynamic measurement precision receives on-the-spot vibration environment to influence problems such as big.
For achieving the above object, technical scheme of the present invention is achieved in that
The invention provides the flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor; This system comprises: a plurality of three-dimensional optical gauge heads, a plurality of overall situation unification are with plane target drone, controller, image capturing system and core computer; Said three-dimensional optical gauge head and plane target drone be distributed in testee around; Said three-dimensional optical gauge head is connected said controller and image capturing system with plane target drone, said controller is connected said core computer with image capturing system; Said three-dimensional optical gauge head comprises raster pattern binocular vision sensor and wide visual field video camera; Wherein,
Said core computer; Be used for lighting through the unique point that controller is controlled on the said plane target drone; Control said wide visual field video camera and measure the said plane target drone of its periphery; Control said raster pattern binocular vision sensor and measure the three-dimensional surface shape of said testee, and be intermediary with the plane target drone in the common visual field of video camera, said wide visual field, the partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is to global coordinate system;
Said image capturing system is used to gather the measurement gained data of said wide visual field video camera and raster pattern binocular vision sensor, and offers said core computer.
Wherein, the wide visual field video camera of each said three-dimensional optical gauge head is used for carrying out identification of plane target drone characteristic point center image coordinate and location according to after himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image distortion;
The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is used for carrying out identification of grating optical strip image central point and location according to after himself inner parameter calibration result correcting measuring gained grating optical strip image distortion.
Wherein, said raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera, wherein,
Said grating laser is used to throw the grating striation;
Said first video camera and second video camera are used to gather the grating optical strip image and offer said image capturing system.
Wherein, The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is further used for; Matching result according to the grating optical strip image of said first video camera and second camera acquisition; Adopt the binocular stereo vision principle, calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
Wherein, Said core computer is further used for; Choose a plane target drone coordinate system arbitrarily as global coordinate system; Plane target drone characteristic point center image coordinate according to identification and location calculates the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to said global coordinate system, and the partial 3 d data overall situation that said raster pattern binocular vision sensor is measured is unified to said global coordinate system.
Wherein, said wide visual field video camera is flake video camera or reflected refraction video camera.
The present invention also provides a kind of body surface three-dimensional pattern multisensor flexible dynamic vision measuring method; The unified plane target drone of using of a plurality of three-dimensional optical gauge heads and a plurality of overall situation of around testee, distributing; Said three-dimensional optical gauge head is connected controller and image capturing system with plane target drone; Said controller is connected said core computer with image capturing system, said three-dimensional optical gauge head comprises raster pattern binocular vision sensor and wide visual field video camera; This method comprises:
The unique point that said core computer is controlled on the said plane target drone through controller is lighted; Control said wide visual field video camera and measure the said plane target drone of its periphery, control the three-dimensional surface shape that said raster pattern binocular vision sensor is measured said testee;
Plane target drone with in the common visual field of video camera, said wide visual field is an intermediary, and the partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is to global coordinate system.
Wherein, Said wide visual field video camera is measured the said plane target drone of its periphery; Be specially: the wide visual field video camera of each said three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image after, carry out identification of plane target drone characteristic point center image coordinate and location;
Said raster pattern binocular vision sensor is measured the three-dimensional surface shape of said testee; Be specially: the raster pattern binocular vision sensor of each said three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained grating optical strip image after, carry out identification of grating optical strip image central point and location.
Wherein, Said raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera; Wherein, said grating laser projection grating striation, said first video camera and the second camera acquisition grating optical strip image offer said image capturing system.
Wherein, said raster pattern binocular vision sensor is measured the three-dimensional surface shape of said testee, is specially:
The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is according to the matching result of the grating optical strip image of said first video camera and second camera acquisition; Adopt the binocular stereo vision principle; Calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
Wherein, said is intermediary with the plane target drone in the common visual field of video camera, wide visual field, and the partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is specially to global coordinate system:
Core computer is chosen a plane target drone coordinate system arbitrarily as global coordinate system; Plane target drone characteristic point center image coordinate according to identification and location; Calculate the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to said global coordinate system, and the partial 3 d data overall situation that said raster pattern binocular vision sensor is measured is unified to said global coordinate system.
Wherein, said wide visual field video camera is flake video camera or reflected refraction video camera.
Flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor provided by the present invention and method; Raster pattern binocular vision sensor and wide visual field video camera combined constitute the three-dimensional optical gauge head; The three-dimensional optical gauge head is arranged in the testee periphery; Plane target drone to be arranged in periphery is an intermediary, and the local data that each three-dimensional optical gauge head measures is unified under global coordinate system, realizes that the flexible dynamic vision of whole body surface three-dimensional pattern is measured.
A plurality of vision sensors among the present invention need not rigidly fix in measurement; The overall situation unification of each vision sensor local measurement data is to be that intermediary realizes with the plane target drone in the wide visual field; Belong between each vision sensor and flexibly connect; Need not accomplish the global calibration of three-dimensional optical gauge head in measure field, local measurement is accomplished with the overall situation is unified synchronously, and therefore on-the-spot strong vibration is less to the measuring accuracy influence of measuring system.
Description of drawings
Fig. 1 is the synoptic diagram as a result of the flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor of the embodiment of the invention;
Fig. 2 is the structural representation of three-dimensional optical gauge head in the embodiment of the invention;
Fig. 3 is the structural representation of wide visual field video camera in the embodiment of the invention;
Fig. 4 is the unified synoptic diagram with plane target drone of the overall situation in the embodiment of the invention;
Fig. 5 is the measurement model synoptic diagram of raster pattern binocular vision sensor in the embodiment of the invention;
Fig. 6 is the synoptic diagram of partial 3 d data overall situation unified model in the embodiment of the invention.
Embodiment
Below in conjunction with accompanying drawing and specific embodiment technical scheme of the present invention is further set forth in detail.
The flexible dynamic vision measuring system of a kind of body surface three-dimensional pattern multisensor provided by the present invention; Comprise: a plurality of three-dimensional optical gauge heads, a plurality of overall situation unification are with plane target drone, controller, image capturing system and core computer; Said three-dimensional optical gauge head and plane target drone be distributed in object around; Said three-dimensional optical gauge head is connected said controller and image capturing system with plane target drone, said controller is connected said core computer with image capturing system; Said three-dimensional optical gauge head comprises raster pattern binocular vision sensor and wide visual field video camera.
Core computer; Be used for lighting through the unique point on the controller control plane target; Control wide visual field video camera and measure its peripheral plane target drone; The three-dimensional surface shape of control raster pattern binocular vision sensor Measuring Object, and be intermediary with the plane target drone in the common visual field of video camera, wide visual field, the partial 3 d uniform data that the raster pattern binocular vision sensor is measured gained is to global coordinate system;
Image capturing system is used to gather the measurement gained data of wide visual field video camera and raster pattern binocular vision sensor, and offers core computer.
Wherein, the wide visual field video camera of each three-dimensional optical gauge head is used for carrying out identification of plane target drone characteristic point center image coordinate and location according to after himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image distortion;
The raster pattern binocular vision sensor of each three-dimensional optical gauge head is used for carrying out identification of grating optical strip image central point and location according to after himself inner parameter calibration result correcting measuring gained grating optical strip image distortion.
Further, the raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera, and wherein, grating laser is used to throw the grating striation; First video camera and second video camera are used to gather the grating optical strip image and offer image capturing system.
The raster pattern binocular vision sensor of each three-dimensional optical gauge head is further used for; Matching result according to the grating optical strip image of first video camera and second camera acquisition; Adopt the binocular stereo vision principle; Calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
Core computer is further used for; Choose a plane target drone coordinate system arbitrarily as global coordinate system; According to plane target drone characteristic point center image coordinate by identification of wide visual field video camera and location in each three-dimensional optical gauge head; Calculate the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to global coordinate system, and the partial 3 d data overall situation that the raster pattern binocular vision sensor is measured is unified to global coordinate system.
The flexible dynamic vision measuring system of corresponding above-mentioned body surface three-dimensional pattern multisensor, the flexible dynamic vision measuring method of a kind of body surface three-dimensional pattern multisensor provided by the present invention mainly comprises:
Step 1, core computer is lighted through the unique point on the controller control plane target, controls wide visual field video camera and measures its peripheral plane target drone, the three-dimensional surface shape of control raster pattern binocular vision sensor Measuring Object.
Wherein, Wide visual field video camera is measured the said plane target drone of its periphery; Be specially: the wide visual field video camera of each three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image after, carry out identification of plane target drone characteristic point center image coordinate and location;
The three-dimensional surface shape of raster pattern binocular vision sensor Measuring Object; Be specially: the raster pattern binocular vision sensor of each three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained grating optical strip image after, carry out identification of grating optical strip image central point and location.
Further, the raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera, wherein, grating laser projection grating striation, first video camera and the second camera acquisition grating optical strip image offer image capturing system.Accordingly; The raster pattern binocular vision sensor is measured said object surfaces three-dimensional appearance; Be specially: the raster pattern binocular vision sensor of each three-dimensional optical gauge head is according to the matching result of the grating optical strip image of first video camera and second camera acquisition; Adopt the binocular stereo vision principle, calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
Step 2 is an intermediary with the plane target drone in the common visual field of video camera, wide visual field, and the partial 3 d uniform data that the raster pattern binocular vision sensor is measured gained is to global coordinate system.
Concrete; Core computer is chosen a plane target drone coordinate system arbitrarily as global coordinate system; Plane target drone characteristic point center image coordinate according to identification of wide visual field video camera and location in each three-dimensional optical gauge head; Calculate the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to global coordinate system, and the partial 3 d data overall situation that the raster pattern binocular vision sensor is measured is unified to global coordinate system.
Need to prove, dynamic vision measuring system of the present invention and method, the three-dimensional appearance dynamic vision that is not only applicable to the large-scale component surface is measured, and is applicable to that too the three-dimensional appearance dynamic vision of stock size object and microsize body surface is measured.
Three-dimensional appearance dynamic vision with the large-scale component surface is measured as example below, to dynamic vision measuring system of the present invention and method further explain.
The large-scale component three-dimensional surface shape dynamic vision measuring system of the embodiment of the invention one shown in Figure 1; Comprise: three-dimensional optical gauge head 1,2,3; The overall situation is unified with plane target drone 1,2,3; The high-speed image sampling system, controller and core computer, and corresponding Survey Software and physical construction and tripod etc.Three-dimensional optical gauge head 1 is by raster pattern binocular vision sensor and wide visual field video camera be combined into, and is as shown in Figure 2, and wherein, the raster pattern binocular vision sensor is made up of grating laser, video camera 1, video camera 2; Position between wide visual field video camera and the raster pattern binocular vision sensor keeps immobilizing.
Shape according to three-dimensional optical gauge head field range and tested large-scale component; Three-dimensional optical gauge head 1,2,3 is distributed in around the large-scale component with certain angle of distribution; Each three-dimensional optical gauge head is measured a sub regions of large-scale component, measures the component partial surface in real time; At the measure field periphery; Place the discernible plane target drone of a plurality of monumented points; The wide visual field video camera of each optical measuring head is taken the plane target drone of periphery, and the raster pattern binocular vision sensor is measured the three-dimensional appearance on large-scale component surface, is intermediary with the plane target drone 1,2,3 of the common visual field of video camera, wide visual field; The partial 3 d uniform data that the raster pattern binocular vision sensor is measured is realized complicated site environment large-scale component three-dimensional surface shape dynamic vision measurement to global coordinate system.
Wide visual field video camera can select the flake video camera also can select the reflected refraction video camera.The reflected refraction video camera is made up of the mirror surface in a pinhole camera and its dead ahead.The reflected refraction video camera can be divided into two types: one type is the monochromatic light heart, and another kind of is the non-monochromatic light heart.Wherein, monochromatic light oculo cardiac reflex refraction video camera is easy to generate fluoroscopy images, in reality, has a wide range of applications.The mirror surface of monochromatic light oculo cardiac reflex refraction video camera can be divided into four types: rotating paraboloidal mirror, hyperboloid of revolution mirror, revolution ellipsoid mirror and level crossing.Be without loss of generality, for convenience, embodiments of the invention are that example describes with the reflected refraction video camera that adopts level crossing.As shown in Figure 3, wide visual field video camera is made up of a video camera 11 and a level crossing 12 (being the four sides mirror in the present embodiment), constitutes four mirror image video cameras 13,14,15,16 through level crossing 12 and realizes wide visual field test.
As shown in Figure 4, the overall situation is unified uses the unique point of plane target drone to be led light source, and each characteristic point position can be according to characteristic point position relation recognition out-of-plane target sequence number through code Design.Plane target drone passes through the tripod support arrangement around testee.
Specifically describe the implementation step that realizes that large-scale component three-dimensional surface shape dynamic vision is measured below:
Step 101, the dynamic vision measuring system is demarcated.
Intrinsic parameters of the camera is demarcated in A, the three-dimensional optical gauge head
Plane target drone is moved freely before video camera more than five times; Extract the plane target drone characteristic point image coordinate, the camera marking method of mentioning in the article " A flexible new technique for cameracalibration [J] .IEEE Trans.on Pattern Analysis and Machine Intelligence " that adopts Zhang Zhengyou to deliver in November, 2000 is demarcated the inner parameter of video camera in the three-dimensional optical gauge head respectively.
The raster pattern binocular vision sensor is demarcated in B, the three-dimensional optical gauge head
The binocular vision sensor scaling method of mentioning in the article " binocular vision sensor based on unknown motion 1-dimension drone is demarcated [J]; the mechanical engineering journal " that adopts Zhou Fuqiang to deliver in June, 2006; Before binocular vision sensor, move more than twice 1-dimension drone; Video camera is taken the 1-dimension drone image in the binocular vision sensor, extracts the 1-dimension drone image characteristic point.Find the solution two essential matrixs between the video camera, be known as constraint condition with distance between the 1-dimension drone unique point and demarcate rotation matrix and translation vector R between two video cameras of raster pattern binocular vision sensor 12, t 12
The overall situation of four mirror image video cameras and raster pattern binocular vision sensor calibration in C, the three-dimensional optical gauge head
The multi-visual sense sensor calibration method of mentioning in the article " based on the multiple vision sensor field calibration [J] of biplane target; the mechanical engineering journal " that adopts Zhang Guangjun to deliver in July, 2009; The biplane target is moved freely more than three times before two band calibration vision sensors; Two vision sensor camera plane target images are constraint condition with invariant position between two plane target drones, calculate the transition matrix between two vision sensors.Calculate in the video camera of wide visual field the transition matrix T between four mirror type camera coordinate systems through calibrating mode in twos at last Cij(i, j=1,2,3,4), T CijI in the subscript, j represent the sequence number of mirror image video camera, for example T respectively C12Expression mirror image video camera 1 is to the transition matrix of mirror image video camera 2.With mirror type video camera 13 is that wide visual field camera coordinate system (can certainly be that wide visual field camera coordinate system is set up on the basis with other any mirror type video cameras) is set up on the basis, calculates the transition matrix Th that raster pattern binocular vision sensor coordinate is tied to wide visual field camera coordinate system.
Step 102, grating optical strip image center quick identification and location.
Core computer is controlled grating laser projection grating striation simultaneously through controller, and the raster pattern binocular vision sensor takes optical strip image and wide visual field video camera is taken the unified plane target drone of using of the overall situation.
The optical strip image center method for distilling that the paper " An unbiased detector of curvilinearstructures, IEEE Transaction on Pattern Analysis Machine Intelligence. " that adopts Steger to deliver in February, 1998 is mentioned extracts grating optical strip image central point.Computed image each point Hessian (Hai Sen) matrix at first; According to optical strip image gray scale curved surface features; Judge striation center candidate point through eigenwert and proper vector in the Hessian matrix of each picture point, through on-link mode (OLM) striation center candidate point is linked at again and forms the optical strip image data together.Adopt the constraint of striation locus to combine the identification positioning of striation in the polar curve constraint realization left and right cameras in the binocular stereo vision at last.
Step 103, raster pattern binocular vision sensor partial 3 d is rebuild.
As shown in Figure 5, grating striation point P utilizes the binocular stereo vision model to resolve the three-dimensional coordinate of a P under the binocular vision sensor coordinate system respectively at left and right video camera imaging, is that the binocular vision sensor coordinate system is set up on the basis with left camera coordinate system.If p 1And p 2Be respectively grating striation point P orthoscopic image homogeneous coordinates under left and right camera review coordinate system.l 1Be p 1Polar curve in left camera review, l 2Be p 2Polar curve in right camera review.Left side camera coordinate system is O C1x C1y C1z C1, right camera coordinate system is O C2x C2y C2z C2Rotation matrix and translation vector that left side camera coordinates is tied to right camera coordinate system are R 12, t 12R 12, t 12In step 102, try to achieve.
The measurement model of raster pattern binocular vision sensor is suc as formula shown in (1):
ρ 1 p 1 = A 1 I 0 P ρ 2 p 2 = A 2 R 12 t 12 P - - - ( 1 )
In the following formula, A 1And A 2Be respectively the inner parameter of left and right video camera, ρ 1, ρ 2Be the non-zero proportions coefficient.
In actual measurement, often there is lens distortion in the video camera imaging system.If p d=(u d, v d, 1) TFor the fault image homogeneous coordinates being arranged, p=(u, v, 1) TBe orthoscopic image homogeneous coordinates, p n=(u n, v n, 1) TBe the normalized image homogeneous coordinates, then the lens distortion model of inventive embodiments employing can be expressed as:
u d=u+(u-u 0)(k 1r 2+k 2r 4)
(2)
v d=v+(v-v 0)(k 1r 2+k 2r 4)
In the following formula,
Figure BDA0000098138600000112
k 1, k 2Be the camera lens coefficient of radial distortion; (u 0, v 0) be video camera principal point coordinate.
According to the striation central point image coordinate on the two video cameras coupling of raster pattern binocular vision sensor; Through type (1) can obtain the three-dimensional coordinate of grating striation central point under raster pattern binocular vision sensor coordinate system, realizes the reconstruction of raster pattern binocular vision sensor partial 3 d.
Step 104, the overall situation is unified with identification of plane target drone optical spot centre and location.
Embodiments of the invention adopt led light source as target luminescence feature point (abbreviation luminous point) on target, solve bias light extracts precision to the dot pattern inconocenter influence.The dot pattern that led light source generates meets Gaussian distribution as intensity profile, and the dot pattern inconocenter is the summit of luminous point gradation of image curved surface just.
It is CN101408985 that the present invention adopts publication number; Open day is on April 15th, 2009; Denomination of invention realizes overall situation unification for the dot pattern inconocenter method for distilling of mentioning in the one Chinese patent application of " a kind of circular facula sub-pel center method for distilling and device " and locatees with the plane target drone optical spot centre; This method is the Hessian matrix of computed image each point at first; According to the Pixel-level coordinate of the deterministic location optical spot centre of forming by eigenwert among the Hessian, represent the gray scale curved surface in dot pattern inconocenter neighbours zone again through the secondary Taylor expansion, confirm the sub-pix image coordinate of optical spot centre according to curved surface summit character.
In addition, for the ease of identification, location aware between plane target drone characteristic point, and adopt code Design, can identify the plane target drone sequence number according to characteristic point position.
Step 105, the overall situation of partial 3 d data is unified.
With a mirror image video camera in the video camera of wide visual field is that three-dimensional optical gauge head coordinate system is set up on the basis.As shown in Figure 6; Described the overall RUP that the embodiment of the invention adopts: the raster pattern binocular vision sensor is measured the local surfaces three-dimensional appearance of large-scale component; Wide visual field video camera is taken the plane target drone that is arranged in the measure field periphery, calculates the transition matrix T between the midplane target coordinate of visual field Ti, tj(t iRepresent i plane target drone, t jRepresent j plane target drone) and three-dimensional optical gauge head coordinate be tied to the transition matrix T of plane target drone coordinate system Oi, tj( OiRepresent i three-dimensional optical gauge head, t jRepresent j plane target drone), be intermediary with the common visual field of video camera, wide visual field midplane target, realize that the overall situation of three-dimensional optical gauge head local measurement data is unified.
Being without loss of generality, is that global coordinate system is set up on the basis with plane target drone 1 coordinate system.Three-dimensional optical gauge head 1 and 2 can pass through T O1, t1And T O2, t1Directly that the partial 3 d data overall situation that measures is unified to global coordinate system, 3 of three-dimensional optical gauge heads can be unified to global coordinate system, specifically suc as formula shown in (3) with the partial 3 d data overall situation through plane target drone 2:
P G 1 = T o 1 , t 1 T h 1 P o 1 P G 2 = T o 2 , t 1 T h 2 P o 2 P G 3 = T t 2 , t 1 T o 3 , t 2 T h 3 P o 3 - - - ( 3 )
In the following formula, P O1, P O2, P O3Be respectively the partial 3 d data that the raster pattern binocular vision sensor measures in the three-dimensional optical gauge head 1,2,3, P G1, P G2, P G3Be respectively P O1, P O2, P O3Coordinate under global coordinate system, T H1, T H2, T H3Be respectively raster pattern binocular vision sensor coordinate in the three-dimensional optical gauge head 1,2,3 and be tied to the transition matrix of three-dimensional optical gauge head coordinate system.
In actual measurement, T H1, T H2, T H3Known for demarcating in advance, remain unchanged.T Oi, tjAnd T Ti, tjCalculate through wide visual field video camera at every turn.In order to improve the unified precision of the overall situation, can reduce target feature point extraction error is unified the result to the overall situation influence at a plurality of plane target drones of measure field arranged around.
In sum; The present invention combines raster pattern binocular vision sensor and wide visual field video camera and constitutes the three-dimensional optical gauge head; The three-dimensional optical gauge head is arranged in the testee periphery; Plane target drone to be arranged in periphery is an intermediary, and the local data that each three-dimensional optical gauge head measures is unified under global coordinate system, realizes that the flexible dynamic vision of whole body surface three-dimensional pattern is measured.A plurality of vision sensors among the present invention need not rigidly fix in measurement; The overall situation unification of each vision sensor local measurement data is to be that intermediary realizes with the plane target drone in the wide visual field; Belong between each vision sensor and flexibly connect; Need not accomplish the global calibration of three-dimensional optical gauge head in measure field, local measurement is accomplished with the overall situation is unified synchronously, and therefore on-the-spot strong vibration is less to the measuring accuracy influence of measuring system.
In addition, dynamic vision measuring system of the present invention and method, the three-dimensional appearance dynamic vision that is not only applicable to the large-scale component surface is measured, and is applicable to that too the three-dimensional appearance dynamic vision of stock size object and microsize body surface is measured.
The above is merely preferred embodiment of the present invention, is not to be used to limit protection scope of the present invention.

Claims (12)

1. the flexible dynamic vision measuring system of a body surface three-dimensional pattern multisensor; It is characterized in that; This system comprises: a plurality of three-dimensional optical gauge heads, a plurality of overall situation unification are with plane target drone, controller, image capturing system and core computer; Said three-dimensional optical gauge head and plane target drone be distributed in testee around, said three-dimensional optical gauge head is connected said controller and image capturing system with plane target drone, said controller is connected said core computer with image capturing system; Said three-dimensional optical gauge head comprises raster pattern binocular vision sensor and wide visual field video camera; Wherein,
Said core computer; Be used for lighting through the unique point that controller is controlled on the said plane target drone; Control said wide visual field video camera and measure the said plane target drone of its periphery; Control said raster pattern binocular vision sensor and measure the three-dimensional surface shape of said testee, and be intermediary with the plane target drone in the common visual field of video camera, said wide visual field, the partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is to global coordinate system;
Said image capturing system is used to gather the measurement gained data of said wide visual field video camera and raster pattern binocular vision sensor, and offers said core computer.
2. according to the flexible dynamic vision measuring system of the said body surface three-dimensional pattern of claim 1 multisensor, it is characterized in that,
The wide visual field video camera of each said three-dimensional optical gauge head is used for carrying out identification of plane target drone characteristic point center image coordinate and location according to after himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image distortion;
The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is used for carrying out identification of grating optical strip image central point and location according to after himself inner parameter calibration result correcting measuring gained grating optical strip image distortion.
3. according to the flexible dynamic vision measuring system of the said body surface three-dimensional pattern of claim 2 multisensor, it is characterized in that said raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera, wherein,
Said grating laser is used to throw the grating striation;
Said first video camera and second video camera are used to gather the grating optical strip image and offer said image capturing system.
4. according to the flexible dynamic vision measuring system of the said body surface three-dimensional pattern of claim 3 multisensor; It is characterized in that; The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is further used for, and according to the matching result of the grating optical strip image of said first video camera and second camera acquisition, adopts the binocular stereo vision principle; Calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
5. according to claim 2, the flexible dynamic vision measuring system of 3 or 4 said body surface three-dimensional pattern multisensors; It is characterized in that; Said core computer is further used for; Choose a plane target drone coordinate system arbitrarily as global coordinate system; Plane target drone characteristic point center image coordinate according to identification and location calculates the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to said global coordinate system, and the partial 3 d data overall situation that said raster pattern binocular vision sensor is measured is unified to said global coordinate system.
6. according to the flexible dynamic vision measuring system of each said body surface three-dimensional pattern multisensor of claim 1 to 4, it is characterized in that said wide visual field video camera is flake video camera or reflected refraction video camera.
7. the flexible dynamic vision measuring method of a body surface three-dimensional pattern multisensor; It is characterized in that; The unified plane target drone of using of a plurality of three-dimensional optical gauge heads and a plurality of overall situation of around testee, distributing; Said three-dimensional optical gauge head is connected controller and image capturing system with plane target drone, said controller is connected said core computer with image capturing system, and said three-dimensional optical gauge head comprises raster pattern binocular vision sensor and wide visual field video camera; This method comprises:
The unique point that said core computer is controlled on the said plane target drone through controller is lighted; Control said wide visual field video camera and measure the said plane target drone of its periphery, control the three-dimensional surface shape that said raster pattern binocular vision sensor is measured said testee;
Plane target drone with in the common visual field of video camera, said wide visual field is an intermediary, and the partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is to global coordinate system.
8. according to the flexible dynamic vision measuring method of the said body surface three-dimensional pattern of claim 7 multisensor; It is characterized in that; Said wide visual field video camera is measured the said plane target drone of its periphery; Be specially: the wide visual field video camera of each said three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained plane target drone characteristic point center image after, carry out identification of plane target drone characteristic point center image coordinate and location;
Said raster pattern binocular vision sensor is measured the three-dimensional surface shape of said testee; Be specially: the raster pattern binocular vision sensor of each said three-dimensional optical gauge head according to the distortion of himself inner parameter calibration result correcting measuring gained grating optical strip image after, carry out identification of grating optical strip image central point and location.
9. the flexible dynamic vision measuring method of said according to Claim 8 body surface three-dimensional pattern multisensor; It is characterized in that; Said raster pattern binocular vision sensor comprises grating laser, first video camera and second video camera; Wherein, said grating laser projection grating striation, said first video camera and the second camera acquisition grating optical strip image offer said image capturing system.
10. according to the flexible dynamic vision measuring method of the said body surface three-dimensional pattern of claim 9 multisensor, it is characterized in that said raster pattern binocular vision sensor is measured the three-dimensional surface shape of said testee, is specially:
The raster pattern binocular vision sensor of each said three-dimensional optical gauge head is according to the matching result of the grating optical strip image of said first video camera and second camera acquisition; Adopt the binocular stereo vision principle; Calculate the three-dimensional coordinate of grating optical strip image central point, measure the partial 3 d data.
11. the flexible dynamic vision measuring method of 9 or 10 said body surface three-dimensional pattern multisensors according to Claim 8; It is characterized in that; Said is intermediary with the plane target drone in the common visual field of video camera, wide visual field; The partial 3 d uniform data that said raster pattern binocular vision sensor is measured gained is specially to global coordinate system:
Core computer is chosen a plane target drone coordinate system arbitrarily as global coordinate system; Plane target drone characteristic point center image coordinate according to identification and location; Calculate the transition matrix that wide visual field camera coordinates in each three-dimensional optical gauge head is tied to said global coordinate system, and the partial 3 d data overall situation that said raster pattern binocular vision sensor is measured is unified to said global coordinate system.
12. to the flexible dynamic vision measuring method of 11 each said body surface three-dimensional pattern multisensors, it is characterized in that said wide visual field video camera is flake video camera or reflected refraction video camera according to Claim 8.
CN 201110308905 2011-10-12 2011-10-12 Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method Expired - Fee Related CN102506758B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110308905 CN102506758B (en) 2011-10-12 2011-10-12 Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110308905 CN102506758B (en) 2011-10-12 2011-10-12 Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method

Publications (2)

Publication Number Publication Date
CN102506758A true CN102506758A (en) 2012-06-20
CN102506758B CN102506758B (en) 2012-12-12

Family

ID=46218866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110308905 Expired - Fee Related CN102506758B (en) 2011-10-12 2011-10-12 Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method

Country Status (1)

Country Link
CN (1) CN102506758B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103438803A (en) * 2013-09-18 2013-12-11 苏州晓创光电科技有限公司 Method for performing view-field-across accurate measurement on size of rectangular part through computer vision technology
CN104897060A (en) * 2015-06-17 2015-09-09 大连理工大学 large Large field of view global measurement method using coordinates tracking control board
CN105823420A (en) * 2016-05-16 2016-08-03 北京控制工程研究所 Method for precise derivation of light-return energy center coordinates of pyramid combined part
CN106441200A (en) * 2016-07-20 2017-02-22 杭州先临三维科技股份有限公司 3 dimensional measuring method having multi-measuring modes
CN107429991A (en) * 2015-04-14 2017-12-01 雅马哈发动机株式会社 Appearance inspection device and appearance inspection method
CN108107837A (en) * 2018-01-16 2018-06-01 三峡大学 A kind of glass processing device and method of view-based access control model guiding
CN109883326A (en) * 2019-03-29 2019-06-14 湖南省鹰眼在线电子科技有限公司 A kind of videographic measurment formula automobile three-dimensional four-wheel aligner method, system and medium
CN110763141A (en) * 2019-08-29 2020-02-07 北京空间飞行器总体设计部 Precision verification method and system of high-precision six-degree-of-freedom measurement system
CN111561867A (en) * 2020-04-15 2020-08-21 成都飞机工业(集团)有限责任公司 Airplane surface appearance digital measurement method
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN113503825A (en) * 2021-05-31 2021-10-15 北京卫星制造厂有限公司 Visual measurement method for deformation of dynamic structure
CN114279362A (en) * 2021-12-28 2022-04-05 中国航天空气动力技术研究院 Dynamic shape measuring device and method for heat-proof structure
WO2022134939A1 (en) * 2020-12-24 2022-06-30 上海智能制造功能平台有限公司 Data splicing and system calibration method for human body digital measurement device
CN114910021A (en) * 2022-05-07 2022-08-16 泰州市创新电子有限公司 Grating type binocular stereoscopic vision three-dimensional measurement system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009270915A (en) * 2008-05-07 2009-11-19 Kagawa Univ Method and device for measuring three-dimensional shape
US20100073461A1 (en) * 2008-09-23 2010-03-25 Sick Ag Lighting unit and method for the generation of an irregular pattern
CN101825445A (en) * 2010-05-10 2010-09-08 华中科技大学 Three-dimension measuring system for dynamic object
CN102062588A (en) * 2009-11-11 2011-05-18 中国科学院沈阳自动化研究所 Computer binocular vision denture scanning device and three-dimensional reconstruction method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009270915A (en) * 2008-05-07 2009-11-19 Kagawa Univ Method and device for measuring three-dimensional shape
US20100073461A1 (en) * 2008-09-23 2010-03-25 Sick Ag Lighting unit and method for the generation of an irregular pattern
CN102062588A (en) * 2009-11-11 2011-05-18 中国科学院沈阳自动化研究所 Computer binocular vision denture scanning device and three-dimensional reconstruction method thereof
CN101825445A (en) * 2010-05-10 2010-09-08 华中科技大学 Three-dimension measuring system for dynamic object

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
R.S.LU,ET AL: "A global calibration method for large-scale multi-sensor visual measurement systems", 《SENSORS AND ACTUATORS A》, vol. 116, 25 June 2004 (2004-06-25), pages 384 - 393, XP004572641, DOI: doi:10.1016/j.sna.2004.05.019 *
刘震 等: "基于双平面靶标的多视觉传感器现场全局校准", 《机械工程学报》, vol. 45, no. 7, 31 July 2009 (2009-07-31), pages 228 - 232 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103438803A (en) * 2013-09-18 2013-12-11 苏州晓创光电科技有限公司 Method for performing view-field-across accurate measurement on size of rectangular part through computer vision technology
CN103438803B (en) * 2013-09-18 2015-10-28 苏州晓创光电科技有限公司 Computer vision technique accurately measures the method for Rectangular Parts size across visual field
CN107429991A (en) * 2015-04-14 2017-12-01 雅马哈发动机株式会社 Appearance inspection device and appearance inspection method
CN104897060A (en) * 2015-06-17 2015-09-09 大连理工大学 large Large field of view global measurement method using coordinates tracking control board
CN104897060B (en) * 2015-06-17 2017-08-01 大连理工大学 Using the big visual field global measuring method of coordinate tracking control panel
CN105823420A (en) * 2016-05-16 2016-08-03 北京控制工程研究所 Method for precise derivation of light-return energy center coordinates of pyramid combined part
CN105823420B (en) * 2016-05-16 2018-06-01 北京控制工程研究所 A kind of pyramid sub-assembly light echo center of energy coordinate precise deriving method
CN106441200A (en) * 2016-07-20 2017-02-22 杭州先临三维科技股份有限公司 3 dimensional measuring method having multi-measuring modes
CN106441200B (en) * 2016-07-20 2019-03-08 先临三维科技股份有限公司 A kind of method for three-dimensional measurement of more measurement patterns
CN108107837A (en) * 2018-01-16 2018-06-01 三峡大学 A kind of glass processing device and method of view-based access control model guiding
CN109883326A (en) * 2019-03-29 2019-06-14 湖南省鹰眼在线电子科技有限公司 A kind of videographic measurment formula automobile three-dimensional four-wheel aligner method, system and medium
CN110763141A (en) * 2019-08-29 2020-02-07 北京空间飞行器总体设计部 Precision verification method and system of high-precision six-degree-of-freedom measurement system
CN110763141B (en) * 2019-08-29 2021-09-03 北京空间飞行器总体设计部 Precision verification method and system of high-precision six-degree-of-freedom measurement system
CN111561867A (en) * 2020-04-15 2020-08-21 成都飞机工业(集团)有限责任公司 Airplane surface appearance digital measurement method
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN113129385B (en) * 2020-12-23 2022-08-26 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
WO2022134939A1 (en) * 2020-12-24 2022-06-30 上海智能制造功能平台有限公司 Data splicing and system calibration method for human body digital measurement device
CN113503825A (en) * 2021-05-31 2021-10-15 北京卫星制造厂有限公司 Visual measurement method for deformation of dynamic structure
CN113503825B (en) * 2021-05-31 2023-02-03 北京卫星制造厂有限公司 Visual measurement method for deformation of dynamic structure
CN114279362A (en) * 2021-12-28 2022-04-05 中国航天空气动力技术研究院 Dynamic shape measuring device and method for heat-proof structure
CN114910021A (en) * 2022-05-07 2022-08-16 泰州市创新电子有限公司 Grating type binocular stereoscopic vision three-dimensional measurement system and method

Also Published As

Publication number Publication date
CN102506758B (en) 2012-12-12

Similar Documents

Publication Publication Date Title
CN102506758B (en) Object surface three-dimensional morphology multi-sensor flexible dynamic vision measurement system and method
CN102445164B (en) Three-dimensional shape vision measuring method and system for large component surface
US8086026B2 (en) Method and system for the determination of object positions in a volume
RU2609434C2 (en) Detection of objects arrangement and location
CN102788559B (en) Optical vision measuring system with wide-field structure and measuring method thereof
CN104990515B (en) Large-sized object three-dimensional shape measure system and its measuring method
KR102424135B1 (en) Structured light matching of a set of curves from two cameras
CN105066962B (en) A kind of high-precision photogrammetric apparatus of the big angle of visual field of multiresolution
CN102155923A (en) Splicing measuring method and system based on three-dimensional target
CN105115560B (en) A kind of non-contact measurement method of cabin volume of compartment
CN103759670A (en) Object three-dimensional information acquisition method based on digital close range photography
CN103759669A (en) Monocular vision measuring method for large parts
CN104142157A (en) Calibration method, device and equipment
JP6333396B2 (en) Method and apparatus for measuring displacement of mobile platform
Rodríguez-Garavito et al. Automatic laser and camera extrinsic calibration for data fusion using road plane
JP2016217941A (en) Three-dimensional evaluation device, three-dimensional data measurement system and three-dimensional measurement method
CN104036518B (en) Camera calibration method based on vector method and three collinear points
CN105043301A (en) Grating strip phase solving method used for three-dimensional measurement
Jiang et al. Combined shape measurement based on locating and tracking of an optical scanner
CN105806319B (en) A kind of across shaft type image measuring method for glass pipeline three-dimensional motion analysis
Radhakrishna et al. Development of a robot-mounted 3D scanner and multi-view registration techniques for industrial applications
CN110686593B (en) Method for measuring relative position relation of image sensors in spliced focal plane
Tehrani et al. A new approach to 3D modeling using structured light pattern
Gong et al. Feature-based three-dimensional registration for repetitive geometry in machine vision
CN100370220C (en) Single-image self-calibration for relative parameter of light structural three-dimensional system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121212

Termination date: 20171012

CF01 Termination of patent right due to non-payment of annual fee