CN104504691B - Camera position and posture measuring method on basis of low-rank textures - Google Patents
Camera position and posture measuring method on basis of low-rank textures Download PDFInfo
- Publication number
- CN104504691B CN104504691B CN201410777911.5A CN201410777911A CN104504691B CN 104504691 B CN104504691 B CN 104504691B CN 201410777911 A CN201410777911 A CN 201410777911A CN 104504691 B CN104504691 B CN 104504691B
- Authority
- CN
- China
- Prior art keywords
- low
- video camera
- image
- camera
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a camera position and posture measuring method on the basis of low-rank textures and belongs to the technical field of vision measurement of a computer. The camera position and posture measuring method is characterized in that a plurality of low-rank textures, such as pane textures of a ceiling, in an indoor scene are used for measuring the position and the posture of a camera. Images of the low-rank textures are shot by the camera, a projection matrix of the camera in the shooting process is represented with an euler angle and an euler angle representing the posture of the camera is obtained by solving the optimization problem related to a low-rank texture imaging model; then a geometrical relationship of the textures in the scene and a camera imaging principle are further combined to work out the position of the camera in the scene. The camera position and posture measuring method has the effects and the benefits that six-degree-of-freedom measurement on the posture and the spacial position of the camera can be implemented by utilizing the characteristics of the low-rank textures in the scene; moreover, in the measuring process, the camera can rotate, so that the application range of the vision measurement is wider.
Description
Technical field
The invention belongs to computer vision measurement technical field, it is related to a kind of video camera spatial attitude based on low-rank texture
And location measurement method.
Background technology
Indoor e measurement technology is a widely used technology.With CCD technologies and image processing techniques in recent years not
It is disconnected to improve, vision measurement technology is so that its is applied widely, high precision the characteristics of, receive researcher and more and more pay close attention to.Depending on
Feel that location technology obtains the image containing feature using one or more video cameras from scene, it is special to image by distinct methods
Levy and expressed and extracted, the information for finally being provided using these features solves video camera position in space and attitude.Figure
The feature of picture is divided into local feature and global characteristics.Local feature is mainly comprising angle point, straight line etc..Current most of vision positionings
Method is based on image local feature, and point or straight line etc. are extracted from image to realize Camera Positioning.The measurement essence of this kind of method
Degree depends on the precision of local shape factor, is highly prone to noise jamming, and precision does not often reach reality in some scenarios
The requirement of application.Global characteristics are mainly the color of image, texture etc..Texture ceiling indoors, floor or wall etc.
It is easier to find in plane, often there is repeatability or symmetry, it is possible to use these features that texture has are imaged
Machine measurement of correlation.But the method for carrying out camera position and attitude measurement using textural characteristics at present is less.
In addition, be placed on video camera in some plane by many of each method at present, the shooting angle of video camera is fixedly mounted
Degree, video camera can not be rotated when shooting.In addition, most methods have only solved two of video camera in its holding plane
Dimension position coordinates, does not solve video camera three-dimensional coordinate in space.
The content of the invention
It is an object of the invention to provide a kind of camera position and attitude measurement method based on low-rank texture.The method is adopted
Shooting in indoor scene some with the unfixed video camera of setting angle has the texture of low-rank feature, such as grid on ceiling
Texture etc., 6 measurements of the free degree in video camera locus and attitude are carried out by processing screened image.
The technical scheme is that:
Some texture images in indoor scene, the grid texture such as on ceiling are shot using video camera.Use Euler
Angle represents the projection matrix in low-rank texture imaging model, and the low-rank texture represented by Optimization Method Eulerian angles is imaged mould
Type, solves the Eulerian angles for representing video camera attitude.Further using the geometrical feature and the imaging relations of video camera of texture, ask
Solution video camera position in space.Technical scheme will be described in detail below, low-rank texture and shooting will be introduced first
Shooting of the machine to low-rank texture, then introduces and how to solve video camera attitude using the low-rank characteristic of texture, finally introduces and solves
The step of video camera locus.
Step one:Obtain the image comprising low-rank texture
Texture as shown in Figure 1 can be often taken out on ceiling, ground, wall in actual scene.These textures have
There are symmetry and repeatability.Because the image moment rank of matrix of these textures is relatively low, these textures are referred to as low-rank texture by us.This
In provide the definition of low-rank texture:The texture of generally two dimension (2D) is considered as in plane R2The function I of upper definition0(x,y);Such as
Fruit family of functions I0(x) across limited lower-dimensional subspace, then by texture I0It is defined as low-rank texture.As shown in formula (1), for certain
Less integer k:
Now, I0It is considered as order-r textures.Generally, in k is less than evaluated window number of pixels half,
It is considered as then low-rank.
This method is firstly the need of the low-rank texture shot using the video camera demarcated in actual scene.The shooting demarcated
Machine refers to the video camera that focal length and principal point value are obtained by certain scaling method.During shooting, the setting angle of video camera can
Not fix.Video camera can shoot while adjusting the attitude of oneself, while carrying out the measurement of position and attitude.
Step 2:Using the low-rank Feature-solving video camera attitude of texture
There is projection deformation and noise when video camera shoots in low-rank texture, it is special that its formed image no longer possesses low-rank
Property.The low-rank texture can be recovered by following steps and try to achieve the shooting attitude of video camera.
1) coordinate system needed for setting up
Initially set up image coordinate system, camera coordinate system, world coordinate system.As shown in Fig. 2 image coordinate system is shooting
The center of image is respectively parallel to image edge for origin O, transverse axis OU and longitudinal axis OV.Camera coordinate system is taken the photograph for measurement
Camera photocentre OcIt is origin, transverse axis OcXcWith longitudinal axis OcYcIt is respectively parallel to image coordinate system OU axles and OV axles, vertical pivot OcZcVertically
In XcOcYcPlane.World coordinate system OwXwYwZwChosen according to the need for difference, typically by the X of world coordinate systemwOwYwPlane is built
In plane where standing in low-rank textural characteristics, and the matrix for causing expression low-rank texture is chosen in the direction of each axle of world coordinate system
Order it is minimum.
2) camera imaging model is set up, video camera projection matrix is represented with Eulerian angles
The point P on low-rank texture is chosen, if its coordinate in world coordinate system is (xw,yw,zw,1)T, sat in video camera
Coordinate representation in mark system is (xc,yc,zc)T, the coordinate of picture p on the image plane is (u, v, 1)T.Then in pin-hole imaging mould
Under type, the relation between P and p can be described as formula (2):
Wherein
S is a scale factor for non-zero in formula (2), and N is the Intrinsic Matrix of video camera, and f is the focal length of video camera;
Dx, dy represent the pixel dimension in U, V direction, (u respectively0,v0) it is coordinate of the video camera photocentre in image coordinate system.Spin matrix and translation matrix of the video camera relative to world coordinate system are represented respectively.Formula (2) is video camera imaging
Model.
According to Euler theorems, by the spin matrix in formula (2)Represented with three Eulerian angles.Selection each axle of video camera
Rotational order be:First rotate about the z axis, further around Y-axis rotation, finally rotated around X-axis, clockwise turn to just, corresponding three
Eulerian angles are respectively θx,θy,θz, then projection matrixSeparately it is written as:
Formula (3) is the projection matrix that Eulerian angles are represented.After establishing each coordinate system and camera imaging model, video camera
Attitude and position measurement be Eulerian angles θ of each axle of camera coordinate system to be asked in world coordinate systemx,θy,θz, and
Video camera photocentre OcCoordinate in world coordinate system
3) the low-rank texture imaging model based on Eulerian angles is set up
Use I0Original low-rank texture image is represented, I represents the low-rank texture image of actual photographed.Such as Fig. 3, the world is sat
Mark the X of systemwOwYwPlane is set up in the plane where low-rank textural characteristics, and the direction selection of each axle of world coordinate system is caused
The rank of matrix for expressing low-rank texture is minimum.By I0Video camera is along world coordinate system Z when regarding aswDirection of principal axis only has translation and does not have
The position for having rotation shoot the image for obtaining, referred to as just to shooting image;Regard I as video cameras to be sat relative to the world
Mark system is in the presence of shoot the image that obtains, referred to as tilt image at the position of certain rotation and translation.According to shooting
Machine model and image-forming principle, the point P (x on to low-rank texturew,yw,zw) carry out during just to shooting, its image coordinate (u1,
v1) and world coordinates (xw,yw,zw) between relation be expressed as formula (4);When carrying out tilt, image coordinate (u2,v2) and generation
Boundary coordinate (xw,yw,zw) between relation be expressed as formula (5):
Wherein
Wherein
Wherein Ri, TiRepresent that camera coordinate system is relative between world coordinate system during just to shooting and tilt respectively
Spin matrix and translation matrix, i=1,2.Wherein d represent just to shoot when between camera coordinate system and world coordinate system
Translation depth, rmnCorresponding parameter in the spin matrix represented for available Eulerian angles, m, n=1,2,3;t1,t2,t3Respectively put down
Three translational components moved in matrix.
With reference to (4) and (5), (u is tried to achieve1,v1) and (u2,v2) between relation it is as follows:
Wherein t'=[t'1,t'2,t'3]=[t1/d,t2/d,t3/ d] it is normalized displacements vector.When deriving (6), do not lose
The z=0 planes that captured plane scene is world coordinate system are assume that in general manner.
Use symbol(u in expression (6)1,v1) and (u2,v2) between computing, then the relation between two pixels be rewritten as:
Wherein τ isThe parameter r that computing is relied onmn,t1',t2',t3' set, m, n=1,2,3.rmnIt is world coordinate system
To the parameter in the spin matrix between video camera, by Eulerian angles θx,θy,θzRepresent.From above-mentioned derivation, just to shooting
Image I0In any point pass throughComputing can be converted into the point I in tilt image.By I0With single pixel on I
The relation of point is generalized to image all pixels point, and because random noise E, then can release I0Relation with both I is as follows:
Wherein τ is in formula (7)The parameter sets that computing is relied on.So far, the imaging relations model of low-rank texture is set up.
4) low-rank texture imaging model is solved, the attitude of video camera is solved
Solving video camera shooting attitude will solve τ.To solve τ, is converted into above mentioned problem such as formula by formula (8) first
(9) optimization problem shown in:
Wherein, | | E | |0Represent the number of non-zero entry in E.To solve the optimization problem, formula (9) is approximately:
Wherein, | | I0||*Represent I0Nuclear norm, | | E | |1Represent the l of E1- norm.Constraints right and wrong in formula (10)
Linear restriction, it is launched according to Taylor's formula at point (u, v) place to realize that constraints is linearized.Due to u, v depends on τ,
Therefore have after launching:
WhereinIt is Jacobian matrixs of the image I on parameters in τ.By above-mentioned derivation, by former optimization problem weight
It is written as:
Optimization problem expressed by formula (12) is solved using alternative manner, is comprised the following steps that:
Used as input picture, the initial value for setting τ is τ to the captured low-rank texture image of S1, receiving0=(0,0,0,0,0,
1), first three parameter represents the initial value of Eulerian angles, afterwards three initial values of value expression normalized displacements;Selected convergence precision ε>
0;Selected weights λ>0;
S2, the rectangular window containing low-rank texture is taken over an input image, gained image is designated as I;
S3, to image I, repeat following iterative step, until object function f=| | I0||*+λ||E1Global convergence:
Image I is normalized, and normalized value is assigned to I again;
The Jacobian matrix on Eulerian angles and normalized displacements is asked for I;
The problem that formula is stated is solved using augmented vector approach, the locally optimal solution τ ' of τ is obtained:
Wherein I (uij(τ),vij(τ)) it is that the image I, u, v represented with pixel value (u, v) are the function on τ;||I
(uij(τ),vij(τ))||FRepresent I (uij(τ),vij(τ)) F- norms;The Jacobi square on τ that is I
Battle array;
UsingComputing, the corresponding projective transformations of locally optimal solution τ ' are acted on each point of image I, and such I is just
It is converted into the new image of a width;The new images of gained are assigned to I again;
S4, output globally optimal solution τ*, the value of its first three value as Eulerian angles of expression video camera attitude;
By iterative, the optimal solution of τ can be tried to achieve.The parameter θ for representing video camera attitude can be directly obtained from τx,θy,
θz.It is pointed out that the normalized displacements value required by iteration, it is impossible to represent the actual value of displacement where video camera, it is necessary to logical
Crossing step 3 can just solve the displacement of video camera.
Step 3:Video camera position in space is solved using image-forming principle and low-rank texture geometrical property
Video camera is solved to be carried out in two steps position in space.First, world coordinate system origin O is asked forwSat in video camera
Coordinate under mark system, then tries to achieve video camera photocentre O by coordinate transformcCoordinate in world coordinate system.Video camera photocentre Oc
Coordinate in world coordinate system is video camera position in space.Comprise the following steps that:
By photocentre OcCoordinate in world coordinate system is usedRepresent, by world coordinate system origin OwIn shooting
Coordinate under machine system is usedRepresent.Choose low-rank texture two characteristic point P1, P2 in the plane, such as Fig. 4.Two
To facilitate, this 2 points picture is extracted from image is principle for the selection of point.To square low-rank texture, its summit can be chosen.Two
Length between point is measured by ruler, and is translated into the coordinate in world coordinate system, is usedRepresent,
Wherein i=1,2.If the coordinate of P1, P2 in camera coordinate system isThe two features are extracted from image
Point is (u in the coordinate that image coordinate is fastenedpi,vpi), convolution (2) is tried to achieve(upi,vpi) meet such as formula (14)
Relation:
Can be solved by (14)
Wherein
Due to the coordinate (x of camera coordinate systemc,yc,zc) and world coordinate system coordinate (xw,yw,zw) can be by rotation and translation
Mutually conversion, i.e., in the presence of following relation:
Coordinate (0,0,0) by video camera photocentre in camera coordinate system substitutes into formula (16), tries to achieve video camera photocentre Oc
Coordinate under world coordinate system
Wherein RijForMiddle parameter, i, j=1,2,3.Video camera photocentre O is tried to achievecCoordinate under world coordinate system,
Try to achieve video camera position in space.
The invention has the advantages that, low-rank textural characteristics present in indoor scene are make use of, with reference to these textures
The geometrical relationship of feature, asks for video camera position in space and attitude.The method can reach precision higher, and survey
During amount, this method can rotate video camera, and realize the six degree of freedom measurement of video camera attitude and locus, application
Scope is broader, flexible.
Brief description of the drawings
Fig. 1 is the low-rank texture maps taken out in actual scene.Wherein Fig. 1 (a) is strip low-rank texture maps, and Fig. 1 (b) is
Gridiron pattern low-rank texture maps, Fig. 1 (c) is square low-rank texture.
Fig. 2 is to solve for required coordinate system schematic diagram.Wherein 1 is world coordinate system OwXwYwZw, for describing real world
The position at midpoint, according to oneself need set up.2 is image coordinate system OUV, is origin O, transverse axis OU and longitudinal axis OV with CCD centers
It is respectively parallel to two vertical edges edges of CCD.3 is camera coordinate system, with video camera photocentre OcIt is origin, transverse axis OcXcWith it is vertical
Axle OcYcIt is respectively parallel to image coordinate system OU axles and OV axles, vertical pivot OcZcPerpendicular to XcOcYcPlane.P is low-rank texture in scene
On a bit, its picture on the image plane be p.
Fig. 3 is to align shooting and tilt schematic diagram.Wherein 4 is the world coordinate system now set up as needed, and 5 are
Square texture target during shooting.6 for just to shoot when camera coordinate system, 7 be tilt when camera coordinates
System.
Fig. 4 is the schematic diagram of the low-rank texture solution video camera 3 d space coordinate using geometrical relationship in scene always.
1,2,3 respectively with 1,2,3 in Fig. 1 in figure.P1, P2 are two characteristic points of low-rank texture, and its picture on the image plane is
P1, p2.
Fig. 5 is experiment scene schematic diagram in embodiment.8 is selected world coordinate system when testing in figure, and 9 is selected when being experiment
Camera coordinate system, 10 is the starting point of measurement.When shooting using video camera during experiment, from pointing out for 10 meanings
Hair, along arrow direction, shoots at * positions.
Specific embodiment
Specific embodiment of the invention is described in detail below in conjunction with technical scheme and accompanying drawing.Can will utilize low-rank texture maps
As the solution for solving video camera attitude is summarized as follows:
Step one:The selected low-rank texture that can be used to measure.Selected texture should have symmetry or repeatability, one
As should have point or the feature such as straight line, recognized from image with facilitating.Indoors in scene, ceiling grid texture, wall side
Check reason, floor grid texture are chosen as the texture for measuring.
Select after good low-rank texture, place video camera in scene indoors, the position of video camera and angle can not be consolidated
It is fixed, but need to ensure to image the low-rank texture that function is photographed in indoor scene.Adjustment focal length of camera, focuses on it.To adjustment
The video camera of good focal length is demarcated using scaling method (such as Zhang Zhengyou standardizations), obtains focal length of camera size and image master
Point value.After having demarcated video camera, the focal length of video camera can not be adjusted again.Indoor scene is shot using the video camera demarcated
In have symmetrically repeating feature low-rank texture image.
Step 2:Shooting image is processed, the shooting attitude of video camera is solved.Bag is chosen in captured image
One piece of rectangular area containing low-rank texture is designated as I as input.The initial value τ of Selecting All Parameters collection τ0=(0,0,0,0,0,1),
Weights λ>0.It is as follows to above-mentioned input applying step to ask for representing the parameter τ of video camera attitude:
1st, input is received:The initial value τ of image I, τ0, weights λ
2nd, as object function f=| | I0||*+λ||E||1Non- global convergence, is repeated below circulation:
S1:Image I is normalized, and value is assigned to I, and ask for Jacobian matrixs of the I on τ
Wherein I (uij(τ),vij(τ)) it is the image I represented with pixel, | | I (uij(τ),vij(τ))||FRepresent I (uij
(τ),vij(τ)) F- norms.
S2:Following problem is solved, optimal solution is obtainedE*,Δτ*:
S3:Undated parameter τ:τ←τ+Δτ*
3rd, export:The final optimal solution τ of optimization problem*, and take τ*It is preceding 3 value as Eulerian angles solution
The problem solved needed for S2 in above-mentioned steps needs to be solved with augmented vector approach, its specific algorithm step
It is summarized as follows:
1st, input is received:I passes throughImage after computingJacobian matrixs of the I on τWeights λ>0, here
τ is the locally optimal solution of each round iteration.
2nd, initial value is substituted into:Y0=0, E0=0, Δ τ0=0, μ0>0,ρ>1, k=0. wherein Y0It is Lagrange multiplier, E0
It is the initial value of E,It is the initial renewal step-length of τ, μ0It is the initial value of augmentation term coefficient μ in Augmented Lagrangian Functions, ρ
It is the updating factor of μ, k is iterations;
3rd, successively according to the following steps be iterated calculating untilConvergence:
μk+1=ρ μk
It is above-mentioned it is various in, SVD () represent singular value decomposition is carried out to expression formula in bracket;ΣkIt is the contracting of kth time iteration
Put the factor;Yk、Yk+1It is respectively the kth time and+1 iterative value of kth of Lagrange multiplier;Ek、Ek+1It is the kth time of random noise
With+1 iterative value of kth;It is the kth time and+1 iteration step length of kth of τ;μk、μk+1It is augmentation Lagrange letter
The kth time and+1 iterative value of kth of augmentation term coefficient in number;
4th, export:The locally optimal solution I of low-rank image0, the locally optimal solution E of random noise, the locally optimal solution of step-length
Δτ。
Supplementary notes, if when shooting low-rank texture, video camera relative to world coordinate system deflection angle less, can be with
Assuming that low-rank texture in the image area before and after projection is constant, optimization problem that also can be in step 2Middle adding conditionalWith
So that solving result is more accurate.By above-mentioned steps, represent that Euler's angular dimensions of video camera attitude is solved.
Step 3:Feature extraction is carried out to shooting image, by texture geometric properties in the scene with it in image coordinate
Geometric properties under system are mapped by imaging relations, ask for the coordinate under the alive boundary's coordinate system of video camera photocentre, namely take the photograph
Camera position in space.Calculate comprising the following steps that for video camera photocentre coordinate:
1st, two characteristic points for being used to measure are chosen on real low-rank texture.Facilitating from image during selected characteristic point
Upper its picture that extracts is principle, and square texture typically chooses the summit of square texture.Characteristic point on texture is measured with ruler
Between actual length, and coordinate of the characteristic point in world coordinate system is drawn according to actual length.Here, characteristic point is at least selected
Take a pair, it is also possible to choose several right more.When choosing multipair, the position that every a pair of characteristic points ask for video camera is utilized respectively, then will
That integrates each pair of characteristic point asks for actual position of the result after result as video camera.
2nd, using image processing method, selected texture characteristic points corresponding pixel position on image is extracted from image
Put, obtain the pixel coordinate of characteristic point.Image processing method used can be selected according to the not equal factor of texture, general to use
Harris Robust Algorithm of Image Corner Extraction extracts angle point, and Hough transform extracts straight line, with the operator extraction such as Canny edge.To selection summit
Square texture a little is characterized, straight line can be extracted with Hough transform, ask for the intersection point of straight line, just obtained the picture of characteristic point
Plain coordinate.
3rd, bring the Euler's angular dimensions in τ striked in step 2 into formula (3), calculate projection matrixWill
Coordinate and its coordinate under pixel coordinate system of the selected characteristic point in world coordinate system bring formula (15) and (17) into, calculate
Video camera position in space
Embodiment:
Square texture using on ceiling given below carries out the embodiment of actual measurement.Experiment is used
MCV1000SAM-HD cameras, optical dimensions are 1/2 inch, and Pixel size is 5.2 × 5.2 μm2, image resolution ratio is 1280*
1024, interface be gigabit network interface, use the zoom lens that camera lens is 4~10mm.Ceiling texture is length of side 600mm in experiment
Square, apart from ground 3330mm, video camera is not on the tripod of fixed angle ground mounting distance ground 425mm for ceiling.
Camera intrinsic parameter demarcation is carried out before experiment, focal length of camera value and photocentre coordinate on the image plane is obtained.It is real during experiment
Schematic diagram of a scenario such as Fig. 5 is tested, video camera moves according to position shown in label 1 to 24 in Fig. 5, ceiling is shot.Choose
The summit of ceiling grid is characterized a little, pass sequentially through edge extracting, morphological image opening operation, closing operation of mathematical morphology,
Harris Robust Algorithm of Image Corner Extraction is extracted to the ceiling texture characteristic points in the image of shooting, obtains the pixel coordinate of characteristic point,
And combine its corresponding video camera photocentre position of the coordinate every width picture of calculating in world coordinate system.Measurement gained video camera exists
Coordinates measurements (the x of the Eulerian angles of each shooting point, each shooting point in world coordinate systemi,yi,zi) standard with each shooting point
Reference valueAs shown in table 1.Data are the average value of multiple measurement in table.I represents the sequence number of measurement point, di,i-1
Represent the measurement distance between sequence number i points and sequence number i-1 points, calculating formula such as formula (18), d0 i,i-1Represent sequence number i points and sequence number i-1
Actual distance between point, is obtained by ruler measurement.Absolute value error is di,i-1With d0 i,i-1Difference absolute value.
Table 1 is using ceiling texture measurement camera position and the experimental result of attitude
Claims (2)
1. the camera position and attitude measurement method of low-rank texture are based on, it is characterised in that step is as follows:
Step one:Obtain the image comprising low-rank texture
The low-rank texture in scene is shot with the video camera for having demarcated focal length and principle point location, is obtained and is included low-rank texture
Image;
Step 2:Video camera is solved using the low-rank texture image for shooting shoot attitude
2.1 set up image coordinate system at CCD centers, and world coordinate system is set up on low-rank texture plane, at video camera photocentre
Set up camera coordinate system, and derived between the coordinate on low-rank texture a little in three coordinate systems based on above-mentioned coordinate system
Relation, such as formula (2):
Wherein
(u, v) is the pixel coordinate of a bit, (x on low-rank texture in formula (2)c,yc,zc) it is this in camera coordinate system
Coordinate, (xw,yw,zw) it is coordinate of this in world coordinate system;S is a scale factor for non-zero, and N is interior for video camera
Parameter matrix, f is the focal length of video camera;Dx, dy represent the pixel dimension in U, V direction, (u respectively0,v0) exist for video camera photocentre
Coordinate in image coordinate system, namely principal point coordinate;Spin moment of the video camera relative to world coordinate system is represented respectively
Battle array and translation matrix;
2.2 represent projection matrix during video camera shooting low-rank texture with Eulerian angles;And low-rank texture is imaged by mould based on formula (2)
Type is revised as the low-rank texture imaging model based on Eulerian angles, the Solve problems of video camera attitude is converted into an optimization and is asked
Topic:Wherein I0Original low-rank texture image is represented, I is represented
The actual photographed image of low-rank texture, τ represents the parameter set that projective transformation is relied on, and comprising 6 parameters, respectively video camera is sat
Three Eulerian angles θs of each axle of mark system around each axle rotation of world coordinate systemx、θy、θzWith sign video camera photocentre relative to world coordinates
It is the normalized displacements amount t ' of origin1,t′2,t′3;Symbol o represents a kind of computing acted on projective transformation on I, E represent with
Machine noise, | | E | |0The number of non-zero entry in random noise E is represented, λ is the weights of E;
2.3 solve the optimization problem using augmented vector approach, solve the Euler's angular dimensions for representing video camera attitude;Ask
Solve comprising the following steps that for Eulerian angles:
Used as input picture, the initial value for setting τ is τ to the captured low-rank texture image of S1, receiving0=(0,0,0,0,0,1), its
First three parameter represents the initial value of Eulerian angles, afterwards three initial values of value expression normalized displacements;Selected convergence precision ε>0;
Selected weights λ>0;
S2, the rectangular window containing low-rank texture is taken over an input image, gained image is designated as I';
S3, to image I', repeat following iterative step, until object function f=| | I0||*+λ||E||1Global convergence:
Image I' is normalized, and normalized value is assigned to I ' ';
The Jacobian matrix on Eulerian angles and normalized displacements is asked for I ' ';
The problem that formula (13) is stated is solved with augmented vector approach, the locally optimal solution τ ' of τ is obtained:
Wherein | | I0||*Represent I0Nuclear norm, | | E | |1Represent the l of E1- norm, I ' ' (uij(τ),vij(τ)) it is to use pixel value
The image I ' ' that (u, v) is represented, and u, v are the functions on τ,It is Jacobian matrixs of the I ' ' on τ,
Δ τ is the increment of τ, and i, j represent the row sequence number and row sequence number of I ' ' respectively;
Using o computings, the corresponding projective transformations of locally optimal solution τ ' are acted on each point of image I ' ', such I ' ' is just
It is converted into the new image of a width;The new images of gained are assigned to I' again;
S4, output globally optimal solution τ*, the value of its first three value as Eulerian angles of expression video camera attitude;
Step 3:Video camera position in space is solved using image-forming principle and low-rank texture geometrical relationship
Two characteristic points are chosen on low-rank texture, its geometrical relationship in space is measured, it is calculated under world coordinate system
Coordinate, while extracting the corresponding pixel of characteristic point from the image that video camera shoots, obtain the pixel coordinate of characteristic point;Profit
The solution formula of camera position is derived with pin-hole imaging principle:
WhereinRepresent world coordinate system origin OwCoordinate under camera coordinate system;Represent camera light
Heart OcCoordinate in world coordinate system;(upi,vpi) represent characteristic point pixel coordinate;(u0,v0) it is video camera principal point, fdIt is video camera
The focal length of picture;
And i, j=1,2;RmnIt is the inverse matrix of projection matrixMiddle parameter, parameter rmnIt is projection matrixMiddle parameter, m, n=1,2,3,
Such as formula (3);
Wherein θx, θy, θzIt is three Eulerian angles of each axle of camera coordinate system around the rotation of each axle of world coordinate system;
Grid zone of the picture of coordinate and characteristic point by the value of Eulerian angles, characteristic point in world coordinate system in image coordinate system
Enter formula (15) and (17), calculate the locus of video camera.
2. camera position and attitude measurement method based on low-rank texture according to claim 1, it is characterised in that step
It is by low-rank texture I the step of o computings in the 2.2 of rapid two0On a bit (u1,v1) by formula (6), change into actual photographed image I
On corresponding points (u2,v2):
Wherein t'=[t'1,t'2,t'3]=[t1/d,t2/d,t3/ d] it is normalized displacements vector, t1,t2,t3For video camera is true
Displacement vector;D is the translation depth of video camera, fdIt is focal length of camera;Parameter rmnIt is projection matrixMiddle parameter and m, n=
1,2,3, such as formula (3):
Wherein θx, θy, θzIt is three Eulerian angles of each axle of camera coordinate system around the rotation of each axle of world coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410777911.5A CN104504691B (en) | 2014-12-15 | 2014-12-15 | Camera position and posture measuring method on basis of low-rank textures |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410777911.5A CN104504691B (en) | 2014-12-15 | 2014-12-15 | Camera position and posture measuring method on basis of low-rank textures |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104504691A CN104504691A (en) | 2015-04-08 |
CN104504691B true CN104504691B (en) | 2017-05-24 |
Family
ID=52946085
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410777911.5A Active CN104504691B (en) | 2014-12-15 | 2014-12-15 | Camera position and posture measuring method on basis of low-rank textures |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104504691B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180086218A (en) * | 2015-11-20 | 2018-07-30 | 매직 립, 인코포레이티드 | Methods and systems for large-scale determination of RGBD camera poses |
CN109936712B (en) * | 2017-12-19 | 2020-12-11 | 陕西外号信息技术有限公司 | Positioning method and system based on optical label |
CN108827300A (en) * | 2018-04-17 | 2018-11-16 | 四川九洲电器集团有限责任公司 | A kind of the equipment posture position measurement method and system of view-based access control model |
WO2022021132A1 (en) * | 2020-07-29 | 2022-02-03 | 上海高仙自动化科技发展有限公司 | Computer device positioning method and apparatus, computer device, and storage medium |
CN113221253B (en) * | 2021-06-01 | 2023-02-07 | 山东贝特建筑项目管理咨询有限公司 | Unmanned aerial vehicle control method and system for anchor bolt image detection |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4859205B2 (en) * | 2005-02-04 | 2012-01-25 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
CN102122172B (en) * | 2010-12-31 | 2013-03-13 | 中国科学院计算技术研究所 | Image pickup system and control method thereof for machine motion control |
CN103268612B (en) * | 2013-05-27 | 2015-10-28 | 浙江大学 | Based on the method for the single image fisheye camera calibration of low-rank characteristic recovery |
-
2014
- 2014-12-15 CN CN201410777911.5A patent/CN104504691B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN104504691A (en) | 2015-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110296691B (en) | IMU calibration-fused binocular stereo vision measurement method and system | |
CN106651942B (en) | Three-dimensional rotating detection and rotary shaft localization method based on characteristic point | |
US10026218B1 (en) | Modeling indoor scenes based on digital images | |
CN108510551B (en) | Method and system for calibrating camera parameters under long-distance large-field-of-view condition | |
Sturm | Algorithms for plane-based pose estimation | |
US8098958B2 (en) | Processing architecture for automatic image registration | |
CN109919911B (en) | Mobile three-dimensional reconstruction method based on multi-view photometric stereo | |
CN104537707B (en) | Image space type stereoscopic vision moves real-time measurement system online | |
CN104504691B (en) | Camera position and posture measuring method on basis of low-rank textures | |
Saurer et al. | Homography based visual odometry with known vertical direction and weak manhattan world assumption | |
US20060215935A1 (en) | System and architecture for automatic image registration | |
CN107843251B (en) | Pose estimation method of mobile robot | |
CN101887585B (en) | Method for calibrating camera based on non-coplanar characteristic point | |
CN108629829B (en) | Three-dimensional modeling method and system of the one bulb curtain camera in conjunction with depth camera | |
CN106530358A (en) | Method for calibrating PTZ camera by using only two scene images | |
GB2506411A (en) | Determination of position from images and associated camera positions | |
CN105654547B (en) | Three-dimensional rebuilding method | |
CN108645398A (en) | A kind of instant positioning and map constructing method and system based on structured environment | |
CN110044374A (en) | A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image | |
CN109613974B (en) | AR home experience method in large scene | |
CN107038753B (en) | Stereoscopic vision three-dimensional reconstruction system and method | |
CN102914295A (en) | Computer vision cube calibration based three-dimensional measurement method | |
CN113329179A (en) | Shooting alignment method, device, equipment and storage medium | |
Rousso et al. | Robust recovery of camera rotation from three frames | |
Kurillo et al. | Framework for hierarchical calibration of multi-camera systems for teleimmersion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |