CN106101689B - The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality - Google Patents

The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality Download PDF

Info

Publication number
CN106101689B
CN106101689B CN201610420931.6A CN201610420931A CN106101689B CN 106101689 B CN106101689 B CN 106101689B CN 201610420931 A CN201610420931 A CN 201610420931A CN 106101689 B CN106101689 B CN 106101689B
Authority
CN
China
Prior art keywords
coordinate system
camera
mobile phone
virtual camera
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610420931.6A
Other languages
Chinese (zh)
Other versions
CN106101689A (en
Inventor
姜光
常河河
马超群
贾静
彭亲利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201610420931.6A priority Critical patent/CN106101689B/en
Publication of CN106101689A publication Critical patent/CN106101689A/en
Application granted granted Critical
Publication of CN106101689B publication Critical patent/CN106101689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/008Aspects relating to glasses for viewing stereoscopic images

Abstract

The invention discloses a kind of method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality, mainly solve the problems, such as that prior art function is single.Its technical scheme is:1. earth coordinates are established according to mark figure, with scene of the mobile phone rear camera shooting containing mark figure;2. solve the projection matrix P of the homography conversion matrix H and camera between mark plan and the imaging plane of mobile phone camera;3. establish dummy object threedimensional model and two Softcam C1And C2;4. calculate on threedimensional model and scene is in Softcam C1And C2Under imaging I1And I2;5. I will be imaged1And I2The screen of left and right two of mobile phone screen is respectively displayed on, VR glasses display screens is served as with mobile phone screen, completes AR stereo display effects.The present invention realizes the function of making virtual reality glasses complete augmented reality stereoscopic display using mobile phone monocular cam, has the advantages of cheap cost, simple installation, available for industry, education, the virtual visualization entertained.

Description

Method for enhancing reality of virtual reality glasses by using monocular camera of mobile phone
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a method for enhancing reality of virtual reality glasses, which can be used for virtual visualization of industry, education and entertainment.
Background
Augmented reality technology, AR technology for short, is a technology for calculating the position and angle of a camera image in real time and adding a corresponding virtual image, and the aim of the technology is to superimpose a real environment and a virtual object on the same picture in real time so that the real environment and the virtual object exist at the same time. With the development of AR technology, its application in the fields of education, medical treatment, art, etc. is becoming more and more extensive, and accordingly, wearable AR stereoscopic display technology is also in rapid upgrade.
At present, wearable AR stereoscopic display equipment is mainly AR glasses, and the function of the AR glasses is to add a virtual object model for a current scene on the premise that a user watches a real scene, and perform stereoscopic display on the model.
The AR glasses on the market are mainly classified into an open type and a closed type. The open type AR glasses are similar to glasses worn by people in daily life in appearance, have transparent open type visual fields, and form a stereoscopic display feeling by projecting virtual pictures in the visual fields of the wearers through a micro projector. Closed AR glasses generally use the binocular camera to catch the scene and add virtual data for the camera picture, in order to simulate the binocular parallax of people's eyes, simulate people's eye stereovision promptly, these two cameras need settle according to people's eye interpupillary distance, then make the person of wearing see different screen pictures through dividing the screen operation to the display screen, just produced three-dimensional sense of vision after people's brain is synthesized. These two systems have the following problems:
1) The projection technology is used for the open type AR glasses, the whole system is high in cost, the requirement on hardware is high, and the market cannot be well popularized;
2) The closed type AR glasses are heavy due to the fact that the two cameras are arranged, cost is high, meanwhile, software needs to process two paths of video textures at the same time, and high requirements are placed on performance of a processor.
The virtual reality glasses, namely VR glasses, which are popularized in the market at present, utilize the binocular parallax principle to block the binocular vision, provide two virtual scene pictures which accord with the binocular parallax for the left and right eyes respectively, and then adjust the vision distance, and finally realize the three-dimensional display of the virtual scene. The glasses are low in cost, and the display screen can be replaced by a mobile phone screen, so that a user can be completely immersed in a virtual scene built by a developer. However, the two images provided by the VR glasses for the two eyes of the user only contain information of the virtual scene, and the user cannot see the real scene after wearing the VR glasses, and cannot simultaneously display the real scene and the virtual object model added in the scene.
Disclosure of Invention
The invention aims to provide a method for enhancing reality of virtual reality glasses by using a monocular camera of a mobile phone so as to realize simultaneous display of a real scene and a virtual object model added in the scene.
In order to achieve the purpose, the technical scheme of the invention comprises the following steps:
(1) Placing the identification graph on a plane in a real scene, establishing an earth coordinate system OXYZ, and forming an XOY plane of the earth coordinate system by using the identification graph, wherein the Z axis of the earth coordinate system is vertical to the identification graph;
(2) A camera C is arranged at the back of the mobile phone, and a scene containing the identification chart is shot;
(3) Utilizing coordinate values of the characteristic points on the identification graph under a geodetic coordinate system and a mobile phone camera imaging plane coordinate system respectively to solve a homography transformation matrix H between the identification graph plane and the mobile phone camera imaging plane, and decomposing a camera projection matrix P of the geodetic coordinate system and the mobile phone camera imaging plane coordinate system from the H so as to obtain a position parameter of the mobile phone camera C;
(4) Establishing a virtual object three-dimensional model by utilizing modeling software, obtaining information of all points on the model, defining the specific positions of the model on a marking graph plane, and obtaining the coordinates of all points on the three-dimensional model under a geodetic coordinate system;
(5) Defining two virtual cameras C 1 And C 2 And make the first virtual camera C 1 And a second virtual camera C 2 The position of the mobile phone camera is consistent with the positions of the two eyes behind the display screen when the user wears VR glasses, and the coordinate system of the mobile phone camera is obtained and respectively reaches the first virtual camera C 1 First rotation-translation matrix H 1 And to the second virtual camera C 2 Second rotation-translation matrix H of coordinate system 2
(6) Utilizing the two rotation and translation matrixes H obtained in the step (5) 1 、H 2 And (4) calculating a camera projection matrix P obtained in the step (3), and respectively arranging the points on the three-dimensional model in a first virtual camera C 1 Imaging of m1 And in the second virtual cameraHead C 2 Imaging of m2 Calculating all points in the scene in the first virtual camera C respectively 1 Imaging of e1 And in the second virtual camera C 2 Imaging of e2 A first reaction of m1 And I e1 Overlapping to obtain a scene and a model in a first virtual camera C 1 General imaging of 1 Is shown by m2 And I e2 Overlapping to obtain a scene and a model in a second virtual camera C 2 General imaging of 2
(7) Transversely placing the mobile phone screen, dividing the mobile phone screen into a left screen and a right screen from the middle, and obtaining two images I in the step (6) in a code form 1 And I 2 Respectively writing the data into a left screen and a right screen of the mobile phone; and then, the mobile phone is placed at the front end of the VR glasses, the screen of the mobile phone is used as a display screen of the VR glasses, and a user wearing the glasses looks at the marker map, so that the three-dimensional display of the virtual three-dimensional model in the real scene can be realized.
Compared with the prior art, the invention has the following advantages:
according to the invention, the mobile phone and the virtual glasses are combined together, so that the defect that the virtual glasses can only display a virtual scene is improved, and the real scene and a virtual object model added in the scene are displayed simultaneously; meanwhile, as the VR real glasses have simple principle and low price, the manufacturing cost is greatly reduced; in addition, as the mobile phone is used as an article available everywhere in daily life, the mobile phone can be put into use only by installing a corresponding installation package, so that the popularization degree is high and the popularity is wide.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 shows the construction of two virtual cameras C according to the present invention 1 And C 2 And a schematic diagram of the positional relationship between the coordinate systems.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
Referring to fig. 1, the method comprises the following steps:
step 1, establishing a geodetic coordinate system according to the identification chart
The identification map is placed on a plane in a real scene, an earth coordinate system OXYZ is established, an XOY plane of the earth coordinate system is formed by the identification map, and the Z axis of the earth coordinate system is perpendicular to the identification map.
And 2, using the mobile phone to back a camera C to shoot a scene containing the identification map to obtain a picture containing the identification map.
And 3, solving the homography transformation matrix H and the projection matrix P of the camera by using the picture containing the identification map in the step 2.
The solving method of the step adopts a camera calibration method, which is realized as follows:
3a) The coordinate values of the feature points on the marker chart in the geodetic coordinate system are recorded as (X, Y, Z), and since the Z-axis component is 0, the homogeneous coordinate of the points on the marker chart is represented as
3b) By detecting and positioning the characteristic points of the identification graph in the picture shot by the mobile phone, the coordinate values of the characteristic points under the mobile phone camera imaging plane coordinate system are obtained and marked as (mu, v), and the homogeneous expression form is
3c) Solving a homography transformation matrix H:
3c1) Definition H = K [ r 1 ,r 2 ,t]Where K is a 3 × 3 dimensional matrix representing camera intrinsic parameters:default to known, (f) x ,f y ) Is the camera focal length (μ) 0 ,ν 0 ) Representing the intersection point of the main optical axis of the camera and the imaging plane of the camera;
3c2) Let r 1 And r 2 Are both unknown3X 1 dimensional rotation vector, memoryt is an unknown 3 × 1-dimensional translation vector
3c3) According to the projection equation between the identification graph plane and the imaging plane of the mobile phone camera, the following equation is established and substituted into an expression H, namely:
define a 3 × 3 parameter matrix W:
wherein omega 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 、ω 9 For 9 unknown parameters to be solved, because the coordinates are in a homogeneous form, the omega is taken 9 To 1, i.e. only need to find ω 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 These eight parameters, i.e. the formula<2&gt, in-band type<1&Is of the formula<1&gt, simply:
expansion equation <3>:
λμ=ω 1 X+ω 2 Y+ω 3
λν=ω 4 X+ω 5 Y+ω 6
λ=ω 7 X+ω 8 Y+ω 9 <4>
further simplification of formula <4> yields:
is of the formula<5&gt, 8 unknown parameters omega 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 The eight parameters can be solved by only finding out 4 pairs of corresponding points;
can be obtained by the formula <2 >:
is composed of<6&gt, can calculate alpha 1 、α 2 、α 3 、β 1 、β 2 、β 3 、τ 1 、τ 2 、τ 3 H is solved;
3d) And (3) solving a projection matrix P of a geodetic coordinate system to a mobile phone camera imaging plane coordinate system by H:
P=K[r 1 ,r 2 ,r 3 ,t],
k and r in P 1 、r 2 T is equal to K and r in H 1 、r 2 T is the same as r 3 For unknown 3 × 1 dimensional rotation vector to be solved, recordingThen P can be expressed as:
wherein:
thus, all parameters in P are obtained to obtain P.
And 4, establishing a three-dimensional model of the virtual object.
And establishing a three-dimensional model of the virtual object by utilizing modeling software, obtaining information of all points on the model, defining the specific positions of the model on the plane of the marker graph, and obtaining the coordinates of all points on the three-dimensional model in a geodetic coordinate system.
Step 5, defining two virtual cameras C 1 And C 2 And calculating that the mobile phone camera respectively reaches C 1 And C 2 Rotational translation matrix H of 1 And H 2
5a) Defining two virtual cameras C 1 And C 2 Referring to FIG. 2, a first virtual camera C is shown 1 Placing the left eye at the position where the user wears the VR glasses, and making the second virtual camera C 2 The mobile phone screen is placed at the position of the right eye when the user wears VR glasses, and serves as a VR glasses display screen;
5b) By measuring the camera of the mobile phone to the first virtual camera C 1 Calculating the projection of the distance in each axis direction of the mobile phone camera coordinate system to obtain the distance from the mobile phone camera coordinate system to the first virtual camera C 1 The rotation-translation matrix of the coordinate system, i.e. the first rotation matrix H 1
Wherein r is c1 For the first from the mobile phone camera coordinate system to the first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, r c2 The second from the mobile phone camera coordinate system to the first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, r c3 For the third slave mobile phoneCamera coordinate system to first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, t c From the mobile phone camera coordinate system to a first virtual camera C 1 A 3 x 1 dimensional translation vector of the coordinate system.
5c) By measuring the cell phone camera to a second virtual camera C 2 Calculating the projection of the distance in each axial direction of the mobile phone camera coordinate system to obtain the distance from the mobile phone camera coordinate system to the second virtual camera C 2 Rotation-translation matrix H of coordinate system 2 I.e. the second rotation matrix:
wherein r' c1 For the first from the mobile phone camera coordinate system to the second virtual camera C 2 3 × 1 dimensional rotation conversion vector r 'of coordinate system' c2 For the second from the mobile phone camera coordinate system to a second virtual camera C 2 3 × 1 dimensional rotation conversion vector r 'of coordinate system' c3 For the third from the mobile phone camera coordinate system to the second virtual camera C 2 3 × 1 dimensional rotation conversion vector, t 'of coordinate system' c From the mobile phone camera coordinate system to a second virtual camera C 2 A 3 x 1 dimensional translation vector of a coordinate system.
Step 6, calculating all points on the three-dimensional model and all points in the scene in the virtual camera C 1 And C 2 Following image I 1 And I 2
6a) And (3) solving a transformation relation between a geodetic coordinate system and a mobile phone camera coordinate system:
the projection matrix P = K [ r ] of the camera obtained in the step (3) 1 ,r 2 ,r 3 ,t]The transformation relation of the geodetic coordinate system to the mobile phone camera coordinate system is obtained as follows:
whereinIn a homogeneous form of a coordinate system of a camera of the mobile phone,is a homogeneous representation of the geodetic coordinate system;
6b) Get the geodetic coordinate system to the first virtual camera C 1 Transformation relationship of coordinate system:
6b1) According to the first rotation-translation matrix H in the step (5) 1 Establishing a coordinate system of the mobile phone camera to a first virtual camera C 1 Transformation relation of coordinate system:
whereinIn a homogeneous form of a coordinate system of a camera of the mobile phone,is a first virtual camera C 1 Homogeneous form of the coordinate system.
6b2) General formula<7&Ready-to-carry type<8&get the geodetic coordinate system to the first virtual camera C 1 Transformation relationship of coordinate system:
6c) Calculating the geodetic coordinate system to a second virtual camera C 2 Transformation relation of coordinate system:
6c1) According to the second rotation-translation matrix H in the step (5) 2 Establishing a coordinate system of the mobile phone camera to a second virtual camera C 2 Transformation relation of coordinate system:
whereinIs a second virtual camera C 2 A homogeneous form of the coordinate system;
6c2) General formula (II)<7&gt, in-band type<10&get the geodetic coordinate system to the second virtual camera C 2 Transformation relation of coordinate system:
6d) Is composed of<9&And formula<11&gt, obtaining a first virtual camera C 1 And a second virtual camera C 2 Camera projection matrix P 1 And P 2
Wherein P is 1 、P 2 K in the matrix is consistent with K in the projection matrix P of the mobile phone camera;
6e) Establishing the following equation by using a projection equation, and respectively calculating the points on the three-dimensional model in the first virtual camera C 1 And a second virtual camera C 2 Projection onto the imaging plane:
whereinFor points on the three-dimensional model at a first virtual camera C 1 A homogeneous representation of the projected point coordinates on the image plane,for points on the three-dimensional model to be a second virtual camera C 2 A homogeneous representation of the projected point coordinates on the imaging plane,coordinates of points on the three-dimensional model under a geodetic coordinate system;
6f) Repeating the calculation process of the step 6 e) for all the points on the three-dimensional model to finally obtain the first virtual camera C of the three-dimensional model 1 Imaging of m1 And in the second virtual camera C 2 Imaging of m2
6g) The following equation is established by a projection equation, and points in the scene are respectively calculated in the first virtual camera C 1 And a second virtual camera C 2 Projection onto the imaging plane:
whereinFor a point in the scene is a first virtual camera C 1 A homogeneous representation of the projected coordinates on the image plane,second virtual camera for point in sceneHead C 2 A homogeneous representation of the projection coordinates on the imaging plane,coordinates of points in the scene in a geodetic coordinate system;
6h) Repeating the calculation process of the step 6 g) for all points in the scene, and finally obtaining the first virtual camera C of the scene 1 Imaging of e1 And in the second virtual camera C 2 Imaging of e2
6i) Placing the scene obtained in the step 6 h) in a first virtual camera C 1 Imaging of e1 With the model obtained in step 6 f) in a first virtual camera C 1 Imaging of m1 Overlapping to obtain a model and a scene in a first virtual camera C 1 General imaging of 1 Setting the scene obtained in the step 6 h) in a second virtual camera C 2 Lower image formation I e2 The model obtained in the step 6 f) is arranged in a second virtual camera C 2 Imaging of m2 Overlapping to obtain a model and a scene in a second virtual camera C 2 General imaging of 2
7, transversely placing the mobile phone screen, dividing the mobile phone screen into a left screen and a right screen from the middle, and using a code form to enable the model and the scene obtained in the step 6 to be positioned on a first virtual camera C 1 General imaging of 1 And the model and the scene are in a second virtual camera C 2 General imaging of 2 Respectively writing the data into a left screen and a right screen of the mobile phone; and then, the mobile phone is placed at the front end of the VR glasses, the screen of the mobile phone serves as the display screen of the VR glasses, and a user wearing the glasses looks at the experimental scene with the identification chart, so that the real scene and the virtual three-dimensional model can be displayed in a three-dimensional mode at the same time.
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (6)

1. A method for carrying out augmented reality on virtual reality glasses by utilizing a mobile phone monocular camera comprises the following steps:
(1) Placing the identification graph on a plane in a real scene, establishing an earth coordinate system OXYZ, and forming an XOY plane of the earth coordinate system by using the identification graph, wherein the Z axis of the earth coordinate system is vertical to the identification graph;
(2) A camera C is arranged behind the mobile phone, and a scene containing the identification chart is shot;
(3) Utilizing coordinate values of the characteristic points on the identification graph under a geodetic coordinate system and a mobile phone camera imaging plane coordinate system respectively, solving a homography transformation matrix H between the identification graph plane and the mobile phone camera imaging plane, and decomposing a camera projection matrix P of the geodetic coordinate system and the mobile phone camera imaging plane coordinate system from the H, thereby obtaining the position parameters of the mobile phone camera C;
(4) Establishing a virtual object three-dimensional model by utilizing modeling software, obtaining information of all points on the model, defining the specific positions of the model on a marking graph plane, and obtaining the coordinates of all points on the three-dimensional model under a geodetic coordinate system;
(5) Defining two virtual cameras C 1 And C 2 And make the first virtual camera C 1 And a second virtual camera C 2 The positions of the two eyes are consistent with the positions of the two eyes behind the display screen when the user wears VR glasses, and the coordinate systems of the mobile phone camera and the virtual camera C are obtained and respectively reach the first virtual camera C 1 First rotation-translation matrix H 1 And to a second virtual camera C 2 Second rotation-translation matrix H of coordinate system 2
(6) Utilizing two rotation and translation matrixes H obtained in the step (5) 1 、H 2 And (4) calculating a camera projection matrix P obtained in the step (3), and respectively arranging the points on the three-dimensional model in a first virtual camera C 1 Imaging of m1 And in the second virtual camera C 2 Imaging of m2 Calculating all points in the scene in the first virtual camera C respectively 1 Imaging of e1 And in the second virtual camera C 2 Imaging of e2 Is shown by m1 And I e1 Overlapping to obtain a scene and a model in a first virtual camera C 1 General imaging of 1 Is shown by m2 And I e2 Overlapping to obtain a scene and a model in a second virtual camera C 2 General imaging of 2
(7) Transversely placing the mobile phone screen, dividing the mobile phone screen into a left screen and a right screen from the middle, and obtaining two images I in the step (6) in a code form 1 And I 2 Respectively writing the data into a left screen and a right screen of the mobile phone; and then, the mobile phone is placed at the front end of the VR glasses, the screen of the mobile phone serves as the display screen of the VR glasses, and a user wearing the glasses looks at the marker map, so that the three-dimensional display of the virtual three-dimensional model in the real scene can be realized.
2. The method for augmented reality of virtual reality glasses by using a monocular mobile phone camera as claimed in claim 1, wherein: solving a homography transformation matrix H and a projection matrix P in the step (3): the method comprises the following steps:
3a) The coordinate values of the feature points on the marker chart in the geodetic coordinate system are recorded as (X, Y, Z), and since the Z-axis component is 0, the homogeneous coordinate of the points on the marker chart is represented as
3b) By detecting and positioning the characteristic points of the identification graph in the picture shot by the mobile phone, the coordinate values of the characteristic points under the mobile phone camera imaging plane coordinate system are obtained and marked as (mu, v), and the homogeneous expression form is
3c) Solving a homography transformation matrix H:
3c1) Definition H = K [ r ] 1 ,r 2 ,t]Where K is a 3 × 3 dimensional matrix representing camera intrinsic parameters:
is known by default, (f) x ,f y ) Is the focal length of the camera (mu) 0 ,ν 0 ) Representing the intersection point of the main optical axis of the camera and the imaging plane of the camera;
3c2) Let r 1 And r 2 Both unknown 3 x 1 dimensional rotation vectors are recordedt is an unknown 3 × 1-dimensional translation vector
3c3) According to the projection equation between the identification graph plane and the imaging plane of the mobile phone camera, the following equation is established and substituted into an H expression, namely:
define a 3 × 3 parameter matrix W:
wherein ω is 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 、ω 9 For 9 unknown parameters to be solved, because the coordinates are in a homogeneous form, the omega is taken 9 To 1, i.e. only need to find ω 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 These eight parameters, i.e. the formula<2&Ready-to-carry type<1&Is of the formula<1&And the method is simplified as follows:
expansion equation <3>:
λμ=ω 1 X+ω 2 Y+ω 3
λν=ω 4 X+ω 5 Y+ω 6
λ=ω 7 X+ω 8 Y+ω 9 <4>
further simplification of formula <4> yields:
formula (II)<5&gt, 8 unknown parameters omega 1 、ω 2 、ω 3 、ω 4 、ω 5 、ω 6 、ω 7 、ω 8 The eight parameters can be solved by only finding out 4 pairs of corresponding points;
can be obtained from the formula <2 >:
is composed of<6&gt, can calculate alpha 1 、α 2 、α 3 、β 1 、β 2 、β 3 、τ 1 、τ 2 、τ 3 H is solved;
3d) And (3) solving a projection matrix P of a geodetic coordinate system to a mobile phone camera imaging plane coordinate system by H:
P=K[r 1 ,r 2 ,r 3 ,t],
k and r in P 1 、r 2 T is equal to K and r in H 1 、r 2 T is the same as r 3 For unknown 3 × 1 dimensional rotation vector to be solved, recordingThen P can be expressed as:
wherein:
thus, all parameters in P are obtained to obtain P.
3. The method for augmented reality of virtual reality glasses by using a monocular mobile phone camera as claimed in claim 1, wherein: the first virtual camera C in the step (5) 1 First rotation-translation matrix H 1 Expressed as follows:
wherein r is c1 For the first from the mobile phone camera coordinate system to the first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, r c2 The second from the mobile phone camera coordinate system to the first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, r c3 For the third from the mobile phone camera coordinate system to the first virtual camera C 1 3 x 1 dimensional rotational transformation vector of coordinate system, t c From the coordinate system of the mobile phone camera to the first virtual camera C 1 A 3 x 1 dimensional translation vector of the coordinate system.
4. The method for augmented reality of virtual reality glasses by using a monocular mobile phone camera as claimed in claim 1, wherein: the second virtual camera C in the step (5) 2 Second rotation-translation matrix H 2 Expressed as follows:
wherein r' c1 For the first from the mobile phone camera coordinate system to the second virtual camera C 2 3 × 1-dimensional rotation conversion vector r 'of coordinate system' c2 For the second from the mobile phone camera coordinate system to the second virtual camera C 2 3 × 1 dimensional rotation conversion vector r 'of coordinate system' c3 For the third from the mobile phone camera coordinate system to the second virtual camera C 2 3 × 1 dimensional rotation conversion vector, t 'of coordinate system' c From the coordinate system of the mobile phone camera to a second virtual camera C 2 A 3 x 1 dimensional translation vector of a coordinate system.
5. The method for augmented reality of virtual reality glasses by using a monocular mobile phone camera as claimed in claim 1, wherein: step (6) calculating the three-dimensional model in a first virtual camera C 1 Imaging of m1 And in the second virtual camera C 2 Imaging of m2 The method comprises the following steps:
6a) And (3) solving a transformation relation between a geodetic coordinate system and a mobile phone camera coordinate system:
the projection matrix P = K [ r ] of the camera obtained in the step (3) 1 ,r 2 ,r 3 ,t]The transformation relation of the geodetic coordinate system to the mobile phone camera coordinate system is obtained as follows:
whereinIn a homogeneous form of a coordinate system of a camera of the mobile phone,a homogeneous representation of a geodetic coordinate system;
6b) Obtaining a geodetic coordinate system to a first virtual cameraC 1 Transformation relation of coordinate system:
6b1) According to the first rotation-translation matrix H in the step (5) 1 Establishing a coordinate system of the mobile phone camera to a first virtual camera C 1 Transformation relation of coordinate system:
whereinIn a homogeneous form of a coordinate system of a camera of the mobile phone,is a first virtual camera C 1 A homogeneous form of the coordinate system;
6b2) General formula<7&Ready-to-carry type<8&get the geodetic coordinate system to the first virtual camera C 1 Transformation relationship of coordinate system:
6c) Calculating the geodetic coordinate system to a second virtual camera C 2 Transformation relationship of coordinate system:
6c1) According to the second rotation-translation matrix H in the step (5) 2 Establishing a coordinate system of the mobile phone camera to a second virtual camera C 2 Transformation relationship of coordinate system:
whereinAs a second virtual camera C 2 A homogeneous form of the coordinate system;
6c2) General formula<7&Ready-to-carry type<10&gt, obtaining a geodetic coordinate system to a second virtual camera C 2 Transformation relation of coordinate system:
6d) Is composed of<9&And formula<11&gt, obtaining a first virtual camera C 1 And a second virtual camera C 2 Camera projection matrix P 1 And P 2
Wherein P is 1 、P 2 K in the matrix is consistent with K in the projection matrix P of the mobile phone camera;
6e) Establishing the following equation by using a projection equation, and respectively calculating the points on the three-dimensional model in the first virtual camera C 1 And a second virtual camera C 2 Projection onto the imaging plane:
whereinFor points on the three-dimensional model at a first virtual camera C 1 A homogeneous representation of the projected point coordinates on the image plane,
for points on the three-dimensional model to be a second virtual camera C 2 A homogeneous representation of the projected point coordinates on the imaging plane,
coordinates of points on the three-dimensional model under a geodetic coordinate system;
6f) Repeating the calculation process of the step 6 e) for all the points on the three-dimensional model to finally obtain the first virtual camera C of the three-dimensional model 1 Imaging of m1 And in the second virtual camera C 2 Imaging of m2
6. The method for augmented reality of virtual reality glasses by using a monocular mobile phone camera as claimed in claim 1, wherein: calculating the scene in the first virtual camera C in the step (6) 1 Imaging of e1 And in the second virtual camera C 2 Imaging of e2 The method comprises the following steps:
6g) Establishing the following equation by using a projection equation, and respectively calculating the points in the scene in the first virtual camera C 1 And a second virtual camera C 2 Projection onto the imaging plane:
whereinFor a point in the scene is a first virtual camera C 1 A homogeneous representation of the projected coordinates on the image plane,
for points in the scene at the second virtual camera C 2 A homogeneous representation of the projection coordinates on the imaging plane,
coordinates of points in the scene in a geodetic coordinate system;
6h) Repeating the calculation process of the step 6 g) for all points in the scene, and finally obtaining the first virtual camera C of the scene 1 Imaging of e1 And in the second virtual camera C 2 Imaging of e2
CN201610420931.6A 2016-06-13 2016-06-13 The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality Active CN106101689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610420931.6A CN106101689B (en) 2016-06-13 2016-06-13 The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610420931.6A CN106101689B (en) 2016-06-13 2016-06-13 The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality

Publications (2)

Publication Number Publication Date
CN106101689A CN106101689A (en) 2016-11-09
CN106101689B true CN106101689B (en) 2018-03-06

Family

ID=57846309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610420931.6A Active CN106101689B (en) 2016-06-13 2016-06-13 The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality

Country Status (1)

Country Link
CN (1) CN106101689B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774870A (en) * 2016-12-09 2017-05-31 武汉秀宝软件有限公司 A kind of augmented reality exchange method and system
CN106534800A (en) * 2016-12-12 2017-03-22 大连文森特软件科技有限公司 Drawing auxiliary system based on Augmented Reality (AR) technology and wireless communication technology
CN106534802A (en) * 2016-12-12 2017-03-22 大连文森特软件科技有限公司 AR technology and wireless communication technology based drawing auxiliary system
CN106534801A (en) * 2016-12-12 2017-03-22 大连文森特软件科技有限公司 Auxiliary painting system based on AR technique and data mining
CN106937085A (en) * 2016-12-12 2017-07-07 大连文森特软件科技有限公司 Drawing accessory system based on AR augmented realities
US20180211447A1 (en) * 2017-01-24 2018-07-26 Lonza Limited Methods and Systems for Using a Virtual or Augmented Reality Display to Perform Industrial Maintenance
JP6426772B2 (en) * 2017-02-07 2018-11-21 ファナック株式会社 Coordinate information conversion apparatus and coordinate information conversion program
CN106980371B (en) * 2017-03-24 2019-11-05 电子科技大学 It is a kind of based on the mobile augmented reality exchange method for closing on heterogeneous distributed structure
CN107071384B (en) * 2017-04-01 2018-07-06 上海讯陌通讯技术有限公司 The binocular rendering intent and system of virtual active disparity computation compensation
CN106982298A (en) * 2017-04-01 2017-07-25 小派科技(上海)有限责任公司 A kind of implementation method of virtual reality mobile terminal and virtual reality mobile terminal
CN107168520B (en) * 2017-04-07 2020-12-18 北京小鸟看看科技有限公司 Monocular camera-based tracking method, VR (virtual reality) equipment and VR head-mounted equipment
CN108984075B (en) * 2017-05-31 2021-09-07 华为技术有限公司 Display mode switching method and device and terminal
CN112805075A (en) * 2018-06-15 2021-05-14 伊瓦·阿尔布佐夫 Advanced game visualization system
US10777012B2 (en) 2018-09-27 2020-09-15 Universal City Studios Llc Display systems in an entertainment environment
CN110111413A (en) * 2019-04-08 2019-08-09 西安电子科技大学 A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario
CN112631424A (en) * 2020-12-18 2021-04-09 上海影创信息科技有限公司 Gesture priority control method and system and VR glasses thereof
CN113188439B (en) * 2021-04-01 2022-08-12 深圳市磐锋精密技术有限公司 Internet-based automatic positioning method for mobile phone camera shooting
CN113112545B (en) * 2021-04-15 2023-03-21 西安电子科技大学 Handheld mobile printing device positioning method based on computer vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011159757A2 (en) * 2010-06-15 2011-12-22 Sensics Inc. Systems and methods for personal viewing devices
CN104345802A (en) * 2013-08-08 2015-02-11 派布勒斯有限公司 Method and device for controlling a near eye display
CN104580986A (en) * 2015-02-15 2015-04-29 王生安 Video communication system combining virtual reality glasses
CN204462541U (en) * 2015-01-02 2015-07-08 靳卫强 A kind of intelligent glasses realizing augmented reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8743244B2 (en) * 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011159757A2 (en) * 2010-06-15 2011-12-22 Sensics Inc. Systems and methods for personal viewing devices
CN104345802A (en) * 2013-08-08 2015-02-11 派布勒斯有限公司 Method and device for controlling a near eye display
CN204462541U (en) * 2015-01-02 2015-07-08 靳卫强 A kind of intelligent glasses realizing augmented reality
CN104580986A (en) * 2015-02-15 2015-04-29 王生安 Video communication system combining virtual reality glasses

Also Published As

Publication number Publication date
CN106101689A (en) 2016-11-09

Similar Documents

Publication Publication Date Title
CN106101689B (en) The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality
US10269177B2 (en) Headset removal in virtual, augmented, and mixed reality using an eye gaze database
JP4804256B2 (en) Information processing method
CN108513123B (en) Image array generation method for integrated imaging light field display
CN107016704A (en) A kind of virtual reality implementation method based on augmented reality
KR20200012043A (en) Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking
CN103839227B (en) Fisheye image correcting method and device
US6388666B1 (en) System and method for generating stereoscopic image data
CN105916022A (en) Video image processing method and apparatus based on virtual reality technology
CN104599317B (en) A kind of mobile terminal and method for realizing 3D scanning modeling functions
KR20140108128A (en) Method and apparatus for providing augmented reality
CN109361913A (en) For providing the method and apparatus of 3-D image for head-mounted display
CN101631257A (en) Method and device for realizing three-dimensional playing of two-dimensional video code stream
CN205610834U (en) Stereo display system
CN101729920A (en) Method for displaying stereoscopic video with free visual angles
CN203746012U (en) Three-dimensional virtual scene human-computer interaction stereo display system
CN108616752B (en) Head-mounted equipment supporting augmented reality interaction and control method
JP2014095808A (en) Image creation method, image display method, image creation program, image creation system, and image display device
CN110337674A (en) Three-dimensional rebuilding method, device, equipment and storage medium
JP2014095809A (en) Image creation method, image display method, image creation program, image creation system, and image display device
US20210295587A1 (en) Stylized image painting
CN108830944B (en) Optical perspective three-dimensional near-to-eye display system and display method
CN107545537A (en) A kind of method from dense point cloud generation 3D panoramic pictures
CN103530869B (en) For mating the system and method that moving mass controls
CN109003294A (en) A kind of unreal &amp; real space location registration and accurate matching process

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant