CN106878698B - The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition - Google Patents
The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition Download PDFInfo
- Publication number
- CN106878698B CN106878698B CN201611202874.0A CN201611202874A CN106878698B CN 106878698 B CN106878698 B CN 106878698B CN 201611202874 A CN201611202874 A CN 201611202874A CN 106878698 B CN106878698 B CN 106878698B
- Authority
- CN
- China
- Prior art keywords
- image
- virtual reality
- viewpoint
- naked eye
- optical path
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000003287 optical effect Effects 0.000 title claims abstract description 33
- 238000010586 diagram Methods 0.000 claims abstract description 45
- 239000002131 composite material Substances 0.000 claims abstract description 38
- 238000005303 weighing Methods 0.000 claims abstract description 28
- 230000001133 acceleration Effects 0.000 claims abstract description 21
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 20
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 20
- 238000004891 communication Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 13
- 238000006073 displacement reaction Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 5
- 238000012937 correction Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
Abstract
The present invention relates to the method and systems that the embodiment of the present invention proposes a kind of mobile naked eye three-dimensional virtual reality based on optical path acquisition.Wherein, this method may include: acquisition image;Two View Synthesis index of the picture figures are obtained according to acquired image;According to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained;Two viewpoint figures are rendered composite diagram by the two View Synthesis index of the picture figures based on image, are obtained three-dimension virtual reality and are shown image.Preferably, obtaining two viewpoint figure steps can specifically include: determining sub-pixel luminence weighing factor on image, then using the least square method for having bounding box, determines the composite diagram index map of two viewpoints;According to the rotation information of the location information of acceleration and angular speed, the orientation of acquired image is adjusted, obtains two viewpoint figures.The embodiment of the present invention solves the technical issues of naked eye three-dimensional virtual reality display for how realizing high quality by using above-mentioned technical proposal.
Description
Technical field
The present invention relates to 3 D displaying method more particularly to a kind of mobile naked eye three-dimensional virtual realities based on optical path acquisition
Method and system.
Background technique
Currently, virtual reality has started tide in the whole world, and three-dimension virtual reality is then a newer developing direction.It is empty
Quasi- reality technology is one for having gathered tracking system, haptic system, image generation and display system and visual display device
Comprehensive achievement, but there are still many problems to be solved in practical commercialized road.Virtual reality is to pass through wearing mostly
The formula equipment helmet integrates the relevant technologies, but headband size can not be adjusted, the helmet is overweight, equipment is saturating when wearing for a long time
Gas, poor radiation, these factors can all make the comfortable Experience Degree of viewer decline;Since the equipment of virtual reality needs and electricity
Brain is connected, and carries out signal transmission, and often the very long cable of connection one is ensured as communication.In the mobile situation of viewer
Under, the cable that attention has ignored underfooting on the display device always there is the danger tripped;In contrast, in performance
Etc. virtual reality device on the upper side, price is also universal higher.As a converter tools, it shows that resource extremely has at present
Limit, but price exceeds the ability to bear of ordinary populace.Three-dimension virtual reality is relative to traditional virtual reality, more close to the mankind
Actual perceived intuitively can bring depth effect to viewer.At home and abroad in numerous dimension display technologies, holographic technique is
The record and reduction technique of real-world object image are really recorded and restore with the optical phenomena of diffraction by interfering, but at present
Hologram device image collection and display are carried out using laser, equipment manufacturing cost is extremely expensive.Integration imaging Three-dimensional Display skill
There are problems in various aspects such as image viewpoint crosstalk, viewpoint numbers for art.The stereoscopic Three-dimensional Display of naked eye has considerable commercial promise,
The stereoscopic three-dimensional display device of laboratory stage has been walked out at present, and structure is complicated mostly, heavy and multiple material is needed to set
Standby and designed, designed control device, is not appropriate for applying in the stronger virtual reality technology of mobility.
Mobile terminal occupies geneogenous advantage due to its portability and popularization in three-dimensional virtual reality technology.
Microlens array naked eye three-dimensional display technology is realized relatively simple based on the microlens array being covered in two-dimensional flat screen
It is single, and the quality of its three-dimensional image reconstruction is sufficient to many consumer-elcetronics devices.Existing rendering side for display technology
The accurate correction of method, such as the modular arithmetic or light back projection method of Philip, the device measuring that needs to rely on is demarcated to generate
High quality Three-dimensional Display effect.
In view of this, the present invention is specifically proposed.
Summary of the invention
In order to solve the above problem in the prior art, the naked eye three-dimensional in order to solve how to realize high quality is virtually existing
A kind of method of mobile naked eye three-dimensional virtual reality based on optical path acquisition is proposed the technical issues of display in fact.In addition, also mentioning
For a kind of system of mobile naked eye three-dimensional virtual reality based on optical path acquisition.
To achieve the goals above, according to an aspect of the present invention, the following technical schemes are provided:
A method of the mobile naked eye three-dimensional virtual reality based on optical path acquisition, this method comprises:
Acquire image;
Two View Synthesis index of the picture figures are obtained according to acquired image;
According to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained;
Two viewpoint figures are rendered composite diagram, obtain three-dimension virtual reality by the two View Synthesis index of the picture figures based on image
Show image.
Further, described to be specifically included based on collected described image two View Synthesis index of the picture figures of acquisition:
Determine sub-pixel luminence weighing factor on image;
Two viewpoints are determined using the least square method for having bounding box based on sub-pixel luminence weighing factor on image
Composite diagram index map.
Further, it is determined that sub-pixel luminence weighing factor can specifically include on image:
Sub-pixel luminence weighing factor on image is determined according to the following formula:
Wherein, Imagec(t)(i, j, k) indicates the sub-pixel at t-th of viewpoint in acquired image, t=0,1, i
=1~H, j=1~W;The height of H expression image;The width of W expression image;R indicates security window radius;M, n is indicated to pacify
Full window center is the window coordinates of origin;Q=1~3;Imagem(i+m, j+n, q) indicates the sub-pixel on image;Indicate when sub-pixel (i+m, j+n, q) is illuminated to maximum brightness on image to neighborhood pixels (i,
J, k) by normalization after brightness weighing factor.
Further, based on sub-pixel luminence weighing factor on image, using the least square method for having bounding box, really
The composite diagram index map of fixed two viewpoints, can specifically include:
Based on sub-pixel luminence weighing factor on image, using the least square method for having bounding box, according to the following formula really
The composite diagram index map of fixed two viewpoints:
Wherein,Indicate the difference of two viewpoint corresponding sub-pixel brightness weighing factors;I indicates the composite diagram rope of two viewpoints
Draw figure, and is defined to bounding box BI={ I ∈ Z1×H: L≤I≤U } in, L=0 × I1×HAnd U=255 × I1×H, I1×HIndicate H
Complete 1 column vector of dimension;Indicate the difference of two view collection images;The height of H expression image.
Further, can also include: before sub-pixel luminence weighing factor step on determining image
It is counter to image distorted and interest region extraction operation processing.
Further, according to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained, it specifically can be with
Include:
According to the rotation information of the location information of acceleration and angular speed, the orientation of acquired image is adjusted, obtains two
Viewpoint figure.
Two viewpoint figures are rendered composite diagram, obtain three-dimension virtual reality by the two View Synthesis index of the picture figures based on image
It shows image, specifically includes:
Composite diagram is rendered according to the following formula, is obtained three-dimension virtual reality and is shown image:
Syn=I0×view0+I1×view1
Wherein, Syn indicates that three-dimension virtual reality shows image;I0Indicate the composite diagram index map of the 0th viewpoint;I1Indicate the 1st
The composite diagram index map of viewpoint;view0Indicate the 0th viewpoint figure;view1Indicate the 1st viewpoint figure.
To achieve the goals above, according to another aspect of the present invention, the following technical schemes are provided:
A kind of system of the mobile naked eye three-dimensional virtual reality based on optical path acquisition, the system may include:
Image acquisition unit is sent to controller for acquiring image, and by image;
Mobile terminal, including screen, screen are equipped with microlens array film, for the location information of acceleration and angle is fast
The rotation information of degree is sent to controller, and shows that three-dimension virtual reality shows image by screen;
Controller is connect with image acquisition unit and communication of mobile terminal respectively;For handling image, two are obtained
View Synthesis index of the picture figure, and according to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained, and be based on
Two viewpoint figures are rendered composite diagram by two View Synthesis index of the picture figures of image, are obtained three-dimension virtual reality and are shown image, and
Three-dimension virtual reality is shown that image is sent to mobile terminal.
Further, mobile terminal can also include
Accelerometer, for obtaining the location information of acceleration;
Gyroscope, for obtaining the rotation information of angular speed.
Further image acquisition unit is monocular camera or binocular camera.
Further, mobile terminal is mobile phone or personal digital assistant.
Further, controller is computer, laptop, server or industrial personal computer.
The embodiment of the present invention proposes a kind of method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition.Its
In, this method may include: acquisition image;Two View Synthesis index of the picture figures are obtained based on acquired image, according to acceleration
Location information and angular speed rotation information, obtain two viewpoint figures;The two View Synthesis index of the picture figures based on image are regarded to two
Point diagram renders composite diagram, obtains three-dimension virtual reality and shows image.The embodiment of the present invention is existing by dimension display technologies and virtually
Real technology combines, and can show high quality in the case where the unknown parameters of microlens array and mobile terminal screen
Three-dimensional Display effect.The constraint sense not good enough compared to weight and comfort that traditional virtual real world devices reduce the helmet, simultaneously
It increases equipment itself Three-dimensional Display effect rather than binocular reception left and right different scenes picture is only used only and achievees the effect that three-dimensional.
Detailed description of the invention
Fig. 1 is the structure of the system of the mobile naked eye three-dimensional virtual reality according to an embodiment of the present invention based on optical path acquisition
Schematic diagram;
Fig. 2 is the system of the mobile naked eye three-dimensional virtual reality according to another embodiment of the present invention based on optical path acquisition
Structural schematic diagram;
Fig. 3 is that the hand-held mobile phone of user according to an embodiment of the present invention is virtual in the mobile phone naked eye three-dimensional of three different points of observation
Real display effect schematic diagram;
Fig. 4 is the process of the method for the mobile naked eye three-dimensional virtual reality according to an embodiment of the present invention based on optical path acquisition
Schematic diagram.
Specific embodiment
The preferred embodiment of the present invention described with reference to the accompanying drawings.It will be apparent to a skilled person that this
A little embodiments are used only for explaining technical principle of the invention, it is not intended that limit the scope of the invention.
Illustrate the application platform of the embodiment of the present invention with an exemplary embodiment below.Wherein it is possible to utilize two bases
Directrix is that the image acquisition unit of 60mm simulates the eyes of people, is placed it at mobile terminal screen about 300mm.Utilize one
The experimental box of a sealing, for image acquisition unit optical path acquisition provide one with ambient completely cut off closed environment, and
Fan is set behind image acquisition unit, to radiate.Preferably, land productivity is maximized in order to make to capture the pixel on image
With, can give image acquisition unit be equipped with telephoto lens.Test atlas generate on computers, and by local area network build according to
The secondary mobile terminal that is transferred to is shown.
Above-mentioned image acquisition unit includes but is not limited to monocular camera, binocular camera, industrial camera.Wherein, if selected
Monocular camera then uses two monocular cameras in practical applications.
The embodiment of the present invention provides a kind of system of mobile naked eye three-dimensional virtual reality based on optical path acquisition.Such as Fig. 1 institute
Show, which includes: image acquisition unit 12, mobile terminal 14 and controller 16.Wherein, image acquisition unit 12 is for adopting
Collect image, and image is sent to controller.Mobile terminal 14 includes screen, which is equipped with microlens array film, is used for
The rotation information of the location information of acceleration and angular speed is sent to controller, and shows that three-dimension virtual reality is aobvious by screen
Diagram picture.Controller 16 is connect with image acquisition unit and communication of mobile terminal respectively;For handling image, two are obtained
View Synthesis index of the picture figure, and according to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained, and be based on
Two viewpoint figures are rendered composite diagram by two View Synthesis index of the picture figures of the image, are obtained three-dimension virtual reality and are shown image, with
And three-dimension virtual reality is shown that image is sent to mobile terminal.
The mobile terminal includes but is not limited to mobile phone, personal digital assistant.In the preferred embodiment, moving three dimension is shown
Show that technology is combined with virtual reality technology, moveable naked eye three-dimensional virtual reality may be implemented using mobile terminal and show.
Above-mentioned image acquisition unit acquires equipment as optical path comprising but it is not limited to monocular camera and binocular camera.It should
Image acquisition unit can carry out the correction and calibration of early period between mobile terminal, carry out the aobvious of three-dimension virtual reality later
Show.
Above controller includes but is not limited to computer, laptop, industrial personal computer, server.The server can also be with
For server cluster.
Microlens array film is covered on the screen of the mobile terminal, and the microlens array film is by thousands of a convex lens groups
At.In this way, the sub-pixel of the different places below convex lens is amplified to observer's eye when watching in a different direction
In.
In the present embodiment, the mode that controller is communicatively coupled with image acquisition unit and mobile terminal respectively includes
But it is not limited to WIFI network, bluetooth, ZigBee, 2G, 3G, 4G, 5G.
The embodiment of the present invention, can be in microlens array and mobile terminal LCD screen by using above-mentioned technical proposal
In the case where unknown parameters, realize that replicability is strong, cost is controllable and the moving three dimension virtual reality of high resolution is shown.
In an alternative embodiment, mobile terminal may include gyroscope and accelerometer.Wherein, accelerometer is used
In the location information for obtaining acceleration.Gyroscope is used to obtain the rotation information of angular speed.
Below with reference to Fig. 2 and Fig. 3 in a manner of preferred embodiment, come virtual to the mobile naked eye three-dimensional acquired based on optical path
The course of work of the system of reality is described in detail.This preferred embodiment is using mobile phone, camera, control computer as shifting
It is described in detail for dynamic terminal, image acquisition unit and controller.Wherein, using two monocular industrial cameras.Mobile phone screen
Microlens array film is covered on curtain.It controls computer and wireless communication is realized by router and mobile phone and industrial camera.
This preferred embodiment carries out as follows apart from setting: screen when measuring the binocular distance of viewing user, holding mobile phone 3
Two cameras 6 are horizontally arranged the spacing that spacing is set as 7 binocular of user, by mobile phone screen and camera 6 by the distance apart from eyes
Distance and user 7 hold the distance of mobile phone 3 and be consistent.
It is acquired in experiment box 4 in optical path, industrial camera 6 is corrected and is demarcated via light 5 by mobile phone 3.Wherein,
Scaling method includes but is not limited to active vision camera calibration method and Camera Self-Calibration method.
Industrial camera 6 is connected to control computer 1 and carries out Image Acquisition, optimization two opens initial synthesis index of the picture
Figure, effect are that display effect of two index maps in mobile phone 3 is black-white colors complementation.
Displacement and the rotation attitude of viewer are acquired by mobile phone 3, and letter is carried out by WIFI network and control computer 1
Number communication, by the displacement of viewer and rotation attitude be sent to control computer 1.
It controls computer 1 and signal communication is carried out by router 2 with mobile phone 3 and industrial camera 6 respectively, control mobile phone is lighted
Single pixel point, while controlling camera 6 and acquiring image;Processing life also is optimized to two 6 acquired images of industrial camera
At two viewpoint figures;And the displacement and rotation attitude data for the viewer that mobile phone 3 is sent also are received, and according to the displacement of viewer
With rotation attitude data, two viewpoints figure to be shown and the interaction of initial composite diagram index map are generated into virtual three-dimensional scene figure
As (three-dimensional virtual reality map picture), and by the virtual three-dimensional scene image feedback to mobile phone 3, carries out naked eye three-dimensional and show.At this point,
Viewer can watch the 3-D effect of virtual scene in the case where not needing any other external auxiliary equipment.
It is aobvious in the mobile phone naked eye three-dimensional virtual reality of three different points of observation that Fig. 3 schematically illustrates the hand-held mobile phone of user
Show effect diagram.
This preferred embodiment constructs that a replicability is strong, cost is controllable and high resolution, movably based on optical path
The system of the naked eye three-dimensional virtual reality of acquisition, has incorporated virtual reality technology, as the automatic calibration acquired based on optical path
It, can be the unknown parameters of microlens array and mobile phone liquid crystal screen curtain the case where with the scheme that shows of moving three dimension for reducing crosstalk
Under, high quality, fine and smooth Three-dimensional Display effect are showed, personalized customization, phase can be done for the difference of personal hand-held mobile phone
Than reducing the weight of the helmet and the constraint sense that comfort is not good enough in traditional virtual real world devices, while increasing equipment itself
Three-dimensional Display effect rather than only be used only binocular receive left and right different scenes picture achieve the effect that three-dimensional.
In addition, the embodiment of the present invention provides a kind of method of mobile naked eye three-dimensional virtual reality based on optical path acquisition.It should
The system that method can be applied to the above-mentioned mobile naked eye three-dimensional virtual reality based on optical path acquisition.As shown in figure 4, this method can
To include:
S400: acquisition image.
S410: the composite diagram index map of two viewpoints is obtained based on acquired image.
Specifically, this step can be realized by step S414 to step S418.
S414: sub-pixel luminence weighing factor on image is determined.
This step considers periphery sub-pixel for the value of each sub-pixel on the composite diagram index map of two viewpoints
Influence to it, and the weighted value of different subpixel influence is obtained by acquired image.
For example, when positioned at ImagemAny one sub-pixel Image in (i, j) windowmWhen (i, j, q) is lit,
It will influence on capture image, Imagec(t)Each sub-pixel in (i, j) pixel.
Specifically, this step may further include: sub-pixel luminence weighing factor on image is determined according to the following formula:
Wherein, Imagec(t)(i, j, k) indicates the sub-pixel at t-th of viewpoint in acquired image, t=0,1, i
=1~H, j=1~W;The height of H expression image;The width of W expression image;R indicates security window radius;M, n is indicated to pacify
Full window center is the window coordinates of origin;Q=1~3;Imagem(i+m, j+n, q) indicates the sub-pixel on image;It indicates when sub-pixel (i+m, j+n, q) is illuminated to maximum brightness on image to neighborhood pixels
The brightness weighing factor of (i, j, k) after normalization.
The sub-pixel luminence weighing factor obtained according to this step can index out weight matrix, to be used for subsequent place
Reason.
S416: two are determined using the least square method for having bounding box based on sub-pixel luminence weighing factor on image
The composite diagram index map of viewpoint.
Specifically, this step may further include: based on sub-pixel luminence weighing factor on image, using with encirclement
The least square method of box determines the composite diagram index map of two viewpoints according to the following formula:
Wherein,Indicate the difference of two viewpoint corresponding sub-pixel brightness weighing factors;I indicates the composite diagram rope of two viewpoints
Draw figure, and is defined to bounding box BI={ I ∈ Z1×H: L≤I≤U } in, L=0 × I1×HAnd U=255 × I1×H, I1×HIndicate H
Complete 1 column vector of dimension;Indicate the difference of two view collection images;The height of H expression image.
As an example, determining the composite diagram index map I of the 0th viewpoint in the following manner0:
Step 1: I is defined in bounding box BI={ I ∈ Z1×H: L≤I≤U } in, wherein L=0 × I1×HAnd U=255 ×
I1×H, I1×HIndicate complete 1 column vector of H dimension.
Step 2: the least square with bounding box is carried out according to the following formula to be handled:
Wherein,Indicate the difference of 0 viewpoint and 1 viewpoint capture image, Indicate the corresponding son of two viewpoints
The difference of pixel intensity weighing factor, is the full rank sparse matrix of (H × W × 3, H × W × 3) size, and H indicates the height of image
Degree, W indicate the width of image;The composite diagram index map of I expression viewpoint.
Similarly we can be in the hope of the composite diagram index map of the 1st viewpoint.
In a preferred embodiment, the embodiment of the present invention can also include: before step S414
S412: it is counter to image distorted and interest region extraction operation processing.
For example, it is contemplated that distance is bright between two pixels of N (N round numbers) a pixel on mobile terminal screen
Degree will not mutually have an impact, it is possible to which mobile terminal screen is divided into the security window of several N × N sizes.Therefore
N × N × 3 test chart image is transmitted to mobile terminalmIt can be implemented.Wherein, the resolution ratio of test chart is H × W.Its
In, H indicates the height of test chart;The width of W expression test chart.In the specific implementation process, using positioned at two viewpoint observation points
Image acquisition unit captured respectively, obtain capture image, and to capture image it is counter distorted, interest extracted region behaviour
Make, to obtain the image that resolution ratio is H × Wc(0)The image that image and resolution ratio are H × Wc(1)Image.Wherein, image obtains
Taking unit includes but is not limited to monocular camera, binocular camera and industrial camera.
S420: according to the rotation information of the location information of acceleration and angular speed, two viewpoint figures are obtained.
Specifically, this step can be accomplished by the following way: according to the rotation of the location information of acceleration and angular speed
Transfering the letter breath, the orientation of adjustment acquisition image, obtains two viewpoint figures.
Wherein, the location information of acceleration can be acquired to obtain by the acceierometer sensor of mobile terminal.
The rotation information of angular speed can be acquired to obtain by the gyro sensor of mobile terminal.Thus, it is possible to collect
The displacement of viewer and rotation attitude, thus, the aobvious of mobile naked eye three-dimensional virtual reality is realized using the displacement and rotation attitude
Show.
In practical applications, it can calculated by the platform built by mobile phone, camera and computer by step S420
Generator terminal adjusts the pose of virtual camera, obtains two viewpoint figures.
S430: the two View Synthesis index of the picture figures based on image render composite diagram to two viewpoint figures, obtain three-dimensional
Reality display image.
For example, composite diagram is rendered according to the following formula, is obtained three-dimension virtual reality and is shown image:
Syn=I0×view0+I1×view1;
Wherein, Syn indicates that three-dimension virtual reality shows image;I0Indicate the composite diagram index map of the 0th viewpoint;I1Indicate the 1st
The composite diagram index map of viewpoint;view0Indicate the 0th viewpoint figure;view1Indicate the 1st viewpoint figure.
In a preferred embodiment, this step S430 can also be accomplished by the following way: by two viewpoint figures with
Two display effects are that the predetermined composite diagram index map of black and white complementation is synthesized, and obtain three-dimension virtual reality and show image.
Although each step is described in the way of above-mentioned precedence in above-described embodiment, this field
Technical staff is appreciated that the effect in order to realize the present embodiment, executes between different steps not necessarily in such order,
It (parallel) execution simultaneously or can be executed with reverse order, these simple variations all protection scope of the present invention it
It is interior.
It should be noted that, for concise consideration, phase is omitted during the description to each embodiment of the present invention
With part, those skilled in the art will be understood that in the case where there is not conflict, can also be with to the explanation of one embodiment
Applied to another embodiment.
It shall also be noted that language used in this specification primarily to readable and introduction purpose, be not for
It explains or limits the scope of protection of the present invention.
It is above to provide for the purpose of illustration and description to the detailed description of example embodiments of the present invention.Be not for
It is exhaustive or limit the invention to described precise forms.Obviously, many variations and modifications to those skilled in the art and
Speech is obvious.Selection and description of the embodiments be in order to most preferably illustrate the principle of the present invention and its practical application, from
And make others skilled in the art it will be appreciated that various embodiments of the present invention and be suitable for it is specific use expected various modifications.
The embodiment of the present invention can be omitted some technical characteristics in above-mentioned technical characteristic, only solve part existing in the prior art
Technical problem.Moreover, described technical characteristic can carry out any combination.Protection scope of the present invention is by appended claims
And its equivalent limits, art technology other staff can carry out technical solution described in appended claims each
Kind variation or replacement and combination, the technical solution after these changes or replacement will fall within the scope of protection of the present invention.
Claims (11)
1. a kind of method of the mobile naked eye three-dimensional virtual reality based on optical path acquisition, which is characterized in that the described method includes:
Acquire image;
Two View Synthesis index of the picture figures are obtained based on collected described image;
According to the rotation information of the location information of the acceleration of viewer and angular speed, two viewpoint figures are obtained;
The two View Synthesis index of the picture figure based on described image, renders composite diagram to the two viewpoints figure, obtains three-dimensional
Virtual reality shows image;
It is described to be specifically included based on collected described image two View Synthesis index of the picture figures of acquisition:
Determine brightness weighing factor of the periphery sub-pixel to each sub-pixel of each sub-pixel in described image;
Two view is determined using the least square method for having bounding box based on brightness weighing factor described in described image
The composite diagram index map of point.
2. the method for the mobile naked eye three-dimensional virtual reality according to claim 1 based on optical path acquisition, which is characterized in that
The determination brightness weighing factor specifically includes:
The brightness weighing factor in described image is determined according to the following formula:
Wherein, the Imagec(t)(i, j, k) indicates the acquired image Image at t-th of viewpointcOn sub-pixel, it is described
T=0,1, the i=1~H, the j=1~W, the k indicate image ImagecColor Channel and k=1~3;The H table
Show the height of described image;The W indicates the width of described image;The r indicates security window radius;Described m, n indicate with
The security window center is the window coordinates of origin;The q indicates imageColor Channel and q=1~3;Institute
State Imagem(i+m, j+n, q) indicates the sub-pixel in described image;It is describedIndicate described image
Brightness when upper sub-pixel (i+m, j+n, q) is illuminated to maximum brightness to neighborhood pixels (i, j, k) after normalization influences power
Weight.
3. the method for the mobile naked eye three-dimensional virtual reality according to claim 1 based on optical path acquisition, which is characterized in that
It is described that two view is determined using the least square method for having bounding box based on the brightness weighing factor in described image
The composite diagram index map of point, specifically includes:
Two viewpoint is determined according to the following formula using the least square method for having bounding box based on the brightness weighing factor
The composite diagram index map:
Wherein, describedIndicate the difference of the two viewpoints corresponding brightness weighing factor;The I indicates the described of two viewpoint
Composite diagram index map, and it is defined to bounding box BI={ I ∈ Z1×H: L≤I≤U } in, L=0 × I1×HAnd U=255 × I1×H,
The I1×HIndicate complete 1 column vector of H dimension;It is describedIndicate the difference of two view collection images;The H indicates described image
Highly, the Z indicates integer.
4. the method for the mobile naked eye three-dimensional virtual reality according to claim 1 based on optical path acquisition, which is characterized in that
Before the determination brightness weighing factor step further include:
It is counter to described image distorted and interest region extraction operation processing.
5. the method for the mobile naked eye three-dimensional virtual reality according to claim 1 based on optical path acquisition, which is characterized in that
It is described to be specifically included according to the location information of acceleration with the rotation information of angular speed, two viewpoint figures of acquisition:
According to the rotation information of the location information of the acceleration and the angular speed, the side of described image collected is adjusted
Position obtains the two viewpoints figure.
6. the method for the mobile naked eye three-dimensional virtual reality according to claim 1 based on optical path acquisition, which is characterized in that
The two View Synthesis index of the picture figure based on described image, renders composite diagram to the two viewpoints figure, obtains three-dimensional
Virtual reality shows image, specifically includes:
Composite diagram is rendered according to the following formula, is obtained the three-dimension virtual reality and is shown image:
Syn=I0×view0+I1×view1;
Wherein, the Syn indicates that the three-dimension virtual reality shows image;The I0Indicate the composite diagram index map of the 0th viewpoint;
The I1Indicate the composite diagram index map of the 1st viewpoint;The view0Indicate the 0th viewpoint figure;The view1Indicate the 1st viewpoint figure.
7. a kind of system of the mobile naked eye three-dimensional virtual reality based on optical path acquisition, which is characterized in that the system comprises:
Image acquisition unit is sent to controller for acquiring image, and by described image;
Mobile terminal, including screen, the screen are equipped with microlens array film, for the location information of acceleration and angle is fast
The rotation information of degree is sent to the controller, and shows that three-dimension virtual reality shows image by the screen;
The controller is connect with described image acquiring unit and the communication of mobile terminal respectively;For to described image into
Row processing obtains two View Synthesis index of the picture figures, and is believed according to the rotation of the location information of the acceleration and the angular speed
Breath obtains two viewpoint figures, and the two View Synthesis index of the picture figure based on described image, renders conjunction to the two viewpoints figure
Cheng Tu obtains the three-dimension virtual reality and shows image, and the three-dimension virtual reality is shown that image is sent to the shifting
Dynamic terminal;
The controller is also used to perform the following operations: determining the periphery sub-pixel of each sub-pixel in described image to described every
The brightness weighing factor of a sub-pixel;Based on brightness weighing factor described in described image, the minimum two for having bounding box is utilized
Multiply method, determines the composite diagram index map of two viewpoint.
8. the system of the mobile naked eye three-dimensional virtual reality according to claim 7 based on optical path acquisition, which is characterized in that
The mobile terminal includes:
Accelerometer, the location information of the acceleration for obtaining viewer;
Gyroscope, the rotation information of the angular speed for obtaining viewer.
9. the system of the mobile naked eye three-dimensional virtual reality according to claim 7 based on optical path acquisition, which is characterized in that
Described image acquiring unit is monocular camera or binocular camera.
10. the system of the mobile naked eye three-dimensional virtual reality according to claim 7 based on optical path acquisition, feature exist
In the mobile terminal is mobile phone or personal digital assistant.
11. the system of the mobile naked eye three-dimensional virtual reality according to claim 7 based on optical path acquisition, feature exist
In the controller is computer, server or industrial personal computer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611202874.0A CN106878698B (en) | 2016-12-23 | 2016-12-23 | The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611202874.0A CN106878698B (en) | 2016-12-23 | 2016-12-23 | The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106878698A CN106878698A (en) | 2017-06-20 |
CN106878698B true CN106878698B (en) | 2019-05-24 |
Family
ID=59164901
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611202874.0A Active CN106878698B (en) | 2016-12-23 | 2016-12-23 | The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106878698B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108391117A (en) * | 2018-02-01 | 2018-08-10 | 周金润 | A kind of mobile phone bore hole 3D display technology based on viewpoint positioning, single-view relief painting |
CN108769642A (en) * | 2018-03-27 | 2018-11-06 | 深圳负空间科技有限公司 | An action shot based on high in the clouds renders and bore hole VR exchange methods |
CN112184878B (en) * | 2020-10-15 | 2023-08-25 | 洛阳众智软件科技股份有限公司 | Method, device and equipment for automatically generating and rendering three-dimensional night scene lamplight |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019021B (en) * | 2012-12-27 | 2016-05-11 | Tcl集团股份有限公司 | The processing method of a kind of 3D light field camera and photographic images thereof |
CN103702099B (en) * | 2013-12-17 | 2015-08-05 | 四川大学 | A kind of super large visual angle integration imaging 3D display packing based on head-tracking |
TWI572906B (en) * | 2015-02-25 | 2017-03-01 | 台達電子工業股份有限公司 | Three-dimension light field construction apparatus |
CN106154567B (en) * | 2016-07-18 | 2018-12-04 | 北京邮电大学 | A kind of imaging method and device of 3 d light fields display system |
-
2016
- 2016-12-23 CN CN201611202874.0A patent/CN106878698B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN106878698A (en) | 2017-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107105213B (en) | Stereoscopic display device | |
CN102098524B (en) | Tracking type stereo display device and method | |
US7091931B2 (en) | Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse | |
WO2019105323A1 (en) | Display module, head-mounted display device, and stereoscopic image display method and apparatus | |
CN115951504A (en) | Three-dimensional glasses-free light field display using eye positions | |
CN102209254B (en) | One-dimensional integrated imaging method and device | |
CN104714302A (en) | Head mounted display device | |
CN106878698B (en) | The method and system of mobile naked eye three-dimensional virtual reality based on optical path acquisition | |
CN102109677A (en) | Head-mounted stereoscopic visual slide viewer | |
WO2020106443A1 (en) | Apparatus systems, and methods for local dimming in brightness-controlled environments | |
CN108345108A (en) | Head-mounted display apparatus, the generation method of three-dimensional image information and device | |
CA2476612A1 (en) | Method and system for displaying stereoscopic image | |
CN206133120U (en) | Display panel and display device | |
CN1319310A (en) | Device for producing three-dimensional image | |
Chappuis et al. | Subjective evaluation of an active crosstalk reduction system for mobile autostereoscopic displays | |
JP2013080224A (en) | Barrier panel, and 3d image display device and method | |
CN209991990U (en) | Map interaction system | |
Buchroithner et al. | Stereoscopic 3-D solutions for online maps and virtual globes | |
CN116583879A (en) | Vehicle terrain capture system and display of 3D digital images and 3D sequences | |
CN113031299A (en) | Desktop true three-dimensional display method | |
Jacobs et al. | 3-D movies using microprocessor-controlled optoelectronic spectacles | |
CN103442162A (en) | Portable type 3D shooting device and 3D shooting method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |