CN108632599B - Display control system and display control method of VR image - Google Patents
Display control system and display control method of VR image Download PDFInfo
- Publication number
- CN108632599B CN108632599B CN201810287177.2A CN201810287177A CN108632599B CN 108632599 B CN108632599 B CN 108632599B CN 201810287177 A CN201810287177 A CN 201810287177A CN 108632599 B CN108632599 B CN 108632599B
- Authority
- CN
- China
- Prior art keywords
- image
- movable lens
- distance
- image distance
- lens group
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Abstract
The invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is used for identifying a human eye fixation point area; the calculation and control device calculates the position data of the movable lens group when the imaging image distance is the same as the image distance of the gazing point region according to the data information of the gazing point region; and the driving device drives the movable lens group to move to adjust the image distance of the VR image to be consistent with the image distance of the gazing point area image. The system and the method provided by the invention firstly acquire the fixation point area of human eyes, and utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the fixation point area of human eyes, thereby overcoming the defect of convergence adjustment conflict of vision caused by fixed VR imaging position.
Description
Technical Field
The invention relates to the technical field of display control, in particular to a display control system and a display control method for VR images.
Background
In the prior VR display technology, 2D images of the same object at different angles are respectively displayed in front of the left eye and the right eye of an observer, a 3D visual sense is formed by utilizing parallax, the focal point adjustment of the 2D images is not matched with the 3D visual sense with depth formed by the parallax of the left eye and the right eye due to the fixed image distance of the 2D images observed by the left eye and the right eye, the problem of convergence adjustment conflict occurs, and the fatigue and even dizziness of the eyes can occur when the observer watches the images for a long time.
Specifically, as shown in fig. 1a and fig. 1b, fig. 1a is a schematic stereoscopic view of the prior VR technology, where 101 and 102 in the figure represent a left eye and a right eye respectively, 103 is a real 3D real object, 104 is a prior VR device, 105 is a VR imaging position, and the 3D visual sensing position is 106, and Lb and La in the figure represent a convergence distance and a focus distance respectively, as shown in fig. 1a, when a human eye observes a real world, the convergence distance Lb and the focus distance La are equal, there is no convergence adjustment conflict, i.e., a focus-focus contradiction, and under the prior VR stereoscopic display technology, the convergence distance Lb and the focus distance La have a large difference, and the problem of convergence adjustment conflict is prominent, which affects viewing experience.
In the prior art, there are mainly two different VR display technologies, the light path diagram of the first display technology is shown in fig. 2, and a display screen 210 generates a picture to generate a VR image 230 through a VR main lens 220 for the left and right eyes of an observer to watch. Since the focal length of the VR main lens 220 is fixed and the distance from the display screen 210 is fixed, the image distance position of the VR imaging 230 is fixed. It can be seen that the display technology has the disadvantage that the VR imaging position is fixed and cannot be kept consistent with the viewing distance of the 3D parallax image, so that there is a problem of convergence adjustment conflict. The second display technology is a light field display technology based on a microlens array, the light field is as shown in fig. 3, the display screen pixels 310 near the focal plane of the microlens array 320 form a light field vector 330 through the microlens array, the light field vector is converged and imaged through the human eye 340, and the angle of the light field vector can be adjusted by selecting different pixels, so that the image distance of the imaging is adjusted to avoid convergence adjustment conflict. It can be seen from fig. 3 that each pixel forms a light field vector, while a plurality of light field vectors form an image point. The disadvantages of the microlens array based light field display technology are: a plurality of pixel points are needed to display one image point, the angular resolution of a light field vector is provided, meanwhile, the spatial resolution of a displayed image is reduced, and the angular resolution and the spatial resolution are in a pair of contradiction.
Therefore, the prior art is subject to further improvement.
Disclosure of Invention
In view of the above disadvantages in the prior art, the present invention provides an image processing method and system based on VR, which overcomes the disadvantages of fixed imaging position and conflict in vergence adjustment in VR in the prior art.
A first embodiment of the present invention is a display control system for a VR image, including: display screen and VR main lens, wherein, still include:
an eyeball tracking device for tracking the eyes of the viewer and identifying the fixation point area of the eyes;
a movable lens group arranged between the VR main lens and the display screen for adjusting the VR imaging position;
and a computing and control device and a driving device which are connected with the movable lens group;
the eyeball tracking device is connected with the calculation and control device and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device;
the calculation and control device is used for calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the data information of the gazing point region, and generating a driving control instruction according to the position data;
and the driving device receives the driving control instruction, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
Optionally, the eyeball tracking device acquires a human eye fixation point area based on an eyeball tracking technology.
Optionally, the calculating and controlling device is further configured to obtain image distance information of a corresponding sub-image in the image to be displayed according to the gaze point region of human eyes, and calculate imaging image distance information according to the image distance information of the sub-image.
Optionally, the display control system further includes: a graphics processor connected to the display screen;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
and the graphic processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
Optionally, the movable lens group comprises at least one lens, which is parallel to and coaxial with the VR main lens.
A second embodiment provided by the present invention is a display control method of the display control system, including the steps of:
tracking the eyes of a viewer and identifying a fixation point area of the eyes;
calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye, and generating a driving control instruction according to the position data;
and receiving the driving control instruction, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image of the fixation point area.
Optionally, the step of calculating, according to the identified data information of the gaze point region of the human eye, position data that the movable lens group needs to be adjusted when the image distance is the same as the image distance of the gaze point region further includes:
and acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images.
Optionally, the step of calculating, according to the identified gaze point region of the human eye, position data that the movable lens group needs to adjust when the imaging distance is the same as the image distance of the gaze point region includes:
keeping the position between the display screen and the VR main lens unchanged, and deducing the position data of the movable lens group required to be adjusted according to a Gaussian imaging formula.
Optionally, the display control method further includes:
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
and performing blurring processing on the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
Optionally, the movable lens group comprises at least one lens, and the lens is parallel to the main lens and coaxial with the main lens.
The invention has the beneficial effects that the invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is used for identifying a human eye fixation point area and transmitting the identified human eye fixation point area to the calculation and control device; the calculation and control device calculates position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the gazing point region; and the driving device drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image. The display control system and the display control method firstly acquire the gaze point area of the human eyes, utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the gaze point area of the human eyes, overcome the defect of convergence adjustment conflict caused by fixed VR imaging position, and solve the problem of convergence adjustment conflict caused by the fact that a user watches VR video.
Drawings
FIG. 1a is a schematic diagram of a prior art viewing of real 3D objects;
FIG. 1b is a schematic diagram of stereo vision of VR technology in the prior art;
FIG. 2 is an optical diagram of a prior art VR display technique;
FIG. 3 is a prior art optical path diagram of a VR light field display technique based on a microlens array;
FIG. 4 is a schematic structural diagram of a VR image display control system according to the present invention;
FIG. 5 is a schematic diagram of the structure of an image to be displayed and a sub-image in an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of sub-images corresponding to the eye gaze point area in an embodiment of the present invention;
FIG. 7 is an optical diagram of a VR image display control system provided by the present invention;
FIG. 8 is a flow chart of method steps for a method of controlling the display of a VR image provided in accordance with the present invention;
FIG. 9 is a flow chart of steps implemented by a specific application of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the prior VR display technology, parallax 2D images are respectively displayed in front of left and right eyes of an observer to form 3D visual sensation, and the problem of convergence adjustment conflict exists due to the fixed image distance of the 2D images observed by the left and right eyes, so that the observer can feel eyestrain and even dizziness when watching for a long time, which is a problem to be solved in 3D display. The invention provides a display control system and a display control method of VR images, aiming at solving the problem of convergence adjustment conflict in the prior VR display technology.
A first embodiment of the present invention provides a display control system of a VR image, as shown in fig. 4, including: display screen 440 and VR main lens 460, wherein, still include:
an eyeball tracking device 420 for tracking the eyes of the viewer and identifying the gaze point area of the eyes;
a movable lens group (including a concave lens 451 and a convex lens 452) provided between the VR main lens 460 and the display screen 440 for adjusting a VR imaging position; and computing and control means 400 and drive means 470 establishing a connection with each of said movable lens groups;
the eyeball tracking device 420 is connected with the calculation and control device 400 and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device 400;
the calculation and control device 400 is configured to calculate, according to the data information of the gazing point region, position data that the movable lens group needs to adjust when an imaging distance is the same as an image distance of the gazing point region, and generate a driving control instruction according to the position data;
and the driving device 470 receives the driving control command, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
In the present invention, the gaze point region is a region gazed by eyes of a viewer, and in specific implementation, preferably, the eye tracking device acquires the gaze point region of the eyes based on an eye tracking technology to obtain the gaze point region. It is conceivable that the infrared light source and the infrared image recognition module may be used to recognize the pupil and the purkinje spot, as long as the information of the gazing point region of the human eye can be accurately acquired.
Specifically, the calculating and controlling device 400 is further configured to obtain image distance information of a corresponding sub-image in the image to be displayed according to the gaze point region of human eyes, and calculate imaging image distance information according to the image distance information of the sub-image.
As shown in fig. 5, the calculation and control device obtains the image data of the image 51 to be displayed from the GPU, and obtains the image distance information of the sub-image 52 corresponding to the gazing point region in the image 51 to be displayed according to the image data.
And calculating the optical axis coordinate position where the movable lens group is to be located when the image distance of VR imaging is adjusted to be the same as the image distance of the gazing point area.
Preferably, in order to achieve better adjustment of the image distance for imaging the VR, the movable lens group includes at least one lens, and in order to adjust the image distance more conveniently, a convex lens and a concave lens may be used, and it is conceivable to select one lens for image distance adjustment or one convex lens and one concave lens, and the lenses are all parallel to and coaxial with the VR main lens.
Referring to fig. 6 to 7, taking an example that the movable lens group includes a convex lens and a concave lens, the step of calculating the optical axis coordinate position where the movable lens group should be located according to the data information of the gazing point region will be described in more detail, specifically, the step includes:
firstly, the computing and controlling device acquires sub-image distance data contained in an image to be displayed. Preferably, the image distance data of the sub-image is in a two-dimensional array format, X [ i ] [ j ] is an image distance value, i, j are coordinates of the sub-image in the image, respectively, and the image distance data of the sub-image is shown in a following two-dimensional matrix:
next, the eye tracking apparatus calculates the coordinates of the gaze point position that the observer is viewing based on an eye tracking technique. As shown in fig. 6, the coordinate of the gazing point position observed by the human eye is tracked to be (i, j), that is, the corresponding subimage in the middle of the fire balloon. And calculating and controlling the sub-image distance data corresponding to the position coordinates of the fixation point by the control unit, and determining the position to be displayed of the image, namely the image distance. Since the coordinates (i, j) correspond to an image distance X [ i ] [ j ], the image should be imaged at the image distance X [ i ] [ j ].
The calculation and control unit calculates the movable lens position from the image distance data of the sub-images. In the optical path of this embodiment as shown in fig. 7, the movable lens group includes 2 movable lenses, and a first movable lens 720 and a second movable lens 730 are sequentially arranged along the incident light direction emitted from the display screen, wherein the first movable lens is a concave lens, the second movable lens is a convex lens, and both the first movable lens and the second movable lens are parallel to the VR main lens and share the same optical axis.
As shown in fig. 7, an image displayed on the display screen 710 sequentially passes through the first movable lens 720 and the second movable lens 730 to form an intermediate image 1(760) and an intermediate image 2(770), the intermediate image 1(760) and the intermediate image 2(770) are virtual images, and the intermediate image 2(770) passes through the VR main lens 740 to form a VR image. In fig. 7, Hp: displaying the image height of the screen; le: human eye to VR main lens distance; f: a VR main lens focal length; s: the distance between the VR imaging and the VR main lens is the image distance; and Ss: VR imaging to human eye distance, i.e. apparent distance, Ss = S + Le; hx: VR image height; l1: the distance between the display screen and the first movable lens is a parameter to be calculated; s1: distance of the intermediate image 1 from the first movable lens; l2: the distance between the intermediate image 1 and the second movable lens is a parameter to be calculated; l02: the distance between the second movable lens and the display screen; s2: the distance of the intermediate image 2 from the second movable lens; and Lz: distance of the intermediate image 2 from the VR main lens.
The following requirements are satisfied when calculating the positions of the first movable lens and the second movable lens:
1) the visual angle is not changed in the image distance adjusting process, namely the ratio v of Ss to Hx is kept unchanged
2) The distance between the display screen and the main lens is kept unchanged, namely Ls = L1-S1+ L2+ Lz-S2 is kept unchanged
According to the above requirements, as shown in the optical path diagram of fig. 7, the positions of the first movable lens and the second movable lens can be derived according to the gaussian imaging formula, which is derived as follows:
the VR main lens has a focal length f, a positive sign, an object distance Lz, an image distance (-S), and a virtual image;
the focal length of the first movable lens is-f 1 (concave lens), the object distance is L1, and the image distance is (-S1) (virtual image);
the second movable lens has a focal length f2, a positive sign, an object distance L2, and an image distance-S2 (virtual image);
wherein f1, f2 are fixed values, and L1, L2 are the amount of awaiting requisition, the requirement that needs to satisfy:
1) VR imaging distance S
2) Height of virtual image,vAdjusting the position of the first and second movable lenses for viewing angle dependent scalingIs not changed
3) During the movable lens group position adjustment, the distance Ls between the display screen 710 and the VR main lens 740 is kept constant,
Using the gaussian imaging formula for the VR main lens we obtain:
thus:
The total magnification k of the first movable lens and the second movable lens is equal to the total magnification of the optical path divided by the VR main lens magnification, and then according to the above formula (03) and formula (04), it can be obtained:
using the gaussian imaging formula for the first movable lens is:
substituting (10) into (09) (08) yields:
by sequentially substituting the formulas (06), (07), (011) and (012) into (01), the following can be obtained:
can be obtained by simplifying the above formula
For a given image distance S, Lm, k, f1, and f2 in the above equation are all fixed, so the above equation is a quadratic equation with respect to r, and solving the equation can yield:
wherein:
finally, the following formula is obtained:
wherein:
the calculation and control means calculates the lens position according to the above equations (1) to (7) to obtain the current distance between the first movable lens and the display screen and the distance between the second movable lens and the display screen.
The lens results calculated according to the above formula are as follows:
given the data:
main lens focal length f: 50mm, height of display screen: 30mm, view-dependent scale factorv=0.8
Eye distance VR main lens distance Le: the thickness of the film is 10mm,
first movable lens object distance focal length: the thickness of the glass is 90mm,
second movable lens object distance focal length: the thickness of the glass is 80mm,
distance Ls between main lens and display screen: 38mm
For VR imaging distance S =330mm, L1=10.09mm, L02=20.72mm,
for VR image distance S =500mm, L1=10.99mm, L02=23.53mm,
for VR image distance S =1000mm, L1=11.23mm, L02=25.79mm,
for VR image distance S =2000mm, L1=11.21mm, L02=26.82mm are calculated,
for VR image distance S =4000mm, L1=11.13mm, L02=27.30mm are calculated.
And after the position of the movable lens is obtained through calculation, generating a driving control command of the driving unit according to the distance data.
The driving device receives the driving control command, and moves the positions of the first movable lens and the second movable lens according to the L1 and the L02 obtained by the calculation, thereby controlling the image distance of the VR image to be consistent with the image distance of the gazing point area image.
Preferably, in order to better display the VR imaging corresponding to the user gaze point region, the display control system disclosed in the present invention further includes: a graphics processor (i.e., the GPU shown in FIG. 4) coupled to a display screen 440;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
and the graphic processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
The display control system of the VR image calculates and determines the image area observed by human eyes based on an eyeball tracking technology, namely determines the current fixation point area of the human eyes of an observer, determines the image distance to be imaged of the left eye and the right eye according to the fixation point area, adjusts the positions of the lenses of the movable lens group to adjust the VR imaging distance of the left eye and the right eye, enables the left eye image and the right eye image to fall on the imaging distance, then calculates the blurring parameters of the non-fixation point area according to the image distance of the fixation point area and the non-fixation point area, performs blurring processing on the non-fixation point area, overcomes the difference between the image distance of the VR image and the imaging image distance of the human eyes, avoids the influence of other images on the display of the fixation point area of the user, and provides a more comfortable VR watching environment for the user.
The display control system has the advantages that the position, namely the image distance, of VR imaging on the optical axis is adjusted in real time according to the gaze point position of an observer, and images in the non-gaze point area can be blurred according to the image distance difference between the non-gaze point area and the gaze point area, so that the problems of dizziness and visual fatigue caused by convergence conflict when human eyes watch 3D images are solved.
A second embodiment disclosed in this embodiment is a display control method of the display control system, as shown in fig. 8, including the following steps:
step S810, tracking eyes of a viewer, and identifying a fixation point area of the eyes; the function of which is as described above for the eye tracking device of the system.
Step S820, calculating position data which needs to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the data information of the identified gaze point region of the human eye, and generating a driving control command according to the position data; the function of which is described in the calculation and control device of the above-mentioned system.
Step S830, receiving the driving control command, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image in the gazing point region, where the function of the driving device is as described in the above system.
Preferably, the step of calculating the position data required to be adjusted by the movable lens group when the image distance is the same as the image distance of the gaze point region according to the identified gaze point region of the human eye further comprises:
and acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images.
Preferably, the step of calculating the position data of the movable lens group to be adjusted when the image distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye comprises:
the position between the display screen and the VR main lens is kept unchanged, and the position data of the movable lens group can be deduced according to a Gaussian imaging formula.
Preferably, the movable lens group comprises a convex lens and a concave lens, and the convex lens and the concave lens are parallel to the main lens and share an optical axis.
In order to achieve better display of the VR image, the display control method further includes:
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
and performing blurring processing on the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
In order to explain the display control method provided by the present invention in more detail, the following describes the steps of the present invention in more detail when the embodiment is specifically applied.
In specific implementation, as shown in fig. 9, the method provided by the present invention has the following steps:
step S910, the calculating and controlling device obtains sub-image distance data included in the image to be displayed.
In step S920, the eye tracking apparatus calculates the coordinates of the gazing point position that the observer is watching based on the eye tracking technique. For example: the position coordinates of the gazing point which is watched by the eyes of the observer are positioned in the middle area of the image to be displayed.
Step S930, calculating sub-image distance data corresponding to the coordinates of the gazing point position read by the control device, and determining a position to be displayed of the image, i.e., an image distance.
In step S940, the calculation and control device calculates position coordinates at which each lens group of the movable lens group should be located, and generates a drive control command. In this embodiment, the movable lens group includes 2 movable lenses, and the first movable lens 720 and the second movable lens 730 are sequentially arranged along the incident light direction emitted from the display screen, the first movable lens is a concave lens, the second movable lens is a convex lens, and the lenses are parallel to the main lens and share the same optical axis.
And step S950, the driving device adjusts the position of the movable lens group according to the driving control instruction, and further adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
Step S960, calculating the blurring factor of the sub-image of each non-gazing point region according to the image distance difference between the gazing point region and the non-gazing point region.
In step S970, the GPU performs blurring processing on each non-gaze point region according to the blurring factor.
The invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is connected with the calculation and control device and is used for transmitting the acquired data of the human eye fixation point area to the calculation and control device; the calculation and control device calculates the position data of the movable lens group when the imaging image distance is the same as the image distance of the gazing point region according to the data information of the gazing point region; and the driving device drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image. The display control system and the display control method firstly acquire the gaze point area of human eyes, utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the gaze point area of the human eyes, overcome the defects of conflict of convergence adjustment of vision due to fixed VR imaging position, and solve the problem of fatigue of the human eyes caused by the conflict of convergence adjustment of vision when users watch VR videos or images.
It should be understood that equivalents and modifications of the technical solution and inventive concept thereof may occur to those skilled in the art, and all such modifications and alterations should fall within the scope of the appended claims.
Claims (2)
1. A display control system of a VR image, comprising: display screen and VR main lens, its characterized in that still includes:
an eyeball tracking device for tracking the eyes of the viewer and identifying the fixation point area of the eyes;
a movable lens group arranged between the VR main lens and the display screen for adjusting the VR imaging position;
and a computing and control device and a driving device which are connected with the movable lens group;
the eyeball tracking device is connected with the calculation and control device and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device;
the calculation and control device is used for calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the data information of the gazing point region, and generating a driving control instruction according to the position data;
the driving device receives the driving control instruction, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image;
the calculation and control device is also used for acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes and calculating imaging image distance information according to the image distance information of the sub-images; the computing and control device acquires image data of an image to be displayed from the graphic processor, and acquires image distance information of a sub-image in the image to be displayed, which corresponds to the fixation point area, according to the image data; calculating the optical axis coordinate position where the movable lens group is positioned when the image distance of VR imaging is adjusted to be the same as the image distance of the fixation point area;
the movable lens group comprises a first movable lens and a second movable lens, and the first movable lens and the second movable lens are parallel to the VR main lens and share an optical axis;
the calculating and controlling device is used for calculating the positions of the first movable lens and the second movable lens and is required to meet the following requirements: the visual angle is unchanged in the image distance adjusting process; the distance between the display screen and the VR main lens is kept unchanged;
the display control system further includes: a graphics processor connected to the display screen;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
the image processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters;
the eyeball tracking device acquires a human eye fixation point area based on an eyeball tracking technology;
the calculating steps of the positions of the first and second movable lenses are as follows:
the VR main lens has a focal length f, a positive sign, an object distance Lz and an image distance (-S);
focal length of first movable lens-f 1 the first movable lens is a concave lens, object distance is L1, image distance is (-S1);
the focal length of the second movable lens is f2, the sign is positive, the object distance is L2, and the image distance is-S2;
wherein f1, f2 are fixed values, and L1, L2 are the amount of awaiting requisition, the requirement that needs to satisfy:
1) the VR imaging distance is S;
2) height of virtual image,vAdjusting the position of the first and second movable lenses for viewing angle dependent scalingThe change is not changed;
3) during the position adjustment process of the movable lens group, the distance Ls between the display screen and the VR main lens is kept unchanged,
Using the gaussian imaging formula for the VR main lens we obtain:
thus:
The total magnification k of the first movable lens and the second movable lens is equal to the total magnification of the optical path divided by the VR main lens magnification, and then according to the above formula (03) and formula (04), it can be obtained:
using the gaussian imaging formula for the first movable lens is:
substituting (10) into (09) (08) yields:
by sequentially substituting the formulas (06), (07), (011) and (012) into (01), the following can be obtained:
can be obtained by simplifying the above formula
For a given image distance S, Lm, k, f1, and f2 in the above equation are all fixed, so the above equation is a quadratic equation with respect to r, and solving the equation can yield:
wherein:
finally, the following formula is obtained:
wherein:
the calculation and control means calculates the lens position according to the above equations (1) to (7) to obtain the current distance between the first movable lens and the display screen and the distance between the second movable lens and the display screen.
2. A display control method of a display control system according to claim 1, characterized by comprising the steps of:
identifying a human eye fixation point area based on an eyeball tracking technology;
calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye, and generating a driving control instruction according to the position data;
receiving the driving control instruction, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image of the fixation point area;
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters;
the step of calculating the position data that the movable lens group needs to adjust when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye further comprises:
acquiring image distance information of corresponding sub-images in an image to be displayed according to a fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images;
the step of calculating the position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye comprises the following steps:
keeping the position between the display screen and the VR main lens unchanged, and deducing position data required to be adjusted by the movable lens group according to a Gaussian imaging formula;
the movable lens group comprises at least one lens which is parallel to the main lens and is coaxial with the main lens.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810287177.2A CN108632599B (en) | 2018-03-30 | 2018-03-30 | Display control system and display control method of VR image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810287177.2A CN108632599B (en) | 2018-03-30 | 2018-03-30 | Display control system and display control method of VR image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108632599A CN108632599A (en) | 2018-10-09 |
CN108632599B true CN108632599B (en) | 2020-10-09 |
Family
ID=63696579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810287177.2A Active CN108632599B (en) | 2018-03-30 | 2018-03-30 | Display control system and display control method of VR image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108632599B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108663799B (en) * | 2018-03-30 | 2020-10-09 | 蒋昊涵 | Display control system and display control method of VR image |
CN109298793B (en) * | 2018-11-22 | 2022-05-20 | 京东方科技集团股份有限公司 | Screen position adjusting method and device |
CN109491091B (en) * | 2019-01-10 | 2024-04-16 | 京东方科技集团股份有限公司 | Optical system applied to VRAR system and focusing method thereof |
CN112213859B (en) * | 2020-10-12 | 2022-09-23 | 歌尔科技有限公司 | Head-mounted display device and imaging method thereof |
CN114047817B (en) * | 2021-10-15 | 2023-04-07 | 中邮通建设咨询有限公司 | Virtual reality VR interactive system of meta universe |
CN114415368B (en) * | 2021-12-15 | 2023-05-12 | 青岛歌尔声学科技有限公司 | Regulation and control method and device of VR equipment, system and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1611975A (en) * | 2003-09-02 | 2005-05-04 | 佳能株式会社 | Imaging apparatus |
CN101137920A (en) * | 2005-03-08 | 2008-03-05 | 株式会社理光 | Lens barrel, lens driving device, camera and personal digital assistant device |
CN101454705A (en) * | 2006-05-26 | 2009-06-10 | 株式会社理光 | Lens driving-control device and imaging apparatus including the lens driving-control device |
CN205581417U (en) * | 2016-04-13 | 2016-09-14 | 中山联合光电科技股份有限公司 | Virtual reality optical system |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006285482A (en) * | 2005-03-31 | 2006-10-19 | Toppan Printing Co Ltd | Device for correcting image geometry |
JP5434848B2 (en) * | 2010-08-18 | 2014-03-05 | ソニー株式会社 | Display device |
CN102944935B (en) * | 2012-11-13 | 2014-12-24 | 京东方科技集团股份有限公司 | Binocular head-wearing display device and method thereof for adjusting image spacing |
CN103698884A (en) * | 2013-12-12 | 2014-04-02 | 京东方科技集团股份有限公司 | Opening type head-mounted display device and display method thereof |
CN106199964B (en) * | 2015-01-21 | 2019-06-21 | 成都理想境界科技有限公司 | The binocular AR helmet and depth of field adjusting method of the depth of field can be automatically adjusted |
TWI569040B (en) * | 2015-05-07 | 2017-02-01 | 尚立光電股份有限公司 | Autofocus head mounted display device |
CN107272200A (en) * | 2017-05-02 | 2017-10-20 | 北京奇艺世纪科技有限公司 | A kind of focal distance control apparatus, method and VR glasses |
CN108663799B (en) * | 2018-03-30 | 2020-10-09 | 蒋昊涵 | Display control system and display control method of VR image |
-
2018
- 2018-03-30 CN CN201810287177.2A patent/CN108632599B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1611975A (en) * | 2003-09-02 | 2005-05-04 | 佳能株式会社 | Imaging apparatus |
CN101137920A (en) * | 2005-03-08 | 2008-03-05 | 株式会社理光 | Lens barrel, lens driving device, camera and personal digital assistant device |
CN101454705A (en) * | 2006-05-26 | 2009-06-10 | 株式会社理光 | Lens driving-control device and imaging apparatus including the lens driving-control device |
CN205581417U (en) * | 2016-04-13 | 2016-09-14 | 中山联合光电科技股份有限公司 | Virtual reality optical system |
Non-Patent Citations (1)
Title |
---|
"紧凑型长波致冷红外变焦距透镜系统";白玉琢 等;《红外技术》;20110831;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN108632599A (en) | 2018-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108663799B (en) | Display control system and display control method of VR image | |
CN108632599B (en) | Display control system and display control method of VR image | |
US10241329B2 (en) | Varifocal aberration compensation for near-eye displays | |
JP6520119B2 (en) | Image processing apparatus and image processing method | |
CN104869389B (en) | Off-axis formula virtual video camera parameter determination method and system | |
JPH10210506A (en) | Three-dimensional image information input device and three-dimensional image information input output device | |
US9905143B1 (en) | Display apparatus and method of displaying using image renderers and optical combiners | |
JPH08317429A (en) | Stereoscopic electronic zoom device and stereoscopic picture quality controller | |
JP2000013818A (en) | Stereoscopic display device and stereoscopic display method | |
US8692870B2 (en) | Adaptive adjustment of depth cues in a stereo telepresence system | |
JP2014219621A (en) | Display device and display control program | |
US20230239457A1 (en) | System and method for corrected video-see-through for head mounted displays | |
CN109799899B (en) | Interaction control method and device, storage medium and computer equipment | |
TWI589150B (en) | Three-dimensional auto-focusing method and the system thereof | |
CN108287609B (en) | Image drawing method for AR glasses | |
KR100439341B1 (en) | Depth of field adjustment apparatus and method of stereo image for reduction of visual fatigue | |
CN110794590B (en) | Virtual reality display system and display method thereof | |
JPH06235885A (en) | Stereoscopic picture display device | |
WO2017085803A1 (en) | Video display device and video display method | |
CN115202475A (en) | Display method, display device, electronic equipment and computer-readable storage medium | |
CN211786414U (en) | Virtual reality display system | |
JP2001218231A (en) | Device and method for displaying stereoscopic image | |
CN211791831U (en) | Integrated imaging display system | |
CN110933396A (en) | Integrated imaging display system and display method thereof | |
KR101173280B1 (en) | Method and apparatus for processing stereoscopic image signals for controlling convergence of stereoscopic images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |