CN108632599B - Display control system and display control method of VR image - Google Patents

Display control system and display control method of VR image Download PDF

Info

Publication number
CN108632599B
CN108632599B CN201810287177.2A CN201810287177A CN108632599B CN 108632599 B CN108632599 B CN 108632599B CN 201810287177 A CN201810287177 A CN 201810287177A CN 108632599 B CN108632599 B CN 108632599B
Authority
CN
China
Prior art keywords
image
movable lens
distance
image distance
lens group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810287177.2A
Other languages
Chinese (zh)
Other versions
CN108632599A (en
Inventor
蒋昊涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201810287177.2A priority Critical patent/CN108632599B/en
Publication of CN108632599A publication Critical patent/CN108632599A/en
Application granted granted Critical
Publication of CN108632599B publication Critical patent/CN108632599B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is used for identifying a human eye fixation point area; the calculation and control device calculates the position data of the movable lens group when the imaging image distance is the same as the image distance of the gazing point region according to the data information of the gazing point region; and the driving device drives the movable lens group to move to adjust the image distance of the VR image to be consistent with the image distance of the gazing point area image. The system and the method provided by the invention firstly acquire the fixation point area of human eyes, and utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the fixation point area of human eyes, thereby overcoming the defect of convergence adjustment conflict of vision caused by fixed VR imaging position.

Description

Display control system and display control method of VR image
Technical Field
The invention relates to the technical field of display control, in particular to a display control system and a display control method for VR images.
Background
In the prior VR display technology, 2D images of the same object at different angles are respectively displayed in front of the left eye and the right eye of an observer, a 3D visual sense is formed by utilizing parallax, the focal point adjustment of the 2D images is not matched with the 3D visual sense with depth formed by the parallax of the left eye and the right eye due to the fixed image distance of the 2D images observed by the left eye and the right eye, the problem of convergence adjustment conflict occurs, and the fatigue and even dizziness of the eyes can occur when the observer watches the images for a long time.
Specifically, as shown in fig. 1a and fig. 1b, fig. 1a is a schematic stereoscopic view of the prior VR technology, where 101 and 102 in the figure represent a left eye and a right eye respectively, 103 is a real 3D real object, 104 is a prior VR device, 105 is a VR imaging position, and the 3D visual sensing position is 106, and Lb and La in the figure represent a convergence distance and a focus distance respectively, as shown in fig. 1a, when a human eye observes a real world, the convergence distance Lb and the focus distance La are equal, there is no convergence adjustment conflict, i.e., a focus-focus contradiction, and under the prior VR stereoscopic display technology, the convergence distance Lb and the focus distance La have a large difference, and the problem of convergence adjustment conflict is prominent, which affects viewing experience.
In the prior art, there are mainly two different VR display technologies, the light path diagram of the first display technology is shown in fig. 2, and a display screen 210 generates a picture to generate a VR image 230 through a VR main lens 220 for the left and right eyes of an observer to watch. Since the focal length of the VR main lens 220 is fixed and the distance from the display screen 210 is fixed, the image distance position of the VR imaging 230 is fixed. It can be seen that the display technology has the disadvantage that the VR imaging position is fixed and cannot be kept consistent with the viewing distance of the 3D parallax image, so that there is a problem of convergence adjustment conflict. The second display technology is a light field display technology based on a microlens array, the light field is as shown in fig. 3, the display screen pixels 310 near the focal plane of the microlens array 320 form a light field vector 330 through the microlens array, the light field vector is converged and imaged through the human eye 340, and the angle of the light field vector can be adjusted by selecting different pixels, so that the image distance of the imaging is adjusted to avoid convergence adjustment conflict. It can be seen from fig. 3 that each pixel forms a light field vector, while a plurality of light field vectors form an image point. The disadvantages of the microlens array based light field display technology are: a plurality of pixel points are needed to display one image point, the angular resolution of a light field vector is provided, meanwhile, the spatial resolution of a displayed image is reduced, and the angular resolution and the spatial resolution are in a pair of contradiction.
Therefore, the prior art is subject to further improvement.
Disclosure of Invention
In view of the above disadvantages in the prior art, the present invention provides an image processing method and system based on VR, which overcomes the disadvantages of fixed imaging position and conflict in vergence adjustment in VR in the prior art.
A first embodiment of the present invention is a display control system for a VR image, including: display screen and VR main lens, wherein, still include:
an eyeball tracking device for tracking the eyes of the viewer and identifying the fixation point area of the eyes;
a movable lens group arranged between the VR main lens and the display screen for adjusting the VR imaging position;
and a computing and control device and a driving device which are connected with the movable lens group;
the eyeball tracking device is connected with the calculation and control device and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device;
the calculation and control device is used for calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the data information of the gazing point region, and generating a driving control instruction according to the position data;
and the driving device receives the driving control instruction, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
Optionally, the eyeball tracking device acquires a human eye fixation point area based on an eyeball tracking technology.
Optionally, the calculating and controlling device is further configured to obtain image distance information of a corresponding sub-image in the image to be displayed according to the gaze point region of human eyes, and calculate imaging image distance information according to the image distance information of the sub-image.
Optionally, the display control system further includes: a graphics processor connected to the display screen;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
and the graphic processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
Optionally, the movable lens group comprises at least one lens, which is parallel to and coaxial with the VR main lens.
A second embodiment provided by the present invention is a display control method of the display control system, including the steps of:
tracking the eyes of a viewer and identifying a fixation point area of the eyes;
calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye, and generating a driving control instruction according to the position data;
and receiving the driving control instruction, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image of the fixation point area.
Optionally, the step of calculating, according to the identified data information of the gaze point region of the human eye, position data that the movable lens group needs to be adjusted when the image distance is the same as the image distance of the gaze point region further includes:
and acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images.
Optionally, the step of calculating, according to the identified gaze point region of the human eye, position data that the movable lens group needs to adjust when the imaging distance is the same as the image distance of the gaze point region includes:
keeping the position between the display screen and the VR main lens unchanged, and deducing the position data of the movable lens group required to be adjusted according to a Gaussian imaging formula.
Optionally, the display control method further includes:
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
and performing blurring processing on the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
Optionally, the movable lens group comprises at least one lens, and the lens is parallel to the main lens and coaxial with the main lens.
The invention has the beneficial effects that the invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is used for identifying a human eye fixation point area and transmitting the identified human eye fixation point area to the calculation and control device; the calculation and control device calculates position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the gazing point region; and the driving device drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image. The display control system and the display control method firstly acquire the gaze point area of the human eyes, utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the gaze point area of the human eyes, overcome the defect of convergence adjustment conflict caused by fixed VR imaging position, and solve the problem of convergence adjustment conflict caused by the fact that a user watches VR video.
Drawings
FIG. 1a is a schematic diagram of a prior art viewing of real 3D objects;
FIG. 1b is a schematic diagram of stereo vision of VR technology in the prior art;
FIG. 2 is an optical diagram of a prior art VR display technique;
FIG. 3 is a prior art optical path diagram of a VR light field display technique based on a microlens array;
FIG. 4 is a schematic structural diagram of a VR image display control system according to the present invention;
FIG. 5 is a schematic diagram of the structure of an image to be displayed and a sub-image in an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of sub-images corresponding to the eye gaze point area in an embodiment of the present invention;
FIG. 7 is an optical diagram of a VR image display control system provided by the present invention;
FIG. 8 is a flow chart of method steps for a method of controlling the display of a VR image provided in accordance with the present invention;
FIG. 9 is a flow chart of steps implemented by a specific application of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the prior VR display technology, parallax 2D images are respectively displayed in front of left and right eyes of an observer to form 3D visual sensation, and the problem of convergence adjustment conflict exists due to the fixed image distance of the 2D images observed by the left and right eyes, so that the observer can feel eyestrain and even dizziness when watching for a long time, which is a problem to be solved in 3D display. The invention provides a display control system and a display control method of VR images, aiming at solving the problem of convergence adjustment conflict in the prior VR display technology.
A first embodiment of the present invention provides a display control system of a VR image, as shown in fig. 4, including: display screen 440 and VR main lens 460, wherein, still include:
an eyeball tracking device 420 for tracking the eyes of the viewer and identifying the gaze point area of the eyes;
a movable lens group (including a concave lens 451 and a convex lens 452) provided between the VR main lens 460 and the display screen 440 for adjusting a VR imaging position; and computing and control means 400 and drive means 470 establishing a connection with each of said movable lens groups;
the eyeball tracking device 420 is connected with the calculation and control device 400 and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device 400;
the calculation and control device 400 is configured to calculate, according to the data information of the gazing point region, position data that the movable lens group needs to adjust when an imaging distance is the same as an image distance of the gazing point region, and generate a driving control instruction according to the position data;
and the driving device 470 receives the driving control command, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
In the present invention, the gaze point region is a region gazed by eyes of a viewer, and in specific implementation, preferably, the eye tracking device acquires the gaze point region of the eyes based on an eye tracking technology to obtain the gaze point region. It is conceivable that the infrared light source and the infrared image recognition module may be used to recognize the pupil and the purkinje spot, as long as the information of the gazing point region of the human eye can be accurately acquired.
Specifically, the calculating and controlling device 400 is further configured to obtain image distance information of a corresponding sub-image in the image to be displayed according to the gaze point region of human eyes, and calculate imaging image distance information according to the image distance information of the sub-image.
As shown in fig. 5, the calculation and control device obtains the image data of the image 51 to be displayed from the GPU, and obtains the image distance information of the sub-image 52 corresponding to the gazing point region in the image 51 to be displayed according to the image data.
And calculating the optical axis coordinate position where the movable lens group is to be located when the image distance of VR imaging is adjusted to be the same as the image distance of the gazing point area.
Preferably, in order to achieve better adjustment of the image distance for imaging the VR, the movable lens group includes at least one lens, and in order to adjust the image distance more conveniently, a convex lens and a concave lens may be used, and it is conceivable to select one lens for image distance adjustment or one convex lens and one concave lens, and the lenses are all parallel to and coaxial with the VR main lens.
Referring to fig. 6 to 7, taking an example that the movable lens group includes a convex lens and a concave lens, the step of calculating the optical axis coordinate position where the movable lens group should be located according to the data information of the gazing point region will be described in more detail, specifically, the step includes:
firstly, the computing and controlling device acquires sub-image distance data contained in an image to be displayed. Preferably, the image distance data of the sub-image is in a two-dimensional array format, X [ i ] [ j ] is an image distance value, i, j are coordinates of the sub-image in the image, respectively, and the image distance data of the sub-image is shown in a following two-dimensional matrix:
Figure 422360DEST_PATH_IMAGE001
next, the eye tracking apparatus calculates the coordinates of the gaze point position that the observer is viewing based on an eye tracking technique. As shown in fig. 6, the coordinate of the gazing point position observed by the human eye is tracked to be (i, j), that is, the corresponding subimage in the middle of the fire balloon. And calculating and controlling the sub-image distance data corresponding to the position coordinates of the fixation point by the control unit, and determining the position to be displayed of the image, namely the image distance. Since the coordinates (i, j) correspond to an image distance X [ i ] [ j ], the image should be imaged at the image distance X [ i ] [ j ].
The calculation and control unit calculates the movable lens position from the image distance data of the sub-images. In the optical path of this embodiment as shown in fig. 7, the movable lens group includes 2 movable lenses, and a first movable lens 720 and a second movable lens 730 are sequentially arranged along the incident light direction emitted from the display screen, wherein the first movable lens is a concave lens, the second movable lens is a convex lens, and both the first movable lens and the second movable lens are parallel to the VR main lens and share the same optical axis.
As shown in fig. 7, an image displayed on the display screen 710 sequentially passes through the first movable lens 720 and the second movable lens 730 to form an intermediate image 1(760) and an intermediate image 2(770), the intermediate image 1(760) and the intermediate image 2(770) are virtual images, and the intermediate image 2(770) passes through the VR main lens 740 to form a VR image. In fig. 7, Hp: displaying the image height of the screen; le: human eye to VR main lens distance; f: a VR main lens focal length; s: the distance between the VR imaging and the VR main lens is the image distance; and Ss: VR imaging to human eye distance, i.e. apparent distance, Ss = S + Le; hx: VR image height; l1: the distance between the display screen and the first movable lens is a parameter to be calculated; s1: distance of the intermediate image 1 from the first movable lens; l2: the distance between the intermediate image 1 and the second movable lens is a parameter to be calculated; l02: the distance between the second movable lens and the display screen; s2: the distance of the intermediate image 2 from the second movable lens; and Lz: distance of the intermediate image 2 from the VR main lens.
The following requirements are satisfied when calculating the positions of the first movable lens and the second movable lens:
1) the visual angle is not changed in the image distance adjusting process, namely the ratio v of Ss to Hx is kept unchanged
2) The distance between the display screen and the main lens is kept unchanged, namely Ls = L1-S1+ L2+ Lz-S2 is kept unchanged
According to the above requirements, as shown in the optical path diagram of fig. 7, the positions of the first movable lens and the second movable lens can be derived according to the gaussian imaging formula, which is derived as follows:
the VR main lens has a focal length f, a positive sign, an object distance Lz, an image distance (-S), and a virtual image;
the focal length of the first movable lens is-f 1 (concave lens), the object distance is L1, and the image distance is (-S1) (virtual image);
the second movable lens has a focal length f2, a positive sign, an object distance L2, and an image distance-S2 (virtual image);
wherein f1, f2 are fixed values, and L1, L2 are the amount of awaiting requisition, the requirement that needs to satisfy:
1) VR imaging distance S
2) Height of virtual image
Figure 729713DEST_PATH_IMAGE002
vAdjusting the position of the first and second movable lenses for viewing angle dependent scaling
Figure 73232DEST_PATH_IMAGE003
Is not changed
3) During the movable lens group position adjustment, the distance Ls between the display screen 710 and the VR main lens 740 is kept constant,
Figure 202862DEST_PATH_IMAGE004
is provided with
Figure 813972DEST_PATH_IMAGE005
Using the gaussian imaging formula for the VR main lens we obtain:
Figure 229910DEST_PATH_IMAGE006
thus:
Figure 965785DEST_PATH_IMAGE007
(02)
VR main lens magnification is:
Figure 23739DEST_PATH_IMAGE008
(03)
total magnification of light path should be
Figure 990821DEST_PATH_IMAGE009
(04)
The total magnification k of the first movable lens and the second movable lens is equal to the total magnification of the optical path divided by the VR main lens magnification, and then according to the above formula (03) and formula (04), it can be obtained:
Figure 453026DEST_PATH_IMAGE010
setting: the first movable lens magnification is r,
Figure 800831DEST_PATH_IMAGE011
that is to say that,
Figure 6684DEST_PATH_IMAGE012
(06)
using the gaussian imaging formula for the first movable lens is:
Figure 389124DEST_PATH_IMAGE013
then
Figure 287810DEST_PATH_IMAGE014
(07)
Setting: the second movable lens magnification is r2,
Figure 326173DEST_PATH_IMAGE015
the second movable lens is imaged by the Gaussian imaging formula
Figure 696237DEST_PATH_IMAGE016
(08)
Then
Figure 74128DEST_PATH_IMAGE017
(09)
And due to
Figure 206032DEST_PATH_IMAGE018
Namely:
Figure 528429DEST_PATH_IMAGE019
(010)
substituting (10) into (09) (08) yields:
Figure 76085DEST_PATH_IMAGE020
(011)
and
Figure 433117DEST_PATH_IMAGE021
(012)
by sequentially substituting the formulas (06), (07), (011) and (012) into (01), the following can be obtained:
Figure 502967DEST_PATH_IMAGE022
Figure 188026DEST_PATH_IMAGE023
can be obtained by simplifying the above formula
Figure 664007DEST_PATH_IMAGE024
For a given image distance S, Lm, k, f1, and f2 in the above equation are all fixed, so the above equation is a quadratic equation with respect to r, and solving the equation can yield:
Figure 485332DEST_PATH_IMAGE025
wherein:
Figure 224618DEST_PATH_IMAGE026
Figure 256028DEST_PATH_IMAGE027
=
Figure 145487DEST_PATH_IMAGE028
Figure 978576DEST_PATH_IMAGE029
and due to
Figure 888763DEST_PATH_IMAGE030
Obtaining:
Figure 282835DEST_PATH_IMAGE031
finally, the following formula is obtained:
Figure 366198DEST_PATH_IMAGE032
Figure 896536DEST_PATH_IMAGE033
wherein:
Figure 774362DEST_PATH_IMAGE034
Figure 496810DEST_PATH_IMAGE035
Figure 993651DEST_PATH_IMAGE036
Figure 34288DEST_PATH_IMAGE037
Figure 20698DEST_PATH_IMAGE038
the calculation and control means calculates the lens position according to the above equations (1) to (7) to obtain the current distance between the first movable lens and the display screen and the distance between the second movable lens and the display screen.
The lens results calculated according to the above formula are as follows:
given the data:
main lens focal length f: 50mm, height of display screen: 30mm, view-dependent scale factorv=0.8
Eye distance VR main lens distance Le: the thickness of the film is 10mm,
first movable lens object distance focal length: the thickness of the glass is 90mm,
second movable lens object distance focal length: the thickness of the glass is 80mm,
distance Ls between main lens and display screen: 38mm
For VR imaging distance S =330mm, L1=10.09mm, L02=20.72mm,
for VR image distance S =500mm, L1=10.99mm, L02=23.53mm,
for VR image distance S =1000mm, L1=11.23mm, L02=25.79mm,
for VR image distance S =2000mm, L1=11.21mm, L02=26.82mm are calculated,
for VR image distance S =4000mm, L1=11.13mm, L02=27.30mm are calculated.
And after the position of the movable lens is obtained through calculation, generating a driving control command of the driving unit according to the distance data.
The driving device receives the driving control command, and moves the positions of the first movable lens and the second movable lens according to the L1 and the L02 obtained by the calculation, thereby controlling the image distance of the VR image to be consistent with the image distance of the gazing point area image.
Preferably, in order to better display the VR imaging corresponding to the user gaze point region, the display control system disclosed in the present invention further includes: a graphics processor (i.e., the GPU shown in FIG. 4) coupled to a display screen 440;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
and the graphic processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
The display control system of the VR image calculates and determines the image area observed by human eyes based on an eyeball tracking technology, namely determines the current fixation point area of the human eyes of an observer, determines the image distance to be imaged of the left eye and the right eye according to the fixation point area, adjusts the positions of the lenses of the movable lens group to adjust the VR imaging distance of the left eye and the right eye, enables the left eye image and the right eye image to fall on the imaging distance, then calculates the blurring parameters of the non-fixation point area according to the image distance of the fixation point area and the non-fixation point area, performs blurring processing on the non-fixation point area, overcomes the difference between the image distance of the VR image and the imaging image distance of the human eyes, avoids the influence of other images on the display of the fixation point area of the user, and provides a more comfortable VR watching environment for the user.
The display control system has the advantages that the position, namely the image distance, of VR imaging on the optical axis is adjusted in real time according to the gaze point position of an observer, and images in the non-gaze point area can be blurred according to the image distance difference between the non-gaze point area and the gaze point area, so that the problems of dizziness and visual fatigue caused by convergence conflict when human eyes watch 3D images are solved.
A second embodiment disclosed in this embodiment is a display control method of the display control system, as shown in fig. 8, including the following steps:
step S810, tracking eyes of a viewer, and identifying a fixation point area of the eyes; the function of which is as described above for the eye tracking device of the system.
Step S820, calculating position data which needs to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the data information of the identified gaze point region of the human eye, and generating a driving control command according to the position data; the function of which is described in the calculation and control device of the above-mentioned system.
Step S830, receiving the driving control command, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image in the gazing point region, where the function of the driving device is as described in the above system.
Preferably, the step of calculating the position data required to be adjusted by the movable lens group when the image distance is the same as the image distance of the gaze point region according to the identified gaze point region of the human eye further comprises:
and acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images.
Preferably, the step of calculating the position data of the movable lens group to be adjusted when the image distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye comprises:
the position between the display screen and the VR main lens is kept unchanged, and the position data of the movable lens group can be deduced according to a Gaussian imaging formula.
Preferably, the movable lens group comprises a convex lens and a concave lens, and the convex lens and the concave lens are parallel to the main lens and share an optical axis.
In order to achieve better display of the VR image, the display control method further includes:
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
and performing blurring processing on the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters.
In order to explain the display control method provided by the present invention in more detail, the following describes the steps of the present invention in more detail when the embodiment is specifically applied.
In specific implementation, as shown in fig. 9, the method provided by the present invention has the following steps:
step S910, the calculating and controlling device obtains sub-image distance data included in the image to be displayed.
In step S920, the eye tracking apparatus calculates the coordinates of the gazing point position that the observer is watching based on the eye tracking technique. For example: the position coordinates of the gazing point which is watched by the eyes of the observer are positioned in the middle area of the image to be displayed.
Step S930, calculating sub-image distance data corresponding to the coordinates of the gazing point position read by the control device, and determining a position to be displayed of the image, i.e., an image distance.
In step S940, the calculation and control device calculates position coordinates at which each lens group of the movable lens group should be located, and generates a drive control command. In this embodiment, the movable lens group includes 2 movable lenses, and the first movable lens 720 and the second movable lens 730 are sequentially arranged along the incident light direction emitted from the display screen, the first movable lens is a concave lens, the second movable lens is a convex lens, and the lenses are parallel to the main lens and share the same optical axis.
And step S950, the driving device adjusts the position of the movable lens group according to the driving control instruction, and further adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image.
Step S960, calculating the blurring factor of the sub-image of each non-gazing point region according to the image distance difference between the gazing point region and the non-gazing point region.
In step S970, the GPU performs blurring processing on each non-gaze point region according to the blurring factor.
The invention provides a display control system and a display control method of VR images, which comprises the following steps: the device comprises a display screen, a VR main lens, an eyeball tracking device, a movable lens group, a calculation and control device and a driving device, wherein the calculation and control device and the driving device are connected with the movable lens group; the eyeball tracking device is connected with the calculation and control device and is used for transmitting the acquired data of the human eye fixation point area to the calculation and control device; the calculation and control device calculates the position data of the movable lens group when the imaging image distance is the same as the image distance of the gazing point region according to the data information of the gazing point region; and the driving device drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image. The display control system and the display control method firstly acquire the gaze point area of human eyes, utilize the movable lens group to adjust the difference between the imaging image distance and the image distance of the gaze point area of the human eyes, overcome the defects of conflict of convergence adjustment of vision due to fixed VR imaging position, and solve the problem of fatigue of the human eyes caused by the conflict of convergence adjustment of vision when users watch VR videos or images.
It should be understood that equivalents and modifications of the technical solution and inventive concept thereof may occur to those skilled in the art, and all such modifications and alterations should fall within the scope of the appended claims.

Claims (2)

1. A display control system of a VR image, comprising: display screen and VR main lens, its characterized in that still includes:
an eyeball tracking device for tracking the eyes of the viewer and identifying the fixation point area of the eyes;
a movable lens group arranged between the VR main lens and the display screen for adjusting the VR imaging position;
and a computing and control device and a driving device which are connected with the movable lens group;
the eyeball tracking device is connected with the calculation and control device and is used for transmitting the data information of the identified human eye fixation point area to the calculation and control device;
the calculation and control device is used for calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gazing point region according to the data information of the gazing point region, and generating a driving control instruction according to the position data;
the driving device receives the driving control instruction, drives the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusts the image distance of the VR image to be consistent with the image distance of the gaze point area image;
the calculation and control device is also used for acquiring image distance information of corresponding sub-images in the image to be displayed according to the fixation point area of human eyes and calculating imaging image distance information according to the image distance information of the sub-images; the computing and control device acquires image data of an image to be displayed from the graphic processor, and acquires image distance information of a sub-image in the image to be displayed, which corresponds to the fixation point area, according to the image data; calculating the optical axis coordinate position where the movable lens group is positioned when the image distance of VR imaging is adjusted to be the same as the image distance of the fixation point area;
the movable lens group comprises a first movable lens and a second movable lens, and the first movable lens and the second movable lens are parallel to the VR main lens and share an optical axis;
the calculating and controlling device is used for calculating the positions of the first movable lens and the second movable lens and is required to meet the following requirements: the visual angle is unchanged in the image distance adjusting process; the distance between the display screen and the VR main lens is kept unchanged;
the display control system further includes: a graphics processor connected to the display screen;
the calculation and control device is also used for calculating an image distance difference value between the human eye fixation point area and the non-fixation point area, and calculating the blurring parameters of the sub-images corresponding to the non-fixation point areas according to the image distance difference value;
the image processor is used for blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters;
the eyeball tracking device acquires a human eye fixation point area based on an eyeball tracking technology;
the calculating steps of the positions of the first and second movable lenses are as follows:
the VR main lens has a focal length f, a positive sign, an object distance Lz and an image distance (-S);
focal length of first movable lens-f 1 the first movable lens is a concave lens, object distance is L1, image distance is (-S1);
the focal length of the second movable lens is f2, the sign is positive, the object distance is L2, and the image distance is-S2;
wherein f1, f2 are fixed values, and L1, L2 are the amount of awaiting requisition, the requirement that needs to satisfy:
1) the VR imaging distance is S;
2) height of virtual image
Figure 731219DEST_PATH_IMAGE001
vAdjusting the position of the first and second movable lenses for viewing angle dependent scaling
Figure 341192DEST_PATH_IMAGE002
The change is not changed;
3) during the position adjustment process of the movable lens group, the distance Ls between the display screen and the VR main lens is kept unchanged,
Figure 137371DEST_PATH_IMAGE003
is provided with
Figure 643439DEST_PATH_IMAGE004
Using the gaussian imaging formula for the VR main lens we obtain:
Figure 390815DEST_PATH_IMAGE005
thus:
Figure 7741DEST_PATH_IMAGE006
(02)
VR main lens magnification is:
Figure 156963DEST_PATH_IMAGE007
(03)
total magnification of light path should be
Figure 630670DEST_PATH_IMAGE008
(04)
The total magnification k of the first movable lens and the second movable lens is equal to the total magnification of the optical path divided by the VR main lens magnification, and then according to the above formula (03) and formula (04), it can be obtained:
Figure 803025DEST_PATH_IMAGE009
setting: the first movable lens magnification is r,
Figure 20380DEST_PATH_IMAGE010
that is to say that,
Figure 227370DEST_PATH_IMAGE011
(06)
using the gaussian imaging formula for the first movable lens is:
Figure 137557DEST_PATH_IMAGE012
then
Figure 797209DEST_PATH_IMAGE013
(07)
Setting: the second movable lens magnification is r2,
Figure 552675DEST_PATH_IMAGE014
the second movable lens is imaged by the Gaussian imaging formula
Figure 410910DEST_PATH_IMAGE015
(08)
Then
Figure 429681DEST_PATH_IMAGE016
(09)
And due to
Figure 137481DEST_PATH_IMAGE017
Namely:
Figure 899901DEST_PATH_IMAGE018
(010)
substituting (10) into (09) (08) yields:
Figure 878221DEST_PATH_IMAGE019
(011)
and
Figure 802315DEST_PATH_IMAGE020
(012)
by sequentially substituting the formulas (06), (07), (011) and (012) into (01), the following can be obtained:
Figure 498875DEST_PATH_IMAGE021
Figure 799407DEST_PATH_IMAGE022
can be obtained by simplifying the above formula
Figure 897813DEST_PATH_IMAGE023
For a given image distance S, Lm, k, f1, and f2 in the above equation are all fixed, so the above equation is a quadratic equation with respect to r, and solving the equation can yield:
Figure 789545DEST_PATH_IMAGE024
wherein:
Figure 911085DEST_PATH_IMAGE025
Figure 77624DEST_PATH_IMAGE026
=
Figure 702640DEST_PATH_IMAGE027
Figure 296433DEST_PATH_IMAGE028
and due to
Figure 639690DEST_PATH_IMAGE029
Obtaining:
Figure 609920DEST_PATH_IMAGE030
finally, the following formula is obtained:
Figure 355022DEST_PATH_IMAGE031
Figure 621180DEST_PATH_IMAGE032
wherein:
Figure 451733DEST_PATH_IMAGE033
Figure 960075DEST_PATH_IMAGE034
Figure 356421DEST_PATH_IMAGE035
Figure 229699DEST_PATH_IMAGE036
Figure 609865DEST_PATH_IMAGE037
the calculation and control means calculates the lens position according to the above equations (1) to (7) to obtain the current distance between the first movable lens and the display screen and the distance between the second movable lens and the display screen.
2. A display control method of a display control system according to claim 1, characterized by comprising the steps of:
identifying a human eye fixation point area based on an eyeball tracking technology;
calculating position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye, and generating a driving control instruction according to the position data;
receiving the driving control instruction, driving the movable lens group to move to the optical axis coordinate corresponding to the position data, and adjusting the image distance of the VR image to be consistent with the image distance of the image of the fixation point area;
calculating an image distance difference value between a fixation point area and a non-fixation point area of the human eye, and calculating a blurring parameter of a sub-image corresponding to each non-fixation point area according to the image distance difference value;
blurring the sub-images corresponding to the non-fixation point areas according to the calculated blurring parameters;
the step of calculating the position data that the movable lens group needs to adjust when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye further comprises:
acquiring image distance information of corresponding sub-images in an image to be displayed according to a fixation point area of human eyes, and calculating imaging image distance information according to the image distance information of the sub-images;
the step of calculating the position data required to be adjusted by the movable lens group when the imaging distance is the same as the image distance of the gaze point region according to the identified data information of the gaze point region of the human eye comprises the following steps:
keeping the position between the display screen and the VR main lens unchanged, and deducing position data required to be adjusted by the movable lens group according to a Gaussian imaging formula;
the movable lens group comprises at least one lens which is parallel to the main lens and is coaxial with the main lens.
CN201810287177.2A 2018-03-30 2018-03-30 Display control system and display control method of VR image Active CN108632599B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810287177.2A CN108632599B (en) 2018-03-30 2018-03-30 Display control system and display control method of VR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810287177.2A CN108632599B (en) 2018-03-30 2018-03-30 Display control system and display control method of VR image

Publications (2)

Publication Number Publication Date
CN108632599A CN108632599A (en) 2018-10-09
CN108632599B true CN108632599B (en) 2020-10-09

Family

ID=63696579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810287177.2A Active CN108632599B (en) 2018-03-30 2018-03-30 Display control system and display control method of VR image

Country Status (1)

Country Link
CN (1) CN108632599B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108663799B (en) * 2018-03-30 2020-10-09 蒋昊涵 Display control system and display control method of VR image
CN109298793B (en) * 2018-11-22 2022-05-20 京东方科技集团股份有限公司 Screen position adjusting method and device
CN109491091B (en) * 2019-01-10 2024-04-16 京东方科技集团股份有限公司 Optical system applied to VRAR system and focusing method thereof
CN112213859B (en) * 2020-10-12 2022-09-23 歌尔科技有限公司 Head-mounted display device and imaging method thereof
CN114047817B (en) * 2021-10-15 2023-04-07 中邮通建设咨询有限公司 Virtual reality VR interactive system of meta universe
CN114415368B (en) * 2021-12-15 2023-05-12 青岛歌尔声学科技有限公司 Regulation and control method and device of VR equipment, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1611975A (en) * 2003-09-02 2005-05-04 佳能株式会社 Imaging apparatus
CN101137920A (en) * 2005-03-08 2008-03-05 株式会社理光 Lens barrel, lens driving device, camera and personal digital assistant device
CN101454705A (en) * 2006-05-26 2009-06-10 株式会社理光 Lens driving-control device and imaging apparatus including the lens driving-control device
CN205581417U (en) * 2016-04-13 2016-09-14 中山联合光电科技股份有限公司 Virtual reality optical system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006285482A (en) * 2005-03-31 2006-10-19 Toppan Printing Co Ltd Device for correcting image geometry
JP5434848B2 (en) * 2010-08-18 2014-03-05 ソニー株式会社 Display device
CN102944935B (en) * 2012-11-13 2014-12-24 京东方科技集团股份有限公司 Binocular head-wearing display device and method thereof for adjusting image spacing
CN103698884A (en) * 2013-12-12 2014-04-02 京东方科技集团股份有限公司 Opening type head-mounted display device and display method thereof
WO2016115874A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted device capable of automatically adjusting depth of field and depth of field adjusting method
TWI569040B (en) * 2015-05-07 2017-02-01 尚立光電股份有限公司 Autofocus head mounted display device
CN107272200A (en) * 2017-05-02 2017-10-20 北京奇艺世纪科技有限公司 A kind of focal distance control apparatus, method and VR glasses
CN108663799B (en) * 2018-03-30 2020-10-09 蒋昊涵 Display control system and display control method of VR image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1611975A (en) * 2003-09-02 2005-05-04 佳能株式会社 Imaging apparatus
CN101137920A (en) * 2005-03-08 2008-03-05 株式会社理光 Lens barrel, lens driving device, camera and personal digital assistant device
CN101454705A (en) * 2006-05-26 2009-06-10 株式会社理光 Lens driving-control device and imaging apparatus including the lens driving-control device
CN205581417U (en) * 2016-04-13 2016-09-14 中山联合光电科技股份有限公司 Virtual reality optical system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"紧凑型长波致冷红外变焦距透镜系统";白玉琢 等;《红外技术》;20110831;全文 *

Also Published As

Publication number Publication date
CN108632599A (en) 2018-10-09

Similar Documents

Publication Publication Date Title
CN108663799B (en) Display control system and display control method of VR image
CN108632599B (en) Display control system and display control method of VR image
JP6520119B2 (en) Image processing apparatus and image processing method
US10241329B2 (en) Varifocal aberration compensation for near-eye displays
JPH10210506A (en) Three-dimensional image information input device and three-dimensional image information input output device
JPH08317429A (en) Stereoscopic electronic zoom device and stereoscopic picture quality controller
JP2000013818A (en) Stereoscopic display device and stereoscopic display method
US8692870B2 (en) Adaptive adjustment of depth cues in a stereo telepresence system
US9905143B1 (en) Display apparatus and method of displaying using image renderers and optical combiners
JP2014219621A (en) Display device and display control program
CN109188700A (en) Optical presentation system and AR/VR display device
US20230239457A1 (en) System and method for corrected video-see-through for head mounted displays
CN109799899B (en) Interaction control method and device, storage medium and computer equipment
TWI589150B (en) Three-dimensional auto-focusing method and the system thereof
KR100439341B1 (en) Depth of field adjustment apparatus and method of stereo image for reduction of visual fatigue
CN110794590B (en) Virtual reality display system and display method thereof
JPH06235885A (en) Stereoscopic picture display device
CN108287609B (en) Image drawing method for AR glasses
CN115202475A (en) Display method, display device, electronic equipment and computer-readable storage medium
CN211786414U (en) Virtual reality display system
WO2017085803A1 (en) Video display device and video display method
JP2001218231A (en) Device and method for displaying stereoscopic image
CN211791831U (en) Integrated imaging display system
CN110933396A (en) Integrated imaging display system and display method thereof
KR101173280B1 (en) Method and apparatus for processing stereoscopic image signals for controlling convergence of stereoscopic images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant