CN110035274B - Three-dimensional display method based on grating - Google Patents

Three-dimensional display method based on grating Download PDF

Info

Publication number
CN110035274B
CN110035274B CN201810031064.6A CN201810031064A CN110035274B CN 110035274 B CN110035274 B CN 110035274B CN 201810031064 A CN201810031064 A CN 201810031064A CN 110035274 B CN110035274 B CN 110035274B
Authority
CN
China
Prior art keywords
visual
visual area
area
grating
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810031064.6A
Other languages
Chinese (zh)
Other versions
CN110035274A (en
Inventor
刘立林
滕东东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Park view (Guangzhou) Technology Co., Ltd
Original Assignee
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Sun Yat Sen University filed Critical National Sun Yat Sen University
Priority to CN201810031064.6A priority Critical patent/CN110035274B/en
Priority to PCT/CN2019/070029 priority patent/WO2019137272A1/en
Priority to US16/479,926 priority patent/US11012673B2/en
Publication of CN110035274A publication Critical patent/CN110035274A/en
Application granted granted Critical
Publication of CN110035274B publication Critical patent/CN110035274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays

Abstract

The invention relates to the technical field of three-dimensional image display, in particular to a three-dimensional display method based on a grating. According to the spatial position of the eyes of an observer, the method changes the relevant parameters of the grating, so that the light passing through the whole or partial area of the adjacent visual area is incident to the left pupil of the observer, and the light passing through the whole or partial area of the other adjacent visual area is incident to the right pupil of the observer. The two pupils of the observer are separated from each other in space by means of other methods for projecting different images to the left and right eyes of the observer, or by means of the design of the arrangement direction of the grating units.

Description

Three-dimensional display method based on grating
Technical Field
The invention relates to the technical field of three-dimensional image display, in particular to a three-dimensional display method based on a grating.
Background
In the real three-dimensional world, two-dimensional display is imperfect due to loss of depth information in the third dimension, and therefore, three-dimensional display technology for presenting stereoscopic scenes is receiving increasing attention. The grating three-dimensional display technology is compatible with the characteristics of a main flat panel display, so that the grating three-dimensional display technology becomes the most widely practical three-dimensional technology at present. Through the light splitting function of the grating, the traditional grating type three-dimensional display technology guides different groups of pixel emergent light beams of the display screen to be visible in different visual areas, so that two eyes of an observer positioned near corresponding viewpoints of the different visual areas can respectively receive light information from the different pixel groups, and the three-dimensional image is presented based on the binocular parallax. Limited by the space bandwidth product of the display screen, the number of visual areas and corresponding viewpoints which can be presented by the grating display technology is limited; at the same time, in order to ensure that both eyes of the observer can receive the corresponding optical information, a limited number of viewing zones need to cover both eyes of the observer, thereby resulting in a larger distance between adjacent viewing zones, which means that each eye of the observer can receive only one two-dimensional view. In order to see clearly a respective two-dimensional view, the two eyes of the observer must focus on the display screen presenting the two-dimensional view, and when three-dimensional vision is obtained based on the principle of binocular parallax, the included angle between the viewing directions of the two eyes is relative to the three-dimensional light spot in the space displayed, so that the resulting focus-convergence conflict causes dizziness and various visual discomfort for the observer.
In the existing grating-based three-dimensional display technology, the long direction of a grating unit and the binocular connecting line direction of an observer are both placed at a large angle, the number of visual areas with very limited number caused by the limited resolution of a display screen is consistent with the long direction of the grating unit, the distribution area of the visual areas along the binocular connecting line direction needs to cover the binocular requirement of the observer, the distance between adjacent visual areas along the binocular connecting line direction needs to be larger, the presentation of monocular multiview (including two views) cannot be realized, and the problem that the focusing distance and the converging distance are inconsistent exists in a three-dimensional display method that each eye can only receive one corresponding two-dimensional view respectively, and the method is one of important sources of visual discomfort when a three-dimensional image is watched. Meanwhile, in the existing various non-grating three-dimensional display technologies, for example, a helmet-type virtual/augmented reality technology using two displays and a three-dimensional technology wearing polarized glasses or shutter glasses, two equivalent screens (for example, two equivalent screens in a helmet-type virtual/augmented reality system are virtual images of the two displays, and two equivalent screens in a polarized glasses-type three-dimensional system are projections of the two projectors on a screen) on a display surface are used for respectively presenting a corresponding two-dimensional view to an observer through binocular vision to realize three-dimensional presentation, and there is a problem that a focusing distance and a converging distance are inconsistent.
If a monocular can acquire two or more two-dimensional views, the light rays from the two or more two-dimensional views that cross the displayed object point will be superimposed on the displayed object point to form a monocular focusable object point, thereby overcoming the above-mentioned focusing-convergence conflict.
Disclosure of Invention
The invention aims to overcome the defects of the existing three-dimensional display technology and provide a three-dimensional display method based on a grating, which realizes the three-dimensional display technology of monocular multiple views (including two views) by improving the density of the visual points which can be presented at the pupils of an observer so as to realize comfortable three-dimensional vision. The invention forms visual area by one-dimensional grating light splitting, in the following part, the vertical direction of the arrangement direction of the grating units of the one-dimensional grating is defined as the long direction of the grating units, and the long direction of each visual area formed by the vertical direction is consistent with the long direction of the grating units. The technical scheme adopted by the invention is as follows:
a three-dimensional display method based on a grating is provided, which comprises the following steps:
s1, at the time pointt+l×∆tBy following upTracking and locating to determine the spatial position of the observer's double pupils, whereinlIs a natural number;
s2, arranging a one-dimensional grating in front of the display screen along the transmission direction of the emergent light of the display screen presenting pixel light information, and splitting the emergent light of the pixels of the display screen by the one-dimensional grating to ensure that the emergent light is splitNThe group pixels can be respectively seen inNA viewing zone in whichNIs a positive integer;
s3, designing the grating parameters of the one-dimensional grating and the inclination angle of the grating unit length direction relative to the central line of the double pupils of the observerθSplitting the light of the one-dimensional grating to form a left view area group which is composed of two or more view areas and corresponds to the left pupil of an observer in the view areas, wherein emergent light beams of pixels corresponding to each view area in the left view area group can be incident to the left pupil, and a right view area group which is composed of two or more view areas and corresponds to the right pupil of the observer also exists, and emergent light beams of pixels corresponding to each view area in the right view area group can be incident to the right pupil;
s4, respectively taking points in each visual area of the left visual area group as the corresponding viewpoint of the visual area, and respectively taking points in each visual area of the right visual area group as the corresponding viewpoint of the visual area;
s5, loading view information corresponding to the view points in the corresponding view areas for each pixel on the display screen corresponding to the left and right view area groups;
s6, at the next time pointt+l×∆t+∆tSteps S1 to S5 are repeatedly executed.
In the scheme, the spatial position of an observer is mainly determined by designing the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the double pupils of the observerθThe left pupil of the observer can receive at least two-dimensional views which can be presented by the pixels corresponding to at least two adjacent visual areas, and the right pupil can receive at least two-dimensional views which can be presented by the pixels corresponding to at least two other adjacent visual areas, so that comfortable three-dimensional vision is realized.
In a preferred embodiment, step S4 specifically includes: and making a left line and a right line which respectively pass through the left pupil and the right pupil and respectively named as a left viewpoint positioning line and a right viewpoint positioning line, taking a point on the left viewpoint positioning line in each visual area of the left visual area group as a corresponding visual point of the visual area, taking a point on the right viewpoint positioning line in each visual area of the right visual area group as a corresponding visual point of the visual area, and enabling the distance between the corresponding visual points of two adjacent visual areas in the same visual area group to be smaller than or equal to the diameter of the pupil. It is understood that the selection of the viewpoint corresponding to each viewing zone by making left and right viewpoint positioning lines is only one preferred embodiment. In other embodiments, the viewpoint corresponding to each visual area of the left visual area group and the viewpoint corresponding to each visual area of the right visual area group may be determined by taking points in a direct visual area.
In a more preferred embodiment, the three-dimensional display method is performed by splitting light with a one-dimensional grating to form a display screenNThe group pixels can be respectively seen inNOne of the visual zones, whereinNThe three-dimensional display method further comprises a time division multiplexing step, wherein the time division multiplexing step comprises the following steps:
p1 at time Pointt+l×∆tDetermining the spatial position of the observer 'S pupil based on the step S1, and adopting the grating parameters of the one-dimensional grating and the inclination angle of the grating unit in the longitudinal direction relative to the central line of the observer' S double pupils determined in the steps S3 and S4θThe left and right visual area groups and the corresponding viewpoints and numbers thereofNGroup of pixels andNa viewing zone for grouping pixelsnCorresponding visual areanEach pixel on the display screen corresponding to the left and right visual zone groups loads the visual information corresponding to the visual point in the visual zone corresponding to each pixel, wherein,lnis a natural number and is less than or equal to 1nN
P2 at time Pointt+l×∆tTo a point in timet+l×∆t+∆tTime point in betweent+l×∆t+k×∆t/NTranslating the one-dimensional grating to make the pixel groupnEach pixel is atn+kNThe time is the visual arean+kInner visualIn thatn+k>NThe time is the visual arean+k-NLoading view information relative to the view point in the corresponding view area on each pixel on the display screen corresponding to the inner visual area and the left and right view area groups, whereinkIs a natural number, and is not more than 1kN-1;
P3 at time Pointt+l×∆tWhen in time ofIntermediate pointt+l×∆t+∆tA between,NΔ of-1t/NRespectively, step P2 is performed for part or all of the time points of (a).
The invention discloses an alternative mode of three-dimensional display based on a grating, which comprises the following steps:
SS1. use as an image input device a display device having the ability to project distinct images binocular respectively to a viewer, the display device comprising a display screen carrying pixel light information;
SS2 at time pointt+l×∆tDetermining the spatial position of the observer's double pupils by tracking the location, whereinlIs a natural number;
SS3 one-dimensional grating is arranged in front of the display screen along the transmission direction of the emergent light of the display screen, and the one-dimensional grating is used for splitting the emergent light beam of the pixel of the display screen to ensure that the display screenNThe group pixels can be respectively seen inNA viewing zone in whichNIs a positive integer;
SS4 designing grating parameters of one-dimensional grating and inclination angle of grating unit length direction relative to observer double-pupil center lineθCombining the ability of a display device to respectively project different images to two eyes of an observer, enabling the one-dimensional grating to split light to form a left visual area group which is formed by two or more visual areas and has a left pupil corresponding to the left pupil of the observer in the visual area, wherein outgoing beams of pixels corresponding to each visual area in the left visual area group can be incident to the left pupil, and a right visual area group which is formed by two or more visual areas and has a right pupil corresponding to the right pupil of the observer is also present, and the outgoing beams of the pixels corresponding to each visual area in the right visual area group can be incident to the right pupil;
SS5, respectively taking points in each visual area of the left visual area group as the corresponding viewpoint of the visual area, and respectively taking points in each visual area of the right visual area group as the corresponding viewpoint of the visual area;
SS6, loading view information corresponding to the view points in the corresponding view areas for each pixel on the display screen corresponding to the left and right view area groups;
SS7 at the next time pointt+l×∆t+∆tAnd steps SS 2-SS 6 are repeated.
In a preferred embodiment, step SS5 specifically includes: and drawing a left line and a right line which respectively pass through the left pupil and the right pupil and are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, respectively taking a point on the left viewpoint positioning line as a corresponding viewpoint of the visual area in each visual area of the left visual area group, respectively taking a point on the right viewpoint positioning line as a corresponding viewpoint of the visual area in each visual area of the right visual area group, and respectively taking the distance between the corresponding viewpoints of two adjacent visual areas in the same visual area group as the pupil diameter of an observer.
It is understood that the selection of the viewpoint corresponding to each viewing zone by making left and right viewpoint positioning lines is only one preferred embodiment. Other embodiments can directly select the viewpoint in each visual zone on the premise that the distance between adjacent visual zones in the same visual zone group is less than or equal to the diameter of the pupil of an observer.
In a more preferred embodiment, the three-dimensional display method is performed by splitting light with a one-dimensional grating so as to display on a screenNThe group pixels can be respectively seen inNOne of the viewing zones, the three-dimensional display method further comprising a time division multiplexing step, the time division multiplexing step comprising the steps of:
p1 at time Pointt+l×∆tDetermining the spatial position of the pupil of the observer based on the step SS2, and adopting the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating unit relative to the central line of the double pupils of the observer determined in the steps SS4 and SS5θThe left and right visual area groups and the corresponding viewpoints and numbers thereofNGroup of pixels andNa viewing zone for grouping pixelsnCorresponding visual areanLoading view information for each pixel on the display screen corresponding to the left and right view zone groups relative to the view point in the corresponding view zone, wherein,lnis a natural number, and is not more than 1nN
P2 at time Pointt+l×∆tTo a point in timet+l×∆t+∆tTime point in betweent+l×∆t+k×∆t/NTranslating the one-dimensional grating to make the pixel groupnEach pixel is atn+kNThe time is the visual arean+kInside can be seen atn+k>NThe time is the visual arean+k-NThe pixels on the display screen corresponding to the inner visual and left and right visual zone groups are correspondingly viewedThe intra-region view loads view information, whereinkIs a natural number, and is not more than 1kN-1;
P3 at time Pointt+l×∆tAnd point in timet+l×∆t+∆tA between,NΔ of-1t/NRespectively, step P2 is performed for part or all of the time points of (a).
Compared with the prior art, the invention has the beneficial effects that: even if the distance between the adjacent visual areas is larger, the monocular two-view presentation can be realized by controlling the boundaries of the adjacent visual areas to be positioned at the pupils of the observer, or the monocular multi-view presentation can be realized by directly splitting the light by the one-dimensional grating under the condition that the distance between the adjacent visual areas is smaller; aiming at the eyes of an observer, even if the distance between adjacent visual areas is small, the eyes of the observer can be covered only by a limited number of small-distance (smaller than the diameter of a pupil) visual areas by designing the angle of the trend of the grating unit relative to the direction of a connecting line of the eyes, or comfortable binocular three-dimensional visual presentation can be realized by combining other binocular different image projection devices. Furthermore, the invention improves the display resolution through time division multiplexing, and further improves the display effect.
Drawings
Fig. 1 is a schematic diagram illustrating an implementation principle of a three-dimensional display method based on a grating for implementing monocular two-view presentation through adjacent partial areas of two viewing zones according to an embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating an implementation principle of a grating-based three-dimensional display method for implementing monocular multi-view presentation by a single pupil across multiple viewing zones according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an implementation principle of a three-dimensional display method based on a grating, which is used for implementing monocular and binocular vision rendering through adjacent partial areas of two visual areas and combines with a binocular disparity image projection device based on a grating according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of an implementation principle of a three-dimensional grating-based display method for implementing monocular multi-view rendering across multiple viewing zones through a single pupil in combination with a binocular disparity image projection apparatus based on a grating according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the specific embodiments. The drawings are for illustrative purposes only and are not to be construed as limiting the patent; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
Example 1
The invention discloses a three-dimensional display method based on a grating, which is used for splitting emergent light beams of pixels of a display screen by using a one-dimensional grating. The one-dimensional grating is arranged in front of the display screen along the transmission direction of the emergent light of the display screen, and the one-dimensional grating splits the emergent light beam of the pixels of the display screen, so that different groups of pixels can be respectively seen in different visual areas, such as visual areas 1, 2, 3, 4, 5, 6, 7, 8 and 9 shown in fig. 1. The grating parameters of the one-dimensional grating enable the distance between adjacent visual areas along the length direction and the vertical direction of the grating unitdLarger, the observer's pupil cannot completely cover a complete viewing zone. At the time pointt+l×∆tlNatural number) to determine the viewer's spatial location of the pupils by tracking the location. According to the positions of the two pupils of the observer, designing the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the two pupils of the observerθThe one-dimensional grating is split to form two adjacent visual areas corresponding to the left pupil of the observer in the visual area, such as the visual area 3 and the visual area 4 in fig. 1, through the adjacent partial areas, the corresponding pixels of the visual area 3 and the visual area 4 can be observed by the left eye of the observer, and the two visual areas are correspondingly named as a left visual area 3 and a left visual area 4 respectively; there are also two other adjacent viewing zones corresponding to the viewer's right pupil, such as viewing zone 6 and viewing zone 7 in fig. 1, through which adjacent portions of the pixels corresponding to viewing zone 6 and viewing zone 7 can be viewed by the viewer's right eye, which are correspondingly designated as right viewing zone 6 and right viewing zone 7, respectively. Left viewing zone 3 and left viewing zone 4 make up the left viewing zone group, and right viewing zone 6 and right viewing zone 7 make up the right viewing zone group. Two lines are respectively made through the left pupil and the right pupil, which are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, as shown in fig. 1. The corresponding viewpoints (named as the left viewpoints 3 and 4 respectively) are taken from each visual area of the left visual area group and are positioned on the positioning line, and the right viewpointCorresponding viewpoints (named as right viewpoints 6 and 7) are selected from each visual area of the visual area group and positioned on the positioning line, and the distance between the two viewpoints corresponding to the same visual area group is less than or equal to the diameter of the pupil of an observerD e . Each viewpoint positioning line in fig. 1 is a straight line passing through the corresponding pupil, and the left viewpoint 3 and the left viewpoint 4 are located in the left pupil, and the right viewpoint 6 and the right viewpoint 7 are located in the right pupil. Each viewpoint positioning line may actually be a curve passing through each corresponding pupil, and this also applies to the following embodiments. Further, in an actual system, the viewpoint positioning lines may be two lines that are not left and right pupils, but are spaced apart from each other by a distance between the left and right pupils along the binocular line direction, and only in this design, the positions of the display space light spots may be deviated by a certain dimension. The expansion characteristic of the viewpoint locating line is also applicable to the following embodiments. The method for determining the viewpoint may also be applied to the following embodiments without using a viewpoint positioning line, for example, a viewpoint corresponding to each visual area in the left and right visual area groups is selected as a point in the overlapping area of the visual area and the pupil of the observer. The corresponding pixels of the left visual area 3 on the display screen are loaded with the view taking the left viewpoint 3 as the viewpoint, the corresponding pixels of the left visual area 4 on the display screen are loaded with the view taking the left viewpoint 4 as the viewpoint, the corresponding pixels of the right visual area 6 on the display screen are loaded with the view taking the right viewpoint 6 as the viewpoint, and the corresponding pixels of the right visual area 7 on the display screen are loaded with the view taking the right viewpoint 7 as the viewpoint. Each pupil of an observer receives two views respectively, and the real space light spot distribution is formed by the spatial superposition of emergent light rays of the two views, so that the space light spot presentation capable of monocular focusing is realized. At the next time point, the above process is executed in the same way, and the process is repeated in such a way, so that three-dimensional scene presentation with comfortable vision can be visually realized. In this embodiment, the display screen for displaying the pixel optical information may be a display having real display pixels, for exampleOLEDDisplays, lcd displays, etc., but may also be other forms of display surfaces, such as screens for projecting images by reflective or transmissive projectors, etc.
Example 2
The invention discloses a three-dimensional display method based on a grating, which is used for splitting emergent light beams of pixels of a display screen by using a one-dimensional grating. Emitting light along a display screenThe one-dimensional grating is disposed in front of the display screen in the line transmission direction, and splits the emergent light beam of the pixels of the display screen, so that different groups of pixels can be respectively seen in different viewing areas, such as viewing areas 1, 2, 3, 4, 5, 6, 7, 8, and 9 shown in fig. 2. The grating parameters of the one-dimensional grating enable the distance between adjacent visual areas along the length direction and the vertical direction of the grating unitdSmaller than the diameter of the pupil of the observerD e In this case, the pupil in the visual zone range can overlap at least partially with two or more visual zones as long as the pupil is in the visual zone distribution range. At the time pointt+l×∆tlNatural number) to determine the viewer's spatial location of the pupils by tracking the location. According to the positions of the two pupils of the observer, designing the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the two pupils of the observerθThe visual areas formed by the one-dimensional grating light splitting can be divided into a left visual area group and a right visual area group which are not overlapped and respectively cover the left pupil and the right pupil of an observer. A left view zone group consisting of views 1, 2 and 3 as in fig. 2, with each view designated left view 1, left view 2 and left view 3, respectively; and a group of right-view zones consisting of view zones 6, 7 and 8, each of which is designated as right-view zone 6, right-view zone 7 and right-view zone 8, respectively. Two lines are respectively made through the left pupil and the right pupil, which are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, as shown in fig. 2. Taking points on the left viewpoint positioning line in each left visual area as the corresponding viewpoints of the visual area, such as left viewpoints 1, 2 and 3 in fig. 2; points on the right viewpoint positioning line in each right visual area are taken as the corresponding viewpoints of the visual area, such as right viewpoints 6, 7 and 8 in fig. 2. Loading a view with a left viewpoint 1 as a viewpoint on a display screen by the left visual area 1 corresponding to the pixels; the left visual area 2 loads a view with a left viewpoint 2 as a viewpoint on a display screen corresponding to pixels; a view with a left viewpoint 3 as a viewpoint is loaded on a display screen by the left visual area 3 corresponding to pixels; the right visual area 6 loads a view with a right viewpoint 6 as a viewpoint on a display screen corresponding to pixels; the right visual area 7 loads a view with a right viewpoint 7 as a viewpoint on a display screen corresponding to pixels; the right view area 8 loads a view with the right view point 8 as a view point on a display screen corresponding to pixels. In this case, the left and right pupils will receive the three views respectively, and the real spatial light spot division is formed by the spatial superposition of the emergent light rays of the three viewsAnd the space light spot presentation capable of monocular focusing is realized. In this process, there are other viewing zones that do not overlap with the pupil space, such as viewing zones 4, 5 and 9 in fig. 2. According to their spatial positions, they can also be classified into the close groups of the vision zones respectively, so that the spatial range of each group of the vision zones is enlarged, for example, in fig. 2, vision zone 4 is classified into the left group of the vision zones and correspondingly named as left vision zone 4, vision zones 5 and 9 are classified into the right group of the vision zones and correspondingly named as right vision zone 5 and right vision zone 9, then the corresponding viewpoint is determined and the image loading on the corresponding pixel is carried out according to the above method. This has the advantage that the determination of the spatial position of the observer's double pupils by means of tracking and localization is no longer necessary if the movement range of the observer's respective object does not exceed the spatial region of the corresponding viewing zone group, in which the spatial range is expanded. At the next time point, the above process is executed in the same way, and the process is repeated in such a way, so that three-dimensional scene presentation with comfortable vision can be visually realized. In this embodiment, the display screen for displaying the pixel optical information may be a display having real display pixels, for exampleOLEDDisplays, lcd displays, etc., but may also be other forms of display surfaces, such as screens for projecting images by reflective or transmissive projectors, etc.
Example 3
The three-dimensional display method based on the grating adopts a display device with the capability of respectively projecting different images to two eyes of an observer as an image input device. The image input device realizes the presentation of binocular dissimilar images through light splitting technology, such as light splitting by an inherent grating attached to a display screen of the image input device. The viewing zones generated by the image input devices are named as inherent viewing zones of the image input devices, such as inherent viewing zones 1, 2, 3, 4 and 5 of the image input devices in fig. 3, and images displayed by different groups of pixels on the display screen can be received in the inherent viewing zones of different image input devices. Then, a one-dimensional grating is arranged in front of the display screen along the transmission direction of the emergent light of the display screen, the one-dimensional grating is used for splitting the emergent light beam of the pixels of the display screen, and the emergent light beam is vertical along the length direction of the grating unit of the one-dimensional grating, so that different groups of pixels can be respectively seen in different visual areas, such as visual areas 1, 2, 3 and 4 shown in fig. 3. The length direction of visual area formed by one-dimensional grating and the length direction of inherent visual area of image input device are not in the same directionIn the direction of the rotation. The grating parameters of the one-dimensional grating enable the distance between adjacent visual areas along the length direction and the vertical direction of the grating unitdLarger, the observer's pupil cannot completely cover a complete viewing zone. At the time pointt+l×∆tlNatural number) to determine the viewer's spatial location of the pupils by tracking the location. According to the positions of the two pupils of the observer, designing the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the two pupils of the observerθThe one-dimensional grating is split into two adjacent visual regions corresponding to the left pupil of the observer in the visual region, such as visual region 2 and visual region 3 in fig. 3, and the corresponding pixels can be observed by the left eye and the right eye of the observer through the adjacent partial regions without considering the intrinsic visual region of the image input device. However, due to the existence of the intrinsic vision area of the image input device, the pixels which are jointly covered by the vision area 2, the vision area 3 and the intrinsic vision area 2 of the image input device can only be observed by the left eye of an observer, wherein the area which is jointly covered by the vision area 2 and the intrinsic vision area 2 of the input device is correspondingly named as a left vision area 2, and the area which is jointly covered by the vision area 3 and the intrinsic vision area 2 of the input device is correspondingly named as a left vision area 3; the pixels which are commonly covered by the visual area 2, the visual area 3 and the image input device inherent visual area 5 can only be observed by the right eye of an observer, wherein the area which is commonly covered by the visual area 2 and the input device inherent visual area 5 is correspondingly named as a right visual area 2, and the area which is commonly covered by the visual area 3 and the input device inherent visual area 5 is correspondingly named as a right visual area 3. The corresponding pixels of the left visual area or the right visual area on the display screen are not overlapped with each other, and images can be displayed respectively. At this time, left view region 2 and left view region 3 are made up of left view zone groups, and right view region 2 and right view region 3 are made up of right view zone groups. In fig. 3, the left visual area 2 and the right visual area 2 are different parts of the visual area 2 formed by one-dimensional grating splitting, and the left visual area 3 and the right visual area 3 are different parts of the visual area 3 formed by one-dimensional grating splitting. When changing the inclination angleθWhen values are taken, the visual areas of the left visual area group and the right visual area group can not belong to the same visual area formed by splitting the light of the one-dimensional grating in pairs. Two lines are respectively made through the left pupil and the right pupil, which are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, and the left viewpoint positioning line and the right viewpoint positioning line respectively intersect with each visual area of the left visual area group and each visual area of the right visual area group, as shown in fig. 3. The left visual point in each visual area of the left visual area group is positioned on the left visual point positioning linePoints within the pupil are taken as the viewpoint corresponding to the visual region, such as the left viewpoint 2 and the left viewpoint 3 in fig. 3; and taking a point which is on the right viewpoint positioning line and is in the right pupil in each visual zone of the right visual zone group as the corresponding visual point of the visual zone, such as a right visual point 2 and a right visual point 3 in the figure 3. The pixel corresponding to the left view area 2 projects a view corresponding to the left viewpoint 2; the right view region 2 projects a view corresponding to the right viewpoint 2 in correspondence with the pixels. Similarly, the corresponding pixel of the left view area 3 projects the view corresponding to the left viewpoint 3; the right view zone 3 projects a view corresponding to the right viewpoint 3 in correspondence with the pixels. Each pupil of the observer can respectively see two views, and the emergent rays from the two views are spatially superposed to form real spatial light spot distribution, so that the spatial light spot presentation capable of monocular focusing is realized. At the next time point, the above process is executed in the same way, and the process is repeated in such a way, so that three-dimensional scene presentation with comfortable vision can be visually realized. In this example, when the position of the pupil of the observer crosses the intrinsic viewing area of the adjacent image input device, the intrinsic viewing areas of the two adjacent image input devices are equivalent to a synthesized intrinsic viewing area, the function of the synthesized intrinsic viewing area is equivalent to that of the intrinsic viewing area of one image input device in the process, the synthesized intrinsic viewing area can be equivalent to that of the intrinsic viewing area of one image input device in the process, and the determination of the left and right viewing areas, the determination of the corresponding viewpoints, and the information loading of the corresponding pixels are performed by the same method. In this embodiment, the display screen for displaying the pixel optical information may be a display having real display pixels, for exampleOLEDDisplays, lcd displays, etc., but may also be other types of display surfaces, such as screens that reflect or project images projected by projectors.
In this embodiment, if the intrinsic viewing zones generated by the image input device are obtained by a time-series method, such as time-series conversion generated by directional backlight pointing, the pixels corresponding to the intrinsic viewing zones of the image input devices are spatially identical, but their emergent light information is directed to different intrinsic viewing zones of the image input device at different times. At this time, the setting methods of the left visual regions or the right visual regions in the above process are the same, and although there is a possibility of overlapping in the space of their corresponding pixels, the presentation of the views corresponding to the respective viewpoints can be realized by the occurrence of temporal misalignment.
Example 4
The three-dimensional display method based on the grating adopts a display device with the capability of respectively projecting different images to two eyes of an observer as an image input device. The image input device is split by an inherent grating attached to the display screen of the input device. The viewing zones generated by the image input devices are named as inherent viewing zones of the image input devices, such as inherent viewing zones 1, 2, 3, 4 and 5 of the image input devices in fig. 4, and images displayed by different groups of pixels on the display screen can be received in the inherent viewing zones of different image input devices. The one-dimensional grating is arranged in front of the display screen along the transmission direction of emergent light of the display screen and is vertical along the length direction of the grating unit of the one-dimensional grating, and different groups of pixels can be respectively seen in different visual areas by the one-dimensional grating, such as visual areas 1, 2, 3 and 4 shown in fig. 4. The long direction of the visual area formed by the one-dimensional grating and the long direction of the inherent visual area of the image input device are not in the same direction. The distance between adjacent visual areas formed by the one-dimensional grating and the vertical direction along the length direction of the grating unitdSmaller than the diameter of the pupil of the observerD e . At the time pointt+l×∆tlNatural number) to determine the viewer's spatial location of the pupils by tracking the location. According to the positions of the two pupils of the observer, designing the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the two pupils of the observerθThe left pupil of the observer in the visual area formed by splitting the one-dimensional grating can receive the light emitted by the corresponding pixels of the observer through at least two visual areas, such as visual areas 2 and 3 and a visual area 4 in fig. 4, and the pixels which correspond to the intrinsic visual area 2 of the image input device together can be observed by the left eye of the observer, wherein the area covered by the visual area 2 and the intrinsic visual area 2 of the input device together is named as a left visual area 2, the area covered by the visual area 3 and the intrinsic visual area 2 of the input device together is named as a left visual area 3, and the area covered by the visual area 4 and the intrinsic visual area 2 of the input device together is; similarly, the viewer's right pupil can receive the light emitted from their corresponding pixels through at least two viewing zones, such as viewing zones 2, 3 and 4 in fig. 4, the pixels corresponding to the viewing zone 5 of the image input device can be observed by the viewer's right eye, wherein the viewing zone 2 and the viewing zone 5 of the input device cover the same areaNamed as right view area 2, the area covered by the vision area 3 and the input device inherent vision area 5 is named as right view area 3, and the area covered by the vision area 4 and the input device inherent vision area 5 is named as right view area 4. And the images displayed by different corresponding pixel groups can be received in each left visual area or each right visual area. At this time, let left view zone 2, left view zone 3 and left view zone 4 constitute a left view zone group, and right view zone 2, right view zone 3 and right view zone 4 constitute a right view zone group. In fig. 4, the left visual area 2 and the right visual area 2 belong to the visual area 2 formed by splitting one-dimensional grating, the left visual area 3 and the right visual area 3 belong to the visual area 3 formed by splitting one-dimensional grating, and the left visual area 4 and the right visual area 4 belong to the visual area 4 formed by splitting one-dimensional grating. When the angle of inclinationθWhen the values are changed, the constituent visual areas of the left and right visual area groups can not completely belong to the same visual area formed by splitting light of the one-dimensional grating pairwise. Two lines are respectively made through the left pupil and the right pupil, which are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, as shown in fig. 4. Points on the left viewpoint positioning line in each viewing zone of the left viewing zone group are taken as corresponding viewpoints of the viewing zone, such as the left viewpoint 2, the left viewpoint 3 and the left viewpoint 4 in fig. 4, and points on the right viewpoint positioning line in each viewing zone of the right viewing zone group are taken as corresponding viewpoints of the viewing zone, such as the right viewpoint 2, the right viewpoint 3 and the right viewpoint 4 in fig. 4. The left visual area 2 corresponds to the pixels on the display screen, and projects a corresponding view to the left viewpoint 2; the right view 2 projects the corresponding view to the right viewpoint 2 at the corresponding pixel on the display screen. And loading information on corresponding pixels of other left visual areas or right visual areas according to the same principle. Each pupil of the observer receives three views respectively, and the three views form real space light spot distribution through the spatial superposition of emergent light rays of the three views, so that the space light spot presentation capable of monocular focusing is realized. In the process, there are other visual regions which do not overlap with the pupil space in the intrinsic visual region of the image input device where the pupil is located, such as visual region 1 in fig. 4. The overlapping areas of the left and right pupils and the intrinsic visual area of the image input device can be classified into a left visual area group or a right visual area group respectively, and are named as a left visual area 1 and a right visual area 1 correspondingly, and then the corresponding viewpoint is determined and the image loading on the corresponding pixel is carried out according to the method. This has the advantage that the range of movement for each purpose of the observer does not exceed the spatial range of the viewing area inherent to the corresponding image input deviceIt may no longer be necessary to determine the viewer's two-pupil spatial position by tracking localization. And furthermore, the inherent visual area of the input device and the one-dimensional grating form visual areas which are combined pairwise, when the view information is loaded on the pixels which correspond to each group in common relative to the view points which are taken in the common coverage area of the pixels according to the method, an observer moves to different positions, both eyes can receive a plurality of views which correspond to each other, and monocular multi-view display is realized under the condition that tracking and positioning are not needed. At the next time point, the above process is executed in the same way, and the process is repeated in such a way, so that three-dimensional scene presentation with comfortable vision can be visually realized. In this example, when the position of the pupil of the observer crosses the intrinsic viewing area of the adjacent image input device, the intrinsic viewing areas of the two adjacent image input devices are equivalent to a synthesized intrinsic viewing area, the function of the synthesized intrinsic viewing area is equivalent to that of the intrinsic viewing area of one image input device in the process, the synthesized intrinsic viewing area can be equivalent to that of the intrinsic viewing area of one image input device in the process, and the determination of the left and right viewing areas, the determination of the corresponding viewpoints, and the information loading of the corresponding pixels are performed by the same method. In this embodiment, the display screen for displaying the pixel optical information may be a display having real display pixels, for exampleOLEDDisplays, lcd displays, etc., but may also be other forms of display surfaces, such as screens for projecting images by reflective or transmissive projectors, etc.
In this embodiment, if the intrinsic viewing zones generated by the image input device are obtained by a time-series method, such as time-series conversion generated by directional backlight pointing, the pixels corresponding to the intrinsic viewing zones of the image input devices are spatially identical, but their emergent light information is directed to different intrinsic viewing zones of the image input device at different times. At this time, the setting methods of the left visual regions or the right visual regions in the above process are the same, and although there is a possibility of overlapping in the space of their corresponding pixels, the presentation of the views corresponding to the respective viewpoints can be realized by the occurrence of temporal misalignment.
Example 5
The three-dimensional display method based on the grating adopts a display device with the capability of respectively projecting different images to two eyes of an observer as an image input device. The image input device presents different views to a viewer binocular respectively through equivalent double screens. For example, in the head-mounted virtual reality/augmented display technology, two displays respectively display enlarged images to two eyes of an observer through respective corresponding eyepieces, enlarged virtual images of the two displays are superposed in a common area on a virtual image surface serving as a display screen, the display screen has an equivalent double-screen function, the two equivalent screens are respectively the enlarged virtual images of the two displays and respectively present corresponding views to the observer in a binocular manner; for example, in the shutter glasses type three-dimensional display system, different images are displayed on the display screen in a time sequence manner to different eyes of an observer through a time sequence switch of the shutter glasses, although images seen by two eyes are from the display screen, the images are separated in time, so that the display screen is an equivalent double screen separated in time sequence manner, two equivalent screens of the display screen are the screens on two adjacent time points of the display screen respectively, and the time sequence presents corresponding views to the two eyes of the observer respectively; for example, in a polarized glasses type three-dimensional display system, two projectors transmit images with mutually orthogonal polarization states to a screen (i.e., a display screen), and the images are respectively reflected or transmitted to the left and right eyes of an observer through polarized glasses, the display screen has the function of equivalent double screens, and the two equivalent screens are the projections of the two projectors on the screen. And each equivalent screen of the similar equivalent double screens respectively projects corresponding images to two eyes of an observer, and the intrinsic visual areas of the image input devices corresponding to the equivalent screen of the left target of the observer and the equivalent screen of the right target of the observer are separated in space and are named as the intrinsic visual areas of the left image input device and the right image input device respectively. In this case, similarly to embodiments 3 and 4, monocular multiview (including dual view) three-dimensional display can be realized based on the same method and procedure. The difference is that in the present embodiment, the image input device intrinsic viewing zone illustrated in fig. 3 and 4 is changed to only two viewing zones of left and right image input device intrinsic viewing zones, and covers the left and right eyes of the observer, respectively, with corresponding pixels from different equivalent screens. By adopting the method described in embodiment 3 or 4, each left and right visual area and corresponding viewpoint are determined, and information loading is performed, so that monocular multiview display can be realized.
Example 6
The method described in example 1 describes any one point in timet+l×∆tWhen (1)lNatural number) and three-dimensional display by one-dimensional grating, wherein the one-dimensional grating is used for making display screen on by light splittingNNIs a positive integer) group (pixel group 1, pixel group 2.,. pixel groupn,.., pixel groupN) The pixels are respectively visible inNVision area (vision area 1, vision area 2.., vision area)n,.., vision zoneN). The present example is based on example 1 (example 1)N= 9), indicating that by introducing time division multiplexing, the resolution of the views received by the views can be increased.
At the time pointt+l×∆tlNatural number), determining the positions of the two pupils of the observer, adjusting the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating units relative to the central line of the two pupils of the observer according to the method described in embodiment 1θAnd determining the viewpoint corresponding to the left and right visual zones. Then, the one-dimensional grating light splitting generation visual area and the corresponding pixel group are numbered to ensure that the pixel groupn(natural number is 1. ltoreqnN) Corresponding visual areanAnd each pixel loads light information relative to a viewpoint in a corresponding visual area. At the time pointt+l×∆tTo a point in timet+l×∆t+∆tTime point in betweent+l×∆t+k×∆t/NTranslating the grating to group the pixelsnEach pixel is atn+kNThe time is the visual arean+kInner visualIn thatn+k>NThe time is the visual arean+k-NIs internally visible, whereinkIs a natural number, and is not more than 1kN-1. For example, at a point in timet+l×∆t+k×∆t/N=t+l×∆t+1×∆t/NThe grating is translated, the pixel group 1 of the display screen is visible through the one-dimensional grating visual area 2, the pixel group 2 of the display screen is visible through the one-dimensional grating visual area 3, the pixel group 3 of the display screen is visible through the one-dimensional grating visual area 4, the pixel group 4 of the display screen is visible through the one-dimensional grating visual area 5, and the pixel group 5 of the display screen is visible through the one-dimensional grating visual area 5The visual area 6 is visible, the pixel group 6 of the display screen is visible in the visual area 7 through the one-dimensional grating, the pixel group 7 of the display screen is visible in the visual area 8 through the one-dimensional grating, the pixel group 8 of the display screen is visible in the visual area 9 through the one-dimensional grating, and the pixel group 9 of the display screen is visible in the visual area 1 through the one-dimensional grating. In the process, the spatial distribution of each visual area formed by splitting light by the one-dimensional grating does not change, but the corresponding pixel group on the display screen changes. In this way, the corresponding pixels are changed while the spatial positions and the corresponding viewpoints of the viewing zones of the left-view zone group and the right-view zone group are kept unchanged. And then each visual area in the left and right visual area groups corresponds to a new pixel, and the view loading is carried out by taking the viewpoint in the visual area as the viewpoint, so that the display of the monocular multi-view can be realized. The other endt/NIs at a spacing ofN-2) time points, performing the translation of the grating and the loading of the optical information in the same way, whentWhen sufficiently small, the views received by the observer's eyes at the various viewpoints, based on the retention of vision, are all due toNPixel time-sequential synthesis of groups of pixels with resolution corresponding to when time-division multiplexing is not employed is increasedN-1 time. At each lower endtThe above process operations are performed in the same manner. At each onetWithin a time division multiplexing period of (1), whichNAnt/NAt the time points of the interval, the operation can not be carried out at partial time points, and the operation content between different time points in the period can be exchanged.
This embodiment is a time division multiplexing operation performed based on embodiment 1, and the same can be applied to embodiments 2 to 5 described above.
The time division multiplexing described in this embodiment may also be applied to other grating three-dimensional display technologies, such as the existing multi-view three-dimensional display technology that projects a corresponding two-dimensional view to two eyes of an observer.
The above is only a preferred embodiment of the present invention, but the design concept of the present invention is not limited thereto, and any insubstantial modifications made by using the design concept fall within the scope of the present invention.

Claims (6)

1. A three-dimensional display method based on a grating is characterized by comprising the following steps:
s1, at the time pointt+l×∆tDetermining the spatial position of the observer's double pupils by tracking the location, whereinlIs a natural number;
s2, arranging a one-dimensional grating in front of the display screen along the transmission direction of the emergent light of the display screen presenting pixel light information, and splitting the emergent light of the pixels of the display screen by the one-dimensional grating to ensure that the emergent light is splitNThe group pixels can be respectively seen inNA viewing zone in whichNIs a positive integer;
s3, designing the grating parameters of the one-dimensional grating and the inclination angle of the grating unit length direction relative to the central line of the double pupils of the observerθSplitting the light of the one-dimensional grating to form a left view area group which is composed of two or more view areas and corresponds to the left pupil of an observer in the view areas, wherein emergent light beams of pixels corresponding to each view area in the left view area group can be incident to the left pupil, and a right view area group which is composed of two or more view areas and corresponds to the right pupil of the observer also exists, and emergent light beams of pixels corresponding to each view area in the right view area group can be incident to the right pupil;
s4, respectively taking points in each visual area of the left visual area group as the corresponding viewpoint of the visual area, and respectively taking points in each visual area of the right visual area group as the corresponding viewpoint of the visual area;
s5, loading view information relative to the view points in the respective corresponding visual areas for each pixel on the display screen corresponding to the left and right visual zone groups;
s6, at the next time pointt+l×∆t+∆tSteps S1 to S5 are repeatedly executed.
2. The three-dimensional display method based on raster as claimed in claim 1, wherein step S4 includes: and drawing a left line and a right line which respectively pass through the left pupil and the right pupil and are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, taking a point on the left viewpoint positioning line in each visual area of the left visual area group as a corresponding visual point of the visual area, taking a point on the right viewpoint positioning line in each visual area of the right visual area group as a corresponding visual point of the visual area, and enabling the distance between the corresponding visual points of two adjacent visual areas in the same visual area group to be smaller than or equal to the diameter of the pupil of an observer.
3. The three-dimensional display method based on optical grating as claimed in any one of claims 1 to 2, wherein the three-dimensional display method is to split light by one-dimensional optical grating to make the display screenNThe group pixels can be respectively seen inNOne of the viewing zones, the three-dimensional display method further comprising a time division multiplexing step, the time division multiplexing step comprising the steps of:
p1 at time Pointt+l×∆tDetermining the spatial position of the observer 'S pupil based on the step S1, and adopting the grating parameters of the one-dimensional grating and the inclination angle of the grating unit in the longitudinal direction relative to the central line of the observer' S double pupils determined in the steps S3 and S4θThe left and right visual area groups and the visual areas thereof correspond to viewpoints and numbersNGroup of pixels andNa viewing zone for grouping pixelsnCorresponding visual areanEach pixel on the display screen corresponding to the left and right visual zone groups loads the visual information corresponding to the visual point in the visual zone corresponding to each pixel, wherein,lnis a natural number and is less than or equal to 1nN
P2 at time Pointt+l×∆tTo a point in timet+l×∆t+∆tTime point in betweent+l×∆t+k×∆t/NTranslating the one-dimensional grating to make the pixel groupnEach pixel is atn+kNThe time is the visual arean+kInner visualIn thatn+k>NThe time is the visual arean+k-NLoading view information relative to the view point in the corresponding view area on each pixel on the display screen corresponding to the inner visual area and the left and right view area groups, whereinkIs a natural number, and is not more than 1kN-1;
P3 at time Pointt+l×∆tAnd point in timet+l×∆t+∆tA between,NΔ of-1t/NRespectively, step P2 is performed for part or all of the time points of (a).
4. A three-dimensional display method based on a grating is characterized by comprising the following steps:
SS1. use as an image input device a display device having the ability to project distinct images binocular respectively to a viewer, the display device comprising a display screen carrying pixel light information;
SS2 at time pointt+l×∆tDetermining the spatial position of the observer's double pupils by tracking the location, whereinlIs a natural number;
SS3 one-dimensional grating is arranged in front of the display screen along the transmission direction of the emergent light of the display screen, and the one-dimensional grating is used for splitting the emergent light beam of the pixel of the display screen to ensure that the display screenNThe group pixels can be respectively seen inNA viewing zone in whichNIs a positive integer;
SS4 designing grating parameters of one-dimensional grating and inclination angle of grating unit length direction relative to observer double-pupil center lineθCombining the ability of a display device to respectively project different images to two eyes of an observer, enabling the one-dimensional grating to split light to form a left visual area group which is formed by two or more visual areas and has a left pupil corresponding to the left pupil of the observer in the visual area, wherein outgoing beams of pixels corresponding to each visual area in the left visual area group can be incident to the left pupil, and a right visual area group which is formed by two or more visual areas and has a right pupil corresponding to the right pupil of the observer is also present, and the outgoing beams of the pixels corresponding to each visual area in the right visual area group can be incident to the right pupil;
SS5, respectively taking points in each visual area of the left visual area group as the corresponding viewpoint of the visual area, and respectively taking points in each visual area of the right visual area group as the corresponding viewpoint of the visual area;
SS6, loading view information corresponding to the view points in the corresponding view areas for each pixel on the display screen corresponding to the left and right view area groups;
SS7 at the next time pointt+l×∆t+∆tAnd steps SS 2-SS 6 are repeated.
5. The raster-based three-dimensional display method according to claim 4, wherein step SS5 includes: and drawing a left line and a right line which respectively pass through the left pupil and the right pupil and are respectively named as a left viewpoint positioning line and a right viewpoint positioning line, taking a point on the left viewpoint positioning line in each visual area of the left visual area group as a corresponding visual point of the visual area, taking a point on the right viewpoint positioning line in each visual area of the right visual area group as a corresponding visual point of the visual area, and enabling the distance between the corresponding visual points of two adjacent visual areas in the same visual area group to be smaller than or equal to the diameter of the pupil of an observer.
6. The three-dimensional display method based on optical grating as claimed in any one of claims 4 to 5, wherein the three-dimensional display method is to split light by one-dimensional optical grating to make the display screen displayNThe group pixels can be respectively seen inNOne of the visual zones, whereinNThe three-dimensional display method further comprises a time division multiplexing step, wherein the time division multiplexing step comprises the following steps:
p1 at time Pointt+l×∆tDetermining the spatial position of the pupil of the observer based on the step SS2, and adopting the grating parameters of the one-dimensional grating and the inclination angle of the longitudinal direction of the grating unit relative to the central line of the double pupils of the observer determined in the steps SS4 and SS5θThe left and right visual area groups and the corresponding viewpoints and numbers thereofNGroup of pixels andNa viewing zone for grouping pixelsnCorresponding visual areanLoading view information for each pixel on the display screen corresponding to the left and right view zone groups relative to the view point in the corresponding view zone, wherein,lnis a natural number, and is not more than 1nN
P2 at time Pointt+l×∆tTo a point in timet+l×∆t+∆tTime point in betweent+l×∆t+k×∆t/NTranslating the one-dimensional grating to make the pixel groupnEach pixel is atn+kNThe time is the visual arean+kInside can be seen atn+k>NThe time is the visual arean+k-NLoading view information relative to the view point in the corresponding view area on each pixel on the display screen corresponding to the inner visual area and the left and right view area groups, whereinkIs a natural number, and is not more than 1kN-1;
P3 at time Pointt+l×∆tAnd point in timet+l×∆t+∆tA between,NΔ of-1t/NRespectively, step P2 is performed for part or all of the time points of (a).
CN201810031064.6A 2018-01-12 2018-01-12 Three-dimensional display method based on grating Active CN110035274B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810031064.6A CN110035274B (en) 2018-01-12 2018-01-12 Three-dimensional display method based on grating
PCT/CN2019/070029 WO2019137272A1 (en) 2018-01-12 2019-01-02 Grating based three-dimentional display method for presenting more than one views to each pupil
US16/479,926 US11012673B2 (en) 2018-01-12 2019-01-02 Grating based three-dimentional display method for presenting more than one views to each pupil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810031064.6A CN110035274B (en) 2018-01-12 2018-01-12 Three-dimensional display method based on grating

Publications (2)

Publication Number Publication Date
CN110035274A CN110035274A (en) 2019-07-19
CN110035274B true CN110035274B (en) 2020-10-16

Family

ID=67234845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810031064.6A Active CN110035274B (en) 2018-01-12 2018-01-12 Three-dimensional display method based on grating

Country Status (1)

Country Link
CN (1) CN110035274B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112748585B (en) * 2019-10-30 2022-07-19 驻景(广州)科技有限公司 Small-distance visual area guiding type three-dimensional display system and method
CN113495366B (en) * 2020-04-03 2022-05-17 驻景(广州)科技有限公司 Three-dimensional display method based on sub-pixel emergent light space superposition
CN113495365B (en) * 2020-04-03 2022-09-30 驻景(广州)科技有限公司 Monocular multiview display method using sub-pixel as display unit
CN113835235B (en) * 2020-06-24 2023-12-15 中山大学 Multi-user-oriented three-dimensional display system based on entrance pupil division multiplexing
WO2022036691A1 (en) * 2020-08-21 2022-02-24 深圳市立体通科技有限公司 Naked-eye 3d display method and smart terminal
WO2022036692A1 (en) * 2020-08-21 2022-02-24 深圳市立体通科技有限公司 Naked eye 3d display method and intelligent terminal
CN113395510B (en) * 2021-05-21 2023-04-07 深圳英伦科技股份有限公司 Three-dimensional display method and system, computer-readable storage medium, and program product
CN113687523B (en) * 2021-08-05 2022-05-17 中山大学 Naked eye light field display method based on asymmetric distribution of projection light

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101562756A (en) * 2009-05-07 2009-10-21 昆山龙腾光电有限公司 Stereo display device as well as display method and stereo display jointing wall thereof
CN103558690A (en) * 2013-10-30 2014-02-05 青岛海信电器股份有限公司 Grating type stereoscopic display device, signal processing method and image processing device
CN106526878A (en) * 2016-12-08 2017-03-22 南京大学 Multidimensional free stereoscopic display device
CN206260048U (en) * 2016-09-28 2017-06-16 擎中科技(上海)有限公司 A kind of bore hole 3D display devices

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130135588A1 (en) * 2011-11-29 2013-05-30 Milan Momcilo Popovich 3D display apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101562756A (en) * 2009-05-07 2009-10-21 昆山龙腾光电有限公司 Stereo display device as well as display method and stereo display jointing wall thereof
CN103558690A (en) * 2013-10-30 2014-02-05 青岛海信电器股份有限公司 Grating type stereoscopic display device, signal processing method and image processing device
CN206260048U (en) * 2016-09-28 2017-06-16 擎中科技(上海)有限公司 A kind of bore hole 3D display devices
CN106526878A (en) * 2016-12-08 2017-03-22 南京大学 Multidimensional free stereoscopic display device

Also Published As

Publication number Publication date
CN110035274A (en) 2019-07-19

Similar Documents

Publication Publication Date Title
CN110035274B (en) Three-dimensional display method based on grating
US8547422B2 (en) Multi-user autostereoscopic display
JP4607208B2 (en) 3D display method
US6252707B1 (en) Systems for three-dimensional viewing and projection
JP2007503606A (en) Autostereoscopic multi-user display
US6788274B2 (en) Apparatus and method for displaying stereoscopic images
JP2011034086A (en) Three-dimensional video display method, system thereof and recording medium with three-dimensional video display program recorded therein
WO2011086874A1 (en) Display device and display method
CN104503093B (en) Light complementary splicing technology for generating space gradual transition view and three-dimensional display system based on light complementary splicing technology
CN102566250B (en) A kind of optical projection system of naked-eye auto-stereoscopic display and display
KR101309313B1 (en) 3-dimension display device using devided screen
JP4213210B2 (en) 3D observation and projection system
EP0830630B1 (en) Stereoscopic display device
JP7335233B2 (en) A system and method for displaying two-viewpoint autostereoscopic images on an N-viewpoint autostereoscopic display screen and a method for controlling the display on such a display screen
CN114981711B (en) Display device and driving method thereof
JP2001218231A (en) Device and method for displaying stereoscopic image
KR101544841B1 (en) Autostereoscopic multi-view 3d display system with triple segmented-slanted parallax barrier
JPH08314034A (en) Stereoscopic picture displaying method and device therefor
JP2004258594A (en) Three-dimensional image display device realizing appreciation from wide angle
US9217875B1 (en) Multi-view auto-stereoscopic display and angle magnifying screen thereof
KR101272015B1 (en) Three dimensional image display device
KR101831748B1 (en) Autostereoscopic 3d display apparatus
KR20170122687A (en) Non optical plate multiview 2d/3d conversion parallax lighting system
JP4893821B2 (en) Image display device
Brar et al. Helium3D: a laser-based 3D display with'3D+'Capability

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20211019

Address after: Room c053, 3rd floor, No. 18, GUANGTANG West Road, Tianhe District, Guangzhou City, Guangdong Province (office only)

Patentee after: Park view (Guangzhou) Technology Co., Ltd

Address before: 510275 No. 135 West Xingang Road, Guangdong, Guangzhou

Patentee before: SUN YAT-SEN University

TR01 Transfer of patent right