CN108769664B - Naked eye 3D display method, device, equipment and medium based on human eye tracking - Google Patents

Naked eye 3D display method, device, equipment and medium based on human eye tracking Download PDF

Info

Publication number
CN108769664B
CN108769664B CN201810521797.8A CN201810521797A CN108769664B CN 108769664 B CN108769664 B CN 108769664B CN 201810521797 A CN201810521797 A CN 201810521797A CN 108769664 B CN108769664 B CN 108769664B
Authority
CN
China
Prior art keywords
phase
eye
viewpoint
user
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810521797.8A
Other languages
Chinese (zh)
Other versions
CN108769664A (en
Inventor
陈佳搏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhangjiagang Kangdexin Optronics Material Co Ltd
Original Assignee
Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhangjiagang Kangdexin Optronics Material Co Ltd filed Critical Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority to CN201810521797.8A priority Critical patent/CN108769664B/en
Publication of CN108769664A publication Critical patent/CN108769664A/en
Application granted granted Critical
Publication of CN108769664B publication Critical patent/CN108769664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Abstract

The embodiment of the invention discloses a naked eye 3D display method, a naked eye 3D display device, naked eye 3D display equipment and a naked eye 3D display medium based on human eye tracking. The method comprises the following steps: determining an initial phase of each viewpoint according to the number of viewpoints of the viewpoints to be arranged; determining characteristic information of an original visual area in each visual angle according to a preset layout mode of the viewpoint of the diagram to be laid, wherein the original visual area is a display area of which a screen can present a naked eye 3D display effect when human eye tracking is not carried out; and adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the two eyes of the user relative to the screen. By adopting the technical scheme, the viewpoint image content correspondingly moves along with the movement of the positions of the human eyes, so that the left eye and the right eye of each user can watch the correct viewpoint image content, the image aliasing or inversion is avoided, and the watching effect and the watching experience of the user are improved.

Description

Naked eye 3D display method, device, equipment and medium based on human eye tracking
Technical Field
The embodiment of the invention relates to a naked eye 3D technology, in particular to a naked eye 3D display method, a naked eye 3D display device, naked eye 3D display equipment and a naked eye 3D display medium based on human eye tracking.
Background
The naked-eye 3D display is widely applied to various fields such as advertisement, media, demonstration teaching, exhibition and display, movie and television and the like. Different from the traditional binocular 3D display technology, the naked eye 3D display has the unique characteristics of the naked eyes, namely, the 3D effect can be viewed without the need of wearing glasses or helmets by audiences, the depth of field and the stereoscopic impression of the naked eyes are vivid, the visual impact and the immersion sense of the audiences during watching experience are greatly improved, and the display is the best display product for product popularization, public propaganda and image playing.
The principle of naked eye 3D display is that an image displayed by a display is split by a lens, different display contents are refracted to different places in space by the lens through the refraction effect of light, the displayed contents are separated when the contents reach human eyes, and the two images with parallax are received by the human eyes, so that a three-dimensional effect is generated. If the user looks outside the visible area, image reversal may occur. At present, although a multi-viewpoint mode can be adopted to increase a visible area, the definition is reduced by adopting the multi-viewpoint mode under a certain angular resolution, and the actual viewing effect is influenced by aliasing of an image.
Disclosure of Invention
The embodiment of the invention provides a naked eye 3D display method, a naked eye 3D display device, naked eye 3D display equipment and a naked eye 3D display medium, which are used for solving the technical problems of easy occurrence of inversion and image aliasing in a naked eye 3D display process.
In a first aspect, an embodiment of the present invention provides a naked eye 3D display method based on human eye tracking, where the method includes: determining an initial phase of each viewpoint according to the number of viewpoints of the viewpoints to be arranged;
determining characteristic information of an original visual area in each visual angle according to a preset layout mode of the viewpoint of the diagram to be laid, wherein the original visual area is a display area of which a screen can present a naked eye 3D display effect when human eye tracking is not carried out;
and adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the two eyes of the user relative to the screen.
In a second aspect, an embodiment of the present invention further provides a naked eye 3D display device based on human eye tracking, where the device includes:
the initial phase determining module is used for determining the initial phase of each viewpoint according to the number of the viewpoints to be arranged;
the characteristic information determining module is used for determining characteristic information of an original visual area in each visual angle according to a preset layout mode of the viewpoint of the to-be-laid image, wherein the original visual area is a display area of which a screen can present a naked eye 3D display effect when human eye tracking is not carried out;
and the phase adjusting module is used for adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the eyes of the user relative to the screen.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a naked eye 3D display method based on human eye tracking as provided by any embodiment of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor, and is configured to implement the naked eye 3D display method based on human eye tracking according to any embodiment of the present invention.
According to the technical scheme provided by the embodiment of the invention, the initial phase of each viewpoint can be determined according to the number of the viewpoints of the diagram to be arranged. And determining the characteristic information of the original visual area in each visual angle according to the preset layout mode of the viewpoint of the picture to be laid. The current phase of the human eyes relative to the screen is acquired in real time, and phase adjustment can be carried out on the played viewpoint image content according to the current phase and the characteristic information, so that the viewpoint image content correspondingly moves along with the movement of the positions of the human eyes, the left eye and the right eye of each user can watch correct viewpoint image content, image aliasing or inversion is avoided, and the watching effect and the watching experience of the users are improved.
Drawings
Fig. 1 is a flowchart of a naked eye 3D display method based on human eye tracking according to an embodiment of the present invention;
fig. 2a is a schematic view of multi-viewpoint optical design in a naked-eye 3D display method according to an embodiment of the present invention;
fig. 2b is a schematic view illustrating a discontinuous viewing region within a viewing angle according to an embodiment of the present invention;
fig. 2c is a schematic view illustrating a discontinuous viewing region within a viewing angle according to an embodiment of the present invention;
fig. 2D is a schematic diagram of a phase relationship between a position of a human eye and a screen in the naked-eye 3D display method according to the first embodiment of the present invention;
fig. 3 is a flowchart of a naked eye 3D display method based on human eye tracking according to a second embodiment of the present invention;
fig. 4 is a flowchart of a naked eye 3D display method based on human eye tracking according to a third embodiment of the present invention;
fig. 5 is a flowchart of a naked eye 3D display method based on human eye tracking according to a fourth embodiment of the present invention;
fig. 6 is a schematic diagram illustrating allocation of multi-view channels and view maps in a naked eye 3D display method according to a fourth embodiment of the present invention;
fig. 7 is a flowchart of a naked eye 3D display method based on human eye tracking according to a fifth embodiment of the present invention;
fig. 8 is a schematic structural diagram of a naked eye 3D display device based on human eye tracking according to a sixth embodiment of the present invention;
fig. 9 is a schematic structural diagram of an apparatus according to a seventh embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
For the purpose of clearly and clearly describing the embodiments of the present invention, the following first briefly introduces the implementation principle of the present invention:
when the naked eye 3D display effect is realized, a layer of grating film is arranged in front of a liquid crystal display screen of the naked eye 3D display, image pixels below each grating film are divided into R, G, B sub-pixels, each sub-pixel is projected towards different directions after passing through the grating film, different display contents are refracted to different places in space, and audiences can watch different views from different directions. For each user, the left eye and the right eye can respectively see appropriate images, so that binocular parallax is formed, a depth feeling and a space feeling are generated, and a naked eye 3D display effect is experienced.
Wherein, after the light is split by the grating film, a plurality of visual angles exist. For each view angle, there are a plurality of regions in which a plurality of viewpoints are periodically arranged. Within each region, if there is effective overlap between adjacent views, there is no aliasing or inversion of the image, and the region can be considered as a visible region. After the viewpoint images in the visible area are arranged according to the preset arrangement mode, the user can watch the ideal 3D display effect.
Since the left-right view format uses a side-by-side transmission of left and right views, the left view viewed by the left eye and the right view viewed by the right eye need to be adjusted as the position of the viewer changes. If the arrangement manner of each viewpoint map does not change with the positions of the user's eyes, the user's eyes may move from the corresponding visible area to the invisible area. In the invisible area, the image has aliasing or inversion phenomenon, and the user can not watch the 3D display effect and even can have symptoms such as dizziness and the like. Therefore, the technical scheme provided by the embodiment of the invention adopts the human eye tracking technology, the current phases of the two eyes of the user relative to the screen are obtained in real time, and the phase of each viewpoint image in each visual angle can be adjusted by combining the characteristic information of the visual area, so that the content of each viewpoint image correspondingly moves along with the movement of the position of the human eye, and each user can see the correct viewpoint content matched with the position of the eye and experience the naked eye 3D display effect.
Example one
Fig. 1 is a flowchart of a naked eye 3D display method based on human eye tracking according to an embodiment of the present invention, where the method may be performed by a naked eye 3D display device based on human eye tracking, and the device may be implemented by software and/or hardware. Referring to fig. 1, the method of the present embodiment specifically includes:
and S110, determining the initial phase of each viewpoint according to the number of the viewpoints to be mapped.
For naked eye 3D, an optimal viewing distance generally exists, and the optimal viewing distance refers to the situation that when the vertical distance from a screen is the optimal viewing distance, the projection position of the sub-pixels separated by the grating film is matched with the position of human eyes, so that the left eye and the right eye of a viewer can respectively see appropriate corresponding images, binocular parallax is formed, and the depth feeling and the space feeling are generated.
FIG. 2a is a drawing of the present inventionIn the naked eye 3D display method according to the first embodiment, a schematic view of multi-view optical design is shown in fig. 2a, where OVD represents an optimal viewing distance and DOVDRepresenting the range of central viewing angles. In this embodiment, the central viewing area can be designed at the optimal viewing distance, i.e. the central viewing line segment is set to the corresponding phase [0, 1]]And determining the initial phase of each viewpoint image and the viewpoint image corresponding to each initial phase according to the set viewpoint number. Generally, the viewpoints are uniformly distributed, and accordingly, the phase range corresponding to each viewpoint can be determined. The viewpoint ranges are continuously equal and cover the whole phase range [0, 1]]. Taking the number of viewpoint images as 5 as an example, the viewpoint images 1, 2, 3, 4 and 5 correspond to the phase range { [0,0.2 ]]、[0.2,0.4)、[0.4,0.6)、[0.6,0.8)、[0.8,1)}。
Correspondingly, rendering generates a K viewpoint image, wherein K is more than or equal to 2 and is less than the number of viewpoints in optical design. And determining the corresponding phase range of each viewpoint. Typically, the viewpoint ranges are continuously equal, covering the entire phase range [0, 1]]I.e. by
Figure BDA0001674981910000061
Taking the number of viewpoint images as 5 as an example, the viewpoint images 1, 2, 3, 4, 5 correspond to the phase ranges { [0, 0.2), [0.2, 0.4), [0.4, 0.6), [0.6, 0.8), [0.8, 1 }.
And S120, determining the characteristic information of the original visual area in each visual angle according to a preset layout mode of the viewpoint of the picture to be laid.
The original visual area is a display area in which a screen can present a naked eye 3D display effect when human eye tracking is not performed, and in the area, the phenomenon of image aliasing or inversion does not exist between adjacent viewpoints. The preset arrangement manner is an initial arrangement manner corresponding to each viewpoint image in each view angle, which is preset when human eye tracking is not performed, and specifically may be that a plurality of original visual areas exist in each view angle, and each viewpoint image is periodically arranged in each view angle. The preset mapping is the same for each cycle. For the convenience of calculation, the technical solution of this embodiment preferably maps the phases of the viewpoints in the respective views into the central view. In the subsequent calculation, it is also preferable to perform the calculation on the basis of the central view angle.
Illustratively, the feature information of the original visual area includes a continuity feature of the original visual area, a start phase feature and a size feature of the original visual area. After the phase adjustment is performed on the viewpoint images corresponding to the initial phases according to the positions of the eyes of the user, the feature information of the original visual area is not changed, for example, if the original visual area is continuous, the visual area formed by the viewpoint images after the phase adjustment is still continuous.
Illustratively, for the continuity feature of the original visual area, the determination method is as follows: according to a preset layout mode of viewpoints to be laid out, within each view angle, for any two adjacent original visual regions, if no invisible region exists, the adjacent original visual regions are continuous within the view angle, wherein the invisible region is a region where image aliasing or inversion occurs between the adjacent viewpoints.
For example, fig. 2b is a schematic diagram of a discontinuous visible region within a viewing angle according to an embodiment of the present invention, as shown in fig. 2b, for a preset arrangement manner of two viewing points, as only two viewing points V1 and V2 are provided within each viewing angle, the parallax between two adjacent viewing points is large and generally higher than a set threshold, and therefore, within one viewing angle, the positions of the edges of two viewing points are both non-visible regions (1, 2, and 3 in fig. 2 b), so the original visible region is discontinuous.
For example, fig. 2c is a schematic diagram of a visual area being discontinuous within a visual angle according to an embodiment of the present invention, as shown in fig. 2c, for a layout design of multiple viewpoints (five viewpoints V1-V5 in fig. 2 c), a disparity between two adjacent viewpoints is smaller than a set threshold, except that the edge positions of the visual angle are non-visual areas (4 and 5 in fig. 2 c) due to a larger disparity, and other central areas are too smooth due to a sufficient number of viewpoints and are continuous visual areas.
And S130, adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the two eyes of the user relative to the screen.
Illustratively, the current phase of the user's eyes relative to the screen may be calculated by obtaining the spatial position of the user's eyes within the viewing area of the screen.
Alternatively, the image with the human face may be obtained by a photographing device disposed on the display device and facing the viewing area of the screen. The face of a person in the image is recognized, and the spatial positions of the left eye and the right eye of a viewer, such as the distance from the screen in the vertical direction and the distance from the center line perpendicular to the center of the screen, are determined according to the face of the person. In addition, an infrared device can be used for assisting ranging to acquire a more accurate space position of the human eyes.
Preferably, images with human faces shot by the camera are periodically acquired, and the spatial position of the human eyes is determined according to the plurality of human face images, so that the spatial position deviation caused by accidental swinging of the viewer is avoided.
Specifically, fig. 2D is a schematic diagram of a phase relationship between the eye position and the screen in the naked eye 3D display method according to the first embodiment of the present invention, and as shown in fig. 2D, when the current phase of the eye position relative to the screen is calculated, the calculation may be performed according to the following formula:
Figure BDA0001674981910000081
Figure BDA0001674981910000082
Figure BDA0001674981910000083
wherein f is the distance between the position of the eyes and the vertical center line of the screen; VD is the distance between the position of the human eyes and the screen; the OVD is the distance between the optimal viewing distance and the screen; x is the distance between a pixel point on the screen and the center position of the screen; dOVDWidth at the central viewing angle; p is a radical ofgRepresenting the phase of a single eye; t is the distance between the edge position of the visible area and the vertical center line of the screen; p is the eye position versus the on-screen position phase.
After the current phases of the two eyes of the user are determined, the phases can be adjusted according to the feature information of the original visual area and the viewpoint image corresponding to the initial phase of the current phase.
For example, when the phase of the viewpoint image corresponding to the initial phase is adjusted, the adjustment may be performed according to the continuity characteristic in the original visual area characteristic information.
Specifically, if the original visible area is discontinuous, the present embodiment takes two viewpoints as an example, and when no human eye tracks, the initial phase of the original visible area of the left viewpoint is S1(0<S1<0.5) in size
Figure BDA0001674981910000091
The initial phase of the original visible area of the right viewpoint is S2(0<S2<0.5) in size
Figure BDA0001674981910000092
Since the characteristic information of the original visible area does not change as the position of the human eye moves. Therefore, after the current left eye phase and the current right eye phase of the user are determined, according to the starting phase and the size of the original visual area of the left viewpoint and the starting phase and the size of the original visual area of the right viewpoint, the left eye target visual area phase corresponding to the current left eye phase and the right eye target visual area phase corresponding to the current right eye phase can be determined. The phase positions of the left-eye target visual area and the right-eye target visual area are target intervals for adjusting the phase positions of the viewpoint images corresponding to the initial phases along with the change of the left-eye and right-eye spatial positions of the current viewer, and the viewpoint images corresponding to the initial phases can be moved to the left-eye target visual area and the right-eye target visual area along with the target intervals, so that the content viewed by the left eye and the right eye of the viewer is not changed along with the change of the positions of the two eyes.
Specifically, if the original visual area is continuous, when the phase of the viewpoint image corresponding to the initial phase is adjusted, for each user, the relative phase relationship between the left eye phase and the right eye phase may be calculated according to the current phase, and the maximum value and the minimum value that satisfy the relative phase relationship are determined from the current left eye phase and the current right eye phase corresponding to each user, respectively. And adjusting the phase of the viewpoint image corresponding to the initial phase according to the relative phase relationship, the characteristic information, the maximum value and the minimum value. Since the technical solution of this embodiment relates to human eye tracking of multiple users, the arrangement can classify users meeting different relative phase relationships according to the relative phase relationship between the left eye phase and the right eye phase of the multiple users, for example, all users meet the relative phase relationship that the left eye phase is on the left side of the right eye phase, or all users meet the relative phase relationship that the left eye phase is on the right side of the right eye phase. In the process of adjusting the initial phase, various conditions of whether the positions of the two eyes of the user correspond to the visual area can be discussed in a classified mode, and the viewpoint images corresponding to the initial phases of the users meeting the same relative phase relationship are subjected to phase adjustment in the same mode, so that all the users can watch a naked eye 3D effect at the current position of the users as far as possible, and the watching experience of the users is improved.
According to the naked eye 3D display method based on the human eye tracking, provided by the embodiment of the invention, according to the number of the viewpoints of the to-be-arranged image, the characteristic information of the original visual area in each viewpoint can be determined by determining the initial phase of each viewpoint and according to the preset arrangement mode of the viewpoints of the to-be-arranged image. The current phase of the human eyes relative to the screen is acquired in real time, and phase adjustment can be carried out on the played viewpoint image content according to the current phase and the characteristic information, so that the viewpoint image content correspondingly moves along with the movement of the positions of the human eyes, the left eye and the right eye of each user can watch correct viewpoint image content, image aliasing or inversion is avoided, and the watching effect and the watching experience of the users are improved.
Example two
Fig. 3 is a flowchart of a naked eye 3D display method based on human eye tracking according to a second embodiment of the present invention, and this embodiment optimizes a case where an original visible area is discontinuous when two viewpoints are arranged, on the basis of the second embodiment, wherein explanations of terms that are the same as or corresponding to the first embodiment are not repeated herein. Referring to fig. 3, the present embodiment provides a naked eye 3D display method based on human eye tracking, including:
s210, determining the initial phase of each viewpoint according to the number of the viewpoints to be mapped.
Illustratively, the to-be-scheduled viewpoint may include a left viewpoint and a right viewpoint.
S220, determining the characteristic information of the original visual area in each visual angle according to a preset layout mode of the viewpoint of the picture to be laid.
The original visual area is a display area where a naked eye 3D display effect can be presented on the screen when human eye tracking is not performed.
Illustratively, when the viewpoints to be mapped are a left viewpoint and a right viewpoint, the original visual area includes a left viewpoint original visual area and a right viewpoint original visual area.
And S230, if the original visual areas are discontinuous, determining a left eye target visual area phase corresponding to the current left eye phase and a right eye target visual area phase corresponding to the current right eye phase according to the current left eye phase and the current right eye phase of different users and the characteristic information of the left viewpoint original visual area and the right viewpoint original visual area.
Because the technical scheme provided by the embodiment of the invention is multi-user multi-viewpoint arrangement, in the actual arrangement process, the left eyes and the right eyes of all users need to be enabled to correspond to the range of the target visual area as much as possible.
Illustratively, let the current left eye phase of the first user be p1LCurrent right eye phase is p1R. Let the current left eye phase of the Nth (N is more than or equal to 2) user be pNLCurrent right eye phase is pNR. According to the size of the original visual area of the left viewpoint, the phase of the left eye target visual area corresponding to the current left eye phase of the first user can be determined to be
Figure BDA0001674981910000111
The phase of the right eye target visible region corresponding to the current right eye phase is
Figure BDA0001674981910000112
The phase of the left eye target visible region corresponding to the current left eye phase of the Nth user is
Figure BDA0001674981910000113
The phase of the right eye target visible region corresponding to the current right eye phase is
Figure BDA0001674981910000114
And S240, calculating a first intersection between all left-eye target visual area phases and a second intersection between all right-eye target visual area phases.
After the phases of the target visual areas of the two eyes of each user are determined, the intersection phi of the phases of the target visual areas of the left eyes of all the users is calculated respectivelyL=φ1L∩φ2L∩...∩φNLAnd the phase of the visual area of the right eye target to calculate the intersection phiR=φ1R∩φ2R∩...∩φNRTherefore, the common part of all the user target visual areas can be calculated through the setting, so that the viewpoint images corresponding to the initial visual phases are moved to the area, and the naked eye 3D display effect can be observed by all users.
And S250, if the first intersection and the second intersection are both non-empty sets, adjusting the phase of the viewpoint image corresponding to the initial phase according to the magnitude relation between the maximum value and the minimum value in the first intersection and the maximum value and the minimum value in the second intersection.
Illustratively, the phase of the viewpoint map corresponding to the initial phase is adjusted according to the following formula:
Figure BDA0001674981910000121
Figure BDA0001674981910000122
wherein K is the number of viewpoints to be arranged, and K is any one of the K viewpointsPoint; max (phi)L) Represents the maximum value in the first intersection, min (φ)L) Represents the minimum value in the first intersection; max (phi)R) Represents the maximum value in the second intersection; min (phi)R) Represents the minimum value in the second intersection; s1Representing the starting phase of the original visual area of the left eye; s2The starting phase of the original visual area of the left eye;
Figure BDA0001674981910000123
indicating the size of the original viewable area corresponding to the left or right eye,
Figure BDA0001674981910000124
representing a phase range corresponding to any one viewpoint; phi is the phase adjustment amplitude corresponding to any viewpoint;
Figure BDA0001674981910000125
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By moving the viewpoint image corresponding to the initial phase into the adjusted phase range, the viewpoint image corresponding to the initial phase correspondingly moves along with the movement of the human eye position, and the user can always watch the correct viewpoint image content.
It should be noted that if phiLOr phiRAn empty set indicates that all people cannot be within the viewable area. At this time, the users may be subjected to importance ranking, and the user closest to the screen or the user closest to the center position of the screen within the optimal viewing range may be regarded as the third target user, and the other users may be regarded as the secondary users. And when the human eye tracking is carried out, the third target user is preferentially tracked, and the phase of the viewpoint image corresponding to the initial phase is adjusted according to the current phase and the characteristic information of the eyes of the third target user.
Illustratively, the secondary users may be prompted to adjust the positions, so as to sequentially reduce the secondary users, and the phase adjustment scheme provided by the above embodiment is executed in an iterative manner until all users satisfy the mapping condition of the human eye tracking, so that all users can view the 3D display effect.
On the basis of the above embodiment, by calculating a first intersection of phases of left-eye target visual areas and a second intersection of phases of right-eye target visual areas of all users, if the first intersection and the second intersection are both non-empty sets, it is indicated that all user target visual areas have a common portion, and thus, a viewpoint image corresponding to an initial visual phase is moved to the common portion, so that all users can view a naked-eye 3D display effect. And if the first intersection or the second intersection is an empty set, it indicates that all users cannot be in the target visual area, and in this case, the third target user can preferentially view the 3D effect by preferentially tracking the positions of both eyes of the third target user closest to or closest to the middle of the screen. In addition, the secondary users except the third target user are prompted to adjust the positions, so that the secondary users can be continuously reduced, all the users can meet the layout condition tracked by human eyes, and an ideal naked-eye 3D display effect can be observed.
EXAMPLE III
Fig. 4 is a flowchart of a naked eye 3D display method based on eye tracking according to a third embodiment of the present invention, and this embodiment optimizes a situation that an original visual area is continuous when a plurality of viewpoints are arranged on the basis of the third embodiment, wherein explanations of terms the same as or corresponding to the third embodiment are not repeated herein. Referring to fig. 4, the naked eye 3D display method based on human eye tracking provided by the present embodiment includes:
s310, determining the initial phase of each viewpoint according to the number of the viewpoints to be mapped.
S320, determining the characteristic information of the original visual area in each visual angle according to the preset layout mode of the viewpoint to be laid.
The original visual area is a display area where a naked eye 3D display effect can be presented on the screen when human eye tracking is not performed.
And S330, for each user, calculating the relative phase relationship between the left eye phase and the right eye phase of the user according to the current phase of the two eyes of the user relative to the screen.
And S340, adjusting the phase of the viewpoint image corresponding to the initial phase according to the relative phase relationship, the characteristic information and the maximum value and the minimum value which respectively satisfy the relative phase relationship in the current left eye phase and the current right eye phase of each user.
The lens refracts different display contents to different places in space through the refraction effect of light to form a plurality of optical channels. The view maps corresponding to the same phase in different phase ranges (0, l) are the same. The human eye views the corresponding viewpoint image through the optical channel.
Illustratively, according to the relative phase relationship, the initial phase of the original visual area, the size of the original visual area, and the maximum value and the minimum value which satisfy the relative phase relationship, it can be determined whether the human eye has moved out of the range of the original visual area, if so, the viewpoint image of the optical channel corresponding to the human eye is adjusted, so that the left eye and the right eye of each user can watch correct viewpoint image content; if the human eyes do not move out of the range of the original visual area, the viewpoint images of the optical channels corresponding to the human eyes do not need to be subjected to phase adjustment, and the images are arranged according to the original preset image arrangement mode.
Illustratively, the relative phase relationship includes that the left eye phase is smaller than the right eye phase and the left eye phase is larger than the right eye phase, and the moving amplitudes of different users are inconsistent, so that the positions of both eyes of a part of users after moving can still correspond to the original visual area, only one eye of the part of users can correspond to the original visual area, and the other eye moves to the range of the invisible area, therefore, whether both eyes of different users correspond to the original visual area can be discussed according to the maximum value and the minimum value which meet different phase relationships, and therefore, the corresponding viewpoint images of the two eyes of the users at different positions can be adjusted in a targeted manner, and the viewing experience of the users is improved.
In this embodiment, by introducing the relative phase relationship between the left eye and the right eye of the user, the region corresponding to the current left-right eye phase of the user can be identified according to the relative phase relationship and by combining the maximum value and the minimum value in the current left-right eye phase of the user satisfying the relative correlation relationship, so that the viewpoint images of the left-eye visual region and the right-eye visual region of different users can be adjusted according to the positions of the two eyes of different users in a targeted manner, and the viewing experience of the user is improved.
Example four
Fig. 5 is a flowchart of a naked eye 3D display method based on eye tracking according to a fourth embodiment of the present invention, where on the basis of the foregoing embodiments, the present embodiment optimizes a situation that original visual areas are continuous when multiple viewpoints are arranged, and optimizes a relative phase relationship of each user such that a current left eye phase of the user is located on the left side of a current right eye phase, where explanations of terms the same as or corresponding to the foregoing embodiments are not repeated herein. Referring to fig. 5, the naked eye 3D display method based on human eye tracking provided by the present embodiment includes:
s410, determining the initial phase of each viewpoint according to the number of the viewpoints to be mapped.
And S420, determining the characteristic information of the original visual area in each visual angle according to a preset layout mode of the viewpoint to be laid.
The original visual area is a display area where a naked eye 3D display effect can be presented on the screen when human eye tracking is not performed.
And S430, for each user, calculating the relative phase relationship between the left eye phase and the right eye phase of the user according to the current phases of the two eyes of the user relative to the screen.
Illustratively, let the left eye phase of the first user be p1LLet the right eye phase of the first user be p1R. Let the left eye of the first user fall within the Kth optical design viewpoint, i.e. phase
Figure BDA0001674981910000161
Let the phase of the left eye of the Nth (N is more than or equal to 2) user be pNLLet the phase of the right eye of the Nth user be pNR. The phase relationship Δ p ═ p for the left and right eyes of the userR-pL
For example, in the present embodiment, the relative phase relationship is: the user's current left eye phase is located to the left of the current right eye phase, i.e., Δ p > 0.
And S440, taking the user with the current left eye phase positioned on the left side of the current right eye phase as a first target user.
S450, determining a first left-eye phase with the minimum value from the current left-eye phases corresponding to the first target users, and determining a first right-eye phase with the maximum value from the current right-eye phases corresponding to the first target users.
Illustratively, let Δ p be satisfied>The minimum value of the current left eye phase of the first target user of 0 is minpSeqL=min(p1L,p2L,...,pNL) Satisfy Δ p>The maximum value of the current right eye phase of the first target user of 0 is maxpSeqR=max(p1R,p2R,...,pNR)。
And S460, adjusting the phase of the viewpoint image corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the initial phase characteristic and the size characteristic.
In this embodiment, the feature information of the original visible region further includes an initial phase feature s (0 ≦ s <0.5) and a size feature w (0< w <1), where s + w < 1.
Illustratively, if all users are the first target user, the relationship between the first left-eye phase, the first right-eye phase and the starting phase feature and the size feature determines whether the left-eye phase and the right-eye phase of the first target user are located in the original visual area. The following is a case-by-case discussion about whether the current phases of the left and right eyes of different first target users correspond to the original visual area, respectively:
1. if the first left-eye phase is greater than the initial phase of the original viewable area, minpSeqLIs greater than or equal to s, and the second right eye phase is smaller than the sum of the initial phase and the size of the original visual area, namely maxpSeqRAnd if not, determining that the first left eye phase and the first right eye phase corresponding to all the first target users are located in the original visual area. At this time, it can be said that although the positions of both eyes of the user have moved, when the positions of both eyes after the user has moved are out of the range of the original visible region, they are still within the same phase range (0, 1). Due to the fact thatTherefore, the phase diagram does not need to be adjusted, that is, the adjustment amplitude of the viewpoint diagram corresponding to the initial phase is 0, and the left eye and the right eye of the first target user can still respectively see the correct viewpoint diagram and observe the 3D effect of the object at different angles.
For example, if the phase adjustment of the viewpoint map is not required, the change of the viewpoint map may be implemented by assigning values to sub-pixels of the pixel points again based on the initial phase of the viewpoint map corresponding to the preset map arrangement. The sub-pixels are preferably assigned by linear difference.
Optionally, the sub-pixels of the viewpoint channel are assigned according to the viewpoint image corresponding to the initial phase, which may specifically be: if the original phase does not have a corresponding viewpoint image, setting the sub-pixel of the channel as full black or setting the sub-pixel according to the view content in the nearest channel; if the original phase has a corresponding view point, utilize
Figure BDA0001674981910000171
Calculating an adjusted view point map, wherein cjFor the adjusted view map, viFor any one view included in a channel, diAnd setting the sub-pixels according to the adjusted viewpoint image content for the weight of the viewpoint image. Wherein the weight may be a proportion of each view map within the channel. Fig. 6 is a schematic diagram illustrating allocation of multi-view channels and view points in a naked-eye 3D display method according to a fourth embodiment of the present invention, and fig. 6 illustrates channels and view points with phases of 0,1]An allocation relationship within the range of (1).
2. According to the first left eye phase, the first right eye phase, the initial phase characteristic and the size characteristic, if the current right eye phase of the first target user is determined to be located in the original visual area, namely maxpSeqRS + w ≦ and the current left eye phase of the at least one first target user is outside the left edge of the original viewable area, minpSeqL<s, adjusting the phase of the viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000181
wherein, the nipSeqLThe phase is a first left eye phase, K is the number of viewpoints to be mapped, and K is any one viewpoint in K viewpoints; s is the initial phase of the original visual area;
Figure BDA0001674981910000182
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000183
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By adopting the formula, the phase of the viewpoint image corresponding to the initial phase of the left eye of the user can be adjusted to be
Figure BDA0001674981910000184
That is, after the viewpoint image corresponding to the initial left eye phase is adjusted according to the above formula, the position of the viewpoint image corresponds to the current left eye phase of the user. Although the left eye of the user has moved to the invisible area, the user can still see the correct viewpoint image content and experience the 3D display effect through the above setting.
3. According to the first left eye phase, the first right eye phase, the starting phase characteristic and the size characteristic, if the current left eye phase of the first target user is determined to be located in the original visual area, namely, the minpSeqL≧ s, and the current right eye phase of the at least one first target user is located outside the right edge of the original viewable area, maxpSeqR>s + w, adjusting the phase of the viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000185
wherein, maxpSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints;s is a starting phase in the original visual area; w is the size of the original viewable area,
Figure BDA0001674981910000191
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000192
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By adopting the formula, the phase of the viewpoint image corresponding to the initial phase of the right eye of the user can be adjusted to be
Figure BDA0001674981910000193
That is, after the viewpoint map corresponding to the initial right-eye phase is adjusted according to the above formula, the position of the viewpoint map corresponds to the current right-eye phase of the user. Although the right eye of the user has moved to the invisible area, the user can still see the correct viewpoint image content and experience the 3D display effect through the above setting.
On the basis of the above embodiment, the present embodiment discusses, by case, whether the two-eye phases of different first target users correspond to the original visual areas according to the magnitude relationship among the first left-eye phase, the first right-eye phase, the initial phase feature and the magnitude feature, and designs a corresponding view point phase adjustment mode for users in different cases, so that the left eye and the right eye of all the first target users can both view correct view point image content, the phenomenon of aliasing or inversion of images is avoided, and the viewing effect and the viewing experience of the users are improved.
EXAMPLE five
Fig. 7 is a flowchart of a naked eye 3D display method based on eye tracking according to a fifth embodiment of the present invention, where on the basis of the foregoing embodiment, the situation that original visual areas are continuous when images are arranged from multiple viewpoints is optimized, and a relative phase relationship is optimized such that a current left eye phase of a user is located on the right side of a current right eye phase, that is, a left eye and a right eye of the user are respectively located in adjacent different viewing angles, and left and right eye phases are flipped. Wherein explanations of the same or corresponding terms as those of the above-described embodiments are omitted. Referring to fig. 7, the naked eye 3D display method based on eye tracking provided by the present embodiment includes:
s510, determining an initial phase of each viewpoint according to the number of the viewpoints to be mapped.
S520, determining the characteristic information of the original visual area in each visual angle according to the preset layout mode of the viewpoint to be laid.
The original visual area is a display area where a naked eye 3D display effect can be presented on the screen when human eye tracking is not performed.
And S530, for each user, calculating the relative phase relation between the left eye phase and the right eye phase of the user according to the current phases of the two eyes of the user relative to the screen.
The relative phase relationship in this embodiment is: the user's current left eye phase is to the left of the current right eye phase, i.e. Δ p <0.
And S540, taking the user with the current left eye phase positioned on the right side of the current right eye phase as a second target user.
And S550, determining a second left-eye phase with the minimum value in the current left-eye phases corresponding to the second target users, and determining a second right-eye phase with the maximum value in the current right-eye phases corresponding to the second target users.
Illustratively, let Δ p be satisfied<The minimum value of the current left eye phase of the second target user of 0 is minpInvL=min(p1L,p2L,...,pNL) Satisfy Δ p<The maximum value of the current right eye phase of the second target user of 0 is maxpInvR=max(p1R,p2R,...,pNR)。
And S560, calculating the size of the invisible area according to the size of the preset visual angle and the size characteristic of the original visible area.
Specifically, the size of the preset viewing angle is 1, the size of the original visible area is w, and for each viewing angle, except for the original visible area, the size of the invisible area is 1-w.
And S570, when at least one user is a second target user, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the second left-eye phase, the second right-eye phase and the size of the invisible area.
For example, the relationship between the second left-eye phase, the second right-eye phase, the starting phase feature and the invisible area determines whether the left-eye phase and the right-eye phase of the second target user are located in the original visible area, and similar to the above method for discussing the situations of the two-eye phase of the first target user, the following is to discuss whether the current phases of the left eye and the right eye of different second target users correspond to the original visible area in a classified manner:
1. if the absolute value of the second left-eye phase and the second right-eye phase difference is larger than the size of the invisible area, minpInvL-maxpInvR>1-w, that is, the left eye phases of all users are located at the right side of the right eye phases of all users within the viewing angle, the phase of the view corresponding to the initial phase is adjusted according to the following formula:
Figure BDA0001674981910000211
wherein, the nipInvLIs the second left eye phase; maxpInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure BDA0001674981910000212
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000213
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By adopting the formula, the phase of the viewpoint image corresponding to the initial phase of the left eye and the right eye of the user can be adjusted to be
Figure BDA0001674981910000214
Namely, after the viewpoint images corresponding to the initial left and right eye phases are adjusted according to the formula, the positions of the viewpoint images correspond to the current left and right eye phases of the user. Although the phases of the two eyes of the user are reversed, the user still can see the correct viewpoint image content and experience the 3D display effect through the arrangement.
2. Among all users, if there is at least one first target user and at least one second target user, if the absolute value of the difference between the first left-eye phase and the second left-eye phase is greater than the size of the invisible area, minSeqL-maxpInvR>1-w, it can be stated at this time that the second right-eye phase with the largest value among the current right-eye phases of the second target user is located on the left side of the first left-eye phase, compared with the first left-eye phase with the smallest value among the current left-eye phases of the first target user. At this time, the phase of the viewpoint map corresponding to the initial phase may be adjusted according to the following formula:
Figure BDA0001674981910000221
wherein, the nipSeqLIs the first left eye phase; maxpInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure BDA0001674981910000222
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000223
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By adopting the formula, after the viewpoint images corresponding to the initial left and right eye phases are adjusted according to the formula, the positions of the viewpoint images correspond to the current left and right eye phases of the user. Although the phases of the two eyes of part of the users are reversed, the users still can see the correct viewpoint image content and experience the 3D display effect through the arrangement.
3. Among all users, if there are at least one first target user and at least one second target user, if the absolute value of the difference between the second left-eye phase and the first right-eye phase is greater than the size of the invisible area, minInvL-maxpSeqR>1-w, at this time, it can be stated that the second left-eye phase with the smallest value is in the current right-eye phases of the two target users, and the second left-eye phase is on the right side of the first right-eye phase compared with the first left-eye phase with the largest value in the current left-eye phases of the first target user, and at this time, the phase of the viewpoint map corresponding to the initial phase can be adjusted according to the following formula:
Figure BDA0001674981910000231
wherein, the nipInvLIs the second left eye phase; maxpSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints;
Figure BDA0001674981910000232
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000233
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
By adopting the formula, after the viewpoint images corresponding to the initial left and right eye phases are adjusted according to the formula, the positions of the viewpoint images correspond to the current left and right eye phases of the user. Although the phases of the two eyes of part of the users are reversed, the users still can see the correct viewpoint image content and experience the 3D display effect through the arrangement.
It should be noted that, if the phases of both eyes of all users cannot be located in the corresponding original visual areas, importance ranking may be performed on all users according to the determination manner of the third target user, and an important user closest to the middle of the screen or closest to the screen may be tracked preferentially. Through the suggestion of carrying out position adjustment to other secondary users outside the important user to reduce the secondary user in proper order, make all users can both satisfy the arrangement picture condition that the people eye tracked, and then can both watch the 3D display effect, promote user's viewing experience.
On the basis of the above embodiment, the present embodiment discusses, by different situations, whether the two eye phases of different second target users correspond to the original visual areas according to the magnitude relationship between the first right eye phase, the second left eye phase, and the invisible area, and designs the corresponding visual point image phase adjustment mode for the users in different situations, so that the left eyes and the right eyes of all the second target users can both view correct visual point image content, the phenomenon of aliasing or overturning of images is avoided, and the viewing effect and the viewing experience of the users are improved.
EXAMPLE six
Fig. 8 is a schematic structural diagram of a naked eye 3D display device based on human eye tracking according to a sixth embodiment of the present invention, and as shown in fig. 8, the device includes: an initial phase determination module 610, a characteristic information determination module 620, and a phase adjustment module 630.
The initial phase determining module 610 is configured to determine an initial phase of each viewpoint according to the number of viewpoints to be mapped;
the characteristic information determining module 620 is configured to determine characteristic information of an original visual area in each viewing angle according to a preset layout mode of the viewpoint to be laid out, where the original visual area is a display area where a naked-eye 3D display effect can be presented on a screen when human eye tracking is not performed;
and a phase adjusting module 630, configured to perform phase adjustment on the viewpoint image corresponding to the initial phase according to the feature information and the current phase of the two eyes of the user relative to the screen.
According to the naked eye 3D display device based on human eye tracking, the initial phase of each viewpoint can be determined according to the number of the viewpoints of the image to be arranged. And determining the characteristic information of the original visual area in each visual angle according to the preset layout mode of the viewpoint of the picture to be laid. The current phase of the human eyes relative to the screen is acquired in real time, and phase adjustment can be carried out on the played viewpoint image content according to the current phase and the characteristic information, so that the viewpoint image content correspondingly moves along with the movement of the positions of the human eyes, the left eye and the right eye of each user can watch correct viewpoint image content, image aliasing or inversion is avoided, and the watching effect and the watching experience of the users are improved.
On the basis of the above embodiment, the feature information includes a continuity feature;
accordingly, the characteristic information determination module 620. Specifically, the method is used for continuously arranging adjacent original visual areas in each visual angle between any two adjacent original visual areas according to a preset arrangement mode of a viewpoint of an image to be arranged, if no invisible area exists between the adjacent original visual areas;
wherein the invisible region is a region where aliasing or inversion of an image occurs between adjacent viewpoints.
On the basis of the above embodiment, if the original visible area is continuous, the phase adjustment module 630 includes:
the relative phase relation calculation unit is used for calculating the relative phase relation between the left eye phase and the right eye phase of each user according to the current phases of the two eyes of each user relative to the screen;
and the first phase adjusting unit is used for carrying out phase adjustment on the viewpoint image corresponding to the initial phase according to the relative phase relationship, the characteristic information and the maximum value and the minimum value which respectively meet the relative phase relationship in the current left eye phase and the current right eye phase of each user.
On the basis of the above embodiment, the viewpoint to be mapped comprises a left viewpoint and a right viewpoint;
correspondingly, the original visual area comprises a left-viewpoint original visual area and a right-viewpoint original visual area;
accordingly, if the original visible region is not continuous, the phase adjustment module 630 includes:
a target visual area determining unit, configured to determine, according to current left-eye phases and current right-eye phases of different users and feature information of the left-viewpoint original visual area and the right-viewpoint original visual area, a left-eye target visual area phase corresponding to the current left-eye phase and a right-eye target visual area phase corresponding to the current right-eye phase;
the intersection calculation unit is used for calculating a first intersection between all left-eye target visual area phases and a second intersection between all right-eye target visual area phases;
and the second phase adjustment unit is used for adjusting the phase of the viewpoint image corresponding to the initial phase according to the magnitude relation between the maximum value and the minimum value in the first intersection and the maximum value and the minimum value in the second intersection if the first intersection and the second intersection are both non-empty sets.
On the basis of the foregoing embodiment, the second phase adjustment unit is specifically configured to perform phase adjustment on the viewpoint map corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000261
Figure BDA0001674981910000262
k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints; max (phi)L) Represents the maximum value in the first intersection, min (φ)L) Represents the minimum value in the first intersection; max (phi)R) Represents the maximum value in the second intersection; min (phi)R) Represents the minimum value in the second intersection; s1Representing the starting phase of the original visual area of the left eye; s2The starting phase of the original visual area of the left eye;
Figure BDA0001674981910000263
indicating the size of the original viewable area corresponding to the left or right eye,
Figure BDA0001674981910000264
representing a phase range corresponding to any one viewpoint; phi is the phase adjustment amplitude corresponding to any viewpoint;
Figure BDA0001674981910000265
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
On the basis of the above embodiment, the feature information includes a start phase feature and a magnitude feature;
the relative phase relationship includes: the current left eye phase of the user is positioned on the left side of the current right eye phase;
correspondingly, the first phase adjustment unit comprises:
a first target user determination subunit configured to determine, as a first target user, a user whose current left-eye phase is on the left side of the current right-eye phase;
a first phase determining subunit, configured to determine a first left-eye phase with a minimum value from current left-eye phases corresponding to the first target user, and determine a first right-eye phase with a maximum value from current right-eye phases corresponding to the first target user;
and the first phase adjusting subunit is configured to perform phase adjustment on the viewpoint image corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the initial phase feature, and the magnitude feature.
On the basis of the foregoing embodiment, when all users are the first target user, the first phase adjustment subunit is specifically configured to:
according to the first left-eye phase, the first right-eye phase, the initial phase feature and the size feature, if it is determined that the first left-eye phase and the first right-eye phase corresponding to all the first target users are located in the original visual area, the adjustment amplitude of the viewpoint image corresponding to the initial phase is 0.
On the basis of the foregoing embodiment, the first phase adjustment subunit is specifically configured to:
according to the first left-eye phase, the first right-eye phase, the starting phase characteristic and the size characteristic, if it is determined that the current right-eye phase of the first target user is located in the original visual region and the current left-eye phase of at least one first target user is located outside the left edge of the original visual region, performing phase adjustment on a viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000271
wherein, the nipSeqLThe phase is a first left eye phase, K is the number of viewpoints to be mapped, and K is any one viewpoint in K viewpoints; s is the initial phase of the original visual area;
Figure BDA0001674981910000272
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000273
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
On the basis of the foregoing embodiment, the first phase adjustment subunit is specifically configured to:
according to the first left-eye phase, the first right-eye phase, the starting phase characteristic and the size characteristic, if it is determined that the current left-eye phase of the first target user is located in the original visual region and the current right-eye phase of at least one first target user is located outside the right edge of the original visual region, performing phase adjustment on a viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000281
wherein, maxpSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints; s is a starting phase in the original visual area; w is the size of the original viewable area,
Figure BDA0001674981910000282
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000283
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
On the basis of the above embodiment, the relative phase relationship further includes: the user's current left eye phase is located to the right of the current right eye phase:
correspondingly, the phase adjustment unit comprises:
a second target user determination subunit, configured to determine, as a second target user, a user whose current left-eye phase is located on the right side of the current right-eye phase;
a second phase determining subunit, configured to determine a second left-eye phase with a minimum value in current left-eye phases corresponding to the second target user, and determine a second right-eye phase with a maximum value from current right-eye phases corresponding to the second target user;
the size calculation unit of the invisible area is used for calculating the size of the invisible area according to the size of a preset visual angle and the size characteristic of the original visible area;
and the second phase adjustment subunit is configured to, when at least one user is a second target user, perform phase adjustment on the viewpoint image corresponding to the initial phase according to the second left-eye phase, the second right-eye phase, and the size of the invisible area.
On the basis of the foregoing embodiment, the second phase adjustment subunit is specifically configured to:
if the absolute value of the second left-eye phase and the second right-eye phase difference is larger than the size of the invisible area, performing phase adjustment on the view corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000291
wherein, the nipInvLIs the second left eye phase; maxpInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure BDA0001674981910000292
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000293
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
On the basis of the above embodiment, the apparatus further includes: and a third phase determining subunit, configured to, if there is at least one first target user and at least one second target user among all users, perform phase adjustment on the viewpoint image corresponding to the initial phase according to a relationship between the first left-eye phase and the second right-eye phase and the size of the invisible area, or according to a relationship between the first right-eye phase and the size of the invisible area and the second left-eye phase and the size of the invisible area.
On the basis of the above embodiment, the third phase determining subunit is specifically configured to:
if the absolute value of the difference between the first left-eye phase and the second left-eye phase is larger than the size of the invisible area, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000294
wherein, the nipSeqLIs the first left eye phase; maxpInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure BDA0001674981910000301
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000302
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
On the basis of the above embodiment, the third phase determining subunit is specifically configured to:
if the absolute value of the difference between the second left-eye phase and the first right-eye phase is larger than the size of the invisible area, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the following formula:
Figure BDA0001674981910000303
wherein, the nipInvLIs the second left eye phase; maxpSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints;
Figure BDA0001674981910000304
representing a phase range corresponding to any one viewpoint;
Figure BDA0001674981910000305
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]Then, the view point diagram corresponding to any initial phase is adjustedPhase.
On the basis of the above embodiment, the apparatus further includes:
a third target user determination module, configured to determine a third target user in a screen viewing area if the first intersection or the second intersection is an empty set, so as to perform phase adjustment on the viewpoint image corresponding to the initial phase according to a current phase of both eyes of the third target user and the feature information;
and the third target user is the user closest to the screen or the user closest to the center position of the screen in the optimal viewing range.
The naked eye 3D display device based on the human eye tracking can execute the naked eye 3D display method based on the human eye tracking provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. Technical details that are not described in detail in the above embodiments may be referred to a naked eye 3D display method based on human eye tracking provided by any embodiment of the present invention.
EXAMPLE seven
Fig. 9 is a schematic structural diagram of an apparatus according to a seventh embodiment of the present invention. FIG. 9 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 9 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 9, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 9, and commonly referred to as a "hard drive"). Although not shown in FIG. 9, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The device may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with the device 12, and/or with any devices (e.g., network card, modem, etc.) that enable the device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes programs stored in the system memory 28 to execute various functional applications and data processing, for example, to implement a naked eye 3D display method based on human eye tracking provided by the embodiment of the present invention.
Example eight
The eighth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the naked eye 3D display method based on human eye tracking provided in any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (17)

1. A naked eye 3D display method based on human eye tracking is characterized by comprising the following steps:
determining an initial phase of each viewpoint according to the number of viewpoints of the viewpoints to be arranged;
determining characteristic information of an original visual area in each visual angle according to a preset layout mode of the viewpoint of the diagram to be laid, wherein the original visual area is a display area of which a screen can present a naked eye 3D display effect when human eye tracking is not carried out;
according to the characteristic information and the current phase of the two eyes of the user relative to the screen, phase adjustment is carried out on the viewpoint image corresponding to the initial phase;
if the original visual area is continuous, adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the two eyes of the user relative to the screen, wherein the adjusting comprises the following steps:
for each user, calculating the relative phase relation between the left eye phase and the right eye phase of the user according to the current phases of the two eyes of the user relative to the screen;
performing phase adjustment on the viewpoint image corresponding to the initial phase according to the relative phase relationship, the characteristic information and the maximum value and the minimum value which respectively satisfy the relative phase relationship in the current left eye phase and the current right eye phase of each user;
determining an initial phase of each viewpoint according to the number of viewpoints to be mapped, wherein the method comprises the following steps:
designing a central visual area on the optimal viewing distance, enabling the phase of a viewing line segment of a central visual angle to be [0, 1], and determining the initial phase of each visual point according to the number of the visual points;
the optimal viewing distance is the distance from the screen when perpendicular.
2. The method of claim 1, wherein the feature information comprises continuity features;
correspondingly, determining the characteristic information of the original visual area in each visual angle according to the preset layout mode of the viewpoint of the diagram to be laid, which comprises the following steps:
according to a preset layout mode of a viewpoint of a to-be-laid picture, within each view angle, for any two adjacent original visual areas, if no invisible area exists, the adjacent original visual areas are continuous within the view angle;
wherein the invisible region is a region where aliasing or inversion of an image occurs between adjacent viewpoints.
3. The method of claim 2, wherein the to-be-mapped viewpoint comprises a left viewpoint and a right viewpoint;
correspondingly, the original visual area comprises a left-viewpoint original visual area and a right-viewpoint original visual area;
correspondingly, if the original visual area is discontinuous, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the feature information and the current phase of the user's eyes relative to the screen, including:
determining a left eye target visual area phase corresponding to the current left eye phase and a right eye target visual area phase corresponding to the current right eye phase according to the current left eye phase and the current right eye phase of different users and the characteristic information of the left viewpoint original visual area and the right viewpoint original visual area;
calculating a first intersection between all left-eye target visual area phases and a second intersection between all right-eye target visual area phases;
and if the first intersection and the second intersection are both non-empty sets, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the magnitude relation between the maximum value and the minimum value in the first intersection and the maximum value and the minimum value in the second intersection.
4. The method according to claim 3, wherein performing phase adjustment on the viewpoint map corresponding to the initial phase according to a magnitude relationship between a maximum value and a minimum value in the first intersection and a maximum value and a minimum value in the second intersection specifically comprises:
and adjusting the phase of the viewpoint image corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000031
Figure FDA0002681757370000032
k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints; max (phi)L) Represents the maximum value in the first intersection, min (φ)L) Represents the minimum value in the first intersection; max (phi)R) Represents the maximum value in the second intersection; min (phi)R) Represents the minimum value in the second intersection; s1Representing the starting phase of the original visual area of the left eye; s2The initial phase of the original visual area of the right eye;
Figure FDA0002681757370000033
indicating the size of the original viewable area corresponding to the left or right eye,
Figure FDA0002681757370000034
representing a phase range corresponding to any one viewpoint; phi is the phase adjustment amplitude corresponding to any viewpoint;
Figure FDA0002681757370000035
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
5. The method of claim 1, wherein the characteristic information comprises a start phase characteristic and a magnitude characteristic;
the relative phase relationship includes: the current left eye phase of the user is positioned on the left side of the current right eye phase;
correspondingly, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the relative phase relationship, the feature information, and the maximum value and the minimum value which respectively satisfy the relative phase relationship in the current left-eye and right-eye phases of each user, includes:
taking a user with a current left eye phase positioned on the left side of the current right eye phase as a first target user;
determining a first left eye phase with a minimum value from the current left eye phases corresponding to the first target users, and determining a first right eye phase with a maximum value from the current right eye phases corresponding to the first target users;
and adjusting the phase of the viewpoint image corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the initial phase characteristic and the size characteristic.
6. The method of claim 5, wherein if all users are first target users, performing phase adjustment on the viewpoint map corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the start phase feature, and the magnitude feature comprises:
according to the first left-eye phase, the first right-eye phase, the initial phase feature and the size feature, if it is determined that the first left-eye phase and the first right-eye phase corresponding to all the first target users are located in the original visual area, the adjustment amplitude of the viewpoint image corresponding to the initial phase is 0.
7. The method of claim 5, wherein performing a phase adjustment on the view map corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the start phase feature, and the magnitude feature comprises:
according to the first left-eye phase, the first right-eye phase, the starting phase characteristic and the size characteristic, if it is determined that the current right-eye phase of the first target user is located in the original visual region and the current left-eye phase of at least one first target user is located outside the left edge of the original visual region, performing phase adjustment on a viewpoint image corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000041
wherein, min pSeqLThe phase is a first left eye phase, K is the number of viewpoints to be mapped, and K is any one viewpoint in K viewpoints; s is the initial phase of the original visual area;
Figure FDA0002681757370000042
representing a phase range corresponding to any one viewpoint;
Figure FDA0002681757370000043
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
8. The method of claim 5, wherein performing a phase adjustment on the view map corresponding to the initial phase according to the first left-eye phase, the first right-eye phase, the start phase feature, and the magnitude feature comprises:
according to the first left-eye phase, the first right-eye phase, the starting phase characteristic and the size characteristic, if it is determined that the current left-eye phase of the first target user is located in the original visual region and the current right-eye phase of at least one first target user is located outside the right edge of the original visual region, performing phase adjustment on a viewpoint image corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000051
wherein, max pSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints; s is a starting phase in the original visual area; w is the size of the original viewable area,
Figure FDA0002681757370000052
representing a phase range corresponding to any one viewpoint;
Figure FDA0002681757370000053
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
9. The method of claim 5, wherein the relative phase relationship further comprises: the user's current left eye phase is located to the right of the current right eye phase:
correspondingly, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the relative phase relationship, the feature information, and the maximum value and the minimum value which respectively satisfy the relative phase relationship in the current left-eye and right-eye phases of each user, includes:
taking the user with the current left eye phase positioned on the right side of the current right eye phase as a second target user;
determining a second left-eye phase with the minimum value in the current left-eye phases corresponding to the second target users, and determining a second right-eye phase with the maximum value from the current right-eye phases corresponding to the second target users;
calculating the size of the invisible area according to the size of a preset visual angle and the size characteristic of the original visible area;
and when at least one user is a second target user, carrying out phase adjustment on the viewpoint image corresponding to the initial phase according to the second left-eye phase, the second right-eye phase and the size of the invisible area.
10. The method of claim 9, wherein adjusting the phase of the viewpoint map corresponding to the initial phase according to the second left-eye phase, the second right-eye phase and the size of the invisible area comprises:
if the absolute value of the second left-eye phase and the second right-eye phase difference is larger than the size of the invisible area, performing phase adjustment on the view corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000061
wherein, min pInvLIs the second left eye phase; max pInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure FDA0002681757370000062
representing a phase range corresponding to any one viewpoint;
Figure FDA0002681757370000063
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
11. The method of claim 9, further comprising:
and if at least one first target user and at least one second target user exist in all the users, carrying out phase adjustment on the viewpoint image corresponding to the initial phase according to the relationship among the first left-eye phase, the second right-eye phase and the size of the invisible area, or according to the relationship among the first right-eye phase, the second left-eye phase and the size of the invisible area.
12. The method of claim 11, wherein adjusting the viewpoint map corresponding to the initial phase according to the relationship between the first left-eye phase, the second right-eye phase and the size of the invisible area comprises:
if the absolute value of the difference between the first left-eye phase and the second right-eye phase is larger than the size of the invisible area, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000071
wherein, min pSeqLIs the first left eye phase; max pInvRIs the second right eye phase; k is the number of the viewpoints to be arranged, K is any one of the K viewpoints,
Figure FDA0002681757370000072
representing a phase range corresponding to any one viewpoint;
Figure FDA0002681757370000073
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
13. The method of claim 11, wherein adjusting the phase of the viewpoint map corresponding to the initial phase according to the relationship between the first right-eye phase, the second left-eye phase and the size of the invisible area comprises:
if the absolute value of the difference between the second left-eye phase and the first right-eye phase is larger than the size of the invisible area, performing phase adjustment on the viewpoint image corresponding to the initial phase according to the following formula:
Figure FDA0002681757370000074
wherein, min pInvLIs the second left eye phase; max pSeqRIs the first right eye phase; k is the number of viewpoints of the graph to be arranged, and K is any one viewpoint in the K viewpoints;
Figure FDA0002681757370000075
representing a phase range corresponding to any one viewpoint;
Figure FDA0002681757370000081
representing the mapping of different views to phase ranges [0, 1] corresponding to the central view]And then, adjusting the phase of the viewpoint image corresponding to any initial phase.
14. The method of claim 3, further comprising:
if the first intersection or the second intersection is an empty set, determining a third target user in a screen viewing area, and performing phase adjustment on the viewpoint image corresponding to the initial phase according to the current phase of the eyes of the third target user and the feature information;
and the third target user is the user closest to the screen or the user closest to the center position of the screen in the optimal viewing range.
15. A naked eye 3D display device based on human eye tracking, comprising:
the initial phase determining module is used for determining the initial phase of each viewpoint according to the number of the viewpoints to be arranged;
the characteristic information determining module is used for determining characteristic information of an original visual area in each visual angle according to a preset layout mode of the viewpoint of the to-be-laid image, wherein the original visual area is a display area of which a screen can present a naked eye 3D display effect when human eye tracking is not carried out;
the phase adjusting module is used for adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the eyes of the user relative to the screen;
if the original visual area is continuous, adjusting the phase of the viewpoint image corresponding to the initial phase according to the characteristic information and the current phase of the two eyes of the user relative to the screen, wherein the adjusting comprises the following steps:
for each user, calculating the relative phase relation between the left eye phase and the right eye phase of the user according to the current phases of the two eyes of the user relative to the screen;
performing phase adjustment on the viewpoint image corresponding to the initial phase according to the relative phase relationship, the characteristic information and the maximum value and the minimum value which respectively satisfy the relative phase relationship in the current left eye phase and the current right eye phase of each user;
determining an initial phase of each viewpoint according to the number of viewpoints to be mapped, wherein the method comprises the following steps:
designing a central visual area on the optimal viewing distance, enabling the phase of a viewing line segment of a central visual angle to be [0, 1], and determining the initial phase of each visual point according to the number of the visual points;
the optimal viewing distance is the distance from the screen when perpendicular.
16. An autostereoscopic 3D display device based on human eye tracking, characterized in that the device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the naked eye 3D display method based on human eye tracking according to any one of claims 1-14.
17. Computer-readable storage medium for naked eye 3D display based on human eye tracking, on which a computer program is stored which, when being executed by a processor, implements a naked eye 3D display method based on human eye tracking according to any one of claims 1 to 14.
CN201810521797.8A 2018-05-28 2018-05-28 Naked eye 3D display method, device, equipment and medium based on human eye tracking Active CN108769664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810521797.8A CN108769664B (en) 2018-05-28 2018-05-28 Naked eye 3D display method, device, equipment and medium based on human eye tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810521797.8A CN108769664B (en) 2018-05-28 2018-05-28 Naked eye 3D display method, device, equipment and medium based on human eye tracking

Publications (2)

Publication Number Publication Date
CN108769664A CN108769664A (en) 2018-11-06
CN108769664B true CN108769664B (en) 2020-12-08

Family

ID=64006168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810521797.8A Active CN108769664B (en) 2018-05-28 2018-05-28 Naked eye 3D display method, device, equipment and medium based on human eye tracking

Country Status (1)

Country Link
CN (1) CN108769664B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112135116A (en) * 2020-08-21 2020-12-25 深圳市立体通科技有限公司 Naked eye 3D display method and intelligent terminal
CN113271452B (en) * 2021-05-17 2023-04-21 京东方科技集团股份有限公司 Multi-view naked eye 3D display device and display method thereof
CN113660480B (en) * 2021-08-16 2023-10-31 纵深视觉科技(南京)有限责任公司 Method and device for realizing looking-around function, electronic equipment and storage medium
CN113689551A (en) * 2021-08-20 2021-11-23 纵深视觉科技(南京)有限责任公司 Three-dimensional content display method, device, medium and electronic equipment
CN114173108B (en) * 2021-09-30 2023-12-12 合肥京东方光电科技有限公司 Control method and device of 3D display panel, computer equipment and storage medium
CN114327346B (en) * 2021-12-27 2023-09-29 北京百度网讯科技有限公司 Display method, display device, electronic apparatus, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320647A (en) * 2014-10-13 2015-01-28 深圳超多维光电子有限公司 Three-dimensional image generating method and display device
CN105072431A (en) * 2015-07-28 2015-11-18 上海玮舟微电子科技有限公司 Glasses-free 3D playing method and glasses-free 3D playing system based on human eye tracking
KR20170077331A (en) * 2015-12-28 2017-07-06 전자부품연구원 Arbitrary View Image Generation Method and System
CN107167926A (en) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 A kind of bore hole 3D display methods and device
CN107249125A (en) * 2017-06-22 2017-10-13 上海玮舟微电子科技有限公司 A kind of bore hole 3D display methods and device
CN107454381A (en) * 2017-06-22 2017-12-08 上海玮舟微电子科技有限公司 A kind of bore hole 3D display method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320647A (en) * 2014-10-13 2015-01-28 深圳超多维光电子有限公司 Three-dimensional image generating method and display device
CN105072431A (en) * 2015-07-28 2015-11-18 上海玮舟微电子科技有限公司 Glasses-free 3D playing method and glasses-free 3D playing system based on human eye tracking
KR20170077331A (en) * 2015-12-28 2017-07-06 전자부품연구원 Arbitrary View Image Generation Method and System
CN107167926A (en) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 A kind of bore hole 3D display methods and device
CN107249125A (en) * 2017-06-22 2017-10-13 上海玮舟微电子科技有限公司 A kind of bore hole 3D display methods and device
CN107454381A (en) * 2017-06-22 2017-12-08 上海玮舟微电子科技有限公司 A kind of bore hole 3D display method and device

Also Published As

Publication number Publication date
CN108769664A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108769664B (en) Naked eye 3D display method, device, equipment and medium based on human eye tracking
US10715782B2 (en) 3D system including a marker mode
KR102415502B1 (en) Method and apparatus of light filed rendering for plurality of user
EP2659680B1 (en) Method and apparatus for providing mono-vision in multi-view system
KR102121389B1 (en) Glassless 3d display apparatus and contorl method thereof
JP2014045474A (en) Stereoscopic image display device, image processing apparatus, and stereoscopic image processing method
CN102223549A (en) Three-dimensional image display device and three-dimensional image display method
KR20140089860A (en) Display apparatus and display method thereof
Lee et al. Autostereoscopic 3D display using directional subpixel rendering
CN102281454B (en) Stereoscopic image display device
CN102404592A (en) Image processing device and method, and stereoscopic image display device
CN108881893A (en) Naked eye 3D display method, apparatus, equipment and medium based on tracing of human eye
KR101975246B1 (en) Multi view image display apparatus and contorl method thereof
US20140267235A1 (en) Tilt-based look around effect image enhancement method
US20140071237A1 (en) Image processing device and method thereof, and program
Date et al. Highly realistic 3D display system for space composition telecommunication
US20150054928A1 (en) Auto-stereoscopic display apparatus and storage media
US10122987B2 (en) 3D system including additional 2D to 3D conversion
US10212416B2 (en) Multi view image display apparatus and control method thereof
CN108881878B (en) Naked eye 3D display device and method
Lee et al. Eye tracking based glasses-free 3D display by dynamic light field rendering
Wu et al. Design of stereoscopic viewing system based on a compact mirror and dual monitor
CN113660480B (en) Method and device for realizing looking-around function, electronic equipment and storage medium
US20220232201A1 (en) Image generation system and method
US20240121373A1 (en) Image display method and 3d display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200401

Address after: 215634 north side of Chengang road and west side of Ganghua Road, Jiangsu environmental protection new material industrial park, Zhangjiagang City, Suzhou City, Jiangsu Province

Applicant after: ZHANGJIAGANG KANGDE XIN OPTRONICS MATERIAL Co.,Ltd.

Address before: 201203, room 5, building 690, No. 202 blue wave road, Zhangjiang hi tech park, Shanghai, Pudong New Area

Applicant before: WZ TECHNOLOGY Inc.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant