CN114063310A - Light field source viewpoint confirmation method - Google Patents

Light field source viewpoint confirmation method Download PDF

Info

Publication number
CN114063310A
CN114063310A CN202111236891.7A CN202111236891A CN114063310A CN 114063310 A CN114063310 A CN 114063310A CN 202111236891 A CN202111236891 A CN 202111236891A CN 114063310 A CN114063310 A CN 114063310A
Authority
CN
China
Prior art keywords
sub
image
horizontal direction
viewpoints
viewpoint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111236891.7A
Other languages
Chinese (zh)
Other versions
CN114063310B (en
Inventor
赵健
李小猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fengmang Technology Nanjing Co ltd
Original Assignee
Fengmang Technology Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fengmang Technology Nanjing Co ltd filed Critical Fengmang Technology Nanjing Co ltd
Priority to CN202111236891.7A priority Critical patent/CN114063310B/en
Publication of CN114063310A publication Critical patent/CN114063310A/en
Application granted granted Critical
Publication of CN114063310B publication Critical patent/CN114063310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0012Optical design, e.g. procedures, algorithms, optimisation routines
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • G02B30/29Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays characterised by the geometry of the lenticular array, e.g. slanted arrays, irregular arrays or arrays of varying shape or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)

Abstract

The invention relates to the technical field of naked eye 3D display, and discloses a light field film source viewpoint confirming method, which comprises the following steps: s1: adjusting the angle, wherein the rectangular optical device is tightly attached to the display screen and strictly aligned after being inclined by a certain angle, and the number of cylindrical lenses spanned by a single sub-image in the image source in the horizontal direction is N (N is more than 2); s2: setting parameters, setting the width w of a sub-pixel of the display, the inclination angle a, the pitch width P of the cylindrical lens, the number X of the pixels covered by a single row of the single lens, the number N of the cylindrical lenses spanned by a single sub-image in the horizontal direction, and the number M of the sub-pixels of the single sub-image in the horizontal direction; s3: and calculating the number of viewpoints. The invention determines the minimum repeatable unit by analyzing the number N of the single sub-images crossing the cylindrical lens in the horizontal direction, reduces the number of parallax images and improves the processing speed under the condition of ensuring the monocular definition, the binocular parallax and the large depth of field of the stereoscopic display.

Description

Light field source viewpoint confirmation method
Technical Field
The invention relates to the technical field of naked eye 3D display, in particular to a light field film source viewpoint confirming method.
Background
In recent years, three-dimensional imaging and display technologies are receiving more and more attention, and integrated imaging based on a microlens array is distinguished due to complete parallax, continuous viewpoints, no observation glasses and special illumination, and gradually developed into an autostereoscopic display technology with the most potential and prospect.
The next generation high resolution three-dimensional display technology can perfectly present rich information such as positions, angles, colors, detail characteristics and the like of all objects in a three-dimensional scene, has continuous visual angles and spatial depth sense, and is more suitable for the watching habits of human eyes, but the data volume required by display needs to be improved by 2-3 orders of magnitude compared with the existing system. The huge amount of information puts higher demands on high-speed characterization of complex scenes, real-time light wave field description and spatial bandwidth product of reproduction systems. Image source processing methods that must be applied to three-dimensional displays of vast amounts of information become particularly important.
The super multi-view stereo display can increase the angular resolution and improve the stereo imaging effect, but the main problems of the super multi-view stereo display are that a large number of parallax images are needed, the parallax interval is small, and the image source acquisition is difficult. The patent CN 103813153 a adopts the method of weighting values of adjacent viewpoints, but the number of viewpoints and the number of parallax images are still equal. In the correspondence relationship between the cylindrical lens AND the pixel proposed in philips documents or patents such as "character AND operation OF 3D-LCD MODULE DESIGN", although a single sub-image may span a plurality OF lenses, the number OF viewpoints AND the number OF parallax images are equal to each other. The sparse viewpoints proposed by Sichuan universities such as 1D integral imaging based on parallel images 'virtual retrieval' are synthesized into multiple viewpoints, but the number of the sparse viewpoints proposed by the universities is still not determined by combining the number of cycles N, so that a scientific method for determining the number of the sparse viewpoints is not provided at present.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a light field source viewpoint confirmation method, which solves the problems that the number of sparse viewpoints which cannot be used for super-multi-viewpoint three-dimensional display is confirmed, the processing speed is low, and the requirements of people cannot be met.
(II) technical scheme
In order to achieve the purpose, the invention provides the following technical scheme:
a light field source viewpoint confirming method comprises the following steps:
s1: adjusting the angle, wherein the rectangular optical device is tightly attached to the display screen and strictly aligned after being inclined by a certain angle, and the number of cylindrical lenses spanned by a single sub-image in the image source in the horizontal direction is N (N is more than 2);
s2: setting parameters, setting the width w of a sub-pixel of the display, the inclination angle a, the pitch width P of the cylindrical lens, the number X of the pixels covered by a single row of the single lens, the number N of the cylindrical lenses spanned by a single sub-image in the horizontal direction, and the number M of the sub-pixels of the single sub-image in the horizontal direction, wherein the following relations are satisfied: n is M/X, namely in the horizontal direction, and the single-row viewpoint arrangement period is equal to the product of the number of sub-pixels covered by the single lens in the horizontal direction and the number of cylindrical lenses spanned by the single sub-image;
s3: calculating the number of viewpoints, determining N, and starting from a first viewpoint, wherein every N viewpoints are from the same disparity map, namely, no disparity exists between the N viewpoints, and the required total number Q of disparity maps meets the formula:
Figure BDA0003317970230000021
ntot is the total number of viewpoints, ceil is an upward rounding symbol, the total number of effective viewpoints is set at the optimal viewing position, and the sub-image interval corresponding to the horizontal binocular interval of 6.5cm is c, so that the final total number of effective viewpoints can satisfy the formula: q _ final equals mod (Q/c), and the number of views is calculated by a formula.
As a further aspect of the present invention, in S2, the number X of sub-pixels covered by the single lens satisfies the following relation: x/(w cos θ), i.e., the number X of sub-pixels covered by a single lens is equal to the horizontal pitch width of the cylindrical lens divided by the width of the single sub-pixel, X may be a non-integer number.
Further, the number N of cylindrical lenses spanned by the single sub-image in the horizontal direction in S2, the cylindrical lens array may be strictly aligned with the display with a period of N, and N is an integer greater than 2.
Based on the foregoing solution, in S2, the number M of sub-pixels of a single sub-image in the horizontal direction is periodically distributed in each row with the period of the number of sub-pixels occupied by the sub-image in the row being M in the image source viewpoint distribution.
Further, the total number Q of the disparity maps in S3 is a sparse viewpoint, and in the cylindrical lens period N, the N viewpoints with the closest corresponding positions under each cylindrical lens come from the same disparity map.
(III) advantageous effects
Compared with the prior art, the invention provides a light field source viewpoint confirming method, which has the following beneficial effects:
1. the invention determines the minimum repeatable unit by analyzing the number N of the single sub-images crossing the cylindrical lens in the horizontal direction, reduces the number of parallax images and improves the processing speed under the condition of ensuring the monocular definition, the binocular parallax and the large depth of field of the stereoscopic display.
2. In the invention, the cylindrical lens array can be strictly aligned with the display by taking N as a period, and N is an integer greater than 2, so that the depth of field of stereoscopic display can be improved.
Drawings
Fig. 1 is a schematic structural diagram of a light field source viewpoint determining method according to the present invention.
Fig. 2 is a schematic view of a depth-of-field analysis principle structure of a light field plate source viewpoint confirmation method according to the present invention.
Fig. 3 is a schematic diagram of a pixel correspondence of an embodiment 1 of a light field source viewpoint determining method according to the present invention.
Fig. 4 is a schematic diagram of a pixel correspondence of embodiment 2 of a light field source viewpoint determining method according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The depth value of the depth of field of the stereoscopic display should satisfy the following relationship:
depth=4*λ*(L/P)2 (1)
wherein L is the lens imaging distance, P is the lens pitch width, and λ is the wavelength. The magnitude of the depth value depth is proportional to L and λ and inversely proportional to P, as can be derived from the formula. However, L and λ are fixed values, so the depth of field value of the stereoscopic display has only a relationship with the lens pitch width P.
Setting the sub-pixel width w, the inclination angle a, the cylindrical lens pitch width P, the number X of single-lens single-row covered pixels, the number N of cylindrical lenses spanned by a single sub-image in the horizontal direction, and the number M of sub-pixels of the single sub-image in the horizontal direction, wherein the following relations are satisfied:
N*X=M (2)
the number X of the sub-pixels covered by the single lens is characterized in that: satisfy the relation:
X=P/(w*cosθ) (3)
substituting formula (3) into formula (2) can obtain:
N*P=(w*cosθ)*M (4)
the width w of the sub-pixel is a fixed value, and if the inclination angle theta of the cylindrical lens and the number M of the sub-pixels of a single sub-image in the horizontal direction are both fixed values, if P is reduced, the number N of cycles can be increased.
After determining the pitch width P and the number of cycles N, according to the viewpoint arrangement formula:
Figure BDA0003317970230000041
and the viewpoint distribution table for naked eye three-dimensional display can be completed.
In the traditional method, the total viewpoint Ntot is what, and how many disparity maps should be, so that the actual operation is relatively difficult. In addition, since in the case of super-multi-viewpoint, the observer receives image information from more than one viewpoint in a monocular manner at the optimal viewing distance L. If the parallax of these viewpoints is too large, monocular crosstalk is caused, and the viewing effect is adversely affected. In view of the above considerations, the present invention proposes to obtain a sparse view number based on the number of cycles N.
After N is determined, starting from the first viewpoint, every N viewpoints are from the same disparity map, in other words, there is no disparity between these N viewpoints. The total number Q of required disparity maps satisfies the formula:
Figure BDA0003317970230000051
wherein Ntot is the total view number, ceil is the rounded up symbol.
Example 1
Referring to fig. 1 to 3, a light field source viewpoint confirmation method includes the following steps:
the pixel distribution shown in fig. 3 can be obtained by setting the display subpixel width w to 0.0191mm, the inclination angle a to 18.43 °, when the number N of cylindrical lenses spanned by a single sub-image in the horizontal direction is 3, the cylindrical lens pitch width P to 0.1mm, the number X of single-lens single-row covered pixels to 5.667, and the number M of sub-pixels of a single sub-image in the horizontal direction to 17, according to equation (5).
In fig. 3, since N is 3, according to equation (6), the total number of sparse viewpoints should be at least 7, and the dark gray viewpoints 1, 2 and 3 correspond to the closest 3 viewpoints in one lenticular lens period, and when pixels are filled, they will come from the same sparse disparity map, and similarly, the light gray viewpoints 4, 5 and 6 will come from the same sparse disparity map, and the same is true for the other viewpoints.
Example 2
Referring to fig. 1-2 and 4, a light field plate source viewpoint confirmation method includes the following steps:
the display sub-pixel width w is set to be 0.0191mm, the inclination angle a is set to be 9.4623 degrees, when the number of cylindrical lenses which a single sub-image crosses in the horizontal direction is set to be 3, the number of pixels which the single lens covers in a single row is set to be 5.3333, the number of sub-pixels of the single sub-image in the horizontal direction is set to be M16, and the total viewpoint number is set to be 34. The pixel distribution shown in fig. 4 can be obtained according to equation (5).
Then in fig. 4, since N is 3, the total number of sparse viewpoints should be at least 12 according to equation (6). Viewpoint 1, viewpoint 2 and viewpoint 3 are 3 viewpoints with the closest corresponding positions in one cylindrical lens period, and when pixels are filled, they will be from the same sparse disparity map, and similarly, viewpoint 4 in light gray, viewpoint 5 and viewpoint 6 will also be from the same sparse disparity map, and the same is true for the other viewpoints.
In the description herein, it is noted that relational terms such as first and second, and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (5)

1. A light field plate source viewpoint confirming method is characterized by comprising the following steps:
s1: adjusting the angle, wherein the rectangular optical device is tightly attached to the display screen and strictly aligned after being inclined by a certain angle, and the number of cylindrical lenses spanned by a single sub-image in the image source in the horizontal direction is N (N is more than 2);
s2: setting parameters, setting the width w of a sub-pixel of the display, the inclination angle a, the pitch width P of the cylindrical lens, the number X of the pixels covered by a single row of the single lens, the number N of the cylindrical lenses spanned by a single sub-image in the horizontal direction, and the number M of the sub-pixels of the single sub-image in the horizontal direction, wherein the following relations are satisfied: n is M/X, namely in the horizontal direction, and the single-row viewpoint arrangement period is equal to the product of the number of sub-pixels covered by the single lens in the horizontal direction and the number of cylindrical lenses spanned by the single sub-image;
s3: calculating the number of viewpoints, determining N, and starting from a first viewpoint, wherein every N viewpoints are from the same disparity map, namely, no disparity exists between the N viewpoints, and the required total number Q of disparity maps meets the formula:
Figure FDA0003317970220000011
ntot is the total number of viewpoints, ceil is an upward rounding symbol, the total number of effective viewpoints is set at the optimal viewing position, and the sub-image interval corresponding to the horizontal binocular interval of 6.5cm is c, so that the final total number of effective viewpoints can satisfy the formula: q _ final equals mod (Q/c), and the number of views is calculated by a formula.
2. The method for confirming the source viewpoint of the light field plate as claimed in claim 1, wherein the number X of the sub-pixels covered by the single lens in S2 satisfies the following relation: x/(w cos θ), i.e., the number X of sub-pixels covered by a single lens is equal to the horizontal pitch width of the cylindrical lens divided by the width of the single sub-pixel, X may be a non-integer number.
3. The method for confirming the source viewpoint of the light field plate as claimed in claim 1, wherein the number of the cylindrical lenses which the single sub-image crosses in the S2 is N, the cylindrical lens array can be aligned with the display strictly with N as a period, and N is an integer greater than 2.
4. The method as claimed in claim 1, wherein the number M of sub-pixels of a single sub-image in the horizontal direction in S2 is periodically distributed in each row with a period of the number of sub-pixels occupied by the sub-image in the row as M in the image source viewpoint distribution.
5. The method for confirming the source viewpoint of the optical field patch according to claim 1, wherein the total number Q of the parallax maps in S3 is a sparse viewpoint, and N viewpoints corresponding to the most similar positions under each cylindrical lens in the cylindrical lens period N are from the same parallax map.
CN202111236891.7A 2021-10-24 2021-10-24 Light field slice source viewpoint confirmation method Active CN114063310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111236891.7A CN114063310B (en) 2021-10-24 2021-10-24 Light field slice source viewpoint confirmation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111236891.7A CN114063310B (en) 2021-10-24 2021-10-24 Light field slice source viewpoint confirmation method

Publications (2)

Publication Number Publication Date
CN114063310A true CN114063310A (en) 2022-02-18
CN114063310B CN114063310B (en) 2023-11-24

Family

ID=80235259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111236891.7A Active CN114063310B (en) 2021-10-24 2021-10-24 Light field slice source viewpoint confirmation method

Country Status (1)

Country Link
CN (1) CN114063310B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101175223A (en) * 2007-07-10 2008-05-07 天津大学 Multi-view point stereoscopic picture synthesizing method for LCD free stereoscopic display device based on optical grating
US20090116108A1 (en) * 2004-10-18 2009-05-07 Xavier Levecq Lenticular Autostereoscopic Display Device and Method, and Associated Autostereoscopic Image Synthesizing Method
CN102149003A (en) * 2011-04-26 2011-08-10 黑龙江省四维影像数码科技有限公司 Method for synthesizing multi-viewpoint stereo image based on prism grating
KR20130015936A (en) * 2011-08-05 2013-02-14 엘지전자 주식회사 An apparatus and a method for displaying a 3-dimensional image
CN103248908A (en) * 2013-04-28 2013-08-14 四川大学 Method for eliminating visual area jumping of multi-view-point autostereoscopic display and adding view points
CN103813153A (en) * 2014-01-27 2014-05-21 北京乐成光视科技发展有限公司 Weighted sum based naked eye three-dimensional (3D) multi-view image synthesis method
CN104811686A (en) * 2015-04-14 2015-07-29 西安交通大学 Hardware implementation method for floating-point type multi-view naked-eye three-dimensional synthetic image
CN105898286A (en) * 2016-04-11 2016-08-24 北京邮电大学 Three-dimensional image display device
CN105911712A (en) * 2016-06-30 2016-08-31 北京邮电大学 Multi-view-point liquid crystal display LCD naked-eye 3D (Three Dimensional) display method and device
CN113395510A (en) * 2021-05-21 2021-09-14 深圳英伦科技股份有限公司 Three-dimensional display method and system, computer-readable storage medium, and program product

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090116108A1 (en) * 2004-10-18 2009-05-07 Xavier Levecq Lenticular Autostereoscopic Display Device and Method, and Associated Autostereoscopic Image Synthesizing Method
CN101175223A (en) * 2007-07-10 2008-05-07 天津大学 Multi-view point stereoscopic picture synthesizing method for LCD free stereoscopic display device based on optical grating
CN102149003A (en) * 2011-04-26 2011-08-10 黑龙江省四维影像数码科技有限公司 Method for synthesizing multi-viewpoint stereo image based on prism grating
KR20130015936A (en) * 2011-08-05 2013-02-14 엘지전자 주식회사 An apparatus and a method for displaying a 3-dimensional image
CN103248908A (en) * 2013-04-28 2013-08-14 四川大学 Method for eliminating visual area jumping of multi-view-point autostereoscopic display and adding view points
CN103813153A (en) * 2014-01-27 2014-05-21 北京乐成光视科技发展有限公司 Weighted sum based naked eye three-dimensional (3D) multi-view image synthesis method
CN104811686A (en) * 2015-04-14 2015-07-29 西安交通大学 Hardware implementation method for floating-point type multi-view naked-eye three-dimensional synthetic image
CN105898286A (en) * 2016-04-11 2016-08-24 北京邮电大学 Three-dimensional image display device
CN105911712A (en) * 2016-06-30 2016-08-31 北京邮电大学 Multi-view-point liquid crystal display LCD naked-eye 3D (Three Dimensional) display method and device
CN113395510A (en) * 2021-05-21 2021-09-14 深圳英伦科技股份有限公司 Three-dimensional display method and system, computer-readable storage medium, and program product

Also Published As

Publication number Publication date
CN114063310B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN103988504B (en) The image processing equipment rendered for sub-pixel and method
CN102164298B (en) Method for acquiring element image based on stereo matching in panoramic imaging system
KR101265893B1 (en) Controlling the angular extent of autostereoscopic viewing zones
CN102801999B (en) Synthetizing algorithm based on naked eye three-dimensional displaying technology
CN102932659B (en) Method for removing integral imaging three-dimensional displaying crosstalk images
CN102209254B (en) One-dimensional integrated imaging method and device
CN103297796A (en) Double-vision 3D (three-dimensional) display method based on integrated imaging
CN105376558B (en) Multi-view image shows equipment and its control method
CN107105216B (en) A kind of 3 d light fields display device of continuous parallax based on pinhole array, wide viewing angle
CN208257981U (en) A kind of LED naked-eye 3D display device based on sub-pixel
CN102621702B (en) Method and system for naked eye three dimensional (3D) image generation during unconventional arrangement of liquid crystal display pixels
CN106604018A (en) 3D display apparatus and control method thereof
CN111781737B (en) High-resolution double-view 3D display device and method
CN110913201B (en) Light field display structure and synthetic image coding method
CN212276124U (en) Double-vision 3D display device based on polarization array
Yang et al. Demonstration of a large-size horizontal light-field display based on the LED panel and the micro-pinhole unit array
KR101957243B1 (en) Multi view image display apparatus and multi view image display method thereof
Park et al. Viewpoint vector rendering for efficient elemental image generation
CN212276123U (en) High-resolution double-vision 3D display device
CN111781734B (en) Dual-view 3D display device and method based on dual display screens
CN102447936B (en) Method for generating LED (Light-Emitting Diode) large-screen stereoscopic image file
CN111193921B (en) LED screen one-dimensional integrated imaging display method based on combined discrete grating
CN202565397U (en) 3D video monitoring system allowing videos to be watched with naked eyes
CN116708746A (en) Naked eye 3D-based intelligent display processing method
CN103676176B (en) A kind of 3 d display device and its imaging method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant