WO2019156807A1 - Plenoptic cellular imaging system - Google Patents

Plenoptic cellular imaging system Download PDF

Info

Publication number
WO2019156807A1
WO2019156807A1 PCT/US2019/014668 US2019014668W WO2019156807A1 WO 2019156807 A1 WO2019156807 A1 WO 2019156807A1 US 2019014668 W US2019014668 W US 2019014668W WO 2019156807 A1 WO2019156807 A1 WO 2019156807A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
layer
sensor
cells
facet
Prior art date
Application number
PCT/US2019/014668
Other languages
French (fr)
Inventor
Mark A. LAMKIN
Kyle Martin RINGGENBERG
Jordan David LAMKIN
Original Assignee
Lockheed Martin Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lockheed Martin Corporation filed Critical Lockheed Martin Corporation
Priority to JP2020542664A priority Critical patent/JP7077411B2/en
Priority to CA3090661A priority patent/CA3090661C/en
Priority to AU2019218741A priority patent/AU2019218741B2/en
Priority to ES19704502T priority patent/ES2953632T3/en
Priority to MYPI2020004059A priority patent/MY197937A/en
Priority to EP19704502.4A priority patent/EP3750000B1/en
Priority to CN201980019851.1A priority patent/CN111902762B/en
Priority to KR1020207025315A priority patent/KR102479029B1/en
Priority to SG11202007486YA priority patent/SG11202007486YA/en
Publication of WO2019156807A1 publication Critical patent/WO2019156807A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Abstract

In one embodiment, an electronic display assembly includes a microlens layer and a pixel array layer. The microlens-layer includes a plurality of cells. The pixel array layer is adjacent to the microlens layer. The pixel array layer includes a plurality of pixels. Bach, cell of the plurality of cells includes a. transparent lenslet and one or both of: a plurality of opaque walls configured to prevent light from bleeding into adjacent cells; and a filter layer on one end of each cells of the plurality of cells. The filter- layer is configured to limit light passing through the filter layer to a specific incident angle.

Description

PLEHQPTIC CELLULAR IMAGING SYSTEM
TECHNICAL FIELD
[1] This disclosure relates generally to light field displays an cameras, and more particularly to plenqptic cellular imaging systems.
BACKGROUND
[S] Electronic displays are utilised in a variety of applications For example,, displays are used in smartphones, laptop computers, and digital cameras. Some devices, such as smartphones and digital cameras, may include an image sensor in addition to an electronic display, While some cameras and electronic displays separately capture and reproduce light fields, light field displays and light field cameras are generally not integrated v*ith one another
SUMMARY QF PARTICULAR EMBODIMENTS
[3ij I one embodiment, an electronic display assembly includes a microlens layer and a pixel array layer. The microlens layer includes a plurality of cells. The pixel array layer is adjacent to the microlens layer. The pixel array layer includes a plurality of pixels. Each cell of the plurality of cells includes a. transparent lenslet and one or both of ; a plurality of opaque walls configured to prevent light from bleeding into adjacent cells; and a filter layer on one end of each cell of the plurality of cells. The filter layer is configured to limit light passing through the filter layer to a specific incident angle . [4] The present disclosure presents several technical advantages. Some embodiments provide a complete and accurate recreation of a tax-get light field while remaining lightweight and comfortable to wear for a user. Some embodiments provide a thin electronic system which offers both opacity an controllable unidireetional emulated transparency, as well as digital display capabilities such as virtual reality (¥R , augmexited reality (AR) , and mire reality (MR;) , Some embodiments provide a direct sensor-to-display system that utilise a direct association of input pixels to corollary output pixels to circumvent the need for image transforniation. This reduces the complexity, cost, and power requirements for some systems. Some embodiments provide in-·layer signal processing configurations that provide for local, distributed processing of large quantities of data (e.g, , 160k of image data or more) , thereby circumventing bottlenecks as well as performance, power, and transmission line issues associated with existing solutions. Some embodiments utilise mierolens layers with arrays of plenoptie cells to accurately capture and display a volume of light to a viewer. The plenoptie cells include opaque cell wails to eliminate optical cross talk between cells, thereby improving the accuracy of the replicated light field.
f-5] Som embodiments provide three-dimedsional electronics by geodesic faceting. In such embodiments, a flexible circuit board with an array of small, rigid surfaces (e.g,, display and/or sensor facets) may foe formed into any 3D shape, which is especially useful to accommodate the arrow radii of curvature (e.g., 30-60 mm) necessary for head-mounted near-eye rapped displays. Some embodiments provide distributed multi-screen arrays for high density displays. In s c embodiments, an array of small- high- resolution micro displays is, g.. display facets) of custom sizes and shapes are formed and then assembled on a larger, flexible circuit board that may then be formed into a 3D shape (e.g, a seraispherical surface) , Each micro display may act independently of any other display, thereby providing a large array of many high- resolution displays with unique content on each, such that the whole assembly together forms essentially a single extremely high-resolution display. Some embodiments provide a distributed multi-aperture camera array. Such embodiments provide an array of small image sensors (e.g. , sensor facets) of custom sices and shapes, all of which are assembled on a larger, flexible circuit board that is then formed to a 3D ie,g, , semi-spherical} shape. Each discrete image sensor may act independently of any other image sensor in order to provide a large array of many apertures capturing unique content on each, such that the whole assembly essentially becomes a seamless, very high resolution, multi -node camera.
[6] Other technical advantages will be readily apparent to one skilled in the art from FIGURES L& through 42, their descriptions, and the claims. Moreover, while specific advantages have been enumerate above, various embodiments may include all , some , or none of the enumerated advantages .
BRIEF DESCRIPTION OP THE DRAWINGS
[7] For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description, taken in conjunction with the accompanying drawings, in which; [8] FIGURES 1A-1C illustrate a reference scene with various three-dimensional {3D} objects and various viewing positions, according to certain embodiments;
[$] FIGURES 2A-2C illustrate viewing the 3D objects of
FIGURES 1A~ 1C through a transparent panel, according to certain embodiments ,¨
:[10] FIGURES 3A~ 30 Ilustrate viewing the 3D objects of
FIGURES IA- 1C through a camera image panel., according to eexrfcain etnho&i erits?
[Ill FIGURES 4A- C illustrate viewing the 3D objects of FIGURES 1A.-1C through an emulated-transparency electronic panel, according to certain embodiments ,·
[12] FIGURES 5A--5C illustrate viewing the 3D objects of FIGURES 1A-IC: through the camera image panel of FIGURES 3A-3C from an alternate angle, accox'ding to certain embodiments;
[13] FIGURES EA-6C illustrate viewing the 30 objects of FIGURES 1A--1C through the emulated-transparency electronic panel of FIGURES 4A-4C from an alternate angle, according to certain embodiments;
[14] FIGURE 7 illustrates a cut-away view of an emulated transparenc assembly, accox'ding to certain embodiments ?
[15] FIGURE 8 illustrates an exploded vie of the emulated transparency assembly of ίTGURE 7 , according fcp axn embodiments;
[16] FIGURE $ illustrates a method of manufacturing the emulafeed transparency assembly of FIGURE ?r according tocertain embodiments ;
[17] FIGURE 10 illustrates a: direct sensor-to-display system that may be used by the emulated transparency assembl of FIGURE 7, accordin to certain embodiments; [IS] FIGURE 11 illustrates a method of manufact uring the direct sensor-to-displa system of FIGURE 10, according to cert.a.in embodiments ;
[IS] FIGURES 12-13 illustrate various in- layer signal processing configurations that ma be used by the emulated transparency assembly of FIGURE 7 , according to certain embodiments ;
. [20] FIGURE 14 Illustrates a method of manufacturing the in·- layer signal processing systems of FIGURES 12-13, according to certain embodiments ?
[21] FIGURE 15 illustrates a plenoptic cell assembly that may be used by the emulated transparency assembly of
FIGURE 7 , according to certain embodiments ,·
[22] FIGURE 16 illusbxatas a cross section of a portion of the plenoptic ceil assembly of FIGURE 15/ according to certain embodiments ;
[23] FIGURES im~17C illustrate cross sections of a portion of the plenoptic cell assembly of FIGURE 15 ¾rith various incoming fields of light, according to certain embodiments ;
[24] FIGURES ISA- ISB illustrate a method
Figure imgf000007_0001
manufacturing the plenoptic cell assembly of FIGURE ID, according to certain embodiments,'
[25] FIGURES ISA-19B illustrate another method of manufacturing the plenoptic cell assembly of FIGURE 15, according to certain embodiments;
[26] FIGURES 20-21 illustrate a plenoptic cell asseTabiy that may be manufactured by the methods of FIGURE3 IBS.- 193, according to certain embodiments; [27] FIGURE 22 illustrates a flexible circuit board that way be used by the emulated transparency assembly of FIGURE 7 , according to certain embodiments ;
[28 } FIGURE 23 illustrates additional details of the flexible circuit board of FIGURE 22, according to certain embodiments ?
[29 FIGURE 24 illustrates a data flow through the flexible circuit board of FIGURE 22, aecoiding to certain embodiments ?
[,30] FIGURE 25 illustrates a method of manufacturing an electronic assembly using the flexible circuit board of FIGURE 22, according to certain embodiments ;
[31] FIGURE 26 illustrates a cut-away view of a curved multi-display array, according to certain embodiments,·
[32] FIGURE 27 illustrates an exploded view of the curved multi-display array of FIGURE 26, according to certain embodiments ?
[333 FIGURES 2§ 29 illustrate logic facets and display facets of the curved multi-display array of FIGURE 26, according to certain embodiments ;
[34] FIGURE 30 illustrates a back side of the flexible circuit board of FIGURE 22 according to certain embodiments;
[35] FIGURE 31 illustrates a data flow through the flexible circuit board of FIGURE 30, according to certain embodiments;
[36] FIGURE 32 illustrates the flexible circuit board of FIGURE 30 that has been formed into a semispherical shape, according to certain embodiments?
[37] FIGURE 33 Illustrates a data flow through the flexible circuit board of FIGURE 32, according to certain embodiments ? [38] FIGURE 34 illustrates an array of logic facets that have keen formed into a seraispherical shape, according to certain embodiments ;
[39] FIGURE 35 illustrates communications between the logic facets of FIGURE 34, according to certain embodiments;
[ 0] FIGURE 36 illustrates a method of manufacturing the curved multi display array of FIGURE 26 , according to certain embodiments ?
[41] FIGURE! 37 illustrates a cut-away view of a curved multi- camera array, according to certain embodiments;
[42] FIGURES 38-39 illustrate: exploded views of the curve multi -camera array of FIGURE 37, according to certain embodiment ?
[43] FIGURE 40 illustrates a back: view of the flexible circuit board of FIGURE 32, according to certain embodiments ;
[44] FIGURE 41 illustrates a data flow through the flexible circui board of FIGURE 40, according to certain embodiments; and
[45] FIGURE 42 illustrates a method of manufacturing the curved multi -camera array of FIGURE 37, according to certain embodiments .
DETAILED DESCRIPTION OP EXA PLE EMBODIMENTS
[46] Electronic displays are utilised in a variety of applications . For example, displays are used in smartphones., laptop computers, and digital cameras. Some devices, such as smartphones and digital cameras, may include an image sensor in addition to an electronic display. Devices with displays and image sensors, however, are generally limited in theirability to accuratel capture and display the full photonic env ronmen . [47] o address problems and limitations associated with existing electronic displays,· embodiments of the disclosure provide various electronic assemblies for capturing arid displaying light fields, FIGURES 1A-9 are directed to display assemblies w th electronically emulated transparency, FIGURES 1:0-11 are directed to direct camera-to-display systems., FIGURES 12-14 are directed to in-layer signal processing, IGURES 15-21 are directed to plenopfcie cellular imaging systems , FIGURES 22-25 are directed. to three- dimensional {3D} electronics distribution by geodesic faceting, FIGURES 26-3» are directed to distributed multi- screen arrays for high density displays, and FIGURES 37-42 are directed to distributed multi-aperture camera arrays.
[48 J To facilitate a better understanding of the present disclosure, the following examples of certain embodiments are given. The following examples are not to be read to limit or define the scope of the disclosure. Embodiments of the present disclosure and its advantages are best understood by referring to FIGURES 1A-42, where like numbers are, used to indicate like and corresponding parts.
[49] FIGURES 1&--3 illustrate various aspects of an assembly with electronically emulated transparency, according to certain embodiments. In general, the electronic assembly illustrated in detail in FIGURES 7-8 may be used in. different applications to provide features such as virtual reality (VE) , augmented reality (AR) , and mixed reality {MR} . For VR applications, a digital display is required which can completely replace a view of the real world, similar to how a standard computer monitor blocks the view of the scene behind it. However, for AR applications, a digital display is required which can overlay data on to of that view of the real world, such as a pilot's heads-up display in a modern cockpit, MR application require a combination of both. Typical systems used to provide some or all of these features are not desirable for a number of reasons. Fox' example, typical solutions do not provide an accurate or complete recreation of a target light field. As another example, existing solutions are typically bulky and not comfortable fox' users .
[50] To address problems an limitations with existing electronic displays, embodiments of the disclosure provide a thin electronic system which offers both opacity and controllable unidirectional emulated transparency,, as well as digital display capabilities . From one side the surface appears opaque, but from the opposite side the surface can appear fully transparent, appear fully opaque, act as a digital display, or any combination of these, In some embodiments, simultaneous plenoptic sensing and display technologies are combined within a single layered structure to form what appears to be a unidirectional visually transparent surface, The system may include multiple layers of electronics and optics for the purpose of artificially recreating transparency that ma be augmented and/or digitally controlled. Individual image sensor pixels on one side may be arranged spatially to match the positions of display pixels on the opposite side of the assembly. In some embodiments, all electronic driving circuitry as well as some display logic circuitry may be sandwiched between the sensor layer and display layer, and each sensor pixel's output signal may be channeled throug the circuitry to the corresponding displa pixel on the opposite side. In some embodiments, this centrally-processed signal is aggregated with the incoming signal from the plenoptic imaging sensor array on the opposite side, and is handled according to the following modes of operation. In
Figure imgf000012_0001
mode, the external video feed overrides the camera data, completely replacing the user's view of the outside world with the incoming view frcaft the video. In AR mode, the external video feed is overlaid on the camera data, resulting in a combined view of both the external world and the view from the video £e»g. , the video data is simply added to the scene) , In MR mode, the external video feed is mixed with the camera data, allowing virtual Objects to appear to interact with actual objects in the real world, altering the virtual content to make it appear integi'ated with the actual environment through object occlusion, lighting, etc.
[51] Some emtoodimexxts combine stacked transparent hig dynamic range (HDR) sensor and display pixels into a single structure, with sensor pixels on one side of the assembly and display pixels on the other, and wit pixel- for- pixelalignment between camera and. display. Both the sensor and display pixel arrays may be focused by groups of micro lenses to capture and display four-dimensional light fields, This means that the complete view of the real world is captured on one side of the assembly and electronically reproduce os the other, allowing for partial or complete alteration of the incoming image while maintaining image clarity, luminance, an enough angular resolution for the display" side to appear transparent, even when viewed at oblique angles.
[52] FIGURES 1A~6C are provided to illustrate the differences between electronically eirruiated transparency provided toy embodiments of the disclosure and typical camera images (suc as through a camera viewfinder or using a smartphone to display its current camera image) , FIGURES m- c illustrate a reference seen© with various 3D objects 110 (i.e,, 110A-C) and a frontal viewing position, according to certain embodiments . FIGURE 1A is a top view of an arrangement of 3D objects 110 and a frontal viewing direction of 3D objects 110. FIGURE IB is a perspective view of the same arrangement of 3D objects 1.10 and frontal viewing direction as FIGURE 1A, FIGURE 1C is the resulting front view of 3D objects 110 from the position illustrated in FIGURES 1A and IB, As can be seen, the view in FIGURE 1C of 3D objects 110 is a normal, expected view of 3D objects 110 (i.e., the view of 3D objects 110 is not altered at all because there isnothing between the viewer and 3D objects 110)„
[53] FIGURES 2A-2C illustrate viewing the 3D objects 110 of FIGURES 1A~IC through a transparent panel 210, according to certain embodiments . Transparent panel 210 may be, for example, a piece of transparent glass, FIGURE 2A is a top view of" a frontal viewing direction of 3D objects 110 through transparent panel 210, and FIGURE 2B is a perspective view of the same arrangement of 3D objects 110 and frontal viewing direction as FIGURE 2A, FIGURE 20 is the resulting front view of 3D objects 110 through transparent panel 218 om the position illustrated in FIGURED 2A and 2B. As can be seen, the view in FIGURE 2C of 3D objects 110 through transparent panel 210 is a normal, expected view of 3D objects 110 (i.e., the view of 3D objects 110 is not altered at ail because the viewer is looking through a transparent panel 210) . In other words, the view of 3D objects 110 through transparent panel 210 in FIGURE 2C is the same as the view in FIGURE 1G where no object is between the viewer and 3P objects 110 (i, e,f “ erceive " transparency) . Stated another way, the edges of the projected imagery on transparent panel 210 line u with the view of the actual 3D objects 110 behind transparent panel 210 to create a view-aligned image 220Ά of 3D object 11QA, a view-aligned image 220.B of 3D object HOB, and a view- aligned image 220C of 3D object HOC,
[54] FIGURES 3A-3C illustrate viewing the 3D objects 110 of FIGURES 1A-1C through a camera image panel 310, according to certain embodiments. Camera image panel 310 may be, for example, a camera viewfinder or a display of a smartphone that is displaying its current camera image. In these images,, camera image panel 310 is at an angle (erg., 30 degrees) to the viewer to illustrate how such systems do not provide true emulated transparency, FIGURE 3A is a top view of a frontal viewing direction of 3D objects 110 through camera image panel 310,, and FIGURE 3B is a perspective view of the same arrangement of 3D objects 110 and frontal viewing direction as FIGURE 3A, FIGURE 3C is the resulting front view of 3D objects 110 through camera image panel 310 from the position illustrated in FIGURES 3A and 3B. As can be seen, the view in FIGURE 30 of 3D objects 110 ttrough. camera image panel 310 is different from a view of 3D obi ects 11Q through transparent panel 210, Hers, camera image panel 310 redirects the lines of sight that are normal to camera image panel 3 IQ , thereby showing no perceived transparency (i,a., the image on camera image panel 310 is not aligned with the view but instead depicts the image acquired by the redirected lines of sight) . Stated another way, the edges of the proj ected imagery on camera image panel 3 IQ do not 1.1ne tip with the view of the actual 3D objects 110 behin earner:a image panel 10. This is illustrated by an unaiigned image 320A of 3D object IICA and an analigned image 32,0B of 3D oh;jeot 11OB on camera image panel 310 i.n FIGURE 3C. [55] FIGURES 4A-4C illustrate viewing the 3D objects 110 of FIGURES l&-iC through an emulated- transparency electronic panel 410, according to certain embodiments. In these images, emulated transparency panel 41Q is at an angle (e,g. , 30 degrees) to the viewer to illustrate how emulated transparency panel 410 provides true emulated transparency unlike camera image panels 310 < FIGURE 4A is a top view of a frontal viewing dibeefcien of 3D objects 110 through emulated transparency panel 410, and FIGURE 4B is a perspective view of the same arrangement of 3D objects 110 and frontal viewing direction as FIGURE 4A, FIGUR 40 is the resulting front view of 3D objects no through emulated transparency panel 410 from the position illustrated in FIGURES 4A and 4B, As can be seen, the view in FIGURE 40 of 3D objects 110 through emulated transparency panel 410 is different from a view? of 3D objects 110 through camera image panel 310 but is similar to a view of 3D objects 110 through transparent panel ill). Here, emulated transparency panel 410 does not redirect the lines of sight from the viewer through emulated transparency panel 410, but allows them to remain virtually unchanged and thereby providing emulated transparency ii.e. , the image on emulated transparency panel 410 is aligned with the view as in transparent panel 210) . Like transparent panel 210, the edges of the projected imagery on emulated transparency panel 410 lines u with the view of the actual 3D objects 110 behind emulated transparency panel 410 to create view-aligned image 220A of 3D object 11DA, view-aligned image 220B of 3D object 110B , and view-·aligned image 220C of 3D object HOC.
fSSJ FIGURES 5&-SC illustrate viewing the 3D objects 110 of FIGURES 1A-IC through the camera image panel 310 of FIGURES 3&--3C', but from an alternate angle. In these images, camera image panel 310 is at a different 30 degree angle to the. viewer to further illustrate how sxida systems do not provide true emulated transparency, Like in FIGURES 3A-3C, the edges of the projected imagery on camera image panel 31G do not line up with the view of the actual 313 objects 110 behind camera image panel 310. This is illustrated by an unaligried image 3200 of 3D object HOC and an unaligned image 320B of 3D object HOB on camera image panel 310 in FIGURE SC.
[57] FIGURES SA-6C illustrate viewing the 3D objects 110 of FIGURES 1A-1C through the emulated- transparency electronic panel 410 of FIGURES 4A-4C, Hit from an alternate angle. Like in FIGURES 4A--4C the edges of the projected imagery on emulated transparency panel 410 in FIGURE 6C line up with the view of the actual 3D objects 1.10 behind emulated transparency panel 410 to create view-aligned image 220B of 3D object HOB and view-aligned image 220C of 3D object HOC,
[S3] As illustr te shove in FIGURES 4A-4C and SA-SC, emulated transparency panel 410 provides view-aligned images 220 of 3D objects 110 behind emulated transparency panel 410, thereby providing electronically emulated transparency. FIGURES 7-8 illustrate an example embodiment of emulated transparency panel 410. FIGURE 7 illustrates a cut-away view of an emulate transparency assembly 710 which may be emulated transparency panel 410, and FIGURE 8 illustrates an exploded vie of the emulated transparency assembly 710 of FIGUR 7, according to certain embodiments,
[53] In some embodiments, emulated transparency assembly 710 includes two microlens arrays 720 (i,e,f a sensor side microlens array 720:A and a display side microlens array 720B } an image sensor layer 730 ,. a circuit board 740, an an electronic: display layer 760, In general, incoming light field 701 enters sensor side microlens array 720A where it is detected by image sensor layer 730. S1ectronica.11y-re licated
Outgoing light field 702 is then generated by electronic display layer 760 and projected through display side microlens array 72033. As explained in more detail below, the unique arrangement and features of emulated transparency assembl 710 permits it to provide electronically-emulated transparency via eleetronica.Xly-· eplicated outgoing light field 70S, ad well as other features described below. While a specific shape of emulated transparency .assembly 7X0 is illustrated in FIGURES 7-8, emulated transparency assembly 710 may have any appropriate shape including any polygonal or non-polygonal shape, and both flat and non- flat configurations ,
[60] Microlens arrays 720
Figure imgf000017_0001
sensor side microlens array 720A and display side microlens array 720;B) are generally layers of micx'elenses . In some embodiments, each microlens of microlens arrays 720 is a plenoptic cell 1510 as described in more; detail belo in reference to FIGURE IS . In general, each microlens of sensor side microlens array 72:0 is configured to capture a portio of incoming light field 701 an direct it to pixels within image sensor layer 730, Similarly, each microlens of display side microlens array 72QB is configured to emit a portion of electronically-replica ed outgoing light field 732 that is generated by pixels of electronic display layer 760. In some embodiments, each microlens of sensor side microlens array 72 C) and display side microlens array 720B is in a 3D shape with a collimating lens on one end of the 3D shape. The 3D shape may be,, for example, a triangular polyhedron, a rectangular cuboid, a pentagonal polyhedron, a hexagonal polyhedron, a he t gonal polyhedron, or an octagonal polyhedron. In some embodiments,: each microlens of sensor side microlems array 720Ά and displa side n crolens array 720B includes opaque walls such as cell walls 1514 (discussed below in reference to FIGURE: 15} that are configured to prevent light from bleeding into adjacent«vicrolenses . In so e embodiments» each microlens of sensor side micro!ens array 72QA and display side microleas .array 72OB additionally or alternatively includes a light incidence angle rejection coating such as filter layer 1640 describe below to prevent light from bleeding into adjacent microlenses -
[6,1] in some embodiments, the microlenses of sensor side microlens array 720L are oriented towards a first direction, and the microlenses of display side raic olefts array 72GB are oriented towards a second direction that is 180 degrees from the first direction. In other words, some embodiments of emulated transparency assembly 710 include a sensor side taicrolens array 72,GA that is oriented exactly opposite from displa side microlens array 720B, In other embodiments, any other orientation of sensor side microlens array 720Ά and display side microlens array 7 OS is possible,
[62] In general, image sensor layer 730 includes a plurality of sensor pixels that e; configured to detect incoming light field 701 after it passes through sensor side microlens array 720Ά. In some embodiments, image sensor layer 730 includes an array of sensor units 735 e.g. , sensor units 735A-C as illustrated in FIGURE 8}„ Each sensor unit. 735 may be a. defined portion of image sensor layer 730 (e.g. , a specific area such as a portion of a rectangular grid) or a specific number or pattern of sensor pixels within image sensor layer 730, In some embodiments, each sensor unit 735 corresponds to a specific logic unit 755 of logic unit layer 750 as described below. In some embodiments » image sensor layer 730 is couple to or otherwise immediately adjacent to sensor side microlens array 720A. in some embodiments, image sensor layer 730 is between sensor side microlens array 720A an circuit board 740. In other embodiments, image sensor- layer 730 is between sensor side microlens array 720A and logic unit layer 750. In some embodiments, other appropriate layers may be included in emulated transparency assembly 710 on either side of image sensor layer ?3Q . Furthermore, while a specific number and pattern of sensor units 735 are illustrated, any appropriate number (including only one) and pattern of seixsor units 735 may be used.
L63] Circuit board 740 is .any appropriate rigid or flexible circuit board. In general, circuit board 740 includes various pads and traces that provide electrical connections between various layers o.f emulated transparency assembly 710, As one example, in embodiments that include circuit beard 740, circuit board 740 ma be located between image sensor layer 730 and logic unit layer 750 as illustrated in FIGURES 7-8 in order to provide electrical connections between image sensor layer 730 an logic unit layer 750. In other embodiments, circuit board 740 may be located between logic u it layer 750 and electronic display layer 760 in order to provide electrical connections between logic unit layer 750 and electronic display layer 7SQ, In some embodiments, circuit board 740 includes an array of unit - attachment locations 745 (e.g., uni attachment locations 745A-C as illustrated in FIGURE 8) . Each unit attachment location 745 may foe a define portion of circuit board 740 (e.g. , a specific area such as a portion of a rootangular grid) and may include a plurality of pads (e.g. , ball grid array (EGA,) pad) and/or vias . In soma embodiments,. each unit attachment location 745 corresponds to a specific sensor unit 735 of image sensor layer 730 and a specific display unit 765 of electronic display layer 760 (e,g,„ unit attachment location
7 5& corresponds to sensor unit 73SA and display unit 765A) and is configured to permit electrical communication between the corresponding specific sensor unit 735 and the specific display unit 765,
;[64] Logic unit layer 750 provides optional/ dditional logic and/or processing for emulated transparency assembly 710, In general, logic unit layer 750 emulates transparency by directing signals from the plurality of sensor pixels of image sensor layer 730 to the plurality of display pixels of electronic display layer 760, thereby emitting electronically- replicated outgoing light field 702 from display side microlens array 720B at angles that correspond to angles of the incoming light field 701 detected through sensor side microlens array 720A. By emitting electronically-·replicated outgoing light field 702 from display side microlens array 720B at angles that correspon to angles of the incoming light field 701 detected through sensor side microlens array 720A, an image is displayed that matches what would be seen if emulated transparency assembly 710 was not present {i . e , , emulated transparency) . In some embodiments, logic unit layer 750 includes an arra of logic units 755 (e,g,, logic units 755&-C as illustrated in FIGURE 8), Each logic units 755 ma be a defined portion of logic unit layer 750 ie,g. , a specific area such as a portion of a rectangular grid) . .In some embodiments, each logic uni 755 is a separate physical, rigid unit that is later joined to or coupled to other logic units 755 in order to form logic unit layer 750. In some embodiments, each logic unit 755 corresponds to a specific sensor unit 735 of image senso layer 730 and a specific display unit 765 of electronic display layer 760 (e,g, , logic unit 755A corresponds to {and is electrically coupled to) sensor unit. 735Ά an display unit 765JU , in some embodiments, logic unit layer 750 is located between circuit board 740 and electronic display layer 760, In other embodiments, logic unit layer 750 is between image sensor layer 730 and circuit board. 740. In some embodiments, other appropriate layers may be included in emulated transparency assembly 710 on sithex' side of logic unit layer 750. Furthermore, while a specific number and pa tern of logic units 755 is illustrated, any appropriate number (including none or only one) and pattern of logic units 755 may be used,
(65] In general, electronic displa layer 760 includes a plurality of display pixels that are configured to generate and project electronically replicate outgoing light field 702 through displa side microlehs array 720B. In some embodiments, electronic display layer 760 includes an array of display units 765 (e,g,, display units 765A--C as illustrated in FIGURE 8) , Bach display unit 765 may be a defined portion of electronic display layer 760 {e,g,, a specific area such as a portion of a rectangular grid) or a specific number or pattern of display pixels within electronic display layer 760. In some embodiments, each display unit 765 corresponds to a specific logic unit 755 of logic unit layer 750, In some embodiments, electronic display layer 750 is coupled to or other-wise immediately adjacent to display side iex'olens array 72GB . In some embodiments , electronic displa layer 750 i s between display side microlens array 720B and circuit board
740 , In other embodiments electronic display layer 760 is between display side microlens array 72OB and logic unit layer ISO, In some embodiments , other appropriate layers may be included in emulated transparency assembly 710 on either side of electronic display layer 760. Furthermore, while a specific number and pattern of display units 765 are illustrated, any appropriate numbe (including only one) and patter of display units 765 may be used.
[66] In some embodiments , the sensor pixels of image sensor layer 730 may be sensor pixels 1800 as described in
FIGURES 18-20 and t eir associated descriptions in U.S . Patent
Application NO. 15/724,027 entitled "Stacked Transparent Pixel Structures for image Sensors," which is incorporated herein by reference in its entirety, in some mbodiments , the display pixels of electronic display layer 760 are display pixels; 100 as described in FIGURES 1-4 and their associated descriptions in U , B , Patent application Mo:. 15/724,004 entitled "Stacked Transparent Pixel Structures for Electronic Displays," hich is ncorporated herein by reference in its entirety.
[67] While FIGURES 7-8 depict emulated transparency assembl ?10 as having arrays of sensors, displays, an electronics, other embodiments may have single-unit setu s. Furthermore, while the illustrated embodiments of emulated transparency assembly 710 depict unidirectional emulated transparency (i.e, allowing the capture of incoming: ligh field 701 fro a single direction and displaying a corresponding: electronically-replicated outgoing light field 702 in the opposite direction) , other embodiments may include arrangements and combinations of emulated. transparency assembly 710 that permit bidirectional transparency ,
[68] FIGURE 9 illustrates a method 900 of manufacturing the emulated transparency assembly 710 of FIGURE 7, according to certain embodiments . Method 900 may begin in step .910 where a plurality of unit attachment locations are formed on a circuit board. In some embodiments, the circuit boar is circuit board 740 and the unit attachment locations are unit attachment locations 145. In some embodiments, each unit attachment location corresponds to one of a plurality of display units such as display units 765 and one of a plurality of sensor units such as sensor units 735.
[S3] At step 920, a plurality of sensor units are coupled to a first side of the circuit board. In some embodiments, the sensor units are sensor units 735, In so eembodiments , each sensor unit is coupled in step 920 to a respective one of the unit attachment locations of step 910. In some embodiments, the sensor units are first formed into an image sensor layer such as image sensor layer 730, and the image sensor layer is coupled to the first side of the circuit board in this step,
7C>] At step 330, a plurality of display units are coupled to a second side of the circuit board that is opposite the first side. In some embodiments,, the display units are display units 765. In some embodiments, each display unit is coupled to a respective one of the unit attachment locations. In some embodiments, the display units are first formed into a display layer such as electronic display layer 768, and the display layer is coupled to the second side of the circuit board in this step,
[71] At step 940, a first plurality of microienses are coupled to the plurality of sensor units of step 920 , In some embodiments, the microienses are plenoptic cells 1510. In some embodiments, the microienses are first formed into an microlens array layer such as sensor side microlens array 720A, and the microlens array layer is coupled to the sensor units <
[723 At step 950:f a secon plurality of microienses are coupled to the plurality of display units of step 930, in some embodiments, the microlenses are plenoptic ceils 1510, In some embodiments, the microlenses are first formed into- an icrolens array layer such as display side microlens array 720B, and the microlens array layer i coupled to the display units. After step 950, method 900 may end,
[73] In some embodiments, method 900 may additionally include coupling a plurality of logic units between the circuit board of ste 910 an the plurality of display units of step 930, In some embodiments .< the logic units are logicunits 755. In some embodiments, the plurality of logic units are coupled between the circuit board and the plurality of sensor units of step 920,
[74] Particular embodiments may repeat one or more steps of method 900, where appropriate. Although this disclosure describes and illustrates particular steps of method 900 as occurring in a particular order, this disclosure contemplates any suitable steps of method 90Q occurring in any suitable order [e,g., any temporal order). Moreover, although this disclosure describes and illus rates an example emulated transparency assembly manufacturing method including the particular steps of method 900, this disclosure contemplates any suitable emulated transparency assembly manufacturing method including any suitable steps, which may include all, some, or none of the steps of method 900, where appropriate. Furthermore, although this disclosure describes and illustrates particular components, devices, or systems carrying out particular steps of method SQG, this disclosure contemplates any suitable combination of any suitable; components, devices, or systems carrying out any suitable steps of metho 9Q0.
:[75l FIGURE 1Q illustrates a direct sensor-to-display system 1000 tha may be implemented b the emulated transparency assembly of FIGURE 7, according to certain embodiments. In general, FIGURE 10 illustrates how embodiments of emulated transparency assembly 710 utilize a direct association of input pixels to corollary outpu pixels. In so e embodiments, this is accomplished by using a layered approach such that the image sensor layer 730 and electronic display layer 760 are in close proximity to one another, mounted cm opposite sides of a shared substrate (a, g>, circuit board '7401 as illustrated in FIGURES 7-8. Signals from image sensor layer 730 may be propagated directly : to electronic displa layer 760 through circuit board 740 (and Iogic unit layer 750 in some embodiments) . Logic unit lay¾r 750 provides simple processing with optional input fox' any necessary control or augmentation, Typical electronic sensor/dispIay pairs (e.g,, a digital camera) do not express a one-to-one relationship in that the display is not coupled directly with the input sensor and thus requires some degree of image transformation, Certain embodiments of the disclosure, however, implement a one-to-one mapping between input and output pixels (i.e,, the sensor pixel and display pixel layouts are identical) , thereby circumventing the need for any image transformation. This reduces the complexity and. power requirements of emulated transparency assembly 710,
[76.1 ,¾s illustrated in FIGURE 10, each sensor unit 735 is directly coupled to a dorrespending display unit 765, For example, sensor unit 735A ma be directly coupled to displa unit 76SA sensor unit 735B may be directly coupled to display unit 765B, and so on. In some embodiments, the signaling between sensor units 735 and display units 765 may foe any appropriate differential signalling such as low-voltage differential signaling (LVDSi , More specifically, each sensor unit 735 may output first signals in a specific format (eng,, LVDS) that corresponds to incoming light field 701, in some embodiments, the. first signals are sent via a corresponding logic unit 755, which in turn sends second signals to display unit 76S in the same format as the first signals (e.g, LVDS) , In other embodi e ts, the first signals are sent directly to display units 765 from sensor units 735 (e.g< , sensor un s 735 and displa units 765 are coupled directly to opposite sides of circuit board 740 } . Display unit 765 receives the second signals from the logic unit 755 (or the first signals directl from the sensor unit. 735 via circuit board 740 and. uses them to generate outgoing light field 702,
[77] Because no conversion is neede in the signaling between sensor units 735 and display units 765,· emulated transparency assembly 710 may provide many benefits from t ical display/sensor combinations. First, no signal processors are needed to convert the signals from sensor units 735 to display units 765, For example, no off-board signal processors are needed to perform image transformation between sensor units 735 and display unit 765, This reduces the space, complexity, weight, and cost requirements fox' emulated transparency assembly 710, Second, emulated transparency assembly 710 may provid greater resolutions than would typically be possible for displa /sensor combinations , By directly coupling sensor units 735 with display units 765 and not requiring any processing or transformation of data between the units, the resolution of sensor units 735 and display units 765 may fee far greater than would typically fee possible. Furthermore, emulated transparency assembly 710 may provide heterogeneous resolutions across sensor units 735 and display units 765 at any particular time . Tha is, a particular ensor unit 735 and corresponding display unit 765 may have a particular resolution that is different from other sensor units 735 an display emits 765 at a particular time, and the resolutions of each sensor unit 735 and display unit 765 may fee changed at any ime .·
[78] In some embodiments, each particular sensor pixel of a sensor unit 735 is mapped to a single display pixel of a corresponding display unit 765, and the display pixel displays light corresponding to light captured by its mapped sensor pixel. This is illustrated best in FIGURES 17A-17B, As one example, each center sensing pixel 1725 of: a particular plenoptic cell 1510 of sensor side mierolens array ?20A (e.g, , the bottom plenoptic cell 1510 of sensor side microlens array 720A in FIGURE 1.7A) is mapped to a center display pixel 1735 of a corresponding plenoptic cell 1510 of display side microlens array 720B [e.g., the bottom plenoptic cell 1510 of display side mierolems array 7.2GB in FIGURE X7AJ , As another example, each to sensing pixel 1725 of a particular plenoptic cell 15X0 of sensor side microlens array 720A (e.g., the top plenoptic cell 1510 of sensor side microlens array 7 DA in FIGURE 17B} is apx^e to a bottom display pixel 1735 of a corresponding plenoptic cell 1510 of display side ierolens array 7 0B (e.g., the top plenoptic cell 1510 of display side microlens array 720B in FIGURE 17B),
[79J In some embodiments, sensor units 735 are coupled directly to circuit board 740 while display units 765 are coupled to logic units 755 {which are in turn coupled to circuit board 740) as illustrated ύh. FIGURE 3. In Cither embodiments, display units 765 are coupled directly to circuit board 740 while sensor units 735 are coupled to logic units 755 (which are in turn coupled to circuit board 740) , In other embodiments, both sensor units 735 and display units '765 are coupled directly to circuit board 740 (i.e., without any intervening logic units 755) , In such embodiments, sensor units 735 and display units 765 are coupled to opposite sides of circuit board 74D at unit attachment locations 745 (e.g., sensor unit 735A and display unit 7S5A are coupled to opposite sides of circuit board 74G at unit attachment location 745A}„
[80] FIGURE 11 illustrates a method 1100 of manufacturing the direct sensor-to-display system 1000 of FIGURE 10, according to certain embodiments. Method 1190 ma begin at step 111Q where a plurality of unit attachment locations are formed o a circuit board. In some embodiments, the circuit board is circuit board 740 and the unit attachment locations are unit attachment locations 745, In some embodiments, each unit attachment location corresponds to one of a plurality of display units and. one of a plurality of sensor units. The display units may be display units 765 and the sensor units may be sensor units 735, in some embodiments, each particular unit attachment location includes BSA pads that are configured to couple to one of the plurality of sensor units and/or one of the plurality of logic units. In some embodiments, each particular unit attachment location includes a plurality of interconnection pads configured to electrically couple the particular unit attachment location to one or more adjacent unit ttachment locations. In some embodiments, the unit attachment locations are arranged into a plurality of columns and plurality of rows as illustrated in FIGURE 8.
I81T At. step 1120, a plurality of sensor units &re. coupled to a first side of the circuit board. In some embodiments, each sensor unit is coupled to a respective one of the unit, attachment locations of step 1118. At step 113Q, a plurality of display units are coupled to a second side of the circuit board that is opposite to the first aide, In some embodiments, each display unit is coupled to a respective one of the unit attachment locations of step 1110 such that each particular one of the plurality of sensor pixel units is mapped to a corresponding one of the plurality of display pixel units. By mapping each particular sensor pixel unit to one of the display pixel units, the display pixels of each particular one of the plurality of display pixel units are configured to display light corresponding to light captured by sensor pixels of its mapped sensor pixel unit, After step
1.130, method iioo may end.
[82] Particular embodiments ma.y repeat one or more steps of method 1100, where appropriate. Although this disclosure describes and illustrates particular steps of method 1XG0 as occurring in a particular order, this disclosure contemplates any suitable steps of metho 1100 occurring in any suitable order (e.g., an temporal order}. Moreover, although this disclosure describes and illustrates an example direct sensor-to-dis lay system manufacturing method including the particular steps of method 1100., this disclosure contemplates any suitable direct sensor-to-display system manufacturing method including· any suitable ste s, which may nclude all, some, or none of the steps of method 11Q0 , where appropriate. Furthermore, although t is disclosure describes and illustrates particular components, devices,, or systems carrying out particular steps of method HOG, this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of method 1100,
[831 FIGURES 12-13 illustrate various in- layer signal processing configurations that may be used by emulated transparency assembly 719 of FIGURE 7, according to certain embodiments. In general, the configurations of FIGURES 12-13 utilise a layer of digital logic (a,g,f logic unit layer 750} that is sandwiched between: the camera and distilay (i.e,, between image sensor layer 730 and electronic displa layer 760). These configurations allow for local., distributed processing of large quantities of data fe,g,, 160k of image delta or more) , thereby circumventing bottlenecks as well as performance, power, and: transmission line issues associated with typical configurations. Human visual acuity represents a tremendous amount of data which must be processed in real time, Typical imaging systems propagate a single data strea to/from a high--powered processor (e.g, , a CPU or GPU), which may or may not serialise the data for manipulation. The bandwidth required for this approach at human 20/20 visual acuit far exceeds that of any known transmission protocols. Typical systems also use a master controlle which is responsible for either processing all irxcOKiihg/outgoing data or managing distribution to smaller processing nodes.
Regardless, all data must be transported off-systsm/off--chip, manipulated, and then returned to the display device,(s) , However, this typical approach is unable to handle the enormous amount of data required by human visual acuity. Embodiments of the disclosure, however, harness the faceted nature of a sensor/display combination as described erein to decentrealize and localise signal processing, This enables previously unachievable real-time digital image processing.
[843 As illustrated in FIGURES 12-13, certain embodiments of emulated transparency assembly 71:0 include logic unit layer 75Ό that contains the necessary logic to manipulate input signals from image sensor layer 73 n and provide output signals to electronic display layer 760, In some embodiments, logic unit layer 750 is located between image sensor layer 730 and circuit .board 740 as illustrated in FIGURE 12, In other embodiments, logic unit laye 750 is located between circuit board 740 and electronic display layer 76G as illustrated in FIGURE 13, In general, logic unit layer 750 is a specialised image processing layer that is capable of mixing an input signal directly from image sensor layer 730 and performing one or more mathematical operations (e.g. , matrix transforms) on the input signal before outputting a resulting signal directly to electronic display layer 760. Since each logic unit 755 of logic unit layer 730 isresponsible only for it5 associated facet (i.,e, , sensor unit 735 or display unit 765} , the data of the particular logic unit 755 can be manipulated with no appreciable impact to the system-level I/O. This effectively circumvents the need to parallelise any incoming sensor data for centralized processing, 'the distributed approach enables emulated transparency assembly- 710 to provide multiple features such as magnification/scorn £each facet applies a scaling transform to its input) , vision correction (each facet applies a simulated optical transformation compensating for common vision issues such as near-sightedness, far-sightedness, astigmatism, etc,), color blindness correctio (each facet applies a color transformation: compensating for common color blindness issues , polarization (each facet applies transformation simulating wave polarisation allowing for glare reduction), and dynamic range reduction (each facet: applies a transformation that darkens high- intensity regions (e,g, Sun) and lightens low-intensity regions (s.g. shadows}). Furthermore, sines any data transformations remain localised to logic unit layer 750 of each facet, there ma be no need for long transmission lines. This circumvents issues of cross talk, signal integrity, etc, Additionally, since the disclose embodiments do not require optical transparency (bu instead harness emulated transparency) , there is no functional impact to placing an o aque processing layer between the senso and display facets,
[853 In some embodiments, logic unit layer 750 contains discrete logic units (e.g,, transistors) that are formed directly on circuit board 740. For example, standard photo lithography techniques may be used to for logic unit layer 750 directl on circuit board 740, In other embodiments, each logic unit 755 i a separate integrated circuit (IC) that is coupled to either a sensor facet or a display facet, or directly to circuit board 740, As used herein, "facet" refers to a discrete unit tha is separately manufactured and then coupled to circuit board 740. For example, a "display facet" may refer to a unit that indudes a combination of an electronic display layer 760 and a display side microlens array 72OB, and a "sensor facet" may refer to a unit that includes a combination of an image sensor layer 730 and a sensor side microlens array 720A. In some embodiments , a display facet may Include a single display unit 765, or it may include multiple display units 765. Similarly, a sensor facet may include a single sensor unit 735 , or it may include multiple sensor units 735. In some embodiments, a logic unit 755 may be included in either a sensor facet or a display facet. In embodiments where a logic unit 755 is a separate IC that is coupled directly to either a display or sensor facet (as opposed to being formed directly on circuit hoard 740) , any appropriate technique such as 3D IC design with through- silicon vias may be used to couple the IC of logic unit 755 to a wafer of the facet ,
(86J In some embodiments, logic unit layer 75:0 is an application-specific integrate circuit (&0IC) or an arithmetic logic unit (ALU) , but not a general purpose rocessor. This allows logic unit layer 750 to be power efficient. Furthermore, this allow© logic unit layer 750 to operate without cooling, further reducing cost and power requirements of emulated transparency assembly 710.
[87] in sorae embodiments, logic units 755 are configured to communicate using the same protocol as sensor units 735 and display units 765. For example, in embodiments where logic units 755 are discrete ICs , the XCs may be configured to communicate in a same protocol as the sensor .and display facets (e .g., D DS or Inter- Integrated Circuit (i C) } . This eliminates the problem of having to translate between the sensor and display facet, thereby reducing power and cost.
[883 In some embodiment©, logic unit layer 75G performs one or more operations oh signals received fro image sensor layer 73G before transmitting output signals to electronic display layer 760. For example, logic unit layer 750 may transform received signals from image sensor layer 730 to include augmented information for display on electronic display layer 760. This may be used, for example, to provide Aϊ5, to a viewer, In some embodiments, logic unit layer 750 may completely replace received signals from image sensor layer 730 with alternate information for display on electronic display layer 760, This may be used, for example, to provide VR to a viewer.
[8.9] FIGURE 14 illustrates a method 1400 of manufacturing the: in-layer signal processing systems of FIGURES 12-13, according to certain embodiments. Method 1400 may begin in step 1410 where a plurality of sensor· units are coupled to a first side of a circui board. In some embodiments, the sensor units are sensor units 735, and the circuit board Is Circuit board 740, In some embodiments, each sensor unit is couple to one of a plurality of unit attachment locations such as unit attachment locations 745:. Each sensor unit includes a plurality of sensor pixels.
[SO] At step 1420, a plurality of display units are formed. In some embodiments, the display units are a combination of display units 765 and logic units 755. Each display unit may be formed by combining an electronic display and a logic unit into a single 3D integrated circuit using through-silicon vias , Each display unit includes a plurality of display pixels.
[913 At step 1430, the plurality of display units of step 1420 are coupled to a second side of the circuit board that is opposite the first side. In some embodiments,, each logic unit is coupled to a respective one of the unit attachment locations. After step 1430:, method 1400 may end.
[92] Particular embodiments ma repeat one or more steps of method 1400, 'where appropriate. Although this disclosure describes and illustrates particular steps of method 1400 as occurring in a particular order, this disclosure contemplates any .suitable steps of method 1400 occurring in any suitable order (e.g. , any temporal order). Moreover, although this disclosure describes and illustrates an example in-layer signal processing system manufacturing method including the particular steps of method 1400, this disclosure contemplates an suitable in-layer signal processing system manufacturing method including any suitable steps, which may include all., some, or none of the steps of method 1400 , where appropriate. Furthermore,, although this disclosure describes and illustrates particular components, devices, or- systems carrying out particular steps of method 1400, this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of method 1400.
[33] FIGURES 15-17C illustrate various views of an array 1500 of plenoptic cells 1510 that may be used within microlens arrays 720Ά-B of emulated transparency assembly 710. FIGURE 15 illustrates a plenoptic cell assembly 1500, FIGURE IS illustrates a cross section of a portion of the plenoptic cell assembly 1500 of FIGURE IS, and FIGURES 17A-17C illustrate cross sections of a portion of the plenoptic cell assembly 1500 of FIGURE 15 with various incoming and outgoing fields of light
[34] Standard electronic displays typically include planar arrangements of pixels which form a two-dimensional rasterised image, conveying inherently two-dimensional data. One limitation is that the planar image cannot he rotated in order to perceive a different perspective within the scene .being conveyed. In order to clearly view this image, regardless of what is portrayed within the image itself, either a viewer
Figure imgf000035_0001
eyes or the lens of a camera must focus on the screen. By contrast, a volume of light entering the eyes from the real world allows the eyes to naturally focus on an point within that volume of light, This plenoptic "field" of light contains rays of light from the scene as they naturally enter the. eye, as opposed to a virtual image: focused by n external lens at a single focal plane, While existing light field displays may be able to replicate this phenomenon, they present substantial tradeoffs: between spatial and angular resolutions, resulting in the perceived volume of light looking fussy or scant in detail,
[95] To overcome problems and limitation with existing light field; displays, embodiments of the disclosure provide a coupled light field capture and display system that is capable of recording and then electronically recreating the incoming plenoptic volume of light. Both the capture and the display process are accomplished by an arrangement of plenoptic cells: 1510 responsible for recording or displaying smaller views of a larger compound image. Each plenoptic cell 1510 of the sensor is itself comprise of dense cluster of image sensor pixels, and each plenoptic cell of the display is itself comprised of a dense cluster of display pixels. In both cases, light rays entering the sensor cells or exiting the display cells are focuse b one or more transparent lensiets 1512 to produce a precisely tune distribution of near-collimated rays. This essentially records an incoming light field and reproduces it on the opposite side of the assembly. More specifically, for the sensor', the volume of light entering the lens (or series of lenses) of this cell is focused onto the image pixels such that each pixel gathers light from only on® direction, as determined by its position within the cell and the profile of the lens. This allows rasterized encoding of the various angular rays within the light field, with the number of pixels in the cell determining the angular resolution recorded. For the display, the light emitted fro the pixels is focused by an identical lens (or series of lenses to create a volume of light that matches what was recorded by the sensor, plus any electronic augmentation or alterations (e.g., fro logic unit layer 75S described above). The cone of emitted light from this cell contains a subset of rays at enough interval angles to enable the formation of a light field for the viewer, where each output ray direction is determined by the position of its originating pixel within the cell and the profile of the lens.
ί bϋ Plenoptic cells 1510 ay be utilized by both sensor side microlehs array 720h and display side microiens array 72OB. For example, multiple plenoptic cells 15X0A may be included in sensor side microiens array 2OK, and each plenoptic cell 1.510A may be coupled to or otherwise adjacent to an image sensor 1520, Image sensor 1520 may be a portion: of image sensor layer 730 and may include a sensor pixel array 1525 that includes sensing pixels 1725. Similarly, multiple plenoptic cells 15103 may be included in display side microiens array 720B, an each plenoptic cell 15X0B may be coupled to or otherwise adjacent to a display 1530, Display 1530 may hs a portion of electronic display layer 760 and may include a display pixel array 1625 that includes display pixels 1735. Sensing pixels 1725 may be sensor pixels 1800 as described in FIGURES 18-20 and their associated descriptions in XL· S, Patent Application No. 15/724,027 entitled "Stacked Transparent Pixel Structures for Image Sensors, " which is incorporated herein by reference i its entirety. Display pixels 1735 may foe display pixels 100 as described in FIGURES 1.-4 and their associated descriptions in U . S . Patent
Application Ho. 15/724,004 entitled ^Stacked Transparent Pixel Structures for Electronic Displays," which is incorporated herein by reference in its entirety.
[97] In some embodiments, plenoptic cell 1510 includes a transparent lenslet 1512 and cell walls 1514, Specifically., plenoptic cell 1510A includes transparent lenslet 1512A and cell walls 1S14A, and plenoptic cell 151GB includes transparent lenslet 1512B and cell walls igliB, In some embodiments:, transparent lenslet 1512 contains a 3D shape with a colliotating lens on one end of the 3D shape. For example, as illustrated in FIGURE 15, transparent lenslet 1512 may be: a rectangular cuboid with a colligating lens on one end of the rectangular cuboid. In other embodiments, the 3D shape of transparent lenslet 1512 ma be a triangular polyhedron, a pentagonal polyhedron,, a hexagonal polyhedron, a heptagons! polyhedron, an octagonal polyhedron, a cylinder, or any other appropriate shape, Each plenoptic cell 15IGA includes an inpu field of view (FOV) 1610 (e.g,, 30 degrees}, and each plenoptic cell 151OB includes a.n output FOV 1620 {e.g., 30 degrees) . In some embodiments, input FQ¥ .1610 matches output FGV 1620 for corresponding plenoptic cells 1510.
[98] Transparent lenslet 1512 may foe formed from any appropriate transparent optical material. For example, transparent lenslet 1512 may foe formed from a polymer, silica glass, or sapphire. In some embodiments, transparent lenslet 1512 may he formed from a polymer such as polycarbonate or acrylic. In some embodiments, transparent lenslets 1512 may be replaced with waveguides and/or photonic crystals in order to capture and/or produce a light field. [95] In general , cell walls 1514 are barriers to prevent optical crosstalk between adjacent plenoptic cells 1510. Cell walls 1514 may be formed from any appropriate material that is opaque to visible light when hardened. In some embodiments dell walls 1514 are formed frora a polymer. Preventing optical cross talk -using cell walls 1514 is described in more detail below in reference to FIGURES 17& and 17C .
[100] I some embodiments., image sensor 1520 includes or is coupled to backplane circuitry 1630A, and display 1530 includes or is coupled to backplane circuitry 16393. In general, backplane circuitry 1630&-B provides electrical connections to permit image data to flow from image sensor 1520 to display 1530. In some embodiments, backplane circuitry 1630¾ and backplane circuitry 1:63GB are the opposite sides of a single backplane > In some embodiments, backplane circuitry 1630A and backplane circuitry 1630B are circuit board 740.
[1011 In some embodiments, a filter layer 1640 may be included on one or both ends of transparent lenslet 1512 in order to restrict the entry or exit of light to a specific incidence angle. For example, a first filter layer 1640A may be included on the convex end of transparent .lenslet 1512, and/or a second filter layer 1640B may be included on the opposite end of transparent. lenslet 1512 , Similar to cell walls 1514, such a coating or film may also limit image bleed between adjacent transparent lenslets 1512 to an acceptable amount. Filter layer 1640 may be used in addition to or in place of call walls 1514,
[102] FIGURES 17A-I7C each illustrate a cross-sectional view of seven adjacent plenoptic ceils 1510 for a senso side micrelens array 720A and a corresponding display side microlens array 7203. These figures show how incoming light fields 701 are captured by image sensors 1520 and electronically replicated on display 1530 to emit a virtually identical field of light. In FIGURE 17A, an incoming light field 1710 from objects directly in front of the sensor plenoptic cells 1510 are focused by the transparent lenslets 1512 of the sensor plenoptic cells 1510 onto center sensing i els 1725, Corresponding light is then transmitted by corresponding center display pixels 173 of corresponding display plenoptic cells 1510. The transmitted light is focused and emitted as emitted light field 1711 b the transparent lenslets 1512 of display plenoptic cells 1510. Emitted light field 1711 precisely matches the se.ro degree source ligh field (i. e,., incoming light field 1710 . In addition, emitted light rays striking cell walls 1514 at location 1740 that would otherwise penetrate adjacent display plenoptic cells 1510 are blocked by the opaque cell walls IS14, thereby preventing optical cross-talk,
1103] In FIGURE 17B, an incoming light field 1720 from objects fourteen degrees off the axis of sensor plenoptic cells 1510 are focused by the transparent lenslets 1512 of the sensor plenoptic cells 1510 onto top sensing pixels 1725. Corresponding light is then transmitted by corresponding opposite Ci,e.f bottom) display pixel 1735 of corresponding display plenoptic ceils 1510, The transmitted light is focused and emitted as emitted light field 1721 by the transparen lenslefcs 1512 of display plenoptic cells 1510. Emitted light fiel 1721 precisely matches the 14 degree source light field (i.e,. incoming light fiel 1720) . [104] in FIGURE 17C, azi incoming light field 1730 fro objects 25 degrees off the axis of sensor plenoptic cells 1510 are focused b the transparent lenslefcs 1512 of the sensor plenoptic cells 1510 entirely onto cell nails 1514. Becauseincoming light field 173G is focused entirely onto cell wails 1514 of sensor plenoptic cells 151G instead of sensing pixels 1725 / no corresponding light is transmitted by corresponding display plenoptic cells 1510 , In addition., incoming light rays striking cell walls 1514 at location 1750 that would otherwise penetrate adjacent sensor plenoptic cells 1510 are blocked toy the opaque cell walls 1511, thereby preventing optica1 erose~taIk ,
[105] FIGURES I8A~18B illustrate a method of manufacturing the plenoptic cell assembly of FIGURE IS, accordixvy to certain embodiments. In FIGURE ISA, a microlens array (MBA) sheet 1810 is fo med or obtained. MLA sheet 1810 includes a plurality of lenslets as illustrated. In FIGURE 18B, a plurality of grooves 1820 are cut around each of the plurality of lenslets of MLA sheet 1810 to a predetermined depth. In some embodiments, grooves 1820 may be cut using multiple passes to achieve the desired depth. In some embodiments grooves 1820 may be cut using laser ablation, etching, lithographic processes,, or any other appropriate method. After grooves 1820 are cut to the desired depth,: they are filled with a material configured to prevent light from bleeding through grooves 1820, In some embodiments, the material is any light absorbing (e. g.s carbon narxotubes) or opaque material fe.g., a non-reflective opaque material or a tinted polymer) when hardened, The resulting plenoptic cell assembly after grooves 1820 are filled and allowed to harden is illustrated in FIGURES 20-21, [1061 FIGURES 1¾A~18:B illustrate another method of manufacturing the plenoptie cell assembly of FIGURE 15, according to certain embodiments. in FIGURE ISA, a pre-formed lattice 1830 having voids 1840 is obtained or formed. Lattice 1830 is made of any suitable .material as described above for cell walls 1514. Lattice 1830 may be formed from any suitable method including., but not limited to, additive manufacturing and ablation of cell matter.
[1071 In FIGURE IS'H, voids 1840 are filled with an optical polymer 1850, Optical polymer 1850 may he any suitable material as described above for transparent lenslet 1512. After voids 1840 are filled with optical polymer 1850, the final lens profile is created using molding or ablation. An example of the resulting plenoptie cell assembly after the lenses are formed is illustrated in FIGURES 20-21.
[108] FIGURE 22-23 illustrates a flexible circuit board 2210 that may be used as circuit board 746 by the emulated transparency assembly 710 of FIGURE 7, according to certain embodiments . Generally, wrapping electronics around a 3D shape such as spherical or semispherical surface is a nontrivial task. Though various examples of flexible and even stretchable circuitr are currently- available, there are several hurdles to overcome when positioning such electronics on a small radius e.g. , 30 - 60 mm.) spherical or semispherical surface. For example, bending of flexible electronics substrates i one direction does not inherently indicate adaptability to compound curvature, as the torsional forces reguixed for s\ich curvature can be damaging to the thin films involved. As another example, questions remain about this degree of sfcretehabiXity and lifetime of stretchable electronics currently available. {1031 To address the problems and. limitations of current solutions, embodiments of the disclosure present a 3D {e,g. , spherical or hemispherical) electronics manufacturing method using a geodesic faceted approach consisting of an array of small , rigid surfaces built on a single flexible circuit. In some embodiments, the. flexible circuit is cut to a specific net. shape and then wrapped fco a 3D shape Ce. g, , a spherical :or se ispherical shape) and looked into place to prevent wear and tear from repeated flexing. The method is especially useful to accommodate the narrow radii of eurvafcuire Ce. g., 30 -6'-3 m) necessary for head-mounted near- eye wrapped displays. In some embodiments , the assembly includes a single, foundational flexible printed circuitry layer, with rigid sensor and display arrays layered on opposite sides of the flexible circuit. The eatire assembly including sensor and display layers may be manufactured by standard planar semiconductor processes (e. g< , spin coatings,, photolithography, etc.;. The rigid electronics layers may be etched to form individual sensor and display units { i . e . , “facets") and then connected to th flexible circuitry by connection pads and adhered through patterned conductive and non-conductive adhesives. This permits the flexible circuitry to fol slightly at the edges between the rigid facets. Its. some embodiments, following planar manufacturing, the full cured an functional electronic stack is formed to the desired final 3D shape using one side of a final rigid polymer casing as a mold. In this way, the arrays of rigid electronics facets are not deformed but simply fall into place in their mold, with the flexible circuitry bending at defined creases/gaps to match the faceted interior of the casing. The assembly may be finally capped and sealed using an opposite matching side of the rigid casing. [1.10] Embodiment·*’ of the disclosure are not limited to only spherical or semispherical shapes , although such shapes are certainly contemplated. The disclosed embodiments may fee formed into any compound curvature or any other revolved shape. Furthermore, the disclosed embodiments may fee formed into any non-unifoirtn curvature, as well as non-curved [i.e., flat) surfaces .
[Hi] FIGURE 22 illustrates flexible circuit board 2210 in two different states; a fla flexible circuit board 221GA and a 3D·*shaped flexible circuit board 221QB, Flexible circuit board 2210 includes facet locations 2220, which in general are locations in which facets (e.g,, sensor facets 3735, displa facets 2665, or logic facets 2655 discussed below) may fee installed on flexible circuit boar 2210. In some embodiments, flexible circuit board 2210 includes gaps 2215. As illustrate in the bottom portion of FIGURE 22, when flexible circuit board 2210 is flat, at least some of facet location 2220 are separated from one or more adjacent facet locations 2220 by one or more gaps 2215. As illustrated in the top portion: of FIGURE 22, whe flexible circuit boatd 2210 is formed into a 3D shape, gaps 2215 may fee substantially eliminated, thereby forming a continuous surface across at least some of the facets that are coupled at facet locations 2220 fe ,g, , a continuous sensing surface across multiple sensor facets 3735 or a continuous display surface across multiple display facets 2665} ,
[1.12] in general, facet locations 2220 may have any shape. In some embodiments , facet locations 2220 are in the shape of a polygon (e.g, , a triangle, square, rect ngle , pentagon, hexagon, heptagon, or octagon < In some embodiments, facet locations 2220 are all identical. In other embodiments, however, facet locations 2:220 all share the same polygon shape (e.g., all are hexagonal) , but have different dimensions. In some embodiments , facet locations 2220 have heterogeneous shapes (e.g, , some are rectangular and some are hexagonal). Any appropriate shape of facet locations 2220 may be used,
( 113 ) In some embodiments, facet locations 2220 are arranged in columns 2201. In some embodiments, facet locations 2220 are additionally or alternatively arranged in rows 2203, While a specific pattern of facet locations 2220 illustrated, any appropriate pattern of facet locations 2220 may be used.
[1243 FIGURE 23 illustrates additional details of flexible circuit board 2210 ; according to certain embodiments· In some embo iments, each facet location 2220 includes pads and/or vias for coupling sensor or display facets to flexible circuit board 2210, As an example., some embodiments of flexible circuit board 2310 include BGA pads 2240 at each facet location 2220, Any appropriate pattern and number of pads/ ies may be .included at each facet location 2220.
[115] In general , each particular facet location 2220 is configured to transmit signals between a particular sensor facet coupled to the particular facet location and a particular display facet coupled fco an opposite side of the particular· facet location. For example, a particular facet location 2220 may have a sensor facet 3735 coupled to one side, and a display facet 2665 coupled to its opposite side. The particular facet location 2220 provides the necessary electrical connections to permit signals from the sensor facet 3735 to travel directly to the display facet 2665, thereby enabling the display facet 2S65 to display light thatcorresponds to light captured by the sensor facet 3735,
[ 116] In some embodimentsf wire traces 2230 are included on flexible circuit hoard 2210 to electrically connect facet locations 2220. For example,, wire traces 223Q may connect to interconnection pads 2250 of each facet location 2220 in order to electrically connect adjacent facet locations 2220. In some embodiments, facet locations 2220 are serially connected via wire traces 2230. Per example, FIGURE 24 illustrates a serial data flow through flexible circuit hoard 2210, according to certain embodiments. In this example, each facet location 2220 is assigned a unique identifier e . g , , "I," "2 , w and so on) , and data flows serially through facet locations 2220 via wire traces 2230 as illustrated. In this manner, each facet location 2220 may be addressed by a single processor or logic unit usiirg its unique identifier. Any appropriate addressing scheme and data flow pattern may be used.
1117] FIGURE 25 illustrates a method 25:00 of manufacturing an electro:xic assembly usiiK? flexible circuit board 2210 of FIGURE 22, according to certai embodiments
Figure imgf000047_0001
step 2510,. , a plurality of facet locations are formed on a flexible circuit board. In some embodiments, the facet locations are facet locations 2220, an the flexible circuit noard xs flexible circuit board 2210. Each facet location corresponds to one of a plurality of sensor facets and one of a piu alit~y of display facets. The sensor facets may foe sensor facets 3735, and the display facets ay foe display facets 2665. In some embodiments, the plurality of facet locations are arranged into a plurality of facet columns such as columns 2201. In some embodiments, the plurality of facet locations are additionally or alternatively arranged into a plurality of facet rows such as rows 2202,
[118] A step 2520, the flexible circuit hoar of step 2510 is cut or otherwise shaped into a pattern that permits the flexible circuit hoard to foe later formed into a 3D shape; such as a spherical or semu.spherical shape. When the flexible circuit board is flat, at least some of the facet locations are separated from one or more adjacent facet locations by a plurality of gaps such as gaps 2215. When the flexible circuit board is formed into the 3D shape, the plurality of gaps are substantially eliminated.
[11.9] At step 2530, the electronic assembly is assembled by coupling a first pluralit of rigid facets to a first .side of the flexible circuit board. The first plurality of rigid facets may be sensor face s 3735 or display facets 2665. Each rigid facet is coupled to a respective one of the facet locations. In some embodiments, the first plurality of rigid facets are coupled to connection pads on the first side of the flexible circuit board using patterned conductive and non- conductive adh.esivas ,
[120] In some embodiments, tbe first plurality of rigid facets of step 2530 are rigid sensor facets sued as sensor facet 3735, and method 25QQ further includes coupling a plurality of rigid display facets such as display facet 2665 to a second side of the flexible circuit board tha is opposite the first side. in this case, each particular face location is configured to transmit signals between a particular rigid sensor facet electrically coupled to the particular facet location and a particular rigid display facet electricall coupled to the same particular facet location. This permits light to be displayed from the particular rigid display facet that corresponds to light captured by the corresponding rigid sensor facet .
[121] At step 2540, the assemble electronic assembly is forme into the desired 3D shape. In some embodiments, this step involves placing the flexible circuit board with its coupled rigid facets into one side of a rigid casing that is in the desired shape. This allows the rigid facets to fall into defined spaces in the casing an the flexible circuit board to bend at defined creases/gaps between the rigid facets. After placing the flexible circuit board with its coupled rigid facets into one side of the rigid casing, an opposite matching side of the rigid casing may be attached to the first side, thereby sealing the assenibly into the desire sha e .
[122] Particular embodiments may repeat one or more steps of method 2500, where appropriate. Although this disclosure describes and illustrates particular steps of method 2500 as occurring in a particular order, this disclosure contemplates any suitable steps of method 2500 occurring in any suitable order (e,g,f any temporal order). Moreover, although this disclosure describes and illustrates an example method of manufacturing an electronic assemblyusing flexible circuit board, this disclosure contemplates any suitable method of manufacturing an electronic assembly using flexible circuit board, which may include all, some, or none of the steps of method 2500, w ere: appropriate. Furthermore, although this disclosure describes and illustrates particular components, devices, or systems carrying out particular steps of method 2500, this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of method 25GQ,
[123] FIGURES 26-36 illustrate distributed multi-screen arrays for high de sit displays, according to certain embodiments. In general, to provide near-eye display capable of emulating the entire visual field of a single human eye, a high dynamic range image display with a resolution orders of magnitude greater than cu rent common display screens is required. Such displays should be able to provide a light field display with enough angular an spatial resolution to aocoramodate 20/20 human visual acuity. This is an enormous amount of information, equating to a total horizontal pixel coun of 10OK to 20OK, These displays should also wrap around the entire fiel of vision of one human eye (approximately 160° horizontally and 130° vertically) . For rendering binocular vision, a pair of such displays spanning the entirety of a curved surface around each eye would be necessary.. Typical displays available today, however, are unable to meet these requirements, [124] To address these and other limitations of current displays, embodiments of the disclosure provide an array of small, high-resolution micro displays I'e.g., display facets 2665) of custom sizes and shapes, all of which are formed and then assembled on a larger, flexible circuit hoard 2210 that may be formed into a 3D shape (e.g. , a semispherical surface! . The micro displays may be mounted to the interior side of semispherical circuitry, where another layer containing an array of TFT logic units ie.gc, logic units 755) may be included: to handle all the power and signal management. Typically, one logic unit 755 may he include for each micro display. Bach micro display operates: as a discreet unit, displaying data from the logic unit behind it. Any additional information [e.g., such as external video for AR, VR or MR applications) may be passed to the entire array via a central control processor. In some; embodiments, the external data signal progresses serially from one micro display to the next as a packed multiplex stream:, while the TFT logic unit for each display determines the source and section of the signal to read. This allows each unit to act independently of any other display, providing a large array of many high- resolution displays with unique content on each, such that the whole assembly together forms essentially a single extremely high- resolution display.
[1253 To fulfill the requirements of resolution, color clarity, and luminance output, each micro display may have a unique, high performance pixel architecture. For example, each micro display screen may include arrays of display pixels 100 as described in FIGURES 1-4 and their associated descriptions in U , S , Patent Application No. 15/724,004 entitled "Stacked Transparent Pixel Structures for Electronic Displays! " which is incorporated herein fey reference in its entirety. The itii-cro display screens may be assembled on the same substrate using any appropriate method. Such simultaneous manufacturing using standard semiconductor layering and photolithographic processes virtually eliminates the overhead and costs associated with production and packaging o£ many individual screens, greatly improving affordability .
[126] FIGURE 26 illustrates a cut-away view of a curved multi-display array 2,600, according to certain embodiments > FIGURE 26 is essentially the back side of flexible circuit board 221GB of FIGURE 22 with the addition of logic facets 2655 and display facets 2665 couple to flexible circuit hoard 221GB at facet locations 22 0. In general, each logic facet 2655 is an individual logic unit 755 from logic unit layer 750, Similarly, each display facet 2665 is an individual display unit 765 from display layer 760 coupled with a. portion of microlens array 720.
[1271 In some embodiments, each individual logic facet 2655 is coupled to flexible circuit board 2210, and each individual display facet 2665 is then coupled to one of the. logic facets 2655. In other embodiments, each logic facet. 2655 is first coupled one of the display facets 2665.. and the combined facet is then coupled to flexible circuit board 2210, In such embodiments, the combined logic facet 2655 and display facet 2665 may be referred to as a display facet 2665 for simplicity. As used herein, ''display facet" ma refer to both embodiments ( i . e , t an individual display facet 2665 or a combination of a display facet 2S65 with a logic facet 2655) ,
[128] In general, each display facet 2665 can he individually addressed (e.g., by a central control processor not pictured) , and a collection of display facets 2665 may represent a dynamic, heterogeneous collection forming a singular collective, In other words, multi-display array 2600 provides a tiled electronic display system shoeing imagery through individual display facets 2665 that together form a complete whole. Each individual display facet 2665 is capable of" providing u1tipi© different display resolutions and can be customised on the fl to run a different resolution, color range, frame rate, etc. For example, one display facet 2665 may have a 512x512 display resolution while an adjacent display facet 2665 {of equal size) has a 128x128 display resolution, wherein the former represents a higher concentration of imagery data. In this example, these two displays are heterogeneous , but are individually controllable and work in unison to form a singular display image.
[129] The overall collection of display facets 2665 can follow any curved or flat surface structure. For example, display facets 2665 may he forme into a semispherical surface, a cylindrical surface,, an oblong spherical surface, or an other shaped surface.
{1303 Logic facets 2655 and display facet 2665 may he in any appropriate shape. In some emb diments, the shapes of logic facets 2655 and display facets 2665 match each other and. the shape of facet locations 2220. in some embodiments, logic: facets 2655 and displa facets 2685 are in the shape of a polygon such as a triangle, a quadrilateral, a pentagon, a hexagon, a heptagon, or an octagon. In some embodiments, some or all of logic facets 2655 and display facets 2665 have non- polygonal shapes. For example, display facets 2665 on the edges of flexible circuit board 2210 may not be polygonal as they may have curved cutoffs so as to enhance the aesthetic of the overall assembly. [131] In addition to having a selecta.b1e/contr l1able display resolution, each is lay facet 2665 may In some embodiments also have a selectable color range from a plurality of color ranges and/or a selectable frame rate from a plurality of frame rates. In such embodiments, the displa facets 2665 of a particular flexible circuit board 22IQ are configurable to provide heterogeneous frame rates and heterogeneous color rraannggee. For example, one display facet
2665 may have a. particular color range while another display facet 2665 has a different color" range, Similarly, one display facet 2665 may have a particular frame rate while another display facet 2665 has a different f ame rate,
[132] FIGURE 27 illustrates an exploded view of the curved multi -dis lay array 2600 of FIGURE 26, and FIGURES 28- 29 illustrate additional details of logic facet 2655 anddisplay facet 2665, according to certain embodiments. As illustrated in these figures1, each logic facet 2655 may include interconnections pads 2850 that may foe electrically coupled to interconnection pads 2250 of adjacent logic facets 2655. This may enable display facets 2665 to be serially coupled via wire traces 2230. In addition, each logic facet 2655 may include pads 2840 in a pattern that matches pads 2940 on the back side of display facet 2665, This permits logic facet 2655 and display facet 2665 to foe couple together using any appropriate technique in the art. In some embodiments, pads 2840 and pads 2940 lie BGA pads or any other appropriate surface -mounting pads.
[133} FIGURES 30 and 32 illustrate a back side of flexible circuit board 2210 of FIGURE 22, and show similar details as described in reference to FIGURE 23. FIGURES 31 and 33 illustrate a serial data flow through flexible5 circuit board 2210, arid show similar details as described in reference to FIGURE 24, FIGURE 34 iHastrates an array of logic facets 2655 that have been formed into a aemispherical shape, according to certain embodi ents. In this figure, flexible circuit board 2210 and display facet 2665 have been removed for clarity. FIGURE 35 illustrates communications between the logic facets 2655 of FIGURE: 34, according to certain embodiments. As illustrated in this figure, each logic facet 26:55 may communicate with adjacent logic facets 2655 using interconnections pads 2850. In addition, each logic" facet 2655 may have a unique identification as illustrated in FIGURE 35. This permits each logic facet 2655 to be uniquely addressed by, for example, a central processing unit,
[134] FIGURE 36 illustrates a method 3600 of manufacturing the curved multi display arra of FIGURE 26, according to certain embodiments , Method 3600 ma begin in step 3610 where a plurality of facet locations are formed on a circuit board. In some embodimen s , the facet locations are facet locations 2220 and the circuit board is flexible circuit board 2210. In some embodiments, each facet location corresponds to one of a plurality of display facets such as display facets 2665.
[135] At step 362:0, the flexible circuit board is cut or otherwise formed into a pattern that ermits the flexible circuit board to be later formed into a 3D shape , When the flexible circuit board is flat, at least some of the facet locations are separated from one. or more adjacent facet locations by a plurality of gaps such as gaps 2215. when the flexible circuit board is formed into the 3D shape, the plurality of gaps are substantially eliminated. 1136] &t step 3630,. a plurality of logic facets are coupled to a first side·; of the flexible circuit board. Bach logic facet is coupled to a respective one of the facet locations of step 3610. .At step 3640, a plurality of display facets are coupled to a respective one of the plurality of logic facets of step 363Q , In alternate embodiments , the display facets may be mounte to the logic facets of step 3630 at the wafer level prior to coupling the logic facets to the first side of the flexible circuit .board, ¾fc step 3650, the asseifbled electronic display assembly is forme into the 3D shape. In come embodiments, this step may foe similar to step 2:540 of method 2500 described above. After step 3650,. method
36DO may end,
[137] Particular embodiments may repeat one or more steps of method 3600, where appropriate. Although this disclosure describes an illustrates particular steps of method 3600 as occurring in a particular order, this disclosure contemplates any suitable steps of method 3600occurring in. any suitable order (e.g. , any temporal order) Moreover, although this disclosure describes and illustrates an example method of manufacturing a curved multi-display array, this disclosure contemplates any suitable method of manufacturing a curved multi-display array, which may include ll, some, or none of the steps of method 3600, whereappropriate„ Furthermore, although this disclosure describes and illustrates particular components, devices, or systerns carrying out particular steps of method 3600, this dieclosure contemplates any suitable combination of any suitable components, devices, oorr systems carrying ou ai¾ r suitable steps of method 3600. [138] FIGURES 37-42 Illustrate a distributed multi- aperture camera array 3700, according to certain embodiments < In general, to capture the full light field of the entire visual field of a single human eye, a large, high dynamic range image sensor with a resolution much higher than currently available .is needed Such an image sensor would enable a light field camera with enough angular and spatial resolution to accommodate 20/2Q human visual acuity This is an enormous amount of information, equating to a total horizontal pixel count of TOOK to 2Q0K1 This multi -aperture image sensor must also wrap around the entire field of vision of one human eye (approximately 160° horizontally and 130® vertically) . For imaging binocular vision, a pair of such cameras spanning the entirety of a curved surface around eac eye are necessary. Typical Image sensor assemblies available today are unable to meet these requirements.
[139] To overcome these arid other limitations of typical image sensors, embodiments of the disclosure provide an arra of small image sensors of custom sizes and shapes, all of which are assembled on a larger, flexible circuit boar 2210 that is formed to a 3D (e,g., semi -spherical) shape. The image sensors : a , g .. sensor facets 37355 are mounted to the exterior side of flexible circuit board 2210, where another laye containing; an array of TFT logic units (a. g. , logic units 755) may be provided to handle all the power and signal management · one logic unit for each display. Each image sensor operates as a discrete unit passing readout data to the logic unit behind it. fin embodiments that include logic units:), where it Is handled and routed accordingly (e.g., to a corresponding display facet 2S65 In some embodiments) , This allows each sensor facet 3735 to act independently of an other sensor facet 3735, providing a large array of many apertures capturing unique content on each, such that the whole assembly essentially becomes a seamless, very high resolution, multi -node camera. It should toe noted that while image sensors may pass data to their paired logic units in some embodiments , the functionality of the image sensors themselves do not necessarily require logic unit coupling.
[140] To fulfill the requirements of resolution, color clarity, and. lumin nce output, each micro sensor may have a unique, high performance pixel architecture. For example, each micro sertsor may include arrays of sensor pixels 1800 as described in FIGURES 18-20 and their associated descriptions in D.S. Patent Application No. 15/724,027 entitled "Stacked Transparent Pixel Structures for Image Sensors," which is incorporated herei by reference in its entirety. The micro sensor may be assembled on the same substrate using any appropriate method. Such simultaneous manufacturing using standard semiconductor layering and photolithographic processes virtuall eliminates the overhead and costs associated with production and packaging of many individual screens, greatly improving affordability.
[141] Another characteristic of certain embodiments of distributed multi-aperture camera -array 3700 is built-in depth perception based on parallax between different plenop ic cells. Imagery produced toy cells on opposite sides of a given sensor may be used to calculate the offse pf image detail, where offset distance directly correlates with proximity of the detail to the sensor surface. This scene information may be used by a central processor when overlaying any augmented video signal, resulting in AR/'RR content placed i front of the viewer at the appropriate depth. The information can also be used for a variety of artificial focus blurring and depth- sensing tanks, including emulated depth of field,, spatial edge detection, and other visual effects.
[142) FIGURE 37 illustrates a cut-away view of distributed multi-aperture camera array 3700, according to certain embodiments. FIGURE 37 is essentially the flexible circuit board 221QB of FIGURE 22 with the addition of sensor facet 3735 coupled to flexible circuit board 22G0B at facet locations 2220, in some e bo iment# , each sensor facet 3735 is an individual sensor uni 735 from image sensor layer 730. if 143] In some embodiments, each individual sensor facet 3735 is coupled to flexible circuit board 2210. In other embodiments, each individual sensor facet 3735 is coupled to one of the logic facets 2655 that has been coupled to flexible circuit board 2210. In other embodiments, each logic facet 2655 is first coupled one of the sensor facets 3735, and the combined facet is then coupled to flexible circuit board. 2210. In such embodiments, the combined logic facet 2655 and sensor facet 3735 ma be referred to as a sexxsor facet 3735 for simplicity. As used herein, "sensor facet" may refer to both, embodiments (i.e. , an individual sensor facet 3735 or a combination of a sensor facet 3735 with a logic facet 2655) .
[1441 In general,. each. sensor facet. 3735 can foe individually addressed [e.g<, by a central control processor not pictured) , and. a collection of sensor facets 3735 may represent a dynamic, heterogeneous collection forming a singular collective. In other words, distributed multi - aperture camera array 3700 provides a tiled electronic sensor system providing imagery captured through individual sensor" facets 373S that together form a complete whole, Biach individual sensor facets 3735 is capable of capturing images at multiple different resolutions and can foe customised on the fly to capture a different resolution, color range, frame rate, etc. For example, one sensor facet 3735 may have a 512x512 capture resolution while an adj cent sensor facet 3735 [of equal size) has a 128x128 capture resolution, wherein the former represents a higher concentrat-ios of imagery data. In this example, these two sensors are heterogeneous., but are individually; controllable and work in unison to capture a singular light field.
[145] The overall collection of sensor facets 3735 can follow any curved or flat surface structure. For example, sensor facets 3735 may be formed into a Hemispherical surface, a cylindrical surface, an oblong spherical surrace , or any other
Figure imgf000059_0001
[146] Sensor facets 3735 may be in any appropriate shape. In some embodiments, the shapes of sensor facets 3735 match the shapes of display facets 2665 and the shape of facet locations 2220. In some embodiments, sensor facets 3735 are in the shape of a polygon such as a triangle, a quadrilateral, a pentagon, a hexagon, a heptagon, or an octagon. In some embodiments, some or all of sensor facets 3735 have non- polygonal shapes. For example, sensor facets 3735 on the edges of flexible circuit beared 2210 may not be polygonal as they may have curved cutoffs so as to enhance the aesthetic of the overall assembly.
{147] in addition to having a seleotable/eontrollafole resolution, each sensor facets 3735 may in some embodiments also have a selectable color range from a plurality of color ranges and/or a selectable frame rate from a plurality of frame rates. In such embodiments, the sensor facets 3735 of a particular flexible circuit board 2210 are configurable to provide heterogeneous frame rates arid heterogeneous color range, For exampl , one sensor facet 3735 may haw? a particular color range while another sensor facet 3?3S has a different color range. Similarly., one sensor facet 3735 may have a particular frame rats while another sensor facet 3735 has a different frame rate.
[148] FIGURES 38-39 illustrate exploded views of the distributed multi-aperture camera array 3700 of FIGURE 37, according to certain embodiments . ¾s illustrated in these figures, each sensor facet 3735 ay include pads 3S40 in a pattern that matches pads 2240 on flexible circuit hoard 22IQ or pads 2MO on logic facet 2655. This permits sensor facet 3735 to he coupled to logic facet 2655 or flexible circuit board 2210 using any appropriate technique in the art . In some embodiments, pads 3948 are BGA pads or any other appropriate surface--mounting pads. FIGURE 40-40 illustrate similar views of flexible circuit boar 2210 as shown in FIGURES 23-24, except that flexible circuit hoard 2210 has been formed into a 3D shape.
[149] FIGURE 42 illustrates a method 4280 of manufacturing distributed multi-aperture camera array 37OQ, according to certain embodiment®. Method 4208 may begin in ste 4218 where a plurality of facet locations are formed on a circuit board. In some embodiments, the facet locations are facet locations 2220 and the circuit board is flexible circuit board 2210, In some embodiments each facet location corresponds to one of a plurality of sensor facets such as sensor facets 3735,
[150] A step 4:220 , the flexible circuit board is cut or otherwise formed into pattern that permits the flexible circuit board to he later formed into a 3D shape. When the flexible circuit board is flat. , at least some of the facet locations are separated from one or more adjacent facet locations by a plurality of gaps such as gaps 2215, When the flexible circuit board is formed into the 3D shape., the plurality of gaps are substantially eli inated*
[151] At step 4230, a plurality of sensor facets are coupled to a first side of the flexible circuit board, Each sensor facet is coupled to a respective one of the facet locations of step 4210, At step 4240, the assembled electronic camera assembly is formed into the 3D shape . In some embodiments , this step may be similar to step 2540 of method: 2500 described above. After step 4240, method 4200 may end.
[152] Particular embodiments may repeat one or more steps of method 4200, where appropriate, Although this disclosure describes and illustrates particular steps of method 4200 as occurring in a particular order, this disclosure contemplates any suitable steps of method 4200 occurring in any suitable order (e.g,, any temporal order). Moreover, although this disclosure describes and illustrates an example method of manufacturing a distributed multi- aperture camera array, this disclosure contemplates any suitable method of manufacturing a distributed multi“aperture camera array, which may include all, some, or none of the steps of method 4200, where pprop iate . Furthermore, although this disclosure describes and illustrates particular components, devices, or systems carryin out particular steps of method 4200, this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of method 4200, [153} Herein, "or" is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, ¾A or B” means "A, B, or both," unless expressly indicated otherwise or indicated otherwise by context. Moreover, "and" is both joint and several, unless expressly indicated otherwise or indicated otherwise by context . Therefore, herein, T and B" means "A and B, jointl or severally," unless expressly indicated otherwise or indicated otherwise by context .
[1 The scope of this disclosure encompasses all changes, substitutions, variations, alterations, and modifloations to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of this disclosure is not limited to the example embodiments described o illustrated herein. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combina ion or permutation of any of the components, elements, functions,operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims toan apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable: of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component Is so adapted, arranged, capable, configured, enabled, operable, or operative. [155] Although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, functions, operations, or steps described or illustrated anywhere herein that a perso having ordinary skill in the art would comprehend,
[156] Furthermore, reference in the appended claims to an p ratus or system ©r a component of ail apparatus or system being adapted to, arrange to, capable of, configured to, enabled to, operable to, or operative to perform a particula function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.

Claims

WHAT IS CLAIMED IS:
1, An electronic display assembly comprising;
a first microIens layer on a first side of a circuit board, the first, mierolens layer comprising a first plurality of cells?
a second mierolens layer on an opposite side of the circuit board from the first microlens layer, the second ierolens layer comprisin a secon plurality of cells;
an image sensor layer adjacent to the first mierolens layer, the image: sensor layer comprising a plurality of sensor pixels configured to detect incoming light through the first plurality of ceils.; and
a display layer adjacent to the second mierolens array, the display layer comprising a plurality of display pixels configured to emit light through the second plurality of cells ,·
wherein each cell of the first and second plurality of cells comprises;
a transparent lensle and
a plurality of opaque walls configured to prevent light from bleeding into adjacent cells.
2, The electronic display assembly of Claim 1, further comprising a filter layer on at least one end of each ceil of the first and second plurality of cells, the filter layer operable to limit light passing through the filter layer- to a specific incident angle.
3 , The electronic display assembly of Claim 2 , wherein the filter layer comprises a coating or a film.
4. The electronic display assembly of Claim
Figure imgf000065_0001
wherein the transparent lenslet of each cell of the first and second plurality of cells is formed from a polymer, silica glass, or sapphire .
5. The electronic displa assembly of Claim 4, wherein the polymer comprises polycarbonate or acrylic.
S. ide electronic display assembly of Claim G, wherein the transparent enslet of each cell of the first and second plurality of cells comprises a three-dimensional shape with a collimating lens on one end of the three-dimensional shape, the three-dime sional shape comprising;
a triangular polyhedron;
a rectangular cuboid;
Figure imgf000065_0002
a hexagonal polyhedron;·
a heptagonal polyhedron; or
an octagonal polyhedron.
7, Mi electronic display assembly comprising;
a microlens layer comprising a plu ality of cells.? and a pixel array layer adjacent to the icrolens layer, the pixel array layer comprising a plurality of pixels;
wherein each cell of the plurality of cells comprises: a transparent lenslet; and
one or .both of :
a plurality of opaque walls configured to prevent light from bleeding into adjacent cells; and a filter layer on one end of each cell of the plurality of cells, the filter layer configured to li it light passing through the filter layer to a specific incident angle.
8. The electronic display assembly of Claim 7, wherein the filter layer comprises a coating or a film,.
9. The electronic display assembly of Claim. 7, wherein the transparent lenslet of each cell of the plurality of cells is formed fro a polymer, silica glass, or sapphire,
10, The electronic display assembly of Claim. .9, wherein the polymer comprises polycarbonate or acrylic.
11, The electronic display assembly of Claim 7 wherein the transparen lenslefc of each cell of the first anc second plurality of cells co prises a. three-dimensional sha e with a collimating lens on one end. of the three-dimensional s ape , the three -diraensioxial shape comprising s
a triangular polyhedron?
a rectangular cuboid;
a pentagonal polyhedron?
a hexagonal polyhedron;
a he tagonal polyhedron; or
an octagonal polyhedron.
12, The electronic displa assembly of Claim 7, wherein. the plurality of pixels comprises;
a plurality of sensor pixels configured to de ect incoming light through the plurality of cells; or
a plurality of display pixels configured to emit light through the plurality of cells.
13. A method of manufacturing an electronic display., the method comprising;
obtaining a microlens array (MIA5 , the MLA comprising a pluralit of lenslets?
cutting a plurality of grooves aroun each of the plurality of lenslets to a predetermined depth; and
filling the plurality of grooves with a material configured to prevent light. from bleeding through the plurality of grooves .
14, The method of manufacturing an electronic display of Claim 13 , wherein the cutting comprises lithography.
The method of manufacturing ah electronic display of
Claim 13 wherein the cutting comprises laser etching.
It , The method of manufacturing an electronic display of
Cla im 13 , wherein the material comprises an opaque material.
17. Tiie method of manufacturing an electronic display of Claim 13, wherein the opaque material comprises a non- reflective opaque material or a tinted polymer.
13. The method of manufacturing an electronic display of Claim 13, wherein the material comprises a light-absorbing material .
19, The method of manufacturin an electronic display of Claim 13 , wherein the light -absorbing material comprises a solution of carbon nanotubes. 20 , The method of mamfacturing an electronic display of Claim 13, wherein cutting the plurality of grooves around each of the plurality of Isnslets to the predetermined depth com rises performing multiple cutting passes to achieve the predetermined depth
PCT/US2019/014668 2018-02-07 2019-01-23 Plenoptic cellular imaging system WO2019156807A1 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
JP2020542664A JP7077411B2 (en) 2018-02-07 2019-01-23 Plenoptic Cellular Imaging System
CA3090661A CA3090661C (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system
AU2019218741A AU2019218741B2 (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system
ES19704502T ES2953632T3 (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system
MYPI2020004059A MY197937A (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system
EP19704502.4A EP3750000B1 (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system
CN201980019851.1A CN111902762B (en) 2018-02-07 2019-01-23 All-optical element imaging system
KR1020207025315A KR102479029B1 (en) 2018-02-07 2019-01-23 Plenoptic Cellular Imaging System
SG11202007486YA SG11202007486YA (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/890,711 2018-02-07
US15/890,711 US10979699B2 (en) 2018-02-07 2018-02-07 Plenoptic cellular imaging system

Publications (1)

Publication Number Publication Date
WO2019156807A1 true WO2019156807A1 (en) 2019-08-15

Family

ID=65363385

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/014668 WO2019156807A1 (en) 2018-02-07 2019-01-23 Plenoptic cellular imaging system

Country Status (11)

Country Link
US (1) US10979699B2 (en)
EP (1) EP3750000B1 (en)
JP (1) JP7077411B2 (en)
KR (1) KR102479029B1 (en)
CN (1) CN111902762B (en)
AU (1) AU2019218741B2 (en)
CA (1) CA3090661C (en)
ES (1) ES2953632T3 (en)
MY (1) MY197937A (en)
SG (1) SG11202007486YA (en)
WO (1) WO2019156807A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10652529B2 (en) 2018-02-07 2020-05-12 Lockheed Martin Corporation In-layer Signal processing
US11616941B2 (en) 2018-02-07 2023-03-28 Lockheed Martin Corporation Direct camera-to-display system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5731899A (en) * 1996-12-20 1998-03-24 Eastman Kodak Company Lenslet array system incorporating an integral field lens/reimager lenslet array
US20070109438A1 (en) * 2004-01-20 2007-05-17 Jacques Duparre Image recognition system and use thereof

Family Cites Families (220)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5475514A (en) 1990-12-31 1995-12-12 Kopin Corporation Transferred single crystal arrayed devices including a light shield for projection displays
US7075501B1 (en) 1990-12-31 2006-07-11 Kopin Corporation Head mounted display system
US6548956B2 (en) 1994-12-13 2003-04-15 The Trustees Of Princeton University Transparent contacts for organic devices
US6744912B2 (en) 1996-11-29 2004-06-01 Varian Medical Systems Technologies, Inc. Multiple mode digital X-ray imaging system
WO1999038046A1 (en) 1998-01-23 1999-07-29 Burger Robert J Lenslet array systems and methods
JPH11272209A (en) 1998-01-30 1999-10-08 Hewlett Packard Co <Hp> Integrated circuit video tile for display
ATE287114T1 (en) * 2000-04-14 2005-01-15 C 360 Inc LUMINOUS VISION DEVICE, DISPLAY SYSTEM COMPRISING THE SAME AND METHOD FOR VISUALIZING THEREOF
US6775048B1 (en) * 2000-10-31 2004-08-10 Microsoft Corporation Microelectrical mechanical structure (MEMS) optical modulator and optical display system
US6724012B2 (en) 2000-12-14 2004-04-20 Semiconductor Energy Laboratory Co., Ltd. Display matrix with pixels having sensor and light emitting portions
JP4543560B2 (en) * 2001-02-09 2010-09-15 日本電気株式会社 Image input device with built-in display function
US7387913B2 (en) 2001-08-08 2008-06-17 Jsr Corporation 3D optoelectronic micro system
TWI289896B (en) 2001-11-09 2007-11-11 Semiconductor Energy Lab Laser irradiation apparatus, laser irradiation method, and method of manufacturing a semiconductor device
US7233354B2 (en) 2002-10-11 2007-06-19 Hewlett-Packard Development Company, L.P. Digital camera that adjusts resolution for low light conditions
US7015639B2 (en) 2002-10-22 2006-03-21 Osram Opto Semiconductors Gmbh Electroluminescent devices and method of making transparent cathodes
FR2852147B1 (en) 2003-03-06 2005-09-30 Commissariat Energie Atomique PIXEL MATRIX DETECTORS INTEGRATED ON LOAD READING CIRCUIT
US6909233B2 (en) 2003-06-11 2005-06-21 Eastman Kodak Company Stacked OLED display having improved efficiency
US8884845B2 (en) 2003-10-28 2014-11-11 Semiconductor Energy Laboratory Co., Ltd. Display device and telecommunication system
US20050205879A1 (en) 2004-03-17 2005-09-22 Fuji Photo Film Co., Ltd. Photoelectric converting film stack type solid-state image pickup device
US20050206755A1 (en) 2004-03-17 2005-09-22 Fuji Photo Film Co., Ltd. Solid-state imaging device
JP2005268609A (en) 2004-03-19 2005-09-29 Fuji Photo Film Co Ltd Multilayer lamination multi-pixel imaging element and television camera
US7352415B1 (en) * 2004-04-23 2008-04-01 Geronimi Heather A Electronic image display systems
US20050242712A1 (en) 2004-04-29 2005-11-03 Chao-Chin Sung Multicolor electroluminescent display
KR101084853B1 (en) 2004-08-03 2011-11-21 실버브룩 리서치 피티와이 리미티드 Walk-up printing
US20060054782A1 (en) 2004-08-25 2006-03-16 Olsen Richard I Apparatus for multiple camera devices and method of operating same
GB2420536A (en) 2004-11-27 2006-05-31 Tom Vlaming Automotive head up display system
JP2007012359A (en) 2005-06-29 2007-01-18 Hitachi Displays Ltd Organic el display device
KR100704765B1 (en) * 2005-07-12 2007-04-10 에이디반도체(주) Capacitance switch which can indicate the operating state by light
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US20070123133A1 (en) 2005-11-30 2007-05-31 Eastman Kodak Company OLED devices with color filter array units
US9101279B2 (en) 2006-02-15 2015-08-11 Virtual Video Reality By Ritchey, Llc Mobile user borne brain activity data and surrounding environment data correlation system
TW200803606A (en) 2006-06-13 2008-01-01 Itc Inc Ltd The fabrication of full color OLED panel using micro-cavity structure
US7542210B2 (en) 2006-06-29 2009-06-02 Chirieleison Sr Anthony Eye tracking head mounted display
JP2008066814A (en) * 2006-09-05 2008-03-21 Murata Mach Ltd Optical reader and image reader
US7697053B2 (en) 2006-11-02 2010-04-13 Eastman Kodak Company Integrated display having multiple capture devices
US7714923B2 (en) 2006-11-02 2010-05-11 Eastman Kodak Company Integrated display and capture apparatus
US7808540B2 (en) 2007-01-09 2010-10-05 Eastman Kodak Company Image capture and integrated display apparatus
US20080218331A1 (en) 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
US8456393B2 (en) 2007-05-31 2013-06-04 Nthdegree Technologies Worldwide Inc Method of manufacturing a light emitting, photovoltaic or other electronic apparatus and system
US7952059B2 (en) 2007-06-13 2011-05-31 Eyes Of God, Inc. Viewing system for augmented reality head mounted display with rotationally symmetric aspheric lenses
CN101345248B (en) 2007-07-09 2010-07-14 博立码杰通讯(深圳)有限公司 Multi-optical spectrum light-sensitive device and preparation thereof
KR20090038242A (en) 2007-10-15 2009-04-20 삼성전자주식회사 An image sensor including photoelectric transducing - charge trap layer
KR100916321B1 (en) 2007-12-17 2009-09-11 한국전자통신연구원 Apparatus for Organic Light Emitting Diode touch screen and manufacturing method thereof
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8339429B2 (en) 2008-07-24 2012-12-25 International Business Machines Corporation Display monitor electric power consumption optimization
US7948547B2 (en) * 2008-08-05 2011-05-24 Eastman Kodak Company Apparatus and method for capturing and viewing images
US8610650B2 (en) 2009-05-20 2013-12-17 Dialog Semiconductor Gmbh Advanced multi line addressing
DE102010000925B4 (en) 2010-01-14 2012-04-19 Kreyenborg Beteiligungen Und Verwaltungen Gmbh & Co. Kg Siebträgerelement for a filtration device with at least one Siebkavität
US8749620B1 (en) 2010-02-20 2014-06-10 Lytro, Inc. 3D light field cameras, images and files, and methods of using, operating, processing and viewing same
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9852496B2 (en) 2010-06-11 2017-12-26 Back In Focus Systems and methods for rendering a display to compensate for a viewer's visual impairment
US8163581B1 (en) 2010-10-13 2012-04-24 Monolith IC 3D Semiconductor and optoelectronic devices
US9632315B2 (en) 2010-10-21 2017-04-25 Lockheed Martin Corporation Head-mounted display apparatus employing one or more fresnel lenses
US8576276B2 (en) 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
TWI462265B (en) 2010-11-30 2014-11-21 Ind Tech Res Inst Image capture device
CA2821401C (en) 2010-12-16 2019-04-30 Lockheed Martin Corporation Collimating display with pixel lenses
CN102565918B (en) 2010-12-17 2015-04-22 杜比实验室特许公司 Quantum dot lighting engineering
US9179134B2 (en) 2011-01-18 2015-11-03 Disney Enterprises, Inc. Multi-layer plenoptic displays that combine multiple emissive and light modulating planes
JPWO2012117670A1 (en) 2011-03-01 2014-07-07 パナソニック株式会社 Solid-state imaging device
JP2012205015A (en) 2011-03-24 2012-10-22 Casio Comput Co Ltd Image processor and image processing method, and program
US8605082B2 (en) 2011-04-18 2013-12-10 Brian K. Buchheit Rendering adjustments to autocompensate for users with ocular abnormalities
US20150077312A1 (en) 2011-05-13 2015-03-19 Google Inc. Near-to-eye display having adaptive optics
US8508830B1 (en) 2011-05-13 2013-08-13 Google Inc. Quantum dot near-to-eye display
KR101252169B1 (en) * 2011-05-27 2013-04-05 엘지전자 주식회사 Mobile terminal and operation control method thereof
US9454851B2 (en) 2011-06-24 2016-09-27 Intel Corporation Efficient approach to estimate disparity map
US20130175557A1 (en) 2011-09-02 2013-07-11 The Procter & Gamble Company Light emitting apparatus
US20130083003A1 (en) 2011-09-30 2013-04-04 Kathryn Stone Perez Personal audio/visual system
US20130188045A1 (en) 2012-01-20 2013-07-25 Nokia Corporation High Resolution Surveillance Camera
US8848006B2 (en) * 2012-01-25 2014-09-30 Massachusetts Institute Of Technology Tensor displays
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US20130286053A1 (en) 2012-04-25 2013-10-31 Rod G. Fleck Direct view augmented reality eyeglass-type display
KR101260287B1 (en) 2012-04-27 2013-05-03 (주)뷰아이텍 Method for simulating spectacle lens image using augmented reality
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
US9001427B2 (en) 2012-05-30 2015-04-07 Microsoft Technology Licensing, Llc Customized head-mounted display device
US9179126B2 (en) 2012-06-01 2015-11-03 Ostendo Technologies, Inc. Spatio-temporal light field cameras
WO2014005123A1 (en) 2012-06-28 2014-01-03 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays, optic arrays, and sensors
US9841537B2 (en) * 2012-07-02 2017-12-12 Nvidia Corporation Near-eye microlens array displays
US9494797B2 (en) 2012-07-02 2016-11-15 Nvidia Corporation Near-eye parallax barrier displays
US9992453B2 (en) 2012-07-20 2018-06-05 Freedom Scientific, Inc. Multiposition magnifier camera
EP2690483A1 (en) 2012-07-25 2014-01-29 Johnson Controls Automotive Electronics SAS Head-up display and method for operating it
US8754829B2 (en) 2012-08-04 2014-06-17 Paul Lapstun Scanning light field camera and display
US9250445B2 (en) * 2012-08-08 2016-02-02 Carol Ann Tosaya Multiple-pixel-beam retinal displays
US9349769B2 (en) * 2012-08-22 2016-05-24 Taiwan Semiconductor Manufacturing Company, Ltd. Image sensor comprising reflective guide layer and method of forming the same
JP2014048652A (en) 2012-09-04 2014-03-17 Japan Display Inc Liquid crystal display device
US9237263B2 (en) 2012-10-05 2016-01-12 Vidinoti Sa Annotation method and apparatus
EP2915156A4 (en) 2012-11-01 2015-10-28 Lellan Inc Seamless illuminated modular panel
US9933684B2 (en) 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
JP6220125B2 (en) 2012-12-28 2017-10-25 キヤノン株式会社 Imaging apparatus and control method thereof
US9497380B1 (en) 2013-02-15 2016-11-15 Red.Com, Inc. Dense field imaging
US9379359B2 (en) 2013-03-13 2016-06-28 Panasonic Intellectual Property Management Co., Ltd. Organic electroluminescence element and lighting device using same
KR20230113418A (en) 2013-03-15 2023-07-28 매직 립, 인코포레이티드 Display system and method
TWI649675B (en) * 2013-03-28 2019-02-01 新力股份有限公司 Display device
KR20150136519A (en) 2013-05-01 2015-12-07 코니카 미놀타 가부시키가이샤 Organic electroluminescent element
US9036078B1 (en) 2013-05-14 2015-05-19 Google Inc. Reducing light damage in shutterless imaging devices
DE112014002456B4 (en) 2013-05-17 2017-07-13 Panasonic Intellectual Property Management Co., Ltd. Organic electroluminescent element and lighting device
US9519144B2 (en) 2013-05-17 2016-12-13 Nvidia Corporation System, method, and computer program product to produce images for a near-eye light field display having a defect
ES2872927T3 (en) 2013-05-21 2021-11-03 Photonic Sensors & Algorithms S L Monolithic integration of plenoptic lenses on photosensor substrates
KR20140140861A (en) 2013-05-30 2014-12-10 영남대학교 산학협력단 Organic Light Emitting Display
JP6098375B2 (en) 2013-05-31 2017-03-22 日本精機株式会社 Head-up display device
US9874749B2 (en) 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
CN103353663B (en) 2013-06-28 2016-08-10 北京智谷睿拓技术服务有限公司 Imaging adjusting apparatus and method
KR20150006731A (en) 2013-07-09 2015-01-19 삼성디스플레이 주식회사 Display device and driving method thereof
US20150022643A1 (en) 2013-07-19 2015-01-22 Google Inc. Asymmetric Sensor Array for Capturing Images
GB201314984D0 (en) 2013-08-21 2013-10-02 Sony Comp Entertainment Europe Head-mountable apparatus and systems
US9158115B1 (en) 2013-09-16 2015-10-13 Amazon Technologies, Inc. Touch control for immersion in a tablet goggles accessory
US9454008B2 (en) 2013-10-07 2016-09-27 Resonance Technology, Inc. Wide angle personal displays
TWI498769B (en) 2013-11-18 2015-09-01 Quanta Comp Inc Head mounted display apparatus and login method thereof
CN107272199B (en) 2013-11-27 2023-04-07 奇跃公司 Virtual and augmented reality systems and methods
US8958158B1 (en) 2013-12-03 2015-02-17 Google Inc. On-head detection for head-mounted display
KR102283111B1 (en) 2013-12-17 2021-07-29 마수피얼 홀딩스 아이엔시. Integrated microoptic imager, processor and display
US10274731B2 (en) 2013-12-19 2019-04-30 The University Of North Carolina At Chapel Hill Optical see-through near-eye display using point light source backlight
US10381395B2 (en) 2013-12-24 2019-08-13 Sony Semiconductor Solutions Corporation Light control device with stacked light control layers
EP3087427B1 (en) 2013-12-27 2019-01-30 Intel Corporation Device, method, and system of providing extended display with head mounted display
WO2015100714A1 (en) 2014-01-02 2015-07-09 Empire Technology Development Llc Augmented reality (ar) system
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
KR102291619B1 (en) 2014-02-26 2021-08-23 삼성디스플레이 주식회사 Organic light emiiting dispaly device
WO2015134740A1 (en) 2014-03-05 2015-09-11 Arizona Board Of Regents On Behalf Of The University Of Arizona Wearable 3d augmented reality display with variable focus and/or object recognition
US9568727B2 (en) 2014-03-25 2017-02-14 Amazon Technologies, Inc. Electrowetting display pixel architecture
US10168534B2 (en) 2014-03-26 2019-01-01 Essilor International Methods and systems for augmented reality
US9710881B2 (en) 2014-04-05 2017-07-18 Sony Interactive Entertainment America Llc Varying effective resolution by screen location by altering rasterization parameters
US9843787B2 (en) 2014-04-24 2017-12-12 Qualcomm Incorporated Generation and use of a 3D radon image
WO2015162098A1 (en) 2014-04-24 2015-10-29 Carl Zeiss Meditec, Inc. Functional vision testing using light field displays
WO2015170410A1 (en) 2014-05-09 2015-11-12 日立マクセル株式会社 Image playback device, display device, and transmission device
EP3146715B1 (en) 2014-05-20 2022-03-23 University Of Washington Through Its Center For Commercialization Systems and methods for mediated-reality surgical visualization
US9706910B1 (en) 2014-05-29 2017-07-18 Vivid Vision, Inc. Interactive system for vision assessment and correction
KR102205856B1 (en) 2014-06-11 2021-01-21 삼성디스플레이 주식회사 Organic light emitting diode display device including sensors
EP3155668B1 (en) 2014-06-16 2021-02-17 B.G. Negev Technologies & Applications Ltd., at Ben-Gurion University Swir to visible image up-conversion integrated device
US20160267851A1 (en) 2014-06-17 2016-09-15 Nato Pirtskhlava One Way Display
US9581821B2 (en) 2014-06-24 2017-02-28 Fakespace Labs, Inc. Head mounted augmented reality display
US10198865B2 (en) 2014-07-10 2019-02-05 Seiko Epson Corporation HMD calibration with direct geometric modeling
US9596456B2 (en) 2014-07-16 2017-03-14 Quanta Computer Inc. Head mounted display system
JP6571110B2 (en) 2014-07-18 2019-09-04 ビュージックス コーポレーションVuzix Corporation Eyepiece display with self-emitting microdisplay engine
CN104166259B (en) 2014-07-30 2017-02-08 京东方科技集团股份有限公司 Display substrate, driving method thereof and display device
US10739882B2 (en) 2014-08-06 2020-08-11 Apple Inc. Electronic device display with array of discrete light-emitting diodes
JP6501462B2 (en) 2014-08-08 2019-04-17 キヤノン株式会社 PHOTOELECTRIC CONVERSION DEVICE AND DRIVING METHOD OF PHOTOELECTRIC CONVERSION DEVICE
US9465991B2 (en) 2014-08-11 2016-10-11 Microsoft Technology Licensing, Llc Determining lens characteristics
US9626936B2 (en) 2014-08-21 2017-04-18 Microsoft Technology Licensing, Llc Dimming module for augmented and virtual reality
KR102271817B1 (en) 2014-09-26 2021-07-01 삼성전자주식회사 Smart contact lens for augmented reality and methods of manufacturing and operating the same
GB2532003A (en) 2014-10-31 2016-05-11 Nokia Technologies Oy Method for alignment of low-quality noisy depth map to the high-resolution colour image
CN105744257B (en) 2014-12-08 2018-06-12 北京蚁视科技有限公司 A kind of optical invisible device
US9824498B2 (en) 2014-12-30 2017-11-21 Sony Interactive Entertainment Inc. Scanning display system in head-mounted display for virtual reality
US10049495B2 (en) 2015-01-14 2018-08-14 Hashplay Inc. System and method for providing virtual reality content
KR102311741B1 (en) 2015-01-14 2021-10-12 삼성디스플레이 주식회사 Head mounted display apparatus
US9734554B2 (en) 2015-02-27 2017-08-15 Charter Communications Operating, Llc Compensation for viewing with common vision abnormalities
WO2016138428A1 (en) 2015-02-27 2016-09-01 LAFORGE Optical, Inc. Augmented reality eyewear
US20160269720A1 (en) 2015-03-11 2016-09-15 Oculus Vr, Llc Flexible substrate display panel and panel geometry for ergonomics
US10078922B2 (en) 2015-03-11 2018-09-18 Oculus Vr, Llc Eye tracking for display resolution adjustment in a virtual reality system
CN104854651B (en) 2015-03-17 2018-02-02 深圳云英谷科技有限公司 Display pixel arrangement and its drive circuit
EP3281057A4 (en) 2015-04-08 2018-12-05 Dispelix Oy Optical see-through display element and device utilizing such element
US10085005B2 (en) 2015-04-15 2018-09-25 Lytro, Inc. Capturing light-field volume image and video data using tiled light-field cameras
US10152906B2 (en) 2015-04-26 2018-12-11 Mems Start, Llc Near-eye display system and method
US9912897B2 (en) 2015-05-11 2018-03-06 Semiconductor Energy Laboratory Co., Ltd. Imaging device and electronic device
US20180104106A1 (en) 2015-05-12 2018-04-19 Agency For Science, Technology And Research A system and method for displaying a video image
CN104795026A (en) 2015-05-13 2015-07-22 京东方科技集团股份有限公司 Driving circuit of full-color organic light emitting diode pixel and driving method thereof
JP2016225221A (en) 2015-06-02 2016-12-28 コニカミノルタ株式会社 Electroluminescence device
US9728143B2 (en) 2015-06-29 2017-08-08 Amazon Technologies, Inc. System and method for driving electrowetting display device
US10359630B2 (en) 2015-06-30 2019-07-23 Massachusetts Institute Of Technology Display apparatus comprising first and second optical phased arrays and method for augmented reality
CN107850788B (en) 2015-07-03 2020-10-27 依视路国际公司 Method and system for augmented reality
KR102348760B1 (en) 2015-07-24 2022-01-07 삼성전자주식회사 Image sensor and signal processing method thereof
US10451876B2 (en) 2015-08-03 2019-10-22 Facebook Technologies, Llc Enhanced visual perception through distance-based ocular projection
CN105093532A (en) 2015-08-03 2015-11-25 京东方科技集团股份有限公司 Virtual reality glasses and display method
US20170038607A1 (en) 2015-08-04 2017-02-09 Rafael Camara Enhanced-reality electronic device for low-vision pathologies, and implant procedure
KR101808852B1 (en) 2015-08-18 2017-12-13 권혁제 Eyeglass lens simulation system using virtual reality headset and method thereof
US20170061696A1 (en) 2015-08-31 2017-03-02 Samsung Electronics Co., Ltd. Virtual reality display apparatus and display method thereof
CN105192982B (en) 2015-09-07 2018-03-23 北京小鸟看看科技有限公司 The image correction method and system of adjustable virtual implementing helmet
KR20170033462A (en) 2015-09-16 2017-03-27 삼성디스플레이 주식회사 Electronic device and method for displaying image of a head mounted display device
KR102404648B1 (en) 2015-09-21 2022-05-31 엘지디스플레이 주식회사 Display device
US9958941B2 (en) 2015-09-24 2018-05-01 Tobii Ab Eye-tracking enabled wearable devices
US20170090194A1 (en) 2015-09-24 2017-03-30 Halo Augmented Reality Ltd. System And Method For Subtractive Augmented Reality And Display Contrast Enhancement
US20170092232A1 (en) 2015-09-30 2017-03-30 Brian Mullins Optical true time delay circuit in a head-mounted display
KR102521944B1 (en) 2015-10-02 2023-04-18 삼성디스플레이 주식회사 Head mounted display and fabrication method thereof
KR102406606B1 (en) 2015-10-08 2022-06-09 삼성디스플레이 주식회사 Organic light emitting device, organic light emitting display device having the same and fabricating method of the same
US11609427B2 (en) 2015-10-16 2023-03-21 Ostendo Technologies, Inc. Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays
US20170123488A1 (en) 2015-10-28 2017-05-04 Microsoft Technology Licensing, Llc Tracking of wearer's eyes relative to wearable device
US10338677B2 (en) 2015-10-28 2019-07-02 Microsoft Technology Licensing, Llc Adjusting image frames based on tracking motion of eyes
KR102448611B1 (en) 2015-10-30 2022-09-27 엘지디스플레이 주식회사 Organic light emitting display
KR102516054B1 (en) 2015-11-13 2023-03-31 삼성디스플레이 주식회사 Organic light emitting display apparatus and method for manufacturing the same
KR20170061602A (en) 2015-11-26 2017-06-05 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Semiconductor device and electronic device
US10204451B2 (en) 2015-11-30 2019-02-12 Microsoft Technology Licensing, Llc Multi-optical surface optical design
US10445860B2 (en) 2015-12-08 2019-10-15 Facebook Technologies, Llc Autofocus virtual reality headset
EP3179334A1 (en) 2015-12-09 2017-06-14 Airbus Defence and Space GmbH Device and method for testing function or use of a head worn see through augmented reality device
KR20170070693A (en) 2015-12-14 2017-06-22 삼성전자주식회사 Image sensor
US20170169612A1 (en) 2015-12-15 2017-06-15 N.S. International, LTD Augmented reality alignment system and method
US10438410B2 (en) 2015-12-22 2019-10-08 Intel Corporation Text enhancements for head-mounted displays
JP6641987B2 (en) 2015-12-25 2020-02-05 セイコーエプソン株式会社 Virtual image display
KR102458645B1 (en) 2015-12-28 2022-10-25 엘지디스플레이 주식회사 Display device and driving method thereof
US10241319B2 (en) 2015-12-28 2019-03-26 Amazon Technologies, Inc. Electrowetting display pixels with fluid motion initiator
JP6607777B2 (en) 2015-12-28 2019-11-20 ルネサスエレクトロニクス株式会社 Semiconductor device and manufacturing method thereof
KR102570950B1 (en) 2015-12-28 2023-08-25 엘지디스플레이 주식회사 Display device for personal immersion apparatus
US10038033B2 (en) 2015-12-29 2018-07-31 Industrial Technology Research Institute Image sensor
KR102470377B1 (en) 2015-12-31 2022-11-23 엘지디스플레이 주식회사 Display device for personal immersion apparatus
US9946073B2 (en) 2015-12-31 2018-04-17 Oculus Vr, Llc Methods and systems for eliminating strobing by switching display modes in response to detecting saccades
US10043305B2 (en) 2016-01-06 2018-08-07 Meta Company Apparatuses, methods and systems for pre-warping images for a display system with a distorting optical component
CA3193007A1 (en) 2016-01-12 2017-07-20 Esight Corp. Language element vision augmentation methods and devices
US9998695B2 (en) 2016-01-29 2018-06-12 Ford Global Technologies, Llc Automotive imaging system including an electronic image sensor having a sparse color filter array
JP2017134365A (en) * 2016-01-29 2017-08-03 大日本印刷株式会社 Lens sheet, imaging module, and imaging device
US9836652B2 (en) 2016-02-02 2017-12-05 International Business Machines Corporation Showing danger areas associated with objects using augmented-reality display techniques
CN105572877B (en) 2016-02-03 2018-11-09 上海群英软件有限公司 A kind of wear-type augmented reality intelligent display device
CN109070804B (en) 2016-04-14 2021-09-21 金泰克斯公司 Vision-corrected vehicle display
WO2018022521A1 (en) 2016-07-25 2018-02-01 Magic Leap, Inc. Light field processor system
CN106309089B (en) 2016-08-29 2019-03-01 深圳市爱思拓信息存储技术有限公司 VR vision correction procedure and device
EP3293959A1 (en) 2016-09-07 2018-03-14 Thomson Licensing Plenoptic imaging device equipped with an enhanced optical system
US11160688B2 (en) 2016-11-10 2021-11-02 Samsung Electronics Co., Ltd. Visual aid display device and method of operating the same
US9977248B1 (en) 2016-12-21 2018-05-22 PhantaField, Inc. Augmented reality display system
US20190057957A1 (en) 2016-12-21 2019-02-21 PhantaField, Inc. Augmented reality display system
DE102017200112B4 (en) 2017-01-05 2021-03-18 Volkswagen Aktiengesellschaft Method and device for generating a dynamic light field
US20180269260A1 (en) 2017-01-29 2018-09-20 Emagin Corporation Quantum dot array on directly patterned amoled displays and method of fabrication
CN107260505A (en) 2017-02-14 2017-10-20 合肥中感微电子有限公司 Sight protectio method, device and the VR glasses with eyesight protection function
US10163963B2 (en) 2017-04-05 2018-12-25 Semiconductor Components Industries, Llc Image sensors with vertically stacked photodiodes and vertical transfer gates
US10638069B2 (en) 2017-04-07 2020-04-28 Raytheon Company System for and method of configurable line scan array imaging
US10178334B2 (en) 2017-04-07 2019-01-08 Raytheon Company System for and method of configurable diagonal and multi-mission line scan array imaging
US10249800B1 (en) 2017-10-03 2019-04-02 Lockheed Martin Corporation Stacked transparent pixel structures for electronic displays
US10930709B2 (en) 2017-10-03 2021-02-23 Lockheed Martin Corporation Stacked transparent pixel structures for image sensors
US10510812B2 (en) 2017-11-09 2019-12-17 Lockheed Martin Corporation Display-integrated infrared emitter and sensor structures
US10129984B1 (en) 2018-02-07 2018-11-13 Lockheed Martin Corporation Three-dimensional electronics distribution by geodesic faceting
US10690910B2 (en) * 2018-02-07 2020-06-23 Lockheed Martin Corporation Plenoptic cellular vision correction

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5731899A (en) * 1996-12-20 1998-03-24 Eastman Kodak Company Lenslet array system incorporating an integral field lens/reimager lenslet array
US20070109438A1 (en) * 2004-01-20 2007-05-17 Jacques Duparre Image recognition system and use thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BASEL SALAHIEH ET AL: "Light Field Retargeting from Plenoptic Camera to Integral Display", IMAGING AND APPLIED OPTICS 2017 (3D, AIO, COSI, IS, MATH, PCAOP), vol. 3, 1 January 2017 (2017-01-01), Washington, D.C., pages DTu3F.3, XP055584294, ISBN: 978-1-943580-29-3, DOI: 10.1364/3D.2017.DTu3F.3 *
SUNG-WOOK MIN ET AL: "Three-dimensional electro-floating display system using an integral imaging method References and linksViewing-angle-enhanced integral imaging using lens switching," in Stereoscopic Displays and Virtual Reality Systems IXMultiple-viewing-zone integral imaging using a dynamic barrier array for three-", COMPTES-RENDUS ACADEMIE DES SCIENCES J. OPT. SOC. AM. APPL. OPT. APPL. OPT. OPT. EXPRESS JAPANESE J. APPL. PHYS. APPL. OPT. APPL. OPT. OPT. LETT. OPT. LETT. OPT. LETT. OPT. LETT. J. S. JANG AND B. JAVIDI OPT. LETT. OPT. EXPRESS OPT. LETT. OPT. EXPRES, 1 January 1980 (1980-01-01), pages 548 - 564, XP055585556, Retrieved from the Internet <URL:https://www.osapublishing.org/DirectPDFAccess/7831B52A-DF5F-01E3-B759E5E1DE829905_84314/oe-13-12-4358.pdf?da=1&id=84314&seq=0&mobile=no> *
XIAO XIAO ET AL: "Advances in three-dimensional integral imaging: sensing, display, and applications [Invited]", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 52, no. 4, 1 February 2013 (2013-02-01), pages 546 - 560, XP001580772, ISSN: 0003-6935, DOI: HTTP://DX.DOI.ORG/10.1364/AO.52.000546 *

Also Published As

Publication number Publication date
EP3750000A1 (en) 2020-12-16
CA3090661A1 (en) 2019-08-15
AU2019218741A1 (en) 2020-09-03
US10979699B2 (en) 2021-04-13
KR20200113272A (en) 2020-10-06
AU2019218741B2 (en) 2022-02-24
JP2021513271A (en) 2021-05-20
KR102479029B1 (en) 2022-12-20
JP7077411B2 (en) 2022-05-30
EP3750000B1 (en) 2023-07-05
US20190246097A1 (en) 2019-08-08
CN111902762A (en) 2020-11-06
CN111902762B (en) 2022-12-09
ES2953632T3 (en) 2023-11-14
MY197937A (en) 2023-07-25
SG11202007486YA (en) 2020-09-29
CA3090661C (en) 2022-10-04

Similar Documents

Publication Publication Date Title
EP3750184B1 (en) Three-dimensional electronics distribution by geodesic faceting
US10594951B2 (en) Distributed multi-aperture camera array
US10866413B2 (en) Eccentric incident luminance pupil tracking
US10698201B1 (en) Plenoptic cellular axis redirection
US11146781B2 (en) In-layer signal processing
AU2019218741B2 (en) Plenoptic cellular imaging system
EP3750183B1 (en) Display assemblies with electronically emulated transparency
US10951883B2 (en) Distributed multi-screen array for high density display
US11616941B2 (en) Direct camera-to-display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19704502

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3090661

Country of ref document: CA

Ref document number: 2020542664

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207025315

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019218741

Country of ref document: AU

Date of ref document: 20190123

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019704502

Country of ref document: EP

Effective date: 20200907