CA2223126A1 - Three-dimensional imaging system - Google Patents

Three-dimensional imaging system Download PDF

Info

Publication number
CA2223126A1
CA2223126A1 CA002223126A CA2223126A CA2223126A1 CA 2223126 A1 CA2223126 A1 CA 2223126A1 CA 002223126 A CA002223126 A CA 002223126A CA 2223126 A CA2223126 A CA 2223126A CA 2223126 A1 CA2223126 A1 CA 2223126A1
Authority
CA
Canada
Prior art keywords
image
lenses
micro
array
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002223126A
Other languages
French (fr)
Inventor
Jacob N. Wohlstadter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/476,852 external-priority patent/US6014259A/en
Priority claimed from US08/476,854 external-priority patent/US5986811A/en
Priority claimed from US08/476,853 external-priority patent/US5717453A/en
Application filed by Individual filed Critical Individual
Publication of CA2223126A1 publication Critical patent/CA2223126A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B82NANOTECHNOLOGY
    • B82YSPECIFIC USES OR APPLICATIONS OF NANOSTRUCTURES; MEASUREMENT OR ANALYSIS OF NANOSTRUCTURES; MANUFACTURE OR TREATMENT OF NANOSTRUCTURES
    • B82Y30/00Nanotechnology for materials or surface science, e.g. nanocomposites
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/10Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images using integral imaging methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/229Image signal generators using stereoscopic image cameras using a single 2D image sensor using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/236Image signal generators using stereoscopic image cameras using a single 2D image sensor using varifocal lenses or mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/322Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using varifocal lenses or mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/232Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/293Generating mixed stereoscopic images; Generating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/307Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/395Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Abstract

Recent advances in surface techniques have lead to the development of extremely small (sub-micron) scale features. These techniques allow the formation of polymer micro-lenses (14) as well as variable focus liquid lenses (52). The present invention primarily concerns the use of small scale lenses for the fabrication of novel displays which exhibit three-dimensional (3D) effects. Both still images and video images (or other motion images) can be generated.

Description

W O 96/41227 PCT~US96/10181 TI~R~ DTl~,NSIONAI, IM~GTNG SY~T~,l~I

F~,l,n OF T~ NVF',NTION

S The present invention relates generally to optical ~y~t~.~lS, and more specifir~lly to three ~im.on~jon~l im~in~ systems incorporating ~ e~tive, refractive, or e~cli~ compound lenses.

RACKGROuNn HUMAN VISION

Normal human vision provides a ~ ct;~,Lion of space in the visual field of view that is in color and three ~imPn~ions (3D~ A better r~li7~tiorl of the optical 15 requirements for a photographic system to present an ~cc~l; hle 3D ~ ose~ ic image or stereo-model to the viewer is given by an underst~n~ling of ~ ,opsis, or visual ~,.;~Lion of space.

The stim~ con~lition~ for space p ~.~;~tion are tennet cues, and are in 20 h,vo groups. The ~l-ono~;ular group allows ~ is with one eye and il-cl~ relative sizes of sulJ;e~,1s, their h~t~",osilion, linear and aerial p ~l~e~ e, distribution of light and shade, m~ ..le.lL p~r~ x of subject and background and visual ~cco . .~o~lnti~ The binocular group uses the two coor.linaled activities of both eyes: firstly, visual coll~l~,,gellce, where the optical axes converge mllcc~ rly from parallel for distant vision 2S to a co.l.,c~,llcc angle of 23~ for a near point of 150 mm; and s~cQn~lly~ st~ oscol ;c vision, where, due to the two different visual viewpoints, the im~gitlg peom~t~y gives two .l;~S.~ retinal images for the left and right eyes The disparities are due to p~ Y the eksli~- displA~ t of corresponding or homologous image points of a subject pointaway from the optical axis due to its position in tlle binocular field of view.
Retinal images are encoded for h~n~mieeion as rS~e 5u~s~ mo~ ted voltage imr-lees along the optic nerve, with signal procçceing takirsg place at the ;~.t ~ ç lateral geniculate bodies and then the visual cortex of tshe braL-s. The , ~ visuaS pe.~ tion is unique to the observer. For a fu~er ~ u~;ol~ of human 3D ~ c~,ption, see, e.g., Sidney F Ray, "Applied Photographic Optics Tm~ing Systems For Photography, Film and Video," Focal Press, pp. 469-484, (1988), which is LcossJos~st~d herein by rer~,rence.

3D TECHnNIQ UES

Many prior art 3D im5~in~ ~y~l~.-lS use parallax to j~ P.,~t~, the 3D effect.
S Section 65.5 of Ray, cited above and which is illc~.~olaled herein by .ef~e"ce, provides a good ~le~~rirtion of several parallax-based techniques, such as 3D movies, stereo viewing of two side-by-side offset im~geS~ 3D post cards, etc. Although these parallax-only based s~h~s offer some degree of 3D effect, they are ~iic~ ~ ..Al-ly unrealistic.

0 Another well known, but far more compleY technique for ~ ,nl;.. e 3D
images is holography. While holography can produce quite realistic 3D images, its use is quite limited because of the need for coherent light sources (such as lasers) and the darkroom or near darkroom conditions required to generate holograms.

One prior art technique for g~--f .~ 3D images, known as integral photogrArhy, uses an array of small lenses (referred to as a fly's eye lens or a micro-lens array) to both 8~ e and reproduce 3D images. The technique of integral phn~rhy is ~les~ -- ;I-~d in Ives, Herbert E., "Optical Properties of a Li~ çnti~ tç~l Sheet,"
Jollm~l of th~ Qpti~l Society of Am~rica ;~ 171-176 (1931).
Other techniques h,coll,u~ g micro-lens arrays for the ~.n~ ;orl of 3D
images are A~s~ribe~l in Yang et al., 1988, "Di~c~ ion ofthe Optics of aNew 3-D
Tm~ing System," A,pplied Optics ~7(21!:4529-4534, Davies et al., 1988, "Three-Dim~n~ion~l Tm~gin~ Systems: A New Development," ~p~liP~l Optics 27(21!:4520-4528; Davies et al., 1994, "Design and Analysis of an Image Transfer System Using 2S Micro-lens Arrays," 0~2ti~1 Fr~ ee,;l~ 33fl l!:3624-3633; Benton, Stephen A., 1972, -"Direct Orthoscopic Stereo Panoramagram Camera," USPN 3,657,981; Nims et al., 1974, "Three Dim~on~iorl~l Pictures and Method of Composing Them," USPN 3,852,787; andDavies et al., 1991, "Tm~ing System," USPN 5,040,871, each of which is il,colp~,...t.,d herein by reference. A dlawbac~; of the above micro-lens array based 3D optical ~,s 30 is that all lenses in the array have a fixed focal length. This greatly limits the type of 3D
effects that can be ge..~ldl~1 by such arrays.

THE FABRICATION OF MICR(D-LENS ARRAYS r 3S C~reat advances in the generation of very small scale surface f,dLul~s have been made rec~llLly. Micro-~t~ techniques using self assembling monolayers (SAMs) have allowed low cost pro~uction of reaLu.~s on sub-micron (C 10 6 m) scales.

.

W O 96/41227 PCT~US96/10181 Certain compounds, when placed in an ~ro~l;ate en~ o~ are capable of ~ ously formine an ordered two~lim~onginnal crystalline array. For eY~mrlç, solutions of alkane thiols exhibit this property on gold. Micro~ ;u~ ormicro contact printin~ uses a 'rubber' ~silicone ~ tomPr) stamp to selectively deposit s allcane thiols in small ~om~ing on gold sllrf~Ps A 'ma~ster' mold with the desired feature shapes and sizes is f~ri~ted using optical lithographic techniques well known in the ele_~.unic arts. Poly(d~ lgiloY~np~) (PDMS), a silicone e~ . ,-, is poured over the master and allowed to cure and then gently removed. The resl~lting stamp is then ir~ked by l~ ~nlg the PDMS surface with a solution of the a~.- ~liate alkane thiol. Thelo PDMS stamp is then placed on a gold surface and the desired pattern of aL~ane thiols is deposited selectively as a monolayer on the surface. The monolayers may be derivatized with various head groups (exposed to the environment away from the m~t~llic surface) in order to tailor the p~ Iies of the surface.

ls In this f~qhiol~, altern~ting clorn~im, hydç~philic and l~dLo~ obic, may be easily f~hric~t~oA on a surface on a very small scale. Under appro~,.;a~e c~n~1ition~ such a sl~rf~, when cooled in the p.~,s_nce of water vapor, will sele~ ly c~ A~ water droplets on the ll~/~opl~ilic surface dom~in~ Such droplets can act as convergent or di~ ,e.ll micro-lenses. Any shape lens or lens cl- ~-P--1 may be pro.luced. SAMs rnay be 20 s~lccli~ deposited on planar or curved s~lrf~s which may or may not be optically l.~..sl)A~~ . Off~ettin~ c~?nt, st?~rkeA~ and other configurations of SAM :~--- r~ec may all be used to gene.ale complex lens shapes.

Using techniques similar to the SAM techniques ~ s~ above, 2S ~ A~,n~ polymers have been used to make stable micro-lenses. For eY~mpl~, a solution of unpol~ e. ;7~od monomers (which are hydrophilic) will selectively adsorb to h~ydn~philic domains on a derivatized SAM surface. At that point, poly.~ ;on may be ;l l;I;h~. cl (e.g., by heating). By varying the shape of the derivatized surface dom~in~> the amount of solution on the domain, and the solution composition, a great variety of 30 dirr~ l lenses with di~.l optical properties may be formed.

For ey~mples of optical techniques incol~ol~lil.g liquid optical ek -~and SAMs, see Kumar et al., 1994, "P,~ . ..rd Con~l~n~tion Figures as Optical Dif~ction Gratings," Science ~:60-62; Kumar et al., 1993, "Features of Gold Having 3s Mic~lllct~. to C~ . Dimen~ions Can be Formed Through a Co...hh.~l;on of S~ g With an F.~ o~ . ;c Stamp and an Alkanethiol 'Ink' Followed by C'hf~.mir.~lF.tohing~'l ~?pl. Pllys. T ~.tt 63(14!:2002-2004; Kumar et al., 1994, "P~ Self-Ass~ bledMonolayers: Applic~tion~in ~tf~ri~l~Sci~nce,"Ts-~mllir~ :1498-1511;

W O 96/41227 PCTrUS96/10181 ~h~ h~lry et al., 1992, "How to Make Water Run Uphill," Science ~:1539-1541;
Abbott et al., 1994, "Potential-Dependent Wetting of Aqueous Solutions on Self-~-~s~Amh'e~ Monolayers Formed From 15-(Ferrocenylcarbonyl)pent<q-lec~ .;ol on Gold," T ~ ir lQ~: 1493-1497; and Gorman et al., in press, "Control of the Shape of s Liquid Lenses on a Modified Gold Surface Using an Applied Electrical ~,t~..Lal Across a Self-~cc~mh',~c1 Monolayer," Harvard University, De~ c.lt of Ch- ...;c~ , each of which is Lcu~ t~d herein by reference.

Micro-lens arrays can also be fabncated using several other well lcnown techniques. Some illu~lldti~ techni~ues for the generation of micro-lens or mic ~ h-o~
arrays are Aicrlo~e(1 in the following articles, each of which is incorporated herein by . f~ .~lce; Liau et al., 1994, "Large-Numerical-Aperture Micro-lens Fabrication by One-Step Ftching and Mass-Transport Smoothing," Appl. Phys. T ett. ~1~:1484-1486; Jay et al., 1994, "~ Photoresist for ~efractive Micro-lens Fabrication," ~i5~1 lS r.~ 3~:3552-3555; MacFa~ e et al., 1994, "Microjet F~qt)ricfltiQn of Micro-lens Arrays," IFF.F. Pl"-lol~;cs T~rhnolo~v T .tott~rs ~: 1112-1114; Stern et al., 1994, "DA~Y Ftching for Coherent ReLaclivc Micro-lens Arrays," O~?ti~s-l r~ U
33(1 V-~547-3551; and ~-~n~ll et al., 1994, "Micro.l~ Arrays Using KOH:H20 Mic.s~ -r~ g of Silicon for Lens Tc.~ t~ s, C~eo~es;- T f nses~ and Other Applic~tinn!:," Opti~l Fn~ r~ 33(11!:3578-3588.

FOCAL LENGTH VARIATION AND CONTROL

Using the micro-sl~npit~ technique ~ cu~ above, small lenses may be 2S fi1b, ;cn~ d with variable focal lengths. Variable focus may be achieved ~rough several general means, e.g., (i) through the use of electrical pot~ntisll~; (ii) through .,lecllA~
defo....~lion; (iii) through selective deposition, such as deposition of liquid water drops from the vapor phase (as described in Kumar et al., (Science, 1994) cited above); and (iv) heating or n~lting (e.g., structures may be melted to change optical ~r~llies, as in some 30 micro-lens arrays which are crudely molded and then melted into finer optical rlf -~lf -~

The degree to which a solution wets or spreads on a surface may becontrolled by varying the ele~ l~OlUC ~,.ol).,.lies of the system. For; ~r~e7 by placing micloelccllodes within the liquid lens and varying the potential with respect to the 3S snrf~ce, the ~;u, ~ of the lens may be varied. See Abbott et al, cited above. In other cQnfi~ticms~ hydrophobic liquid micro-lenses are formed on a surface and covered~,vith an a~lueou~ solution and the surface pol~,llial is varied versus the &~lueol~s solllti~n W O 96/41227 PCTrUS96/10181 Such ~y~ s have ~etnon~ ted extremely small volume lenses ~lnL) which are capable of reversibly and rapidly va~ing focus (see Gorman et al., cited above).

~efe. . ;.~ now to Figure 3, a sçh~ lic ~ ., of a variable focus lens s SO is shown. Variable focus lens 50 in~ hl~es a liquid lens 52 and two SAM s-~- r~es 54.
SAM s~ r ~- .,s 54 adhere to liquid lens 52. As can be seen in the progIession from Figures 3(a) through 3(c), by varying the ~ t~n~e b~ the SAM ~ - r,fef ~ 54, theshape, and ll.~e~ , optical Cl~A~ t~ tiC~:, of liquid lens 52 can be altered. There are also several other ways to vary the shape and optical ~ v~ cs of liquid lens S2. For 10 e , le, the ele~ ical potential between lens 52 and surface 54 can be varied, causing cllanges in the shape of lens 52, as is Aiccuc~ed further below with respect to Figure 4.
The index of refr~ction of lens 52 can be varied by using different liquid m~teri~l~ The cohesive and a&esive p,ul,e.lies of liquid lens 52 can be adjusted by varying the ~h-~.mi~try of the liquid mAten~l and by varying the cht~ try of surface 54. The three 5 .1;~"1 . ~;on~l c~ ;cs of surface 54 can be varied. For eY~rnrle, when viewed from the top or bottom surface 54 can be circular, rectangular, h~Yagon~l or any other shape, and may be moved up and down. These terhniq~l~s may be used individually or in com~ io~ to create a variety of lens shapes and optical effects.

~efP~ring now to Figure 4, a s~h~ tic diagram of an electri~ ly ~ ~le focus lens as tli~closed in the above cited Abbott et al. article is shown. A drop of liquid 52 is placed on SAM surface 54, which is in turn formed on m~t~llic surface 56, p~,fe.dbly gold. By varying the electric potential bt;~,en mi~;.ue,le~ ude S8 and SAM
surface S4, the curvature (and thus optical ç~ tir.s) of liquid lens S2 can be varied.
2s The progression from Figure 4(a) to 4(c) shows s~ h~ tir ~lly how the shape of liquid lens S2 can be changed. Similar effects can be achieved using the techniques described in the above Gorrnan et al. article, although microelectrodes 58 need not be used.

;v~ly~ such micro-lenses may be focused through ..,~
30 means. For ey~mrle~ flexible polymeric or elastomeric lenses may be col~ cd or relaxed so as to vary focus through piezoelectric means. ~lt~ 7.1;vely, liquid lenses e-~ fe~1 in flexible casings may be mP~ ni~llyco~ .ss~l or relaxed.

SUMMARY OF THF ~NV~ TION
3s The present invention provides a 3D optical system which, in c~ ; to the prior art, in~h~des a variable focus micro-lens array and an image ~at appears to have been taken with an optical system having a relatively high depth of field; that is, objects S

W O96/41227 PCTrUS96/10181 of varying ~ t~nres within the image are s~lbst~nti~lly in focus over a pre~ietermin~
area. In an ~ t;~e emboAim~ont variable focus micro-lens arrays can be used in cc mbin~tion with still or motion images to cause the a~paL~ nt di~t~nce of the image to change. Another embodiment uses fixed arrays having cl~ ..L~ with v~yiu~ focal S lengths to create 3D and other optical effects.

n~.~CR~PT~ON OF T~li FIGURl~

Figure 1 is a scl~ ;c Aiagrann showing a 3D im~ing system incorpor~ting a 0 micro-lens array accoldillg to a ~uler~ d embodiment.

Figures 2(a) - 2(c) are sçh~ tic diagrams showing the path of light directed to an observer under various conditions.

Figures 3(a) - 3(c) are sçh~m~tic diagrams showing one technique for varying thefocsl length of a liquid micro-lens through the use of SAMs.

Figures 4(a) - 4(c) are SCllf n~;C ~ gr~mC ;~hu~.iug another t~c~ ., for the focal leng~h of a liquid micro-lens through ~e use of SAMs.

Figure 5 is a block diagram of a camera used to make two f~imPnQional images of the type used in a pl~c.lcid embodiment.

nh',T.~lT,F.D nli,.~CRTPTION

The structure and function of the plefe~-~d emboAim~nt~c can best be un-lPnctc~od by l~ fe.ence to the drawings. The reader will note that the same .~.e.lce mlm.-r~lc appear in multiple figures. Where this is the case, the nl-m~r~lc refer to the same or C~ll'. $~,ol~ling structure. In a preferred embodiment, variable focus rnicro-lens 30 arrays, such as those f~hrir~tP(i using the techniques IliQCllCQ,e(~ above, along with still or motion images having relatively great depth of field, are used to create 3D effects.

R~G~ - . ;.~g to Figure 2(a), images viewed by the human eye ~"'l~ e a plurality of e~ ely fine points which are p~ .c~;:ived in conlilluous detail. Ac light falls 3S on each object point, the light is scattered and the point di~usely reflects a cone of light 30 (i.e., light which s~lbten~ls some solid angle) outward. If an object is viewed at a conQ;~1Pr~bl~ e, by an observer 20, then a very small portion of cone 30 is cQllPcteA and the rays of light that are collected are nearly parallel (see Figure 2(a): far W O 96/41227 PCT~US96tlO181 focus). As the viewing ~ t~nre decl~ases, however, the rays collected by the eyes of obs~ ,. 20 are less parallel and are received at greater diverging angles (see Figllre 2(a), focus and close focus). The complex of the cornea and lenses r.h~ec shape so that objects at va~ying rlict~n~es can be focused. For a more comrlete ~liccllcsioll of s diffuse reflçctinn of the type ~~;cc~csed above, see, e.g., Tipler, Paul A., Phycir.c for ~r.iP..";~ n~l F1~ " Third _dition, Fxt~nrled Version, Worth Publishers, pp. 982-984, which is ~lCO ~u~t~d herein by reference.

Acco~ g to a plef~ ,d embo~lim~nt, a two ~ n~l rh~to~rh or lO image which is in focus at all points of the image is overlaid with an array of micro-lenses. With proper i11nmin~tioI~, such a system can generate light cones of ~ yhlg di~ gence and ~imn1~te 3D space.

~ ecau~e photographic lenses only have one ~; .inl~u~ point of focus, there lS iS only one plane in the photograph which is in exact focus; in front of and behind this plane the image is pro~,~~h"~,ly out of focus. This ef~ect can be l~,luce~ by i~ g ~he depth of field, but can only be coll~,te~ to a certain extent.
In g~n~l, a ~ ,f~ d eml~oflim~nt of the present invention will work with images g~ cl using an optical system having a large depth of fidd. For certain 20 imagec~ proper p1~c~ of the plane of focus and use of depth of field is ~leq~l~tç to attain perceived ~h&l~lles~ throughout the entire image. In other sitl~tion~ more a~lv~ced techniques are required to attain perceived exact focus for all points within an image. Mor1ifi~d ç~m~r~c and/or digital im~gin~ techniques may be used. For eY~mrle, some out of focus areas within an image may be focused using digital sOn~
2S ';,l~ g' filters.

Referring now to Figure 5, a block ~ r~m of a camera 60 used to make two ~im~n~jonal images of the type used in a prtfcl..,d embodiment is shown. Camera 60 inrllldes con~,.,.llional motorized optics 62 having an input lens 64 and an output lens 66. While lenses 64 and 66 have been depicted as convex lenses, those skilled in the art ~,vill 1m~l~t~ntl that lenses 64 and 66 may be of any desired collfi~lrati~ Ml~t~.. ;,. ~1 optics 62 focuses an image on image recorder 72. An image can also be f~cnQe~ onimage ,ecol.le~ 72 by varying the ~ t~nre b~ image ~COLd~1 72 and output lens 66 either in-lepf l,.i. .~11y~ or in combination with adju~ in mo~ ,d opffcs 62. Image 3S lccG-dc. 72 may be a charge coupled device'~CCD), photoml-1tir1ier tube (PMT), o1ol1io~e, avalanche photo~iQde~ photographic film, plates, or other light se~ili~_ materials. In addition, image l~corde~ 72 may be a combin~tiQn of any of the above light lccor~ling or co11~cting devices.

W O 96/41227 PCT~US96/10181 The focus of motorized optics 62 is controlled by controller 68, which is coupled to mr)t-)ri7l~A optics 62 via control line 70~ Controller 68 may be a microprocessor, micro-controller, or any other device which gen~l~t~s a digital or analog s signal that can be used to control the focus of motori~d optics 70.

If image recorder 72 is a digital device, then images c~tw~d by image o..lei 72 are stored in memory 74. If image recorder 72 is a pk~JIo~ ic or light3~,~ili~_ m~pri~l then memory 74 is not needed.

Memory 74 may be semiconductor memory, m~gn~tic m~n,.,l ~, optical memory, or any other type of memory used to store digital inforrn~tion Image leco.de.
72 is col pl,c~ to memory 74 via data line 76. Controller 68 may also control memory 74 and Image ~CCl~ 72 via control lines 78 and 80.
1S Through the op~tion of camera 60, a collage of sha~p areas may be formed to make an image which is sha~p at all points. For ~. . .pl~ a series of digital images ofthe same scene may be cal~tuled with Image l~,co.d~. 72, each foc~ at adi~ ....re~ That is, controller 68 causes m~to,;,~l optics 64 to cycle ll~o~L a range of focuses (e.g., from S meters to infinity), image recorder 72 c~l~tu~s images of a 20 scene taken at di~.e.l~ focuses, and memory 74 stores the ca~u~l images. The focus of moto.;~d optics 64 can be varied co~ l-Qùsly, or in steps, depending on col~-lition~ and the image ~ uil~,d.

And further ~leplon~lin~ on cl nllitioll~ and the image ~ uhed, one to many 25 hundreds of images may be c~lu.~d. For example, if the image is entirely of a distant hori7. m, only a far focus image would be required. Thelefo.~, the overall shutter speed may be very short.

Camera 60 may be a still camera or a video camera. Controller 68 can be 30 used to sequence motol;~d optics 64 through any range of focuses, as the desired range of focuses may change with the type of scene and li~htin~ colulition~. If camera 60 is used as a video ~rn~rs~ motc!ri7P~l optics 64 must be made to operate very quickly, as several frames (each inclu~ling several images taken at d;~ 1 focuses) per second must be ch~u.~d. To save time, controller 68 could be pro~,.anl.lled to cycle ...ol.J~; -~ optics~s 64 from the closest desired focus to the fiurthest desired focus to capture the images d to ~ e one frame, and then cycle motori_ed optics 64 from the rul~Le~
desired focus to ~e closest desired focus to capture the images ~ d to ~ . te the next fraîne. This process could then be ç~ested for all :iul~Se~lu~ll frames.

W O 96/412~7 PCTAUS96/10181 The same segTnlont of the scene in each of the digital images stored in c~nol~ 74 (say a 5XS pixel array) may be sampled for contrast (the highest co..~.,.c co..~,ol-Ac to the ~ e~l focus). Each 5xS high contrast se~ may then be S r~ into a single image which will be ~b.,l~ ;Qlly in focus over the entire scene.
This may be done with more adv~lced sonw~e algo~ s which will reco~i7~
I'c~ ol~ shapes" or objects to simplify the process and make it more rapid. The m~nirllation is most easily carried out in digital form (either ~om ~ligiti7~d analog nri~inQlc or from digital ~riginQl~) but may also be done in an analog forrnat (cut and lo paste).

P~eferrin~ now to Figure 1, a ~ref~lcd embodiment of the present invention is il~ trated Objects 15A-15C re~ the position of several objects in space as perceived by a viewer 20. Objècts 15A-15C are rli~tQncee 22A-22C, rc~ ly, away from viewer 20. Objects 15A-15C also reflect light cones 16A-16 tow~ls viewer 20. As ~ c~ ed above, the degree to which a light cone 16 is diverging when it reaches viewer 20 varies with the distance of an object lS from viewer 20. To ecle~t~, a 3D image of objec1s lSA-15C, an image 10 (which is l~lef~,.ably ~ClC~;~.'~ as sharp over its entire area) is placed in registered Qli~m.ont with an alTay 12 of micro-20 lenses 14. However, the ~,~f~..ed embodiment can also operate on an irnage 10 that is not sharp at each point.

Array 12 can be a s~bst~nti~lly flat two ~1im-on~ionQl array, or it can be an array having a desired degree of curvature or shape, which depends on the c~ ~ or 25 shape of image 10. The characteristics of each rnicro-lens 14 COll~O~ Q to each point or pixel on image 10 are chosen based on the focus ~ tQnre of the camera lens which made that point or pixel of the image sharp. The focal lengths of the micro-lenses 14 may be chosen so that light cones 18A-18C duplicate light cones 16A-16C (based on the ~l.e.,ted or known viewing ~ tQn~e from the micro-lenses, or based on a relative 30 scale or an ~bill~y scale to vary the perceived image). In this respect, viewer 20A will see the same 3D image seen by viewer 20.

Since image 10 can be viewed as a coherent 2D image when viewed by itself, the a~ ce of image 10 can be made to vary or QlternQte b~ n 2D and 3D.
35 If 2D viewing is desired, lenses l4 in array 12 can either be removed, or can be adjusted to be optically neutral. If 3D viewing is desired, lenses 14 in array 12 can be zdj- k d as ~es~ l above.

W O9~41227 PCT~US96tlO181 A similar procedure may be utilized to produce 3D motion pictures/video.
As is known to those skilled in the art, motion video is achieved by rapidly displaying images in seq~lenti~l fashion. Therefore, sequential images in focus over ~e entire image (or to the degree desired) must be created. To achieve this, a video camera which is made s to rapidly and cQIltinllQusly cycle ~twcen near and far focus is used. Each overall sharp image is produced by the techniques ~li.ccl-cced above (l-tili~ing depth of ffeld, knowledge ofthe scene, collage techniques, etc.). Further, intelligent sGn~.~e can be used in comhin~ti~n with still or video ~ "-c to upti~ , depth of ffdd, llWll~ of focus steps on a focus cycle, etc., based on ambient c~-n-lition.c, previously ;~ p.~cf~.c~es, 10 and/or the past (immeAi~tely prior or overall past history) a~ iale s~ttings-Additional sor~ w~ manipulation can be used to make sharp images over the entire scene or to the degree desired. For example, the pe~il)hcl ~ of a scene may be sele~,tivcly out of focus.

Although the overall field of view of the humari eye is large, the brain focuses on a centTal portion and the ~ hc,y is often subst~nti~lly out of focus. In the ideal case the image behind the micro-lens array is sharp over the entire scene so that as the viewer P~-..;r~s .liLr.,.~ t portions of the scene each will come into focus as the viewer focuses ~lol~e,ly. There are, however, situations in which alh~.~l.ess over the 20 entire image is not nPe~lefl such as in video sequences when the viewer only follows a particular field within a scene.

Once the desired video images are c~plul~d~ 3D display is achieved by placing the images behind an array 12 of variable focus lenses 14, as ~l;e-~iu~e~l above 25 with respect to Figure 1. In each frarne in the video sequence, for each point or pixel of the frame there is a co~c~onding focus setting for the lens 14 which is in l~gi~t~I with that pixel. As each frame is sequentially displayed each pixel varies its focus to the a~plol,l;ale pre~let~minçd setting for the pixel of that frarne.

Since each point or pixel has with it an associated lens or compound lens, the rays from each pixel can be controlled to reach the eye at a predctu~ fd angle cul~o--Air~g to the 3D depth desired for that pixel. There may be mllltirle lens designs which may suit the desired effect for any given situation.

Referring again to Figure 2, an important cone;~ler~tion in the operation of the present invention is the eye to pixel ~ t~nce Di~. l~ lens designs are ~ d for close screens such as goggles (see Figure 2(b)) than are l~uil~d for rnore distant screens (see Figure 2(c)). As is de~:ct ~ in Figure 2(b) (~ .1i,..,. and far focus), there are ~it-~tion~ where cnmhin~tion~ of elements (such as a po~iLive and a negative lens) can be moved relative to each other to create the desired optical effect. Thus, in one embo-lim~nt, mllltirle arrays could be moved relative to each other to create the proper light output. For a more complete des~ tion of the properties of combin~tion~ of optical 5 el~ n~ ;, see, e.g., Ray (cited above), pp. 43~9, which is also illc~olal~d herein by ~f~,.e.lce.

Consider the analogous behavior of a point of diffuse reflectinn and a point of focus from a lens; i:E both the point of focus and the point of refl~ctinn are at the 10 same rli~t~nr~ from the eye, the angle of the rays upon reaching the eye will be the same.
Rec~ e the pupil of the eye is relatively small, about Smm, only a small fraction of the ifru3elyreflectefl light cones are observed by the eye, and one does not need to ,le..t~" rays which are not observed by the eye.

The above described techniques may be used for display screens such as television, video, ~ideo callleras, computer displays, advertising displays such as counter top and wi~dow displays, billboards, clothes, interior deco.dti.,g, fashion ~ S7~ v~ s, eYt~riors, camouflage, joke items, ~ ~Y~ park rides, garnes, virtual reality, books, ~~aj~ 5, pO:~l ;al~S and other printed ~ t . ;~1, art, s~ 4~
20 lighting effects which cause light to become more intense or diffuse, as may be desired in photogr~rhic or home use applications, and any other applications where three flim~n~ion~l or variable optical effects are desired.

Co.~ L~ . displays are typically placed close to a user, and the user's eyes 2s are c.~ y set at a single ~lict~nce which puts strain on the eye mll~clf c To prevent e~ LI~ll and long-term deleterious effects, it is recommPn-led that one perioAi~ ~lly look at distant objects. By using the present invention, a lens array can be adjusted so that the viewer can focus near or far to view the display. Such variation in a~a.~,lt viewing fl;~ n~C (the display itself may be kept at the same fli~t~nce) may be m~ml~lly user 30 controlled, or may follow a predetermined algorithrn (such as slowly and i~ )libly cycling but moving through a range to prevent strain). Such ~lg~. ;Ll....~ rnay also be used for 1~ , pul~oses. The viewing di~t~n~e may be mo~ tpd to ~t..., ~ lly benefit certain muscle groups. The technique may be used for books a~s well as other close-field i~ .lsi~e work.

One ~ppli~tion of the still 3D images, according to the present invention, would be in the field of fine art and collectibles. Moreover, still images may be paired with fixed focal length lens 3~rays as well as variable focus arrays. Unique effects can be W O 96/41227 PCT~US96/10181 achieved by mod~ ting the focal length of the lenses in conjunction ~-vith a still image.
FccPntric art 8S well as eye-c~tf~l~ing displays or adverti~em~nt~ could be achieved by lm-llll~tin~ the focus of a still image. In particular, this technique can be used to guide the viewer's ~ttPntion to particular portions of an image by selectively mocllll~tir~ the 5 a~ l,l viewing area of interest and leaving the rest of the image static--or vice versa, or alter the focus of a region and its a~.~l size. For example, if the size of an object (in terms of its pe.ce.ll~ge of an observer's field of view) stays the sarne, and the obs~
eye s~ h~s from near focus to far focus, then the observer's sense of how large the object is wiU change (i.e., the observer will p~ce;ve the object as being bigger).
10 Simil~rly, if the size of an object (in terms of its pe~e.~lage of an obs~ 's field of view) stays the same, and the obs~ . v. ,'s eye switches from far focus to near focus, then the observer will p~ ,. the object as being smaller). This effect is further aided by inr.ln~1ing "lef~,le.lce" images -- images of objects of known size. The.efule, such a screen could selectively cause changes in appare.ll size, for ~ Jlc, to grab the lS obsel~e.'sattention.
.
W~ ~v~ld, or all enco.,.p~ views are &lv~ 40~bcc~e they e~ t~ t~ ~v~ non-relevant pcll~ ,.al; ~ ~ r~ ;on and im~gP5 There are two general tec~niques for giving the viewer an all encor.~p~ view of a scene. The first 20 is to use c ~ .ncly large and/or curved viewing screens most useful for group viewing (e.g. the Sony IMAX lh~ , or a plan~l~,;ulll). The second technique is the use of individual viewing goggles or glasses. In this technique relatively small screens are placed close to the eyes. An advantage to using the micro-lenses is that even at very close .l;~ r~s, it is difficult for the average person to dîscern rec.lu,~s of less than 100 2s microns -- so if the micro-lenses in the array are made small enough (but are large enough so that unwanted diffraction effects do not preclomin~te) the screen can remain virtually contin-lQus without pixel effects. Because the screens are smaU, reductionc in cost to achieve the wrap-around all encompassing views are achieved. Ad~lition~lly~ it is possible to use bl~ PnPd areas around the screen if the screen does not fill the entire 30 viewing angle so as to remove distractions. Alternatively, some applir~tion~ would al~ ageO~ y inCOI~JOldle PYter~ visual images. For example, a partially L~
display could overlap images from the environrnent with displayed images (this can bc used in other embo~iim~nt~ such as heads up displays). Such displays could have military as well as civilian use. In particular, information can be displayed to Op~,~alul;i of moving 3S vehicles. When ~sing goggles, such displays could be visible to one eye or both.

If a co-..p~ display were ge.lc.d~ed wi~in wrap-around goggles, the err~ screen size wo~d be m~x;i~ There is a trend lo~-~s il~C~Si~g~

, wo 96/41227 PC rlus96/10181 sizes for CGlll~ S as the total information/number of computer applications ~imlllt~neoucly running increases. ~ wrap-around goggle co~ ul~,. display would allow the user to use his entire field of vision as a desktop. This could be CQ~nbinPd with 3D
effects as well as the strain re~lucing features described above.

~ <ldition~lly, goggles may have one screen for each eye. Such ~oggles would require app.u~l;dte p~r~ r correction so that the two images coincide and are ,d as a single image by the viewer. An advantage of using two screens is that the irldividual screens may be placed very close to their ~ e~ eyes. The two images of 10 Lfr~ L p~rall~ x may be obL~ined from a variety of modified camera ~y~ ls (see Ray, Figure 65.10, Section 65.5 (cited above)). Alternatively, software algC)~ S may be used to gel)f..11~ second imàges from single views with altered ps~rs~ Two screen goggles may also be used without parallax corrected images--that is, with the same ~ c~ e displayed to both eyes. This would likely result in some loss of natural 3D
lS effect. However, many factors contribute to 3D effects, of which p~r~ c is only one.

Refernn~ again to Figure 1, the display 10 behind the lens array 12 may be analog or digital, and it may be printed, drawn, typed, etc. It may be a ~ otc,g.~ h or , in color or black and white, a positive or l~ ,e~t~d or offset by any 20 angle or ~r~.,~ly o. ;~ in its original fashion--it may emit or reflect light of many e-lL wavel~n~th~ visible or non-visible. It may be lithograph, sequential Ç;~
images and may be an XY plane in two or three dimensions. It may be a CRT, LCD, plasma display, ele.;l,ocl~ ic display, electrochemilumil~csce.lt display or other disylay~ well known in the ~t.

Lenses 14 in array 12 may vary in terms of:

Size; preferably ranging from I cm to 1 micron.

Shape; preferably circular, cylindrical, convex, concave, sph~
AAeph~rir~l, ellipsoid, rectilinç~r~ complex (e.g Fresnel), or any other optical configuration known in the art.

Constitution; the lenses may be primarily l~r.~clive, prim~rily 35 ~ , or a hybrid diffractive-refractive design, such as the design ~ closed in Missig et al., 1995, "Diff~active Optics Applied to Eyepiece Design," ~lied Optire 34(14!:2452-2461, which is illcol~olaled herein by lc~.ellce.

Number of lenses in the array; the arrays may range from 2x2 to a virtually llnlimited array, as the lens array 12 could be in the form of a very large sheet.

The number of lens elements used for each 'pixel'; as is known in the art, s c~...pol~ lenses may be useful for correcting optical aberrations and/or useful for di~,lll optical effects. For t;,~ le, sph~ori-~l or cl..unlalic ~berr~tinn~ may be cGll~,ct~d and zoom lens optics may be incol~,ol~Led into an alTay. Moreover, one could use a fixed focus array in front of a display and then a zoom array on top of the first array. Or in lirr~,l&ll applic~tion~ di~c~ll optical elem~nt designs could ~e hlco.~,u.~d 10 into the same array.

Color of the lenses; the lenses may be colored or colorless and may be lJ~ e.ll to a variety of visible and non-visible wave lengths. For exarnple, stacked arrays of red, green, and blue lenses may be used. Alternatively, colored display pixels 15 could be used with non-colored lenses Ccl~o~ilion of the lenses; as ~i~cllc~ed above, ~e lenses may be c~...pos~3 of a variety of m~t~ lc in a variety of states. The lenses may be liquid solllt;- n~, col1c:~, el~ ~t.~ " polyrners, solids, crysPIIine, su~ etc.
Lens co-.l~ ion, relaxation, and deforrnation; the lenses may be ~r~....~d by electrical and/or mechanical (e.g. pie7-~el~ctric) means. Der~ ion may be employed to control effective focal length and/or to vary other optical pl.,l,ellies ofthe lens or lens system (e.g. aberrations or ~lignm,qnt__~lig~m~nt may be 1~l~ ,l lenses 2S and/or ~ nmP!nt with the display) Finally, arrays may be combined or stacked to vary or increase ~lifr~.e.ll optical propc.lies. The arrays can be curved or flat.

30Many other various elements can be included in the preferred embo~ f .1~;. For eY~mrle filters may be used in the arrays, between the array and the display, and in front of the array. Such filters may be global, cu ~_..ng all or most pixels, or rnay be in .c;~ . with only one pixel or a select group of pixels. Of particular note are neutral density filters (e.g. an LCD array). Other filters include color filters, {~Mrli~nt 3S filters, pol~-i7~r.~ (circular and linear) and others know to those skilled in the art.

Further, the s~ es of the di~ l co~ )one.ll~ of the i~ tioll may be coated wi~ a variety of Ccs&~ ,s~ such as, antiglare co~tin~ (often multilayer). Other W O 96/41227 PCT~US96/10181 co~tin~!e provide scratch re~ nre or mechanical stability and protection from c"lvil~v~ nt~l factors.
.
Light b~fflin~ structures or m~t~ri~l~ may be used to prevent unwanted S s~ay light or reflçction~ For eY~mr le, it may be desirable to isoiate each pixel optically from neighboring pixels. In one embodiment, SAMs may be used to form micro light'' baffles. For eY~mrle, micro-lenses which occupy hydrophilic regions may be circ~ ribed by hydrophobic regions whose snrf~e~ are sel~livcly occupied by light a~;,vLb~ m~tPri~ ;v~,ly, micro-m~ inf~d light baffle shuclu.~,s may be uset.

The components of the invention may advantageously have varying optical ~lv~ ies. For some applications Sllhst~nti~llyL~ tcvll~on~ and support m~teri~l~ would be used--e.g. for use in a heads up display. In other cases, ll~lvl~d s... r;1~s may be des;rable--e.g. as a backing to m~Yim~lly utilize reflected light ls and also for the use of ~ lu~,d optical elernent~. Other m~t~ri~l~ include sç-~ yA~ ~.ll mirrors/beam splitters, optical gr~qtin~s~ Fresnel lenses, and other m~tf~ri~l~ known to those skilled in the art.

Sh-ltt~ s andtor a~e.lu,~s may be placed in various loc~ ns thc system 20 and may be global or specific (as the filters above). Shutters may be usefi~l, for example, if a film based çin-~m~tic video scene were used as the display. A~e.~ules could be used to vary light illl~ iLy and depth of field.

The overall systems may vary in size b~ a few microns and llu~lL~
2s of meters or more. The system may be curved or flat. It may be a kit. It may be a p. . ~ f .ll in~t~ tion or it may be portable. Screens may fold or roll for easyol~lion. The screens may have covers for protection and may be i..~,.,.t~ d intocomrlçY units (e.g. a laptop computer). The system may be used in ~im~ tnr~ and virtual reality systems. The system can be used as a range finder by coll~,laling ~ c 30 focus on the array with a plane of focus in the envhu~ c.ll. The system may be used for ~lva~ced ~ntofoc~ systems. For example, the system could be used to rapidly findoptimal focus since the micro-lens can focus much faster than a large l.~ l camera lens and tihen ~e lens can be set to the accurate focus. The system can be used for direction~l viewing of a display--for example by using long efrc~ focal lengths. The 3s sy~ llS may also be disposable.

An hl~ol~ll consideration in the present invention is the type and direction of li~htin~ The lis~htin~ may be from the front (reflected) or from the rear (backlit) and/or from a variety of int~rme~ te angles. There may be one light source or mlllti. '- light sources. In some cases both reflected and luminous b~ hting areto more ~rcnrAtP,Iy represent a scene. For example, when indoors looking out a window, one rnay perceive strong b~lighting through the window and reflectP~d softer S light with directi- n~l shadows within the room. Combining bae~light, reflected light and the i~ y/neutral density filtPring will give a more realistic image. Direction~
reflected light may be r~,.;u~ed on a single pixel or specific area or may be global (as with bn~l~lightin,E). The light may be filtered, po}arized, coherent or non-cohe..,~l. For . le, the color t ~ of slmli~ht varies through the day. A sllnli~ht co.l~,t~d o source light could then be filtered to .~lese.lt the reddish tones of a sunset image etc.
The light may be placed in a variety of positions (as with the filters above) and may be from a variety of known light sources to one skilled in the art in~ in~ in~n-lescPnt h~logPn, flu0l-,3ca~, mercury lamps, strobes, lasers, natural snnlight luminP.scin~
m~t~ le, phosphorescing materials, chemilumin~scent mz-t~rislle, ele~ u~ minPscent etc. Another embodiment is that of l............ ;.. ~c;.,g lenses. Liquid lenses or lenses which may be suitably doped with ll~minescPnt m~t~ri~1e may be useful, ~lRci3l1y in ~ ~s~hle systems. For c~ ,le, conei(lPr a liquid ph~e lens res~ng on an de~,~de. Such a lens (if it col.~ d an ECL tag) could be caused to l-~..;n~sc~

The present invention has been ~les~ribe~ in terms of a ~ f~
anbo~l;.... ......- .1 The invention, however, is not limited to the embodiment depicted and ~lesçrihe~l Rather, the scope of the invention is defined by the appended claims.

Claims (68)

What is claimed is:
1. A three dimensional imaging system comprising:
a two dimensional array of micro-lenses, at least some of the lenses having a variable focal length; and a two dimensional image having a plurality of image points or pixels;
at least one micro-lens in the array in registered alignment with one or more of the image points or pixels.
2. A three dimensional imaging system comprising:
a two dimensional array of variable focal length micro-lenses; and a two dimensional image having a plurality of image points or pixels;
at least one micro-lens in the array in registered alignment with one or more of the image points or pixels.
3. A three dimensional imaging system comprising:
an array of micro-lenses, at least some of the lenses having a variable focal length; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the array.
4. The invention of claim 3 wherein the imaging system is sold as a kit.
5. The invention of claim 3 wherein the imaging system is incorporated in a pair of goggles.
6. The invention of claim 3 wherein the imaging system is incorporated in a transparent heads-up display.
7. The invention of claim 3 wherein the image has a depth of field greater than that which could be taken at any one focus distance.
8. The invention of claim 3 wherein the imaging system can alternate between 3D
and 2D images.
9. The invention of claim 8 wherein the micro-lenses can be made optically neutral.
10. The invention of claim 8 wherein the micro-lenses can be removed.
11. The invention of claim 3 wherein the imaging system is incorporated in art work.
12. The invention of claim 3 wherein the imaging system is incorporated in advertisements.
13. The invention of claim 3 wherein the imaging system is incorporated in a virtual reality device.
14. A three dimensional imaging system comprising:
a first array of micro-lenses, at least some of the lenses in the first array having a variable focal length;
a second array of micro-lenses, at least some of the lenses in the second array having a variable focal length; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the first array;
at least one micro-lens in the second array in registered alignment with one or more micro-lenses in the first array.
15. A three dimensional imaging system comprising:
an array of variable focal length micro-lenses; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the array.
16. A three dimensional optical system comprising:
an array of micro-lenses, each micro-lens having a fixed but individually predetermined focal length, the micro-lenses in the array having a plurality of focal lengths; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the array.
17. A three dimensional optical system comprising:
an array of micro-lenses, each micro-lens having a fixed focal length; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the array.
18. A three dimensional optical system comprising:
an array of micro-lenses, each micro-lens having a fixed but individually predetermined focal length; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the array.
19. A three dimensional optical system comprising:
an array of micro-lenses. each micro-lens having a fixed but individually predetermined focal length; and an image having a number of image points or pixels;
at least one micro-lens in registered alignment with one or more image point or pixel.
20. An optical system for varying the apparent distance of a computer screen, comprising:
an array of variable focal length micro-lenses; and a computer screen having a plurality of pixels;
each pixel in registered alignment with one or more micro-lenses in the array.
21. In an optical system comprising an array of variable focal length micro-lenses and a computer screen having a plurality of pixels in registered alignment with one or more micro-lenses in the array, a method for reducing eyestrain comprising the step of:
periodically varying the focal length of all of the micro-lenses in the array, whereby the computer screen appears to become closer or further away.
22. In an optical system comprising an array of variable focal length micro-lenses and a computer screen having a plurality of pixels in registered alignment with one or more micro-lenses in the array, a method for reducing eyestrain comprising the step of:
periodically varying the focal length of a subset of the micro-lenses in the array, whereby a portion of the computer screen appears to become closer or further away.
23. An optical system for varying the apparent distance of a computer screen, comprising:
an array of variable focal length micro-lenses; and a computer screen having a plurality of pixels;
at least one micro-lens in registered alignment with one or more pixels.
24. An optical system for varying the apparent distance of a two dimensional object, comprising:
an array of variable focal length micro-lenses; and a two dimensional object having a plurality of points or pixels;
each point or pixel in registered alignment with one or more micro-lenses in the array.
25. An optical system for varying the apparent distance of a two dimensional object, comprising:
an array of variable focal length micro-lenses; and a two dimensional object having a plurality of points or pixels;
at least one micro-lens in registered alignment with one or more points or pixels.
26. An optical system for varying the apparent distance of a two dimensional object, comprising:
an array of fixed focal length micro-lenses; and a two dimensional object having a plurality of points or pixels;
each point or pixel in registered alignment with one or more micro-lenses in the array.
27. An optical system for varying the apparent distance of a two dimensional object, comprising:
an array of fixed focal length micro-lenses; and a two dimensional object having a plurality of points or pixels;
at least one micro-lens in registered alignment with one or more points or pixels.
28. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having high depth of field and having a number of image points or pixels; and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a predetermined solid angle, the solid angle varying with the perceived distance of the image point or pixel from a viewer.
29. A method for generating a three dimensional image, comprising the steps of:

generating a two dimensional image having high depth of field and having a number of image points or pixels; and reflecting, transmitting, or emitting light from each of the image points or pixels so as to generate a cone of light having a predetermined solid angle, the solid angle varying with the perceived distance of the image point or pixel from a viewer.
30. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having a number of image points or pixels, the image being substantially in focus over a predetermined area; and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a predetermined solid angle, the solid angle varying with the perceived distance of the image point or pixel from a viewer.
31. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having a number of image points or pixels, the image being substantially in focus over a predetermined area; and reflecting, transmitting, or emitting light from each of the image points or pixels so as to generate a cone of light having a predetermined solid angle, the solid angle varying with the perceived distance of the image point or pixel from a viewer.
32. A method for generating optical effects, comprising the steps of:
generating a two dimensional image having a number of image points or pixels; and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
33. A method for generating optical effects, comprising the steps of:
generating a two dimensional image having a number of image points or pixels; and reflecting, transmitting, or emitting light from each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
34. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having high depth of field and having a number of image points or pixels, and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
35. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having high depth of field and having a number of image points or pixels; and reflecting, transmitting, or emitting light from each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
36. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having a number of image points or pixels, the image being substantially in focus over a predetermined area; and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
37. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image having a number of image points or pixels, the image being substantially in focus over a predetermined area; and reflecting, transmitting, or emitting light from each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
38. A method for generating a three dimensional image, comprising the steps of:
generating a two dimensional image using an optical system having a large depth of field, the image having a number of image points or pixels; and projecting light reflected from or emitted by each of the image points or pixels so as to generate a cone of light having a variable predetermined solid angle.
39. A method for generating a changing three dimensional image using a variable focus micro-lens array, comprising the steps of:
generating a sequential series of two dimensional images that are substantially in focus over a predetermined area, the images having a number of image points or pixels;
projecting light reflected from or emitted by each of the image points or pixels through the micro-lenses in the array so as to generate a cone of light having a predetermined solid angle, the solid angle varying with the perceived distance of the image point or pixel from a viewer; and varying the focal length of each micro-lens in the array as appropriate with each sequential image.
40. A three dimensional imaging system comprising:

an array of variable focal length liquid micro-lenses formed on a SAM;
and an image having a plurality of image points or pixels;
each the image point or pixel in registered alignment with one or more micro-lenses in the array.
41. A three dimensional system comprising:
an array of variable focal length liquid micro-lenses formed on a SAM;
and an image having a plurality of image points or pixels;
at least one micro-lens in the array in registered alignment with one or more image points or pixels.
42. The imaging system of claim 41 wherein the micro-lenses are liquid lensesadherent to one or more SAMs.
43. The imagine system of claim 42 wherein the focal lengths of the liquid lenses are adjusted by the application of an electric field.
44. The imaging system of claim 41 wherein the micro-lenses are flexible lenses.
45. The imaging system of claim 44 wherein the focal lengths of the flexible lenses are adjusted by elastic deformation.
46. The imaging system of claim 45 wherein the elastic deformation is caused by pressure exerted by a piezoelectric element.
47. A three dimensional imaging system comprising:
a first array of micro-lenses, at least some of the lenses in the first array having a variable focal length and being colored;
a second array of micro-lenses, at least some of the lenses in the second array having a variable focal length and being colored;
a third array of micro-lenses, at least some of the lenses in the third array having a variable focal length and being colored; and an image having a plurality of image points or pixels;
each image point or pixel in registered alignment with one or more micro-lenses in the first array;

at least one micro-lens in the second array in registered alignment with one or more micro-lenses in the first array;
at least one micro-lens in the third array in registered alignment with one or more micro-lenses in the second array.
48. The three dimensional imaging system of claim 47 wherein the lenses in the first array are colored red, the lenses in the second array are colored green, and the lenses in the third array are colored blue.
49. A method for generating an image of a scene having a plurality of objects, comprising steps of:
a) focusing on one or more of the objects;
b) capturing an image of the scene so focused;
c) focusing on one or more different objects;
d) capturing a separate image of the scene so focused;
e) repeating steps (c) and (d) until images focusing on the desired number of objects are captured; and f) combining the captured images to generate a single image having a large depth of field.
50. The method of claim 49 wherein the captured images are digital images.
51. The method of claim 49 wherein the combining steps are performed digitally.
52. A method for generating an image of a scene, comprising the steps of:
a) focusing at a particular distance;
b) capturing an image of the scene so focused;
c) focusing at a different distance;
d) capturing a separate image of the scene so focused;
e) repeating steps (c) and (d) until images focused at the desired number of distances are captured; and f) combining the captured images to generate a single image.
53. The method of claim 52 wherein the captured images are digital images.
54. The method of claim 52 wherein the combining steps are performed digitally.
55. A method for generating a digital image using a camera having a variable focus, comprising the steps of:
a) sequentially capturing a series of digital images while the focus of the camera is varied between near and far focus; and b) digitally combining the captured images to generate a single image having a large depth of field.
56. The method of claim 55 wherein the capturing step is performed with a CCD.
57. A method for generating a digital moving image using a video camera having avariable focus, comprising the steps of:
a) sequentially capturing a first series of digital images while the focus of the camera varies from near focus to far focus:
b) digitally combining the first series of captured images to generate a first video frame;
c) sequentially capturing a second series of digital images while the focus of the camera varies from far focus to near focus;
d) digitally combining the second series of captured images to generate a second video frame;
e) repeating steps (a) through (d) until the desired number of video frames is generated.
58. The method of claim 57 wherein the capturing step is performed with a CCD.
59. The method of claim 57 wherein the focus of the camera varies in a stepwise fashion.
60. The method of claim 57 wherein the focus of the camera varies in a stepwise fashion.
61. A method for generating a digital moving image using a video camera having avariable focus, comprising the steps of:
a) sequentially capturing a first series of digital images while the focus of the camera varies from near focus to far focus;
b) digitally combining the first series of captured images to generate a first video frame that is substantially in focus over a predetermined area;
c) sequentially capturing a second series of digital images while the focus of the camera varies from far focus to near focus;

d) digitally combining the second series of captured images to generate a second video frame that is substantially in focus over a predetermined area;
e) repeating steps (a) through (d) until the desired number of video frames is generated.
62. A camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optics for generating a series of images;
the motorized optics having a focus variable from a near focus to a far focus;
the focus of the motorized optics responsive to a variable focus signal;
a controller, coupled to the motorized optics, for generating the variable focus signal;
an image recorder for receiving and capturing the series of images, or a subset of the series of images, generated by the motorized optics; and a memory, coupled to the image recorder, for storing the images captured by the image recorder.
63. The camera of claim 62 wherein the image recorder is a CCD.
64. A camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optics for generating a series of images;
the motorized optics having a focus variable from a near focus to a far focus;
the focus of the motorized optics responsive to a variable focus signal;
a controller, coupled to the motorized optics, for generating the variable focus signal; and an image recorder for receiving and capturing the series of images, or a subset of the series of images, generated by the motorized optics.
65. A camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optical means, having a focus variable from a near focus to a far focus, for generating a series of images;
the focus of the motorized optical means responsive to a variable focus signal;

controller means, coupled to the motorized optical means, for generating the variable focus signal and controlling the focus of the motorized optical means;
means for receiving and capturing the series of images, or a subset of the series of images, generated by the motorized optical means; and memory means for storing the images captured by the means for receiving and capturing.
66. A camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optics which automatically scan from a near focus to a far focus;
a CCD;
the motorized optics generating a changing image which is projected on the CCD;
the CCD operative to capture the changing image at several sequential points in time; and a memory, coupled to the CCD, for storing the images captured by the CCD.
67. A video camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optics which automatically scan from a near focus to a far focus and from a far focus to a near focus;
a CCD;
the motorized optics generating a changing image which is projected on the CCD;
the CCD operative to capture the changing image at several sequential points in time; and a memory, coupled to the CCD, for storing the images captured by the CCD.
68. A video camera for generating images which are substantially focused over a predetermined area, comprising:
motorized optical means for automatically scanning from a near focus to a far focus and from a far focus to a near focus;
means for capturing an image;
the motorized optical means generating a changing image which is projected on means for capturing an image;

the means for capturing an image operative to capture the changing image at several sequential points in time; and a memory, coupled to the means for capturing an image, for storing the captured images.
CA002223126A 1995-06-07 1996-06-06 Three-dimensional imaging system Abandoned CA2223126A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US08/476,852 US6014259A (en) 1995-06-07 1995-06-07 Three dimensional imaging system
US08/476,854 1995-06-07
US08/476,854 US5986811A (en) 1995-06-07 1995-06-07 Method of and apparatus for generating a 3-D image from a 2-D image having a changeable focusing micro-lens array
US08/476,853 US5717453A (en) 1995-06-07 1995-06-07 Three dimensional imaging system
US08/476,853 1995-06-07
US08/476,852 1995-06-07

Publications (1)

Publication Number Publication Date
CA2223126A1 true CA2223126A1 (en) 1996-12-19

Family

ID=27413393

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002223126A Abandoned CA2223126A1 (en) 1995-06-07 1996-06-06 Three-dimensional imaging system

Country Status (8)

Country Link
EP (1) EP0871917A4 (en)
JP (1) JPH11513129A (en)
KR (2) KR100417567B1 (en)
CN (2) CN1188727C (en)
AU (1) AU6276496A (en)
CA (1) CA2223126A1 (en)
TW (1) TW355756B (en)
WO (1) WO1996041227A1 (en)

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3255087B2 (en) * 1997-06-23 2002-02-12 株式会社エム・アール・システム研究所 3D image display device
US20050002113A1 (en) 1997-10-08 2005-01-06 Varioptic Drop centering device
FR2791439B1 (en) 1999-03-26 2002-01-25 Univ Joseph Fourier DEVICE FOR CENTERING A DROP
DE19949011C2 (en) * 1999-10-11 2001-10-25 Werner Breit Passage of light waves
WO2001044858A2 (en) * 1999-12-16 2001-06-21 Reveo, Inc. Three-dimensional volumetric display
KR100433277B1 (en) * 2001-07-30 2004-05-31 대한민국 Three-dimensional display
CN100430777C (en) * 2002-02-20 2008-11-05 皇家飞利浦电子股份有限公司 Display apparatus
US7428001B2 (en) 2002-03-15 2008-09-23 University Of Washington Materials and methods for simulating focal shifts in viewers using large depth of focus displays
AU2003267797A1 (en) * 2002-10-25 2004-05-13 Koninklijke Philips Electronics N.V. Zoom lens
DE60308161T2 (en) * 2003-06-27 2007-08-09 Asml Netherlands B.V. Lithographic apparatus and method for making an article
EP1665815A2 (en) * 2003-09-15 2006-06-07 Armin Grasnick Method for creating a stereoscopic image master for imaging methods with three-dimensional depth rendition and device for displaying a stereoscopic image master
KR101057769B1 (en) 2003-10-20 2011-08-19 엘지디스플레이 주식회사 Lens array for image conversion and image display device and method using same
US7077523B2 (en) * 2004-02-13 2006-07-18 Angstorm Inc. Three-dimensional display using variable focusing lens
WO2006017771A1 (en) * 2004-08-06 2006-02-16 University Of Washington Variable fixation viewing distance scanned light displays
JP4670299B2 (en) 2004-09-30 2011-04-13 カシオ計算機株式会社 Lens unit, camera, optical equipment, and program
JP5119567B2 (en) 2004-09-30 2013-01-16 カシオ計算機株式会社 camera
JP4864326B2 (en) * 2005-01-21 2012-02-01 Hoya株式会社 Solid-state image sensor
DE102006010971A1 (en) * 2005-03-09 2006-09-21 Newsight Gmbh Autostereoscopic viewing method e.g. for images, involves having arrays providing defined propagation directions for light which emerge from one of arrays through one array from light source and oriented to array of transparent elements
JP4578294B2 (en) 2005-03-18 2010-11-10 株式会社エヌ・ティ・ティ・データ三洋システム Stereoscopic image display device, stereoscopic image display method, and computer program
JP4334495B2 (en) * 2005-03-29 2009-09-30 株式会社東芝 Stereoscopic image display device
FR2887638B1 (en) 2005-06-23 2007-08-31 Varioptic Sa VARIABLE FOCAL LENS WITH REDUCED INTERNAL PRESSURE VARIATION
CN101278557A (en) 2005-10-04 2008-10-01 皇家飞利浦电子股份有限公司 A 3D display with an improved pixel structure (pixelsplitting)
JP2009515213A (en) * 2005-11-02 2009-04-09 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Optical system for 3D display
KR100813492B1 (en) * 2006-11-22 2008-03-13 엘지전자 주식회사 Head up display system for vehicle
WO2008076399A2 (en) * 2006-12-15 2008-06-26 Hand Held Products, Inc. Apparatus and method comprising deformable lens element
KR20120087647A (en) * 2011-01-28 2012-08-07 삼성전자주식회사 Displaying device
KR101302415B1 (en) 2012-06-27 2013-09-06 주식회사 나무가 Signal process system for automatically adjusting the focus of the 3 dimensional depth lens
KR101984701B1 (en) 2012-11-13 2019-05-31 삼성전자주식회사 3D image dispaly apparatus including electrowetting lens array and 3D image pickup apparatus including electrowetting lens array
TWI556037B (en) * 2014-12-17 2016-11-01 宇勤科技(深圳)有限公司 Lcd and electrically-controlled 3d grating structure thereof
US9857594B2 (en) 2015-01-29 2018-01-02 Kabushiki Kaisha Toshiba Optical device and head-mounted display device and imaging device equipped with the same
US9686458B2 (en) * 2015-04-16 2017-06-20 Sony Corporation Camera LED flash with variable lens gain
CN107405049B (en) * 2015-05-12 2020-05-26 奥林巴斯株式会社 Stereoscopic endoscope device
CN106303315B (en) 2015-05-30 2019-08-16 北京智谷睿拓技术服务有限公司 Video display control method and device, display equipment
CN106303498B (en) 2015-05-30 2018-10-16 北京智谷睿拓技术服务有限公司 Video display control method and device, display equipment
CN106303499B (en) 2015-05-30 2018-10-16 北京智谷睿拓技术服务有限公司 Video display control method and device, display equipment
CN106254857B (en) * 2015-12-31 2018-05-04 北京智谷睿拓技术服务有限公司 Light field display control method and device, light field display device
WO2017160484A1 (en) * 2016-03-15 2017-09-21 Deepsee Inc. 3d display apparatus, method, and applications
GB2550885A (en) * 2016-05-26 2017-12-06 Euro Electronics (Uk) Ltd Method and apparatus for an enhanced-resolution light field display
KR20190119093A (en) * 2017-03-03 2019-10-21 오스텐도 테크놀로지스 인코포레이티드 Split injection pupil head up display system and method
GB2564850A (en) * 2017-07-18 2019-01-30 Euro Electronics Uk Ltd Apparatus and method of light field display
CN109307935B (en) * 2018-11-13 2023-12-01 深圳创维新世界科技有限公司 Space projection display device
CN110879478B (en) * 2019-11-28 2022-02-01 四川大学 Integrated imaging 3D display device based on compound lens array
CN113677981A (en) * 2021-07-06 2021-11-19 香港应用科技研究院有限公司 Flexible display inspection system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3006244C2 (en) * 1979-02-20 1984-08-30 Ricoh Co., Ltd., Tokio/Tokyo Device for determining the focus of a lens on an object
JPS57210326A (en) * 1981-06-22 1982-12-23 Nippon Kogaku Kk <Nikon> Automatic focus detecting device for camera
GB9102903D0 (en) * 1991-02-12 1991-03-27 Oxford Sensor Tech An optical sensor
US5291334A (en) * 1992-01-30 1994-03-01 United Technologies Corporation Micro-lens panoramic imager
US5439621A (en) * 1993-04-12 1995-08-08 Minnesota Mining And Manufacturing Company Method of making an array of variable focal length microlenses
US5398125A (en) * 1993-11-10 1995-03-14 Minnesota Mining And Manufacturing Company Liquid crystal projection panel having microlens arrays, on each side of the liquid crystal, with a focus beyond the liquid crystal

Also Published As

Publication number Publication date
EP0871917A1 (en) 1998-10-21
TW355756B (en) 1999-04-11
CN1188727C (en) 2005-02-09
CN1193389A (en) 1998-09-16
AU6276496A (en) 1996-12-30
KR100436538B1 (en) 2004-09-16
JPH11513129A (en) 1999-11-09
WO1996041227A1 (en) 1996-12-19
KR19990022726A (en) 1999-03-25
KR100417567B1 (en) 2004-02-05
CN1645187A (en) 2005-07-27
EP0871917A4 (en) 1999-11-24

Similar Documents

Publication Publication Date Title
CA2223126A1 (en) Three-dimensional imaging system
US6437920B1 (en) Three Dimensional imaging system
US5986811A (en) Method of and apparatus for generating a 3-D image from a 2-D image having a changeable focusing micro-lens array
US5717453A (en) Three dimensional imaging system
US11726325B2 (en) Near-eye optical imaging system, near-eye display device and head-mounted display device
US7002749B2 (en) Modular integral magnifier
JP4470355B2 (en) Display system without glasses
CN1910937A (en) Volumetric display
JPH11285030A (en) Stereoscopic image display method and stereoscopic image display device
KR20110107815A (en) Spatial image display device
WO2005079244A2 (en) Three-dimensional display using variable focusing lens
WO1999008150A1 (en) Apparatus and method for creating and displaying planar virtual images
KR20010093245A (en) Three-dimensional image sensing device and method, three-dimensional image displaying device and method, and three-dimensional image position changing device and method
KR20210127744A (en) High resolution 3D display
KR100786860B1 (en) Autostereoscopic display appratus having varifocal lens
US10728534B2 (en) Volumetric display system and method of displaying three-dimensional image
AU746605B2 (en) Three-dimensional imaging system
Surman et al. Latest developments in a multi-user 3D display
JPH07140419A (en) Stereoscopic method, stereoscopic spectacles used for the same, and its manufacture
JPH11113029A (en) Stereoscopic image display device and stereoscopic photographing and display device
JP2002107665A (en) Stereoscopic viewing device
JPH10221645A (en) Stereoscopic picture display device
WO2022074409A1 (en) Method and device for displaying a 3d image
KR20130124744A (en) Device for displaying three dimensional image, mirror therefor and method for manufacturing mirror
Sayinta 3d display system using scanning LED array modules

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued