US20190139290A9 - Integrated 3d-d2 visual effects dispay - Google Patents

Integrated 3d-d2 visual effects dispay Download PDF

Info

Publication number
US20190139290A9
US20190139290A9 US14/189,232 US201414189232A US2019139290A9 US 20190139290 A9 US20190139290 A9 US 20190139290A9 US 201414189232 A US201414189232 A US 201414189232A US 2019139290 A9 US2019139290 A9 US 2019139290A9
Authority
US
United States
Prior art keywords
display system
visual display
accordance
display
presents
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/189,232
Other versions
US10593092B2 (en
US20150243068A1 (en
Inventor
Dennis J. Solomon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from FR9015462A external-priority patent/FR2670301B1/en
Priority claimed from US12/456,401 external-priority patent/US8730129B2/en
Priority claimed from US13/294,011 external-priority patent/US20120056799A1/en
Priority to US14/189,232 priority Critical patent/US10593092B2/en
Application filed by Individual filed Critical Individual
Publication of US20150243068A1 publication Critical patent/US20150243068A1/en
Priority to US16/190,044 priority patent/US20190258061A1/en
Publication of US20190139290A9 publication Critical patent/US20190139290A9/en
Priority to US16/819,091 priority patent/US11199714B2/en
Publication of US10593092B2 publication Critical patent/US10593092B2/en
Application granted granted Critical
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/15Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components with at least one potential-jump barrier or surface barrier specially adapted for light emission
    • H01L27/153Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components with at least one potential-jump barrier or surface barrier specially adapted for light emission in a repetitive configuration, e.g. LED bars
    • H01L27/156Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components with at least one potential-jump barrier or surface barrier specially adapted for light emission in a repetitive configuration, e.g. LED bars two-dimensional arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/013Head-up displays characterised by optical features comprising a combiner of particular shape, e.g. curvature
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0194Supplementary details with combiner of laminated type, for optical or mechanical aspects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/101Scanning systems with both horizontal and vertical deflecting means, e.g. raster or XY scanners
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C5/00Constructions of non-optical parts
    • G02C5/006Collapsible frames
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C5/00Constructions of non-optical parts
    • G02C5/02Bridges; Browbars; Intermediate bars
    • G02C5/08Bridges; Browbars; Intermediate bars foldable
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C5/00Constructions of non-optical parts
    • G02C5/14Side-members
    • G02C5/20Side-members adjustable, e.g. telescopic

Definitions

  • This invention relates generally to audiovisual effects displays and environments; and more particularly to novel integration of personal head-mounted glasses, 3D imaging displays environments and other audio-visual effects.
  • This continuation-in-part and divisional application incorporates by reference my related and earlier-filed applications and disclosures including Ser. Nos. 12/456,401 and 13/294,011 and provisional applications 61/962,877; 61/850,920; 61/850,082.
  • Miniature displays are also well known and may involve a miniaturized version of planar or stereoscopic 3D technologies which display a distinct image to each eye.
  • HMDs head-mounted displays
  • the 3D HMD display technology has numerous extensions including Near-to-Eye (NTD)—periscopes and tank sights; Heads-Up (HUD)—windshield and augmented reality—and immersive displays (IMD)—including CAVE, dome and theater size environments.
  • NTD Near-to-Eye
  • HUD Heads-Up
  • IMD immersive displays
  • wavefront-based technologies such as digital phase and diffractive holography
  • may at high-resolutions convey a limited amount of accommodation data.
  • their limitations including coherent effects, impart significant specular and other aberrations degrading performance and inducing observer fatigue.
  • Augmented reality had in origins at MIT Lincoln Laboratory in the 1960s and involved in a translucent HMD with head-orientation tracking in a wall projection immersive environment.
  • the ‘virtual image’ in the HMD did not have accommodation, and the immersive environment did not include spatially-tracked, portable audience elements with multiplicative effects.
  • a further problem solved by the innovation of present invention is the method and apparatus to comfortably and useful carry and use an audio-visual display on one's person.
  • the present invention solves these problems, particularly related to the portable multiphasic design, augmented reality, environmental dynamics and the accurate display of 3D pixels.
  • the present invention discloses an improved method and device for the display of a visual image in two or three dimensions including stereoscopic and/or visual accommodation.
  • Another object of the present invention is an improved method and device for an immersive, augmented reality environment.
  • Another object of the present invention is an improved method and device for monitoring the physiological, psychological, fixation, processing, awareness and response of an individual.
  • Another object of the present invention is an improved method and device for constructing an accurate, augmented reality, visual display with automatic biocular alignment
  • Another object of the present invention is an improved method and device for constructing an accurate, augmented reality, visual display without an intermediate image plane,
  • Another object of the present invention is an improved method and device for manufacturing a visual display independent of coherence and wavefront curvature constraints
  • Another object of the present invention is an improved method and device for thin, wave-guided display.
  • Another object of the present invention is an improved method of presenting visual information
  • Another object of the present invention is an improved method and device for an immersive, augmented reality, audience performance environment.
  • Another object of the present invention is an improved method and device to present visual information in compact form unaffected by an external environment.
  • FIG. 1 presents a general view of binocular stereoscopic viewers.
  • FIGS. 2A, 2B & 2C presents a general view of the wrist-wearable, stereoviewer embodiment of the present invention
  • FIGS. 3A-D presents a preferred embodiment having an articulated bridge.
  • FIGS. 4A-B presents a preferred embodiment having a foldable temple.
  • FIGS. 5A-B presents a preferred embodiment having repositionable elements.
  • SB4 and V1 presents an HMD embodiment
  • FIGS. 6A-B presents a preferred embodiment having wrappable temples.
  • FIGS. 7A-C presents the layered construction of the viewer panels.
  • FIGS. 8A-C presents a preferred embodiment having a variable, semi-transparent viewer panel.
  • FIG. 9A presents a preferred embodiment of a augmented, virtual reality embodiment with eye monitoring and response.
  • FIG. 9B references parent U.S. patent application Ser. No. 12/456,401 FIG. 5.
  • FIG. 10 presents a preferred embodiment having a pivotable frame with storage headband.
  • FIG. 11 presents a generalized viewer of an audience effects system.
  • FIG. 12 presents a scanning display embodiment.
  • FIG. 13 presents another embodiment of a scanning display embodiment.
  • FIG. 14 presents a variable focal length lens embodiment.
  • FIG. 15 presents a scanning stereo viewer embodiment using micro optic domains with a polarizing aperture
  • FIG. 16 shows a perspective view of the linear array, continuous focal distance embodiment of the present invention
  • FIG. 17 presents a side view of the linear array, continuous focal distance embodiment.
  • FIG. 18 presents a perspective view of the planar array, continuous focal distance, environmental embodiment of the present invention.
  • FIG. 19 shows a perspective view of a two photon activation embodiment of the present invention.
  • FIGS. 20 and 21 presents perspective views of a plasma activation embodiment of the present invention
  • FIG. 22 presents a sectional view of a scanning embodiment.
  • FIG. 23 presents a sectional view of a deflected, tethered light emitting element activation embodiment of the present invention
  • FIG. 24 presents a perspective view of a three dimensional optic deflection of apparent light source embodiment of the present invention.
  • FIG. 25 presents an interneural motion embodiment of the present invention.
  • FIG. 26 presents a dynamic interocular display embodiment.
  • FIGS. 27 and 28 presents a top view of a transfer reflector embodiment.
  • FIG. 29 presents a photonic inductin of nerve transmission embodiment.
  • FIGS. 30 and 31 presents a preferred deformable mirror membrane embodiments of the present invention.
  • FIG. 32 presents a preferred embodiment having a focusable element.
  • FIG. 33 presents a preferred embodiment having a fiber optic element.
  • FIGS. 34 and 35 present a preferred embodiment having a offset reflector.
  • FIG. 36 presents a preferred embodiment having an active, augmented reality visor.
  • FIGS. 37-45 present preferred embodiments having a total internal reflection.
  • FIGS. 46 and 47 presents preferred embodiments having thin-film optics.
  • FIGS. 48-50 present preferred embodiments of audience effects elements employed with the present invention.
  • FIG. 51 presents an immersive environment embodiment.
  • FIGS. 52A-54 present enhanced embodiments of the present invention.
  • FIGS. 55 and 56 present an enhanced data structure embodiment of the present invention.
  • FIGS. 57A-C present preferred saccadic enhancement embodiments of the present.
  • the present invention relates to improved methods and constructions to achieve a complex visual display environment which may include dynamic and precise focal length for each pixel of each frame of a visual display and the ability to present a comfortable image, two or three-dimensional, virtual or augmented, to a single or multiple observers which may be participating in an audience performance.
  • One preferred embodiment of the personal device enables a full accommodative, stereoscopic, augmented reality, audio-visual experience with a convenient portability and use.
  • This preferred embodiment provides a complete personal audio-visual experience in the form of wearable eyeglasses or head-mounted environment (HME) which may be removably-affixed as a wristband or watch.
  • HME head-mounted environment
  • the present invention allows but does not require that the pixels or light sources be transparent, or that any light is actually transmitted through them.
  • This approach has many technological, performance and manufacturing advantages including the ability to present discrete, collinear pixels of different focal distances with improved acuity within the period of visual integration.
  • FIG. 1 presents prior art with elements locally renumbered from allowed, parent U.S. patent application Ser. No. 12/456,401 FIG.
  • FIG. 2A presents a preferred convertible embodiment of the present invention having a convertible frame 100 in the form of eyeglasses or goggles, a first display region 102 , a second display region 104 , a first arm 106 , a second arm 108 , a first ear assemblage 110 , a second ear assemblage 112 , a first camera 116 , a second camera 118 , a first display 122 , a second display 124 .
  • the present invention functions as a head-mounted, augmented reality display having some or all of the features common to the field or disclosed in my previous or pending patent applications included by reference.
  • the displays 122 , 124 may be any type including but not limited to those disclosed herein or in my previously-filed applications, miniature LCD, LED, OLED, QD, fluorescent, electroluminescent, transformed, array, scanned, static or dynamic focus, one or two-sided, of occlusive, transparent, or variable optics, engaging a limited or comprehensive field of view.
  • These may include the display of images to the user's eyes at a static or dynamic focal distance, accompanying audio, one or more affixed or mobile cameras capturing the user's environment, one or more cameras capturing the user's eye position, motion and state which may be incorporated into the display, sensors monitoring geophysical position (GPS), motion (acceleration, gyro), physiological state (temp, heart beat, brain waves, etc.), environmental state (temperature, humidity, pressure, audio, air quality, insolation, ambient illumination, scents, etc.).
  • GPS geophysical position
  • motion acceleration, gyro
  • physiological state temp, heart beat, brain waves, etc.
  • environmental state temperature, humidity, pressure, audio, air quality, insolation, ambient illumination, scents, etc.
  • the convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened.
  • the frame 100 material may be flexible. Examples include but are not limited to: elastomers, woven fabrics, linked chains, pinned segments, or other configurations.
  • FIG. 2B presents a posterior view of the second configuration of the present invention where the convertible frame 100 is folded about the wrist 101 of the user having the second display 124 presenting on the posterior wrist and the first display 122 (not shown) presenting on the anterior wrist.
  • FIG. 2C presents a top view of the second configuration of the present invention where the convertible frame 100 is folded about the wrist 101 of the user having the second outer display 124 presenting on the posterior wrist and the first out display 122 presenting on the anterior wrist.
  • FIG. 3A presents a front view of the of the convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened.
  • FIG. 3B presents a front view of the of the convertible frame 100 having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be folded upon itself, the arms 106 , 108 forming a strap-like fastener to secure the frame 100 to the wrist.
  • FIG. 3C presents a front view of the of the folded convertible frame 100 of FIG. 5 , having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be folded upon itself, the arms 106 , 108 forming a strap-like fastener to secure the frame 100 to the wrist.
  • FIG. 3D presents a end view of the of the folded convertible frame 100 of FIG. 5 , having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be folded upon itself, the arms 106 , 108 forming a strap-like fastener to secure the frame 100 to the wrist, and showing the first outer display 122 presenting upward.
  • FIG. 4A presents a front unfolded view of the of the folded convertible frame 100 and arm having a first fold region 106 and a second fold region 106 ′ enabling the arm to be folded upon itself.
  • FIG. 4B presents a front folded view of the of the folded convertible frame 100 and arm having a first fold region 106 and a second fold region 106 ′ enabling the arm to be folded upon itself.
  • FIG. 5A presents an expanded top view of the present invention where the display 122 , 124 may be separated or overlapped to convert to a wrist configuration.
  • FIG. 5B presents wrist-converted top view of the present invention where the display 122 , 124 are overlapped to convert to a wrist configuration.
  • the ‘hinge’ region 114 may have connected/loop structure 114 ′ to which the opposing arm/strap 106 , 108 is removably affixed in the wrist configuration.
  • FIG. 6A presents an expanded front view of the present invention where the ear arms 128 are in the form of wire and may be loop around and over the alternate display 122 , 124 to convert to a wrist configuration
  • FIG. 6B presents an front wrist view of the present invention where the ear arms 128 are in the form of wire and may be loop around and over the alternate display 122 , 124 to convert to a wrist configuration
  • accommodation may also be dynamic, and change according the situational, user-defined, or programmed factors.
  • Personal displays included a broad range of configurations from screens embedded in contact lenses to handheld tablets. Many exhibit unique advantages and limitations on their utility based on factors including but not limited to weight, cost, resolution, performance, durability and suitability. For example, a large, high resolution tablet may be useful to detailed graphics, but cumbersome as a real-time navigation aid while skiing, where a micro-display on transparent goggles would be suitable.
  • FIGS. 7A and 7 b present a preferred embodiment of the display 124 of the present invention having one or more layers including but not limited to: outer protective and outer display 132 , occlusion layer 134 , inner display 136 and inner protective layer.
  • FIG. 7C present a preferred embodiment of the display 124 of the present invention having one or more layers including but not limited to: outer protective and outer reflective layer 138 , display layer 136 , inner reflection display 140 .
  • FIG. 8A presents a sparse emitter embodiment of the display layer 132 having emitters 142 ,′,′′ spaced in a sparse array with transparent spaces in between.
  • FIG. 8B presents a sparse emitter embodiment of the display layer 132 having emitters 142 ,′,′′ spaced in a sparse array with transparent spaces in between with the emitters 142 visible to an external observer 12 .
  • FIG. 8C presents a sparse emitter embodiment of the display layer 132 having emitters 142 ,′,′′ spaced in a sparse array with transparent spaces in between with the emitters 142 visible to an internal observer 10 .
  • Various prismatic constructions may be employed to present a larger apparent source of emission of the emitter 142 .
  • the display layers may be edge-illuminated from a position with the relevant display illumination exiting the substrate by any fixed or dynamic optical means including but not limited to optical path prisms, holograms, TIR violations, acousto or electro-optic waveguides such as leaky or tunneling mode transition described in Sanford U.S. Pat. No. 4,791,388 and more contemporary references.
  • FIG. 9 presents a schematic view of an augmented reality embodiment of the present invention having the user's eye 10 , eye lens, 14 optical path combiner 150 , external object 32 , display 136 and focal-length lens 152 .
  • the image projected from the display 136 combines in optical path combiner 150 with the external scene 32 .
  • the user's lens 14 will normally accommodate to focus on the object 32 of attention in its field of view.
  • In the display focal-length lens 152 may be programmed to follow the user's accommodated eye lens 14 focal length by monitoring the user's vision. This is commonly done in eye examinations and may be accomplished by a monitoring eye camera 160 in the display optical path. Monitoring may be enhanced by projecting a pattern, in visible or non-visible light on the eye generally from the display. Near IR is common wavelength used for this purpose.
  • the preferred embodiment of the present invention presents a single, dynamic display focal length by means of the single dynamic lens 152 .
  • This embodiment may present a programmed image and display focal length based on the design of the programmer.
  • This image may be based on the user's eye position, eye focal length or other design factors.
  • an external camera may identify an object of interest at a distance location while the observer is focused on a near object.
  • the user may be directed to the object of interest faster and with the ability to more accurately evaluate critical factors related to the object.
  • Programmed Global View an entertainment, educational or other presentation may be prepared to guide the user through a real, virtual or synthetic (fantasy) visual environment by adding focal distance parameters to the scene/timeline of presentation.
  • Predictive Global View Complex, predictive algorithms may be employed to program the project display focal distance based on user's actions, physiology, progression, program or other factors. A wide range of tests, evaluations and educational tools may be enhanced by this method.
  • the display focal distance may be based on the user's eye position, eye focal length, or other complex factors responsive the global view, including but not limited to enhancing the scene by adding realistic brightness to dimly illuminated parts of the view.
  • Optometry Science, Techniques and Clinical Management, 2e, 2009 is incorporated herein by reference.
  • FIG. 9B references and retains the element numbering of parent U.S. Patent Application FIG. 5.
  • a handheld device may be used as the base unit/power charger for the WristHMD/HD-HMD.
  • the wrist band may be permanent incorporating the telephonic circuitry and charging power for an low-power data-streamed HMD.
  • FIG. 10 presents an integrated headphones and HMD 100 embodiment where the HMD 100 may pivot about the ear and fold into a protective cover which is part of upper headband 120 connecting the individual ear speakers.
  • the convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106 , 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened.
  • the frame 100 material may be flexible. Examples include but are not limited to: elastomers, woven fabrics, linked chains, pinned segments, or other configurations.
  • FIG. 11 presents the generalized elements of the performance display system 250 in a venue resembling an audience 252 with a plurality of audience members 252 ′′ and stage 254 .
  • the venue may include any space ranging from the interior of an automobile, a living room, a dining hall, or nightclub to major event venues such as theatres, concert halls, football stadiums or outdoor festivals.
  • audience unit or audience receiver unit 200 is used to describe both the simple and autostereoscopic-effects unit, it may be understood that the module may take any shape or be incorporated into any independent handheld, worn, or positioned effects device including but not limited to tickets, badges, buttons, globes, cylinders, signs, sashes, headdresses, jewelry, clothing, shields, panels and emblems affixed or held to a member of the audience or any object, moveable or stationary.
  • the insert in FIG. 11 presents a front view of the present invention having an illuminated audience unit 200 with some or all of the elements of the audience unit of FIG. 11 , as described in my U.S. Pat. No.
  • the show director at the control board 18 or instrument sends a sequence of commands, live or from a stored visual or audio program, over the performance system data network 14 to the projector/signal generator 100 which emits a precisely timed series of directional signals.
  • 106 , 106 ′′, 106 ′′ programmed to activate the audience units 200 at a precise location impacted by the directional signal 106 .
  • the projector/signal generator 100 displays an invisible IR 106 image at a specific wavelength (880 nanometers, for example) on the audience 22 which causes the wavelength-specific audience unit communication receiver 202 to activate one or more light emitters or modulators 206 .
  • the projector/signal generator 100 may also transmit a program sequence for later execution and display.
  • Each audience unit may contain a unique encoded identifier entered during manufacture; at the time of purchase or distribution; or transmitted by the projection system to the audience at any time, including during the performance.
  • the data protocol may included well-known communication protocols such as but not limited to IR RS-232, IRDA, Fiber Channel, Fiber Ethernet, etc.
  • the projector/signal generator 100 may also project a visible light beam containing visual content as well as a data stream by modulating the frequency above the human visual system integration frequency of 30 Hz. It may be understood that the projector/signal generator 100 in its photonic form encompasses the simplest gobo projector as well as the most complex, integrated terahertz-modulated photonic signal generator and spatial light modulator.
  • the Light emitting elements 206 may refer to any type of photonic source such as but not limited to incandescent, fluorescent, neon, electroluminescent, chemical, LED, laser, or quantum dot; or to combinations of light modulating combinations such as but not limited to thin film LCDs, backlit or reflective, E*INK type reflective modulators, chemical, photonic or electronic chromatic modulators.
  • a camera system 300 may be employed to monitor the audience and/or audience units, and provide feedback for a number of manual or automated design, setup and operating procedures. The camera system may be incorporated into the projector/signal generator unit 100 . If not properly configured said data signals 106 may interfere and degrade the rate and integrity of transmission.
  • a time code signal may be transmitted from the system control board 18 , a designated master controller 100 .
  • Each data projector 100 may be programmed with a calculated offset from the time-code signal based on its distance from ‘center of mass’ of the audience, the location of other controllers, external environment, and other factors.
  • a central timecode beacon 140 may transmit the time-code signal to each of the data projectors 100 by means including but not limited to photonic, acoustic, or RF signals.
  • a feedback system from the cameras 300 may be used to adjust the performance including but not limited to projecting a fine pattern and adjusting the intensity of the data signal 106 until the appropriate resolution is achieved.
  • the audience unit may employ an IR or other non-visible emitter for adjustment, diagnostic and other purposes.
  • Various user input devices including microphones, buttons, switches, motion detectors, gyroscopes, light detectors, cameras, GPS and other devices may be included 216 .
  • FIG. 12 presents a cross section of the translocation reflector method with a lenticular type screen.
  • the components are an LEE array 1320 , a FOE array 1360 , a translocation reflector 1322 , an actuator 1330 , a counterweight 1332 and a position encoder 1340 and a screen 1350 .
  • a section of the full view is presented on the LEE 1320 , focused by the FOE array 1360 , reflected by the translocation reflector 1322 and the screen 1350 .
  • the screen may be of a fresnel, lenticular, stepped or holographic construction such as to present a focused image of the LEE 1320 to a viewer.
  • a circular polarizing window 1360 may be placed between the observer and the screen to extinct external ambient light.
  • FIG. 13 presents a rotating polygon embodiment of the present invention.
  • the system projects an image of the LEE 1510 by scanning a rotating reflective polygon 1520 and projecting the image onto a viewing screen or reflective micro-optic surface 1530 viewed by the observer 1540 .
  • a circular polarizing aperture 1550 may be placed between the screen 1530 and the observer 1540 and the LEE 1510 output modulated to produce a range of elliptical polarization whereby the external ambient light is extincted while the image of LEE remains visible.
  • the LEE 1510 modulation may be used to control color and intensity as well.
  • the LEE 1510 although shown as a single row may be constructed of multiple rows thereby projecting either a ID array of elements optically-combined for increased brightness or intensity modulation, or a 2D array. As a 2D array with appropriate spacing between elements, the optical deflection angle may be reduced to the spacing arc. This technique in combination may be used for large stereoscopic, autostereoscopic and monoscopic projection systems.
  • FIG. 14 presents a perspective view of one embodiment of a single element of the focal distance optical element.
  • the components are the LEE 2020 , a piezoelectric cylinder 2030 and a variable optical element 2040 .
  • an electrical charge applied to the piezoelectric cylinder 2030 varies the compression of the enclosed optical material 2040 resulting in a change in the focal length of the optical element.
  • the LEE will appear to vary in distance when the eye adjusts to the minimum focus. This approach requires a dark region 2060 adjacent to the focusable element for single elements, or an image edge.
  • Focal length adjustment may also be effected by electrostatic reflective membrane arrays, gradient index liquid crystal arrays, SLMs, diffractive elements, multiple internal reflections and other known technologies.
  • FIG. 15 presents a scanning stereo viewer using micro optic domains with a polarizing aperture. Similar to the embodiment of FIG. 21 , an image is projected onto a screen 2220 from scanner 2230 or 2232 and viewed by observer 2210 . A transparent polarizer window 2250 is interposed between the observer 2250 and the screen 2220 .
  • the screen may be constructed of reflective micro domains which focus the image to one observer or disperse the image for multiple observer.
  • the beams of light from the scanner 2230 are either unpolarized or the polarization is modulated to control intensity or color.
  • FIG. 16 shows a perspective view of the linear array, continuous focal distance embodiment of the present invention where the component parts of the light source and scanning assembly A 100 are shown including a image computer A 90 , a linear array of light sources A 110 , and a two axis, scanning mirror A 120 .
  • the computer A 90 communicates with the scanning mirror A 120 through an open loop drive system, closed loop position feedback or other known positioning system and illuminates those light sources A 110 which correspond to the image points A 310 to be displayed.
  • the divergent beams from each light sources A 110 may be focused by the eye A 24 to correspond to the appropriate object distance.
  • linear array of light sources A 100 is shown as an array of light emitters such as LEDs (light emitting diodes) which are driven by an image computer A 90 through circuits not shown, alternative light sources may be employed. Examples of such alternatives include electronically, optically or mechanically activated emitters, shutters, reflectors, and beam modulators. Specifically an FLCD shutter array as shown in Fig., a fluorescent or two-photon emitter as described by Elizabeth Dowling, or a mechanically reflector such as Texas Instruments DMD device may be used.
  • the axial image or zero-order view may be block and the image formed from the divergent beams from the emitter.
  • FIG. 17 shows a perspective view of the 2D planar array, continuous focal distance embodiment of the present invention where a two dimensional matrix of light sources A 110 , A 110 ′ which produce the image beams A 304 .
  • a multiplicity of 2D arrays A 110 may be used to produce a 3D matrix full display
  • a preferred embodiment combines the 2D array with a scanning mechanism A 120 to create the full image.
  • FIG. 18 shows a side view of the planar array, continuous focal distance embodiment of the present invention applied to an autostereoscopic display where the light source A 110 and scanning assembly A 120 project the beams towards the screen A 200 and then to the observer's eye A 24 .
  • the scanning assembly A 120 , projection optics and screen A 200 may include embodiments of my previously filed and co-pending patent applications for autostereoscopic displays, thereby incorporating the present invention in the function of the light source and focal distance control.
  • FIG. 19 shows a perspective view of a two-photon activation embodiment of the present invention.
  • researchers have developed a number of techniques for the photo-activation of light emitters.
  • Elizabeth Dowling of Stanford University has perfected a technique using a two-photon activation method. This approach may be useful employed as a light emitter in the present invention.
  • FIG. 20 shows a perspective view of a plasma or floating emitter activation embodiment of the present invention where a light emitting region where a defined light emitter region A 110 is displaced in space and activated under the control of the image computer a 90 , the displacement field control structures A 150 and the activation signal A 154 .
  • the output beam A 340 is structured by output optics A 410 .
  • FIG. 21 shows a perspective view of the reflector or optically activated emitter activation embodiment of the present invention where a light emitting region where a defined light emitter region A 110 is displaced in space and activated under the control of the image computer a 90 , the displacement field control structures A 150 and the activation signal A 154 .
  • the output beam A 340 is structured by output optics A 410 .
  • FIG. 22 shows a side view of the angled reflective planar array, continuous focal distance embodiment of the present invention where the light source A 110 and scanning assembly A 120 projects the beam towards the screen A 200 and then to the observer's eye A 24 .
  • a light source A 102 and reflector A 104 illuminate an array A 110 , A 110 ′, A 110 ′′ shown as a section of a planar array which provides depth function for a multiplicity of image pixels.
  • a ray A 304 from the appropriate pixel A 110 corresponding the depth function of the pixel is reflected to the imaging optics A 410 , the scanning optics A 120 shown as a rotating mirror, and a reflective HOE optical element A 410 ′ which imparts the angular divergence required to present the proper cone of rays to the HOE augmented reality screen A 200 and then to the observer's eye A 24 .
  • FIG. 23 shows a side view of an improved aberration free light source and scanning assembly A 10 where a light source A 110 is scanned affixed to a movable member A 400 affixed to a point on the plane of the projection optics A 410 and the output beam is emitter about a path diverging generally along the movable member A 400 .
  • the light source A 110 and movable member A 400 may be chemically, electrodynamically, mechanically (physical, piezo, acousto), or optically displaced in a resonant or pixel determined fashion. Multiple light sources A 110 may be affixed to the movable member A 400 with intervening non emitting regions thus reducing the required displacement required.
  • the movable member may be cyclically or predeterminably lengthen and shorten to impart a variable focal length. A multiplicity of movable members may be employed.
  • the electronic circuits, which may be formed from transparent conductive films, are not shown. This approach may be used in low cost consumer and toy applications.
  • the present invention optimizes the current performance/cost parameters of commercially available processes.
  • Contemporary, medium cost, high-speed, light sources, either emitters or shutters, together with associated electronics have digital modulation frequencies in the range of 10-100 MHz.
  • a full field display should have at least 2000.times.1000 pixels of resolution (2 megapixels) and a refresh rate of 72 Hz.
  • the resultant data rate for a single plane, single emitter light source is 144 MHz.
  • a digital modulation frequency must be increased by at least a factor of 8. Adding focal depth of 10,000 points, a modulation frequency of over 10 terahertz is required.
  • the present invention provides a direct solution to this problem. Section Two
  • FIG. 24 Multiple Axis—presents a perspective view of a preferred embodiment of the present invention wherein the deformable membrane incorporates a pattern permitting an increased range of the redirection of the incident radiation.
  • the structure is comprised of a deformable membrane N 100 suspended above or between one or more programmable electrodes N 102 , which may be transparent.
  • the incident beam N 104 is reflected from the membrane N 100 towards the visor mirror 230 and observer's eye 200 .
  • the control electronics N 110 applies a variable charge to electrodes N 102 causing a localized deformation N 114 of membrane N 100 .
  • the amplitude and timing of the applied charge may cause the localized deformation N 114 to travel about membrane N 100 in a vector or raster pattern.
  • the deformation of membrane N 100 is synchronized with the modulation of LEE 220 causing a specific image pixel to be illuminated.
  • the pattern may simultaneously control the spatial distribution and the wavefront of the beam, creating the impression of a variable focal distance with spectral and 3.sup.rd and 5.sup.th order optical aberrations corrected.
  • the membrane N 100 and structure may be mounted upon a translocatable, movable or resonant structure to further enhance its range and applications.
  • the membrane may be lateral or other incisions/discontinuities for a linear translocation.
  • Heterogeneous chemical and mechanical domains in the membrane may be included and individually activated by photonic, mechanical, magnetic or electronic means.
  • FIG. 25 Interneural Motion Processing—presents a preferred embodiment of pixel pattern N 2100 containing multiple pixels N 2102 which are illuminated simultaneously or with discrete recalculated intervals. While the human retinal captures photons in microseconds, processing by the retinal neural system imparts a time course which acts to enhance or inhibit adjacent biological vision pathways. A single scanned photon may when illuminated at a certain frequency induce the cognitive visual impression of motion in the opposite direction. At a image level, this is observed in the spoked wagon wheels of older Western films. At the biological level, the result may be confusing and ambiguous, thereby substantially reducing a fighter pilots response time, for example.
  • FIG. 26 Interocular and Retinal Distance, Shape and Range of Movement—presents a preferred embodiment incorporating the dynamic interocular distance and orientation control.
  • One method of alignment and orientation of immersive displays employs one or more test patterns which provide the observer an alignment or adjustment reference. Standard tests for image position, focal distance and stereo alignment may be incorporated in manner similar to adjusting a pair of binoculars or stereomicroscope. Additional tests which incorporate dynamic motion and require hand-eye coordination may be included.
  • the first part measures the range of eye motion of each eye by recording the limited of the iris movement.
  • the second parts the range of retinal image focus and position by projecting a visible or invisible test image and recording the dynamic changes of eye position and focus.
  • An incident beam 170 which may be visible or invisible is reflected from the iris N 7200 , the retinal N 7202 , or the eye lens N 7204 .
  • Spectrographic analysis may be used to identify the source of the reflected beam.
  • the control computer 160 receives the data from the image detector N 7112 and other external systems including the interocular distance which is either fixed or includes a known measuring detector (not shown). This provides sufficient information for the calculation of the orthogonal visual axis of the immersive display relative to the observer and permits an adjustment of the display image including apparent focal distance, stereo image disparity, and visual axis orientation.
  • This dynamic adjustment may be useful convenience for all users and of crucial importance to fighter pilots and other environments where high stresses may cause a physical displacement or distortion of the display or body morphology.
  • An test example for dynamic control would measure the retinal shape and curvature by monitoring the focus of a scanned point in a single photodiode detector system or the width and curvature of a line with a two dimensional detector array. Dynamic monitoring of retina would correct for G forces and other anomalies during high speed turns by fighter pilots and astronauts.
  • Additional external eye state systems such as are manufacture red by ISCAN, Inc. may be employed and the data integrated by the control computer 160 .
  • FIG. 27 Distant Focus—presents a preferred embodiment wherein a fixed focus length is set by multiple horizontal elements which are vertically scanned. Other orientations may be employed.
  • one or more emitters 220 may be used in a scanning system.
  • emitter may include the other optical emitter group components including variable focal length.
  • the left eye 200 L observes a virtual image at point N 4102 .
  • the right eye 200 R observes a image set at infinity. While the relative position of point N 4102 in relation to the left eye 200 L is important, it is less so in the infinite focal length example. With all image points being compressed into the infinite plane, image object occlusion disappears. A object only viewed through an aperture would still be subject to minor occlusion at a global scale
  • variable focal length faculty of the present invention may be exploited to permit global or sectional virtual screen at a fixed focal length—with or without correct stereoscopic image disparity.
  • This technique may be used for medical and performance diagnostic, data compression and reduction as well as all other purposes.
  • a virtual screen set beyond the normal accommodative limits of the human eye may be minimize the impact of incorrect stereoscopic interocular alignment.
  • the projected cone of rays emanating from each pixel need not illuminated the entire pupil travel domain but may subtend the solid angle from the general region of the image object.
  • FIG. 28 shows a representative example where an intermediate transfer reflector (or transmitter) N 4110 is employed.
  • the beam 170 exits the optional focal length control 1620 if employed and is reflected (or transmitted) by intermediate transfer reflector (transmitter) N 4010 towards the visor reflector 230 and to the observer 200 .
  • the reflectors may be positioned in any location or combination including but not limited to above and below the eye plane, across the field of vision, at the periphery or the center.
  • FIG. 29 Induction of Vision—The use of photonic induction of nerve transmission has been disclosed by the author in previous U.S. patent applications and papers.
  • the preferred embodiment of the present invention discloses a method and apparatus for the direct photonic enervation of the human visual system.
  • a retinal implant N 5100 receives the beam 170 which causes a localized nerve depolarization N 5102 sending a signal N 5104 to a brain image location N 5106 .
  • the user may then identify the location in the viewer's reference (imaginary) which may or may not correspond to the virtual spatial source of the beam N 5108 .
  • the difference is received and computed by the processing computer 160 to generate a viewer's lookup table which permits a mosaic image to provide a correct view for the individual viewer's cognitive vision.
  • the retinal implant N 5100 is the subject on the inventor's previous and pending applications and papers.
  • the process may be used on sense, motor and aural nerves as well where processing computer 160 receives the instructions from the users biological process (Solomon, 1979) or other control systems and generates a mosaic image to activate the implant N 5100 .
  • FIG. 30 Variable Membrane Tension—The use of variable shape reflective and transmissive materials such as reflective membranes, transmissive liquid lenses, and materials wherein a localized change in refractive index is induced for beam forming and scanning are well known. In a preferred embodiment of the present invention these materials are utilized to vary the focal length and beam direction in a novel construction, using both integrated and multiple elements.
  • FIG. 30 an elongated concave membrane N 6100 with multiple electrodes N 6102 is shown.
  • the membrane N 6100 is shown connected at the corners but any configuration may used.
  • the membrane may be in tension flat or designed with a distinct neutral shape.
  • FIG. 31 shows the operation wherein a shaped portion N 6104 of a convex membrane N 6100 oscillates between alternative positions N 6104 and N 6106 during a view cycle of approximately 72 hertz.
  • the beam 170 is reflected from the surface.
  • the membrane undergoes a multiplicity of subtle changes which reflect the integration of the field forces generated between the multiple electrodes N 6102 and the membrane N 6100 .
  • These changes are controlled by the processing computer 160 and incorporate the focal length and beam direction information.
  • the membrane may represent the surface of deformable or refractive index variable, transmissive material using transparent or reflective electrodes at surface N 6102 .
  • deformable membrane mirrors as a method for controlling the beam direction, the focal length, the modulation of intensity and chromaticity and the correction of errors has been the subject of extensive research.
  • Applied Optics, Vol. 31, No. 20, Pg. 3987 a general equation for membrane deformation in electrostatic systems as a function of diameter and membrane tension is given. It is shown that deformation varies as the square of the pixel diameter [a] or voltage [V], and is inversely proportional to the tension [T]. In many applications were the invention is proximal to the human eye, increasing the pixel diameter or the voltage is impractical. Consequently, dynamic changes in membrane tension offer an acceptable method for variation.
  • Variable membranes utilizing known mechanical, photonic, acoustic and magnetic deformation may be employed.
  • FIG. 32 shows the preferred embodiment as disclosed in related government proposals wherein the display system is comprised of a processing computer 160 which coordinates the illumination of LEEs 220 , the modulation of display beam integrated translocation and focal length component N 7110 and the eye state feedback component N 7112 .
  • the light emitted from LEEs 220 is combined the optical waveguide 1050 and directed as a discrete beam 170 to the translocation and focal length component N 7110 .
  • the beam 170 is directed and focused towards the beam splitter N 7114 , an optional conditioning optic 228 which may be positioned at any point between the exit aperture of the optical waveguide 1050 and the visor reflector 230 , and the visor reflector 230 .
  • the beam 170 is then directed to the viewer's eye 200 , presenting a replica beam of that which would have been produced by a real point N 7118 on a real object 100 .
  • a real point N 7118 would generate a cone of light whose virtual representation is beams 170 and 171 .
  • the observer will perceive the object point N 7118 as long image beams 170 or 171 enter the observer's iris N 7200 at a viewable angle.
  • a reflected beam N 7120 is recorded by the eye state feedback component N 7112 which incorporates a detector and conditioning optic N 7122 which may range from a single photodiode to a complex, hi-speed, full color camera.
  • Data collected by the eye state component N 7112 may be received and analyzed by the processing computer 160 .
  • the preferred embodiment of the present invention may incorporate a membrane structure which dynamically and reversibly changes tension in response to applied field, charge density and photonic irradiation.
  • FIG. 33 Field optic transfer of emitter aperture—presents a preferred embodiment wherein the emitter and combiner exit aperture N 8102 , N 8102 A is transferred by means of an optical waveguide N 8104 to the focal distance optical element N 7110 or projection optics 228 .
  • Various shapes of waveguides including micro-optical elements may be employed.
  • the present invention may be applied to alternative constructions, orientations, spacing, and shapes including but not limited to horizontal, oblique, curved or discontinuous arrays and scans.
  • the intensity of the light source may vary during the cycle maximum of 8 periods by the binary increments of 1, 2, 4, 8 . . . . Each pixel is illuminated for 0 to 8 periods resulting in varying intensities of 0-255 and an individual pixel density increase of a factor of 4.
  • the base two series may be expanded to any power.
  • HMD with image generated in ear arm and optically bent by TIR at the arm-visor junction
  • HMD as Personal Communicator
  • FIGS. 34 and 35 show a preferred embodiment having a light source 10 , variable focal length element 12 , a first scanning element 14 , a first optical element 16 and a visor optical element 18 .
  • the light source 10 is focused by focal length element 12 and scanned by scanning element 14 onto the first optic 16 and then onto the visor optical element 18 .
  • the first optical 16 causes the virtual position of the light source to displace, which is expanded by the proper complementary visor optics as viewed by the observer. This embodiment expands the visual aperture of the HMD.
  • Complementary optics includes various combinations of circular, parabolic, and elliptical forms.
  • One example shown is a circular first optic 16 and an elliptic visor optic 18 .
  • Corrections for 1 st and 3 rd order aberrations may be introduced.
  • Factors such as field of view, precision, scanning control and light source modulation may determine the optimum design for a given market.
  • Eye position feedback may be used to adjust the image for placement, registration with the external environment, or distortion.
  • FIG. 46 The embodiment disclosed in FIG. 46 is described in large part in my earlier and pending applications, which integrate the scanning and first optic properties by displacing the reflective surface of the scanning element 14 , which may be but is not limited to a resonant mirror, from the axis of rotation. This naturally occurs with a polygon scanner.
  • the observer aperture is determined in part by the relative size of the light source aperture (pixel) and the virtual position displacement caused by the scanning optics.
  • a wide observer aperture dictates a small light source and a larger virtual displacement.
  • FIG. 36 shows a preferred embodiment having an active, augmented-reality visor optics 28 having a reflective prismatic form 30 , a liquid crystal medium 32 and an external substrate.
  • the reflective forms 30 a - c are sequentially switch from reflective to transmissive in coordination with the scanning of the light source 10 .
  • the ratio of reflective to transmissive periods determines the occlusion of the ambient environment.
  • a second liquid crystal and substrate 40 may be employed to increase the occlusion of the ambient environment.
  • the polarization optics for occlusion are not shown, but commonly understood in sequential shutter stereoglasses such as those used by IMAX or manufactured by Stereographics.
  • the active visor optics 28 complements and may be applied to the embodiments in my pending applications.
  • FIG. 37 shows a preferred embodiment applied to the Johnson art of total internal reflector where the beam(s) 28 from one or more light sources 10 including but not limited to a linear array are modified by a focal length element 12 and scanned by scanner 14 which may included a displacement reflector 16 into the Johnson prism 40 .
  • the beam is totally internally reflected one or more times between the exit face 46 and the prism face 48 , finally exiting when the intersection with the exit face 46 is more than the critical angle, to the observer 20 .
  • a redirecting optical element 60 is shown in FIG. 4 which may be diffuser, fresnel lens, micro-optic lens, HOE or other optical element depending on the use, (HMD, NTE, heads up display, screen) and position(s) of the observer(s).
  • FIG. 38A shows a second prism 42 proximal but spaced from the first prism 40 which directs the light from the environment 100 through the first prism 40 to the observer(s) 20 .
  • a shutter system 50 (which may be but is not limited to liquid crystal shutters, electrophoretic, electro-optic, MEMS or other systems) configured and activated as rows, columns or both.
  • the shutter acts to occlude the external environment 100 and increased the contrast of the projected ray 30 .
  • the shutter 50 may act in synchrony with the scanning system 14 .
  • FIG. 38B shows that the shutter system 50 may be placed next to the second prism 42 with a space 52 between the shutter and the first prism 40 .
  • the change in the refractive index may alter the critical angle or reflectivity, or evanescent coupling, thereby increasing resolution and contrast.
  • the shutter system 50 may be spaced from both prisms.
  • FIG. 39 shows that the shutter system 50 may be able to the observer face 50 ′ or the environment face 50 .
  • FIG. 40 shows a redirecting optical element 60 which may be diffuser, fresnel lens, micro-optic lens, HOE or other optical element depending on the use, (HMD, NTE, heads up display, screen) and position(s) of the observer(s).
  • HMD head up display, screen
  • FIG. 41 shows a method of manufacturing the linear array shutter system where the shutter material (LCD, for example) 50 is applied to a film which is placed on roll 208 and serially sliced 210 (etched by laser, for example.)
  • FIG. 42 present an active shutter reflector element 50 ′ which may function as the redirecting optics 1350 as shown in FIG. 13 and FIG. 50 , one or more shutter systems 50 , 50 ′ may be incorporated with a redirecting optic 60 placed before or after.
  • the shutter system 50 ′ When the shutter system 50 ′ is in between the observer the prism exit face 46 it may additionally function to increase the resolution, shown as vertical lines but not limited to any direction, of the beam by masking the adjacent regions 50 a, b, c , when opened in synchrony with the scan.
  • the scans may be interlaced (alternating patterns).
  • FIG. 43 present a linear accommodation embodiment where the LEE array 10 projects a fan shaped beam 28 , 28 ′, 28 ′′′ from each pixel.
  • the optical path lengths are symmetrical about the principal axis of the beam 28 and facilitate visual accommodation. Further the necessary optics are simplified and the resolution of the system improved.
  • Chromatic control may be integrated or distinct, with separate LEEs for each color. While RGB combinations are well-known, additional colors including yellow, amber and purple may be included.
  • a LUT may be provided in the software to introduce the correction.
  • the shutter element 50 may be optically-active materials such as liquid crystal, (LC, FLC), dyes, or displaceable elements such as micro-mirrors, electrophoretic spheres, piezo-vanes, etc. While the embodiment shown places the LEE and prism vertically, the orientation may be horizontal or oblique.
  • the TIR pathway may begin in the ear arm of a pair of eyeglasses and bend around the corner.
  • the visor, LEE and other components may be curved or conform to a unique shape.
  • FIG. 44 shows a perspective view of the combined system A 10 having a light emitting element (LEE) array A 110 , scanning optics A 120 in the form of a two-axis, reflective scanner, and a partially reflective, micro-optical element visor or screen A 300 .
  • the LEE array A 110 and scanning optics A 120 are controlled by computer assembly A 90 .
  • Common to all head mounted displays and well known to those skilled in the art are a power source such as a battery A 90 B and a data receiving channel such as a television broadcast decoder or other data link. These are usually incorporated in the computer assembly A 90 and therefore not shown separately.
  • the light beams A 200 , A 200 ′ (shown by single and double arrows respectively) from one of the LEE array elements A 110 x are cyclically scanned by the two-axis (vertical A 120 v and horizontal A 120 h ), reflective scanner A 120 across the partial reflective visor A 300 .
  • the reflected beams A 200 , A 200 ′ directed towards the observer's eye A 22 which, when in focus converge as a single point on the retina A 22 ′.
  • the partial reflective screen A 300 also permits the observer to view the external environment A 304 .
  • the percentage of reflectivity is commonly controllable by a number of well-known technologies including but not limited to LDC shutters.
  • the apparent distance between oneself and a light emitting element A 110 ′ is a function of the design focal length of the system which includes the focal lengths incorporated in the visor A 300 , the scanner A 120 , and the LEE array A 110 .
  • HMDs are set at about 12 feet.
  • the LEE array A 110 is co-axial with the principal optical axis of the system and along this axis, the distal LEE element A 110 ′′ is further away than the proximal LEE element A 110 ′′′.
  • the LEE elements A 110 will each focus at a different virtual distance A 310 , and they may be simultaneously illuminated.
  • FIG. 45 shows the present invention with a two-dimensional (7 ⁇ 3), light emitting element array A 110 D. It may be understood that the size of the array is generally 4096 ⁇ 1024 and the virtual image 640-4096 ⁇ 1024. Two advantages of this preferred embodiment are the simplification of the scanner A 120 from two-axis to one A 120 H, and reduction in the required frequency of illumination of the individual light emitting elements A 110 for a given image resolution. While Fig. X2 shows the placement of the light source and scanning assembly A 100 on the side of the head, any placement may be employed including but not limited to on the top or bottom of the head, on the cockpit dashboard, or a desktop.
  • Displays with visual accommodation produce an image by scanning a divergent beam from each image pixel directly into the field of view of the observer rather than forming a real image on a screen or surface, though embodiments may not implement the attribute.
  • the divergent beam is generally circular orthogonal to the principal axis between the center of the observer's eyelens and the originating image pixel.
  • beam may be elliptical or linear. Nonetheless, human visual accommodation is able to respond accurately.
  • a number of display configurations and technologies including those enabling visual accommodation may be enhanced, both in performance and manufacturability, by projecting a linear form of the divergent beam.
  • This preferred embodiment of the present invention addresses the application of the Johnson Wedge to devices which maintain the optical focal distance to the LEE.
  • FIG. 46 presents the thin-film preferred embodiment of the present invention having a generally linear pixel source 1100 , a thin-film waveguide 1112 , an extraction/activation layer 1114 , an augmented occlusion layer 1110 .
  • the vertically divergent beams 28 , 28 ′ are emitted by the pixel source 1100 and coupled to the thin-film waveguide 1112 in which they travel by total internal reflection or evanescent wave exiting at proper exit position 1116 along the waveguide 1112 and directed to the observer's eye 20 .
  • the visual accommodation faculty of human vision will adjust the focal distance of the observer's eye in response to the vertical divergence of the beams, obviating the need for a horizontal divergence which would demand a more complex optical waveguide for high resolution transmission.
  • the extraction/activation layer 1114 and thin film layer may be active or passive, reversed and function by direct or reflected extraction/activation.
  • an active extraction layer 1114 the construction may included but is not limited to an array of liquid crystal (LC, FLC) vertical linear apertures timed with the transmission, wavelength conversion using quantum dots, two photon conversion, embedded conversion elements, coupling evanescent waves, optical coherence tuning and other known optical technologies.
  • LC liquid crystal
  • FLC liquid crystal
  • the construction may be of multiple planar layers with a thickness approaching evanescent wave dimensions and terminating or transitioning at a fixed distance.
  • a 2000 layer system comprised of a transmission and spacing sub-layers may be less than 2 millimeters (1 micron thick layers) in thickness.
  • FIG. 47 presents one of many locations for an integrated camera element 1150 which records the position, orientation, iris, and focal length of the observer's eye from the reflected beam—which may be the image forming beam or an auxiliary beam including but not limited to a non-visible wavelength such infrared or ultraviolet.
  • FIG. 48 presents an integrated visual display system which may be applied broadly to my related inventions having one or more fixed, movable, independent, handheld, suspended, and/or otherwise located device 2000 , one, two or three dimensional visual emitters 2010 , a wireless communications element 2012 R, and a special effects module 2013 which may include audio, tactile, inertial, olfactory, or other effects, controlled by RF, acoustic or photonic devices 2012 from a control board or computer 2014 .
  • the photonic control devices 2012 may be static or moving sources and spots having static, mechanically, optically or electronically patterns including but not limited to omnidirectional sources, regional fixed IR spotlights, moving gobo patterned spots, digital micro-mirrored device (DMD) electromechanically controlled patterning, LCD or other electronic patterning device.
  • Multiple overlapping control devices 2012 L, 2012 C, 2012 R may be used to provide full data signal coverage, and the specific patterns may be adjusted to present a single seamless data pattern of controlled intensity including but not limited to the methods employed with visible light projectors.
  • a carrier frequency 2024 such as 38 Khz, or 450 KHz may be imposed under the data signal.
  • the carrier frequency may be synchronized electronically or optically, including by a wireless master carrier frequency synch signal 2020 and corresponding receivers 2022 .
  • FIG. 49 presents a moving device 2000 having an embedded pattern 2036 which may be pre-coded or transmitted, and which displays upon receiving a sequence of activating signal at location 2028 , 2030 , 2032 , 2034 .
  • a history of the device 2000 locations may be stored and used to adjust the proper timing and direction of the displayed pattern 2036 .
  • FIG. 50 presents a balloon embodiment of the moving device 200 having an additional special effect altitude control 2042 including but not limited to a volume heater/cooler, volume pump, balloon surface tension material, chemical reaction or other known device to regulate the volume or buoyancy of a balloon.
  • a bistable wire web may be employed to alternatively contract and expand the volume.
  • an upper 2014 and lower signal 2012 may be provided to regulate to the altitude to a given layer.
  • the signal strength may be employed to cause the balloon to descend once it reaches a defined level or is lost.
  • FIG. 51 presents an improved beam holographic background display 3000 background having one or more digital beam holographic pixels 3012 which emit a complex pattern of light, horizontal and/or vertically, replicative of a virtual visual screen through which one views a 2 or 3-dimensional image of design. Details of the principles of operation have been presented in my earlier related applications.
  • the improved display may be constructed by one or more columns 3002 of pixels, each column 3002 derived from the projection of one or more projection spatial light modulators (SLM) 3010 .
  • SLM 3010 having a base resolution of 1024 ⁇ 768 may be expanded into a column of 768 pixels and 1024 horizontal views 3014 ′.
  • the column beam presentation 3014 L at a given angle may be monitored by a sensor or camera 3040 L and an appropriate correction may be applied by the internal controller or a central server 3004 .
  • the beam presentation may be an non-visible wavelength such as but not limited to infrared. If a number of the peripheral views of the SLM are reserved for correction, the system will be able to dynamically correct for substantial vibration, displacement or other interruptions. The percentage required is dependent on the conditions such that a fixed stable system may require only 4-6 pixel views while a mobile stage mounted system for outdoor shows may require 20-40 views.
  • Multiple sensors 3040 L, 3040 C, 3040 R may be employed to increase the accuracy.
  • FIG. 52A presents top view of a total-internal reflection differential expansion of the projected pattern of the SLM 3010 through a series of waveguides 3020 .
  • FIG. 52B presents a perspective view having the SLM 3020 centrally mounted proximal to the column 3002 and the projection grid 3030 shown.
  • FIG. 53 presents the columns angled.
  • FIG. 54 presents the columns staggered and in parts.
  • an integrated, coordinated display system may be created having a dynamic, three dimensional beam holographic background 3000 , a visual space filled with moving pixel devices 2000 , and an augment observer mounted display system.
  • FIG. 55 presents a preferred method applying the data error bits 602 to an upper nibble 604 , thereby enabling a first approximation of the greater value, and the remaining lower nibble or bits 606 .
  • the algorithm may allow a state change if the upper nibble data is OK, but not if only the lower nibble is OK. When used in a RGB (600 ⁇ 3) or other sequence for display applications, it enhances the continuity of the show at a small chance of minor error in small values of the display.
  • FIG. 56 presents a preferred method which enables the coordination of the controllers by the central control computer data stream 502 having a synch mark 504 and optional synchronization offset value 506 .
  • the synch mark initiates the controller emission sequence and the offset value 506 corrects the timing of the sequence to reflect the individual controller position, data transmission delay and other factors which cause the degradation of the signal.
  • FIG. 2 -AO presents a preferred method which enables the coordination of the controller/projector 100 by one or more global synch marks 504 in the form of a radiative pulse of RF, photonic, acoustic or other means from a generator properly located.
  • the central control computer data stream 502 which may include but is not limited to DMX, RDM, ArtNET universes, having a synch mark 504 and optional synchronization offset value 506 .
  • Both methods may be used to synchronize at the receiver position the data stream optionally including the phase and carrier frequency if used, from multiple controllers.
  • Using these methods with line-of-sight photonic data streams enables multiple controllers from different direction to control receivers, often a lower power with less spurious reflections from the audience or environment than otherwise. These methods improve spatial resolution and controller system efficiency.
  • Wizard elements on an audience participant may have a single emitter or, in order to increase the apparent resolution—multiple, spatially-independent emitters.
  • Each emitter may be controlled by a high resolution controller or alternatively by an increased data stream which incorporates the data required for each emitter.
  • Lower resolution displays may be enhanced through sequential or interlaced frames.
  • Sequential frames cause the emitter to present sequentially a multiplicity of pixels of the image, each shifted by a small amount, at a higher emitter rate than normal.
  • the human visual system integrates the sequential images by virtually spatially displacing each sequential pixel properly. In reality, the pixel hasn't physically moved. Virtual apparent resolution multiplications over 100 are possible.
  • FIG. 57A presents a preferred embodiment of the sequential frame method having a base image matrix of 4 ⁇ 4 elements s 400 and 3 frames including the base frame 410 , and virtual sequential frames 412 and 414 , shown in their virtual position. It may be understood that the actual emitters are the base image matrix 400 . Scene content and cues cause the virtual image to shift and the apparent resolution increase along the y-axis 404 .
  • FIG. 57B presents a preferred embodiment of the interlaced frame method having a base image matrix of 4 ⁇ 4 elements s 400 and 5 frames including the base frame 410 , and virtual interlaced frames 412 through 418 , shown in their virtual position interposed about the base frame 410 . It may be understood that the actual emitters are the base image matrix 400 . Scene content and cues cause the virtual image to shift and the apparent resolution increase along both axes.
  • FIG. 57C presents a preferred embodiment of the enhanced saccadic method where a virtual frame having an fixation attractive (light) and repulsive (dark) overlay on the matrix base image causing the observer's fixation (saccadic) to shift in the designated direction.
  • fixation attractive light
  • repulsive dark

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

The present invention discloses an improved method and device for the immersive display of three-dimensional images with convertible eyeglasses and wristwatch structures. An improved method for manufacturing a visual display incorporating a scanned light source and an improved method of presenting visual information are disclosed. A complete, immersive display environment is also presented.

Description

    TECHNICAL FIELD
  • This invention relates generally to audiovisual effects displays and environments; and more particularly to novel integration of personal head-mounted glasses, 3D imaging displays environments and other audio-visual effects. This continuation-in-part and divisional application incorporates by reference my related and earlier-filed applications and disclosures including Ser. Nos. 12/456,401 and 13/294,011 and provisional applications 61/962,877; 61/850,920; 61/850,082.
  • Allowed patent application Ser. No. 12/456,401 is incorporated herein by reference.
  • Further including continuation-in-part priority benefit of Ser. No. 11/149,638 incorporated herein by reference.
  • BACKGROUND ART
  • Miniature displays are also well known and may involve a miniaturized version of planar or stereoscopic 3D technologies which display a distinct image to each eye. With increase miniaturization and incorporation into eyeglasses design, head-mounted displays (HMDs) have enjoyed an increasing popularity for applications ranging from fighter pilot helmet displays and endoscopic surgery to virtual reality games and augmented reality glasses. The 3D HMD display technology has numerous extensions including Near-to-Eye (NTD)—periscopes and tank sights; Heads-Up (HUD)—windshield and augmented reality—and immersive displays (IMD)—including CAVE, dome and theater size environments. The principal employed varies little from that of the 1930 Polaroid™ glasses, or the barrier stereoscopic displays of the 1890s, despite extensive invention related to the active technology to produce each display has occurred over the past twenty years. As applied to small displays, these techniques evolved to include miniature liquid crystal, field emission, OLED, quantum dot and other two-dimensional matrix displays; variations of virtual screen and retinal scanning methodologies. Other approaches include scanning fiber optic point sources such as disclosed by Palmer, U.S. Pat. No. 4,234,788, compact folded, total internal reflection optical displays disclosed by Johnson in U.S. Pat. No. 4,109,263. These inventions have provided practical solutions to the problem of providing lightweight, high resolution displays but are limited to providing a stereoscopic view by means of image disparity.
  • But, object visual accommodation is not incorporated in the previous inventions. A solution to the problem of accommodation for all displays was disclosed by A. C. Traub in U.S. Pat. No. 3,493,390, Sher in U.S. Pat. No. 4,130,832, and others. These inventors proposed a modulated scanning signal beam coordinated with a resonantly varying focal length element disposed in the optical path between the image display and the observer. These solutions are bulky, and do not scale for practical usage.
  • It is also well known in the field that wavefront-based technologies, such as digital phase and diffractive holography, may at high-resolutions, convey a limited amount of accommodation data. However, their limitations including coherent effects, impart significant specular and other aberrations degrading performance and inducing observer fatigue.
  • Alternative approaches where a data-controlled, variable focal length optical element was associated with each pixel of the display were such of experimentation by this inventor and others, including Sony Corporation researchers, in Cambridge, Mass. during the late 1980s. In 1990, Ashizaki, U.S. Pat. No. 5,355,181, of the Sony Corporation, disclosed an HMD with a variable focus optical system.
  • Augmented reality had in origins at MIT Lincoln Laboratory in the 1960s and involved in a translucent HMD with head-orientation tracking in a wall projection immersive environment. The ‘virtual image’ in the HMD did not have accommodation, and the immersive environment did not include spatially-tracked, portable audience elements with multiplicative effects.
  • Despite the improvements during the past decades, the significant problem of providing a low cost, highly accurate visual display with full accommodation remains. One of the principal limitations has been the inability of sequentially resonant or programmed variable focal length optics combined with scanning configurations to properly display solid three dimensional pixels, orthogonal to the scanning plane. Another limitation is the inability of the observer's eye to properly and comfortably focus on rapidly flashing elements. Numerous inventions have been proposed which have generally been too complicated to be reliable, too expensive to manufacture, without sufficient resolution, accuracy, stability to gain wide acceptance.
  • A further problem solved by the innovation of present invention is the method and apparatus to comfortably and useful carry and use an audio-visual display on one's person.
  • The present invention solves these problems, particularly related to the portable multiphasic design, augmented reality, environmental dynamics and the accurate display of 3D pixels.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention discloses an improved method and device for the display of a visual image in two or three dimensions including stereoscopic and/or visual accommodation.
  • Another object of the present invention is an improved method and device for an immersive, augmented reality environment.
  • Another object of the present invention is an improved method and device for monitoring the physiological, psychological, fixation, processing, awareness and response of an individual.
  • Another object of the present invention is an improved method and device for constructing an accurate, augmented reality, visual display with automatic biocular alignment,
  • Another object of the present invention is an improved method and device for constructing an accurate, augmented reality, visual display without an intermediate image plane,
  • Another object of the present invention is an improved method and device for manufacturing a visual display independent of coherence and wavefront curvature constraints,
  • Another object of the present invention is an improved method and device for thin, wave-guided display.
  • Another object of the present invention is an improved method of presenting visual information,
  • Another object of the present invention is an improved method and device for an immersive, augmented reality, audience performance environment.
  • Another object of the present invention is an improved method and device to present visual information in compact form unaffected by an external environment.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and still further objects, features and advantages of the present invention will become apparent upon consideration of the following detailed disclosure of specific embodiments of the invention, especially when taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1 presents a general view of binocular stereoscopic viewers.
  • FIGS. 2A, 2B & 2C presents a general view of the wrist-wearable, stereoviewer embodiment of the present invention
  • FIGS. 3A-D presents a preferred embodiment having an articulated bridge.
  • FIGS. 4A-B presents a preferred embodiment having a foldable temple. FIGS. 5A-B presents a preferred embodiment having repositionable elements. SB4 and V1 presents an HMD embodiment
  • FIGS. 6A-B presents a preferred embodiment having wrappable temples.
  • FIGS. 7A-C presents the layered construction of the viewer panels.
  • FIGS. 8A-C presents a preferred embodiment having a variable, semi-transparent viewer panel.
  • FIG. 9A presents a preferred embodiment of a augmented, virtual reality embodiment with eye monitoring and response.
  • FIG. 9B references parent U.S. patent application Ser. No. 12/456,401 FIG. 5.
  • FIG. 10 presents a preferred embodiment having a pivotable frame with storage headband.
  • FIG. 11 presents a generalized viewer of an audience effects system.
  • FIG. 12 presents a scanning display embodiment.
  • FIG. 13 presents another embodiment of a scanning display embodiment.
  • FIG. 14 presents a variable focal length lens embodiment.
  • FIG. 15 presents a scanning stereo viewer embodiment using micro optic domains with a polarizing aperture
  • FIG. 16 shows a perspective view of the linear array, continuous focal distance embodiment of the present invention,
  • FIG. 17 presents a side view of the linear array, continuous focal distance embodiment.
  • FIG. 18 presents a perspective view of the planar array, continuous focal distance, environmental embodiment of the present invention,
  • FIG. 19 shows a perspective view of a two photon activation embodiment of the present invention.
  • FIGS. 20 and 21 presents perspective views of a plasma activation embodiment of the present invention,
  • FIG. 22 presents a sectional view of a scanning embodiment.
  • FIG. 23 presents a sectional view of a deflected, tethered light emitting element activation embodiment of the present invention,
  • FIG. 24 presents a perspective view of a three dimensional optic deflection of apparent light source embodiment of the present invention.
  • FIG. 25 presents an interneural motion embodiment of the present invention.
  • FIG. 26 presents a dynamic interocular display embodiment.
  • FIGS. 27 and 28 presents a top view of a transfer reflector embodiment.
  • FIG. 29 presents a photonic inductin of nerve transmission embodiment.
  • FIGS. 30 and 31 presents a preferred deformable mirror membrane embodiments of the present invention.
  • FIG. 32 presents a preferred embodiment having a focusable element.
  • FIG. 33 presents a preferred embodiment having a fiber optic element.
  • FIGS. 34 and 35 present a preferred embodiment having a offset reflector.
  • FIG. 36 presents a preferred embodiment having an active, augmented reality visor.
  • FIGS. 37-45 present preferred embodiments having a total internal reflection.
  • FIGS. 46 and 47 presents preferred embodiments having thin-film optics.
  • FIGS. 48-50 present preferred embodiments of audience effects elements employed with the present invention.
  • FIG. 51 presents an immersive environment embodiment.
  • FIGS. 52A-54 present enhanced embodiments of the present invention.
  • FIGS. 55 and 56 present an enhanced data structure embodiment of the present invention.
  • FIGS. 57A-C present preferred saccadic enhancement embodiments of the present.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present invention relates to improved methods and constructions to achieve a complex visual display environment which may include dynamic and precise focal length for each pixel of each frame of a visual display and the ability to present a comfortable image, two or three-dimensional, virtual or augmented, to a single or multiple observers which may be participating in an audience performance.
  • It discloses the general elements of the present invention which may include but is not limited to:
      • 1. A personal audio-visual device which may be wearable such as a handheld wand or communications device; body or clothing-affixed device such as a watch, wrist or armband, earring, necklace, medallion or other jewelry; a head-mounted device such as a helmet or glasses; or a dispersible device such as a ball, balloon, Frisbee, etc., all of which may be responsive to or interactive to user state and input, or optionally external control,
      • 2. A communications control device which may control one or more personal audio-visual devices, and
      • 3. An augmented-environment device which may include a real-time display or projection.
  • One preferred embodiment of the personal device enables a full accommodative, stereoscopic, augmented reality, audio-visual experience with a convenient portability and use. This preferred embodiment provides a complete personal audio-visual experience in the form of wearable eyeglasses or head-mounted environment (HME) which may be removably-affixed as a wristband or watch. The present invention allows but does not require that the pixels or light sources be transparent, or that any light is actually transmitted through them. This approach has many technological, performance and manufacturing advantages including the ability to present discrete, collinear pixels of different focal distances with improved acuity within the period of visual integration. FIG. 1 presents prior art with elements locally renumbered from allowed, parent U.S. patent application Ser. No. 12/456,401 FIG. 1, disclosing a generalized stereo viewing system which “presents the image of an object 32 taken by two cameras 48 and 48′, displaced by a small distance equivalent of the separation of a viewers's eyes, to tv- type viewer panels 122 and 124, which corresponds to the view that would be seen by each eye. Commonly, the viewer panels 122 and 124 are mounted on an eyeglass or goggle-type frame 100.”
  • FIG. 2A presents a preferred convertible embodiment of the present invention having a convertible frame 100 in the form of eyeglasses or goggles, a first display region 102, a second display region 104, a first arm 106, a second arm 108, a first ear assemblage 110, a second ear assemblage 112, a first camera 116, a second camera 118, a first display 122, a second display 124.
  • In a first configuration, the present invention functions as a head-mounted, augmented reality display having some or all of the features common to the field or disclosed in my previous or pending patent applications included by reference.
  • The displays 122, 124 may be any type including but not limited to those disclosed herein or in my previously-filed applications, miniature LCD, LED, OLED, QD, fluorescent, electroluminescent, transformed, array, scanned, static or dynamic focus, one or two-sided, of occlusive, transparent, or variable optics, engaging a limited or comprehensive field of view.
  • These may include the display of images to the user's eyes at a static or dynamic focal distance, accompanying audio, one or more affixed or mobile cameras capturing the user's environment, one or more cameras capturing the user's eye position, motion and state which may be incorporated into the display, sensors monitoring geophysical position (GPS), motion (acceleration, gyro), physiological state (temp, heart beat, brain waves, etc.), environmental state (temperature, humidity, pressure, audio, air quality, insolation, ambient illumination, scents, etc.).
  • The convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106, 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened. Alternatively, the frame 100 material may be flexible. Examples include but are not limited to: elastomers, woven fabrics, linked chains, pinned segments, or other configurations.
  • FIG. 2B presents a posterior view of the second configuration of the present invention where the convertible frame 100 is folded about the wrist 101 of the user having the second display 124 presenting on the posterior wrist and the first display 122 (not shown) presenting on the anterior wrist.
  • FIG. 2C presents a top view of the second configuration of the present invention where the convertible frame 100 is folded about the wrist 101 of the user having the second outer display 124 presenting on the posterior wrist and the first out display 122 presenting on the anterior wrist.
  • FIG. 3A presents a front view of the of the convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106, 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened.
  • FIG. 3B presents a front view of the of the convertible frame 100 having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106, 108 which allow the frame 100 to be folded upon itself, the arms 106,108 forming a strap-like fastener to secure the frame 100 to the wrist.
  • FIG. 3C presents a front view of the of the folded convertible frame 100 of FIG. 5, having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106, 108 which allow the frame 100 to be folded upon itself, the arms 106,108 forming a strap-like fastener to secure the frame 100 to the wrist.
  • FIG. 3D presents a end view of the of the folded convertible frame 100 of FIG. 5, having a flexible, elastic, hinged pivot region 130 on the bridge, and on the arms 106, 108 which allow the frame 100 to be folded upon itself, the arms 106, 108 forming a strap-like fastener to secure the frame 100 to the wrist, and showing the first outer display 122 presenting upward.
  • FIG. 4A presents a front unfolded view of the of the folded convertible frame 100 and arm having a first fold region 106 and a second fold region 106′ enabling the arm to be folded upon itself.
  • FIG. 4B presents a front folded view of the of the folded convertible frame 100 and arm having a first fold region 106 and a second fold region 106′ enabling the arm to be folded upon itself.
  • FIG. 5A presents an expanded top view of the present invention where the display 122, 124 may be separated or overlapped to convert to a wrist configuration.
  • FIG. 5B presents wrist-converted top view of the present invention where the display 122, 124 are overlapped to convert to a wrist configuration. The ‘hinge’ region 114 may have connected/loop structure 114′ to which the opposing arm/ strap 106, 108 is removably affixed in the wrist configuration.
  • FIG. 6A presents an expanded front view of the present invention where the ear arms 128 are in the form of wire and may be loop around and over the alternate display 122, 124 to convert to a wrist configuration
  • FIG. 6B presents an front wrist view of the present invention where the ear arms 128 are in the form of wire and may be loop around and over the alternate display 122, 124 to convert to a wrist configuration
  • Other constructions may include but are not limited to the following features:
  • Construction
      • 1. folding
        • flat
        • fixed curvature
        • flexible
      • 2. sliding
        • two pieces
        • internal
        • slide and twist
      • 3. layers
        • outer protective
        • outer reflective
        • outer occlusive
        • outer display
        • mid-occlusive
        • inner display
        • inner protective
  • Display
      • 1. 2D screen
      • 2. Composite Screen
        • Point
          • 1. R-screen
        • Linear Array
          • 1. Continuous
          • 2. Discontinuous
          • 3. Sparse
        • Indirect
          • 1. Fluors
          • 2. QD
          • 3. Other
        • R-Screen
          • 1. Passive
            • a. Optics—TIR
          • 2. Active
            • a. Micro-mirrors
            • b. Other nano-mechanical optical elements
            • c. Refractive
            •  i. LC
            •  ii. Electrophoretic
            •  iii. Acousto
  • With Advanced Visualization
      • 1. Augmented Reality—augmented reality refers to the methods of modifying the presentation to the user based on situational factors. For example, a location/direction sensor would enable the display to present the names of stores as the user views a scene; a directional camera may be used to identify persons and display their names and other information. The examples and variations are well-documented.
      • 2. Accommodation—accommodation refers to the focal distance of the lens of the eye. In most HMDs and digital eyepieces (video camera, for example) the apparent distance of the display is uniform and static. A common distance is 12 feet. Many digital eyepieces permit a small range of focal distance adjustment to accommodate natural aberrations in the human lens.
  • In the present invention, accommodation may also be dynamic, and change according the situational, user-defined, or programmed factors.
  • Personal displays included a broad range of configurations from screens embedded in contact lenses to handheld tablets. Many exhibit unique advantages and limitations on their utility based on factors including but not limited to weight, cost, resolution, performance, durability and suitability. For example, a large, high resolution tablet may be useful to detailed graphics, but cumbersome as a real-time navigation aid while skiing, where a micro-display on transparent goggles would be suitable.
  • Present technology have not solved discovered a core display technology which would be applicable to many different displays, nor has a versatile construction been invented. The present invention solves both problems and discloses a versatile display technology together with constructions of significant utility.
  • FIGS. 7A and 7 b present a preferred embodiment of the display 124 of the present invention having one or more layers including but not limited to: outer protective and outer display 132, occlusion layer 134, inner display 136 and inner protective layer.
  • FIG. 7C present a preferred embodiment of the display 124 of the present invention having one or more layers including but not limited to: outer protective and outer reflective layer 138, display layer 136, inner reflection display 140.
  • FIG. 8A presents a sparse emitter embodiment of the display layer 132 having emitters 142,′,″ spaced in a sparse array with transparent spaces in between.
  • FIG. 8B presents a sparse emitter embodiment of the display layer 132 having emitters 142,′,″ spaced in a sparse array with transparent spaces in between with the emitters 142 visible to an external observer 12.
  • FIG. 8C presents a sparse emitter embodiment of the display layer 132 having emitters 142,′,″ spaced in a sparse array with transparent spaces in between with the emitters 142 visible to an internal observer 10. Various prismatic constructions may be employed to present a larger apparent source of emission of the emitter 142.
  • Layers:
      • 1. outer occlusive
      • 2. outer display
      • 3. mid-occlusive
      • 4. inner display
      • 5. inner protective
  • It may be noted that the display layers may be edge-illuminated from a position with the relevant display illumination exiting the substrate by any fixed or dynamic optical means including but not limited to optical path prisms, holograms, TIR violations, acousto or electro-optic waveguides such as leaky or tunneling mode transition described in Sanford U.S. Pat. No. 4,791,388 and more contemporary references.
  • FIG. 9 presents a schematic view of an augmented reality embodiment of the present invention having the user's eye 10, eye lens, 14 optical path combiner 150, external object 32, display 136 and focal-length lens 152. In a preferred embodiment, the image projected from the display 136 combines in optical path combiner 150 with the external scene 32. The user's lens 14 will normally accommodate to focus on the object 32 of attention in its field of view. In the display focal-length lens 152 may be programmed to follow the user's accommodated eye lens 14 focal length by monitoring the user's vision. This is commonly done in eye examinations and may be accomplished by a monitoring eye camera 160 in the display optical path. Monitoring may be enhanced by projecting a pattern, in visible or non-visible light on the eye generally from the display. Near IR is common wavelength used for this purpose. The preferred embodiment of the present invention presents a single, dynamic display focal length by means of the single dynamic lens 152.
  • This embodiment may present a programmed image and display focal length based on the design of the programmer. This image may be based on the user's eye position, eye focal length or other design factors. For example, an external camera may identify an object of interest at a distance location while the observer is focused on a near object. By programming a distant focal length together with other visual cues projected from the display, the user may be directed to the object of interest faster and with the ability to more accurately evaluate critical factors related to the object. Programmed Global View—an entertainment, educational or other presentation may be prepared to guide the user through a real, virtual or synthetic (fantasy) visual environment by adding focal distance parameters to the scene/timeline of presentation.
  • Predictive Global View—Complex, predictive algorithms may be employed to program the project display focal distance based on user's actions, physiology, progression, program or other factors. A wide range of tests, evaluations and educational tools may be enhanced by this method.
  • Responsive Global View—The display focal distance may be based on the user's eye position, eye focal length, or other complex factors responsive the global view, including but not limited to enhancing the scene by adding realistic brightness to dimly illuminated parts of the view.
  • With integrated physiological monitoring (Ophthalmology and Optometry have developed a large number of tests and evaluations which may be incorporated in the physiologic monitoring of the user. The standard text: Optometry: Science, Techniques and Clinical Management, 2e, 2009 is incorporated herein by reference.
      • 1. Lens focus—focal length of the eye's lens
      • 2. Eye motion—saccades and other motions of the iris of the eye as well as blinking
      • 3. Morphological Delta—both external and internal (retinal morphology)
      • 4. Spectral Delta—distinct wavelengths may be monitored
        • Ambient
        • Spectral illuminated (including Raman)
        • Fluorescent Illuminated (including Raman)
        • OCT (optical coherence tomography
          • 1. Global
          • 2. Scanned—the present invention lends itself to scanned OCT integrated in the display projection path.
      • 5. Doppler blood flow (pulse)
  • FIG. 9B references and retains the element numbering of parent U.S. Patent Application FIG. 5.
  • Wrist—HMD Additions
  • Wrist/Arm/Handheld Permanent base—A handheld device (smartphone) may be used as the base unit/power charger for the WristHMD/HD-HMD.
  • Wrist/Arm Permanent base—the wrist band may be permanent incorporating the telephonic circuitry and charging power for an low-power data-streamed HMD.
      • 1. The Wrist/Arm band may have a permanent screen
        • with a secondary, removable HMD appliance
      • 2. Alternatively, the removable HMD may hold the permanent screen
  • User Input
      • 1. The user input may be any known method, such as but not limited to touch switches and device on the HMD frame, finger and hand gestures with gloves or to HMD mounted detectors/cameras, or eye movements monitored.
      • 2. Innovative technologies may also be employed including, but not limited to:
        • Fingers—the movement of the fingers and gestures may be monitored by a ring, knuckles, wrist or arm band with using optical variations, motion and/or vibration to differentiate the motion
        • Optical Coherence Tomography (OCT) may be employed as a sensor.
      • 3. Eye—the activity and state of the eye and eyelids may be used as input gestures
        • Iris —position (fixation) activity may be utilized including but not limited to position, timing and direction of motion.
        • Eyelids—activity may be utilized including but not limited to position, timing and direction of motion.
        • OCT—Optical Coherence Tomography and other
      • 4. Electro/Optical encephlograhy
      • 5. Face muscles
        • OCT on HP
  • Wrist/Ring Input Device—
  • On thumb/wrist—“sees” fingers, surface—records sounds (taps) and Position
  • On wrist—place on surface, sees fingers, records sounds
  • Hears tab with Bio-change
  • FIG. 10 presents an integrated headphones and HMD 100 embodiment where the HMD 100 may pivot about the ear and fold into a protective cover which is part of upper headband 120 connecting the individual ear speakers. The convertible frame 100 may have flexible, elastic, hinged regions 114 on the bridge, and on the arms 106, 108 which allow the frame 100 to be draped or wrapped about a curved object, including an arm, and fastened. Alternatively, the frame 100 material may be flexible. Examples include but are not limited to: elastomers, woven fabrics, linked chains, pinned segments, or other configurations.
  • Audience Effects
  • FIG. 11 presents the generalized elements of the performance display system 250 in a venue resembling an audience 252 with a plurality of audience members 252″ and stage 254. The venue may include any space ranging from the interior of an automobile, a living room, a dining hall, or nightclub to major event venues such as theatres, concert halls, football stadiums or outdoor festivals. Although the term audience unit or audience receiver unit 200 is used to describe both the simple and autostereoscopic-effects unit, it may be understood that the module may take any shape or be incorporated into any independent handheld, worn, or positioned effects device including but not limited to tickets, badges, buttons, globes, cylinders, signs, sashes, headdresses, jewelry, clothing, shields, panels and emblems affixed or held to a member of the audience or any object, moveable or stationary. The insert in FIG. 11 presents a front view of the present invention having an illuminated audience unit 200 with some or all of the elements of the audience unit of FIG. 11, as described in my U.S. Pat. No. 8,194,118, incorporated herein by reference, having one or more light emitting elements 206 which may be referred to as light emitters or modulators, light modulator, light emitting/modulator elements, LEDS, light emitter or light array, a connecting member 214, handle 212 and an active receiver 202 capable of receiving optical or acoustic signals. In operation, the show director at the control board 18 or instrument sends a sequence of commands, live or from a stored visual or audio program, over the performance system data network 14 to the projector/signal generator 100 which emits a precisely timed series of directional signals. 106, 106″, 106″ programmed to activate the audience units 200 at a precise location impacted by the directional signal 106. In its simplest embodiment, the projector/signal generator 100 displays an invisible IR 106 image at a specific wavelength (880 nanometers, for example) on the audience 22 which causes the wavelength-specific audience unit communication receiver 202 to activate one or more light emitters or modulators 206. The projector/signal generator 100 may also transmit a program sequence for later execution and display. Each audience unit may contain a unique encoded identifier entered during manufacture; at the time of purchase or distribution; or transmitted by the projection system to the audience at any time, including during the performance. The data protocol may included well-known communication protocols such as but not limited to IR RS-232, IRDA, Fiber Channel, Fiber Ethernet, etc. The projector/signal generator 100, referred to hereafter principally as “projector 100”, may also project a visible light beam containing visual content as well as a data stream by modulating the frequency above the human visual system integration frequency of 30 Hz. It may be understood that the projector/signal generator 100 in its photonic form encompasses the simplest gobo projector as well as the most complex, integrated terahertz-modulated photonic signal generator and spatial light modulator.
  • The Light emitting elements 206 may refer to any type of photonic source such as but not limited to incandescent, fluorescent, neon, electroluminescent, chemical, LED, laser, or quantum dot; or to combinations of light modulating combinations such as but not limited to thin film LCDs, backlit or reflective, E*INK type reflective modulators, chemical, photonic or electronic chromatic modulators. A camera system 300 may be employed to monitor the audience and/or audience units, and provide feedback for a number of manual or automated design, setup and operating procedures. The camera system may be incorporated into the projector/signal generator unit 100. If not properly configured said data signals 106 may interfere and degrade the rate and integrity of transmission. In order to synchronize the data projectors 100, a time code signal may be transmitted from the system control board 18, a designated master controller 100. Each data projector 100 may be programmed with a calculated offset from the time-code signal based on its distance from ‘center of mass’ of the audience, the location of other controllers, external environment, and other factors. A central timecode beacon 140 may transmit the time-code signal to each of the data projectors 100 by means including but not limited to photonic, acoustic, or RF signals.
  • A feedback system from the cameras 300 may be used to adjust the performance including but not limited to projecting a fine pattern and adjusting the intensity of the data signal 106 until the appropriate resolution is achieved. The audience unit may employ an IR or other non-visible emitter for adjustment, diagnostic and other purposes. Various user input devices including microphones, buttons, switches, motion detectors, gyroscopes, light detectors, cameras, GPS and other devices may be included 216.
  • FIG. 12 presents a cross section of the translocation reflector method with a lenticular type screen. The components are an LEE array 1320, a FOE array 1360, a translocation reflector 1322, an actuator 1330, a counterweight 1332 and a position encoder 1340 and a screen 1350. In operation, a section of the full view is presented on the LEE 1320, focused by the FOE array 1360, reflected by the translocation reflector 1322 and the screen 1350. The screen may be of a fresnel, lenticular, stepped or holographic construction such as to present a focused image of the LEE 1320 to a viewer. A circular polarizing window 1360 may be placed between the observer and the screen to extinct external ambient light.
  • FIG. 13 (POLARIZED) presents a rotating polygon embodiment of the present invention. The system projects an image of the LEE 1510 by scanning a rotating reflective polygon 1520 and projecting the image onto a viewing screen or reflective micro-optic surface 1530 viewed by the observer 1540. A circular polarizing aperture 1550 may be placed between the screen 1530 and the observer 1540 and the LEE 1510 output modulated to produce a range of elliptical polarization whereby the external ambient light is extincted while the image of LEE remains visible. The LEE 1510 modulation may be used to control color and intensity as well. The LEE 1510 although shown as a single row may be constructed of multiple rows thereby projecting either a ID array of elements optically-combined for increased brightness or intensity modulation, or a 2D array. As a 2D array with appropriate spacing between elements, the optical deflection angle may be reduced to the spacing arc. This technique in combination may be used for large stereoscopic, autostereoscopic and monoscopic projection systems.
  • FIG. 14 presents a perspective view of one embodiment of a single element of the focal distance optical element. The components are the LEE 2020, a piezoelectric cylinder 2030 and a variable optical element 2040. In operation, an electrical charge applied to the piezoelectric cylinder 2030 varies the compression of the enclosed optical material 2040 resulting in a change in the focal length of the optical element. To a viewer, the LEE will appear to vary in distance when the eye adjusts to the minimum focus. This approach requires a dark region 2060 adjacent to the focusable element for single elements, or an image edge. Focal length adjustment may also be effected by electrostatic reflective membrane arrays, gradient index liquid crystal arrays, SLMs, diffractive elements, multiple internal reflections and other known technologies.
  • FIG. 15 presents a scanning stereo viewer using micro optic domains with a polarizing aperture. Similar to the embodiment of FIG. 21, an image is projected onto a screen 2220 from scanner 2230 or 2232 and viewed by observer 2210. A transparent polarizer window 2250 is interposed between the observer 2250 and the screen 2220. The screen may be constructed of reflective micro domains which focus the image to one observer or disperse the image for multiple observer. The beams of light from the scanner 2230 are either unpolarized or the polarization is modulated to control intensity or color.
  • FIG. 16 shows a perspective view of the linear array, continuous focal distance embodiment of the present invention where the component parts of the light source and scanning assembly A100 are shown including a image computer A90, a linear array of light sources A110, and a two axis, scanning mirror A120. In operation, the computer A90 communicates with the scanning mirror A120 through an open loop drive system, closed loop position feedback or other known positioning system and illuminates those light sources A110 which correspond to the image points A310 to be displayed. The divergent beams from each light sources A110 may be focused by the eye A24 to correspond to the appropriate object distance.
  • While the linear array of light sources A100 is shown as an array of light emitters such as LEDs (light emitting diodes) which are driven by an image computer A90 through circuits not shown, alternative light sources may be employed. Examples of such alternatives include electronically, optically or mechanically activated emitters, shutters, reflectors, and beam modulators. Specifically an FLCD shutter array as shown in Fig., a fluorescent or two-photon emitter as described by Elizabeth Dowling, or a mechanically reflector such as Texas Instruments DMD device may be used.
  • In all optical systems the axial image or zero-order view may be block and the image formed from the divergent beams from the emitter.
  • FIG. 17 shows a perspective view of the 2D planar array, continuous focal distance embodiment of the present invention where a two dimensional matrix of light sources A110, A110′ which produce the image beams A304. Although a multiplicity of 2D arrays A110 may be used to produce a 3D matrix full display, a preferred embodiment combines the 2D array with a scanning mechanism A120 to create the full image.
  • FIG. 18 shows a side view of the planar array, continuous focal distance embodiment of the present invention applied to an autostereoscopic display where the light source A110 and scanning assembly A120 project the beams towards the screen A200 and then to the observer's eye A24. It may be understood that the scanning assembly A120, projection optics and screen A200 may include embodiments of my previously filed and co-pending patent applications for autostereoscopic displays, thereby incorporating the present invention in the function of the light source and focal distance control.
  • FIG. 19 shows a perspective view of a two-photon activation embodiment of the present invention. Over the past fifty years, researchers have developed a number of techniques for the photo-activation of light emitters. In recent years, Elizabeth Dowling of Stanford University has perfected a technique using a two-photon activation method. This approach may be useful employed as a light emitter in the present invention.
  • FIG. 20 shows a perspective view of a plasma or floating emitter activation embodiment of the present invention where a light emitting region where a defined light emitter region A110 is displaced in space and activated under the control of the image computer a90, the displacement field control structures A150 and the activation signal A154. The output beam A340 is structured by output optics A410.
  • FIG. 21 shows a perspective view of the reflector or optically activated emitter activation embodiment of the present invention where a light emitting region where a defined light emitter region A110 is displaced in space and activated under the control of the image computer a90, the displacement field control structures A150 and the activation signal A154. The output beam A340 is structured by output optics A410.
  • FIG. 22 shows a side view of the angled reflective planar array, continuous focal distance embodiment of the present invention where the light source A110 and scanning assembly A120 projects the beam towards the screen A200 and then to the observer's eye A24. Specifically, a light source A102 and reflector A104 illuminate an array A110, A110′, A110″ shown as a section of a planar array which provides depth function for a multiplicity of image pixels. A ray A304 from the appropriate pixel A110 corresponding the depth function of the pixel is reflected to the imaging optics A410, the scanning optics A120 shown as a rotating mirror, and a reflective HOE optical element A410′ which imparts the angular divergence required to present the proper cone of rays to the HOE augmented reality screen A200 and then to the observer's eye A24.
  • FIG. 23 shows a side view of an improved aberration free light source and scanning assembly A10 where a light source A110 is scanned affixed to a movable member A400 affixed to a point on the plane of the projection optics A410 and the output beam is emitter about a path diverging generally along the movable member A400.
  • The light source A110 and movable member A400 may be chemically, electrodynamically, mechanically (physical, piezo, acousto), or optically displaced in a resonant or pixel determined fashion. Multiple light sources A110 may be affixed to the movable member A400 with intervening non emitting regions thus reducing the required displacement required. The movable member may be cyclically or predeterminably lengthen and shorten to impart a variable focal length. A multiplicity of movable members may be employed. The electronic circuits, which may be formed from transparent conductive films, are not shown. This approach may be used in low cost consumer and toy applications.
  • The present invention optimizes the current performance/cost parameters of commercially available processes. Contemporary, medium cost, high-speed, light sources, either emitters or shutters, together with associated electronics have digital modulation frequencies in the range of 10-100 MHz. A full field display should have at least 2000.times.1000 pixels of resolution (2 megapixels) and a refresh rate of 72 Hz. The resultant data rate for a single plane, single emitter light source is 144 MHz. When 24 bit color depth is added, a digital modulation frequency must be increased by at least a factor of 8. Adding focal depth of 10,000 points, a modulation frequency of over 10 terahertz is required. Thus is it apparent that a simpler, more cost effective approach is an increase in the number of light sources. The present invention provides a direct solution to this problem. Section Two
  • FIG. 24—Multiple Axis—presents a perspective view of a preferred embodiment of the present invention wherein the deformable membrane incorporates a pattern permitting an increased range of the redirection of the incident radiation. The structure is comprised of a deformable membrane N100 suspended above or between one or more programmable electrodes N102, which may be transparent. In one configuration, the incident beam N104 is reflected from the membrane N100 towards the visor mirror 230 and observer's eye 200. In operation, the control electronics N110 applies a variable charge to electrodes N102 causing a localized deformation N114 of membrane N100. The amplitude and timing of the applied charge may cause the localized deformation N114 to travel about membrane N100 in a vector or raster pattern. the deformation of membrane N100 is synchronized with the modulation of LEE 220 causing a specific image pixel to be illuminated. The pattern may simultaneously control the spatial distribution and the wavefront of the beam, creating the impression of a variable focal distance with spectral and 3.sup.rd and 5.sup.th order optical aberrations corrected. The membrane N100 and structure may be mounted upon a translocatable, movable or resonant structure to further enhance its range and applications.
  • The membrane may be lateral or other incisions/discontinuities for a linear translocation.
  • Heterogeneous chemical and mechanical domains in the membrane may be included and individually activated by photonic, mechanical, magnetic or electronic means.
  • FIG. 25 Interneural Motion Processing—presents a preferred embodiment of pixel pattern N2100 containing multiple pixels N2102 which are illuminated simultaneously or with discrete recalculated intervals. While the human retinal captures photons in microseconds, processing by the retinal neural system imparts a time course which acts to enhance or inhibit adjacent biological vision pathways. A single scanned photon may when illuminated at a certain frequency induce the cognitive visual impression of motion in the opposite direction. At a image level, this is observed in the spoked wagon wheels of older Western films. At the biological level, the result may be confusing and ambiguous, thereby substantially reducing a fighter pilots response time, for example.
  • Many image processing systems compute the next image well in advance of the 72 hertz visual refresh rate and may extrapolate images to include the intensification of certain pixels N2104 or the reduction of other pixels N2106. When correlated to visual field speed, this enhances the observers response. Reference: USAF Advanced Flight Cockpit Study, MIT, 1997.
  • FIG. 26—Interocular and Retinal Distance, Shape and Range of Movement—presents a preferred embodiment incorporating the dynamic interocular distance and orientation control. One method of alignment and orientation of immersive displays employs one or more test patterns which provide the observer an alignment or adjustment reference. Standard tests for image position, focal distance and stereo alignment may be incorporated in manner similar to adjusting a pair of binoculars or stereomicroscope. Additional tests which incorporate dynamic motion and require hand-eye coordination may be included.
  • In the present invention, two complementary improvements are employed which permit dynamic adjustment. The first part measures the range of eye motion of each eye by recording the limited of the iris movement. The second parts the range of retinal image focus and position by projecting a visible or invisible test image and recording the dynamic changes of eye position and focus.
  • This is accomplished by monitoring the eye state by means of a reflected beam N7120 and a reflected image detector N7112 which may range from a single photodiode to a full color hi-speed camera. An incident beam 170 which may be visible or invisible is reflected from the iris N7200, the retinal N7202, or the eye lens N7204. Spectrographic analysis may be used to identify the source of the reflected beam.
  • The control computer 160 receives the data from the image detector N7112 and other external systems including the interocular distance which is either fixed or includes a known measuring detector (not shown). This provides sufficient information for the calculation of the orthogonal visual axis of the immersive display relative to the observer and permits an adjustment of the display image including apparent focal distance, stereo image disparity, and visual axis orientation.
  • This dynamic adjustment may be useful convenience for all users and of crucial importance to fighter pilots and other environments where high stresses may cause a physical displacement or distortion of the display or body morphology. An test example for dynamic control would measure the retinal shape and curvature by monitoring the focus of a scanned point in a single photodiode detector system or the width and curvature of a line with a two dimensional detector array. Dynamic monitoring of retina would correct for G forces and other anomalies during high speed turns by fighter pilots and astronauts.
  • Additional external eye state systems such as are manufacture red by ISCAN, Inc. may be employed and the data integrated by the control computer 160.
  • FIG. 27—Distant Focus—presents a preferred embodiment wherein a fixed focus length is set by multiple horizontal elements which are vertically scanned. Other orientations may be employed. Alternatively as shown in FIG. 4A, one or more emitters 220 may be used in a scanning system. In this FIG. 4 emitter may include the other optical emitter group components including variable focal length. The left eye 200L observes a virtual image at point N4102. The right eye 200R observes a image set at infinity. While the relative position of point N4102 in relation to the left eye 200L is important, it is less so in the infinite focal length example. With all image points being compressed into the infinite plane, image object occlusion disappears. A object only viewed through an aperture would still be subject to minor occlusion at a global scale
  • The variable focal length faculty of the present invention may be exploited to permit global or sectional virtual screen at a fixed focal length—with or without correct stereoscopic image disparity. This technique may be used for medical and performance diagnostic, data compression and reduction as well as all other purposes. A virtual screen set beyond the normal accommodative limits of the human eye (approximately 400 meters through infinity) may be minimize the impact of incorrect stereoscopic interocular alignment. Under these circumstances, the projected cone of rays emanating from each pixel need not illuminated the entire pupil travel domain but may subtend the solid angle from the general region of the image object.
  • FIG. 28 shows a representative example where an intermediate transfer reflector (or transmitter) N4110 is employed. The beam 170 exits the optional focal length control 1620 if employed and is reflected (or transmitted) by intermediate transfer reflector (transmitter) N4010 towards the visor reflector 230 and to the observer 200. The reflectors may be positioned in any location or combination including but not limited to above and below the eye plane, across the field of vision, at the periphery or the center.
  • FIG. 29—Induction of Vision—The use of photonic induction of nerve transmission has been disclosed by the author in previous U.S. patent applications and papers. The preferred embodiment of the present invention discloses a method and apparatus for the direct photonic enervation of the human visual system.
  • It has been shown (Salzburg, 1979, this inventor and others) that the state of a neuron may be monitored optically. The reverse process is also true. The preferred embodiment incorporates the disclosed optical system in a novel way. A retinal implant N5100 receives the beam 170 which causes a localized nerve depolarization N5102 sending a signal N5104 to a brain image location N5106. The user may then identify the location in the viewer's reference (imaginary) which may or may not correspond to the virtual spatial source of the beam N5108.
  • The difference is received and computed by the processing computer 160 to generate a viewer's lookup table which permits a mosaic image to provide a correct view for the individual viewer's cognitive vision.
  • The retinal implant N5100 is the subject on the inventor's previous and pending applications and papers. The process may be used on sense, motor and aural nerves as well where processing computer 160 receives the instructions from the users biological process (Solomon, 1979) or other control systems and generates a mosaic image to activate the implant N5100.
  • FIG. 30—Variable Membrane Tension—The use of variable shape reflective and transmissive materials such as reflective membranes, transmissive liquid lenses, and materials wherein a localized change in refractive index is induced for beam forming and scanning are well known. In a preferred embodiment of the present invention these materials are utilized to vary the focal length and beam direction in a novel construction, using both integrated and multiple elements.
  • In FIG. 30, an elongated concave membrane N6100 with multiple electrodes N6102 is shown. The membrane N6100 is shown connected at the corners but any configuration may used. The membrane may be in tension flat or designed with a distinct neutral shape.
  • FIG. 31 shows the operation wherein a shaped portion N6104 of a convex membrane N6100 oscillates between alternative positions N6104 and N6106 during a view cycle of approximately 72 hertz. The beam 170 is reflected from the surface. During each cycle the membrane undergoes a multiplicity of subtle changes which reflect the integration of the field forces generated between the multiple electrodes N6102 and the membrane N6100. These changes are controlled by the processing computer 160 and incorporate the focal length and beam direction information.
  • It is understood that the membrane may represent the surface of deformable or refractive index variable, transmissive material using transparent or reflective electrodes at surface N6102.
  • The use of deformable membrane mirrors as a method for controlling the beam direction, the focal length, the modulation of intensity and chromaticity and the correction of errors has been the subject of extensive research. In Applied Optics, Vol. 31, No. 20, Pg. 3987, a general equation for membrane deformation in electrostatic systems as a function of diameter and membrane tension is given. It is shown that deformation varies as the square of the pixel diameter [a] or voltage [V], and is inversely proportional to the tension [T]. In many applications were the invention is proximal to the human eye, increasing the pixel diameter or the voltage is impractical. Consequently, dynamic changes in membrane tension offer an acceptable method for variation. Variable membranes utilizing known mechanical, photonic, acoustic and magnetic deformation may be employed.
  • FIG. 32 shows the preferred embodiment as disclosed in related government proposals wherein the display system is comprised of a processing computer 160 which coordinates the illumination of LEEs 220, the modulation of display beam integrated translocation and focal length component N7110 and the eye state feedback component N7112. In operation, the light emitted from LEEs 220 is combined the optical waveguide 1050 and directed as a discrete beam 170 to the translocation and focal length component N7110. The beam 170 is directed and focused towards the beam splitter N7114, an optional conditioning optic 228 which may be positioned at any point between the exit aperture of the optical waveguide 1050 and the visor reflector 230, and the visor reflector 230. The beam 170 is then directed to the viewer's eye 200, presenting a replica beam of that which would have been produced by a real point N7118 on a real object 100.
  • Under normal illumination, a real point N7118 would generate a cone of light whose virtual representation is beams 170 and 171. The observer will perceive the object point N7118 as long image beams 170 or 171 enter the observer's iris N7200 at a viewable angle.
  • A reflected beam N7120 is recorded by the eye state feedback component N7112 which incorporates a detector and conditioning optic N7122 which may range from a single photodiode to a complex, hi-speed, full color camera. Data collected by the eye state component N7112 may be received and analyzed by the processing computer 160.
  • The preferred embodiment of the present invention may incorporate a membrane structure which dynamically and reversibly changes tension in response to applied field, charge density and photonic irradiation.
  • FIG. 33—Fiber optic transfer of emitter aperture—presents a preferred embodiment wherein the emitter and combiner exit aperture N8102, N8102A is transferred by means of an optical waveguide N8104 to the focal distance optical element N7110 or projection optics 228. Various shapes of waveguides including micro-optical elements may be employed.
  • It may be understood that the present invention may be applied to alternative constructions, orientations, spacing, and shapes including but not limited to horizontal, oblique, curved or discontinuous arrays and scans.
  • In the present invention, the intensity of the light source may vary during the cycle maximum of 8 periods by the binary increments of 1, 2, 4, 8 . . . . Each pixel is illuminated for 0 to 8 periods resulting in varying intensities of 0-255 and an individual pixel density increase of a factor of 4. The base two series may be expanded to any power.
  • Additions: Composite Linear Array Having:
  • pixel LEE driven analog
  • pixel LEE driven digital
  • group pixel LEE driven analog
  • group pixel LEE driven digitally
  • additive
  • binary intensity sequence
  • with integrated color
  • with distinct color
  • vertical scan
  • horizontal
  • with TIR visor optic
  • color separation
  • image enhancement
  • by F/LCD shutter
  • by static directed prismatic
  • variable ambient occlusion
  • forming TIR layer
  • with separator from TIR
  • integrated eye-tracker
  • horizontal FDOE
  • vertical FDOE
  • With TIR Screen
  • With FDOE enabled
  • With FD corrected for TIR
  • with dynamic HOE visor optic
  • HMD with image generated in ear arm and optically bent by TIR at the arm-visor junction
  • HMD as Personal Communicator
  • HMD with Dynamically Focusable Transmissive External View Lens
  • FIGS. 34 and 35 show a preferred embodiment having a light source 10, variable focal length element 12, a first scanning element 14, a first optical element 16 and a visor optical element 18. In operation, the light source 10 is focused by focal length element 12 and scanned by scanning element 14 onto the first optic 16 and then onto the visor optical element 18. The first optical 16 causes the virtual position of the light source to displace, which is expanded by the proper complementary visor optics as viewed by the observer. This embodiment expands the visual aperture of the HMD.
  • Complementary optics includes various combinations of circular, parabolic, and elliptical forms. One example shown is a circular first optic 16 and an elliptic visor optic 18. Corrections for 1st and 3rd order aberrations may be introduced. Factors such as field of view, precision, scanning control and light source modulation may determine the optimum design for a given market.
  • Eye position feedback may be used to adjust the image for placement, registration with the external environment, or distortion.
  • The embodiment disclosed in FIG. 46 is described in large part in my earlier and pending applications, which integrate the scanning and first optic properties by displacing the reflective surface of the scanning element 14, which may be but is not limited to a resonant mirror, from the axis of rotation. This naturally occurs with a polygon scanner.
  • It may be noted that the observer aperture is determined in part by the relative size of the light source aperture (pixel) and the virtual position displacement caused by the scanning optics. Thus, a wide observer aperture dictates a small light source and a larger virtual displacement.
  • FIG. 36 shows a preferred embodiment having an active, augmented-reality visor optics 28 having a reflective prismatic form 30, a liquid crystal medium 32 and an external substrate. In operation, the reflective forms 30 a-c are sequentially switch from reflective to transmissive in coordination with the scanning of the light source 10. The ratio of reflective to transmissive periods determines the occlusion of the ambient environment. A second liquid crystal and substrate 40 may be employed to increase the occlusion of the ambient environment. The polarization optics for occlusion are not shown, but commonly understood in sequential shutter stereoglasses such as those used by IMAX or manufactured by Stereographics.
  • The active visor optics 28 complements and may be applied to the embodiments in my pending applications.
  • FIG. 37 shows a preferred embodiment applied to the Johnson art of total internal reflector where the beam(s) 28 from one or more light sources 10 including but not limited to a linear array are modified by a focal length element 12 and scanned by scanner 14 which may included a displacement reflector 16 into the Johnson prism 40. With the prism 40, the beam is totally internally reflected one or more times between the exit face 46 and the prism face 48, finally exiting when the intersection with the exit face 46 is more than the critical angle, to the observer 20. A redirecting optical element 60 is shown in FIG. 4 which may be diffuser, fresnel lens, micro-optic lens, HOE or other optical element depending on the use, (HMD, NTE, heads up display, screen) and position(s) of the observer(s).
  • FIG. 38A shows a second prism 42 proximal but spaced from the first prism 40 which directs the light from the environment 100 through the first prism 40 to the observer(s) 20. Interposing between the prisms is a shutter system 50 (which may be but is not limited to liquid crystal shutters, electrophoretic, electro-optic, MEMS or other systems) configured and activated as rows, columns or both. In operation, the shutter acts to occlude the external environment 100 and increased the contrast of the projected ray 30. The shutter 50 may act in synchrony with the scanning system 14.
  • FIG. 38B shows that the shutter system 50 may be placed next to the second prism 42 with a space 52 between the shutter and the first prism 40. When used with LCD, electro-optics or acousto-optics the change in the refractive index may alter the critical angle or reflectivity, or evanescent coupling, thereby increasing resolution and contrast. Alternately, the shutter system 50 may be spaced from both prisms.
  • FIG. 39 shows that the shutter system 50 may be able to the observer face 50′ or the environment face 50.
  • FIG. 40 shows a redirecting optical element 60 which may be diffuser, fresnel lens, micro-optic lens, HOE or other optical element depending on the use, (HMD, NTE, heads up display, screen) and position(s) of the observer(s).
  • FIG. 41 shows a method of manufacturing the linear array shutter system where the shutter material (LCD, for example) 50 is applied to a film which is placed on roll 208 and serially sliced 210 (etched by laser, for example.)
  • FIG. 42 present an active shutter reflector element 50′ which may function as the redirecting optics 1350 as shown in FIG. 13 and FIG. 50, one or more shutter systems 50, 50′ may be incorporated with a redirecting optic 60 placed before or after. When the shutter system 50′ is in between the observer the prism exit face 46 it may additionally function to increase the resolution, shown as vertical lines but not limited to any direction, of the beam by masking the adjacent regions 50 a, b, c, when opened in synchrony with the scan. The scans may be interlaced (alternating patterns).
  • FIG. 43 present a linear accommodation embodiment where the LEE array 10 projects a fan shaped beam 28,28′,28′″ from each pixel. When the fan beam 28 is perpendicular to the wedge axis of TIR prism 40, the optical path lengths are symmetrical about the principal axis of the beam 28 and facilitate visual accommodation. Further the necessary optics are simplified and the resolution of the system improved.
  • Chromatic control may be integrated or distinct, with separate LEEs for each color. While RGB combinations are well-known, additional colors including yellow, amber and purple may be included.
  • Accurate accommodation requires the adjustment of the base level for objects in the system. Thus an virtual object designed to by at 1 meter will require focal distance adjustment as it moves from the along the wedge axis. A LUT may be provided in the software to introduce the correction.
  • The shutter element 50 may be optically-active materials such as liquid crystal, (LC, FLC), dyes, or displaceable elements such as micro-mirrors, electrophoretic spheres, piezo-vanes, etc. While the embodiment shown places the LEE and prism vertically, the orientation may be horizontal or oblique. The TIR pathway may begin in the ear arm of a pair of eyeglasses and bend around the corner. The visor, LEE and other components may be curved or conform to a unique shape.
  • FIG. 44 shows a perspective view of the combined system A10 having a light emitting element (LEE) array A110, scanning optics A120 in the form of a two-axis, reflective scanner, and a partially reflective, micro-optical element visor or screen A300. The LEE array A110 and scanning optics A120 are controlled by computer assembly A90. Common to all head mounted displays and well known to those skilled in the art are a power source such as a battery A90B and a data receiving channel such as a television broadcast decoder or other data link. These are usually incorporated in the computer assembly A90 and therefore not shown separately.
  • In operation, the light beams A200, A200′ (shown by single and double arrows respectively) from one of the LEE array elements A110 x are cyclically scanned by the two-axis (vertical A120 v and horizontal A120 h), reflective scanner A120 across the partial reflective visor A300. The reflected beams A200, A200′ directed towards the observer's eye A22 which, when in focus converge as a single point on the retina A22′. As is common in augmented reality systems, the partial reflective screen A300 also permits the observer to view the external environment A304. The percentage of reflectivity is commonly controllable by a number of well-known technologies including but not limited to LDC shutters. By scanning the entire screen at 30 frames per second, a stable, full virtual image A310 over a wide field of view is presented.
  • To the observer, the apparent distance between oneself and a light emitting element A110′ is a function of the design focal length of the system which includes the focal lengths incorporated in the visor A300, the scanner A120, and the LEE array A110. Commonly, HMDs are set at about 12 feet. In a preferred embodiment of the present invention, the LEE array A110 is co-axial with the principal optical axis of the system and along this axis, the distal LEE element A110″ is further away than the proximal LEE element A110′″. As a result, the LEE elements A110 will each focus at a different virtual distance A310, and they may be simultaneously illuminated.
  • In my earlier inventions disclosed in U.S. patent application Ser. No. 07/779,066 and subsequent applications, co-axial image points could only be presented sequentially in time. One of the significant advantages of the present invention is that a multiplicity of co-axially elements may be simultaneously illuminated. In defense, medical and other applications where multiple targets frequently align co-axially, the present invention increases image comprehension and accuracy while improving the reaction time.
  • FIG. 45 shows the present invention with a two-dimensional (7×3), light emitting element array A110D. It may be understood that the size of the array is generally 4096×1024 and the virtual image 640-4096×1024. Two advantages of this preferred embodiment are the simplification of the scanner A120 from two-axis to one A120H, and reduction in the required frequency of illumination of the individual light emitting elements A110 for a given image resolution. While Fig. X2 shows the placement of the light source and scanning assembly A100 on the side of the head, any placement may be employed including but not limited to on the top or bottom of the head, on the cockpit dashboard, or a desktop.
  • Displays with visual accommodation produce an image by scanning a divergent beam from each image pixel directly into the field of view of the observer rather than forming a real image on a screen or surface, though embodiments may not implement the attribute. In the natural environment, the divergent beam is generally circular orthogonal to the principal axis between the center of the observer's eyelens and the originating image pixel. However, under certain natural and normal circumstances, including the polarized reflections from the surface of a body of water, beam may be elliptical or linear. Nonetheless, human visual accommodation is able to respond accurately.
  • A number of display configurations and technologies including those enabling visual accommodation may be enhanced, both in performance and manufacturability, by projecting a linear form of the divergent beam.
  • In my earlier patent applications including U.S. Pat. No. 7,799,066, I disclosed improvements to the well-known waveguide wedge taught in U.S. Pat. No. 4,212,048 by Donald Castleberry and U.S. Pat. No. 4,109,263 by Bruce Johnson of the Polaroid Corporation of Cambridge, Mass. Mr. Johnson was a co-employee of my colleague at MIT and Woods Hole, and his total internal reflection camera was often used as a visual display screen with a ground glass focusing element in place of the film. Both natural and projected images were used. My referenced enhancements have also been the subject of discussions with collaborators at MIT Professors Stephen Benton and Cardinal Ward.
  • While the application of the Johnson Wedge was well-known at MIT, it application was limited to the compactness of the optical path in connection with reprojection of the image from an often diffusive screen in the Johnson film plane. This is in part due the substantial different optical path lengths and visual focal distance between the display exit pixels at the base and tip of the wedge.
  • This preferred embodiment of the present invention addresses the application of the Johnson Wedge to devices which maintain the optical focal distance to the LEE.
  • FIG. 46 presents the thin-film preferred embodiment of the present invention having a generally linear pixel source 1100, a thin-film waveguide 1112, an extraction/activation layer 1114, an augmented occlusion layer 1110. In direct transmission operation, the vertically divergent beams 28, 28′ are emitted by the pixel source 1100 and coupled to the thin-film waveguide 1112 in which they travel by total internal reflection or evanescent wave exiting at proper exit position 1116 along the waveguide 1112 and directed to the observer's eye 20. The visual accommodation faculty of human vision will adjust the focal distance of the observer's eye in response to the vertical divergence of the beams, obviating the need for a horizontal divergence which would demand a more complex optical waveguide for high resolution transmission.
  • The extraction/activation layer 1114 and thin film layer may be active or passive, reversed and function by direct or reflected extraction/activation. As an active extraction layer 1114 the construction may included but is not limited to an array of liquid crystal (LC, FLC) vertical linear apertures timed with the transmission, wavelength conversion using quantum dots, two photon conversion, embedded conversion elements, coupling evanescent waves, optical coherence tuning and other known optical technologies.
  • In addition as a passive extraction/activation layer, the construction may be of multiple planar layers with a thickness approaching evanescent wave dimensions and terminating or transitioning at a fixed distance. A 2000 layer system comprised of a transmission and spacing sub-layers may be less than 2 millimeters (1 micron thick layers) in thickness.
  • FIG. 47 presents one of many locations for an integrated camera element 1150 which records the position, orientation, iris, and focal length of the observer's eye from the reflected beam—which may be the image forming beam or an auxiliary beam including but not limited to a non-visible wavelength such infrared or ultraviolet.
  • Audience Effects Part
  • FIG. 48 presents an integrated visual display system which may be applied broadly to my related inventions having one or more fixed, movable, independent, handheld, suspended, and/or otherwise located device 2000, one, two or three dimensional visual emitters 2010, a wireless communications element 2012R, and a special effects module 2013 which may include audio, tactile, inertial, olfactory, or other effects, controlled by RF, acoustic or photonic devices 2012 from a control board or computer 2014.
  • In operation using infrared communication, the photonic control devices 2012 may be static or moving sources and spots having static, mechanically, optically or electronically patterns including but not limited to omnidirectional sources, regional fixed IR spotlights, moving gobo patterned spots, digital micro-mirrored device (DMD) electromechanically controlled patterning, LCD or other electronic patterning device. Multiple overlapping control devices 2012L, 2012C, 2012R may be used to provide full data signal coverage, and the specific patterns may be adjusted to present a single seamless data pattern of controlled intensity including but not limited to the methods employed with visible light projectors.
  • In operation, a carrier frequency 2024 such as 38 Khz, or 450 KHz may be imposed under the data signal. When multiple control devices 2012 are employed the carrier frequency may be synchronized electronically or optically, including by a wireless master carrier frequency synch signal 2020 and corresponding receivers 2022.
  • FIG. 49 presents a moving device 2000 having an embedded pattern 2036 which may be pre-coded or transmitted, and which displays upon receiving a sequence of activating signal at location 2028, 2030, 2032, 2034. A history of the device 2000 locations may be stored and used to adjust the proper timing and direction of the displayed pattern 2036.
  • FIG. 50 presents a balloon embodiment of the moving device 200 having an additional special effect altitude control 2042 including but not limited to a volume heater/cooler, volume pump, balloon surface tension material, chemical reaction or other known device to regulate the volume or buoyancy of a balloon. A bistable wire web may be employed to alternatively contract and expand the volume.
  • In operation, an upper 2014 and lower signal 2012 may be provided to regulate to the altitude to a given layer. Alternatively, the signal strength may be employed to cause the balloon to descend once it reaches a defined level or is lost.
  • Immersive Environment
  • FIG. 51 presents an improved beam holographic background display 3000 background having one or more digital beam holographic pixels 3012 which emit a complex pattern of light, horizontal and/or vertically, replicative of a virtual visual screen through which one views a 2 or 3-dimensional image of design. Details of the principles of operation have been presented in my earlier related applications. The improved display may be constructed by one or more columns 3002 of pixels, each column 3002 derived from the projection of one or more projection spatial light modulators (SLM) 3010. An SLM 3010 having a base resolution of 1024×768 may be expanded into a column of 768 pixels and 1024 horizontal views 3014′.
  • The column beam presentation 3014L at a given angle may be monitored by a sensor or camera 3040L and an appropriate correction may be applied by the internal controller or a central server 3004. The beam presentation may be an non-visible wavelength such as but not limited to infrared. If a number of the peripheral views of the SLM are reserved for correction, the system will be able to dynamically correct for substantial vibration, displacement or other interruptions. The percentage required is dependent on the conditions such that a fixed stable system may require only 4-6 pixel views while a mobile stage mounted system for outdoor shows may require 20-40 views.
  • Multiple sensors 3040L, 3040C, 3040R may be employed to increase the accuracy.
  • FIG. 52A presents top view of a total-internal reflection differential expansion of the projected pattern of the SLM 3010 through a series of waveguides 3020.
  • FIG. 52B presents a perspective view having the SLM 3020 centrally mounted proximal to the column 3002 and the projection grid 3030 shown.
  • FIG. 53 presents the columns angled.
  • FIG. 54 presents the columns staggered and in parts.
  • As shown in FIG. 1, an integrated, coordinated display system may be created having a dynamic, three dimensional beam holographic background 3000, a visual space filled with moving pixel devices 2000, and an augment observer mounted display system.
  • Synchronization of Data, Phase and Carrier Frequency
  • High-speed, optical data streams are degraded by multimode paths whether in fiber or free air. It is well-known that most data communication protocols incorporate data error checks. These checks such as the basic checksum enable the receiver to ignore improperly formatted or received data. More sophisticated data error checks may provide correction algorithms.
  • FIG. 55 presents a preferred method applying the data error bits 602 to an upper nibble 604, thereby enabling a first approximation of the greater value, and the remaining lower nibble or bits 606. The algorithm may allow a state change if the upper nibble data is OK, but not if only the lower nibble is OK. When used in a RGB (600×3) or other sequence for display applications, it enhances the continuity of the show at a small chance of minor error in small values of the display.
  • When these observations are applied to audience effects system, multimode paths principally arise from reflections in the environment, the audience and from uncoordinated data and carrier frequency effects.
  • FIG. 56 presents a preferred method which enables the coordination of the controllers by the central control computer data stream 502 having a synch mark 504 and optional synchronization offset value 506. In operation, the synch mark initiates the controller emission sequence and the offset value 506 corrects the timing of the sequence to reflect the individual controller position, data transmission delay and other factors which cause the degradation of the signal.
  • In relation to synchronization, FIG. 2-AO presents a preferred method which enables the coordination of the controller/projector 100 by one or more global synch marks 504 in the form of a radiative pulse of RF, photonic, acoustic or other means from a generator properly located. the central control computer data stream 502, which may include but is not limited to DMX, RDM, ArtNET universes, having a synch mark 504 and optional synchronization offset value 506.
  • Both methods may be used to synchronize at the receiver position the data stream optionally including the phase and carrier frequency if used, from multiple controllers. Using these methods with line-of-sight photonic data streams enables multiple controllers from different direction to control receivers, often a lower power with less spurious reflections from the audience or environment than otherwise. These methods improve spatial resolution and controller system efficiency.
  • Image Enhancements by Sequential, Interlaced and Saccadic Frames
  • Wizard elements on an audience participant may have a single emitter or, in order to increase the apparent resolution—multiple, spatially-independent emitters. Each emitter may be controlled by a high resolution controller or alternatively by an increased data stream which incorporates the data required for each emitter.
  • Lower resolution displays may be enhanced through sequential or interlaced frames. Sequential frames cause the emitter to present sequentially a multiplicity of pixels of the image, each shifted by a small amount, at a higher emitter rate than normal. The human visual system integrates the sequential images by virtually spatially displacing each sequential pixel properly. In reality, the pixel hasn't physically moved. Virtual apparent resolution multiplications over 100 are possible.
  • Sequential Frames
      • a. Higher Speed Data Frame and Refresh Rate (over 30 fps)
      • b. Multiple Sequential Data at Normal or Other Data Frame Rate—Higher Sequential Series
      • c. Compressed Data—The individual pixel data (RGB, for example) may be compressed using existing or custom codecs.
  • Interlaced Content Dependent Frames
      • d. Higher Speed Data Frame and Refresh Rate (over 30 fps)
      • e. Multiple Sequential Data at Normal or Other Data Frame Rate—Higher Sequential Series
      • f. Compressed Data
  • Enhanced Saccadic Spatial Displacement
      • g. Saccadic Frame(s)
  • FIG. 57A presents a preferred embodiment of the sequential frame method having a base image matrix of 4×4 elements s400 and 3 frames including the base frame 410, and virtual sequential frames 412 and 414, shown in their virtual position. It may be understood that the actual emitters are the base image matrix 400. Scene content and cues cause the virtual image to shift and the apparent resolution increase along the y-axis 404.
  • FIG. 57B presents a preferred embodiment of the interlaced frame method having a base image matrix of 4×4 elements s400 and 5 frames including the base frame 410, and virtual interlaced frames 412 through 418, shown in their virtual position interposed about the base frame 410. It may be understood that the actual emitters are the base image matrix 400. Scene content and cues cause the virtual image to shift and the apparent resolution increase along both axes.
  • FIG. 57C presents a preferred embodiment of the enhanced saccadic method where a virtual frame having an fixation attractive (light) and repulsive (dark) overlay on the matrix base image causing the observer's fixation (saccadic) to shift in the designated direction. Many combinations of relative color and intensity caused predictability of saccades and may be found in the relevant journals including Journal of Vision Science and Vision Research which are incorporated by reference.
  • The embodiment of the invention particularly disclosed and described herein above is presented merely as an example of the invention. While the present invention is presented in a binocular environment, the novel elements may be applied to monoscopic or polyscopic devices, head mounted, near to eye, immersive, planar, television and cinema configurations. Other embodiments, forms and modifications of the invention coming within the proper scope and spirit of the appended claims will, of course, readily suggest themselves to those skilled in the art.

Claims (14)

1. A visual display system comprising:
light emitting element means for projecting one or more parts of a full image to the eye of an observer
2. A visual display system in accordance with claim 1 wherein said light array means includes light emitting diodes.
3. A visual display system in accordance with claim 1 wherein said light array means includes a transparent light emitting medium.
4. A visual display system in accordance with claim 1 wherein said transparent light emitting medium is modulated by two-photon up conversion.
5. A visual display system in accordance with claim 1 wherein said light array means is scanned to describe a visual volume corresponding the field and depth of view of the virtual image.
6. A visual display system in accordance with claim 1 further comprising eye state monitoring means for providing said controller means data to conform the modulation of said light array means and focus optical means to the observer's eye state for optimum performance.
7. A visual display system in accordance with claim 1, further comprising a one or more of thin-film optical conduit means.
8. A visual display system in accordance with claim 1, further comprising a one or more of thin-film optical conduit means and an extraction/activation layer means.
9. A visual display system comprising:
light emitting element array means for projecting one or more parts of a full image;
interlacing means for providing a sub-element illumination pattern transducible into a full virtual image of increased pixel number and density;
optical scanning means for displacing the optical radiation from the light emitting elements means across the field of view;
screen means for projecting the optical radiation from the light emitting elements means toward the observer's eye;
controller means for synchronizing the light emitter element means, interlacing means and optical scanning means, and
a means of storing said display system.
10. A visual display system in accordance with claim 9, wherein said storing means is in the form of a wristband which presents a display.
11. A visual display system in accordance with claim 10 further comprising focus optical means for providing optical focal distance for each element of the light emitting element array means.
12. A visual display system in accordance with claim 9 further comprising eye state monitoring means for providing said controller means data to conform the modulation of said light array means and focus optical means to the observer's eye state for optimum performance.
13. A visual display system in accordance with claim 9, further comprising a one or more of thin-film optical conduit means.
14. A visual display system in accordance with claim 9, further comprising a one or more of thin-film optical conduit means and an extraction/activation layer means.
US14/189,232 1990-12-07 2014-02-25 Integrated 3D-D2 visual effects display Expired - Fee Related US10593092B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/189,232 US10593092B2 (en) 1990-12-07 2014-02-25 Integrated 3D-D2 visual effects display
US16/190,044 US20190258061A1 (en) 2011-11-10 2018-11-13 Integrated Augmented Virtual Reality System
US16/819,091 US11199714B2 (en) 2014-02-25 2020-03-14 Experimental reality integrated system

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
FR9015462 1990-12-07
FR9015462A FR2670301B1 (en) 1990-12-07 1990-12-07 EXTENDED DYNAMIC NEUTRONIC DETECTION DEVICE FOR MONITORING AND CONTROLLING NUCLEAR REACTORS.
US12/456,401 US8730129B2 (en) 1990-12-07 2009-06-15 Advanced immersive visual display system
US13/294,011 US20120056799A1 (en) 2001-02-24 2011-11-10 Performance Audience Display System
US201361850920P 2013-02-25 2013-02-25
US201361962877P 2013-11-18 2013-11-18
US14/189,232 US10593092B2 (en) 1990-12-07 2014-02-25 Integrated 3D-D2 visual effects display

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
US12/456,401 Continuation-In-Part US8730129B2 (en) 1990-12-07 2009-06-15 Advanced immersive visual display system
US13/294,011 Continuation-In-Part US20120056799A1 (en) 1990-12-07 2011-11-10 Performance Audience Display System
US16/190,044 Continuation-In-Part US20190258061A1 (en) 2011-11-10 2018-11-13 Integrated Augmented Virtual Reality System

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/190,044 Continuation-In-Part US20190258061A1 (en) 2011-11-10 2018-11-13 Integrated Augmented Virtual Reality System
US16/819,091 Continuation-In-Part US11199714B2 (en) 2014-02-25 2020-03-14 Experimental reality integrated system

Publications (3)

Publication Number Publication Date
US20150243068A1 US20150243068A1 (en) 2015-08-27
US20190139290A9 true US20190139290A9 (en) 2019-05-09
US10593092B2 US10593092B2 (en) 2020-03-17

Family

ID=53882720

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/189,232 Expired - Fee Related US10593092B2 (en) 1990-12-07 2014-02-25 Integrated 3D-D2 visual effects display

Country Status (1)

Country Link
US (1) US10593092B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190285886A1 (en) * 2018-03-16 2019-09-19 Toshihiro Yamashiro Optical scanning apparatus, image projecting apparatus, and mobile object
US20210068540A1 (en) * 2018-01-10 2021-03-11 Segos Co., Ltd. Sliding device for drawer
US20210109349A1 (en) * 2012-09-10 2021-04-15 Elbit Systems Ltd. Microsurgery system for displaying in real-time magnified digital image sequences of an operated area
WO2023192656A1 (en) * 2022-04-01 2023-10-05 Meta Platforms Technologies, Llc Smart glasses with enhanced optical structures for augmented reality applications

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0718706D0 (en) 2007-09-25 2007-11-07 Creative Physics Ltd Method and apparatus for reducing laser speckle
US11726332B2 (en) 2009-04-27 2023-08-15 Digilens Inc. Diffractive projection apparatus
US9335604B2 (en) 2013-12-11 2016-05-10 Milan Momcilo Popovich Holographic waveguide display
US9274349B2 (en) 2011-04-07 2016-03-01 Digilens Inc. Laser despeckler based on angular diversity
US10670876B2 (en) 2011-08-24 2020-06-02 Digilens Inc. Waveguide laser illuminator incorporating a despeckler
EP2995986B1 (en) 2011-08-24 2017-04-12 Rockwell Collins, Inc. Data display
WO2016020630A2 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Waveguide laser illuminator incorporating a despeckler
WO2013102759A2 (en) 2012-01-06 2013-07-11 Milan Momcilo Popovich Contact image sensor using switchable bragg gratings
US9417660B2 (en) 2012-04-25 2016-08-16 Kopin Corporation Collapsible head set computer
US9933684B2 (en) 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
US9727772B2 (en) 2013-07-31 2017-08-08 Digilens, Inc. Method and apparatus for contact image sensing
US9759918B2 (en) * 2014-05-01 2017-09-12 Microsoft Technology Licensing, Llc 3D mapping with flexible camera rig
WO2016020632A1 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Method for holographic mastering and replication
US10241330B2 (en) 2014-09-19 2019-03-26 Digilens, Inc. Method and apparatus for generating input images for holographic waveguide displays
CN107873086B (en) 2015-01-12 2020-03-20 迪吉伦斯公司 Environmentally isolated waveguide display
US9632226B2 (en) 2015-02-12 2017-04-25 Digilens Inc. Waveguide grating device
KR101678642B1 (en) * 2015-06-01 2016-11-22 오영권 Smart glass and wristwatch comprising same
US10349045B2 (en) * 2015-06-03 2019-07-09 Stanley Shao-Ying Lee Three-dimensional (3D) viewing device and system thereof
CN108474945B (en) * 2015-10-05 2021-10-01 迪吉伦斯公司 Waveguide display
US10511895B2 (en) * 2015-10-09 2019-12-17 Warner Bros. Entertainment Inc. Cinematic mastering for virtual reality and augmented reality
JP6895451B2 (en) 2016-03-24 2021-06-30 ディジレンズ インコーポレイテッド Methods and Devices for Providing Polarized Selective Holography Waveguide Devices
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
CN109154717B (en) 2016-04-11 2022-05-13 迪吉伦斯公司 Holographic waveguide device for structured light projection
EP3281909A1 (en) * 2016-08-08 2018-02-14 Essilor International Projector configured to project an image onto a surface and portable device comprising such projector
US10324291B2 (en) 2016-09-12 2019-06-18 Microsoft Technology Licensing, Llc Display active alignment system for waveguide displays
US10216263B2 (en) * 2016-09-12 2019-02-26 Microsoft Technology Licensing, Llc Display active alignment systems utilizing test patterns for calibrating signals in waveguide displays
WO2018102834A2 (en) 2016-12-02 2018-06-07 Digilens, Inc. Waveguide device with uniform output illumination
US10366674B1 (en) * 2016-12-27 2019-07-30 Facebook Technologies, Llc Display calibration in electronic displays
US10545346B2 (en) 2017-01-05 2020-01-28 Digilens Inc. Wearable heads up displays
WO2018175343A1 (en) * 2017-03-21 2018-09-27 Magic Leap, Inc. Eye-imaging apparatus using diffractive optical elements
US10419716B1 (en) * 2017-06-28 2019-09-17 Vulcan Technologies Llc Ad-hoc dynamic capture of an immersive virtual reality experience
CN111095190A (en) * 2017-07-24 2020-05-01 维奈·K·梅塔 Static display and manufacturing method thereof
WO2019079350A2 (en) 2017-10-16 2019-04-25 Digilens, Inc. Systems and methods for multiplying the image resolution of a pixelated display
US10937348B2 (en) * 2017-11-23 2021-03-02 Facebook Technologies, Llc Analog data shifter for a current mode display
WO2019136476A1 (en) 2018-01-08 2019-07-11 Digilens, Inc. Waveguide architectures and related methods of manufacturing
CN111566571B (en) 2018-01-08 2022-05-13 迪吉伦斯公司 System and method for holographic grating high throughput recording in waveguide cells
US11253149B2 (en) * 2018-02-26 2022-02-22 Veyezer, Llc Holographic real space refractive sequence
RU183466U1 (en) * 2018-03-07 2018-09-24 Алексей Владимирович Непрокин Videonistamography Device
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
CN110376734B (en) * 2018-04-12 2021-11-19 肥鲨技术 Single-panel head-mounted display
US10861380B2 (en) 2018-05-14 2020-12-08 Facebook Technologies, Llc Display systems with hybrid emitter circuits
WO2020023779A1 (en) 2018-07-25 2020-01-30 Digilens Inc. Systems and methods for fabricating a multilayer optical structure
US11215828B1 (en) * 2018-08-03 2022-01-04 Rockwell Collins, Inc. In field visor characterization for visor projected displays
US11435580B1 (en) * 2018-09-20 2022-09-06 Rockwell Collins, Inc. High dynamic range head-up display
US10971061B2 (en) 2019-01-11 2021-04-06 Facebook Technologies, Llc Control scheme for a scanning display
EP3924759A4 (en) 2019-02-15 2022-12-28 Digilens Inc. Methods and apparatuses for providing a holographic waveguide display using integrated gratings
JP2022525165A (en) 2019-03-12 2022-05-11 ディジレンズ インコーポレイテッド Holographic Waveguide Backlights and Related Manufacturing Methods
US11481889B2 (en) * 2019-04-03 2022-10-25 Pittsburgh Glass Works, Llc Fixture for evaluating heads-up windshields
US11619825B2 (en) * 2019-04-10 2023-04-04 Electronics And Telecommunications Research Institute Method and apparatus for displaying binocular hologram image
JP6641055B2 (en) * 2019-05-29 2020-02-05 株式会社東芝 Wearable terminal, system and display method
EP3980825A4 (en) 2019-06-07 2023-05-03 Digilens Inc. Waveguides incorporating transmissive and reflective gratings and related methods of manufacturing
US11067809B1 (en) * 2019-07-29 2021-07-20 Facebook Technologies, Llc Systems and methods for minimizing external light leakage from artificial-reality displays
JP2022543571A (en) 2019-07-29 2022-10-13 ディジレンズ インコーポレイテッド Method and Apparatus for Multiplying Image Resolution and Field of View for Pixelated Displays
US10885819B1 (en) * 2019-08-02 2021-01-05 Harman International Industries, Incorporated In-vehicle augmented reality system
EP4022370A4 (en) 2019-08-29 2023-08-30 Digilens Inc. Evacuating bragg gratings and methods of manufacturing
EP4229471A2 (en) * 2020-09-21 2023-08-23 Apple Inc. Systems with displays and sensors
WO2022154847A1 (en) 2021-01-12 2022-07-21 Emed Labs, Llc Health testing and diagnostics platform
US11615888B2 (en) 2021-03-23 2023-03-28 Emed Labs, Llc Remote diagnostic testing and treatment
US11929168B2 (en) 2021-05-24 2024-03-12 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11369454B1 (en) 2021-05-24 2022-06-28 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
GB2623461A (en) 2021-06-22 2024-04-17 Emed Labs Llc Systems, methods, and devices for non-human readable diagnostic tests
CN114022529B (en) * 2021-10-12 2024-04-16 东北大学 Depth perception method and device based on self-adaptive binocular structured light

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120056799A1 (en) * 2001-02-24 2012-03-08 Dennis Solomon Performance Audience Display System
US8730129B2 (en) * 1990-12-07 2014-05-20 Dennis J Solomon Advanced immersive visual display system
DE19958436B4 (en) * 1999-12-03 2014-07-17 Carl Zeiss Meditec Ag Apparatus and method for active, physiologically evaluated, comprehensive correction of the aberrations of the human eye
US8482488B2 (en) * 2004-12-22 2013-07-09 Oakley, Inc. Data input management system for wearable electronically enabled interface
US7292760B2 (en) * 2002-12-09 2007-11-06 Eastman Kodak Company Optical converter formed from flexible strips
WO2005083546A1 (en) * 2004-02-27 2005-09-09 Simon Richard Daniel Wearable modular interface strap
TWI317826B (en) * 2005-09-15 2009-12-01 Asustek Comp Inc Ear-hook display and its electrical display apparatus
US8248462B2 (en) * 2006-12-15 2012-08-21 The Board Of Trustees Of The University Of Illinois Dynamic parallax barrier autosteroscopic display system and method
US8692886B2 (en) * 2007-10-31 2014-04-08 Timothy James Ennis Multidirectional video capture assembly
JPWO2009150747A1 (en) * 2008-06-13 2011-11-10 パイオニア株式会社 User interface device by line-of-sight input, user interface method, user interface program, and recording medium on which user interface program is recorded
US20100253700A1 (en) * 2009-04-02 2010-10-07 Philippe Bergeron Real-Time 3-D Interactions Between Real And Virtual Environments
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US9927611B2 (en) * 2010-03-29 2018-03-27 Soraa Laser Diode, Inc. Wearable laser based display method and system
JP5885129B2 (en) * 2012-09-11 2016-03-15 カシオ計算機株式会社 Exercise support device, exercise support method, and exercise support program
US10134170B2 (en) * 2013-09-26 2018-11-20 Intel Corporation Stereoscopic rendering using vertix shader instancing
US9454008B2 (en) * 2013-10-07 2016-09-27 Resonance Technology, Inc. Wide angle personal displays

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210109349A1 (en) * 2012-09-10 2021-04-15 Elbit Systems Ltd. Microsurgery system for displaying in real-time magnified digital image sequences of an operated area
US20210068540A1 (en) * 2018-01-10 2021-03-11 Segos Co., Ltd. Sliding device for drawer
US20190285886A1 (en) * 2018-03-16 2019-09-19 Toshihiro Yamashiro Optical scanning apparatus, image projecting apparatus, and mobile object
US10831024B2 (en) * 2018-03-16 2020-11-10 Ricoh Company, Ltd. Optical scanning apparatus, image projecting apparatus, and mobile object
WO2023192656A1 (en) * 2022-04-01 2023-10-05 Meta Platforms Technologies, Llc Smart glasses with enhanced optical structures for augmented reality applications

Also Published As

Publication number Publication date
US10593092B2 (en) 2020-03-17
US20150243068A1 (en) 2015-08-27

Similar Documents

Publication Publication Date Title
US10593092B2 (en) Integrated 3D-D2 visual effects display
US8730129B2 (en) Advanced immersive visual display system
JP7228581B2 (en) Augmented reality display with eyepiece having transparent emissive display
US20210144361A1 (en) Near Eye Wavefront Emulating Display
EP3542206B1 (en) Near-eye sequential light-field projector with correct monocular depth cues
US20060033992A1 (en) Advanced integrated scanning focal immersive visual display
Kramida Resolving the vergence-accommodation conflict in head-mounted displays
CN105492957B (en) Using the image display of pairs of glasses form
CN107300769B (en) Virtual and augmented reality systems and methods
JP2023058677A (en) Display system and method for providing variable accommodation cues using multiple intra-pupil parallax views formed by light emitter arrays
US20190258061A1 (en) Integrated Augmented Virtual Reality System
US20040130783A1 (en) Visual display with full accommodation
JP2019537060A (en) Multi-resolution display for head mounted display system
US20040108971A1 (en) Method of and apparatus for viewing an image
JP2000506998A (en) Method and apparatus for viewing images
JP2008509438A (en) Optical display device scanned with variable fixed viewing distance
CN106170729A (en) For the method and apparatus with the head-mounted display of multiple emergent pupil
CN109633905A (en) Multifocal flat panel display system and equipment
JP2024045723A (en) Display system having one-dimensional pixel array with scanning mirror - Patents.com
US20220175326A1 (en) Non-Invasive Experimental Integrated Reality System
US11199714B2 (en) Experimental reality integrated system
CN109963141A (en) Vision display system and method and head-wearing display device

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY