WO2018078633A1 - Reflector eye sight with compact beam combiner - Google Patents
Reflector eye sight with compact beam combiner Download PDFInfo
- Publication number
- WO2018078633A1 WO2018078633A1 PCT/IL2017/051180 IL2017051180W WO2018078633A1 WO 2018078633 A1 WO2018078633 A1 WO 2018078633A1 IL 2017051180 W IL2017051180 W IL 2017051180W WO 2018078633 A1 WO2018078633 A1 WO 2018078633A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- real
- world view
- eye sight
- conjugated
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/02—Simple or compound lenses with non-spherical faces
- G02B3/08—Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/42—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
- G02B27/4205—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
- G02B27/4216—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant correcting geometrical aberrations
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/013—Head-up displays characterised by optical features comprising a combiner of particular shape, e.g. curvature
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
- G02B27/0037—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration with diffracting elements
- G02B27/0043—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration with diffracting elements in projection exposure systems, e.g. microlithographic systems
Definitions
- the present invention relates generally to an eye sight with a compact beam combiner, and more particularly to a compact eye sight, such as a rifle sight, camera or telescope, operable to overlay graphics onto an undistorted and unmodified real-world view.
- a compact eye sight such as a rifle sight, camera or telescope
- Augmented reality refers to techniques whereby a real-world view, object or scene as seen by an observer is enhanced with an additional visual layer of digital information.
- Implementation of AR systems typically requires use of an optical enabling device (i.e. an optical imaging system) to display virtual objects directly into the observer's field of view (FOV).
- optical enabling devices i.e. an optical imaging system
- FOV field of view
- Standard optical enabling devices are based on two channels, the first being a transmissive channel operable to enable an observer to view a real-world scene without optical modulation (i.e.
- the transmissive and virtual channels are generally superimposed (i.e. combined and aligned) using a combiner, such as a beam splitter or dichroic mirror.
- the said combiner is positioned along the optical path between the observer and scene (i.e. positioned as an optical-incident surface).
- FIG. 1 An exemplary AR system is illustrated in figure 1 wherein the transmissive channel is depicted in figure l a, the virtual channel is depicted in figure lb, and the combined transmissive-virtual channel is depicted in figure lc.
- the transmissive channel light rays 103a from the real-world view 101a are transmitted via AR system 104 to the eyes of an observer 102a without modification or distortion.
- additional data 105 a for augmentation with the real- world view is defined within AR system 104, is focused via imaging lens 106 to have appropriate virtual image characteristics 105b (i.e.
- Reflector eye sights are known to utilize AR arrangements of this type to enable an observer to look through a partially reflecting element to see a superimposed virtual image, such as an aim point reticle, in their field of view. As the virtual image is projected to infinity, it remains in alignment with the device the reflector eye sight is attached to regardless of the observer's eye position, thereby obviating many of the parallax and other sighting errors found in prevalent iron sight arrangements.
- One such common reflector eye sight is the ubiquitous red dot sight which uses a red LED at the focus of collimating optics to generate a dot style illuminated reticle in the observer's field of view.
- AR systems are typically constrained in their application due to limitations arising from the physical size of the imaging lens.
- the physical size of the imaging lens places constraints on the size of the exit pupil and its axial distance, and this consequently also dictates the effective area and axial location in which the observer's eyes are capable of perceiving projected virtual images (known as the 'eye-motion-box' or EMB).
- EMB projected virtual images
- the eye sight comprises: a lens positioned between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and, a computer processing device, said computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real- world view.
- a method of using an eye sight to augment a real-world view with virtual data comprising: positioning a lens between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and, operating a computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real- world view.
- Figure 1 is a block diagram illustrating exemplary architecture of a standard AR system.
- Figure 2A is a schematic diagram illustrating exemplary architecture of a standard meniscus lens.
- Figure 2B is a schematic diagram illustrating exemplary non-limiting architecture of a conjugated diffractive lens according to embodiments of the present invention.
- Figure 3 is an exemplary non-limiting diagram illustrating diffraction patterns on both sides of a conjugated diffractive lens.
- FIG. 2A is a schematic diagram illustrating exemplary architecture of a standard meniscus lens.
- a meniscus lens is a unitary lens with a convex-concave profile, or an outward- inward curved profile, or, in other words, an optical power of zero.
- the exemplary meniscus lens comprises a substantially convex lens element cemented with a substantially concave lens element, the boundary or mutual surface 204 having a dichroic narrow band-pass reflective coating.
- the substantially convex and concave lens elements each comprise an external surface with an antireflective coating 203a, the composite lens, when formed together, having uniform thickness.
- Light rays 202a from the real-world scene pass through the meniscus lens absent spatial modulation due to the lens having uniform thickness.
- Light rays 205a originating from an object point 206a, such as an AR source are reflected by the dichroic narrow band-pass reflective coating 204, modulated, and collimated in alignment with the real-world scene light rays 202a.
- Meniscus lens arrangements of this type are commonly used in reflector 'red dot' rifle- mounted-sights, the dichroic narrow-band-pass reflective coating being appropriate to reflect red light emitted from the object point, and the antireflective coatings being appropriate to transmit visible spectrum light.
- the meniscus lens behaves, in part, like a window.
- the window being operable to transmit light arriving from an external scene without distortion or modulation.
- the meniscus lens also behaves like a concave mirror, the mirror being operable to reflect rays projected from the AR source 206a onto the opposite, internal, side of the meniscus lens back to an observer.
- the transmitted unmodulated 202a and reflected modulated 205a rays arrive in alignment at the eye 201a of an observer.
- spherical concave surfaces of the type illustrated in figure 2A have been found to have constrained effective areas in which the AR data source may render virtual data due to aberrations occurring in peripheral areas of the lens. This, at least in part, is due to large angle ray distortion arising as a result of significant optical path differences.
- lenses of this type are, in practice, limited to displaying only small object points (e.g. 'red dots' for rifle targeting).
- a theoretical approach to overcoming aberration phenomenon is to use a doublet lens with an a-spherical mutual surface (e.g. a parabolic mirror or ellipsoidal surface).
- a doublet lens is a composite lens made up of two simple lenses paired together and has additional degrees of freedom. These additional degrees of freedom are instrumental in enabling a designer to more rigorously correct for optical aberrations, such as chromatic or spherical aberrations.
- the correction of aberrations also, in turn, somewhat improves the effective area in which to support and render AR virtual data.
- the challenges in producing an a-spherical doublet lens are however significant with current manufacturing methods, and furthermore prohibitively expensive.
- the total width, and therefore also weight, of an optical component featuring an a-spherical mutual surface is also a factor and, in some instances, may give rise to ghost images (i.e. additional shifted images).
- ghost images may be addressed by applying a dichroic coating to the spherical incident surface, however such coatings invariably lead to discoloration in the transmissive channel (e.g. a red dichroic mirror may lead to a bluish discoloration being applied to the real- world view).
- a dichroic coating invariably lead to discoloration in the transmissive channel (e.g. a red dichroic mirror may lead to a bluish discoloration being applied to the real- world view).
- the ability to render AR virtual data in peripheral areas of the field of view is still undesirably constrained as the doublet lens provides only limited improvement in aberrations arising from large angle distortion.
- the arrangement in figure 2B discloses exemplary non-limiting architecture of a conjugated diffractive lens according to embodiments of the present invention.
- the proposed conjugated diffractive lens comprises two opposing lens elements where one presents with a substantially concave surface, and the other presents with a substantially convex surface.
- the lens elements may comprise Fresnel architecture to facilitate application in compact or space constrained environments.
- the conjugated diffractive elements are formed together to fulfill optical beam combiner functionality, and may be imposed onto a thin film of a transparent dispersive material, for example onto a film of Silica S1O 2 or any other appropriate material.
- the thin film may be applied directly or indirectly, with or without an appropriate adhesive agent, onto a transparent surface within the reflector eye sight.
- Light rays 202b from the real-world scene pass through the conjugated diffractive elements absent spatial modulation due to the lens having uniform thickness (i.e. the same thickness at every location on the composite lens).
- Light rays 205b originating from an object point 206b, such as an AR source are reflected, modulated and collimated in alignment with the real-world scene light rays 202b.
- the substantially convex external surface comprises an antireflective coating 203b, and rays from the real- world scene 202b and AR source 205b progress in alignment to the eye 201b of an observer.
- the thickness of the conjugated diffractive lens may be kept relatively small (e.g. around 300 ⁇ ) the effect of ghost image phenomenon is negligible and a dichroic coating is unnecessary.
- spherical aberrations may be overcome by implementing an appropriate a-spherical pattern on the film, and the rendering of virtual data in peripheral regions of the field of view is achievable.
- the lens is a thin film applied to a transparent surface on or within the eye sight, the transparent surface lying along the user's optical path and transmitting incident light without modification or distortion.
- the thickness of the thin film is between 200 ⁇ and 500 ⁇ .
- the thin film is applied to the transparent surface with an adhesive agent.
- the virtual data comprises one or more of dimensions, coordinates, bearings, timestamps, heat-maps, reticles, graticules, weather analysis, and text.
- the eye sight is for use on a weapon, telescope, periscope, binoculars, or a camera.
- Figure 3 is an exemplary non-limiting diagram graphically illustrating the diffraction pattern resulting from the conjugated drffractive lens of figure 2B. These patterns may be represented mathematically by the following equations:
- Qi represents the transfer function (depicted as 301) produced on the first side of the conjugated diffractive lens 302
- Q2 represents the conjugate transfer function (depicted as 303) produced on the second side of the conjugated diffractive lens 302
- Ai and A2 are constants
- A represents the wavelength of light emitted from the synthetic data source (i.e. from the AR data source); is the focal length of the incident surface
- each portion in the flowchart or portion diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the portion may occur out of the order noted in the figures.
- aspects of the present invention may be embodied as a system or an apparatus. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", “module” or “system”.
- Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
- method may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Geometry (AREA)
- Lenses (AREA)
Abstract
An eye sight operable to augment a real-world view with virtual data is disclosed. The eye sight comprises a lens positioned between a user and the real-world view, said lens comprising conjugated diffractive elements wherein the conjugated diffractive elements comprise a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user. The eye sight further comprises a computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification.
Description
REFLECTOR EYE SIGHT WITH COMPACT BEAM COMBINER
FIELD OF THE INVENTION
[0001] The present invention relates generally to an eye sight with a compact beam combiner, and more particularly to a compact eye sight, such as a rifle sight, camera or telescope, operable to overlay graphics onto an undistorted and unmodified real-world view.
BACKGROUND OF THE INVENTION
[0002] Augmented reality (AR) refers to techniques whereby a real-world view, object or scene as seen by an observer is enhanced with an additional visual layer of digital information. Implementation of AR systems typically requires use of an optical enabling device (i.e. an optical imaging system) to display virtual objects directly into the observer's field of view (FOV). These devices are especially vital where there is a need for virtual objects to be aligned with, or to overlap with, objects or scenery in the real-world view. Standard optical enabling devices are based on two channels, the first being a transmissive channel operable to enable an observer to view a real-world scene without optical modulation (i.e. without modification, magnification or distortion), and the second being a virtual channel operable to obtain data (e.g. computer generated digital data of a textual or graphical variety) from an external source and to project it to the observer in alignment with the real- world scene. The transmissive and virtual channels are generally superimposed (i.e. combined and aligned) using a combiner, such as a beam splitter or dichroic mirror. The said combiner is positioned along the optical path between the observer and scene (i.e. positioned as an optical-incident surface).
[0003] An exemplary AR system is illustrated in figure 1 wherein the transmissive channel is depicted in figure l a, the virtual channel is depicted in figure lb, and the combined transmissive-virtual channel is depicted in figure lc. In the case of the transmissive channel, light rays 103a from the real-world view 101a are transmitted via AR system 104 to the eyes of an observer 102a without modification or distortion. In the case of the virtual channel, additional data 105 a for augmentation with the real- world view is defined within AR system 104, is focused via imaging lens 106 to have appropriate virtual image characteristics 105b (i.e. appropriate dimensions and magnification for alignment with the real-world view), and then projected to the eye of the observer 102b. In the case of the combined transmissive-virtual channel, light rays 103c from the real- world view 101c pass unmodified via the beam splitter
107 to the eye of the observer 102c, and data 105a defined by the AR system is projected and focused via imaging lens 106 onto an alternative surface of the beam splitter 107 (which may have a dichroic coating for a particular wavelength) and aligned with the real-world view into the eye of the observer 102c.
[0004] Reflector eye sights are known to utilize AR arrangements of this type to enable an observer to look through a partially reflecting element to see a superimposed virtual image, such as an aim point reticle, in their field of view. As the virtual image is projected to infinity, it remains in alignment with the device the reflector eye sight is attached to regardless of the observer's eye position, thereby obviating many of the parallax and other sighting errors found in prevalent iron sight arrangements. One such common reflector eye sight is the ubiquitous red dot sight which uses a red LED at the focus of collimating optics to generate a dot style illuminated reticle in the observer's field of view.
[0005] As most reflector eye sight arrangements are primarily assembled from standard optical components including mirrors, lenses and light sources (i.e. incandescent bulbs, LEDs etc.), that there can be considerable size, cost, weight, reliability, performance, and complexity benefits in decreasing the number of optical components present in a given eye sight. One such method of achieving these benefits is to merge or combine optical components into unitary composite components fulfilling the same functionality as those components in isolation. Accordingly, it is an objective of the present invention to produce a compact reflector eye sight wherein an imaging lens and/or a concave mirror is merged with a beam splitter into a single optical component fulfilling the same function.
[0006] Further, AR systems are typically constrained in their application due to limitations arising from the physical size of the imaging lens. Specifically, the physical size of the imaging lens places constraints on the size of the exit pupil and its axial distance, and this consequently also dictates the effective area and axial location in which the observer's eyes are capable of perceiving projected virtual images (known as the 'eye-motion-box' or EMB). In essence, the smaller the physical size of the imaging lens, the smaller the EMB, and therefore, the smaller the area in which the projected data is refracted from the optical combiner (i.e. beam splitter etc.) and can be rendered in the observer's eye. These limitations are particularly problematic in situations where it may be desirable for virtual images to be rendered in peripheral areas of a view through a reflector eye sight, rather than solely in the central or immediate field of view (FOV). Where peripheral rendering is desirable, the imaging lens either needs to be comparable in size to the sight so as to exhibit the requisite exit pupil, or else will be restricted to define only a small effective area in which an observer may see virtual images. The practicality of
installing a large imaging lens is generally cost prohibited, weight prohibited, lacking in feasibility, and often leads to modification and/or distortion in transmissive light (i.e. distortion/modification to the transmissive channel illustrated in figure la). Accordingly, it is a further objective of the present invention to merge the imaging lens and optical combiner in a manner where the EMB substantially encompasses the entirety of the FOV through a reflector eye sight without necessitating prohibitively expensive optical components.
SUMMARY OF THE PRESENT INVENTION
[0007] An eye sight operable to augment a real-world view with virtual data is proposed. The eye sight comprises: a lens positioned between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and, a computer processing device, said computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real- world view.
[0008] A method of using an eye sight to augment a real-world view with virtual data is also proposed. The method comprising: positioning a lens between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and, operating a computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user
without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real- world view.
[0009] Advantages of the present invention are set forth in detail in the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] For a better understanding of the invention and in order to show how the invention may be implemented, references are made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections. In the accompanying drawings:
[0011] Figure 1 is a block diagram illustrating exemplary architecture of a standard AR system.
[0012] Figure 2A is a schematic diagram illustrating exemplary architecture of a standard meniscus lens.
[0013] Figure 2B is a schematic diagram illustrating exemplary non-limiting architecture of a conjugated diffractive lens according to embodiments of the present invention.
[0014] Figure 3 is an exemplary non-limiting diagram illustrating diffraction patterns on both sides of a conjugated diffractive lens.
DETAILED DESCRIPTION OF THE INVENTION
[0015] With specific reference now to the drawings in detail, it is stressed that the particulars shown are for the purpose of example and solely for discussing the preferred embodiments of the present invention, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention. The description taken with the drawings makes apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
[0016] Before explaining the embodiments of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following descriptions or illustrated in the drawings. The invention is applicable to other embodiments and may be practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
[0017] Figure 2A is a schematic diagram illustrating exemplary architecture of a standard meniscus lens. A meniscus lens is a unitary lens with a convex-concave profile, or an outward- inward curved profile, or, in other words, an optical power of zero. The exemplary meniscus lens comprises a substantially convex lens element cemented with a substantially concave lens element, the boundary or mutual surface 204 having a dichroic narrow band-pass reflective coating. The substantially convex and concave lens elements each comprise an external surface with an antireflective coating 203a, the composite lens, when formed together, having uniform thickness. Light rays 202a from the real-world scene pass through the meniscus lens absent spatial modulation due to the lens having uniform thickness. Light rays 205a originating from an object point 206a, such as an AR source, are reflected by the dichroic narrow band-pass reflective coating 204, modulated, and collimated in alignment with the real-world scene light rays 202a.
[0018] Meniscus lens arrangements of this type are commonly used in reflector 'red dot' rifle- mounted-sights, the dichroic narrow-band-pass reflective coating being appropriate to reflect red light emitted from the object point, and the antireflective coatings being appropriate to transmit visible spectrum light. In essence, the meniscus lens behaves, in part, like a window. The window being operable to transmit light arriving from an external scene without distortion or modulation. The meniscus lens also behaves like a concave mirror, the mirror being operable to reflect rays projected from the AR source 206a onto the opposite, internal, side of the meniscus lens back to an observer. The transmitted unmodulated 202a and reflected modulated 205a rays arrive in alignment at the eye 201a of an observer. Notably, however, spherical concave surfaces of the type illustrated in figure 2A have been found to have constrained effective areas in which the AR data source may render virtual data due to aberrations occurring in peripheral areas of the lens. This, at least in part, is due to large angle ray distortion arising as a result of significant optical path differences. As a consequence, lenses of this type are, in practice, limited to displaying only small object points (e.g. 'red dots' for rifle targeting).
[0019] A theoretical approach to overcoming aberration phenomenon is to use a doublet lens with an a-spherical mutual surface (e.g. a parabolic mirror or ellipsoidal surface). Unlike a meniscus lens, a doublet lens is a composite lens made up of two simple lenses paired together and has additional degrees of freedom. These additional degrees of freedom are instrumental in enabling a designer to more rigorously correct for optical aberrations, such as chromatic or spherical aberrations. The correction of aberrations also, in turn, somewhat improves the effective area in which to support and render AR virtual data. The challenges in producing an a-spherical doublet lens are however significant with current manufacturing methods, and
furthermore prohibitively expensive. The total width, and therefore also weight, of an optical component featuring an a-spherical mutual surface is also a factor and, in some instances, may give rise to ghost images (i.e. additional shifted images). These ghost images may be addressed by applying a dichroic coating to the spherical incident surface, however such coatings invariably lead to discoloration in the transmissive channel (e.g. a red dichroic mirror may lead to a bluish discoloration being applied to the real- world view). Moreover, the ability to render AR virtual data in peripheral areas of the field of view is still undesirably constrained as the doublet lens provides only limited improvement in aberrations arising from large angle distortion.
[0020] The arrangement in figure 2B discloses exemplary non-limiting architecture of a conjugated diffractive lens according to embodiments of the present invention. As with the meniscus lens, the proposed conjugated diffractive lens comprises two opposing lens elements where one presents with a substantially concave surface, and the other presents with a substantially convex surface. In some embodiments, the lens elements may comprise Fresnel architecture to facilitate application in compact or space constrained environments. The conjugated diffractive elements are formed together to fulfill optical beam combiner functionality, and may be imposed onto a thin film of a transparent dispersive material, for example onto a film of Silica S1O2 or any other appropriate material. In some embodiments, the thin film may be applied directly or indirectly, with or without an appropriate adhesive agent, onto a transparent surface within the reflector eye sight. Light rays 202b from the real-world scene pass through the conjugated diffractive elements absent spatial modulation due to the lens having uniform thickness (i.e. the same thickness at every location on the composite lens). Light rays 205b originating from an object point 206b, such as an AR source, are reflected, modulated and collimated in alignment with the real-world scene light rays 202b. The substantially convex external surface comprises an antireflective coating 203b, and rays from the real- world scene 202b and AR source 205b progress in alignment to the eye 201b of an observer. Unlike the doublet lens, as the thickness of the conjugated diffractive lens may be kept relatively small (e.g. around 300μπι) the effect of ghost image phenomenon is negligible and a dichroic coating is unnecessary. Moreover, spherical aberrations may be overcome by implementing an appropriate a-spherical pattern on the film, and the rendering of virtual data in peripheral regions of the field of view is achievable.
[0021] In some embodiments, the lens is a thin film applied to a transparent surface on or within the eye sight, the transparent surface lying along the user's optical path and transmitting incident light without modification or distortion.
[0022] In some embodiments, the thickness of the thin film is between 200μπι and 500μπι.
[0023] In some embodiments, the thin film is applied to the transparent surface with an adhesive agent.
[0024] In some embodiments, the virtual data comprises one or more of dimensions, coordinates, bearings, timestamps, heat-maps, reticles, graticules, weather analysis, and text.
[0025] In some embodiments the eye sight is for use on a weapon, telescope, periscope, binoculars, or a camera.
[0026] Figure 3 is an exemplary non-limiting diagram graphically illustrating the diffraction pattern resulting from the conjugated drffractive lens of figure 2B. These patterns may be represented mathematically by the following equations:
where Qi represents the transfer function (depicted as 301) produced on the first side of the conjugated diffractive lens 302; Q2 represents the conjugate transfer function (depicted as 303) produced on the second side of the conjugated diffractive lens 302; Ai and A2 are constants; A represents the wavelength of light emitted from the synthetic data source (i.e. from the AR data source); is the focal length of the incident surface; and / = / + d is the focal distance of the conjugated lens where d is the thickness of the film, where in both / and / the refractive index of the media is taken into consideration.
As aforementioned, light rays from the real- world scene are not modulated by the conjugated diffractive lens as the lens has uniform thickness. The transfer function for light rays originating from the real-world scene may therefore be represented mathematically as:
2πί
t(x, y) = B - exp , where B is a constant and A =— is any typical wavelength in the
λ '
visible range (A0) attenuated by the refractive index of the media (denoted as ri).
Light originating from the AR data source onto the rear surface of the conjugated diffractive lens is however modulated. The transfer function for this may be represented mathematically as: r(x, y) = C■ exp [— - (x2 + y2)|.
[0027] The aforementioned flowchart and diagrams illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each portion in the flowchart or portion diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the portion may occur out of the order noted in the figures. For example, two portions shown in succession may, in fact, be executed substantially concurrently, or the portions may sometimes be executed in the reverse order, depending upon the functionality involved, It will also be noted that each portion of the portion diagrams and/or flowchart illustration, and combinations of portions in the portion diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0028] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system or an apparatus. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system".
[0029] The aforementioned figures illustrate the architecture, functionality, and operation of possible implementations of systems and apparatus according to various embodiments of the present invention. Where referred to in the above description, an embodiment is an example or implementation of the invention. The various appearances of "one embodiment," "an embodiment" or "some embodiments" do not necessarily all refer to the same embodiments.
[0030] Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.
[0031] Reference in the specification to "some embodiments", "an embodiment", "one embodiment" or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the inventions. It will further be recognized that the aspects of the invention described hereinabove may be combined or otherwise coexist in embodiments of the invention.
[0032] It is to be understood that the phraseology and terminology employed herein is not to be construed as limiting and are for descriptive purpose only.
[0033] The principles and uses of the teachings of the present invention may be better understood with reference to the accompanying description, figures and examples.
[0034] It is to be understood that the details set forth herein do not construe a limitation to an application of the invention.
[0035] Furthermore, it is to be understood that the invention can be carried out or practiced in various ways and that the invention can be implemented in embodiments other than the ones outlined in the description above.
[0036] It is to be understood that the terms "including", "comprising", "consisting" and grammatical variants thereof do not preclude the addition of one or more components, features, steps, or integers or groups thereof and that the terms are to be construed as specifying components, features, steps or integers.
[0037] If the specification or claims refer to "an additional" element, that does not preclude there being more than one of the additional element.
[0038] It is to be understood that where the claims or specification refer to "a" or "an" element, such reference is not be construed that there is only one of that element.
[0039] It is to be understood that where the specification states that a component, feature, structure, or characteristic "may", "might", "can" or "could" be included, that particular component, feature, structure, or characteristic is not required to be included.
[0040] Where applicable, although state diagrams, flow diagrams or both may be used to describe embodiments, the invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.
[0041] Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
[0042] The term "method" may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
[0043] The descriptions, examples and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only.
[0044] Meanings of technical and scientific terms used herein are to be commonly understood as by one of ordinary skill in the art to which the invention belongs, unless otherwise defined.
[0045] The present invention may be implemented in the testing or practice with materials equivalent or similar to those described herein.
[0046] While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other or equivalent variations, modifications, and applications are also within the scope of the invention. Accordingly, the scope of the invention should not be limited by what has thus far been described, but by the appended claims and their legal equivalents.
Claims
1. An eye sight operable to augment a real- world view with virtual data, said eye sight comprising:
a lens positioned between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antireflective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and,
a computer processing device, said computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real- world view.
2. The eye sight according to claim 1 , wherein said lens is a thin film applied to a transparent surface on or within said eye sight, said transparent surface lying along the user' s optical path and transmitting incident light without modification or distortion.
3. The eye sight according to claim 2, wherein the thickness of said thin film is between 200μπι and 500μπι.
4. The eye sight according to claim 2, wherein said thin film is applied to said transparent surface with an adhesive agent.
5. The eye sight according to claim 1 , wherein said virtual data comprises one or more of dimensions, coordinates, bearings, timestamps, heat-maps, reticles, graticules, weather analysis, and text.
6. The eye sight according to claim 1 for use on a weapon, telescope, periscope, binoculars, or camera.
7. A method of using an eye sight to augment a real-world view with virtual data, said method comprising:
positioning a lens between a user and the real-world view, said lens comprising conjugated diffractive elements wherein one or more of said conjugated diffractive elements comprises a surface with an antirefiective coating, and wherein said conjugated diffractive elements are joined to form a concave surface and convex surface, said concave surface being orientated toward the real-world view and said convex surface being orientated toward the user; and,
operating a computer processing device comprising a transmission component and a transmission lens, said computer processing device being operable to transmit virtual data via the transmission component to be augmented with said real-world view, said transmission lens being operable to focus said virtual data onto said convex surface of said lens, wherein said lens transmits said real-world view to said user without distortion or modification, and wherein said lens further modulates and aligns said overlaying data with said undistorted and unmodified real-world view.
8. The method according to claim 7, wherein said lens is a thin film applied to a transparent surface on or within said eye sight, said transparent surface lying along the user' s optical path and transmitting incident light without modification or distortion.
9. The method according to claim 8, wherein the thickness of said thin film is between 200μπι and 500μπι.
10. The method according to claim 8, wherein said thin film is applied to said transparent surface with an adhesive agent.
11. The method according to claim 7, wherein said virtual data comprises one or more of dimensions, coordinates, bearings, timestamps, heat-maps, reticles, graticules, weather analysis, and text.
12. The method according to claim 7 for use on a weapon, telescope, periscope, binoculars, or camera.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662414810P | 2016-10-31 | 2016-10-31 | |
US62/414,810 | 2016-10-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018078633A1 true WO2018078633A1 (en) | 2018-05-03 |
Family
ID=62024508
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2017/051180 WO2018078633A1 (en) | 2016-10-31 | 2017-10-30 | Reflector eye sight with compact beam combiner |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018078633A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108761794A (en) * | 2018-07-09 | 2018-11-06 | 深圳市昊日科技有限公司 | A kind of AR imaging systems based on transparent screen |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040263981A1 (en) * | 2003-06-27 | 2004-12-30 | Coleman Christopher L. | Diffractive optical element with anti-reflection coating |
US6867927B2 (en) * | 2002-03-11 | 2005-03-15 | Eastman Kodak Company | Transparent surface formed complex polymer lenses |
US20120002294A1 (en) * | 2009-02-25 | 2012-01-05 | Carl Zeiss Ag | Beam combiner for use in a head-mounted display device and beam splitter |
US20140146394A1 (en) * | 2012-11-28 | 2014-05-29 | Nigel David Tout | Peripheral display for a near-eye display device |
-
2017
- 2017-10-30 WO PCT/IL2017/051180 patent/WO2018078633A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6867927B2 (en) * | 2002-03-11 | 2005-03-15 | Eastman Kodak Company | Transparent surface formed complex polymer lenses |
US20040263981A1 (en) * | 2003-06-27 | 2004-12-30 | Coleman Christopher L. | Diffractive optical element with anti-reflection coating |
US20120002294A1 (en) * | 2009-02-25 | 2012-01-05 | Carl Zeiss Ag | Beam combiner for use in a head-mounted display device and beam splitter |
US20140146394A1 (en) * | 2012-11-28 | 2014-05-29 | Nigel David Tout | Peripheral display for a near-eye display device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108761794A (en) * | 2018-07-09 | 2018-11-06 | 深圳市昊日科技有限公司 | A kind of AR imaging systems based on transparent screen |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3889670B1 (en) | Optical system for a display with an off axis projector | |
JP7329310B2 (en) | System, apparatus, and method for eyebox extension in wearable head-up display | |
EP1798587B1 (en) | Head-up display | |
US10031339B2 (en) | Spatially multiplexed lens for head mounted display | |
US7619825B1 (en) | Compact head up display with wide viewing angle | |
JP6601431B2 (en) | Head-up display device | |
US12007559B2 (en) | Multi-layered thin combiner | |
US8009949B1 (en) | Fiber bundle head up display | |
US11353713B2 (en) | Virtual image display device | |
US20200049988A1 (en) | Virtual image display device | |
CN113661432A (en) | Head-up display device | |
US11061370B2 (en) | Viewing system including a holographic optical device allowing images to be displayed in different planes | |
US20180341118A1 (en) | Distributed Aperture Head up Display (HUD) | |
US11300796B2 (en) | Virtual image display device | |
WO2018078633A1 (en) | Reflector eye sight with compact beam combiner | |
US11119317B2 (en) | Virtual image display device | |
TWM535811U (en) | Reflective virtual image displaying device | |
CN114600034A (en) | Double-image-free head-up display | |
Bakholdin et al. | Systems design of augmented-reality collimator displays | |
RU2771247C1 (en) | Collimator indicator system (variants) | |
JP6593463B2 (en) | Virtual image display device | |
JP2024096801A (en) | Head-up display device | |
KR20190059991A (en) | Virtual image display device | |
GB2076178A (en) | Display Units |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17865248 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/08/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17865248 Country of ref document: EP Kind code of ref document: A1 |