WO2023129289A1 - Eyewear electronic tinting lens with integrated waveguide - Google Patents

Eyewear electronic tinting lens with integrated waveguide Download PDF

Info

Publication number
WO2023129289A1
WO2023129289A1 PCT/US2022/049760 US2022049760W WO2023129289A1 WO 2023129289 A1 WO2023129289 A1 WO 2023129289A1 US 2022049760 W US2022049760 W US 2022049760W WO 2023129289 A1 WO2023129289 A1 WO 2023129289A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
waveguide
electronic
tinting
eyewear
Prior art date
Application number
PCT/US2022/049760
Other languages
French (fr)
Inventor
David FLISZAR
Amit Singh
Original Assignee
Snap Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/586,265 external-priority patent/US20230204958A1/en
Application filed by Snap Inc. filed Critical Snap Inc.
Publication of WO2023129289A1 publication Critical patent/WO2023129289A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0081Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/10Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses
    • G02C7/101Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses having an electro-optical light valve
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids

Definitions

  • the present subject matter relates to an eyewear device, e.g., smart glasses and see-through displays.
  • Portable eyewear devices such as smart glasses, headwear, and headgear available today integrate cameras and see-through displays.
  • the see-through displays render an image viewable by a user.
  • FIG. 1 A is a side view of an example hardware configuration of an eyewear device, which shows a right optical assembly with an image display, and field of view adjustments are applied to a user interface presented on the image display based on detected head or eye movement by a user;
  • FIG. IB is a top cross-sectional view of a temple of the eyewear device of FIG. 1 A depicting a visible light camera, a head movement tracker for tracking the head movement of the user of the eyewear device, and a circuit board;
  • FIG. 2A is a rear view of an example hardware configuration of an eyewear device, which includes an eye scanner on a frame, for use in a system for identifying a user of the eyewear device;
  • FIG. 2B is a rear view of an example hardware configuration of another eyewear device, which includes an eye scanner on a temple, for use in a system for identifying a user of the eyewear device;
  • FIGS. 2C and 2D are rear views of example hardware configurations of the eyewear device, including two different types of image displays.
  • FIG. 3 shows a rear perspective view of the eyewear device of FIG. 2 A depicting an infrared emitter, an infrared camera, a frame front, a frame back, and a circuit board;
  • FIG. 4 is a cross-sectional view taken through the infrared emitter and the frame of the eyewear device of FIG. 3;
  • FIG. 5 illustrates detecting eye gaze direction
  • FIG. 6 illustrates detecting eye position
  • FIG. 7 depicts an example of visible light captured by the left visible light camera as a left raw image and visible light captured by the right visible light camera as a right raw image;
  • FIG. 8A illustrates a front view of the frame including electronic tinting lenses
  • FIG. 8B illustrates an exploded view of the frame including the electronic tinting lenses and waveguides
  • FIG. 8C illustrates a cross-sectional view of the integrated electronic tinting lens and waveguide
  • FIG. 9 illustrates a block diagram of electronic components of the eyewear device.
  • FIG. 10 illustrates a method of operating the eyewear.
  • This disclosure is directed to eyewear having an electronic tinting lens for controlling a light transmissive property of an optical assembly.
  • the electronic tinting lens has a substrate and a separator that is integrated with the waveguide to reduce the number of layers of the electronic tinting lens and therefore reduce the lens weight.
  • the upper protective glass layer of the waveguide is used as a substrate for the electronic tinting lens, wherein the electronic tinting lens substrate and the waveguide substrate each include an electrode configured to control a tint of the electronic tinting lens.
  • the glass layer additionally encapsulates the waveguide to protect the waveguide against environmental factors, such as moisture.
  • the electronic tinting lens can be an electrochromic lens and a liquid crystal with dye lens, or other forms of electronic tinting lenses.
  • Coupled refers to any logical, optical, physical, or electrical connection, link, or the like by which signals or light produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate, or carry the light or signals.
  • the orientations of the eyewear device, associated components and any complete devices incorporating an eye scanner and camera such as shown in any of the drawings, are given by way of example only, for illustration and discussion purposes.
  • the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation.
  • any directional term such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any optic or component of an optic constructed as otherwise described herein.
  • FIG. 1 A is a side view of an example hardware configuration of an eyewear device 100, which includes a right optical assembly 180B with an image display 180D (FIG. 2A).
  • Eyewear device 100 includes multiple visible light cameras 114A-B (FIG. 7) that form a stereo camera, of which the right visible light camera 114B is located on a right temple HOB.
  • the left and right visible light cameras 114A-B have an image sensor that is sensitive to the visible light range wavelength.
  • Each of the visible light cameras 114A-B have a different frontward facing angle of coverage, for example, visible light camera 114B has the depicted angle of coverage 11 IB.
  • the angle of coverage is an angle range which the image sensor of the visible light camera 114A-B picks up electromagnetic radiation and generates images.
  • Examples of such visible lights camera 114A-B include a high-resolution complementary metal-oxide-semiconductor (CMOS) image sensor and a video graphic array (VGA) camera, such as 640p (e.g., 640 x 480 pixels for a total of 0.3 megapixels), 720p, or 1080p.
  • CMOS complementary metal-oxide-semiconductor
  • VGA video graphic array
  • Image sensor data from the visible light cameras 114A-B are captured along with geolocation data, digitized by an image processor, and stored in a memory
  • visible light cameras 114A-B may be coupled to an image processor (element 912 of FIG. 9) for digital processing along with a timestamp in which the image of the scene is captured.
  • Image processor 912 includes circuitry to receive signals from the visible light camera 114A-B and process those signals from the visible light cameras 114A-B into a format suitable for storage in the memory (element 934 of FIG. 9). The timestamp can be added by the image processor 912 or other processor, which controls operation of the visible light cameras 114A-B.
  • Visible light cameras 114A-B allow the stereo camera to simulate human binocular vision. Stereo cameras provide the ability to reproduce three-dimensional images (element 715 of FIG.
  • the pair of images 758A-B are generated at a given moment in time - one image for each of the left and right visible light cameras 114A-B.
  • FOV frontward facing field of view
  • a user interface field of view adjustment system includes the eyewear device 100.
  • the eyewear device 100 includes a frame 105, a right temple HOB extending from a right lateral side 170B of the frame 105, and a see-through image display 180D (FIGS. 2A-B) comprising optical assembly 180B to present a graphical user interface to a user.
  • the eyewear device 100 includes the left visible light camera 114A connected to the frame 105 or the left temple 110A to capture a first image of the scene.
  • Eyewear device 100 further includes the right visible light camera 114B connected to the frame 105 or the right temple 110B to capture (e.g., simultaneously with the left visible light camera 114A) a second image of the scene which partially overlaps the first image.
  • the user interface field of view adjustment system further includes the processor 932 coupled to the eyewear device 100 and connected to the visible light cameras 114A-B, the memory 934 accessible to the processor 932, and programming in the memory 934, for example in the eyewear device 100 itself or another part of the user interface field of view adjustment system.
  • the eyewear device 100 also includes a head movement tracker (element 109 of FIG. IB) or an eye movement tracker (element 213 of FIG. 2B).
  • Eyewear device 100 further includes the see-through image displays 180C-D of optical assembly 180A-B, respectfully, for presenting a sequence of displayed images, and an image display driver (element 942 of FIG. 9) coupled to the see-through image displays 180C-D of optical assembly 180A-B to control the image displays 180C-D of optical assembly 180A-B to present the sequence of displayed images 715, which are described in further detail below.
  • Eyewear device 100 further includes the memory 934 and the processor 932 having access to the image display driver 942 and the memory 934.
  • Eyewear device 100 further includes programming (element 934 of FIG. 9) in the memory. Execution of the programming by the processor 932 configures the eyewear device 100 to perform functions, including functions to present, via the see-through image displays 180C-D, an initial displayed image of the sequence of displayed images, the initial displayed image having an initial field of view corresponding to an initial head direction or an initial eye gaze direction (element 230 of FIG.5).
  • Execution of the programming by the processor 932 further configures the eyewear device 100 to detect movement of a user of the eyewear device by: (i) tracking, via the head movement tracker (element 109 of FIG. IB), a head movement of a head of the user, or (ii) tracking, via an eye movement tracker (element 213 of FIG. 2B, FIG. 5), an eye movement of an eye of the user of the eyewear device 100.
  • Execution of the programming by the processor 932 further configures the eyewear device 100 to determine a field of view adjustment to the initial field of view of the initial displayed image based on the detected movement of the user.
  • the field of view adjustment includes a successive field of view corresponding to a successive head direction or a successive eye direction.
  • FIG. IB is a top cross-sectional view of the temple of the eyewear device 100 of FIG. 1 A depicting the right visible light camera 114B, a head movement tracker 109, and a circuit board. Construction and placement of the left visible light camera 114A is substantially similar to the right visible light camera 114B, except the connections and coupling are on the left lateral side 170A.
  • the eyewear device 100 includes the right visible light camera 114B and a circuit board, which may be a flexible printed circuit board (PCB) 140.
  • the right hinge 126B connects the right temple HOB to a right temple 125B of the eyewear device 100.
  • components of the right visible light camera 114B, the flexible PCB 140, or other electrical connectors or contacts may be located on the right temple 125B or the right hinge 126B.
  • eyewear device 100 has a head movement tracker 109, which includes, for example, an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • An inertial measurement unit is an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the magnetic field surrounding the body, using a combination of accelerometers and gyroscopes, sometimes also magnetometers.
  • the inertial measurement unit works by detecting linear acceleration using one or more accelerometers and rotational rate using one or more gyroscopes.
  • Typical configurations of inertial measurement units contain one accelerometer, gyro, and magnetometer per axis for each of the three axes: horizontal axis for left-right movement (X), vertical axis (Y) for top-bottom movement, and depth or distance axis for up-down movement (Z).
  • the accelerometer detects the gravity vector.
  • the magnetometer defines the rotation in the magnetic field (e.g., facing south, north, etc.) like a compass which generates a heading reference.
  • the three accelerometers to detect acceleration along the horizontal, vertical, and depth axis defined above, which can be defined relative to the ground, the eyewear device 100, or the user wearing the eyewear device 100.
  • Eyewear device 100 detects movement of the user of the eyewear device 100 by tracking, via the head movement tracker 109, the head movement of the head of the user.
  • the head movement includes a variation of head direction on a horizontal axis, a vertical axis, or a combination thereof from the initial head direction during presentation of the initial displayed image on the image display.
  • tracking, via the head movement tracker 109, the head movement of the head of the user includes measuring, via the inertial measurement unit 109, the initial head direction on the horizontal axis (e.g., X axis), the vertical axis (e.g., Y axis), or the combination thereof (e.g., transverse or diagonal movement).
  • Tracking, via the head movement tracker 109, the head movement of the head of the user further includes measuring, via the inertial measurement unit 109, a successive head direction on the horizontal axis, the vertical axis, or the combination thereof during presentation of the initial displayed image.
  • Tracking, via the head movement tracker 109, the head movement of the head of the user further includes determining the variation of head direction based on both the initial head direction and the successive head direction.
  • Detecting movement of the user of the eyewear device 100 further includes in response to tracking, via the head movement tracker 109, the head movement of the head of the user, determining that the variation of head direction exceeds a deviation angle threshold on the horizontal axis, the vertical axis, or the combination thereof.
  • the deviation angle threshold is between about 3° to 10°.
  • the term “about” when referring to an angle means ⁇ 10% from the stated amount.
  • the eyewear device 100 may power down.
  • the right temple HOB includes temple body 211 and a temple cap, with the temple cap omitted in the cross-section of FIG. IB.
  • various interconnected circuit boards such as PCBs or flexible PCBs, that include controller circuits for right visible light camera 114B, microphone(s) 130, speaker(s) 132, low-power wireless circuitry (e.g., for wireless short-range network communication via BluetoothTM), high-speed wireless circuitry (e.g., for wireless local area network communication via WiFi).
  • the right visible light camera 114B is coupled to or disposed on the flexible PCB 140 and covered by a visible light camera cover lens, which is aimed through opening(s) formed in the right temple HOB.
  • the frame 105 connected to the right temple 110B includes the opening(s) for the visible light camera cover lens.
  • the frame 105 includes a front-facing side configured to face outwards away from the eye of the user.
  • the opening for the visible light camera cover lens is formed on and through the front-facing side.
  • the right visible light camera 114B has an outward facing angle of coverage 11 IB with a line of sight or perspective of the right eye of the user of the eyewear device 100.
  • the visible light camera cover lens can also be adhered to an outward facing surface of the right temple 11 OB in which an opening is formed with an outwards facing angle of coverage, but in a different outwards direction.
  • the coupling can also be indirect via intervening components.
  • Left (first) visible light camera 114A is connected to the left see-through image display 180C of left optical assembly 180A to generate a first background scene of a first successive displayed image.
  • the right (second) visible light camera 114B is connected to the right see-through image display 180D of right optical assembly 180B to generate a second background scene of a second successive displayed image.
  • the first background scene and the second background scene partially overlap to present a three-dimensional observable area of the successive displayed image.
  • FIG. 2A is a rear view of an example hardware configuration of an eyewear device 100, which includes an eye scanner 113 on a frame 105, for use in a system for determining an eye position and gaze direction of a wearer/user of the eyewear device 100. As shown in FIG.
  • the eyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example of FIG. 2 A.
  • the eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet.
  • eyewear device 100 includes the frame 105 which includes the left rim 107 A connected to the right rim 107B via the bridge 106 adapted for a nose of the user.
  • the left and right rims 107A-B include respective apertures 175A-B which hold the respective optical element 180A-B, such as a lens and the see-through displays 180C-D.
  • the term lens is meant to cover transparent or translucent pieces of glass or plastic having curved and flat surfaces that cause light to converge/di verge or that cause little or no convergence/divergence.
  • eyewear device 100 can include other arrangements, such as a single optical element depending on the application or intended user of the eyewear device 100.
  • eyewear device 100 includes the left temple 110A adjacent the left lateral side 170A of the frame 105 and the right temple 110B adjacent the right lateral side 170B of the frame 105.
  • the temples 110A-B may be integrated into the frame 105 on the respective sides 170A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170A- B.
  • the temples 110A-B may be integrated into temples (not shown) attached to the frame 105.
  • the eye scanner 113 includes an infrared emitter
  • the infrared camera 120 is a visible light camera, such as a low-resolution video graphic array (VGA) camera (e.g., 640 x 480 pixels for a total of 0.3 megapixels), with the blue filter removed.
  • VGA video graphic array
  • the infrared emitter 115 and the infrared camera 120 are co-located on the frame 105, for example, both are shown as connected to the upper portion of the left rim 107 A.
  • the frame 105 or one or more of the left and right temples 110A-B include a circuit board (not shown) that includes the infrared emitter 115 and the infrared camera 120.
  • the infrared emitter 115 and the infrared camera 120 can be connected to the circuit board by soldering, for example.
  • infrared emitter 115 and infrared camera 120 can be implemented, including arrangements in which the infrared emitter 115 and infrared camera 120 are both on the right rim 107B, or in different locations on the frame 105, for example, the infrared emitter 115 is on the left rim 107A and the infrared camera 120 is on the right rim 107B. In another example, the infrared emitter 115 is on the frame 105 and the infrared camera 120 is on one of the temples 110A-B, or vice versa.
  • the infrared emitter 115 can be connected essentially anywhere on the frame 105, left temple 110A, or right temple 110B to emit a pattern of infrared light.
  • the infrared camera 120 can be connected essentially anywhere on the frame 105, left temple 110A, or right temple 110B to capture at least one reflection variation in the emitted pattern of infrared light.
  • the infrared emitter 115 and infrared camera 120 are arranged to face inwards towards an eye of the user with a partial or full field of view of the eye in order to identify the respective eye position and gaze direction.
  • the infrared emitter 115 and infrared camera 120 are positioned directly in front of the eye, in the upper part of the frame 105 or in the temples 110A-B at either ends of the frame 105.
  • FIG. 2B is a rear view of an example hardware configuration of another eyewear device 200.
  • the eyewear device 200 is depicted as including an eye scanner 213 on a right temple 210B.
  • an infrared emitter 215 and an infrared camera 220 are co-located on the right temple 210B.
  • the eye scanner 213 or one or more components of the eye scanner 213 can be located on the left temple 210A and other locations of the eyewear device 200, for example, the frame 105.
  • the infrared emitter 215 and infrared camera 220 are like that of FIG. 2 A, but the eye scanner 213 can be varied to be sensitive to different light wavelengths as described previously in FIG. 2A.
  • the eyewear device 200 includes a frame 105 which includes a left rim 107A which is connected to a right rim 107B via a bridge 106; and the left and right rims 107A-B include respective apertures which hold the respective optical elements 180A-B comprising the see-through display 180C-D.
  • FIGS. 2C-D are rear views of example hardware configurations of the eyewear device 100, including two different types of see-through image displays 180C-D.
  • these see-through image displays 180C-D of optical assembly 180A-B include an integrated image display.
  • the optical assemblies 180A-B includes a suitable display matrix 180C-D of any suitable type, such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, a waveguide display, or any other such display.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the optical assembly 180A-B also includes an optical layer or layers 176, which can include lenses, optical coatings, prisms, mirrors, waveguides, optical strips, and other optical components in any combination.
  • the optical layers 176A-N can include a prism having a suitable size and configuration and including a first surface for receiving light from display matrix and a second surface for emitting light to the eye of the user.
  • the prism of the optical layers 176A-N extends over all or at least a portion of the respective apertures 175A- B formed in the left and right rims 107A-B to permit the user to see the second surface of the prism when the eye of the user is viewing through the corresponding left and right rims 107A-B.
  • the first surface of the prism of the optical layers 176A-N faces upwardly from the frame 105 and the display matrix overlies the prism so that photons and light emitted by the display matrix impinge the first surface.
  • the prism is sized and shaped so that the light is refracted within the prism and is directed towards the eye of the user by the second surface of the prism of the optical layers 176A-N.
  • the second surface of the prism of the optical layers 176A-N can be convex to direct the light towards the center of the eye.
  • the prism can optionally be sized and shaped to magnify the image projected by the see-through image displays 180C-D, and the light travels through the prism so that the image viewed from the second surface is larger in one or more dimensions than the image emitted from the see- through image displays 180C-D.
  • the see-through image displays 180C-D of optical assembly 180A-B include a projection image display as shown in FIG. 2D.
  • the optical assembly 180A-B includes a laser projector 150, which is a three-color laser projector using a scanning mirror or galvanometer.
  • an optical source such as a laser projector 150 is disposed in or on one of the temples 125A-B of the eyewear device 100.
  • Optical assembly 180A-B includes one or more optical strips 155A-N spaced apart across the width of the lens of the optical assembly 180A-B or across a depth of the lens between the front surface and the rear surface of the lens.
  • the photons projected by the laser projector 150 travel across the lens of the optical assembly 180A-B, the photons encounter the optical strips 155A-N.
  • the photon is either redirected towards the user’s eye, or it passes to the next optical strip.
  • a combination of modulation of laser projector 150, and modulation of optical strips may control specific photons or beams of light.
  • a processor controls optical strips 155A-N by initiating mechanical, acoustic, or electromagnetic signals.
  • the eyewear device 100 can include other arrangements, such as a single or three optical assemblies, or the optical assembly 180A-B may have arranged different arrangement depending on the application or intended user of the eyewear device 100.
  • eyewear device 100 includes a left temple 110A adjacent the left lateral side 170A of the frame 105 and a right temple HOB adjacent the right lateral side 170B of the frame 105.
  • the temples 110A-B may be integrated into the frame 105 on the respective lateral sides 170A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170A-B.
  • the temples 110A-B may be integrated into temples 125A-B attached to the frame 105.
  • the see-through image displays include the first see-through image display 180C and the second see-through image display 180D.
  • Eyewear device 100 includes first and second apertures 175A-B which hold the respective first and second optical assembly 180A-B.
  • the first optical assembly 180A includes the first see-through image display 180C (e.g., a display matrix of FIG. 2C or optical strips and a projector (not shown)).
  • the second optical assembly 180B includes the second see-through image display 180D (e.g., a display matrix of FIG. 2C or optical strips 155A-N and a projector 150).
  • the successive field of view of the successive displayed image includes an angle of view between about 15° to 30, and more specifically 24°, measured horizontally, vertically, or diagonally.
  • the successive displayed image having the successive field of view represents a combined three- dimensional observable area visible through stitching together of two displayed images presented on the first and second image displays.
  • an angle of view describes the angular extent of the field of view associated with the displayed images presented on each of the left and right image displays 180C-D of optical assembly 180A-B.
  • the “angle of coverage” describes the angle range that a lens of visible light cameras 114A-B or infrared camera 220 can image.
  • the image circle produced by a lens is large enough to cover the film or sensor completely, possibly including some vignetting (i.e., a reduction of an image's brightness or saturation toward the periphery compared to the image center).
  • FIG. 3 shows a rear perspective view of the eyewear device of FIG. 2 A.
  • the eyewear device 100 includes an infrared emitter 215, infrared camera 220, a frame front 330, a frame back 335, and a circuit board 340. It can be seen in FIG. 3 that the upper portion of the left rim of the frame of the eyewear device 100 includes the frame front 330 and the frame back 335. An opening for the infrared emitter 215 is formed on the frame back 335. [0057] As shown in the encircled cross-section 4 in the upper middle portion of the left rim of the frame, a circuit board, which is a flexible PCB 340, is sandwiched between the frame front 330 and the frame back 335.
  • a circuit board which is a flexible PCB 340
  • components of the eye movement tracker 213, including the infrared emitter 215, the flexible PCB 340, or other electrical connectors or contacts may be located on the left temple 325A or the left hinge 126 A.
  • FIG. 4 is a cross-sectional view through the infrared emitter 215 and the frame corresponding to the encircled cross-section 4 of the eyewear device of FIG. 3. Multiple layers of the eyewear device 100 are illustrated in the cross-section of FIG. 4, as shown the frame includes the frame front 330 and the frame back 335.
  • the flexible PCB 340 is disposed on the frame front 330 and connected to the frame back 335.
  • the infrared emitter 215 is disposed on the flexible PCB 340 and covered by an infrared emitter cover lens 445. For example, the infrared emitter 215 is reflowed to the back of the flexible PCB 340.
  • Reflowing attaches the infrared emitter 215 to contact pad(s) formed on the back of the flexible PCB 340 by subjecting the flexible PCB 340 to controlled heat which melts a solder paste to connect the two components.
  • reflowing is used to surface mount the infrared emitter 215 on the flexible PCB 340 and electrically connect the two components.
  • through-holes can be used to connect leads from the infrared emitter 215 to the flexible PCB 340 via interconnects, for example.
  • the frame back 335 includes an infrared emitter opening 450 for the infrared emitter cover lens 445.
  • the infrared emitter opening 450 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user.
  • the flexible PCB 340 can be connected to the frame front 330 via the flexible PCB adhesive 460.
  • the infrared emitter cover lens 445 can be connected to the frame back 335 via infrared emitter cover lens adhesive 455.
  • the coupling can also be indirect via intervening components.
  • the processor 932 utilizes eye tracker 213 to determine an eye gaze direction 230 of a wearer’s eye 234 as shown in FIG. 5, and an eye position 236 of the wearer’s eye 234 within an eyebox as shown in FIG. 6.
  • the eye tracker 213 is a scanner which uses infrared light illumination (e.g., near-infrared, short- wavelength infrared, midwavelength infrared, long- wavelength infrared, or far infrared) to captured image of reflection variations of infrared light from the eye 234 to determine the gaze direction 230 of a pupil 232 of the eye 234, and also the eye position 236 with respect to the see-through display 180D.
  • infrared light illumination e.g., near-infrared, short- wavelength infrared, midwavelength infrared, long- wavelength infrared, or far infrared
  • FIG. 7 depicts an example of capturing visible light with cameras 114A-B. Visible light is captured by the left visible light camera 114A with a round field of view (FOV). 111 A. A chosen rectangular left raw image 758A is used for image processing by image processor 912 (FIG. 9). Visible light is captured by the right visible light camera 114B with a round FOV 11 IB. A rectangular right raw image 758B chosen by the image processor 912 is used for image processing by processor 912.
  • a three-dimensional image 715 of a three-dimensional scene is generated by processor 912 and displayed by displays 180C and 180D and which is viewable by the user.
  • FIG. 8 A illustrates a front view 800 of the frame 105 of the eyewear 100 having a left frame opening 802A and a right frame opening 802B that support a powered electronic tinting lens 804A and 804B, respectively and a waveguide stack 810A and 810B, respectively.
  • the electronic tinting lenses 804A and 804B comprise a substrate 806A and 806B and a separator 808A and 808B, respectively, and are each configured to receive electrical control signals that establish different tint states of the separator 808A and 808B, respectively (FIG. 8B-8C).
  • the separators 808A and 808B are directly coupled to an upper layer of waveguide stack 810A and 81 OB, respectively, as shown in FIG. 8B-8C.
  • Each waveguide stack 810A-B is configured to generate images viewable by a user of the eyewear 100.
  • the waveguide stack 810A consists of two substrates, a first waveguide substrate 812A and a second waveguide substrate 813 A, respectively.
  • the term waveguide stack is also referred to as a waveguide in this disclosure.
  • the left electronic tinting lens 804A is part of the left optical assembly 180A and the right electronic tinting lens 804B is part of the right optical assembly 180B.
  • Each of the separators 808A and 808B are configured to be controlled by a processor 932 (FIG. 9) to establish a selected tint state.
  • the separators 808A and 808B are used to optimize image readability in different lighting conditions.
  • the separators 808 A and 808B are configured in a first state to have a dark tint to improve visibility and contrast of an image displayed by the waveguide stack 810A and 810B (FIG. 8C) in bright outdoor conditions, respectively.
  • the separators 808A and 808B are configured to have a light tint or no tint for indoor conditions.
  • the eyewear 100 is suitable for use with augmented reality (AR) features of the eyewear 100.
  • AR augmented reality
  • the electronic tinting lens 804A includes the substrate 806A coupled to the waveguide stack 810A by an adhesive 818 with the separator 808 A disposed between the substrate 806A and the waveguide stack 810A.
  • the separator 808A is an electronic tinting electrolyte.
  • the separator 808A is a liquid crystal with dye material.
  • a first electrode 814A is coupled to an inside face of the substrate 806A to interact with the separator 808A
  • a second electrode 816A is coupled to an upper face of the first waveguide substrate 812A to interact with the separator 808 A.
  • the electrodes 814A and 816A control the tinting of the respective separator disposed between them as a function of the control signals from the processor 932.
  • the electrodes 814A and 816A are a coating of visibly transparent Indium Tin Oxide (ITO) formed on the respective surface of the first substrate 806A and the first waveguide substrate 812A.
  • ITO Indium Tin Oxide
  • the substrate 806A of the electronic tinting lens 804A is made of a plastic material to reduce overall system weight, as plastic is lighter than glass.
  • the first waveguide substrate 812A is made of glass to provide protection for the waveguide stack 810A.
  • the right electronic tinting lens 804B and waveguide stack 810B are identical in construction to the left electronic tinting lens 804A and waveguide stack 810A.
  • This integration is done by utilizing the first waveguide substrate 812A as a second substrate for the electronic tinting lens 804A and 804B, respectively.
  • the first waveguide substrate 812A is additionally used as a protective cover, and also to encapsulate the wave guide stack 810A and 81 OB, respectively.
  • the first waveguide substrate 812A maintains the protection needed for the waveguide stack 810A, against environmental factors, such as moisture.
  • the use of glass for the second substrate 810A additionally provides increased flatness and rigidity to the electronic tinting lens 804A as compared to the traditional use of plastic.
  • FIG. 9 depicts a high-level functional block diagram including example electronic components disposed in eyewear 100 and 200.
  • the illustrated electronic components include the processor 932, the memory 934, and the see-through image display 180C and 180D.
  • Memory 934 includes instructions for execution by processor 932 to implement functionality of eyewear 100/200, including instructions for processor 932 to control in the image 715.
  • Processor 932 receives power from battery (not shown) and executes the instructions stored in memory 934, or integrated with the processor 932 on-chip, to perform functionality of eyewear 100/200, and communicating with external devices via wireless connections.
  • a user interface adjustment system 900 includes a wearable device, which is the eyewear device 100 with an eye movement tracker 213 (e.g., shown as infrared emitter 215 and infrared camera 220 in FIG. 2B).
  • User interface adjustments system 900 also includes a mobile device 990 and a server system 998 connected via various networks.
  • Mobile device 990 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting with eyewear device 100 using both a low-power wireless connection 925 and a high-speed wireless connection 937.
  • Mobile device 990 is connected to server system 998 and network 995.
  • the network 995 may include any combination of wired and wireless connections.
  • Eyewear device 100 includes at least two visible light cameras 114A-B (one associated with the left lateral side 170A and one associated with the right lateral side 170B). Eyewear device 100 further includes two see-through image displays 180C-D of the optical assembly 180A-B (one associated with the left lateral side 170A and one associated with the right lateral side 170B). Eyewear device 100 also includes image display driver 942, image processor 912, low-power circuitry 920, and high-speed circuitry 930. The components shown in FIG. 9 for the eyewear device 100 and 200 are located on one or more circuit boards, for example a PCB or flexible PCB, in the temples.
  • Left and right visible light cameras 114A-B can include digital camera elements such as a complementary metal-oxide-semiconductor (CMOS) image sensor, charge coupled device, a lens, or any other respective visible or light capturing elements that may be used to capture data, including images of scenes with unknown objects.
  • CMOS complementary metal-oxide-semiconductor
  • Eye movement tracking programming implements the user interface field of view adjustment instructions, including, to cause the eyewear device 100 to track, via the eye movement tracker 213, the eye movement of the eye of the user of the eyewear device 100.
  • Other implemented instructions cause the eyewear device 100 and 200 to determine the FOV adjustment to the initial FOV 111 A-B based on the detected eye movement of the user corresponding to a successive eye direction.
  • Further implemented instructions generate a successive displayed image of the sequence of displayed images based on the field of view adjustment. The successive displayed image is produced as visible output to the user via the user interface. This visible output appears on the see-through image displays 180C-D of optical assembly 180A-B, which is driven by image display driver 942 to present the sequence of displayed images, including the initial displayed image with the initial field of view and the successive displayed image with the successive field of view.
  • high-speed circuitry 930 includes high-speed processor 932, memory 934, and high-speed wireless circuitry 936.
  • the image display driver 942 is coupled to the high-speed circuitry 930 and operated by the highspeed processor 932 in order to drive the left and right image displays 180C-D of the optical assembly 180A-B.
  • High-speed processor 932 may be any processor capable of managing high-speed communications and operation of any general computing system needed for eyewear device 100.
  • High-speed processor 932 includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 937 to a wireless local area network (WLAN) using high-speed wireless circuitry 936.
  • WLAN wireless local area network
  • the high-speed processor 932 executes an operating system such as a LINUX operating system or other such operating system of the eyewear device 100 and the operating system is stored in memory 934 for execution. In addition to any other responsibilities, the high-speed processor 932 executing a software architecture for the eyewear device 100 is used to manage data transfers with high-speed wireless circuitry 936.
  • high-speed wireless circuitry 936 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other examples, other high-speed communications standards may be implemented by high-speed wireless circuitry 936.
  • IEEE Institute of Electrical and Electronic Engineers
  • Low-power wireless circuitry 924 and the high-speed wireless circuitry 936 of the eyewear device 100 and 200 can include short range transceivers (BluetoothTM) and wireless wide, local, or wide area network transceivers (e.g., cellular or WiFi).
  • Mobile device 990 including the transceivers communicating via the low-power wireless connection 925 and high-speed wireless connection 937, may be implemented using details of the architecture of the eyewear device 100, as can other elements of network 995.
  • Memory 934 includes any storage device capable of storing various data and applications, including, among other things, color maps, camera data generated by the left and right visible light cameras 114A-B and the image processor 912, as well as images generated for display by the image display driver 942 on the see-through image displays 180C-D of the optical assembly 180A-B. While memory 934 is shown as integrated with high-speed circuitry 930, in other examples, memory 934 may be an independent standalone element of the eyewear device 100. In certain such examples, electrical routing lines may provide a connection through a chip that includes the high-speed processor 932 from the image processor 912 or low-power processor 922 to the memory 934. In other examples, the high-speed processor 932 may manage addressing of memory 934 such that the low-power processor 922 will boot the high-speed processor 932 any time that a read or write operation involving memory 934 is needed.
  • Server system 998 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the network 995 with the mobile device 990 and eyewear device 100/200.
  • Eyewear device 100 and 200 is connected with a host computer.
  • the eyewear device 100 is paired with the mobile device 990 via the high-speed wireless connection 937 or connected to the server system 998 via the network 995.
  • Output components of the eyewear device 100 include visual components, such as the left and right image displays 180C-D of optical assembly 180A-B as described in FIGS. 2C-D (e.g., a display such as a liquid crystal display (LCD), a plasma display panel (PDP), a light emitting diode (LED) display, a projector, or a waveguide).
  • the image displays 180C-D of the optical assembly 180A-B are driven by the image display driver 942.
  • the output components of the eyewear device 100 further include acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth.
  • the input components of the eyewear device 100 and 200, the mobile device 990, and server system 998 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
  • Eyewear device 100 may optionally include additional peripheral device elements.
  • peripheral device elements may include ambient light and spectral sensors, biometric sensors, additional sensors, or display elements integrated with eyewear device 100.
  • peripheral device elements may include any I/O components including output components, motion components, position components, or any other such elements described herein.
  • the eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet.
  • the biometric components of the user interface field of view adjustment 900 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.
  • the motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
  • the position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or BluetoothTM transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
  • location sensor components to generate location coordinates
  • WiFi or BluetoothTM transceivers to generate positioning system coordinates
  • altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
  • orientation sensor components e.g., magnetometers
  • an “application” or “applications” are program(s) that execute functions defined in the programs.
  • Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language).
  • a third party application e.g., an application developed using the ANDROIDTM or IOSTM software development kit (SDK) by an entity other than the vendor of the particular platform
  • the third-party application may be mobile software running on a mobile operating system such as IOSTM, ANDROIDTM, WINDOWS® Phone, or another mobile operating systems.
  • the third-party application can invoke API calls provided by the operating system to facilitate functionality described herein.
  • FIG. 10 there is shown a method 1000 for operating the eyewear 100 including the electronic tinting lenses 804 A and 804B.
  • the processor 932 determines the light transmissive property/tinting of respective electronic tinting lenses 804A and 804B.
  • the processor 932 may determine the light transmissive property/tinting based on the ambient light detected by an ambient light sensor about the eyewear 100, or may be user selected such as using a control of the eyewear 100.
  • the processor 932 sends control signals to the electrodes 814 and 816 of the respective electronic tinting lenses 804 A and 804B to control the light transmissive property/tinting of respective electronic tinting lenses 804A and 804B.
  • the electronic tinting lenses 804A and 804B can be adjusted to improve visibility and contrast in bright lighting conditions. Having two or more electronic tinting lenses 804A and 804B allows opacity to be tuned independently.
  • the light transmissive property/tinting of respective separators 808A and 808B of electronic tinting lens 804A and 804B is altered by the control signals.
  • the tint of the separators 808A and 808B is used to optimize user viewing of a displayed image 715 by the waveguide stack 810A and 810B, as well as real -world images.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Eyewear having an electronic tinting lens for controlling a light transmissive property of an optical assembly. The electronic tinting lens has a substrate and a separator that is integrated with the waveguide to reduce the number of layers within the electronic tinting lens and therefore reduce the weight of the assembly. The upper protective glass layer of the waveguide is used as a substrate for the electronic tinting lens, wherein the electronic tinting lens substrate and the upper waveguide substrate each include an electrode configured to control a tint of the electronic tinting lens. The glass layer additionally encapsulates the waveguide to protect the waveguide against environmental factors, such as moisture.

Description

EYEWEAR ELECTRONIC TINTING LENS WITH INTEGRATED WAVEGUIDE
Cross-Reference to Related Applications
[0001] This application claims priority to U.S. Provisional Application Serial No. 63/294,317 filed on December 28, 2021, and U.S. Application Serial No. 17/586,265 filed on January 27, 2022, the contents of both of which are incorporated fully herein by reference.
Technical Field
[0002] The present subject matter relates to an eyewear device, e.g., smart glasses and see-through displays.
Background
[0003] Portable eyewear devices, such as smart glasses, headwear, and headgear available today integrate cameras and see-through displays. The see-through displays render an image viewable by a user.
Brief Description of the Drawings
[0004] The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
[0005] FIG. 1 A is a side view of an example hardware configuration of an eyewear device, which shows a right optical assembly with an image display, and field of view adjustments are applied to a user interface presented on the image display based on detected head or eye movement by a user;
[0006] FIG. IB is a top cross-sectional view of a temple of the eyewear device of FIG. 1 A depicting a visible light camera, a head movement tracker for tracking the head movement of the user of the eyewear device, and a circuit board;
[0007] FIG. 2A is a rear view of an example hardware configuration of an eyewear device, which includes an eye scanner on a frame, for use in a system for identifying a user of the eyewear device;
[0008] FIG. 2B is a rear view of an example hardware configuration of another eyewear device, which includes an eye scanner on a temple, for use in a system for identifying a user of the eyewear device;
[0009] FIGS. 2C and 2D are rear views of example hardware configurations of the eyewear device, including two different types of image displays. [0010] FIG. 3 shows a rear perspective view of the eyewear device of FIG. 2 A depicting an infrared emitter, an infrared camera, a frame front, a frame back, and a circuit board;
[0011] FIG. 4 is a cross-sectional view taken through the infrared emitter and the frame of the eyewear device of FIG. 3;
[0012] FIG. 5 illustrates detecting eye gaze direction;
[0013] FIG. 6 illustrates detecting eye position;
[0014] FIG. 7 depicts an example of visible light captured by the left visible light camera as a left raw image and visible light captured by the right visible light camera as a right raw image;
[0015] FIG. 8A illustrates a front view of the frame including electronic tinting lenses;
[0016] FIG. 8B illustrates an exploded view of the frame including the electronic tinting lenses and waveguides;
[0017] FIG. 8C illustrates a cross-sectional view of the integrated electronic tinting lens and waveguide;
[0018] FIG. 9 illustrates a block diagram of electronic components of the eyewear device; and
[0019] FIG. 10 illustrates a method of operating the eyewear.
Detailed Description
[0020] This disclosure is directed to eyewear having an electronic tinting lens for controlling a light transmissive property of an optical assembly. The electronic tinting lens has a substrate and a separator that is integrated with the waveguide to reduce the number of layers of the electronic tinting lens and therefore reduce the lens weight. The upper protective glass layer of the waveguide is used as a substrate for the electronic tinting lens, wherein the electronic tinting lens substrate and the waveguide substrate each include an electrode configured to control a tint of the electronic tinting lens. The glass layer additionally encapsulates the waveguide to protect the waveguide against environmental factors, such as moisture. The electronic tinting lens can be an electrochromic lens and a liquid crystal with dye lens, or other forms of electronic tinting lenses.
[0021] Additional objects, advantages and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
[0022] In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
[0023] The term “coupled” as used herein refers to any logical, optical, physical, or electrical connection, link, or the like by which signals or light produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate, or carry the light or signals.
[0024] The orientations of the eyewear device, associated components and any complete devices incorporating an eye scanner and camera such as shown in any of the drawings, are given by way of example only, for illustration and discussion purposes. In operation for a particular variable optical processing application, the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation. Also, to the extent used herein, any directional term, such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any optic or component of an optic constructed as otherwise described herein.
[0025] Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
[0026] FIG. 1 A is a side view of an example hardware configuration of an eyewear device 100, which includes a right optical assembly 180B with an image display 180D (FIG. 2A). Eyewear device 100 includes multiple visible light cameras 114A-B (FIG. 7) that form a stereo camera, of which the right visible light camera 114B is located on a right temple HOB.
[0027] The left and right visible light cameras 114A-B have an image sensor that is sensitive to the visible light range wavelength. Each of the visible light cameras 114A-B have a different frontward facing angle of coverage, for example, visible light camera 114B has the depicted angle of coverage 11 IB. The angle of coverage is an angle range which the image sensor of the visible light camera 114A-B picks up electromagnetic radiation and generates images. Examples of such visible lights camera 114A-B include a high-resolution complementary metal-oxide-semiconductor (CMOS) image sensor and a video graphic array (VGA) camera, such as 640p (e.g., 640 x 480 pixels for a total of 0.3 megapixels), 720p, or 1080p. Image sensor data from the visible light cameras 114A-B are captured along with geolocation data, digitized by an image processor, and stored in a memory.
[0028] To provide stereoscopic vision, visible light cameras 114A-B may be coupled to an image processor (element 912 of FIG. 9) for digital processing along with a timestamp in which the image of the scene is captured. Image processor 912 includes circuitry to receive signals from the visible light camera 114A-B and process those signals from the visible light cameras 114A-B into a format suitable for storage in the memory (element 934 of FIG. 9). The timestamp can be added by the image processor 912 or other processor, which controls operation of the visible light cameras 114A-B. Visible light cameras 114A-B allow the stereo camera to simulate human binocular vision. Stereo cameras provide the ability to reproduce three-dimensional images (element 715 of FIG. 7) based on two captured images (elements 758A-B of FIG. 7) from the visible light cameras 114A-B, respectively, having the same timestamp. Such three-dimensional images 715 allow for an immersive lifelike experience, e.g., for virtual reality or video gaming. For stereoscopic vision, the pair of images 758A-B are generated at a given moment in time - one image for each of the left and right visible light cameras 114A-B. When the pair of generated images 758A-B from the frontward facing field of view (FOV) 111 A-B of the left and right visible light cameras
114A-B are stitched together (e.g., by the image processor 912), depth perception is provided by the optical assembly 180A-B.
[0029] In an example, a user interface field of view adjustment system includes the eyewear device 100. The eyewear device 100 includes a frame 105, a right temple HOB extending from a right lateral side 170B of the frame 105, and a see-through image display 180D (FIGS. 2A-B) comprising optical assembly 180B to present a graphical user interface to a user. The eyewear device 100 includes the left visible light camera 114A connected to the frame 105 or the left temple 110A to capture a first image of the scene. Eyewear device 100 further includes the right visible light camera 114B connected to the frame 105 or the right temple 110B to capture (e.g., simultaneously with the left visible light camera 114A) a second image of the scene which partially overlaps the first image. Although not shown in FIGS. 1 A-B, the user interface field of view adjustment system further includes the processor 932 coupled to the eyewear device 100 and connected to the visible light cameras 114A-B, the memory 934 accessible to the processor 932, and programming in the memory 934, for example in the eyewear device 100 itself or another part of the user interface field of view adjustment system.
[0030] Although not shown in FIG. 1 A, the eyewear device 100 also includes a head movement tracker (element 109 of FIG. IB) or an eye movement tracker (element 213 of FIG. 2B). Eyewear device 100 further includes the see-through image displays 180C-D of optical assembly 180A-B, respectfully, for presenting a sequence of displayed images, and an image display driver (element 942 of FIG. 9) coupled to the see-through image displays 180C-D of optical assembly 180A-B to control the image displays 180C-D of optical assembly 180A-B to present the sequence of displayed images 715, which are described in further detail below. Eyewear device 100 further includes the memory 934 and the processor 932 having access to the image display driver 942 and the memory 934. Eyewear device 100 further includes programming (element 934 of FIG. 9) in the memory. Execution of the programming by the processor 932 configures the eyewear device 100 to perform functions, including functions to present, via the see-through image displays 180C-D, an initial displayed image of the sequence of displayed images, the initial displayed image having an initial field of view corresponding to an initial head direction or an initial eye gaze direction (element 230 of FIG.5).
[0031] Execution of the programming by the processor 932 further configures the eyewear device 100 to detect movement of a user of the eyewear device by: (i) tracking, via the head movement tracker (element 109 of FIG. IB), a head movement of a head of the user, or (ii) tracking, via an eye movement tracker (element 213 of FIG. 2B, FIG. 5), an eye movement of an eye of the user of the eyewear device 100. Execution of the programming by the processor 932 further configures the eyewear device 100 to determine a field of view adjustment to the initial field of view of the initial displayed image based on the detected movement of the user. The field of view adjustment includes a successive field of view corresponding to a successive head direction or a successive eye direction. Execution of the programming by the processor 932 further configures the eyewear device 100 to generate a successive displayed image of the sequence of displayed images based on the field of view adjustment. Execution of the programming by the processor 932 further configures the eyewear device 100 to present, via the see-through image displays 180C-D of the optical assembly 180A-B, the successive displayed images. [0032] FIG. IB is a top cross-sectional view of the temple of the eyewear device 100 of FIG. 1 A depicting the right visible light camera 114B, a head movement tracker 109, and a circuit board. Construction and placement of the left visible light camera 114A is substantially similar to the right visible light camera 114B, except the connections and coupling are on the left lateral side 170A. As shown, the eyewear device 100 includes the right visible light camera 114B and a circuit board, which may be a flexible printed circuit board (PCB) 140. The right hinge 126B connects the right temple HOB to a right temple 125B of the eyewear device 100. In some examples, components of the right visible light camera 114B, the flexible PCB 140, or other electrical connectors or contacts may be located on the right temple 125B or the right hinge 126B.
[0033] As shown, eyewear device 100 has a head movement tracker 109, which includes, for example, an inertial measurement unit (IMU). An inertial measurement unit is an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the magnetic field surrounding the body, using a combination of accelerometers and gyroscopes, sometimes also magnetometers. The inertial measurement unit works by detecting linear acceleration using one or more accelerometers and rotational rate using one or more gyroscopes. Typical configurations of inertial measurement units contain one accelerometer, gyro, and magnetometer per axis for each of the three axes: horizontal axis for left-right movement (X), vertical axis (Y) for top-bottom movement, and depth or distance axis for up-down movement (Z). The accelerometer detects the gravity vector. The magnetometer defines the rotation in the magnetic field (e.g., facing south, north, etc.) like a compass which generates a heading reference. The three accelerometers to detect acceleration along the horizontal, vertical, and depth axis defined above, which can be defined relative to the ground, the eyewear device 100, or the user wearing the eyewear device 100.
[0034] Eyewear device 100 detects movement of the user of the eyewear device 100 by tracking, via the head movement tracker 109, the head movement of the head of the user. The head movement includes a variation of head direction on a horizontal axis, a vertical axis, or a combination thereof from the initial head direction during presentation of the initial displayed image on the image display. In one example, tracking, via the head movement tracker 109, the head movement of the head of the user includes measuring, via the inertial measurement unit 109, the initial head direction on the horizontal axis (e.g., X axis), the vertical axis (e.g., Y axis), or the combination thereof (e.g., transverse or diagonal movement). Tracking, via the head movement tracker 109, the head movement of the head of the user further includes measuring, via the inertial measurement unit 109, a successive head direction on the horizontal axis, the vertical axis, or the combination thereof during presentation of the initial displayed image.
[0035] Tracking, via the head movement tracker 109, the head movement of the head of the user further includes determining the variation of head direction based on both the initial head direction and the successive head direction. Detecting movement of the user of the eyewear device 100 further includes in response to tracking, via the head movement tracker 109, the head movement of the head of the user, determining that the variation of head direction exceeds a deviation angle threshold on the horizontal axis, the vertical axis, or the combination thereof. The deviation angle threshold is between about 3° to 10°. As used herein, the term “about” when referring to an angle means ± 10% from the stated amount. [0036] Variation along the horizontal axis slides three-dimensional objects, such as characters, Bitmojis, application icons, etc. in and out of the field of view by, for example, hiding, unhiding, or otherwise adjusting visibility of the three-dimensional object. Variation along the vertical axis, for example, when the user looks upwards, in one example, displays weather information, time of day, date, calendar appointments, etc. In another example, when the user looks downwards on the vertical axis, the eyewear device 100 may power down.
[0037] The right temple HOB includes temple body 211 and a temple cap, with the temple cap omitted in the cross-section of FIG. IB. Disposed inside the right temple 110B are various interconnected circuit boards, such as PCBs or flexible PCBs, that include controller circuits for right visible light camera 114B, microphone(s) 130, speaker(s) 132, low-power wireless circuitry (e.g., for wireless short-range network communication via Bluetooth™), high-speed wireless circuitry (e.g., for wireless local area network communication via WiFi).
[0038] The right visible light camera 114B is coupled to or disposed on the flexible PCB 140 and covered by a visible light camera cover lens, which is aimed through opening(s) formed in the right temple HOB. In some examples, the frame 105 connected to the right temple 110B includes the opening(s) for the visible light camera cover lens. The frame 105 includes a front-facing side configured to face outwards away from the eye of the user. The opening for the visible light camera cover lens is formed on and through the front-facing side. In the example, the right visible light camera 114B has an outward facing angle of coverage 11 IB with a line of sight or perspective of the right eye of the user of the eyewear device 100. The visible light camera cover lens can also be adhered to an outward facing surface of the right temple 11 OB in which an opening is formed with an outwards facing angle of coverage, but in a different outwards direction. The coupling can also be indirect via intervening components.
[0039] Left (first) visible light camera 114A is connected to the left see-through image display 180C of left optical assembly 180A to generate a first background scene of a first successive displayed image. The right (second) visible light camera 114B is connected to the right see-through image display 180D of right optical assembly 180B to generate a second background scene of a second successive displayed image. The first background scene and the second background scene partially overlap to present a three-dimensional observable area of the successive displayed image.
[0040] Flexible PCB 140 is disposed inside the right temple HOB and is coupled to one or more other components housed in the right temple HOB. Although shown as being formed on the circuit boards of the right temple 110B, the right visible light camera 114B can be formed on the circuit boards of the left temple 110A, the temples 125A-B, or frame 105. [0041] FIG. 2A is a rear view of an example hardware configuration of an eyewear device 100, which includes an eye scanner 113 on a frame 105, for use in a system for determining an eye position and gaze direction of a wearer/user of the eyewear device 100. As shown in FIG. 2A, the eyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example of FIG. 2 A. The eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet.
[0042] In the eyeglasses example, eyewear device 100 includes the frame 105 which includes the left rim 107 A connected to the right rim 107B via the bridge 106 adapted for a nose of the user. The left and right rims 107A-B include respective apertures 175A-B which hold the respective optical element 180A-B, such as a lens and the see-through displays 180C-D. As used herein, the term lens is meant to cover transparent or translucent pieces of glass or plastic having curved and flat surfaces that cause light to converge/di verge or that cause little or no convergence/divergence.
[0043] Although shown as having two optical elements 180A-B, the eyewear device 100 can include other arrangements, such as a single optical element depending on the application or intended user of the eyewear device 100. As further shown, eyewear device 100 includes the left temple 110A adjacent the left lateral side 170A of the frame 105 and the right temple 110B adjacent the right lateral side 170B of the frame 105. The temples 110A-B may be integrated into the frame 105 on the respective sides 170A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170A- B. Alternatively, the temples 110A-B may be integrated into temples (not shown) attached to the frame 105.
[0044] In the example of FIG. 2 A, the eye scanner 113 includes an infrared emitter
115 and an infrared camera 120. Visible light cameras typically include a blue light filter to block infrared light detection, in an example, the infrared camera 120 is a visible light camera, such as a low-resolution video graphic array (VGA) camera (e.g., 640 x 480 pixels for a total of 0.3 megapixels), with the blue filter removed. The infrared emitter 115 and the infrared camera 120 are co-located on the frame 105, for example, both are shown as connected to the upper portion of the left rim 107 A. The frame 105 or one or more of the left and right temples 110A-B include a circuit board (not shown) that includes the infrared emitter 115 and the infrared camera 120. The infrared emitter 115 and the infrared camera 120 can be connected to the circuit board by soldering, for example.
[0045] Other arrangements of the infrared emitter 115 and infrared camera 120 can be implemented, including arrangements in which the infrared emitter 115 and infrared camera 120 are both on the right rim 107B, or in different locations on the frame 105, for example, the infrared emitter 115 is on the left rim 107A and the infrared camera 120 is on the right rim 107B. In another example, the infrared emitter 115 is on the frame 105 and the infrared camera 120 is on one of the temples 110A-B, or vice versa. The infrared emitter 115 can be connected essentially anywhere on the frame 105, left temple 110A, or right temple 110B to emit a pattern of infrared light. Similarly, the infrared camera 120 can be connected essentially anywhere on the frame 105, left temple 110A, or right temple 110B to capture at least one reflection variation in the emitted pattern of infrared light.
[0046] The infrared emitter 115 and infrared camera 120 are arranged to face inwards towards an eye of the user with a partial or full field of view of the eye in order to identify the respective eye position and gaze direction. For example, the infrared emitter 115 and infrared camera 120 are positioned directly in front of the eye, in the upper part of the frame 105 or in the temples 110A-B at either ends of the frame 105.
[0047] FIG. 2B is a rear view of an example hardware configuration of another eyewear device 200. In this example configuration, the eyewear device 200 is depicted as including an eye scanner 213 on a right temple 210B. As shown, an infrared emitter 215 and an infrared camera 220 are co-located on the right temple 210B. It should be understood that the eye scanner 213 or one or more components of the eye scanner 213 can be located on the left temple 210A and other locations of the eyewear device 200, for example, the frame 105. The infrared emitter 215 and infrared camera 220 are like that of FIG. 2 A, but the eye scanner 213 can be varied to be sensitive to different light wavelengths as described previously in FIG. 2A.
[0048] Similar to FIG. 2 A, the eyewear device 200 includes a frame 105 which includes a left rim 107A which is connected to a right rim 107B via a bridge 106; and the left and right rims 107A-B include respective apertures which hold the respective optical elements 180A-B comprising the see-through display 180C-D.
[0049] FIGS. 2C-D are rear views of example hardware configurations of the eyewear device 100, including two different types of see-through image displays 180C-D. In one example, these see-through image displays 180C-D of optical assembly 180A-B include an integrated image display. As shown in FIG. 2C, the optical assemblies 180A-B includes a suitable display matrix 180C-D of any suitable type, such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, a waveguide display, or any other such display.
[0050] The optical assembly 180A-B also includes an optical layer or layers 176, which can include lenses, optical coatings, prisms, mirrors, waveguides, optical strips, and other optical components in any combination. The optical layers 176A-N can include a prism having a suitable size and configuration and including a first surface for receiving light from display matrix and a second surface for emitting light to the eye of the user. The prism of the optical layers 176A-N extends over all or at least a portion of the respective apertures 175A- B formed in the left and right rims 107A-B to permit the user to see the second surface of the prism when the eye of the user is viewing through the corresponding left and right rims 107A-B. The first surface of the prism of the optical layers 176A-N faces upwardly from the frame 105 and the display matrix overlies the prism so that photons and light emitted by the display matrix impinge the first surface. The prism is sized and shaped so that the light is refracted within the prism and is directed towards the eye of the user by the second surface of the prism of the optical layers 176A-N. In this regard, the second surface of the prism of the optical layers 176A-N can be convex to direct the light towards the center of the eye. The prism can optionally be sized and shaped to magnify the image projected by the see-through image displays 180C-D, and the light travels through the prism so that the image viewed from the second surface is larger in one or more dimensions than the image emitted from the see- through image displays 180C-D.
[0051] In another example, the see-through image displays 180C-D of optical assembly 180A-B include a projection image display as shown in FIG. 2D. The optical assembly 180A-B includes a laser projector 150, which is a three-color laser projector using a scanning mirror or galvanometer. During operation, an optical source such as a laser projector 150 is disposed in or on one of the temples 125A-B of the eyewear device 100. Optical assembly 180A-B includes one or more optical strips 155A-N spaced apart across the width of the lens of the optical assembly 180A-B or across a depth of the lens between the front surface and the rear surface of the lens.
[0052] As the photons projected by the laser projector 150 travel across the lens of the optical assembly 180A-B, the photons encounter the optical strips 155A-N. When a particular photon encounters a particular optical strip, the photon is either redirected towards the user’s eye, or it passes to the next optical strip. A combination of modulation of laser projector 150, and modulation of optical strips, may control specific photons or beams of light. In an example, a processor controls optical strips 155A-N by initiating mechanical, acoustic, or electromagnetic signals. Although shown as having two optical assemblies 180A-B, the eyewear device 100 can include other arrangements, such as a single or three optical assemblies, or the optical assembly 180A-B may have arranged different arrangement depending on the application or intended user of the eyewear device 100.
[0053] As further shown in FIGS. 2C-D, eyewear device 100 includes a left temple 110A adjacent the left lateral side 170A of the frame 105 and a right temple HOB adjacent the right lateral side 170B of the frame 105. The temples 110A-B may be integrated into the frame 105 on the respective lateral sides 170A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170A-B. Alternatively, the temples 110A-B may be integrated into temples 125A-B attached to the frame 105.
[0054] In one example, the see-through image displays include the first see-through image display 180C and the second see-through image display 180D. Eyewear device 100 includes first and second apertures 175A-B which hold the respective first and second optical assembly 180A-B. The first optical assembly 180A includes the first see-through image display 180C (e.g., a display matrix of FIG. 2C or optical strips and a projector (not shown)). The second optical assembly 180B includes the second see-through image display 180D (e.g., a display matrix of FIG. 2C or optical strips 155A-N and a projector 150). The successive field of view of the successive displayed image includes an angle of view between about 15° to 30, and more specifically 24°, measured horizontally, vertically, or diagonally. The successive displayed image having the successive field of view represents a combined three- dimensional observable area visible through stitching together of two displayed images presented on the first and second image displays. [0055] As used herein, “an angle of view” describes the angular extent of the field of view associated with the displayed images presented on each of the left and right image displays 180C-D of optical assembly 180A-B. The “angle of coverage” describes the angle range that a lens of visible light cameras 114A-B or infrared camera 220 can image. Typically, the image circle produced by a lens is large enough to cover the film or sensor completely, possibly including some vignetting (i.e., a reduction of an image's brightness or saturation toward the periphery compared to the image center). If the angle of coverage of the lens does not fill the sensor, the image circle will be visible, typically with strong vignetting toward the edge, and the effective angle of view will be limited to the angle of coverage. The “field of view” is intended to describe the field of observable area which the user of the eyewear device 100 can see through his or her eyes via the displayed images presented on the left and right image displays 180C-D of the optical assembly 180A-B. Image display 180C of optical assembly 180A-B can have a field of view with an angle of coverage between 15° to 30°, for example 24°, and have a resolution of 480 x 480 pixels. [0056] FIG. 3 shows a rear perspective view of the eyewear device of FIG. 2 A. The eyewear device 100 includes an infrared emitter 215, infrared camera 220, a frame front 330, a frame back 335, and a circuit board 340. It can be seen in FIG. 3 that the upper portion of the left rim of the frame of the eyewear device 100 includes the frame front 330 and the frame back 335. An opening for the infrared emitter 215 is formed on the frame back 335. [0057] As shown in the encircled cross-section 4 in the upper middle portion of the left rim of the frame, a circuit board, which is a flexible PCB 340, is sandwiched between the frame front 330 and the frame back 335. Also shown in further detail is the attachment of the left temple 110A to the left temple 325A via the left hinge 126A. In some examples, components of the eye movement tracker 213, including the infrared emitter 215, the flexible PCB 340, or other electrical connectors or contacts may be located on the left temple 325A or the left hinge 126 A.
[0058] FIG. 4 is a cross-sectional view through the infrared emitter 215 and the frame corresponding to the encircled cross-section 4 of the eyewear device of FIG. 3. Multiple layers of the eyewear device 100 are illustrated in the cross-section of FIG. 4, as shown the frame includes the frame front 330 and the frame back 335. The flexible PCB 340 is disposed on the frame front 330 and connected to the frame back 335. The infrared emitter 215 is disposed on the flexible PCB 340 and covered by an infrared emitter cover lens 445. For example, the infrared emitter 215 is reflowed to the back of the flexible PCB 340. Reflowing attaches the infrared emitter 215 to contact pad(s) formed on the back of the flexible PCB 340 by subjecting the flexible PCB 340 to controlled heat which melts a solder paste to connect the two components. In one example, reflowing is used to surface mount the infrared emitter 215 on the flexible PCB 340 and electrically connect the two components. However, it should be understood that through-holes can be used to connect leads from the infrared emitter 215 to the flexible PCB 340 via interconnects, for example.
[0059] The frame back 335 includes an infrared emitter opening 450 for the infrared emitter cover lens 445. The infrared emitter opening 450 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user. In the example, the flexible PCB 340 can be connected to the frame front 330 via the flexible PCB adhesive 460. The infrared emitter cover lens 445 can be connected to the frame back 335 via infrared emitter cover lens adhesive 455. The coupling can also be indirect via intervening components.
[0060] In an example, the processor 932 utilizes eye tracker 213 to determine an eye gaze direction 230 of a wearer’s eye 234 as shown in FIG. 5, and an eye position 236 of the wearer’s eye 234 within an eyebox as shown in FIG. 6. The eye tracker 213 is a scanner which uses infrared light illumination (e.g., near-infrared, short- wavelength infrared, midwavelength infrared, long- wavelength infrared, or far infrared) to captured image of reflection variations of infrared light from the eye 234 to determine the gaze direction 230 of a pupil 232 of the eye 234, and also the eye position 236 with respect to the see-through display 180D.
[0061] FIG. 7 depicts an example of capturing visible light with cameras 114A-B. Visible light is captured by the left visible light camera 114A with a round field of view (FOV). 111 A. A chosen rectangular left raw image 758A is used for image processing by image processor 912 (FIG. 9). Visible light is captured by the right visible light camera 114B with a round FOV 11 IB. A rectangular right raw image 758B chosen by the image processor 912 is used for image processing by processor 912. Based on processing of the left raw image 758 A and the right raw image 758B (which overlaps 713 with the left raw image 758A), a three-dimensional image 715 of a three-dimensional scene, referred to hereafter as an immersive image, is generated by processor 912 and displayed by displays 180C and 180D and which is viewable by the user.
[0062] FIG. 8 A illustrates a front view 800 of the frame 105 of the eyewear 100 having a left frame opening 802A and a right frame opening 802B that support a powered electronic tinting lens 804A and 804B, respectively and a waveguide stack 810A and 810B, respectively. The electronic tinting lenses 804A and 804B comprise a substrate 806A and 806B and a separator 808A and 808B, respectively, and are each configured to receive electrical control signals that establish different tint states of the separator 808A and 808B, respectively (FIG. 8B-8C). In an example, the separators 808A and 808B are directly coupled to an upper layer of waveguide stack 810A and 81 OB, respectively, as shown in FIG. 8B-8C. Each waveguide stack 810A-B is configured to generate images viewable by a user of the eyewear 100. The waveguide stack 810A consists of two substrates, a first waveguide substrate 812A and a second waveguide substrate 813 A, respectively. The term waveguide stack is also referred to as a waveguide in this disclosure.
[0063] The left electronic tinting lens 804A is part of the left optical assembly 180A and the right electronic tinting lens 804B is part of the right optical assembly 180B. Each of the separators 808A and 808B are configured to be controlled by a processor 932 (FIG. 9) to establish a selected tint state. In an example, the separators 808A and 808B are used to optimize image readability in different lighting conditions. For example, the separators 808 A and 808B are configured in a first state to have a dark tint to improve visibility and contrast of an image displayed by the waveguide stack 810A and 810B (FIG. 8C) in bright outdoor conditions, respectively. In a second state, the separators 808A and 808B are configured to have a light tint or no tint for indoor conditions. In an example, the eyewear 100 is suitable for use with augmented reality (AR) features of the eyewear 100.
[0064] Referring to FIG. 8B and 8C, the electronic tinting lens 804A includes the substrate 806A coupled to the waveguide stack 810A by an adhesive 818 with the separator 808 A disposed between the substrate 806A and the waveguide stack 810A. In one embodiment, the separator 808A is an electronic tinting electrolyte. In another embodiment, the separator 808A is a liquid crystal with dye material. A first electrode 814A is coupled to an inside face of the substrate 806A to interact with the separator 808A, and a second electrode 816A is coupled to an upper face of the first waveguide substrate 812A to interact with the separator 808 A. The electrodes 814A and 816A control the tinting of the respective separator disposed between them as a function of the control signals from the processor 932. In one embodiment, the electrodes 814A and 816A are a coating of visibly transparent Indium Tin Oxide (ITO) formed on the respective surface of the first substrate 806A and the first waveguide substrate 812A. In one example, the substrate 806A of the electronic tinting lens 804A is made of a plastic material to reduce overall system weight, as plastic is lighter than glass. The first waveguide substrate 812A is made of glass to provide protection for the waveguide stack 810A. The right electronic tinting lens 804B and waveguide stack 810B are identical in construction to the left electronic tinting lens 804A and waveguide stack 810A. [0065] The integration of the separators 808A and 808B directly coupled to the waveguide stack 810A and 81 OB, respectively, reduces the number of layers present in this assembly, therefore reducing the thickness and weight of the assembly. This integration is done by utilizing the first waveguide substrate 812A as a second substrate for the electronic tinting lens 804A and 804B, respectively. The first waveguide substrate 812A is additionally used as a protective cover, and also to encapsulate the wave guide stack 810A and 81 OB, respectively. The first waveguide substrate 812A maintains the protection needed for the waveguide stack 810A, against environmental factors, such as moisture. The use of glass for the second substrate 810A additionally provides increased flatness and rigidity to the electronic tinting lens 804A as compared to the traditional use of plastic.
[0066] FIG. 9 depicts a high-level functional block diagram including example electronic components disposed in eyewear 100 and 200. The illustrated electronic components include the processor 932, the memory 934, and the see-through image display 180C and 180D.
[0067] Memory 934 includes instructions for execution by processor 932 to implement functionality of eyewear 100/200, including instructions for processor 932 to control in the image 715. Processor 932 receives power from battery (not shown) and executes the instructions stored in memory 934, or integrated with the processor 932 on-chip, to perform functionality of eyewear 100/200, and communicating with external devices via wireless connections.
[0068] A user interface adjustment system 900 includes a wearable device, which is the eyewear device 100 with an eye movement tracker 213 (e.g., shown as infrared emitter 215 and infrared camera 220 in FIG. 2B). User interface adjustments system 900 also includes a mobile device 990 and a server system 998 connected via various networks. Mobile device 990 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting with eyewear device 100 using both a low-power wireless connection 925 and a high-speed wireless connection 937. Mobile device 990 is connected to server system 998 and network 995. The network 995 may include any combination of wired and wireless connections.
[0069] Eyewear device 100 includes at least two visible light cameras 114A-B (one associated with the left lateral side 170A and one associated with the right lateral side 170B). Eyewear device 100 further includes two see-through image displays 180C-D of the optical assembly 180A-B (one associated with the left lateral side 170A and one associated with the right lateral side 170B). Eyewear device 100 also includes image display driver 942, image processor 912, low-power circuitry 920, and high-speed circuitry 930. The components shown in FIG. 9 for the eyewear device 100 and 200 are located on one or more circuit boards, for example a PCB or flexible PCB, in the temples. Alternatively, or additionally, the depicted components can be located in the temples, frames, hinges, or bridge of the eyewear device 100 and 200. Left and right visible light cameras 114A-B can include digital camera elements such as a complementary metal-oxide-semiconductor (CMOS) image sensor, charge coupled device, a lens, or any other respective visible or light capturing elements that may be used to capture data, including images of scenes with unknown objects.
[0070] Eye movement tracking programming implements the user interface field of view adjustment instructions, including, to cause the eyewear device 100 to track, via the eye movement tracker 213, the eye movement of the eye of the user of the eyewear device 100. Other implemented instructions (functions) cause the eyewear device 100 and 200 to determine the FOV adjustment to the initial FOV 111 A-B based on the detected eye movement of the user corresponding to a successive eye direction. Further implemented instructions generate a successive displayed image of the sequence of displayed images based on the field of view adjustment. The successive displayed image is produced as visible output to the user via the user interface. This visible output appears on the see-through image displays 180C-D of optical assembly 180A-B, which is driven by image display driver 942 to present the sequence of displayed images, including the initial displayed image with the initial field of view and the successive displayed image with the successive field of view.
[0071] As shown in FIG. 9, high-speed circuitry 930 includes high-speed processor 932, memory 934, and high-speed wireless circuitry 936. In the example, the image display driver 942 is coupled to the high-speed circuitry 930 and operated by the highspeed processor 932 in order to drive the left and right image displays 180C-D of the optical assembly 180A-B. High-speed processor 932 may be any processor capable of managing high-speed communications and operation of any general computing system needed for eyewear device 100. High-speed processor 932 includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 937 to a wireless local area network (WLAN) using high-speed wireless circuitry 936. In certain examples, the high-speed processor 932 executes an operating system such as a LINUX operating system or other such operating system of the eyewear device 100 and the operating system is stored in memory 934 for execution. In addition to any other responsibilities, the high-speed processor 932 executing a software architecture for the eyewear device 100 is used to manage data transfers with high-speed wireless circuitry 936. In certain examples, high-speed wireless circuitry 936 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other examples, other high-speed communications standards may be implemented by high-speed wireless circuitry 936.
[0072] Low-power wireless circuitry 924 and the high-speed wireless circuitry 936 of the eyewear device 100 and 200 can include short range transceivers (Bluetooth™) and wireless wide, local, or wide area network transceivers (e.g., cellular or WiFi). Mobile device 990, including the transceivers communicating via the low-power wireless connection 925 and high-speed wireless connection 937, may be implemented using details of the architecture of the eyewear device 100, as can other elements of network 995.
[0073] Memory 934 includes any storage device capable of storing various data and applications, including, among other things, color maps, camera data generated by the left and right visible light cameras 114A-B and the image processor 912, as well as images generated for display by the image display driver 942 on the see-through image displays 180C-D of the optical assembly 180A-B. While memory 934 is shown as integrated with high-speed circuitry 930, in other examples, memory 934 may be an independent standalone element of the eyewear device 100. In certain such examples, electrical routing lines may provide a connection through a chip that includes the high-speed processor 932 from the image processor 912 or low-power processor 922 to the memory 934. In other examples, the high-speed processor 932 may manage addressing of memory 934 such that the low-power processor 922 will boot the high-speed processor 932 any time that a read or write operation involving memory 934 is needed.
[0074] Server system 998 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the network 995 with the mobile device 990 and eyewear device 100/200. Eyewear device 100 and 200 is connected with a host computer. For example, the eyewear device 100 is paired with the mobile device 990 via the high-speed wireless connection 937 or connected to the server system 998 via the network 995.
[0075] Output components of the eyewear device 100 include visual components, such as the left and right image displays 180C-D of optical assembly 180A-B as described in FIGS. 2C-D (e.g., a display such as a liquid crystal display (LCD), a plasma display panel (PDP), a light emitting diode (LED) display, a projector, or a waveguide). The image displays 180C-D of the optical assembly 180A-B are driven by the image display driver 942. The output components of the eyewear device 100 further include acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. The input components of the eyewear device 100 and 200, the mobile device 990, and server system 998, may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like. [0076] Eyewear device 100 may optionally include additional peripheral device elements. Such peripheral device elements may include ambient light and spectral sensors, biometric sensors, additional sensors, or display elements integrated with eyewear device 100. For example, peripheral device elements may include any I/O components including output components, motion components, position components, or any other such elements described herein. The eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet. [0077] For example, the biometric components of the user interface field of view adjustment 900 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or Bluetooth™ transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. Such positioning system coordinates can also be received over wireless connections 925 and 937 from the mobile device 990 via the low-power wireless circuitry 924 or high-speed wireless circuitry 936.
[0078] According to some examples, an “application” or “applications” are program(s) that execute functions defined in the programs. Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, a third party application (e.g., an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as IOS™, ANDROID™, WINDOWS® Phone, or another mobile operating systems. In this example, the third-party application can invoke API calls provided by the operating system to facilitate functionality described herein.
[0079] Referring now to FIG. 10, there is shown a method 1000 for operating the eyewear 100 including the electronic tinting lenses 804 A and 804B.
[0080] At block 1002, the processor 932 determines the light transmissive property/tinting of respective electronic tinting lenses 804A and 804B. The processor 932 may determine the light transmissive property/tinting based on the ambient light detected by an ambient light sensor about the eyewear 100, or may be user selected such as using a control of the eyewear 100.
[0081] At block 1004, the processor 932 sends control signals to the electrodes 814 and 816 of the respective electronic tinting lenses 804 A and 804B to control the light transmissive property/tinting of respective electronic tinting lenses 804A and 804B. The electronic tinting lenses 804A and 804B can be adjusted to improve visibility and contrast in bright lighting conditions. Having two or more electronic tinting lenses 804A and 804B allows opacity to be tuned independently.
[0082] At block 1006, the light transmissive property/tinting of respective separators 808A and 808B of electronic tinting lens 804A and 804B is altered by the control signals. The tint of the separators 808A and 808B is used to optimize user viewing of a displayed image 715 by the waveguide stack 810A and 810B, as well as real -world images.
[0083] It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
[0084] Unless otherwise stated, any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ± 10% from the stated amount.
[0085] In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
[0086] While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.

Claims

What is claimed is:
1. Eyewear, comprising: a frame; a waveguide coupled to the frame and configured to display images; a processor configured to process images and display the processed images on the waveguide; and an electronic tinting lens coupled to the frame and configured to receive and pass a real-world image, wherein the electronic tinting lens comprises a lens substrate and a separator, wherein the separator is directly coupled to the waveguide and has a tint electrically controllable by the processor.
2. The eyewear of claim 1, wherein the waveguide comprises a first waveguide substrate and a second waveguide substrate, wherein the separator is interposed between the lens substrate and the first waveguide substrate.
3. The eyewear of claim 2, wherein the first waveguide substrate is a glass protective layer of the waveguide.
4. The eyewear of claim 3, wherein the lens substrate is a plastic.
5. The eyewear of claim 2, wherein the lens substrate further comprises a first electrode, and the first waveguide substrate further comprises a second electrode, wherein the first and second electrodes are configured to control the tint of the separator.
6. The eyewear of claim 5, wherein the separator is an electrolyte.
7. The eyewear of claim 5, wherein the separator is a liquid crystal with dye.
8. The eyewear of claim 7, wherein the processor is coupled to the electronic tinting lens.
9. The eyewear of claim 8, wherein the processor is configured to control the electronic tinting lens.
10. The eyewear of claim 9, wherein the lens substrate is coupled to the first waveguide substrate with an adhesive.
11. A method of operating eyewear having: a frame; a waveguide coupled to the frame and configured to display images; a processor configured to process images and display the processed images on the waveguide; and an electronic tinting lens coupled to the frame and configured to receive and pass a real-world image, wherein the electronic tinting lens comprises a lens substrate and a separator, wherein the separator is directly coupled to the waveguide and has a tint electrically controllable by the processor, the method comprising the steps of; the processor determining a light transmissive property of the electronic tinting lens; and the processor controlling the electronic tinting lens to selectively control the tint of the electronic tinting lens.
12. The method of claim 11, wherein the waveguide comprises a first waveguide substrate and a second waveguide substrate, wherein the separator is interposed between the lens substrate and the first waveguide substrate.
13. The method of claim 12, wherein the first waveguide substrate is a glass protective layer of the waveguide.
14. The method of claim 13, wherein the lens substrate is a plastic.
15. The method of claim 14, wherein the lens substrate further comprises a first electrode, and the first waveguide substrate further comprises a second electrode, wherein the first and second electrodes are configured to control the tint of the separator.
16. The method of claim 15, wherein the separator is an electrolyte.
17. The method of claim 15, wherein the separator is a liquid crystal with dye.
18. The method of claim 17, wherein the processor is coupled to the electronic tinting lens.
19. The method of claim 18, wherein the processor is configured to control the electronic tinting lens.
20. A non-transitory computer-readable medium storing program code which, when executed, is operative to cause an electronic processor of eyewear having a frame, a waveguide coupled to the frame and configured to display images, a processor configured to process images and display the processed images on the waveguide, and an electronic tinting lens coupled to the frame and configured to receive and pass a real-world image, wherein the electronic tinting lens comprises a lens substrate and a separator, wherein the separator is directly coupled to the waveguide and has a tint electrically controllable by the processor, to perform the steps of: determining a light transmissive property of the electronic tinting lens; and selectively controlling the electronic tinting lens to selectively control the tint of the electronic tinting lens.
PCT/US2022/049760 2021-12-28 2022-11-14 Eyewear electronic tinting lens with integrated waveguide WO2023129289A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163294317P 2021-12-28 2021-12-28
US63/294,317 2021-12-28
US17/586,265 2022-01-27
US17/586,265 US20230204958A1 (en) 2021-12-28 2022-01-27 Eyewear electronic tinting lens with integrated waveguide

Publications (1)

Publication Number Publication Date
WO2023129289A1 true WO2023129289A1 (en) 2023-07-06

Family

ID=84519579

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/049760 WO2023129289A1 (en) 2021-12-28 2022-11-14 Eyewear electronic tinting lens with integrated waveguide

Country Status (1)

Country Link
WO (1) WO2023129289A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170184857A1 (en) * 2013-02-20 2017-06-29 Sony Corporation Display device
US11204501B2 (en) * 2018-04-24 2021-12-21 Mentor Acquisition One, Llc See-through computer display systems with vision correction and increased content density

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170184857A1 (en) * 2013-02-20 2017-06-29 Sony Corporation Display device
US11204501B2 (en) * 2018-04-24 2021-12-21 Mentor Acquisition One, Llc See-through computer display systems with vision correction and increased content density

Similar Documents

Publication Publication Date Title
US11792500B2 (en) Eyewear determining facial expressions using muscle sensors
US11284058B2 (en) Utilizing dual cameras for continuous camera capture
US11899283B2 (en) Antenna implementation embedded in optical waveguide module
US20220082867A1 (en) Wearable proximity sensing array for the visually impaired
US11506902B2 (en) Digital glasses having display vision enhancement
EP4107576A1 (en) Hyperextending hinge for wearable electronic device
US11619819B2 (en) Eyewear display for generating an immersive image
US11933977B2 (en) Eyewear eye-tracking using optical waveguide
WO2021167756A1 (en) Hyperextending hinge having fpc service loops for eyewear
US11880039B2 (en) Polarized reflective pinhole mirror display
US11789294B2 (en) Eyewear frame as charging contact
WO2021167762A1 (en) Hyperextending hinge having cosmetic trim for eyewear
US20230204958A1 (en) Eyewear electronic tinting lens with integrated waveguide
US20220365370A1 (en) Eyewear electrochromic lens with multiple tint regions
US11860371B1 (en) Eyewear with eye-tracking reflective element
US11852500B1 (en) Navigation assistance for the visually impaired
US20230367137A1 (en) Eyewear with a shape memory alloy actuator
WO2023129289A1 (en) Eyewear electronic tinting lens with integrated waveguide
US20230301045A1 (en) Eyewear with rf shielding having grounding springs
US20220373401A1 (en) Eyewear surface temperature evaluation
US20230137052A1 (en) Eyewear having current consumption optimization of wireless system interface
WO2023192082A1 (en) Eyewear with combined flexible pcb and wire assembly

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22823211

Country of ref document: EP

Kind code of ref document: A1