WO2023091771A2 - Optique hybride - Google Patents

Optique hybride Download PDF

Info

Publication number
WO2023091771A2
WO2023091771A2 PCT/US2022/050640 US2022050640W WO2023091771A2 WO 2023091771 A2 WO2023091771 A2 WO 2023091771A2 US 2022050640 W US2022050640 W US 2022050640W WO 2023091771 A2 WO2023091771 A2 WO 2023091771A2
Authority
WO
WIPO (PCT)
Prior art keywords
eyewear
user
disposed
shading
amount
Prior art date
Application number
PCT/US2022/050640
Other languages
English (en)
Other versions
WO2023091771A3 (fr
Inventor
Scott W. Lewis
Original Assignee
Percept Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Percept Technologies, Inc. filed Critical Percept Technologies, Inc.
Publication of WO2023091771A2 publication Critical patent/WO2023091771A2/fr
Publication of WO2023091771A3 publication Critical patent/WO2023091771A3/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/02Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/06Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the phase of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/10Bifocal lenses; Multifocal lenses
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/10Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses
    • G02C7/104Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses having spectral characteristics for purposes other than sun-protection
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/10Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses
    • G02C7/105Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses having inhomogeneously distributed colouring
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C2200/00Generic mechanical aspects applicable to one or more of the groups G02C1/00 - G02C5/00 and G02C9/00 - G02C13/00 and their subgroups
    • G02C2200/10Frame or frame portions made from wire

Definitions

  • Human eyesight sometimes requires correction, and sometimes involves different amounts or types of correction, depending on a selected object being viewed. For example, when the object is at relatively close-range, an amount of refraction to be preferred might differ substantially from when the object is at relatively distant range. For another example, when the object is disposed in a frontal region of a field of view, a user’s eyesight might differ substantially from when the object is disposed in a peripheral region of a field of view. [10]
  • One issue that can arise with respect to eyesight is that a user might spend excessive time viewing close-range objects, such as a computing display or a mobile device (such as a smartphone). Long-term excessive viewing of close-range objects can lead to dry eye effects and/or myopia.
  • Medical personnel sometimes recommend that the user follow a “20-20-20” plan, that is, each 20 minutes looking away for 20 seconds at an object at least 20 feet away. While this can reduce the likelihood of the user developing a dry eye effect or myopia, users can easily forget to practice this procedure.
  • One technique for delaying the onset of myopia is for the user to wear hard contact lenses, such as “K-Ortho” lenses, a relatively large contact lens that covers the sclera and forces the user’s eye into a different shape. These lenses can be worn at night when the user’s eye is not otherwise busy focusing. This can have the effect of squeezing the user’s eyeball into a shape better suited for non-myopic vision. While this technique can be effective at delaying onset of myopia, they can be less effective when the user restarts their day with the same process of staring at close-range objects such as close computing displays or small screens on mobile devices.
  • Another technique for delaying the onset of myopia is to dilate the user’s eyes, such as with atropine.
  • This can have the advantage of providing more information to the retina, as having a larger pupil size allows more light, and thus more information, to enter the user’s eye. This can avoid any need for the user to squint at a small screen, such as on a mobile device.
  • this technique can be effective at slowing progression of, thus delaying onset of, myopia, it has the drawback that it can induce the user to have photophobia in response to having their eyes frequently dilated. This can be particularly painful when the user has their irises eyes dilated, such as during an optometry exam. With some users, the effect can be sufficient strong that they need tinted glasses even when their eyes are no longer dilated.
  • corrective eyewear can include multiple lens regions, such as an upper clear region having little or no refractive effect and suitable for distance viewing, and a lower “reader” region having a refractive effect suitable for close-range viewing (such as when reading a book or viewing a mobile device display). In such cases, it can be desirable to encourage a viewer to direct their vision through a region best suited for viewing those objects selected by to be viewed.
  • lens regions Other types of vision correction might be applied by distinct lens regions and might be appropriate to objects selected for viewing.
  • distinct polarization might be preferred with respect to objects subject to glare, mobile device displays, or laptop/ desktop displays.
  • eyewear can include multiple lens regions, such as a first region having a first polarization effect and a second region having a second polarization effect.
  • it can be desirable to encourage a viewer to direct their vision through a region best suited for viewing those objects selected by to be viewed.
  • Eyewear can be disposed to apply still other types of vision correction, whether dynamically in selected regions (or for regions defined by dynamically selected sets of pixels) or statically in known regions. Similarly, in such cases, it can be desirable to encourage a viewer to direct their vision through a region best suited for viewing those objects selected by to be viewed. Eyewear disposed to apply one or more amounts or types of vision correction can be made usable in one or more selected circumstances, such as reading, operating a vehicle, performing a selected activity for which particular vision correction is optimized, or otherwise as described herein.
  • each of these issues might cause difficulty in aspects of vision correction and in encouraging a user to select a defined portion of a lens in eyewear through which to view a particular object.
  • each of these issues, as well as other possible considerations might cause difficulty in aspects of eyewear for the user might wish to have more than one function or a different function applied at each of more than one portion of the user’s field of view.
  • devices and methods for use, using them are capable of optimizing sensory inputs using lenses divided into distinct portions.
  • Devices and methods described herein can include lenses distinguishing between close-range and distance viewing; distinguishing between frontal and peripheral viewing; or otherwise as described herein.
  • Devices and methods described herein can include a hybrid lens having multiple portions, such as upper/lower lens portions, each of which can be disposed to perform a distinct function; thus, an upper portion can perform a first function while a lower portion can perform a second function.
  • the distinct functions can include different amounts of refraction (such as for close-range and distant viewing), different amounts of shading/inverse-shading, different amounts of coloring/tinting, different amounts of polarization (possibly including planar polarization, circular polarization, or a combination thereof), different dynamic visual optimization (“DVO”) functions or DVO functions with different parameters, combinations or conjunctions of one or more such functions, or otherwise as described herein.
  • different amounts of refraction such as for close-range and distant viewing
  • different amounts of shading/inverse-shading different amounts of coloring/tinting
  • different amounts of polarization possibly including planar polarization, circular polarization, or a combination thereof
  • DVO dynamic visual optimization
  • the distinct functions can operate in response to a situational context, such as a time of day, a known location, a nature of an ambient environment, a set of user inputs, or otherwise as described herein.
  • the user can determine and set one or more “shading bookmarks”, describing an amount of shading/inverse-shading specified to be applied in the context described by each selected bookmark, and one or more “tinting bookmarks”, describing an amount and nature of coloring/tinting specified to be applied in the context described by each selected bookmark.
  • the user can determine and set one or more other types of bookmarks, describing a nature of another type of effect to be applied in the context described by each selected bookmark, such as refraction, polarization, DVO functions, or as otherwise described herein.
  • Devices and methods described herein can be disposed to adjust an amount of shading/inverse-shading and/or coloring/tinting in at least a portion of the user’s field of view: so as to prevent refraction of bright light from damaging or injuring a viewer or objects near the lenses, or otherwise as described herein.
  • Shading/inverse-shading can be combined with coloring/tinting, with each one possibly being responsive to the other, such as one or more of: (A) adjusting coloring/tinting in response to an amount of luminance in an ambient environment, or (B) adjusting shading/inverse-shading in response to an amount of luminance in a particular frequency range, such as green or blue/ultraviolet.
  • Devices and methods described herein can be disposed to adjust the shading/inverse- shading and/or coloring/tinting autonomously in response to an ambient environment, and/or in response to a user-defined set point.
  • the set point can be defined in response to a relatively normal brightness for a location where the user is located.
  • the amount and nature of shading/inverse-shading can be disposed to have a base value and a relative in- crease/decrease in response to an amount of luminance in an ambient environment, and similarly for coloring/tinting.
  • a GPS receiver or other location identifier can be disposed to identify a lighting context as a proxy for direct measurement of the ambient environment.
  • devices and methods can be disposed to adjust in response to one or more threshold values, such that the shading/inverse-shading and/or coloring/tinting is responsive to an ambient environment up to the threshold values and converts to a selected user set point at each threshold.
  • Devices and methods described herein can be disposed to adjust for a selected viewing direction or depth of focus, in response to user behavior; in response to external signals; or otherwise as described herein.
  • the magnification effect can be disposed to be “turned off’ or significantly reduced in response to a user focusing on different objects or different lines of sight.
  • jewelers, dentists, surgeons, and similar users can perform professional functions using a magnification effect with respect to detailed objects, while still being able to see substantially normally with respect to other objects.
  • devices or methods used herein can be disposed to adjust an amount of refraction in response to a distance to an object at which the user is looking, so as to optimize an amount of refraction to match a distance of that object, thus to a user’s depth of focus.
  • the amount of refraction can be disposed to be adjusted in one or both (or more if the lens is multi-focal) portions of the lens in response to where the user is looking and the user’s relative depth of focus, such as in response to an angle of the user’s head or body with respect to a vehicle.
  • Devices and methods described herein can be disposed to respond to control inputs, such as from a user; from an evaluator, supervisor, or trainer of that user; or otherwise as described herein.
  • Devices and methods described herein can be disposed to respond to signal inputs, such as from a sensor disposed to detect an amount of luminance or a color balance of a user’s field of view.
  • optional light sensors can be disposed to respond to one or more of: an amount of luminance in an ambient environment, an amount of luminance from a specific light source, an amount of luminance from an object being focused upon by a user or in a background of such an object, or as otherwise described herein.
  • multiple such sensors can be disposed in the eyewear, as in one or more of: forward-pointing, downwardpointing, or sideways-pointing. The multiple such sensors can be disposed to determine an amount of luminance of particular portions of the user’s field of view, such as a distant viewing range, a reading range, and/or a luminance of an ambient environment.
  • actual sensors can be disposed within the eyewear with one or more external inputs disposed to receive light and transmit that light to the sensors. This can have the effect that the location within the eyewear where light is received, and the location within the eyewear where received light is actually measured, differ substantially. This can also have the effect that the location within the eyewear where light is received can be shielded against stray light inputs (or disposed to measure at two points and process the different values) so as to assure that the luminance measured by the sensor is accurate.
  • individual sensors can be disposed to measure luminance at a selected frequency or range of frequencies, such as only measuring light in a 5OO-56onm range (green light), only measuring blue/ultraviolet light, only measuring red/amber light, or some combination or conjunction thereof (such as red and blue, but not green).
  • a single receiving location can be disposed to split incoming light into multiple sets of frequencies and route those multiple sets to separate sensors so as to receive that light jointly but measure it separately.
  • Devices and methods described herein can be disposed to encourage a user to use selected portions of lenses suitable for monitoring, prevention, or treatment of adverse optical conditions; appropriate to healthy viewing habits; or otherwise as described herein.
  • Devices and methods described herein can be disposed to provide coloring/tinting effects, including adjusting a set of frequencies presented in at least a portion of the user’s field of view: such as by injecting selected frequencies; using a chromatic filter; using a photomultiplier; or otherwise as described herein.
  • Devices and methods described herein can be disposed to provide a combination of coloring/tinting and/or shading/inverse-shading, so as to enable a viewer to primarily view selected advantageous frequencies (such as green) while still being able to view other selected frequencies (such as red) that are common in an ambient environment.
  • Devices and methods described herein can be disposed to provide additional coloring/tinting and/or other lighting, including adjusting a luminance presented in at least a portion of the user’s field of view: such as by injecting selected frequencies or other lighting; so as to urge the viewer’s eye and/ or brain out of scotopic vision or into a mesopic vision; or otherwise as described herein.
  • Lighting by the eyewear can include one or more of: a lamp disposed to inject lighting (such as one or more advantageous frequencies) into the eye, a lamp disposed to illuminate an object (such as an object of focus by the user), a lamp disposed to illuminate a region in the user’s field of view (such as a region the user is to be encouraged to view by the eyewear). Similar to a lamp, the eyewear can be disposed to control a display so as to brighten or darken the display (such as using a signal transmitted to the display or a computing device controlling the display).
  • Devices and methods described herein can be disposed to adjust dynamic visual optimization (“DVO”) in response to user behavior; in response to control inputs; or otherwise as described herein.
  • Dynamic visual optimization can be combined with control, for at least a portion of the viewer’s field of view, of one or more of: an amount of refraction, shading/in- verse-shading, coloring/tinting effects.
  • Devices and methods described herein can be disposed to adjust a set of shading/in- verse-shading and/or coloring/tinting effects, DVO effects, or other visual effects, presented in at least a portion of the user’s field of view, in response to an audio or audio/video presentation, or in response to another selected signal: such as a presentation selected by the user, a presentation from an ambient environment, a presentation associated with a signal received by digital eyewear, or otherwise as described herein.
  • devices and methods described herein can be disposed for use with particular user devices; for use with particular user activities; or otherwise as described herein.
  • devices and methods described herein can be disposed for use with one or more combinations of properties and/or methods, one or more particular user activities, or otherwise as described herein.
  • Fig. 1 (collectively including figs. 1A-1C) shows a conceptual drawing of a first example eyewear system, showing a lens divided into sections for close-range and distance viewing.
  • Fig- 2 (collectively including fig. 2A-2B) shows a conceptual drawing of a second example eyewear system, showing a lens divided into sections for frontal and peripheral viewing.
  • FIG. 3 (collectively including fig. 3A-3C) shows a conceptual drawing of a third example eyewear system, showing a contact lens.
  • FIG. 4 shows a conceptual drawing of an example eyewear system, showing adjustment of refraction in response to a depth of focus.
  • this Application describes devices, and methods for using them, capable of optimizing sensory inputs, such as using lenses having multiple capabilities that can be disposed to apply differing visual effects to distinct portions of a user’s field of view.
  • Devices described herein are sometimes referred to herein as “digital eyewear” or “eyewear”, even though they might be operating under control of one or more computing devices.
  • Elements in such devices can include “lenses”, which might include lenses in glasses; contact lenses; intraocular lenses; lenses in face masks, goggles, helmets, or similar equipment; “telescopic” lenses; lenses in jeweler’s loupes, dental lenses, surgical lenses, or similar equipment; lenses in binoculars, sights, scopes, or similar equipment; or as otherwise described herein.
  • devices can include any similar equipment or any similar form factor, such as a first form factor having a front piece to support one or more lenses and one or more temples (ear pieces) disposed to support the front piece using a user’s ears, or such as a second form factor having a front piece to support one or more lenses and a nose piece disposed to support the front piece using a user’s nose (such as a pince-nez).
  • a first form factor having a front piece to support one or more lenses and one or more temples (ear pieces) disposed to support the front piece using a user’s ears
  • a second form factor having a front piece to support one or more lenses and a nose piece disposed to support the front piece using a user’s nose (such as a pince-nez).
  • the eyewear can have multiple regions having distinct refractive properties; this can have the effect that the eyewear can be suitable for either nearer or farther viewing, or for either viewing in a frontal region or a peripheral region.
  • the eyewear can have multiple regions having distinct shading/inverse-shading properties; this can have the effect that the eyewear can be suitable for viewing objects with different shading/inverse-shading or with no shading/inverse-shading.
  • the eyewear can have multiple regions having distinct polarization properties; this can have the effect that the eyewear can be suitable for viewing objects with different polarization or with no polarization.
  • the eyewear can have multiple regions having distinct dynamic visual optimization; this can have the effect that the eyewear can be suitable for viewing objects with different flicker- ing/periodic display properties, or no flickering/periodic display properties.
  • the eyewear can have multiple regions having distinct color balance; this can have the effect that the eyewear can be suitable for viewing objects with different color frequencies or mixtures thereof.
  • Hybrid lenses and contextual functions This Application describes devices, and methods for using them, capable of providing hybrid lenses.
  • the hybrid lenses can be disposed to have multiple portions, such as upper/lower lens portions. Each portion can be disposed to perform a distinct function.
  • the distinct functions can include one or more of:
  • Different amounts of shading/inverse-shading such as might be used to encourage the user to look through selected portions, or such as might be used to decrease an effect of bright light or glare infalling from a particular direction, or such as might be used to prevent damage from high refractive strength in bright sunlight.
  • Different amounts of coloring/tinting such as might be used to reduce the likelihood or severity of an oncoming episode of migraine or photophobia, or such as might be used to treat or ameliorate a current episode of migraine or photophobia, or such as might be used with respect to using the hybrid lenses in contexts in which seeing particular colors is important (such as driving).
  • Different amounts of polarization such as might be used to reduce an amount of glare from one or more sources, such as sunlight, reflections, computing displays, or otherwise as described herein.
  • DVO dynamic visual optimization
  • the distinct functions can operate in response to a situational context, such as a time of day, a known location, a nature of an ambient environment, a set of user inputs, or otherwise as described herein.
  • the user can determine and set one or more “shading bookmarks”, describing an amount of shading/inverse-shading specified to be applied in the context described by each selected bookmark.
  • the eyewear can be disposed to adjust its refractive properties in response to relatively bright light; this can have the effect that the eyewear can be suitable for use in bright sunlight, such as for reading in an outside environment or when placing lenses for close-range viewing on or near a light-sensitive object.
  • the light-sensitive object can include paper (which can discolor or burn) or electronic equipment (which can be damaged by concentrated light or heat).
  • the eyewear can be disposed to adjust an amount of shading/inverse-shad- ing and/or coloring/ tinting of at least a portion of a lens, such as a portion having concentrated refractive properties, in response to relatively bright light.
  • the eyewear can be disposed to determine whether light in an ambient environment is sufficiently bright, and whether the user’s viewing direction includes that bright light, that damage or injury can result.
  • the eyewear can be used when switching between close-range viewing and distance viewing: such as when switching between distance viewing and when reading from a book or an electronic display; switching between piloting a vehicle and operating its controls, such as an aircraft, ground vehicle, or watercraft; switching between performing a detailed or life-threatening activity that includes both close-range viewing and distance viewing, such as law enforcement or search/ rescue, or emergency medical response or treatment; or switching between close-range viewing and distance viewing as otherwise described herein, such as for example in bright ambient light.
  • the eyewear can be used when switching between frontal viewing and peripheral viewing, such as when switching between piloting a vehicle and watching for traffic, such as an aircraft, ground vehicle, or watercraft; performing a detailed or life-threatening activity that includes both frontal viewing and peripheral viewing, such as law enforcement or search/res- cue; or otherwise as described herein.
  • the eyewear can determine where a “close-range” location is, such as in response to a signal from one or more of: an external device, (such as a computing display or mobile device), a control panel or dashboard of a vehicle, a medical instrument, a signal from a facemask or helmet that can determine whether the user’s eyes are directed at the close-range location, or otherwise as described herein.
  • the eyewear can similarly determine where a peripheral location is, such as in response to a signal from one or more of: an external device, a control panel or dashboard, a medical instrument, a facemask or helmet, or otherwise as described herein.
  • the eyewear can determine where a close-range versus a distance location, or a frontal versus a peripheral location, in response to an input signal from a user, such as described below, indicating when the user believes they are looking at a close-range versus a distance location, or looking at a frontal versus a peripheral location.
  • This can have the effect that the user can dynamically adjust one or more dividing borders on the lens between close-range versus distance effects, or between frontal versus peripheral effects, rather than having those effects occur in response to a fixed dividing line on the lens.
  • This Application describes devices, and methods for using them, capable of optimizing sensory inputs in bright ambient light, such as when using eyewear under control of a user wearing that eyewear, or under control of an evaluator, supervisor, or trainer of the user wearing that eyewear. For some examples:
  • the eyewear can be disposed to respond to user controls, such as touch controls (including capacitive or contact touch), gesture controls (including eye or facial gestures), body movement controls (including hand/finger, head, or other body gestures), voice controls (including spoken commands or echolocation), controls by an application (“app”) operating on a computing device or mobile device, or otherwise as described herein.
  • user controls such as touch controls (including capacitive or contact touch), gesture controls (including eye or facial gestures), body movement controls (including hand/finger, head, or other body gestures), voice controls (including spoken commands or echolocation), controls by an application (“app”) operating on a computing device or mobile device, or otherwise as described herein.
  • the eyewear can be disposed to respond to user control by adjusting a polarization or shading amount, thus providing user control of a sunglasses effect, or adjusting a color balance, thus providing user control of a filtering or tinting effect.
  • the eyewear can be disposed to respond to controls by an evaluator, supervisor, or trainer, such as a coach, teacher, test proctor, sports scout, work supervisor, or otherwise as described herein.
  • an evaluator, supervisor, or trainer can be disposed to receive sensory inputs from the user’s eyewear and to adjust those sensory inputs for a process in which the user is evaluated, reviewed, trained, or otherwise as described herein.
  • the evaluator, supervisor, or trainer can be disposed to use a computing device or mobile device to adjust sensory inputs designated for the user.
  • the evaluator, supervisor, or trainer can be disposed to adjust the user’s sensory inputs to add new inputs, such as an augmented reality or virtual reality overlay in an otherwise natural environment.
  • the evaluator, supervisor, or trainer can be disposed to block or remove inputs, such as by filtering, shading, or tinting, a portion of the user’s field of view.
  • the eyewear can be disposed to determine an amount of luminance infalling from a user’s field of view, such as either the entire field of view or a portion thereof.
  • a user’s field of view such as either the entire field of view or a portion thereof.
  • the eyewear can be disposed to shade a portion of the lenses associated with close-range vision.
  • the eyewear can be disposed to shade a portion of the lenses associated with distant vision.
  • the eyewear can be disposed to shade a portion of the lenses associated with peripheral vision.
  • One or more sensors can be disposed in particular peripheral directions, such as to allow the eyewear to determine whether luminance is excessive from that particular direction.
  • the eyewear determines that the color balance of the user’s field of view is inapposite to the user’s medical condition, such as when the coloring/ tinting is skewed too far toward blue/ultraviolet, the eyewear can be disposed to provide coloring/ tinting to adjust the color balance such as to be more appropriate to the user’s medical condition.
  • the eyewear when the eyewear receives a signal indicating that coloring/tinting or shad- ing/inverse-shading should be adjusted, such as a signal from medical personnel or from a caretaker, the eyewear can be disposed to adjust the coloring/tinting or shading/inverse-shad- ing for all or a portion of the user’s field of view, as indicated by the received signal.
  • a signal indicating that coloring/tinting or shad- ing/inverse-shading should be adjusted such as a signal from medical personnel or from a caretaker
  • the eyewear can be disposed to adjust the coloring/tinting or shading/inverse-shad- ing for all or a portion of the user’s field of view, as indicated by the received signal.
  • the eyewear can be disposed to remind the user that they have been focusing on closerange objects for longer than is healthy. In such cases, the eyewear can be disposed to remind the user to look away from close-range objects, such as by presenting an indicator to the user that that sufficient time has passed so that the user should toward a distant object for at least a short time.
  • the eyewear can be disposed to present an indicator to the user to follow a 20-20-20 plan, that is, each 20 minutes to look away for at least 20 seconds at a distance of at least 20 feet.
  • the eyewear can be disposed to so inform the user.
  • the eyewear can be disposed to tint a close-range portion of the lenses so as to prompt the user to look away from a close-range object and toward a distance. For example, after warning the user, the eyewear can be disposed to darken the close-range portion of the lenses at greater and greater amounts, so as to prevent the user from continuing to squint at closerange objects.
  • the eyewear can be disposed to tint all or a portion of the lenses so as to prompt the user to blink. For example, after warning the user, such as using an audio beep or a video flash, the eyewear can be disposed to darken the entirety of the lenses at greater and greater amounts, so as to prevent the user from continuing to see and prompting the user to blink. For another example, the eyewear can be disposed to use this effect or a similar effect to ameliorate an effect of migraine or photophobia.
  • the eyewear can be disposed to tint the lenses so as to prompt the user’s eyes to naturally dilate (or to tint the lenses in combination with dilation using atropine).
  • the tinting can be restricted to regions where the user is not looking, so as to prompt the user’s eyes to naturally dilate, while still allowing the user’s eyes to receive sufficient light to avoid any necessity to squint, and without acclimatizing the user’s eyes to excessive dilation.
  • the eyewear can be disposed to alter a refraction of a close-range portion of the lenses so as to prompt the user to look away from a close-range object and toward a distance.
  • the eyewear can be disposed to alter the refraction of the close-range portion of the lenses away from the user’s natural prescription, so as to prevent the user from focusing effectively on close-range objects. In such cases, the user can be prompted to focus on a more distant object in response to a change in effect of the lens.
  • the eyewear can include an artificial pupil generated in response to a radial shading on a lens, such as a contact lens or an intraocular (IOC) lens, disposed to perform one or more of: (A) reducing an amount of light admitted to the user’s eye, or (B) altering a focal length of the user’s eye.
  • a lens such as a contact lens or an intraocular (IOC) lens
  • IOC intraocular
  • the user can be prompted to blink, or to focus on a more distant object, in response to a change in effect of the lens.
  • the eyewear can include an artificial refraction effect generated in response to a radial shading on a lens, such as a contact lens or an intraocular (IOC) lens, disposed to alter a refractive effect on light admitted to the user’s eye.
  • a lens such as a contact lens or an intraocular (IOC) lens
  • IOC intraocular
  • the user can be prompted to blink in response to a change in effect of the lens.
  • This Application also describes devices, and methods for using them, capable of monitoring the user’s use of the eyewear, so as to allow medical personnel, such as an optometrist or ophthalmologist, to determine the users typical eye behavior, particularly with respect to concentrating on close-range or distant viewing.
  • medical personnel such as an optometrist or ophthalmologist
  • the eyewear can be disposed to maintain a record of the user’s gaze direction and/or focal length. For example, the eyewear can be disposed to determine how frequently and for how long the user views objects at a distance and how frequently and for how long the user views objects at close-range. For another example, the eyewear can be disposed to determine in what direction the user is looking when looking at a distance and in what direction the user is looking when viewing objects at close-range.
  • the eyewear can also be disposed to use the record of the user’s gaze direction and/or focal length to determine whether the user is staring at a computing display or mobile device or is viewing different objects, such as the user might do while being active outside (or possibly being active inside).
  • the eyewear can also be disposed to use the record of the user’s gaze direction and/ or focal length to determine whether the user is focused at a frontal portion of their field of view or is occasionally moving their eyes to identify objects or movement in a peripheral portion of their field of view.
  • the eyewear can be disposed to maintain a record of a brightness level where the user is located or is looking, so as to determine whether the user is indoors or outdoors. For example, when the local brightness level is between about 300-700 lux, the eyewear can determine that the user is indoors or is looking at objects that are indoors, while when the local brightness level is over 1000 lux (or has a high variability), the eyewear can determine that the user is outdoors or is looking at objects that are outdoors.
  • the eyewear can be disposed to maintain a record of a color balance where the user is located or is looking, so as to alternatively determine whether the user is indoors or outdoors. For example, when the blue or ultraviolet component of the ambient light environment is greater than a selected fraction, the eyewear can be disposed to determine that the user is indoors or is looking at objects illuminated by indoor lighting, while when the green component is greater than a selected fraction, the eyewear can be disposed to determine that the user is outdoors or is looking at objects illuminated by outdoor lighting
  • the eyewear can be disposed to use the record of the color balance to determine a set of times of day, or an evaluation of ambient weather, when the user is outside.
  • the eyewear can be disposed to determine, in response to a color balance, a temperature associated with times of day when the user is outside, particularly in combination with an evaluation of the user’s location (as might be determined in response to a GPS receiver).
  • the eyewear can be disposed to perform dynamic visual optimization (DVO) (possibly such as described in the Incorporated Disclosures) with respect to a first portion of a lens, such as a portion (or multiple such portions) directed to a computing display or a mobile device, and to refrain from performing such optimization with respect to a first portion of the lens, such as a portion (or multiple such portions) directed to distant vision.
  • DVO dynamic visual optimization
  • the eyewear can be disposed to perform dynamic visual optimization (DVO) with respect to distant vision, and to refrain from doing so with respect to the computing display or mobile device.
  • the eyewear When the eyewear performs dynamic visual optimization (DVO) with respect to a portion of a lens directed to a computing display or a mobile device, the eyewear can be synchronized to a display frequency of the computing display or mobile device. Alternatively, the eyewear can be synchronized to a fraction or a multiple of that frequency, so as to cause the computing display or mobile device to appear, to the user, to be dimmed or shaded.
  • DVO dynamic visual optimization
  • Multi-focal contact lens effects This Application also describes devices, and methods for using them, capable of adjusting sensory inputs passing through contact lenses, so as to allow the user bifocal or multi-focal lens effects. For some examples:
  • the eyewear can be disposed to include a contact lens having multiple portions, each with a separate refractive effect.
  • a first refractive effect can be optimized for close-range viewing while a second refractive effect can be optimized for distant viewing.
  • a first set of portions can be disposed having the first refractive effect and a second set of portions can be disposed having the second refractive effect.
  • the first set of portions and the second set of portions can be disposed in multiple sets of concentric rings, such as interlaced rings with even rings being included in the first set of portions and odd rings being included in the second set of portions.
  • the contact lens can use shading/inverse-shading to select one or more of those portions in response to the eyewear selecting whether the user is, or should be, focusing on close-range viewing or distant viewing. In such cases, when the user is, or should be, focusing on a selected viewing range, the eyewear can adjust the shading/inverse-shading to activate an associated refractive effect.
  • the lens can be disposed to use a local power supply, such as (A) an antenna collecting power from ambient electromagnetic fields, (B) a battery replaceable when the contact lens is replaced, (C) a photodiode collecting power from incoming light, (D) a ratchet or spring collecting power from the user’s eye movements, head movements, or other movements, or (E) otherwise as described herein.
  • a local power supply such as (A) an antenna collecting power from ambient electromagnetic fields, (B) a battery replaceable when the contact lens is replaced, (C) a photodiode collecting power from incoming light, (D) a ratchet or spring collecting power from the user’s eye movements, head movements, or other movements, or (E) otherwise as described herein.
  • a local power supply such as (A) an antenna collecting power from ambient electromagnetic fields, (B) a battery replaceable when the contact lens is replaced, (C) a photodiode collecting power from incoming light, (D) a ratchet or spring collecting
  • eyewear and associated optical systems such as glasses or sunglasses, contact lenses, goggles, facemasks or helmets, or other similar elements;
  • optical systems such as automobile mirrors and windshields, binoculars, cameras, computer screens (including laptops, tablets, smartphones, and otherwise), microscopes and telescopes, or rifle scopes or other gun scopes;
  • the eyewear When the eyewear is embodied in goggles, the eyewear can be disposed with foam (or another substance) coupling the goggles to the surface of the user’s face, so as to provide a shield against stray or otherwise untoward incoming light. As described herein, in one embodiment, the eyewear can be disposed to provide a selected amount of luminance input to the user’s eyes, such as using shading/inverse-shading, so as to provide a limited amount of luminance, so as to prevent, ameliorate, or treat, migraine, photophobia or neuro-ophthalmic disorder.
  • the eyewear can be disposed to use col- oring/tinting, so as to provide light input to the user’s eyes in a selected frequency range, such as 5OO-56onm (green), so as to prevent, ameliorate, or treat, migraine, photophobia or neuro- ophthalmic disorder.
  • a selected frequency range such as 5OO-56onm (green)
  • the selected shading/inverse-shading or selected coloring/ tinting can be controlled by the eyewear.
  • One type of particular device can include lenses that are disposed to provide enhanced user vision of elements that are relatively distant, relatively small, relatively undistinguished against their background or their setting, or otherwise difficult to see with the user’s unaided eyesight.
  • eyewear can include lenses that allow the user to see objects that are relatively small or are otherwise difficult to distinguish from the object’s background or setting. Examples include jeweler’s loupes, dental lenses, surgical lenses, or similar equipment.
  • the eyewear can be disposed to allow the user to examine such objects while performing one particular function (such as surgery) and to operate with normal eyesight while performing another function (such as when observing instruments or tools, monitoring equipment, or other medical personnel).
  • the eyewear can be disposed to allow the user to examine an augmented reality (AR) or virtual reality (VR) enhanced presentation of elements for which the user is performing operations.
  • AR augmented reality
  • VR virtual reality
  • the eyewear can be disposed to present an AR view on at least a portion of the lenses that identifies a diseased portion of a patient’s anatomy (such as a decayed tooth) or a nerve in the patient’s anatomy. This can have the effect that the user is assisted when performing their selected activity.
  • This Application describes devices, and methods for using them, capable of providing the user with the ability to automatically or rapidly switch between close-range and distance viewing, or between frontal and peripheral viewing, in one or more of the following circumstances:
  • This Application also describes devices, and methods for using them, capable of providing the user with the ability to automatically or rapidly switch between close-range and distance viewing, or between frontal and peripheral viewing, while participating in activities in which switching between viewing parameters is valuable to the viewer, such as: — (A) operating a flying vehicle, such as an aircraft, an ultralight aircraft, a glider, a hang- glider, a helicopter, or a similar vehicle;
  • a flying vehicle such as an aircraft, an ultralight aircraft, a glider, a hang- glider, a helicopter, or a similar vehicle;
  • (E) participating in a sport using relatively rapid sports equipment such as baseball, basketball, an equestrian sport (such as dressage or horse racing), football, field hockey, ice hockey, jai alai, lacrosse, a snow sport (such as skiing, sledding, snowboarding, operating a snowmobile, or tobogganing or luge), soccer, or a similar sport;
  • relatively rapid sports equipment such as baseball, basketball, an equestrian sport (such as dressage or horse racing), football, field hockey, ice hockey, jai alai, lacrosse, a snow sport (such as skiing, sledding, snowboarding, operating a snowmobile, or tobogganing or luge), soccer, or a similar sport;
  • digital eyewear generally refer to any device coupled to a wearer’s (or other user’s) input senses, including without limitation: glasses (such as those including lens frames and lenses), contact lenses (such as so-called “hard” and “soft” contact lenses applied to the surface of the eye, as well as lenses implanted in the eye), retinal image displays (RID), laser and other external lighting images, “heads-up” displays (HUD), holographic displays, electro-optical stimulation, artificial vision induced using other senses, transfer of brain signals or other neural signals, headphones and other auditory stimulation, bone conductive stimulation, wearable and implantable devices, and other devices disposed to influence (or be influenced by) the wearer.
  • glasses such as those including lens frames and lenses
  • contact lenses such as so-called “hard” and “soft” contact lenses applied to the surface of the eye, as well as lenses implanted in the eye
  • RID retinal image displays
  • HUD heads-up” displays
  • electro-optical stimulation electro-optical stimulation
  • the eyewear can be wearable by the user, either directly as eyeglasses or as part of one or more clothing items, or implantable in the user, either above or below the skin, in or on the eyes (such as contact lenses), or otherwise.
  • the phrase “eyewear” is not limited to visual inputs only; it can also operate with respect to audio inputs, haptic inputs, olfactory inputs, or other sensory inputs.
  • the eyewear can include one or more devices operating in concert, or operating with other devices that are themselves not part of the eyewear.
  • eyewear can include any device worn by a user in eyeglasses and similar wearable elements; face masks, goggles, helmets, and similar wearable elements; “telescopic” lenses; lenses in jeweler’s loupes, dental lenses, surgical lenses, and similar wearable elements; other lenses in optical systems, such as vehicle mirrors and windshields; lenses in binoculars, cameras, computing device displays and screens, or similar equipment; microscopes and telescopes; other sights and scopes, such as in gun sights or similar equipment; contact lenses and intraocular lenses; or as otherwise described herein.
  • devices can include any similar equipment or any similar form factor, such as glasses having a front piece and temples (ear-pieces) or glasses in a pince-nez form factor and disposed to couple to the user’s nose.
  • devices can include any similar equipment or any similar form factor, such as devices disposed to allow a user to focus on objects at a selected distance or having selected properties, whether relatively close, relatively far, or otherwise difficult to see with unaided vision.
  • the phrases “coloring”, “color balance”, the term “tinting”, and variants thereof, generally refer to any technique by which a set of one or more frequencies or frequency ranges can be selected for emphasis or deemphasis by eyewear, including one or more of: (A) adding or injecting light of one or more frequencies or frequency ranges to the user’s eye or to one or more lenses for receipt by the user’s eye; (B) illuminating eyewear or the user’s eye so as to improve the user’s ability to see in one or more frequencies or frequency ranges; (C) filtering or removing light of one or more frequencies or frequency ranges from infalling light, so as to prevent light of those frequencies or frequency ranges from reaching the user’s eye; or otherwise as described herein.
  • coloring/tinting can have the property that the user’s field of view can be improved so as to reduce the likelihood or severity of a medical condition, or to otherwise treat or ameliorate the medical condition.
  • coloring/tinting can have the property that the user’s field of view can be improved so as to reduce the likelihood or severity of a medical condition, or to otherwise treat or ameliorate the medical condition.
  • dynamic visual optimization generally refers to any technique by which a moving object can be presented to an observer in a substantially nonmoving manner, including one or more of: (A) presenting a sequence of substantially still images, each separately identifiable to the observer with at least some distinction between successive ones of that sequence, which collectively show a continuous motion of the object; (B) presenting a sequence of substantially short moving images, each separately identifiable to the observer with at least some distinction between successive ones of that sequence, which collectively show a continuous motion of the object; or (C) any another techniques described herein by which the observer can distinguish between substantially local positions and direction of motion of the object, without the observer losing the ability to determine a relatively longer motion of the object.
  • dynamic visual optimization can have the property that the observer’s view of the moving object improves the observer’s visual acuity and reduces the cognitive load on the observer when viewing the object.
  • eye-tracking generally refers to any technique by which a gaze direction and/or distance to an object being looked at can be determined, including one or more of:
  • motion blur generally refer to artifacts of viewing objects for which there is relative motion between the user and object, in which the object appears blurred, smeared, or otherwise unclear, due to that relative motion.
  • motion blur can occur when the object and user are moving or rotating relatively quickly with respect to each other.
  • motion blur can occur when the object is disposed in the user’s field of view other than focused upon, such as a peripheral vision field of view or a upper or lower range of the user’s field of view.
  • real time generally refer to timing, particularly with respect to sensory input or adjustment thereto, operating substantially in synchrony with real world activity, such as when a user is performing an action with respect to real world sensory input.
  • real time operation of eyewear with respect to sensory input generally includes user receipt of sensory input and activity substantially promptly in response to that sensory input, rather than user receipt of sensory input in preparation for later activity with respect to other sensory input.
  • sensory input generally refer to any input detectable by a human or animal user.
  • sensory inputs include audio stimuli such as in response to sound; haptic stimuli such as in response to touch, vibration, or electricity; visual stimuli such as in response to light of any detectable frequency; nasal or oral stimuli such as in response to aroma, odor, scent, taste, or otherwise; other stimuli such as balance; or otherwise.
  • sensor overload generally refers to any case in which excessive volume of a sensory input (such as brightness, loudness, or another measure) can cause information to be lost due to human sensory limitations.
  • a sensory input such as brightness, loudness, or another measure
  • excessive luminance in all or part of an image can cause human vision to be unable to detect some details in the image.
  • images having sensory overload can cause human vision to be unable to properly determine the presence or location of objects of interest.
  • shade and “shading/inverse-shading”, and variants thereof, generally refer to any technique for altering a sensory input, including but not limited to:
  • altering a luminance associated with a portion of an image such as by increasing luminance at a selected portion of the image, to brighten that portion of the image, to highlight a border around or near that portion of the image, to improve visibility of that portion of the image, or otherwise;
  • altering a loudness associated with an auditory signal such as by reducing loudness at substantially each portion of the auditory signal;
  • altering a loudness associated with a portion of an auditory signal such as by increasing loudness at a selected set of times or frequencies in that auditory signal, to improve listening to that portion of the image, or otherwise;
  • altering a selected set of frequencies associated with an image such as to provide a “false color” image of a signal not originally viewable by the human eye, such as to provide a visible image in response to an IR (infrared) or UV (ultraviolet) or other information ordinarily not available to human senses;
  • signal input generally refer to any input detectable by eyewear or other devices.
  • signal inputs can include
  • electromagnetic signals other than human senses such as signals disposed in a telephone protocol, a messaging protocol such as SMS or MMS or a variant thereof, an electromagnetic signal such as NFC or RFID or a variant thereof, an internet protocol such as TCP/IP or a variant thereof, or similar elements;
  • a mobile device generally refers to any relatively portable device disposed to receive inputs from and provide outputs to, one or more users.
  • a mobile device can include a smartphone, an MP3 player, a laptop or notebook computer, a computing tablet or phablet, or any other relatively portable device disposed to be capable as further described herein.
  • the mobile device can include input elements such as a capacitive touchscreen; a keyboard; an audio input; an accelerometer or haptic input device; an input couplable to an electromagnetic signal, to an SMS or MMS signal or a variant thereof, to an NFC or RFID signal or a variant thereof, to a signal disposed using TCP/IP or another internet protocol or a variant thereof, to a signal using a telephone protocol or a variant thereof; another type of input device; or otherwise.
  • input elements such as a capacitive touchscreen; a keyboard; an audio input; an accelerometer or haptic input device; an input couplable to an electromagnetic signal, to an SMS or MMS signal or a variant thereof, to an NFC or RFID signal or a variant thereof, to a signal disposed using TCP/IP or another internet protocol or a variant thereof, to a signal using a telephone protocol or a variant thereof; another type of input device; or otherwise.
  • random generally refers to any process or technique having a substantially non-predictable result, and includes pseudorandom processes and functions.
  • a remote device generally refers to any device disposed to be accessed, and not already integrated into the accessing device, such as disposed to be accessed by eyewear.
  • a remote device can include a database or a server, or another device or otherwise, coupled to a communication network, accessible using a communication protocol.
  • a remote device can include one or more mobile devices other than a user’s eyewear, accessible using a telephone protocol, a messaging protocol such as SMS or MMS or a variant thereof, an electromagnetic signal such as NFC or RFID or a variant thereof, an internet protocol such as TCP/IP or a variant thereof, or otherwise.
  • user input generally refers to information received from the user, such as in response to audio/video conditions, requests by other persons, requests by the eyewear, or otherwise.
  • user input can be received by the eyewear in response to an input device (whether real or virtual), a gesture (whether by the users’ eyes, hands, or otherwise), using a smartphone or controlling device, or otherwise.
  • user parameters generally refers to information with respect to the user as determined by eyewear, user input, or other examination about the user.
  • user parameters can include measures of whether the user is able to distinguish objects from audio/video background signals, whether the user is currently undergoing an overload of audio/video signals (such as from excessive luminance or sound), a measure of confidence or probability thereof, a measure of severity or duration thereof, other information with respect to such events, or otherwise.
  • the phrase “visual acuity”, and variants thereof, generally refers to the ability of a user to determine a clear identification of an object in the user’s field of view, such as one or more of:
  • the object is presented in the user’s field of view against a background that involves the user having relatively greater difficulty identifying the object against that background. This is sometimes called “static” visual acuity herein.
  • the object is moving at relatively high speed, or relatively unexpected speed, in the user’s field of view, that involves the user having relatively greater difficulty identifying a path of the object. This is sometimes called “dynamic” visual acuity herein.
  • the object is presented in the user’s field of view at an angle, such as a peripheral vision angle or another non-frontal visual angle, that involves the user having relatively greater difficulty identifying the object. This is sometimes called “peripheral” visual acuity herein.
  • the object is in motion with respect to the user, such as objects that are moving directly toward or away from the user, or objects that are moving in a region of the user’s peripheral vision.
  • the object is located poorly for viewing with respect to a background, such as an object that is brightly backlit, or for which the sun or other lighting is in the user’s eyes, or an object which appears before a visually noisy background, or otherwise is difficult to distinguish.
  • a background such as an object that is brightly backlit, or for which the sun or other lighting is in the user’s eyes, or an object which appears before a visually noisy background, or otherwise is difficult to distinguish.
  • the phrase “improving visual acuity”, and variants thereof, generally refers to improving the user’s audio and/ or visual acuity, or improving the user’s ability to see motion, without degrading the user’s normal ability to sense audio and/or visual information, and without interfering with the user’s normal sensory activity.
  • the user when the user’s visual acuity is improved, the user should still be able to operate a vehicle, such as driving a motor vehicle or piloting an aircraft, or operating another type of vehicle.
  • the phrase “cognitive load”, “cognitive overload”, “cognitive underload”, and variants thereof, with respect to observing an object generally refers to a measure of how difficult an observer might find determining a location or movement of that object, such as with respect to a foreground or a background.
  • cognitive load when the user’s cognitive load is reduced (whether due to a reduced amount of cognitive overload or cognitive underload), the user’s visual acuity is generally improved.
  • cognitive overload generally refers to a measure of excessive sensory input, with the effect that the user loses visual acuity due to that overload.
  • cognitive overload can occur with respect to a moving object when that moving object has a relatively bright sky or a relatively noisy image behind it.
  • cognitive underload generally refers to a measure of inadequate sensory input, with the effect that the user loses visual acuity due to that underload.
  • cognitive underload can occur with respect to a moving object when that moving object is relatively dim or indistinct with respect to its background.
  • FIG. 1 (collectively including figs. 1A-1C) shows a conceptual drawing of a first example eyewear system, showing a lens divided into sections for close-range and distance viewing.
  • FIG. 1A shows a conceptual drawing of an example eyewear system, showing a lens divided into an upper portion and a lower portion.
  • a system 100 such as operated with respect to a user 101 and with respect to an object 102 in the user’s field of view 103, is described with respect to elements as shown in the figure, and as otherwise described herein, such as:
  • eyewear 110 including one or more lenses 111 (possibly including one or more portions, as described herein), an eye-tracking element 112, an object-tracking element 113, an (optional) lamp 114, one or more sensors 115, and possibly other elements;
  • a computing device 120 including a processor 121, program and data memory 122, one or more input/output elements 123, and possibly other elements;
  • a communication system 130 including one or more communication devices 131 and being disposed to couple to one or more remote devices 132 (such as a database, server, a second eyewear 110, and possibly other such devices or elements).
  • remote devices 132 such as a database, server, a second eyewear 110, and possibly other such devices or elements.
  • the eyewear 110 can be disposed to include eyewear or associated optical systems, such as glasses or sunglasses (including “reader” versions thereof), contact lenses, goggles, facemasks or helmets, or other eyewear.
  • the eyewear 110 can include glasses having lenses operating under control of a computing device 120, in which the glasses include lenses 111 that can be controlled by the computing device.
  • the lenses 111 can have a corrective lens effect, such as using refraction to correct for myopia, presbyopia, astigmatism, or otherwise.
  • the lenses 111 can include a shading/inverse- shading element, whether additional to corrective lenses or not, and disposed in line between the user’s eye(s).
  • the eyewear 110 can be disposed to rest on the user 101’s nose 141 and be supported using one or more temples 142 (such as right/left temples 142R/142L). Alternatively, the eyewear 110 can be disposed to rest on the user 101’s nose 141 and be supported thereby (such as with a pince-nez form factor). When the eyewear 110 is disposed in a face mask or goggles, the eyewear 110 can be disposed to be supported by one or more straps or other features supporting the eyewear 110 between the user 101’s eyes and the user 101’s field of view 103.
  • the user 101 can include one or more natural persons, operating individually or cooperatively, with or without assistance from an ML (machine learning) or Al (artificial intelligence (Al) or machine learning (ML) technique, and with or without assistance from one or more other software elements.
  • the user 101 can include one or more software elements, disposed to perform functions as further described herein with respect to the user 101.
  • the user 101 can include one or more communication devices disposed to couple to one or more remote devices, which can be disposed to perform functions as further described herein with respect to the user 101.
  • the object 102 can include one or more objects recognizable by the user 101, such as a physical object within the user 101’s field of view 103, an image of such an object, or an image presented on a display.
  • the physical object 102 can include a person known to the user 101, such as a social media “friend” or as otherwise described herein, a person displaying credentials known to the user 101, such as an emergency responder, medical personnel, law enforcement officer, salesperson, sports participant, or as otherwise described herein.
  • the physical object 102 can also include an object of interest to the user 101, such as an object being offered for sale, an item of sports equipment (e.g., a ball being used in a sports event), a weapon, or as otherwise described herein.
  • an image of an object can include an occlusion or a shadow of a physical object 102, such as might be observed when a ball or other sports equipment crosses a region just outside the user 101’s field of view 103.
  • an image of an object 102 can include an image presented on a display, such as including text, emoji, still or moving pictures, or other elements known to the user 101, such as might be presented on a smartphone or other mobile device, on a computing device display, or as otherwise described herein.
  • the user 101’s field of view 103 can include a whole field of view, a close-range or distant field of view, a central or peripheral field of view, another designated portion of the user 101’s field of view, or as otherwise described herein.
  • Eyewear elements
  • the lenses 111 can include one or more lenses such as supportable by glasses frames (as further described herein), one or more contact lenses, or as otherwise described herein.
  • the lenses 111 can include other suitable elements such as described in the Incorporated Disclosures; for example, the lenses 111 can include one or more lenses supportable by face masks or goggles, one or more intraocular lenses, one or more retinal image devices, or as otherwise described in the Incorporated Disclosures.
  • the lenses 111 can also include one or more lenses disposed to include different regions having refraction parameters suitable for close-range or distant viewing, central or peripheral viewing, or as otherwise described herein.
  • the lenses 111 are shown in the figure as being relatively square or blocky, in alternative embodiments the lenses can be disposed to have any reasonable shape consistent with the functions described herein.
  • the lenses 111 can be disposed to have an “aviator” shape or any other shape that is deemed fashionable or otherwise attractive by the user.
  • the lenses 111 can be disposed to have a relatively circular shape, so as to provide the possibility that an additional lens (or a filter) can be coupled to one or more of the lenses in a rotatable configuration.
  • the lenses 111 have a particular shape as shown in any one or more of the figures; however, there are some particular shapes that are necessary or convenient for some of the functions described herein, such as the possibility of coupling to an additional (rotatable) lens or filter.
  • the eye-tracking element 112 can include a dynamic eye-tracking element disposed to determine one or more of: a gaze direction, a depth of focus, by one or more of the user’s eyes.
  • the eye-tracking element 112 can include an inward-looking (thus, directed toward the user’s eye) camera disposed to determine a set of one or more directions in which the user’s eye is disposed. This can have the effect that the eye-tracking element 112 can determine a direction in which the user’s eye is pointed, thus a gaze direction associated with that user’s eye. With two such gaze directions, the eye-tracking element 112 can determine a depth of focus or focal length associated with the user’s two eyes. Alternatively, the eye-tracking element 112 can measure a pupil width to determine a depth of focus or focal length associated with the user’s eye having that pupil width.
  • the object-tracking element 113 can include an outward-facing camera (thus, directed toward the user’s field of view 103) coupled to the computing device 120.
  • the computing device 120 can be disposed to receive one or more signals from the camera and to determine a location of a designated object 102, such as a baseball or other sports equipment.
  • the computing device 120 can use an artificial intelligence (Al) or machine learning (ML) technique to determine the location of the designated object 102, in response to a signal from the camera indicating information about images available within the user’s field of view 103.
  • the computing device 120 can be disposed to track the designated object within the user’s field of view 103.
  • the computing device 120 can also possibly predict the location of the designated object 102 outside the user’s field of view 103 when that object is moving inertially, such as a baseball or similar sports equipment.
  • the lamp 114 can include an electromagnetic emitter including frequencies in a visible spectrum.
  • the lamp 114 can include an emitter including frequencies in a near-visible spectrum, such as near-infrared or nearultraviolet, which might be usable by one or more detectors to determine whether any objects 102 reflect those frequencies, or whether any objects fluoresce in response to those frequencies.
  • the lamp 114 can include an emitter using ultrasonic energy or another output, which might be usable by one or more detectors to determine whether any objects 102 reflect those energies, or whether any objects react in a detectable way to those energies (such as possibly by vibrating or as otherwise described herein).
  • the lamp 114 can be disposed to shine on one or more objects 102 determined to be within the user’s field of view 103, so as to present those objects to the user 101 in response to the user’s gaze direction or depth of focus.
  • the one or more sensors 115 can include light sensors, such as a photocell, photodiode, or other light sensor, disposed to determine an amount of luminance.
  • the sensors 115 can be disposed to detect luminance either (A) specifically directed to be detected by the sensors 115, such as by the computing device 120; or (B) present in an ambient environment and available to the sensor 115, such as light emitted or reflected by nearby objects 102, such as an environment near the user’s eye.
  • the sensors 115 can be disposed to determine an amount of luminance in response to a particular function determine by the computing device 120, in response to the user’s gaze direction or depth of focus, in response to one or more user gestures, in response to one or more user inputs, or as otherwise described herein.
  • the sensors 115 can be disposed to detect when a user gesture occludes one or more of the sensors, so as to allow the user 101 to enter a continuous value (thus, not limited to one of a discrete set of values).
  • the eyewear 110 can include other elements, such as one or more other detectors disposed to determine information about the user’s environment.
  • the eyewear 110 can include one or more sonar detectors disposed to determine whether any objects are disposed near the user 101, one or more electromagnetic sensors disposed to determine whether any other eyewear are disposed near the user, one or more accelerometers disposed to determine a direction/velocity in which the user is moving, or as otherwise described herein.
  • the computing device 120 can include a processor 121, program and data memory 122, one or more input/output elements 123, and possibly other elements.
  • the computing device can include a smartphone or mobile device 120a, such as an iPhoneTM or AndroidTM mobile phone, an MP3 or other music player, a Wi-FiTM router or “hotspot”, or another device suitable to functions described herein.
  • the smartphone or mobile device 120a can be disposed to interact with the user 101, such as by receiving requests from the user using a touch interface, and presenting responses to the user using a display screen.
  • the computing device 120 can include a smartphone or mobile device 120a as an input/output element 123, similarly receiving requests from the user 101 using a touch or voice interface, and presenting responses to the user using a display screen or audio output.
  • a smartphone or mobile device 120a as an input/output element 123, similarly receiving requests from the user 101 using a touch or voice interface, and presenting responses to the user using a display screen or audio output.
  • the computing device 120 can include program and data memory 122 and can be disposed to interact with another device (whether a smartphone or other mobile device 120a or otherwise) as an input or output element 123, or as both an input and output element, such as by similarly using the smartphone or other mobile device to receive requests from the user using a touch or voice interface, and presenting responses to the user 101 using a display screen or audio output.
  • another device whether a smartphone or other mobile device 120a or otherwise
  • an input or output element 123 or as both an input and output element, such as by similarly using the smartphone or other mobile device to receive requests from the user using a touch or voice interface, and presenting responses to the user 101 using a display screen or audio output.
  • the computing device 120 can include program and data memory 122 a software using an artificial intelligence (Al) or machine learning (ML) technique, disposed to determine the nature of one or more objects in the user’s field of view 103.
  • Al artificial intelligence
  • ML machine learning
  • the computing device 120 can include program and data memory 122 including software using an artificial intelligence (Al) or machine learning (ML) technique, disposed to determine whether there is an object 102 of likely interest to the user 101 in the user’s field of view 103.
  • the object 102 can include an object having material which the user 101 is likely to read, such as a book or paper, a display such as on a smartphone or other mobile device 120a (or another type of display), an object at close-range, a relatively small object at which the user is likely to focus at relatively close-range, an object at a distance, a relatively large object at which the user is likely to focus at a relatively distant range, or as otherwise described herein.
  • the computing device 120 can include program and data memory 122 including software using an artificial intelligence (Al) or machine learning (ML) technique, disposed to determine whether the user 101 is performing a gesture, such as one or more of: a head movement, an eye/face gesture (such as a gesture including movement of the user’s eye, eyebrows, mouth, ears or nose, or such as combinations, repetitions, or sequences thereof), a hand/ finger gesture, another body movement, or as otherwise described herein.
  • a gesture such as one or more of: a head movement, an eye/face gesture (such as a gesture including movement of the user’s eye, eyebrows, mouth, ears or nose, or such as combinations, repetitions, or sequences thereof), a hand/ finger gesture, another body movement, or as otherwise described herein.
  • a gesture such as one or more of: a head movement, an eye/face gesture (such as a gesture including movement of the user’s eye, eyebrows, mouth, ears or nose, or such as combinations, repetitions, or sequences thereof), a
  • the communication system 130 can include one or more communication devices 131, such as disposed in a wireless communication system.
  • wireless communication system can include a local transceiver disposed to exchange information with a remote transceiver, and the remote transceiver can be coupled to a remote device 132.
  • the remote device 132 can include a server device 132a disposed to receive requests and make responses, a database device 132b disposed to receive and respond to database commands/queries, a second eyewear of the same or similar type (or of a different type), or as otherwise described herein.
  • the remote device 132 can include a smartphone or other mobile device 120a, or another input/output device.
  • the smartphone or other mobile device 120a can be disposed to couple using the communication system 130 to the eyewear too, so as to allow the eyewear to use the smartphone or other mobile device (or other input/output device) to receive and respond to commands/queries from the user 101.
  • the eyewear 110 can be disposed to include eyewear or associated optical systems, such as glasses or sunglasses (including “reader” versions thereof), contact lenses, goggles, face masks or helmets, or other eyewear.
  • the eyewear 110 can include glasses having lenses operating under control of a computing device 120, in which the glasses include lenses 111 that can be controlled by the computing device.
  • the lenses 111 can have a corrective lens effect, such as using refraction to correct for myopia, presbyopia, astigmatism, or otherwise.
  • the lenses 111 can include a shading/inverse-shading element, whether additional to corrective lenses or not, and disposed in line between the user’s eye(s).
  • the eyewear 110 can optionally include a front-piece 116 disposed to support or otherwise hold or affix the lenses 111, and a set of temples 117 (sometimes referred to herein as “ear-pieces”) disposed to support or otherwise hold or affix the front-piece while being worn by the user 101.
  • the temples 117 can be disposed to fit over the user’s ears and support the eyewear 110.
  • the front piece 116 and temples 117 are shown in the figure as surrounding the lenses and resting on the user’s ears, there is no particular requirement for any such limitation.
  • the front piece 116 can be coupled to the user’s nose (such as in a pince-nez), obviating a specific need for temples 117.
  • the front-piece 116 and temples (ear-pieces) 117 are shown in the figure as surrounding the lenses and resting on the user’s ears, there is no particular requirement for any such limitation.
  • the front-piece 116 can be coupled to the user’s nose (such as in a “pince-nez” form factor), obviating the need for temples (ear-pieces) 117.
  • the eyewear 110 can also include one or more lamps disposed to provide lighting.
  • the eyewear 110 can include a first lamp 114a disposed to inject light into the user’s eye.
  • the first lamp 114a can be disposed to inject light between 5OO-56onm (green light) into the user’s eye; this is believed to have an advantageous effect with respect to migraine, photophobia, or neuro-ophthalmic disorder. This can have the effect that the user can have migraine or photophobia prevented, delayed, alleviated, or otherwise treated.
  • the eyewear 110 can include a second lamp 118b disposed to illuminate an object, such as an object being viewed by the user or otherwise recognized by the eyewear 110.
  • the eyewear 110 can be disposed to recognize the object and to direct the second lamp 114b at the object so as to illuminate that object. This can have the effect that the user can more easily view the recognized object, even when recognition of that object is performed by a computing device in the eyewear 110.
  • the eyewear 110 can include a third lamp 118c disposed to illuminate a region in the user’s field of view, such as a region the eyewear 110 is encouraging the user to look at.
  • the eyewear 110 can be disposed to identify a portion of the user’s field of view at which to encourage the user to look and can be disposed to direct the third lamp 114c in that direction. This can have the effect that the selected portion of the user’s field of view is made brighter that other portions and the user is encouraged to look in that direction.
  • the eyewear 110 can be disposed to control a display so as to brighten or darken a display, or a portion of a display, of a computing device.
  • the eyewear 110 can be disposed to send a signal to the computing device, or to the display itself, directing the display to brighten/ darken, or directing the display to brighten/ darken a portion thereof. This can have the effect that the selected display, or portion thereof, can attract the user’s attention and prompt the user to look in that direction.
  • the user 101 can include one or more natural persons, operating individually or cooperatively, with or without assistance from an ML (machine learning) or Al (artificial intelligence) technique, and with or without assistance from another software element.
  • the user 101 can include one or more software elements, disposed to perform functions as further described herein with respect to the user.
  • the eyewear 110 can be disposed to include eyewear or associated optical systems, such as glasses or sunglasses (including “reader” versions thereof), contact lenses, goggles, facemasks or helmets, or other eyewear.
  • the eyewear 110 can include glasses having lenses operating under control of a computing device 120, in which the glasses include lenses 111 that can be controlled by the computing device.
  • the lenses 111 can have a corrective lens effect, such as using refraction to correct for myopia, presbyopia, astigmatism, or otherwise.
  • the lenses 111 can include a shading/inverse- shading element, whether additional to corrective lenses or not, and disposed in line between the user’s eye(s).
  • one or more lenses 111 can include a first layer 111* and a second layer 111**, disposed so as to provide that light passing through the lens 111 is subject to a designated effect, such as refraction, shading/inverse-shading or color- ing/tinting, dynamic visual optimization (DVO), or as otherwise described herein.
  • a designated effect such as refraction, shading/inverse-shading or color- ing/tinting, dynamic visual optimization (DVO), or as otherwise described herein.
  • the first layer 111* can include a substantially transparent substrate on which is disposed a material having a computer-controlled visual effect.
  • the second layer 111** can include a substance which is sensitive to an electromagnetic signal, and can be disposed to provide the visual effect in response to that electromagnetic signal.
  • the first layer 111* can include glass or plastic, or another substance suitable for being substantially transparent to electromagnetic signals or images.
  • the first layer 111* can also be disposed to support a coating or another layer (such as the second layer 111**), so as to provide a combined visual effect on the electromagnetic signal.
  • the second layer 111** can include a layer of glass, a layer of plastic, or a chemical coating on the first layer 111*, such as disposed on one or both sides of the lens 111.
  • Either the first layer 111*, or the second layer 111**, or both, can be disposed to be chemically or electromagnetically (or otherwise) responsive to a signal from the computing device 120, so as to alter one or more viewing functions applied to images received from the user 101’s field of view 103. This can have the effect that the combination of the first layer 111* and the second layer 111** can be disposed to provide a combined visual effect on an image in the user 101’s field of view 103.
  • the first layer 111* can be disposed to be responsive to an electromagnetic signal from the computing device 120, such as a signal directing it to alter the amount of refraction, the amount of shading/inverse-shading element, the amount and hue of coloring/tinting, or the amount or parameters of another function, it imposes on light passing therethrough.
  • the first layer 111* can also be disposed to alter other viewing features of one or more lenses 111, such as a dynamic viewing optimization (“DVO”) function, or as otherwise described herein.
  • DVO dynamic viewing optimization
  • the first layer 111* can be disposed to use (chemical (or other) darkening or tinting, polarization, or other techniques suitable for control by a computer-controlled electrical signal.
  • one or more of the first layer 111* or the second layer 111* can operate under control of, to perform its functions as directed by the computing device 120.
  • the first layer 111* can provide an amount of refraction
  • the second layer 111** can provide an amount of shading/inverse-shading, an amount and hue of coloring/tinting, an amount of polarization, an amount of prismatic angle deflection, or as otherwise described herein.
  • the amount of refraction provided by the first layer 111* can vary with location within the user 101’s field of view 103
  • the amount of another visual effect provided by the second layer 111** can vary with a control signal from the computing device 120 (as further described herein).
  • the second layer 111* can include a coating (such as a chemical coating) including a substance which is either substantially transparent, or which alternatively provides a selected visual effect, under control of an electromagnetic signal from the computing device 120.
  • the electromagnetic signal can include a current or voltage, such as a current or voltage which might either be near zero, or which alternatively might exceed a selected threshold, such as a electromagnetic signal equivalent to a logical “o” or a logical “i”.
  • the electromagnetic signal can include a signal having selected information content, such as a sine wave at one or more selected frequencies, or an information signal encoded using a selected modulation technique, such as frequency modulation, pulse-code or pulsewidth modulation, code-division modulation (“CDMA”), as otherwise described herein.
  • a selected modulation technique such as frequency modulation, pulse-code or pulsewidth modulation, code-division modulation (“CDMA”), as otherwise described herein.
  • CDMA code-division modulation
  • the first portion inT can be disposed to provide one or more viewing functions so as to allow the user 101 (not shown) to focus on one or more relatively close-range objects, thus at a relatively nearby depth of view (i.e., focal length within the user 101’ s field of view).
  • the second portion inB can be disposed to provide one or more viewing functions, possibly different from the first portion mT, so as to allow the user 101 (not shown) to focus on one or more relatively distant objects, thus at a relatively more distant depth of view.
  • the first layer in* might provide a first amount of refraction for an upper portion of the lens in suitable for distance viewing and a second amount of refraction for a lower portion of the lens in suitable for closer viewing, such as when the user 101 is using the eyewear no as “reader” glasses.
  • the closer-viewing portion of the lens m can be disposed to have a lesser amount of shading/inverse-shading or coloring/ tinting, so as to allow the user 101 to read text on a page or a display without substantial alteration.
  • the distance-viewing portion of the lens in can be disposed to have a greater amount of shading/inverse-shading or coloring/tinting, so as to allow the user 101 to peruse their field of view without bright ambient light interfering with their eyesight.
  • the closer-viewing portion of the lens in can be disposed to have a selected amount of shading/inverse-shading or coloring/tinting, so as to allow the user 101 to read text on a page or a display in bright ambient light, without the ambient light causing the page to be so bright that the user 101 cannot read.
  • the first portion 111T and the second portion 111B can include an upper portion and a lower portion of the lenses 111, respectively. This can have the effect that the first portion 111T can be disposed to cover a portion of the user 101’s field of view 103 through which the user 101 would look when viewing one or more distant objects.
  • the second portion 111B can be disposed to cover a portion of the user 101’s field of view 103 through which the user 101 would look when viewing one or more close-range objects.
  • the second portion 111B can be disposed to cover a lower 20mm portion, while the first portion 111T can be disposed to cover the remaining portion, of the lenses 111.
  • the eyewear 110 can be disposed to be responsive to the user’s viewing direction, as determine by the computing device 120 in response to the eye-tracking element 112 or the object-tracking element 113.
  • the computing device 120 can be disposed to determine one or more of: (A) the user’s depth of focus or focal length within the user’s field of view 103; or (B) a recognized object 102 within the user’s field of view 103. In such cases, the computing device 120 can determine that the user 101 is looking through a particular portion of the lens 111, such as a first portion or a second portion (or as described herein with respect to fig. 1B, such as the distance portion 111T or the close-range portion 111B.
  • the computing device 120 When the computing device 120 so determines, it can be disposed to urge the user 101 to look through the “correct” portion of the lens 111, thus, the portion of the lenses 111 associated with the user’s particular depth of focus or focal length, or the portion of the lenses associated with the recognized object within the user 101’s field of view 103.
  • the computing device 120 can determine, in response to the lamp 114, that the user 101 is looking at a relatively close-range object 102.
  • the computing device 120 can so determine in response to whether the object 102 reflects, or otherwise responds to, frequencies emitted by the lamp 114, in such manner as to indicate that the object 102 is relatively close-range.
  • the computing device 120 can be disposed to determine a distance to the object 102 in response to (A) an amount of reflection by the object 102 from the lamp 114 back to one or more sensors 115, such as an ambient light sensor 115; (B) an amount of heat energy generated by the object 102 from the lamp 114, absorbed by the object 102 and re-radiated back to one or more sensors 115, such as an ambient light sensor 115; (C) an amount of florescence, or other re-radiation, by the object 102 from ultraviolet emitted by the lamp 114 and re-radiated by the object 102 back to one or more sensors 115, such as an ambient light sensor 115; (D) a time-of-flight or equivalent indicator of distance between the lamp 114 and the object 102, such as when the lamp 114 is disposed to emit ultrasonic energy or another signal detected by the object 102 and to which the object 102 responds; or as otherwise described herein.
  • the first layer 111* of the lens 111 can be substantially transparent and have an upper portion 111a and a lower portion mb, the upper portion 111a being disposed for distance viewing and having no substantially refractive effect, and the lower portion mb being disposed for close-range (“reader”) viewing and having a refractive effect of between about +1 diopter and +3 diopter, or more or less.
  • the first layer 111* can operate as “reader” glasses, providing the user 101 with the option of switching between distance viewing using the upper portion 111a and close-range viewing using the lower portion mb, using eye movement, head movement, or similar activity.
  • the second layer 111** of the lens can have a corresponding upper portion 111a and lower portion mb, and can impose a shading/inverse-shading effect (either on the upper portion ma or the lower portion mb (or both, or neither), so as to implement functions described herein.
  • the second layer 111** can operate to shade/inverse-shade or color/tint portions of the first layer 111* under control of the computing device 120, so as to encourage the user 101 to direct their gaze through a selected portion of the lens 111, to reduce an amount of light incoming through one or more selected portions of the lens 111 into the eye, to ameliorate an effect of light incoming through one or more selected portions of the lens 111 into the eye, or otherwise as described herein.
  • the second layer 111** When the second layer 111** is controlled to shade/inverse-shade or color/tint the upper portion ma of the first layer 111* of the lens 111, the user’s eye can be shaded against bright incoming light, such as floodlights, glare, or sunlight. This can effectively provide the user with sunglasses (when the user directs their eyes at a relative distance) that still allow the user to read from a book or a display (when the user directs their eyes at a relatively close range).
  • sunglasses when the user directs their eyes at a relative distance
  • the user’s eye can be shaded against bright incoming light 111 that is magnified by the “reader” portion of the lens and thus having a magnified effect on the user’s eye, such as when incoming light is reflected from a book or a display into a magnifying “reader” portion of the lens 111.
  • This can effectively provide the user’s eyes with protection against bright light that is untowardly magnified by the “reader” portion of the lens 111.
  • the user can be protected against excessive incoming light, such as (A) when an ambient environment provides excessive incoming light and can trigger migraine, photophobia, neuro-ophthalmic disorder, or other eye disorders, (B) when an object within the user’s field of view is deliberately disposed to overload the user’s viewing capability, such as a dazzling laser, a “flashbang” grenade, a floodlight, or another eyesight-disabling device.
  • A when an ambient environment provides excessive incoming light and can trigger migraine, photophobia, neuro-ophthalmic disorder, or other eye disorders
  • B when an object within the user’s field of view is deliberately disposed to overload the user’s viewing capability, such as a dazzling laser, a “flashbang” grenade, a floodlight, or another eyesight-disabling device.
  • the first layer 111* can be disposed to provide an amount of refraction on the lens 111.
  • the first layer 111* can be disposed to provide a first amount of refraction on a first portion 111a of the lens 111, such as an upper portion having a distant amount of refraction for relatively distant viewing, and a second amount of refraction on a second portion mb of the lens, such as a lower portion having a closer amount of refraction for relatively closer viewing.
  • Lenses 111 that are divided into two portions are sometimes commonly referred to as “bifocals” or “bifocal lenses”.
  • Lenses 111 that are divided into more than two portions are sometimes referred to herein as “trifocals”, “multifocals”, or “progressive lenses”.
  • the lens 111 can thus include a similar upper portion 111a, disposed to facilitate distance (or “far”) viewing, such as being relatively clear and absent of any strong refractive viewing effect.
  • the lens 111 can also include a “reader” portion 111c, localized in a corner of a similar lower portion mb of the lens, disposed in a region where the user is likely to be focused when reading text at relatively close range.
  • the user’s eyes might be focused on a relatively close-range portion of the user’s field of view. This can have the effect that the user’s pupils are more closely positioned and are directed at a more centralized portion of the lens 111 on its lower portion mb.
  • the lenses 111 are shown in fig. 1A as being relatively square or blocky, in alternative embodiments the lenses can be disposed to have any reasonable shape.
  • the lenses 111 can be disposed to have a relatively circular shape, so as to provide the possibility that an additional lens (or a filter) can be coupled to one or more of the lenses in a rotatable configuration.
  • the lenses 111 can be disposed to have an “aviator” shape or any other shape that is deemed fashionable or otherwise attractive by the user.
  • the lens 111 having a first portion 111a and a second portion mb, there is no particular requirement for any such limitation.
  • the lens 111 (or its first layer 111*) can be disposed to have a first portion in an upper region for relatively distant viewing, a second portion in a lower region for relatively closer viewing, and a third portion in a middle region for relatively mid-range viewing.
  • the lens 111 (or its first layer 111*) can be disposed to have a sequence of portions ranging from an upper region to a lower region, each having a slightly different amount of refraction, so as to present the user 101 with a substantially continuous range of refraction across the lens.
  • the first layer 111* can be disposed to include glass or plastic having the first portion 111a and the second portion mb with different amounts of refraction.
  • the second layer 111** can be disposed as a coating on one or both sides of the first layer 111* and being responsive to a first control signal 111a* with respect to the first portion 111a and a second control signal 111b* with respect to the second portion mb.
  • the first control signal 111a* can be disposed to alter an amount of shading/inverse-shading, or coloring/tinting with respect to the first portion 111a and the second control signal mb* can be disposed to alter an amount of shading/inverse-shading, or coloring/tinting with respect to the second portion mb.
  • This can have the effect that the eyewear 110 can be disposed to provide different visual effects with respect to a first part of the user 101’s field of view 103 and a second part of the user 101’s field of view 103.
  • the second layer 111** can be controlled to shade/inverse-shade or tint the lower portion mb.
  • the eyewear 110 can be disposed to provide a neutral density of approximately one-half green tint and one-half shading in the upper portion ma of the first layer 111* of the lens 111, such as a region the user can use for distant viewing. This can have the effect that the user obtains the benefit of substantially green light, but also does not filter out all of the red light, thus (for example) allowing the user to see traffic lights, brake lights, warning lights, and other hazard indicators associated with driving.
  • the eyewear can be disposed to provide a neutral density of approximately one-half green tint and one-half shading, in a portion of the lenses (either the upper portion 111a or the lower portion mb of the lenses 111) where the user is viewing a presentation, so as to provide a substantially green environment while still allowing the display to present red colors (and colors related to red, such as orange) to the user.
  • the neutral density can be disposed either on the upper portion 111a of the lenses 111, on the lower portion mb of the lenses, or on both the upper portion 111a and the lower portion mb of the lenses.
  • the eyewear 110 can be disposed to control whether to present the neutral density on selected portions of the lenses 111 in response to one or more of:
  • a user control such as the user indicating a preference
  • a measure of light entering the lenses such as a measure of an amount of red light is available in the ambient environment
  • the eyewear can be disposed to provide a neutral density shading and tinting (as described herein), so as to filter commercials to a color balance more amenable to the user being able to sleep.
  • the eyewear can be disposed to shade bright blue and replace that coloring with less-bright green (possibly with a slight delay for processing), or to filter bright blue or other colors that might trigger migraines or photophobia, so as to replace those colors with black/white, or otherwise to assist the user in avoiding an episode of migraine or photophobia (or another medical condition, such as epilepsy, which might be triggered by bright or flashing commercials), or in allowing, better sleep.
  • the eyewear can be disposed to use that signal to engage in shading/inverse-shading or tinting in response to a prediction of likely brightness or color balance.
  • the upper portion 111a and the lower portion mb can be disposed to provide two distinct coloring/tinting effects. This can have the effect that the user can use the upper portion 111a for a first type of viewing, such as relatively distant viewing of a television or a movie, in which accurate color perception is preferred, and the user can use the lower portion mb for a second type of viewing, such as relatively close-range viewing of reading material, in which the user might make use of primarily black/ white perception.
  • the upper portion 111a and the lower portion mb can be disposed to provide distinct coloring/tinting effects with other purposes or values.
  • the eyewear 110 can be disposed to provide distinct coloring/tinting effects in response to one or more of:
  • a determination of a color balance of the user’s field of view such as a relatively closerange field of view and a relatively distant field of view;
  • a determination of a user medical condition such as whether the user is particularly subject to migraine or photophobia, or other medical conditions such as depression, epilepsy, or vertigo;
  • a determination of another person s preference, such as in response to a caretaker; medical personnel; evaluating, observing, or training personnel; or otherwise as described herein;
  • the eyewear 110 can be disposed to urge the user 101 to look through a selected one of the portions of the lens 111, such as a selected one of the first portion 111a or the second portion mb.
  • a selected object such as a moving object (such as a ball or other sports equipment)
  • the eyewear 110 can determine a distance to the selected object and urge the user 101 to look through a selected preferred portion of the lens 111, such as a close-range portion or distance portion of the lens.
  • the eyewear 110 can select one or more of the first portion ma or second portion mb, and use the second layer 111** to shade/inverse-shade or color/ tint the de-selected portion (if any) so as to urge the user 101 to look through the selected portion.
  • the eyewear 110 can be disposed to use the computing device 120 to send one or more electromagnetic signals to the second layer 111** so as to cause the second layer to alter its viewing effect in combination with the first layer 111*.
  • the computing device 120 can be disposed to alter the first control signal 111a* and the second control signal mb* so as to darken or tint the de-selected portion of the lens 111 so as to urge the user 101 to look through the selected portion of the lens.
  • the computing device 120 can be disposed to alter the first control signal 111a* and the second control signal mb* so as to perform one or more of the following:
  • the second layer 111** can be disposed to darken or strongly tint the de-selected portion of the lens 111 so as to prevent the user 101 from easily seeing through the de-selected portion of the lens. This can have the effect that the user 101 will instead view their field of view 103 through the selected portion of the lens.
  • the second layer 111** can be disposed to slowly increase an amount of shading/inverse- shading or coloring/tinting of the de-selected portion of the lens 111 so as to warn the user 101 that the de-selected portion is about to be strongly shaded/inverse-shaded or colored/tinted. This can have the effect that the user 101 will realize that the de-selected portion is about to become difficult to view and will instead view their field of view 103 through the selected portion of the lens.
  • the second layer 111** can be disposed to provide the user 101 with a warning that the de-selected portion of the lens 111 is about to be de-selected before shading/inverse-shading or coloring/tinting the de-selected portion.
  • a warning signal can include one or more of several possibilities.
  • the warning can include a symbol or text presented on the lens 111, such as an arrow pointing toward the selected portion of the lens.
  • the warning can include a function presented on one or more portions of the lens 111, such as varying an amount of shading/inverse-shading or coloring/tinting so as to provide a viewable “flash”, or another signal known to the user 101.
  • the viewable “flash” can be presented on the selected portion, the de-selected portion, or alternately on one after the other.
  • the warning can include a signal other than a visual signal.
  • the warning signal might include an audio signal which might include a beeping or buzzing, or a voice instruction.
  • the warning signal might include a haptic signal, which might include a buzzing or shaking.
  • the warning signal might include another signal known to the user 101, or might include a signal as otherwise described herein.
  • the eyewear 110 can be disposed to use shading/inverse-shading so as to encourage users to avoid behavior that can lead to adverse optical conditions, such as dry eyes or myopia.
  • the eyewear 110 can be disposed to perform one or more of:
  • the eyewear 110 can be disposed to shade/inverse-shade or color/ tint a close-range portion of the lenses so as to prompt the user 101 to look away from a close-range object and toward a distance. For example, after warning the user 101, the eyewear 110 can be disposed to darken the close-range portion of the lenses at greater and greater amounts, so as to prevent the user 101 from continuing to squint at close-range objects.
  • the eyewear 110 can be disposed to shade/inverse-shade or color/tint all or a portion of the lenses so as to prompt the user 101 to blink. For example, after warning the user 101, such as using an audio beep or a video flash, the eyewear 110 can be disposed to darken the entirety of the lenses at greater and greater amounts, so as to prevent the user 101 from continuing to see and prompting the user 101 to blink. For another example, the eyewear 110 can be disposed to use this effect or a similar effect to ameliorate an effect of migraine or photophobia.
  • the eyewear 110 can be disposed to shade/inverse-shade or color/tint the lenses so as to prompt the user 101’s eyes to naturally dilate (or to shade/inverse-shade or color/tint the lenses in combination with dilation using atropine).
  • the shading/inverse-shading or color- ing/tinting can be restricted to regions where the user 101 is not looking, so as to prompt the user 101’s eyes to naturally dilate, while still allowing the user 101’s eyes to receive sufficient light to avoid any necessity to squint, and without acclimatizing the user 101’s eyes to excessive dilation.
  • the eyewear 110 can be disposed to use other techniques to encourage user 101s to avoid behavior that can lead to adverse optical conditions, such as dry eyes or myopia, such as by prompting the user 101 to blink or otherwise focus on distant objects in response to a change in effect of the lenses.
  • the eyewear 110 can be disposed to perform one or more of:
  • the eyewear 110 can be disposed to alter a refraction of a close-range portion of the lenses so as to prompt the user 101 to look away from a close-range object and toward a distance.
  • the eyewear 110 can be disposed to alter the refraction of the close-range portion of the lenses away from the user 101’s natural prescription, so as to prevent the user 101 from focusing effectively on close-range objects. In such cases, the user 101 can be prompted to focus on a more distant object in response to a change in effect of the lens.
  • the eyewear no can include an artificial pupil generated in response to a radial shading on a lens, such as a contact lens or an intraocular (IOC) lens, disposed to perform one or more of: (A) reducing an amount of light admitted to the user 101’s eye, or (B) altering a focal length of the user 101’s eye. In such cases, the user 101 can be prompted to blink, or to focus on a more distant object, in response to a change in effect of the lens.
  • a lens such as a contact lens or an intraocular (IOC) lens
  • the eyewear no can include an artificial refraction effect generated in response to a radial shading on a lens, such as a contact lens or an intraocular (IOC) lens, disposed to alter a refractive effect on light admitted to the user 101’s eye.
  • a lens such as a contact lens or an intraocular (IOC) lens
  • IOC intraocular
  • the user 101 can be prompted to blink in response to a change in effect of the lens.
  • the eyewear 110 can be disposed to be responsive to the user 101’s viewing direction, as determined by the computing device 120 in response to the eyetracking element 112 or the object-tracking element 113.
  • the computing device 120 can be disposed to determine one or more of: (A) the user 101’s depth of focus or focal length within the user 101’s field of view 103, or (B) a recognized object within the user 101’s field of view 103. In such cases, the computing device 120 can determine that the user 101 is looking through a particular portion of the lens 111, such as the first portion 111a or the second portion mb (or as described herein with respect to fig. 1B, such as the distance portion 111a or the close-range portion 111c).
  • the computing device 120 When the computing device 120 so determines, it can be disposed to urge the user 101 to look through the “correct” portion of the lenses 111, thus, the portion of the lenses associated with the user 101’s particular depth of focus or focal length, or the portion of the lenses associated with the recognized object within the user 101’s field of view 103.
  • the computing device 120 can determine, in response to the lamp 114, that the user 101 is looking at a relatively close-range object 102.
  • the computing device 120 can so determine in response to whether the object 102 reflects, or otherwise responds to, frequencies emitted by the lamp 114, in such manner as to indicate that the object 102 is relatively close-range.
  • the computing device 120 can be disposed to determine a distance to the object 102 in response to (A) an amount of reflection by the object 102 from the lamp 114 back to one or more sensors 115, such as an ambient light sensor 115; (B) an amount of heat energy generated by the object 102 from the lamp 114, absorbed by the object 102 and re-radiated back to one or more sensors 115, such as an ambient light sensor 115; (C) an amount of florescence, or other re-radiation, by the object 102 from ultraviolet emitted by the lamp 114 and re-radiated by the object 102 back to one or more sensors 115, such as an ambient light sensor 115; (D) a time-of-flight or equivalent indicator of distance between the lamp 114 and the object 102, such as when the lamp 114 is disposed to emit ultrasonic energy or another signal detected by the object 102 and to which the object 102 responds; or as otherwise described herein.
  • the computing device 120 can be disposed to determine an object 102 that is relatively well-lit with respect to other objects in the user 101’s field of view 103, such as defined by one or more of: (C) the lamp 114, or (D) one or more of the sensors 115.
  • the computing device 120 can be disposed to identify an object 102 that is relatively well -lit with respect to a background behind the object 102, such as when the object 102 is being lit by the lamp 114.
  • the computing device 120 can be disposed to identify an object 102 that is relatively in shadow with respect to the background behind the object 102, such as when the object 102 is substantially darker than its background.
  • the computing device 120 can be disposed to identify an object 102 that is moving into or out of a relatively well-lit region, or into or out of a region in relative shadow, possibly in response to one or more of: (A) whether the user 101 looking at the object 102; (B) whether the object 102 is moving toward or away from the user 101102, or is otherwise moving relative to the user 101102.
  • the computing device 120 can be disposed to adjust, in response to the eye-tracking element 112, the shading/inverse-shading or tinting effect of the second layer 111**, so as to encourage the user 101 to use a portion of eyewear 110 with optimal adjustment for their selected view.
  • the eye-tracking element 112 can be coupled to the computing device 120, which can be disposed to use information from the eye-tracking element to determine whether the user 101 is directing their gaze (or focal length) at a distance or at a close range.
  • the eyewear 110 can be disposed to encourage the user 101 to adjust their gaze direction (or focal length) in response to their selected activity. For example, if the user 101 is reading a book or viewing a display in sunlight and switches their gaze direction (or focal length) between a close-range view and a distance view, the eyewear 110 can be disposed to encourage the user 101 to direct their gaze direction through a selected portion of the lens 111. This can have the effect that the user 101 need not adjust the eyewear 110 so as to move the selected portion of the lens 111 in front of their gaze direction; instead, the user 101’s gaze direction can be encouraged to move so as to move the user 101’s gaze direction toward the selected portion of the lens ill.
  • the eyewear 110 can be disposed to similarly encourage the user 101 to direct their gaze direction through a selected portion of the lens 111.
  • the eyewear 110 can be disposed to so operate when the user 101 is operating any type of vehicle having a control panel, such as an aircraft, a ground vehicle, or a watercraft.
  • the eyewear 110 can be disposed to similarly encourage the user 101 to direct their gaze direction through a selected portion of the lens 111.
  • the eyewear 110 can be disposed to so operate when the user 101 is performing a detailed activity, such as emergency medical response or treatment, when the user 101 is performing a potentially life-threatening activity, such as law enforcement or search/rescue, when the user 101 is performing an activity in the presence of a possibly distracting bright ambient light, or otherwise as described herein.
  • the eyewear 110 can be disposed to adjust shading/inverse-shading or polarization to encourage the user 101 to use a portion of eyewear 110 with optimal adjustment for their selected view. For some examples:
  • the eyewear 110 can be used when switching between close-range viewing and distance viewing: such as when switching between distance viewing and when reading from a book or an electronic display; switching between piloting a vehicle and operating its controls, such as an aircraft, ground vehicle, or watercraft; switching between performing a detailed or lifethreatening activity that includes both close-range viewing and distance viewing, such as law enforcement or search/rescue, or emergency medical response or treatment; or switching between close-range viewing and distance viewing as otherwise described herein, such as for example in bright ambient light.
  • the eyewear 110 can be used when switching between frontal viewing and peripheral viewing, such as when switching between piloting a vehicle and watching for traffic, such as an aircraft, ground vehicle, or watercraft; performing a detailed or life-threatening activity that includes both frontal viewing and peripheral viewing, such as law enforcement or search/ rescue; or otherwise as described herein.
  • the eyewear no can determine where a “close-range” location is, such as in response to a signal from one or more of: an external device, (such as a computing display or mobile device), a control panel or dashboard of a vehicle, a medical instrument, a signal from a facemask or helmet that can determine whether the user 101’s eyes are directed at the close-range location, or otherwise as described herein.
  • the eyewear no can similarly determine where a peripheral location is, such as in response to a signal from one or more of: an external device, a control panel or dashboard, a medical instrument, a facemask or helmet, or otherwise as described herein.
  • the eyewear no can determine where a close-range versus a distance location, or a frontal versus a peripheral location, in response to an input signal from a user 101, such as described below, indicating when the user 101 believes they are looking at a close- range versus a distance location, or looking at a frontal versus a peripheral location.
  • This can have the effect that the user 101 can dynamically adjust one or more dividing borders on the lens between close-range versus distance effects, or between frontal versus peripheral effects, rather than having those effects occur in response to a fixed dividing line on the lens.
  • This Application describes devices, and methods for using them, capable of optimizing sensory inputs in bright ambient light, such as when using eyewear 110 having multiple viewing regions. For some examples:
  • the eyewear 110 can have multiple regions having distinct refractive properties; this can have the effect that the eyewear 110 can be suitable for either nearer or farther viewing, or for either viewing in a frontal region or a peripheral region.
  • the eyewear 110 can have multiple regions having distinct shading/inverse-shading properties; this can have the effect that the eyewear 110 can be suitable for viewing objects with different shading/inverse-shading or with no shading/inverse-shading.
  • the eyewear 110 can have multiple regions having distinct polarization properties; this can have the effect that the eyewear 110 can be suitable for viewing objects with different polarization or with no polarization.
  • the eyewear 110 can have multiple regions having distinct dynamic visual optimization; this can have the effect that the eyewear 110 can be suitable for viewing objects with different flickering/periodic display properties, or no flickering/periodic display properties.
  • the eyewear 110 can have multiple regions having distinct color balance; this can have the effect that the eyewear 110 can be suitable for viewing objects with different color frequencies or mixtures thereof. Adjusting for light level
  • the eyewear 110 can include a lamp 114 disposed proximally toward the user’s eye with respect to the lenses 111 (such as disposed inside the eyewear), so as to provide illumination viewable by the user without alteration by the lenses.
  • the lamp 114 can thus adjust a coloring or color balance, or a luminance perceived by the user’s eye. This can have the effect that the user’s eye receives a sufficient amount of luminance so that the user’s visual perception is urged from a scotopic (responsive to a low-level amount of luminance) to mesopic (responsive to a mid-level amount of luminance).
  • the user’s visual perception is mesopic, the user can more easily distinguish colors in the user’s field of view, so as to allow the user to perform activities for which color perception is valuable. Examples of such activities can include one or more of:
  • the lamp 114 can be disposed to present an increase in ambient luminance when the lenses 111 are being adjusted to shade/inverse-shade infalling light to reduce glare or other bright light with possible adverse effects on a user’s medical condition, such as migraine or photophobia.
  • the lamps 111 when the lenses 111 are being adjusted to provide color- ing/tinting to increase an amount of green light, such as to treat or ameliorate migraine or photophobia, the lamp 114 can be disposed to present an increase in luminance perceived by the user’s eye; this can allow the user to interpret their field of view using color vision, even when infalling light is heavily skewed toward green light.
  • the lamp 114 can be disposed to provide a relatively small amount of luminance in a broad spectrum of visible light, such as not to degrade the user’s night vision.
  • the lamp 114 can be disposed to provide a greater amount of luminance in a narrow spectrum of visible light, sufficient to urge the user’s eye toward mesopic operation (but not sufficient to either degrade the user’s night vision or to provide enough blue/ultraviolet to increase the likelihood or severity of migraine or photophobia).
  • the lamp 114 can be disposed to provide luminance in a relatively narrow spectrum of visible light, such as light suitable to reduce the likelihood or severity of, or to treat or ameliorate, migraine or photophobia.
  • the lamp 114 can be disposed to provide light in a frequency range of approximately 5OO-56onm (green), which is believed to be helpful with respect to migraine or photophobia.
  • the eyewear 110 can be disposed to be responsive to a user input indicating a likelihood or severity of migraine or photophobia, such as a measure of pain being experienced by the user.
  • the eyewear 110 can also be disposed to record the user input and to associate that input with a set of related parameters, such as ambient environment (including luminance, color balance, allergens, pollution, air pressure and/ or other weather parameters), time of day, season, and/ or other factors known to be associated with migraine or photophobia.
  • the eyewear 110 can be disposed to adjust an amount of luminance from the lamp 114 (down to zero such luminance) so as to urge the user’s eyes to operate in a scotopic mode (as noted, responsive to a low-level amount of luminance). This can have the effect of allowing the user to train their eyesight so as to recognize object color from texture in a black/ white environment, in response to information from the user’s rods, which provide an essentially monochromatic view.
  • the eyewear 110 can be disposed to adjust coloring/tinting, or otherwise adjust color balance, so as to show primarily green light. Showing green light is believed to provide the user with a reduced likelihood or severity of migraine or photophobia; however, showing only green light can have the effect of degrading the user’s ability to see objects or signals which are red. Accordingly, showing primarily green light can be performed once the user’s eyesight is trained to recognize red objects or signals.
  • the lamp 114 can be disposed to illuminate a portion of the user’s field of view as directed by the user’s gaze direction and/or focal length.
  • the eyewear 110 can determine a location at which the user is looking. Having determined where the user is looking, the eyewear 110 can either add more light to that location using the lamp 114, subtract less light from that location using shading/in- verse-shading, or some of each of those effects.
  • the eyewear 110 can be disposed to continuously adjust illumination and/or shading/inverse-shading in response thereto.
  • the eyewear 110 can be disposed, in response to a user input, to adjust refraction of the lenses 111 in response to a distance to the location where the user is looking (such as to “zoom in” on that location), adjust shading/inverse-shading to encourage the user to focus on that location, and/or adjust coloring/tinting to ease the user’s view of that location.
  • the lamp 114 can be disposed to be adjusted in brightness or focus to maintain a relatively fixed measure of total brightness at the object.
  • the eyewear 110 can be disposed to measure a focal length of the user’s eye(s), or to measure a time of flight of an ultrasonic or electromagnetic signal to the object and back, or otherwise to measure a distance to the object.
  • the lamp 114 can be adjusted to maintain the total brightness at each such new object.
  • the eyewear 110 can also be disposed to maintain a record of lamp settings, so as to avoid any need to recompute those lamp settings when the user switches viewpoint between or among multiple objects.
  • the eyewear 110 can be disposed to receive user input indicating which objects are of interest to the user and which objects have simply caught the user’s attention for a short time span.
  • the eyewear 110 can be disposed to maintain bookmarks of which objects are of more than passing interest to the user, and to return to those bookmarks for use in determining lamp settings when the user’s gaze direction returns to those objects (or to near those objects).
  • the eyewear 110 can include (optional) light sensors disposed to respond to one or more of: an amount of luminance in an ambient environment, an amount of luminance from a specific light source, an amount of luminance from an object being focused upon by a user or in a background of such an object, or as otherwise described herein.
  • multiple such sensors can be disposed in the eyewear, as in one or more of: forward-pointing, downward-pointing, sideways-pointing, or as otherwise described herein.
  • the multiple such sensors can be disposed to determine an amount of luminance of particular portions of the user’s field of view, such as a distant viewing range, a reading range, and/or a luminance of an ambient environment.
  • the actual sensors can be disposed within the eyewear, not necessarily near a light input to the eyewear no, with one or more external inputs disposed to receive light and transmit that light to the sensors.
  • a location disposed to receive light can be coupled to a light pipe, such as a fiber optic pathway, which is coupled to the actual sensor. This can have the effect that the location on the eyewear no where the light is received (thus, at a surface or similar location thereof), can be substantially different from the location where the received light is actually measured (thus, where the sensor is actually located).
  • the location on the eyewear where the light is received can be shielded against stray light, so as to assure that the luminance measured by the sensor is accurate.
  • the location where the light is received can include two measurement points and coupled to a comparator to process the different values, so as to identify stray light and account therefor.
  • one or more of the sensors can be disposed to measure luminance at a selected frequency or range of frequencies, such as only measuring light in a 500- 56onm range (green light), only measuring blue/ultraviolet light, only measuring red/ amber light, or some combination or conjunction thereof (such as red and blue, but not green).
  • a first sensor can be disposed to measure a first selected set of frequencies
  • a second sensor can be disposed to measure a second selected set of frequencies
  • both the first and second sensors can be coupled to a computing element so as to measure a combination of the first and second set of frequencies.
  • a single location can be disposed to receive the light to which the sensors are responsive.
  • a set of multiple light pipes such as a single location at which light is received that is coupled to multiple separate fiber optic pathways, can be disposed to couple the received light to the first and second sensors. This can have the effect that light is received jointly at one location and measured separately by different sensors.
  • the second layer 111** can be controlled to shade/inverse-shade or tint the lower portion mb. This can protect the user and close-range against excessive incoming light, such as from when the user’s field of view includes bright sunlight.
  • a refractive effect from a “reading glasses” portion might be sufficiently strong that in bright sunlight the lower portion can damage or injure the user’s eyes.
  • the eyewear no can be disposed to shade/inverse-shade the reading glasses portion of the lenses in when a bright image is within the user’s field of view.
  • a refractive effect from a “reading glasses” portion might cause eyestrain or even damage to the user’s eyes when the user has bright snow in their field of view, such as might occur on a ski slope, snowboarding trail, or otherwise out and about on a snowy day.
  • the eyewear no can be disposed to shade/inverse-shade the reading glasses portion of the lenses in when the ambient environment includes a large region of snow or other bright terrain.
  • a refractive effect from a “reading glasses” portion might cause heating or burning of a surface that is exposed to concentrated sunlight; this can include a reading surface of a page or mobile device, or an object upon which the eyewear is resting.
  • the eyewear no can be disposed to shade/inverse-shade the reading glasses portion of the lenses in when the eyewear is disposed near equipment or paper that might be damaged by sunlight.
  • the second layer 111** can be disposed having portions corresponding to the portions of the first layer 111* and each capable of optimizing sensory inputs, such as in bright ambient light, or otherwise as described herein.
  • the individual portions of the second layer 111** corresponding to the individual portions of the first layer 111* can be disposed to have other types of visual optimization, such as otherwise described herein.
  • the second layer 111** can have multiple regions having distinct shad- ing/inverse-shading or tinting properties, color balance properties, polarization properties, dynamic visual optimization properties, other properties as described herein, and similar properties.
  • the second layer 111** can be disposed to impose different shading/inverse-shading or tinting with respect to the upper portion 111T and the lower portion 111B.
  • the second layer 111** can be disposed to impose relatively little shading or tinting with respect to the lower portion 111B (unless in an excessively brightly lit ambient environment).
  • the second layer in** can be disposed to impose substantial shading, so as to operate the lenses in as sunglasses.
  • the second layer 111** can be disposed to impose shading/inverse-shading or tinting with respect to medical conditions determined with respect to the user 101.
  • the second layer 111** can be disposed to impose shading/inverse-shading or tinting so as to ameliorate the effect of bright light or glare in an ambient environment.
  • the second layer 111** can be disposed to impose different color balance with respect to the upper portion 111T and the lower portion 111B. This can have the effect that the eyewear 110 can be made suitable for viewing objects with different color frequencies or mixtures thereof. In such cases, the eyewear 110 can be disposed to select color balance suitable for one or more of:
  • the second layer 111** can be disposed to impose different polarization properties with respect to the upper portion 111T and the lower portion 111B. This can have the effect that the eyewear 110 can be suitable for viewing objects with different polarization. For example, objects providing glare that are polarized can have the glare removed from the user 101’s field of view.
  • the second layer 111** can be disposed to use polarization to shade/inverse-shade incoming light, with the effect of providing shading/inverse-shading or tinting as described above.
  • the eyewear no can be disposed to remind the user 101 that they have been focusing on close-range objects for longer than is healthy. In such cases, the eyewear no can be disposed to remind the user 101 to look away from close-range objects, such as by presenting an indicator to the user 101 that that sufficient time has passed so that the user 101 should toward a distant object for at least a short time.
  • the eyewear no can be disposed to present an indicator to the user 101 to follow a 20-20-20 plan, that is, each 20 minutes to look away for at least 20 seconds at a distance of at least 20 feet.
  • the eyewear 110 can be disposed to so inform the user 101.
  • the second layer 111** can be disposed to impose different dynamic visual optimization (DV0) properties with respect to the upper portion 111T and the lower portion 111B.
  • the eyewear 110 can use the second layer 111** to impose shading/inverse-shading at a selected frequency, so as to impose dynamic visual optimization on the upper portion 111T, while not so doing with respect to the lower portion 111B. This can have the effect that the eyewear 110 can be suitable for viewing objects using dynamic visual optimization with distance viewing, while refraining from using dynamic visual optimization with close-range viewing.
  • the eyewear 110 can be disposed to perform one or more of:
  • the eyewear 110 can be disposed to perform dynamic visual optimization (DV0), possibly such as described in the Incorporated Disclosures, with respect to at least a first portion of a lens 111, such as a portion (or multiple such portions) directed to a computing display (not shown) or a smartphone or mobile device, and to refrain from performing such optimization with respect to a first portion of the lens disposed for close-range vision 111B, such as a portion (or multiple such portions) disposed for distant vision 111T.
  • DV0 dynamic visual optimization
  • the eyewear 110 can be disposed to perform dynamic visual optimization (DV0) with respect to at least a portion of a lens 111 disposed for distant vision 111T, and to refrain from doing so with respect to the computing display (not shown) or a smartphone or mobile device, such as a portion disposed for close-range vision 111B.
  • DV0 dynamic visual optimization
  • the eyewear 110 When the eyewear 110 performs dynamic visual optimization (DV0) with respect to a portion of a lens 111 directed to a computing display (not shown) or a smartphone or mobile device, the eyewear no can be synchronized to a display frequency of the computing display (not shown) or the smartphone or mobile device. This can have the effect that the eyewear no presents a continuous image including a sequence of frames from the display from the computing display (not shown) or the smartphone or mobile device, rather than a sequence of intermittent frames normally presented by the computing display (not shown) or the smartphone or mobile device.
  • DV0 dynamic visual optimization
  • the eyewear no can be synchronized to a fraction or a multiple of that frequency, so as to cause the computing display or mobile device to appear, to the user 101, to be dimmed or shaded, or otherwise subject to shading/inverse-shad- ing.
  • the lens 111 can be disposed to be shaded/inverse-shaded so as to perform dynamic visual optimization.
  • the upper portion 111T of each lens can be disposed to be shaded/inverse-shaded at a selected frequency, so as to provide a sequence of still images of a moving object or moving images of that moving object in place of a continuous image of that moving object.
  • this can have the effect of providing the user 101 with improved visual acuity when viewing that moving object.
  • the lens 111 can be disposed to perform dynamic visual optimization, such as by using shading/inverse-shading with respect to the upper portion 111T, for distance viewing, while refraining from providing dynamic visual optimization with respect to the lower portion 111B.
  • the lens 111 can be disposed to perform dynamic visual optimization with respect to both the upper portion 111T and the lower portion 111B, either at the same frequencies or at different frequencies.
  • the lens 111 can be disposed to provide dynamic visual optimization with respect to the upper portion 111T (thus, distance viewing) while refraining from providing dynamic visual optimization with respect to the lower portion 111B (thus, close-range viewing, such as with respect to a mobile device).
  • the eyewear 110 can be synchronized to a fraction or a multiple of that frequency, so as to cause the computing display or mobile device to appear, to the user 101, to be dimmed or shaded.
  • the lens 111 can be disposed to provide dynamic visual optimization with respect to the lower portion 111B (thus, close-range viewing, such as with respect to a rotating element), while refraining from providing dynamic visual optimization with respect to the upper portion 111T (thus, distance viewing, such as with respect to communication with another person or while reviewing documentation). Adjusting for control inputs
  • the computing device 120 can be disposed to adjust, in response to one or more user 101 controls, the shading/inverse-shading or tinting effect of the second layer 111**, so as to allow the user 101 to direct the eyewear 110 to use a portion thereof for optimal adjustment for their selected view.
  • the eyewear 110 can be disposed to receive one or more intentional inputs from the user 101, at one or more input/output elements 123, to switch between close-range viewing and distance viewing.
  • the user 101 inputs can include one or more touch controls (including capacitive or contact touch), gesture controls (including eye or facial gestures), body movement controls (including hand/ finger, head, or other body gestures), voice controls (including spoken commands or echolocation), controls by an application (“app”) operating on a computing device or mobile device, or otherwise as described herein.
  • touch controls including capacitive or contact touch
  • gesture controls including eye or facial gestures
  • body movement controls including hand/ finger, head, or other body gestures
  • voice controls including spoken commands or echolocation
  • the eyewear 110 can be disposed to respond to user 101 controls, such as touch controls (including capacitive controls, buttons, switches, sliders, or as otherwise described herein), gesture controls (including eye or facial gestures, or as otherwise described herein), body movement controls (including hand/ finger, head, or other body gestures, or as otherwise described herein), voice controls (including spoken commands or echolocation, or as otherwise described herein), controls by an application (“app”) operating on a computing device or mobile device, or as otherwise described herein.
  • the eyewear 110 can be disposed to respond to user 101 control by adjusting a polarization or shading amount, thus providing user 101 control of a sunglasses effect, or adjusting a color balance, thus providing user 101 control of a filtering or tinting effect.
  • the touch controls can be disposed on the front-piece 116 or on one or more of the temples (ear-pieces) 117 and disposed to be touched by the user 101’s finger, hand, or other body part.
  • the touch controls can include a button, a capacitive sensor, a slide-bar, a switch, other touch controls described herein, or similar touch controls.
  • the touch controls can include a capacitive or contact touch.
  • the gesture controls can include the eye-tracking element 112, so as to detect user 101 eye gestures, including one or more of: eye movements, eye blinks, eye squints, other eye gestures described herein, or similar eye activity.
  • the gesture controls can additionally or alternatively include one or more facial recognition devices, so as to determine user 101 facial gestures, including one or more of: cheek, chin, jaw, lip, or mouth movements; eyebrow movements or surprise indications; grimaces, glares, smiles or frowns, sneers, squints; other facial gestures described herein, or similar facial activity.
  • the gesture controls can include one or more gestures in sequence, such as a gesture including a look-left followed by a blink, a sequence of multiple blinks, a look-left followed by a look-right, or some other sequence of eye gestures.
  • the gesture controls can include one or more gestures in combination or conjunction with other types of gestures, such as a sequence of blinks in combination or conjunction with a voice command, a sequence of eye darts (looking in a particular direction for a short duration) in combination or conjunction with a touch control, a sequence of blinks in combination or conjunction with a sequence of eye darts, a sequence of blinks in combination or conjunction with another type of gesture, or as otherwise described herein.
  • other types of gestures such as a sequence of blinks in combination or conjunction with a voice command, a sequence of eye darts (looking in a particular direction for a short duration) in combination or conjunction with a touch control, a sequence of blinks in combination or conjunction with a sequence of eye darts, a sequence of blinks in combination or conjunction with another type of gesture, or as otherwise described herein.
  • the body movement controls can include detectors for hand or finger gestures, or gestures using other parts of the user 101’s body, such as for an arm or elbow, whether within the user 101’s field of view or otherwise.
  • body movement gestures can be detected using a camera and using the computing device operating using an artificial intelligence or machine learning technique, with the camera disposed either to view one or more portions of the user 101’s field of view or to view one or more areas outside the user 101’s field of view.
  • the voice controls can include detectors for voice commands other sounds as described herein, or similar sounds.
  • voice controls can be detected using a microphone and using the computing device operating using an artificial intelligence or machine learning technique, with the microphone disposed either to receive a voice signal or another sound signal from the user 101 (such as a finger snap or a similar sound).
  • the controls by an app can include input/ output devices coupled to a mobile device (not shown), such as a smartphone, another mobile device described herein, or a similar mobile device.
  • a mobile device such as a smartphone, another mobile device described herein, or a similar mobile device.
  • the controls by an app can be detected using a mobile device operating under control of an app, receiving gesture inputs (such as moving the mobile device), touch inputs, voice inputs, other inputs described herein, or similar mobile device inputs.
  • a mobile device operating under such control can be disposed to provide user 101 inputs to the computing device 120 so as to adjust the shading/inverse-shading or tinting effect of the second layer 111**, as described herein.
  • the eyewear 110 can be disposed to respond to control inputs such as those described with respect to an ambient light sensor (or other sensor to determine a distance of a hand/finger gesture), a hand/finger gesture to adjust an input value, and a hand/ finger “swipe” to set that input value.
  • control inputs such as those described with respect to an ambient light sensor (or other sensor to determine a distance of a hand/finger gesture), a hand/finger gesture to adjust an input value, and a hand/ finger “swipe” to set that input value.
  • the eyewear 110 can be disposed to determine a luminance value of the ambient environment. In such cases, the eyewear 110 can be disposed to determine a base level with which to measure a user 101 input value in response to the hand/finger gesture.
  • the eyewear 110 can be disposed to determine an amount of that luminance is obscured or occluded by a hand/finger gesture by the user 101. In such cases, the eyewear 110 can be disposed to continuously adjust a user 101 input value as the user 101 adjusts the hand/finger gesture.
  • the eyewear 110 can be disposed to determine when the user 101 performs a hand/ finger “swipe”. In such cases, the eyewear 110 can be disposed to set the user 101 input value to the value selected in response to the continuously adjustment by the hand/finger gesture.
  • the eyewear 110 can be disposed to allow the user 101 to set multiple continuous values, such as by offering each such value in turn, setting the associated value, and proceeding to the next such value.
  • the eyewear 110 can be disposed to allow the user 101 to select among such multiple values, or can be responsive to a gesture that it interprets as meaning to proceed to the next such value.
  • the eyewear 110 can be disposed to cycle among them until the user 101 presents a gesture indicating that the user 101 has finished entering such values.
  • the user 101 can provide inputs to switch the eyewear 110 between operating as sunglasses and operating as “reader” glasses, such as by invoking a touch control or another type of control on the eyewear 110.
  • an operator other than the user 101 such as an evaluator, supervisor, or trainer of the user 101 wearing that eyewear 110, can review the user 101’s gaze direction or focal length, or can provide inputs to switch the eyewear 110 between close-range viewing and distance viewing.
  • eyewear 110 can determine where a close-range versus a distance location, in response to an input signal from a user 101, indicating when the user 101 believes they are looking at a close-range versus a distance location. This can have the effect that the user 101 can dynamically adjust one or more dividing borders on the lens between close-range versus distance effects, rather than having those effects occur in response to a fixed dividing line on the lens.
  • the eyewear 110 can also be disposed to optimize sensory inputs in bright ambient light, such as when using eyewear 110 under control of a user 101 wearing that eyewear 110, or under control of an evaluator, supervisor, or trainer of the user 101 wearing that eyewear 110. For some examples:
  • the eyewear 110 can be disposed to respond to controls by an evaluator, supervisor, or trainer, such as a coach, teacher, test proctor, sports scout, work supervisor, or otherwise as described herein.
  • an evaluator, supervisor, or trainer can be disposed to receive sensory inputs from the user 101’s eyewear 110 and to adjust those sensory inputs for a process in which the user 101 is evaluated, reviewed, trained, or otherwise as described herein.
  • the evaluator, supervisor, or trainer can be disposed to use a computing device or mobile device to adjust sensory inputs designated for the user 101.
  • the evaluator, supervisor, or trainer can be disposed to adjust the user 101’s sensory inputs to add new inputs, such as an augmented reality or virtual reality overlay in an otherwise natural environment.
  • the evaluator, supervisor, or trainer can be disposed to block or remove inputs, such as by filtering, shading, or tinting, a portion of the user 101’s field of view.
  • the eyewear 110 can include one or more sensors 115 disposed to measure light incoming from the ambient environment.
  • the sensors 115 can be disposed to measure total luminance, luminance of a portion of the user 101’s field of view, color balance of all or a portion of the user 101’s field of view, or other parameters derivable from the ambient environment.
  • the eyewear 110 can be disposed to adjust one or more of: coloring/tinting, shading/inverse-shading, or other viewing parameters, in response to one or more of the sensors 115.
  • the sensors 115 can include a forward-facing sensor 115F, one or more side-facing sensors 115R and 115L, an upward-facing sensor 115U, a downward-facing sensor 115D, and possibly other sensors such as peripheral sensors 115P disposed to observe the user 101’s peripheral fields of view.
  • the sensors 115 can also include an internal sensor 115N disposed proximal to the user 101 with respect to the lenses 111, so as to detect incoming light at or near the user 101’s eye after passing through and being adjusted by the lenses.
  • the eyewear 110 can be disposed to be responsive to one or more of the sensors 115, so as to determine whether the user 101’s eyes are subject to incoming light from behind (as reflected by the lenses 111 and measured by the internal sensor 115N), from a side (as measured by one or more of the side-facing sensors 115R or 115L), from above (as measured by one or more of the upward- facing sensor 115U), or from below (as measured by one or more of the downward-facing sensor 115D).
  • the eyewear 110 can be disposed to determine whether the user 101 is blocking sunlight using a hat or visor, such as by determining whether there is excessive incoming light from above.
  • While determining whether the user 101 is blocking sunlight using a hat might not, without more, inform the eyewear 110 whether the user 101 is wearing a “hard hat” appropriate to a construction zone or otherwise hazardous location, it can help indicate to other persons to inspect the user 101 for one.
  • the eyewear 110 can be disposed to shade/inverse-shade at least a portion of the user 101’s field of view in response to one or more of the sensors 115.
  • the eyewear 110 can be disposed to shade/inverse-shade that portion of the lenses 111.
  • the eyewear 110 can be disposed to shade/inverse-shade that portion of the lenses 111 as appropriate.
  • the eyewear 110 can be disposed to shade/inverse-shade the upper portion 111T or the lower portion 111B in response to the upward-facing sensor 115U or the downward-facing sensor 115D, respectively.
  • the eyewear 110 can be disposed to measure an amount of luminance in a portion of the user 101’s field of view. When the luminance in that portion of the user 101’s field of view is too bright, the eyewear 110 can be disposed to shade/inverse-shade a portion of the lens or lenses 111 associated with that portion of the user 101’s field of view. For example, one or more sensors can be disposed in particular peripheral directions, such as to allow the eyewear 110 to determine whether luminance is excessive from that particular direction. When the luminance in the user 101’s peripheral view is too bright, the eyewear 110 can be disposed to shade/inverse-shade a peripheral portion of lens or lenses 111.
  • the eyewear 110 can include sensors disposed to measure luminance in other portions of the user 101’s field of view and can be disposed to shade/inverse-shade those portions of the lenses 111 as needed: such as a close-range viewing portion, a mid-range viewing portion, a distant viewing portion, or otherwise as described herein.
  • the eyewear 110 can be disposed to determine a color balance of the user 101’s field of view, or a portion thereof.
  • the eyewear 110 can coupled inputs from one or more sensors to the computing device 102, which can be disposed to determine color balance in response thereto.
  • the computing device 102 can be disposed to direct the eyewear 110 to adjust the coloring/tinting of the lenses 111 to account therefor.
  • the eyewear 110 can be disposed to inject particular colors, such as the 5oo-56onm (green) frequency range, to adjust the color balance.
  • the eyewear 110 When the eyewear 110 receives a signal indicating that coloring/tinting or shad- ing/inverse-shading should be adjusted, such as a signal from medical personnel or from a caretaker, the eyewear 110 can be disposed to adjust the coloring/tinting or shading/inverse- shading for all or a portion of the user 101’s field of view, as indicated by the received signal.
  • a signal indicating that coloring/tinting or shad- ing/inverse-shading should be adjusted such as a signal from medical personnel or from a caretaker
  • the eyewear 110 can be disposed to adjust the coloring/tinting or shading/inverse- shading for all or a portion of the user 101’s field of view, as indicated by the received signal.
  • the eyewear 110 can also be disposed to respond to signal inputs, such as from one or more sensors disposed to detect features of an ambient environment. For some examples:
  • the eyewear 110 can be disposed to determine an amount of luminance incoming from a user 101’s field of view, such as either the entire field of view or a portion thereof. When the luminance in the user 101’s close-range view is too bright, the eyewear 110 can be disposed to shade a portion of the lenses associated with close-range vision. Similarly, when the luminance in the user 101’s distant view is too bright, the eyewear 110 can be disposed to shade a portion of the lenses associated with distant vision.
  • the eyewear 110 can be disposed to shade a portion of the lenses associated with peripheral vision.
  • One or more sensors can be disposed in particular peripheral directions, such as to allow the eyewear 110 to determine whether luminance is excessive from that particular direction.
  • the eyewear 110 determines that the color balance of the user 101’s field of view is inapposite to the user 101’s medical condition, such as when the coloring/tinting is skewed too far toward blue/ultraviolet, the eyewear 110 can be disposed to provide coloring/tinting to adjust the color balance such as to be more appropriate to the user 101’s medical condition.
  • the eyewear 110 when the eyewear 110 receives a signal indicating that coloring/tinting or shading/inverse-shading should be adjusted, such as a signal from medical personnel or from a caretaker, the eyewear 110 can be disposed to adjust the coloring/tinting or shading/inverse- shading for all or a portion of the user 101’s field of view, as indicated by the received signal. Adjusting for object signaling
  • the object-tracking element 113 can be disposed to adjust the shading/inverse-shad- ing or tinting effect of the second layer 111** to encourage the user 101 to use a portion of eyewear 110 with optimal adjustment for a view of the selected object (not shown).
  • the object-tracking element 113 can be coupled to the computing device 120, which can be disposed to use information from the object-tracking element to determine whether the user 101 is properly directing their gaze (or focal length) at the selected object.
  • the object-tracking element 113 can be disposed to determine whether the user 101 is directing their gaze (or focal length) at the moving object or at a background such as the sky.
  • the object-tracking element 113 can be disposed to encourage the user 101 to direct their gaze through a portion of the lens 111 for close-range viewing, such as when the moving object is at a relatively close range.
  • the moving object can be a baseball the user 101 is about to hit or catch, a golfball the user 101 is about to drive or putt, or a similar object.
  • the object-tracking element 113 can be disposed to encourage the user 101 to direct their gaze through a portion of the lens 111 for distance viewing.
  • the eyewear 110 can include a distance-measuring device 124 (possibly coupled to one or more of the computing device’s input/output elements 123), such as a motion detector; a lidar, radar, sonar, or similar device; an object-recognition device; or otherwise as described herein.
  • the distance-measuring device 124 can be disposed to actively send a signal 125a in a direction corresponding to the user 101’s gaze direction, so as to determine whether the user 101 is attempting to use close-range viewing or distance viewing.
  • one or more distance-measuring devices 124 can be disposed on a front-piece of eyewear 110 in the form of glasses, or on one or more of the temples (earpieces) thereof, or on a nose-piece of eyewear 110 in the form of glasses not having temples, so as to maintain alignment with the user 101’s eyes or head.
  • one or more of the distance-measuring devices 124 can be disposed on a separate device, such as a smartphone or other mobile device (not shown) and targeted in the direction, or at an object at which, the user 101 is looking.
  • a smartphone or other mobile device can be disposed to communicate with one or more objects or instruments so as to determine where the user 101 is looking, or is intending to look, as described herein.
  • the signal 125a can be disposed to be reflected by an object at which the user 101 is looking, providing the computing device 120 with a reflected signal 125b.
  • the computing device 120 can be disposed to process the reflected signal 125b so as to determine a distance to the object at which the user 101 is looking.
  • a lidar signal 125a can be disposed to be reflected by a baseball so as to allow the computing device 120 to determine a distance to the baseball.
  • the eyewear 110 can be disposed to encourage the user 101 to use a portion of the lens 111 best suitable for viewing that object.
  • the signal 125a can be disposed to be received by the object at which the user 101 is looking and processed by that object.
  • the object can be disposed to send a responsive signal 125b to the computing device 120, which can be disposed so as to determine at distance to the object at which the user 101 is looking.
  • an electromagnetic signal 125a can be disposed to be received by a dashboard or instrument of an aircraft or other vehicle.
  • An active element on the dashboard or instrument can be disposed to send back a responsive signal 125b to the eyewear 110, such as to indicate a correct direction at which the user 101 should look to see the dashboard or instrument.
  • the eyewear 110 can be disposed to encourage the user 101 to use a portion of the lens 111 best suitable for viewing that object. Similar to a dashboard or instrument, the eyewear 110 can be disposed to determine where a “close-range” location is, such as in response to a signal from one or more of: a computing display or mobile device, a medical instrument, or otherwise as described herein.
  • the distance-measuring device 124 can be coupled to a facemask or helmet, or an item of jewelry or another headpiece, so as to determine whether the user 101’s eyes are directed at the close-range location, without obscuring the user 101’s vision.
  • Fig. 1B shows a conceptual drawing of an example eyewear system, showing a lens divided into an upper portion and a “reader” portion.
  • the lens 111 can include a similar upper portion 111T, disposed to facilitate distance (or “far”) viewing, such as being relatively clear and absent of any strong refractive viewing effect.
  • the lens 111 can also include a “reader” portion 111R (sometimes known as a “flat-top” portion) localized in a corner of a similar lower portion 111B of the lens, and disposed in a region where the user is likely to be focused when reading text at relatively close range.
  • the upper portion 111T can include a portion of the lenses 111 specifically disposed for distance viewing, while the reader portion 111R can include a remaining portion of the lenses 111 specifically disposed for close-range viewing, respectively. Similar to fig. 1A, this can have the effect that the upper portion 111T can be disposed to cover a specific portion of the user’s field of view 103 through which the user would look when viewing one or more distant objects, while the reader portion 111R can be disposed to cover a portion of the user’s field of view through which the user 101 would look when viewing one or more close-range objects.
  • the reader portion 111R can be disposed to cover a specific region of the lenses 111, such as a relatively “flat top” shaped area, disposed within the user’s viewing direction when objects in the user’s field of view 103 are at a relatively close range.
  • the user’s eyes might be focused on a relatively close-range portion of the user’s field of view. This can have the effect that the user’s pupils are more closely positioned and are directed at a more centralized portion of the lens 111 on its lower portion 111B.
  • the reader portion 111R can be disposed to include a region of the lenses 111 that is disposed through where the user 101 would look when viewing close-range objects, thus, in locations 111R’ and 111R” that are associated with the user’s pupils focusing at relative close range.
  • This can have the effect that the reader portion 111R is asymmetrically disposed in locations 111R’ and 111R” that are near the user’s nose 141, not symmetrically disposed centered in the lenses 111, thus, located to the right of the user’s left lens and to the left of the user’s right lens. Accordingly, the reader portion 111R need only include a relatively smaller part of the lenses 111.
  • the reader portion 111R can be disposed with respect to the first portion 111a so as to provide different viewing functions to be applied, each to a different portion of the user’s field of view 103.
  • the reader portion 111R can be disposed with respect to the first portion 111a so as to provide a close-range viewing function and a distance viewing function, respectively.
  • the second portion 111b disposed with respect to the first portion ma described with respect to fig.
  • the reader portion 111R can be disposed in combination with a portion niR** of a second layer 111** so as to provide a combined viewing function with respect to the close-range viewing portion of the user’s field of view 103
  • the first portion 111a can be disposed in combination with a portion 111a** of a second layer 111** so as to provide a combined viewing function with respect to the distance viewing portion of the user’s field of view 103.
  • This can have the effect that the eyewear 110 can be disposed to provide the user 101 with both (A) close-range viewing function and distance viewing functions; and (B) augmented reality functions or virtual reality functions associated with those close-range viewing function and distance viewing functions, similar to those functions described with respect to fig. 1A.
  • reader portion 111R disposed with a substantially “flat” upper delimiter and disposed with a substantially curved lower delimiter
  • the reader portion 111R can have any shape suitable to perform its designated functions.
  • the reader portion 111R can include a substantially elliptical portion disposed near a corner of the lens 111 where the user 101 can look through it, such as near the user’s nose 141.
  • FIG. 1C shows a conceptual drawing of an example eyewear system, showing a lens divided into multiple portions.
  • the eyewear no can include lenses 111 divided into multiple portions.
  • the lenses 111 can be divided into an upper portion 111a, a middle portion mb, and a bottom portion 111c, such as in a trifocal lens 111.
  • the lenses 111 can be divided into a set of more than three portions, such as a sequence of portions disposed with a corresponding sequence of distinct refractive effects.
  • the sequence of distinct refractive effects can be disposed to provide a lens 111 with a substantially progressive set of refractive effects, such as in a progressive lens 111.
  • each portion of the lens 111 such as the three portions 111a, mb, and 111c, in a trifocal lens 111; or such as the multiple portions in a progressive lens 111; can be disposed having a first layer 111* and a second layer 111** as described with respect to fig. 1A.
  • the first layer 111* can be disposed to provide refractive effects and the second layer 111*** can be disposed to provide shading/inverse-shading or tinting effects.
  • the first layer 111* and the second layer 111** can be disposed having multiple portions that are closely coupled, or which fade into one another at their edges.
  • the eyewear 110 can dispose the hybrid lenses 111 with multiple portions, such as an upper portion 111a and a lower portion mb. Alternatively, there can also be a middle portion me, or other configurations with multiple portions. For example, the eyewear 110 can be disposed with portions allocated to peripheral portions of the user’s field of view, to extreme upper/lower portions of the user’s field of view, or otherwise as described herein.
  • each individual portion can be disposed to perform a distinct function, such as a function with respect to refraction, shading/inverse-shading, coloring/tint- ing, polarization, dynamic visual optimization (sometimes referred to herein as “DVO”). or otherwise as described herein.
  • a distinct function such as a function with respect to refraction, shading/inverse-shading, coloring/tint- ing, polarization, dynamic visual optimization (sometimes referred to herein as “DVO”). or otherwise as described herein.
  • distinct portions can be disposed to provide different amounts of refraction.
  • an upper portion 111a can be disposed with an amount of refraction associated with relatively distant viewing
  • a lower portion mb can be disposed with an amount of refraction associated with relatively close-range viewing, such as for reading.
  • distinct portions can be disposed to provide different amounts of shading/inverse-shading and/or coloring/tinting.
  • an upper portion 111a can be disposed to provide shading/inverse-shading and/or coloring/tinting when the user is looking through the lower portion mb, thus providing the user with a relatively less distracting view of their reading material.
  • the lower portion mb can be disposed to provide shading/inverse-shading and/or coloring/tinting when the user is looking through the upper portion ma, thus avoiding bright light being refracted into the user’s eye at high intensity.
  • Other and further possibilities are described herein.
  • shading/inverse-shading and/or coloring/tinting can be used to encourage the user to look through selected portions of the lens or lenses 111, such as when the user has been looking through a relatively close-range portion mb at reading material for a relatively long time and should take a break to relax their eyes by looking at a relatively distant range portion 111a.
  • the eyewear 110 can be disposed to encourage the user to follow a “20-20-20” guideline, thus, each 20 minutes the user can be encouraged to look away from reading at a distance of at least 20 feet for at least 20 seconds.
  • the eyewear 110 can be disposed to shade/inverse-shade and/ or color/tint the lower portion mb (thus, the relatively close-range portion) after the user has spent 20 minutes reading and to maintain that shading/inverse-shading for at least 20 seconds while the user looks through the upper portion 111a (thus, the relatively distant range portion).
  • shading/inverse-shading and/or coloring/tinting can be used to encourage the user to relax their eyes in response to a determination that the user is subject to a “dry eye” circumstance, such as when the user is presenting inadequate tearing, or in response to other measures of possible dry eye circumstances.
  • distinct portions can be disposed to provide different amounts of shading/inverse-shading and/or coloring/tinting as a preventative and/or therapeutic measure.
  • the eyewear 110 can be disposed to use its computing device 102 to determine whether the user is subject to an oncoming or current migraine or photophobia event.
  • the eyewear 110 can be disposed to use shading/inverse-shading and/or coloring/tinting to reduce the likelihood or probable severity of an oncoming migraine or photophobia event.
  • the eyewear 110 can be disposed to reduce an amount of blue/ultraviolet in the user’s field of view, to inject green light into the user’s field of view, or to filter the user’s field of view to only admit green light (such as between about 5OO-56onm).
  • the eyewear 110 can similarly be disposed to use shading/inverse-shading and/or coloring/tinting to reduce the likelihood or severity of an oncoming migraine or photophobia event.
  • the eyewear 110 can similarly be disposed to reduce an amount of blue/ultraviolet in the user’s field of view, to inject green light into the user’s field of view, or to filter the user’s field of view to only admit green light (such as between about 5oo-56onm).
  • the eyewear 110 can be disposed to combine partial shading/inverse-shading with coloring/tinting, so as to provide the user with the ability to see red/amber objects or signals, while still providing the positive effect of green light.
  • distinct portions can be disposed to provide different amounts of polarization.
  • the eyewear 110 can be disposed to use polarization to reduce bright light or glare, such as might be received from reflections (such as in bright sunlight), from a computing display, or otherwise as described herein.
  • distinct portions can be disposed to provide different dynamic visual optimization (“DVO”) functions or DVO functions with different parameters.
  • the eyewear 110 can be disposed to use its computing device 102 to determine whether the user can be helped with respect to cognitive or visual acuity, and by which DVO functions.
  • the eyewear 110 can be disposed to adjust its relatively distant viewing portion to account for possible motion blur, while concurrently adjusting its relatively closerange viewing portion to allow the user to relatively continuously view a control panel or dashboard.
  • the eyewear 110 can be disposed to use coloring/tinting in addition to or in lieu of shading/inverse-shading to provide DVO.
  • a background image includes a substantial amount of a known color, such as blue/ultraviolet
  • the eyewear 110 can be disposed to use coloring/tinting to filter the blue/ultraviolet in addition to or in lieu of shading/inverse-shading, to provide a DVO effect that provides the user 101 with an adjusted image with less cognitive distraction and better visual acuity.
  • the eyewear 110 can be disposed to use coloring/tinting to false-color or otherwise augment a recognized object, to provide a DVO effect that provides the user 101 with an adjusted image with less cognitive distraction and better visual acuity.
  • a background image includes a reduced amount of a known color, such as green
  • the eyewear 110 can be disposed to use coloring/tinting to false-color or otherwise augment a recognized object, to provide a DVO effect that provides the user 101 with an adjusted image with less cognitive distraction and better visual acuity.
  • the distinct functions can operate in response to a situational context, such as a time of day, a known location, a nature of an ambient environment, a set of user inputs, or otherwise as described herein.
  • the user can determine and set one or more “shading bookmarks”, describing an amount of shading/inverse-shading specified to be applied in the context described by each selected bookmark.
  • shading/inverse-shading can be used to encourage the user 101 to look through selected portions of the lens or lenses 111, such as when the user 101 has been looking through a relatively close-range portion (here identified with the lower portion 111B) at reading material for a relatively long time and should take a break to relax their eyes by looking at a relatively distant range portion (here identified with the upper portion 111T).
  • the eyewear 110 can be disposed to encourage the user 101 to follow a “20-20-20” guideline, thus, each 20 minutes the user 101 can be encouraged to look away from reading at a distance of at least 20 feet for at least 20 seconds.
  • the eyewear 110 can be disposed to shade/inverse-shade the lower portion 111B (thus, the relatively close-range portion) after the user 101 has spent 20 minutes reading and to maintain that shading/inverse-shading for at least 20 seconds while the user 101 looks through the upper portion 111T (thus, the relatively distant range portion).
  • the distinct functions can operate in response to a situational context, such as a time of day, a known location, a nature of an ambient environment, a set of user 101 inputs, or otherwise as described herein.
  • a situational context such as a time of day, a known location, a nature of an ambient environment, a set of user 101 inputs, or otherwise as described herein.
  • the user 101 can determine and set one or more “shading bookmarks”, describing an amount of shading/inverse-shading and/or coloring/tinting preferred by the user 101 and specified to be applied in the context described by each selected contextual/situational bookmark.
  • the computing device 120 can be disposed to adjust, in response to one or more user 101 controls, the shading/inverse-shading or coloring/tinting effect of the second layer 111**, so as to present an interpretation of an audio/video experience, or another experience, to the user 101.
  • the audio/video experience can include one or more of: — A time sequence of shading/inverse-shading or coloring/tinting effects on the second layer in**, in response to an audio signal, or another signal, such as including one or more of: music, ambient sound, or voice effects.
  • an audio/video presentation or another presentation, such as a ballet or opera, a play or other live performance, a movie or AR/VR (augmented reality or virtual reality) presentation, a presentation of a live performance, or otherwise as described herein.
  • a signal received by the eyewear such as a communication signal, a sensor signal, a signal responsive to the ambient environment, a signal selected by the user or by another person (such as a reviewer or supervisor, one or more medical personnel, another viewer, or otherwise as described herein).
  • the shading/inverse-shading or coloring/tinting effects can be responsive to beat, echo, pitch, volume, or other aspects of audio music.
  • the shading/inverse-shading and/or coloring/tinting effects can also be synchronized with the audio signal, so as to improve the user’s experience of the music.
  • the audio music can include music selected by the user 101, music available to the user in an ambient environment (such as being played by a band or soloist, a music system, or in association with a movie or other audio/video presentation), or music triggered by the computing device 120 in response to a pre-selected user preference or in response to a medical intervention (such as to induce calm).
  • the shading/inverse-shading or tinting effects can also be responsive to a music style recognized by the computing device 120, such as whether the music is classical, country and western, heavy metal, rock, or another musical style.
  • the shading/inverse-shading or coloring/tinting effects can be responsive to pitch, volume, or other aspects of audio sounds available to the user 101 in a local environment.
  • the shading/inverse-shading and/or coloring/tinting effects can also be synchronized with the audio signal, so as to improve the user’s experience of the ambient sound.
  • the shading/inverse-shading or tinting effects can also be responsive to a type of ambient sound, such as one or more of: animal noise, electrical or mechanical devices (such as lawn mowers or smartphones), movement of branches or leaves, vehicle traffic (such as engine noise, honking, sirens, tire squealing), water sounds (such as running water or flooding), wind or other weather (such as rain, sleet), or otherwise as described herein.
  • a type of ambient sound such as one or more of: animal noise, electrical or mechanical devices (such as lawn mowers or smartphones), movement of branches or leaves, vehicle traffic (such as engine noise, honking, sirens, tire squealing), water sounds (such as running water or flooding), wind or other weather (such as rain, sleet), or otherwise as described herein.
  • the shading/inverse-shading or color- ing/tinting effects can be responsive to pitch, volume, information content, or other aspects of voices from or near the user 101.
  • the shading/inverse-shading and/or coloring/tinting effects can also be synchronized with the audio signal, so as to improve the user’s experience of the voice effects.
  • the shading/inverse-shading or tinting effects can also be responsive to a type of source of ambient sound, such as the user’s voice, a nearby voice of a person speaking to the user 101, a nearby voice of a recognized person (such as a friend of the user), or other nearby voices.
  • the computing device 120 can be disposed to receive a signal indicating (A) a set of shading/inverse-shading or tinting effects to present to the user 101, (B) an audio signal from which the computing device can select an associated set of shading/inverse- shading or tinting effects to present, or otherwise as described herein.
  • the shading/inverse- shading and/or coloring/tinting effects can also be synchronized with the audio signal, so as to improve the user’s experience of the audio/video presentation.
  • the computing device 120 can be disposed to adjust, such as in response to one or more user 101 controls, the shading/inverse-shading or coloring/tinting effect of the second layer 111**, separately with respect to: a first time sequence of shading/inverse-shading or tinting effects on the upper portion 111a of the lens 111 and a second time sequence of such effects on the lower portion mb of the lens (such as using the second layer 111** to present those time sequences of effects), in response to one or more audio or audio/video signals, or other signals such as described herein.
  • the shading/inverse-shading or tinting coloring/ effects include one or more portions (e.g., the upper portion 111a or the lower portion mb) of the lens 111 having shading/inverse-shading or coloring/tinting effects that are substantially fixed
  • those substantially fixed portions can be disposed to be (A) substantially clear, (B) substantially shaded/inverse- shaded and/or colored/tinted in a selected amount of shading/inverse-shading and/or coloring/tinting, or otherwise as described herein.
  • a first portion of the lens 111 can be shaded/inverse-shaded or colored/tinted in response to a music signal selected by the user 101, while a second portion of the lens is shaded/inverse-shaded or colored/tinted in response to an amount of luminance in an ambient environment.
  • the first portion of the lens 111 can be shaded/inverse-shaded or colored/tinted separately from the second portion of the lens as described with respect to other and further possibilities as described herein.
  • the selected shading/inverse-shading or coloring/tinting effects can be disposed to (A) present an enjoyable accompaniment to the user 101 associated with an audio or au- dio/video presentation, (B) assist the user in detecting an audio or audio/video signal in an ambient environment, or determining a type of nearby audio or audio/video signal in the ambient environment, (C) present a selected set of shading/inverse-shading or coloring/tinting effects to induce or prompt a emotional or medical response by the user, or otherwise as described herein.
  • This Application also describes devices, and methods for using them, capable of monitoring the user 101’s use of the eyewear 110, so as to allow medical personnel, such as an optometrist or ophthalmologist, to determine the user 101s typical eye behavior, particularly with respect to concentrating on close-range or distant viewing.
  • medical personnel such as an optometrist or ophthalmologist
  • the eyewear 110 can be disposed to maintain a record of the user 101’s gaze direction and/or focal length.
  • the eyewear no can be disposed to determine how frequently and for how long the user 101 views objects at a distance and how frequently and for how long the user 101 views objects at close-range.
  • the eyewear no can be disposed to determine in what direction the user 101 is looking when looking at a distance and in what direction the user 101 is looking when viewing objects at close-range.
  • the eyewear no can also be disposed to use the record of the user 101’s gaze direction and/or focal length to determine whether the user 101 is staring at a computing display or mobile device or is viewing different objects, such as the user 101 might do while being active outside (or possibly being active inside).
  • the eyewear no can also be disposed to use the record of the user 101’s gaze direction and/or focal length to determine whether the user 101 is focused at a center portion of their field of view or is occasionally moving their eyes to identify objects or movement in a peripheral portion of their field of view.
  • the eyewear no can be disposed to maintain a record of a brightness level where the user 101 is located or is looking, so as to determine whether the user 101 is indoors or outdoors. For example, when the local brightness level is between about 300-700 lux, the eyewear 110 can determine that the user 101 is indoors or is looking at objects that are indoors, while when the local brightness level is over 1000 lux (or has a high variability), the eyewear 110 can determine that the user 101 is outdoors or is looking at objects that are outdoors.
  • the eyewear 110 can be disposed to maintain a record of a color balance where the user 101 is located or is looking, so as to alternatively determine whether the user 101 is indoors or outdoors. For example, when the blue or ultraviolet component of the ambient light environment is greater than a selected fraction, the eyewear 110 can be disposed to determine that the user 101 is indoors or is looking at objects illuminated by indoor lighting, while when the green component is greater than a selected fraction, the eyewear 110 can be disposed to determine that the user 101 is outdoors or is looking at objects illuminated by outdoor lighting.
  • the eyewear 110 can be disposed to use the record of the color balance to determine a set of times of day, or an evaluation of ambient weather, when the user 101 is outside.
  • the eyewear 110 can be disposed to determine, in response to a color balance, a temperature associated with times of day when the user 101 is outside, particularly in combination with an evaluation of the user 101’s location (as might be determined in response to a GPS receiver).
  • the evaluator, supervisor, or trainer, or other evaluating party might include a coach (such as in a sports event or other competitive event), a teacher (such as for improving the user 101’s skills), sports scout or test proctor (such as for evaluating the user 101’s skills), otherwise as described herein, or a similar reviewing party.
  • the reviewing party might typically be another natural person.
  • the reviewing party might alternatively be a group of persons, a computing device disposed to perform such techniques, an artificial intelligence or machine learning technique disposed to perform such techniques, or a similar reviewing party or device.
  • the reviewing party can be disposed to receive sensory inputs from the user 101’s eyewear 110, so as to determine what the user 101 can view with respect to a closerange view or a distance view.
  • the reviewing party can also be disposed to provide inputs to a computing device or a mobile device to adjust sensory inputs provided by the eyewear 110 to the user 101.
  • the reviewing party can be disposed to use the computing device or mobile device to adjust the first layer 111* or the second layer 111** of the lens 111, so as to perform one or more of: (A) altering the user’s view so as to optimize that view; or (B) altering the user’s view so as to test the user 101’s actions in response to that view.
  • altering the user 101’s view can include providing an augmented reality or virtual reality overlay in an otherwise natural environment, blocking or removing inputs (such as by filtering, shading, or tinting) in a portion of the user 101’s field of view, or similarly altering the user 101’s view.
  • the reviewing party can alter operation of the eyewear 110 similarly to described above with respect to other sections of this Application, including one or more of: “Adjusting for control inputs”: thus, adjusting a color balance effect, adjusting a polarization or shading/inverse-shading effect, adjusting a refraction effect, adjusting a dynamic visual optimization effect, adjusting other effects described herein, or adjusting similar audio/video effects.
  • FIG. 2 shows a conceptual drawing of a second example eyewear system, showing a lens divided into sections for frontal and peripheral viewing.
  • Fig. 2A shows a first conceptual drawing of an example eyewear system, showing a lens divided into an upper portion and a lower portion, and a center portion with right/left peripheral portions.
  • the lens 211 can be substantially transparent and have an upper portion 211a and a lower portion 211b.
  • the lens 211 can also be disposed with a center portion including a frontal portion 211c and a peripheral portion 2nd.
  • the frontal portion 211c can be disposed in a portion of the user’s field of view where light is infalling from a focused-upon region
  • the peripheral portion 2nd can be disposed in a portion of the user’s field of view where light is infalling from a peripheral region.
  • a first layer 211* of the lens 211 can be substantially transparent and have a set of refractive effects, in addition to the upper portion 211a and the lower portion 211b, for the frontal portion 211c and the peripheral portion 2nd.
  • the refractive effects for the upper portion 211a might be similar to the upper portion 111a of the lens 111
  • the refractive effects for the lower portion 211b might be similar to the lower portion 111a of the lens 111.
  • the refractive effects for the upper portion 211a might differ from the upper portion 111a of the lens 111, and the refractive effects for the lower portion 211b might be similar to the lower portion 111a of the lens 111, in response to differing sizes of the upper portion 211a from the upper portion 111a and of the lower portion 211b from the lower portion mb.
  • the upper portion 211a can be disposed for distance viewing and have no substantial refractive effect, while the lower portion 211b can be disposed for close-range (“reader”) viewing and have a refractive effect of between about +1 diopter and +3 diopter, or more or less.
  • the first layer 211* can operate as “reader” glasses, providing the user 101 with the option of switching between distance viewing using the upper portion 211a and close-range viewing using the lower portion 211b, using eye movement, head movement, or similar activity.
  • the frontal portion 211c can be disposed for direct viewing and have no substantial refractive effect, while the peripheral portion 2nd can be disposed for peripheral viewing and have a refractive effect tailored for peripheral viewing.
  • the set of refractive effects for the first layer 211* of the frontal portion 211c and the peripheral portion 2nd of the lens 211 can be disposed to adjust for a difference between the user’s view in a frontal portion and a peripheral portion of their field of view.
  • the refractive effects for the frontal portion 211c might be substantially different from the refractive effects for the peripheral portion 2nd.
  • a second layer 211** can be disposed having portions corresponding to the portions of the first layer 111* and each capable of optimizing sensory inputs, such as in bright ambient light, or otherwise as described herein.
  • the second layer 211** can have multiple regions with distinct shading/inverse-shading or coloring/tinting properties, can have multiple regions having distinct polarization properties, can have multiple regions having distinct dynamic visual optimization, can have multiple regions having distinct color balance, can have multiple regions with other features as described herein, or can have other similar features.
  • the second layer 211** can also be disposed as described herein with respect to other sections of this Application, including one or more of: “adjusting for gaze direction”, “adjusting for object signaling”, “adjusting for control inputs”, “alternative lens viewing”, “reviewing party for eyewear”, as otherwise described herein, and similar techniques for adjustment.
  • Fig. 2B shows a second conceptual drawing of an example eyewear system, showing a lens divided into an upper portion and a lower portion, and a center portion with a frontal portion and right/left peripheral portions.
  • the lens 211 can be substantially transparent and have an upper portion 211a and a lower portion 211b.
  • the lens 211 can also be disposed with a center portion including a frontal portion 211c and right/left peripheral portions 2nd/ 2iie.
  • the frontal portion 211c can be disposed in a portion of the user’s field of view where light is infalling from a focused-upon region, while the right/left peripheral portions 2iid/2iie can be disposed in a portion of the user’s field of view where light is infalling from right/left peripheral regions.
  • a first layer 211* of the lens 211 can be substantially transparent and have a set of refractive effects, in addition to the upper portion 211a and the lower portion 211b, for the frontal portion 211c and the right/left peripheral portion 2iid/ 2iie.
  • the refractive effects for the upper portion 211a might be similar to the upper portion 111a of the lens 111
  • the refractive effects for the lower portion 211b might be similar to the lower portion 111a of the lens 111.
  • the upper portion 211a can be disposed for distance viewing and have no substantial refractive effect, while the lower portion 211b can be disposed for close-range (“reader”) viewing.
  • the first layer 211* can operate as “reader” glasses, providing the user 101 with the option of switching between distance viewing and close-range viewing, using eye movement, head movement, or similar activity.
  • the frontal portion 211c can be disposed for direct viewing and have no substantial refractive effect, while the right/left peripheral portions 2iid/2iie can be disposed for peripheral viewing and have a refractive effect tailored for peripheral viewing from the right/left peripheral portions of the user’s field of view.
  • the set of refractive effects for the first layer 211* of the frontal portion 211c and the right/left peripheral portions 2iid/2iie of the lens 211 can be disposed to adjust for a difference between the user’s view in a frontal portion and right/left peripheral portions of their field of view.
  • the second layer 211** can be disposed having portions corresponding to the portions of the first layer 111* and each capable of optimizing sensory inputs.
  • the second layer 211** can have multiple regions with distinct shading/inverse-shading or coloring/ tinting properties, can have multiple regions having distinct polarization properties, can have multiple regions having distinct dynamic visual optimization, can have multiple regions having distinct color balance, can have multiple regions with other features as described herein, or can have other similar features.
  • the second layer 211** can also be disposed as described herein with respect to “adjusting for gaze direction”, “adjusting for object tracking”, “adjusting for control inputs”, “reviewing party for eyewear”, as otherwise described herein, and similar techniques for adjustment.
  • the eyewear 110 can be disposed to distinguish between closerange versus distance viewing, distinguish between frontal versus peripheral viewing, and distinguish between right/left peripheral viewing.
  • the eyewear 110 can be disposed to make such determinations in response to a user’s gaze direction, an object signal, a user input, an input from a reviewing party, as otherwise described herein, and similar techniques for making such determinations.
  • the eyewear 110 can also be disposed to distinguish a peripheral location, or a right/left peripheral location, in response to a recognized object, such as an external device, a control panel or dashboard, a medical instrument, otherwise as described herein, or using a similar technique.
  • a recognized object such as an external device, a control panel or dashboard, a medical instrument, otherwise as described herein, or using a similar technique.
  • Fig. 3 (collectively including fig. 3A-3C) shows a conceptual drawing of a third example eyewear system, showing a contact lens.
  • Fig. 3A shows a system including a contact lens having multiple portions, so as to provide distinct visual adjustments to distinct portions of a user’s field of view.
  • a system 300 such as operated with respect to a user 101, is described with respect to elements as shown in the figure, and as otherwise described herein, such as:
  • a contact lens 310 including at least a first portion 310a and a second portion 310b, and having a first layer 311* and a second layer 311**;
  • a computing device 320 including a processor 321, program and data memory 322, one or more input/output elements 323;
  • a communication system 330 including a communication device 331 and a power supply 332.
  • the contact lens 310 can be disposed in a substantially circular configuration, such as an ellipse with almost zero eccentricity and having a curved shape disposed to fit the user’s eye.
  • the contact lens 310 can be disposed to have a segment removed at one side, so as to provide an alignment of the contact lens, when worn, that maintains a first portion 310a toward its top (as described herein) above a second portion 310b toward its bottom (as described herein), particularly as viewed by the user’s eye.
  • the contact lens 310 can be disposed to have a segment disposed with additional weight, so as to alternatively provide an alignment of the contact lens that maintains a first portion 310a above a second portion 310b, again, particularly as viewed by the user’s eye.
  • the first portion 310a of the contact lens 310 can be disposed with a first refractive effect, such as a distance viewing effect.
  • a first refractive effect such as a distance viewing effect.
  • the first refractive effect can be disposed to be relatively minimal or substantially zero (thus, providing a relatively clear lens).
  • the first refractive effect can be disposed to be equivalent to that refractive correction.
  • the second portion 310b of the contact lens 310 can be disposed with a second refractive effect, such as a close-range viewing effect.
  • a second refractive effect such as a close-range viewing effect.
  • the second refractive effect can be disposed to account for the user’s needed refractive correction.
  • the user’s needed refractive correction can be disposed to be equivalent to “reader” glasses, such as between about +1 diopter and +3 diopter, more or less.
  • the combination of the first portion 310a and the second portion 310b of the contact lens 310 can provide a “bifocal” contact lens for the user 101.
  • the contact lens 310 can be disposed to have multiple portions beyond the first portion 310a and the second portion 310b. Each such portion can be disposed to provide additional refractive effect, so as to provide the user 101 with a relatively continuous sequence of refractive effects, thus appearing as a relatively continuous amount of refractive effect, in response to the user’s gaze direction.
  • the first portion 310a and the second portion 310b, and each additional portion should those be present can be disposed in one or more sets of concentric rings.
  • the concentric rings can be disposed as interlaced rings with even rings being included in the first set of portions and odd rings being included in the second set of portions.
  • Fig. 3B shows a system including a contact lens having multiple layers, so as to provide multiple visual adjustments to at least some portions of a user’s field of view.
  • the contact lens 310 can include a first layer 311* and a second layer 311**, having similar form and performing similar functions as the first layer 111* and the second layer 111** of the lens 111.
  • the first layer 311* can be disposed to be substantially transparent.
  • the second layer 311** can be disposed to provide shading/in- verse-shading or coloring/ tinting, under control of the computing device 320.
  • the first layer 311* and the second layer 311** in the contact lens 310 can be disposed to provide more than one visual adjustment to a user’s field of view.
  • the first layer 311* can be disposed to provide a first adjustment (such as a distinct refractive effect, a distinct coloring/tinting effect, or some combination thereof) to the first portion 310a of the contact lens 310
  • the second layer 311** can be disposed to provide a second adjustment (such as an amount of shading/inverse-shading) to both the first portion 310a and the second portion 310b of the contact lens.
  • the first layer 311* can be disposed to provide a first adjustment (such as a distinct refractive effect, a distinct coloring/tinting effect, or some combination thereof) to the first portion 310a of the contact lens 310, while the second layer 311** can be disposed to provide a second adjustment (such as an amount of shading/inverse-shading) to only the first portion 310a, or to only the second portion 310b, of the contact lens.
  • a first adjustment such as a distinct refractive effect, a distinct coloring/tinting effect, or some combination thereof
  • a second adjustment such as an amount of shading/inverse-shading
  • the computing device 320 can be disposed with its processor 321 operating under control of the program and data memory 322 and having at least one of its input/output elements coupled to the communication device 331.
  • the communication device 331 can be disposed to provide commands and receive responses from the second layer 311**.
  • the second layer 311** can be disposed to communicate with the communication device 331 using electromagnetic power drawn from the power supply 332.
  • the power supply 332 can be disposed to obtain power from one or more of:
  • a ratchet or spring collecting power from the user s eye movements, head movements, or other movements; or otherwise as described herein.
  • FIG. 3C shows a conceptual drawing of an example eyewear system, showing a contact lens divided into multiple portions.
  • the lenses 111 can include an upper portion 111a and a lower portion mb, so as to provide a bifocal lens set.
  • the upper portion 111a can be disposed to provide a relatively distant vision amount of refraction
  • the lower portion mb can be disposed to provide a relatively close-range vision amount of refraction.
  • the eyewear 110 can be disposed to determine a most frequent distance, or a most frequent set of distance, at which the user typically looks, and can be disposed to dynamically adjust the refraction of those portions of the lenses 111 so as to optimize use of the eyewear on behalf of the user.
  • the lenses 111 can include a larger number of portions than just up- per/lower portions, such as multi-focal lenses 111 having three or more such portions.
  • the multi-focal lenses 111 can be disposed with individual portions dividing the lenses 111 into multiple horizontal sections; thus, a set of stripes each being horizontal and transitioning from an amount of refraction for a most distant viewing range at a top edge to an amount of refraction for a least distant viewing range at a bottom edge.
  • multi-focal contact lenses 310 can be disposed with individual portions dividing the contact lenses 310 into multiple concentric circular sections 312 (shown in fig. 3); thus, a set of circular sections 312 each having the same center as the center of the contact lens and transitioning from an amount of refraction for a most distant viewing range at an outer edge to an amount of refraction for a least distant viewing range at a center of the lens.
  • multi-focal contact lenses 310 can be disposed with individual portions dividing the contact lenses 310 into top/bottom portions 313U and 313B and right/left peripheral portions 313R and 313L; thus, a contact lens can be partitioned approximately in quarters with a top quarter 313U for an amount of refraction for a most distant viewing range, a bottom quarter 313B for an amount of refraction for a least distant viewing range, and two side quarters 313R and 313L for peripheral viewing ranges.
  • one or more contact lenses 310 can include combinations of one or more of these possibilities.
  • a variant contact lens 310 can include a top portion (not shown), a central portion (not shown), and a bottom portion (not shown), and one or more peripheral portions (not shown), collectively a single contact lens 310 with 4-5 portions, capable of providing long-range, middle-range, and close-range viewing, as well as peripheral viewing.
  • two contact lenses 310 there is no particular requirement for two contact lenses 310 in use by the user need be identical.
  • two contact lenses 310 might be different because the user’s eyes have distinct refraction or astigmatism prescriptions.
  • two contact lenses 310 might be different because the user’s view from each eye is toward distinct objects, leading the eyewear 110 to electronically adjust the refraction, shading/inverse-shading, coloring/tint- ing, or other parameters to different values.
  • a right-hand contact lens 310R might include portions disposed in concentric circular sections 312, while a left-hand contact lens 310L might include portions disposed having top/bottom portions 313U and 313L and right/left portions 313R and 313L.
  • two contact lenses 310 in use by the user there is no particular requirement for two contact lenses 310 in use by the user to be controlled by the eyewear 110 in identical manner.
  • two distinct contact lenses 310 might be distinguished by the user by coloring, patterning, stippling, or micro-printed labeling at an edge or in a circular region away from the pupil.
  • the eyewear 110 might control a first contact lens 310 primarily by adjusting its shading/inverse-shading and might control a second contact lens 310 primarily by adjusting its coloring/tinting.
  • the eyewear 110 might control a first contact lens 310 primarily at a first speed and might control a second contact lens 310 primarily at a second speed.
  • the eyewear 110 might control a first contact lens 310 primarily with respect to a first set of objects and might control a second contact lens 310 primarily with respect to a second set of objects.
  • Fig. 4 shows a conceptual drawing of an example eyewear system, showing adjustment of refraction in response to a depth of focus.
  • an eyewear 400 can be disposed to include one or more lenses 410, at least one of which includes a first portion 411 and a second portion 412.
  • the first portion 411 can include an upper portion and the second portion 412 can include a lower portion.
  • the first portion 411 can include a central portion and the second portion 412 can include a peripheral portion.
  • the first and second portions can be disposed in a different manner.
  • the lenses 410 can also be disposed with more than two portions, such as lenses 410 with four portions: upper/lower and left/right, disposed as shown in the figure.
  • the lenses 410 can also be disposed with more than two portions, such as lenses 410 with multiple portions disposed to form a trifocal, multi-focal, or “progressive” lens.
  • a lens 410 having multiple portions can be disposed to adjust an amount of refraction 421 in response to the user’s depth of focus 422.
  • the user’s depth of focus 422 can be determined in response to one or more of:
  • the user s iris/pupil size or inter-pupillary distance (not shown), whether the user is squinting or performing other eye/facial muscle movements (not shown).
  • a distance 431 to a recognized object 432 such as determined in response to a relative brightness thereof or a time-of-flight to/from the recognized object 432.
  • Information from a selected object 432 such as a signal 433 from the object 432, wherein the signal 433 is disposed to include information with respect to a location of the object 432, or wherein a signal strength is disposed to be measured by the eyewear 410 or by an associated signal receiver (not shown).
  • the eyewear 400 can be disposed to adjust the amount of refraction 421 so as to provide accurate focus at the user’s depth of focus 422 or at the recognized object 432.
  • This can have the effect that the eyewear 400 can be disposed to adjust for the user’s viewing direction, where any change in viewing direction has an effect on depth of focus 422.
  • the eyewear 400 can adjust an amount of refraction when appropriate.
  • the eyewear 400 can be disposed to adjust the amount of refraction 421 in response to an external signal, such as from a recognized object 432 being viewed by the user.
  • an external signal such as from a recognized object 432 being viewed by the user.
  • the eyewear 400 can be disposed to adjust its amount of refraction 421 in response to that signal 433.
  • the eyewear 400 can be disposed to adjust its amount of refraction 421 in response to an external signal from a different person, such as a reviewer or supervisor of the user, so as to allow the different person to direct the user’s attention to a different part of the user’s field of view.
  • jewelers, dentists, surgeons, and similar users can perform professional functions while using a magnification effect with respect to detailed objects, and can still be able to see substantially normally when not looking in the direction of those detailed objects.
  • the eyewear 400 can be disposed to alter an amount of refraction in response to whether the user is looking in the direction of their vehicle instruments, in a direction of travel, or in a direction of the ground or other terrain.
  • the amount of refraction 421 can be disposed to be separately adjusted with respect to each portion of the lenses 410.
  • an upper portion 411 can be disposed to separately adjusted from a lower portion 412.
  • the user can look through the upper portion 411 when viewing a relatively distant object 432 and can look through the lower portion 412 when viewing a relatively close-range object 432, as shown in the figure.
  • This can have the effect that the eyewear 400 can be disposed to allow the user to focus on objects at different ranges, adjusting a focal length of the eyewear 400 in response to a distance of an object being focused upon.
  • each one of the multiple portions of the lenses 410 can be disposed to perform a distinct function in addition to or in lieu of refraction, such as one or more of: shad- ing/inverse-shading, coloring/tinting, polarization, prismatic deflection, dynamic visual optimization, or another function thereof.
  • the eyewear 400 can be disposed to adjust an amount or nature of shading/inverse-shading, such as a minimum amount of shading/inverse-shading, a responsiveness of shading/inverse-shading to an ambient environment, a maximum amount of shading/inverse-shading, and one or more thresholds at which selected intermediate amounts of shading/inverse-shading are applicable.
  • the eyewear 400 can be disposed to adjust an amount of shading/inverse-shading from a minimum of 10% when luminance of an ambient environment exceeds 400 lux and to increase the amount of shading/inverse-shading by 1% for each 10 lux above that threshold up to a maximum of 90% when luminance of the ambient environment reaches 1200 lux.
  • each such threshold can be defined in response to a relatively normal brightness for a location at which the eyewear 400 is found or toward which one or more lenses 410 (or portions of lenses 410) are directed.
  • a relatively normal brightness for a location at which the eyewear 400 is found or toward which one or more lenses 410 (or portions of lenses 410) are directed.
  • one or more such thresholds can be defined by “a sunny day”, “a cloudy day”, “a rainy day”, and similarly.
  • a GPS receiver or other location identifier can be disposed to identify where the user is, as a proxy for a description of an ambient environment.
  • the GPS receiver coupled to the eyewear 400 can be disposed to determine whether the user is indoors or outdoors.
  • a weather report received by the eyewear 400 can be disposed to identify a lighting condition.
  • the eyewear 400 can be disposed to adjust an amount or nature of coloring/tinting, such as a set of lower/ upper frequencies for a set of frequency ranges.
  • the amounts of coloring/tinting can also be disposed in amounts similar to amounts and with thresholds of shading/inverse-shading.
  • the eyewear 400 can be disposed to inject light in a frequency range of 5OO-56onm (green light) and/or to remove light in a frequency range below 48onm (blue/ultraviolet light).
  • the eyewear 400 can be disposed to adjust an amount or nature of polarization, such as whether the polarization is circular or planar, and an angle or direction of the polarization.
  • the eyewear 400 can be disposed to adjust an angle of planar polarization to 90 degrees away from an angle of maximum luminance.
  • the eyewear 400 can be disposed to adjust an amount or nature of prismatic deflection, such as a direction of deflection and an angle thereof.
  • the eyewear 400 can be disposed to adjust an angle of prismatic deflection so as to allow the user to view a display with an upper portion 411 of a lens 410 and a keyboard with a lower portion 412 of the lens 410.
  • the eyewear 400 can be disposed to adjust an amount or nature of dynamic visual optimization (“DVO”), such as a duty cycle or frequency thereof.
  • DVO dynamic visual optimization
  • the eyewear 400 can be disposed to increase/decrease a duty cycle of a DVO function in response to a brightness of a recognized object 432 being viewed, and/ or to increase/ decrease a frequency of a DVO function in response to a velocity of movement of the object 432.
  • the eyewear 400 can be disposed to perform dynamic visual optimization using alternating portions of the lenses.
  • the eyewear 400 can first shade a first portion of the lenses, then can unshade that first portion of the lenses, shade a second portion of the lenses, then can unshade that second portion of the lenses.
  • This can have the effect that the first and second portions of the lenses alternate shading/inverse-shading, so as to allow the user to view alternately through the first portion and the second portion of the lenses.
  • the user can look through either first portion or the second portion of the lenses, or both at once, and see a dynamically optimized view.
  • the eyewear 400 can be disposed to perform dynamic visual optimization using alternating lenses.
  • the eyewear 400 can first shade a first lens, then can unshade that first lens, shade a second lens, then can unshade that second lens.
  • This can have the effect that the lenses alternate shading/inverse-shading, so as to allow the user to view alternately through the first lens and the second lens.
  • the user can still look through either the first or second lens, or both at once, and see a dynamically optimized view.
  • the lens functions can be disposed to operate in response to one or more lens function bookmarks, such as a shading/inverse-shading or coloring/tinting bookmark.
  • the eyewear 400 can be disposed to allow the user to set one or more bookmarks in response to an amount of luminance in an ambient environment, such as: (A) a shading/inverse-shading bookmark in response to indoor reading of a book; (B) a shading/in- verse-shading bookmark in response to indoor viewing of a computing device display; (C) a shading/inverse-shading bookmark in response to outdoor activities involving a sporting event; (D) a coloring/ tinting bookmark in response to onset of migraine when indoors; (E) a coloring/ tinting bookmark in response to migraine onset when outdoors; or as otherwise described herein.
  • the eyewear 400 can be disposed to allow the user to specify one or more bookmarks with respect to other lens functions, such as polarization, prismatic deflection, dynamic visual optimization, or as otherwise described herein.
  • the lens functions can be disposed to operate in response to one or more situational context, such as a time of day, a known location, a nature of an ambient environment, a set of user inputs, or as otherwise described herein.
  • the eyewear 400 can be disposed to adjust shading/inverse-shading to prevent refraction of sunlight or other bright light from damaging or injuring the user’s eye or a printed page, such as in response to concentration of light on the user’s lens/retina or on the printed page.
  • the lenses 111 and 211, and the contact lenses 310 can be disposed to provide users with the ability to automatically or rapidly switch between close-range and distance viewing, or between frontal and peripheral viewing, in a variety of circumstances.
  • the user can be using any form of eyewear or associated accessories, including without limitation glasses or sunglasses, contact lenses, goggles, facemasks or helmets, or other eyewear.
  • the user can be using any form of optical systems, including without limitation automobile mirrors and windshields, binoculars, cameras, display screens (including computer displays, laptop displays, tablets, smartphones, and otherwise), microscopes and telescopes, rifle scopes or other gun scopes, or other optical systems.
  • optical systems including without limitation automobile mirrors and windshields, binoculars, cameras, display screens (including computer displays, laptop displays, tablets, smartphones, and otherwise), microscopes and telescopes, rifle scopes or other gun scopes, or other optical systems.
  • the user can be performing any form of review of another person, including without limitation coaching, performing as an agent, scouting for talent, or otherwise, with respect to sports or players.
  • the user can be performing any form of coaching, observing, training, or otherwise, with respect to activities where the presence of a secondary party could interfere; such as aircraft piloting, bomb defusal, combat training, contact sports, horseback riding, marksmanship, military training, or otherwise.
  • the user can be participating in any form of entertainment, including without limitation interactive entertainment, comedy or horror shows, dinner theater, live action role playing, or otherwise.
  • the user can be observing entertainment or sports, such as being a spectator for a careful or high-speed sport: such as automobile racing, an equestrian sport (such as dressage or horse racing), marksmanship, shooting skeet, skateboarding, skiing, or otherwise; or such as participating in or viewing a competitive sport: such as baseball, basketball, field hockey or ice hockey, football, golf, racing, soccer, skiing, snowboarding, tennis or table tennis, video games, or otherwise.
  • a careful or high-speed sport such as automobile racing, an equestrian sport (such as dressage or horse racing), marksmanship, shooting skeet, skateboarding, skiing, or otherwise; or such as participating in or viewing a competitive sport: such as baseball, basketball, field hockey or ice hockey, football, golf, racing, soccer, skiing, snowboarding, tennis or table tennis, video games, or otherwise.
  • a careful or high-speed sport such as automobile racing, an equestrian sport (such as dressage or horse racing), marksmanship, shooting skeet, skateboarding, skiing
  • the user can be participating in a sports activity, such as participating in a careful or high-speed sport, such as automobile racing, an equestrian sport, marksmanship, shooting skeet, skateboarding, skiing, or otherwise; or such as participating in or viewing a competitive sport: such as baseball, basketball, field hockey or ice hockey, football, golf, racing, soccer, skiing, snowboarding, tennis or table tennis, video games, or otherwise.
  • a sports activity such as participating in a careful or high-speed sport, such as automobile racing, an equestrian sport, marksmanship, shooting skeet, skateboarding, skiing, or otherwise; or such as participating in or viewing a competitive sport: such as baseball, basketball, field hockey or ice hockey, football, golf, racing, soccer, skiing, snowboarding, tennis or table tennis, video games, or otherwise.
  • the user can be operating a vehicle, such as driving a ground vehicle, such as an automobile or racing car; operating a bicycle, motorcycle, or mountain bike; piloting an aircraft, such as a fixed-wing aircraft, glider, ultralight aircraft, or helicopter; operating a watercraft, such as a motorboat, sailboat, or jet ski; operating a space craft; or otherwise as described herein.
  • a vehicle such as driving a ground vehicle, such as an automobile or racing car; operating a bicycle, motorcycle, or mountain bike; piloting an aircraft, such as a fixed-wing aircraft, glider, ultralight aircraft, or helicopter; operating a watercraft, such as a motorboat, sailboat, or jet ski; operating a space craft; or otherwise as described herein.
  • the user can be conducting or reviewing law enforcement or military operations, such as decisions whether to shoot, piloting, use of suppression devices, and otherwise.
  • the user can be conducting or reviewing search/ rescue operations, emergency responder operations, or other observational operations, such as decisions whether to look more closely, look for more detail, or otherwise identify subjects, and otherwise.
  • the user can be experiencing medical conditions, such as migraines, photophobia, neuro-ophthalmic disorders; PTSD and other psychological triggers of trauma, and variants thereof; and otherwise.
  • medical conditions such as migraines, photophobia, neuro-ophthalmic disorders; PTSD and other psychological triggers of trauma, and variants thereof; and otherwise.
  • the eyewear can be disposed to provide the user 101 with the ability of automatically or rapidly switching between close-range and distance viewing, between frontal and peripheral viewing, or otherwise as described herein, while participating in activities in which switching between viewing parameters is valuable to the viewer. At least some of these activities can include one or more of:
  • the user 101 While participating in or viewing a sports activity, the user 101 might concurrently view the sports activity, a billboard or monitor associated with the sports activity, a program or scoresheet associated with the sports activity, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) observing the sports activity, (B) reviewing game information, a scoresheet, a set of betting odds or other parameters, or other information with respect to the sports activity, (C) socializing with other observers of the sports activity, or otherwise as described herein.
  • A observing the sports activity
  • B reviewing game information
  • a scoresheet a set of betting odds or other parameters, or other information with respect to the sports activity
  • C socializing with other observers of the sports activity, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing the sports activity and (B) reviewing game information or other information as described herein, (C) socializing with other observers of the sports activity, or otherwise as described herein.
  • A observing the sports activity
  • B reviewing game information or other information as described herein
  • C socializing with other observers of the sports activity, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing other players in the sports activity, (B) observing selected sports equipment, such as a moving bat or ball, or otherwise as described herein.
  • A observing other players in the sports activity
  • B observing selected sports equipment, such as a moving bat or ball, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing the sports activity, (B) adjusting or selecting sports equipment, such as selecting golf clubs, adjusting rifle sights or restocking ammunition, or similar activities with respect to the sport in which the user is participating, (C) reviewing game information or other information as described herein, such as a description of a golf course or other sports environment, (D) socializing with other participants in the sports activity, or otherwise as described herein.
  • sports equipment such as selecting golf clubs, adjusting rifle sights or restocking ammunition, or similar activities with respect to the sport in which the user is participating
  • C reviewing game information or other information as described herein, such as a description of a golf course or other sports environment
  • D socializing with other participants in the sports activity, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) similar switch between (A) observing the user’s action with respect to the sports activity, such as when dirt biking, observing a trail on which the user is moving, (B) observing an environment near the sports activity, such as when dirt biking, observing potential obstacles near the trail on which the user is moving, (C) observing controls and sensors associated with the equipment the user is controlling, or otherwise as described herein.
  • A observing the user’s action with respect to the sports activity, such as when dirt biking, observing a trail on which the user is moving
  • B observing an environment near the sports activity, such as when dirt biking, observing potential obstacles near the trail on which the user is moving
  • C observing controls and sensors associated with the equipment the user is controlling, or otherwise as described herein.
  • the user 101 While conducting or reviewing law enforcement or military operations, the user 101 might concurrently switch between (A) viewing the operations at a relatively close range or a relatively distant range, (B) viewing controls or sights of equipment being operated by the law enforcement officer or military personnel, (C) viewing nearby persons or equipment to the law enforcement officer or military personnel, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) observing a relatively wide-angle view of the law enforcement activity, (B) observing a relatively narrow-angle view of a portion of the law enforcement activity, such as a particular person or object therein (such as a suspect, possible evidence, or a potential weapon), (C) observing a relatively close range view of a control or sensor associated with the law enforcement personnel, such as a weapon (particularly a gun sight, a safety, or a target associated with a weapon), or otherwise as described herein.
  • A observing a relatively wide-angle view of the law enforcement activity
  • B observing a relatively narrow-angle view of a portion of the law enforcement activity, such as a particular person or object therein (such as a suspect, possible evidence, or a potential weapon)
  • C observing a relatively close range view of a control or sensor associated with the law enforcement personnel, such as a weapon (particularly a gun sight, a safety, or a target associated with a weapon), or
  • the user 101 might periodically (or otherwise from time to time) switch between (A) observing a relatively wide-angle view of the military operation, (B) observing a relatively narrow-angle view of a portion of the military operation, such as a particular location or object therein (such as enemy equipment or personnel, an objective, a field of fire or a target, or friendly military personnel), (C) observing a relatively close range view of a control or sensor associated with the law enforcement personnel, such as a weapon (particularly a gun sight or a target associated with a weapon), or otherwise as described herein.
  • A observing a relatively wide-angle view of the military operation
  • B observing a relatively narrow-angle view of a portion of the military operation, such as a particular location or object therein (such as enemy equipment or personnel, an objective, a field of fire or a target, or friendly military personnel)
  • C observing a relatively close range view of a control or sensor associated with the law enforcement personnel, such as a weapon (particularly a gun sight or a target associated
  • the user 101 While conducting or reviewing search/ rescue operations, emergency responder operations, or other observational operations, the user 101 might concurrently switch between (A) viewing the operations at a relatively close range or a relatively distant range, (B) viewing controls or sights of equipment being operated by the search/rescue personnel, emergency responders, or other observers, (C) viewing nearby persons or equipment to the emergency responders or other observers, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) observing a relatively wide-angle view of the search/ rescue operation, (B) observing a relatively narrow-angle view of a portion of the search/rescue operation, such as a particular location or object therein (such as an object being searched for or evidence thereof), or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) observing a relatively wide-angle view of the emergency situation, (B) observing a relatively narrow-angle view of a portion of the emergency situation, such as a particular person or object therein (such as a patient or medical equipment), (C) observing a relatively close range view of a control or sensor associated with the emergency responder, such as medical equipment (particularly medical equipment associated with a patient), or otherwise as described herein.
  • A observing a relatively wide-angle view of the emergency situation
  • B observing a relatively narrow-angle view of a portion of the emergency situation, such as a particular person or object therein (such as a patient or medical equipment)
  • C observing a relatively close range view of a control or sensor associated with the emergency responder, such as medical equipment (particularly medical equipment associated with a patient), or otherwise as described herein.
  • the user 101 While experiencing medical conditions, such as migraines, photophobia, neuro-oph- thalmic disorders; PTSD and other psychological triggers of trauma, or variants thereof, the user 101 might concurrently switch between viewing (A) a clear, substantially adjustment-free view of the ambient environment, (B) an adjusted view of the ambient environment, such as adjusted with respect to shading/inverse-shading, coloring/tinting, or filtering with respect to a frequency of change or with respect to dynamic visual optimization, or otherwise as described herein.
  • A a clear, substantially adjustment-free view of the ambient environment
  • an adjusted view of the ambient environment such as adjusted with respect to shading/inverse-shading, coloring/tinting, or filtering with respect to a frequency of change or with respect to dynamic visual optimization, or otherwise as described herein.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) a clear, substantially adjustment-free view of the ambient environment, (B) a view of the ambient environment subject to shading/inverse-shading and/or coloring/tinting, so as to reduce an effect of excess luminance, glare, or other bright lighting on the user’s eyes, (C) a view of the ambient environment subject to coloring/tinting, color filtering, or color balancing, so as to encourage a user emotional state less subject to one or more of these, or (D) a combination of shading/inverse-shading and coloring/tinting, and possibly other visual or audio/video effects, or otherwise as described herein.
  • tinting or coloring the user’s view of the ambient environment to add green coloring/tinting can have the effect of reducing the likelihood of, or ameliorating the effect of, migraine or photophobia.
  • the user 101 might periodically (or otherwise from time to time) switch between (A) a clear, substantially adjustment-free view of the ambient environment, (B) a view of the ambient environment subject to shading/inverse-shading, coloring/tinting, or time filtering (so as to reduce the effect of flashing or other high-frequency lighting), or otherwise as described herein.
  • This can have the effect of reducing the likelihood of, ameliorating the effect or severity of, at least partially treating, or otherwise helping the user with respect to, one or more of these disorders.
  • time filtering such as low-pass filtering, or dynamic visual optimization (such as described in the Incorporated Disclosures), of the user’s view of the ambient environment, can have the effect of reducing the likelihood of, or ameliorating the effect of, PTSD or epilepsy.
  • the user can be operating a device or vehicle, such as one or more of:
  • a flying vehicle such as an aircraft, an ultralight aircraft, a glider, a hang-glider, a helicopter, or a similar vehicle;
  • a ground vehicle such as an automobile, a race car, or a similar vehicle
  • a water vehicle such as a kayak, motorboat, sailboat or yacht, speedboat, a cigarette boat, or a similar vehicle;
  • tracking moving equipment such as viewing rotating turbines or wheels, or for which it is useful to tune a viewing frequency to a frequency of angular position or movement, so as to operate in synchrony therewith;
  • This Application also describes devices, and methods for using them, capable of providing the user with the ability to automatically or rapidly switch between close-range and distance viewing, or between frontal and peripheral viewing, while participating in activities in which switching between viewing parameters is valuable to the viewer, such as:
  • a flying vehicle such as an aircraft, an ultralight aircraft, a glider, a hang- glider, a helicopter, or a similar vehicle;
  • ground vehicle such as an automobile, a race car, or a similar vehicle
  • a water vehicle such as a kayak, motorboat, sailboat or yacht, speedboat, a cigarette boat, or a similar vehicle;
  • baseball, basketball, an equestrian sport such as dressage or horse racing
  • football field hockey, ice hockey, jai alai, lacrosse
  • a snow sport such as skiing, sledding, snowboarding, operating a snowmobile, or tobogganing or luge
  • soccer or a similar sport
  • tracking moving equipment such as viewing rotating turbines or wheels, or for which it is useful to tune a viewing frequency to a frequency of angular position or movement, so as to operate in synchrony therewith;
  • the user 101 While operating a vehicle, such as a flying vehicle, a ground vehicle, or a water vehicle, the user 101 might concurrently switch between (A) observing the ambient environment with respect to movement of the vehicle, such as when the vehicle is being operated at a relatively high speed with respect to the ambient environment, (B) observing an environment near the movement of the vehicle, such as when the vehicle is moving near obstacles or terrain of concern to the user, (C) observing controls and sensors associated with the equipment the user is controlling, or otherwise as described herein.
  • A observing the ambient environment with respect to movement of the vehicle, such as when the vehicle is being operated at a relatively high speed with respect to the ambient environment
  • B observing an environment near the movement of the vehicle, such as when the vehicle is moving near obstacles or terrain of concern to the user
  • C observing controls and sensors associated with the equipment the user is controlling, or otherwise as described herein.
  • the user 101 While participating in a sport using relatively rapid sports equipment (e.g., baseball, basketball, football, racing, soccer, tennis, or other such sports), the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing other players in the sports activity, (B) observing selected sports equipment, such as a moving bat or ball, or otherwise as described herein.
  • relatively rapid sports equipment e.g., baseball, basketball, football, racing, soccer, tennis, or other such sports
  • the user 101 While participating in a sport in which shooting might occur, or in which a sight might be used, the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing a target without a sight, (B) observing a target with a sight, (C) adjusting sports equipment, such as adjusting rifle sights or other sights, or similar activities with respect to the sport in which the user is participating, or otherwise as described herein.
  • the user 101 While participating in an activity in which critical, such as life-critical, decisions are made, the user 101 might periodically (or otherwise from time to time) similarly switch between (A) observing a relatively wide-angle view of the activity (such as a medical emergency situation), (B) observing a relatively narrow-angle view of a portion of the activity, such as a particular person or object therein (such as a patient or medical equipment), (C) observing a relatively close range view of a control or sensor associated with the activity, such as medical equipment (particularly medical equipment associated with a patient), or otherwise as described herein.
  • A observing a relatively wide-angle view of the activity (such as a medical emergency situation)
  • B observing a relatively narrow-angle view of a portion of the activity, such as a particular person or object therein (such as a patient or medical equipment)
  • C observing a relatively close range view of a control or sensor associated with the activity, such as medical equipment (particularly medical equipment associated with a patient), or otherwise as described herein.
  • a law enforcement officer might be helped by switching between a relatively broader view of a scene when responding to a request for assistance, and a relatively narrower view of a particular person (such as observing for a possible weapon or other threat).
  • a law enforcement officer might be helped by switching between a relatively broader view when reviewing a crime scene, and a relatively narrower view of a particular item that might be used as evidence.
  • a law enforcement officer might also be helped by switching to a relatively narrower view of equipment, such as a gun safety or gun sight, when appropriate.
  • medical personnel might be helped by switching between a relatively broader view of a scene when responding to a request for assistance, and a relatively narrower view of a particular person (such as observing for possible illness, injury, or a medical condition).
  • medical personnel might be helped by switching between a relatively broader view when observing a patient, and a relatively narrower view of particular medical indicators, such as the patient’s irises or one or more sensors coupled to the patient.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Eye Examination Apparatus (AREA)
  • Eyeglasses (AREA)

Abstract

L'invention concerne des dispositifs et des procédés permettant d'optimiser des entrées sensorielles à l'aide de lentilles divisées en parties distinctes. Ils comprennent des lentilles distinguant une vision rapprochée et une vision à distance ; ou distinguant une visualisation frontale et une visualisation périphérique. Lesdites lentilles ajustent une direction de visualisation sélectionnée, en réponse à un comportement d'utilisateur ou à des signaux externes. Lesdites lentilles répondent à des entrées de commande, par exemple, provenant d'un utilisateur ; d'une partie de révision. Lesdites lentilles encouragent un utilisateur à utiliser des parties sélectionnées de lentilles appropriées pour la surveillance, la prévention ou le traitement de conditions optiques défavorables ; ou appropriées pour des habitudes de visualisation saines. Lesdites lentilles ajustent des paramètres d'optimisation visuelle dynamique en réponse à un comportement d'utilisateur ou à une réponse à des entrées de commande. Lesdites lentilles sont utilisables avec des dispositifs particuliers ou des activités particulières.
PCT/US2022/050640 2021-11-21 2022-11-21 Optique hybride WO2023091771A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202117991400A 2021-11-21 2021-11-21
US17/991,400 2021-11-21
US202117534447A 2021-11-23 2021-11-23
US17/534,447 2021-11-23

Publications (2)

Publication Number Publication Date
WO2023091771A2 true WO2023091771A2 (fr) 2023-05-25
WO2023091771A3 WO2023091771A3 (fr) 2023-09-28

Family

ID=86397839

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/050640 WO2023091771A2 (fr) 2021-11-21 2022-11-21 Optique hybride

Country Status (1)

Country Link
WO (1) WO2023091771A2 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11428937B2 (en) * 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US11181740B1 (en) * 2013-03-15 2021-11-23 Percept Technologies Inc Digital eyewear procedures related to dry eyes
KR102564748B1 (ko) * 2015-03-16 2023-08-07 매직 립, 인코포레이티드 건강 질환 진단과 치료를 위한 방법 및 시스템

Also Published As

Publication number Publication date
WO2023091771A3 (fr) 2023-09-28

Similar Documents

Publication Publication Date Title
US11956414B2 (en) Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing
US20210349535A1 (en) Eye glint imaging in see-through computer display systems
US10231614B2 (en) Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance
US11428955B1 (en) Personalized optics
US20200387226A9 (en) Systems and methods for monitoring a user's eye
CN106662750B (zh) 透视计算机显示系统
CN105142498B (zh) 增强光学和感知数字护目镜
KR102256992B1 (ko) 착용자에게 적응된 헤드 장착형 전자-광학 장치를 제어하는 방법
US20240042232A1 (en) Head-worn therapy device
US20130242262A1 (en) Enhanced optical and perceptual digital eyewear
JP2002278670A (ja) 情報システム
AU2023285715A1 (en) Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing
US11669163B2 (en) Eye glint imaging in see-through computer display systems
EP3615986A1 (fr) Système portable de commande et de manipulation d'images à correction des défauts de vision et augmentation de la vision et de la détection
CN109964230A (zh) 用于眼睛度量采集的方法和设备
WO2023091771A2 (fr) Optique hybride
US11747897B2 (en) Data processing apparatus and method of using gaze data to generate images
KR102315674B1 (ko) 가상현실 기반 안구운동 유도를 통한 시력 회복 장치
WO2023096713A1 (fr) Optique personnalisée
CN107111366A (zh) 用于使感觉输出装置的感觉输出模式与使用者适配的方法
KR102315680B1 (ko) 가상현실 기반 조리개 운동 유도를 통한 시력 회복 장치
WO2022261031A9 (fr) Optimisation visuelle dynamique
KR20230085614A (ko) 가상 디스플레이를 설정하는 가상현실장치 및 장치의 동작 방법
Levy et al. Low vision goggles: optical design studies
Bertolli et al. VISION SCIENCE: Self-Guided Visual Therapy for Law Enforcement Skill Enhancement.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22896588

Country of ref document: EP

Kind code of ref document: A2