US20120249797A1 - Head-worn adaptive display - Google Patents

Head-worn adaptive display Download PDF

Info

Publication number
US20120249797A1
US20120249797A1 US13/429,721 US201213429721A US2012249797A1 US 20120249797 A1 US20120249797 A1 US 20120249797A1 US 201213429721 A US201213429721 A US 201213429721A US 2012249797 A1 US2012249797 A1 US 2012249797A1
Authority
US
United States
Prior art keywords
eyepiece
user
embodiments
image
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/429,721
Inventor
John D. Haddick
Ralph F. Osterhout
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Osterhout Group Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US30897310P priority Critical
Priority to US37379110P priority
Priority to US38257810P priority
Priority to US41098310P priority
Priority to US201161429447P priority
Priority to US201161429445P priority
Priority to US13/037,335 priority patent/US20110213664A1/en
Priority to US13/037,324 priority patent/US20110214082A1/en
Priority to US201161472491P priority
Priority to US201161483400P priority
Priority to US201161487371P priority
Priority to US201161504513P priority
Priority to US13/232,930 priority patent/US9128281B2/en
Priority to US201161557289P priority
Priority to US13/341,758 priority patent/US20120194549A1/en
Priority to US201261584029P priority
Priority to US13/429,721 priority patent/US20120249797A1/en
Application filed by Osterhout Group Inc filed Critical Osterhout Group Inc
Publication of US20120249797A1 publication Critical patent/US20120249797A1/en
Assigned to OSTERHOUT GROUP, INC. reassignment OSTERHOUT GROUP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OSTERHOUT, RALPH F., HADDICK, JOHN D.
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OSTERHOUT GROUP, INC.
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/02Detectors of external physical values, e.g. temperature
    • G04G21/025Detectors of external physical values, e.g. temperature for measuring physiological data
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/04Input or output devices integrated in time-pieces using radio waves
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/08Touch switches specially adapted for time-pieces
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/02 - G06F3/16, e.g. facsimile, microfilm
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type, eyeglass details G02C
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • G02C7/086Auxiliary lenses located directly on a main spectacle lens or in the immediate vicinity of main spectacles
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C9/00Attaching auxiliary optical parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00228Detection; Localisation; Normalisation
    • G06K9/00255Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
    • G06K9/00671Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera for providing information about objects in the scene to a user, e.g. as in augmented reality applications

Abstract

This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly includes absorptive polarizers or anti-reflective coatings to reduce stray light.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. Provisional Patent Application 61/584,029, filed Jan. 6, 2012, which is incorporated herein by reference in its entirety.
  • This application is a continuation-in-part of the following United States non-provisional patent applications, each of which is incorporated herein by reference in its entirety:
  • U.S. Non-Provisional application Ser. No. 13/341,758, filed Dec. 30, 2011, which claims the benefit of the following provisional applications, each of which is hereby incorporated herein by reference in its entirety: U.S. Provisional Patent Application 61/557,289, filed Nov. 8, 2011.
  • U.S. Non-Provisional application Ser. No. 13/232,930, filed Sep. 14, 2011, which claims the benefit of the following provisional applications, each of which is hereby incorporated herein by reference in its entirety: U.S. Provisional Application 61/382,578, filed Sep. 14, 2010; U.S. Provisional Application 61/472,491, filed Apr. 6, 2011; U.S. Provisional Application 61/483,400, filed May 6, 2011; U.S. Provisional Application 61/487,371, filed May 18, 2011; and U.S. Provisional Application 61/504,513, filed Jul. 5, 2011.
  • U.S. patent application Ser. No. 13/037,324, filed Feb. 28, 2011 and U.S. patent application Ser. No. 13/037,335, filed Feb. 28, 2011, each of which claim the benefit of the following provisional applications, each of which is hereby incorporated herein by reference in its entirety: U.S. Provisional Patent Application 61/308,973, filed Feb. 28, 2010; U.S. Provisional Patent Application 61/373,791, filed Aug. 13, 2010; U.S. Provisional Patent Application 61/382,578, filed Sep. 14, 2010; U.S. Provisional Patent Application 61/410,983, filed Nov. 8, 2010; U.S. Provisional Patent Application 61/429,445, filed Jan. 3, 2011; and U.S. Provisional Patent Application 61/429,447, filed Jan. 3, 2011.
  • BACKGROUND Field
  • The present disclosure relates to an augmented reality eyepiece, associated control technologies, and applications for use, and more specifically to software applications running on the eyepiece.
  • SUMMARY
  • In embodiments, the eyepiece may include an internal software application running on an integrated multimedia computing facility that has been adapted for 3D augmented reality (AR) content display and interaction with the eyepiece. 3D AR software applications may be developed in conjunction with mobile applications and provided through application store(s), or as stand-alone applications specifically targeting the eyepiece as the end-use platform and through a dedicated 3D AR eyepiece store. Internal software applications may interface with inputs and output facilities provided by the eyepiece through facilities internal and external to the eyepiece, such as initiated from the surrounding environment, sensing devices, user action capture devices, internal processing facilities, internal multimedia processing facilities, other internal applications, camera, sensors, microphone, through a transceiver, through a tactile interface, from external computing facilities, external applications, event and/or data feeds, external devices, third parties, and the like. Command and control modes operating in conjunction with the eyepiece may be initiated by sensing inputs through input devices, user action, external device interaction, reception of events and/or data feeds, internal application execution, external application execution, and the like. In embodiments, there may be a series of steps included in the execution control as provided through the internal software application, including at least combinations of two of the following: events and/or data feeds, sensing inputs and/or sensing devices, user action capture inputs and/or outputs, user movements and/or actions for controlling and/or initiating commands, command and/or control modes and interfaces in which the inputs may be reflected, applications on the platform that may use commands to respond to inputs, communications and/or connection from the on-platform interface to external systems and/or devices, external devices, external applications, feedback to the user (such as related to external devices, external applications), and the like.
  • These and other systems, methods, objects, features, and advantages of the present disclosure will be apparent to those skilled in the art from the following detailed description of the embodiments and the drawings.
  • All documents mentioned herein are hereby incorporated in their entirety by reference. References to items in the singular should be understood to include items in the plural, and vice versa, unless explicitly stated otherwise or clear from the text. Grammatical conjunctions are intended to express any and all disjunctive and conjunctive combinations of conjoined clauses, sentences, words, and the like, unless otherwise stated or clear from the context.
  • BRIEF DESCRIPTION OF THE FIGURES
  • The present disclosure and the following detailed description of certain embodiments thereof may be understood by reference to the following figures:
  • FIG. 1 depicts an illustrative embodiment of the optical arrangement.
  • FIG. 2 depicts an RGB LED projector.
  • FIG. 3 depicts the projector in use.
  • FIG. 4 depicts an embodiment of the waveguide and correction lens disposed in a frame.
  • FIG. 5 depicts a design for a waveguide eyepiece.
  • FIG. 6 depicts an embodiment of the eyepiece with a see-through lens.
  • FIG. 7 depicts an embodiment of the eyepiece with a see-through lens.
  • FIG. 8A-C depicts embodiments of the eyepiece arranged in a flip-up/flip-down configuration.
  • FIG. 8D-E depicts embodiments of snap-fit elements of a secondary optic.
  • FIG. 8F depicts embodiments of flip-up/flip-down electro-optics modules.
  • FIG. 9 depicts an electrochromic layer of the eyepiece.
  • FIG. 10 depicts the advantages of the eyepiece in real-time image enhancement, keystone correction, and virtual perspective correction.
  • FIG. 11 depicts a plot of responsivity versus wavelength for three substrates.
  • FIG. 12 illustrates the performance of the black silicon sensor.
  • FIG. 13A depicts an incumbent night vision system, FIG. 13B depicts the night vision system of the present disclosure, and FIG. 13C illustrates the difference in responsivity between the two.
  • FIG. 14 depicts a tactile interface of the eyepiece.
  • FIG. 14A depicts motions in an embodiment of the eyepiece featuring nod control.
  • FIG. 15 depicts a ring that controls the eyepiece.
  • FIG. 15AA depicts a ring that controls the eyepiece with an integrated camera, where in an embodiment may allow the user to provide a video image of themselves as part of a videoconference.
  • FIG. 15A depicts hand mounted sensors in an embodiment of a virtual mouse.
  • FIG. 15B depicts a facial actuation sensor as mounted on the eyepiece.
  • FIG. 15C depicts a hand pointing control of the eyepiece.
  • FIG. 15D depicts a hand pointing control of the eyepiece.
  • FIG. 15E depicts an example of eye tracking control.
  • FIG. 15F depicts a hand positioning control of the eyepiece.
  • FIG. 16 depicts a location-based application mode of the eyepiece.
  • FIG. 17 shows the difference in image quality between A) a flexible platform of uncooled CMOS image sensors capable of VIS/NIR/SWIR imaging and B) an image intensified night vision system
  • FIG. 18 depicts an augmented reality-enabled custom billboard.
  • FIG. 19 depicts an augmented reality-enabled custom advertisement.
  • FIG. 20 an augmented reality-enabled custom artwork.
  • FIG. 20A depicts a method for posting messages to be transmitted when a viewer reaches a certain location.
  • FIG. 21 depicts an alternative arrangement of the eyepiece optics and electronics.
  • FIG. 22 depicts an alternative arrangement of the eyepiece optics and electronics.
  • FIG. 22A depicts the eyepiece with an example of eyeglow.
  • FIG. 22B depicts a cross-section of the eyepiece with a light control element for reducing eyeglow.
  • FIG. 23 depicts an alternative arrangement of the eyepiece optics and electronics.
  • FIG. 24 depicts a lock position of a virtual keyboard.
  • FIG. 24A depicts an embodiment of a virtually projected image on a part of the human body.
  • FIG. 25 depicts a detailed view of the projector.
  • FIG. 26 depicts a detailed view of the RGB LED module.
  • FIG. 27 depicts a gaming network.
  • FIG. 28 depicts a method for gaming using augmented reality glasses.
  • FIG. 29 depicts an exemplary electronic circuit diagram for an augmented reality eyepiece.
  • FIG. 29A depicts a control circuit for eye-tracking control of an external device.
  • FIG. 29B depicts a communication network among users of augmented reality eyepieces.
  • FIG. 30 depicts partial image removal by the eyepiece.
  • FIG. 31 depicts a flowchart for a method of identifying a person based on speech of the person as captured by microphones of the augmented reality device.
  • FIG. 32 depicts a typical camera for use in video calling or conferencing.
  • FIG. 33 illustrates an embodiment of a block diagram of a video calling camera.
  • FIG. 34 depicts embodiments of the eyepiece for optical or digital stabilization.
  • FIG. 35 depicts an embodiment of a classic cassegrain configuration.
  • FIG. 36 depicts the configuration of the micro-cassegrain telescoping folded optic camera.
  • FIG. 37 depicts a swipe process with a virtual keyboard.
  • FIG. 38 depicts a target marker process for a virtual keyboard.
  • FIG. 38A depicts an embodiment of a visual word translator.
  • FIG. 39 illustrates glasses for biometric data capture according to an embodiment.
  • FIG. 40 illustrates iris recognition using the biometric data capture glasses according to an embodiment.
  • FIG. 41 depicts face and iris recognition according to an embodiment.
  • FIG. 42 illustrates use of dual omni-microphones according to an embodiment.
  • FIG. 43 depicts the directionality improvements with multiple microphones.
  • FIG. 44 shows the use of adaptive arrays to steer the audio capture facility according to an embodiment.
  • FIG. 45 shows the mosaic finger and palm enrollment system according to an embodiment.
  • FIG. 46 illustrates the traditional optical approach used by other finger and palm print systems.
  • FIG. 47 shows the approach used by the mosaic sensor according to an embodiment.
  • FIG. 48 depicts the device layout of the mosaic sensor according to an embodiment.
  • FIG. 49 illustrates the camera field of view and number of cameras used in a mosaic sensor according to another embodiment.
  • FIG. 50 shows the bio-phone and tactical computer according to an embodiment.
  • FIG. 51 shows the use of the bio-phone and tactical computer in capturing latent fingerprints and palm prints according to an embodiment.
  • FIG. 52 illustrates a typical DOMEX collection.
  • FIG. 53 shows the relationship between the biometric images captured using the bio-phone and tactical computer and a biometric watch list according to an embodiment.
  • FIG. 54 illustrates a pocket bio-kit according to an embodiment.
  • FIG. 55 shows the components of the pocket bio-kit according to an embodiment.
  • FIG. 56 depicts the fingerprint, palm print, geo-location and POI enrollment device according to an embodiment.
  • FIG. 57 shows a system for multi-modal biometric collection, identification, geo-location, and POI enrollment according to an embodiment.
  • FIG. 58 illustrates a fingerprint, palm print, geo-location, and POI enrollment forearm wearable device according to an embodiment.
  • FIG. 59 shows a mobile folding biometric enrollment kit according to an embodiment.
  • FIG. 60 is a high level system diagram of a biometric enrollment kit according to an embodiment.
  • FIG. 61 is a system diagram of a folding biometric enrollment device according to an embodiment.
  • FIG. 62 shows a thin-film finger and palm print sensor according to an embodiment.
  • FIG. 63 shows a biometric collection device for finger, palm, and enrollment data collection according to an embodiment.
  • FIG. 64 illustrates capture of a two stage palm print according to an embodiment.
  • FIG. 65 illustrates capture of a fingertip tap according to an embodiment.
  • FIG. 66 illustrates capture of a slap and roll print according to an embodiment.
  • FIG. 67 depicts a system for taking contactless fingerprints, palmprints or other biometric prints.
  • FIG. 68 depicts a process for taking contactless fingerprints, palmprints or other biometric prints.
  • FIG. 69 depicts an embodiment of a watch controller.
  • FIG. 70A-D depicts embodiment cases for the eyepiece, including capabilities for charging and integrated display.
  • FIG. 71 depicts an embodiment of a ground stake data system.
  • FIG. 72 depicts a block diagram of a control mapping system including the eyepiece.
  • FIG. 73 depicts a biometric flashlight.
  • FIG. 74 depicts a helmet-mounted version of the eyepiece.
  • FIG. 75 depicts an embodiment of situational awareness glasses.
  • FIG. 76A depicts an assembled 360° imager and FIG. 76B depicts a cutaway view of the 360° imager.
  • FIG. 77 depicts an exploded view of the multi-coincident view camera.