US20210181536A1 - Eyewear device with finger activated touch sensor - Google Patents

Eyewear device with finger activated touch sensor Download PDF

Info

Publication number
US20210181536A1
US20210181536A1 US17/182,943 US202117182943A US2021181536A1 US 20210181536 A1 US20210181536 A1 US 20210181536A1 US 202117182943 A US202117182943 A US 202117182943A US 2021181536 A1 US2021181536 A1 US 2021181536A1
Authority
US
United States
Prior art keywords
touch sensor
input surface
finger
image display
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/182,943
Inventor
Julio Cesar Castañeda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Snap Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US17/182,943 priority Critical patent/US20210181536A1/en
Publication of US20210181536A1 publication Critical patent/US20210181536A1/en
Assigned to SNAP INC. reassignment SNAP INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Castañeda, Julio Cesar
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0161Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
    • G02B2027/0163Electric or electronic control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Definitions

  • the present subject matter relates to eyewear devices, e.g., smart glasses, and, more particularly, to eyewear devices with touch sensors (e.g., slide controllers) for receiving user gestures.
  • touch sensors e.g., slide controllers
  • Portable eyewear devices such as smartglasses, headwear, and headgear available today integrate lenses, cameras, and wireless network transceiver devices.
  • size limitations and the form factor of an eyewear device can make a user interface difficult to incorporate into the eyewear device.
  • the available area for placement of various control buttons on an eyewear device, e.g., to operate a camera, is limited. Due to the small form factor of the eyewear device, manipulation and interacting with, for example, displayed content on an image display is difficult.
  • FIG. 1A is a side view of an example hardware configuration of an eyewear device, which includes a touch sensor on a temple, for use in identifying a finger gesture for adjusting an image presented on an image display of the eyewear device.
  • FIGS. 1B-C are rear views of example hardware configurations of the eyewear device of FIG. 1A , including two different types of image displays.
  • FIG. 2A shows a side view of a temple of the eyewear device of FIGS. 1A-C depicting a capacitive type touch sensor example.
  • FIG. 2B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 2A .
  • FIG. 2C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 2B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 2D depicts a capacitive array pattern formed on the circuit board of FIG. 2C to receive finger contacts.
  • FIG. 3A shows an external side view of a temple of the eyewear device of FIG. 1 depicting another capacitive type touch sensor.
  • FIG. 3B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 3A .
  • FIG. 3C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 3B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 3D depicts the capacitive array pattern formed on the circuit board of FIG. 3C to receive finger contacts.
  • FIGS. 4A-B show operation and a circuit diagram of the capacitive type touch sensor of FIGS. 2A-D and 3 A-D to receive finger contacts and the sensing circuit to track the finger contacts.
  • FIG. 5A shows an external side view of a temple of the eyewear device of FIGS. 1A-C depicting a resistive type touch sensor example.
  • FIG. 5B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 5A .
  • FIG. 5C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 5B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 5D depicts a resistive array pattern formed on the circuit board of FIG. 5C to receive finger contacts.
  • FIG. 6 shows operation and a circuit diagram of the resistive type touch sensor of FIGS. 5A-D to receive finger contacts.
  • FIGS. 7A-C illustrate press and hold detected touch events on the input surface of the touch sensor.
  • FIG. 8 illustrates finger pinching and unpinching detected touch events on the input surface of the touch sensor.
  • FIG. 9 illustrates finger rotation detected touch events on the input surface of the touch sensor.
  • FIG. 10 illustrates finger swiping detected touch events on the input surface of the touch sensor.
  • FIG. 11 is a high-level functional block diagram of an example finger activated touch sensor system including the eyewear device, a mobile device, and a server system connected via various networks.
  • Coupled refers to any logical, optical, physical or electrical connection, link or the like by which electrical signals produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate or carry the electrical signals.
  • on means directly supported by an element or indirectly supported by the element through another element integrated into or supported by the element.
  • the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation.
  • any directional term such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any touch sensor or component of a touch sensor constructed as otherwise described herein.
  • an eyewear device includes a frame, a temple connected to a lateral side of the frame, an image display, a processor, and a touch sensor.
  • the touch sensor includes an input surface and a sensor array that is coupled to the input surface to receive at least one finger contact inputted from a user.
  • the sensor array can be a capacitive array or a resistive array.
  • the eyewear device further includes a sensing circuit integrated into or connected to the touch sensor and connected to the processor. The sensing circuit is configured to measure voltage to track the at least one finger contact on the input surface.
  • the eyewear device further includes a memory accessible to the processor.
  • FIG. 1A is a side view of an example hardware configuration of an eyewear device 100 , which includes a touch sensor 113 on a temple 125 B.
  • the touch sensor 113 identifies finger gestures for adjusting an image presented on an image display of an optical assembly 180 B of the eyewear device 100 .
  • the touch gestures are inputs to the human-machine interface of the eyewear device 100 to perform specific actions in applications executing on the eyewear device 100 or to navigate through displayed images in an intuitive manner which enhances and simplifies the user experience.
  • the eyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example of FIGS. 1A-C .
  • the eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet. It should be understood that in some examples, the touch sensor 113 may receive input in a manner other than finger contact, for example, a stylus or other mechanical input device.
  • eyewear device 100 includes a frame 105 including a left rim 107 A connected to a right rim 107 B via a bridge 106 adapted for a nose of the user.
  • the left and right rims 107 A-B include respective apertures 175 A-B, which hold a respective optical assembly 180 A-B.
  • Optical assembly 180 A-B can include various optical layers 176 A-N and an image display device.
  • the left and right temples 125 A-B are connected to respective lateral sides of the frame 105 , for example, via respective left and right chunks 110 A-B.
  • a substrate or materials forming the temple 125 A-B can include plastic, acetate, metal, or a combination thereof.
  • the chunks 110 A-B can be integrated into or connected to the frame 105 on the lateral side.
  • Eyewear device 100 includes touch sensor 113 on the frame 105 , the temple 125 A-B, or the chunk 110 A-B.
  • the touch sensor 113 includes an input surface 181 and a capacitive array or a resistive array that is coupled to the input surface 181 to receive at least one finger contact input by a user.
  • eyewear device 100 includes a processor, a memory accessible to the processor, and a sensing circuit.
  • the sensing circuit is integrated into or connected to the touch sensor 113 and is connected to the processor.
  • the sensing circuit is configured to measure voltage to track the at least one finger contact on the input surface 181 .
  • the eyewear device 100 includes programming in the memory. Execution of the programming by the processor configures the eyewear device 100 to perform functions, including functions to receive on the input surface 181 of the touch sensor 113 the at least one finger contact input by the user. The execution of the programming by the processor further configures the eyewear device 100 to track, via the sensing circuit, the at least one finger contact on the input surface 181 . The execution of the programming by the processor further configures the eyewear device 100 to detect at least one touch event on the input surface 181 of the touch sensor 113 based on the at least one finger contact on the input surface 181 .
  • a touch event represents when the state of contacts with the touch sensor 113 changes.
  • the touch event can describe one or more points of contact with the touch sensor 113 and can include detecting movement, and the addition or removal of contact points.
  • the touch event can be described by a position on the touch sensor 113 , size, shape, amount of pressure, and time.
  • the execution of the programming by the processor further configures the eyewear device 100 to identify a finger gesture based on the at least one detected touch event.
  • the execution of the programming by the processor further configures the eyewear device 100 to adjust an image presented on the image display of the optical assembly 180 A-B based on the identified finger gesture.
  • the identified finger gesture is selection or pressing of a graphical user interface element in the image presented on the image display of the optical assembly 180 A-B.
  • the adjustment to the image presented on the image display of the optical assembly 180 A-B based on the identified finger gesture is a primary action which selects or submits the graphical user interface element on the image display of the optical assembly 180 A-B for further display or execution.
  • the touch sensor 113 may control other output components, such as a speakers of the eyewear device 100 , with the touch sensor 113 controlling volume, for example.
  • Eyewear device 100 may include wireless network transceivers, for example cellular or local area network transceivers (e.g., WiFi or BluetoothTM), and run sophisticated applications. Some of the applications may include a web browser to navigate the Internet, an application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application (e.g., to enter in a destination address and view maps), an augmented reality application, an email application (e.g., to read and compose emails). Gestures inputted on the touch sensor 113 can be used to manipulate and interact with the displayed content on the image display and control the applications.
  • wireless network transceivers for example cellular or local area network transceivers (e.g., WiFi or BluetoothTM)
  • Some of the applications may include a web browser to navigate the Internet, an application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application (e.g.
  • touch screens exist for mobile devices, such as tablets and smartphones
  • utilization of a touch screen in the lens of an eyewear device can interfere with the line of sight of the user of the eyewear device 100 and hinder the user's view.
  • finger touches can smudge the optical assembly 180 -B (e.g., optical layers, image display, and lens) and cloud or obstruct the user's vision.
  • the touch sensor 113 is located on the right temple 125 B.
  • Touch sensor 113 can include a sensor array, such as a capacitive or resistive array, for example, horizontal strips or vertical and horizontal grids to provide the user with variable slide functionality, or combinations thereof.
  • the capacitive array or the resistive array of the touch sensor 113 is a grid that forms a two-dimensional rectangular coordinate system to track X and Y axes location coordinates.
  • the capacitive array or the resistive array of the touch sensor 113 is linear and forms a one-dimensional linear coordinate system to track an X axis location coordinate.
  • the touch sensor 113 may be an optical type sensor that includes an image sensor that captures images and is coupled to an image processor for digital processing along with a timestamp in which the image is captured.
  • the timestamp can be added by a coupled sensing circuit 241 which controls operation of the touch sensor 113 and takes measurements from the touch sensor 113 .
  • the sensing circuit 241 uses algorithms to detect patterns of the finger contact on the input surface 181 , such as ridges of the fingers, from the digitized images that are generated by the image processor. Light and dark areas of the captured images are then analyzed to track the finger contact and detect a touch event, which can be further based on a time that each image is captured.
  • Touch sensor 113 can enable several functions, for example, touching anywhere on the touch sensor 113 may highlight an item on the screen of the image display of the optical assembly 180 A-B. Double tapping on the touch sensor 113 may select an item. Sliding (e.g., or swiping) a finger from front to back may slide or scroll in one direction, for example, to move to a previous video, image, page, or slide. Sliding the finger from back to front may slide or scroll in the opposite direction, for example, to move to a previous video, image, page, or slide. Pinching with two fingers may provide a zoom-in function to zoom in on content of a displayed image. Unpinching with two fingers provides a zoom-out function to zoom out of content of a displayed image.
  • Sliding e.g., or swiping
  • Pinching with two fingers may provide a zoom-in function to zoom in on content of a displayed image. Unpinching with two fingers provides a zoom-out function to zoom out of content of a displayed image.
  • the touch sensor 113 can be provided on both the left and right temples 125 A-B to increase available functionality or on other components of the eyewear device 113 , and in some examples, two, three, four, or more touch sensors 113 can be incorporated into the eyewear device 100 in different locations.
  • the type of touch sensor 113 depends on the intended application. For example, a capacitive type touch sensor 113 has limited functionality when the user wears gloves. Additionally, rain can trip false registers on the capacitive type touch sensor 113 . A resistive type touch sensor 113 on the other hand, requires more applied force, which may not be optimal to the user wearing the eyewear device 100 on their head. Both capacitive and resistive type technologies can be leveraged by having multiple touch sensors 113 in the eyewear device 100 given their limitations.
  • the eyewear device includes at least one visible light camera 114 that is sensitive to the visible light range wavelength.
  • the visible light camera 114 has a frontward facing field of view.
  • Examples of such a visible light camera 114 include a high resolution complementary metal-oxide-semiconductor (CMOS) image sensor and a video graphic array (VGA) camera, such as 640 p (e.g., 640 ⁇ 480 pixels for a total of 0.3 megapixels), 720 p, or 1080 p.
  • CMOS complementary metal-oxide-semiconductor
  • VGA video graphic array
  • Image sensor data from the visible light camera 114 is captured along with geolocation data, digitized by an image processor, stored in a memory, and displayed on the image display device of optical assembly 180 A-B.
  • the touch sensor 113 is responsive to provide image or video capture via the visible light camera 114 , for example, in response to any of the identified finger gestures disclosed herein.
  • FIGS. 1B-C are rear views of example hardware configurations of the eyewear device 100 of FIG. 1A , including two different types of image displays.
  • the image display of optical assembly 180 A-B includes an integrated image display.
  • An example of such an integrated image display is disclosed in FIG. 5 of U.S. Pat. No. 9,678,338, filed Jun. 19, 2015, titled “Systems and Methods for Reducing Boot Time and Power Consumption in Wearable Display Systems,” which is incorporated by reference herein.
  • the optical assembly 180 A-B includes a suitable display matrix 170 of any suitable type, such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, or any other such display.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the optical assembly 180 A-B also includes an optical layer or layers 176 , which can include lenses, optical coatings, prisms, mirrors, waveguides, optical strips, and other optical components in any combination.
  • the optical layers 176 A-N can include a prism having a suitable size and configuration and including a first surface for receiving light from display matrix and a second surface for emitting light to the eye of the user.
  • the prism of the optical layers 176 A-N extends over all or at least a portion of the respective apertures 175 A-B formed in the left and right rims 107 A-B to permit the user to see the second surface of the prism when the eye of the user is viewing through the corresponding left and right rims 107 A-B.
  • the first surface of the prism of the optical layers 176 A-N faces upwardly from the frame 105 and the display matrix overlies the prism so that photons and light emitted by the display matrix impinge the first surface.
  • the prism is sized and shaped so that the light is refracted within the prism and is directed towards the eye of the user by the second surface of the prism of the optical layers 176 A-N.
  • the second surface of the prism of the optical layers 176 A-N can be convex so as to direct the light towards the center of the eye.
  • the prism can optionally be sized and shaped so as to magnify the image projected by the display matrix 170 , and the light travels through the prism so that the image viewed from the second surface is larger in one or more dimensions than the image emitted from the display matrix 170 .
  • the image display device of optical assembly 180 A-B includes a projection image display as shown in FIG. 1C .
  • An example of a projection image display is disclosed in FIG. 6 of U.S. Pat. No. 9,678,338, filed Jun. 19, 2015, titled “Systems and Methods for Reducing Boot Time and Power Consumption in Wearable Display Systems,” which is incorporated by reference herein.
  • the optical assembly 180 A-B includes a laser projector 150 , which is a three-color laser projector using a scanning mirror or galvanometer. During operation, an optical source such as a laser projector 150 is disposed in or on one of the temples 125 A-B of the eyewear device 100 .
  • Optical assembly 180 -B includes one or more optical strips 155 A-N spaced apart across the width of the lens of the optical assembly 180 A-B or across a depth of the lens between the front surface and the rear surface of the lens.
  • the photons projected by the laser projector 150 travel across the lens of the optical assembly 180 A-B, the photons encounter the optical strips 155 A-N. When a particular photon encounters a particular optical strip, it is either redirected towards the user's eye, or it passes to the next optical strip.
  • Specific photons or beams of light may be controlled by a combination of modulation of laser projector 150 , and modulation of optical strips 155 A-N.
  • a processor controls optical strips 155 A-N by initiating mechanical, acoustic, or electromagnetic signals.
  • the eyewear device 100 can include other arrangements, such as a single or three optical assemblies, or the optical assembly 180 A-B may have arranged different arrangement depending on the application or intended user of the eyewear device 100 .
  • eyewear device 100 includes a left chunk 110 A adjacent the left lateral side 170 A of the frame 105 and a right chunk 110 B adjacent the right lateral side 170 B of the frame 105 .
  • the chunks 110 A-B may be integrated into the frame 105 on the respective lateral sides 170 A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170 A-B.
  • the chunks 110 A-B may be integrated into temples 125 A-B attached to the frame 105 .
  • FIG. 2A shows a side view of a temple 125 B of the eyewear device 100 of FIG. 1A depicting a capacitive type touch sensor 113 example.
  • the right temple 125 B includes the touch sensor 113 and the touch sensor 113 has an input surface 181 .
  • a protruding ridge 281 surrounds the input surface 181 of the touch sensor 113 to indicate to the user an outside boundary of the input surface 181 of the touch sensor 113 .
  • the protruding ridge 281 orients the user by indicating to the user that their finger is on top of the touch sensor 113 and is in the correct position to manipulate the touch sensor 113 .
  • FIG. 2B illustrates an external side view of a portion of the temple of the eyewear device 100 of FIGS. 1A-C and FIG. 2A .
  • plastic or acetate form the right temple 125 B.
  • the right temple 125 B is connected to the right chunk 110 B via the right hinge 126 B.
  • FIG. 2C illustrates an internal side view of the components of the portion of temple of the eyewear device 100 of FIGS. 1A-C and FIG. 2B with a cross-sectional view of a circuit board 240 with the touch sensor 113 , a sensing circuit 241 , an image display driver 242 , and a processor 243 .
  • the circuit board 240 is a flexible printed circuit board (PCB), it should be understood that the circuit board 240 can be rigid in some examples.
  • the frame 105 or the chunk 110 A-B can include the circuit board 140 that includes the touch sensor 113 .
  • sensing circuit 241 includes a dedicated microprocessor integrated circuit (IC) customized for processing sensor data from the touch sensor 113 , along with volatile memory used by the microprocessor to operate.
  • IC microprocessor integrated circuit
  • the sensing circuit 241 and processor 243 may not be separate components, for example, functions and circuitry implemented in the sensing circuit 241 can be incorporated or integrated into the processor 243 itself.
  • Image display driver 242 commands and controls the image display of the optical assembly 180 A-B.
  • Image display driver 242 may deliver image data directly to the image display of the optical assembly 180 A-B for presentation or may have to convert the image data into a signal or data format suitable for delivery to the image display device.
  • the image data may be video data formatted according to compression formats, such as H. 264 (MPEG-4 Part 10), HEVC, Theora, Dirac, RealVideo RV40, VP8, VP9, or the like, and still image data may be formatted according to compression formats such as Portable Network Group (PNG), Joint Photographic Experts Group (JPEG), Tagged Image File Format (TIFF) or exchangeable image file format (Exif) or the like.
  • compression formats such as Portable Network Group (PNG), Joint Photographic Experts Group (JPEG), Tagged Image File Format (TIFF) or exchangeable image file format (Exif) or the like.
  • PNG Portable Network Group
  • JPEG Joint Photographic Experts Group
  • TIFF Tagged Image File
  • the touch sensor 113 is disposed on the flexible printed circuit board 240 .
  • the touch sensor 113 includes a capacitive array that is coupled to the input surface 181 to receive at least one finger contact inputted from a user.
  • the sensing circuit 241 is integrated into or connected to the touch sensor 113 and connected to the processor 243 .
  • the sensing circuit 241 is configured to measure voltage to track the at least one finger contact on the input surface 181 .
  • FIG. 2D depicts a capacitive array pattern formed on the circuit board of FIG. 2C to receive finger contacts.
  • the pattern of the capacitive array 214 of the touch sensor 113 includes patterned conductive traces formed of at least one metal, indium tin oxide, or a combination thereof on the flexible printed circuit board 240 .
  • the conductive traces are rectangular shaped copper pads.
  • FIG. 3A shows an external side view of a temple 125 B of the eyewear device 100 of FIG. 1 depicting another capacitive type touch sensor 113 .
  • the right temple 125 B includes the touch sensor 113 and the touch sensor 113 has a protruding ridge 281 that surrounds an input surface 181 .
  • FIG. 3B illustrates an external side view of a portion of the temple 125 B of the eyewear device 100 of FIG. 1A and FIG. 3A .
  • Metal may form the right temple 125 B and a plastic external layer can cover the metal layer.
  • FIG. 3C illustrates an internal side view of the components of the portion of temple 125 B of the eyewear device 100 of FIG. 1A and FIG. 3B with a cross-sectional view of a circuit board 240 with the touch sensor 113 , a sensing circuit 241 , an image display driver 242 , and a processor 243 .
  • the touch sensor 113 is disposed on the flexible printed circuit board 240 .
  • Various electrical interconnect(s) 294 are formed to convey electrical signals from the input surface 181 to the flexible printed circuit board 240 .
  • FIG. 3D depicts a pattern of the capacitive array 214 formed on the flexible printed circuit board 240 of FIG. 3C to receive finger contacts similar to FIG. 2C .
  • FIGS. 4A-B show operation and a circuit diagram of the capacitive type touch sensor 113 of FIGS. 2A-D and 3 A-D to receive finger contacts and the sensing circuit 241 to track the finger contacts 410 .
  • the view of FIG. 4A is intended to give a cross-sectional view of two capacitors of the capacitive array 214 of the touch sensor 113 of FIGS. 2A-D and 3 A-D, and the coupled sensing circuit 241 .
  • the touch sensor 113 includes the capacitive array 214 formed by capacitors, including capacitors C A and C B .
  • the capacitive array 214 includes multiple patterned conductive sensor electrodes 415 A-B, and it should be understood that although only two sensor electrodes are shown, the number can be 20, 100, 1000, etc. or essentially any number depending on the application.
  • the capacitive array 214 includes 100 sensor electrodes, in other examples, the 100 sensor electrodes are arranged in a 10 ⁇ 10 grid.
  • the sensor electrodes 415 A-B are connected to the flexible printed circuit board 240 and disposed below the input surface 181 . At least one respective electrical interconnect connects the sensing circuit 241 to the sensor electrodes 415 A-B.
  • the sensing circuit 241 is configured to measure capacitance changes of each of the sensor electrodes 415 A-B of the capacitive array 214 .
  • the sensor electrodes 415 A-B are rectangular patterned conductive traces formed of at least one of metal, indium tin oxide, or a combination thereof.
  • capacitors C A and C B of the capacitive array 214 store electrical charge, connecting them up to conductive plates on the input surface 181 allows the capacitors to track the details of finger contacts 410 .
  • Charge stored in the capacitor C A changes slightly (e.g., the charge becomes higher) when the finger is placed over the conductive plates of capacitor C A , while an air gap will leave the charge at the capacitor C B relatively unchanged (e.g., the charge remains lower). As shown in FIG.
  • the sensing circuit 241 can include an op-amp integrator circuit which can track these changes in capacitance of capacitive array 214 , and the capacitance changes can then be recorded by an analog-to-digital converter (ADC) and stored in a memory along with timing data of when the capacitance change is sensed.
  • ADC analog-to-digital converter
  • the sensing circuit 241 is further configured to determine a respective location coordinate and a respective input time of the at least one finger contact 410 on the input surface 181 .
  • Execution of the programming by the processor configures the eyewear device 100 to perform functions, including functions to track, via the sensing circuit 241 , the respective location coordinate and the respective input time of the at least one finger contact on the input surface 181 .
  • the function to detect the at least one touch event on the input surface 181 of the touch sensor 113 is based on the at least one respective location coordinate and the respective input time of the at least one finger contact 410 .
  • FIG. 5A shows an external side view of a temple 125 B of the eyewear device of FIGS. 1A-C depicting a resistive type touch sensor 114 on the temple 125 B.
  • the right temple 125 B includes the touch sensor 113 and the touch sensor 113 has an input surface 181 surrounded by a protruding ridge 281 .
  • the touch sensor 113 includes a resistive array 514 .
  • FIG. 5B illustrates an external side view of a portion of the temple of the eyewear device 100 of FIG. 5A . Plastic or metal may form the right temple 125 B.
  • FIG. 5C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 5A with a cross-sectional view of a circuit board 540 with the touch sensor 113 , a sensing circuit 241 , an image display driver 242 , and a processor 243 .
  • the touch sensor 113 is disposed on the flexible printed circuit board 540 .
  • Various electrical interconnect(s) 294 are formed to convey electrical signals from the input surface 181 to the flexible printed circuit board 540 .
  • FIG. 5D depicts a pattern of the resistive array 514 formed on the circuit board 540 of FIG. 5C to receive finger contacts similar to FIG. 2C .
  • the flexible printed circuit board 540 is an air gapped dual layer flexible printed circuit board with a resistive pattern thereon.
  • the resistive array 514 includes two conductive layers, including a first conductive layer 583 (e.g., ground) and a second conductive layer 585 (e.g., signal).
  • An air gap 584 between the two conductive layers 583 and 585 separates the first and second conductive layers.
  • the first and second conductive layers 583 and 585 of the resistive array 514 can include rectangular patterned conductive traces formed of at least one metal, indium tin oxide, or a combination thereof.
  • the two conductive layers 583 and 585 are connected to the flexible printed circuit board 540 and are disposed below the input surface 181 of the touch sensor 113 .
  • FIG. 6 shows operation and a circuit diagram of the resistive type touch sensor of FIGS. 5A-D to receive finger contacts.
  • the view of FIG. 6 is intended to give a cross-sectional view of a single resistor of the resistive array 514 of the touch sensor 113 of FIG. 5A , and the coupled sensing circuit (not shown).
  • the first conductive layer 583 and the second conductive layer 585 are separated by insulating spacers 570 A-B (shown as dots) to form an air gap 584 between the two conductive layers 583 and 585 which may be deposited or layered on respective substrates.
  • the sensing circuit 241 (not shown) is connected to the flexible printed circuit board 540 and connected to the two conductive layers 583 and 585 and configured to measure a voltage drop between the two conductive layers 583 and 585 in response to the at least one finger contact 410 .
  • the second conductive layer 585 is deposited on the flexible printed circuit board 540 and is separated from the first conductive layer 583 by the insulating spacers 570 A-B.
  • a flexible layer of protective insulation may be layered on the first conductive layer 585 .
  • the sensing circuit 241 can track touch location coordinates on the resistive array 514 using four wires that are connected to the sensing circuit 241 and the conductive layers 583 and 585 . Two wires are connected to the left and right sides of the second conductive layer 585 , and two wires are connected to the top and bottom of the first conductive layer 583 . A voltage gradient is applied across the first conductive layer 483 and when contact is made with the first conductive layer 583 the resulting circuit mimics a voltage divider. The voltage is then probed at the first conductive layer 583 to determine the x-coordinate of the touch location.
  • the sensing circuit 241 may employ a 5-wire method with a fifth wire behaving as a top layer voltage probe, in which the second conductive layer 585 is utilized for both X and Y-axis measurements.
  • FIGS. 7-10 illustrate several examples of multiple finger contact detected touch events and identified finger gestures.
  • the function to receive on the input surface 181 of the touch sensor 113 the at least one finger contact input by the user includes functions to: receive on the input surface 181 of the touch sensor 113 a first finger contact input by the user at a first input time; and receive on the input surface 181 of the touch sensor 113 a second finger contact 710 B input by the user at a second input time which is within a predetermined time period of the first input time.
  • the function to detect the at least one touch event on the input surface 181 of the touch sensor 113 based on the at least one finger contact inputted from the user includes functions to: detect a first touch event on the input surface 181 of the touch sensor 113 based on the first finger contact inputted from the user at the first input time; and detect a second touch event on the input surface 181 of the touch sensor 113 based on the second finger contact inputted from the user at the second input time within the predetermined time period of the first input time.
  • the function to identify the finger gesture is based on the first and second detected touch events, the first input time, the second input time, and the predetermined time period.
  • FIGS. 7A-C illustrate press and hold detected touch events on the input surface 181 of the touch sensor 113 .
  • multiple finger contacts occur on the touch sensor 113 , which include pressing (the first finger contact 710 A), holding (the second finger contact 710 B), and no finger contact 710 C by releasing the touch sensor 113 .
  • the first and second detected touch events are a press and hold on the input surface 181 of the touch sensor 113 .
  • the identified finger gesture is a press and hold of a graphical user interface element in the image presented on the image display.
  • the adjustment to the image presented on the image display based on the identified finger gesture is configured to allow a drag and drop (e.g., move) of the graphical user interface element on the image display or provide display options (e.g., a context menu associated with the graphical user interface element).
  • a drag and drop e.g., move
  • display options e.g., a context menu associated with the graphical user interface element
  • FIG. 8 illustrates finger pinching and unpinching detected touch events on the input surface 181 of the touch sensor 113 .
  • Multiple finger contacts occur on the touch sensor 113 , in which two fingers (first finger contact 810 A and second finger contact 810 B) move apart from each other (finger unpinching) or move toward each other (finger pinching).
  • the first and second detected touch events are finger pinching on the input surface 181 of the touch sensor 113 .
  • the identified finger gesture is a zoom in of the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture zooms in on the image presented on the image display.
  • the first and second detected touch events are finger unpinching on the input surface of the touch sensor 113 .
  • the identified finger gesture is a zoom out of the image presented on the image display.
  • the adjustment to the image presented on the image display based on the identified finger gesture zooms out of the image presented on the image display.
  • FIG. 9 illustrates finger rotation detected touch events on the input surface 181 of the touch sensor 113 .
  • multiple finger contacts occur on the touch sensor 113 , which include continuously rotating two fingers in a circle from two initial points, a first finger contact 910 A and a second finger contact 910 B, to two final points of contact for those two fingers. In some examples, only one finger may be rotated in a circle.
  • the first and second detected touch events are finger rotation on the input surface 181 of the touch sensor 113 .
  • the identified finger gesture is a finger rotation of the image presented on the image display.
  • the adjustment to the image presented on the display based on the identified finger gesture rotates the image presented on the image display, for example, to rotate a view.
  • the rotation gesture is can occur when two fingers rotate around each other.
  • FIG. 10 illustrates finger swiping detected touch events on the input surface 181 of the touch sensor 113 .
  • multiple finger contacts occur on the touch sensor 113 , which include dragging one finger left or right from a point of initial finger contact 1010 A to a final point of second finger contact 1010 B or 1010 C.
  • the first and second detected touch events are finger swiping from front to back ( 1010 A to 1010 C) or back to front ( 1010 A to 1010 B) on the input surface 181 of the touch sensor 113 .
  • the identified finger gesture is a scroll of the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture scrolls the image presented on the image display.
  • such a scroll or swipe gesture can occur when the user moves one or more fingers across the screen in a specific horizontal direction without significantly deviating from the main direction of travel, however, it should be understood that the direction of travel can be vertical as well, for example if the touch sensor 113 is a X and Y coordinate grid or a vertical strip.
  • FIG. 11 is a high-level functional block diagram of an example finger activated touch sensor system.
  • the system 1100 includes eyewear device 100 , mobile device 1190 , and server system 1198 .
  • Mobile device 1190 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting with eyewear device 100 using both a low-power wireless connection 1125 and a high-speed wireless connection 1137 .
  • Mobile device 1190 is connected to server system 1198 and network 1195 .
  • the network 1195 may include any combination of wired and wireless connections.
  • Server system 1198 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the network 1195 with the mobile device 1190 and eyewear device 100 .
  • Low-power wireless circuitry 1124 and the high-speed wireless circuitry 1136 of the eyewear device 100 can include short range transceivers (BluetoothTM) and wireless wide, local, or wide area network transceivers (e.g., cellular or WiFi).
  • Mobile device 1190 including the transceivers communicating via the low-power wireless connection 1125 and high-speed wireless connection 1137 , may be implemented using details of the architecture of the eyewear device 100 , as can other elements of network 1195 .
  • Output components of the eyewear device 100 include visual components, such as the image display of the optical assembly 180 as described in FIGS. 1B-C (e.g., a display such as a liquid crystal display (LCD), a plasma display panel (PDP), a light emitting diode (LED) display, or a projector).
  • the image display of the optical assembly 180 is driven by the image display driver 242 .
  • the output components of the eyewear device 100 further include acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth.
  • the input components of the eyewear device 100 include the touch sensor 113 , and various components of the system, including the mobile device 1190 and server system 1198 , may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
  • alphanumeric input components e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components
  • point-based input components e.g., a mouse, a touchpad, a trackball,
  • System 1100 may optionally include additional peripheral device elements 1119 .
  • peripheral device elements 1119 may include biometric sensors, additional sensors, or display elements integrated with eyewear device 100 .
  • peripheral device elements 1119 may include any I/O components including output components, motion components, position components, or any other such elements described herein.
  • the biometric components of the system include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.
  • the motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
  • the position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or BluetoothTM transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
  • location sensor components to generate location coordinates
  • WiFi or BluetoothTM transceivers to generate positioning system coordinates
  • altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
  • orientation sensor components e.g., magnetometers
  • Eyewear device 100 includes a touch sensor 113 , visible light camera 114 , image display of the optical assembly 180 , sensing circuit 241 , image display driver 242 , image processor 1112 , low-power circuitry 1120 , and high-speed circuitry 1130 .
  • the components shown in FIG. 11 for the eyewear device 100 are located on one or more circuit boards, for example a PCB or flexible PCB, in the temples. Alternatively or additionally, the depicted components can be located in the chunks, frames, hinges, or bridge of the eyewear device 100 .
  • Visible light camera 114 can include digital camera elements such as a complementary metal-oxide-semiconductor (CMOS) image sensor, charge coupled device, a lens, or any other respective visible or light capturing elements that may be used to capture data.
  • CMOS complementary metal-oxide-semiconductor
  • Touch sensor 113 can receive a user input commands (e.g., finger contacts) as input and the sensing circuit 241 along with the depicted gesture application 1144 stored in memory 1134 can track those finger contacts and identify particular input gestures. In one implementation, the identified gesture sends a user input signal from to low power processor 243 A. In some examples, the touch sensor 113 is located on different portions of the eyewear device 100 , such as on a different temple, chunk, or the frame, but is electrically connected via a circuit board to the visible light camera 114 , sensing circuit 241 , image processor 1112 , image display driver 242 , and image display of the optical assembly 180 .
  • a user input commands e.g., finger contacts
  • the sensing circuit 241 along with the depicted gesture application 1144 stored in memory 1134 can track those finger contacts and identify particular input gestures.
  • the identified gesture sends a user input signal from to low power processor 243 A.
  • the touch sensor 113 is located on different portions of the eyewear device 100
  • interaction with the touch sensor 113 by the user e.g., tactile input can be processed by low power processor 243 A as a request to capture a single image by the visible light camera 114 .
  • the tactile input for a first period of time may be processed by low-power processor 243 A as a request to capture video data while the touch sensor 113 is being contacted by a finger, and to cease video capture when no finger contact is detected on the touch sensor 113 , with the video captured while the touch sensor 113 was continuously contacted stored as a single video file.
  • the low-power processor 243 A may have a threshold time period between the inputted touch gesture, such as 500 milliseconds or one second, below which the finger contact with the touch sensor 113 is processed as an image request, and above which the finger contact with the touch sensor 113 is interpreted as a video request.
  • Image processor 1112 includes circuitry to receive signals from the visible light camera 114 and process those signals from the visible light camera 114 into a format suitable for storage in the memory 1134 .
  • Memory 1134 includes various captured images, videos, and a gesture application 1144 to perform the functions of the programming described herein, for example the gesture identification operations outlined in further detail in FIG. 1-10 .
  • the gesture application 1144 can be part of the operating system stored in the memory 1134 of the eyewear device 100 and provides an application programming interface (API) which is responsive to calls from other applications.
  • API application programming interface
  • Identified gestures can be utilized to allow the user to interact with and manipulate various applications, including the depicted augmented reality application 1145 , web browser application 1146 , and turn-by-turn navigation application 1147 , phone application 1148 , photo and video viewer application 1149 , music player application 1150 , and email application 1151 .
  • the applications 1145 - 1151 can manipulate and interact with the displayed content (e.g., graphical user interface) on the optical assembly 180 with image display to control applications 1145 - 1151 .
  • an API call to the gesture application 1144 can return identified finger gestures.
  • the applications 1145 - 1151 can adjust the image presented on the display based on the identified finger gesture.
  • the underlying detected touch events of the identified finger gesture may also be returned by the API call to the gesture application 1144 to the applications 1145 - 1151 . This can allow for custom gestures to be developed and implemented in the applications 1145 - 1151 for identification (e.g., via a software development kit) and resulting adjustments to images presented on the display based on the identified finger gesture.
  • eyewear device 100 may include cellular wireless network transceivers or other wireless network transceivers (e.g., WiFi or BluetoothTM), and run sophisticated applications. Some of the applications may include web browsers to navigate the Internet, a phone application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application, an augmented or virtual reality application, or an email application. Gestures inputted on the touch sensor 113 can be used to manipulate and interact with the displayed content on the image display of the optical assembly 180 and control the applications.
  • cellular wireless network transceivers e.g., WiFi or BluetoothTM
  • Some of the applications may include web browsers to navigate the Internet, a phone application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application, an augmented or virtual reality application, or an email application.
  • Gestures inputted on the touch sensor 113 can be used to manipulate and interact with the
  • the API of the gesture application 1144 can be configured to enable gestures to navigate the Internet in the web browser application 1146 .
  • the API of the gesture application 1144 can be configured to enable gestures to enter addresses or zoom in and out of maps and locations displayed in the turn-by-turn navigation application 1147 .
  • the API of the gesture application 1144 can be configured to enable gestures to select a contact or enter a phone number to place phone calls to in the phone application 1148 .
  • the API of the gesture application 1144 can be configured to enable gestures to view photos by swiping or select videos to view in the photo and video viewer application 1149 , including pause, stop, play, etc.
  • the API of the gesture application 1144 can be configured to enable gestures to select audio files to be played in the music player application 1150 , including pause, stop, play, etc.
  • the API of the gesture application 1144 can be configured to enable gestures to read, send, delete, and compose emails in the email application 1151 .
  • Image processor 1112 , touch sensor 113 , and sensing circuit 241 are structured within eyewear device 100 such that the components may be powered on and booted under the control of low-power circuitry 1120 .
  • Image processor 1112 , touch sensor 113 , and sensing circuit 241 may additionally be powered down by low-power circuitry 1120 .
  • these components may still consume a small amount of power even when in an off state. This power will, however, be negligible compared to the power used by image processor 1112 , touch sensor 113 , and sensing circuit 241 when in an on state, and will also have a negligible impact on battery life.
  • device elements in an “off” state are still configured within a device such that low-power processor 243 A is able to power on and power down the devices.
  • a device that is referred to as “off” or “powered down” during operation of eyewear device 100 does not necessarily consume zero power due to leakage or other aspects of a system design.
  • image processor 1112 comprises a microprocessor integrated circuit (IC) customized for processing sensor data from the touch sensor 113 , sensing circuit 241 , and visible light camera 114 , along with volatile memory used by the microprocessor to operate.
  • IC microprocessor integrated circuit
  • ROM read only memory
  • This ROM may be minimized to match a minimum size needed to provide basic functionality for gathering sensor data from the touch sensor 113 , sensing circuit 241 , and visible light camera 114 , such that no extra functionality that would cause delays in boot time are present.
  • the ROM may be configured with direct memory access (DMA) to the volatile memory of the microprocessor of image processor 1112 .
  • DMA allows memory-to-memory transfer of data from the ROM to system memory of the image processor 1112 independent of operation of a main controller of image processor 1112 .
  • Providing DMA to this boot ROM further reduces the amount of time from power on of the image processor 1112 until sensor data from the touch sensor 113 , sensing circuit 241 , and visible light camera 114 can be processed and stored.
  • minimal processing of the camera signal from the touch sensor 113 , sensing circuit 241 , and visible light camera 114 is performed by the image processor 1112 , and additional processing may be performed by applications operating on the mobile device 1190 or server system 1198 .
  • Low-power circuitry 1120 includes low-power processor 243 A and low-power wireless circuitry 1124 . These elements of low-power circuitry 1120 may be implemented as separate elements or may be implemented on a single IC as part of a system on a single chip.
  • Low-power processor 243 A includes logic for managing the other elements of the eyewear device 100 . As described above, for example, low power processor 243 A may accept user input signals from the touch sensor 113 . Low-power processor 243 A may also be configured to receive input signals or instruction communications from mobile device 1190 via low-power wireless connection 1125 . Additional details related to such instructions are described further below.
  • Low-power wireless circuitry 1124 includes circuit elements for implementing a low-power wireless communication system via a short-range network. BluetoothTM Smart, also known as BluetoothTM low energy, is one standard implementation of a low power wireless communication system that may be used to implement low-power wireless circuitry 1124 . In other embodiments, other low power communication systems may be used.
  • High-speed circuitry 1130 includes high-speed processor 243 B, memory 1134 , and high-speed wireless circuitry 1136 .
  • the sensing circuit 241 and touch sensor 113 are shown as being coupled to the low-power circuitry 1120 and operated by the low-power processor 243 B.
  • the touch sensor 113 and sensing circuit 241 can be coupled to the high-speed circuitry 1130 and operated by the high-speed processor 243 B.
  • the image display driver 242 is coupled to the high-speed circuitry 1130 and operated by the high-speed processor 243 B in order to drive the image display of the optical assembly 180 .
  • High-speed processor 243 B may be any processor capable of managing high-speed communications and operation of any general computing system needed for eyewear device 100 .
  • High speed processor 243 B includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 1137 to a wireless local area network (WLAN) using high-speed wireless circuitry 1136 .
  • the high-speed processor 243 B executes an operating system such as a LINUX operating system or other such operating system of the eyewear device 100 and the operating system is stored in memory 1134 for execution.
  • the high-speed processor 243 B executing a software architecture for the eyewear device 100 is used to manage data transfers with high-speed wireless circuitry 1136 .
  • high-speed wireless circuitry 1136 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other embodiments, other high-speed communications standards may be implemented by high-speed wireless circuitry 1136 .
  • IEEE Institute of Electrical and Electronic Engineers
  • Memory 1134 includes any storage device capable of storing various applications 1144 - 1151 and data, including camera data generated by the visible light camera 114 and the image processor 1112 , as well as images generated for display by the image display driver 242 on the image display of the optical assembly 180 . While memory 1134 is shown as integrated with high-speed circuitry 1130 , in other embodiments, memory 1134 may be an independent standalone element of the eyewear device 100 . In certain such embodiments, electrical routing lines may provide a connection through a chip that includes the high-speed processor 243 B from the image processor 1112 or low-power processor 243 A to the memory 1134 . In other embodiments, the high-speed processor 243 B may manage addressing of memory 1134 such that the low-power processor 243 A will boot the high-speed processor 243 B any time that a read or write operation involving memory 1134 is needed.
  • any of the touch sensor or other functions described herein for the eyewear device 100 , mobile device 1190 , and server system 1198 can be embodied in on one or more methods as method steps or in one more applications as described previously.
  • an “application” or “applications” are program(s) that execute functions defined in the programs.
  • Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language).
  • a third party application may be mobile software running on a mobile operating system such as IOSTM, ANDROIDTM, WINDOWS® Phone, or another mobile operating systems.
  • the third party application can invoke API calls provided by the operating system to facilitate functionality described herein.
  • the applications can be stored in any type of computer readable medium or computer storage device and be executed by one or more general purpose computers.
  • the methods and processes disclosed herein can alternatively be embodied in specialized computer hardware or an application specific integrated circuit (ASIC), field programmable gate array (FPGA) or a complex programmable logic device (CPLD).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • CPLD complex programmable logic device
  • Program aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of executable code and/or associated data that is carried on or embodied in a type of machine readable medium.
  • programming code could include code for the touch sensor or other functions described herein.
  • “Storage” type media include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks.
  • Such communications may enable loading of the software from one computer or processor into another, for example, from the server system 1198 or host computer of the service provider into the computer platforms of the eyewear device 100 and mobile device 1190 .
  • another type of media that may bear the programming, media content or meta-data files includes optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links.
  • the physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software.
  • terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions or data to a processor for execution.
  • Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the client device, media gateway, transcoder, etc. shown in the drawings.
  • Volatile storage media include dynamic memory, such as main memory of such a computer platform.
  • Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system.
  • Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • Computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
  • any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ⁇ 10% from the stated amount.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Acoustics & Sound (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Ophthalmology & Optometry (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An eyewear device includes a frame, a temple connected to a lateral side of the frame, a processor, and an image display. The eyewear device further includes a touch sensor. The touch sensor includes an input surface and a sensor array that is coupled to the input surface to receive at least one finger contact inputted from a user. The sensor array can be a capacitive array or a resistive array. A sensing circuit is configured to measure voltage to track the at least one finger contact on the input surface. The processor of the eyewear device can identify a finger gesture based on at least one detected touch event, and adjust an image presented on the image display based on the identified finger gesture.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/241,063, filed Jan. 7, 2019, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/615,664, filed Jan. 10, 2018, which applications are hereby incorporated herein by reference in their entireties.
  • TECHNICAL FIELD
  • The present subject matter relates to eyewear devices, e.g., smart glasses, and, more particularly, to eyewear devices with touch sensors (e.g., slide controllers) for receiving user gestures.
  • BACKGROUND
  • Portable eyewear devices, such as smartglasses, headwear, and headgear available today integrate lenses, cameras, and wireless network transceiver devices. Unfortunately, size limitations and the form factor of an eyewear device can make a user interface difficult to incorporate into the eyewear device. The available area for placement of various control buttons on an eyewear device, e.g., to operate a camera, is limited. Due to the small form factor of the eyewear device, manipulation and interacting with, for example, displayed content on an image display is difficult.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
  • FIG. 1A is a side view of an example hardware configuration of an eyewear device, which includes a touch sensor on a temple, for use in identifying a finger gesture for adjusting an image presented on an image display of the eyewear device.
  • FIGS. 1B-C are rear views of example hardware configurations of the eyewear device of FIG. 1A, including two different types of image displays.
  • FIG. 2A shows a side view of a temple of the eyewear device of FIGS. 1A-C depicting a capacitive type touch sensor example.
  • FIG. 2B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 2A.
  • FIG. 2C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 2B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 2D depicts a capacitive array pattern formed on the circuit board of FIG. 2C to receive finger contacts.
  • FIG. 3A shows an external side view of a temple of the eyewear device of FIG. 1 depicting another capacitive type touch sensor.
  • FIG. 3B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 3A.
  • FIG. 3C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 3B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 3D depicts the capacitive array pattern formed on the circuit board of FIG. 3C to receive finger contacts.
  • FIGS. 4A-B show operation and a circuit diagram of the capacitive type touch sensor of FIGS. 2A-D and 3A-D to receive finger contacts and the sensing circuit to track the finger contacts.
  • FIG. 5A shows an external side view of a temple of the eyewear device of FIGS. 1A-C depicting a resistive type touch sensor example.
  • FIG. 5B illustrates an external side view of a portion of the temple of the eyewear device of FIGS. 1A-C and FIG. 5A.
  • FIG. 5C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 5B with a cross-sectional view of a circuit board with the touch sensor, a sensing circuit, an image display driver, and a processor.
  • FIG. 5D depicts a resistive array pattern formed on the circuit board of FIG. 5C to receive finger contacts.
  • FIG. 6 shows operation and a circuit diagram of the resistive type touch sensor of FIGS. 5A-D to receive finger contacts.
  • FIGS. 7A-C illustrate press and hold detected touch events on the input surface of the touch sensor.
  • FIG. 8 illustrates finger pinching and unpinching detected touch events on the input surface of the touch sensor.
  • FIG. 9 illustrates finger rotation detected touch events on the input surface of the touch sensor.
  • FIG. 10 illustrates finger swiping detected touch events on the input surface of the touch sensor.
  • FIG. 11 is a high-level functional block diagram of an example finger activated touch sensor system including the eyewear device, a mobile device, and a server system connected via various networks.
  • DETAILED DESCRIPTION
  • In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
  • The term “coupled” as used herein refers to any logical, optical, physical or electrical connection, link or the like by which electrical signals produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate or carry the electrical signals. The term “on” means directly supported by an element or indirectly supported by the element through another element integrated into or supported by the element.
  • The orientations of the eyewear device, associated components and any complete devices incorporating a touch sensor such as shown in any of the drawings, are given by way of example only, for illustration and discussion purposes. In operation for a particular touch sensing application, the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation. Also, to the extent used herein, any directional term, such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any touch sensor or component of a touch sensor constructed as otherwise described herein.
  • In an example, an eyewear device includes a frame, a temple connected to a lateral side of the frame, an image display, a processor, and a touch sensor. The touch sensor includes an input surface and a sensor array that is coupled to the input surface to receive at least one finger contact inputted from a user. The sensor array can be a capacitive array or a resistive array. The eyewear device further includes a sensing circuit integrated into or connected to the touch sensor and connected to the processor. The sensing circuit is configured to measure voltage to track the at least one finger contact on the input surface. The eyewear device further includes a memory accessible to the processor.
  • Additional objects, advantages and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
  • Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
  • FIG. 1A is a side view of an example hardware configuration of an eyewear device 100, which includes a touch sensor 113 on a temple 125B. The touch sensor 113 identifies finger gestures for adjusting an image presented on an image display of an optical assembly 180B of the eyewear device 100. The touch gestures are inputs to the human-machine interface of the eyewear device 100 to perform specific actions in applications executing on the eyewear device 100 or to navigate through displayed images in an intuitive manner which enhances and simplifies the user experience. As shown in FIGS. 1A-C, the eyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example of FIGS. 1A-C. The eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet. It should be understood that in some examples, the touch sensor 113 may receive input in a manner other than finger contact, for example, a stylus or other mechanical input device.
  • In the eyeglasses example, eyewear device 100 includes a frame 105 including a left rim 107A connected to a right rim 107B via a bridge 106 adapted for a nose of the user. The left and right rims 107A-B include respective apertures 175A-B, which hold a respective optical assembly 180A-B. Optical assembly 180A-B can include various optical layers 176A-N and an image display device. The left and right temples 125A-B are connected to respective lateral sides of the frame 105, for example, via respective left and right chunks 110A-B. A substrate or materials forming the temple 125A-B can include plastic, acetate, metal, or a combination thereof. The chunks 110A-B can be integrated into or connected to the frame 105 on the lateral side.
  • Eyewear device 100 includes touch sensor 113 on the frame 105, the temple 125A-B, or the chunk 110A-B. The touch sensor 113 includes an input surface 181 and a capacitive array or a resistive array that is coupled to the input surface 181 to receive at least one finger contact input by a user. Although not shown, in FIGS. 1A-B, eyewear device 100 includes a processor, a memory accessible to the processor, and a sensing circuit. The sensing circuit is integrated into or connected to the touch sensor 113 and is connected to the processor. The sensing circuit is configured to measure voltage to track the at least one finger contact on the input surface 181.
  • The eyewear device 100 includes programming in the memory. Execution of the programming by the processor configures the eyewear device 100 to perform functions, including functions to receive on the input surface 181 of the touch sensor 113 the at least one finger contact input by the user. The execution of the programming by the processor further configures the eyewear device 100 to track, via the sensing circuit, the at least one finger contact on the input surface 181. The execution of the programming by the processor further configures the eyewear device 100 to detect at least one touch event on the input surface 181 of the touch sensor 113 based on the at least one finger contact on the input surface 181.
  • A touch event represents when the state of contacts with the touch sensor 113 changes. The touch event can describe one or more points of contact with the touch sensor 113 and can include detecting movement, and the addition or removal of contact points. The touch event can be described by a position on the touch sensor 113, size, shape, amount of pressure, and time. The execution of the programming by the processor further configures the eyewear device 100 to identify a finger gesture based on the at least one detected touch event.
  • The execution of the programming by the processor further configures the eyewear device 100 to adjust an image presented on the image display of the optical assembly 180A-B based on the identified finger gesture. For example, when the at least one detected touch event is a single tap on the input surface 181 of the touch sensor 113, the identified finger gesture is selection or pressing of a graphical user interface element in the image presented on the image display of the optical assembly 180A-B. Hence, the adjustment to the image presented on the image display of the optical assembly 180A-B based on the identified finger gesture is a primary action which selects or submits the graphical user interface element on the image display of the optical assembly 180A-B for further display or execution. This is just one example of a supported finger gesture, and it should be understood that several finger gesture types are supported by the eyewear device 100 which can include single or multiple finger contacts. Examples of multiple finger contact detected touch events and identified finger gestures are provided in FIGS. 7-10. Moreover, in some examples, the touch sensor 113 may control other output components, such as a speakers of the eyewear device 100, with the touch sensor 113 controlling volume, for example.
  • Eyewear device 100 may include wireless network transceivers, for example cellular or local area network transceivers (e.g., WiFi or Bluetooth™), and run sophisticated applications. Some of the applications may include a web browser to navigate the Internet, an application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application (e.g., to enter in a destination address and view maps), an augmented reality application, an email application (e.g., to read and compose emails). Gestures inputted on the touch sensor 113 can be used to manipulate and interact with the displayed content on the image display and control the applications.
  • While touch screens exist for mobile devices, such as tablets and smartphones, utilization of a touch screen in the lens of an eyewear device can interfere with the line of sight of the user of the eyewear device 100 and hinder the user's view. For example, finger touches can smudge the optical assembly 180-B (e.g., optical layers, image display, and lens) and cloud or obstruct the user's vision. To avoid creating blurriness and poor clarity when the user's eyes look through the transparent portion of the optical assembly 180A-B, the touch sensor 113 is located on the right temple 125B.
  • Touch sensor 113 can include a sensor array, such as a capacitive or resistive array, for example, horizontal strips or vertical and horizontal grids to provide the user with variable slide functionality, or combinations thereof. In one example, the capacitive array or the resistive array of the touch sensor 113 is a grid that forms a two-dimensional rectangular coordinate system to track X and Y axes location coordinates. In another example, the capacitive array or the resistive array of the touch sensor 113 is linear and forms a one-dimensional linear coordinate system to track an X axis location coordinate. Alternatively or additionally, the touch sensor 113 may be an optical type sensor that includes an image sensor that captures images and is coupled to an image processor for digital processing along with a timestamp in which the image is captured. The timestamp can be added by a coupled sensing circuit 241 which controls operation of the touch sensor 113 and takes measurements from the touch sensor 113. The sensing circuit 241 uses algorithms to detect patterns of the finger contact on the input surface 181, such as ridges of the fingers, from the digitized images that are generated by the image processor. Light and dark areas of the captured images are then analyzed to track the finger contact and detect a touch event, which can be further based on a time that each image is captured.
  • Touch sensor 113 can enable several functions, for example, touching anywhere on the touch sensor 113 may highlight an item on the screen of the image display of the optical assembly 180A-B. Double tapping on the touch sensor 113 may select an item. Sliding (e.g., or swiping) a finger from front to back may slide or scroll in one direction, for example, to move to a previous video, image, page, or slide. Sliding the finger from back to front may slide or scroll in the opposite direction, for example, to move to a previous video, image, page, or slide. Pinching with two fingers may provide a zoom-in function to zoom in on content of a displayed image. Unpinching with two fingers provides a zoom-out function to zoom out of content of a displayed image. The touch sensor 113 can be provided on both the left and right temples 125A-B to increase available functionality or on other components of the eyewear device 113, and in some examples, two, three, four, or more touch sensors 113 can be incorporated into the eyewear device 100 in different locations.
  • The type of touch sensor 113 depends on the intended application. For example, a capacitive type touch sensor 113 has limited functionality when the user wears gloves. Additionally, rain can trip false registers on the capacitive type touch sensor 113. A resistive type touch sensor 113 on the other hand, requires more applied force, which may not be optimal to the user wearing the eyewear device 100 on their head. Both capacitive and resistive type technologies can be leveraged by having multiple touch sensors 113 in the eyewear device 100 given their limitations.
  • In the example of FIG. 1A, the eyewear device includes at least one visible light camera 114 that is sensitive to the visible light range wavelength. As shown in the example, the visible light camera 114 has a frontward facing field of view. Examples of such a visible light camera 114 include a high resolution complementary metal-oxide-semiconductor (CMOS) image sensor and a video graphic array (VGA) camera, such as 640 p (e.g., 640×480 pixels for a total of 0.3 megapixels), 720 p, or 1080 p. Image sensor data from the visible light camera 114 is captured along with geolocation data, digitized by an image processor, stored in a memory, and displayed on the image display device of optical assembly 180A-B. In some examples, the touch sensor 113 is responsive to provide image or video capture via the visible light camera 114, for example, in response to any of the identified finger gestures disclosed herein.
  • FIGS. 1B-C are rear views of example hardware configurations of the eyewear device 100 of FIG. 1A, including two different types of image displays. In one example, the image display of optical assembly 180A-B includes an integrated image display. An example of such an integrated image display is disclosed in FIG. 5 of U.S. Pat. No. 9,678,338, filed Jun. 19, 2015, titled “Systems and Methods for Reducing Boot Time and Power Consumption in Wearable Display Systems,” which is incorporated by reference herein. As shown in FIG. 1B, the optical assembly 180A-B includes a suitable display matrix 170 of any suitable type, such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, or any other such display. The optical assembly 180A-B also includes an optical layer or layers 176, which can include lenses, optical coatings, prisms, mirrors, waveguides, optical strips, and other optical components in any combination. The optical layers 176A-N can include a prism having a suitable size and configuration and including a first surface for receiving light from display matrix and a second surface for emitting light to the eye of the user. The prism of the optical layers 176A-N extends over all or at least a portion of the respective apertures 175A-B formed in the left and right rims 107A-B to permit the user to see the second surface of the prism when the eye of the user is viewing through the corresponding left and right rims 107A-B. The first surface of the prism of the optical layers 176A-N faces upwardly from the frame 105 and the display matrix overlies the prism so that photons and light emitted by the display matrix impinge the first surface. The prism is sized and shaped so that the light is refracted within the prism and is directed towards the eye of the user by the second surface of the prism of the optical layers 176A-N. In this regard, the second surface of the prism of the optical layers 176A-N can be convex so as to direct the light towards the center of the eye. The prism can optionally be sized and shaped so as to magnify the image projected by the display matrix 170, and the light travels through the prism so that the image viewed from the second surface is larger in one or more dimensions than the image emitted from the display matrix 170.
  • In another example, the image display device of optical assembly 180A-B includes a projection image display as shown in FIG. 1C. An example of a projection image display is disclosed in FIG. 6 of U.S. Pat. No. 9,678,338, filed Jun. 19, 2015, titled “Systems and Methods for Reducing Boot Time and Power Consumption in Wearable Display Systems,” which is incorporated by reference herein. The optical assembly 180A-B includes a laser projector 150, which is a three-color laser projector using a scanning mirror or galvanometer. During operation, an optical source such as a laser projector 150 is disposed in or on one of the temples 125A-B of the eyewear device 100. Optical assembly 180-B includes one or more optical strips 155A-N spaced apart across the width of the lens of the optical assembly 180A-B or across a depth of the lens between the front surface and the rear surface of the lens.
  • As the photons projected by the laser projector 150 travel across the lens of the optical assembly 180A-B, the photons encounter the optical strips 155A-N. When a particular photon encounters a particular optical strip, it is either redirected towards the user's eye, or it passes to the next optical strip. Specific photons or beams of light may be controlled by a combination of modulation of laser projector 150, and modulation of optical strips 155A-N. In an example, a processor controls optical strips 155A-N by initiating mechanical, acoustic, or electromagnetic signals. Although shown as having two optical assemblies 180A-B, the eyewear device 100 can include other arrangements, such as a single or three optical assemblies, or the optical assembly 180A-B may have arranged different arrangement depending on the application or intended user of the eyewear device 100.
  • As further shown in FIG. 1B, eyewear device 100 includes a left chunk 110A adjacent the left lateral side 170A of the frame 105 and a right chunk 110B adjacent the right lateral side 170B of the frame 105. The chunks 110A-B may be integrated into the frame 105 on the respective lateral sides 170A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170A-B. Alternatively, the chunks 110A-B may be integrated into temples 125A-B attached to the frame 105.
  • FIG. 2A shows a side view of a temple 125B of the eyewear device 100 of FIG. 1A depicting a capacitive type touch sensor 113 example. As shown, the right temple 125B includes the touch sensor 113 and the touch sensor 113 has an input surface 181. A protruding ridge 281 surrounds the input surface 181 of the touch sensor 113 to indicate to the user an outside boundary of the input surface 181 of the touch sensor 113. The protruding ridge 281 orients the user by indicating to the user that their finger is on top of the touch sensor 113 and is in the correct position to manipulate the touch sensor 113.
  • FIG. 2B illustrates an external side view of a portion of the temple of the eyewear device 100 of FIGS. 1A-C and FIG. 2A. In the capacitive type touch sensor 113 example of FIGS. 2A-D, plastic or acetate form the right temple 125B. The right temple 125B is connected to the right chunk 110B via the right hinge 126B.
  • FIG. 2C illustrates an internal side view of the components of the portion of temple of the eyewear device 100 of FIGS. 1A-C and FIG. 2B with a cross-sectional view of a circuit board 240 with the touch sensor 113, a sensing circuit 241, an image display driver 242, and a processor 243. Although the circuit board 240 is a flexible printed circuit board (PCB), it should be understood that the circuit board 240 can be rigid in some examples. In some examples, the frame 105 or the chunk 110A-B can include the circuit board 140 that includes the touch sensor 113. In one example, sensing circuit 241 includes a dedicated microprocessor integrated circuit (IC) customized for processing sensor data from the touch sensor 113, along with volatile memory used by the microprocessor to operate. In some examples, the sensing circuit 241 and processor 243 may not be separate components, for example, functions and circuitry implemented in the sensing circuit 241 can be incorporated or integrated into the processor 243 itself.
  • Image display driver 242 commands and controls the image display of the optical assembly 180A-B. Image display driver 242 may deliver image data directly to the image display of the optical assembly 180A-B for presentation or may have to convert the image data into a signal or data format suitable for delivery to the image display device. For example, the image data may be video data formatted according to compression formats, such as H. 264 (MPEG-4 Part 10), HEVC, Theora, Dirac, RealVideo RV40, VP8, VP9, or the like, and still image data may be formatted according to compression formats such as Portable Network Group (PNG), Joint Photographic Experts Group (JPEG), Tagged Image File Format (TIFF) or exchangeable image file format (Exif) or the like.
  • The touch sensor 113 is disposed on the flexible printed circuit board 240. The touch sensor 113 includes a capacitive array that is coupled to the input surface 181 to receive at least one finger contact inputted from a user. The sensing circuit 241 is integrated into or connected to the touch sensor 113 and connected to the processor 243. The sensing circuit 241 is configured to measure voltage to track the at least one finger contact on the input surface 181.
  • FIG. 2D depicts a capacitive array pattern formed on the circuit board of FIG. 2C to receive finger contacts. The pattern of the capacitive array 214 of the touch sensor 113 includes patterned conductive traces formed of at least one metal, indium tin oxide, or a combination thereof on the flexible printed circuit board 240. In the example, the conductive traces are rectangular shaped copper pads.
  • FIG. 3A shows an external side view of a temple 125B of the eyewear device 100 of FIG. 1 depicting another capacitive type touch sensor 113. Similar to the example of FIGS. 2A-D, the right temple 125B includes the touch sensor 113 and the touch sensor 113 has a protruding ridge 281 that surrounds an input surface 181. FIG. 3B illustrates an external side view of a portion of the temple 125B of the eyewear device 100 of FIG. 1A and FIG. 3A. Metal may form the right temple 125B and a plastic external layer can cover the metal layer.
  • FIG. 3C illustrates an internal side view of the components of the portion of temple 125B of the eyewear device 100 of FIG. 1A and FIG. 3B with a cross-sectional view of a circuit board 240 with the touch sensor 113, a sensing circuit 241, an image display driver 242, and a processor 243. Similar to FIG. 2C, the touch sensor 113 is disposed on the flexible printed circuit board 240. Various electrical interconnect(s) 294 are formed to convey electrical signals from the input surface 181 to the flexible printed circuit board 240. FIG. 3D depicts a pattern of the capacitive array 214 formed on the flexible printed circuit board 240 of FIG. 3C to receive finger contacts similar to FIG. 2C.
  • FIGS. 4A-B show operation and a circuit diagram of the capacitive type touch sensor 113 of FIGS. 2A-D and 3A-D to receive finger contacts and the sensing circuit 241 to track the finger contacts 410. The view of FIG. 4A is intended to give a cross-sectional view of two capacitors of the capacitive array 214 of the touch sensor 113 of FIGS. 2A-D and 3A-D, and the coupled sensing circuit 241. As shown, the touch sensor 113 includes the capacitive array 214 formed by capacitors, including capacitors CA and CB. The capacitive array 214 includes multiple patterned conductive sensor electrodes 415A-B, and it should be understood that although only two sensor electrodes are shown, the number can be 20, 100, 1000, etc. or essentially any number depending on the application. In one example, the capacitive array 214 includes 100 sensor electrodes, in other examples, the 100 sensor electrodes are arranged in a 10×10 grid. The sensor electrodes 415A-B are connected to the flexible printed circuit board 240 and disposed below the input surface 181. At least one respective electrical interconnect connects the sensing circuit 241 to the sensor electrodes 415A-B. The sensing circuit 241 is configured to measure capacitance changes of each of the sensor electrodes 415A-B of the capacitive array 214. In the example, the sensor electrodes 415A-B are rectangular patterned conductive traces formed of at least one of metal, indium tin oxide, or a combination thereof.
  • Since the capacitors CA and CB of the capacitive array 214 store electrical charge, connecting them up to conductive plates on the input surface 181 allows the capacitors to track the details of finger contacts 410. Charge stored in the capacitor CA changes slightly (e.g., the charge becomes higher) when the finger is placed over the conductive plates of capacitor CA, while an air gap will leave the charge at the capacitor CB relatively unchanged (e.g., the charge remains lower). As shown in FIG. 4B, the sensing circuit 241 can include an op-amp integrator circuit which can track these changes in capacitance of capacitive array 214, and the capacitance changes can then be recorded by an analog-to-digital converter (ADC) and stored in a memory along with timing data of when the capacitance change is sensed.
  • Hence, the sensing circuit 241 is further configured to determine a respective location coordinate and a respective input time of the at least one finger contact 410 on the input surface 181. Execution of the programming by the processor configures the eyewear device 100 to perform functions, including functions to track, via the sensing circuit 241, the respective location coordinate and the respective input time of the at least one finger contact on the input surface 181. The function to detect the at least one touch event on the input surface 181 of the touch sensor 113 is based on the at least one respective location coordinate and the respective input time of the at least one finger contact 410.
  • FIG. 5A shows an external side view of a temple 125B of the eyewear device of FIGS. 1A-C depicting a resistive type touch sensor 114 on the temple 125B. Similar to the example of FIGS. 2A-D, the right temple 125B includes the touch sensor 113 and the touch sensor 113 has an input surface 181 surrounded by a protruding ridge 281. In this example, however, the touch sensor 113 includes a resistive array 514. FIG. 5B illustrates an external side view of a portion of the temple of the eyewear device 100 of FIG. 5A. Plastic or metal may form the right temple 125B.
  • FIG. 5C illustrates an internal side view of the components of the portion of temple of the eyewear device of FIGS. 1A-C and FIG. 5A with a cross-sectional view of a circuit board 540 with the touch sensor 113, a sensing circuit 241, an image display driver 242, and a processor 243. Similar to FIG. 2C, the touch sensor 113 is disposed on the flexible printed circuit board 540. Various electrical interconnect(s) 294 are formed to convey electrical signals from the input surface 181 to the flexible printed circuit board 540. FIG. 5D depicts a pattern of the resistive array 514 formed on the circuit board 540 of FIG. 5C to receive finger contacts similar to FIG. 2C. The flexible printed circuit board 540 is an air gapped dual layer flexible printed circuit board with a resistive pattern thereon.
  • As shown, the resistive array 514 includes two conductive layers, including a first conductive layer 583 (e.g., ground) and a second conductive layer 585 (e.g., signal). An air gap 584 between the two conductive layers 583 and 585 separates the first and second conductive layers. The first and second conductive layers 583 and 585 of the resistive array 514 can include rectangular patterned conductive traces formed of at least one metal, indium tin oxide, or a combination thereof. The two conductive layers 583 and 585 are connected to the flexible printed circuit board 540 and are disposed below the input surface 181 of the touch sensor 113.
  • When the outer first conductive layer 583 is pressed so that it makes contact with the inner second conductive layer 585, an electrical connection is made between the layers. In effect, this closes an electrical switch with the voltage measurements on the resistive array 514 taken by the sensing circuit 241 being directly correlated to where the touch sensor 113 is touched. A voltage gradient is applied either in a horizontal or a vertical direction of the resistive array 514 to acquire the X or Y location coordinates of the finger contact and repeats for the other direction, requiring two measurements. The sensing circuit 241 of the eyewear device 100 correlates the voltage measurement to the location coordinates of the finger contact.
  • FIG. 6 shows operation and a circuit diagram of the resistive type touch sensor of FIGS. 5A-D to receive finger contacts. The view of FIG. 6 is intended to give a cross-sectional view of a single resistor of the resistive array 514 of the touch sensor 113 of FIG. 5A, and the coupled sensing circuit (not shown). The first conductive layer 583 and the second conductive layer 585 are separated by insulating spacers 570A-B (shown as dots) to form an air gap 584 between the two conductive layers 583 and 585 which may be deposited or layered on respective substrates.
  • The sensing circuit 241 (not shown) is connected to the flexible printed circuit board 540 and connected to the two conductive layers 583 and 585 and configured to measure a voltage drop between the two conductive layers 583 and 585 in response to the at least one finger contact 410. In an example, the second conductive layer 585 is deposited on the flexible printed circuit board 540 and is separated from the first conductive layer 583 by the insulating spacers 570A-B. A flexible layer of protective insulation may be layered on the first conductive layer 585.
  • In one example, the sensing circuit 241 can track touch location coordinates on the resistive array 514 using four wires that are connected to the sensing circuit 241 and the conductive layers 583 and 585. Two wires are connected to the left and right sides of the second conductive layer 585, and two wires are connected to the top and bottom of the first conductive layer 583. A voltage gradient is applied across the first conductive layer 483 and when contact is made with the first conductive layer 583 the resulting circuit mimics a voltage divider. The voltage is then probed at the first conductive layer 583 to determine the x-coordinate of the touch location. This process is repeated for the y-axis by applying a potential across the first conductive layer 583 and measuring the voltage of the second conductive layer 585. In some examples, the sensing circuit 241 may employ a 5-wire method with a fifth wire behaving as a top layer voltage probe, in which the second conductive layer 585 is utilized for both X and Y-axis measurements.
  • FIGS. 7-10 illustrate several examples of multiple finger contact detected touch events and identified finger gestures. In each of the examples of FIGS. 7-10, the function to receive on the input surface 181 of the touch sensor 113 the at least one finger contact input by the user includes functions to: receive on the input surface 181 of the touch sensor 113 a first finger contact input by the user at a first input time; and receive on the input surface 181 of the touch sensor 113 a second finger contact 710B input by the user at a second input time which is within a predetermined time period of the first input time.
  • Further, in each of the examples of FIGS. 7-10, the function to detect the at least one touch event on the input surface 181 of the touch sensor 113 based on the at least one finger contact inputted from the user includes functions to: detect a first touch event on the input surface 181 of the touch sensor 113 based on the first finger contact inputted from the user at the first input time; and detect a second touch event on the input surface 181 of the touch sensor 113 based on the second finger contact inputted from the user at the second input time within the predetermined time period of the first input time. The function to identify the finger gesture is based on the first and second detected touch events, the first input time, the second input time, and the predetermined time period.
  • FIGS. 7A-C illustrate press and hold detected touch events on the input surface 181 of the touch sensor 113. As shown, multiple finger contacts occur on the touch sensor 113, which include pressing (the first finger contact 710A), holding (the second finger contact 710B), and no finger contact 710C by releasing the touch sensor 113. Accordingly, the first and second detected touch events are a press and hold on the input surface 181 of the touch sensor 113. The identified finger gesture is a press and hold of a graphical user interface element in the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture is configured to allow a drag and drop (e.g., move) of the graphical user interface element on the image display or provide display options (e.g., a context menu associated with the graphical user interface element).
  • FIG. 8 illustrates finger pinching and unpinching detected touch events on the input surface 181 of the touch sensor 113. Multiple finger contacts occur on the touch sensor 113, in which two fingers (first finger contact 810A and second finger contact 810B) move apart from each other (finger unpinching) or move toward each other (finger pinching). In the finger pinching detected touch event example, the first and second detected touch events are finger pinching on the input surface 181 of the touch sensor 113. The identified finger gesture is a zoom in of the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture zooms in on the image presented on the image display.
  • In the finger unpinching detected touch event example, the first and second detected touch events are finger unpinching on the input surface of the touch sensor 113. The identified finger gesture is a zoom out of the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture zooms out of the image presented on the image display.
  • FIG. 9 illustrates finger rotation detected touch events on the input surface 181 of the touch sensor 113. As shown, multiple finger contacts occur on the touch sensor 113, which include continuously rotating two fingers in a circle from two initial points, a first finger contact 910A and a second finger contact 910B, to two final points of contact for those two fingers. In some examples, only one finger may be rotated in a circle. The first and second detected touch events are finger rotation on the input surface 181 of the touch sensor 113. The identified finger gesture is a finger rotation of the image presented on the image display. The adjustment to the image presented on the display based on the identified finger gesture rotates the image presented on the image display, for example, to rotate a view. The rotation gesture is can occur when two fingers rotate around each other.
  • FIG. 10 illustrates finger swiping detected touch events on the input surface 181 of the touch sensor 113. As shown, multiple finger contacts occur on the touch sensor 113, which include dragging one finger left or right from a point of initial finger contact 1010A to a final point of second finger contact 1010B or 1010C. The first and second detected touch events are finger swiping from front to back (1010A to 1010C) or back to front (1010A to 1010B) on the input surface 181 of the touch sensor 113. The identified finger gesture is a scroll of the image presented on the image display. The adjustment to the image presented on the image display based on the identified finger gesture scrolls the image presented on the image display. As shown, such a scroll or swipe gesture can occur when the user moves one or more fingers across the screen in a specific horizontal direction without significantly deviating from the main direction of travel, however, it should be understood that the direction of travel can be vertical as well, for example if the touch sensor 113 is a X and Y coordinate grid or a vertical strip.
  • FIG. 11 is a high-level functional block diagram of an example finger activated touch sensor system. The system 1100 includes eyewear device 100, mobile device 1190, and server system 1198. Mobile device 1190 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting with eyewear device 100 using both a low-power wireless connection 1125 and a high-speed wireless connection 1137. Mobile device 1190 is connected to server system 1198 and network 1195. The network 1195 may include any combination of wired and wireless connections.
  • Server system 1198 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the network 1195 with the mobile device 1190 and eyewear device 100.
  • Low-power wireless circuitry 1124 and the high-speed wireless circuitry 1136 of the eyewear device 100 can include short range transceivers (Bluetooth™) and wireless wide, local, or wide area network transceivers (e.g., cellular or WiFi). Mobile device 1190, including the transceivers communicating via the low-power wireless connection 1125 and high-speed wireless connection 1137, may be implemented using details of the architecture of the eyewear device 100, as can other elements of network 1195.
  • Output components of the eyewear device 100 include visual components, such as the image display of the optical assembly 180 as described in FIGS. 1B-C (e.g., a display such as a liquid crystal display (LCD), a plasma display panel (PDP), a light emitting diode (LED) display, or a projector). The image display of the optical assembly 180 is driven by the image display driver 242. The output components of the eyewear device 100 further include acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. The input components of the eyewear device 100 include the touch sensor 113, and various components of the system, including the mobile device 1190 and server system 1198, may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
  • System 1100 may optionally include additional peripheral device elements 1119. Such peripheral device elements 1119 may include biometric sensors, additional sensors, or display elements integrated with eyewear device 100. For example, peripheral device elements 1119 may include any I/O components including output components, motion components, position components, or any other such elements described herein.
  • For example, the biometric components of the system include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or Bluetooth™ transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. Such positioning system coordinates can also be received over wireless connections 1125 and 1137 from the mobile device 1190 via the low-power wireless circuitry 1124 or high-speed wireless circuitry 1136.
  • Eyewear device 100 includes a touch sensor 113, visible light camera 114, image display of the optical assembly 180, sensing circuit 241, image display driver 242, image processor 1112, low-power circuitry 1120, and high-speed circuitry 1130. The components shown in FIG. 11 for the eyewear device 100 are located on one or more circuit boards, for example a PCB or flexible PCB, in the temples. Alternatively or additionally, the depicted components can be located in the chunks, frames, hinges, or bridge of the eyewear device 100. Visible light camera 114 can include digital camera elements such as a complementary metal-oxide-semiconductor (CMOS) image sensor, charge coupled device, a lens, or any other respective visible or light capturing elements that may be used to capture data.
  • Touch sensor 113 can receive a user input commands (e.g., finger contacts) as input and the sensing circuit 241 along with the depicted gesture application 1144 stored in memory 1134 can track those finger contacts and identify particular input gestures. In one implementation, the identified gesture sends a user input signal from to low power processor 243A. In some examples, the touch sensor 113 is located on different portions of the eyewear device 100, such as on a different temple, chunk, or the frame, but is electrically connected via a circuit board to the visible light camera 114, sensing circuit 241, image processor 1112, image display driver 242, and image display of the optical assembly 180.
  • In one example, interaction with the touch sensor 113 by the user, e.g., tactile input can be processed by low power processor 243A as a request to capture a single image by the visible light camera 114. The tactile input for a first period of time may be processed by low-power processor 243A as a request to capture video data while the touch sensor 113 is being contacted by a finger, and to cease video capture when no finger contact is detected on the touch sensor 113, with the video captured while the touch sensor 113 was continuously contacted stored as a single video file. In certain embodiments, the low-power processor 243A may have a threshold time period between the inputted touch gesture, such as 500 milliseconds or one second, below which the finger contact with the touch sensor 113 is processed as an image request, and above which the finger contact with the touch sensor 113 is interpreted as a video request. Image processor 1112 includes circuitry to receive signals from the visible light camera 114 and process those signals from the visible light camera 114 into a format suitable for storage in the memory 1134.
  • Memory 1134 includes various captured images, videos, and a gesture application 1144 to perform the functions of the programming described herein, for example the gesture identification operations outlined in further detail in FIG. 1-10. Although shown as an application, it should be understood that the gesture application 1144 can be part of the operating system stored in the memory 1134 of the eyewear device 100 and provides an application programming interface (API) which is responsive to calls from other applications. Identified gestures can be utilized to allow the user to interact with and manipulate various applications, including the depicted augmented reality application 1145, web browser application 1146, and turn-by-turn navigation application 1147, phone application 1148, photo and video viewer application 1149, music player application 1150, and email application 1151. Through a series of one or more calls to the API of the gesture application 1144, the applications 1145-1151 can manipulate and interact with the displayed content (e.g., graphical user interface) on the optical assembly 180 with image display to control applications 1145-1151. For example, an API call to the gesture application 1144 can return identified finger gestures. In response to the identified finger gestures, the applications 1145-1151 can adjust the image presented on the display based on the identified finger gesture. In some examples, the underlying detected touch events of the identified finger gesture may also be returned by the API call to the gesture application 1144 to the applications 1145-1151. This can allow for custom gestures to be developed and implemented in the applications 1145-1151 for identification (e.g., via a software development kit) and resulting adjustments to images presented on the display based on the identified finger gesture.
  • As noted above, eyewear device 100 may include cellular wireless network transceivers or other wireless network transceivers (e.g., WiFi or Bluetooth™), and run sophisticated applications. Some of the applications may include web browsers to navigate the Internet, a phone application to place phone calls, video or image codecs to watch videos or interact with pictures, codecs to listen to music, a turn-by-turn navigation application, an augmented or virtual reality application, or an email application. Gestures inputted on the touch sensor 113 can be used to manipulate and interact with the displayed content on the image display of the optical assembly 180 and control the applications.
  • Following are some examples, of finger gestures which can be identified by the API of the gesture application 1144 and use cases. The API of the gesture application 1144 can be configured to enable gestures to navigate the Internet in the web browser application 1146. The API of the gesture application 1144 can be configured to enable gestures to enter addresses or zoom in and out of maps and locations displayed in the turn-by-turn navigation application 1147. The API of the gesture application 1144 can be configured to enable gestures to select a contact or enter a phone number to place phone calls to in the phone application 1148. The API of the gesture application 1144 can be configured to enable gestures to view photos by swiping or select videos to view in the photo and video viewer application 1149, including pause, stop, play, etc. The API of the gesture application 1144 can be configured to enable gestures to select audio files to be played in the music player application 1150, including pause, stop, play, etc. The API of the gesture application 1144 can be configured to enable gestures to read, send, delete, and compose emails in the email application 1151.
  • Image processor 1112, touch sensor 113, and sensing circuit 241 are structured within eyewear device 100 such that the components may be powered on and booted under the control of low-power circuitry 1120. Image processor 1112, touch sensor 113, and sensing circuit 241 may additionally be powered down by low-power circuitry 1120. Depending on various power design elements associated with image processor 1112, touch sensor 113, and sensing circuit 241, these components may still consume a small amount of power even when in an off state. This power will, however, be negligible compared to the power used by image processor 1112, touch sensor 113, and sensing circuit 241 when in an on state, and will also have a negligible impact on battery life. As described herein, device elements in an “off” state are still configured within a device such that low-power processor 243A is able to power on and power down the devices. A device that is referred to as “off” or “powered down” during operation of eyewear device 100 does not necessarily consume zero power due to leakage or other aspects of a system design.
  • In one example embodiment, image processor 1112 comprises a microprocessor integrated circuit (IC) customized for processing sensor data from the touch sensor 113, sensing circuit 241, and visible light camera 114, along with volatile memory used by the microprocessor to operate. In order to reduce the amount of time that image processor 1112 takes when powering on to processing data, a non-volatile read only memory (ROM) may be integrated on the IC with instructions for operating or booting the image processor 1112. This ROM may be minimized to match a minimum size needed to provide basic functionality for gathering sensor data from the touch sensor 113, sensing circuit 241, and visible light camera 114, such that no extra functionality that would cause delays in boot time are present. The ROM may be configured with direct memory access (DMA) to the volatile memory of the microprocessor of image processor 1112. DMA allows memory-to-memory transfer of data from the ROM to system memory of the image processor 1112 independent of operation of a main controller of image processor 1112. Providing DMA to this boot ROM further reduces the amount of time from power on of the image processor 1112 until sensor data from the touch sensor 113, sensing circuit 241, and visible light camera 114 can be processed and stored. In certain embodiments, minimal processing of the camera signal from the touch sensor 113, sensing circuit 241, and visible light camera 114 is performed by the image processor 1112, and additional processing may be performed by applications operating on the mobile device 1190 or server system 1198.
  • Low-power circuitry 1120 includes low-power processor 243A and low-power wireless circuitry 1124. These elements of low-power circuitry 1120 may be implemented as separate elements or may be implemented on a single IC as part of a system on a single chip. Low-power processor 243A includes logic for managing the other elements of the eyewear device 100. As described above, for example, low power processor 243A may accept user input signals from the touch sensor 113. Low-power processor 243A may also be configured to receive input signals or instruction communications from mobile device 1190 via low-power wireless connection 1125. Additional details related to such instructions are described further below. Low-power wireless circuitry 1124 includes circuit elements for implementing a low-power wireless communication system via a short-range network. Bluetooth™ Smart, also known as Bluetooth™ low energy, is one standard implementation of a low power wireless communication system that may be used to implement low-power wireless circuitry 1124. In other embodiments, other low power communication systems may be used.
  • High-speed circuitry 1130 includes high-speed processor 243B, memory 1134, and high-speed wireless circuitry 1136. In the example, the sensing circuit 241 and touch sensor 113 are shown as being coupled to the low-power circuitry 1120 and operated by the low-power processor 243B. However, it should be understood that in some examples the touch sensor 113 and sensing circuit 241 can be coupled to the high-speed circuitry 1130 and operated by the high-speed processor 243B. In the example, the image display driver 242 is coupled to the high-speed circuitry 1130 and operated by the high-speed processor 243B in order to drive the image display of the optical assembly 180.
  • High-speed processor 243B may be any processor capable of managing high-speed communications and operation of any general computing system needed for eyewear device 100. High speed processor 243B includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 1137 to a wireless local area network (WLAN) using high-speed wireless circuitry 1136. In certain embodiments, the high-speed processor 243B executes an operating system such as a LINUX operating system or other such operating system of the eyewear device 100 and the operating system is stored in memory 1134 for execution. In addition to any other responsibilities, the high-speed processor 243B executing a software architecture for the eyewear device 100 is used to manage data transfers with high-speed wireless circuitry 1136. In certain embodiments, high-speed wireless circuitry 1136 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other embodiments, other high-speed communications standards may be implemented by high-speed wireless circuitry 1136.
  • Memory 1134 includes any storage device capable of storing various applications 1144-1151 and data, including camera data generated by the visible light camera 114 and the image processor 1112, as well as images generated for display by the image display driver 242 on the image display of the optical assembly 180. While memory 1134 is shown as integrated with high-speed circuitry 1130, in other embodiments, memory 1134 may be an independent standalone element of the eyewear device 100. In certain such embodiments, electrical routing lines may provide a connection through a chip that includes the high-speed processor 243B from the image processor 1112 or low-power processor 243A to the memory 1134. In other embodiments, the high-speed processor 243B may manage addressing of memory 1134 such that the low-power processor 243A will boot the high-speed processor 243B any time that a read or write operation involving memory 1134 is needed.
  • Any of the touch sensor or other functions described herein for the eyewear device 100, mobile device 1190, and server system 1198 can be embodied in on one or more methods as method steps or in one more applications as described previously. According to some embodiments, an “application” or “applications” are program(s) that execute functions defined in the programs. Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, a third party application (e.g., an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as IOS™, ANDROID™, WINDOWS® Phone, or another mobile operating systems. In this example, the third party application can invoke API calls provided by the operating system to facilitate functionality described herein. The applications can be stored in any type of computer readable medium or computer storage device and be executed by one or more general purpose computers. In addition, the methods and processes disclosed herein can alternatively be embodied in specialized computer hardware or an application specific integrated circuit (ASIC), field programmable gate array (FPGA) or a complex programmable logic device (CPLD).
  • Program aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of executable code and/or associated data that is carried on or embodied in a type of machine readable medium. For example, programming code could include code for the touch sensor or other functions described herein. “Storage” type media include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from the server system 1198 or host computer of the service provider into the computer platforms of the eyewear device 100 and mobile device 1190. Thus, another type of media that may bear the programming, media content or meta-data files includes optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to “non-transitory”, “tangible”, or “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions or data to a processor for execution.
  • Hence, a machine readable medium may take many forms of tangible storage medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the client device, media gateway, transcoder, etc. shown in the drawings. Volatile storage media include dynamic memory, such as main memory of such a computer platform. Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
  • The scope of protection is limited solely by the claims that now follow. That scope is intended and should be interpreted to be as broad as is consistent with the ordinary meaning of the language that is used in the claims when interpreted in light of this specification and the prosecution history that follows and to encompass all structural and functional equivalents. Notwithstanding, none of the claims are intended to embrace subject matter that fails to satisfy the requirement of Sections 101, 102, or 103 of the Patent Act, nor should they be interpreted in such a way. Any unintended embracement of such subject matter is hereby disclaimed.
  • Except as stated immediately above, nothing that has been stated or illustrated is intended or should be interpreted to cause a dedication of any component, step, feature, object, benefit, advantage, or equivalent to the public, regardless of whether it is or is not recited in the claims.
  • It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
  • Unless otherwise stated, any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ±10% from the stated amount.
  • In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
  • While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.

Claims (20)

What is claimed is:
1. A method of providing input to an eyewear device, comprising:
receiving on an input surface of a touch sensor at least one finger contact inputted from a user, the touch sensor including an input surface and a sensor array that is coupled to the input surface to receive the at least one finger contact inputted from the user;
tracking the at least one finger contact on the input surface;
detecting at least one touch event on the input surface of the touch sensor based on the at least one finger contact on the input surface;
identifying a finger gesture based on the at least one detected touch event on the input surface; and
adjusting an image presented on the image display based on the identified finger gesture.
2. The method of claim 1, wherein the sensor array is a capacitive array or a resistive array, further comprising:
determining a respective location coordinate and a respective input time of the at least one finger contact on the input surface; and
tracking the respective location coordinate and the respective input time of the at least one finger contact on the input surface,
wherein detecting the at least one touch event on the input surface of the touch sensor is based on the at least one respective location coordinate and the respective input time of the at least one finger contact.
3. The method of claim 2, further comprising disposing the touch sensor on a circuit board.
4. The method of claim 3, wherein the circuit board is a flexible printed circuit board.
5. The method of claim 4, further comprising forming patterned conductive traces in the capacitive array or the resistive array of the touch sensor, the patterned conductive traces formed of at least one metal, indium tin oxide, or a combination thereof on the flexible printed circuit board.
6. The method of claim 4, further comprising surrounding the input surface of the touch sensor by a protruding ridge to indicate to the user an outside boundary of the input surface of the touch sensor.
7. The method of claim 4, wherein the touch sensor includes the capacitive array, the capacitive array being formed of patterned conductive sensor electrodes connected to the flexible printed circuit board and disposed below the input surface, and a sensing circuit is connected to the sensor electrodes and integrated into or connected to the touch sensor via at least one respective electrical interconnect, further comprising:
the sensing circuit measuring capacitance changes of each of the sensor electrodes of the capacitive array voltage to track the at least one finger contact on the input surface of the touch sensor.
8. The method of claim 4, wherein the touch sensor includes the resistive array, the resistive array includes two conductive layers separated by at least one spacer to form an air gap between the two conductive layers connected to the flexible printed circuit board, and a sensing circuit is connected to the flexible printed circuit board and connected to the two conductive layers disposed below the input surface, further comprising:
the sensing circuit measuring a voltage drop between the two conductive layers in response to the at least one finger contact.
9. The method of claim 2, further comprising forming the capacitive array or the resistive array into a grid that forms a two-dimensional rectangular coordinate system to track X and Y axes location coordinates.
10. The method of claim 9, wherein the at least one detected touch event is a single tap on the input surface of the touch sensor and the identified finger gesture is a selection of a graphical user interface element in the image presented on the image display, further comprising:
adjusting the image presented on the image display based on the identified finger gesture by selecting the graphical user interface element for display or execution on the image display.
11. The method of claim 2, further comprising forming the capacitive array or the resistive array into a linear array and forming a one-dimensional linear coordinate system to track an X axis location coordinate.
12. The method of claim 2, wherein:
receiving on the input surface of the touch sensor the at least one finger contact inputted from the user comprises:
receiving on the input surface of the touch sensor a first finger contact inputted from the user at a first input time; and
receiving on the input surface of the touch sensor a second finger contact inputted from the user at a second input time which is within a predetermined time period of the first input time.
13. The method of claim 12, wherein:
detecting the at least one touch event on the input surface of the touch sensor based on the at least one finger contact inputted from the user comprises:
detecting a first touch event on the input surface of the touch sensor based on the first finger contact inputted from the user at the first input time; and
detecting a second touch event on the input surface of the touch sensor based on the second finger contact inputted from the user at the second input time within the predetermined time period of the first input time,
wherein identifying the finger gesture is based on the first and second detected touch events, the first input time, the second input time, and the predetermined time period.
14. The method of claim 13, wherein the first and second detected touch events are a press and hold on the input surface of the touch sensor, and the identified finger gesture is a press and hold of a graphical user interface element in the image presented on the image display, further comprising:
adjusting the image presented on the image display based on the identified finger gesture by dragging and dropping of the graphical user interface element on the image display.
15. The method of claim 13, wherein the first and second detected touch events are finger swiping from front to back or back to front on the input surface of the touch sensor, and the identified finger gesture is a scroll of the image presented on the image display, further comprising:
adjusting the image presented on the image display based on the identified finger gesture by scrolling the image presented on the image display.
16. The method of claim 13, wherein the first and second detected touch events are finger pinching on the input surface of the touch sensor, and the identified finger gesture is a zoom in of the image presented on the image display, further comprising:
adjusting the image presented on the image display based on the identified finger gesture by zooming in on the image presented on the image display.
17. The method of claim 13, wherein the first and second detected touch events are finger unpinching on the input surface of the touch sensor, and the identified finger gesture is a zoom out of the image presented on the image display, further comprising:
adjusting the image presented on the image display based on the identified finger gesture by zooming out of the image presented on the image display.
18. The method of claim 13, wherein the first and second detected touch events are finger rotations on the input surface of the touch sensor, and the identified finger gesture is a finger rotation of the image presented on the image display, further comprising:
adjusting the image presented on the display based on the identified finger gesture by rotating the image presented on the image display.
19. A non-transitory computer readable medium comprising instructions which, when executed by a processor, cause an electronic system to provide input to an eyewear device, by:
receiving on an input surface of a touch sensor at least one finger contact inputted from a user, the touch sensor including an input surface and a sensor array that is coupled to the input surface to receive the at least one finger contact inputted from the user;
tracking the at least one finger contact on the input surface;
detecting at least one touch event on the input surface of the touch sensor based on the at least one finger contact on the input surface;
identifying a finger gesture based on the at least one detected touch event on the input surface; and
adjusting an image presented on the image display based on the identified finger gesture.
20. The non-transitory computer readable medium of claim 19, wherein the sensor array is a capacitive array or a resistive array, the instructions further comprising instructions to:
determine a respective location coordinate and a respective input time of the at least one finger contact on the input surface; and
track the respective location coordinate and the respective input time of the at least one finger contact on the input surface,
wherein detecting the at least one touch event on the input surface of the touch sensor is based on the at least one respective location coordinate and the respective input time of the at least one finger contact.
US17/182,943 2018-01-10 2021-02-23 Eyewear device with finger activated touch sensor Pending US20210181536A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/182,943 US20210181536A1 (en) 2018-01-10 2021-02-23 Eyewear device with finger activated touch sensor

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862615664P 2018-01-10 2018-01-10
US16/241,063 US10962809B1 (en) 2018-01-10 2019-01-07 Eyewear device with finger activated touch sensor
US17/182,943 US20210181536A1 (en) 2018-01-10 2021-02-23 Eyewear device with finger activated touch sensor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/241,063 Continuation US10962809B1 (en) 2018-01-10 2019-01-07 Eyewear device with finger activated touch sensor

Publications (1)

Publication Number Publication Date
US20210181536A1 true US20210181536A1 (en) 2021-06-17

Family

ID=75164468

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/241,063 Active 2039-08-23 US10962809B1 (en) 2018-01-10 2019-01-07 Eyewear device with finger activated touch sensor
US17/182,943 Pending US20210181536A1 (en) 2018-01-10 2021-02-23 Eyewear device with finger activated touch sensor

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/241,063 Active 2039-08-23 US10962809B1 (en) 2018-01-10 2019-01-07 Eyewear device with finger activated touch sensor

Country Status (1)

Country Link
US (2) US10962809B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230168500A1 (en) * 2021-11-26 2023-06-01 Merry Electronics (Suzhou) Co., Ltd. Smart glasses and camera device thereof

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10509466B1 (en) 2011-05-11 2019-12-17 Snap Inc. Headwear with computer and optical element for use therewith and systems utilizing same
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10586570B2 (en) 2014-02-05 2020-03-10 Snap Inc. Real time video processing for changing proportions of an object in the video
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
US9225897B1 (en) 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US10775996B2 (en) 2014-11-26 2020-09-15 Snap Inc. Hybridization of voice notes and calling
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
KR102035405B1 (en) 2015-03-18 2019-10-22 스냅 인코포레이티드 Geo-Fence Authorized Provisioning
US9668217B1 (en) 2015-05-14 2017-05-30 Snap Inc. Systems and methods for wearable initiated handshaking
US10503264B1 (en) 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10055895B2 (en) 2016-01-29 2018-08-21 Snap Inc. Local augmented reality persistent sticker objects
US10474353B2 (en) 2016-05-31 2019-11-12 Snap Inc. Application control using a gesture based trigger
US10102423B2 (en) 2016-06-30 2018-10-16 Snap Inc. Object modeling and replacement in a video stream
US10768639B1 (en) 2016-06-30 2020-09-08 Snap Inc. Motion and image-based control system
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US10579869B1 (en) 2017-07-18 2020-03-03 Snap Inc. Virtual object machine learning
US11323398B1 (en) 2017-07-31 2022-05-03 Snap Inc. Systems, devices, and methods for progressive attachments
US11204949B1 (en) 2017-07-31 2021-12-21 Snap Inc. Systems, devices, and methods for content selection
US10591730B2 (en) 2017-08-25 2020-03-17 II Jonathan M. Rodriguez Wristwatch based interface for augmented reality eyewear
US11531357B1 (en) 2017-10-05 2022-12-20 Snap Inc. Spatial vector-based drone control
US11847426B2 (en) 2017-11-08 2023-12-19 Snap Inc. Computer vision based sign language interpreter
US10217488B1 (en) 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
US10567321B2 (en) 2018-01-02 2020-02-18 Snap Inc. Generating interactive messages with asynchronous media content
US10523606B2 (en) 2018-01-02 2019-12-31 Snap Inc. Generating interactive messages with asynchronous media content
US10962809B1 (en) * 2018-01-10 2021-03-30 Snap Inc. Eyewear device with finger activated touch sensor
US11063889B2 (en) 2018-06-08 2021-07-13 Snap Inc. Generating interactive messages with entity assets
US10796482B2 (en) 2018-12-05 2020-10-06 Snap Inc. 3D hand shape and pose estimation
US12071228B1 (en) * 2019-03-28 2024-08-27 Snap Inc. Drone with propeller guard configured as an airfoil
US11012390B1 (en) 2019-03-28 2021-05-18 Snap Inc. Media content response in a messaging system
US11036368B1 (en) 2019-03-29 2021-06-15 Snap Inc. Messaging system with message transmission user interface
US11019011B1 (en) 2019-03-29 2021-05-25 Snap Inc. Messaging system with discard user interface
US11106342B1 (en) 2019-06-03 2021-08-31 Snap Inc. User interfaces to facilitate multiple modes of electronic communication
US11151794B1 (en) 2019-06-28 2021-10-19 Snap Inc. Messaging system with augmented reality messages
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11551374B2 (en) 2019-09-09 2023-01-10 Snap Inc. Hand pose estimation from stereo cameras
US11775168B1 (en) * 2019-09-25 2023-10-03 Snap Inc. Eyewear device user interface
US11062498B1 (en) 2019-12-30 2021-07-13 Snap Inc. Animated pull-to-refresh
US11488358B2 (en) 2020-02-05 2022-11-01 Snap Inc. Augmented reality session creation using skeleton tracking
US11265274B1 (en) 2020-02-28 2022-03-01 Snap Inc. Access and routing of interactive messages
US11409368B2 (en) 2020-03-26 2022-08-09 Snap Inc. Navigating through augmented reality content
US11675494B2 (en) 2020-03-26 2023-06-13 Snap Inc. Combining first user interface content into second user interface
US11960651B2 (en) 2020-03-30 2024-04-16 Snap Inc. Gesture-based shared AR session creation
US11455078B1 (en) 2020-03-31 2022-09-27 Snap Inc. Spatial navigation and creation interface
US11900784B2 (en) 2020-05-26 2024-02-13 Motorola Mobility Llc Electronic device having capacitance-based force sensing
US11763177B2 (en) * 2020-05-26 2023-09-19 Motorola Mobility Llc Electronic device having capacitance-based force sensing for user interface input
EP4173257A1 (en) 2020-06-30 2023-05-03 Snap Inc. Skeletal tracking for real-time virtual effects
EP4197180A1 (en) 2020-08-13 2023-06-21 Snap Inc. User interface for pose driven virtual effects
US11671559B2 (en) 2020-09-30 2023-06-06 Snap Inc. Real time video editing
US12105283B2 (en) 2020-12-22 2024-10-01 Snap Inc. Conversation interface on an eyewear device
US11782577B2 (en) 2020-12-22 2023-10-10 Snap Inc. Media content player on an eyewear device
US11797162B2 (en) 2020-12-22 2023-10-24 Snap Inc. 3D painting on an eyewear device
EP4272406B1 (en) 2020-12-29 2024-10-02 Snap Inc. Body ui for augmented reality components
US12008152B1 (en) 2020-12-31 2024-06-11 Snap Inc. Distance determination for mixed reality interaction
EP4272059A1 (en) 2020-12-31 2023-11-08 Snap Inc. Electronic communication interface with haptic feedback response
US11989348B2 (en) 2020-12-31 2024-05-21 Snap Inc. Media content items with haptic feedback augmentations
CN116670635A (en) 2020-12-31 2023-08-29 斯纳普公司 Real-time video communication interface with haptic feedback
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11978283B2 (en) 2021-03-16 2024-05-07 Snap Inc. Mirroring device with a hands-free mode
USD998637S1 (en) 2021-03-16 2023-09-12 Snap Inc. Display screen or portion thereof with a graphical user interface
US12050729B2 (en) 2021-03-31 2024-07-30 Snap Inc. Real-time communication interface with haptic and audio feedback response
US11880542B2 (en) 2021-05-19 2024-01-23 Snap Inc. Touchpad input for augmented reality display device
US11928306B2 (en) 2021-05-19 2024-03-12 Snap Inc. Touchpad navigation for augmented reality display device
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11960784B2 (en) 2021-12-07 2024-04-16 Snap Inc. Shared augmented reality unboxing experience
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11861801B2 (en) 2021-12-30 2024-01-02 Snap Inc. Enhanced reading with AR glasses
US11579747B1 (en) 2022-03-14 2023-02-14 Snap Inc. 3D user interface depth forgiveness
US11960653B2 (en) 2022-05-10 2024-04-16 Snap Inc. Controlling augmented reality effects through multi-modal human interaction
US12001878B2 (en) 2022-06-03 2024-06-04 Snap Inc. Auto-recovery for AR wearable devices
US12002168B2 (en) 2022-06-20 2024-06-04 Snap Inc. Low latency hand-tracking in augmented reality systems
US12069399B2 (en) 2022-07-07 2024-08-20 Snap Inc. Dynamically switching between RGB and IR capture
US11948266B1 (en) 2022-09-09 2024-04-02 Snap Inc. Virtual object manipulation with gestures in a messaging system
US11995780B2 (en) 2022-09-09 2024-05-28 Snap Inc. Shooting interaction using augmented reality content in a messaging system
US11797099B1 (en) 2022-09-19 2023-10-24 Snap Inc. Visual and audio wake commands
US11747912B1 (en) 2022-09-22 2023-09-05 Snap Inc. Steerable camera for AR hand tracking
US12112025B2 (en) 2023-02-16 2024-10-08 Snap Inc. Gesture-driven message content resizing
US12093443B1 (en) 2023-10-30 2024-09-17 Snap Inc. Grasping virtual objects with real hands for extended reality

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106745A1 (en) * 2008-01-21 2013-05-02 Elan Microelectronics Corporation Touch pad operable with multi-objects and method of operating same
US20130285966A1 (en) * 2010-12-28 2013-10-31 Sharp Kabushiki Kaisha Display apparatus
US20150103021A1 (en) * 2013-10-15 2015-04-16 Lg Electronics Inc. Glass type terminal having three-dimensional input device and screen control method thereof
US20160187662A1 (en) * 2014-12-25 2016-06-30 Seiko Epson Corporation Display device, and method of controlling display device
US20160334911A1 (en) * 2015-05-13 2016-11-17 Seiko Epson Corporation Display apparatus and method of controlling display apparatus
US20170090557A1 (en) * 2014-01-29 2017-03-30 Google Inc. Systems and Devices for Implementing a Side-Mounted Optical Sensor
US10962809B1 (en) * 2018-01-10 2021-03-30 Snap Inc. Eyewear device with finger activated touch sensor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9692967B1 (en) 2015-03-23 2017-06-27 Snap Inc. Systems and methods for reducing boot time and power consumption in camera systems

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106745A1 (en) * 2008-01-21 2013-05-02 Elan Microelectronics Corporation Touch pad operable with multi-objects and method of operating same
US20130285966A1 (en) * 2010-12-28 2013-10-31 Sharp Kabushiki Kaisha Display apparatus
US20150103021A1 (en) * 2013-10-15 2015-04-16 Lg Electronics Inc. Glass type terminal having three-dimensional input device and screen control method thereof
US20170090557A1 (en) * 2014-01-29 2017-03-30 Google Inc. Systems and Devices for Implementing a Side-Mounted Optical Sensor
US20160187662A1 (en) * 2014-12-25 2016-06-30 Seiko Epson Corporation Display device, and method of controlling display device
US20160334911A1 (en) * 2015-05-13 2016-11-17 Seiko Epson Corporation Display apparatus and method of controlling display apparatus
US10962809B1 (en) * 2018-01-10 2021-03-30 Snap Inc. Eyewear device with finger activated touch sensor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230168500A1 (en) * 2021-11-26 2023-06-01 Merry Electronics (Suzhou) Co., Ltd. Smart glasses and camera device thereof

Also Published As

Publication number Publication date
US10962809B1 (en) 2021-03-30

Similar Documents

Publication Publication Date Title
US20210181536A1 (en) Eyewear device with finger activated touch sensor
US11886633B2 (en) Virtual object display interface between a wearable device and a mobile device
US11892710B2 (en) Eyewear device with fingerprint sensor for user input
US11573607B2 (en) Facilitating dynamic detection and intelligent use of segmentation on flexible display screens
US11500536B2 (en) Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
US11275453B1 (en) Smart ring for manipulating virtual objects displayed by a wearable device
KR20150091322A (en) Multi-touch interactions on eyewear
US20220326530A1 (en) Eyewear including virtual scene with 3d frames
CN113260951A (en) Fade-in user interface display based on finger distance or hand proximity
CN110998497A (en) Electronic device including force sensor and electronic device control method thereof
US20230126025A1 (en) Context-sensitive remote eyewear controller
US20240221331A1 (en) Geospatial image surfacing and selection
US20240061798A1 (en) Debug access of eyewear having multiple socs
US11900058B2 (en) Ring motion capture and message composition system
CN118696288A (en) Dual system-on-chip glasses with MIPI bridge chip
CN118202314A (en) System-on-two-piece glasses
WO2022072205A1 (en) Image capture eyewear with automatic image sending to recipients determined based on context
KR101397812B1 (en) Input system of touch and drag type in remote
KR102353919B1 (en) Electronic device and method for performing predefined operations in response to pressure of touch
US11762202B1 (en) Ring-mounted flexible circuit remote control
US11733789B1 (en) Selectively activating a handheld device to control a user interface displayed by a wearable device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: SNAP INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CASTANEDA, JULIO CESAR;REEL/FRAME:068627/0514

Effective date: 20180108

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER