CN117762275A - Photo-induced sensing enabled display for stylus detection - Google Patents

Photo-induced sensing enabled display for stylus detection Download PDF

Info

Publication number
CN117762275A
CN117762275A CN202311230108.5A CN202311230108A CN117762275A CN 117762275 A CN117762275 A CN 117762275A CN 202311230108 A CN202311230108 A CN 202311230108A CN 117762275 A CN117762275 A CN 117762275A
Authority
CN
China
Prior art keywords
stylus
light
examples
optical
sensing system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311230108.5A
Other languages
Chinese (zh)
Inventor
C·H·克拉
M·雅凯亚兹丹杜斯特
C·J·巴特勒
P·J·格尔辛格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/462,330 external-priority patent/US20240118773A1/en
Application filed by Apple Inc filed Critical Apple Inc
Priority to CN202311423947.9A priority Critical patent/CN117762276A/en
Publication of CN117762275A publication Critical patent/CN117762275A/en
Pending legal-status Critical Current

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The present disclosure relates to a photo-induced sensing enabled display for stylus detection. An optical stylus system is disclosed that includes an optical stylus and an optical sensing system that operate together to determine one or more of a target or touch position, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of the stylus. In some examples, light illuminator and detector angle filters are employed to limit the illumination angle and detection angle of light to minimize false object detection. In other examples, the stylus is a passive stylus having a surface that reflects light at a consistent angular reflection profile or reflected light pattern, regardless of stylus tilt. In still other examples, the stylus may detect light at different modulation frequencies emitted from an array of light emitters in the optical sensing system, or the stylus may emit light and detect reflected light having different spectral distributions across the optical sensing system to determine stylus position.

Description

Photo-induced sensing enabled display for stylus detection
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/377,003, filed on month 23 of 2022, U.S. provisional application No. 63/496,258, filed on month 9 of 2023, and U.S. patent application No. 18/462,330, filed on month 9 of 2023, the disclosures of which are incorporated herein by reference in their entirety for all purposes.
Technical Field
The present disclosure relates generally to optical touch and/or proximity sensing, and more particularly to an optical sensing system that works in conjunction with an optical stylus to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of the stylus.
Background
Many types of input devices are currently available for performing operations in a computing system, such as buttons or keys, a mouse, a trackball, a joystick, a touch sensor panel, a touch screen, and the like. In particular, touch screens are popular because of their simplicity and flexibility in operation and their ever-decreasing price. The touch screen may include a touch sensor panel, which may be a transparent panel having a touch-sensitive surface, and a display device, such as a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, or an Organic Light Emitting Diode (OLED) display, which may be positioned partially or fully behind the panel such that the touch-sensitive surface may cover at least a portion of the viewable area of the display device. Touch screens may allow a user to perform various functions by touching the touch sensor panel with a finger, stylus, or other object at a location generally indicated by a User Interface (UI) displayed by the display device. Generally, a touch screen may identify a touch and the location of the touch on the touch sensor panel, and the computing system may then interpret the touch according to what appears when the touch occurred, and may then perform one or more actions based on the touch. For some touch sensing systems, detecting a touch does not require a physical touch on the display. For example, in some capacitive touch sensing systems, the fringe electric field used to detect a touch may extend beyond the surface of the display and an object near the surface may be detected near the surface without actually touching the surface.
In some examples, the capacitive touch sensor panel may be formed from a matrix of transparent, translucent, or opaque conductive plates (e.g., touch electrodes) made of a material such as Indium Tin Oxide (ITO). In some examples, the conductive plate may be formed from other materials, including conductive polymers, metal grids, graphene, nanowires (e.g., silver nanowires), or nanotubes (e.g., carbon nanotubes). As described above, some capacitive touch sensor panels may be overlaid on a display to form a touch screen, in part because they are substantially transparent. Some touch screens may be formed by integrating touch sensing circuitry partially into a display pixel stack-up structure (i.e., stacked layers of material forming display pixels).
Stylus has become a popular input device for touch sensitive devices such as touch panels and touch screens. Detecting the touch position or target position of the stylus (e.g., an illuminated area of a detection surface that the user of the stylus is intended to engage), the tilt angle and direction of the touch detection surface or the stylus hovering over the detection surface (but not in direct contact), or the orientation and rotation of the stylus, may provide a variety of input modes and increased stylus functionality. However, the tilt angle and hover distance of the stylus may affect the accuracy of parameters derived from the target location (e.g., parameters such as the centroid of the illumination pattern representing the target location), which may affect the accuracy of other operations (e.g., tracking accuracy) and result in reduced performance.
Disclosure of Invention
The present disclosure relates to an optical stylus system that includes an optical stylus and an optical sensing system that operate together to detect one or more of a target or touch position, centroid, hover distance, tilt angle, azimuth angle, and in some cases an orientation and rotation of the optical stylus relative to the optical sensing system. In particular, detecting rotation of the stylus may provide additional input modes that enable additional stylus functionality. For example, rotating the stylus while interacting with a drawing application may allow an artist to control the perceived texture, thickness, or color of the drawn line.
In some embodiments, the optical sensing system is an electronic device with an integrated touch screen having microcircuits that can be configured for both display operation and touch/proximity sensing of objects. In some implementations, the integrated touch screen may include light emitting diodes or organic light emitting diodes (LEDs/OLEDs), display driving circuitry, and touch sensing circuitry. In some embodiments, the LED/OLED may be implemented as a micro LED display including an array of micro LEDs and a micro driver circuit. In some implementations, the array of micro LEDs and the micro driver circuit can be configured to be in a Direct Current (DC) photoconductive mode to detect the presence of the stylus by detecting the unmodulated light emitted by the stylus. In other implementations, the array of micro LEDs and the micro driver circuit may be configured in an Alternating Current (AC) photoconductive mode to detect the presence of a plurality of styli by detecting modulated light emitted by the plurality of styli. In other implementations, the array of micro-LEDs and the micro-driver circuit may be configured in an optically reflective touch mode to detect the presence of an object by detecting modulated light generated by some of the micro-LEDs and reflected from the object, such as a finger or stylus.
The detection mode described above relies on light reaching the underlying LED, OLED or micro-LED through the detection surface of the cover material above the integrated touch screen. However, light from above or below the detection surface that impinges on the boundary between the detection surface and the medium above the detection surface (e.g., air, water, stylus, or finger) may reflect off the boundary or be refracted as it passes through the boundary. In some cases, such reflected or refracted light may be detected and incorrectly identified as an object such as a finger or stylus. Thus, in some embodiments of the present disclosure, an optical illuminator angle filter may be employed over those micro-LEDs configured as illuminators within the integrated touch screen to limit the illumination angle of those illuminators, and/or a photodetector angle filter may be employed over those micro-LEDs configured as detectors within the integrated touch screen to limit the detection angle of those detectors. These angular filters effectively block or filter light transmitted, reflected, or refracted within the cover material to reduce or eliminate false detection of water drops on the touch surface.
After the optical sensing system detects the angularly filtered light, the resulting illumination pattern (e.g., the target position of the hovering stylus) may be processed to determine the hovering distance and tilt angle of the object, and calculate various parameters (e.g., the centroid of the illumination pattern representing the target position) and do other operations (e.g., stylus tracking) with greater accuracy.
In some embodiments, the optical stylus is a passive stylus, including a diffuse reflector facet or a retroreflector facet, so as to reflect light emitted from the optical sensing system at a uniform angular reflection profile. Different tilt angles may produce different reflected energy profiles, and these different reflected energy profiles may be evaluated to determine the position, hover distance (if any), and tilt angle of the stylus. In embodiments, a passive stylus that includes a diffractive (patterned) reflector may also reflect light emitted from the optical sensing system in a uniform reflected light pattern, regardless of the angle of inclination of the stylus relative to the surface. Different tilt angles and rotations of the stylus may produce different reflected light patterns, and these different reflected light patterns may be evaluated to determine the position, hover distance (if any), tilt angle, orientation, and rotation of the stylus. In some embodiments, a semi-active stylus including an amplitude sensor in its tip and optionally in a radial position along the stylus side may detect the amplitude of the modulated light of different frequencies emitted from the optical sensing system and detect the position and hover distance (if any) of the stylus and in some cases also the tilt angle and rotation of the stylus. In some examples of the invention, an active stylus that includes both an optical emitter and a detector may generate light and receive reflected light when the light is reflected from an optical sensing system having a proximity of a retroreflector layer formed between an array of display elements. Different locations of the stylus on or over the display surface may produce different reflected light spectral distributions that may be analyzed to determine the stylus's location.
Drawings
Fig. 1A-1E illustrate an optical stylus system including an optical stylus and an electronic device including an optical sensing system, in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus touching or approaching a surface may be determined, according to examples of the present disclosure.
FIG. 2A is a block diagram of a computing system showing one implementation of an integrated touch screen in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface can be determined, in accordance with examples of the present disclosure.
Fig. 2B is a block diagram of one implementation of a semi-active or active optical stylus forming part of an optical stylus system for detecting one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus touching or approaching a surface, according to examples of the present disclosure.
Fig. 3A-3B illustrate a stacked structure of an integrated touch screen in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface may be determined, according to examples of the present disclosure.
Fig. 4A illustrates a corresponding circuit of a portion of a conductive layer and a portion of an example touch and display circuit layer in an optical sensing system according to examples of the present disclosure, wherein one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface may be determined.
Fig. 4B shows a block diagram of a touch node electrode according to an example of the present disclosure.
Fig. 4C shows an expanded view of a touch node electrode including two micro LED modules configured in DC photoconductive mode for detecting unmodulated light emitted by a stylus and a micro driver block according to an example of the present disclosure.
Fig. 4D shows an expanded view of a touch node electrode including two micro LED modules and a micro driver block configured in an AC photoconductive mode for detecting modulated light emitted by one or more styluses, according to an example of the present disclosure.
Fig. 4E illustrates an expanded view of a touch node electrode including two micro LED modules and a micro driver block configured to be in an optically reflective touch mode for emitting modulated light and detecting reflection of the modulated light from an object such as a finger or a passive stylus, according to an example of the present disclosure.
Fig. 4F shows an expanded view of a touch node electrode including two micro LED modules and a micro driver block in an analog demodulation configuration, according to an example of the present disclosure.
Fig. 5A illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs, a cover material, and an object such as a proximity stylus, and transmission of light across a boundary between the object and the cover material, according to an example of the present disclosure.
Fig. 5B illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs and a cover material, and reflection or refraction of light across a boundary between air and the cover material, according to an example of the present disclosure.
Fig. 5C illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs and a cover material, and reflection or refraction of light across a boundary between a water droplet and the cover material, according to an example of the present disclosure.
Fig. 5D illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs and a cover material, and the concept of blocking or filtering reflected or refracted light from or through some angles of the cover material, according to an example of the present disclosure.
Fig. 5E illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs configured as illuminators and generating light in the direction of a boundary represented by an interface between air and a cover material, according to an example of the present disclosure.
Fig. 5F illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs configured as illuminators and generating light in the direction of a boundary represented by an interface between a water droplet and a cover material, according to an example of the present disclosure.
Fig. 5G illustrates a cross-sectional view of a portion of an integrated touch screen including micro LEDs and a cover material, and concepts of blocking or filtering light at some angles emitted by micro LEDs configured as luminaires, according to examples of the present disclosure.
Fig. 5H-5K illustrate cross-sectional views of portions of an integrated touch screen including alternative illuminator and photodetector embodiments, according to examples of the present disclosure.
Fig. 6A illustrates a cross-sectional view of a portion of an integrated touch screen including representative micro LEDs, a cover material, a light blocking layer, and transmission and reception of light through the light blocking layer, according to an example of the present disclosure.
Fig. 6B illustrates a top view of a portion of the integrated touch screen of fig. 6A showing a light blocking layer according to an example of the present disclosure.
Fig. 7A illustrates a geometric perspective view of a stylus hovering in a perpendicular orientation relative to a detection surface and producing an illumination pattern, according to an example of the present disclosure.
Fig. 7B illustrates a flowchart of a method of calculating an illumination pattern of a stylus oriented perpendicular to a detection surface according to an example of the present disclosure.
Fig. 8A illustrates a visual comparison between a vertical stylus and a tilted stylus and an elliptical illumination pattern produced by the tilted stylus according to an example of the present disclosure.
Fig. 8B illustrates a flowchart of a method of calculating an illumination pattern of a stylus tilted with respect to a detection surface according to an example of the present disclosure.
Fig. 8C illustrates a flowchart of an alternative method of calculating an illumination pattern of a stylus tilted relative to a detection surface, according to an example of the present disclosure.
FIG. 9A illustrates irradiance profiles of a plurality of touch node electrodes in a portion of an integrated touch screen according to one example of the present disclosure.
Fig. 9B illustrates an irradiance profile after interpolation and upsampling of the irradiance of fig. 9A has been performed to increase the profile granularity of the illumination pattern, according to one example of the present disclosure.
FIG. 9C illustrates a two-dimensional view of those touch node electrodes that have been identified as boundary touch node electrodes in the irradiance profile of FIG. 9B, according to one example of the disclosure.
Fig. 9D illustrates an ellipse as a result of fitting the ellipse to the boundary map of fig. 9C according to one example of the present disclosure.
Fig. 9E shows an ellipse as a result of fitting the ellipse to the boundary map of fig. 9C, but in this case the boundary map is incomplete, according to one embodiment of the present disclosure.
Fig. 10A illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffuse reflector stylus and an optical sensing system having an array of optical light emitting and/or detecting devices, according to some examples of the present disclosure.
Fig. 10B illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffuse reflector stylus and an optical sensing system having a single light emitting device (for purposes of explanation only) according to some examples of the present disclosure.
Fig. 10C illustrates a plot of reflected light intensity versus scattering angle β for light reflected from a passive diffuse reflector stylus according to some examples of the present disclosure.
10D-1 through 10D-3 illustrate reflected energy profiles of light that has been reflected from a diffuse reflecting surface of a passive diffuse reflector stylus at three different tilt angles and now impinges on an array of detectors in an optical sensing system, according to examples of the present disclosure.
Fig. 11A illustrates a cross-sectional view of a portion of an optical stylus system including a passive, retro-reflective stylus and an optical sensing system having an array of optical light emitting and/or detecting devices, according to some examples of the present disclosure.
Fig. 11B shows a symbolic representation of a cross-section of a retroreflector facet to illustrate the principle of retroreflection, according to some examples of the present disclosure.
Fig. 11C-1 illustrates a portion of an optical stylus system having a passive, retro-reflective stylus with a retro-reflective surface including retro-reflector facets, according to some examples of the present disclosure.
Fig. 11C-2 illustrates a retroreflector facet according to some examples of the present disclosure.
11D-1 through 11D-3 illustrate energy profiles of light that has been reflected from the retroreflective surface of a passive retroreflective stylus at three different tilt angles and now impinges on an array of detectors in an optical sensing system, according to an example of the present disclosure.
Fig. 12A illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffraction reflector stylus and an optical sensing system having an array of light emitting and/or detecting devices, according to some examples of the present disclosure.
Fig. 12B illustrates a perspective view of a portion of an optical stylus system with a passive diffraction reflector stylus and an optical sensing system, according to some examples of the present disclosure.
Fig. 12C illustrates a stylus pattern and a corresponding reflected light pattern that occurs at an optical sensing system according to some examples of the present disclosure.
Fig. 12D illustrates an alternative reflected light pattern that occurs at an optical sensing system according to some examples of the present disclosure.
Fig. 13A illustrates a plan view of a portion of an optical sensing system including an array of light emitting devices operating with a semi-active light detection stylus, according to some examples of the present disclosure.
Fig. 13B illustrates a semi-active stylus with a light detection device embedded within the tip of the stylus, according to some examples of the present disclosure.
Fig. 13C-1 illustrates a semi-active stylus with a light detection device embedded within the tip of the semi-active stylus and an additional light detection device embedded within the side of the stylus, according to some examples of the present disclosure.
Fig. 13C-2 illustrates a view of a semi-active stylus with a light detection device along its axis according to some examples of the present disclosure.
Fig. 13D illustrates a touch node electrode implemented within a portion of the optical sensing system shown in fig. 13A and including a micro LED module and micro driver block configured to emit modulated light to a semi-active stylus, according to some examples of the present disclosure.
Fig. 13E illustrates a light detection device that may be embedded within the semi-active stylus illustrated in fig. 13B, 13C-1, or 13C-2 and configured to detect modulated light emitted from one or more light emitting devices in the array of fig. 13A, according to some examples of the present disclosure.
Fig. 13F is a flow chart for estimating a position of a semi-active stylus on or over an optical sensing system including an array of light emitting devices, according to some examples of the present disclosure.
13G-1, 13G-2, and 13G-3 illustrate a symbol optical sensing system with 16 light emitting devices and two semi-active stylus positions according to some examples of the present disclosure.
Fig. 13G-4 illustrate groups of nine light emitting devices that may emit light up to nine different modulation frequencies and up to nine different phases according to some examples of the present disclosure.
Fig. 14A illustrates a perspective view of a portion of an optical stylus system having an optical sensing system and an active light-emitting stylus including a light-emitting device, according to some examples of the present disclosure.
Fig. 14B illustrates a portion of an optical stylus system with an active stylus including a laser and patterned apertures according to some examples of the present disclosure.
Fig. 14C illustrates two illumination patterns occurring at an optical sensing system having an array of light detection devices according to some examples of the present disclosure.
Fig. 15A illustrates a cross-sectional view of a portion of an optical stylus system including an active stylus having a light emitting device, a separating element, and a plurality of light detecting devices in contact with or hovering over an optical sensing system of a display device, according to some examples of the present disclosure.
Fig. 15B illustrates a plan view of a portion of an optical sensing system having a display element and a retroreflector layer according to some examples of the present disclosure.
Detailed Description
In the following description of the examples, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific examples which may be practiced. It is to be understood that other examples may be utilized and structural changes may be made without departing from the scope of the disclosed examples.
The present disclosure relates to an optical stylus system that includes an optical stylus and an optical sensing system that operate together to detect one or more of a target or touch position, centroid, hover distance, tilt angle, azimuth angle, and in some cases an orientation and rotation of the optical stylus relative to the optical sensing system. In particular, detecting rotation of the stylus may provide additional input modes that enable additional stylus functionality. For example, rotating the stylus while interacting with a drawing application may allow an artist to control the perceived texture, thickness, or color of the drawn line.
In some embodiments, the optical sensing system is an electronic device with an integrated touch screen having microcircuits that can be configured for both display operation and touch/proximity sensing of objects. In some implementations, the integrated touch screen may include light emitting diodes or organic light emitting diodes (LEDs/OLEDs), display driving circuitry, and touch sensing circuitry. In some embodiments, the LED/OLED may be implemented as a micro LED display including an array of micro LEDs and a micro driver circuit. In some implementations, the array of micro LEDs and the micro driver circuit can be configured to be in a Direct Current (DC) photoconductive mode to detect the presence of the stylus by detecting the unmodulated light emitted by the stylus. In other implementations, the array of micro LEDs and the micro driver circuit may be configured in an Alternating Current (AC) photoconductive mode to detect the presence of a plurality of styli by detecting modulated light emitted by the plurality of styli. In other implementations, the array of micro-LEDs and the micro-driver circuit may be configured in an optically reflective touch mode to detect the presence of an object by detecting modulated light generated by some of the micro-LEDs and reflected from the object, such as a finger or stylus.
The detection mode described above relies on light reaching the underlying LED, OLED or micro-LED through the detection surface of the cover material above the integrated touch screen. However, light from above or below the detection surface that impinges on the boundary between the detection surface and the medium above the detection surface (e.g., air, water, stylus, or finger) may reflect off the boundary or be refracted as it passes through the boundary. In some cases, such reflected or refracted light may be detected and incorrectly identified as an object such as a finger or stylus. Thus, in some embodiments of the present disclosure, an optical illuminator angle filter may be employed over those micro-LEDs configured as illuminators within the integrated touch screen to limit the illumination angle of those illuminators, and/or a photodetector angle filter may be employed over those micro-LEDs configured as detectors within the integrated touch screen to limit the detection angle of those detectors. These angular filters effectively block or filter light transmitted, reflected, or refracted within the cover material to reduce or eliminate false detection of water drops on the touch surface.
After the optical sensing system detects the angularly filtered light, the resulting illumination pattern (e.g., the target position of the hovering stylus) may be processed to determine the hovering distance and tilt angle of the object, and calculate various parameters (e.g., the centroid of the illumination pattern representing the target position) and do other operations (e.g., stylus tracking) with greater accuracy.
In some embodiments, the optical stylus is a passive stylus, including a diffuse reflector facet or a retroreflector facet, so as to reflect light emitted from the optical sensing system at a uniform angular reflection profile. Different tilt angles may produce different reflected energy profiles, and these different reflected energy profiles may be evaluated to determine the position, hover distance (if any), and tilt angle of the stylus. In embodiments, a passive stylus that includes a diffractive (patterned) reflector may also reflect light emitted from the optical sensing system in a uniform reflected light pattern, regardless of the angle of inclination of the stylus relative to the surface. Different tilt angles and rotations of the stylus may produce different reflected light patterns, and these different reflected light patterns may be evaluated to determine the position, hover distance (if any), tilt angle, orientation, and rotation of the stylus. In some embodiments, a semi-active stylus including an amplitude sensor in its tip and optionally in a radial position along the stylus side may detect the amplitude of the modulated light of different frequencies emitted from the optical sensing system and detect the position and hover distance (if any) of the stylus and in some cases also the tilt angle and rotation of the stylus. In some examples of the invention, an active stylus that includes both an optical emitter and a detector may generate light and receive reflected light when the light is reflected from an optical sensing system having a proximity of a retroreflector layer formed between an array of display elements. Different locations of the stylus on or over the display surface may produce different reflected light spectral distributions that may be analyzed to determine the stylus's location.
Fig. 1A-1E illustrate an optical stylus system including an optical stylus and an electronic device including an optical sensing system, in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus touching or approaching a surface may be determined, according to examples of the present disclosure. Fig. 1A illustrates a mobile phone 100 including an optical sensing system with an integrated touch screen 102 that operates with an optical stylus 114 to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of the optical stylus, according to examples of the present disclosure. Fig. 1B illustrates a digital media player 104 including an optical sensing system with an integrated touch screen 102 that operates with an optical stylus 114 to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus, according to examples of the present disclosure. Fig. 1C illustrates a personal computer 106 including an optical sensing system with a touch pad 108 and an integrated touch screen 102 that operate with an optical stylus 114 to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of the optical stylus, according to examples of the present disclosure. Fig. 1D illustrates a tablet computer 110 including an optical sensing system with an integrated touch screen 102 that operates with an optical stylus 114 to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus, according to examples of the disclosure. Fig. 1E illustrates a wearable device 112 (e.g., a watch) including an optical sensing system with an integrated touch screen 102 that operates with an optical stylus 114 to determine one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus, according to examples of the disclosure. It should be appreciated that the integrated touch screen described above may also be implemented in other devices. Additionally, it should be understood that although the disclosure herein focuses primarily on integrated touch screens, some of the present disclosure also apply to touch sensor panels without a corresponding display.
FIG. 2A is a block diagram of a computing system 214 showing one implementation of an integrated touch screen 202 in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface may be determined, in accordance with examples of the present disclosure. As described in greater detail herein, the integrated touch screen 202 can include micro Light Emitting Diodes (LEDs) 216, or Organic LEDs (OLEDs), and a chiplet 218 (e.g., an integrated chiplet including LED/OLED drivers, touch sensing circuitry, and/or optical sensing circuitry). In some examples, the functionality of the chiplet can be divided into separate display chiplets 220 (e.g., including LED/OLED drivers) and touch chiplets 222 (e.g., including touch sensing circuitry and/or optical sensing circuitry). The computing system 214 may be included, for example, in the mobile phone 100, digital media player 104, personal computer 106, tablet computer 110, or wearable device 112, or any mobile or non-mobile computing device including a touch screen, as shown in fig. 1A-1E. Computing system 214 can include an optical sensing system such as an integrated touch and display module 224, a host processor 226 (which can include one or more processors), and a program storage device 228. The integrated touch and display module 224 may include the integrated touch screen 202 and integrated circuitry for integrating the operation of the touch screen 202. In some examples, integrated touch and display module 224 may be formed on a single substrate with micro-LEDs 216 and chiplets 218 (or display chiplets 220 and/or touch chiplets 222) of integrated touch screen 202 on one side of the touch screen and integrated circuitry to control the operation of micro-LEDs 216 and chiplets 218 mounted on the opposite side of the single substrate. Forming the integrated touch and display module 224 in this manner may provide simplified manufacturing and assembly of a device having a touch screen. In some examples, integrated touch and display module 224 may be formed on a single substrate with micro-LEDs 216 on one side of the substrate and chiplets 218 (or display chiplets 220 and/or touch chiplets 222) of integrated touch screen 202 mounted on the opposite side of the single substrate along with integrated circuitry that controls the operation of micro-LEDs 216 and chiplets 218.
The integrated circuits for integrating the operation of the touch screen 202 may include an integrated touch and display Integrated Circuit (IC) (touch and display controller) 230, a Power Management Unit (PMU) 232, and optionally a protection integrated circuit (protection IC) 234. In some examples, protection IC 234 may be used to operate integrated touch and display module 224 in a protected power domain during protected touch operations and to otherwise operate touch and display module 224 in a rack power domain (e.g., during unprotected touch operations or during display operations). The power management unit 232 may be an integrated circuit configured to provide the voltage necessary for the touch and display controller 230, including a protection reference power supply when operating in the protected power domain. The touch and display controller 230 may include circuitry for performing touch sensing, optical sensing, and display operations. Although illustrated in fig. 2A as a single integrated circuit, the various components and/or functionality of touch and display controller 230 may be implemented with multiple circuits, elements, chips, and/or discrete components (e.g., separate touch integrated circuits and separate display integrated circuits, with integrated circuits for handling switching therebetween).
The touch and display controller 230 may include display circuitry 236 for performing display operations. The display circuitry 236 may include hardware for processing one or more still images and/or one or more video sequences for display on the integrated touch screen 202. The display circuitry 236 may be configured to generate read memory operations to read data representing a frame/video sequence from a memory (not shown) via, for example, a memory controller (not shown), or may receive data representing a frame/video sequence from the host processor 226. The display circuitry 236 may be configured to perform various processing on image data (e.g., still images, video sequences, etc.). In some examples, the display circuitry 236 may be configured to scale still images and to dither, scale, and/or perform color space conversion on frames of a video sequence. The display circuitry 236 may be configured to mix still image frames with video sequence frames to produce output frames for display. The display circuit 236 may also be more generally referred to as a display controller, a display pipe, a display control unit, or a display pipeline. The display control unit may generally be any hardware and/or firmware such as: which is configured to prepare frames for display from one or more sources (e.g., still images and/or video sequences). More specifically, the display circuitry 236 may be configured to retrieve source frames from one or more source buffers stored in memory, synthesize frames from the source buffers, and display the resulting frames on the integrated touch screen 202. Accordingly, the display circuitry 236 may be configured to read one or more source buffers and synthesize image data to produce an output frame. The display circuitry 236 can provide various control and data signals to the display via the chiplet 218 (or via the display chiplet 220), including timing signals (e.g., one or more clock signals) and pixel selection signals. The timing signal may include a pixel clock that may indicate a pixel transfer. The data signals may include color signals (e.g., red, green, blue) for the micro LEDs 216. The display circuitry may control the integrated touch screen 202 in real time to provide data indicative of the pixels to be displayed while the touch screen is displaying the image indicated by the frame. Such an interface of integrated touch screen 202 may be, for example, a Video Graphics Array (VGA) interface, a High Definition Multimedia Interface (HDMI), a Mobile Industry Processor Interface (MIPI), a Digital Video Interface (DVI), an LCD/LED/OLED interface, a plasma interface, or any other suitable interface.
The touch and display controller 230 may include a touch circuit 238 for performing a touch operation. Touch circuitry 238 may include one or more touch processors, peripherals (e.g., random Access Memory (RAM) or other types of memory or storage, watchdog timers and the like), and touch controllers. The touch controller can include, but is not limited to, channel scan logic (e.g., implemented in programmable logic or as hard-coded logic) that can provide configuration and control for touch sensing operations of the chiplet 218 (or the touch chiplet 222). For example, the touch chiplet 222 can be configured to drive, sense and/or ground touch node electrodes according to a mode of touch sensing operation. Additionally or alternatively, the chiplet 218 (or touch chiplet 222) can be configured for optical sensing (e.g., using the touch circuitry 238 of the touch and display controller 230 or using separate circuitry and separate controllers for optical sensing operations). In some examples, the pattern of touch sensing and/or optical sensing operations may be determined by a scan plan stored in a memory (e.g., RAM) in touch circuitry 238. The scan plan may provide a sequence of scan events to be performed during a frame. The scan plan can also include information necessary to provide control signals to the chiplet 218 and program the chiplet and analyze data from the chiplet 218 in accordance with the particular scan event to be performed. Scanning events may include, but are not limited to, mutual capacitance scanning, self-capacitance scanning, stylus scanning, touch spectroscopy scanning, stylus spectroscopy scanning, and optical sensing scanning. Channel scan logic or other circuitry in touch circuitry 238 may provide excitation signals of various frequencies and phases that may be selectively applied to touch node electrodes of integrated touch screen 202 or used for demodulation, as described in more detail below. Touch circuitry 238 can also receive touch data from chiplet 218 (or touch chiplet 222), store the touch data in memory (e.g., RAM), and/or process the touch data (e.g., by one or more touch processors or touch controllers) to determine touch locations and/or cleaning operation frequencies for touch sensing operations (e.g., spectroscopic analysis). The touch circuitry 238 (or a separate optical sensing circuit) can also receive ambient light data from the chiplet 218 (or touch chiplet 222), store the ambient light data in memory (e.g., RAM), and/or process the ambient light data (e.g., via one or more touch processors or touch controllers, or optical sensing processors/controllers) to determine an ambient light condition.
The integrated touch screen 202 may be used to derive touch data at a plurality of discrete locations of the touch screen, referred to herein as touch nodes. For example, the integrated touch screen 202 may include touch sensing circuitry that may include a capacitive sensing medium having a plurality of electrically isolated touch node electrodes. Touch node electrodes can be coupled to chiplet 218 (or touch chiplet 222) for touch sensing through the sense channel circuitry. As used herein, an electronic component "coupled to" or "connected to" another electronic component includes a direct or indirect connection that provides an electrical path for communication or operation between the coupled components. Thus, for example, touch node electrodes of integrated touch screen 202 can be connected directly to chiplet 218 or indirectly to chiplet 218 (e.g., to touch chiplet 222 via display chiplet 220), but in either case provide electrical paths for driving and/or sensing touch node electrodes. Marking a conductive plate (or group of conductive plates) for detecting touches as touch node electrodes corresponding to touch nodes (discrete locations of the touch screen) may be particularly useful when the integrated touch screen 202 is considered to capture an "image" of a touch (or "touch image"). The touch image may be a two-dimensional representation of a value indicative of an amount of touch detected at each touch node electrode corresponding to a touch node in the integrated touch screen 202. The pattern of touch nodes where a touch occurs may be considered a touch image (e.g., a pattern of fingers touching a touch screen). In such examples, each touch node electrode in the pixelated touch screen may be sensed for a corresponding touch node represented in the touch image.
The host processor 226 may be connected to a program storage device 228 (e.g., a non-transitory computer readable storage medium) to execute instructions stored in the program storage device 228. The host processor 226 may provide, for example, control and data signals such that the touch and display controller 230 may generate a display image, such as a display image of a User Interface (UI), on the integrated touch screen 202. Host processor 226 can also receive output from touch and display controller 230 (e.g., touch input from one or more touch processors, ambient light information, etc.) and perform actions based on these outputs. Touch input can be used by a computer program stored in program storage device 228 to perform actions, which can include, but are not limited to: mobile objects (such as cursors or pointers), scrolling or panning, adjusting control settings, opening files or documents, viewing menus, making selections, executing instructions, operating peripheral devices connected to the host device, answering telephone calls, placing telephone calls, terminating telephone calls, changing volume or audio settings, storing information related to telephone communications (such as addresses, frequently dialed numbers, missed calls), logging onto a computer or computer network, allowing authorized individuals to access restricted areas of the computer or computer network, loading user profiles associated with preferred arrangements of the user's computer desktop, allowing access to web page content, launching specific programs, encrypting or decoding messages, and the like. Host processor 226 may also perform additional functions that may not be relevant to touch processing, optical sensing, and display.
It is noted that one or more of the functions described herein, including the configuration and operation of the chiplet, may be performed by firmware stored in memory (e.g., one of the peripherals in touch and display controller 230) and executed by one or more processors (in touch and display controller 230), or stored in program storage 228 and executed by host processor 226. The firmware may also be stored and/or transported within any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a "non-transitory computer readable storage medium" may be any medium (excluding signals) that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, a portable computer diskette (magnetic), a Random Access Memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such as a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or a flash memory such as a compact flash card, a secure digital card, a USB memory device, a memory stick, etc.
The firmware may also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a "transmission medium" may be any medium that can communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. Transmission media can include, but are not limited to, electronic, magnetic, optical, electromagnetic, or infrared wired or wireless propagation media.
It should be appreciated that computing system 214 is not limited to the components and configuration of fig. 2A, but may include other components or additional components in a variety of configurations according to various examples. Additionally, components of computing system 214 may be included within a single device or may be distributed among multiple devices. In some examples, PMU 232 and protection IC 234 may be integrated into a power management and protection integrated circuit. In some examples, the power management and protection integrated circuit may provide (e.g., a protection reference) power and protection signals to the touch screen 202 directly rather than via the touch and display IC 230. In some examples, the touch and display IC 230 may be directly coupled to the host processor 226, and a portion of the touch and display IC 230 in communication with the chiplet 218 may be included in an isolation well (e.g., deep N-well isolation) referenced to the protection signal from the protection IC 234. In some examples, computing system 214 may include an energy storage device (e.g., a battery). In some examples, computing system 214 may include wired or wireless communication circuitry (e.g., bluetooth, wiFi, etc.).
The integrated touch screen 202 may be manufactured such that touch sensing circuit elements of the touch sensing system may be integrated with the display stack-up and some of the circuit elements may be shared between touch operations and display operations. It should be noted that the circuit element is not limited to the entire circuit component, such as the entire capacitor, the entire transistor, and the like, but may include a portion of the circuit, such as a conductive plate.
Fig. 2B is a block diagram of one implementation of a semi-active or active optical stylus forming part of an optical stylus system for detecting one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth, and in some cases orientation and rotation of the optical stylus touching or approaching a surface, according to examples of the present disclosure. As described in more detail herein, the exemplary semi-active or active optical stylus 280 of fig. 2B may include a micro Light Emitting Diode (LED) 201, or an Organic LED (OLED), and a chiplet 203 (e.g., an integrated chiplet including LED/OLED drivers and optical sensing circuitry). In some examples, the functionality of the chiplet can be divided into separate illumination chiplets 282 (e.g., including LED/OLED drivers) and optical sensing chiplets 284 (e.g., including optical sensing circuitry). Semi-active or active optical stylus 280 may be included in any semi-active or active stylus that includes a light emitting and/or light detecting device. Semi-active or active optical stylus 280 may include integrated optical sensing and illumination module 286, processor 288 (which may include one or more processors), and program storage 205.
The integrated optical sensing and illumination module 286 may include an integrated optical sensing and illumination controller 290 and a Power Management Unit (PMU) 207. The power management unit 207 may be an integrated circuit configured to provide the voltage necessary for the optical sensing and lighting controller 290. The optical sensing and illumination controller 290 may include circuitry for performing optical sensing and illumination (light detection and illumination) operations. Although shown as a single integrated circuit in fig. 2B, the various components and/or functionality of the optical sensing and illumination controller 290 may be implemented with multiple circuits, elements, chips, and/or discrete components (e.g., separate optical sensing integrated circuits and separate illumination integrated circuits, with integrated circuits for handling switching therebetween).
The optical sensing and illumination controller 290 may include illumination circuitry 292 for performing illumination operations. The illumination circuit 292 can provide various control and data signals, including timing signals (e.g., one or more clock signals) to the light emitting device (e.g., the ul led 201) via the chiplet 203 (or via the illumination chiplet 282).
The optical sensing and illumination controller 290 may include a sensing circuit 294 for performing optical sensing (light detection) operations. The sensing circuitry 294 may include one or more touch processors, peripherals (e.g., random Access Memory (RAM) or other types of memory or storage, watchdog timers and the like), and optical sensing controllers. The optical sensing controller may include, but is not limited to, logic (e.g., implemented in programmable logic or as hard-coded logic) that may provide configuration and control for the light detection operation of the chiplet 203 (or optical sensing chiplet 284). The sensing circuit 294 may also receive light detection data from the chiplet 203 (or optical sensing chiplet 284), store the light detection data in memory (e.g., RAM), and/or process the light detection data (e.g., by one or more processors or controllers) to determine one or more of the frequency, wavelength, and amplitude of the detected light at different times. The sensing circuit 294 (or a separate optical sensing circuit) may also receive ambient light data from the chiplet 203 (or optical sensing chiplet 284), store the ambient light data in memory (e.g., RAM), and/or process the ambient light data (e.g., by one or more processors or controllers, or an optical sensing processor/controller) to determine an ambient light condition.
The processor 288 may be connected to a program storage device 205 (e.g., a non-transitory computer readable storage medium) to execute instructions stored in the program storage device. The processor 288 may provide, for example, control and data signals such that the optical sensing and illumination controller 290 may cause the stylus to emit or detect light. The processor 226 may also perform additional functions that may not be related to optical illumination or detection.
It is noted that one or more of the stylus functions described herein, including the configuration and operation of the chiplet, may be performed by firmware stored in memory (e.g., one of the peripherals in the optical sensing and illumination controller 290) and executed by one or more processors (in the optical sensing and illumination controller 290), or stored in the program storage 228 and executed by the processor 288. The firmware may also be stored and/or transported within any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a "non-transitory computer readable storage medium" may be any medium (excluding signals) that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, a portable computer diskette (magnetic), a Random Access Memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such as a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or a flash memory such as a compact flash card, a secure digital card, a USB memory device, a memory stick, etc.
The firmware may also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a "transmission medium" may be any medium that can communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. Transmission media can include, but are not limited to, electronic, magnetic, optical, electromagnetic, or infrared wired or wireless propagation media.
It should be appreciated that semi-active or active optical stylus 280 is not limited to the components and configuration of fig. 2B, but may include other or additional components in a variety of configurations according to various examples, or fewer components in a variety of configurations in other examples. In some examples, semi-active or active optical stylus 280 may include an energy storage device (e.g., a battery). In some examples, semi-active or active optical stylus 280 may include wired or wireless communication circuitry (e.g., bluetooth, wiFi, etc.).
Fig. 3A-3B illustrate a stacked structure of an integrated touch screen in which one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface may be determined, according to examples of the present disclosure. FIG. 3A illustrates an exemplary layered structure of a touch screen that includes a chiplet (or a touch chiplet and a display chiplet) in the visible region of a display. The integrated touch screen 302A includes a substrate 340 (e.g., a printed circuit board) on which chiplets (or touch chiplets and/or display chiplets) and micro-LEDs can be mounted in a touch and display circuit layer 342. In some examples, the chiplet and/or the micro-LEDs may be partially or fully embedded in the substrate (e.g., the components may be placed in recesses in the substrate). In some examples, chiplets can be mounted on one and/or both sides of substrate 340. For example, some or all of the chiplets can be mounted on the second side of the substrate 340 (or some or all of the touch chiplets and/or some or all of the display chiplets can be mounted on the second side of the substrate 340). In some examples, the chiplet can be disposed on a second side of the substrate (opposite the first side of the substrate including the micro-LEDs). FIG. 3B illustrates an exemplary layered structure of a touch screen 302B that includes a chiplet (or a touch chiplet and/or display chiplet) outside the visible area of the display. Unlike the stacked structure of integrated touch screen 302A, in which the chiplets and micro-LEDs may be mounted in touch and display circuitry layer 342, the stacked structure of integrated touch screen 302B may include chiplets mounted in touch and display circuitry layer 342 on a second (bottom) side of substrate 340 differently than micro-LEDs mounted in display pixel layer 346 on a first (top, visible) side of substrate 340. In some examples, placing the chiplets on the second side of the substrate can allow for uniform spacing of the micro-LEDs and/or increased density of micro-LEDs on the first side of the substrate 340.
Substrate 340 may include routing traces in one or more layers to route signals between micro-LEDs, chiplets, and touch and display controllers. The substrate 340 may also optionally include a guard plane 348 (e.g., corresponding to the guard plane 348 in fig. 3A) for guard operations. Although shown on the bottom of the substrate 340 in fig. 3A, the guard plane 348 may be formed as a layer of the substrate 340 other than the bottom layer (e.g., in an interior layer of the substrate 340 as shown in fig. 3B).
After the micro-LEDs and chiplets are mounted in the touch and display circuit layer 342 in fig. 3A (e.g., during pick and place assembly), a planarization layer (e.g., transparent epoxy) can be deposited over the micro-LEDs and chiplets. A planarization layer may be deposited over the micro LEDs in display pixel layer 346 in the stacked structure of fig. 3B. A fully or partially transparent conductor layer 350 (e.g., ITO) may be deposited over the planarized touch and display circuitry layer 342 in fig. 3A or over the display pixel layer 346 in fig. 3B. The conductor layer 350 may include a pattern of individual conductor plates that may be used to integrate the touch and display functions of the touch screen 302A or 302B. For example, during a display operation (and/or an optical sensing operation), a separate conductor plate may be used as a cathode terminal of the micro LED, and the group of conductor plates may form a touch node electrode for a touch operation. A polarizer 352 may be disposed over the transparent conductor layer 350 (optionally, another planarization layer is disposed over the transparent conductor layer 350). A cover material or glass (or front crystal) 354 may be disposed over the polarizer 352 and form an outer surface of the integrated touch screen 302. The stacked structure of integrated touch screen 302A and/or 302B may provide a number of benefits including reduced cost (e.g., due to simplified assembly of devices including integrated touch and display modules and reduced number of integrated circuits by combining touch and display functionality into an integrated touch and display controller), reduced stack height (shared conductors eliminate separate touch node electrode layers; integrating chiplets (or touch chiplets and display chiplets) into the stacked structure on the same layer as micro-LEDs does not increase the stack height of fig. 3A), simplified support for protecting self-capacitance scanning (by including touch circuitry on the integrated touch and display module and extending a protection plane through the substrate of the integrated touch and display module), and reduced edge area around the touch screen (because routing may be done through the substrate instead of in the edge area).
Fig. 4A illustrates a corresponding circuit of a portion of a conductive layer and a portion of an example touch and display circuit layer in an optical sensing system according to examples of the present disclosure, wherein one or more of a target or touch location, centroid, hover distance, tilt angle, azimuth angle, and in some cases orientation and rotation of an optical stylus touching or approaching a surface may be determined. The integrated touch screen may include a conductive layer (e.g., corresponding to conductive layer 350 in fig. 3A or 3B), a portion of which is shown in fig. 4A as touch pixel 456. Touch pixel 456 can define an area having an area of X1 by Y1 (e.g., 5mm X5 mm) that includes 16 touch node electrodes 458, but in other examples a different number of touch node electrodes can be employed. Each touch node electrode 458 may be formed of 16 ITO groups 460 (e.g., eight rows and two columns in the orientation of fig. 4) and may define an X2 by Y2 (e.g., 1.25mm X1.25 mm) area that is less than X1 by Y1, although in other examples, a different number of ITO groups and a different number of rows and columns may be employed. In some examples, touch chiplets can be used to measure touches for some or all of the smaller areas (e.g., touch node electrodes 458 having an X2 by Y2 area) as described herein. In some examples, a touch image for determining a touch input from a user to an integrated touch screen may combine these touch measurements of some or all of the smaller areas into a touch image with a lower resolution corresponding to the larger area (e.g., touch pixels with an X1 by Y1 area), as described herein.
Fig. 4A also shows an expanded view of touch node electrode 458 showing a component ITO memory bank 462 of ITO group 460 and touch and display circuitry (e.g., components corresponding to touch and display circuitry layer 342), according to some examples. The touch and display circuitry may include micro-LEDs 464 (with the exemplary micro-LED subpixels 464-R, 464-G, and 464-B shown in fig. 4A), display chiplets, and touch chiplets (not shown), although in other examples, LEDs other than micro-LEDs may be employed and the use of chiplets is not required. In another embodiment, the display and touch functions may be integrated into a single chiplet. Touch node electrode 458 of fig. 4A includes 128 ITO memory banks 462 (i.e., eight ITO memory banks 462 per ITO group 460), although in other examples, a different number of ITO memory banks may be used. In some examples, each ITO bank may be formed over a micro LED bank and may serve as a cathode terminal for the micro LED bank during a display operation, and may be coupled to one or more display chiplets to update micro LEDs in a corresponding ITO group. As shown in fig. 4A, each ITO memory bank 462 may serve as a cathode for two display pixels (e.g., each including red, green, and blue sub-pixels). In some examples, each ITO memory bank 462 may serve as a cathode for more or fewer display pixels.
During touch operations, in some examples, the ITO memory banks 462 may be coupled together to form touch node electrodes 458, and the touch node electrodes 458 may be coupled to one or more touch chiplets (not shown) for touch sensing operations.
As shown in fig. 4A, one or more display chiplets can include a display micro-driver 470 and a switch 444. The display micro-driver 470 may be coupled to one or more red, green, and blue LED/OLED devices 464-R, 464-G, and 464-B, such as micro-LEDs that emit red, green, and blue light, respectively. The RGB arrangement is exemplary and other examples may include alternative subpixel arrangements (e.g., red-green-blue-yellow (RGBY), red-green-blue-yellow-cyan (RGBYC), or red-green-blue-white (RGBW), or other subpixel matrix schemes in which the pixels may have different numbers of subpixels.
Display micro-driver 470 may include a current driver coupled to the anodes of the sub-pixel elements in two pixel columns. For example, anodes of each blue sub-pixel in a first pixel column may be coupled together and to one of the current drivers, while anodes of each blue sub-pixel in a second pixel column may be coupled together and to a different one of the current drivers. Likewise, anodes of each of the green or red subpixels in the first and second pixel columns may be coupled together and to a corresponding current source, respectively. Thus, during a display operation, illumination adjustment for each pixel in the ITO group 460 can be addressed using one or more switches to select one of the ITO memory banks 462 and adjust and provide the operating current of the corresponding current driver in the display micro-driver. In some examples, refresh and/or timing signals may be provided by the touch and display controller to address each LED device individually to enable asynchronous or adaptively synchronous display updating. In some examples, the display brightness may be adjusted by manipulating a reference voltage (not shown) supplied to the display micro-driver.
As described above, during a display operation, one or more switches may select a respective memory bank (e.g., one of the ITO memory banks may be selected using a multiplexer or a corresponding group of discrete switches) to couple to a cathode node, which in turn is coupled to Vneg through other switches. During touch operations through the integrated touch screen, one or more switches may alternatively couple each of the ITO memory banks 462 in the ITO group 460 together and couple the ITO group 460 to the touch chiplet. Additionally, the one or more switches may be configured such that the anode and cathode of the LED device may be shorted to avoid any noise from the LED (e.g., leakage current or photocurrent) from interfering with touch sensing. Additionally, multiple ITO groups corresponding to multiple display chiplets can be coupled together to form touch node electrodes and can be coupled to one or more touch chiplets.
As mentioned above, ITO memory banks 462 can be coupled together to form touch node electrodes for touch sensing operations. In some examples, the ITO memory banks 462 coupled to the chiplets can be coupled together to form touch electrodes using switching circuitry within the chiplets. In some examples, the ITO bank groups may be coupled together using display chiplets to form touch node electrodes for touch sensing operations. Each of the touch node electrodes formed from the ITO bank groups may be coupled to one of the touch chiplets during a touch operation.
In some examples, the number of ITO banks 462 in the touch node electrodes may be selected according to a desired sensing resolution. In some examples, the number of ITO banks 462 in the touch node may be limited by the space available for the chiplet, which may vary depending on the density of LEDs/display pixels.
As noted above, in some examples, the ITO group 460 may be coupled to both the display chiplet and the touch chiplet. The chiplet can include sensing circuitry (also referred to herein as a sense channel or sense channel circuitry), switching circuitry, and control circuitry. The sensing circuitry may be configured to couple to the ITO group 460 for a sensing operation. The switching circuitry may include switches (e.g., multiplexers, discrete switches, etc.) to implement the display and sensing configurations described herein. For example, the switches may include an ITO switch (cathode switch), an anode switch, and an excitation voltage switch for coupling the touch node electrode to a positive or negative phase excitation signal for a touch sensing operation. The control circuitry may include interface and register circuitry that provides input and output functionality to enable communication between the touch chiplet and the controller and/or host processor and to store configuration information for the chiplet (e.g., configuration of the sense channel circuitry). The control circuit may also include switch control logic configured to operate the switching circuit for display and sensing operations.
Fig. 4B illustrates a block diagram of touch node electrode 458 in accordance with an example of the present disclosure. The size and micro LED density of the touch node electrodes 458 may vary depending on the size of the device, the size of the integrated touch screen, and the desired display and touch and/or proximity sensing granularity. In one non-limiting example, touch node electrode 458 may define an X2 by Y2 area of 1.25mm X1.25 mm and include micro LED module 0 and micro LED module 1 and micro driver block 472 located between module 0 and module 1, where each module contains 16X 16 pixels (assuming a redundant set of pixels). As noted above, the micro-driver block 472 may include some or all of the micro-driver 470, the switch 444, and other circuits such as amplifiers, analog-to-digital converters, filters, demodulators, result registers, and the like. In another non-limiting example, touch node electrode 458 may define an X2 by Y2 area of 3.546mm× 3.546mm and include micro LED module 0 and micro LED module 1 and micro driver block 472 located between module 0 and module 1, where each module contains 32×64 pixels. Although the example of fig. 4B shows one micro-driver block 472 and two modules, in other examples of the present disclosure, a different number of micro-driver blocks may control a different number of modules.
As noted above, in various embodiments of the present disclosure, an electronic device may detect unmodulated light emitted by a stylus, modulated light emitted by one or more styluses, or modulated/unmodulated light generated by micro LEDs and reflected from an object such as a finger or a passive stylus. Various configurations of micro LED arrays and micro driver circuits may be employed to perform these detections.
Fig. 4C shows an expanded view of a touch node electrode 458 comprising two micro LED modules configured in DC photoconductive mode for detecting unmodulated light emitted by a stylus and a micro driver block 472 according to an example of the present disclosure. In the DC photoconductive mode, light (e.g., unmodulated light) generated by the active stylus is detectable by micro LED 464, which has been configured as a photodetector. In the example of fig. 4C, the anodes of micro LEDs 464 in both module 0 and module 1 may be configured to remain at ground using switch 444, while the cathodes may be reverse biased by coupling to the inverting input of a transimpedance amplifier 466, which may be held at a reference voltage such as 0.65V. The amplifier 466 (also referred to herein as an Analog Front End (AFE)) may be configured as a transimpedance amplifier or charge amplifier to convert the current on the inverting input of the amplifier (indicative of the intensity of the light received at the micro LED) to a voltage on the output of the amplifier using the feedback network of the amplifier. In some examples, the analog output of amplifier 466 may be converted to a differential signal using a single-ended-to-differential (S2D) converter 468, and the differential signal may be converted to a digital signal using a sigma-delta ADC 474 and a subsequent decimation filter 476. In some embodiments, instead of a sigma delta ADC and a decimation filter, a nyquist ADC (such as a SAR ADC) may be used. The digital signal at the output of the decimation filter 476 (or nyquist ADC if applicable) may be a combination of the DC offset value due to the dark current through the micro LED 464 (reverse bias leakage current) and the dynamic component that is the signal of interest. In the DC photoconductive mode, one or more demodulators 478 (as indicated by the dashed line in fig. 4C) may be bypassed because the light detected at micro LED 464 does not need to be modulated. The digitized data stream from ADC 474 may then be processed (along with data from other micro LEDs) by one or more downstream processors to produce an image of the illumination pattern indicative of the target location of the active stylus and the intensity of light across the illumination pattern.
In the example of fig. 4C, all micro LEDs 464 in both module 0 and module 1 are coupled to a single amplifier 466 and its associated downstream circuitry to produce a single digitized data stream for each touch node electrode. However, it should be understood that in other embodiments, the touch node electrode 458 may be divided into groups of more than two micro-LEDs 464, and each group of micro-LEDs 464 may be coupled to one of a plurality of amplifiers 466 within the micro-driver block 472, each amplifier producing its own digitized data stream for processing by one or more processors.
Fig. 4D shows an expanded view of a touch node electrode 458 comprising two micro LED modules configured in an AC photoconductive mode for detecting modulated light emitted by one or more styluses and a micro driver block 472 according to an example of the present disclosure. In the AC photoconductive mode, modulated light generated by an active stylus or multiple active styli, each of which generates modulated light at a different frequency, may be detected by micro LED 464, which has been configured as a photodetector. In the example of fig. 4D, the anodes of micro LEDs 464 in both module 0 and module 1 may be configured to remain at ground using switch 444, while the cathodes may be reverse biased by coupling to the inverting input of a transimpedance (or charge) amplifier 466, which may be held at a reference voltage such as 0.65V. The amplifier 466 may be configured as a transimpedance amplifier as discussed above with respect to fig. 4C. In the AC photoconductive mode, multiple demodulators 478 (finite impulse response (FIR) filters) are required to demodulate the signals received from one or more active styli, each of which may produce light having a different modulation frequency. In some examples, in-phase (I) and quadrature (Q) demodulation (I/Q demodulation) may be employed to achieve phase-agnostic (phase independent) operation, which may be required because the carrier wave on the stylus may not be synchronized with the demodulation waveform in the electronic device.
Because of the I/Q demodulation, two demodulators 478 are needed for each demodulation frequency, one for the I component and one for the Q component. In some examples, serialized demodulation coefficients 490 can be selected by multiplexer 484 and transmitted to a demodulator. In some implementations, the multipliers in demodulator 478 can be implemented with shift registers and adders, and the serialized demodulation coefficients can be partially and gated in the multipliers throughout the touch scan. In another embodiment, serialized demodulation coefficients may be parallelized and then applied to an area multiplier that is time-shared across multiple channels. In the above example, the serial demodulation coefficients may be generated by an NCO residing in an off-panel Display Driver IC (DDIC). However, in other examples, one or more local NCO's may be used to generate the demodulation waveform. The demodulated output of each demodulator 478 can be accumulated and fed to a result register 488, and the digitized data stream stored in the result register can be processed by one or more downstream processors to produce images of illumination patterns indicative of a target location of each of the one or more active styli, and the intensity of light across each of those illumination patterns.
In some examples, pairs of multiple demodulators 478 may be employed within the microdrive block 472, one for each of multiple possible frequencies that may be generated by multiple active styluses. However, if some pairs of demodulators are not needed, having all pairs always actively performing digital demodulation on the incoming digitized data stream may waste resources. Thus, in some examples, a spectrum analysis scan may be performed by at least some of the demodulators to determine the incoming modulation frequencies and to determine which channel or channels (pairs of demodulators 478) to perform demodulation at those frequencies. Alternatively, a wireless communication channel may be established between the active stylus and the electronic device to identify the active modulation frequency. Those channels may be deactivated if some of the demodulators do not correspond to any of the determined incoming modulation frequencies.
In the example of fig. 4D, all micro LEDs 464 in both module 0 and module 1 are coupled to a single amplifier 466 and its associated downstream circuitry, and then demodulated by multiple demodulators 478 to produce separate results at different frequencies representing different active styli (if there is more than one active stylus). However, it should be understood that in other embodiments, the touch node electrode 458 may be divided into groups of more than two micro-LEDs 464, and each group of micro-LEDs 464 may be coupled to one of a plurality of amplifiers 466 within the micro-driver block 472, each amplifier producing its own digitized data stream for demodulation by a plurality of demodulators to produce separate results at different frequencies representing different active styluses.
Fig. 4E illustrates an expanded view of a touch node electrode 458 comprising two micro LED modules and a micro driver block 472 configured to be in an optically reflective touch mode for emitting modulated light and detecting reflection of the modulated light from an object such as a finger or a passive stylus, according to an example of the present disclosure. In an optically reflective touch mode, modulated light generated by one or more micro-LEDs 464 configured as light illuminators may be reflected from an object such as a finger or a passive stylus and received by one or more micro-LEDs configured as light detectors. In the example of fig. 4E, micro LED 464 in module 1 can be configured as an illuminator by coupling the anode to micro drivers 470 (e.g., current sources) using switches 444 (which are coupled to a reference voltage such as 1.29V), while the cathode can be biased by coupling to a reference voltage such as-3.7V. In some examples, the micro-driver 470 in module 1 may be modulated by receiving an excitation signal from a discrete oscillator 486 (as shown in fig. 4E) or alternatively by receiving coefficients that may be used by a device such as an NCO to generate an excitation signal at a particular modulation frequency. In either case, the micro-driver 470 may be modulated according to the excitation signal to cause the micro-LEDs 464 in the module 1 to generate modulated light.
In the example of fig. 4E, the anode of micro LED 464 in module 0 may be configured as a detector by coupling the anode to a reference voltage, such as ground, using switch 444, while the cathode may be coupled to the inverting input of amplifier 466. The amplifier 466 may be configured as a transimpedance amplifier as discussed above with respect to fig. 4C. In the optically reflective touch mode, two demodulators 478 may be utilized to demodulate modulated light that has been reflected from an object such as a finger or a passive stylus, one demodulator for demodulating the I component of frequency and one demodulator for demodulating the Q component of frequency, as described above with respect to fig. 4D. The demodulated output of each demodulator 478 can be accumulated and fed into a result register 488, and the digitized data stream stored in the result register 488 can be processed by one or more downstream processors to produce an image of the illumination pattern indicative of the target location of the subject and the intensity of light across the illumination pattern.
As discussed above, stylus detection may be performed by detecting light emitted by the stylus or by emitting light and detecting reflection of that light from the stylus. However, accurate detection of emitted or reflected light may depend on the stylus, the cover material, the interface between the surrounding environment and the cover material, and the reflective properties of interfering objects such as water droplets.
Fig. 4F shows an expanded view of a touch node electrode 458 comprising two micro LED modules and a micro driver block 472 in an analog demodulation configuration, according to an example of the present disclosure. Although fig. 4F illustrates analog demodulation in the context of an optically reflective touch mode (e.g., fig. 4E), analog demodulation may also be utilized in a DC photoconductive mode (e.g., fig. 4C) and an AC photoconductive mode (e.g., fig. 4D). In the example of fig. 4F, an analog multiplier 425 is interposed between the S2D converter 468 and the sigma-delta ADC 474. A single demodulator 478 receives the output of decimation filter 476 and, after demodulation, passes its output to a result register 488. In fig. 4E, the sigma-delta ADC 474 will have a lower bandwidth than the other modes discussed above because the signal has been down-converted by the analog multiplier 425 using the demodulation signal 423 prior to the sigma-delta ADC. In this configuration, demodulator 478 receives a demodulation window (e.g., as opposed to a sinusoidal demodulation waveform) to obtain improved interference rejection. For the DC photoconductive mode, the analog multiplier 425 will be bypassed. For the AC photoconductive mode, the I scan and the Q scan may be performed sequentially (e.g., by separating a 300us scan into two 150us scans, with the first scan in the I phase and the second scan in the Q phase).
Fig. 5A illustrates a cross-sectional view of a portion of an integrated touch screen 502 including micro LEDs 564, a cover material 554, and an object such as a proximate stylus 596, and transmission of light across a boundary between the object and the cover material, according to an example of the present disclosure. In the example of fig. 5A, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as illuminators and photodetectors (although only three are shown in fig. 5A for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5A. The cover material 554 is formed over the display layer and may be formed of glass having a refractive index of about 1.5 in one example, although other materials (e.g., plastics) having other refractive indices may also be used. Light 594 (which may include light at any angle that has been emitted from micro-LEDs 564-G and/or 564-B or light that has undergone one or more reflections within cover material 554) may impinge on an object 596 (e.g., a stylus or other medium) in contact with the detection surface of cover material 554 and reflect back into the cover material at any degree of reflection angle, as shown at 598. Due to reflections from and/or reflections, absorption, and scattering within object 596, and also due to the similarity of the refractive indices of object 596 and cover material 554, reflection 598 may result from any angle of reflection relative to an angle normal to the surface of cover material 554 (see dashed line in fig. 5A) (e.g., surface normal), and in some cases detected by micro-LEDs 564-R configured as photodetectors.
FIG. 5B illustrates an integrated touch including micro-LEDs 564 and cover material 554 according to an example of the present disclosureA cross-sectional view of a portion of screen 502, and reflection or refraction of light across the boundary between air and the cover material. In the example of fig. 5B, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as illuminators and photodetectors (although only three are shown in fig. 5B for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5B. In the example of fig. 5B, light 594 (which may include light at any angle that has been emitted from micro-LEDs 564-G and/or 564-B or light that has undergone one or more reflections within cover material 554) may impinge from within the cover material on the boundary between the cover material and air and be reflected at least partially back into the cover material as light 595 and, in some cases, detected by micro-LEDs 564-R configured as photodetectors. According to the principles of Snell's law, the critical angle (at which light is no longer refracted into air, but is completely refracted back into the cover material) of light 594 (relative to the surface normal) impinging on the detection surface of the cover material (from within cover material 554) can be calculated as θ crit =sin -1 (n 1 /n 2 ) Wherein n is 1 Is the refractive index of air, and n 2 Is the refractive index of the cover material. In addition, light 599 (which may include ambient light, light that has been emitted from or reflected by an object such as a stylus, or light from any other light source) may impinge on the cover material 554 from outside and be refracted as light 595 as it passes through the boundary between air and the cover material. According to snell's law (law of refraction), the angle of refraction θ within the cover medium 1 Incident angle theta with light irradiated on the covering material 2 Correlation, i.e. theta 12 =n 2 /n 1 Wherein n is 1 Is the refractive index of air, and n 2 Is the refractive index of the cover material. However, at the critical angle of the light impinging from air on the detection surface of the cover material 554, the light will no longer be refracted into the cover material, but will be totally reflected back into the air。
As noted above, the critical angle of light that impinges on the detection surface of the cover material 554 from air as light 599 (below which the light begins to refract into the cover material), and also the critical angle of light that impinges on the detection surface of the cover material from within the cover material as light 594 (above which the light begins to totally reflect back into the cover material) may depend on and be determined (computationally or empirically) by the type of cover material 554 (e.g., glass) and the medium (e.g., air) in contact with the cover material. In the example of fig. 5B, the critical angle may be determined to be +/-42 degrees from the surface normal. For practical applications, the critical angle may include some margin, such as +/-42 degrees +/-1 degree, or +/-42 degrees +/-2% from the surface normal.
Fig. 5C illustrates a cross-sectional view of a portion of an integrated touch screen 502 including micro LEDs 564 and a cover material 554, and reflection or refraction of light across a boundary between a water droplet 597 and the cover material, according to an example of the present disclosure. In the example of fig. 5C, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as illuminators and photodetectors (although only three are shown in fig. 5C for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5C. In the example of fig. 5C, light 594 (which may include light at any angle that has been emitted from micro-LEDs 564-G and/or 564-B or light that has undergone one or more reflections within cover material 554) may impinge from within the cover material on the boundary between the cover material and water droplets 597 (having a refractive index of about 1.3) in contact with the detection surface of the cover material and reflect at least partially back into the cover material as light 593 and in some cases be detected by micro-LEDs 564-R configured as photodetectors. At the critical angle of the cover material/water droplet interface, light 594 may be totally reflected back into cover material 554. In addition, light 599 (which may include ambient light, light that has been emitted from or reflected by an object such as a stylus, or light from any other light source) may enter water droplet 597 and impinge from within the water droplet on cover material 554 and be refracted as light 593 as it passes through the boundary between the water droplet and the cover material. However, at the critical angle of light impinging on the detection surface of the cover material 554 from within the water droplet 597, the light will no longer be refracted into the cover material, but will be totally reflected back into the water droplet.
As noted above, the critical angle of light impinging in the detection surface of the cover material 554 from within the water droplet 597 as light 599 (below which light begins to refract into the cover material), and also the critical angle of light impinging on the detection surface of the cover material from within the cover material as light 594 (above which light begins to totally reflect back into the cover material), may depend on and be determined (computationally or empirically) by the type of cover material 554 (e.g., glass) and the medium (e.g., water) in contact with the cover material. In the example of fig. 5C, the critical angle may be determined to be +/-62.7 degrees from the surface normal. For practical applications, the critical angle may include some margin, such as +/-62.7 degrees +/-1 degree, or +/-62.7 degrees +/-2% from the surface normal.
Fig. 5D illustrates a cross-sectional view of a portion of integrated touch screen 502 including micro LEDs 564 and cover material 554, and the concept of blocking or filtering reflected or refracted light from or through some angles of the cover material, according to an example of the present disclosure. In the example of fig. 5D, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as illuminators and photodetectors (although only three are shown in fig. 5D for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5D. The example of fig. 5D superimposes an air/cover material critical angle of +/-42 degrees from the surface normal, as shown in fig. 5B, and a water drop/cover material critical angle of +/-62.7 degrees from the surface normal, as shown in fig. 5C. To reduce the likelihood that reflected or refracted light from ambient light, light sources other than a stylus, or water drops is detected and used erroneously to determine a proximity image, some embodiments of the present disclosure may filter or block light such that light having a detection angle less than the largest of the two critical angles (e.g., less than +/-62.7 degrees relative to the surface normal (indicated by dashed and white tipped arrow 591)) is blocked from being received by the detector, while light having an angle greater than the larger of the two critical angles (e.g., greater than +/-62.7 degrees relative to the surface normal (indicated by solid and black tipped arrow 589)) may be received by the detector. Filters or light blocking elements (not shown in fig. 5D) may be established at locations associated with particular micro LEDs configured as detectors to allow only light (reflected or otherwise) having an angle of +/-62.7 degrees or greater to reach the detectors.
As discussed above with respect to fig. 5A, 5C, and 5D, in some embodiments of the present disclosure, light emitted by an illuminator may reflect from the stylus and back to a photodetector, and detection of this reflected light may be used to capture an image of the stylus. However, it may be important to control the illumination angle of the illuminator to minimize reflection of the emitted light not due to the object but due to the emitted light causing Total Internal Reflection (TIR) or critical angle of near TIR (or greater) of the emitted light from air/cover material boundary reflection. This type of reflected light may have an angle sufficient to be received by the detector, which may lead to false object detection.
Fig. 5E illustrates a cross-sectional view of a portion of an integrated touch screen 502 including micro-LEDs 564-B configured as illuminators and generating light 585 in the direction of a boundary represented by an interface between air and a cover material 554, according to an example of the present disclosure. In the example of fig. 5E, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as an illuminator and a photodetector (but only two pixels (each representing three micro LEDs) are shown in fig. 5E for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5E. In the example of fig. 5E, micro LED 564-B produces light 585 at a critical angle relative to the surface normal, resulting in total (or near total) internal reflection of the emitted light at the air/cover material interface and the production of reflected light 583, even in the absence of an object that causes reflection. To reduce the chance of the reflected light 583 causing false detection, the light 585 generated by the illuminator may be limited to not greater than the critical angle of the cover material/air interface, which may reduce the reflection 583. The critical angle may depend on and be determined (computationally or empirically) by the type of cover material 554 (e.g., glass) and the medium (e.g., air) in contact with the cover material. In the example of fig. 5E, the critical angle may be determined to be +/-42 degrees from the surface normal. For practical applications, the critical angle may include some margin, such as +/-42 degrees +/-1 degree, or +/-42 degrees +/-2% from the surface normal.
Fig. 5F illustrates a cross-sectional view of a portion of an integrated touch screen 502 including micro-LEDs 564-B configured as illuminators and generating light 581 in the direction of a boundary represented by the interface between a water droplet 597 and a cover material 554, according to an example of the present disclosure. In the example of fig. 5F, the conductive layer 550 is a display layer including a plurality of micro LEDs configured as an illuminator and a photodetector (but only two pixels (each representing three micro LEDs) are shown in fig. 5E for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in fig. 5F. In the example of fig. 5F, micro LED 564-B produces light 581 at a critical angle with respect to the surface normal, resulting in total (or near total) internal reflection of the emitted light at the water droplet/cover material interface and the production of reflected light 579. To reduce the chance of the reflected light 579 causing false detection, the light 581 generated by the illuminator may be limited to not more than the critical angle of the cover material/water droplet interface, which may reduce the reflection 579. The critical angle may depend on and be determined (computationally or empirically) by the type of cover material 554 (e.g., glass) and the medium (e.g., water) in contact with the cover material. In the example of fig. 5F, the critical angle may be determined to be +/-62.7 degrees from the surface normal. For practical applications, the critical angle may include some margin, such as +/-62.7 degrees +/-1 degree, or +/-62.7 degrees +/-2% from the surface normal.
Fig. 5G illustrates a cross-sectional view of a portion of integrated touch screen 502 including micro-LEDs 564 and cover material 554, and the concept of blocking or filtering some angles of light emitted by micro-LEDs 564-B configured as illuminators, according to an example of the present disclosure. In the example of fig. 5G, the conductive layer 550 is a display layer including a plurality of micro LEDs 564 configured as an illuminator and a photodetector (but only two pixels (each representing three micro LEDs) are shown in fig. 5G for the sake of simplifying the drawing). For example, micro-LEDs 564-R may be reverse biased and configured as photodetectors, while micro-LEDs 564-G and 564-B may be configured as illuminators. However, as discussed below, other types and configurations of illuminators and photodetectors may also be employed in FIG. 5E. The example of fig. 5G superimposes an air/cover material critical angle of +/-42 degrees from the surface normal, as shown in fig. 5E, and a water drop/cover material critical angle of +/-62.7 degrees from the surface normal, as shown in fig. 5F. To reduce the likelihood that emitted light reflected from the air/cover material boundary or the water drop/cover material boundary is detected and used erroneously to determine a proximity image, some embodiments of the present disclosure may filter or block light such that emitted light having an illumination angle greater than the smallest of the two critical angles (e.g., greater than +/-42 degrees relative to the surface normal (indicated by dashed and white tipped arrow 577)) is blocked from transmission, while light having an illumination angle less than the smaller of the two critical angles (e.g., less than +/-42 degrees (indicated by solid and black tipped arrow 575)) is transmitted. A filter or light blocking element (not shown in fig. 5G) may be established at a location associated with a particular micro LED (such as micro LED 564-B in the example of fig. 5G) configured as a luminaire to allow only light from those luminaires having an illumination angle of +/-42 degrees or less to be transmitted.
Fig. 5H illustrates a cross-sectional view of a portion of an integrated touch screen 502 including alternative illuminator and photodetector embodiments, according to an example of the present disclosure. In one embodiment within the example of FIG. 5H, conductive layer 550 is a display layer that includes reverse-biased micro-LEDs 564-R configured as photodetectors and micro-LEDs 564-G and 564-B configured as illuminators (and includes other display micro-LEDs that are not shown in FIG. 5H for the sake of simplifying the drawing).
In another embodiment within the example of fig. 5H, the display layer includes dedicated photodetectors 592 (e.g., photodetectors separate/apart from the display LEDs) on the same layer as the micro LEDs 564 configured as display elements or as part of an array of photodetectors formed on a different layer 513 below the display layer. The use of an array of photodetectors on a separate layer 513 may be advantageous because the design of the display layer cannot be broken and it provides flexibility to use different color combinations in the display layer without having to provide photodetection. It should be noted that while only three micro-LEDs 564 and two dedicated photodetectors 592 are shown in fig. 5H in alternative locations for the purpose of simplifying the drawing, it should be understood that each layer may contain more micro-LEDs and dedicated photodetectors.
Fig. 5I illustrates a cross-sectional view of a portion of an integrated touch screen 502 including alternative illuminator and photodetector embodiments, according to examples of the present disclosure. In one embodiment within the example of fig. 5I, the conductive layer 550 is a display layer including an array of Near Infrared (NIR) micro LEDs 509 configured to emit NIR light and an array of NIR sensitive photodetectors 511 (although only one NIR micro LED and one NIR sensitive photodetector are shown in the display layer of fig. 5I for purposes of simplifying the drawing). It is also noted that the display layer includes other display micro-LEDs that are not shown in fig. 5I for the purpose of simplifying the drawing.
In another embodiment within the example of fig. 5I, the array of NIR micro LEDs 509 and the array of NIR sensitive photodetectors 511 are both formed on different layers 513 below the display layer (although only one NIR micro LED and one NIR sensitive photodetector are shown in layer 513 in fig. 5I for the sake of simplifying the drawing). In other embodiments, NIR micro LED 509 may be formed on layer 550 or 513 and NIR sensitive photodetector 511 may be formed on another layer. Utilizing an array of NIR illuminators and an array of photodetectors on a separate layer 513 may be advantageous because the design of the display layer cannot be broken and the use of NIR may make the photodetectors invisible to the user.
Fig. 5J illustrates a cross-sectional view of a portion of an integrated touch screen 502 including alternative illuminator and photodetector embodiments, according to an example of the present disclosure. In the example of fig. 5J, the conductive layer 550 is a display layer including display micro LEDs that are not shown in fig. 5J for the purpose of simplifying the drawing. An illuminator (not shown in fig. 5J) is optically coupled to the covering material 554 and is configured to emit light and pass the light through a filter 507 that directs the light laterally into the covering material 554, as indicated by the wide arrows in fig. 5J. The filter 507 may be a pinhole, slit, optical element, or collimator that emits light into the cover material 554. In various implementations, the array of dedicated photodetectors 592 can be formed in the display layer, or on a different layer 513 below the display layer (although only two dedicated photodetectors are shown on each of layers 550 and 513 for the sake of simplifying the drawing).
In the embodiment of fig. 5J, layer 505 may separate cover material 554 from the display layer. Layer 505 may be an air gap, or an Optically Clear Adhesive (OCA) layer (e.g., having a refractive index of 1.3) that adheres the cover material to the display layer, or other material that does not match the cover material refractive index. The filter 507 may ensure that light entering the cover material 554 from the side is at or beyond the critical angle for total internal reflection and propagation of light 503 within the cover material 554 without escaping the boundary or OCA interface established by the cover material and air. For example, due to the principle of Total Internal Reflection (TIR), light 503 cannot escape the bottom surface of cover material 554 and cannot be detected by photodetector 592. However, fig. 5J shows that when light is reflected from the detection surface of the cover material 554 at a location where an object such as a stylus 596 is present (and the refractive index of the object is similar to that of the cover material), the standing wave may be "broken" and the light may change its reflection angle, as shown at 501. The altered angle may cause light to pass through the lower boundary of the cover material 554 and refract into the layer 550 and optionally into the layer 513 where the photodetector 592 is located, thereby enabling detection of an object. Note that in the embodiment of fig. 5J, angular filtering (discussed below) is not required.
Fig. 5K illustrates a cross-sectional view of a portion of an integrated touch screen 502 including alternative illuminator and photodetector embodiments, according to an example of the present disclosure. In the example of fig. 5K, the conductive layer 550 is a display layer including display micro LEDs that are not shown in fig. 5K for the purpose of simplifying the drawing. The display layer includes an array of dedicated illuminators 509 (e.g., illuminators separate from the display LEDs) configured to emit light (although only one dedicated illuminator is shown in fig. 5K for purposes of simplifying the drawing). An array of photodetectors 592 are formed on a different layer 513 below the display layer (although only one photodetector is shown in fig. 5K for the sake of simplifying the drawing). It is noted that although fig. 5K shows illuminator 509 in the display layer and photodetector 592 in layer 513, in other examples, the illuminator may be formed in layer 513 and the photodetector may be formed in the display layer.
As discussed above, in some examples of the invention, some angular filtering of reflected or refracted light received from or through the cover material may be advantageous for detecting objects such as a stylus while minimizing false detection of those objects due to water or total internal reflection. Additionally, in some examples of the present disclosure, some angular filtering of the light produced by the micro-LEDs may be advantageous to minimize false detection of objects due to water or total internal reflection. To achieve this, in some examples of the present disclosure, a light blocking or light allowing element may be formed in one or more opaque layers of the integrated touch screen.
Fig. 6A illustrates a cross-sectional view of a portion of an integrated touch screen 602 including a representative micro LED 664, a cover material 654, a light blocking layer 673, and transmission and reception of light through the light blocking layer, according to examples of the present disclosure. In some examples, the light blocking layer 673 may be an opaque mask layer, an opaque passivation layer, or any other opaque layer. In the example of fig. 6A, holes or openings may be formed between portions 673-a and 673-B of the light blocking layer 673 to create an illuminator angle filter that allows light from micro LEDs 664-B (configured as an illuminator) to pass through with an illumination angle of between +/-42 degrees according to fig. 5G. It should be appreciated that while portions 673-a and 673-B of the light blocking layer 673 are shown in fig. 6A as having outer edges where the light blocking layer terminates, these outer edges are merely for simplicity of the drawing, and the light blocking layer may continue beyond these outer edges to more broadly block light.
In some embodiments, one or more additional light blocking layers (symbolically shown as a single layer 671) may be employed with portions 671-a and 671-B that create holes or openings that align or mate with holes or openings in light blocking layer 673 to maintain a desired illumination angle. In one illustrative example, the holes or openings in the two layers may be approximately the same size to create a "point source" with a very narrow illumination angle, if desired.
In the example of FIG. 6A, one or more apertures may be formed between portions 673-C and 673-E of the light blocking layer 673 to create a detector angle filter that allows light at a detection angle greater than +/-62.7 degrees to be received and detected at micro-LEDs 664-R (configured as detectors) according to FIG. 5D. It should be appreciated that while portions 673-C and 673-E of the light blocking layer 673 are shown in fig. 6A as having outer edges where the light blocking layer terminates, these outer edges are merely for simplicity of the drawing, and the light blocking layer may continue beyond these outer edges to more broadly block light.
Although the example of FIG. 5D discussed above describes an allowable detection angle greater than +/-62.7 degrees, allowing all angles between +/-62.7 degrees may allow micro-LEDs 664-R to detect some undesirable reflection and refraction. Thus, the portion 673-D may be included in the light blocking layer 673 to limit the detection angle to a very narrow range, such as within plus or minus one degree (or some fixed percentage) of +/-62.7 degrees. In one example, the portions 673-D binding portions 673-C and 673-E may pass light between a first detection angle (e.g., 62.7 degrees) and a second detection angle that is a fixed percentage or fixed degree greater than the first detection angle.
In some embodiments, one or more additional light blocking layers 671 can include portions 671-C and 671-D that create apertures that mate with apertures in light blocking layer 673 to maintain a desired detection angle.
It is noted that the spacing between micro-LEDs 664-B configured as an illuminator and micro-LEDs 664-R configured as a detector is only one example, and other spacings between an illuminator and a detector are also contemplated, such as 1.25mm. Further, it should be appreciated that while FIG. 6A shows micro-LEDs 664-B configured to generate light and micro-LEDs 664-R configured to detect light, other micro-LEDs may be employed, such as micro-LEDs configured for NIR, IR or UV light generation and detection.
Fig. 6B illustrates a top view of a portion of the integrated touch screen 602 of fig. 6A showing a light blocking layer 673 according to examples of the present disclosure. The top view of fig. 6B shows that in some examples, portions 673-a and 673-B of the light blocking layer 673 may form a ring around the opening 669 to allow light from an underlying illuminator (not shown in fig. 6B) to pass through with an illumination angle of between +/-42 degrees. It should be appreciated that while portions 673-a and 673-B of the light blocking layer 673 are shown in fig. 6A and 6B as having outer edges where the light blocking layer terminates, these outer edges are merely for simplicity of the drawing, and the light blocking layer may continue beyond these outer edges to more broadly block light. In addition, the portions 673-A and 673-B of the light blocking layer 673 need not form a ring, but may form discontinuous light blocking portions.
The top view of FIG. 6B also shows that in some examples, portions 673-C, 673-D, and 673-E of the light blocking layer 673 may define a circular aperture 667-A to allow light at an angle of about +/-42 degrees to pass through and be received by an underlying detector (not shown in FIG. 6B). It should be appreciated that while portions 673-C and 673-E of the light blocking layer 673 are shown in fig. 6A and 6B as having outer edges where the light blocking layer terminates, these outer edges are merely for simplicity of the drawing, and the light blocking layer may continue beyond these outer edges to more broadly block light. In addition, the portions 673-C, 673-D, and 673-E of the light blocking layer 673 need not form annular holes therebetween, but may form discontinuous light blocking portions. Fig. 6B also shows other optional areas of the light blocking layer 673 having light blocking portions that create openings 667-B, 667-C, and 667-D for allowing light at a particular angle to pass through and be received by an underlying detector. Various arrangements of light blocking regions corresponding to the illuminator and detector may be arranged across the integrated touch screen.
After the micro-LEDs configured as detectors detect the angularly filtered light, the resulting illumination pattern (e.g., touch or target location of a stylus) may be processed to calculate various parameters (e.g., centroid of the illumination pattern representing touch or target location) and perform various operations (e.g., stylus tracking). However, merely determining one or more illumination patterns may not be sufficient to calculate other parameters needed to perform accurate stylus operations, and may limit the functionality of the stylus as an input device. For example, determining the hover distance and tilt angle of the stylus over the detection surface may enable additional and more accurate stylus operation.
Fig. 7A illustrates a geometric perspective view of a stylus 796 hovering in a perpendicular orientation relative to a detection surface and producing an illumination area 749, according to an example of the present disclosure. FIG. 7A illustrates the simplest orientation of a hover stylus with zero tilt angle. As will be explained below, for a vertical stylus, given the known parameters of the light source (of the active stylus) and the determined average light intensity I determined at the detection surface H Can calculate the hovering distance D H . Hover distance D H According to the light intensity I at the stylus c (e.g., illumination intensity) and the projected change of the light onto the detection surface, and is related to the ratio of the projected area of light at the stylus and the illuminated area on the detection surface. In the geometric perspective view of FIG. 7A, A c Is the known effective illumination area at the stylus tip, I c Is the known illumination intensity at the stylus tip, r c Is a known radius of the effective illumination area at the stylus tip and delta is a known illumination divergence angle of the light emanating from the stylus tip. If the average illumination intensity I H (and optionally detecting at the surface)Illumination area A H And an illumination radius r H ) Determined from the illumination pattern captured and processed according to one of the above-described modes for detecting received light, the hover distance D H It can be calculated as:
fig. 7B illustrates a flowchart of a method of calculating an illumination area or pattern of a stylus oriented perpendicular to a detection surface, according to an example of the present disclosure. Note that in the example of fig. 7B, the illumination pattern is a cone. However, many other illumination structured patterns are possible, including axially symmetric shapes, such as hollow cones, rectangular cones, or star cones, which can produce a circular, square, or star projection onto the coverslip at zero tilt and a stretched version at non-zero tilt. The illumination pattern may be selected based on the desired accuracy, sensor pitch, etc. For example, a star pattern may result in improved azimuth detection (e.g., an angle between a vector representing the direction of a stylus projected onto an x-y plane and a reference direction or vector on the x-y plane). Other detection algorithms, such as AI-based image recognition algorithms, may be used to disambiguate certain features in the structured illumination pattern. Regarding the detection algorithms described below, the elliptical algorithm is a structural algorithm that measures structural elements of the illumination pattern (e.g., circles and ellipses), while Principal Component Analysis (PCA) based algorithms focus on beam pattern distribution and associated statistical information.
In the example of fig. 7B, an image may be acquired at 765 by aggregating multiple illumination intensity results from multiple touch node electrodes (which may be referred to herein as pixels) configured and operated according to one or more of the image acquisition modes described above. A two-dimensional image of pixels having non-zero illumination intensity values may be referred to herein as an illumination region or pattern, and a three-dimensional image of pixels having non-zero illumination intensity values (where the illumination intensity values are plotted in a third dimension) may be referred to herein as an irradiance profile. When operating in DC photoconductive mode, baselining may optionally be performed at 763To compensate for micro-LED dark current by removing the contribution of dark current to the illumination signal in the irradiance profile. Grass suppression may optionally be performed at 761 to exclude noise whose illumination intensity value is primarily due to noise from touching the node electrode (pixel). To perform the grass suppression, an optional adaptive grass suppression threshold may be used to ignore pixels whose illumination intensity value is below the threshold. By removing such pixels, noise is removed from the calculation of the illumination pattern and other parameters, such as the centroid of the illumination pattern, and the signal-to-noise ratio (SNR) may be increased. Optionally, centroid and other parameters may then be calculated from the optionally linearized grass rejection pixels at 759. Spatial (and temporal) filtering may optionally be applied to the X and Y coordinates of the calculated illumination pattern at 757 to eliminate some pixels from further calculations based on their location in the integrated touch screen or within the touch node electrode and based on their capture time. The number of pixels whose illumination intensity values exceed the grass suppression threshold may be determined at 755, and the illumination intensity values for those pixels may be summed at 753. The average illumination intensity value I may be calculated at 751 by dividing the summed illumination intensity value by the number of pixels whose illumination intensity value exceeds the grass suppression threshold (or by dividing by the elliptical area PI a b, where a and b are the major and minor axes of the ellipse) H . The average illumination intensity value I may then be used H For use in equation (1) above to calculate hover distance D H
A stylus oriented perpendicular to the detection surface represents a simplified case of a more general and complex orientation of the stylus hovering a distance above the detection surface in an orientation (tilt) that is not perpendicular to the detection surface.
Fig. 8A illustrates a visual comparison between a vertical stylus and a tilted stylus and an elliptical illumination pattern produced by the tilted stylus according to an example of the present disclosure. In FIG. 8A, a stylus 896 oriented perpendicular to the detection surface produces a circular illumination pattern 849, and hovers a distance D H By determining the illumination intensity I of the illumination pattern H To calculate as discussed above with respect to fig. 7A and 7B. However, when the stylus is tilted, the illumination pattern 847 becomes elliptical and illuminatesArea A H Increases, and it may be desirable to calculate the tilt angle and hover distance of the stylus. The light cone can be described by the following expression:
where δ is the divergence angle of the light cone, α is the inclination angle of the light cone, and x, y and z are the coordinates in x, y and z, respectively. For example, for z <0, a divergence angle of 22.5 degrees, and tilt angles of 0 degrees and 45 degrees, the projections on the sensor area may be circular and elliptical, respectively, as shown in fig. 8A.
Fig. 8B illustrates a flowchart of a method of calculating an illumination pattern of a stylus tilted with respect to a detection surface according to an example of the present disclosure. In the example of fig. 8B, an image may be acquired at 865 by aggregating a plurality of illumination intensity values from a plurality of touch node electrodes (pixels) configured and operated in accordance with one or more of the image acquisition modes described above. When operating in the DC photoconductive mode, baselining may optionally be performed at 863 to compensate for micro LED dark current by removing the contribution of dark current to the illumination intensity value. Grass suppression may be performed at 861 to exclude noise pixels whose illumination values are primarily due to noise. To perform the grass suppression, an optional adaptive grass suppression threshold may be used to ignore pixels whose illumination intensity value is below the threshold by setting those noise pixels to have zero illumination intensity values. By removing such pixels, noise is removed from the calculation of the illumination pattern and other parameters, such as the centroid of the illumination pattern, and the signal-to-noise ratio (SNR) may be increased. The illumination intensity values for those pixels that are high Yu Caozhuang wave suppression thresholds may optionally be interpolated and upsampled at 845 to increase the granularity of the illumination pattern. Coordinates of the boundary pixels may be captured at 843, and a least squares fit of the ellipse may be applied to the boundary pixels at 841. After computing the ellipse, parameters such as centroid, width, height, and rotation of the ellipse may be extracted at 839, and the tilt of the stylus may be computed based on the width and height of the ellipse at 837. Then, at 835, a hover distance for the stylus may be calculated from the tilt. These steps will be discussed in more detail below.
FIG. 9A illustrates irradiance profile 987 for a plurality of touch node electrodes in a portion of an integrated touch screen according to one example of the disclosure. FIG. 9A is a three-dimensional plot of an array of touch node electrodes arranged in the X-Y plane and the illumination intensity of those touch node electrodes along the Z-axis. In one example, the size of each touch node electrode in the array is 1.25mm by 1.25mm. In the example of fig. 9A, irradiance profile 987 may be the result of one or more of operations 865, 863, and 861 in fig. 8B, such that the illumination signals from all touch node electrodes have been acquired, aggregated, and grassy wave suppressed such that all touch node electrodes with illumination intensities below the grassy wave suppression threshold are set to zero.
Fig. 9B illustrates an irradiance profile 933 after interpolation and upsampling have been performed on the irradiance profile 987 of fig. 9A to increase the granularity of the illumination pattern, according to one example of the present disclosure. The irradiance profile of fig. 9B may be the result of operation 845 in fig. 8B. In general, interpolation is used to calculate the interpolation based on N points (x i ,y i ) To estimate the function y=f (x), where i=0 … N-1 and x grid =(x i -x i+1 ) Is a uniform grid spacing and then, after estimation, is applied to the grid at M points (x j ,y j ) Estimating the resulting function at finer resolution, where j=0 … M-1 and M/N>1 is the up-sampling ratio, and x j -x j+1 ) Is a finer grid in which the process is performed across all rows and columns of the captured image. In some examples, linear interpolation may be employed, where the function y=f (x) is a linear function such as y (x) =m x+y 0 At point x i 、x i A piecewise approximation between +1, where i=0 to N-2. In some examples, a nonlinear polynomial interpolation may be employed, where the function y=f (x) is approximated with a nonlinear (e.g., 3 rd order) polynomial function, e.g., y (x) =a 0 +a 1 *x+a 2 *x 2 +a 3 *x 3 Wherein a is 0 ,a 1 And a 2 Is a polynomial coefficient, and y (x) is based on all points (x i ,y i ) Export of whereini=0 … N-1. In some examples, a non-linear spline (cubic) interpolation may be employed, where the function y=f (x) is interpolated with a point x i 、x i+1 A nonlinear (e.g., 3 rd order) spline approximation between, where i=0 to N-2 and point x i The slope at which is uniform, thereby ensuring that the piecewise interpolation function is smooth. Cubic interpolation is used to effectively increase the sampling rate over both x and y by a factor of several in order to obtain more spatial resolution of the boundary pixels. Generally, the cubic interpolation receives an irradiance profile of the illumination pattern (e.g., a profile of varying intensities of the illumination signal at each touch node electrode above a grass suppression threshold), and matches a third order polynomial to the profile. Alternatively, linear interpolation or 5-order interpolation may also be performed. After computing the polynomial, the polynomial may be sub-sampled to increase the granularity of the irradiance profile. The result is a much finer granularity (e.g., much finer than 1.25mm x 1.25 mm) than the granularity of the touch node electrode.
In some examples, the number of samples in each of the X and Y directions may be increased by an order of magnitude due to interpolation and upsampling. As can be intuitively seen from fig. 9B, the boundaries of irradiance profiles that may indicate the presence of an illumination pattern and an object, such as a hovering stylus, may be determined (in one example) by identifying those illumination signals (from both the actual touch node electrode and the up-sampled illumination signals) that are adjacent to adjacent touch node electrodes whose illumination intensity is non-zero (and optionally above a certain threshold) and that are also set to zero due to grassy wave suppression.
FIG. 9C shows a two-dimensional map 931 of those touch node electrodes that have been identified as boundary touch node electrodes in irradiance profile 933 of FIG. 9B, according to one example of the disclosure. Boundary map 931 may be the result of operation 843 in FIG. 8B. In the example of fig. 9C, the identified boundaries may be generally elliptical in shape due to the earlier interpolation and upsampling and increased number of illumination signal samples, but have irregular edges due to the individual touch node electrode boundary determinations.
FIG. 9D illustrates an edge as fitting an ellipse to FIG. 9C in accordance with one example of the present disclosure Ellipse 929 bounding the result of graph 931. Ellipse 929 may be the result of operation 841 in fig. 8B. In one example, the least squares fit of ellipses to boundary map 931 may be performed according to an algorithm described in the 1998 article entitled "Numerically Stable Direct Least Squares Fitting of Ellipses" by Halir and flush, the entire contents of which are incorporated by reference for all purposes. Note that for the general expression of ellipses, F (x, y) =ax 2 +bxy+cy 2 +dx+ey+f= 0,F (x, y) is defined as the geometric distance, which is essentially the point (x i ,y i ) Deviation from the fitted ellipse. For points (x) i ,y i ) The polynomial F (x) is not zero. For points (x i ,y i ) F (x, y) =0. The purpose of this algorithm is essentially to derive the parameters a to F, where F (x i ,y i ) Is an LMS that minimizes across all points i=0 to N-1 (where N is 20 in one example). The general expression for ellipsoids in matrix format is Fa (x) =x×a=0, and the general expression for least mean square estimation is:
the purpose is to make the point (x i ,y i ) The average of the sum of squares of the geometric distances between them is minimized.
Fig. 9E shows an ellipse 927 as a result of fitting an ellipse to the boundary map 931 of fig. 9C, but in the dashed line region of fig. 9E, the boundary map 931 is incomplete, according to one embodiment of the present disclosure. Although not shown in fig. 9C, in some cases, a complete boundary cannot be constructed from the illumination data, such as when the stylus is near the edge of the integrated touch screen and thus does not cause light to be detected on the complete elliptical illumination pattern. In this case, the ellipse fitting algorithm is still able to generate a mathematical expression of the full ellipse 927 based on the partial boundary data. After generating the mathematical expression of the ellipse, all points on the ellipse, including the missing portion indicated by the dotted line, may be estimated, but fidelity may be lost. The azimuth of the optical stylus may be calculated as tan (2 phi) =b/(a-c), where the parameters a, b, and c are parameters from the general expression of ellipses.
Referring again to fig. 8B, after generating the mathematical expression for the ellipse at 841, all points along the ellipse may be calculated, and parameters such as centroid, width, height, and rotation of the ellipse (relative to the X-axis or Y-axis) may be calculated at 839. The tilt of the stylus may be calculated based on the width and height of the ellipse at 837, and then the hover distance of the stylus may be calculated from the tilt at 835, as will be explained in further detail below.
The general form of an ellipse is:
wherein x is c And y c Representing the coordinates of the center of the ellipse. For a, b, x c And y c The following equation is obtained for the following substitution:
where z is the hover distance (also referred to herein as D H ). For a tilting stylus, given the known divergence angle δ of the light source at the stylus, after extracting the ellipse length a and width b from the boundary data, the values of a and b can be substituted into the ellipse equation described above, and the tilt angle α and hover distance z can be solved.
However, because there are two equations and two unknowns, the a/b ratio can be calculated as:
wherein the hover distance z is isolated in both the numerator and denominator using the following substitutions:
wherein x is c0 Is the x-axis coordinate of the center of the ellipse, set to zero. By taking the hover distance z in the denominator from the sqrt () term, z can be eliminated, resulting in the expression:
In equation (9), the ratio a/b now varies entirely depending on the divergence angle δ and the inclination angle α, but since the divergence angle δ is known, the equation can be solved for the inclination angle α. In some embodiments of the present disclosure, rather than performing these calculations by one or more processors, a look-up table may be generated and stored in a computing device that generates the tilt angle α when providing the ellipse length a and width b. In some examples, separate look-up tables may be stored for different divergence angles δ of different intended styluses and their known light sources.
With the tilt angle α now known, the hover distance z may be calculated using equation (6):
and substitution of equation (8). In some examples, equation (8) for the width b of the ellipse may be selected for use because as the stylus is tilted, the illumination profile decreases (less illumination intensity will be detected) as the distance between the stylus and the detection surface (e.g., at the distal end of the ellipse) increases. The elliptical width b may be used because the light level around the minor axis at the elliptical width b will remain fairly constant as the stylus tilt angle α increases. Solving the hovering distance z to obtain:
In some embodiments of the present disclosure, rather than performing these calculations by one or more processors, a look-up table may be generated and stored in a computing device that generates the hover distance z when providing the elliptical width b. In some examples, separate look-up tables may be stored for different divergence angles δ of different intended styluses and their known light sources.
Fig. 8C illustrates a flowchart of an alternative method of calculating an illumination pattern of a stylus tilted relative to a detection surface, according to an example of the present disclosure. Instead of using the ellipse fitting portion of fig. 8B that involves interpolating and upsampling the illumination intensity values of those pixels that are high Yu Caozhuang wave suppression thresholds, capturing coordinates of boundary pixels, performing a least squares fit of the ellipse to the boundary pixels, and extracting the centroid, width, height, and rotation of the ellipse (e.g., blocks 845, 843, 841, and 839 of fig. 8B), the Principal Component Analysis (PCA) method is used.
In the example of fig. 8C, a centroid of pixels above the grass suppression threshold is calculated at 821. A 2 x 2 covariance matrix (a measure of the relationship between the variables) may be calculated at 819 based on the calculated centroid, touch data (e.g., pixels above the grass suppression threshold; irradiance distribution), and the sensor grid. The 2 x 2 covariance matrix can be expressed as:
Wherein C is xx And C yy Variance of touch data in X and Y, respectively, C xy Is the covariance of the data between x and y, and C yx =C xy
Eigenvalues of the covariance matrix are then calculated at 817. Eigenvalues of the covariance matrix may be expressed asWherein det is determinant and A is co-ordinatorDifference matrix, < >>Is a eigenvalue and I is an identity matrix. The calculation of the eigenvalues will produce a quadratic equation with two solutions, one for each half-axis. Fitting the actual width (a) and height (b) of the ellipse of the captured and grass-wave-suppressed touch pixel at 815 may be calculated from the eigenvalues asAnd-> Where β is a scalar that can be derived during calibration. The azimuth of the stylus creating the elliptical illumination area at 815 may also be used +.>To calculate. The tilt is calculated based on the width and height at 837 and the rest of the blocks of hover distance is left unchanged based on the tilt at 835.
In some cases, when the stylus is tilted for a given azimuth angle as calculated at 815, the centroid calculated at 821 may shift due to the imbalance of the beam profile. Thus, in some examples, centroid compensation may be required in order to compensate for the offset between stylus tip position and calculated centroid for a given tilt and azimuth. To achieve this, in some examples, a 2D lookup table may be allocated that contains x, y centroid offset corrections that vary according to stylus tilt and azimuth angle. The look-up table values may be derived as part of a factory calibration in which the offset between stylus tip position and centroid is captured across the range of tilt and azimuth angles. Centroid offset correction may then be applied to the centroid to complete centroid compensation.
In some examples, the choice of whether to use the ellipse fitting method (fig. 8B) or the PCA method (fig. 8C) may depend on the position of the stylus relative to the sensor. In some cases, the PCA method may be less accurate when a portion of the illumination profile is outside the sensor region (as in the example of fig. 9E). Thus, in some examples, the ellipse fitting method of fig. 8B may be used when a portion of the illumination pattern is outside the sensor area (e.g., on an edge or corner), but the PCA method of fig. 8C may be used if the illumination pattern is within the sensor area.
As discussed above with respect to fig. 5A, light generated from an illuminator in the optical sensing system may impinge on a passive stylus in contact with a detection surface of the optical sensing system and reflect back to a detector in the optical sensing system at any degree of reflection angle (if angle filtering is not employed). However, in alternative embodiments of the present disclosure, a passive stylus having various reflective surfaces may be employed to reflect light at a consistent angular reflection profile and/or pattern, as will be discussed below. These reflection profiles may be estimated to determine one or more of touch location, hover distance, tilt angle, orientation, and rotation of the stylus.
Fig. 10A illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffuse reflector stylus 1000 and an optical sensing system 1002 having an array of optical light emitting and/or detecting devices 1004 according to some examples of the present disclosure. The optical device 1004 may include a light emitting device and a light detecting device alone, or a device that may be configured as a light emitting device or a light detecting device, or a combination thereof. As used herein, a light emitting device may refer to a light emitting device alone or configured as a light emitting device, and a light detecting device may refer to a light detecting device alone or configured as a light detecting device. In the example of fig. 10A, a passive diffuse reflector stylus 1000 (only a portion of which is shown in fig. 10A) includes a stylus body having a tip and a side, wherein at least a portion of the tip and side have a diffuse reflective surface 1006 that reflects light 1008 emitted from a light emitting device 1004 in an optical sensing system 1002 at various locations along the tip and side of the stylus body. Light 1008 reflected from various locations may exhibit diffuse reflection within a relatively wide (e.g., >90 degrees) but consistent angular reflection profile (range of reflection angles) 1010 regardless of the angle of inclination γ of the stylus relative to the surface normal (vector perpendicular to the surface of the optical sensing system). However, in other examples, the diffusely reflective surface 1006 may be designed to reflect light in a diffusely reflective manner within a relatively narrow angular reflection profile. In either case, the angular reflection profile of the reflected light may be consistent within manufacturing tolerances of the diffuse reflection surface 1006 at various reflection locations, and may be consistent within +/-1%, +/-2%, +/-5%, +/-10%, or +/-20% across various reflection locations in other examples. Different stylus tilt angles γ may produce different distributions of reflected light energy received at the light detection device 1004, and these changes in reflected light energy (reflected energy profile) across the surface of the optical sensing system 1002 may be estimated by one or more processors executing software or firmware within the optical sensing system to determine the target position, hover distance (if any), and tilt angle of the stylus.
In some examples, the light emitting device 1004 may be a micro LED (such as those described with respect to fig. 4E) with illumination angles 1012 (e.g., +/-30 degrees) that do not undergo total internal reflection (e.g., angles up to the critical angle (in optics) of the surface material) and will not interfere with any other detection scheme employed, such as a water-agnostic detector, and still provide an acceptable angular distribution when light is refracted into the ambient air. In some examples, light emitting device 1004 may be any type of device that produces light in the visible, infrared, or near infrared spectrum. Near infrared light emitting device 1004 may generate light having a wavelength between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other embodiments, a light emitting device 1004 (and corresponding light detecting device) having a wavelength of 1 micron and above (such as 1.3 microns) may be employed, as well as LEDs and detectors in the visible spectrum. In some examples, light detection device 1004 may be a micro LED configured as a detector such as those described with respect to fig. 4E and 4F.
The passive diffuse reflector stylus 1000 may reflect light 1008 with a diffuse reflecting surface 1006 at an angle reflection profile 1010 that does not vary with stylus tilt angle gamma relative to the surface normal. In some examples, the diffuse reflective surface 1006 (e.g., diffuse reflector) may be formed of a volume scattering material that reflects light within the desired angular reflection profile 1010. The diffuse reflective surface 1006 may be a matte or textured surface whose brightness may be isotropic and whose luminous intensity follows lambertian cosine law, and in some examples has a reflectance (lambertian reflector) of greater than 99% in the range of near infrared or visible wavelengths (400 nm to 1500 nm) received at the surface. In some examples, the diffuse reflective surface 1006 may be continuous and uniform across the portion of the passive diffuse reflector stylus 1000 that is intended to be in contact with or in close proximity to the optical sensing system 1002. However, in other examples, the diffuse reflective surface 1006 may be designed to have different reflective characteristics in different areas of the passive diffuse reflector stylus 1000 (e.g., concentric rings around the stylus, where the reflective characteristics within each concentric ring are the same, but each ring has different reflective characteristics). In other examples, the diffuse reflective surface 1006 may be patterned to be present and absent in different regions of the stylus (e.g., columns of volume scattering material separated by non-diffuse regions disposed along the length of the stylus) such that the energy profile of reflected light impinging on the optical sensing system 1002 may vary according to different reflectivity patterns, thereby producing a spatial signature that may be utilized by one or more processors executing software or firmware within the optical sensing system to determine stylus orientation and rotation. In some examples, a spatial signature of reflected energy profiles captured by one or more processors may be tracked over time to determine stylus orientation (e.g., static axial position of the stylus relative to the optical sensing system 1002) and stylus rotation (e.g., pivoting about an axis of the stylus relative to the optical sensing system).
Fig. 10B illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffuse reflector stylus 1000 and an optical sensing system 1002 having a single light emitting device 1004 (for purposes of explanation only) according to some examples of the present disclosure. In the example of fig. 10B, a passive diffuse reflector stylus 1000 (only a portion of which is shown in fig. 10B) includes a diffuse reflecting surface 1006 that reflects light 1008 emitted from a light emitting device 1004 in an optical sensing system 1002 and has a stylus tilt angle γ relative to a surface normal. It should be appreciated that the energy profile of the reflected light received at the optical sensing system 1002 may change as the stylus tilt angle changes. Theta represents the angle of incidence of the emitted light 1008 on the passive diffuse reflector stylus 1000 measured relative to the stylus surface normal (vector normal to the stylus surface at the point of incidence), and Beta represents the specific angle at which light measured relative to the stylus surface normal is scattered from the stylus within the angular reflection profile 1010.
Fig. 10C illustrates a graph of reflected light intensity versus scattering angle β for light reflected from a passive diffuse reflector stylus 1000 according to some examples of the present disclosure. In the example of fig. 10C, it can be seen that the reflected light intensity decreases with increasing scattering angle, and the roll-off is a cosine function. The scattering angle β will have a cosine angular distribution, regardless of the angle θ of the incident light ray.
10D-1 through 10D-3 illustrate reflected energy profiles of light that has been reflected from the diffuse reflecting surface 1006 of the passive diffuse reflector stylus 1000 at three different tilt angles and now impinges on an array of detectors in the optical sensing system 1002, according to an example of the present disclosure. When the reflected light impinges on the optical sensing system 1002, a portion of the light is reflected from the surface of the optical sensing system, while the remainder of the light is refracted and enters the optical sensing system where it is received by the array of light detection devices 1004. In some examples, the light detection device is configured to detect light having a near infrared wavelength. The energy returned to the light detection device 1004 is relatively low due to the relatively wide angular distribution of the light reflected from the diffuse reflecting surface 1006. Examples of energy profiles of the received reflected light when the passive diffuse reflector stylus 1000 has tilt angles of 0 degrees, 30 degrees, and 60 degrees are shown in fig. 10D-1 through 10D-3, respectively. As can be seen in fig. 10D-1, when the passive diffuse reflector stylus 1000 has a tilt angle of 0 degrees, the energy distribution of the reflected energy profile is relatively symmetric about the stylus tip position 1014. However, as can be seen in fig. 10D-2 and 10D-3, the energy distribution becomes increasingly asymmetric in the tilt direction as the tilt angle increases as represented by tilt vectors 1024 and 1026, respectively. While the examples of fig. 10D-1 through 10D-3 represent energy profiles of all reflected light received at the array of light detection devices 1004, in some examples, an angular filter may be employed at the optical sensing system to limit light received at the light detection devices to only certain angles and block unwanted light.
When the array of light detection devices 1004 captures the energy level of the reflected light, one or more processors executing software or firmware within the optical sensing system can use multiple thresholds to generate various energy distribution patterns from the reflected energy profile. In the examples of FIGS. 10D-1 through 10D-3, a first energy threshold (e.g., 0.0025W/mm 2 ) Pattern 1016 may be defined with a second energy threshold (e.g., 0.0010W/mm) lower than the first energy threshold 2 ) A pattern 1018 may be defined, a third energy threshold (e.g., 0.0005W/mm) lower than the second energy threshold 2 ) A pattern 1020 may be defined and a fourth energy threshold (e.g., 0.0002W/mm) that is lower than the third energy threshold 2 ) A pattern 1022 may be defined. Algorithms may be applied to these patterns to determine stylus contact location (whether passive diffuse reflector stylus 1000 is in contact with or hovering over optical sensing system 1002), stylus tilt direction, and stylus tilt angle. It should be understood that the thresholds and patterns described above and shown in fig. 10D-1 through 10D-3 are for illustration purposes only, and that the thresholds may be selected to define patterns that provide the most accurate information. For example, pattern 1016 does not provide a very clear indication of stylus tilt direction, as can be seen from similar patterns in fig. 10D-1 (no tilt) and 10D-2 (some tilts). Instead, a threshold defining pattern 1018 may be selected to provide a more accurate indication of the direction of stylus tilt. However, pattern 1016, or alternatively its centroid, which is a relatively small pattern of high reflected light energy, may be used to determine the location where passive diffuse reflector stylus 1000 is in contact with optical sensing system 1002. As another example, the distance between the centroid of pattern 1016 and the centroid of pattern 1018 may indicate the amount of stylus tilt, and the vector between these centroids may indicate the direction of tilt.
Fig. 11A illustrates a cross-sectional view of a portion of an optical stylus system including a passive, retro-reflective stylus 1100 and an optical sensing system 1102 having an array of optical light emitting and/or detecting devices 1104, according to some examples of the present disclosure. Generally, retroreflection occurs when a surface reflects a substantial portion of the received light back to the source of the light. The optical device 1104 may include a separate light emitting device and light detecting device, or a device that may be configured as a light emitting device or light detecting device, or a combination thereof. As used herein, light emitting device 1104 may refer to a light emitting device alone or configured as a light emitting device, and light detecting device 1104 may refer to a light detecting device alone or configured as a light detecting device. In the example of fig. 11A, a passive retro-reflective stylus 1100 (only a portion of which is shown in fig. 11A) includes a stylus body having a tip and a side, where at least a portion of the tip and side have a retro-reflective surface 1106 that reflects light 1108 emitted from a light emitting device 1104 in an optical sensing system 1102 at various locations along the tip and side of the stylus body at a relatively narrow but consistent angular reflection profile (range of reflection angles) 1110 as compared to the angle of the incoming light, regardless of the tilt angle γ of the stylus relative to the surface normal (vector perpendicular to the surface of the optical sensing system). Such a narrow angle reflection profile causes the reflected light to return parallel or substantially parallel to the incoming light (e.g., toward the source of the incoming light). In one example, the angular reflection profile of the reflected light (as compared to the incoming light) may be consistent within manufacturing tolerances of the retroreflective surface 1106 at various reflection locations, and may be consistent within +/-1%, +/-2%, +/-5%, +/-10% or +/-20% at various reflection locations in other examples. Different stylus tilt angles γ may produce different energy profiles of reflected light received at the light detection device 1104, and these different reflected energy profiles may be estimated by one or more processors executing software or firmware within the optical sensing system 1102 to determine the target position, hover distance (if any), and tilt angle of the stylus.
In some examples, the light emitting device 1104 may be a micro LED (such as those described with respect to fig. 4E) with illumination angles 1112 (e.g., +/-30 degrees) that do not undergo total internal reflection (e.g., angles up to the critical angle of the surface material) and will not interfere with any other detection scheme employed, such as a water-agnostic detector, and still provide an acceptable angular distribution when light is refracted into the surrounding air. In some examples, the light emitting device 1104 may be any type of device that generates light in the visible, infrared, or near infrared spectrum. Near infrared light emitting device 1104 may generate light having a wavelength between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other embodiments, light emitting devices 1104 (and corresponding light detecting devices) having wavelengths of 1 micron and above (such as 1.3 microns) may be employed, as well as LEDs and detectors in the visible spectrum. In some examples, the light detection device 1104 may be a micro LED configured as a detector such as those described with respect to fig. 4E and 4F.
Generally, the retroreflective surface 1106 can be designed to shape the return distribution of reflected light and return the reflected light in the direction of its source. In so doing, the energy level of the reflected light may increase because more of the return energy is redirected and confined to the light detection device in the general area of the stylus contact or hover position. Additionally, by concentrating the reflected light to a general area of the stylus contact or hover location, a more accurate and identifiable pattern of the stylus may be produced from the energy profile of the reflected light. To achieve this, the passive, retro-reflective stylus 1100 may utilize retro-reflector facets 1130 (symbolically shown as triangles in fig. 11A) on the retro-reflective surface 1106 or within the transparent stylus tip to reflect light 1108 emitted from the light emitting device 1104 in the optical sensing system 1102 such that the reflected light 1110 is parallel or substantially parallel to the incoming light. In some examples, the retroreflector facets 1130 may be created by forming a saw tooth pattern (e.g., surface relief structures) in the retroreflective surface 1106. Alternatively, the retroreflector facets 1130 may be implemented as individual pyramid-shaped facets or depressions in the retroreflective surface 1106. In either case, the sawtooth pattern or pyramid-shaped depressions may be formed with right angles (90 degrees) to create a retroreflective inner surface. Retroreflective surface 1106 may reflect light at an angular retroreflective profile and pattern that does not change even when the stylus tilt angle is changed. In other words, the reflected light will return at the same angle (e.g., toward the source) as the incoming light, regardless of the stylus tilt, regardless of whether the incoming light impinges on the side or tip of the stylus.
In some examples, the retroreflector facets 1130 may be continuously and uniformly formed across those portions of the retroreflective surface 1106 on the passive retroreflective stylus 1100 that are expected to be in contact with or in close proximity to the optical sensing system 1102. However, in other examples, the retroreflector facets 1130 may be patterned to be present and absent in different areas of the retroreflective surface 1106, such that the energy profile of the reflected light impinging on the optical sensing system 1102 may vary according to different patterns and may be utilized by one or more processors executing software or firmware within the optical sensing system to determine stylus orientation and rotation. For example, in the rightmost passive retroreflective stylus 1100 in fig. 11A, the retroreflector facets 1130 are symbolically shown in a linear arrangement in a divergent column or an asymmetric pattern. These columns or asymmetric patterns may be formed by a number of retroreflector facets 1130 that reflect light back toward the direction of the emitted light. Reflection from these columns or patterns produces columns, patterns, or flashes of high energy reflected light impinging on the optical sensing system 1102, producing a spatial signature that can be utilized by one or more processors executing software or firmware within the optical sensing system to determine stylus orientation and rotation. In some examples, a spatial signature of reflected energy profiles captured by one or more processors may be tracked over time to determine stylus orientation (e.g., a static axial position of the stylus relative to the optical sensing system 1102) and stylus rotation (e.g., pivoting about an axis of the stylus relative to the optical sensing system).
Fig. 11B shows a symbolic representation of a cross-section of a retroreflector facet 1130 to illustrate the principle of retroreflection, according to some examples of the present disclosure. In the example of fig. 11B, two opposite sides of the retroreflector facet 1130 are formed at right angles. The incoming light 1108-a may impinge on the first side at a 45 degree angle relative to the surface normal, reflect off the first side at a 45 degree angle, then impinge on the second side at a 45 degree angle, reflect off the second side at a 45 degree angle, and exit the retroreflector facet parallel to the incoming light. Similarly, the incoming light 1108-B may impinge on the first side at an angle greater than 45 degrees relative to the surface normal, reflect off both sides, and exit the retroreflector facet 1130 parallel to the incoming light.
Fig. 11C-1 illustrates a portion of an optical stylus system having a passive, retro-reflective stylus 1100 with a retro-reflective surface 1106 including retro-reflector facets 1130, according to some examples of the present disclosure. In the example of fig. 11C, the retroreflector facets 1130 may be pyramid-shaped and may be arranged in an array forming a progressively diverging array of facets (generally along the axis of the stylus) and a circumferentially increasing ring (only a portion of which is shown in fig. 11C-1) about the axis of the stylus 1100. In some examples, the retroreflector facets 1130 may be quadrangular pyramids, with each interior surface forming a right angle with respect to its opposing surface to provide retroreflection, but in one variation, the pyramids may have cut-off points such that the back surface of the pyramids is a flat wall, while still providing retroreflection in some cases. In some examples, the retroreflector facets 1130 face outward and have axes oriented normal to the surface or body of the stylus 1100, and openings with edges oriented in the same general direction as or perpendicular to the axis of the stylus. In other examples, some of the facets may be individually rotated about their own axes such that the facets have a non-uniform orientation, or tilted on their axes such that they are oriented at a non-zero angle relative to the surface normal of the stylus body to provide the desired retroreflection at a particular location.
Fig. 11C-2 illustrates a retroreflector facet 1130 according to some examples of the present disclosure. In the example of fig. 11C-2, the retroreflector facets 1130 are shown as complete four-sided pyramids, but in one variation mentioned above, the pyramids may be cut at the plane 1132 (e.g., at the tips or points of the pyramids). Each side 1134 of the retroreflector facets 1130 may be angled at 45 degrees relative to the planar opening of the retroreflector facets, such that the opposing sides are at right angles (note that fig. 11C-2 is not drawn to scale in this regard). In some examples, each retroreflector facet 1130 may have an opening of 0.05mm by 0.05mm, and adjacent retroreflector facets may have a center-to-center spacing of 0.25mm, although other dimensions are also contemplated.
11D-1 through 11D-3 illustrate energy profiles of light that has been reflected from the retroreflective surface 1106 of the passive retroreflective stylus 1100 at three different tilt angles and now impinges on an array of detectors in the optical sensing system 1102, according to an example of the present disclosure. When the reflected light impinges on the surface of the optical sensing system 1102, a portion of the light is reflected from the surface, while the remainder of the light is refracted and enters the optical sensing system where it is received by the array of light detection devices 1104. In some examples, the light detection device is configured to detect light having a near infrared wavelength. Due to the relatively narrow angular distribution of the light reflected from the retroreflective surface 1106, the energy returned to the light detection device 1104 is relatively high compared to a diffuse reflective surface. Fig. 11D-1 to 11D-3 show examples of energy profiles of received reflected light when the passive-type retroreflective stylus 1100 has tilt angles of 0 degrees, 30 degrees, and 60 degrees, respectively. As can be seen in fig. 11D-1, when passive retroreflective stylus 1100 has a 0 degree tilt angle, the energy distribution of the reflected energy profile is relatively symmetric about stylus tip position 1114. However, as can be seen in fig. 11D-2 and 11D-3, the energy distribution becomes increasingly asymmetric in the tilt direction as the tilt angle increases as represented by tilt vectors 1124 and 1126, respectively. While the examples of fig. 11D-1 through 11D-3 represent energy profiles of all reflected light received at the array of light detection devices 1104, in some examples, an angular filter may be employed at the optical sensing system to limit the light received at the light detection devices to only certain angles and block unwanted light.
When the array of detectors captures the energy level of the reflected light, one or more processors executing software or firmware within the optical sensing system may use multiple thresholds to generate various energy distribution patterns. In the examples of FIGS. 11D-1 through 11D-3, a first energy threshold (e.g., 0.0025W/mm 2 ) Pattern 1116 may be defined below a first energy thresholdA second energy threshold (e.g., 0.0010W/mm 2 ) A pattern 1118 may be defined, a third energy threshold (e.g., 0.0005W/mm) that is lower than the second energy threshold 2 ) A pattern 1120 may be defined and a fourth energy threshold (e.g., 0.0002W/mm) that is lower than the third energy threshold 2 ) A pattern 1122 may be defined. Algorithms may be applied to these patterns to determine stylus contact location (whether passive retro-reflective stylus 1100 is in contact with or hovering over optical sensing system 1102), stylus tilt direction, and stylus tilt angle. It should be understood that the thresholds and patterns described above and shown in fig. 11D-1 through 11D-3 are for illustration purposes only, and that the thresholds may be selected to define patterns that provide the most accurate information. For example, pattern 1116 does not provide a very clear indication of the direction of stylus tilt, as can be seen from similar patterns in FIGS. 11D-1 (no tilt) and 11D-2 (some tilts). Instead, a threshold defining pattern 1118 may be selected to provide a more accurate indication of the direction of stylus tilt. However, pattern 1116, or alternatively its centroid, which is a relatively small pattern of highly reflected light energy, may be used to determine the location at which passive retroreflective stylus 1100 is in contact with the surface of optical sensing system 1102. As another example, the distance between the centroid of pattern 1116 and the centroid of pattern 1118 may indicate the amount of stylus tilt, and the vector between these centroids may indicate the direction of tilt.
Fig. 12A illustrates a cross-sectional view of a portion of an optical stylus system including a passive diffraction reflector stylus 1200 and an optical sensing system 1202 having an array of light emitting and/or detecting devices 1204, according to some examples of the present disclosure. In the example of fig. 12A, a passive diffraction reflector stylus 1200 (only a portion of which is shown in fig. 12A) includes a stylus body having a tip and a side, where at least a portion of the tip and side have a diffraction reflective surface 1206 that reflects light 1208 emitted from a light emitting device 1204 in the optical sensing system 1202 at various locations along the tip and side of the stylus body such that the reflected light 1210 returns to the optical sensing system in a uniform reflected light pattern regardless of the angle of inclination γ of the stylus relative to the surface normal (vector perpendicular to the surface of the optical sensing system). In one example, the reflected light patterns may be consistent within manufacturing tolerances of the retroreflective surface 1206 at various reflective locations, and may be consistent within +/-1%, +/-2%, +/-5%, +/-10%, or +/-20% at various reflective locations in other examples. The different stylus tilt angles γ may produce different energy (or phase) distributions of reflected light patterns received at the light detection device 1204, and these different energy (or phase) distributions (reflected energy profiles) of reflected light patterns may be estimated by one or more processors executing software or firmware within the optical sensing system 1202 to determine one or more of a target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus.
In some examples, the light emitting device 1204 may be a micro LED (such as those described with respect to fig. 4E) with illumination angles 1212 (e.g., +/-30 degrees) that do not undergo total internal reflection (e.g., angles up to the critical angle of the surface material) and will not interfere with any other detection scheme employed, such as a water-agnostic detector, and still provide an acceptable angular distribution when light is refracted into the surrounding air. In some examples, the light emitting device 1204 may be any type of device that produces light in the visible, infrared, or near infrared spectrum. The near infrared light emitting device 1204 may generate light having a wavelength between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other embodiments, other light emitting devices 1204 (and corresponding light detecting devices) having wavelengths of 1 micron and above (such as 1.3 microns), as well as LEDs and detectors in the visible spectrum, may be employed. In some examples, light detection device 1204 may be a micro LED configured as a detector such as those described with respect to fig. 4E and 4F and capable of detecting near infrared light.
In general, the diffractive reflective surface 1206 utilizes the wave nature of the light to shape the return distribution of the reflected light and substantially return the reflected light in a pattern to the optical sensing system 1202. In so doing, the energy level of the reflected light may increase because more of the return energy is redirected and confined to the pattern at the light detection device in the general area of the stylus contact or hover position. Additionally, by concentrating the reflected light pattern in a general area of the stylus contact or hover location, a more accurate and identifiable pattern of the stylus may be produced from the energy profile of the reflected light. To achieve this, the diffractive reflective surface 1206 can include a stylus pattern 1236 (shown in FIG. 12A as an X-Y grid of materials having different (higher and lower) reflectivities) on the surface or within the transparent stylus tip to reflect light 1208 emitted from the light-emitting device 1204 in the optical sensing system 1202 such that reflected light 1210 returns to the optical sensing system in a pattern. Although the stylus pattern 1236 is shown as a separate patch in fig. 12A for simplicity, it should be understood that the stylus pattern may be continuously and uniformly formed across most or all of the stylus surface, or in other examples, the pattern may be selectively formed on only some areas of the stylus surface. In some examples, stylus pattern 1236 may be a plurality of diffractive optical elements implanted into the curved tip of passive diffractive reflector stylus 1200 and optionally in the sides of the stylus. In various examples, the diffractive optical element may be a microstructure or other surface relief texture having different heights, a kinoform surface, a volume diffraction grating, a volume hologram, or a reflective surface pattern that does not have a surface profile but has varying amounts of reflectivity to produce both glossy and opaque regions. As shown in fig. 12A, a stylus pattern 1236 at different locations on the stylus 1200 may reflect light back to the optical sensing system 1202 in the general area of the stylus tip even when the stylus tilt angle changes. For example, the same pattern of reflected light will appear at the optical sensing system 1202 in the general area of the stylus tip, whether the emitted light impinges on the stylus tip or on the stylus pattern 1236 on the side of the stylus (due to stylus tilt). In some examples, a spatial signature of reflected energy profiles of reflected light patterns captured by one or more processors may be tracked over time to determine stylus orientation (e.g., a static axial position of the stylus relative to the optical sensing system 1202) and stylus rotation (e.g., pivoting about an axis of the stylus relative to the optical sensing system).
Fig. 12B illustrates a perspective view of a portion of an optical stylus system having a passive diffraction reflector stylus 1200 and an optical sensing system 1202 according to some examples of the present disclosure. In the example of fig. 12B, the passive diffraction reflector stylus 1200 (only a portion of which is shown in fig. 12B) includes a diffraction reflective surface 1206 having stylus patterns 1236 (symbolically shown as a grid in fig. 12B) that reflect light 1208 emitted from a light-emitting device in the optical sensing system 1202 such that the reflected light 1210 returns to the optical sensing system in reflected light patterns 1238 (shown as cross-hairs in fig. 12B). Although reflected light pattern 1238 appears to be a single reticle, the pattern may be a composite of multiple reflected light patterns reflected back to a general area of the stylus tip, where the wave nature of the light is used to create constructive and destructive interference at optical sensing system 1202. In the reticle example of fig. 12B, the two axes or dimensions of the reticle may be unique (e.g., different lengths, thicknesses, etc.) such that one or more processors executing software or firmware within optical sensing system 1202 track stylus orientation and rotation more easily. It should be understood that the cross-hairs are merely examples and that other patterns may be similarly utilized.
In addition, the number or density of features in the reflected light pattern 1238 may depend on the number of light detection devices 1204 utilized in the optical sensing system 1202. For example, if the density of light detection devices 1204 is relatively high, fewer features in reflected light pattern 1238 may be required, and the placement of stylus pattern 1236 may be simplified (e.g., the density of stylus patterns may be reduced) because small changes (e.g., rotations) of sparse reflected light patterns may be detected by dense arrays of light detection devices. On the other hand, if the density of light detection devices 1204 is relatively low, more features in reflected light pattern 1238 may be required to enable small changes (e.g., rotations) in the pattern to be detected with a sparse array of light detection devices. Different stylus rotations 1240 (and tilt angles not shown in fig. 12B) may produce different orientations and energy profiles of the reflected light pattern 1238, and these different orientations and energy profiles may be detected and estimated to determine one or more of the target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus.
Fig. 12C illustrates a stylus pattern 1236 and a corresponding reflected light pattern 1238 that appears at the optical sensing system 1202, according to some examples of the present disclosure. As noted above, the stylus pattern 1236 may be a microstructure implanted into the curved tip of the passive diffraction reflector stylus 1200 and optionally implanted in the side of the stylus. In the example of fig. 12C, the stylus pattern 1236 is a lithographically-producible grid of surface regions 1244 having a higher reflectivity separating surface regions 1242 having a lower reflectivity, but in other examples the pattern may be any arrangement (e.g., a binary pattern) of regions having lower reflectivity and higher reflectivity that produce a detectable reflected light pattern 1238 at the optical sensing system 1202. Alternatively, the stylus pattern 1236 may be formed of regions having different characteristics, wherein light is reflected back with different reflected phase shifts or changes, and the light detection device 1204 may be designed to detect these phase shifts.
Fig. 12C also shows a reflected light pattern 1238 resulting from light reflected from a stylus pattern 1236 located on the passive diffraction reflector stylus 1200. When light is reflected from the horizontal and vertical grids in the stylus pattern 1236 and returned to the optical sensing system 1202, the reflected light from these grids may become spherical waves that constructively interfere and destructively interfere to form a reflected light pattern 1238, with the strongest light energy at the origin of the pattern and the lower light energy cross hairs appearing on the horizontal and vertical axes. The fourier transform relationship between the stylus pattern 1236 and the reflected light pattern 1238 may be utilized during the design stage to determine the stylus pattern for a given desired reflected light pattern, as will be explained in further detail below. Over a sufficiently large distance, light 1208 emitted from light emitting device 1204 and impinging on stylus pattern 1236 on passive diffraction reflector stylus 1200 will be coherent (e.g., have the same frequency and waveform) and exhibit a diffraction effect as the light reflects and returns to optical sensing system 1202 as a series of spherical waves having different diffraction orders, where the light may combine and constructively interfere with a path length of a particular multiple.
The condition of constructive interference is that, within the coherence length of the light, the path lengths of the two reflected light patterns are each a multiple of the wavelength of the light (but can be assumed to be coherent if the light is monochromatic (single frequency). In the example of fig. 12C, reflection from each of the higher reflectivity slits or lines 1244 in the stylus pattern 1236 in a particular direction (vertical or horizontal) will produce a cosine pattern in that direction. Reflection from a slit or line in a particular direction may be combined at optical sensing system 1202 with different matching (or mismatch) path conditions based on its wavelength, where the energy of the combined light is related to the cosine of the path length difference. Thus, most of the destructive interference occurs in the quadrants of the reflected light pattern 1238, while constructive interference occurs along the principal axis, creating a cross-hair that varies the reflected energy level, as shown in the example of fig. 12C. By detecting the cross-hair (or other reflected light pattern 1238 generated as a result of the stylus pattern 1236) and optionally detecting the reflected energy level within the cross-hair, one or more processors executing software or firmware within the optical sensing system 1202 can determine one or more of a target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus.
Fig. 12D illustrates an alternative reflected light pattern 1246 that appears at the optical sensing system 1202 in accordance with some examples of the present disclosure. Although fig. 12B and 12C illustrate reflected light pattern 1238 with a single reticle, fig. 12D illustrates an embodiment in which stylus pattern 1236 is designed to reflect light back to optical sensing system 1202 in a pattern of multiple different reticles. The array of light detection devices 1204 may detect an amount of reflected light received at each light detection device. From these readings, one or more processors executing software or firmware within optical sensing system 1202 can calculate a weighted reflected energy profile across light detection device 1204 and can determine the touch center, orientation, and rotation of passive diffraction reflector stylus 1200. A densely reflected light pattern 1246 such as shown in fig. 12D may require fewer light detection devices 1204 (e.g., a sparse array) to detect the position, hover distance (if any), tilt angle, orientation, and rotation of the stylus, while a sparsely reflected light pattern 1238 such as shown in fig. 12B and 12C (e.g., a single reticle) may require more light detection devices.
As noted above, a fourier transform relationship between the stylus pattern 1236 and the reflected light pattern 1238 may be employed during the design phase to determine the stylus pattern at any particular location on the passive diffraction reflector stylus 1200 for a given desired reflected light pattern. The stylus pattern 1236 may be determined by starting with a desired reflected light pattern 1238 (e.g., a reticle or other pattern) that will appear on the surface of the optical sensing system 1202, which itself may represent a design tradeoff between the number of light detection devices 1204 in the optical sensing system 1202 and the complexity of the reflected light pattern (e.g., the number of features). After identifying the desired reflected light pattern 1238, a fourier transform may be performed on the desired reflected light pattern to produce a stylus pattern 1236 at various locations around the passive-diffraction reflector stylus 1200 (in some cases taking into account positional parameters such as the angle of light reflection and the path length of reflected light). For example, to determine a stylus pattern 1236 that should be formed at a particular location on the side of the passive diffraction reflector stylus 1200 such that it produces a desired reflected light pattern 1238 at an approximate location of the stylus tip, a fourier transform may be performed on the desired reflected light pattern to identify the particular stylus pattern at the particular location on the side of the stylus. The identified stylus pattern 1236 at a particular location on the side of the passive diffraction reflector stylus 1200 may be a modified version of the default stylus pattern at the stylus tip, including a change in one or more of the orientation of the pattern, the compression of the pattern, or the tilt of the pattern relative to the surface normal of the stylus. Such a modification is shown in FIG. 12A, where modified stylus pattern 1236-2 is a modified version of default stylus pattern 1236-1 at the stylus tip. Additionally, the generation of the stylus pattern 1236 may depend on whether a spherical wave or a plane wave impinges on the passive diffraction reflector stylus 1200. For example, a spherical wave may require incremental changes in the feature spacing of the stylus pattern 1236 at various locations on the stylus surface (e.g., producing chirp (change in frequency or spacing) in the grid).
Fig. 13A illustrates a plan view of a portion of an optical sensing system 1302 including an array 1350 of light emitting devices 1304 operating with a semi-active light detection stylus according to some examples of the disclosure. In the example of fig. 13A, each light emitting device 1304 in array 1350 can emit light at a particular modulation frequency such that within the array, light at a plurality of different modulation frequencies can be emitted from a plurality of light emitting devices. In some examples, each light emitting device 1303 may emit light at a different modulation frequency, but in other examples, groups of light emitting devices (e.g., closely spaced clusters of devices, or dispersion of devices) may emit light at the same modulation frequency, while other groups may emit light at different modulation frequencies. The semi-active light detection stylus, described in more detail below, may receive modulated light from those light emitting devices 1304 that are close to the stylus. One or more processors executing software or firmware within the optical stylus (e.g., within a computing system similar to the system shown in fig. 2B within a semi-active light detection stylus) may determine the amplitude of the received light at various modulation frequencies and utilize the amplitude information to determine the position of the stylus at the optical sensing system 1302. By equipping the stylus only with a receiver instead of a transceiver, low energy return signals from the stylus can be avoided, and higher energy signals received at the semi-active stylus can achieve improved signal-to-noise ratio (SNR) of the detected signals.
In some examples, the light emitting device 1304 may be micro LEDs (such as those described with respect to fig. 4E) with illumination angles 512 (e.g., +/-30 degrees) that do not undergo total internal reflection (e.g., angles up to the critical angle of the surface material) and will not interfere with any other detection scheme employed, such as a water-agnostic detector, and still provide an acceptable angular distribution when light is refracted into the surrounding air. In some examples, the light emitting device 1304 may be any type of device that produces light in the visible, infrared, or near infrared spectrum. The near infrared light emitting device 1304 may generate light having a wavelength between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other implementations, a light emitting device 1304 (and corresponding light detecting device) having a wavelength of 1 micron and above (such as 1.3 microns) may be employed.
Fig. 13B illustrates a semi-active stylus 1300 with a light detection device 1348-1 embedded within the tip of the stylus, according to some examples of the present disclosure. In some examples, the light detection device 1348-1 may be a micro LED configured as a detector, such as those described with respect to fig. 4C. The tip of the semi-active stylus 1300 (only a portion of which is shown in fig. 13B) may be transparent to enable light to enter the stylus and be detected by the light detection device 1348-1. In the example of fig. 13B, the semi-active stylus (only a portion of which is shown in fig. 13B) includes a stylus body having a tip and sides. The tip contains a light detection device 1348-1 (e.g., photodetector, photodiode) that is capable of detecting the amplitude of the modulated light at each modulation frequency produced by the light emitting devices 1304 in the array 1350, but will in fact detect only those light emitting devices that are closest to the light detection device. The transimpedance amplifier may be coupled to a light detection device 1348-1 to generate an output signal from the amplitude detector. Because photodiodes have spectral responsivity (ratio of photocurrent to incident power) that varies according to wavelength, in examples of the present disclosure in which the light detection device 1348-1 is a photodiode, one or more processors executing software or firmware within the optical stylus can apply a fourier transform to the output signal from the light detection device to extract frequency components of the output signal and determine incident power (amplitude of modulated light) at various modulation frequencies that are representative of nearby light emitting devices 1304 (e.g., determine the frequency response of the light detection device).
When the semi-active stylus 1300 is placed on or over the optical sensing system 1302, the light detection device 1348-1 can detect the amplitude of the modulated light from the one or more light emitting devices 1304 sufficiently close to the light detection device. In some cases, the time domain signals from those light emitting devices 1304 may be received by one or more processors executing software or firmware within the optical stylus and transformed into the frequency domain using a Fast Fourier Transform (FFT). Because the location of each light emitting device 1304 and its modulation frequency are known, the detected amplitude of the modulated light from one or more light emitting devices 1304 can be used to calculate the location of the semi-active stylus 1300 and, in some cases, the proximity of the stylus to the optical sensing system 1302. For example, triangulation or similar algorithms may be employed to determine the position of the semi-active stylus 1300, and the amplitude may be used to determine whether the stylus is in contact with the surface of the optical sensing system 1302 or hovering over the optical sensing system at a distance. In some examples, amplitude and frequency information may be transmitted from semi-active stylus 1300 to a device including optical sensing system 1302 for processing using any suitable wireless or wired communication protocol. In other examples, the processing of the amplitude and frequency information may be performed using a Digital Signal Processor (DSP) or other processor in an optical stylus.
Fig. 13C-1 illustrates a semi-active stylus with a light detection device 1348-1 embedded within the tip of the semi-active stylus 1300 and an additional light detection device 1348-2 embedded within the side of the stylus, according to some examples of the present disclosure. The example of fig. 13C-1 is similar to the example of fig. 13B, except that a light detection device 1348-2 is added. The tip and sides of the semi-active stylus 1300 (only a portion of which is shown in fig. 13C-1) may be transparent to enable light to enter the stylus and be detected by light detection devices 1348-1 and 1348-2. Although fig. 13C-1 shows two light detection devices 1348-2 on the side of the stylus 1300, in some examples, only one light detection device or more than two light detection devices may be employed on the side of the stylus. In the example of fig. 13C-1, light detection devices 1348-1 and 1348-2 (e.g., photodetectors, photodiodes) are capable of detecting the amplitude of modulated light at each modulation frequency produced by light emitting devices 1304 in array 1350. Fourier transforms may then be applied to the output signals of light detection devices 1348-1 and 1348-2 to determine the frequency response of the light detection devices, which may then be used to estimate the stylus position at optical sensing system 1302. The unique location of the light detection device 1348-2 on the side of the stylus 1300 may enable these devices to better detect modulated light from the light emitting device 1304 when the stylus is tilted, and thus provide a better estimate of the stylus position. In some examples, the frequency response (and thus the position) of the light detection device 1348-2 may be calculated by one or more processors and tracked over time to determine the stylus orientation (e.g., the static axial position of the stylus relative to the optical sensing system 1302) and the stylus rotation (e.g., pivoting about the axis of the stylus relative to the optical sensing system).
Fig. 13C-2 illustrates a view of a semi-active stylus 1300 along its axis with light detection devices 1348-1 and 1348-2 according to some examples of the present disclosure. In the example of fig. 13C-2, three light detection devices 1348-2 are arranged radially about the axis of the semi-active stylus 1300 (only a portion of which is shown in fig. 13C-2), but it should be understood that other arrangements are possible and may depend on the desired rotational accuracy (e.g., a denser detector arrangement when higher rotational accuracy is desired). Although fig. 13C-2 shows each light detection device 1348-2 in an elongated linear arrangement, in other examples, the light detection devices may be other shapes, such as circular, and may be formed from a single detector (e.g., a single photodiode) or multiple detectors (e.g., a row of photodiodes). In addition, although FIG. 13C-2 shows three light detection devices 1348-2, in some examples, one, two, or more than three light detection devices may be employed. Each light detection device 1348-2 may detect the amplitude of the modulated light at nearby light emitting devices 1304, but because the detectors are positioned around the semi-active stylus 1300 rather than along the axis of the semi-active stylus (e.g., on the tip of the semi-active stylus), the detectors will be located beside different light emitting devices having different modulation frequencies, and thus the measurements of the different light emitting devices will be different. These different measurements from the different light emitting devices 1304 may be used by one or more processors executing software or firmware within the optical stylus to determine stylus orientation, rotation, and tilt, as described with respect to fig. 13C-1.
Fig. 13D illustrates a touch node electrode 1358 implemented within portions of the optical sensing system illustrated in fig. 13A and including a micro LED module and micro driver block 1372 configured to emit modulated light to a semi-active stylus according to some examples of the present disclosure. In some examples, the micro LED module and micro driver block 1372 of fig. 13D corresponds to the touch node electrode of fig. 4D, with some components shown in fig. 13D being optional and depending on the desired configurability of the optical sensing system. For example, if only micro-driver 1370 and waveform generator 1386 were used to drive micro-LEDs 1364, then amplifier 1366 and other downstream electronics such as demodulator 1378 may not be needed. In the example of fig. 13D, the modulated light (in the visible, near infrared, or infrared spectrum) may be generated by a plurality of micro LEDs 1364 configured as a light luminaire. These micro LEDs 1364 may correspond to the light emitting devices 1304 in the array 1350 of fig. 13A, and in some examples may correspond to display pixels in the electronic devices shown in fig. 1A-1E. The modulated light may be received by one or more light detection devices (e.g., photodetectors such as micro-LEDs configured as photodetectors) within a semi-active stylus (not shown in fig. 13D) capable of detecting light in the visible, near-infrared, or infrared spectrum.
In the example of fig. 13D, micro LEDs 1364 in module 1 may be configured as illuminators by coupling anodes of the micro LEDs to micro drivers 1370 (e.g., current sources) in micro driver block 1372 using switches 1344 (the micro drivers are coupled to a reference voltage such as 1.29V), while cathodes of the micro LEDs may be biased by being coupled to a reference voltage such as, for example, -3.7V. The micro-driver 1370 may be modulated by receiving an excitation signal from a waveform generator 1386. In some examples, waveform generator 1386 may be a precision waveform generator based on a Minsky algorithm that generates a sinusoid having a frequency set by a frequency code. In some examples, the sinusoidal signal may then be converted to an analog signal having three levels by a 1.5b DAC. The 1.5b DAC may produce a gating-wave optical transmit signal with a 33% duty cycle that does not have a third-order harmonic distortion (HD 3) component (the first main harmonic distortion component is HD 5) to provide the best tradeoff between complexity and crosstalk with other frequencies. In other embodiments where improved spectral purity is desired, higher order DACs may be used. However, in other examples, a digital comparator may be used instead of a DAC to generate the optical transmit signal, but with odd harmonics. The analog signal from waveform generator 1386 may drive micro-driver 1370, which may provide a current proportional to the modulation voltage to micro-LED 1364. Micro LED 1364 may then produce modulated light at the frequency set by the frequency code.
In some examples, each micro LED 1364 may produce modulated light having a unique optical emission frequency (and optionally a particular phase), where the number of micro LEDs may be equal to the number of bins (bins) in an FFT (discussed below) that processes the received light. However, in other examples, some micro LEDs 1364 within an array of micro LEDs (see, e.g., fig. 13A) may produce modulated light having the same frequency. To disambiguate locations in an optical sensing system having multiple micro-LEDs 1364 emitting modulated light with the same frequency (and optionally the same phase), unique frequency (and optionally phase) patterns may be generated by the micro-LEDs and individually detected (but estimated as a group) by a light detection device in the stylus. For example, the optical sensing system may group micro-LEDs 1364 into multiple 3 x 3 arrays, each having a unique arrangement of frequencies (and optionally phases) therein, but the same frequencies (and optionally the same phases) may be repeated in the micro-LEDs of the other arrays. In one particular example for illustration purposes only, while the upper right micro-LED 1364 in one 3 x 3 array may have the same modulation frequency as the lower left micro-LED in the other 3 x 3 array, the other micro-LEDs in each of the two 3 x 3 arrays form unique patterns that are distinguishable from each other that are detectable by one or more light detection devices in the semi-active stylus to determine the position of the stylus.
Fig. 13E illustrates a light detection device 1348, which may be embedded within the semi-active stylus illustrated in fig. 13B, 13C-1, or 13C-2 and configured to detect modulated light emitted from one or more light emitting devices 1304 in the array 1350 of fig. 13A, according to some examples of the disclosure. In the example of fig. 13E, the photodiode may be configured as a photodetector 1390 by coupling its anode to a reference voltage, such as ground, while coupling its cathode to the inverting input of amplifier 1392. The amplifier 1392 may be configured as a transimpedance amplifier or a charge amplifier to convert the current on the inverting input of the amplifier (indicative of the intensity of light received at the photodetector 1390) to a voltage on the output of the amplifier using the feedback network of the amplifier. In some examples, the analog output of amplifier 1392 may be filtered using an anti-aliasing filter (AAF) 1394 and converted to a digital signal using an ADC 1374 (which may be a nyquist ADC (such as a SAR ADC) in some examples) to produce raw light detection data. In the example of fig. 13E, amplifier 1392, AAF 1394, and ADC 1374 may collectively function as an AFE. However, in other examples, AAF 1394 and ADC 1374 may be replaced with a sigma delta ADC. In the case where the ADC is a continuous-time sigma-delta ADC, AAF may not be required. The FFT block 1396 may extract the amplitude and phase of the raw light detection data at the FFT bin frequency, and the position estimation block 1398 (discussed in further detail below) may estimate the position of the optical stylus based on the intensity (amplitude) and optionally the phase of the light received from the one or more light emitting devices 1304 in the array 1350 of fig. 13A. Note that in some examples, the FFT 1396 and the location estimation block 1398 may be implemented by one or more processors executing software or firmware within the optical stylus (see fig. 2B), and the estimated location information may be communicated to other devices via a wireless interface (I/F) 1399, as shown in fig. 13E. However, in other examples, the FFT 1396 and the position estimation block 1398 may be implemented in components of an optical sensing system other than a stylus. In these examples, the stylus may communicate digitized data (e.g., raw light detection data) from ADC 1374 to these components via wireless I/F1399.
As noted above, the light intensity and optionally phase information of the various modulation frequencies received at the light detection device 1348 may be used to estimate the position of the semi-active stylus on or over an optical sensing system comprising an array of light emitting devices. This estimation relies on constructing a stylus touch/proximity image by correlating amplitude and optionally phase information from the FFT frequency bins with actual light emitting device positions in the optical sensing system. This association is possible because the modulation frequency (and optionally phase) and the location of each light emitting device 1304 in array 1350 are known a priori, or in other examples the location and frequency (and optionally phase) arrangement of groups of light emitting devices are known. Thus, a map of the locations of the light emitting devices (or groups of light emitting devices) and their different modulation frequencies (and optionally their phases) (or patterns of modulation frequencies (and optionally their phases) within the group) may be stored in advance.
Fig. 13F is a flow chart for estimating a position of a semi-active stylus on or over an optical sensing system including an array of light emitting devices, according to some examples of the present disclosure. In the example of fig. 13F, at block 1397, a scan of the optical sensing system may first be performed to acquire raw light detection data (e.g., measurements from light detection device 1348 within the semi-active stylus). The raw light detection data (e.g., raw light signals) may then be digitized within device 1348.
Then, at block 1395, an FFT may be simultaneously applied to the digitized raw light detection data (e.g., digitized raw light signal) to extract the amplitude (and optionally phase information) at the FFT bin frequency. Alternatively, as will be explained in further detail below, I/Q demodulation may be performed using a fewer number of frequency bins than the FFT to calculate the amplitude (and optionally the phase) of the raw light detection data. After calculating the FFT magnitude, these values may be linearized at block 1393, if desired, such that the intensity is mapped to distance (instead of 1/r 2 ) So as to be operable with various downstream processes (e.g., centroid algorithms assuming linear mapping). This can be accomplished by generating a linear distance of 1/r 2 A look-up table mapping to linear distance. At block 1391, an initial image of the location of the semi-active stylus on the optical sensing system 1302 may be constructed by correlating the extracted and linearized intensity values with the location of the light emitting devices using a previously stored mapping of the modulation frequency (and optionally phase) and the location of each light emitting device 1304. Then, a grass suppression of the stylus image may be performed at block 1389 to remove values whose illumination intensity values are indicative of noise, and a centroid algorithm may be applied at block 1387 to derive the x-y position of the semi-active stylus.
For purposes of explanation, an illustration of an exemplary construction of an initial image of the positions of two semi-active styluses in the optical sensing system at block 1391 of fig. 13F will now be provided.
13G-1, 13G-2, and 13G-3 illustrate a symbol optical sensing system with 16 light emitting devices 1304 and two semi-active stylus positions (0, 0) and (1, 1) according to some examples of the present disclosure. In the illustration of fig. 13G-1, the semi-active stylus located at point (0, 0) will detect the highest light intensities (after grass suppression) at modulation frequencies f1, f2, f5 and f6 indicated by shading 1385 in fig. 13G-2 from the four light emitting devices. The semi-active stylus located at point (1, 1) will detect the highest light intensity at the modulation frequency f11 indicated by the shading 1383 in fig. 13G-3 from the device. In addition, lower light intensity values at modulation frequencies f7, f10, f12, and f15 indicated by shading 1381 in fig. 13G-3 will be detected from the four light emitting devices, and the lowest light intensity values at modulation frequencies f6, f8, f14, and f16 indicated by shading 1379 in fig. 13G-3 will be detected from the four light emitting devices. These light intensities may be adjusted such that the FFT amplitude scales linearly with the distance between the semi-active stylus and the light emitting device of interest. A centroid algorithm (in some cases considering the light intensities of the four shadow micro LEDs (alternatively, treating all four shadow micro LEDs to have the same light intensity to simplify the determination)) may then be performed on the four shadow micro LEDs in fig. 13G-2 to calculate the position of the stylus. Similarly, a centroid algorithm (in some cases considering the light intensities of the nine shadow micro LEDs) may then be performed on the nine shadow micro LEDs in fig. 13G-3 to calculate the position of the stylus.
Because the light intensity varies according to distance, the distance between a given light source (e.g., a light emitting device) associated with a particular modulation frequency (and optionally a particular phase) may be determined based on the detected light intensity. For a lambertian light source, the light intensity on the hemispherical surface is I 0 /2π·r 2 Where r is the distance between the semi-active stylus and the light emitting device of interest, and I 0 Is the irradiance of the light emitting device at its source. For a given photodetector area A S The relationship between photodetector area and distance is I (r) =a S ·I 0 /2π·r 2 This isMeaning r=sqrt (a S ·I 0 2 pi.I (r)). Thus, given a determined light intensity I (r) detected at the photodetector, a known irradiance I of the light emitting device at its source 0 Known photodetector area a S The distance r to the light emitting device may be calculated. The equation shows that the light intensity is nonlinear and decreases rapidly with distance from the source, thereby significantly decreasing the relevance of the light emitting device with increasing distance from the photodetector. Thus, grouping micro LEDs 1364 into unique arrays to identify stylus locations as discussed above may be limited to small arrays (e.g., 3 x 3 arrays) without significant loss of fidelity.
As noted above, in addition to using the modulation frequency and amplitude to identify stylus position, in other examples, phase information may also be employed. For example, the plurality of phases may be associated with a plurality of modulation frequencies generated by the light emitting device. The use of phase information may enable the use of fewer modulation frequencies while still providing information sufficient to determine stylus position, in some examples without performing an FFT. In one particular example for illustration purposes, instead of having 1024 FFT bins, only nine modulation frequencies may be used and nine I/Q demodulators may be employed instead of FFTs. (see, e.g., fig. 13E, where FFT 1396 may be replaced with nine I/Q demodulators). I/Q demodulation may be performed using a Goerzel filter because the Goerzel filter is less sensitive to harmonics due to the fact that the illumination may not be sinusoidal. In one example, each light emitting device may be represented by a unique modulation frequency and phase combination, and a mapping of the locations of the light emitting devices (or groups of light emitting devices) and their different modulation frequencies and phases (or patterns of modulation frequencies and phases within the group) may be pre-stored. When the light detection device extracts amplitude and phase information for a particular modulation frequency, the stored mapping can be used to disambiguate and determine stylus position even when two different light detection devices have the same modulation frequency.
As mentioned above, the light emitting device 1304 in the array 1350 of fig. 13A may be configured to emit modulated light at a plurality of different frequencies (and optionally with a plurality of phases). In some examples, each light emitting device 1304 may emit light at a unique frequency (and optionally with a particular phase), and the modulated light from each light emitting device may be individually detected and analyzed to determine stylus position. In other examples, groups of light emitting devices 1304 may be used instead of individual light emitting devices. For example, each group of light emitting devices within a group of multiple light emitting devices 1304 (e.g., a 3 x 3 array of light emitting devices) may emit light having a unique frequency (and optionally phase) arrangement, and these arrangements may be detected and analyzed to identify a particular group and its location on the panel, and then determine a stylus location, as will be explained in further detail below.
Fig. 13G-4 illustrate groups of nine light emitting devices 1304 that may emit light of up to nine different modulation frequencies and having up to nine different phases, according to some examples of the present disclosure. The 3 x 3 array of fig. 13G-4 is an example of one particular grouping of light emitting devices with unique frequency and phase arrangements that can be estimated together to determine the position of the stylus. Note that the 3 x 3 groupings of light emitting devices 1304 in fig. 13G-4 are only a subset of the entire array 1350 of light emitting devices 1304 shown in fig. 13A. The entire array 1350 may be pre-designed from multiple 3 x 3 groups of light emitting devices 1304, each group having light emitting devices that emit modulated light with unique frequency and phase arrangements.
Referring back to fig. 13D, in some examples of the present disclosure, nine unique frequency codes may be utilized to set the frequency of the waveform generator 1386 and modulate the micro-driver 1370 with the nine frequencies. In addition, waveform generator 1386 may be programmed with nine start phases. Using these nine frequencies and nine phases, the number of possible unique groups (e.g., the number of unique 3 x 3 arrays) that can be used to identify unique locations on a touch screen or touch panel is 2 9 =512. The optical sensing system may configure each group of light emitting devices at each location on the touch screen or panel to produce a desired unique frequency and phase arrangement. After performing a stylus scan of the touch screen or panel (block 1397 in figure 13F) and having extracted and processed the data (see blocks 1397, 1395 and 1393 in figure 13F),an initial image of the location of the semi-active stylus may be constructed by associating the extracted and linearized intensity values with a particular group of light emitting devices using a previously stored mapping of the location and frequency/phase arrangement of each group of light emitting devices. Fig. 13G-4 may represent a so-called "heat map" (e.g., intensity map) of the group of light emitting devices after the grass suppression has been performed in block 1389 of fig. 13F. Because the particular group of light emitting devices is associated with a particular known location on the panel, a coarse stylus location may be determined. The centroid of the image of fig. 13G-4 (see block 1387 in fig. 13F) may then be performed to derive a more specific x-y position of the stylus.
The examples of fig. 13D, 13E, 13F, and 13G-1 through 13G-4 are described in the context of a single light detection device 1348-1 at the tip of the semi-active stylus 1300 as shown in fig. 13B. However, when multiple detectors are employed at the semi-active stylus 1300, such as detector 1348-2 in fig. 13C-1 and 13C-2, each detector may include its own amplifier 1392 and associated AFE circuitry as shown in fig. 13E, and the flowchart of fig. 13F may be performed to obtain a separate stylus position determination from each light detection device. Stylus position information from each detector may be input into further algorithms to determine stylus tilt angle and tilt direction and stylus position. For example, the angle of stylus tilt may be derived by: the intensities of the light received at the plurality of light detection devices are mapped to the distance between the light emitting device and the light detection device according to the foregoing equation, and then the tilt angle is calculated based on the known relative position of the light emitting device and the foregoing distance of the light emitting device to the light detection device, based on the known position of the light detection device in the stylus.
The previously discussed examples of the present disclosure employ a light emitting device within an optical sensing system. The examples of fig. 13A-13C-2 also employ a light detection device within the stylus to avoid the need for light reflection and low energy return signals from the reflected light. In other examples, which will be discussed below, the light emitting device may be located within the stylus and the light detecting device may be located within the optical sensing system, which may also avoid the need for light reflection and low energy return signals from the reflected light. The light emitting device may direct light onto and through the pattern generator to create a pattern (e.g., a reticle) on the optical sensing system. Determining the position and energy of the pattern may enable determination of the target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus. As discussed above in the previous examples, the number or density of features in the pattern may depend on the number of light detection devices to be employed within the optical sensing system.
Fig. 14A illustrates a perspective view of a portion of an optical stylus system having an optical sensing system 1402 and an active light emitting stylus 1400 including a light emitting device 1454 according to some examples of the present disclosure. In the example of fig. 14A, the light emitting device 1454 in the active stylus 1400 (only a portion of which is shown in fig. 14A) may be a Light Emitting Diode (LED) that emits light through the patterned aperture 1452 and out through the tip of the stylus. Patterned aperture 1452 may be a reticle as shown in fig. 14A, or other opening shaped such that a desired reflected light pattern 1438 appears at optical sensing system 1402. The array of light detection devices within the optical sensing system 1402 can detect the position and energy of the reflected light pattern 1438 and one or more processors executing software or firmware within the optical stylus (e.g., within a computing system in communication with the optical sensing system 1402 similar to the system shown in fig. 2A) can determine the target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus.
In some examples, the light emitting device 1454 may be a micro LED (such as those described with respect to fig. 4E). In other examples, the light emitting device 1454 may be any type of device that produces light in the visible, infrared, or near infrared spectrum. The near infrared light emitting device 1454 may produce light having a wavelength between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other embodiments, a light emitting device 1454 (and corresponding light detecting device) having a wavelength of 1 micron and above (such as 1.3 microns) may be employed.
Fig. 14B illustrates a portion of an optical stylus system having an active stylus 1400 including a laser 1456 and patterned apertures 1452, according to some examples of the present disclosure. In the example of fig. 14B, a laser 1456 (only a portion of which is shown in fig. 14B) in the active stylus 1400 emits coherent light through the patterned aperture 1452 and out through the tip of the stylus. The laser 1456 may generate light of any wavelength that produces sufficient diffraction, such as light in the visible or infrared spectrum. In some examples, the laser 1456 may be frequency modulated by one or more processors executing software or firmware within the optical stylus. Patterned aperture 1452 may be designed to produce an illumination pattern 1428-1 similar to the single reticle shown in FIG. 14A or multiple reticles as shown in FIG. 14B, or shaped such that a desired pattern appears at other openings at optical sensing system 1402. In the reticle example of fig. 14A, the two axes or dimensions of the reticle may be unique (e.g., different lengths, thicknesses, etc.) such that one or more processors executing software or firmware within the optical sensing system 1402 track the stylus orientation and rotation more easily. It should be understood that the cross-hairs are merely examples and that other patterns may be similarly utilized. In some examples, patterned apertures 1452 may be diffraction pattern generators similar to those described with respect to fig. 9A-9D, but in the example of fig. 14B, light passes through or refracts through the patterned apertures rather than being reflected from the patterned apertures.
Ambient light may also be detected at a light detection device within the optical sensing system 1402 along with patterned light from the laser 1456. To reduce the effect of ambient light on stylus detection, in some examples, a phase lock amplifier may employ a homodyne detection scheme and low pass filtering to measure the amplitude and phase of the patterned laser with respect to a periodic reference, effectively suppressing all other frequency components (ambient light) except the modulated patterned light from the laser 1456.
Fig. 14C illustrates two illumination patterns 1428-1 and 1428-2 that occur at an optical sensing system having an array of light detection devices 1404, according to some examples of the disclosure. In some examples, light detection device 1404 may be a micro LED configured as a detector such as those described with respect to fig. 4D and capable of detecting near infrared light. In the example of fig. 14C, illumination pattern 1428-2 is a rotated image of illumination pattern 1428-1 produced by a small rotation of the stylus that produced the illumination pattern. As shown in fig. 14C, light detection device 1404 may receive an amount of light energy that varies according to the rotation of illumination pattern 1428-2. In other examples, the light detection device 1404 may receive an amount of light energy that varies according to the tilt of the stylus. These energy changes may be detected as an illumination pattern having an illumination energy profile, and the illumination pattern and energy profile may be processed by one or more processors executing software or firmware within the optical sensing system to determine a target location, hover distance (if any), tilt angle of the stylus. In some examples, a spatial signature of the illumination energy profile captured by the one or more processors may be tracked over time to determine stylus orientation (e.g., a static axial position of the stylus relative to the optical sensing system 1402) and stylus rotation (e.g., pivoting about an axis of the stylus relative to the optical sensing system).
The number or density of light detection devices 1404 utilized in the optical sensing system 1402 can depend on the number of features in the illumination pattern 1428-1. For example, if the density of illumination pattern 1428-1 is relatively high (e.g., multiple cross hairs as shown in fig. 14C), fewer light detection devices 1404 may be required because an array of dense light detection devices may be used to detect minor changes (e.g., rotations) of the sparse illumination pattern. On the other hand, if the density of the illumination pattern 1428-1 is relatively low (e.g., a single reticle as shown in fig. 14A), more light detection devices 1404 may be needed to enable detection of small changes (e.g., rotations) in the illumination pattern with a sparse array of light detection devices. As mentioned above, different stylus rotations (and tilt angles not shown in fig. 14C) may produce different orientations and energy profiles of illumination patterns 1428-1 and 1428-2, and these different orientations and illumination energy profiles may be detected and estimated to determine one or more of a target position, hover distance (if any), tilt angle, orientation, and rotation of the stylus.
Fig. 15A illustrates a cross-sectional view of a portion of an optical stylus system including an active stylus 1500 having a light emitting device 1504-1, a separation element 1564, and a plurality of light detecting devices 1504-2 in contact with or hovering over an optical sensing system 1502 of a display device, according to some examples of the present disclosure. In some examples, the light emitting device 1504-1 in the active stylus 1500 (only a portion of which is shown in fig. 15A) may be a micro-LED that has an illumination angle that is subject to total internal reflection (e.g., an angle up to the critical angle of the stylus surface material) and will not interfere with any other detection scheme employed, such as a water-agnostic detector. In some examples, the light emitting device 1504-1 may be any type of device that generates light in the visible, infrared, or near infrared spectrum. The near infrared light emitting device 1504-1 may generate light at wavelengths between 800nm and 2500nm, and in some examples between 980nm and 1 micron, and in some particular implementations at 850nm or 940 nm. However, in other implementations, other emitters (and corresponding detectors) having wavelengths of 1 micron and above (such as 1.3 microns) may be employed.
The splitting element 1564 in the active stylus 1500 may be formed from a grating, prism, or other optical wavelength beam splitter to distribute or split the incoming reflected light 1510 into light of separate wavelengths, such as red and blue light. In some examples, the plurality of light detection devices 1504-2 may be configured to receive light of separate wavelengths from the separation element 1564 and generate one or more output signals. In some examples, light detection device 1504-2 may be a micro LED configured as a detector such as those described with respect to fig. 4D and capable of detecting near infrared light. One or more processors executing software or firmware within an optical stylus (e.g., within a computing system similar to the system shown in fig. 2B) may use output signals from a plurality of light detection devices 1504-2 to determine the wavelength (e.g., color) and optionally the intensity of reflected light 1510, in one example by calculating the ratio of two photodiodes responsive to two different wavelengths, or by other suitable means. As will be explained in further detail below, one or more determinations of the wavelength and optionally the intensity of the reflected light 1510 at one or more locations along the optical sensing system 1502 can be used to estimate the location of the active stylus 1500 at the optical sensing system.
Fig. 15A also shows a layered structure of optical sensing system 1502, which may include a transparent cover material, polarizer 1566, touch sensor layer 1568 with encapsulant, buffer layer 1570, pixel Definition Layer (PDL) 1562 with retroreflector layer 1560 and display elements 1558-1 through 1558-3 (such as blue, red, and green OLEDs), electrical layer 1572 containing circuitry and traces for display updating, and structural layer 1574.
Fig. 15B illustrates a plan view of a portion of an optical sensing system 1502 having display elements 1558-1 through 1558-3 and a retroreflector layer 1560 according to some examples of the disclosure. In some examples, slits or holes 1576 may be formed in the retroreflector layer 1560 to allow light to pass through the display for other purposes. Although shown as a uniform layer in the example of fig. 15B, retroreflector layer 1560 may be formed to have different retroreflective properties at different locations across optical sensing system 1502. In some examples, retroreflector layer 1560 may be a diffraction grating with a chirp (spatially varying periodicity) to achieve a spectral shift of the reflected light. In one particular example, the periodicity of the grating of the retroreflector layer 1560 may gradually change from reflecting light in the blue spectrum in the upper left corner of the optical sensing system 1502 to reflecting light in the red spectrum in the lower left corner of the optical sensing system, and a spectrogram may be calculated for all locations on the optical sensing system. By sensing the wavelength (e.g., color) of reflected light in the active stylus 1500 at one or more locations in the optical sensing system 1502 in conjunction with the spectrogram, one or more processors executing software or firmware within the optical sensing system can estimate the stylus's location on the optical sensing system. However, it should be understood that while the disclosed examples described herein relate to a retroreflector layer 1560 having a chirp that spans two reflective colors (red and blue) and a stylus configured to detect the two colors, in other examples, different reflective colors (more than two reflective colors) may be generated by the retroreflector layer and detected by the active stylus 1500, and different spectrograms of the varying reflective colors across multiple locations of the optical sensing system 1502 may be calculated and utilized to determine stylus position.
While the previous discussion focused on utilizing the progressively changing retroreflection properties of retroreflector layer 1560 to determine stylus location, in other examples of the present disclosure, differences in the reflection properties of structures other than retroreflector layers may also be utilized to determine stylus location. In some examples, the reflective characteristics of the red, green, and blue display elements (1558-2, 1558-3, and 1558-1, respectively) may be different from surrounding retroreflector layer 1560, as there may be no metallic composition or other reflective material under the retroreflector layer. Similarly, holes 1576 in retroreflector layer 1560 may allow reflection of different materials that also produce different reflection characteristics. As indicated by the arrows in the example of fig. 15A, light from active stylus 1500 may reflect from conductive touch electrode 1578 in touch sensor layer 1568, conductive layers in display element 1558 and electrical layer 1572, and retroreflector layer 1560, to name a few examples.
Thus, there is spatially varying reflectivity across optical sensing system 1502, and these variations can be mapped to different locations across the surface and stored to help determine stylus position. The active stylus 1500 may be designed to spatially resolve these reflectivity differences and generate a time domain signal as the stylus moves across the surface of the optical sensing system. The characteristics (e.g., amplitude) of the time domain signal may be processed by one or more processors executing software or firmware within the optical stylus to determine the position and optionally the rate of movement of the active stylus 1500. For example, while the detected reflection from display element 1558-2 (without more) is insufficient to determine stylus position due to the presence of multiple display elements 1558-2 across detection surface 1502, if reflection from retroreflector layer 1560 with a particular color is detected at approximately the same time, the color of the reflection from the retroreflector layer may be used to narrow the range of position of active stylus 1500 to a particular area of detection surface 1502 and the reflection from the display elements may be used to further narrow the range of position.
Thus, in accordance with the above, some examples of the present disclosure relate to an integrated touch screen for performing display operations and optical object sensing, the integrated touch screen comprising: a cover material having a detection surface; an array of photodetectors disposed beneath the cover material; and a first light blocking layer disposed between the photodetector and the cover material, the first light blocking layer including a plurality of first apertures configured to block light having a detection angle smaller than a first critical angle, the first critical angle being determined according to the cover material and a first medium in contact with the cover material and defined with respect to a normal of the detection surface, from illuminating the detector angular filter on the photodetector. Additionally or alternatively to one or more of the examples disclosed above, in some examples the photodetector comprises a plurality of Light Emitting Diodes (LEDs) configured as photodetectors and formed in the display layer. Additionally or alternatively to one or more of the examples disclosed above, in some examples the photodetector comprises a plurality of discrete photodetectors that are separate from a plurality of Light Emitting Diodes (LEDs) configured as display elements in the display layer. Additionally or alternatively to one or more of the examples disclosed above, in some examples the photodetector is formed in a second layer separate from the display layer. Additionally or alternatively to one or more of the examples disclosed above, in some examples the photodetector comprises a plurality of Near Infrared (NIR) sensitive photodetectors. Additionally or alternatively to one or more of the examples disclosed above, in some examples the photodetector is formed in a first layer separate from a second layer, the second layer comprising a plurality of discrete illuminators. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first medium is water and the first critical angle is +/-62.7 degrees +/-1 degree. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first critical angle is less than at least one first reflection angle determined from the cover material and a second medium in contact with the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of first apertures are configured to pass light between the first critical angle and a second detection angle that is a fixed number of degrees greater than the first critical angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first critical angle is the greater of the first critical angle and a second critical angle determined from the cover material and a second medium in contact with the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the second medium is air and the second critical angle is +/-42 degrees +/-1 degree. Additionally or alternatively to one or more of the examples disclosed above, in some examples the integrated touch screen further comprises: a second light blocking layer disposed between the photodetector and the first light blocking layer, the second light blocking layer comprising a plurality of second apertures aligned with the plurality of first apertures and configured to block a portion of the detector angle filter that is less than the first critical angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples the integrated touch screen further comprises: an array of illuminators, wherein the first light blocking layer comprises a plurality of second apertures configured as illuminator angle filters for blocking light from the illuminators having an illumination angle greater than a second critical angle, and the second critical angle is determined from the cover material and a second medium in contact with the cover material and is defined relative to a normal of the detection surface. Additionally or alternatively to one or more of the examples disclosed above, in some examples the luminaire includes a plurality of Light Emitting Diodes (LEDs) configured as a light luminaire. Additionally or alternatively to one or more of the examples disclosed above, in some examples the illuminator includes a plurality of Near Infrared (NIR) micro LEDs configured to emit NIR light. Additionally or alternatively to one or more of the examples disclosed above, in some examples the illuminator is separate from a plurality of Light Emitting Diodes (LEDs) configured as display elements in the display layer. Additionally or alternatively to one or more of the examples disclosed above, in some examples the illuminator is formed in a first layer separate from a second layer, the second layer comprising a plurality of discrete photodetectors. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the second critical angle is the smaller of the second critical angle and a third critical angle determined from the cover material and a third medium in contact with the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the second medium is air and the second critical angle is +/-42 degrees +/-1 degree. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the third medium is water and the third critical angle is +/-62.7 degrees +/-1 degree. Additionally or alternatively to one or more of the examples disclosed above, in some examples the integrated touch screen further comprises: a second light blocking layer disposed between the illuminators and the first light blocking layer, the second light blocking layer comprising a plurality of third apertures aligned with the plurality of second apertures and configured to block a portion of the illuminator corner filter that is greater than the second critical angle.
Some examples of the present disclosure relate to a method for angular filtering light at a touch sensing device to improve object detection, the method comprising: determining a first critical angle from the cover material of the touch sensing device and a first medium in contact with the cover material; and blocking light having a detection angle smaller than the first critical angle from impinging on the plurality of photodetectors located below the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light having a detection angle greater than the first critical angle is detected at a plurality of first light emitting diodes configured as the plurality of photodetectors. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light having a detection angle greater than the first critical angle is allowed to impinge on the plurality of photodetectors through a plurality of first apertures in a first light blocking layer located between the plurality of photodetectors and the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first critical angle is less than at least one first reflection angle determined from the cover material and a second medium in contact with the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light having a detection angle between the first critical angle and a second detection angle that is a fixed number of degrees greater than the first critical angle is allowed to impinge on the plurality of photodetectors through the plurality of first apertures. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: a second critical angle is determined from the cover material and a second medium in contact with the cover material, and the first critical angle is determined to be the greater of the first critical angle and the second critical angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light having a detection angle greater than the first critical angle is allowed to impinge on the plurality of photodetectors through a plurality of second apertures in a second light blocking layer located between the plurality of photodetectors and the first light blocking layer. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: a second critical angle is determined from the cover material of the touch sensing device and a second medium in contact with the cover material, and light emitted from a plurality of illuminators located below the cover material having an illumination angle greater than the second critical angle is blocked. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light is emitted from a plurality of second light emitting diodes configured as the plurality of illuminators. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: a third critical angle is determined from the cover material and a third medium in contact with the cover material, and the second critical angle is determined to be the smaller of the second critical angle and the third critical angle. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: light having an illumination angle less than the second critical angle is allowed to pass through a plurality of first apertures in a first light blocking layer located between the plurality of illuminators and the cover material.
Some examples of the present disclosure relate to an integrated touch screen for performing display operations and optical object sensing, the integrated touch screen comprising: a cover material having a detection surface; an illuminator optically coupled to the cover material for transmitting light laterally into the cover material at a first angle equal to or greater than a critical angle of the cover material to cause total internal reflection within the cover material at the first angle; and an array of photodetectors disposed below the cover material, wherein the critical angle is defined in terms of the cover material and one or more media in contact with the cover material relative to a normal to the detection surface. Additionally or alternatively to one or more of the examples disclosed above, in some examples the integrated touch screen further comprises: a layer between the cover layer and the array of photodetectors, the layer being index mismatched with the cover material. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the cover material is selected to cause a change in total internal reflection from a first angle to a second angle when light reflected within the cover material contacts the detection surface at a location where the object contacts the detection surface, and the total internal reflection at the second angle is capable of being received and detected at one or more of the photodetectors.
Some examples of the present disclosure relate to a method for determining a hover distance for a proximate stylus, the method comprising: capturing irradiance profiles by aggregating a plurality of illumination intensity values from a plurality of light detection pixels at a touch sensing device comprising one or more processors and a plurality of light detection pixels configured to detect light from a stylus proximate a detection surface of the touch sensing device; determining an ellipse from the irradiance profile; determining a tilt of the stylus from the width and height of the ellipse; and determining a hover distance of the stylus from the tilt. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the azimuth of the stylus is determined as the angle between the projection of the stylus onto the detection surface and the reference direction on the detection surface. Additionally or alternatively to one or more of the examples disclosed above, in some examples, determining the ellipse from the irradiance profile includes identifying a plurality of boundary pixels in the irradiance profile, and fitting the ellipse to the plurality of boundary pixels. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the irradiance profile is baselined by removing dark current from the plurality of illumination intensity values in the irradiance profile. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the irradiance profile is subjected to the grass suppression by setting the illumination intensity values of those pixels in the irradiance profile that have an illumination intensity value below the grass suppression threshold to zero. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the irradiance profile is interpolated and upsampled to produce additional illumination intensity values within the irradiance profile. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the plurality of boundary pixels in the irradiance profile are identified by identifying those illumination intensity values within the irradiance profile that have non-zero illumination intensity values and that are adjacent to pixels in the irradiance profile that have zero illumination intensity values. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: an ellipse is fitted to the plurality of boundary pixels by performing a least squares ellipse fitting algorithm on the plurality of boundary pixels to derive parameters representing a mathematical expression of the ellipse. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the width and height of an ellipse are calculated by calculating the centroid, minor axis and major axis of the ellipse using mathematical expressions for the ellipse. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the tilt of the stylus is determined by applying the calculated width and height of the ellipse to a look-up table. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the hover distance of the stylus is determined by applying the determined tilt of the stylus to a look-up table. Additionally or alternatively to one or more of the examples disclosed above, in some examples, determining the ellipse from the irradiance profile includes calculating a centroid from the irradiance profile, calculating a covariance matrix based on the centroid, the irradiance profile, and the sensor grid, calculating eigenvalues of the covariance matrix, and calculating a width, height, and azimuth of the ellipse corresponding to the irradiance profile from the eigenvalues of the covariance matrix. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the irradiance profile is baselined by removing dark current from the plurality of illumination intensity values in the irradiance profile. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the irradiance profile is subjected to the grass suppression by setting the illumination intensity values of those pixels in the irradiance profile that have an illumination intensity value below the grass suppression threshold to zero. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the tilt of the stylus is determined by applying the calculated width and height of the ellipse to a look-up table. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the hover distance of the stylus is determined by applying the determined tilt of the stylus to a look-up table. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the shift of the centroid is compensated by determining a centroid offset correction value according to the tilt and azimuth angle changes and applying the offset correction value to the calculated centroid. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: determining an ellipse from the irradiance profile in accordance with a determination that the stylus is partially outside the detection surface of the touch-sensing device includes identifying a plurality of boundary pixels in the irradiance profile and fitting the ellipse to the plurality of boundary pixels, and determining an ellipse from the irradiance profile in accordance with a determination that the stylus is not partially outside the detection surface of the touch-sensing device includes calculating a centroid from the irradiance profile, calculating a covariance matrix based on the centroid, the irradiance profile, and the sensor grid, calculating eigenvalues of the covariance matrix, and calculating a width, a height, and an azimuth of the ellipse corresponding to the irradiance profile from the eigenvalues of the covariance matrix.
Some examples of the present disclosure relate to a method for calculating a hover distance of an object proximate to a detection surface of a touch sensing device, the method comprising: capturing irradiance profiles by aggregating a plurality of illumination intensity values from a plurality of photodetectors at a touch sensing device comprising one or more processors and the plurality of photodetectors configured to detect light from an object proximate to a detection surface of the touch sensing device; calculating an average illumination intensity value I from the plurality of illumination intensity values H The method comprises the steps of carrying out a first treatment on the surface of the From the average illumination intensity value I H Radius r of illumination source of object c Divergence angle delta of light emanating from an illumination source, and intensity I of light at the illumination source c A hover distance is calculated. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: irradiance profiles in axisymmetric shapes are captured. Additionally or alternatively to one or more of the examples disclosed above, in some examples the axially symmetric shape comprises one of a cone, hollow cone, rectangular cone, or star cone. Additionally or alternatively to one or more of the examples disclosed above, in some examples the object is a stylus, and the method further comprises: an irradiance profile of the modulated light received from the stylus is captured by performing analog demodulation of the light received at the plurality of photodetectors and aggregating the plurality of illumination intensity values from the plurality of photodetectors.
Some examples of the present disclosure relate to an integrated touch screen for performing display operations and optical object sensing, the integrated touch screen comprising: an array of Light Emitting Diodes (LEDs), the array of LEDs configured as a photodetector; at least one Analog Front End (AFE) comprising at least one amplifier coupleable to an array of photodetectors; and a plurality of demodulators coupled to the at least one AFE, at least some of the plurality of demodulators configured to be in a first configuration to demodulate signals at a plurality of demodulation frequencies received from the light detector, wherein the plurality of demodulation frequencies correspond to modulation frequencies of the plurality of styluses. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of demodulators includes a plurality of demodulator pairs, each demodulator pair configured to demodulate an in-phase (I) component and a quadrature (Q) component of a particular demodulation frequency corresponding to a modulation frequency of one of the plurality of styluses. Additionally or alternatively to one or more of the examples disclosed above, in some examples the integrated touch screen further comprises: at least some of the plurality of demodulators configured to be in a second configuration to identify incoming modulation frequencies, and control logic configured to deactivate those demodulator pairs that do not match any of the identified incoming modulation frequencies.
Some examples of the present disclosure relate to an optical stylus that operates with an optical sensing system for performing stylus sensing, the optical stylus comprising: a stylus body having a tip and sides; and a reflective surface located on at least a portion of the tip and the side of the optical stylus, wherein the reflective surface is configured to reflect incoming light received from the optical sensing system at a uniform angular reflection profile at a plurality of locations regardless of the angle of inclination of the stylus relative to a surface normal of the optical sensing system. Additionally or alternatively to one or more of the examples disclosed above, in some examples the reflective surface comprises a volume scattering material. Additionally or alternatively to one or more of the examples disclosed above, in some examples the volume scattering material comprises a diffuse reflector. Additionally or alternatively to one or more of the examples disclosed above, in some examples the volume scattering material comprises a lambertian reflector. Additionally or alternatively to one or more of the examples disclosed above, in some examples the volume scattering material has a reflectivity of greater than 99% for incoming light having a wavelength between 400nm and 1500 nm. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the volume scattering material is formed on the stylus in one or more patterns configured to generate a spatial signature of the reflected light for stylus orientation and rotation detection. Additionally or alternatively to one or more examples described above, in some examples the angular reflection profile is greater than 90 degrees. Additionally or alternatively to one or more of the examples disclosed above, in some examples the volume scattering material is configured to produce scattering angles having a cosine angular distribution. Additionally or alternatively to one or more of the examples disclosed above, in some examples the reflective surface comprises a retroreflective surface. Additionally or alternatively to one or more of the examples disclosed above, in some examples the retroreflective surface includes a saw tooth surface relief structure. Additionally or alternatively to one or more of the examples disclosed above, in some examples the retroreflective surface includes a plurality of retroreflector facets. Additionally or alternatively to one or more of the examples disclosed above, in some examples, at least some of the plurality of retroreflector facets comprise pyramidal facets. Additionally or alternatively to one or more of the examples disclosed above, in some examples, at least some of the pyramid-shaped facets have a tangent point. Additionally or alternatively to one or more of the examples disclosed above, in some examples, each pyramid-shaped facet has at least two opposing interior surfaces oriented at 90 degrees to create retroreflection. Additionally or alternatively to one or more of the examples disclosed above, in some examples, an axis of at least some of the plurality of retroreflector facets is oriented normal to the stylus body. Additionally or alternatively to one or more of the examples disclosed above, in some examples, at least one of the plurality of retroreflector facets rotates along an axis of the retroreflector facet. Additionally or alternatively to one or more of the examples disclosed above, in some examples, axes of at least some of the plurality of retroreflector facets are oriented at a non-zero angle relative to a surface normal of the stylus body. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the retroreflective surface is formed on the stylus in one or more patterns configured to generate a spatial signature of the reflected light for stylus orientation and rotation detection. Additionally or alternatively to one or more of the examples disclosed above, in some examples the retroreflective surface is configured to concentrate reflected light back to a location of a source of incoming light to improve stylus detection. Additionally or alternatively to one or more of the examples disclosed above, in some examples the angular reflection profile coincides with an angle of the incoming light. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the energy distribution of the angular reflection profile varies according to the tilt angle of the stylus.
Some examples of the present disclosure relate to an optical sensing system operating with an optical stylus for performing stylus sensing, the optical sensing system comprising: a plurality of optical devices, some of the plurality of optical devices configured as light emitting devices having an illumination angle less than a critical angle, and some of the plurality of optical devices configured as light detecting devices capturing a reflected energy profile of the reflected light; and one or more processors programmed to determine stylus position and tilt from the reflected energy profile. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a near infrared wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a wavelength between 980nm and 1 micron. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light detection device is configured to detect reflected light having a near infrared wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the one or more processors are further programmed to track the reflected energy profile over time to determine stylus orientation and rotation. Additionally or alternatively to one or more of the examples disclosed above, in some examples the one or more processors are further programmed to determine the stylus tilt using one or more energy distribution thresholds within the reflected energy profile.
Some examples of the present disclosure relate to an optical stylus that operates with an optical sensing system for performing stylus sensing, the optical stylus comprising: a stylus body having a tip and sides; and a diffractive reflective surface located on at least a portion of the tip and the side of the stylus, wherein the diffractive reflective surface is configured to reflect incoming light received from the optical sensing system at a plurality of locations in a uniform reflected light pattern regardless of an angle of inclination of the stylus relative to a surface normal of the optical sensing system. Additionally or alternatively to one or more of the examples disclosed above, in some examples the diffractive reflective surface comprises a plurality of diffractive optical elements. Additionally or alternatively to one or more of the examples disclosed above, in some examples the plurality of diffractive optical elements comprises a volumetric diffraction grating. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of diffractive optical elements comprises a grid of material having a higher reflectivity and a lower reflectivity. Additionally or alternatively to one or more of the examples disclosed above, in some examples the plurality of diffractive optical elements comprise grids of materials having different reflective phase shifts. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of diffractive optical elements have a feature density that varies according to a density of the light detection device in the optical sensing system. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the diffractive reflective surface is further configured to reflect the incoming light as a reflected light pattern of a single reticle having a first axis longer than a second axis. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the diffractive reflective surface is further configured to reflect the incoming light as a reflected light pattern of a single reticle having a first axis thicker than a second axis. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the diffractive reflective surface is further configured to reflect the incoming light in a reflected light pattern of the plurality of cross hairs.
Some examples of the present disclosure relate to an optical sensing system operating with an optical stylus for performing stylus sensing, the optical sensing system comprising: a plurality of optical devices, some of the plurality of optical devices configured as light emitting devices having an illumination angle less than a critical angle in the optics, and some of the plurality of devices configured as light detecting devices configured to capture a reflected energy profile of the reflected light pattern; and one or more processors programmed to determine stylus position and tilt from the reflected energy profile of the reflected light pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a near infrared wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a wavelength between 980nm and 1 micron. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the number of light detection devices varies according to a feature density of the reflected light pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light detection device is configured to detect reflected light having a near infrared wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the one or more processors are further programmed to track the reflected energy profile over time to determine stylus orientation and rotation.
Some examples of the present disclosure relate to a method for determining a first stylus pattern on an optical stylus that operates with an optical sensing system for performing stylus sensing, the method comprising: determining a first location on the optical stylus and a reflected light pattern to be generated at the optical sensing system; and performing a fourier transform on the reflected light pattern to obtain the first stylus pattern at the first location. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first stylus pattern is a reoriented version of a default stylus pattern located at a stylus tip. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first stylus pattern is a compressed version of a default stylus pattern located at the stylus tip. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the first stylus pattern is a sloped version of a default stylus pattern located at the stylus tip. Additionally or alternatively to one or more of the examples disclosed above, in some examples the method further comprises: the pitch of the features in the first stylus pattern is incrementally changed at a plurality of locations across the optical stylus to account for incoming spherical waves.
Some examples of the present disclosure relate to an optical stylus that operates with an optical sensing system for performing stylus sensing, the optical stylus comprising: a stylus body having a tip and sides; and a first light detection device located at a tip of the stylus body, wherein the first light detection device is configured to receive modulated light at a plurality of first modulation frequencies and to generate first light detection data at each of the first modulation frequencies. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical stylus further comprises: a processor programmed to determine a frequency response of the received modulated light from the first light detection data, derive an amplitude at each of the received plurality of first modulation frequencies from the determined frequency response, and determine a position of the optical stylus at the optical sensing system from the derived amplitudes at the received plurality of first modulation frequencies. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to determine the position of the optical stylus using a mapping of the positions of the plurality of light emitting devices in the optical sensing system to the first modulation frequency of each of the plurality of first light emitting devices. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to determine the location of the optical stylus by correlating the derived amplitude at each of the received plurality of first modulation frequencies with the locations of the plurality of light emitting devices in the optical sensing system using the mapping. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to linearize the derived amplitude at each of the received plurality of first modulation frequencies with distance from the light emitting device. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to determine the position of the optical stylus using a mapping of the positions of the plurality of first groups of light emitting devices in the optical sensing system to an arrangement of modulation frequencies within each of the groups. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to determine the location of the optical stylus by associating the arrangement of the detected first modulation frequencies within the detected group of light emitting devices with a location in the optical sensing system using the mapping. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to derive phase information at each of the plurality of first modulation frequencies from the determined frequency response, and determine the position of the optical stylus at the optical sensing system from the derived amplitudes and derived phase information at the received plurality of first modulation frequencies. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical stylus further comprises: a second light detection device positioned around a side of the stylus body, wherein each of the second light detection devices is configured to receive modulated light at a plurality of second modulation frequencies and to generate second light detection data at each of the second modulation frequencies. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical stylus further comprises: a processor programmed to determine a frequency response of the received modulated light from the second light detection data, derive an amplitude at each of the received plurality of second modulation frequencies from the determined frequency response at each of the second light detection devices, and determine a position of the optical stylus at the optical sensing system from the derived amplitudes at the received plurality of second modulation frequencies at the first light detection device and the one or more second light detection devices. Additionally or alternatively to one or more of the examples disclosed above, in some examples the processor is further programmed to determine rotation of the optical stylus about its axis by tracking the derived amplitudes at the received plurality of modulation frequencies at the one or more second light detection devices over time.
Some examples of the present disclosure relate to an optical sensing system operating with an optical stylus for performing stylus sensing, the optical sensing system comprising: an array of light emitting devices, each light emitting device configured to emit light at a particular modulation frequency such that light at a plurality of modulation frequencies is emitted from the array of light emitting devices, wherein the plurality of modulation frequencies is selected to be detectable by the optical stylus. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a near infrared wavelength. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is configured to emit light having a wavelength between 980nm and 1 micron.
Some examples of the present disclosure relate to an optical stylus that operates with an optical sensing system for performing stylus sensing, the optical stylus comprising: a stylus body having a tip, a light emitting device, and a pattern generator disposed between the light emitting device and the tip, wherein the pattern generator is configured to generate an illumination pattern through the tip of the stylus body when light from the light emitting device impinges on the pattern generator. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is a Light Emitting Diode (LED) and the pattern generator is a patterned aperture configured to generate the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the LEDs are configured to emit light having near infrared wavelengths. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the LEDs are configured to emit light having a wavelength between 980nm and 1 micron. Additionally or alternatively to one or more of the examples disclosed above, in some examples the light emitting device is a laser and the pattern generator is a diffraction pattern generator configured to generate the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the diffraction pattern generator is further configured to generate the illumination pattern as a single reticle with a first axis longer than a second axis. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the diffraction pattern generator is further configured to generate the illumination pattern as a single reticle with a first axis thicker than a second axis. Additionally or alternatively to one or more of the examples disclosed above, in some examples the diffraction pattern generator is configured to generate the illumination pattern as a plurality of cross hairs.
Some examples of the present disclosure relate to an optical sensing system operating with an optical stylus for performing stylus sensing, the optical sensing system comprising: a plurality of light detection devices configured to capture an illumination energy profile of the illumination pattern; and one or more processors programmed to determine one or more of stylus position, tilt, orientation, and rotation from an illumination energy profile of the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical sensing system further comprises: a lock-in amplifier communicatively coupled to one or more of the plurality of light detection devices to filter out ambient light separate from the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the number of light detection devices varies according to a feature density of the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the plurality of light detection devices are configured to detect light having near infrared wavelengths forming the illumination pattern. Additionally or alternatively to one or more of the examples disclosed above, in some examples the one or more processors are further programmed to track the illumination energy profile over time to determine stylus orientation and rotation.
Some examples of the present disclosure relate to an optical stylus that operates with an optical sensing system for performing stylus sensing, the optical stylus comprising: a stylus body having a tip and sides; a light emitting device located in the stylus tip; a separation element in the tip and configured to separate the incoming light into a plurality of wavelengths; a plurality of light detection devices located in the tip and optically coupled to the separating element, each light detection device for receiving light from the separating element having a different wavelength; and a processor communicatively coupled to the plurality of light detection devices and configured to determine one or more wavelengths of the incoming light and determine a position of the stylus on the optical sensing system from the one or more wavelengths and a spectrogram of the optical sensing system.
Some examples of the present disclosure relate to an optical sensing system operating with an optical stylus for performing stylus sensing, the optical sensing system comprising: an array of display elements; and a retroreflector layer formed between the array of display elements, the retroreflector layer formed to have varying retroreflection properties at different locations across the optical sensing system, wherein the display elements and the retroreflector layer are configured to reflect light at different wavelengths for detection by the optical stylus. Additionally or alternatively to one or more of the examples disclosed above, in some examples the retroreflector layer includes a diffraction grating having a varying periodicity to produce a spectral shift. Additionally or alternatively to one or more of the examples disclosed above, in some examples the retroreflector layer forms a chirp across the optical sensing system. Additionally or alternatively to one or more of the examples disclosed above, in some examples the optical sensing system further comprises: a plurality of touch electrodes; and display electronics, wherein the plurality of electrodes and the display electronics are configured to reflect light at different wavelengths for detection by the optical stylus.
Although examples of the present disclosure have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. It is to be understood that such variations and modifications are to be considered included within the scope of the examples of the present disclosure as defined by the appended claims.

Claims (14)

1. An optical stylus operative with an optical sensing system for performing stylus sensing, comprising:
a stylus body having a tip and sides; and
a first light detection device located at the tip of the stylus body;
wherein the first light detection device is configured to receive modulated light at a plurality of first modulation frequencies and to generate first light detection data at each first modulation frequency.
2. The optical stylus of claim 1, further comprising: a processor programmed to:
determining a frequency response of the received modulated light from the first light detection data;
deriving from the determined frequency response an amplitude at each of the received plurality of first modulation frequencies; and
a position of the optical stylus at the optical sensing system is determined from the derived amplitudes at the received plurality of first modulation frequencies.
3. The optical stylus of claim 2, the processor further programmed to determine the location of the optical stylus using a mapping of locations of a plurality of light emitting devices in the optical sensing system and the first modulation frequency of each of the plurality of first light emitting devices.
4. The optical stylus of claim 3, the processor further programmed to determine the location of the optical stylus by correlating the derived amplitude at each of the received plurality of first modulation frequencies with locations of a plurality of light emitting devices in the optical sensing system using the mapping.
5. The optical stylus of claim 2, the processor further programmed to linearize the derived amplitude at each of the received plurality of first modulation frequencies with distance from the light emitting device.
6. The optical stylus of claim 2, the processor further programmed to determine the location of the optical stylus using a mapping of locations of a plurality of first groups of light emitting devices in the optical sensing system to an arrangement of modulation frequencies within each of the groups.
7. The optical stylus of claim 6, the processor further programmed to determine the location of the optical stylus by correlating an arrangement of detected first modulation frequencies within a detected group of light emitting devices with locations in the optical sensing system using the mapping.
8. The optical stylus of claim 2, the processor further programmed to:
deriving phase information at each of the plurality of first modulation frequencies from the determined frequency response; and
the position of the optical stylus at the optical sensing system is determined from the derived amplitudes and derived phase information at the received plurality of first modulation frequencies.
9. The optical stylus of claim 1, further comprising:
a second light detection device positioned around the side of the stylus body;
wherein each of the second light detection devices is configured to receive modulated light at a plurality of second modulation frequencies and to generate second light detection data at each of the second modulation frequencies.
10. The optical stylus of claim 9, further comprising: a processor programmed to:
Determining a frequency response of the received modulated light from the second light detection data;
deriving an amplitude at each of the received plurality of second modulation frequencies from the determined frequency response at each second light detection device; and
a position of the optical stylus at the optical sensing system is determined from the derived amplitudes at the received plurality of second modulation frequencies at the first light detection device and the one or more second light detection devices.
11. The optical stylus of claim 2, the processor further programmed to determine rotation of the optical stylus about its axis by tracking the derived amplitudes at the received plurality of modulation frequencies at the one or more second light detection devices over time.
12. An optical sensing system operative with an optical stylus for performing stylus sensing, comprising:
an array of light emitting devices, each light emitting device configured to emit light at a particular modulation frequency such that light at a plurality of modulation frequencies is emitted from the array of light emitting devices;
wherein the plurality of modulation frequencies are selected to be detectable by the optical stylus.
13. The optical sensing system of claim 12, wherein the light emitting device is configured to emit light having a near infrared wavelength.
14. The optical sensing system of claim 13, wherein the light emitting device is configured to emit light having a wavelength between 980nm and 1 micron.
CN202311230108.5A 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection Pending CN117762275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311423947.9A CN117762276A (en) 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/377,003 2022-09-23
US63/496,258 2023-04-14
US18/462,330 2023-09-06
US18/462,330 US20240118773A1 (en) 2022-09-23 2023-09-06 Photo-sensing enabled display for stylus detection

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311423947.9A Division CN117762276A (en) 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection

Publications (1)

Publication Number Publication Date
CN117762275A true CN117762275A (en) 2024-03-26

Family

ID=90318944

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311230108.5A Pending CN117762275A (en) 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection
CN202311423947.9A Pending CN117762276A (en) 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311423947.9A Pending CN117762276A (en) 2022-09-23 2023-09-22 Photo-induced sensing enabled display for stylus detection

Country Status (1)

Country Link
CN (2) CN117762275A (en)

Also Published As

Publication number Publication date
CN117762276A (en) 2024-03-26

Similar Documents

Publication Publication Date Title
US10963671B2 (en) Multifunction fingerprint sensor having optical sensing capability
US11422661B2 (en) Sensing system for detection of light incident to a light emitting layer of an electronic device display
WO2019114276A1 (en) Fingerprint recognition device, fingerprint recognition method, and display device
US10268884B2 (en) Optical fingerprint sensor under a display
US9103658B2 (en) Optical navigation module with capacitive sensor
CN106133664B (en) Frequency conversion in touch sensors
US20180335893A1 (en) Heterogeneous tactile sensing via multiple sensor types using spatial information processing acting on initial image processed data from each sensor
US20210089741A1 (en) Thin-Film Transistor Optical Imaging System with Integrated Optics for Through-Display Biometric Imaging
US8035625B2 (en) Touch screen
CN112070018B (en) Fingerprint identification device and electronic equipment
US9377355B2 (en) Optical sensor apparatus and image sensing apparatus integrating multiple functions
US8766949B2 (en) Systems and methods for determining user input using simultaneous transmission from multiple electrodes
US11314355B2 (en) Capacitive sensor patterns
KR20170018837A (en) Detector for determining a position of at least one object
US20080029316A1 (en) Method for detecting position of input devices on a screen using infrared light emission
US10866447B2 (en) Display panel, display apparatus, and method for manufacturing a display panel
CN114120374A (en) Fingerprint authentication apparatus, display apparatus including the same, and method of authenticating fingerprint
US9201511B1 (en) Optical navigation sensor and method
CN117762275A (en) Photo-induced sensing enabled display for stylus detection
US20240118773A1 (en) Photo-sensing enabled display for stylus detection
KR20100075327A (en) Touch type display device
KR20120066381A (en) Optical touch screen panel
US8896553B1 (en) Hybrid sensor module
CN215869390U (en) Optical sensing system and electronic device
US9098144B1 (en) Adaptive ambient light auto-movement blocking in optical navigation modules

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication