WO2021211127A1 - Light signal identification - Google Patents

Light signal identification Download PDF

Info

Publication number
WO2021211127A1
WO2021211127A1 PCT/US2020/028525 US2020028525W WO2021211127A1 WO 2021211127 A1 WO2021211127 A1 WO 2021211127A1 US 2020028525 W US2020028525 W US 2020028525W WO 2021211127 A1 WO2021211127 A1 WO 2021211127A1
Authority
WO
WIPO (PCT)
Prior art keywords
waveform
spectral
processor
computing device
examples
Prior art date
Application number
PCT/US2020/028525
Other languages
French (fr)
Inventor
James Michael MANN
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/028525 priority Critical patent/WO2021211127A1/en
Publication of WO2021211127A1 publication Critical patent/WO2021211127A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R23/00Transducers other than those covered by groups H04R9/00 - H04R21/00
    • H04R23/008Transducers other than those covered by groups H04R9/00 - H04R21/00 using optical signals for detecting or generating sound
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/40Details of arrangements for obtaining desired directional characteristic by combining a number of identical transducers covered by H04R1/40 but not provided for in any of its subgroups
    • H04R2201/4012D or 3D arrays of transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/40Details of arrangements for obtaining desired directional characteristic by combining a number of identical transducers covered by H04R1/40 but not provided for in any of its subgroups
    • H04R2201/403Linear arrays of transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2203/00Details of circuits for transducers, loudspeakers or microphones covered by H04R3/00 but not provided for in any of its subgroups

Definitions

  • FIG. 1 is a block diagram of an example of a computing device that may be utilized to identify or detect a light signal
  • FIG. 2 is a diagram illustrating examples of waveforms to illustrate some of the techniques that may be utilized for light signal identification described herein;
  • FIG. 3 is a diagram illustrating examples of spectral waveforms to illustrate some of the techniques that may be utilized for light signal identification described herein;
  • FIG. 4 is a block diagram of an example of an audio device that may be utilized for light signal identification.
  • FIG. 5 is a block diagram illustrating an example of a computer- readable medium for light signal identification.
  • An electronic device may be a device that includes electronic circuitry.
  • an electronic device may include integrated circuitry (e.g., transistors, digital logic, semiconductor technology, etc.).
  • Examples of electronic devices include computing devices.
  • a computing device may be a device to execute logic and/or perform computation. Examples of computing devices may include laptop computers, desktop computers, smart speakers, audio devices, digital assistants, smartphones, tablet devices, wireless communication devices, game consoles, smart appliances, vehicles with electronic components, aircraft, drones, robots, smart appliances, Internet of Things (loT) devices, etc.
  • LoT Internet of Things
  • a light signal may be a signal that is transmitted by light.
  • light may be modulated (e.g., amplitude-modulated) to convey a signal.
  • an audio signal may be converted to an amplitude-modulated light signal.
  • Laser signals may be an example of light signals.
  • a light signal (e.g., laser signal) may be directed to a microphone.
  • the microphone may capture the light signal due to the photoacoustic effect.
  • the microphone may convert the light signal to an electronic audio signal.
  • a light signal may be injected into a device with a microphone.
  • attackers may use lasers to command or disrupt devices with microphones.
  • a laser attack on microphones may be carried out when line-of-sight to a microphone and accurate targeting of a microphone membrane are available to an attacker's laser.
  • Approaches to guard against a laser attack that introduce extra components may be costly.
  • Some approaches to detect a light signal may be beneficial. For example, some procedural approaches may be utilized to detect a light signal (e.g., a light signal attack or laser attack) that use a microphone array or arrays.
  • a microphone array may be a group of microphones.
  • a microphone array may include two, three, or more microphones.
  • Some approaches described herein may analyze multiple waveforms from respective microphones of a microphone array to determine whether a microphone is capturing a waveform that is different from a waveform or waveforms from another microphone or microphones. For example, because an attacker may use a single laser to target a single microphone, the other microphone(s) may produce waveforms (e.g., a user’s speech, ambient sound, and/or noise) that differ significantly from the waveform of the targeted microphone.
  • waveform analysis a computing device may determine if one microphone is capturing a significantly different or unexpected waveform, which may occur if the microphone is targeted with a light signal (e.g., light signal attack, laser attack, etc.).
  • the computing device may alert a user when a light signal is detected and/or may disregard information (e.g., voice commands, etc.) conveyed by the light signal. Accordingly, some examples of the techniques described herein may be beneficial to improve security for computing devices (e.g., smart speakers, smart phones, tablet devices, computers, vehicles, etc.) that receive waveforms from multiple microphones.
  • computing devices e.g., smart speakers, smart phones, tablet devices, computers, vehicles, etc.
  • the drawings provide examples in accordance with the description; however, the description is not limited to the examples provided in the drawings.
  • FIG. 1 is a block diagram of an example of a computing device 102 that may be utilized to identify or detect a light signal.
  • Examples of the computing device 102 may include audio devices, smart speakers, laptop computers, smartphones, tablet devices, wireless communication devices, game consoles, vehicles with electronic components, aircraft, drones, robots, etc.
  • the computing device 102 may perform one, some, or all of the functions, operations, elements, procedures, etc., described in one, some, or all of FIG. 1-5.
  • the computing device 102 may include a processor 104.
  • the processor 104 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, graphics processing unit (GPU), field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device for logic execution and/or computation.
  • the processor 104 may retrieve and/or execute instructions stored in a memory (not shown in FIG. 1). For example, the processor 104 may fetch, decode, and/or execute instructions stored in memory.
  • a memory may be any electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic information (e.g., instructions and/or data).
  • the memory may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read- Only Memory (EEPROM), Dynamic Random Access Memory (DRAM), Synchronous DRAM (SDRAM), magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, a storage device, and/or an optical disc, etc.
  • the memory may be a non-transitory tangible machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals.
  • the processor 104 may be in electronic communication with memory.
  • the computing device 102 may include an input/output interface (not shown) through which the computing device 102 may communicate with an external and/or remote device or devices (not shown).
  • the input/output interface may include hardware and/or machine-readable instructions to enable the processor 104 to communicate with the external and/or remote device or devices.
  • the input/output interface may enable a wired or wireless connection to the external and/or remote device or devices.
  • the computing device 102 (e.g., input/output interface) may include a transmitter or transmitters to send a signal or signals.
  • the computing device 102 may include a receiver or receivers to receive a signal or signals.
  • the transmitter and/or the receiver may be coupled to an antenna or antennas of the computing device 102 to transmit and/or receive a wireless signal.
  • a wireless signal may be an electromagnetic signal and/or radio signal. Examples of wireless signals may include communication signals, cellular signals, Wi-Fi signals, Bluetooth signals, etc.
  • the input/output interface may include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 104 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display or displays, a speaker or speakers, a light or lights, a haptic motor or motors, a transmitter or transmitters, another apparatus, etc.
  • a user may utilize the input device(s) to input instructions and/or indications into the computing device 102.
  • the output device(s) may be utilized to output an alert as described herein.
  • the computing device 102 may include a microphone array 106.
  • the microphone array 106 may be a group of microphones (e.g., two, three, or more microphones).
  • a microphone may be a transducer that converts ambient waves (e.g., acoustic waves, sounds, noise, light signals, etc.) into electronic signals.
  • the microphone array 106 may convert ambient waves into waveform signals.
  • a first microphone of the microphone array 106 may produce a first waveform signal 112 and a second microphone of the microphone array 106 may produce a second waveform signal 114.
  • a waveform may be a representation of a wave or waves (e.g., variations in air pressure, sensor position, and/or microphone diaphragm displacement).
  • a waveform signal may be an electronic signal that indicates a waveform.
  • a first microphone of the microphone array 106 may produce a first waveform signal 112 indicating a first waveform that represents waves captured by the first microphone.
  • a second microphone of the microphone array 106 may produce a second waveform signal 114 indicating a second waveform that represents waves captured by the second microphone.
  • different microphones of the microphone array 106 may be arranged, distanced, and/or spaced.
  • the microphone array 106 may include an array of microphones that face the same direction or that face different directions (e.g., opposite directions, different angular directions, orthogonal directions, etc.).
  • the microphone array 106 may include a group of microphones arranged in a line (e.g., a linear array), a group of microphones on different sides of the computing device 102, and/or microphones arranged to face different angles.
  • a line e.g., a linear array
  • different microphones in the microphone array 106 may capture waveform signals with waveforms that exhibit a similar aspect or aspects.
  • different microphones may capture waveform signals with waveforms that exhibit similarities in amplitudes, magnitudes, frequency (e.g., dominant frequency or frequencies, pitch, etc.), and/or timing.
  • An aspect or aspects of waveforms from the microphone array 106 may be utilized to determine whether a light signal 110 is sensed by the microphone array 106 (e.g., by a microphone of the microphone array 106).
  • An aspect may be a characteristic or parameter of a waveform.
  • the processor 104 may analyze a first waveform and a second waveform from the microphone array 106 to identify a light signal 110 sensed by the microphone array 106. For example, the processor 104 may perform light signal identification 108 based on waveforms from the microphone array 106. In some examples, the processor 104 may execute light signal identification instructions stored in a memory to perform light signal identification 108. In some examples, the microphone array 106 may provide a first waveform signal 112 and a second waveform signal 114 to the processor 104.
  • the processor 104 may utilize a first waveform indicated by the first waveform signal 112 from a first microphone and a second waveform indicated by the second waveform signal 114 from a second microphone to determine and/or identify whether a light signal 110 is captured by the microphone array.
  • two waveforms corresponding to two microphones may be utilized.
  • three or more waveforms corresponding to three or more respective microphones may be utilized. For instance, the operations described herein for two waveforms in some examples may be performed for three or more waveforms in some examples.
  • the processor 104 may determine an aspect or aspects (e.g., parameter(s)) of the first waveform and the second waveform.
  • aspects may include amplitudes, magnitudes, peak amplitudes (e.g., temporal and/or spectral peak amplitudes), peak magnitudes (e.g., temporal and/or spectral peak magnitudes), average amplitudes (e.g., temporal average amplitudes and/or spectral average amplitudes), average magnitudes (e.g., temporal average magnitudes and/or spectral average magnitudes), envelopes (e.g., temporal envelopes and/or spectral envelopes), timing (e.g., peak timing), frequencies (e.g., peak frequencies, pitches), statistical measures (e.g., average, standard deviation, and/or variance of temporal waveforms and/or spectral waveforms), noise waveform aspects, etc.
  • amplitudes, magnitudes, peak amplitudes e.g., temporal and/or
  • the analysis performed herein may be performed on a time period of a waveform or waveforms (e.g., the first waveform and the second waveform).
  • analyzing the first waveform and the second waveform may include performing the analysis relative to a period or portion of the first waveform signal 112 and/or the second waveform signal 114.
  • the processor 104 may compare the aspect(s) (e.g., parameter(s)) to determine whether the first waveform and the second waveform are similar or dissimilar. For example, the processor 104 may utilize a threshold or thresholds, a range or ranges, and/or a correlation or correlations to determine similarity or dissimilarity. In some examples, when the first waveform and the second waveform are similar, the first microphone and the second microphone may have likely received acoustic waves that are expressed by the first waveform signal 112 and the second waveform signal 114. When the first waveform and the second waveform are dissimilar, the first microphone or the second microphone may have likely received a light signal 110 that is expressed by the first waveform signal 112 or the second waveform signal 114.
  • the aspect(s) e.g., parameter(s)
  • the processor 104 may determine and/or provide an indication of whether a light signal 110 is identified based on a comparison between the waveforms. For instance, if the processor 104 determines that the waveforms are dissimilar, the processor 104 may determine and/or provide an indication (e.g., alert) that a light signal 110 is identified. If the processor 104 determines that the waveforms are similar, the processor 104 may determine that no light signal is identified. In some examples, the processor 104 may provide an indication that no light signal is identified. Examples of techniques to determine similarity and/or dissimilarity between waveforms from different microphones of the microphone array 106 are described herein. The computing device 102 may utilize a technique or a combination of techniques to identify whether a light signal 110 (e.g., laser) is sensed by a microphone of the microphone array 106.
  • a light signal 110 e.g., laser
  • the processor 104 may detect a temporal peak or peaks of the first waveform and/or of the second waveform.
  • the term “temporal” may relate to time and/or the time domain.
  • the processor 104 may detect the temporal peak(s) using a temporal peak detection threshold.
  • a temporal peak detection threshold may be static or may be relative (e.g., determined based on the waveform(s)).
  • the processor 104 may determine a maximum or maxima for values of a waveform that are greater than a static temporal peak threshold to produce a temporal peak amplitude or magnitude.
  • the processor 104 may determine a maximum or maxima for values of a waveform that are greater than a relative temporal peak threshold.
  • a relative temporal peak threshold may be determined relative to a statistical measure (e.g., a multiple of a standard deviation of amplitude or magnitude of the waveform).
  • a maximum or maxima for values of a waveform that are greater than the relative temporal peak threshold may be the temporal peak amplitude(s) or magnitude(s).
  • the computing device 102 may determine a difference between the first waveform and the second waveform.
  • differences may include a temporal subtraction, a temporal peak amplitude difference, a temporal peak magnitude difference, temporal signal-to-noise ratio (SNR) difference, and/or a statistical temporal difference (e.g., difference of averages, standard deviations, variances, etc.) between the first waveform and the second waveform.
  • the processor 104 may determine a subtraction between the first waveform and the second waveform to produce the temporal subtraction.
  • the processor 104 may determine a temporal peak amplitude or magnitude (e.g., a maximum) or temporal peak amplitudes or magnitudes (e.g., localized maxima) of the first waveform and the second waveform. The processor 104 may determine a subtraction or distance between the temporal peak amplitude(s) and/or the temporal peak magnitude(s) to determine the difference. In some examples, the processor 104 may determine a first temporal SNR for the first waveform and a second temporal SNR for the second waveform. The processor 104 may determine a subtraction or difference between the first temporal SNR and the second temporal SNR to determine the temporal SNR difference.
  • a temporal peak amplitude or magnitude e.g., a maximum
  • temporal peak amplitudes or magnitudes e.g., localized maxima
  • the computing device 102 may determine whether the difference satisfies a temporal difference criterion to identify the light signal.
  • a temporal difference criterion may include a temporal difference threshold or thresholds.
  • the processor 104 may compare the difference(s) (e.g., temporal subtraction, temporal peak amplitude difference, temporal SNR difference, and/or statistical temporal difference, etc.) to a temporal difference threshold or thresholds (e.g., temporal amplitude difference threshold, temporal magnitude difference threshold, temporal SNR difference threshold, and/or temporal statistic difference threshold, etc.).
  • the temporal difference threshold(s) may be expressed in terms of decibels (dB), voltage(s), percentage(s), etc.
  • the processor 104 may determine whether the difference is greater than 30%, 40%, 50%, 60%, 65%, etc. (e.g., whether the temporal peak amplitude difference is greater than 50% of a temporal peak amplitude of the first waveform or the second waveform). In a case that a difference is greater than the temporal threshold, the processor 104 may identify that the light signal 110 is sensed by the microphone array.
  • the computing device 102 may determine a first time component of the first waveform and a second time component of the second waveform.
  • a time component may be a value or values that indicate a temporal distribution and/or temporal characteristic of a waveform. Examples of time components may include peak time or times, a temporal envelope, and/or envelope coefficients.
  • the processor 104 may determine maximum temporal peak times and/or localized temporal peak times of the first waveform and the second waveform. For instance, the processor 104 may determine the time or times corresponding to the temporal peak(s), which may be determined as described above.
  • the processor 104 may determine a temporal envelope or envelopes. For example, the processor 104 may determine a curve that outlines the peaks of a waveform. In some examples, the processor 104 may perform linear prediction analysis to determine an envelope or envelopes (e.g., and/or coefficients) of the waveforms.
  • the computing device 102 may identify the light signal 110 in response to determining that the first time component does not match the second time component. For example, the processor 104 may determine whether the first time component matches the second time component by comparing the time components. In some examples, the processor 104 may correlate the temporal envelopes of the first waveform and the second waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first time component does not match the second time component and/or that a light signal 110 is sensed by the microphone array 106.
  • a correlation threshold e.g. 50%, 60%, 70%, 75%, etc.
  • the processor 104 may correlate the first waveform and the second waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first waveform does not match the second waveform and/or that a light signal 110 is sensed by the microphone array 106.
  • a correlation threshold e.g. 50%, 60%, 70%, 75%, etc.
  • the processor 104 may determine whether a peak time (e.g., time of a maximum peak) of the first waveform is within a threshold temporal range from a peak time (e.g., time of a maximum peak or pitch) of the second waveform. In a case that the peak time of the first waveform is not within the threshold temporal range, the processor 104 may determine that the first time component does not match the second time component. In some examples, a difference in time between a peak time of the first waveform and a peak time of the second waveform may indicate a phase between the first waveform and the second waveform.
  • a peak time e.g., time of a maximum peak
  • the computing device 102 may determine a phase between the first waveform and the second waveform. For instance, the processor 104 may determine a difference between peak times described above to determine the delay.
  • the computing device 102 e.g., processor 104 may compare the phase to a phase range that is based on microphone array spacing to identify the light signal. For instance, the microphone array spacing (e.g., spacing between a first microphone and a second microphone) may establish the phase range.
  • a maximum phase range may be an approximate amount of time for an acoustic wave to travel between the first microphone and the second microphone according to the spacing.
  • the phase range may range from zero (for when an acoustic wave source is equidistant from the microphones, for example) to the maximum phase range (when an acoustic wave source is positioned in line with the microphones, for example).
  • the processor 104 may determine corresponding temporal peaks between the first waveform and the second waveform.
  • a temporal peak with a highest peak amplitude in one waveform may correspond to a following (e.g., next) highest peak amplitude in another waveform (e.g., the second waveform), due to amplitude decline over distance.
  • the processor 104 may determine that a light signal 110 is sensed by the microphone array.
  • the computing device 102 may transform the first waveform to a first spectral waveform and/or may transform the second waveform to a second spectral waveform.
  • the term “spectral” may refer to frequency domain spectrum.
  • the processor 104 may perform a Fourier transform (e.g., fast Fourier transform (FFT), discrete Fourier transform (DFT)), a Laplace transform, a Z-transform, etc., to transform the first waveform to a first spectral waveform and/or to transform the second waveform to a second spectral waveform. Transforming the first waveform and the second waveform (and/or another waveform or waveforms) may enable comparison between a spectral aspect or aspects of the waveforms.
  • FFT fast Fourier transform
  • DFT discrete Fourier transform
  • Laplace transform e.g., Laplace transform, a Z-transform, etc.
  • the processor 104 may detect a peak or peaks of the first spectral waveform and/or of the second spectral waveform. In some examples, the processor 104 may detect the peak(s) using a spectral peak detection threshold. A spectral peak detection threshold may be static or may be relative (e.g., determined based on the spectral waveform(s)). In some examples, the processor 104 may determine a maximum or maxima for values of a spectral waveform that are greater than a static spectral peak threshold to produce a spectral peak amplitude or magnitude. In some examples, the processor 104 may determine a maximum or maxima for values of a spectral waveform that are greater than a relative spectral peak threshold.
  • a spectral peak threshold may be determined relative to a statistical measure (e.g., a multiple of a standard deviation of amplitude or magnitude of the spectral waveform).
  • a maximum or maxima for values of a spectral waveform that are greater than the relative spectral peak threshold may be the spectral peak amplitude(s) or magnitude(s).
  • the computing device 102 may determine a difference between the first spectral waveform of the first waveform and the second spectral waveform of the second waveform. Examples of differences may include a spectral subtraction, a spectral peak amplitude difference, a spectral peak magnitude difference, SNR difference, and/or a statistical spectral difference (e.g., difference of averages, standard deviations, variances, etc.) between the first spectral waveform and the second spectral waveform.
  • the processor 104 may determine a subtraction between the first spectral waveform and the second spectral waveform to produce the spectral subtraction.
  • the processor 104 may determine a spectral peak amplitude or magnitude (e.g., a maximum) or spectral peak amplitudes or magnitudes (e.g., localized maxima) of the first spectral waveform and the second spectral waveform. The processor 104 may determine a subtraction or distance between the spectral peak amplitude(s) and/or the spectral peak amplitude(s) to determine the difference. In some examples, the processor 104 may determine a first spectral SNR for the first spectral waveform and a second spectral SNR for the second spectral waveform. The processor 104 may determine a subtraction or difference between the first spectral SNR and the second spectral SNR to determine the difference.
  • a spectral peak amplitude or magnitude e.g., a maximum
  • spectral peak amplitudes or magnitudes e.g., localized maxima
  • the computing device 102 may determine whether the difference satisfies a spectral difference criterion to identify the light signal.
  • a spectral difference criterion may include a spectral difference threshold or thresholds.
  • the processor 104 may compare the difference(s) (e.g., spectral subtraction, spectral peak amplitude difference, spectral SNR difference, and/or statistical spectral difference, etc.) to a spectral difference threshold or thresholds (e.g., spectral amplitude difference threshold, spectral magnitude difference threshold, spectral SNR difference threshold, and/or spectral statistic difference threshold, etc.).
  • the spectral difference threshold(s) may be expressed in terms of decibels (dB), voltage(s), percentage(s), etc.
  • the processor 104 may determine whether the difference is greater than 30%, 40%, 50%, 60%, 65%, etc. (e.g., whether the spectral peak amplitude difference is greater than 50% of a spectral peak amplitude of the first waveform or the second waveform). In a case that a difference is greater than the spectral threshold, the processor 104 may identify that the light signal 110 is sensed by the microphone array.
  • the computing device 102 may determine a first frequency component of the first spectral waveform and a second frequency component of the second spectral waveform.
  • a frequency component may be a value or values that indicate a spectral distribution and/or a spectral characteristic of a waveform. Examples of frequency components may include peak frequency or frequencies, a spectral envelope, and/or envelope coefficients.
  • the processor 104 may determine maximum spectral peak frequencies (e.g., pitch(es)) and/or localized spectral peaks of the first spectral waveform and the second spectral waveform.
  • the processor 104 may determine the frequency or frequencies corresponding to the spectral peak(s), which may be determined as described above. In some examples, the processor 104 may determine a spectral envelope or envelopes. For example, the processor 104 may determine a curve that outlines the peaks of a spectral waveform. In some examples, the processor 104 may perform linear prediction analysis and/or cepstral windowing to determine an envelope or envelopes (e.g., and/or coefficients) of the spectral waveforms.
  • an envelope or envelopes e.g., and/or coefficients
  • the computing device 102 may identify the light signal 110 in response to determining that the first frequency component does not match the second frequency component. For example, the processor 104 may determine whether the first frequency component matches the second frequency component by comparing the frequency components. For example, the processor 104 may determine whether a peak frequency (e.g., frequency of a maximum peak or pitch) of the first spectral waveform is within a threshold spectral range from a peak frequency (e.g., frequency of a maximum peak or pitch) of the second spectral waveform.
  • a peak frequency e.g., frequency of a maximum peak or pitch
  • the processor 104 may determine that the first frequency component does not match the second frequency component. In some examples, the processor 104 may correlate the spectral envelopes of the first spectral waveform and the second spectral waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first frequency component does not match the second frequency component and/or that a light signal 110 is sensed by the microphone array 106.
  • a correlation threshold e.g. 50%, 60%, 70%, 75%, etc.
  • the processor 104 may filter the first waveform to produce a first filtered waveform and/or may filter the second waveform to produce a second filtered waveform.
  • filtering may include noise filtering, low-pass filtering, band-stop filtering, notch filtering, and/or band-pass filtering. Performing filtering may enable comparison of a variety of characteristics (e.g., filtered waveforms, noise waveforms, spectral bands, etc.), which may be performed to identify whether a light signal 110 is sensed by the microphone array 106.
  • the computing device 102 may compare the first filtered waveform and the second filtered waveform to identify the light signal 110.
  • the processor 104 may determine and/or compare an aspect or aspects for the first filtered waveform and the second filtered waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above.
  • the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal SNRs, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, temporal SNR difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the first filtered waveform and the second filtered waveform.
  • the processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first filtered waveform does not match the second filtered waveform based on the comparison(s) similar to those described above.
  • the processor 104 may filter the first waveform to produce a first noise waveform and/or may filter the second waveform to produce a second noise waveform.
  • the processor 104 may perform noise filtering (e.g., noise suppression) on the first waveform and the second waveform to produce the first noise waveform and the second noise waveform.
  • noise filtering e.g., noise suppression
  • some approaches to noise filtering e.g., Wiener filtering
  • Wiener filtering may produce a noise estimate for filtering.
  • the produced noise estimate for the first waveform may be the first noise waveform and/or the produced noise estimate for the second waveform may be the second noise waveform.
  • Different microphones e.g., a first microphone and a second microphone
  • the sensed light signal 110 may mask and/or change the noise for one of the microphones.
  • the computing device 102 may compare the first noise waveform and the second noise waveform to identify the light signal 110.
  • the processor 104 may determine and/or compare an aspect or aspects for the first noise waveform and the second noise waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above.
  • the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the first noise waveform and the second noise waveform.
  • the processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first noise waveform does not match the second noise waveform based on comparison(s) similar to those described above.
  • the computing device 102 may determine a combination of a noise waveform comparison and a filtered waveform comparison.
  • the noise waveform comparison may be expressed as a noise waveform matching score (e.g., 0-1 , 0-100%, etc.), where the noise waveform matching score may indicate a degree of matching between the first noise waveform and the second noise waveform.
  • the filtered waveform comparison may be expressed as a filtered waveform matching score (e.g., 0-1 , 0-100%, etc.), where the filtered waveform matching score may indicate a degree of matching between the first filtered waveform and the second filtered waveform.
  • the processor 104 may determine a combination by determining a combined score (e.g., sum, average, weighted average, etc.) of the noise waveform matching score and the filtered waveform matching score.
  • the processor 104 may compare the combined score with a combined score threshold (e.g., 0.4, 0.5, 0.7, 0.9, 55%, 65%, etc.). In a case that the combined score is less than the combined score threshold, the processor 104 may determine that a light signal 110 is sensed by the microphone array 106. In some examples, other combinations of the comparisons described herein may be utilized.
  • the computing device 102 may determine a combination of a waveform comparison, a spectral waveform comparison, a noise waveform comparison and/or a filtered waveform comparison.
  • the computing device 102 may generate a filter based on the first waveform.
  • the processor 104 may determine filter coefficients based on the first waveform.
  • the processor 104 may transform the first waveform into the frequency domain to produce a first spectral waveform.
  • the processor 104 may generate a filter to filter out the first spectral waveform.
  • the processor 104 may determine a band stop filter with a stop band or stop bands corresponding to a spectral range or spectral ranges occupied by the first spectral waveform (e.g., spectral ranges with signal energy greater than a noise level).
  • the band stop filter may be determined as a unity gain filter with stop bands (e.g., attenuation or 0 gain) corresponding to spectral characteristics (e.g., spectral peaks) of the first spectral waveform.
  • the processor 104 may determine a filter coefficient or filter coefficients to reduce or cancel the first spectral waveform.
  • the processor 104 may determine filter coefficients to filter out a spectral envelope of the first spectral waveform (e.g., filter coefficients that provide a filter shape complementary to the spectral envelope of the first spectral waveform).
  • the computing device 102 may filter the second waveform based on the filter to produce a second filtered waveform.
  • the processor 104 may apply the filter to the second waveform.
  • the processor 104 may convolve the second waveform with a time-domain version of the filter.
  • the processor 104 may transform the second waveform into the frequency domain to produce a second spectral waveform and may multiply (e.g., perform sample-wise multiplication of) the second spectral waveform with a frequency-domain version of the filter. Filtering the second waveform based on the filter may provide an indication of whether a light signal is identified.
  • the filter generated from the first waveform may target waveforms similar to the first waveform for attenuation.
  • the filter generated based on the first waveform may significantly attenuate the second waveform, which may result in a second filtered waveform with relatively low signal amplitude, magnitude, and/or energy.
  • the second waveform may be attenuated, which may result in a second filtered waveform with relatively small amplitude, magnitude, and/or energy.
  • the filter generated based on the first waveform may pass a dissimilar portion or portions of the second waveform, which may result in a second filtered waveform with relatively higher signal amplitude, magnitude, and/or energy, which may indicate that a light signal is sensed.
  • the second waveform may be time-shifted for filtering, to account for inter-microphone spacing.
  • the computing device 102 may determine whether the second filtered waveform meets a criterion (or criteria) to identify the light signal.
  • the criterion or criteria may be a threshold or thresholds.
  • the processor 104 may compare the second filtered waveform (e.g., a characteristic or characteristics of the second filtered waveform) to a threshold.
  • the threshold may be a static threshold or may be determined based on a noise level and/or signal level of the first waveform and/or of the second waveform.
  • the threshold may be set at a maximum noise level of a noise signal (of a first noise signal determined from the first waveform or of a second noise signal determined from the second waveform) or at a percentage (e.g., +10%, +20%, etc.) or offset relative to a maximum or average noise level.
  • the threshold may be set at a proportion or percentage (e.g., 20%, 30%, 50%, etc.) of a maximum amplitude or magnitude of the first waveform or of the second waveform or at a percentage (e.g., -20%, -30%, -50%, 70%, -80%, etc.) or offset relative to a maximum or average amplitude or magnitude of the first waveform or second waveform.
  • the computing device 102 may identify the light signal and/or produce an alert.
  • a filter may be generated based on the second waveform (or a waveform from another microphone) and may be similarly applied to the first waveform (or to another waveform from another microphone) to identify a light signal.
  • the computing device 102 may analyze the first waveform and the second waveform based on historical audio data.
  • the computing device 102 may store historical audio data that indicates a previously captured waveform or waveforms and/or an aspect or aspects of the previously captured waveform or waveforms.
  • the computing device 102 e.g., processor 104 may compare the first waveform and/or the second waveform with the historical audio data to identify the light signal 110.
  • the processor 104 may determine and/or compare an aspect or aspects for the first waveform and/or the second waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above.
  • the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal SNRs, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, temporal SNR difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the historical audio data and the first waveform and/or the second waveform.
  • the processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first waveform and/or the second waveform do not match the historical audio data based on the comparison(s) similar to those described above.
  • FIG. 2 is a diagram illustrating examples of waveforms 216, 218, 220 to illustrate some of the techniques that may be utilized for light signal identification described herein.
  • FIG. 2 illustrates an example of a first waveform 216 from a first microphone, an example of a second waveform A 218 from a second microphone, and an example of a second waveform B 220 from a second microphone.
  • FIG. 2 illustrates the waveforms 216, 218, 220 in graphs, where the horizontal axes illustrate time (in seconds) and the vertical axes illustrate amplitude. For simplicity, the vertical axes are illustrated on a scale of - 1 to 1 , but different examples may represent amplitude by voltage, current, displacement, or another unit.
  • the first waveform 216 may be provided by a first microphone and the second waveform A 218 may be provided by a second microphone.
  • a computing device e.g., computing device 102 may determine a temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 224 of the second waveform A 218.
  • the computing device may determine a temporal peak amplitude difference between the temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 224 of the second waveform A 218 (e.g., approximately 0.1).
  • the temporal peak amplitude difference may be compared to a temporal amplitude difference threshold (e.g., 0.3). Because the temporal peak amplitude difference is less than the temporal amplitude difference threshold, the computing device may determine that the first waveform 216 and the second waveform A 218 match and/or that a light signal is not sensed by the first microphone or the second microphone.
  • a computing device may determine a temporal peak time 228 of the first waveform and a temporal peak time 230 of the second waveform A 218.
  • the computing device may determine a phase between the temporal peak time 228 of the first waveform and the temporal peak time 230 of the second waveform A 218 (e.g., approximately 0.1 seconds (s)).
  • the phase may be compared to a phase range (e.g., 0.2 s). Because the phase is less than the phase range, the computing device may determine that the first waveform 216 and the second waveform A 218 match and/or that a light signal is not sensed by the first microphone or the second microphone.
  • Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first waveform and second waveform A 218 match and/or that a light signal is not sensed by a microphone array.
  • the first waveform 216 may be provided by a first microphone and the second waveform B 220 may be provided by a second microphone.
  • a computing device e.g., computing device 102 may determine a temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 226 of the second waveform B 220.
  • the computing device may determine a temporal peak amplitude difference between the temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 226 of the second waveform B 220 (e.g., approximately 0.4).
  • the temporal peak amplitude difference may be compared to a temporal amplitude difference threshold (e.g., 0.3).
  • a computing device may determine that a light signal is sensed by the first microphone and/or that the first waveform 216 does not match the second waveform B 220.
  • a computing device e.g., computing device 102 may determine a temporal peak time 228 of the first waveform and a temporal peak time 232 of the second waveform B 220.
  • the computing device may determine a phase between the temporal peak time 228 of the first waveform and the temporal peak time 232 of the second waveform B 220 (e.g., approximately 0.45 s). The phase may be compared to a phase range (e.g., 0.2 s).
  • the computing device may determine that the first waveform 216 and the second waveform B 220 do not match and/or that a light signal is sensed by the first microphone.
  • Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first waveform and second waveform B 220 do not match and/or that a light signal is sensed by a microphone array.
  • FIG. 3 is a diagram illustrating examples of spectral waveforms 334, 336, 338 to illustrate some of the techniques that may be utilized for light signal identification described herein.
  • FIG. 3 illustrates an example of a first spectral waveform 334 from a first microphone, an example of a second spectral waveform A 336 from a second microphone, and an example of a second spectral waveform B 338 from a second microphone.
  • FIG. 3 illustrates the spectral waveforms 334, 336, 338 in graphs, where the horizontal axes illustrate frequency (in Flertz (Hz)) and the vertical axes illustrate magnitude. For simplicity, the vertical axes are illustrated on a scale of 0 to 1, but different examples may represent magnitude by voltage, current, displacement, or another unit.
  • Hz Flertz
  • the first spectral waveform 334 may be from a first microphone and the second spectral waveform A 336 may be from a second microphone.
  • a computing device e.g., computing device 102 may determine a spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 342 of the second spectral waveform A 336.
  • the computing device may determine a spectral peak magnitude difference between the spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 342 of the second spectral waveform A 336 (e.g., approximately 0.15).
  • the spectral peak magnitude difference may be compared to a spectral magnitude difference threshold (e.g., 0.4).
  • the computing device may determine that the first spectral waveform 334 and the second spectral waveform A 336 match and/or that a light signal is not sensed by the first microphone or the second microphone.
  • a computing device may determine a spectral peak frequency 346 of the first spectral waveform 334 and a spectral peak frequency 348 of the second spectral waveform A 336.
  • the computing device may determine whether the spectral peak frequency 346 of the first spectral waveform 334 is within a threshold spectral range of the spectral peak frequency 348 of the second spectral waveform A 336 (e.g., within 40 Hz).
  • the computing device may determine that the first spectral waveform 334 and the second spectral waveform A 336 match and/or that a light signal is not sensed by the first microphone or the second microphone.
  • Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first spectral waveform 334 and second spectral waveform A 336 match and/or that a light signal is not sensed by a microphone array.
  • the first spectral waveform 334 may be from a first microphone and the second spectral waveform B 338 may be from a second microphone.
  • a computing device e.g., computing device 102 may determine a spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 344 of the second spectral waveform B 338.
  • the computing device may determine a spectral peak magnitude difference between the spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 344 of the second spectral waveform B 338 (e.g., approximately 0.8).
  • the spectral peak magnitude difference may be compared to a spectral magnitude difference threshold (e.g., 0.4).
  • a computing device may determine that a light signal is sensed by the first microphone and/or that the first spectral waveform 334 does not match the second spectral waveform B 338.
  • a computing device e.g., computing device 102 may determine a spectral peak frequency 346 of the first spectral waveform 334 and a spectral peak frequency 350 of the second spectral waveform B 338. The computing device may determine whether the spectral peak frequency 346 of the first spectral waveform 334 is within a threshold spectral range of the spectral peak frequency 350 of the second spectral waveform B 338 (e.g., within 40 Hz).
  • the computing device may determine that the first spectral waveform 334 and the second spectral waveform B 338 do not match and/or that a light signal is sensed by the first microphone.
  • Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first spectral waveform 334 and second spectral waveform B 338 do not match and/or that a light signal is sensed by a microphone array.
  • FIG. 4 is a block diagram of an example of an audio device 482 that may be utilized for light signal identification.
  • the audio device 482 may be a device that captures audio from multiple microphones 454, 456. Some examples of the audio device 482 include smart speakers, smartphones, laptop computers, tablet devices, game consoles, mobile devices, etc.
  • the audio device 482 may be an example of the computing device 102 described in FIG. 1.
  • the audio device 482 may be included in a computing device (e.g., the computing device 102 described in FIG. 1).
  • the audio device 482 may perform one, some, or all of the functions, operations, elements, procedures, etc., described in one, some, or all of FIG. 1-5.
  • the audio device 482 may include a first microphone 454 and a second microphone 456. In some examples, the audio device may include three microphones or more. In some examples, the audio device 482 may include a filter or filters 458, a processor 462, and/or an output device or devices 472. In some examples, the filter(s) 458 and the processor 462 may be separate circuitries (e.g., ASICs, processors, integrated circuits, etc.). In some examples, the filter(s) 458 and the processor 462 may be combined into a circuitry (e.g., the processor 104 described in FIG. 1). The microphones 454, 456, the processor 462, and/or the output device(s) 472 may be electronic communication.
  • the filter(s) 458, the processor 462, and/or the output device(s) 472 may be separate from the audio device 482.
  • the audio device 482 may be included in a computing device (e.g., the computing device 102 described in FIG. 1)
  • the processor 462 may be a separate processor (e.g., processor 104 described in FIG. 1) of the computing device
  • the output device(s) may be separate output device(s) of the computing device.
  • the first microphone 454 may capture a first waveform (or a waveform signal that may be represented as a first waveform). The first waveform may be provided to the filter(s) 458 and/or to the processor 462.
  • the second microphone 456 may capture a second waveform (or a waveform signal that may be represented as a second waveform). The second waveform may be provided to the filter(s) 458 and/or to the processor 462.
  • another waveform or waveforms corresponding to another microphone or microphones may be provided to the filter(s) 458 and/or to the processor 462.
  • the filter(s) 458 may filter the first waveform to produce a first filtered waveform 474 and/or may filter the second waveform to produce a second filtered waveform 476.
  • the filter(s) 458 may provide noise filtering, low-pass filtering, band-stop, and/or band-pass filtering to produce the first filtered waveform 474 and/or the second filtered waveform 476.
  • the filter(s) 458 may produce a first noise waveform 478 and/or a second noise waveform 480.
  • the filter(s) 458 may produce another noise waveform or waveforms corresponding to another microphone or microphones.
  • the processor 462 may determine whether the first waveform (e.g., the first filtered waveform 474) matches the second waveform (e.g., the second filtered waveform 476).
  • the processor 462 may perform a transform function or functions 464, an aspect determination function or functions 466, a difference determination function or functions 468, and/or a matching determination function or functions 470.
  • a function or functions performed by the processor 462 may be performed as described in FIG. 1.
  • the transform function(s) 464 may include transforming a waveform or waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) into the frequency domain.
  • the aspect determination function(s) 466 may determine an aspect or aspects described in FIG.
  • the difference determination function(s) 468 may determine a difference or differences between waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) as described in FIG. 1.
  • the matching determination function(s) 470 may compare waveforms and/or determine whether waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) match. For instance, the processor 462 may determine whether a first waveform matches a second waveform. As used herein, the term “match” and variations thereof may mean that a waveform satisfies a criterion or criteria (e.g., threshold(s)) to be similar to another waveform.
  • a criterion or criteria e.g., threshold(s)
  • a function or functions of the processor 462 may utilize a waveform or waveforms, a filtered waveform or waveforms 474, 476, a noise waveform or waveforms, and/or historical audio data 460 to perform a transform or transforms, to determine an aspect or aspects, to determine a difference or differences, and/or to determine a match or matches between waveforms as described in FIG. 1.
  • the historical audio data 460 may be obtained from memory and/or from another device.
  • the processor 462 may determine whether a first time component of the first waveform matches a second time component of the second waveform to determine whether the first waveform matches the second waveform. In some examples, determining whether the first time component matches the second time component may be determined as described in FIG. 1. For instance, the processor 462 may determine whether a maximum correlation of the first waveform and the second waveform satisfies a correlation threshold and/or may determine whether a first peak time of the first waveform is within a threshold temporal range from a second peak time of the second waveform, etc.
  • the processor 462 may generate the filter or filters 458 based on a first waveform and/or a second waveform as described in FIG. 1.
  • the matching determination function(s) 470 may determine whether the first waveform and second waveform match based on filtering the second waveform with a filter 458 that is generated based on the first waveform (or based on filtering the first waveform with a filter that is generated based on the second waveform, for example). For instance, in a case that the filter 458 generated based on the first waveform significantly attenuates the second waveform (to below a threshold, for instance) as indicated by a second filtered waveform, it may be determined that the first waveform matches the second waveform.
  • the processor 462 may provide an indicator or indicators to the output device(s) 472.
  • the output device(s) 472 may output an alert that a light signal is detected in response to determining that the first waveform does not match the second waveform (and/or in response to determining that other waveforms do not match).
  • the light signal may be a laser attack on the audio device 482.
  • the light signal may be an attempt to inject a voice command to be carried out by the audio device 482 (e.g., to unlock a door, open a garage door, disable a security device, start a car, etc.).
  • An output device may be a device for providing output. Examples of output devices may include a speaker(s), display(s), light(s), haptic motor(s), radio frequency (RF) transmitter(s), network card(s), etc.
  • An alert may be an indication or message. Examples of alerts may include a sound (e.g., speech, tone, etc.), an image (e.g., image depicting text), a flashing light, an email, a text message, and/or a video, etc.
  • FIG. 5 is a block diagram illustrating an example of a computer- readable medium 584 for light signal identification.
  • the computer-readable medium 584 may be a non-transitory, tangible computer-readable medium 584.
  • the computer-readable medium 584 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like.
  • the computer- readable medium 584 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like.
  • the computer-readable medium 584 described in FIG. 5 may be an example of the memory described in FIG. 1 or memory described in FIG. 4.
  • code e.g., data and/or executable code or instructions
  • of the computer-readable medium 584 may be transferred and/or loaded to memory or memories of a computing device and/or audio device.
  • the computer-readable medium 584 may include code (e.g., data and/or executable code or instructions).
  • the computer-readable medium 584 may include spectral analysis instructions 586 and/or comparison instructions 588.
  • the spectral analysis instructions 586 may be instructions when executed cause a processor of an electronic device to perform a first spectral analysis of a first waveform from a first microphone of a microphone array.
  • a processor may execute the spectral analysis instructions 586 to transform a first waveform into the frequency domain to produce a first spectral waveform and/or to determine a spectral aspect or aspects of the first spectral waveform as described in FIG. 1 , FIG. 3, and/or FIG. 4.
  • the spectral analysis instructions 586 may be instructions when executed cause the processor of the electronic device to perform a second spectral analysis of a second waveform from a second microphone of a microphone array.
  • the processor may execute the spectral analysis instructions 586 to transform a second waveform into the frequency domain to produce a second spectral waveform and/or to determine a spectral aspect or aspects of the second spectral waveform as described in FIG. 1 , FIG. 3, and/or FIG. 4.
  • the comparison instructions 588 may be instructions when executed cause the processor of the electronic device to compare the first spectral analysis with the second spectral analysis to determine whether the microphone array has received a light signal.
  • the processor may execute the comparison instructions 588 to perform a comparison or comparisons and/or to determine a match or matches based on a spectral aspect or spectral aspects described in FIG. 1 , FIG. 3, and/or FIG. 4.
  • the comparison instructions 588 may include instructions when executed cause the processor of the electronic device to determine whether a first frequency component or components of the first spectral analysis match a second frequency component or components of the second spectral analysis.
  • the processor may determine that the microphone array has received a light signal.
  • the first frequency component may be a first peak frequency and the second frequency component may be a second peak frequency.
  • the comparison instructions 588 may include instructions when executed cause the processor to determine whether the first peak frequency is within a threshold spectral range from the second peak frequency to determine whether the first frequency component matches the second frequency component. If the first peak frequency is within the threshold spectral range, for instance, the first peak frequency may be determined to match the second peak frequency.
  • the computer-readable medium 584 may include instructions to cause a processor to compare a first waveform to a second waveform based on a temporal aspect or aspects in addition to the spectral aspect(s).
  • the term “and/or” may mean an item or items.
  • the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (but not C), B and C (but not A), A and C (but not B), or all of A, B, and C.

Abstract

Examples of computing devices are described herein. In some examples, a computing device may include a microphone array. In some examples, a computing device may include a processor. In some examples, the processor may analyze a first waveform and a second waveform from the microphone array to identify a light signal sensed by the microphone array.

Description

LIGHT SIGNAL IDENTIFICATION
BACKGROUND
[0001] Electronic technology has advanced to become virtually ubiquitous in society and has been used to improve many activities in society. For example, electronic devices are used to perform a variety of tasks, including work activities, communication, research, and entertainment. Different varieties of electronic circuitry may be utilized to provide different varieties of electronic technology.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 is a block diagram of an example of a computing device that may be utilized to identify or detect a light signal;
[0003] FIG. 2 is a diagram illustrating examples of waveforms to illustrate some of the techniques that may be utilized for light signal identification described herein;
[0004] FIG. 3 is a diagram illustrating examples of spectral waveforms to illustrate some of the techniques that may be utilized for light signal identification described herein;
[0005] FIG. 4 is a block diagram of an example of an audio device that may be utilized for light signal identification; and
[0006] FIG. 5 is a block diagram illustrating an example of a computer- readable medium for light signal identification. DETAILED DESCRIPTION
[0007] An electronic device may be a device that includes electronic circuitry. For instance, an electronic device may include integrated circuitry (e.g., transistors, digital logic, semiconductor technology, etc.). Examples of electronic devices include computing devices. A computing device may be a device to execute logic and/or perform computation. Examples of computing devices may include laptop computers, desktop computers, smart speakers, audio devices, digital assistants, smartphones, tablet devices, wireless communication devices, game consoles, smart appliances, vehicles with electronic components, aircraft, drones, robots, smart appliances, Internet of Things (loT) devices, etc.
[0008] Devices with microphones may be susceptible to capturing light signals. A light signal may be a signal that is transmitted by light. For example, light may be modulated (e.g., amplitude-modulated) to convey a signal. For instance, an audio signal may be converted to an amplitude-modulated light signal. Laser signals may be an example of light signals. In some examples, a light signal (e.g., laser signal) may be directed to a microphone. The microphone may capture the light signal due to the photoacoustic effect. For example, the microphone may convert the light signal to an electronic audio signal.
[0009] In some cases, a light signal may be injected into a device with a microphone. For example, attackers may use lasers to command or disrupt devices with microphones. In some examples, a laser attack on microphones may be carried out when line-of-sight to a microphone and accurate targeting of a microphone membrane are available to an attacker's laser. Approaches to guard against a laser attack that introduce extra components may be costly. [0010] Some approaches to detect a light signal may be beneficial. For example, some procedural approaches may be utilized to detect a light signal (e.g., a light signal attack or laser attack) that use a microphone array or arrays. A microphone array may be a group of microphones. For example, a microphone array may include two, three, or more microphones. Some approaches described herein may analyze multiple waveforms from respective microphones of a microphone array to determine whether a microphone is capturing a waveform that is different from a waveform or waveforms from another microphone or microphones. For example, because an attacker may use a single laser to target a single microphone, the other microphone(s) may produce waveforms (e.g., a user’s speech, ambient sound, and/or noise) that differ significantly from the waveform of the targeted microphone. Through waveform analysis, a computing device may determine if one microphone is capturing a significantly different or unexpected waveform, which may occur if the microphone is targeted with a light signal (e.g., light signal attack, laser attack, etc.). In some examples, the computing device may alert a user when a light signal is detected and/or may disregard information (e.g., voice commands, etc.) conveyed by the light signal. Accordingly, some examples of the techniques described herein may be beneficial to improve security for computing devices (e.g., smart speakers, smart phones, tablet devices, computers, vehicles, etc.) that receive waveforms from multiple microphones. [0011] Throughout the drawings, identical or similar reference numbers may designate similar, but not necessarily identical, elements. The figures are not necessarily to scale, and the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples in accordance with the description; however, the description is not limited to the examples provided in the drawings.
[0012] FIG. 1 is a block diagram of an example of a computing device 102 that may be utilized to identify or detect a light signal. Examples of the computing device 102 may include audio devices, smart speakers, laptop computers, smartphones, tablet devices, wireless communication devices, game consoles, vehicles with electronic components, aircraft, drones, robots, etc. In some examples, the computing device 102 may perform one, some, or all of the functions, operations, elements, procedures, etc., described in one, some, or all of FIG. 1-5.
[0013] In some examples, the computing device 102 may include a processor 104. The processor 104 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, graphics processing unit (GPU), field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device for logic execution and/or computation. In some examples, the processor 104 may retrieve and/or execute instructions stored in a memory (not shown in FIG. 1). For example, the processor 104 may fetch, decode, and/or execute instructions stored in memory.
[0014] In some examples, a memory may be any electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic information (e.g., instructions and/or data). The memory may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read- Only Memory (EEPROM), Dynamic Random Access Memory (DRAM), Synchronous DRAM (SDRAM), magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, a storage device, and/or an optical disc, etc. In some examples, the memory may be a non-transitory tangible machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals. In some examples, the processor 104 may be in electronic communication with memory. [0015] In some examples, the computing device 102 may include an input/output interface (not shown) through which the computing device 102 may communicate with an external and/or remote device or devices (not shown). The input/output interface may include hardware and/or machine-readable instructions to enable the processor 104 to communicate with the external and/or remote device or devices. The input/output interface may enable a wired or wireless connection to the external and/or remote device or devices. In some examples, the computing device 102 (e.g., input/output interface) may include a transmitter or transmitters to send a signal or signals. In some examples, the computing device 102 (e.g., input/output interface) may include a receiver or receivers to receive a signal or signals. In some examples, the transmitter and/or the receiver may be coupled to an antenna or antennas of the computing device 102 to transmit and/or receive a wireless signal. A wireless signal may be an electromagnetic signal and/or radio signal. Examples of wireless signals may include communication signals, cellular signals, Wi-Fi signals, Bluetooth signals, etc. In some examples, the input/output interface may include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 104 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display or displays, a speaker or speakers, a light or lights, a haptic motor or motors, a transmitter or transmitters, another apparatus, etc. In some examples, a user may utilize the input device(s) to input instructions and/or indications into the computing device 102. In some examples, the output device(s) may be utilized to output an alert as described herein.
[0016] In some examples, the computing device 102 may include a microphone array 106. The microphone array 106 may be a group of microphones (e.g., two, three, or more microphones). A microphone may be a transducer that converts ambient waves (e.g., acoustic waves, sounds, noise, light signals, etc.) into electronic signals. For example, the microphone array 106 may convert ambient waves into waveform signals. For instance, a first microphone of the microphone array 106 may produce a first waveform signal 112 and a second microphone of the microphone array 106 may produce a second waveform signal 114. A waveform may be a representation of a wave or waves (e.g., variations in air pressure, sensor position, and/or microphone diaphragm displacement). A waveform signal may be an electronic signal that indicates a waveform. For instance, a first microphone of the microphone array 106 may produce a first waveform signal 112 indicating a first waveform that represents waves captured by the first microphone. A second microphone of the microphone array 106 may produce a second waveform signal 114 indicating a second waveform that represents waves captured by the second microphone. [0017] In some examples, different microphones of the microphone array 106 may be arranged, distanced, and/or spaced. In some examples, the microphone array 106 may include an array of microphones that face the same direction or that face different directions (e.g., opposite directions, different angular directions, orthogonal directions, etc.). For instance, the microphone array 106 may include a group of microphones arranged in a line (e.g., a linear array), a group of microphones on different sides of the computing device 102, and/or microphones arranged to face different angles. When ambient waves are acoustic in nature, different microphones in the microphone array 106 may capture waveform signals with waveforms that exhibit a similar aspect or aspects. For example, different microphones may capture waveform signals with waveforms that exhibit similarities in amplitudes, magnitudes, frequency (e.g., dominant frequency or frequencies, pitch, etc.), and/or timing. An aspect or aspects of waveforms from the microphone array 106 may be utilized to determine whether a light signal 110 is sensed by the microphone array 106 (e.g., by a microphone of the microphone array 106). An aspect may be a characteristic or parameter of a waveform.
[0018] In some examples, the processor 104 may analyze a first waveform and a second waveform from the microphone array 106 to identify a light signal 110 sensed by the microphone array 106. For example, the processor 104 may perform light signal identification 108 based on waveforms from the microphone array 106. In some examples, the processor 104 may execute light signal identification instructions stored in a memory to perform light signal identification 108. In some examples, the microphone array 106 may provide a first waveform signal 112 and a second waveform signal 114 to the processor 104. The processor 104 may utilize a first waveform indicated by the first waveform signal 112 from a first microphone and a second waveform indicated by the second waveform signal 114 from a second microphone to determine and/or identify whether a light signal 110 is captured by the microphone array. In some examples, two waveforms corresponding to two microphones may be utilized. In some examples, three or more waveforms corresponding to three or more respective microphones may be utilized. For instance, the operations described herein for two waveforms in some examples may be performed for three or more waveforms in some examples.
[0019] To analyze the first waveform and the second waveform, the processor 104 may determine an aspect or aspects (e.g., parameter(s)) of the first waveform and the second waveform. Examples of aspects may include amplitudes, magnitudes, peak amplitudes (e.g., temporal and/or spectral peak amplitudes), peak magnitudes (e.g., temporal and/or spectral peak magnitudes), average amplitudes (e.g., temporal average amplitudes and/or spectral average amplitudes), average magnitudes (e.g., temporal average magnitudes and/or spectral average magnitudes), envelopes (e.g., temporal envelopes and/or spectral envelopes), timing (e.g., peak timing), frequencies (e.g., peak frequencies, pitches), statistical measures (e.g., average, standard deviation, and/or variance of temporal waveforms and/or spectral waveforms), noise waveform aspects, etc. In some examples, the analysis performed herein may be performed on a time period of a waveform or waveforms (e.g., the first waveform and the second waveform). For example, analyzing the first waveform and the second waveform may include performing the analysis relative to a period or portion of the first waveform signal 112 and/or the second waveform signal 114.
[0020] In some examples, the processor 104 may compare the aspect(s) (e.g., parameter(s)) to determine whether the first waveform and the second waveform are similar or dissimilar. For example, the processor 104 may utilize a threshold or thresholds, a range or ranges, and/or a correlation or correlations to determine similarity or dissimilarity. In some examples, when the first waveform and the second waveform are similar, the first microphone and the second microphone may have likely received acoustic waves that are expressed by the first waveform signal 112 and the second waveform signal 114. When the first waveform and the second waveform are dissimilar, the first microphone or the second microphone may have likely received a light signal 110 that is expressed by the first waveform signal 112 or the second waveform signal 114.
[0021] In some examples, the processor 104 may determine and/or provide an indication of whether a light signal 110 is identified based on a comparison between the waveforms. For instance, if the processor 104 determines that the waveforms are dissimilar, the processor 104 may determine and/or provide an indication (e.g., alert) that a light signal 110 is identified. If the processor 104 determines that the waveforms are similar, the processor 104 may determine that no light signal is identified. In some examples, the processor 104 may provide an indication that no light signal is identified. Examples of techniques to determine similarity and/or dissimilarity between waveforms from different microphones of the microphone array 106 are described herein. The computing device 102 may utilize a technique or a combination of techniques to identify whether a light signal 110 (e.g., laser) is sensed by a microphone of the microphone array 106.
[0022] In some examples, the processor 104 may detect a temporal peak or peaks of the first waveform and/or of the second waveform. As used herein, the term “temporal” may relate to time and/or the time domain. In some examples, the processor 104 may detect the temporal peak(s) using a temporal peak detection threshold. A temporal peak detection threshold may be static or may be relative (e.g., determined based on the waveform(s)). In some examples, the processor 104 may determine a maximum or maxima for values of a waveform that are greater than a static temporal peak threshold to produce a temporal peak amplitude or magnitude. In some examples, the processor 104 may determine a maximum or maxima for values of a waveform that are greater than a relative temporal peak threshold. For example, a relative temporal peak threshold may be determined relative to a statistical measure (e.g., a multiple of a standard deviation of amplitude or magnitude of the waveform). A maximum or maxima for values of a waveform that are greater than the relative temporal peak threshold may be the temporal peak amplitude(s) or magnitude(s).
[0023] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a difference between the first waveform and the second waveform. Examples of differences may include a temporal subtraction, a temporal peak amplitude difference, a temporal peak magnitude difference, temporal signal-to-noise ratio (SNR) difference, and/or a statistical temporal difference (e.g., difference of averages, standard deviations, variances, etc.) between the first waveform and the second waveform. In some examples, the processor 104 may determine a subtraction between the first waveform and the second waveform to produce the temporal subtraction. In some examples, the processor 104 may determine a temporal peak amplitude or magnitude (e.g., a maximum) or temporal peak amplitudes or magnitudes (e.g., localized maxima) of the first waveform and the second waveform. The processor 104 may determine a subtraction or distance between the temporal peak amplitude(s) and/or the temporal peak magnitude(s) to determine the difference. In some examples, the processor 104 may determine a first temporal SNR for the first waveform and a second temporal SNR for the second waveform. The processor 104 may determine a subtraction or difference between the first temporal SNR and the second temporal SNR to determine the temporal SNR difference.
[0024] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine whether the difference satisfies a temporal difference criterion to identify the light signal. Examples of the temporal difference criterion may include a temporal difference threshold or thresholds. For instance, the processor 104 may compare the difference(s) (e.g., temporal subtraction, temporal peak amplitude difference, temporal SNR difference, and/or statistical temporal difference, etc.) to a temporal difference threshold or thresholds (e.g., temporal amplitude difference threshold, temporal magnitude difference threshold, temporal SNR difference threshold, and/or temporal statistic difference threshold, etc.). The temporal difference threshold(s) may be expressed in terms of decibels (dB), voltage(s), percentage(s), etc. For example, the processor 104 may determine whether the difference is greater than 30%, 40%, 50%, 60%, 65%, etc. (e.g., whether the temporal peak amplitude difference is greater than 50% of a temporal peak amplitude of the first waveform or the second waveform). In a case that a difference is greater than the temporal threshold, the processor 104 may identify that the light signal 110 is sensed by the microphone array.
[0025] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a first time component of the first waveform and a second time component of the second waveform. A time component may be a value or values that indicate a temporal distribution and/or temporal characteristic of a waveform. Examples of time components may include peak time or times, a temporal envelope, and/or envelope coefficients. For example, the processor 104 may determine maximum temporal peak times and/or localized temporal peak times of the first waveform and the second waveform. For instance, the processor 104 may determine the time or times corresponding to the temporal peak(s), which may be determined as described above. In some examples, the processor 104 may determine a temporal envelope or envelopes. For example, the processor 104 may determine a curve that outlines the peaks of a waveform. In some examples, the processor 104 may perform linear prediction analysis to determine an envelope or envelopes (e.g., and/or coefficients) of the waveforms.
[0026] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may identify the light signal 110 in response to determining that the first time component does not match the second time component. For example, the processor 104 may determine whether the first time component matches the second time component by comparing the time components. In some examples, the processor 104 may correlate the temporal envelopes of the first waveform and the second waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first time component does not match the second time component and/or that a light signal 110 is sensed by the microphone array 106. In some examples, the processor 104 may correlate the first waveform and the second waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first waveform does not match the second waveform and/or that a light signal 110 is sensed by the microphone array 106.
[0027] In some examples, the processor 104 may determine whether a peak time (e.g., time of a maximum peak) of the first waveform is within a threshold temporal range from a peak time (e.g., time of a maximum peak or pitch) of the second waveform. In a case that the peak time of the first waveform is not within the threshold temporal range, the processor 104 may determine that the first time component does not match the second time component. In some examples, a difference in time between a peak time of the first waveform and a peak time of the second waveform may indicate a phase between the first waveform and the second waveform. [0028] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a phase between the first waveform and the second waveform. For instance, the processor 104 may determine a difference between peak times described above to determine the delay. In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may compare the phase to a phase range that is based on microphone array spacing to identify the light signal. For instance, the microphone array spacing (e.g., spacing between a first microphone and a second microphone) may establish the phase range. For example, a maximum phase range may be an approximate amount of time for an acoustic wave to travel between the first microphone and the second microphone according to the spacing. The phase range may range from zero (for when an acoustic wave source is equidistant from the microphones, for example) to the maximum phase range (when an acoustic wave source is positioned in line with the microphones, for example). In some examples, to determine the phase, the processor 104 may determine corresponding temporal peaks between the first waveform and the second waveform. For instance, a temporal peak with a highest peak amplitude in one waveform (e.g., the first waveform) may correspond to a following (e.g., next) highest peak amplitude in another waveform (e.g., the second waveform), due to amplitude decline over distance. In a case that the phase is greater than the phase range (or in a case that a corresponding peak is not found within the phase range, for example), the processor 104 may determine that a light signal 110 is sensed by the microphone array.
[0029] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may transform the first waveform to a first spectral waveform and/or may transform the second waveform to a second spectral waveform. As used herein, the term “spectral” may refer to frequency domain spectrum. For example, the processor 104 may perform a Fourier transform (e.g., fast Fourier transform (FFT), discrete Fourier transform (DFT)), a Laplace transform, a Z-transform, etc., to transform the first waveform to a first spectral waveform and/or to transform the second waveform to a second spectral waveform. Transforming the first waveform and the second waveform (and/or another waveform or waveforms) may enable comparison between a spectral aspect or aspects of the waveforms.
[0030] In some examples, the processor 104 may detect a peak or peaks of the first spectral waveform and/or of the second spectral waveform. In some examples, the processor 104 may detect the peak(s) using a spectral peak detection threshold. A spectral peak detection threshold may be static or may be relative (e.g., determined based on the spectral waveform(s)). In some examples, the processor 104 may determine a maximum or maxima for values of a spectral waveform that are greater than a static spectral peak threshold to produce a spectral peak amplitude or magnitude. In some examples, the processor 104 may determine a maximum or maxima for values of a spectral waveform that are greater than a relative spectral peak threshold. For example, a spectral peak threshold may be determined relative to a statistical measure (e.g., a multiple of a standard deviation of amplitude or magnitude of the spectral waveform). A maximum or maxima for values of a spectral waveform that are greater than the relative spectral peak threshold may be the spectral peak amplitude(s) or magnitude(s).
[0031] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a difference between the first spectral waveform of the first waveform and the second spectral waveform of the second waveform. Examples of differences may include a spectral subtraction, a spectral peak amplitude difference, a spectral peak magnitude difference, SNR difference, and/or a statistical spectral difference (e.g., difference of averages, standard deviations, variances, etc.) between the first spectral waveform and the second spectral waveform. In some examples, the processor 104 may determine a subtraction between the first spectral waveform and the second spectral waveform to produce the spectral subtraction. In some examples, the processor 104 may determine a spectral peak amplitude or magnitude (e.g., a maximum) or spectral peak amplitudes or magnitudes (e.g., localized maxima) of the first spectral waveform and the second spectral waveform. The processor 104 may determine a subtraction or distance between the spectral peak amplitude(s) and/or the spectral peak amplitude(s) to determine the difference. In some examples, the processor 104 may determine a first spectral SNR for the first spectral waveform and a second spectral SNR for the second spectral waveform. The processor 104 may determine a subtraction or difference between the first spectral SNR and the second spectral SNR to determine the difference.
[0032] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine whether the difference satisfies a spectral difference criterion to identify the light signal. Examples of the spectral difference criterion may include a spectral difference threshold or thresholds. For instance, the processor 104 may compare the difference(s) (e.g., spectral subtraction, spectral peak amplitude difference, spectral SNR difference, and/or statistical spectral difference, etc.) to a spectral difference threshold or thresholds (e.g., spectral amplitude difference threshold, spectral magnitude difference threshold, spectral SNR difference threshold, and/or spectral statistic difference threshold, etc.). The spectral difference threshold(s) may be expressed in terms of decibels (dB), voltage(s), percentage(s), etc. For example, the processor 104 may determine whether the difference is greater than 30%, 40%, 50%, 60%, 65%, etc. (e.g., whether the spectral peak amplitude difference is greater than 50% of a spectral peak amplitude of the first waveform or the second waveform). In a case that a difference is greater than the spectral threshold, the processor 104 may identify that the light signal 110 is sensed by the microphone array.
[0033] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a first frequency component of the first spectral waveform and a second frequency component of the second spectral waveform. A frequency component may be a value or values that indicate a spectral distribution and/or a spectral characteristic of a waveform. Examples of frequency components may include peak frequency or frequencies, a spectral envelope, and/or envelope coefficients. For example, the processor 104 may determine maximum spectral peak frequencies (e.g., pitch(es)) and/or localized spectral peaks of the first spectral waveform and the second spectral waveform. For instance, the processor 104 may determine the frequency or frequencies corresponding to the spectral peak(s), which may be determined as described above. In some examples, the processor 104 may determine a spectral envelope or envelopes. For example, the processor 104 may determine a curve that outlines the peaks of a spectral waveform. In some examples, the processor 104 may perform linear prediction analysis and/or cepstral windowing to determine an envelope or envelopes (e.g., and/or coefficients) of the spectral waveforms.
[0034] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may identify the light signal 110 in response to determining that the first frequency component does not match the second frequency component. For example, the processor 104 may determine whether the first frequency component matches the second frequency component by comparing the frequency components. For example, the processor 104 may determine whether a peak frequency (e.g., frequency of a maximum peak or pitch) of the first spectral waveform is within a threshold spectral range from a peak frequency (e.g., frequency of a maximum peak or pitch) of the second spectral waveform. In a case that the peak frequency of the first spectral waveform is not within the threshold spectral range, the processor 104 may determine that the first frequency component does not match the second frequency component. In some examples, the processor 104 may correlate the spectral envelopes of the first spectral waveform and the second spectral waveform. If a maximum correlation is less than a correlation threshold (e.g., 50%, 60%, 70%, 75%, etc.), the processor 104 may determine that the first frequency component does not match the second frequency component and/or that a light signal 110 is sensed by the microphone array 106.
[0035] In some examples, the processor 104 may filter the first waveform to produce a first filtered waveform and/or may filter the second waveform to produce a second filtered waveform. Examples of filtering may include noise filtering, low-pass filtering, band-stop filtering, notch filtering, and/or band-pass filtering. Performing filtering may enable comparison of a variety of characteristics (e.g., filtered waveforms, noise waveforms, spectral bands, etc.), which may be performed to identify whether a light signal 110 is sensed by the microphone array 106.
[0036] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may compare the first filtered waveform and the second filtered waveform to identify the light signal 110. For example, the processor 104 may determine and/or compare an aspect or aspects for the first filtered waveform and the second filtered waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above. For instance, the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal SNRs, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, temporal SNR difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the first filtered waveform and the second filtered waveform. The processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first filtered waveform does not match the second filtered waveform based on the comparison(s) similar to those described above.
[0037] In some examples, the processor 104 may filter the first waveform to produce a first noise waveform and/or may filter the second waveform to produce a second noise waveform. For instance, the processor 104 may perform noise filtering (e.g., noise suppression) on the first waveform and the second waveform to produce the first noise waveform and the second noise waveform. For example, some approaches to noise filtering (e.g., Wiener filtering) may produce a noise estimate for filtering. The produced noise estimate for the first waveform may be the first noise waveform and/or the produced noise estimate for the second waveform may be the second noise waveform. Different microphones (e.g., a first microphone and a second microphone) may sense similar acoustic noise waves. However, when a light signal 110 is injected into a microphone of the microphone array 106, the sensed light signal 110 may mask and/or change the noise for one of the microphones.
[0038] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may compare the first noise waveform and the second noise waveform to identify the light signal 110. For example, the processor 104 may determine and/or compare an aspect or aspects for the first noise waveform and the second noise waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above. For instance, the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the first noise waveform and the second noise waveform. The processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first noise waveform does not match the second noise waveform based on comparison(s) similar to those described above.
[0039] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a combination of a noise waveform comparison and a filtered waveform comparison. For example, the noise waveform comparison may be expressed as a noise waveform matching score (e.g., 0-1 , 0-100%, etc.), where the noise waveform matching score may indicate a degree of matching between the first noise waveform and the second noise waveform. In some examples, the filtered waveform comparison may be expressed as a filtered waveform matching score (e.g., 0-1 , 0-100%, etc.), where the filtered waveform matching score may indicate a degree of matching between the first filtered waveform and the second filtered waveform. The processor 104 may determine a combination by determining a combined score (e.g., sum, average, weighted average, etc.) of the noise waveform matching score and the filtered waveform matching score. The processor 104 may compare the combined score with a combined score threshold (e.g., 0.4, 0.5, 0.7, 0.9, 55%, 65%, etc.). In a case that the combined score is less than the combined score threshold, the processor 104 may determine that a light signal 110 is sensed by the microphone array 106. In some examples, other combinations of the comparisons described herein may be utilized. For instance, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may determine a combination of a waveform comparison, a spectral waveform comparison, a noise waveform comparison and/or a filtered waveform comparison.
[0040] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may generate a filter based on the first waveform. For example, the processor 104 may determine filter coefficients based on the first waveform. In some examples, the processor 104 may transform the first waveform into the frequency domain to produce a first spectral waveform. The processor 104 may generate a filter to filter out the first spectral waveform. For example, the processor 104 may determine a band stop filter with a stop band or stop bands corresponding to a spectral range or spectral ranges occupied by the first spectral waveform (e.g., spectral ranges with signal energy greater than a noise level). For instance, the band stop filter may be determined as a unity gain filter with stop bands (e.g., attenuation or 0 gain) corresponding to spectral characteristics (e.g., spectral peaks) of the first spectral waveform. In some examples, the processor 104 may determine a filter coefficient or filter coefficients to reduce or cancel the first spectral waveform. For instance, the processor 104 may determine filter coefficients to filter out a spectral envelope of the first spectral waveform (e.g., filter coefficients that provide a filter shape complementary to the spectral envelope of the first spectral waveform). [0041] In some examples, the computing device 102 (e.g., processor 104) may filter the second waveform based on the filter to produce a second filtered waveform. For example, the processor 104 may apply the filter to the second waveform. In some examples, to apply the filter, the processor 104 may convolve the second waveform with a time-domain version of the filter. In some examples, the processor 104 may transform the second waveform into the frequency domain to produce a second spectral waveform and may multiply (e.g., perform sample-wise multiplication of) the second spectral waveform with a frequency-domain version of the filter. Filtering the second waveform based on the filter may provide an indication of whether a light signal is identified. For instance, the filter generated from the first waveform may target waveforms similar to the first waveform for attenuation. For example, in a case that the first waveform and the second waveform are similar, the filter generated based on the first waveform may significantly attenuate the second waveform, which may result in a second filtered waveform with relatively low signal amplitude, magnitude, and/or energy. For example, the second waveform may be attenuated, which may result in a second filtered waveform with relatively small amplitude, magnitude, and/or energy. In a case that the first waveform and the second waveform are dissimilar, the filter generated based on the first waveform may pass a dissimilar portion or portions of the second waveform, which may result in a second filtered waveform with relatively higher signal amplitude, magnitude, and/or energy, which may indicate that a light signal is sensed. In some examples, the second waveform may be time-shifted for filtering, to account for inter-microphone spacing.
[0042] In some examples, the computing device 102 (e.g., processor 104) may determine whether the second filtered waveform meets a criterion (or criteria) to identify the light signal. The criterion or criteria may be a threshold or thresholds. In some examples, the processor 104 may compare the second filtered waveform (e.g., a characteristic or characteristics of the second filtered waveform) to a threshold. The threshold may be a static threshold or may be determined based on a noise level and/or signal level of the first waveform and/or of the second waveform. In some examples, the threshold may be set at a maximum noise level of a noise signal (of a first noise signal determined from the first waveform or of a second noise signal determined from the second waveform) or at a percentage (e.g., +10%, +20%, etc.) or offset relative to a maximum or average noise level. In some examples, the threshold may be set at a proportion or percentage (e.g., 20%, 30%, 50%, etc.) of a maximum amplitude or magnitude of the first waveform or of the second waveform or at a percentage (e.g., -20%, -30%, -50%, 70%, -80%, etc.) or offset relative to a maximum or average amplitude or magnitude of the first waveform or second waveform. In a case that the second filtered waveform (e.g., amplitude or magnitude peak(s) of the second filtered waveform, average energy of the second filtered waveform, etc.) is greater than the threshold, the computing device 102 (e.g., processor 104) may identify the light signal and/or produce an alert. In some examples, a filter may be generated based on the second waveform (or a waveform from another microphone) and may be similarly applied to the first waveform (or to another waveform from another microphone) to identify a light signal.
[0043] In some examples, to analyze the first waveform and the second waveform, the computing device 102 (e.g., processor 104) may analyze the first waveform and the second waveform based on historical audio data. For example, the computing device 102 may store historical audio data that indicates a previously captured waveform or waveforms and/or an aspect or aspects of the previously captured waveform or waveforms. In some examples, the computing device 102 (e.g., processor 104) may compare the first waveform and/or the second waveform with the historical audio data to identify the light signal 110. For example, the processor 104 may determine and/or compare an aspect or aspects for the first waveform and/or the second waveform, which aspect or aspects may be determined and/or which comparisons may be performed as similarly described above. For instance, the processor 104 may determine and/or compare temporal peaks, temporal peak amplitude, temporal peak magnitude, temporal SNRs, temporal subtraction, temporal peak amplitude difference, temporal peak magnitude difference, temporal SNR difference, statistical temporal difference, time components, temporal envelopes, temporal envelope coefficients, correlation, phase, spectral peaks, spectral peak amplitude, spectral peak magnitude, spectral subtraction, spectral peak amplitude difference, spectral peak magnitude difference, statistical spectral difference, spectral components, spectral envelopes, spectral envelope coefficients, and/or correlation for the historical audio data and the first waveform and/or the second waveform. The processor 104 may identify that a light signal 110 is sensed by the microphone array 106 in response to determining that the first waveform and/or the second waveform do not match the historical audio data based on the comparison(s) similar to those described above.
[0044] FIG. 2 is a diagram illustrating examples of waveforms 216, 218, 220 to illustrate some of the techniques that may be utilized for light signal identification described herein. FIG. 2 illustrates an example of a first waveform 216 from a first microphone, an example of a second waveform A 218 from a second microphone, and an example of a second waveform B 220 from a second microphone. FIG. 2 illustrates the waveforms 216, 218, 220 in graphs, where the horizontal axes illustrate time (in seconds) and the vertical axes illustrate amplitude. For simplicity, the vertical axes are illustrated on a scale of - 1 to 1 , but different examples may represent amplitude by voltage, current, displacement, or another unit.
[0045] In one scenario, the first waveform 216 may be provided by a first microphone and the second waveform A 218 may be provided by a second microphone. In this scenario, a computing device (e.g., computing device 102) may determine a temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 224 of the second waveform A 218. The computing device may determine a temporal peak amplitude difference between the temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 224 of the second waveform A 218 (e.g., approximately 0.1). The temporal peak amplitude difference may be compared to a temporal amplitude difference threshold (e.g., 0.3). Because the temporal peak amplitude difference is less than the temporal amplitude difference threshold, the computing device may determine that the first waveform 216 and the second waveform A 218 match and/or that a light signal is not sensed by the first microphone or the second microphone.
[0046] In another approach, a computing device (e.g., computing device 102) may determine a temporal peak time 228 of the first waveform and a temporal peak time 230 of the second waveform A 218. The computing device may determine a phase between the temporal peak time 228 of the first waveform and the temporal peak time 230 of the second waveform A 218 (e.g., approximately 0.1 seconds (s)). The phase may be compared to a phase range (e.g., 0.2 s). Because the phase is less than the phase range, the computing device may determine that the first waveform 216 and the second waveform A 218 match and/or that a light signal is not sensed by the first microphone or the second microphone. Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first waveform and second waveform A 218 match and/or that a light signal is not sensed by a microphone array.
[0047] In another scenario, the first waveform 216 may be provided by a first microphone and the second waveform B 220 may be provided by a second microphone. In this scenario, a computing device (e.g., computing device 102) may determine a temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 226 of the second waveform B 220. The computing device may determine a temporal peak amplitude difference between the temporal peak amplitude 222 of the first waveform and a temporal peak amplitude 226 of the second waveform B 220 (e.g., approximately 0.4). The temporal peak amplitude difference may be compared to a temporal amplitude difference threshold (e.g., 0.3). Because the temporal peak amplitude difference is greater than the temporal amplitude difference threshold, the computing device may determine that a light signal is sensed by the first microphone and/or that the first waveform 216 does not match the second waveform B 220. [0048] In another approach, a computing device (e.g., computing device 102) may determine a temporal peak time 228 of the first waveform and a temporal peak time 232 of the second waveform B 220. The computing device may determine a phase between the temporal peak time 228 of the first waveform and the temporal peak time 232 of the second waveform B 220 (e.g., approximately 0.45 s). The phase may be compared to a phase range (e.g., 0.2 s). Because the phase is less than the phase range, the computing device may determine that the first waveform 216 and the second waveform B 220 do not match and/or that a light signal is sensed by the first microphone. Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first waveform and second waveform B 220 do not match and/or that a light signal is sensed by a microphone array.
[0049] FIG. 3 is a diagram illustrating examples of spectral waveforms 334, 336, 338 to illustrate some of the techniques that may be utilized for light signal identification described herein. FIG. 3 illustrates an example of a first spectral waveform 334 from a first microphone, an example of a second spectral waveform A 336 from a second microphone, and an example of a second spectral waveform B 338 from a second microphone. FIG. 3 illustrates the spectral waveforms 334, 336, 338 in graphs, where the horizontal axes illustrate frequency (in Flertz (Hz)) and the vertical axes illustrate magnitude. For simplicity, the vertical axes are illustrated on a scale of 0 to 1, but different examples may represent magnitude by voltage, current, displacement, or another unit.
[0050] In one scenario, the first spectral waveform 334 may be from a first microphone and the second spectral waveform A 336 may be from a second microphone. In this scenario, a computing device (e.g., computing device 102) may determine a spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 342 of the second spectral waveform A 336. The computing device may determine a spectral peak magnitude difference between the spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 342 of the second spectral waveform A 336 (e.g., approximately 0.15). The spectral peak magnitude difference may be compared to a spectral magnitude difference threshold (e.g., 0.4). Because the spectral peak magnitude difference is less than the spectral magnitude difference threshold, the computing device may determine that the first spectral waveform 334 and the second spectral waveform A 336 match and/or that a light signal is not sensed by the first microphone or the second microphone.
[0051] In another approach, a computing device (e.g., computing device 102) may determine a spectral peak frequency 346 of the first spectral waveform 334 and a spectral peak frequency 348 of the second spectral waveform A 336. The computing device may determine whether the spectral peak frequency 346 of the first spectral waveform 334 is within a threshold spectral range of the spectral peak frequency 348 of the second spectral waveform A 336 (e.g., within 40 Hz). Because the spectral peak frequency 346 of the first waveform 334 is within the threshold spectral range from the spectral peak frequency 348 of the second spectral waveform A 336, the computing device may determine that the first spectral waveform 334 and the second spectral waveform A 336 match and/or that a light signal is not sensed by the first microphone or the second microphone. Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first spectral waveform 334 and second spectral waveform A 336 match and/or that a light signal is not sensed by a microphone array.
[0052] In another scenario, the first spectral waveform 334 may be from a first microphone and the second spectral waveform B 338 may be from a second microphone. In this scenario, a computing device (e.g., computing device 102) may determine a spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 344 of the second spectral waveform B 338. The computing device may determine a spectral peak magnitude difference between the spectral peak magnitude 340 of the first spectral waveform and a spectral peak magnitude 344 of the second spectral waveform B 338 (e.g., approximately 0.8). The spectral peak magnitude difference may be compared to a spectral magnitude difference threshold (e.g., 0.4). Because the spectral peak magnitude difference is greater than the spectral magnitude difference threshold, the computing device may determine that a light signal is sensed by the first microphone and/or that the first spectral waveform 334 does not match the second spectral waveform B 338. [0053] In another approach, a computing device (e.g., computing device 102) may determine a spectral peak frequency 346 of the first spectral waveform 334 and a spectral peak frequency 350 of the second spectral waveform B 338. The computing device may determine whether the spectral peak frequency 346 of the first spectral waveform 334 is within a threshold spectral range of the spectral peak frequency 350 of the second spectral waveform B 338 (e.g., within 40 Hz). Because the spectral peak frequency 346 of the first waveform 334 is not within the threshold spectral range from the spectral peak frequency 350 of the second spectral waveform B 338, the computing device may determine that the first spectral waveform 334 and the second spectral waveform B 338 do not match and/or that a light signal is sensed by the first microphone. Other techniques or a combination of techniques as described in FIG. 1 may be utilized to determine that the first spectral waveform 334 and second spectral waveform B 338 do not match and/or that a light signal is sensed by a microphone array.
[0054] FIG. 4 is a block diagram of an example of an audio device 482 that may be utilized for light signal identification. The audio device 482 may be a device that captures audio from multiple microphones 454, 456. Some examples of the audio device 482 include smart speakers, smartphones, laptop computers, tablet devices, game consoles, mobile devices, etc. In some examples, the audio device 482 may be an example of the computing device 102 described in FIG. 1. In some examples, the audio device 482 may be included in a computing device (e.g., the computing device 102 described in FIG. 1). In some examples, the audio device 482 may perform one, some, or all of the functions, operations, elements, procedures, etc., described in one, some, or all of FIG. 1-5.
[0055] In some examples, the audio device 482 may include a first microphone 454 and a second microphone 456. In some examples, the audio device may include three microphones or more. In some examples, the audio device 482 may include a filter or filters 458, a processor 462, and/or an output device or devices 472. In some examples, the filter(s) 458 and the processor 462 may be separate circuitries (e.g., ASICs, processors, integrated circuits, etc.). In some examples, the filter(s) 458 and the processor 462 may be combined into a circuitry (e.g., the processor 104 described in FIG. 1). The microphones 454, 456, the processor 462, and/or the output device(s) 472 may be electronic communication. In some examples, the filter(s) 458, the processor 462, and/or the output device(s) 472 may be separate from the audio device 482. For example, the audio device 482 may be included in a computing device (e.g., the computing device 102 described in FIG. 1), the processor 462 may be a separate processor (e.g., processor 104 described in FIG. 1) of the computing device, and/or the output device(s) may be separate output device(s) of the computing device.
[0056] In some examples, the first microphone 454 may capture a first waveform (or a waveform signal that may be represented as a first waveform). The first waveform may be provided to the filter(s) 458 and/or to the processor 462. In some examples, the second microphone 456 may capture a second waveform (or a waveform signal that may be represented as a second waveform). The second waveform may be provided to the filter(s) 458 and/or to the processor 462. In some examples, another waveform or waveforms corresponding to another microphone or microphones may be provided to the filter(s) 458 and/or to the processor 462.
[0057] In some examples, the filter(s) 458 may filter the first waveform to produce a first filtered waveform 474 and/or may filter the second waveform to produce a second filtered waveform 476. For example, the filter(s) 458 may provide noise filtering, low-pass filtering, band-stop, and/or band-pass filtering to produce the first filtered waveform 474 and/or the second filtered waveform 476. In some examples, the filter(s) 458 may produce a first noise waveform 478 and/or a second noise waveform 480. In some examples, the filter(s) 458 may produce another noise waveform or waveforms corresponding to another microphone or microphones.
[0058] In some examples, the processor 462 may determine whether the first waveform (e.g., the first filtered waveform 474) matches the second waveform (e.g., the second filtered waveform 476). In some examples, the processor 462 may perform a transform function or functions 464, an aspect determination function or functions 466, a difference determination function or functions 468, and/or a matching determination function or functions 470. A function or functions performed by the processor 462 may be performed as described in FIG. 1. For example, the transform function(s) 464 may include transforming a waveform or waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) into the frequency domain. The aspect determination function(s) 466 may determine an aspect or aspects described in FIG. 1 for a waveform or waveforms (e.g., first filtered waveform 474 and second filtered waveform 476). The difference determination function(s) 468 may determine a difference or differences between waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) as described in FIG. 1.
[0059] The matching determination function(s) 470 may compare waveforms and/or determine whether waveforms (e.g., first filtered waveform 474 and second filtered waveform 476) match. For instance, the processor 462 may determine whether a first waveform matches a second waveform. As used herein, the term “match” and variations thereof may mean that a waveform satisfies a criterion or criteria (e.g., threshold(s)) to be similar to another waveform. In some examples, a function or functions of the processor 462 may utilize a waveform or waveforms, a filtered waveform or waveforms 474, 476, a noise waveform or waveforms, and/or historical audio data 460 to perform a transform or transforms, to determine an aspect or aspects, to determine a difference or differences, and/or to determine a match or matches between waveforms as described in FIG. 1. In some examples, the historical audio data 460 may be obtained from memory and/or from another device.
[0060] In some examples, the processor 462 (e.g., matching determination function(s) 470) may determine whether a first time component of the first waveform matches a second time component of the second waveform to determine whether the first waveform matches the second waveform. In some examples, determining whether the first time component matches the second time component may be determined as described in FIG. 1. For instance, the processor 462 may determine whether a maximum correlation of the first waveform and the second waveform satisfies a correlation threshold and/or may determine whether a first peak time of the first waveform is within a threshold temporal range from a second peak time of the second waveform, etc.
[0061] In some examples, the processor 462 may generate the filter or filters 458 based on a first waveform and/or a second waveform as described in FIG. 1. In some examples, the matching determination function(s) 470 may determine whether the first waveform and second waveform match based on filtering the second waveform with a filter 458 that is generated based on the first waveform (or based on filtering the first waveform with a filter that is generated based on the second waveform, for example). For instance, in a case that the filter 458 generated based on the first waveform significantly attenuates the second waveform (to below a threshold, for instance) as indicated by a second filtered waveform, it may be determined that the first waveform matches the second waveform. Otherwise, it may be determined that the first waveform does not match the second waveform, and/or that a light signal is identified. [0062] In a case that waveforms do not match, the processor 462 may provide an indicator or indicators to the output device(s) 472. For example, the output device(s) 472 may output an alert that a light signal is detected in response to determining that the first waveform does not match the second waveform (and/or in response to determining that other waveforms do not match). The light signal may be a laser attack on the audio device 482. For example, the light signal may be an attempt to inject a voice command to be carried out by the audio device 482 (e.g., to unlock a door, open a garage door, disable a security device, start a car, etc.).
[0063] An output device may be a device for providing output. Examples of output devices may include a speaker(s), display(s), light(s), haptic motor(s), radio frequency (RF) transmitter(s), network card(s), etc. An alert may be an indication or message. Examples of alerts may include a sound (e.g., speech, tone, etc.), an image (e.g., image depicting text), a flashing light, an email, a text message, and/or a video, etc.
[0064] FIG. 5 is a block diagram illustrating an example of a computer- readable medium 584 for light signal identification. The computer-readable medium 584 may be a non-transitory, tangible computer-readable medium 584. The computer-readable medium 584 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like. In some examples, the computer- readable medium 584 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like. In some examples, the computer-readable medium 584 described in FIG. 5 may be an example of the memory described in FIG. 1 or memory described in FIG. 4. In some examples, code (e.g., data and/or executable code or instructions) of the computer-readable medium 584 may be transferred and/or loaded to memory or memories of a computing device and/or audio device.
[0065] The computer-readable medium 584 may include code (e.g., data and/or executable code or instructions). For example, the computer-readable medium 584 may include spectral analysis instructions 586 and/or comparison instructions 588.
[0066] In some examples, the spectral analysis instructions 586 may be instructions when executed cause a processor of an electronic device to perform a first spectral analysis of a first waveform from a first microphone of a microphone array. For example, a processor may execute the spectral analysis instructions 586 to transform a first waveform into the frequency domain to produce a first spectral waveform and/or to determine a spectral aspect or aspects of the first spectral waveform as described in FIG. 1 , FIG. 3, and/or FIG. 4.
[0067] In some examples, the spectral analysis instructions 586 may be instructions when executed cause the processor of the electronic device to perform a second spectral analysis of a second waveform from a second microphone of a microphone array. For example, the processor may execute the spectral analysis instructions 586 to transform a second waveform into the frequency domain to produce a second spectral waveform and/or to determine a spectral aspect or aspects of the second spectral waveform as described in FIG. 1 , FIG. 3, and/or FIG. 4.
[0068] In some examples, the comparison instructions 588 may be instructions when executed cause the processor of the electronic device to compare the first spectral analysis with the second spectral analysis to determine whether the microphone array has received a light signal. For example, the processor may execute the comparison instructions 588 to perform a comparison or comparisons and/or to determine a match or matches based on a spectral aspect or spectral aspects described in FIG. 1 , FIG. 3, and/or FIG. 4. For instance, the comparison instructions 588 may include instructions when executed cause the processor of the electronic device to determine whether a first frequency component or components of the first spectral analysis match a second frequency component or components of the second spectral analysis. For example, if a first frequency component of a first spectral waveform does not match a second frequency component of a second spectral waveform, the processor may determine that the microphone array has received a light signal. In some examples, the first frequency component may be a first peak frequency and the second frequency component may be a second peak frequency. For instance, the comparison instructions 588 may include instructions when executed cause the processor to determine whether the first peak frequency is within a threshold spectral range from the second peak frequency to determine whether the first frequency component matches the second frequency component. If the first peak frequency is within the threshold spectral range, for instance, the first peak frequency may be determined to match the second peak frequency. If the first peak frequency is not within the threshold spectral range, for instance, the first peak frequency may be determined to not match the second peak frequency. In some examples, the computer-readable medium 584 may include instructions to cause a processor to compare a first waveform to a second waveform based on a temporal aspect or aspects in addition to the spectral aspect(s).
[0069] As used herein, the term “and/or” may mean an item or items. For example, the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (but not C), B and C (but not A), A and C (but not B), or all of A, B, and C.
[0070] While various examples are described herein, the disclosure is not limited to the examples. Variations of the examples described herein may be within the scope of the disclosure. For example, operations, functions, aspects, or elements of the examples described herein may be omitted or combined.

Claims

1 . A computing device, comprising: a microphone array; and a processor to analyze a first waveform and a second waveform from the microphone array to identify a light signal sensed by the microphone array.
2. The computing device of claim 1 , wherein to analyze the first waveform and the second waveform, the processor is to: determine a difference between a first spectral waveform of the first waveform and a second spectral waveform of the second waveform; and determine whether the difference satisfies a spectral difference criterion to identify the light signal.
3. The computing device of claim 1 , wherein the processor is to filter the first waveform to produce a first filtered waveform and is to filter the second waveform to produce a second filtered waveform.
4. The computing device of claim 3, wherein to analyze the first waveform and the second waveform, the processor is to compare the first filtered waveform and the second filtered waveform to identify the light signal.
5. The computing device of claim 1 , wherein the processor is to filter the first waveform to produce a first noise waveform and is to filter the second waveform to produce a second noise waveform, wherein to analyze the first waveform and the second waveform, the processor is to compare the first noise waveform and the second noise waveform to identify the light signal.
6. The computing device of claim 5, wherein to analyze the first waveform and the second waveform, the processor is to determine a combination of a noise waveform comparison and filtered waveform comparison.
7. The computing device of claim 1 , wherein to analyze the first waveform and the second waveform, the processor is to: generate a filter based on the first waveform; filter the second waveform based on the filter to produce a second filtered waveform; and determine whether the second filtered waveform meets a criterion to identify the light signal.
8. The computing device of claim 1 , wherein to analyze the first waveform and the second waveform, the processor is to analyze the first waveform and the second waveform based on historical audio data.
9. The computing device of claim 1 , wherein to analyze the first waveform and the second waveform, the processor is to: determine a phase between the first waveform and the second waveform; and compare the phase to a phase range that is based on microphone array spacing to identify the light signal.
10. A computing device, comprising: an audio device, comprising: a first microphone to capture a first waveform; and a second microphone to capture a second waveform; a processor to determine whether the first waveform matches the second waveform; and an output device to output an alert that a light signal is detected in response to determining that the first waveform does not match the second waveform.
11. The computing device of claim 10, wherein the processor is to determine whether a first time component of the first waveform matches a second time component of the second waveform to determine whether the first waveform matches the second waveform.
12. The computing device of claim 10, wherein the light signal is a laser attack on the audio device.
13. A non-transitory tangible computer-readable medium comprising instructions when executed cause a processor of an electronic device to: perform a first spectral analysis of a first waveform from a first microphone of a microphone array; perform a second spectral analysis of a second waveform from a second microphone of the microphone array; and compare the first spectral analysis with the second spectral analysis to determine whether the microphone array has received a light signal.
14. The computer-readable medium of claim 13, wherein the instructions when executed further cause the processor to determine whether a first frequency component of the first spectral analysis matches a second frequency component of the second spectral analysis.
15. The computer-readable medium of claim 14, wherein the first frequency component is a first peak frequency and the second frequency component is a second peak frequency, and wherein the instructions when executed cause the processor to determine whether the first peak frequency is within a threshold spectral range from the second peak frequency to determine whether the first frequency component matches the second frequency component.
PCT/US2020/028525 2020-04-16 2020-04-16 Light signal identification WO2021211127A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/028525 WO2021211127A1 (en) 2020-04-16 2020-04-16 Light signal identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/028525 WO2021211127A1 (en) 2020-04-16 2020-04-16 Light signal identification

Publications (1)

Publication Number Publication Date
WO2021211127A1 true WO2021211127A1 (en) 2021-10-21

Family

ID=78084541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/028525 WO2021211127A1 (en) 2020-04-16 2020-04-16 Light signal identification

Country Status (1)

Country Link
WO (1) WO2021211127A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11757531B1 (en) * 2020-08-26 2023-09-12 United Services Automobile Association (Usaa) Systems and methods for preventing lightbeam access to microphones of smart devices

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0296588A2 (en) * 1987-06-24 1988-12-28 Media Control-Musik-Medien-Analysen Gesellschaft Mit Beschränkter Haftung Method and circuit arrangement for the automatic recognition of signal sequences
US20090296946A1 (en) * 2008-05-27 2009-12-03 Fortemedia, Inc. Defect detection method for an audio device utilizing a microphone array
JP4508862B2 (en) * 2004-12-28 2010-07-21 カシオ計算機株式会社 Optical microphone system
US8462321B2 (en) * 2008-06-30 2013-06-11 Nellcor Puritan Bennet Ireland Methods and systems for discriminating bands in scalograms
JP5721160B2 (en) * 2009-03-23 2015-05-20 Necプラットフォームズ株式会社 Equipment inspection device and equipment inspection method
US20180262831A1 (en) * 2013-03-21 2018-09-13 Nuance Communications, Inc. System and method for identifying suboptimal microphone performance
CN107917665B (en) * 2016-10-09 2020-02-11 睿励科学仪器(上海)有限公司 Method and apparatus for determining the position of a light spot
DE112017007695T5 (en) * 2017-06-26 2020-03-12 Mitsubishi Electric Corporation FACETE EYE IMAGING DEVICE, IMAGE PROCESSING METHOD, PROGRAM AND RECORDING MEDIUM
WO2020057963A1 (en) * 2018-09-20 2020-03-26 Signify Holding B.V. A method and a controller for configuring a distributed microphone system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0296588A2 (en) * 1987-06-24 1988-12-28 Media Control-Musik-Medien-Analysen Gesellschaft Mit Beschränkter Haftung Method and circuit arrangement for the automatic recognition of signal sequences
JP4508862B2 (en) * 2004-12-28 2010-07-21 カシオ計算機株式会社 Optical microphone system
US20090296946A1 (en) * 2008-05-27 2009-12-03 Fortemedia, Inc. Defect detection method for an audio device utilizing a microphone array
US8462321B2 (en) * 2008-06-30 2013-06-11 Nellcor Puritan Bennet Ireland Methods and systems for discriminating bands in scalograms
JP5721160B2 (en) * 2009-03-23 2015-05-20 Necプラットフォームズ株式会社 Equipment inspection device and equipment inspection method
US20180262831A1 (en) * 2013-03-21 2018-09-13 Nuance Communications, Inc. System and method for identifying suboptimal microphone performance
CN107917665B (en) * 2016-10-09 2020-02-11 睿励科学仪器(上海)有限公司 Method and apparatus for determining the position of a light spot
DE112017007695T5 (en) * 2017-06-26 2020-03-12 Mitsubishi Electric Corporation FACETE EYE IMAGING DEVICE, IMAGE PROCESSING METHOD, PROGRAM AND RECORDING MEDIUM
WO2020057963A1 (en) * 2018-09-20 2020-03-26 Signify Holding B.V. A method and a controller for configuring a distributed microphone system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAKESHI SUGAWARA; BENJAMIN CYR; SARA RAMPAZZI; DANIEL GENKIN; KEVIN FU: "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 22 June 2020 (2020-06-22), 201 Olin Library Cornell University Ithaca, NY 14853 , XP081699974 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11757531B1 (en) * 2020-08-26 2023-09-12 United Services Automobile Association (Usaa) Systems and methods for preventing lightbeam access to microphones of smart devices

Similar Documents

Publication Publication Date Title
US10839808B2 (en) Detection of replay attack
CN107577449B (en) Wake-up voice pickup method, device, equipment and storage medium
US8898058B2 (en) Systems, methods, and apparatus for voice activity detection
US20190005964A1 (en) Detection of replay attack
CN109845288B (en) Method and apparatus for output signal equalization between microphones
CN107910013B (en) Voice signal output processing method and device
US10957338B2 (en) 360-degree multi-source location detection, tracking and enhancement
US20200243067A1 (en) Environment classifier for detection of laser-based audio injection attacks
US20190187261A1 (en) Proximity sensing
US20140341386A1 (en) Noise reduction
US9374651B2 (en) Sensitivity calibration method and audio device
US10529356B2 (en) Detecting unwanted audio signal components by comparing signals processed with differing linearity
EP3905718A1 (en) Sound pickup device and sound pickup method
US9699549B2 (en) Audio capturing enhancement method and audio capturing system using the same
CN104464722A (en) Voice activity detection method and equipment based on time domain and frequency domain
US20230037824A1 (en) Methods for reducing error in environmental noise compensation systems
US10381024B2 (en) Method and apparatus for voice activity detection
CN110503973B (en) Audio signal transient noise suppression method, system and storage medium
AU2024200622A1 (en) Methods and apparatus to fingerprint an audio signal via exponential normalization
TW201503116A (en) Method for using voiceprint identification to operate voice recoginition and electronic device thereof
WO2021211127A1 (en) Light signal identification
CN107543569B (en) Space disturbance detection method and device based on frequency modulation sound waves
CN113614828A (en) Method and apparatus for fingerprinting audio signals via normalization
CN110556128B (en) Voice activity detection method and device and computer readable storage medium
GB2603397A (en) Detection of live speech

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20931287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20931287

Country of ref document: EP

Kind code of ref document: A1