WO2021257071A1 - Radar-based touch interface - Google Patents

Radar-based touch interface Download PDF

Info

Publication number
WO2021257071A1
WO2021257071A1 PCT/US2020/038189 US2020038189W WO2021257071A1 WO 2021257071 A1 WO2021257071 A1 WO 2021257071A1 US 2020038189 W US2020038189 W US 2020038189W WO 2021257071 A1 WO2021257071 A1 WO 2021257071A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
radar system
touch
input
touch input
Prior art date
Application number
PCT/US2020/038189
Other languages
French (fr)
Inventor
Patrick M. AMIHOOD
Octavio PONCE MADRIGAL
Cody Blair Wortham
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Priority to PCT/US2020/038189 priority Critical patent/WO2021257071A1/en
Publication of WO2021257071A1 publication Critical patent/WO2021257071A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • Some devices include a physical interface, which enables a user to interact with the device.
  • a device can include a physical button that the user can push or a physical switch that the user can flip back and forth.
  • a physical interface can be relatively easy to integrate within the device. Over time, however, the physical interface can experience a mechanical failure, which can cause the physical interface to become inoperable. Furthermore, the size of some physical interfaces can make it challenging to integrate within space-constrained devices.
  • some devices include a touch user interface (TUI), such as a touch screen.
  • the touch screen can include resistive or capacitive sensors, which enable the touch screen to detect a user’s touch.
  • TTI touch user interface
  • There can be some disadvantages to using a touch screen however.
  • One such disadvantage is the fragility of a glass layer used to implement the touch screen. Sometimes, the glass layer can break, which can significantly degrade the performance of the touch screen.
  • some touch screens may be unable to detect a user’s touch if the user is wearing gloves. To address this, the user can operate the device without using gloves, which can be uncomfortable in cold weather, or operate the device using a particular type of glove that can interface with the touch screen. Compared to a physical interface, the touch screen can also further increase the complexity and cost of the device.
  • the radar- based touch interface utilizes at least one radar system to detect a touch input provided by a user.
  • the radar system can recognize a variety of different touch inputs, including a tap-and-release input, a tap-and-hold input, a swipe input, a position-dependent input, a motion-dependent input, a duration-dependent input, or a pressure-dependent input (e.g., a hard tap, a soft tap), or some combination thereof.
  • the radar system measures an amount of saturation that occurs due to a portion of the user being proximate to one or more antennas of the radar system.
  • the radar system can further measure the position and motion associated with the touch input to distinguish between different types of touch inputs.
  • an operation of the radar system is altered to facilitate the detection of the touch input.
  • components within the radar system’s transceiver can be configured to increase a probability of a received radar signal becoming saturated.
  • the radar system can also distinguish between an intentional touch input and an occlusion. In this way, the radar system can detect self-occlusion without relying on a nearby proximity sensor.
  • the radar-based touch interface includes at least one radar system.
  • the method includes transmitting a radar transmit signal using at least one transmit antenna of the radar system and receiving a radar receive signal using at least one receive antenna of the radar system.
  • the radar receive signal comprises a reflected version of the radar transmit signal.
  • the radar transmit signal is reflected by a portion of a user that touches a location on the device.
  • the method additionally includes recognizing, based on the radar receive signal, a touch input provided by the user.
  • the method further includes providing the touch input to the device to cause the device to perform a function associated with the touch input.
  • aspects described below also include an apparatus comprising a radar-based touch interface with at least one radar system.
  • the radar-based touch interface is configured to perform any of the methods described herein.
  • FIG. 1 illustrates example environments in which a radar-based touch interface can operate
  • FIG. 2 illustrates an example implementation of a radar-based touch interface as part of a user device
  • FIG. 3 illustrates an example radar system of a radar-based touch interface
  • FIG. 4 illustrates an example scheme implemented by a system processor of a radar system
  • FIG. 5 illustrates example positions of multiple radar systems on a smartphone
  • FIG. 6 illustrates an example method performed by a radar-based touch interface
  • FIG. 7 illustrates an example computing system embodying, or in which techniques may be implemented that enable use of, a radar-based touch interface.
  • Some devices include a physical interface, which enables a user to interact with the device.
  • a device can include a physical button that the user can push or a physical switch that the user can flip back and forth.
  • a physical interface can be relatively easy to integrate within the device. Over time, however, the physical interface can experience a mechanical failure, which can cause the physical interface to become inoperable. Furthermore, the size of some physical interfaces can make it challenging to integrate within space-constrained devices.
  • some devices include a touch user interface (TUI), such as a touch screen.
  • the touch screen can include resistive or capacitive sensors, which enable the touch screen to detect a user’s touch.
  • TTI touch user interface
  • There can be some disadvantages to using a touch screen however.
  • One such disadvantage is the fragility of a glass layer used to implement the touch screen. Sometimes, the glass layer can break, which can significantly degrade the performance of the touch screen.
  • some touch screens may be unable to detect a user’s touch if the user is wearing gloves. To address this, the user can operate the device without using gloves, which can be uncomfortable in cold weather, or operate the device using a particular type of glove that can interface with the touch screen. Compared to a physical interface, the touch screen can also further increase the complexity and cost of the device.
  • the radar-based touch interface utilizes at least one radar system to detect a touch input provided by a user.
  • the radar system can recognize a variety of different touch inputs, including a tap-and- release input, a tap-and-hold input, a swipe input, a position-dependent input, amotion-dependent input, a duration-dependent input, or a pressure-dependent input (e.g., a hard tap-and-release input, a soft tap-and-release input), or some combination thereof.
  • the radar system measures an amount of saturation that occurs due to a portion of the user being proximate to one or more antennas of the radar system.
  • the radar system can further measure the position and motion associated with the touch input to distinguish between different types of touch inputs.
  • an operation of the radar system is altered to facilitate the detection of the touch input.
  • components within the radar system’s transceiver can be configured to increase a probability of a received radar signal becoming saturated.
  • the radar system can also distinguish between an intentional touch input and an occlusion. In this way, the radar system can detect self-occlusion without relying on a nearby proximity sensor.
  • FIG. 1 is an illustration of example environments in which techniques using, and an apparatus including, a radar-based touch interface 102 may be embodied.
  • a user device 104 includes the radar-based touch interface 102, which is capable of detecting a touch input provided by a user.
  • the user device 104 is shown to be a smartphone in FIG. 1, the user device 104 can be implemented as any suitable computing or electronic device, as described in further detail with respect to FIG. 2.
  • the user interacts with the user device 104 through the radar-based touch interface 102.
  • the user taps a position on the user device 104 to select an item displayed by the user device 104.
  • This type of touch can represent a tap-and-release input in which the user’s finger is temporarily in contact with the user device 104.
  • the user can provide a tap-and-hold input by keeping their finger in contact with the user device 104 for a longer period of time relative to the tap-and-release input.
  • the user slides their thumb up or down along a side of the user device 104 to provide a swipe input.
  • the radar-based touch interface 102 detects this action and directs the user device 104 to scroll through information that is presented on the display.
  • the radar-based touch interface 102 can also recognize other types of touch inputs that are not shown.
  • the radar-based touch interface 102 can recognize a position- dependent input, which involves the user performing a tap or a swipe at a particular position on the user device 104.
  • a first tap-and-release input performed at a first position can cause the user device 104 to perform a different function than a second tap-and-release input performed at a second position that is different than the first position.
  • Another type of touch input can be a motion-dependent input, which involves the user performing a tap or a swipe at a particular speed or along a particular direction.
  • a right swipe can cause the user device 104 to perform a different function than a left swipe.
  • a slow swipe can cause the user device 104 to perform a different function than a fast swipe.
  • the touch input can be a duration-specific input, which involves the user performing the tap or swipe for a particular duration.
  • the duration- specific input enables the radar-based touch interface 102 to recognize different types of tap-and- hold inputs or different types of swipe inputs that are held for different durations.
  • Another type of touch input can include a pressure-sensitive input, which involves the user performing a tap or a swipe with different amounts of pressure.
  • the radar-based touch interface 102 can distinguish between a hard tap-and-release input and a soft tap-and-release input. In this case, the user performs the hard tap-and-release input with greater force than the soft tap-and-release input.
  • touch inputs can be some combination of the type of inputs described above (e.g., a combination tap-and-release input, a positional-dependent input, a motion-dependent input, and a duration-dependent input; or a combination of multiple tap-and-release inputs).
  • the radar-based touch interface 102 Upon detecting any of these touch inputs, the radar-based touch interface 102 provides the touch input to the user device 104. This causes (e.g., prompts) the user device 104 to perform an action, such as present new content on the display, move a cursor on the display, activate or deactivate one or more components within the user device 104 (e.g., a camera, a wireless communication transceiver), open an application on the user device 104, change an operational mode of the user device 104 (e.g., transition between a low-power mode and a high- power mode), scroll through visual content that is presented on the display, and so forth.
  • the radar-based touch interface 102 can replace other types of touch-sensitive sensors, such as a physical button or switch, a resistance-sensing sensor, or a capacitance-sensing sensor.
  • the radar-based touch interface 102 includes at least one radar system 106.
  • the user does not interact with the radar-based touch interface 102.
  • the user can be at a sufficiently far distance from the radar system 106 or can have a sufficiently small radar cross section such that the radar system 106 is not saturated.
  • the radar system 106 generates a non-saturated receive signal 110, as shown in a graph 112 at the bottom right of FIG. 1.
  • the non-saturated receive signal 110 is a sinusoidal signal having a non-clipped amplitude. Characteristics of the non-saturated receive signal 110 can therefore be directly analyzed by the radar systems 106 for radar-based applications, such as presence detection, gesture recognition, vital-sign detection, or collision avoidance.
  • the user interacts with the radar-based touch interface 102.
  • at least a portion of the user is at a sufficiently close distance to the radar system 106 or has a sufficiently large radar cross section such that the radar system 106 is saturated.
  • signal clipping occurs and the radar system 106 generates a saturated receive signal 114, as shown in a graph 116 at the bottom left of FIG. 1.
  • the saturated receive signal 114 is anon-sinusoidal signal. More specifically, the signal clipping causes an amplitude of the saturated receive signal 114 to be constrained within a saturation threshold 118 of the radar system 106.
  • the radar system 106 analyzes the amount of saturation present within the saturated receive signal 114 to recognize touch inputs for the radar-based touch interface 102 or perform occlusion detection.
  • the radar system 106 can detect an occlusion, such as a portion of the user in environments 100-1 and 100-2 that at least partially occludes (e.g., covers) one or more antennas of the radar system 106.
  • the radar system 106 also detects an occlusion.
  • the user In this case, the user’s purse occludes one or more antennas of the radar system 106.
  • the occlusion can degrade the performance of the radar system 106 by preventing a majority of a transmitted radar signal from propagating further in space beyond the occlusion.
  • the occlusion Similar to the user interaction with the radar-based touch interface 102, the occlusion also causes the radar system 106 to generate a saturated receive signal 114. Responsive to detecting the occlusion, the radar system 106 can transition to a low-power state to conserve power. In some cases, the radar system 106 can be in an off state for a predetermined amount of time. In other cases, the radar system 106 can continue to operate in order to detect whether or not the occlusion has been removed.
  • the user device 104 and the radar-based touch interface 102 are further described with respect to FIG. 2.
  • FIG. 2 illustrates the radar-based touch interface 102 as part of the user device 104.
  • the user device 104 is illustrated with various non-limiting example devices including a desktop computer 104-1, atablet 104-2, alaptop 104-3, atelevision 104-4, a computing watch 104-5 (e.g., a smart watch), computing glasses 104-6, a gaming system 104-7, a home appliance (e.g., microwave) 104-8, and a vehicle 104-9.
  • a desktop computer 104-1 e.g., atablet 104-2
  • alaptop 104-3 e.g., atelevision 104-4
  • a computing watch 104-5 e.g., a smart watch
  • computing glasses 104-6 e.g., a gaming system
  • gaming system 104-7 e.g., a smart watch
  • a home appliance e.g., microwave
  • a home service device such as a smart speaker, a smart thermostat, a security camera, a baby monitor, a Wi-FiTM router, a drone, a trackpad, a drawing pad, a netbook, an e-reader, a home-automation and control system, a wall display, a virtual reality headset, and another home appliance.
  • the user device 104 can be wearable, non-wearable but mobile, or relatively immobile (e.g., desktops, appliances).
  • the radar-based touch interface 102 can be used with, or embedded within, many different user devices 104 or peripherals, such as in control panels that control home appliances and systems, in automobiles to control internal functions (e.g., volume, cruise control, driving of the car), or as an attachment to a laptop computer to control computing applications on the laptop.
  • user devices 104 or peripherals such as in control panels that control home appliances and systems, in automobiles to control internal functions (e.g., volume, cruise control, driving of the car), or as an attachment to a laptop computer to control computing applications on the laptop.
  • the user device 104 includes at least one computer processor 202 and at least one computer-readable medium 204, which includes memory media and storage media. Applications and/or an operating system (not shown) embodied as computer-readable instructions on the computer-readable medium 204 can be executed by the computer processor 202 to provide some of the functionalities described herein.
  • the computer-readable medium 204 also includes applications (not shown), which perform a function based on data provided by the radar-based touch interface 102 or the radar system 106.
  • Example applications can include radar-based applications, which can utilize information provided by the radar system 106 for collision avoidance or touch-free control of the user device 104.
  • Other example applications perform a function responsive to the radar-based touch interface 102 recognizing a touch input provided by a user.
  • the user device 104 can also include a display 206.
  • the display 206 can present different types of information to the user based on the recognized touch input.
  • the user device 104 can also include a network interface 208 for communicating data over wired, wireless, or optical networks.
  • the network interface 208 may communicate data over a local- area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wire-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like.
  • the radar-based touch interface 102 includes at least one radar system 106.
  • the radar-based touch interface 102 includes multiple radar systems 106, which can be located at different positions on the user device 104 (e.g., within an interior of the user device 104, mounted to an exterior surface of the user device 104).
  • the radar system 106 can be implemented on a single integrated circuit or distributed across multiple integrated circuits.
  • the radar system 106 includes at least one antenna 210 and at least one transceiver 212.
  • the radar system 106 includes a single antenna 210 coupled to a single transceiver 212, which can together transmit and receive radar signals to implement a pulse- Doppler radar.
  • the radar system 106 includes at least one antenna 210 coupled to a transmitter of the transceiver 212 and at least one other antenna 210 coupled to a receiver of the transceiver 212 to implement a continuous-wave radar.
  • the antenna 210 can be circularly polarized, horizontally polarized, or vertically polarized.
  • the antenna 210 can be implemented together with the transceiver 212 on a same integrated circuit or implemented separate from the integrated circuit that includes the transceiver 212.
  • the radar system 106 includes multiple antennas 210, which represent antenna elements of one or more antenna arrays.
  • An antenna array enables the radar system 106 to use analog or beamforming techniques during transmission and/or reception to improve the sensitivity and angular resolution.
  • the radar system 106 includes an antenna 210 for transmission, and multiple antennas 210, which form receive antenna elements of an antenna array, for reception.
  • the receive antenna elements can be positioned to form a one-dimensional shape (e.g., a line) or a two-dimensional shape (e.g., a rectangular arrangement, a triangular arrangement, an “L” shape arrangement) for implementations that include three or more receive antenna elements.
  • the one-dimensional shape enables the radar system 106 to measure one angular dimension (e.g., an azimuth, an elevation) while the two-dimensional shape enables the radar system 106 to measure two angular dimensions (e.g., both azimuth and elevation).
  • An element spacing associated with the receive antenna elements can be less than, greater than, or equal to half a center wavelength of the radar signal.
  • the radar system 106 can form beams that are steered or un-steered, wide or narrow, or shaped (e.g., hemisphere, cube, fan, cone, cylinder).
  • the steering and shaping can be achieved through analog beamforming or digital beamforming.
  • the radar system 106 can have, for instance, an un-steered omnidirectional radiation pattern or can produce a wide steerable beam to illuminate a large volume of space during transmission.
  • the radar system 106 can use multiple antennas 210 to generate hundreds or thousands of narrow steered beams with digital beamforming during reception. In this way, the radar system 106 can efficiently monitor an external environment and detect one or more users.
  • the transceiver 212 includes circuitry and logic for transmitting and/or receiving radar signals via the antenna 210.
  • Components of the transceiver 212 can include amplifiers, mixers, switches, analog-to-digital converters, digital-to-analog converters, or filters for conditioning the radar signals.
  • the transceiver 212 also includes logic to perform in-phase/quadrature (I/Q) operations, such as modulation or demodulation. A variety of modulations can be used, including linear frequency modulations, triangular frequency modulations, stepped frequency modulations, or phase modulations.
  • the transceiver 212 can produce radar signals having a relatively constant frequency or a single tone.
  • the transceiver 212 can be configured to support continuous-wave or pulsed radar operations.
  • a frequency spectrum (e.g., range of frequencies) that the transceiver 212 uses to generate the radar signals can encompass frequencies between 1 and 400 gigahertz (GHz), between 4 and 100 GHz, between 1 and 24 GHz, between 2 and 4 GHz, between 57 and 64 GHz, or at approximately 2.4 GHz.
  • the frequency spectrum can be divided into multiple sub-spectrums that have similar or different bandwidths. The bandwidths can be on the order of 500 megahertz (MHz), 1 GHz, 2 GHz, and so forth.
  • Different frequency sub-spectrums may include, for example, frequencies between approximately 57 and 59 GHz, 59 and 61 GHz, or 61 and 63 GHz.
  • frequency sub-spectrums described above are contiguous, other frequency sub-spectrums may not be contiguous.
  • multiple frequency sub-spectrums that have a same bandwidth may be used by the transceiver 212 to generate multiple radar signals, which are transmitted simultaneously or separated in time.
  • multiple contiguous frequency sub-spectrums may be used to transmit a single radar signal, thereby enabling the radar signal to have a wide bandwidth.
  • the transceiver 212 is further described with respect to FIG. 3.
  • the radar system 106 can also include at least one system processor 214 and at least one system medium 216 (e.g., one or more computer-readable storage media).
  • the system processor 214 can be implemented together with the transceiver 212 on a same integrated circuit or implemented on a different integrated circuit that is separate from the integrated circuit with the transceiver 212.
  • the system processor 214 executes instructions stored within the system medium 216 to analyze information provided by the transceiver 212. For example, the system processor 214 can perform Fourier Transform (FT) operations, presence detection, gesture recognition, collision avoidance, or vital-sign detection.
  • FT Fourier Transform
  • the radar-based touch interface 102 includes a touch-input recognition module 218, which can be implemented using hardware, software, firmware, or a combination thereof.
  • the touch-input recognition module 218 is shown as instructions stored within the system medium 216.
  • the system processor 214 can implement the touch-input recognition module 218.
  • the touch-input recognition module 218 recognizes a touch input provided by the user and provides the recognized touch input to the computer processor 202 or an application executed by the computer processor 202. This causes the user device 104 to perform a function or operation according to the touch input.
  • the touch-input recognition module 218 can perform additional functions, such as control an operational mode of the radar system 106 or distinguish between a touch input and an occlusion.
  • the touch-input recognition module 218 can tailor the operation of components within the transceiver 212 to increase the saturation sensitivity of the radar system 106, which increases a probability of the radar system 106 becoming saturated.
  • the touch-input recognition module 218 can selectively alter the operational mode of the radar system 106 to enable the radar system 106 to conserve power during situations in which an occlusion is detected.
  • the touch-input recognition module 218 is further described with respect to FIG. 4.
  • the user device 104 can also include at least one touch-sensitive sensor 220, such as a sensor that senses changes in resistance or capacitance.
  • the touch-sensitive sensor 220 can include a touch screen of the user device 104.
  • the radar- based touch interface 102 and the touch-sensitive sensor 220 operate together to detect additional types of touch inputs that involve both the radar-based touch interface 102 and the touch-sensitive sensor 220. Operation of the radar system 106 is further described with respect to FIG. 3.
  • FIG. 3 illustrates an example radar system 106 of the radar-based touch interface
  • the transceiver 212 of the radar system 106 includes at least one transmitter 302 and at least one receiver 304.
  • the transmitter 302 includes at least one voltage-controlled oscillator (VCO) 306 and at least one power amplifier (PA) 308.
  • the receiver 304 includes one or more receive channels 310-1 to 310-M, where Mis a positive integer.
  • Each receive channel 310-1 to 310-M includes at least one low-noise amplifier (LNA) 312, at least one mixer 314, at least one filter 316, and at least one analog-to-digital converter (ADC) 318.
  • LNA low-noise amplifier
  • ADC analog-to-digital converter
  • the radar system 106 also includes multiple antennas 210, which include at least one transmit antenna 320 and at least two receive antennas 322-1 to 322 -M.
  • the transmit antenna 320 is coupled to the transmitter 302.
  • the receive antennas 322-1 to 322-M form an antenna array, such as a linear antenna array, and are respectively coupled to the receive channels 310-1 to 310-M.
  • the radar system 106 of FIG. 3 is shown to include multiple receive antennas 322-1 to 322-M and multiple receive channels 310-1 to 310-M, other implementations can include a single receive antenna 322 and a single receive channel 310.
  • the voltage-controlled oscillator 306 During transmission, the voltage-controlled oscillator 306 generates a frequency- modulated radar signal 324 at radio frequencies.
  • the frequency-modulated radar signal 324 can include a sequence of chirps that are transmitted in a continuous burst or as time-separated pulses.
  • a duration of each chirp can be on the order of tens or thousands of microseconds (e.g., between approximately 40 microseconds (ps) and 5 milliseconds (ms)), for instance.
  • Individual frequencies of the chirps can increase or decrease over time.
  • the radar system 106 employs atwo-slope cycle (e.g., triangular frequency modulation) to linearly increase and linearly decrease the frequencies of the chirps over time.
  • the two-slope cycle enables the radar system 106 to measure the Doppler frequency shift caused by the motion of a user (or object).
  • transmission characteristics of the chirps e.g., bandwidth, center frequency, duration, and transmit power
  • the power amplifier 308 amplifies the frequency-modulated radar signal 324 for transmission via the transmit antenna 320.
  • the transmitted frequency-modulated radar signal 324 is represented by a radar transmit signal 326. At least a portion of the radar transmit signal 326 is reflected by an object (e.g., the user). This reflected portion represents a radar receive signal 328.
  • the radar receive signal 328 represents the collection of radar receive signals 328-1 to 328-M. An amplitude of the radar receive signal 328 is smaller than an amplitude of the radar transmit signal 326 due to losses incurred during propagation and reflection.
  • the radar receive signal 328 represents a delayed version of the radar transmit signal 326.
  • the amount of delay is proportional to a slant range (e.g., distance) from the radar system 106 to the user.
  • this delay represents a summation of a time it takes for the radar transmit signal 326 to propagate from the transmit antenna 320 to the object and a time it takes for the radar receive signal 328 to propagate from the object to a receive antenna 322.
  • the radar receive signal 328 is shifted in frequency relative to the radar transmit signal 326 due to the Doppler effect.
  • characteristics of the radar receive signal 328 are dependent upon motion of the object and/or motion of the radar system 106.
  • the radar receive signal 328 is composed of one or more of chirps.
  • the radar system 106 receives and processes the radar receive signal 328.
  • the receive antennas 322-1 to 322-M receive respective versions of the radar receive signal 328, which are represented by radar receive signals 328-1 to 328-M.
  • relative phase differences between these versions of the radar receive signals 328-1 to 328-M are due to differences in locations of the receive antennas 322-1 to 322-M.
  • the low-noise amplifier 312 amplifies the radar receive signal 328
  • the mixer 314 mixes the amplified radar receive signal 328 with the frequency -modulated radar signal 324.
  • the mixer 314 performs a beating operation, which downconverts and demodulates the radar receive signal 328 using the frequency-modulated radar signal 324 to generate a beat signal 330.
  • a frequency of the beat signal 330 represents a frequency difference between the frequency-modulated radar signal 324 and the radar receive signal 328, which is proportional to the slant range to the object.
  • the beat signal 330 can include multiple frequencies, which represent reflections from different portions of the object (e.g., different fingers of a user, different portions of a user’s hand, different body parts of a user). In some cases, these different portions move at different speeds, move in different directions, or are positioned at different slant ranges relative to the radar system 106.
  • the filter 316 filters the beat signal 330, and the analog-to-digital converter 318 digitizes the filtered beat signal 330.
  • the receive channels 310-1 to 310-M respectively generate digital beat signals 332-1 to 332-M.
  • one or more of the digital beat signals 332-1 to 332-M represent the saturated receive signal 114 of FIG. 1. This can occur in situations in which the user interacts with the radar-based touch interface 102 or an object occludes the radar system 106.
  • the digital beat signals 332-1 to 332-M are provided to the system processor 214 for processing, as further described with respect to FIG. 4.
  • FIG. 4 illustrates an example scheme implemented by the system processor 214.
  • the system processor 214 is coupled to the receive channels 310-1 to 310-M of the transceiver 212.
  • the system processor 214 implements the touch-input recognition module 218, which can analyze the digital beat signals 332-1 to 332-M to recognize a touch input provided by a user.
  • the touch-input recognition module 218 includes a saturation counter 402 and a touch-input identification (ID) module 404. Additionally, the touch-input recognition module 218 can optionally include a position and/or motion estimator 406, an occlusion detection module 408, a mode-control module 410, or some combination thereof.
  • the saturation counter 402 accepts the digital beat signals 332-1 to 332-M and determines the amount of saturation present within each of these signals. For example, the saturation counter 402 can count the number of samples that are saturated (e.g., the number of samples whose amplitudes are clipped) and determine the percentage of saturated samples within each digital beat signal 332-1 to 332-M. This percentage can be provided as saturation data 412 to the touch-input identification module 404.
  • the touch-input identification module 404 determines whether or not the saturation data 412 indicates that a user provided a touch input. For example, the touch-input identification module 404 compares the percentage of saturated samples to a predetermine threshold. If the percentage of saturated samples is greater than or equal to the predetermined saturation threshold, the touch-input identification module 404 determines that the user provided a touch input. Alternatively, if the percentage of saturated samples is less than the predetermined threshold, the touch-input identification module 404 determines that the user has not provided a touch input.
  • the touch-input identification module 404 can monitor the saturation data 412 over time to distinguish between different types of touch inputs.
  • the tap-and-release input for instance, can cause the digital beat signals 332-1 to 332-M to be saturated for a shorter time period relative to the tap-and-hold input.
  • the touch-input identification module 404 can also distinguish between different pressure-sensitive inputs based on the saturation data 412.
  • a soft (e.g., small force) tap-and-release input can have a smaller percentage of saturated samples compared to a hard (e.g., large force) tap-and-release input. This occurs due to the difference in the amount of area that the user’s finger spreads over as a result of the difference in the amount of applied force.
  • the user’s finger can touch a smaller area on the surface of the user device 104.
  • the user’s finger can spread over a larger area on the surface of the user device 104 during a hard tap-and-release input in which the user is applying a larger amount of force.
  • increasing the area covered by the user’s finger increases the quantity of saturated samples within the digital beat signals 332-1 to 332-M.
  • the touch-input identification module 404 analyzes additional information to recognize the type of touch input provided by the user.
  • the position and/or motion estimator 406 can accept the digital beat signals 332-1 to 332-M and generate position and/or motion data 414, which can further characterize a touch input.
  • the position and/or motion estimator 406 can generate range-Doppler maps based on the digital beat signals 332-1 to 332-M.
  • the range-Doppler maps include amplitude and phase information for a set of range bins and a set of Doppler bins.
  • the position and/or motion estimator 406 can use digital beamforming techniques to generate range-azimuth-elevation maps based on the digital beat signals 332-1 to 332-M.
  • the range-azimuth-elevation maps include amplitude and phase information for a set of range bins, a set of azimuth bins, and a set of elevation bins.
  • the position and/or motion estimator 406 can provide the range-Doppler maps and/or the range-azimuth-elevation maps as the position and/or motion data 414.
  • the touch-input ID module 404 analyzes the position and/or motion data 414 to assist with recognizing various types of touch inputs. For example, the touch-input ID module 404 can determine a range associated with the touch input and confirm that the range is approximately equal to a distance between the antennas 210 of the radar system 106 and an exterior surface of the user device 104. Over time, the touch-input ID module 404 can also observe changes in the range rate associated with the touch input by analyzing the range-Doppler maps.
  • the touch-input ID module 404 can recognize a tap input by observing a detection within the range-Doppler maps having a relatively large negative range rate as the user’s finger approaches the radar-based touch interface 102 to touch the user device 104 and a relatively large positive range rate as the user’s finger moves away after tapping the user device 104.
  • the touch-input ID module 404 can recognize a swipe input by observing a detection within the range-Doppler maps having a range rate that changes by a relatively small amount compared to the tap-and-release input as the user slides their finger across the user device 104.
  • the touch-input ID module 404 can recognize different types of position-dependent inputs and/or motion-dependent inputs.
  • the touch-input ID module 404 can also determine the direction the swipe input travels (e.g., left, right, up, down).
  • the touch-input ID module 404 can combine information from the saturation data 412 and the position and/or motion data 414 to recognize different types of touch inputs.
  • the touch-input ID module 404 generates touch-input data 416, which identifies the touch input recognized by the touch-input recognition module 218.
  • the touch-input data 416 can be provided to the mode-control module 410 and/or other components of the user device 104, such as the computer processor 202 (of FIG. 2). This can prompt (e.g., cause) the user device 104 to perform an action or execute a function according to the touch input identified by the touch- input data 416.
  • the touch- input ID module 404 can accept sensor data 418 from the touch-sensitive sensor 220.
  • the sensor data 418 provides information regarding a touch input that is detected and recognized by the touch-sensitive sensor 220.
  • the touch-input ID module 404 can recognize a multi- touch input that involves both a touch input at the touch-sensitive sensor 220 and a touch input at the radar-based touch interface 102.
  • An example multi -touch input can include a tap-and-hold input via the radar-based touch interface 102 and a tap-and-release input via the touch-sensitive sensor 220.
  • Another example multi-touch input can include a swipe input via the radar-based touch interface 102 and a tap-and-hold input via the touch-sensitive sensor 220.
  • the occlusion detection module 408 can accept the saturation data 412 and the position and/or motion data 414. The occlusion detection module 408 determines whether an object is occluding the radar system 106. Using the occlusion detection module 408, the touch- input recognition module 218 can distinguish between a touch input (e.g., a short-term intentional occlusion) and an occlusion (e.g., a long-term occlusion that may or may not be intentional). In some implementations, the occlusion detection module 408 monitors a duration that the percentage of saturated samples is greater than a predetermined saturation threshold for detecting an occlusion.
  • a touch input e.g., a short-term intentional occlusion
  • an occlusion e.g., a long-term occlusion that may or may not be intentional.
  • the occlusion detection module 408 monitors a duration that the percentage of saturated samples is greater than a predetermined saturation threshold for
  • the saturation threshold for detecting an occlusion can be less than the saturation threshold for detecting a touch input. This is because some objects may have a smaller radar cross section relative to the human body part used to provide the touch input. In contrast to a rigid inanimate object, the human body part is partially composed of fluid, which enables the human body part to spread across the surface of the user device 104. Therefore, the human body part can cause a larger amount of signal clipping to occur relative to the rigid inanimate object.
  • the occlusion detection module 408 determines that an occlusion is present if this duration is greater than or equal to a predetermined duration threshold.
  • the duration threshold can be set to be greater than a duration associated with a touch input, such as a duration of the tap- and-hold input.
  • the occlusion detection module 408 can also analyze information from the position and/or motion data 414 to determine that the radar system 106 is occluded. For example, the occlusion detection module 408 can determine that a detection within the range-Doppler map represents an occlusion if the detection is associated with a near range and a low Doppler filter, which can indicate that the occlusion is relatively stationary. The occlusion detection module 408 can also monitor the range-Doppler maps to detect a change in phase. If the phase associated with the detection does not change significantly over time, then the detection is likely associated with a stationary object.
  • the occlusion detection module 408 generates occlusion data 420, which can be provided to the user device 104 and/or the mode-control module 410.
  • the occlusion data 420 indicates whether or not the radar system 106 is occluded.
  • the user device 104 performs an action or executes a function according to the occlusion data 420.
  • the user device 104 can activate one or more components that are adjacent to the radar system 106 if the radar system 106 is not occluded and deactivate one or more of these components if the radar system 106 is occluded, as further described with respect to FIG. 5.
  • the user device 104 can alert the user to the presence of the occlusion.
  • the mode-control module 410 controls an operational mode 422 of the transceiver 212.
  • the mode-control module 410 directs the radar system 106 to operate according to a saturation mode, which increases the radar system 106’s saturation sensitivity for touch recognition or occlusion detection.
  • the transceiver 212 can operate with a higher transmit power and/or increase the gain of one or more of its amplifiers, such as the power amplifier 308, the low-noise amplifier 312, or a variable gain amplifier within the transmitter 302 or the receiver 304 (not shown).
  • the saturation mode causes the transceiver 212 to lower a cutoff frequency associated with a high-pass filter implemented by the filter 316. Lowering the cutoff frequency increases the filter 316’s response to near-range objects.
  • the transceiver 212 can also activate a low-pass filter, which can be implemented as part of the filter 316, to reduce noise and pass objects associated with small Doppler frequencies.
  • the mode-control module 410 actives the saturation mode responsive to the touch-input ID module 404 providing the touch-input data 416.
  • the mode-control module 410 analyzes the touch input data 416 and determines whether to activate the saturation mode or modify parameters of the saturation mode to improve touch recognition.
  • the mode-control module 410 activates the saturation mode responsive to the occlusion detection module 408 determining that the radar system 106 is occluded. This allows the occlusion detection module 408 to confirm that the radar system 106 is occluded.
  • the mode-control module 410 can activate the saturation mode at pre-specified time intervals, such as every 500 milliseconds.
  • the mode-control module 410 can also selectively alter the operational mode of the radar system 106. For example, if the occlusion detection module 408 determines that the radar system 106 is occluded, the mode-control module 410 can analyze the occlusion data 420 and cause the radar system 106 to operate in a low-power mode or transition to an off state (e.g., a powered-down state) to reduce power consumption while the radar system 106 is occluded. During the low-power mode, the occlusion detection module 408 can continuously and/or intermittently monitor the saturation data 412 and detect if the occlusion is no longer present. Once the radar system 106 is no longer occluded, the mode-control module 410 can cause the radar system 106 to operate according to a previous operational mode.
  • an off state e.g., a powered-down state
  • the mode-control module 410 can periodically wake-up the radar system 106 by causing the radar system 106 to transition to another operational mode, such as the low-power mode.
  • the occlusion detection module 408 can confirm whether the occlusion is still present.
  • the proximity sensor can wake-up the radar system 106 responsive to detecting that the occlusion is no longer present.
  • the mode- control module 410 can cause the radar system 106 to operate according to a soft-gate mode if the occlusion detection module 408 determines that the radar system 106 is occluded. In the soft-gate mode, the radar system 106 to temporarily ceases processing detections. This can help reduce a false-alarm rate of the radar system 106 while the radar system 106 is occluded.
  • the mode-control module 410 can direct the radar system 106 to remain in the soft-gate mode for a predetermined amount of time before returning to a previous operational mode.
  • the radar system 106 can operate as a proximity sensor by detecting occlusion.
  • the system processor 214 can implement the saturation counter 402, the position and/or motion estimator 406, the occlusion detection module 408, the mode-control module 410, or some combination thereof. Multiple instances of the radar system 106 can be implemented within the user device 104, as further described with respect to FIG. 5.
  • FIG. 5 illustrates example positions of the radar-based touch interface 102’s radar systems 106 on a smartphone 500.
  • the smartphone 500 includes radar systems 106-1 to 106-8.
  • the radar systems 106-1 to 106-8 are positioned within or under an exterior housing of the smartphone 500, which can be substantially transparent to radar signals (e.g., minimally attenuate radar signals).
  • the radar systems 106-1 to 106-8 are positioned around the smartphone 500 such that a portion of each radar system 106’s antenna pattern overlaps at least one other radar system 106’s antenna pattern. In this way, an object can be detected in the overlapping antenna patterns of at least two of the radar systems 106-1 to 106-8 at various locations around the smartphone 500.
  • the one or more antennas 210 of each radar system 106 can face up along the Y axis towards an upper side of the smartphone 500, face left along the A ' axis towards a left side of the smartphone 500, face down along the Y axis towards a bottom side of the smartphone 500, or face right along the A ' axis towards a right side of the smartphone 500.
  • the antennas 210 of the radar systems 106-1 and 106- 3 can face up along the Y axis
  • the antennas 210 of the radar systems 106-2 and 106-4 can face left along the X axis
  • the antennas 210 of the radar systems 106-5 and 106-6 can face down along the Y axis
  • the antennas 210 of the radar systems 106-7 and 106-8 can face right along the A ' axis.
  • the antennas 210 of one or more of the radar systems 106-1 to 106-8 can face up out of the page along the Z axis towards a front face of the smartphone 500 or face down into the page along the Z axis towards a back side of the smartphone 500.
  • the antennas 210 of the radar systems 106-1 to 106-8 can face a particular side of the smartphone 500, the antenna patterns of these antennas 210 can encompass a volume of space above the front face of the smartphone 500 and/or another volume of space behind the backside of the smartphone 500. In this way, the radar systems 106-1 to 106-8 can detect a user interacting with the smartphone 500.
  • the radar systems 106-1 to 106-8 have multiple antennas 210 that form an antenna array, such as multiple transmit antennas 320 or multiple receive antennas 322.
  • the radar systems 106-1 to 106-8 each include at least two receive antennas 322-1 and 322-2, which form a linear antenna array. Orientations of these linear antenna arrays can vary to enable the radar systems 106 to operate together to determine two-dimensional angular information associated with an object. In particular, orientations of some linear antenna arrays can differ by approximately 90 degrees.
  • the receive antennas 322-1 and 322-2 of the radar system 106-1 can be aligned along the A axis to measure azimuth angles of objects and the receive antennas 322-1 and 322-2 of the radar system 106-2 can be aligned along the Y axis to measure elevation angles of the objects.
  • the smartphone 500 of FIG. 5 is shown to include eight radar systems 106-1 to 106-8, other implementations of the smartphone 500 can have other quantities of radar systems 106 (e.g., fewer than eight radar systems 106 or more than eight radar systems 106).
  • the smartphone 500 can include two radar systems 106, such as radar systems 106-1 and 106-2. In some cases, the two radar systems 106 are oriented along different axes to enable two- dimensional angular information to be determined.
  • the various positions of the radar systems 106-1 to 106-8 can facilitate implementation of the radar-based touch interface 102.
  • the radar system 106-8 can be at a convenient position for detecting touch inputs provided a right- handed user’s thumb.
  • the radar system 106-4 can be at a convenient position for detection touch inputs provided by a left-handed user’s thumb.
  • the radar systems 106-1, 106-3, 106-5, and/or 106-6 can be used to enable a user to provide touch inputs for a gaming application.
  • Some radar systems 106-1 to 106-8 can be at positions that are likely to be occluded by the user.
  • the radar system 106-4 can be occluded by a right-handed user’s fingers, which wrap around the smartphone 500.
  • one or more of the radar systems 106-1 to 106-8 are not adjacent to a proximity sensor. With the ability to detect occlusion, however, the radar systems 106-1 to 106-8 can independently transition to an appropriate operational mode.
  • the radar systems 106-1 to 106-8 can also operate as separate proximity sensors for other components within the smartphone 500.
  • a radio-frequency component is adjacent to the radar system 106-4 (e.g., above the radar system 106-4, below the radar system 106-4).
  • This radio-frequency component can include a wireless communication transceiver, which transmits wireless communication signals, or another radar system 106 (not shown).
  • FIG. 6 depicts an example method 600 performed by a radar-based touch interface.
  • Method 600 is shown as sets of operations (or acts) performed but not necessarily limited to the order or combinations in which the operations are shown herein. Further, any of one or more of the operations may be repeated, combined, reorganized, or linked to provide a wide array of additional and/or alternate methods.
  • reference may be made to the environment 100-1 to 100-3 of FIG. 1, and entities detailed in FIGs. 2-4, reference to which is made for example only. The techniques are not limited to performance by one entity or multiple entities operating on one device.
  • a radar transmit signal is transmitted using at least one transmit antenna of a radar system.
  • a radar system 106 of a radar-based touch interface 102 transmits a radar transmit signal 326 using at least one transmit antenna 320, as shown in FIG. 3.
  • the radar system 106 operates according to a saturation mode, which increases a power level of the radar transmit signal to increase a likelihood that the radar system 106 becomes saturated.
  • a radar receive signal is received using at least one receive antenna of the radar system.
  • the radar system 106 receives a radar receive signal 328 using at least one receive antenna 322, as shown in FIG. 3.
  • the radar receive signal 328 represents a reflected version of the radar transmit signal 326.
  • the user interacts with the radar-based touch interface 102. Due to this interaction, the radar transmit signal 326 is reflected by a portion of the user that touches a location on the user device 104. In general, this location is proximate to (e.g., close to, near) one or more of the antennas 210 of the radar system 106 and causes the radar system 106 to become saturated.
  • this location can be directly above one or more of the antennas 210 such that the user covers at least a portion of one or more of the antennas 210.
  • this location can be adjacent (e.g., next to, to the side of) one or more of the antennas 210. In this case, the user is not directly above the antennas 210.
  • an object at least partially occludes the radar system 106 and causes the radar system 106 to become saturated. Due to this occlusion, a majority of the radar transmit signal 326 is reflected by the object and does not propagate further in space.
  • the object can be an inanimate object or a portion of the user, such as the user’s hand or finger.
  • the radar system 106 is configured to recognize a touch input provided by the user based on the radar receive signal.
  • the touch-input recognition module 218 recognizes the touch input provided by the user, as shown in FIG. 4.
  • the touch input can include a tap-and-release input, a tap-and-hold input, a swipe input, a position-dependent input, a motion- dependent input, a duration-dependent input, a pressure-dependent input (e.g., a hard tap, a soft tap), or combinations thereof.
  • the touch input is provided to a device to cause the device to perform a function associated with the touch input.
  • the touch-input recognition module 218 provides touch-input data 416 to the user device 104 to cause the user device 104 to perform a function (or an action) associated with the touch input.
  • the user device 104 can present new content on the display, move a cursor on the display, activate or deactivate one or more components within the user device 104 (e.g., a camera, a wireless communication transceiver), open an application on the user device 104, change an operational mode of the user device 104 (e.g., transition between a low-power mode and a high-power mode), or scroll through visual content that is presented on the display according to the touch input.
  • a cursor on the display e.g., a camera, a wireless communication transceiver
  • open an application on the user device 104 e.g., a camera, a wireless communication transceiver
  • change an operational mode of the user device 104 e.g., transition between a low-power mode and a high-power mode
  • scroll through visual content that is presented on the display according to the touch input.
  • FIG. 7 illustrates various components of an example computing system 700 that can be implemented as any type of client, server, and/or computing device as described with reference to the previous FIG. 2 to implement a radar-based touch interface 102.
  • the computing system 700 includes communication devices 702 that enable wired and/or wireless communication of device data 704 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data).
  • the computing system 700 also includes one or more radar systems 106.
  • the device data 704 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device.
  • Media content stored on the computing system 700 can include any type of audio, video, and/or image data.
  • the computing system 700 includes one or more data inputs 706 via which any type of data, media content, and/or inputs can be received, such as human utterances, user-selectable inputs (explicit or implicit), messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
  • data inputs 706 via which any type of data, media content, and/or inputs can be received, such as human utterances, user-selectable inputs (explicit or implicit), messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
  • the computing system 700 also includes communication interfaces 708, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface.
  • the communication interfaces 708 provide a connection and/or communication links between the computing system 700 and a communication network by which other electronic, computing, and communication devices communicate data with the computing system 700.
  • the computing system 700 includes one or more processors 710 (e.g., any of microprocessors, controllers, and the like), which process various computer-executable instructions to control the operation of the computing system 700 and to enable techniques for, or in which can be embodied, gesture recognition in the presence of saturation.
  • processors 710 e.g., any of microprocessors, controllers, and the like
  • the computing system 700 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 712.
  • the computing system 700 can include a system bus or data transfer system that couples the various components within the device.
  • a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
  • the computing system 700 also includes a computer-readable media 714, such as one or more memory devices that enable persistent and/or non-transitory data storage (i.e., in contrast to mere signal transmission), examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device.
  • RAM random access memory
  • non-volatile memory e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.
  • the disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like.
  • the computing system 700 can also include a mass storage media device (storage media) 716.
  • the computer-readable media 714 provides data storage mechanisms to store the device data 704, as well as various device applications 718 and any other types of information and/or data related to operational aspects of the computing system 700.
  • an operating system 720 can be maintained as a computer application with the computer-readable media 714 and executed on the processors 710.
  • the device applications 718 may include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on.
  • the device applications 718 also include any system components, engines, or managers to implement the radar-based touch interface 102.
  • the device applications 718 includes the touch-input recognition module 218 of FIG. 2.
  • Example 1 A method performed by a radar-based touch interface included within a device, the radar-based touch interface including at least one radar system, the method comprising: transmitting a radar transmit signal using at least one transmit antenna of the radar system; receiving a radar receive signal using at least one receive antenna of the radar system, the radar receive signal comprising a reflected version of the radar transmit signal, the radar transmit signal reflected by a portion of a user that touches a location on the device; recognizing, based on the radar receive signal, a touch input provided by the user; and providing the touch input to the device to cause the device to perform a function associated with the touch input.
  • Example 2 The method of example 1, wherein: the recognizing of the touch input comprises using a touch-input recognition module of the at least one radar system for recognizing the touch input provided by the user; and the touch-input recognition module comprises at least one of: a saturation counter, a touch-input identification module, a position estimator, a motion estimator, an occlusion detection module, or a mode control module.
  • Example 3 The method of example 1 or 2, wherein the recognizing of the touch input comprises recognizing at least one of: a tap-and-release input; a tap-and-hold input; a swipe input; a position-dependent input; a motion-dependent input; a duration-dependent input; or a pressure-dependent input.
  • Example 4 The method of example at least one of the preceding examples, wherein the function comprises at least one of: present new content on a display of the device; move a cursor on the display of the device; activate one or more components within the device; deactivate one or more other components within the device; open an application on the device; scroll through visual content that is presented on the display of the device; or adjust an operational mode of the device.
  • Example 5 The method of any preceding example, wherein the recognizing of the touch input comprises recognizing the touch input based on at least one of the following: an amount of saturation associated with the touch input; a position associated with the touch input; a motion associated with the touch input; or a duration of the touch input.
  • Example 6 The method of any preceding example, wherein the recognizing of the touch input further comprises: generating a digital beat signal based on the radar receive signal, the digital beat signal comprising multiple samples; measuring a percentage of the multiple samples of the digital beat signal that are saturated; and recognizing the touch input based on the percentage being greater than or equal to a threshold.
  • Example 7 The method of example 6, wherein: the receiving of the radar receive signal using the at least one receive antenna of the radar system comprises receiving the radar receive signal using two or more receive antennas that form a linear antenna array; the generating of the digital beat signal comprises generating two or more digital beat signals based on the two or more radar receive signals, respectively; and the recognizing of the touch input comprises: analyzing phase differences between the two or more digital beat signals to determine an angular position of the location; and recognizing the touch input based on the angular position and the percentage being greater than or equal to the threshold.
  • Example 8 The method of any preceding example, further comprising: accepting sensor data from a touch-sensitive sensor, the sensor data comprising information regarding another portion of the user that touches another location on the device, wherein the recognizing of the touch input comprises recognizing the touch input based on the radar receive signal and the sensor data.
  • Example 9 The method of any preceding example, further comprising: prior to transmitting the radar transmit signal, altering operation of the radar system to increase a saturation sensitivity of the radar system.
  • Example 10 The method of example 9, wherein the altering the operation of the radar system comprises at least one of the following: increasing a transmit power level of the radar system; increasing a gain of an amplifier of the radar system; decreasing a cutoff frequency of a high-pass filter of the radar system; or enabling a low-pass filter of the radar system.
  • Example 11 The method of any preceding example, further comprising: transmitting another radar transmit signal using the at least one transmit antenna of the radar system; receiving another radar receive signal using the at least one receive antenna of the radar system, the other radar receive signal comprising a reflected version of the other radar transmit signal, the other radar transmit signal reflected by an object that at least partially occludes the at least one transmit antenna or the at least one receive antenna; and determining, based on the other radar receive signal, that the radar system is at least partially occluded by the object.
  • Example 12 The method of example 11, wherein: the method further comprising, responsive to determining that the radar system is at least partially occluded by the object, disabling a radio-frequency component of the device that is adjacent to the radar system; and the radio-frequency component comprises at least one of: another radar system; or a wireless transceiver configured to transmit wireless communication signals.
  • Example 13 The method of example 11, further comprising: responsive to determining that the radar system is at least partially occluded by the object, altering operation of the radar system to reduce power consumption.
  • Example 14 The method of any preceding example, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance.
  • Example 15 The method of any preceding example, wherein: the radar system comprises multiple radar systems at different positions on the device, the multiple radar systems including a first radar system and a second radar system; the at least one receive antenna of the first radar system comprises multiple receive antennas that form a first linear antenna array; the at least one receive antenna of the second radar system comprises other multiple receive antennas that for a second linear antenna array; and the linear antenna array of the first radar system is oriented ninety degrees with respect to the linear antenna array of a second radar system of the multiple radar systems.
  • Example 16 An apparatus comprising: a radar-based touch interface comprising at least one radar system, the radar-based touch interface configured to perform any of the methods of examples 1 to 15.
  • Example 17 A device with a radar-based touch interface comprising: a radar system comprising: at least one antenna; at least one radar transceiver coupled to the at least one antenna, the at least one antenna and the at least one radar transceiver jointly configured to transmit and receive radar signals; and a touch input recognition module configured to process the radar signals, the radar signals reflected by a user of the device, the radar-based touch interface operable to perform any of the methods of examples 1 to
  • Example 18 The device of example 17, wherein the touch input recognition module comprises at least one of: a saturation counter; a position estimator; a motion estimator; an occlusion detection module; or a touch-input identification module.
  • Example 19 The device of example 17 or 18, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance.

Abstract

Techniques and apparatuses are described that implement a radar-based touch interface (102). The radar-based touch interface (102) utilizes at least one radar system (106) to detect a touch input provided by a user. To recognize the touch input, the radar system (106) measures an amount of saturation that occurs due to a portion of the user being proximate to the radar system. The radar system (106) can further measure the angular position and range rate associated with the touch input to distinguish between different types of touch inputs. In some implementations, an operation of the radar system (106) can be altered to facilitate the detection of the touch input. For example, this operational mode can configure components within the radar system's (106) transceiver (210) to increase a probability of a radar signal becoming saturated. The radar system (106) can also distinguish between an intentional touch input and an occlusion.

Description

RADAR-BASED TOUCH INTERFACE
BACKGROUND
[0001] Some devices include a physical interface, which enables a user to interact with the device. For example, a device can include a physical button that the user can push or a physical switch that the user can flip back and forth. In general, a physical interface can be relatively easy to integrate within the device. Over time, however, the physical interface can experience a mechanical failure, which can cause the physical interface to become inoperable. Furthermore, the size of some physical interfaces can make it challenging to integrate within space-constrained devices.
[0002] To address this issue, some devices include a touch user interface (TUI), such as a touch screen. The touch screen can include resistive or capacitive sensors, which enable the touch screen to detect a user’s touch. There can be some disadvantages to using a touch screen, however. One such disadvantage is the fragility of a glass layer used to implement the touch screen. Sometimes, the glass layer can break, which can significantly degrade the performance of the touch screen. Additionally, some touch screens may be unable to detect a user’s touch if the user is wearing gloves. To address this, the user can operate the device without using gloves, which can be uncomfortable in cold weather, or operate the device using a particular type of glove that can interface with the touch screen. Compared to a physical interface, the touch screen can also further increase the complexity and cost of the device.
SUMMARY
[0003] Techniques and apparatuses are described that implement a radar-based touch interface. Instead of, or in addition to, using resistive-sensing or capacitive-sensing, the radar- based touch interface utilizes at least one radar system to detect a touch input provided by a user. The radar system can recognize a variety of different touch inputs, including a tap-and-release input, a tap-and-hold input, a swipe input, a position-dependent input, a motion-dependent input, a duration-dependent input, or a pressure-dependent input (e.g., a hard tap, a soft tap), or some combination thereof. To recognize the touch input, the radar system measures an amount of saturation that occurs due to a portion of the user being proximate to one or more antennas of the radar system. The radar system can further measure the position and motion associated with the touch input to distinguish between different types of touch inputs.
[0004] In some implementations, an operation of the radar system is altered to facilitate the detection of the touch input. For example, components within the radar system’s transceiver can be configured to increase a probability of a received radar signal becoming saturated. The radar system can also distinguish between an intentional touch input and an occlusion. In this way, the radar system can detect self-occlusion without relying on a nearby proximity sensor.
[0005] Aspects below include a method performed by a radar-based touch interface included within a device. The radar-based touch interface includes at least one radar system. The method includes transmitting a radar transmit signal using at least one transmit antenna of the radar system and receiving a radar receive signal using at least one receive antenna of the radar system. The radar receive signal comprises a reflected version of the radar transmit signal. The radar transmit signal is reflected by a portion of a user that touches a location on the device. The method additionally includes recognizing, based on the radar receive signal, a touch input provided by the user. The method further includes providing the touch input to the device to cause the device to perform a function associated with the touch input.
[0006] Aspects described below also include an apparatus comprising a radar-based touch interface with at least one radar system. Using the at least one radar system, the radar-based touch interface is configured to perform any of the methods described herein.
[0007] Aspects described below also include radar means for providing a touch interface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Apparatuses and techniques for implementing a radar-based touch interface are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:
FIG. 1 illustrates example environments in which a radar-based touch interface can operate;
FIG. 2 illustrates an example implementation of a radar-based touch interface as part of a user device;
FIG. 3 illustrates an example radar system of a radar-based touch interface;
FIG. 4 illustrates an example scheme implemented by a system processor of a radar system;
FIG. 5 illustrates example positions of multiple radar systems on a smartphone;
FIG. 6 illustrates an example method performed by a radar-based touch interface; and
FIG. 7 illustrates an example computing system embodying, or in which techniques may be implemented that enable use of, a radar-based touch interface. DETAILED DESCRIPTION
Overview
[0009] Some devices include a physical interface, which enables a user to interact with the device. For example, a device can include a physical button that the user can push or a physical switch that the user can flip back and forth. In general, a physical interface can be relatively easy to integrate within the device. Over time, however, the physical interface can experience a mechanical failure, which can cause the physical interface to become inoperable. Furthermore, the size of some physical interfaces can make it challenging to integrate within space-constrained devices.
[0010] To address this issue, some devices include a touch user interface (TUI), such as a touch screen. The touch screen can include resistive or capacitive sensors, which enable the touch screen to detect a user’s touch. There can be some disadvantages to using a touch screen, however. One such disadvantage is the fragility of a glass layer used to implement the touch screen. Sometimes, the glass layer can break, which can significantly degrade the performance of the touch screen. Additionally, some touch screens may be unable to detect a user’s touch if the user is wearing gloves. To address this, the user can operate the device without using gloves, which can be uncomfortable in cold weather, or operate the device using a particular type of glove that can interface with the touch screen. Compared to a physical interface, the touch screen can also further increase the complexity and cost of the device.
[0011] In contrast, this document describes techniques and devices that implement a radar- based touch interface. Instead of, or in addition to, using resistive-sensing or capacitive-sensing, the radar-based touch interface utilizes at least one radar system to detect a touch input provided by a user. The radar system can recognize a variety of different touch inputs, including a tap-and- release input, a tap-and-hold input, a swipe input, a position-dependent input, amotion-dependent input, a duration-dependent input, or a pressure-dependent input (e.g., a hard tap-and-release input, a soft tap-and-release input), or some combination thereof. To recognize the touch input, the radar system measures an amount of saturation that occurs due to a portion of the user being proximate to one or more antennas of the radar system. The radar system can further measure the position and motion associated with the touch input to distinguish between different types of touch inputs.
[0012] In some implementations, an operation of the radar system is altered to facilitate the detection of the touch input. For example, components within the radar system’s transceiver can be configured to increase a probability of a received radar signal becoming saturated. The radar system can also distinguish between an intentional touch input and an occlusion. In this way, the radar system can detect self-occlusion without relying on a nearby proximity sensor. Example Environment
[0013] FIG. 1 is an illustration of example environments in which techniques using, and an apparatus including, a radar-based touch interface 102 may be embodied. In the depicted environments 100-1, 100-2, and 100-3, a user device 104 includes the radar-based touch interface 102, which is capable of detecting a touch input provided by a user. Although the user device 104 is shown to be a smartphone in FIG. 1, the user device 104 can be implemented as any suitable computing or electronic device, as described in further detail with respect to FIG. 2.
[0014] In the environments 100-1 and 100-2, the user interacts with the user device 104 through the radar-based touch interface 102. In the environment 100-1, the user taps a position on the user device 104 to select an item displayed by the user device 104. This type of touch can represent a tap-and-release input in which the user’s finger is temporarily in contact with the user device 104. Alternatively, the user can provide a tap-and-hold input by keeping their finger in contact with the user device 104 for a longer period of time relative to the tap-and-release input. In the environment 100-2, the user slides their thumb up or down along a side of the user device 104 to provide a swipe input. The radar-based touch interface 102 detects this action and directs the user device 104 to scroll through information that is presented on the display.
[0015] The radar-based touch interface 102 can also recognize other types of touch inputs that are not shown. For example, the radar-based touch interface 102 can recognize a position- dependent input, which involves the user performing a tap or a swipe at a particular position on the user device 104. In this way, a first tap-and-release input performed at a first position can cause the user device 104 to perform a different function than a second tap-and-release input performed at a second position that is different than the first position. Another type of touch input can be a motion-dependent input, which involves the user performing a tap or a swipe at a particular speed or along a particular direction. For example, a right swipe can cause the user device 104 to perform a different function than a left swipe. As another example, a slow swipe can cause the user device 104 to perform a different function than a fast swipe.
[0016] Additionally or alternatively, the touch input can be a duration-specific input, which involves the user performing the tap or swipe for a particular duration. The duration- specific input enables the radar-based touch interface 102 to recognize different types of tap-and- hold inputs or different types of swipe inputs that are held for different durations. Another type of touch input can include a pressure-sensitive input, which involves the user performing a tap or a swipe with different amounts of pressure. For example, the radar-based touch interface 102 can distinguish between a hard tap-and-release input and a soft tap-and-release input. In this case, the user performs the hard tap-and-release input with greater force than the soft tap-and-release input.
Other types of touch inputs can be some combination of the type of inputs described above (e.g., a combination tap-and-release input, a positional-dependent input, a motion-dependent input, and a duration-dependent input; or a combination of multiple tap-and-release inputs).
[0017] Upon detecting any of these touch inputs, the radar-based touch interface 102 provides the touch input to the user device 104. This causes (e.g., prompts) the user device 104 to perform an action, such as present new content on the display, move a cursor on the display, activate or deactivate one or more components within the user device 104 (e.g., a camera, a wireless communication transceiver), open an application on the user device 104, change an operational mode of the user device 104 (e.g., transition between a low-power mode and a high- power mode), scroll through visual content that is presented on the display, and so forth. In this way, the radar-based touch interface 102 can replace other types of touch-sensitive sensors, such as a physical button or switch, a resistance-sensing sensor, or a capacitance-sensing sensor.
[0018] The radar-based touch interface 102 includes at least one radar system 106. In some situations, the user does not interact with the radar-based touch interface 102. In this case, the user can be at a sufficiently far distance from the radar system 106 or can have a sufficiently small radar cross section such that the radar system 106 is not saturated. In this case, the radar system 106 generates a non-saturated receive signal 110, as shown in a graph 112 at the bottom right of FIG. 1. As the radar system 106 is not saturated, signal clipping does not occur and the non-saturated receive signal 110 is a sinusoidal signal having a non-clipped amplitude. Characteristics of the non-saturated receive signal 110 can therefore be directly analyzed by the radar systems 106 for radar-based applications, such as presence detection, gesture recognition, vital-sign detection, or collision avoidance.
[0019] In other situations, the user interacts with the radar-based touch interface 102. In this case, at least a portion of the user is at a sufficiently close distance to the radar system 106 or has a sufficiently large radar cross section such that the radar system 106 is saturated. Without an automatic gain control circuit to automatically adjust transmission power to avoid the saturation, signal clipping occurs and the radar system 106 generates a saturated receive signal 114, as shown in a graph 116 at the bottom left of FIG. 1. Due to the signal clipping, the saturated receive signal 114 is anon-sinusoidal signal. More specifically, the signal clipping causes an amplitude of the saturated receive signal 114 to be constrained within a saturation threshold 118 of the radar system 106. In other words, at least a portion of the amplitude of the saturated receive signal 114 is relatively constant and does not change based on an amplitude of a reflected radar signal. Although the signal clipping can make it challenging for the radar system 106 to support the radar- based applications, the radar system 106 analyzes the amount of saturation present within the saturated receive signal 114 to recognize touch inputs for the radar-based touch interface 102 or perform occlusion detection. [0020] In the environments 100-1, 100-2, and 100-3, the radar system 106 can detect an occlusion, such as a portion of the user in environments 100-1 and 100-2 that at least partially occludes (e.g., covers) one or more antennas of the radar system 106. In the environment 100-3, the radar system 106 also detects an occlusion. In this case, the user’s purse occludes one or more antennas of the radar system 106. In these example environments 100-1 to 100-3, the occlusion can degrade the performance of the radar system 106 by preventing a majority of a transmitted radar signal from propagating further in space beyond the occlusion. Similar to the user interaction with the radar-based touch interface 102, the occlusion also causes the radar system 106 to generate a saturated receive signal 114. Responsive to detecting the occlusion, the radar system 106 can transition to a low-power state to conserve power. In some cases, the radar system 106 can be in an off state for a predetermined amount of time. In other cases, the radar system 106 can continue to operate in order to detect whether or not the occlusion has been removed. The user device 104 and the radar-based touch interface 102 are further described with respect to FIG. 2.
Example Radar-Based Touch Interface
[0021] FIG. 2 illustrates the radar-based touch interface 102 as part of the user device 104. The user device 104 is illustrated with various non-limiting example devices including a desktop computer 104-1, atablet 104-2, alaptop 104-3, atelevision 104-4, a computing watch 104-5 (e.g., a smart watch), computing glasses 104-6, a gaming system 104-7, a home appliance (e.g., microwave) 104-8, and a vehicle 104-9. Other devices may also be used, such as a home service device, a smart speaker, a smart thermostat, a security camera, a baby monitor, a Wi-Fi™ router, a drone, a trackpad, a drawing pad, a netbook, an e-reader, a home-automation and control system, a wall display, a virtual reality headset, and another home appliance. Note that the user device 104 can be wearable, non-wearable but mobile, or relatively immobile (e.g., desktops, appliances). The radar-based touch interface 102 can be used with, or embedded within, many different user devices 104 or peripherals, such as in control panels that control home appliances and systems, in automobiles to control internal functions (e.g., volume, cruise control, driving of the car), or as an attachment to a laptop computer to control computing applications on the laptop.
[0022] The user device 104 includes at least one computer processor 202 and at least one computer-readable medium 204, which includes memory media and storage media. Applications and/or an operating system (not shown) embodied as computer-readable instructions on the computer-readable medium 204 can be executed by the computer processor 202 to provide some of the functionalities described herein. The computer-readable medium 204 also includes applications (not shown), which perform a function based on data provided by the radar-based touch interface 102 or the radar system 106. Example applications can include radar-based applications, which can utilize information provided by the radar system 106 for collision avoidance or touch-free control of the user device 104. Other example applications perform a function responsive to the radar-based touch interface 102 recognizing a touch input provided by a user.
[0023] The user device 104 can also include a display 206. The display 206 can present different types of information to the user based on the recognized touch input. The user device 104 can also include a network interface 208 for communicating data over wired, wireless, or optical networks. For example, the network interface 208 may communicate data over a local- area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wire-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like.
[0024] The radar-based touch interface 102 includes at least one radar system 106. In some implementations, the radar-based touch interface 102 includes multiple radar systems 106, which can be located at different positions on the user device 104 (e.g., within an interior of the user device 104, mounted to an exterior surface of the user device 104). The radar system 106 can be implemented on a single integrated circuit or distributed across multiple integrated circuits.
[0025] The radar system 106 includes at least one antenna 210 and at least one transceiver 212. In some cases, the radar system 106 includes a single antenna 210 coupled to a single transceiver 212, which can together transmit and receive radar signals to implement a pulse- Doppler radar. In other cases, the radar system 106 includes at least one antenna 210 coupled to a transmitter of the transceiver 212 and at least one other antenna 210 coupled to a receiver of the transceiver 212 to implement a continuous-wave radar. The antenna 210 can be circularly polarized, horizontally polarized, or vertically polarized. The antenna 210 can be implemented together with the transceiver 212 on a same integrated circuit or implemented separate from the integrated circuit that includes the transceiver 212.
[0026] In some implementations, the radar system 106 includes multiple antennas 210, which represent antenna elements of one or more antenna arrays. An antenna array enables the radar system 106 to use analog or beamforming techniques during transmission and/or reception to improve the sensitivity and angular resolution. Consider an example in which the radar system 106 includes an antenna 210 for transmission, and multiple antennas 210, which form receive antenna elements of an antenna array, for reception. The receive antenna elements can be positioned to form a one-dimensional shape (e.g., a line) or a two-dimensional shape (e.g., a rectangular arrangement, a triangular arrangement, an “L” shape arrangement) for implementations that include three or more receive antenna elements. The one-dimensional shape enables the radar system 106 to measure one angular dimension (e.g., an azimuth, an elevation) while the two-dimensional shape enables the radar system 106 to measure two angular dimensions (e.g., both azimuth and elevation). An element spacing associated with the receive antenna elements can be less than, greater than, or equal to half a center wavelength of the radar signal.
[0027] Using one or more antennas 210, the radar system 106 can form beams that are steered or un-steered, wide or narrow, or shaped (e.g., hemisphere, cube, fan, cone, cylinder). The steering and shaping can be achieved through analog beamforming or digital beamforming. In some implementations, the radar system 106 can have, for instance, an un-steered omnidirectional radiation pattern or can produce a wide steerable beam to illuminate a large volume of space during transmission. To achieve target angular accuracies and angular resolutions, the radar system 106 can use multiple antennas 210 to generate hundreds or thousands of narrow steered beams with digital beamforming during reception. In this way, the radar system 106 can efficiently monitor an external environment and detect one or more users.
[0028] The transceiver 212 includes circuitry and logic for transmitting and/or receiving radar signals via the antenna 210. Components of the transceiver 212 can include amplifiers, mixers, switches, analog-to-digital converters, digital-to-analog converters, or filters for conditioning the radar signals. The transceiver 212 also includes logic to perform in-phase/quadrature (I/Q) operations, such as modulation or demodulation. A variety of modulations can be used, including linear frequency modulations, triangular frequency modulations, stepped frequency modulations, or phase modulations. Alternatively, the transceiver 212 can produce radar signals having a relatively constant frequency or a single tone. The transceiver 212 can be configured to support continuous-wave or pulsed radar operations.
[0029] A frequency spectrum (e.g., range of frequencies) that the transceiver 212 uses to generate the radar signals can encompass frequencies between 1 and 400 gigahertz (GHz), between 4 and 100 GHz, between 1 and 24 GHz, between 2 and 4 GHz, between 57 and 64 GHz, or at approximately 2.4 GHz. In some cases, the frequency spectrum can be divided into multiple sub-spectrums that have similar or different bandwidths. The bandwidths can be on the order of 500 megahertz (MHz), 1 GHz, 2 GHz, and so forth. Different frequency sub-spectrums may include, for example, frequencies between approximately 57 and 59 GHz, 59 and 61 GHz, or 61 and 63 GHz. Although the example frequency sub-spectrums described above are contiguous, other frequency sub-spectrums may not be contiguous. To achieve coherence, multiple frequency sub-spectrums (contiguous or not) that have a same bandwidth may be used by the transceiver 212 to generate multiple radar signals, which are transmitted simultaneously or separated in time. In some situations, multiple contiguous frequency sub-spectrums may be used to transmit a single radar signal, thereby enabling the radar signal to have a wide bandwidth. The transceiver 212 is further described with respect to FIG. 3.
[0030] The radar system 106 can also include at least one system processor 214 and at least one system medium 216 (e.g., one or more computer-readable storage media). The system processor 214 can be implemented together with the transceiver 212 on a same integrated circuit or implemented on a different integrated circuit that is separate from the integrated circuit with the transceiver 212. The system processor 214 executes instructions stored within the system medium 216 to analyze information provided by the transceiver 212. For example, the system processor 214 can perform Fourier Transform (FT) operations, presence detection, gesture recognition, collision avoidance, or vital-sign detection.
[0031] The radar-based touch interface 102 includes a touch-input recognition module 218, which can be implemented using hardware, software, firmware, or a combination thereof. In FIG. 2, the touch-input recognition module 218 is shown as instructions stored within the system medium 216. In this case, the system processor 214 can implement the touch-input recognition module 218. The touch-input recognition module 218 recognizes a touch input provided by the user and provides the recognized touch input to the computer processor 202 or an application executed by the computer processor 202. This causes the user device 104 to perform a function or operation according to the touch input.
[0032] In some implementations, the touch-input recognition module 218 can perform additional functions, such as control an operational mode of the radar system 106 or distinguish between a touch input and an occlusion. For example, the touch-input recognition module 218 can tailor the operation of components within the transceiver 212 to increase the saturation sensitivity of the radar system 106, which increases a probability of the radar system 106 becoming saturated. In other cases, the touch-input recognition module 218 can selectively alter the operational mode of the radar system 106 to enable the radar system 106 to conserve power during situations in which an occlusion is detected. The touch-input recognition module 218 is further described with respect to FIG. 4.
[0033] The user device 104 can also include at least one touch-sensitive sensor 220, such as a sensor that senses changes in resistance or capacitance. As an example, the touch-sensitive sensor 220 can include a touch screen of the user device 104. In some implementations, the radar- based touch interface 102 and the touch-sensitive sensor 220 operate together to detect additional types of touch inputs that involve both the radar-based touch interface 102 and the touch-sensitive sensor 220. Operation of the radar system 106 is further described with respect to FIG. 3.
[0034] FIG. 3 illustrates an example radar system 106 of the radar-based touch interface
102. In the depicted configuration, the radar system 106 implements a frequency -modulated continuous-wave radar. However, other types of radar architectures can be implemented, as described above with respect to FIG. 2. The transceiver 212 of the radar system 106 includes at least one transmitter 302 and at least one receiver 304. The transmitter 302 includes at least one voltage-controlled oscillator (VCO) 306 and at least one power amplifier (PA) 308. The receiver 304 includes one or more receive channels 310-1 to 310-M, where Mis a positive integer. Each receive channel 310-1 to 310-M includes at least one low-noise amplifier (LNA) 312, at least one mixer 314, at least one filter 316, and at least one analog-to-digital converter (ADC) 318.
[0035] The radar system 106 also includes multiple antennas 210, which include at least one transmit antenna 320 and at least two receive antennas 322-1 to 322 -M. The transmit antenna 320 is coupled to the transmitter 302. The receive antennas 322-1 to 322-M form an antenna array, such as a linear antenna array, and are respectively coupled to the receive channels 310-1 to 310-M. Although the radar system 106 of FIG. 3 is shown to include multiple receive antennas 322-1 to 322-M and multiple receive channels 310-1 to 310-M, other implementations can include a single receive antenna 322 and a single receive channel 310.
[0036] During transmission, the voltage-controlled oscillator 306 generates a frequency- modulated radar signal 324 at radio frequencies. The frequency-modulated radar signal 324 can include a sequence of chirps that are transmitted in a continuous burst or as time-separated pulses. A duration of each chirp can be on the order of tens or thousands of microseconds (e.g., between approximately 40 microseconds (ps) and 5 milliseconds (ms)), for instance.
[0037] Individual frequencies of the chirps can increase or decrease over time. As an example, the radar system 106 employs atwo-slope cycle (e.g., triangular frequency modulation) to linearly increase and linearly decrease the frequencies of the chirps over time. The two-slope cycle enables the radar system 106 to measure the Doppler frequency shift caused by the motion of a user (or object). In general, transmission characteristics of the chirps (e.g., bandwidth, center frequency, duration, and transmit power) can be tailored to achieve a particular detection range, range resolution, or doppler sensitivity for detecting one or more characteristics the user or one or more actions performed by the user.
[0038] The power amplifier 308 amplifies the frequency-modulated radar signal 324 for transmission via the transmit antenna 320. The transmitted frequency-modulated radar signal 324 is represented by a radar transmit signal 326. At least a portion of the radar transmit signal 326 is reflected by an object (e.g., the user). This reflected portion represents a radar receive signal 328. Although not explicitly shown in FIG. 3, the radar receive signal 328 represents the collection of radar receive signals 328-1 to 328-M. An amplitude of the radar receive signal 328 is smaller than an amplitude of the radar transmit signal 326 due to losses incurred during propagation and reflection. [0039] At the radar system 106, the radar receive signal 328 represents a delayed version of the radar transmit signal 326. The amount of delay is proportional to a slant range (e.g., distance) from the radar system 106 to the user. In particular, this delay represents a summation of a time it takes for the radar transmit signal 326 to propagate from the transmit antenna 320 to the object and a time it takes for the radar receive signal 328 to propagate from the object to a receive antenna 322. If the object and/or the radar system 106 is moving, the radar receive signal 328 is shifted in frequency relative to the radar transmit signal 326 due to the Doppler effect. In other words, characteristics of the radar receive signal 328 are dependent upon motion of the object and/or motion of the radar system 106. Similar to the radar transmit signal 326, the radar receive signal 328 is composed of one or more of chirps.
[0040] During reception, the radar system 106 receives and processes the radar receive signal 328. In particular, the receive antennas 322-1 to 322-M receive respective versions of the radar receive signal 328, which are represented by radar receive signals 328-1 to 328-M. In general, relative phase differences between these versions of the radar receive signals 328-1 to 328-M are due to differences in locations of the receive antennas 322-1 to 322-M. Within each receive channel 310-1 to 310-M, the low-noise amplifier 312 amplifies the radar receive signal 328, and the mixer 314 mixes the amplified radar receive signal 328 with the frequency -modulated radar signal 324. In particular, the mixer 314 performs a beating operation, which downconverts and demodulates the radar receive signal 328 using the frequency-modulated radar signal 324 to generate a beat signal 330.
[0041] A frequency of the beat signal 330 represents a frequency difference between the frequency-modulated radar signal 324 and the radar receive signal 328, which is proportional to the slant range to the object. Although not shown, the beat signal 330 can include multiple frequencies, which represent reflections from different portions of the object (e.g., different fingers of a user, different portions of a user’s hand, different body parts of a user). In some cases, these different portions move at different speeds, move in different directions, or are positioned at different slant ranges relative to the radar system 106.
[0042] The filter 316 filters the beat signal 330, and the analog-to-digital converter 318 digitizes the filtered beat signal 330. The receive channels 310-1 to 310-M respectively generate digital beat signals 332-1 to 332-M. In some situations, one or more of the digital beat signals 332-1 to 332-M represent the saturated receive signal 114 of FIG. 1. This can occur in situations in which the user interacts with the radar-based touch interface 102 or an object occludes the radar system 106. The digital beat signals 332-1 to 332-M are provided to the system processor 214 for processing, as further described with respect to FIG. 4. [0043] FIG. 4 illustrates an example scheme implemented by the system processor 214. In the depicted configuration, the system processor 214 is coupled to the receive channels 310-1 to 310-M of the transceiver 212. The system processor 214 implements the touch-input recognition module 218, which can analyze the digital beat signals 332-1 to 332-M to recognize a touch input provided by a user. The touch-input recognition module 218 includes a saturation counter 402 and a touch-input identification (ID) module 404. Additionally, the touch-input recognition module 218 can optionally include a position and/or motion estimator 406, an occlusion detection module 408, a mode-control module 410, or some combination thereof.
[0044] During operation, the saturation counter 402 accepts the digital beat signals 332-1 to 332-M and determines the amount of saturation present within each of these signals. For example, the saturation counter 402 can count the number of samples that are saturated (e.g., the number of samples whose amplitudes are clipped) and determine the percentage of saturated samples within each digital beat signal 332-1 to 332-M. This percentage can be provided as saturation data 412 to the touch-input identification module 404.
[0045] The touch-input identification module 404 determines whether or not the saturation data 412 indicates that a user provided a touch input. For example, the touch-input identification module 404 compares the percentage of saturated samples to a predetermine threshold. If the percentage of saturated samples is greater than or equal to the predetermined saturation threshold, the touch-input identification module 404 determines that the user provided a touch input. Alternatively, if the percentage of saturated samples is less than the predetermined threshold, the touch-input identification module 404 determines that the user has not provided a touch input.
[0046] The touch-input identification module 404 can monitor the saturation data 412 over time to distinguish between different types of touch inputs. The tap-and-release input, for instance, can cause the digital beat signals 332-1 to 332-M to be saturated for a shorter time period relative to the tap-and-hold input. The touch-input identification module 404 can also distinguish between different pressure-sensitive inputs based on the saturation data 412. A soft (e.g., small force) tap-and-release input can have a smaller percentage of saturated samples compared to a hard (e.g., large force) tap-and-release input. This occurs due to the difference in the amount of area that the user’s finger spreads over as a result of the difference in the amount of applied force. During the soft tap-and-release input, for instance, the user’s finger can touch a smaller area on the surface of the user device 104. In contrast, the user’s finger can spread over a larger area on the surface of the user device 104 during a hard tap-and-release input in which the user is applying a larger amount of force. In general, increasing the area covered by the user’s finger increases the quantity of saturated samples within the digital beat signals 332-1 to 332-M. In some cases, the touch-input identification module 404 analyzes additional information to recognize the type of touch input provided by the user.
[0047] The position and/or motion estimator 406, for instance, can accept the digital beat signals 332-1 to 332-M and generate position and/or motion data 414, which can further characterize a touch input. For example, the position and/or motion estimator 406 can generate range-Doppler maps based on the digital beat signals 332-1 to 332-M. The range-Doppler maps include amplitude and phase information for a set of range bins and a set of Doppler bins. In some cases, the position and/or motion estimator 406 can use digital beamforming techniques to generate range-azimuth-elevation maps based on the digital beat signals 332-1 to 332-M. The range-azimuth-elevation maps include amplitude and phase information for a set of range bins, a set of azimuth bins, and a set of elevation bins. The position and/or motion estimator 406 can provide the range-Doppler maps and/or the range-azimuth-elevation maps as the position and/or motion data 414.
[0048] The touch-input ID module 404 analyzes the position and/or motion data 414 to assist with recognizing various types of touch inputs. For example, the touch-input ID module 404 can determine a range associated with the touch input and confirm that the range is approximately equal to a distance between the antennas 210 of the radar system 106 and an exterior surface of the user device 104. Over time, the touch-input ID module 404 can also observe changes in the range rate associated with the touch input by analyzing the range-Doppler maps. For example, the touch-input ID module 404 can recognize a tap input by observing a detection within the range-Doppler maps having a relatively large negative range rate as the user’s finger approaches the radar-based touch interface 102 to touch the user device 104 and a relatively large positive range rate as the user’s finger moves away after tapping the user device 104. In contrast, the touch-input ID module 404 can recognize a swipe input by observing a detection within the range-Doppler maps having a range rate that changes by a relatively small amount compared to the tap-and-release input as the user slides their finger across the user device 104.
[0049] If the position and/or motion data 414 includes the range-azimuth-elevation maps, the touch-input ID module 404 can recognize different types of position-dependent inputs and/or motion-dependent inputs. The touch-input ID module 404 can also determine the direction the swipe input travels (e.g., left, right, up, down). In general, the touch-input ID module 404 can combine information from the saturation data 412 and the position and/or motion data 414 to recognize different types of touch inputs.
[0050] The touch-input ID module 404 generates touch-input data 416, which identifies the touch input recognized by the touch-input recognition module 218. The touch-input data 416 can be provided to the mode-control module 410 and/or other components of the user device 104, such as the computer processor 202 (of FIG. 2). This can prompt (e.g., cause) the user device 104 to perform an action or execute a function according to the touch input identified by the touch- input data 416.
[0051] In some implementations that include the touch-sensitive sensor 220, the touch- input ID module 404 can accept sensor data 418 from the touch-sensitive sensor 220. The sensor data 418 provides information regarding a touch input that is detected and recognized by the touch-sensitive sensor 220. In this manner, the touch-input ID module 404 can recognize a multi- touch input that involves both a touch input at the touch-sensitive sensor 220 and a touch input at the radar-based touch interface 102. An example multi -touch input can include a tap-and-hold input via the radar-based touch interface 102 and a tap-and-release input via the touch-sensitive sensor 220. Another example multi-touch input can include a swipe input via the radar-based touch interface 102 and a tap-and-hold input via the touch-sensitive sensor 220.
[0052] The occlusion detection module 408 can accept the saturation data 412 and the position and/or motion data 414. The occlusion detection module 408 determines whether an object is occluding the radar system 106. Using the occlusion detection module 408, the touch- input recognition module 218 can distinguish between a touch input (e.g., a short-term intentional occlusion) and an occlusion (e.g., a long-term occlusion that may or may not be intentional). In some implementations, the occlusion detection module 408 monitors a duration that the percentage of saturated samples is greater than a predetermined saturation threshold for detecting an occlusion. In some cases, the saturation threshold for detecting an occlusion can be less than the saturation threshold for detecting a touch input. This is because some objects may have a smaller radar cross section relative to the human body part used to provide the touch input. In contrast to a rigid inanimate object, the human body part is partially composed of fluid, which enables the human body part to spread across the surface of the user device 104. Therefore, the human body part can cause a larger amount of signal clipping to occur relative to the rigid inanimate object. The occlusion detection module 408 determines that an occlusion is present if this duration is greater than or equal to a predetermined duration threshold. The duration threshold can be set to be greater than a duration associated with a touch input, such as a duration of the tap- and-hold input.
[0053] The occlusion detection module 408 can also analyze information from the position and/or motion data 414 to determine that the radar system 106 is occluded. For example, the occlusion detection module 408 can determine that a detection within the range-Doppler map represents an occlusion if the detection is associated with a near range and a low Doppler filter, which can indicate that the occlusion is relatively stationary. The occlusion detection module 408 can also monitor the range-Doppler maps to detect a change in phase. If the phase associated with the detection does not change significantly over time, then the detection is likely associated with a stationary object. The occlusion detection module 408 generates occlusion data 420, which can be provided to the user device 104 and/or the mode-control module 410. The occlusion data 420 indicates whether or not the radar system 106 is occluded. In some cases, the user device 104 performs an action or executes a function according to the occlusion data 420. For example, the user device 104 can activate one or more components that are adjacent to the radar system 106 if the radar system 106 is not occluded and deactivate one or more of these components if the radar system 106 is occluded, as further described with respect to FIG. 5. As another example, the user device 104 can alert the user to the presence of the occlusion.
[0054] The mode-control module 410 controls an operational mode 422 of the transceiver 212. In some cases, the mode-control module 410 directs the radar system 106 to operate according to a saturation mode, which increases the radar system 106’s saturation sensitivity for touch recognition or occlusion detection. In the saturation mode, the transceiver 212 can operate with a higher transmit power and/or increase the gain of one or more of its amplifiers, such as the power amplifier 308, the low-noise amplifier 312, or a variable gain amplifier within the transmitter 302 or the receiver 304 (not shown). Additionally or alternatively, the saturation mode causes the transceiver 212 to lower a cutoff frequency associated with a high-pass filter implemented by the filter 316. Lowering the cutoff frequency increases the filter 316’s response to near-range objects. The transceiver 212 can also activate a low-pass filter, which can be implemented as part of the filter 316, to reduce noise and pass objects associated with small Doppler frequencies.
[0055] In some cases, the mode-control module 410 actives the saturation mode responsive to the touch-input ID module 404 providing the touch-input data 416. As an example, the mode-control module 410 analyzes the touch input data 416 and determines whether to activate the saturation mode or modify parameters of the saturation mode to improve touch recognition. In other cases, the mode-control module 410 activates the saturation mode responsive to the occlusion detection module 408 determining that the radar system 106 is occluded. This allows the occlusion detection module 408 to confirm that the radar system 106 is occluded. In still other cases, the mode-control module 410 can activate the saturation mode at pre-specified time intervals, such as every 500 milliseconds.
[0056] The mode-control module 410 can also selectively alter the operational mode of the radar system 106. For example, if the occlusion detection module 408 determines that the radar system 106 is occluded, the mode-control module 410 can analyze the occlusion data 420 and cause the radar system 106 to operate in a low-power mode or transition to an off state (e.g., a powered-down state) to reduce power consumption while the radar system 106 is occluded. During the low-power mode, the occlusion detection module 408 can continuously and/or intermittently monitor the saturation data 412 and detect if the occlusion is no longer present. Once the radar system 106 is no longer occluded, the mode-control module 410 can cause the radar system 106 to operate according to a previous operational mode.
[0057] If the mode-control module 410 causes the radar system 106 to transition to the off state, the mode-control module 410 can periodically wake-up the radar system 106 by causing the radar system 106 to transition to another operational mode, such as the low-power mode. During the low-power mode, the occlusion detection module 408 can confirm whether the occlusion is still present. Alternatively, if a proximity sensor is near the radar system 106 and also detects the occlusion, the proximity sensor can wake-up the radar system 106 responsive to detecting that the occlusion is no longer present.
[0058] In another example, if the radar system 106 is supporting other radar-based applications in addition to or instead of touch recognition, such as gesture recognition, the mode- control module 410 can cause the radar system 106 to operate according to a soft-gate mode if the occlusion detection module 408 determines that the radar system 106 is occluded. In the soft-gate mode, the radar system 106 to temporarily ceases processing detections. This can help reduce a false-alarm rate of the radar system 106 while the radar system 106 is occluded. The mode-control module 410 can direct the radar system 106 to remain in the soft-gate mode for a predetermined amount of time before returning to a previous operational mode.
[0059] In other implementations not shown, the radar system 106 can operate as a proximity sensor by detecting occlusion. To support occlusion detection, the system processor 214 can implement the saturation counter 402, the position and/or motion estimator 406, the occlusion detection module 408, the mode-control module 410, or some combination thereof. Multiple instances of the radar system 106 can be implemented within the user device 104, as further described with respect to FIG. 5.
[0060] FIG. 5 illustrates example positions of the radar-based touch interface 102’s radar systems 106 on a smartphone 500. In the depicted configuration, the smartphone 500 includes radar systems 106-1 to 106-8. In some implementations, the radar systems 106-1 to 106-8 are positioned within or under an exterior housing of the smartphone 500, which can be substantially transparent to radar signals (e.g., minimally attenuate radar signals).
[0061] In some implementations, the radar systems 106-1 to 106-8 are positioned around the smartphone 500 such that a portion of each radar system 106’s antenna pattern overlaps at least one other radar system 106’s antenna pattern. In this way, an object can be detected in the overlapping antenna patterns of at least two of the radar systems 106-1 to 106-8 at various locations around the smartphone 500. [0062] In a depicted portrait orientation of the smartphone 500, the one or more antennas 210 of each radar system 106 can face up along the Y axis towards an upper side of the smartphone 500, face left along the A' axis towards a left side of the smartphone 500, face down along the Y axis towards a bottom side of the smartphone 500, or face right along the A' axis towards a right side of the smartphone 500. For example, the antennas 210 of the radar systems 106-1 and 106- 3 can face up along the Y axis, the antennas 210 of the radar systems 106-2 and 106-4 can face left along the X axis, the antennas 210 of the radar systems 106-5 and 106-6 can face down along the Y axis, and the antennas 210 of the radar systems 106-7 and 106-8 can face right along the A' axis. In other implementations, the antennas 210 of one or more of the radar systems 106-1 to 106-8 can face up out of the page along the Z axis towards a front face of the smartphone 500 or face down into the page along the Z axis towards a back side of the smartphone 500.
[0063] Although the antennas 210 of the radar systems 106-1 to 106-8 can face a particular side of the smartphone 500, the antenna patterns of these antennas 210 can encompass a volume of space above the front face of the smartphone 500 and/or another volume of space behind the backside of the smartphone 500. In this way, the radar systems 106-1 to 106-8 can detect a user interacting with the smartphone 500.
[0064] In some implementations, the radar systems 106-1 to 106-8 have multiple antennas 210 that form an antenna array, such as multiple transmit antennas 320 or multiple receive antennas 322. Consider an example in which the radar systems 106-1 to 106-8 each include at least two receive antennas 322-1 and 322-2, which form a linear antenna array. Orientations of these linear antenna arrays can vary to enable the radar systems 106 to operate together to determine two-dimensional angular information associated with an object. In particular, orientations of some linear antenna arrays can differ by approximately 90 degrees. For example, the receive antennas 322-1 and 322-2 of the radar system 106-1 can be aligned along the A axis to measure azimuth angles of objects and the receive antennas 322-1 and 322-2 of the radar system 106-2 can be aligned along the Y axis to measure elevation angles of the objects.
[0065] Although the smartphone 500 of FIG. 5 is shown to include eight radar systems 106-1 to 106-8, other implementations of the smartphone 500 can have other quantities of radar systems 106 (e.g., fewer than eight radar systems 106 or more than eight radar systems 106). For example, the smartphone 500 can include two radar systems 106, such as radar systems 106-1 and 106-2. In some cases, the two radar systems 106 are oriented along different axes to enable two- dimensional angular information to be determined.
[0066] The various positions of the radar systems 106-1 to 106-8 can facilitate implementation of the radar-based touch interface 102. For example, in the portrait orientation, the radar system 106-8 can be at a convenient position for detecting touch inputs provided a right- handed user’s thumb. In contrast, the radar system 106-4 can be at a convenient position for detection touch inputs provided by a left-handed user’s thumb. In the landscape orientation, the radar systems 106-1, 106-3, 106-5, and/or 106-6 can be used to enable a user to provide touch inputs for a gaming application.
[0067] Some radar systems 106-1 to 106-8 can be at positions that are likely to be occluded by the user. For example, the radar system 106-4 can be occluded by a right-handed user’s fingers, which wrap around the smartphone 500. In some implementations, one or more of the radar systems 106-1 to 106-8 are not adjacent to a proximity sensor. With the ability to detect occlusion, however, the radar systems 106-1 to 106-8 can independently transition to an appropriate operational mode.
[0068] The radar systems 106-1 to 106-8 can also operate as separate proximity sensors for other components within the smartphone 500. For example, consider that a radio-frequency component is adjacent to the radar system 106-4 (e.g., above the radar system 106-4, below the radar system 106-4). This radio-frequency component can include a wireless communication transceiver, which transmits wireless communication signals, or another radar system 106 (not shown).
Example Method
[0069] FIG. 6 depicts an example method 600 performed by a radar-based touch interface. Method 600 is shown as sets of operations (or acts) performed but not necessarily limited to the order or combinations in which the operations are shown herein. Further, any of one or more of the operations may be repeated, combined, reorganized, or linked to provide a wide array of additional and/or alternate methods. In portions of the following discussion, reference may be made to the environment 100-1 to 100-3 of FIG. 1, and entities detailed in FIGs. 2-4, reference to which is made for example only. The techniques are not limited to performance by one entity or multiple entities operating on one device.
[0070] At 602, a radar transmit signal is transmitted using at least one transmit antenna of a radar system. For example, a radar system 106 of a radar-based touch interface 102 transmits a radar transmit signal 326 using at least one transmit antenna 320, as shown in FIG. 3. In some situations, the radar system 106 operates according to a saturation mode, which increases a power level of the radar transmit signal to increase a likelihood that the radar system 106 becomes saturated.
[0071] At 604, a radar receive signal is received using at least one receive antenna of the radar system. For example, the radar system 106 receives a radar receive signal 328 using at least one receive antenna 322, as shown in FIG. 3. The radar receive signal 328 represents a reflected version of the radar transmit signal 326. In a first example, the user interacts with the radar-based touch interface 102. Due to this interaction, the radar transmit signal 326 is reflected by a portion of the user that touches a location on the user device 104. In general, this location is proximate to (e.g., close to, near) one or more of the antennas 210 of the radar system 106 and causes the radar system 106 to become saturated. For example, this location can be directly above one or more of the antennas 210 such that the user covers at least a portion of one or more of the antennas 210. Alternatively, this location can be adjacent (e.g., next to, to the side of) one or more of the antennas 210. In this case, the user is not directly above the antennas 210.
[0072] In a second example, an object at least partially occludes the radar system 106 and causes the radar system 106 to become saturated. Due to this occlusion, a majority of the radar transmit signal 326 is reflected by the object and does not propagate further in space. The object can be an inanimate object or a portion of the user, such as the user’s hand or finger.
[0073] At 606, the radar system 106 is configured to recognize a touch input provided by the user based on the radar receive signal. For example, the touch-input recognition module 218 recognizes the touch input provided by the user, as shown in FIG. 4. The touch input can include a tap-and-release input, a tap-and-hold input, a swipe input, a position-dependent input, a motion- dependent input, a duration-dependent input, a pressure-dependent input (e.g., a hard tap, a soft tap), or combinations thereof.
[0074] At 608, the touch input is provided to a device to cause the device to perform a function associated with the touch input. For example, the touch-input recognition module 218 provides touch-input data 416 to the user device 104 to cause the user device 104 to perform a function (or an action) associated with the touch input. As an example, the user device 104 can present new content on the display, move a cursor on the display, activate or deactivate one or more components within the user device 104 (e.g., a camera, a wireless communication transceiver), open an application on the user device 104, change an operational mode of the user device 104 (e.g., transition between a low-power mode and a high-power mode), or scroll through visual content that is presented on the display according to the touch input.
Example Computing System
[0075] FIG. 7 illustrates various components of an example computing system 700 that can be implemented as any type of client, server, and/or computing device as described with reference to the previous FIG. 2 to implement a radar-based touch interface 102.
[0076] The computing system 700 includes communication devices 702 that enable wired and/or wireless communication of device data 704 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data). The computing system 700 also includes one or more radar systems 106. The device data 704 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on the computing system 700 can include any type of audio, video, and/or image data. The computing system 700 includes one or more data inputs 706 via which any type of data, media content, and/or inputs can be received, such as human utterances, user-selectable inputs (explicit or implicit), messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
[0077] The computing system 700 also includes communication interfaces 708, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 708 provide a connection and/or communication links between the computing system 700 and a communication network by which other electronic, computing, and communication devices communicate data with the computing system 700.
[0078] The computing system 700 includes one or more processors 710 (e.g., any of microprocessors, controllers, and the like), which process various computer-executable instructions to control the operation of the computing system 700 and to enable techniques for, or in which can be embodied, gesture recognition in the presence of saturation. Alternatively or additionally, the computing system 700 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 712. Although not shown, the computing system 700 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
[0079] The computing system 700 also includes a computer-readable media 714, such as one or more memory devices that enable persistent and/or non-transitory data storage (i.e., in contrast to mere signal transmission), examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. The disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. The computing system 700 can also include a mass storage media device (storage media) 716.
[0080] The computer-readable media 714 provides data storage mechanisms to store the device data 704, as well as various device applications 718 and any other types of information and/or data related to operational aspects of the computing system 700. For example, an operating system 720 can be maintained as a computer application with the computer-readable media 714 and executed on the processors 710. The device applications 718 may include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on.
[0081] The device applications 718 also include any system components, engines, or managers to implement the radar-based touch interface 102. In this example, the device applications 718 includes the touch-input recognition module 218 of FIG. 2.
Conclusion
[0082] Although techniques using, and apparatuses including, a radar-based touch interface have been described in language specific to features, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features. Rather, the specific features are disclosed as example implementations of a radar-based touch interface.
[0083] Some examples are described below:
[0084] Example 1 : A method performed by a radar-based touch interface included within a device, the radar-based touch interface including at least one radar system, the method comprising: transmitting a radar transmit signal using at least one transmit antenna of the radar system; receiving a radar receive signal using at least one receive antenna of the radar system, the radar receive signal comprising a reflected version of the radar transmit signal, the radar transmit signal reflected by a portion of a user that touches a location on the device; recognizing, based on the radar receive signal, a touch input provided by the user; and providing the touch input to the device to cause the device to perform a function associated with the touch input.
[0085] Example 2: The method of example 1, wherein: the recognizing of the touch input comprises using a touch-input recognition module of the at least one radar system for recognizing the touch input provided by the user; and the touch-input recognition module comprises at least one of: a saturation counter, a touch-input identification module, a position estimator, a motion estimator, an occlusion detection module, or a mode control module. [0086] Example 3 : The method of example 1 or 2, wherein the recognizing of the touch input comprises recognizing at least one of: a tap-and-release input; a tap-and-hold input; a swipe input; a position-dependent input; a motion-dependent input; a duration-dependent input; or a pressure-dependent input.
[0087] Example 4: The method of example at least one of the preceding examples, wherein the function comprises at least one of: present new content on a display of the device; move a cursor on the display of the device; activate one or more components within the device; deactivate one or more other components within the device; open an application on the device; scroll through visual content that is presented on the display of the device; or adjust an operational mode of the device.
[0088] Example 5 : The method of any preceding example, wherein the recognizing of the touch input comprises recognizing the touch input based on at least one of the following: an amount of saturation associated with the touch input; a position associated with the touch input; a motion associated with the touch input; or a duration of the touch input.
[0089] Example 6: The method of any preceding example, wherein the recognizing of the touch input further comprises: generating a digital beat signal based on the radar receive signal, the digital beat signal comprising multiple samples; measuring a percentage of the multiple samples of the digital beat signal that are saturated; and recognizing the touch input based on the percentage being greater than or equal to a threshold. [0090] Example 7: The method of example 6, wherein: the receiving of the radar receive signal using the at least one receive antenna of the radar system comprises receiving the radar receive signal using two or more receive antennas that form a linear antenna array; the generating of the digital beat signal comprises generating two or more digital beat signals based on the two or more radar receive signals, respectively; and the recognizing of the touch input comprises: analyzing phase differences between the two or more digital beat signals to determine an angular position of the location; and recognizing the touch input based on the angular position and the percentage being greater than or equal to the threshold.
[0091] Example 8: The method of any preceding example, further comprising: accepting sensor data from a touch-sensitive sensor, the sensor data comprising information regarding another portion of the user that touches another location on the device, wherein the recognizing of the touch input comprises recognizing the touch input based on the radar receive signal and the sensor data.
[0092] Example 9: The method of any preceding example, further comprising: prior to transmitting the radar transmit signal, altering operation of the radar system to increase a saturation sensitivity of the radar system.
[0093] Example 10: The method of example 9, wherein the altering the operation of the radar system comprises at least one of the following: increasing a transmit power level of the radar system; increasing a gain of an amplifier of the radar system; decreasing a cutoff frequency of a high-pass filter of the radar system; or enabling a low-pass filter of the radar system. [0094] Example 11 : The method of any preceding example, further comprising: transmitting another radar transmit signal using the at least one transmit antenna of the radar system; receiving another radar receive signal using the at least one receive antenna of the radar system, the other radar receive signal comprising a reflected version of the other radar transmit signal, the other radar transmit signal reflected by an object that at least partially occludes the at least one transmit antenna or the at least one receive antenna; and determining, based on the other radar receive signal, that the radar system is at least partially occluded by the object.
[0095] Example 12: The method of example 11, wherein: the method further comprising, responsive to determining that the radar system is at least partially occluded by the object, disabling a radio-frequency component of the device that is adjacent to the radar system; and the radio-frequency component comprises at least one of: another radar system; or a wireless transceiver configured to transmit wireless communication signals.
[0096] Example 13: The method of example 11, further comprising: responsive to determining that the radar system is at least partially occluded by the object, altering operation of the radar system to reduce power consumption.
[0097] Example 14: The method of any preceding example, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance. [0098] Example 15: The method of any preceding example, wherein: the radar system comprises multiple radar systems at different positions on the device, the multiple radar systems including a first radar system and a second radar system; the at least one receive antenna of the first radar system comprises multiple receive antennas that form a first linear antenna array; the at least one receive antenna of the second radar system comprises other multiple receive antennas that for a second linear antenna array; and the linear antenna array of the first radar system is oriented ninety degrees with respect to the linear antenna array of a second radar system of the multiple radar systems.
[0099] Example 16: An apparatus comprising: a radar-based touch interface comprising at least one radar system, the radar-based touch interface configured to perform any of the methods of examples 1 to 15.
[0100] Example 17: A device with a radar-based touch interface comprising: a radar system comprising: at least one antenna; at least one radar transceiver coupled to the at least one antenna, the at least one antenna and the at least one radar transceiver jointly configured to transmit and receive radar signals; and a touch input recognition module configured to process the radar signals, the radar signals reflected by a user of the device, the radar-based touch interface operable to perform any of the methods of examples 1 to
15.
[0101] Example 18: The device of example 17, wherein the touch input recognition module comprises at least one of: a saturation counter; a position estimator; a motion estimator; an occlusion detection module; or a touch-input identification module. [0102] Example 19: The device of example 17 or 18, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance.

Claims

1. A method performed by a radar-based touch interface included within a device, the radar- based touch interface including at least one radar system, the method comprising: transmitting a radar transmit signal using at least one transmit antenna of the radar system; receiving a radar receive signal using at least one receive antenna of the radar system, the radar receive signal comprising a reflected version of the radar transmit signal, the radar transmit signal reflected by a portion of a user that touches a location on the device; recognizing, based on the radar receive signal, a touch input provided by the user; and providing the touch input to the device to cause the device to perform a function associated with the touch input.
2. The method of claim 1, wherein: the recognizing of the touch input comprises using a touch-input recognition module of the at least one radar system for recognizing the touch input provided by the user; and the touch-input recognition module comprises at least one of: a saturation counter, a touch-input identification module, a position estimator, a motion estimator, an occlusion detection module, or a mode control module.
3. The method of claim 1 or 2, wherein the recognizing of the touch input comprises recognizing at least one of: a tap-and-release input; a tap-and-hold input; a swipe input; a position-dependent input; a motion-dependent input; a duration-dependent input; or a pressure-dependent input.
4. The method of claim at least one of the preceding claims, wherein the function comprises at least one of: present new content on a display of the device; move a cursor on the display of the device; activate one or more components within the device; deactivate one or more other components within the device; open an application on the device; scroll through visual content that is presented on the display of the device; or adjust an operational mode of the device.
5. The method of any preceding claim, wherein the recognizing of the touch input comprises recognizing the touch input based on at least one of the following: an amount of saturation associated with the touch input; a position associated with the touch input; a motion associated with the touch input; or a duration of the touch input.
6. The method of any preceding claim, wherein the recognizing of the touch input further comprises: generating a digital beat signal based on the radar receive signal, the digital beat signal comprising multiple samples; measuring a percentage of the multiple samples of the digital beat signal that are saturated; and recognizing the touch input based on the percentage being greater than or equal to a threshold.
7. The method of claim 6, wherein: the receiving of the radar receive signal using the at least one receive antenna of the radar system comprises receiving the radar receive signal using two or more receive antennas that form a linear antenna array; the generating of the digital beat signal comprises generating two or more digital beat signals based on the two or more radar receive signals, respectively; and the recognizing of the touch input comprises: analyzing phase differences between the two or more digital beat signals to determine an angular position of the location; and recognizing the touch input based on the angular position and the percentage being greater than or equal to the threshold.
8. The method of any preceding claim, further comprising: accepting sensor data from a touch-sensitive sensor, the sensor data comprising information regarding another portion of the user that touches another location on the device, wherein the recognizing of the touch input comprises recognizing the touch input based on the radar receive signal and the sensor data.
9. The method of any preceding claim, further comprising: prior to transmitting the radar transmit signal, altering operation of the radar system to increase a saturation sensitivity of the radar system.
10. The method of claim 9, wherein the altering the operation of the radar system comprises at least one of the following: increasing a transmit power level of the radar system; increasing a gain of an amplifier of the radar system; decreasing a cutoff frequency of a high-pass filter of the radar system; or enabling a low-pass filter of the radar system.
11. The method of any preceding claim, further comprising: transmitting another radar transmit signal using the at least one transmit antenna of the radar system; receiving another radar receive signal using the at least one receive antenna of the radar system, the other radar receive signal comprising a reflected version of the other radar transmit signal, the other radar transmit signal reflected by an object that at least partially occludes the at least one transmit antenna or the at least one receive antenna; and determining, based on the other radar receive signal, that the radar system is at least partially occluded by the object.
12. The method of claim 11, wherein: the method further comprising, responsive to determining that the radar system is at least partially occluded by the object, disabling a radio-frequency component of the device that is adjacent to the radar system; and the radio-frequency component comprises at least one of: another radar system; or a wireless transceiver configured to transmit wireless communication signals.
13. The method of claim 11, further comprising: responsive to determining that the radar system is at least partially occluded by the object, altering operation of the radar system to reduce power consumption.
14. The method of any preceding claim, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance.
15. The method of any preceding claim, wherein: the radar system comprises multiple radar systems at different positions on the device, the multiple radar systems including a first radar system and a second radar system; the at least one receive antenna of the first radar system comprises multiple receive antennas that form a first linear antenna array; the at least one receive antenna of the second radar system comprises other multiple receive antennas that for a second linear antenna array; and the linear antenna array of the first radar system is oriented ninety degrees with respect to the linear antenna array of a second radar system of the multiple radar systems.
16. An apparatus comprising: a radar-based touch interface comprising at least one radar system, the radar-based touch interface configured to perform any of the methods of claims 1 to 15.
17. A device with a radar-based touch interface comprising: a radar system comprising: at least one antenna; at least one radar transceiver coupled to the at least one antenna, the at least one antenna and the at least one radar transceiver jointly configured to transmit and receive radar signals; and a touch input recognition module configured to process the radar signals, the radar signals reflected by a user of the device, the radar-based touch interface operable to perform any of the methods of claims 1 to 15.
18. The device of claim 17, wherein the touch input recognition module comprises at least one of: a saturation counter; a position estimator; a motion estimator; an occlusion detection module; or a touch-input identification module.
19. The device of claim 17 or 18, wherein the device comprises: a smartphone; a smart speaker; a smart thermostat; a smart watch; a gaming system; or a home appliance.
PCT/US2020/038189 2020-06-17 2020-06-17 Radar-based touch interface WO2021257071A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/038189 WO2021257071A1 (en) 2020-06-17 2020-06-17 Radar-based touch interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/038189 WO2021257071A1 (en) 2020-06-17 2020-06-17 Radar-based touch interface

Publications (1)

Publication Number Publication Date
WO2021257071A1 true WO2021257071A1 (en) 2021-12-23

Family

ID=71575781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/038189 WO2021257071A1 (en) 2020-06-17 2020-06-17 Radar-based touch interface

Country Status (1)

Country Link
WO (1) WO2021257071A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2587347A2 (en) * 2011-10-25 2013-05-01 Broadcom Corporation Portable computing device including a three-dimensional touch screen
US20190210615A1 (en) * 2018-01-05 2019-07-11 Magna Mirrors Of America, Inc. Vehicular gesture monitoring system
US20190302250A1 (en) * 2018-03-28 2019-10-03 Qualcomm Incorporated Proximity Detection Using Multiple Power Levels
US20190339358A1 (en) * 2018-05-07 2019-11-07 Qualcomm Incorporated Radar Interference Mitigation Using a Pseudorandom Offset
US20200026361A1 (en) * 2018-07-19 2020-01-23 Infineon Technologies Ag Gesture Detection System and Method Using A Radar Sensors
US20200166623A1 (en) * 2018-11-27 2020-05-28 Qualcomm Incorporated Apparatus and techniques for 3d reconstruction with coordinated beam scan using millimeter wave radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2587347A2 (en) * 2011-10-25 2013-05-01 Broadcom Corporation Portable computing device including a three-dimensional touch screen
US20190210615A1 (en) * 2018-01-05 2019-07-11 Magna Mirrors Of America, Inc. Vehicular gesture monitoring system
US20190302250A1 (en) * 2018-03-28 2019-10-03 Qualcomm Incorporated Proximity Detection Using Multiple Power Levels
US20190339358A1 (en) * 2018-05-07 2019-11-07 Qualcomm Incorporated Radar Interference Mitigation Using a Pseudorandom Offset
US20200026361A1 (en) * 2018-07-19 2020-01-23 Infineon Technologies Ag Gesture Detection System and Method Using A Radar Sensors
US20200166623A1 (en) * 2018-11-27 2020-05-28 Qualcomm Incorporated Apparatus and techniques for 3d reconstruction with coordinated beam scan using millimeter wave radar

Similar Documents

Publication Publication Date Title
US10936185B2 (en) Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
US11435468B2 (en) Radar-based gesture enhancement for voice interfaces
US11175378B2 (en) Smart-device-based radar system performing symmetric doppler interference mitigation
US11460538B2 (en) Detecting a frame-of-reference change in a smart-device-based radar system
US10794997B2 (en) Smartphone-based power-efficient radar processing and memory provisioning for detecting gestures
US20230161027A1 (en) Smart-Device-Based Radar System Performing Near-Range Detection
EP3991067A1 (en) Radar-based authentication status feedback
US20240027600A1 (en) Smart-Device-Based Radar System Performing Angular Position Estimation
WO2021257071A1 (en) Radar-based touch interface
US11860294B2 (en) Electromagnetic vector sensors for a smart-device-based radar system
US20210396867A1 (en) Multi-Radar System
WO2022251825A1 (en) Radar application programming interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20739499

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20739499

Country of ref document: EP

Kind code of ref document: A1