CN111812633A - Detecting reference frame changes in smart device based radar systems - Google Patents

Detecting reference frame changes in smart device based radar systems Download PDF

Info

Publication number
CN111812633A
CN111812633A CN202010663766.3A CN202010663766A CN111812633A CN 111812633 A CN111812633 A CN 111812633A CN 202010663766 A CN202010663766 A CN 202010663766A CN 111812633 A CN111812633 A CN 111812633A
Authority
CN
China
Prior art keywords
radar
data
reference frame
radar system
machine learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010663766.3A
Other languages
Chinese (zh)
Other versions
CN111812633B (en
Inventor
尼古拉斯·爱德华·吉利恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to CN202410233722.5A priority Critical patent/CN118191817A/en
Publication of CN111812633A publication Critical patent/CN111812633A/en
Application granted granted Critical
Publication of CN111812633B publication Critical patent/CN111812633B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/35Details of non-pulse systems
    • G01S7/352Receivers
    • G01S7/354Extracting wanted echo-signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/28Details of pulse systems
    • G01S7/285Receivers
    • G01S7/295Means for transforming co-ordinates or for evaluating data, e.g. using computers
    • G01S7/2955Means for determining the position of the radar coordinate system for evaluating the position data of the target in another coordinate system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/522Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves
    • G01S13/524Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves based upon the phase or frequency shift resulting from movement of objects, with reference to the transmitted signals, e.g. coherent MTi
    • G01S13/53Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves based upon the phase or frequency shift resulting from movement of objects, with reference to the transmitted signals, e.g. coherent MTi performing filtering on a single spectral line and associated with one or more range gates with a phase detector or a frequency mixer to extract the Doppler information, e.g. pulse Doppler radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/584Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to detecting reference frame changes in smart device based radar systems. Techniques and apparatus are described to implement a smart device-based radar system capable of detecting reference frame changes. In particular, the radar system includes a reference frame machine learning module trained to identify whether a reference frame of the radar system changes. The reference frame machine learning module analyzes complex radar data generated from at least one chirp of the reflected radar signals to analyze relative motion of at least one object over time. By directly using machine learning to analyze complex radar data, the radar system can operate as a motion sensor without relying on non-radar-based sensors, such as gyroscopes, inertial sensors, or accelerometers. With knowledge of whether the reference frame is stationary or moving, the radar system may determine whether a gesture is likely to occur and, in some cases, compensate for relative motion of the radar system itself.

Description

Detecting reference frame changes in smart device based radar systems
Technical Field
The present disclosure relates to detecting reference frame changes in smart device based radar systems.
Background
Radar is a useful device that can detect objects. Radar may provide improved performance in many different environmental conditions, such as low light and fog or having moving or overlapping objects, relative to other types of sensors, such as cameras. Radar can also detect objects through one or more obstructions, such as a purse or pocket. Despite the many advantages of radar, there are many challenges associated with integrating radar into electronic devices.
One challenge relates to operating a radar in an electronic device that may be mobile, such as a mobile device or a wearable device. The radar may operate when the electronic device is stationary or moving, and there is uncertainty as to whether the reference frame of the radar is fixed or changing. It is challenging for the radar to distinguish between the case where the radar is stationary and the object is moving, the object is stationary and the radar is moving, or both the radar and the object are moving.
Disclosure of Invention
Techniques and apparatus are described to implement a smart device-based radar system capable of detecting reference frame changes. The radar system includes a reference frame machine learning module trained to operate as a motion sensor. The reference frame machine learning module uses machine learning to identify whether a reference frame of the radar system changes. In particular, the reference frame machine learning module analyzes complex radar data generated from at least one chirp of the reflected radar signals to identify subtle patterns (patterns) in the relative motion of at least one object over time. In some cases, the reference frame machine learning module compares (e.g., correlates) relative motion of two or more objects. By analyzing complex radar data directly using machine learning, the reference frame machine learning module can determine whether the reference frame of the radar system is changing without relying on non-radar-based sensors, such as gyroscopes, inertial sensors, or accelerometers. With knowledge of whether the reference frame is stationary or moving, the radar system may determine whether a gesture is likely to occur, and in some cases, may compensate for relative motion of the radar system itself.
Aspects described below include a method performed by a radar system for detecting a reference frame change. The method includes transmitting a first radar transmit signal using an antenna array of a radar system, and receiving a first radar receive signal using the antenna array. The first radar reception signal comprises a version of the first radar transmission signal reflected by the at least one object. The method also includes generating complex radar data based on the first radar receive signal. The method additionally includes analyzing the complex radar data using machine learning to detect changes in a reference frame of the radar system.
Aspects described below also include an apparatus including a radar system. The radar system includes an antenna array and a transceiver. The radar system also includes a processor and a computer-readable storage medium configured to perform any of the methods.
Aspects described below include a computer-readable storage medium comprising computer-executable instructions that, in response to execution by a processor, implement a reference frame machine learning module configured to accept complex radar data associated with radar receive signals reflected by at least one object. The reference frame machine learning module is further configured to analyze the complex radar data using machine learning to generate reference frame data. The reference frame machine learning module is further configured to determine whether an antenna array receiving the radar receive signal is stationary or moving based on the reference frame data.
Aspects described below also include a system having a machine learning means for detecting reference frame changes based on complex radar data.
Drawings
Apparatus and techniques for implementing a smart device-based radar system capable of detecting changes in a frame of reference are described with reference to the following figures. The same numbers are used throughout the drawings to reference like features and components:
FIG. 1 illustrates an example environment in which a smart device-based radar system capable of detecting changes in a reference frame may be implemented.
Fig. 2-1 illustrates an example embodiment of a radar system as part of a smart device.
2-2 illustrate example embodiments of a reference frame machine learning module.
Fig. 3-1 illustrates operation of an example radar system.
Fig. 3-2 illustrates an exemplary radar framing structure for detecting reference frame changes.
Fig. 4 illustrates an exemplary antenna array and an exemplary transceiver of a radar system.
FIG. 5 illustrates an example scheme implemented by a radar system for detecting a reference frame change.
FIG. 6 illustrates an example portion of a hardware abstraction module for detecting a reference frame change.
FIG. 7-1 illustrates an example spatiotemporal neural network for detecting reference frame changes.
Fig. 7-2 illustrates an example linear chirp level analysis module of a spatio-temporal neural network.
7-3 illustrate example feature level analysis modules of a spatio-temporal neural network.
7-4 illustrate example master analysis modules of a spatio-temporal neural network.
FIG. 8 illustrates an example method for performing the operation of a smart device-based radar system capable of detecting reference frame changes.
FIG. 9 illustrates an example computing system embodying, or in which techniques may be implemented, a radar system capable of detecting reference frame changes.
Detailed Description
SUMMARY
Integrating radar systems within electronic devices can be challenging. One challenge relates to operating a radar system in a mobile electronic device, such as a mobile device or a wearable device. Since radar systems may operate when the electronic device is stationary or moving, there is uncertainty as to whether the reference frame of the radar system is fixed or changing. This can make it challenging for the radar system to distinguish between situations where the radar system is stationary and the object is moving towards the radar system, where the object is stationary and the radar system is moving towards the object, or where both the radar system and the object are moving towards each other. Without knowing whether the reference frame is fixed or changing, the radar system may attempt to recognize the gesture if the user only moves the electronic device without performing the gesture.
Some techniques may enhance the radar system by additional motion sensors. However, these motion sensors can increase the power consumption and cost of the electronic device. In addition, integrating motion sensors in space-constrained electronic devices can be challenging. Other closed form signal processing techniques can model different situations in which radar systems and other objects are moving. However, modeling non-uniform motion, including intentional and unintentional motion performed by a user, can be challenging. These motions may include situations where the user is handling an electronic device that includes the radar system or situations where the user is moving while in the vicinity of the radar system.
Rather, described herein are techniques and devices that implement smart device-based radar systems capable of detecting reference frame changes. The radar system includes a reference frame machine learning module trained to operate as a motion sensor. The reference frame machine learning module uses machine learning to identify whether a reference frame of the radar system changes. In particular, the reference frame machine learning module analyzes complex radar data generated from at least one chirp of the reflected radar signals to identify subtle patterns in the relative motion of at least one object over time. In some cases, the reference frame machine learning module compares (e.g., correlates) relative motion of two or more objects. By analyzing complex radar data directly using machine learning, the reference frame machine learning module can determine whether the reference frame of the radar system is changing without relying on non-radar-based sensors, such as gyroscopes, inertial sensors, or accelerometers. With knowledge of whether the reference frame is stationary or moving, the radar system may determine whether a gesture is likely to occur, and in some cases, may compensate for relative motion of the radar system itself.
Example Environment
FIG. 1 is an illustration of example environments 100-1 through 100-8 in which techniques using a smart device-based radar system capable of detecting a reference frame and an apparatus including the smart device-based radar system may be embodied. In the depicted environments 100-1 through 100-8, the smart device 104 includes a radar system 102 capable of detecting one or more objects (e.g., users) using machine learning. The smart device 104 is shown in the environments 100-1 through 100-7 as a smart phone and in the environment 100-8 as a smart vehicle.
In environments 100-1 through 100-4 and 100-6, the user performs different types of gestures, which are detected by radar system 102. In some cases, the user performs gestures using appendages or body parts. Alternatively, the user may also perform gestures using a stylus, a handheld object, a ring, or any type of material that may reflect radar signals.
In environment 100-1, a user makes a scroll gesture by moving a hand over the smart device 104 in a horizontal direction (e.g., from a left side of the smart device 104 to a right side of the smart device 104). In environment 100-2, the user makes an outstretch gesture, which reduces the distance between the smart device 104 and the user's hand. The user in environment 100-3 makes a gesture to play a game on the smart device 104. In one case, the user makes the push gesture by moving the hand over the smart device 104 in a vertical dimension (e.g., from the bottom side of the smart device 104 to the top of the smart device 104). In environment 100-4, the smart device 104 is stored within a wallet, and the radar system 102 provides occlusion gesture recognition by detecting gestures occluded by the wallet. In environment 100-6, a user waves their hand in front of the smart device 104.
The radar system 102 may also recognize other types of gestures or motions not shown in fig. 1. Example types of gestures include a handle turn gesture in which a user curls their fingers to grip an imaginary door handle and rotates their fingers and hand in a clockwise or counterclockwise manner to mimic the action of turning the imaginary door handle. Another example type of gesture includes a shaft twist gesture that a user performs by rubbing a thumb and at least one other finger together. The gesture may be two-dimensional, such as a gesture used with a touch-sensitive display (e.g., a two-finger pinch, a two-finger spread, or a tap). Gestures may also be three-dimensional, such as many sign language gestures, e.g., gestures in American Sign Language (ASL) and other sign languages worldwide. Upon detecting each of these gestures, the smart device 104 may perform actions such as displaying new content, moving a cursor, activating one or more sensors, opening an application, and so forth. In this manner, the radar system 102 provides contactless control of the smart device 104.
In environment 100-7, radar system 102 generates a three-dimensional map of the surrounding environment for contextual awareness. The radar system 102 also detects and tracks multiple users to enable both users to interact with the smart device 104. The radar system 102 may also perform vital sign detection. In environment 100-8, radar system 102 monitors vital signs of a user driving a vehicle. Example vital signs include heart rate and respiration rate. For example, if the radar system 102 determines that the driver is falling asleep, the radar system 102 may cause the smart device 104 to alert the user. Alternatively, if radar system 102 detects a life-threatening emergency event, such as a heart attack, radar system 102 may cause smart device 104 to alert a medical professional or emergency service.
In the different environments 100-1 through 100-8, the radar system 102 may be stationary or mobile. For example, in environments 100-1 through 100-3 and 100-7, smart device 104 is positioned on a non-moving surface such as a table. In this case, the reference frame of radar system 102 is stationary (e.g., fixed) 106, while one or more objects (e.g., users) move. In contrast, environment 100-5 illustrates a situation in which a portion of the user that is visible to radar system 102 remains stationary and radar system 102 moves. In particular, the user swings their arm in a manner that causes the smart device 104 to pass through their leg. In this case, the reference frame of radar system 102 is moving 108 (e.g., changing), while the portion of the user observed by radar system 102 is stationary. In other environments, both the user and radar system 102 move. As an example, users in environments 100-4 and 100-6 move radar system 102 (intentionally or unintentionally) while performing gestures. To distinguish between these different cases, the radar system 102 uses machine learning to detect reference frame changes, as further described with respect to fig. 2.
Some embodiments of radar system 102 are particularly advantageous when applied in the context of smart device 104, and thus there is a convergence of problems. This may include the need for spacing and layout of the radar system 102 and low power limitations. An exemplary overall lateral dimension of the smart device 104 may be, for example, about eight centimeters by about fifteen centimeters. The exemplary footprint of radar system 102 may be even more limited, such as about four millimeters by six millimeters including the antenna. An exemplary power consumption of radar system 102 may be on the order of a few milliwatts to tens of milliwatts (e.g., between approximately two and twenty milliwatts). This limited footprint and power consumption requirement of the radar system 102 enables the smart device 104 to include other desired features (e.g., camera sensors, fingerprint sensors, display, etc.) in a space-constrained package. The smart device 104 and the radar system 102 are further described with reference to fig. 2.
Fig. 2-1 illustrates a radar system 102 as part of a smart device 104. The smart device 104 is illustrated with various non-limiting example devices including a desktop computer 104-1, a tablet computer 104-2, a laptop computer 104-3, a television 104-4, a computing watch 104-5, computing glasses 104-6, a gaming system 104-7, a microwave oven 104-8, and a vehicle 104-9. Other devices may also be used, such as home services devices, smart speakers, smart thermostats, security cameras, baby monitors, Wi-FiTMRouters, drones, touch pads, drawing pads, netbooks, e-readers, home automation and control systems, wall displays, and other household appliances. Note that the smart device 104 may be wearable, non-wearable but mobile or relatively non-mobile (e.g., desktop and appliance). The radar system 102 may be used as a standalone radar system, or with or embedded in many different smart devices 104 or peripherals, such as in the control of household appliances and systemsIn the panel, in the car to control internal functions (e.g., volume, cruise control, or even car driving), or as an accessory to the laptop to control computing applications on the laptop.
The smart device 104 includes one or more computer processors 202 and computer-readable media 204, the computer-readable media 204 including memory media and storage media. An application and/or operating system (not shown) embodied as computer-readable instructions on computer-readable media 204 may be executed by computer processor 202 to provide some of the functionality described herein. The computer-readable media 204 also includes a radar-based application 206, the radar-based application 206 performing functions, such as motion sensing, presence detection, gesture-based contactless control, collision avoidance for autonomous driving, human vital sign notification, and the like, using radar data generated by the radar system 102.
The smart device 104 may also include a network interface 208 for communicating data over a wired, wireless, or optical network. For example, network interface 208 may communicate data through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Personal Area Network (PAN), a wired local area network (WAN), an intranet, the internet, a peer-to-peer network, a mesh network, and so forth. The smart device 104 may also include a display (not shown).
The radar system 102 includes a communication interface 210 for communicating radar data to remote devices, however this interface need not be used when the radar system 102 is integrated within the smart device 104. Typically, the radar data provided by the communication interface 210 is in a format usable by the radar-based application 206.
Radar system 102 also includes at least one antenna array 212 and at least one transceiver 214 to transmit and receive radar signals. Antenna array 212 includes at least one transmit antenna element and at least two receive antenna elements. In some cases, antenna array 212 includes multiple transmit antenna elements to enable multiple-input multiple-output (MIMO) radar capable of transmitting multiple different waveforms at a given time (e.g., each transmit antenna element transmits a different waveform). The antenna elements may be circularly polarized, horizontally polarized, vertically polarized, or a combination thereof.
The receive antenna elements of antenna array 212 may be positioned in a one-dimensional shape (e.g., a line) or a two-dimensional shape (e.g., a rectangular arrangement, a triangular arrangement, or an "L" shaped arrangement) for implementations that include three or more receive antenna elements. One-dimensional shapes enable radar system 102 to measure one angular dimension (e.g., azimuth or elevation), while two-dimensional shapes enable radar system 102 to measure two angular dimensions (e.g., to determine azimuth and elevation of object 302). The element spacing associated with the receive antenna elements may be less than, greater than, or equal to half the center wavelength of the radar signal.
Using antenna array 212, radar system 102 may form beams that are steered or unguided, wide or narrow or shaped (e.g., hemispherical, cubic, sector, conical, cylindrical). Steering and shaping can be achieved by analog beamforming or digital beamforming. One or more of the transmitting antenna elements may have, for example, an unguided omnidirectional radiation pattern, or may produce a wide steerable beam to illuminate a large volume of space. To achieve the target angular accuracy and resolution, the receive antenna elements may be used to generate hundreds or thousands of narrow steered beams by digital beamforming. In this manner, radar system 102 may effectively monitor the external environment and detect one or more users.
The transceiver 214 includes circuitry and logic for transmitting and receiving radar signals via the antenna array 212. The components of transceiver 214 may include amplifiers, mixers, switches, analog-to-digital converters, or filters for conditioning the radar signal. The transceiver 214 also includes logic for performing phase/quadrature (I/Q) operations such as modulation or demodulation. Various modulations may be used, including linear frequency modulation, triangular frequency modulation, stepped frequency modulation, or phase modulation. Alternatively, the transceiver 214 may generate a radar signal having a relatively constant frequency or a single tone. The transceiver 214 may be configured to support continuous wave or pulsed radar operation.
The frequency spectrum (e.g., frequency range) used by transceiver 214 to generate radar signals may include frequencies between 1 and 400 gigahertz (GHz), between 4 and 100GHz, between 1 and 24GHz, between 2 and 4GHz, between 57 and 64GHz, or at approximately 2.4 GHz. In some cases, the frequency spectrum may be divided into multiple sub-spectra with similar or different bandwidths. The bandwidth may be on the order of 500 megahertz (MHz), 1GHz, 2GHz, etc. The different frequency sub-spectrums may include frequencies between approximately 57 to 59GHz, 59 to 61GHz, or 61 to 63GHz, for example. Although the example frequency sub-spectrum described above is continuous, other frequency sub-spectrums may not be continuous. To achieve coherence, transceiver 214 may use multiple frequency sub-spectra (continuous or discontinuous) with the same bandwidth to generate multiple radar signals that are transmitted simultaneously or separated in time. In some cases, multiple contiguous frequency sub-spectra may be used to transmit a single radar signal, thereby providing a radar signal with a wide bandwidth.
Radar system 102 also includes one or more system processors 216 and system media 218 (e.g., one or more computer-readable storage media). The system media 218 optionally includes a hardware abstraction module 220 and at least one circular buffer 224. The system media 218 also includes at least one reference frame Machine Learning (ML) module 222. The hardware abstraction module 220, the reference frame machine learning module 222, and the circular buffer 224 may be implemented using hardware, software, firmware, or a combination thereof. In this example, the system processor 216 implements a hardware abstraction module 220 and a reference frame machine learning module 222. The circular buffer 224 may be implemented and managed by the system processor 216 or a memory controller. Together, the hardware abstraction module 220, the reference frame machine learning module 222, and the circular buffer 224 enable the system processor 216 to process responses from the receive antenna elements in the antenna array 212 to detect a user, determine the location of an object, recognize a gesture performed by a user, and/or detect a reference frame change.
In alternative embodiments (not shown), the hardware abstraction module 220, the reference frame machine learning module 222, or the loop buffer 224 are included within the computer-readable medium 204 and implemented by the computer processor 202. This enables the radar system 102 to provide the smart device 104 raw data via the communication interface 210 so that the computer processor 202 can process the raw data for the radar-based application 206.
The hardware abstraction module 220 transforms the raw data provided by the transceiver 214 into hardware-independent complex radar data, which can be processed by the reference frame machine learning module 222. In particular, the hardware abstraction module 220 conforms complex radar data from various different types of radar signals to the expected inputs of the reference frame machine learning module 222. This enables the reference frame machine learning module 222 to process different types of radar signals received by the radar system 102, including those utilizing different modulation schemes for frequency modulated continuous wave radar, phase modulated spread spectrum radar, or pulse radar. The hardware abstraction module 220 may also normalize the complex radar data according to radar signals having different center frequencies, bandwidths, transmit power levels, or pulse widths.
Furthermore, the hardware abstraction module 220 conforms to complex radar data generated using different hardware architectures. Different hardware architectures may include different antenna arrays 212 or different sets of antenna elements within antenna arrays 212 positioned on different surfaces of smart device 104. By using the hardware abstraction module 220, the reference frame machine learning module 222 may process complex radar data generated by different sets of antenna elements with different gains, different numbers of different sets of antenna elements, or different sets of antenna elements with different antenna element spacings.
Through the hardware abstraction module 220, the reference frame machine learning module 222 may operate in the radar system 102 with different constraints affecting available radar modulation schemes, transmission parameters, or hardware architecture types. The hardware abstraction module 220 is further described with respect to fig. 5 and 6.
The reference frame machine learning module 222 uses machine learning to analyze complex radar data, such as hardware independent complex radar data, and determines whether the reference frame of the radar system 102 is stationary or moving. In particular, the reference frame machine learning module 222 analyzes the relative motion of at least one object over time and/or compares (e.g., correlates) the relative motion of two or more objects. The reference frame machine learning module 222 may analyze amplitude and phase information of the complex radar data to improve accuracy for detecting reference frame changes. The design of the reference frame machine learning module 222 may be customized to support smart devices 104 having different amounts of available memory, different amounts of available power, or different computing capabilities. In some cases, the reference frame machine learning module 222 includes a suite of machine learning architectures that may be individually selected depending on the type of smart device 104 or radar-based application 206.
In some cases, the reference frame machine learning module 222 implements a portion of a gesture recognition module (not shown) or provides reference coefficient data to the gesture recognition module. The gesture recognition module may use the reference frame data to determine whether it is possible to perform a gesture. For example, if the reference coefficient indicates that the reference frame of the radar system 102 is moving, the gesture recognition module may determine that the detected user motion is due, at least in part, to the moving reference frame. In this way, the user is less likely to perform an intentional gesture while moving radar system 102. In this case, the gesture recognition module may decide not to perform gesture recognition in order to avoid potential false alarms. Conversely, if the reference coefficient data indicates that the reference frame of the radar system 102 is stationary, the gesture recognition module may determine that the detected user motion is likely to represent an intentional gesture. In this case, the gesture recognition module may decide to perform gesture recognition.
Additionally or alternatively, the reference frame machine learning module 222 determines one or more characteristics about the reference frame variations that describe the trajectory of the radar system 102. For example, these characteristics may be used to update a clutter map maintained by the radar system 102, or to determine absolute motion of an object by removing motion components caused by motion of the radar system 102. In some cases, the radar system 102 maps the external environment and determines the location of the smart device 104 relative to stationary objects within the external environment (such as furniture, walls, buildings, or guideboards). Using the reference frame machine learning module 222, the radar system 102 can update its known location within the external environment.
In some implementations, the radar system 102 avoids the use of other motion sensors and may be used to determine the orientation, acceleration, or position of the smart device 104. In this manner, a spatially constrained device, such as a wearable device, may utilize the radar system 102 to provide motion data in addition to data of the radar-based application 206.
The circular buffer 224 is a fixed-size memory buffer implemented using an allocated portion of memory within the system media 218. The circular buffer 224 includes a plurality of memory elements and provides a first-in-first-out queue in which data is stored and accessed sequentially using the memory elements. Once all of the memory elements store data, the oldest stored data will be overwritten. In some embodiments of the reference frame machine learning module 222, the circular buffer 224 is implemented between two phases of the reference frame machine learning module 222. In this way, the circular buffer 224 stores data generated by the first stage of the reference frame machine learning module 222 and causes the second stage of the reference frame machine learning module 222 to access the stored data. The circular buffer 224 is further described with reference to fig. 7-3. The reference frame machine learning module 222 may implement, at least in part, detection of the reference frame, with or without the hardware abstraction module 220 and the loop buffer 224, as further described with respect to fig. 2-2.
Fig. 2-2 illustrate an example implementation of the reference frame machine learning module 222. The reference frame machine learning module 222 may include one or more artificial neural networks (referred to herein as neural networks). Neural networks include groups of connected nodes (e.g., neurons or perceptrons) organized into one or more layers. As an example, the reference frame machine learning module 222 includes a deep neural network 230 that includes an input layer 232, an output layer 234, and one or more hidden layers 236 positioned between the input layer 232 and the output layer 234. The nodes of the deep neural network 230 may be partially connected or fully connected between layers.
The input layer 232 includes a plurality of inputs 238-1, 2388-2 … 238-X, where X represents a positive integer. The plurality of hidden layers 236 includes layers 240-1, 2402 … 40-W, where W represents a positive integer. Each hidden layer 236 includes a plurality of neurons, such as neurons 242-1, 242-2 … 242-Y, where Y is a positive integer. Each neuron 242 is connected to at least one other neuron 242 in another hidden layer 236. The number of neurons 242 may be similar or different for different hidden layers 236. In some cases, hidden layer 236 may be a copy of a previous layer (e.g., layer 240-2 may be a copy of layer 240-1). Output layer 234 includes outputs 244-1, 244-2 … 244-Z, where Z represents a positive integer. Various different deep neural networks 230 may be used with various numbers of inputs 238, hidden layers 236, neurons 242, and outputs 244.
In some cases, the deep neural network 230 is a recurrent deep neural network (e.g., a Long Short Term Memory (LSTM) recurrent deep neural network) with connections between nodes that form a loop to retain information from previous portions of the input data sequence for subsequent portions of the input data sequence. In other cases, the deep neural network 230 is a feed-forward deep neural network in which the connections between nodes do not form a loop. Additionally or alternatively, the reference frame machine learning module 222 may include another type of neural network, such as a convolutional neural network.
In general, the machine learning architecture of the reference frame machine learning module 222 may be customized based on available power, available memory, or computing power. The machine learning architecture may also be customized to implement a classification model that indicates whether a reference frame change is detected, a regression model that indicates a probability associated with the reference frame being stationary and moving, or another regression model that characterizes a reference frame change (e.g., describes the trajectory of radar system 102 in terms of distance, direction, and/or speed).
During operation, complex radar data 246 is provided to input layer 232. The complex radar data 246 may include a complex range doppler plot, complex interferometric data, a plurality of digital beat (beat) signals associated with a reflected radar signal, or a frequency domain representation of a plurality of digital beat signals, as further described with respect to fig. 5. Typically, the complex radar data 246 includes a matrix (or vector) of complex numbers. In some cases, each element of the matrix is provided to one of the inputs 238-1 through 238-X. In other cases, a number of consecutive elements are combined and provided to one of the inputs 238-1 through 238-X.
Each neuron 242 in the hidden layer 236 uses an activation function to analyze a different section or portion of the complex radar data 246. Neurons 242 activate (or reverse activate) when a particular type of characteristic is detected within complex radar data 246. Exemplary activation functions may include, for example, non-linear functions, such as hyperbolic tangent functions. Towards the top of fig. 2-2, neurons 242 are shown to obtain an input IN1W1、IN2W2…INXWXAnd an offset W0IN, wherein1、IN2…INXCorresponds to the output of a previous input or hidden layer (e.g., layer 240-1 in fig. 2-2) and W1、W2…WXCorresponding to application IN1、IN2…INXThe respective weight of. The Output (OUT) generated by neuron 242 is determined based on an activation function f (z). In the depicted example, X equals Y for a fully connected network. The output OUT may be scaled by another weight and provided as an input to another layer 240 or the output layer 234.
At the output layer 234, the reference frame machine learning module 222 generates reference frame data 248. As described above, the reference frame data 248 may include a boolean value indicating whether a change in the reference frame is detected, a continuous value indicating a first probability that the reference frame is stationary and a second probability that the reference frame is moving, or other continuous values indicating a distance, direction, and/or speed associated with the change in the reference frame.
By training, the reference frame machine learning module 222 may detect reference frame changes by recognizing subtle patterns in the relative motion of at least one object. Additionally or alternatively, the reference frame machine learning module 222 also detects reference frame changes by comparing (e.g., correlating) relative motion of two or more objects. The reference frame machine learning module 222 may compare different characteristics of the relative motion, including changes in position (e.g., range, azimuth, or pitch), range rate of change, or doppler or velocity.
In some implementations, the reference frame machine learning module 222 relies on supervised learning and can use measured (e.g., real) data for machine learning training purposes. Typically, real data is collected where radar system 102 is moving and an object (e.g., a user) is stationary, an object (e.g., a user) is moving and radar system 102 is stationary, and both radar system 102 and the object are moving. Other scenes may also include other objects that are moving or not moving. Some of these movements are made by humans, which can be difficult to recognize for closed form signal processing techniques. Training enables the reference frame machine learning module 222 to learn a non-linear mapping function for detecting reference frame states based on the complex radar data 246. In other embodiments, the reference frame machine learning module 222 relies on unsupervised learning to determine the non-linear mapping function.
An exemplary offline training process uses a motion capture system to generate truth data for training the reference frame machine learning module 222. The motion capture system may include a plurality of optical sensors, such as infrared sensors or cameras, and measure the position of a plurality of markers placed on different parts of a person's body, such as the arms, hands, torso, or head. Some markers are also located on the smart device 104 or the radar system 102. As the person or radar system 102 moves, complex radar data 246 from the radar system 102 and position data from the motion capture system are recorded. The complex radar data 246 represents training data. The position data recorded from the motion capture system is converted to a format that conforms to the reference frame data 248 and represents the ground truth data. The truth data and training data are synchronized in time and provided to the reference frame machine learning module 222. The reference frame machine learning module 222 generates an estimate of the reference frame data 248 based on the training data and determines an amount of error between the estimated reference frame data 248 and the true phase data. The reference frame machine learning module 222 adjusts machine learning parameters (e.g., weights and biases) to minimize these errors. Based on this offline training process, the determined weights and biases are preprogrammed into the reference frame machine learning module 222 to enable subsequent reference frame changes to be detected using machine learning. In some cases, the offline training process may provide a relatively noise-free environment and high-resolution ground truth data for training the reference frame machine learning module 222.
Additionally or alternatively, the real-time training process may use available sensors in the smart device 104 to generate truth data for training the reference frame machine learning module 222. In this case, the training process may be initiated by the user of the smart device 104. As the user moves around the smart device 104 and/or moves the smart device 104, data from one or more sensors of the smart device 104 (e.g., an accelerometer, a gyroscope, an inertial sensor, a camera, or an infrared sensor) and the complex radar data 246 generated by the radar system 102 are collected and provided to the reference frame machine learning module 222. The reference frame machine learning module 222 determines or adjusts machine learning parameters to minimize errors between the estimated reference frame data 248 and the true phase data provided by the sensors. Using a real-time training process, the reference frame machine learning module 222 may be customized for the user, take into account current environmental conditions, and take into account the current location or orientation of the smart device 104.
Fig. 3-1 illustrates an exemplary operation of radar system 102. In the depicted configuration, radar system 102 is implemented as a frequency modulated continuous wave radar. However, as described above with respect to fig. 2, other types of radar architectures may be implemented. In environment 300, user 302 is located at a particular tilt range 304 from radar system 102. To detect user 302, radar system 102 transmits radar transmission signal 306. At least a portion of radar emission signal 306 is reflected by user 302. The reflected portion represents radar receive signal 308. Radar system 102 receives radar receive signal 308 and processes radar receive signal 308 to extract data for radar-based application 206. As depicted, the amplitude of radar receive signal 308 is less than the amplitude of radar transmit signal 306 due to losses incurred during propagation and reflection.
Radar transmit signal 306 includes a sequence of chirps a310-1 through 310-N, where N represents a positive integer greater than one. The radar system 102 may transmit the chirps 310-1 to 310-N in consecutive bursts, or as time-separated pulses 310-1 to 310-N, as further described with respect to fig. 3-2. For example, the duration of each chirp 310-1 to 310-N may be on the order of tens or thousands of microseconds (e.g., between approximately 30 microseconds (μ β) to 5 milliseconds (ms)).
The individual frequencies of the chirps 3101 to 310-N may increase or decrease over time. In the depicted example, the radar system 102 employs a dual-slope cycle (e.g., triangular frequency modulation) to linearly increase and linearly decrease the frequencies of the chirps 310-1 to 310-N over time. The dual slope cycle enables radar system 102 to measure the doppler shift caused by the motion of user 302. In general, the transmission characteristics (e.g., bandwidth, center frequency, duration, and transmit power) of the chirps 310-1 to 310-N may be customized to achieve a particular detection range, range resolution, or doppler sensitivity to detect one or more characteristics of the user 302 or one or more actions performed by the user 302.
At radar system 102, radar receive signal 308 represents a delayed version of radar transmit signal 306. The amount of delay is proportional to the tilt range 304 (e.g., distance) from the antenna array 212 of the radar system 102 to the user 302. Specifically, this delay represents the sum of the time it takes radar-transmitted signal 306 to propagate from radar system 102 to user 302 and the time it takes radar-received signal 308 to propagate from user 302 to radar system 102. If user 302 and/or radar system 102 are moving, radar receive signal 308 is shifted in frequency relative to radar transmit signal 306 due to the doppler effect. In other words, the characteristics of radar receive signal 308 are dependent on hand motion and/or motion of radar system 102. Similar to radar transmitted signal 306, radar received signal 308 is comprised of one or more chirps 310-1 to 310N.
The plurality of chirps 310-1 to 310-N cause radar system 102 to make a plurality of observations of user 302 over a predetermined period of time. The radar framing structure determines the timing of the chirps 310-1 to 310-N, as further described with respect to fig. 3-2.
Fig. 3-2 illustrates an exemplary radar framing structure 312 for detecting reference frame changes. In the depicted configuration, radar framing structure 312 includes three different types of frames. At the highest level, radar framing structure 312 includes a sequence of primary frames 314, which may be in an active state or an inactive state. Generally, the active state consumes a greater amount of power relative to the inactive state. At an intermediate level, radar framing structure 312 includes a sequence of feature frames 316, which may similarly be in an active state or an inactive state. The different types of feature frames 316 include a burst mode feature frame 318 (shown at the bottom left of fig. 3-2) and a burst mode feature frame 320 (shown at the bottom right of fig. 3-2). At a low level, radar framing structure 312 includes a sequence of Radar Frames (RF)322, which may also be active or inactive.
Radar system 102 transmits and receives radar signals during active radar frames 322. In some cases, the radar frames 322 are analyzed separately for basic radar operations such as search and tracking, clutter map generation, user location determination, and so forth. The radar data collected during each active radar frame 322 may be saved to a buffer after completion of the radar frame 322 or provided directly to the system processor 216 of fig. 2.
Radar system 102 analyzes radar data across multiple radar frames 322 (e.g., across a group of radar frames 322 associated with active feature frame 316) to identify a particular feature. Example types of features include a particular type of motion, motions associated with a particular appendage (e.g., a hand or various fingers), and features associated with different portions of a gesture. To detect changes in the reference frame of radar system 102 or to recognize gestures performed by user 302 during primary frame 314 of activity, radar system 102 analyzes radar data associated with one or more feature frames 316 of activity.
The duration of the main frame 314 may be on the order of milliseconds or seconds (e.g., between approximately 10ms and 10 seconds). After the occurrence of active main frames 314-1 and 314-2, radar system 102 is inactive, as shown by inactive main frames 314-3 and 314-4. The duration of the inactive main frames 314-3 and 314-4 is characterized by a deep sleep time 324, which may be on the order of tens of milliseconds or longer (e.g., greater than 50 ms). In an example embodiment, the radar system 102 turns off all active components within the transceiver 214 (e.g., amplifiers, active filters, Voltage Controlled Oscillators (VCOs), voltage controlled buffers, multiplexers, analog to digital converters, Phase Locked Loops (PLLs), or crystal oscillators) to save power during the deep sleep time 324.
In the depicted radar framing structure 312, each primary frame 314 includes K feature frames 316, where K is a positive integer. If the primary frame 314 is in an inactive state, all feature frames 316 associated with the primary frame 314 are also in an inactive state. In contrast, the active main frame 314 includes J active feature frames 316 and K-J inactive feature frames 316, where J is a positive integer less than or equal to K. The number of feature frames 316 may be adjusted based on the complexity of the environment or the complexity of the gesture. For example, the primary frame 314 may include several to one hundred feature frames 316 (e.g., K may equal 2, 10, 30, 60, or 100). The duration of each feature frame 316 may be on the order of milliseconds (e.g., between approximately 1ms and 50 ms).
To conserve power, active feature frames 316-1 through 316-J occur before inactive feature frames 316- (J +1) through 316-K. The duration of the inactive feature frames 316- (J +1) through 316-K is characterized by a sleep time 326. In this manner, inactive feature frames 316- (J +1) through 316-K are performed continuously, such that radar system 102 may be in a powered down state for a longer duration relative to other techniques that may interleave inactive feature frames 316- (J +1) through 316-K with active feature frames 316-1 through 316-J. In general, increasing the duration of the sleep time 326 enables the radar system 102 to shut down components within the transceiver 214 that require longer startup times.
Each feature frame 316 includes L radar frames 322, where L is a positive integer that may or may not be equal to J or K. In some implementations, the number of radar frames 322 may vary across different feature frames 316 and may include several or hundreds of frames (e.g., L may equal 5, 15, 30, 100, or 500). The duration of the radar frame 322 may be on the order of tens or thousands of microseconds (e.g., between approximately 30 μ s and 5 ms). The radar frames 322 within a particular feature frame 316 may be customized for a predetermined detection range, range resolution, or doppler sensitivity, which facilitates detection of a particular feature or gesture. For example, radar frame 322 may utilize a particular type of modulation, bandwidth, frequency, transmit power, or timing. If a feature frame 316 is in an inactive state, all radar frames 322 associated with that feature frame 316 are also in an inactive state.
The pulse pattern feature frame 318 and the burst pattern feature frame 320 comprise different sequences of radar frames 322. In general, radar frames 322 within active burst mode feature frame 318 transmit pulses that are spaced apart in time by a predetermined amount. This is seen decentrally over time, which may make it easier for radar system 102 to detect changes in the reference frame due to the large changes in the chirps 310-1 to 310-N observed within the burst mode feature frame 318 relative to the burst mode feature frame 318. In contrast, radar frames 322 within active burst mode feature frame 320 continuously transmit pulses across a portion of burst mode feature frame 320 (e.g., the pulses are not separated by a predetermined amount of time). This will cause the active burst mode feature frame 320 to consume less power than the burst mode feature frame 318 by shutting down a larger number of components, including components with longer startup times, as described further below.
Within each active pulse pattern feature frame 318, the sequence of radar frames 322 alternates between an active state and an inactive state. Each active radar frame 322 transmits a chirp a310 (e.g., a pulse), which is illustrated by a triangle. The duration of the chirp 310 is characterized by an activity time 328. During active time 328, components within transceiver 214 are powered on. During a short idle time 330 that includes the remaining time within an active radar frame 322 and the duration of a subsequent inactive radar frame 322, radar system 102 conserves power by turning off one or more active components within transceiver 214 that have an activation time within the duration of short idle time 330.
Active burst mode feature frames 320 include P active radar frames 322 and L-P inactive radar frames 322, where P is a positive integer less than or equal to L. To conserve power, active radar frames 322-1 through 322-P occur before inactive radar frames 322- (P +1) through 322-L. The duration of inactive radar frames 322- (P +1) through 322-L is characterized by a long idle time 332. By grouping inactive radar frames 322(P +1) through 322-L together, radar system 102 may be in a powered down state for a longer duration relative to the short idle time 330 that occurs during the burst mode feature frame 318. Additionally, the radar system 102 may shut down additional components within the transceiver 214 that have an activation time that is longer than the short idle time 330 and shorter than the long idle time 332.
Each active radar frame 322 in the active burst mode feature frames 320 transmits a portion of the chirp 310. In this example, active radar frames 322-1 through 322P alternate between transmitting a portion of chirp 310 with increasing frequency and a portion of chirp 310 with decreasing frequency.
Radar framing structure 312 enables power conservation through an adjustable duty cycle within each frame type. The first duty cycle 334 is based on the number of active feature frames 316(J) relative to the total number of feature frames 316 (K). The second duty cycle 336 is based on the number of active radar frames 322 (e.g., L/2 or P) relative to the total number of radar frames 322 (L). The third duty cycle 338 is based on the duration of the chirp 310 relative to the duration of the radar frame 322.
Consider an exemplary radar framing structure 312 for a power state that consumes approximately 2 milliwatts (mW) of power and has a master frame update rate between approximately 1 and 4 hertz (Hz). In this example, radar framing structure 312 includes a primary frame 314 having a duration between approximately 250ms and 1 second. The main frame 314 includes thirty-one pulse pattern feature frames 318 (e.g., K equals 31). One of the thirty-one pulse pattern feature frames 318 is active. This results in a duty cycle 334 approximately equal to 3.2%. The duration of each pulse pattern feature frame 318 is between approximately 8 and 32 ms. Each pulse pattern feature frame 318 consists of eight radar frames 322 (e.g., L equals 8). Within the active pulse pattern feature frame 318, all eight radar frames 322 are active. This results in a duty cycle 336 equal to 100%. Each radar frame 322 is between approximately 1 to 4ms in duration. The active time 328 within each active radar frame 322 is between approximately 32 to 128 mus. Thus, the resulting duty cycle 338 is approximately 3.2%. This example radar framing structure 312 has been found to yield good performance results. These good performance results yield good power efficiency results in terms of good reference frame detection, gesture recognition and state detection, while also in the context of low power state handheld smartphone applications. The generation of radar-transmitted signal 306 (of fig. 3-1) and the processing of radar-received signal 308 (of fig. 3-1) are further described with respect to fig. 4.
Fig. 4 illustrates an exemplary antenna array 212 and an exemplary transceiver 214 of radar system 102. In the depicted configuration, the transceiver 214 includes a transmitter 402 and a receiver 404. The transmitter 402 includes at least one voltage controlled oscillator 406 and at least one power amplifier 408. The receiver 404 includes at least two receive channels 410-1 through 410-M, where M is a positive integer greater than 1. Each receive channel 410-1 to 410-M includes at least one low noise amplifier 412, at least one mixer 414, at least one filter 416, and at least one analog-to-digital converter 418.
Antenna array 212 includes at least one transmit antenna element 420 and at least two receive antenna elements 422-1 to 422-M. The transmit antenna element 420 is coupled to the transmitter 402. Receive antenna elements 422-1 through 422-M are coupled to receive channels 410-1 through 410-M, respectively.
During transmission, the voltage controlled oscillator 406 generates a frequency modulated radar signal 424 at radio frequency. Power amplifier 408 amplifies frequency modulated radar signal 424 for transmission via transmit antenna element 420. Transmitted frequency modulated radar signal 424 is represented by radar transmitted signal 306, which may include a plurality of chirps 310-1 through 310-N based on radar framing structure 312 of FIG. 3-2. As an example, radar-transmitted signal 306 is generated in accordance with burst-mode feature frame 320 of fig. 3-2, and radar-transmitted signal 306 includes 16 chirps 310 (e.g., N equals 16).
During reception, each receive antenna element 422-1 to 422-M receives a version of radar receive signal 308-1 to 308-M. In general, the relative phase difference between these versions of radar receive signals 308-1 through 308-M is due to the difference in the positions of receive antenna elements 422-1 through 422-M. Within each receive channel 410-1 to 410-M, low noise amplifier 412 amplifies radar receive signal 308 and mixer 414 mixes amplified radar receive signal 308 with frequency modulated radar signal 424. In particular, the mixer performs a beating operation that down-converts and demodulates radar receive signal 308 to generate beating signal 426.
The frequency of beat signal 426 represents the frequency difference between frequency modulated radar signal 424 and radar receive signal 308, which is proportional to tilt range 304 of fig. 3-1. Although not shown, beat signal 426 may include multiple frequencies that represent reflections from different portions of user 302 (e.g., different fingers, different portions of a hand, or different body parts). In some cases, these different portions move at different speeds, move in different directions, or are positioned at different tilt ranges relative to radar system 102.
Filter 416 filters beat signal 426 and analog-to-digital converter 418 digitizes filtered beat signal 426. Receive channels 410-1 through 410-M generate digital beat signals 428-1 through 428-M, respectively, that are provided to system processor 216 for processing. The receive channels 410-1 through 410-M of the transceiver 214 are coupled to the system processor 216 as shown in fig. 5.
Fig. 5 illustrates an example scheme implemented by radar system 102 for detecting a change in a reference frame. In the depicted configuration, the system processor 216 implements a hardware abstraction module 220 and a reference frame machine learning module 222. System processor 216 is coupled to receive channels 410-1 through 410-M. The system processor 216 may also be in communication with the computer processor 202. Although not shown, the hardware abstraction module 220 and/or the reference frame machine learning module 222 may be implemented by the computer processor 202.
In this example, the reference frame machine learning module 222 is implemented using a spatio-temporal neural network 500 comprising a multi-stage machine learning architecture. In the first stage, the spatial recursive network analyzes the complex radar data 246 over the spatial domain to generate feature data. The feature data identifies one or more features (e.g., characteristics) associated with the motion trajectory of the at least one object. The characteristic data is stored in the memory elements of the circular buffer 224 for at least a portion of the time. Over time, other characteristic data is stored in other memory elements of the circular buffer 224. Other feature data may correspond to the same object or another object. In the second phase, the temporal recursive network analyzes the characteristic data across multiple memory elements within the circular buffer 224 to detect the reference frame change. The spatio-temporal neural network 500 is further described with respect to fig. 7-1 through 7-4.
This multi-stage design enables radar system 102 to conserve power and detect reference frame changes in real-time (e.g., while performing gestures). For example, using the circular buffer 224 enables the radar system 102 to save power and memory by mitigating the need to regenerate the feature data or store the complex radar data 246. Storing feature data instead of complex radar data 246 also reduces the computation time for detecting reference frame changes. The spatio-temporal neural network 500 is also suitable and can be extended to detect reference frame changes in a variety of different situations without significantly increasing the size, computational requirements, or latency. Additionally, the spatiotemporal neural network 500 may be customized to recognize multiple types of gestures, such as swipe gestures and reach gestures.
In this example, the hardware abstraction module 220 accepts digital beat signals 428-1 through 428-M from the receive channels 410-1 through 410-M. Digital beat signals 428-1 through 428-M represent raw or unprocessed complex radar data. Hardware abstraction module 220 performs one or more operations based on digital beat signals 428-1 through 428-M to generate hardware-independent complex radar data 502-1 through 502-M. The hardware abstraction module 220 transforms the complex radar data provided by the digital beat signals 428-1 to 428-M into the form expected by the space-time neural network 500. In some cases, the hardware abstraction module 220 normalizes amplitudes associated with different transmit power levels or transforms complex radar data into a frequency domain representation.
The hardware-independent complex radar data 502-1 to 502-M includes amplitude and phase information (e.g., in-phase and quadrature components). In some embodiments, the hardware-independent complex radar data 502-1 to 502-M includes a range Doppler map for each receive channel 410-1 to 410-M and for a particular active feature frame 316. In other embodiments, the hardware-independent complex radar data 502-1 to 502-M includes complex interferometric data that is an orthogonal representation of a range-doppler plot. As another example, the hardware-independent complex radar data 502-1 to 502-M includes a frequency domain representation of the digital beat signals 428-1 to 428-M for the active feature frame 316. Although not shown, other embodiments of radar system 102 may provide digital beat signals 428-1 through 428-M directly to space-time neural network 500.
The spatio-temporal neural network 500 uses a trained regression or classification model to analyze the hardware-independent complex radar data 502-1 to 502-M and generate reference frame data 248. Although described with respect to motion sensing, the training process performed by the spatio-temporal neural network 500 and the machine learning architecture of the spatio-temporal neural network 500 may be adapted to support other types of applications, including presence detection, gesture recognition, collision avoidance, and human vital sign detection. An example embodiment of the spatio-temporal neural network 500 is further described with respect to fig. 7-1 through 7-4.
FIG. 6 illustrates an example hardware abstraction module 220 for detecting reference frame changes. In the depicted configuration, the hardware abstraction module 220 includes a pre-processing stage 602 and a signal transformation stage 604. The pre-processing stage 602 operates on each chirp 310-1 to 310-N within the digital beat signals 428-1 to 428-M. In other words, the preprocessing stage 602 performs operations on each active radar frame 322. In this example, the pre-processing stage 602 includes M one-dimensional (1D) Fast Fourier Transform (FFT) modules that process the digital beat signals 428-1 to 428-M, respectively. Other types of modules performing similar operations are also possible, such as a fourier transform module.
For simplicity, the hardware abstraction module 220 is shown as processing a digital beat signal 428-1 associated with the receive channel 410-1. Digital beat signal 428-1 includes chirps 310-1 to 310-M, which are time domain signals. The chirps 310-1 to 310-M are passed to the one-dimensional FFT module 606-1 in the order in which they are received and processed by the transceiver 214. Assuming that the radar received signals 308-1 to 308-M include 16 chirps 310-1 to 310-N (e.g., N is equal to 16), the one-dimensional FFT module 606-1 performs 16 FFT operations to generate the pre-processed complex radar data 612-1 for each chirp.
Signal transformation stage 604 operates on a sequence of chirps 3101 to 310-M within each digital beat signal 428-1 to 428-M. In other words, the signal transformation stage 604 performs an operation on each active feature frame 316. In this example, the signal transformation stage 604 includes M buffers and M two-dimensional (2D) FFT modules. For simplicity, the signal transformation stage 604 is shown to include a buffer 608-1 and a two-dimensional FFT module 610-1.
The buffer 608-1 stores a first portion of the pre-processed complex radar data 612-1 associated with the first chirp 310-1. The one-dimensional FFT module 606-1 continues to process subsequent chirps 310-2 through 310-N and the buffer 6081 continues to store corresponding portions of the pre-processed complex radar data 612-1. The process continues until the buffer 608-1 stores the last portion of the pre-processed complex radar data 612-1 associated with the chirp 310-M.
At this point, buffer 608-1 stores preprocessed complex radar data 614-1 associated with a particular feature frame. The pre-processed complex radar data 614-1 for each feature frame represents amplitude information (as shown) and phase information (not shown) across different chirps 310-1 through 310-N and across different range blocks (range bins) 616-1 through 616-a, where a represents a positive integer.
The two-dimensional FFT 610-1 accepts the pre-processed complex radar data 614-1 per feature frame and performs a two-dimensional FFT operation to form hardware-independent complex radar data 502-1 representing a range-Doppler plot. The range-doppler plot includes complex radar data for range blocks 616-1 through 616-a and doppler blocks 618-1 through 618-B, where B represents a positive integer. In other words, each range block 616-1 through 616-A and Doppler block 618-1 through 618-B comprises a complex number having a real and imaginary part that together represent amplitude and phase information. The number of range blocks 616-1 to 616-a may be on the order of tens or hundreds, such as 64 or 128 (e.g., a equals 64 or 128). The number of doppler blocks may be on the order of tens or hundreds, such as 32, 64, or 124 (e.g., B equals 32, 64, or 124). The hardware-independent complex radar data 502-1 and the hardware-independent complex radar data 502-1 through 502-M (of FIG. 6-1) are provided to a spatio-temporal neural network 500, as shown in FIG. 7-1.
Fig. 7-1 illustrates an example spatiotemporal neural network 500 for detecting reference frame changes. In the depicted configuration, the spatio-temporal neural network 500 includes two stages implemented by a spatial recursion network 702 and a temporal recursion network 704, respectively. The spatial recursive network 702 includes a chirp level analysis module 706 and a feature level analysis module 708. In general, the spatial recursive network 702 analyzes the hardware-independent complex radar data 502-1 to 502-M in the spatial domain for each active feature frame 316. The resulting data is stored by the circular buffer 224. The temporal recursion network 704 includes a main level analysis module 710 that analyzes data stored within the circular buffer 224 of two or more activity feature frames 316. In this manner, the temporal recursive network 704 analyzes data in the time domain for at least a portion of the active primary frame 314.
During reception, the chirp level analysis module 706 processes the complex radar data across each range block 616-1 to 616-a to generate channel doppler data 712. The feature level analysis module 708 analyzes the channel doppler data 712 to generate feature data 714, the feature data 714 characterizing one or more features used to detect the reference frame variation. These characteristics may include relative motion of one or more objects detected by radar system 102. The circular buffer 224 stores the characteristic data 714.
Over time, the circular buffer 224 stores feature data 714 associated with different active feature frames 316. The feature data 714 associated with two or more active feature frames 316 is referred to as compiled feature data 716. Compiled profile data 716 is provided to primary analysis module 710 or is accessed by primary analysis module 710. The main-level analysis module 710 analyzes the compiled feature data 716 into reference frame data 248. By way of example, radar application data 504 includes predictions as to whether the reference frame is moving. Because the feature data 714 associated with a larger number of active feature frames 316 is stored by the circular buffer 224, the accuracy of the prediction improves. In some cases, primary analysis module 710 continuously generates or updates radar application data 504 as spatially recursive network 702 processes subsequent feature frames 316 associated with primary frame 314. Alternatively, the main-level analysis module 710 delays the generation of the reference frame data 248 until all feature frames 316 associated with the main frame 314 have been processed by the spatial recursive network 702. Embodiments of the chirp level analysis module 706, the feature level analysis module 708, and the main level analysis module 710 are further described with respect to fig. 7-2 through 7-4.
Fig. 7-2 illustrates an example chirp level analysis module 706 of the spatio-temporal neural network 500. In the depicted configuration, the chirp level analysis module 706 includes channel doppler processing modules 718-1 through 718-a. Each channel doppler processing module 718-1 through 718-a includes a neural network having one or more layers 7201 through 720-Q, where Q is a positive integer. The value of Q may vary depending on the implementation. As an example, Q may be equal to 2, 4, or 10. The layers 720-1 through 720-Q may be fully connected or partially connected. For example, the nodes within the layers 720-1 through 720-Q perform a non-linear rectifier activation function. Channel doppler processing modules 718-1 through 718-a may also perform additions and multiplications.
Channel doppler processing modules 718-1 through 718-a accept respective portions of the hardware independent complex radar data 502-1 through 502-M according to range blocks 616-1 through 616-a. In particular, the channel Doppler processing module 718-1 receives complex radar data associated with the first range block 616-1 across all of the receive channels 410-1 through 410-M and across all of the Doppler blocks 618-1 through 618-B. Each complex number is provided as an input to a respective node of the tier 720-1. The layers 720-1 through 720-Q analyze the data using a non-linear rectifier activation function to generate channel doppler data 712-1. Channel doppler processing modules 718-2 through 718-a also perform similar operations. The combined channel doppler data 712-1 to 712-a represents a vector. For example, assuming there are three receive channels 410-1 through 410-M (e.g., M equals 3), 32 Doppler blocks 618-1 through 618-B (e.g., B equals 32), and 16 range blocks 616-1 through 616-A (e.g., A equals 16), the channel Doppler data 712-1 through 712-A forms a 1x16 vector of values that represents the relationship across the receive channels in the Doppler domain to enable the feature level analysis module 708 to identify one or more features associated with frame of reference changes.
Fig. 7-3 illustrate an exemplary feature level analysis module 708 of the spatiotemporal neural network 500. In the depicted configuration, the feature level analysis module 708 is implemented using one or more recursive layers 722-1 through 722-V, where V represents a positive integer. Within the recursive layers 722-1 through 722-V, the connections between nodes form a cycle that retains information from a previous active feature frame 316 for a subsequent active feature frame 316. For example, using recursive layers 722-1 through 722-V, the feature level analysis module 708 may implement a Long Short Term Memory (LSTM) network.
As described above, the feature level analysis module 708 processes the channel doppler data 712-1 through 712-a across the range blocks 616-1 through 616-a to generate feature data 714. Although not shown, some embodiments of the spatial recursive network 702 may include an additional fully-connected layer 720 connected to the output of the recursive layer 722-V. Similar to the tiers 720 of the chirp level analysis module 706, these tiers 720 may also perform nonlinear transformations.
Over time, the feature data 714-1 through 714-J associated with the active feature frames 316-1 through 316-J is sequentially stored in different memory elements by the circular buffer 224. The profile data 714-1 through 714-J represents compiled profile data 716 processed by the main-level analysis module 710, as further described with respect to fig. 7-4.
Fig. 7-4 illustrate an example master analysis module 710 of the spatio-temporal neural network 500. In the illustrated configuration, primary analysis module 710 is implemented using one or more recursive layers 724-1 through 724-T, where T represents a positive integer that may or may not be equal to V. For example, using recursive layers 724-1 through 724-T, main level analysis module 710 may implement a Long Short Term Memory (LSTM) network.
As described above, the main level analysis module 710 processes two or more feature data 714-1 through 714-J stored within the circular buffer 224. For example, the primary analysis module 710 forms a prediction as to whether the reference frame is moving based on two or more of the feature data 714-1 through 714-J. In some cases, the primary analysis module 710 may wait to form a prediction until a specified amount of feature data 714-1 through 714-J is available, such as 15. If the active main frame 314 includes more than 15 active feature frames 316 (e.g., J is greater than 15), the main level analysis module 710 may proceed to update its prediction based on the last 15 active feature frames 316. Generally, the accuracy of the prediction may increase over time or as a greater amount of the feature data 714-1 through 714-J is analyzed.
Example method
Fig. 8 depicts an example method 800 for performing operations of a smart device-based radar system capable of detecting reference frame changes. The method 800 is illustrated as a collection of operations (or acts) that are performed, but are not necessarily limited to the order or combination of operations shown herein. Moreover, any one or more of the operations may be repeated, combined, re-combined, or linked to provide various additional and/or alternative methods. In portions of the following discussion, reference may be made to environments 100-1 through 100-8 of FIG. 1, entities detailed in FIGS. 2, 4, or 5, reference to which is made only as an example. The techniques are not limited to being performed by one entity or multiple entities operating on one device.
At 802, a first radar transmit signal is transmitted using an antenna array of a radar system. For example, as shown in fig. 4, radar system 102 transmits radar transmit signal 306 using at least one transmit antenna element 420. In some embodiments, radar-transmitted signal 306 includes a plurality of chirps 310-1 to 310-N that are frequency modulated, as shown in FIG. 3-1.
At 804, a first radar receive signal is received using an antenna array. The first radar reception signal comprises a version of the first radar transmission signal reflected by the at least one user. For example, radar system 102 uses at least one receive antenna element 422 to receive a version of radar receive signal 308 reflected by user 302, as shown in fig. 4.
At 806, complex radar data is generated based on the first radar receive signal. For example, receive channel 410 of radar system 102 generates digital beat signal 428 based on radar receive signal 308. The digital beat signal represents complex radar data 246 and includes amplitude and phase information.
At 808, the complex radar data is analyzed using machine learning to detect changes in a reference frame of the radar system. For example, the reference frame machine learning module 222 analyzes the digital beat signals 428-1 to 428-M or the hardware-independent complex radar data 502-1 to 502-M to generate reference frame data 248 that provides information about the reference frame of the radar system 102. This information may indicate whether the reference frame is stationary, whether the reference frame is moving, and/or varying characteristics (e.g., distance, direction, velocity) about the reference frame. Although described with respect to motion sensing, similar operations may be performed for other applications as well, including presence detection, gesture recognition, collision avoidance, vital sign detection, and so forth.
Example computing System
Fig. 9 illustrates various components of an example computing system 900, the example computing system 900 being capable of operating as any type of client, server, and/or electronic device as described with reference to previous fig. 2 to detect a reference frame change.
The computing system 900 includes a communication device 902, the communication device 902 enabling wired and/or wireless communication of device data 904 (e.g., received data, data being received, data scheduled for broadcast, or data packets of data). Although not shown, communication device 902 or computing system 900 may include one or more radar systems 102. The device data 904 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with the user 302 of the device. Media content stored on computing system 900 can include any type of audio, video, and image data. Computing system 900 includes one or more data inputs 906 via which any type of data, media content, and/or inputs can be received, such as human speech, radar-based applications 206, user-selectable inputs (explicit or implicit), messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
Computing system 900 also includes communication interfaces 908 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. Communication interfaces 908 provide a connection and/or communication links between computing system 900 and a communication network by which other electronic, computing, and communication devices communicate data with computing system 900.
The computing system 900 includes one or more processors 910 (e.g., any of microprocessors, controllers, and the like), which processors 910 process various computer-executable instructions to control the operation of the computing system 900 and to enable techniques for, or in which, gesture recognition in the presence of saturation may be embodied. Alternatively or in addition, the computing system 900 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 912. Although not shown, the computing system 900 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Computing system 900 also includes computer-readable media 914, such as one or more memory devices capable of persistent and/or non-transitory data storage (i.e., as opposed to mere signal transmission), examples of which include Random Access Memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable Compact Disc (CD), any type of a Digital Versatile Disc (DVD), and so forth. The computing system 900 can also include a mass storage media device (storage media) 916.
Computer-readable media 914 provides data storage mechanisms for storing the device data 904, as well as various device applications 918 and any other types of information and/or data related to operational aspects of the computing system 900. For example, an operating system 920 can be maintained as a computer application with the computer-readable media 914 and executed on processors 910. The device applications 918 may include a device manager, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction module layer for a particular device, and so forth.
The device applications 918 also include any system components, engines, or managers for detecting changes in the reference frame. In this example, the device applications 918 include the radar-based application 206 and the reference frame machine learning module 222 of fig. 2.
Conclusion
Although the techniques of using a smart device-based radar system that detects reference frame-based changes and apparatus including a smart device-based radar system have been described in language specific to features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of a smart device-based radar system that detects changes in a reference frame.
Some examples are described below.
Example 1: a method performed by a radar system, the method comprising:
transmitting a first radar transmission signal using an antenna array of the radar system;
receiving a first radar receive signal using the antenna array, the first radar receive signal comprising a version of the first radar transmit signal reflected by at least one object;
generating complex radar data based on the first radar receive signal; and
the complex radar data is analyzed using machine learning to detect changes in a reference frame of the radar system.
Example 2: the method of example 1, wherein the at least one object comprises at least one user,
the method further comprises:
determining that the radar system is moving based on the detected change in the reference frame of the radar system; and
in response to determining that the radar system is moving, determining that the at least one user is not performing a gesture.
Example 3: the method of example 1 or example 2, further comprising:
transmitting a second radar transmission signal using the antenna array of the radar system;
receiving a second radar receive signal using the antenna array, the second radar receive signal comprising a version of the second radar transmit signal reflected by at least one other object;
generating other complex radar data based on the second radar receive signal; and
analyzing the other complex radar data using the machine learning to determine that the radar system is stationary.
Example 4: the method of example 3, wherein the at least one other object includes at least one other user,
the method further comprises:
in response to determining that the radar system is stationary, recognizing a gesture performed by the at least one other user based on the other complex radar data.
Example 5: the method of any preceding example, wherein:
the at least one object comprises a first object and a second object; and is
Analyzing the complex radar data includes:
determining relative motion of the first object based on the complex radar data;
determining relative motion of the second object based on the complex radar data; and
detecting a change in a reference frame of the radar system by comparing relative motion of the first object to relative motion of the second object using the machine learning.
Example 6: the method of example 5, wherein:
the first object is stationary and the second object is stationary;
the first object is stationary and the second object is moving; or
The first object is moving and the second object is moving.
Example 7: the method of example 5, wherein determining the relative motion of the first object, determining the relative motion of the second object, and detecting the change in the reference frame of the radar system comprises:
analyzing the complex radar data over a spatial domain using a spatial recursive network to generate feature data; and
the feature data is analyzed in a time domain using a time-recursive network to recognize the gesture.
Example 8: the method of example 7, further comprising:
storing the feature data in a circular buffer; and
accessing the feature data stored in the circular buffer over the time-recursive network.
Example 9: the method of examples 7 or 8, wherein analyzing the complex radar data in a spatial domain comprises:
separately processing portions of the complex radar data associated with different range blocks using a nonlinear activation function to generate channel doppler data for each range block; and
analyzing the channel Doppler data across the different range blocks to generate the feature data.
Example 10: the method of any of examples 7 to 9, wherein analyzing the feature data in the time domain comprises: forming a prediction regarding movement of a reference frame of the radar system and a likelihood that the reference frame of the radar system is stationary.
Example 11: the method of any preceding example, wherein analyzing the complex radar data comprises: analyzing amplitude and/or phase information of the complex radar data using the machine learning.
Example 12: the method of any preceding example, wherein the complex radar data comprises an in-phase component and a quadrature component.
Example 13: the method of any preceding example, wherein the complex radar data comprises at least one of:
a complex range doppler plot;
complex interferometric data;
a plurality of digital beat signals associated with the radar receive signal; or
A frequency domain representation of the plurality of digital beat signals.
Example 14: the method of any preceding example, wherein the first radar-transmitted signal comprises at least one chirp.
Example 15: the method of any preceding example, wherein the machine learning uses at least one artificial neural network, in particular at least one artificial neural network with a deep neural network, in particular with a recursive deep neural network.
Example 16: the method of any preceding example, wherein the machine learning comprises supervised learning, in particular using real data for machine learning training purposes.
Example 17: the method of any preceding example, wherein the machine learning comprises offline training and/or real-time training.
Example 18: the method of example 1, wherein:
the radar system is part of a smart device; and
the smart device does not include an inertial sensor or use the inertial sensor to detect a change in a reference frame of the radar system.
Example 19: an apparatus, comprising:
a radar system, comprising:
an antenna array;
a transceiver; and
a processor and a computer readable storage medium configured to perform the method according to any one of examples 1 to 17.
Example 20: the apparatus of example 19, wherein the apparatus comprises a smart device comprising one of:
a smart phone;
a smart watch;
an intelligent speaker;
an intelligent thermostat;
a security camera;
a vehicle; or
A household appliance.
Example 21: the apparatus of example 19, wherein:
the apparatus comprises a smart device; and is
The smart device does not include an inertial sensor or use the inertial sensor to detect a change in a reference frame of the radar system.
Example 22: a computer-readable storage medium comprising computer-executable instructions that, in response to execution by a processor, implement:
a reference frame machine learning module configured to:
accepting complex radar data associated with radar receive signals reflected by at least one object;
analyzing the complex radar data using machine learning to generate reference frame data; and
based on the reference frame data, it is determined whether an antenna array receiving the radar receive signal is stationary or moving.
Example 23: the computer-readable storage medium of example 22, wherein the reference frame machine learning module is further configured to analyze both amplitude and phase information of the complex radar data to generate the reference frame data.
Example 24: the computer-readable storage medium of example 22 or example 23, wherein the reference frame machine learning module is further configured to:
analyzing the complex radar data over a spatial domain using the machine learning to generate feature data; and
analyzing the feature data in a time domain using the machine learning to generate radar application data.
Example 25: the computer readable storage medium of any of examples 22 to 24, wherein the computer executable instructions, in response to execution by the processor, implement a hardware abstraction module configured to:
generating hardware-independent complex radar data based on the complex radar data; and
providing the hardware-independent complex radar data to the reference frame machine learning module as the complex radar data.
Example 26: the computer-readable storage medium of example 22, wherein:
the computer-readable storage medium and the processor are part of a smart device; and
the smart device does not include an inertial sensor or use the inertial sensor to detect a change in a reference frame of the radar system.

Claims (20)

1. A method performed by a radar system, the method comprising:
transmitting a first radar transmission signal;
receiving a first radar receive signal comprising a version of the first radar transmit signal reflected by at least one object;
generating complex radar data based on the first radar receive signal; and
the complex radar data is analyzed using machine learning to detect changes in a reference frame of the radar system.
2. The method of claim 1, wherein the at least one object comprises at least one user, the method further comprising:
determining that the radar system is moving based on the detected change in the reference frame of the radar system; and
in response to determining that the radar system is moving, determining that the at least one user is not performing a gesture.
3. The method of claim 1, further comprising:
transmitting a second radar transmission signal;
receiving a second radar receive signal comprising a version of the second radar transmit signal reflected by at least one other object;
generating second complex radar data based on the second radar receive signal; and
analyzing the second complex radar data using the machine learning to determine that the radar system is stationary.
4. The method of claim 3, wherein the at least one other object comprises at least one other user, the method further comprising:
in response to determining that the radar system is stationary, recognizing a gesture performed by the at least one other user based on the second complex radar data.
5. The method of claim 1, wherein:
the at least one object comprises a first object and a second object; and is
Analyzing the complex radar data includes:
determining relative motion of the first object based on the complex radar data;
determining relative motion of the second object based on the complex radar data; and
detecting a change in a reference frame of the radar system by comparing relative motion of the first object to relative motion of the second object using the machine learning.
6. The method of claim 5, wherein:
the first object is stationary and the second object is stationary;
the first object is stationary and the second object is moving; or
The first object is moving and the second object is moving.
7. The method of claim 5, wherein determining the relative motion of the first object, determining the relative motion of the second object, and detecting a change in a reference frame of the radar system comprises:
analyzing the complex radar data over a spatial domain using a spatial recursive network to generate feature data; and
analyzing the feature data in a time domain using a time-recursive network to recognize a gesture performed by the at least one user.
8. The method of claim 7, further comprising:
storing the feature data in a circular buffer; and
accessing, by the time-recursive network, the feature data stored in the circular buffer.
9. The method of claim 7, wherein analyzing the complex radar data in a spatial domain comprises:
separately processing portions of the complex radar data associated with different range blocks using a nonlinear activation function to generate channel doppler data for each range block; and
analyzing the channel Doppler data across the different range blocks to generate the feature data.
10. The method of claim 7, wherein analyzing the feature data in the time domain comprises: forming a prediction regarding a likelihood that a reference frame of the radar system is moving and that the reference frame of the radar system is stationary.
11. The method of claim 1, wherein the complex radar data comprises at least one of:
a complex range doppler plot;
complex interferometric data;
a plurality of digital beat signals associated with the first radar receive signal; or
A frequency domain representation of the plurality of digital beat signals.
12. The method of claim 1, wherein:
the machine learning uses at least one artificial neural network;
the machine learning comprises supervised learning; and
the machine learning includes offline training or real-time training.
13. The method of claim 1, wherein:
the radar system is part of a smart device; and
the smart device does not include an inertial sensor or use the inertial sensor to detect a change in a reference frame of the radar system.
14. An apparatus, comprising:
a radar system, comprising:
an antenna array;
a transceiver; and
a processor and computer readable storage medium configured to perform the method of any of claims 1 to 13.
15. The apparatus of claim 14, wherein the apparatus comprises a smart device comprising one of:
a smart phone;
a smart watch;
an intelligent speaker;
an intelligent thermostat;
a security camera;
a vehicle; or
A household appliance.
16. The apparatus of claim 14, wherein:
the apparatus comprises a smart device; and is
The smart device does not include an inertial sensor or use the inertial sensor to detect a change in a reference frame of the radar system.
17. A computer-readable storage medium comprising computer-executable instructions that, when executed by a processor, implement:
a reference frame machine learning module configured to:
accepting complex radar data associated with radar receive signals reflected by at least one object;
analyzing the complex radar data using machine learning to generate reference frame data; and
based on the reference frame data, it is determined whether an antenna array receiving the radar receive signal is stationary or moving.
18. The computer-readable storage medium of claim 17, wherein the reference frame machine learning module is further configured to analyze both amplitude and phase information of the complex radar data to generate the reference frame data.
19. The computer-readable storage medium of claim 17, wherein the reference frame machine learning module is further configured to:
analyzing the complex radar data over a spatial domain using the machine learning to generate feature data; and
analyzing the feature data in a time domain using the machine learning to generate radar application data.
20. The computer-readable storage medium of claim 17, wherein:
the computer-readable storage medium and the processor are part of a smart device; and
the smart device does not include or use inertial sensors to detect changes in a reference frame of the radar system.
CN202010663766.3A 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems Active CN111812633B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410233722.5A CN118191817A (en) 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
USPCT/US2019/063776 2019-11-27
PCT/US2019/063776 WO2021107958A1 (en) 2019-11-27 2019-11-27 Detecting a frame-of-reference change in a smart-device-based radar system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410233722.5A Division CN118191817A (en) 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems

Publications (2)

Publication Number Publication Date
CN111812633A true CN111812633A (en) 2020-10-23
CN111812633B CN111812633B (en) 2024-03-12

Family

ID=69005906

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010663766.3A Active CN111812633B (en) 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems
CN202410233722.5A Pending CN118191817A (en) 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202410233722.5A Pending CN118191817A (en) 2019-11-27 2020-07-10 Detecting reference system changes in smart device-based radar systems

Country Status (4)

Country Link
US (2) US11460538B2 (en)
EP (2) EP4286996A3 (en)
CN (2) CN111812633B (en)
WO (1) WO2021107958A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113126050A (en) * 2021-03-05 2021-07-16 沃尔夫曼消防装备有限公司 Life detection method based on neural network
CN113253249A (en) * 2021-04-19 2021-08-13 中国电子科技集团公司第二十九研究所 MIMO radar power distribution design method based on deep reinforcement learning

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021107958A1 (en) * 2019-11-27 2021-06-03 Google Llc Detecting a frame-of-reference change in a smart-device-based radar system
WO2021118570A1 (en) 2019-12-12 2021-06-17 Google Llc Radar-based monitoring of a fall by a person
US11639985B2 (en) * 2020-07-02 2023-05-02 International Business Machines Corporation Three-dimensional feature extraction from frequency modulated continuous wave radar signals
US11808839B2 (en) 2020-08-11 2023-11-07 Google Llc Initializing sleep tracking on a contactless health tracking device
US11406281B2 (en) 2020-08-11 2022-08-09 Google Llc Contactless cough detection and attribution
US11832961B2 (en) 2020-08-11 2023-12-05 Google Llc Contactless sleep detection and disturbance attribution
US11754676B2 (en) 2020-08-11 2023-09-12 Google Llc Precision sleep tracking using a contactless sleep tracking device
US11256988B1 (en) * 2021-07-19 2022-02-22 Information Systems Laboratories, Inc. Process and method for real-time sensor neuromorphic processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110115592A (en) * 2018-02-07 2019-08-13 英飞凌科技股份有限公司 The system and method for the participation level of people are determined using millimetre-wave radar sensor
DE102018202903A1 (en) * 2018-02-27 2019-08-29 Zf Friedrichshafen Ag Method for evaluating measurement data of a radar measurement system using a neural network
WO2019195327A1 (en) * 2018-04-05 2019-10-10 Google Llc Smart-device-based radar system performing angular estimation using machine learning
WO2020113160A2 (en) * 2018-11-30 2020-06-04 Qualcomm Incorporated Radar deep learning

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5218680A (en) * 1990-03-15 1993-06-08 International Business Machines Corporation Data link controller with autonomous in tandem pipeline circuit elements relative to network channels for transferring multitasking data in cyclically recurrent time slots
EP2496958A4 (en) * 2009-11-06 2013-04-03 Saab Ab Radar system and method for detecting and tracking a target
US9827487B2 (en) * 2012-05-14 2017-11-28 Sphero, Inc. Interactive augmented reality using a self-propelled device
US20140282274A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US10042041B2 (en) 2015-04-28 2018-08-07 Veoneer Us, Inc. Apparatus and method for detecting and correcting for blockage of an automotive radar sensor
GB2541658B (en) 2015-08-24 2020-01-01 Thales Holdings Uk Plc Video-assisted inverse synthetic aperture radar (VAISAR)
US20180049671A1 (en) * 2016-08-18 2018-02-22 Timothy W. Markison Wireless in-shoe physical activity monitoring dongle
US9720086B1 (en) * 2016-11-22 2017-08-01 4Sense, Inc. Thermal- and modulated-light-based passive tracking system
US10498951B2 (en) * 2017-01-23 2019-12-03 Digital Global Systems, Inc. Systems, methods, and devices for unmanned vehicle detection
CN115460463A (en) * 2018-01-04 2022-12-09 三星电子株式会社 Video playing device and control method thereof
CN111758237B (en) * 2018-02-27 2023-12-15 Iee国际电子工程股份公司 Method for joint radar communication
GB2585479B (en) * 2019-05-10 2024-01-31 Victor Kennedy Roderick Reduction of the effects of latency for extended reality experiences
WO2021107958A1 (en) * 2019-11-27 2021-06-03 Google Llc Detecting a frame-of-reference change in a smart-device-based radar system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110115592A (en) * 2018-02-07 2019-08-13 英飞凌科技股份有限公司 The system and method for the participation level of people are determined using millimetre-wave radar sensor
DE102018202903A1 (en) * 2018-02-27 2019-08-29 Zf Friedrichshafen Ag Method for evaluating measurement data of a radar measurement system using a neural network
WO2019195327A1 (en) * 2018-04-05 2019-10-10 Google Llc Smart-device-based radar system performing angular estimation using machine learning
WO2020113160A2 (en) * 2018-11-30 2020-06-04 Qualcomm Incorporated Radar deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SARAH H. 等: "Precise Ego-Motion Estimation with Millimeter-Wave Radar Under Diverse and Challenging Conditions", 《2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)》, pages 6045 - 6052 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113126050A (en) * 2021-03-05 2021-07-16 沃尔夫曼消防装备有限公司 Life detection method based on neural network
CN113253249A (en) * 2021-04-19 2021-08-13 中国电子科技集团公司第二十九研究所 MIMO radar power distribution design method based on deep reinforcement learning

Also Published As

Publication number Publication date
US20210156957A1 (en) 2021-05-27
US11460538B2 (en) 2022-10-04
EP4286996A3 (en) 2024-02-21
EP4066008A1 (en) 2022-10-05
CN111812633B (en) 2024-03-12
CN118191817A (en) 2024-06-14
EP4286996A2 (en) 2023-12-06
EP4066008B1 (en) 2024-01-03
WO2021107958A1 (en) 2021-06-03
US20230008681A1 (en) 2023-01-12

Similar Documents

Publication Publication Date Title
CN111812633B (en) Detecting reference system changes in smart device-based radar systems
CN111433627B (en) Intelligent device-based radar system using machine learning to perform angle estimation
US10698603B2 (en) Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
JP7481434B2 (en) A Smart Device-Based Radar System Using Spatio-Temporal Neural Networks to Perform Gesture Recognition
US20210103348A1 (en) Facilitating User-Proficiency in Using Radar Gestures to Interact with an Electronic Device
EP4131064A1 (en) Gesture recognition method and related apparatus
US20230161027A1 (en) Smart-Device-Based Radar System Performing Near-Range Detection
US20240027600A1 (en) Smart-Device-Based Radar System Performing Angular Position Estimation
CN111830503B (en) Intelligent device-based radar system performing symmetric Doppler interference mitigation
CN113454481B (en) Smart device based radar system for detecting user gestures in the presence of saturation
KR20230165914A (en) Radar application programming interface
US20240231505A1 (en) Facilitating Ambient Computing Using a Radar System
CN111830503A (en) Smart device based radar system performing symmetric doppler interference mitigation
CN113454481A (en) Smart device based radar system to detect user gestures in the presence of saturation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant