WO2023169448A1 - 一种感知目标的方法和装置 - Google Patents

一种感知目标的方法和装置 Download PDF

Info

Publication number
WO2023169448A1
WO2023169448A1 PCT/CN2023/080191 CN2023080191W WO2023169448A1 WO 2023169448 A1 WO2023169448 A1 WO 2023169448A1 CN 2023080191 W CN2023080191 W CN 2023080191W WO 2023169448 A1 WO2023169448 A1 WO 2023169448A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
time
spectrum data
moment
time period
Prior art date
Application number
PCT/CN2023/080191
Other languages
English (en)
French (fr)
Inventor
丁根明
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023169448A1 publication Critical patent/WO2023169448A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/32Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks

Definitions

  • This solution involves radar technology, which is applied in the fields of autonomous driving, intelligent driving, surveying and mapping, smart home, or intelligent manufacturing. It especially involves a method and device for sensing targets.
  • millimeter-wave radar requires a higher bandwidth to achieve high-precision perception when sensing targets.
  • FMCW Frequency-modulated continuous wave
  • This method requires a large bandwidth of more than 4GHz to achieve centimeter-level distance sensing.
  • centimeter-level distance sensing can be achieved by using a 4GHz bandwidth in the 60GHz frequency band.
  • the 60GHz frequency band is not open for free use in many areas, the generally available 24GHz millimeter wave frequency band only has a bandwidth of 250MHz. Under this bandwidth, the theoretical resolution using the above method is only 60cm, and centimeter-level proximity sensing cannot be achieved.
  • Embodiments of the present application provide a method and device for sensing targets. Since this method senses targets through the changing characteristics of spectrum data corresponding to frequency points whose distance values are smaller than the preset value, it does not require the support of higher bandwidth. This can improve the accuracy of target perception under limited bandwidth conditions.
  • embodiments of the present application provide a method for perceiving a target.
  • the method includes:
  • the eigenvalues corresponding to each target moment are calculated to obtain a eigenvalue sequence;
  • the eigenvalue sequence consists of eigenvalues corresponding to multiple target moments sorted in time sequence, and the target moments are from the first moment to the the moment between two moments;
  • the process of calculating the characteristic value corresponding to the target time includes: based on multiple target echo signals received in the sub-time period before the target time, obtaining the target spectrum data corresponding to the multiple target echo signals respectively.
  • the sub-time period includes the target time, the target
  • the spectrum data is the spectrum data corresponding to the frequency points whose distance value is less than the preset value; based on the target spectrum data corresponding to multiple target echo signals, the characteristic value corresponding to the target time is obtained;
  • Perceiving targets based on feature value sequences.
  • the characteristic value can be calculated based on the spectrum data corresponding to the frequency point whose distance value of the received echo signal is smaller than the preset value; further, the target can be sensed based on the corresponding relationship between the characteristic value and the distance.
  • This method has nothing to do with bandwidth and only involves the corresponding relationship between feature values and distances. Therefore, this method can improve perception accuracy under low-bandwidth conditions.
  • the frequency point whose distance value is smaller than the preset value is the frequency point where the distance value is zero.
  • N-point fast Fourier transformation (FFT) calculation can be performed on the chirp signal received by the radar to obtain the range spectrum amplitude spectrum.
  • the range spectrum amplitude spectrum is the echo signal energy on each distance unit (range-bin), and the spectrum data corresponding to the frequency point where the range value is zero can be the amplitude corresponding to the zero frequency point. That is, the zero frequency point is the frequency point where the range-bin is 0, and the amplitude corresponding to the zero frequency point is the amplitude corresponding to the frequency point where the range-bin is 0.
  • This method senses the target through the changing characteristics of the amplitude corresponding to the zero frequency point, which can improve the sensing accuracy under low bandwidth conditions. It can be found through experiments that under the constraints of the 24GHz frequency band and 250MHz bandwidth, the embodiment of the present application can break through the theoretical limitations of radar ranging through this method and realize the centimeter-level proximity sensing function under low bandwidth.
  • the target time is part or all of the time between the first time and the second time.
  • obtaining the characteristic value corresponding to the target time based on the target spectrum data corresponding to the multiple target echo signals includes: obtaining a portion of the target spectrum data corresponding to the multiple target echo signals.
  • the characteristic value corresponding to the target time is the variance or standard deviation of the target spectrum data respectively corresponding to the multiple target echo signals.
  • sensing the target based on the feature value sequence includes:
  • a first time period is determined from the first moment to the second moment, and the eigenvalue corresponding to the target moment in the first time period rises from the minimum threshold to the maximum threshold;
  • the moving speed of the target in the first time period is the ratio of the difference between the maximum threshold and the minimum threshold and the duration of the first time period.
  • sensing the target based on the feature value sequence includes;
  • a second time period is determined from the first moment to the second moment, and the eigenvalues corresponding to the target moments in the second time period are not less than the maximum threshold;
  • sensing the target based on the feature value sequence includes;
  • target spectrum data corresponding to the multiple target echo signals are obtained based on multiple target echo signals received in the sub-time period before the target time, including:
  • the spectrum data group includes multiple spectrum data, and multiple spectrum data correspond to multiple frequency points;
  • Each spectrum data group is normalized separately to obtain multiple processed spectrum data groups
  • Spectrum data corresponding to frequency points with distance values smaller than the preset value are obtained from each processed spectrum data group, respectively, to obtain target spectrum data corresponding to each target echo signal.
  • a spectrum data group corresponding to each target echo signal is obtained, including:
  • the spectrum data group includes N pieces of data, and N is a positive integer.
  • inventions of the present application provide a device for sensing a target.
  • the device includes an antenna unit and a sensing unit:
  • An antenna unit is used for receiving echo signals from the first moment to the second moment
  • the sensing unit is used to calculate the characteristic value corresponding to each target moment based on the received echo signal, and obtain the characteristic value sequence; the characteristic value sequence consists of the characteristic values corresponding to multiple target moments sorted in time, and the target moment is The time between the first time and the second time; the process of calculating the characteristic value corresponding to the target time includes: based on multiple target echo signals received in the sub-time period before the target time, obtaining the corresponding corresponding values of the multiple target echo signals.
  • Target spectrum data the sub-time period includes the target time, and the target spectrum data is the spectrum data corresponding to the frequency point whose distance value is less than the preset value; based on the target spectrum data corresponding to multiple target echo signals, the target time corresponding to Eigenvalues; sensing targets based on eigenvalue sequences.
  • embodiments of the present application provide a vehicle, which includes a device for sensing a target as shown in the second aspect.
  • embodiments of the present application provide an electronic device, which includes one or more processors and one or more memories; wherein one or more memories are coupled to one or more processors, and one or more The plurality of memories are used to store computer program codes.
  • the computer program codes include computer instructions. When one or more processors execute the computer instructions, the electronic device performs the method described in any possible implementation of the first aspect.
  • embodiments of the present application provide a computer program product containing instructions.
  • the computer program product When the computer program product is run on an electronic device, the electronic device causes the electronic device to execute as described in any possible implementation manner of the first aspect. method.
  • embodiments of the present application provide a computer-readable storage medium, including instructions.
  • the electronic device When the instructions are run on an electronic device, the electronic device causes the electronic device to execute the first aspect, the second aspect, or the third aspect and the above.
  • Figure 1 is a schematic diagram of a device for sensing targets provided by an embodiment of the present application
  • Figure 2 is an integrated schematic diagram of a device for sensing objects and a display provided by an embodiment of the present application
  • Figure 3 is a schematic diagram of a detection range provided by an embodiment of the present application.
  • Figure 4A is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • Figure 4B is a software structure block diagram of an electronic device 100 provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of an application scenario provided by the embodiment of the present application.
  • Figure 6 is a flow chart of a method for sensing a target provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of a detection time provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of a target time provided by an embodiment of the present application.
  • Figure 9 is a schematic flowchart of determining the characteristic value corresponding to the target moment provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of a distance spectrum provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram of a characteristic value sequence when a target is approaching provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of several tags provided by embodiments of the present application.
  • Figure 13 is a possible user interface on the display screen of the vehicle-mounted display provided by the embodiment of the present application.
  • Figure 14 is a possible user interface on the display screen of a smart home device provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a possible functional framework of a vehicle 10 provided by an embodiment of the present application.
  • the device for sensing an object used in the embodiment of the present application is first described below.
  • FIG. 1 is a schematic diagram of a device for sensing a target provided by an embodiment of the present application.
  • the device for sensing targets may include an antenna unit 11, a radar unit 12 and a detection unit 13, where:
  • the antenna unit 11 includes a transmitting antenna and a receiving antenna, wherein the transmitting antenna is used to transmit radio frequency signals, and the receiving antenna is used to receive echo signals of radio frequency signals.
  • the radar unit 12 is used to transmit and receive signals in combination with the antenna unit 11, and perform signal processing on the received echo signals to obtain spectrum data. For example, the radar unit 12 can calculate the range spectrum (Range-FFT) of the echo signal.
  • Range-FFT range spectrum
  • the detection unit 13 is used to sense the target according to the spectrum data output by the radar unit 12 and obtain a detection result.
  • the antenna unit 11 can receive the echo signal within the detection time; the radar unit 12 performs an N-point Fourier transform on the intermediate frequency signal corresponding to the received echo signal to obtain a spectrum data set corresponding to the echo signal. ; The detection unit 13 obtains feature values corresponding to multiple target times based on the spectrum data group corresponding to the echo signal, and then perceives the target based on the feature values corresponding to multiple target times, wherein the feature value is based on the distance value being less than the predetermined value. Spectrum corresponding to the frequency point of the set value data obtained.
  • the device for sensing a target may further include an application module.
  • the application module is configured to receive the detection result and perform corresponding operations in response to the detection result.
  • the target-sensing device includes an application module and a display screen. When it is determined that the target is sensed, the application module can display a preset user interface through the display screen, such as waking up the target-sensing device in a screen-off state.
  • the above-mentioned target sensing device can be used as a separate product; it can also be integrated into an electronic device, making the electronic device a terminal device with sensing capabilities.
  • the above-mentioned electronic devices include but are not limited to smartphones, tablets, personal digital assistants (PDAs), wearable electronic devices with wireless communication functions (such as smart watches, smart glasses), and augmented reality (AR). equipment, virtual reality (VR) equipment, etc.
  • Exemplary embodiments of electronic devices include, but are not limited to, equipped with Portable electronic devices with Linux or other operating systems.
  • the above-mentioned electronic device can also be other portable electronic devices, such as a laptop computer (Laptop). It should also be understood that in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer or the like.
  • the target-sensing device can be integrated on a display, such as a car smart screen, a central control screen, and a home device display.
  • a display such as a car smart screen, a central control screen, and a home device display.
  • Figure 2 is a schematic diagram of the integration of a device for sensing objects and a display provided by an embodiment of the present application.
  • the black solid circle represents the receiving antenna
  • the black solid square represents the transmitting antenna.
  • the receiving antenna and the transmitting antenna can be embedded in the frame of the display, specifically in the lower left corner of the display.
  • (B) in Figure 2 is a cross-sectional schematic diagram of the display, illustrating the integrated form of the receiving antenna.
  • the blank area represents the display
  • the horizontal line area represents the receiving antenna
  • the square area represents the radar unit.
  • the receiving antenna can be embedded in the display, and the triangular diagonal area is used to receive the echo signal and transmit it to the radar unit.
  • the receiving antenna and the transmitting antenna can also be arranged at other positions on the display screen. This is just an example and should not limit the integrated position of the device for sensing targets.
  • FIG. 3 exemplarily shows the detection range of the device for sensing targets shown in FIG. 2 .
  • the triangular area is used to represent the detection range in the vertical direction; the elliptical area is used to represent the detection range in the horizontal direction.
  • FIG. 4A is a schematic structural diagram of an electronic device 100 disclosed in an embodiment of the present application.
  • electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different component configuration.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, Camera 193, display screen 194, subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 can be coupled with the audio module 170 through the I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • audio module 170 The audio signal can also be transmitted to the wireless communication module 160 through the PCM interface to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate through the CSI interface to implement the shooting function of the electronic device 100 .
  • the processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, display screen 194, wireless communication module 160, audio module 170, sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the SIM interface can be used to communicate with the SIM card interface 195 to implement the function of transmitting data to the SIM card or reading data in the SIM card.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationships between the modules illustrated in the embodiments of the present application are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, etc.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, and respond to the received electromagnetic waves. Perform filtering, amplification and other processing, and send it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, Quantum dot light emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, etc. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area.
  • the stored program area can store the operating system, at least one application required for the function (such as face recognition function, fingerprint recognition function, mobile payment function, etc.).
  • the storage data area can store data created during the use of the electronic device 100 (such as face information template data, fingerprint information templates, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into Switch to digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the headphone interface 170D is used to connect wired headphones.
  • the headphone interface 170D may be a USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA Cellular Telecommunications Industry Association of the USA
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A there are many types of pressure sensors 180A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 180B detects the angle at which the electronic device 100 shakes, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • Air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may utilize the magnetic sensor 180D to detect opening and closing of the flip holster.
  • the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Then, based on the detected opening and closing status of the leather case or the opening and closing status of the flip cover, features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • Distance sensor 180F for measuring distance.
  • electronic device 100 may measure distance via infrared or laser.
  • the electronic device 100 may utilize the distance sensor 180F to measure distance to achieve fast focusing.
  • the distance sensor 180F may include the above-mentioned device for sensing a target, and may specifically include an antenna unit, a radar unit, and a detection unit.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outwardly through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect when the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, and pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touching.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • Temperature sensor 180J is used to detect temperature.
  • the electronic device 100 utilizes the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the low temperature from causing the electronic device 100 to shut down abnormally. In some other embodiments, when the temperature is lower than another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • the motor 191 can also respond to different vibration feedback effects for touch operations in different areas of the display screen 194 .
  • Different application scenarios such as time reminders, receiving information, alarm clocks, games, etc.
  • the touch vibration feedback effect can also be customized.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 can use the processor 110 to execute the method of sensing a target, and the display screen 194 displays an interface after determining the sensing target.
  • FIG. 4B is a software structure block diagram of an electronic device 100 disclosed in the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the system is divided into four layers, from top to bottom: application layer, application framework layer, runtime and system library, and kernel layer.
  • the application layer can include a series of application packages.
  • the application layer also includes a perception module.
  • the application package can include camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications (also called applications). ).
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 .
  • call status management including connected, hung up, etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of graphics or scroll bar text, such as Notifications for applications running on your computer or notifications that appear on the screen in the form of a conversational interface. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Runtime includes core libraries and virtual machines. Runtime is responsible for the scheduling and management of the system.
  • the core library contains two parts: one part is the functional functions that the programming language (for example, Java language) needs to call, and the other part is the core library of the system.
  • one part is the functional functions that the programming language (for example, Java language) needs to call
  • the other part is the core library of the system.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the programming files (for example, jave files) of the application layer and application framework layer as binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), three-dimensional graphics processing libraries (for example: OpenGL ES), two-dimensional graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes display driver, camera driver, audio driver, sensor driver, and virtual card driver.
  • the following exemplifies the workflow of the software and hardware of the electronic device 100 in conjunction with capturing the photographing scene.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, and other information). Raw input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation and the control corresponding to the click operation as a camera application icon control as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer. Camera 193 captures still images or video.
  • the terminal device is a terminal device with sensing capabilities.
  • the terminal device includes a display screen and the device for sensing the target.
  • the device for sensing the target of the terminal device is located in the lower left corner of the terminal device.
  • the user can approach the lower left corner of the display screen with his hand, and the radio frequency signal emitted by the transmitting antenna is transmitted to the user's hand, generating an echo signal; the echo signal is received by the receiving antenna; the terminal device can determine whether there is a target based on the received echo signal. Get closer, so that the corresponding interface is displayed through the display when the target is close.
  • Figure 6 is a flow chart of a method for sensing a target provided by an embodiment of the present application.
  • the method can be executed by the terminal device shown in Figure 5.
  • the method can include some or all of the following steps.
  • the first time to the second time may also be called detection time.
  • Figure 7 is a diagram provided by an embodiment of the present application. A schematic diagram of a detection time is provided, as shown in Figure 7. The detection time is the time period from the first moment to the second moment, and this time period includes multiple moments.
  • the terminal device may be in a detection state in real time, that is, transmitting a radio frequency signal through a transmitting antenna and receiving an echo signal of the radio frequency signal through a receiving antenna in real time.
  • the terminal device may start receiving the echo signal when receiving a user operation. For example, the terminal device detects the target through a device that senses the target when it is powered on, but does not detect the target when it is in the sleep state; another example is that the target application can invoke the sensing capability, and the terminal device can start detecting a user operation input by the user for the target application. Receive echo signals to detect targets.
  • key parameters of the radar's RF signal can be pre-configured.
  • the radar can use FMCW modulation, set the bandwidth B to ⁇ B ⁇ 250MHz, and set the ⁇ value to 200MHz. If the B value is 250MHz, the theoretical resolution under traditional radar ranging is 60cm; radar frame rate setting It is 20 frames per second or more, each frame contains k Chirp signals, and k is a positive integer.
  • the Chirp signal is a signal whose frequency continuously changes linearly during the duration, that is, a chirp signal, and is a commonly used radar signal.
  • the target time is the time between the first time and the second time.
  • the first time to the second time include multiple target times.
  • the first moment to the second moment include M moments, and M is a positive integer greater than 1.
  • the first moment to the second moment may include m target moments, and m is a positive integer not greater than M. There is no limit to the number of target moments here.
  • the process of calculating the characteristic value corresponding to the target time may include: for each target time, determining the sub-time period corresponding to the target time, where the sub-time period includes the target time; receiving data based on the sub-time period corresponding to the target time target echo signal, and obtain the target spectrum data corresponding to each target echo signal.
  • the target spectrum data is the spectrum data corresponding to the frequency point whose distance value is less than the preset value; based on the target corresponding to multiple target echo signals, Spectrum data is used to obtain the characteristic value corresponding to the target time.
  • the target spectrum data can be the amplitude of the zero-frequency point.
  • the amplitude of the zero-frequency point corresponding to each target echo signal can be obtained; further, based on The obtained amplitudes of multiple zero-frequency points are used to calculate the characteristic values corresponding to the target time.
  • the sub-time period corresponding to the target time can be determined through a sliding time window, so that the characteristic value corresponding to the target time is determined based on multiple target echo signals received in the sub-time period.
  • FIG. 8 is a schematic diagram of a target time provided by an embodiment of the present application. As shown in Figure 8, the slashed area represents the sliding time window. In Figure 8, three moments are used to represent the length of the sliding time window. Then the first target time and the sub-time period corresponding to the target time are as shown in Figure 8. shows; each time the sliding time window slides by one step, a target moment and the sub-time period corresponding to the target moment can be determined.
  • the step size of the sliding time window is 1 moment
  • the first moment to the second moment are divided into the first three moments.
  • Other times than can be the target time then multiple target times from the first time to the second time can be as shown in Figure 8 . That is to say, assuming that the detection time includes Q moments and the sliding time window includes q moments, then the number of target moments is not greater than Q-q, the number of eigenvalues is not greater than Q-q, Q is a positive integer greater than 1, and q is A positive integer greater than 0.
  • Figure 9 is a schematic flowchart of determining the characteristic value corresponding to the target moment provided by an embodiment of the present application. This method include some or all of the following steps:
  • the W target echo signals may be part or all of the echo signals received in the sub-time period corresponding to the target time. That is to say, W target echo signals can be extracted from the received echo signals. Specifically, they can be extracted according to preset rules, for example, at preset time intervals.
  • the radio frequency signal can be an FMCW signal.
  • each Chirp echo signal received by a fixed single receiving antenna can be received; further, the intermediate frequency signal corresponding to each Chirp echo signal can be obtained respectively.
  • the signal after the local oscillator signal of the radar and the echo signal received by the radar (the signal after the radar transmit signal is reflected by the target object) are processed by the mixer is the intermediate frequency signal.
  • S902 Perform N-point Fourier transform on each of the W intermediate frequency signals to obtain W spectrum data groups, where N is a positive integer.
  • each spectrum data group includes N pieces of spectrum data; one spectrum data is the amplitude corresponding to one frequency point.
  • r(n) is the intermediate frequency signal of a single Chirp signal of the receiving antenna
  • n is the number of samples in a single Chirp signal period
  • n is a positive integer
  • FFT calculation can be performed on r(n), that is, 1-dimensional (1D)-FFT calculation, the Range-FFT of the intermediate frequency signal is obtained.
  • One frequency point on the Range-FFT corresponds to a distance value and a reflection intensity value (i.e. amplitude).
  • the Range-FFT has n frequency points;
  • the sequence of all amplitudes in the intermediate frequency signal is the spectrum data group of the intermediate frequency signal, and one spectrum data in the spectrum data group is the amplitude corresponding to a frequency point.
  • Range-FFT is the frequency domain signal (complex value or modulus value) obtained after performing N-point FFT calculation on the Chirp signal.
  • Figure 10 is a schematic diagram of a distance spectrum provided by an embodiment of the present application.
  • t is used to represent time
  • A is used to represent amplitude
  • one waveform corresponds to one Chirp signal.
  • four Chirp signals are used in one frame.
  • S903 Perform normalization processing on each of the W spectrum data groups, respectively, to obtain W processed spectrum data groups.
  • the normalization method can be (0,1) standardization or Z-score standardization, etc.
  • the method of normalization processing is not limited here.
  • S904 Obtain spectrum data corresponding to frequency points with distance values smaller than the preset value from each of the W processed spectrum data groups, and obtain W amplitudes.
  • the spectrum data corresponding to the frequency point whose distance value is smaller than the preset value is the amplitude corresponding to the frequency point whose distance value is smaller than the preset value, that is, the amplitude corresponding to the frequency point whose range-bin is 0.
  • Range-bin refers to each frequency point in Range-FFT, and each frequency point corresponds to the distance information between the radar and the target object, so each frequency point can be defined as Range-bin.
  • the variance or standard deviation of W amplitudes can be calculated to obtain the statistical value ⁇ t ; the statistical value is used as the characteristic value ⁇ t corresponding to the target moment.
  • w amplitudes can also be extracted at even intervals from the W amplitudes, and the w amplitudes can be calculated to obtain the statistical value ⁇ t ; the statistical value can be used as the characteristic value ⁇ t corresponding to the target moment. Understandably, this method can reduce the amount of calculation.
  • a eigenvalue sequence is obtained.
  • the eigenvalue sequence is arranged in order of time. It consists of eigenvalues corresponding to multiple target moments in the sequence.
  • the feature values corresponding to part or all of the target moments from the first moment to the second moment may be used as a feature value sequence.
  • the sliding time window shown in Figure 8 after determining the characteristic value corresponding to the target time based on the sub-time period corresponding to the target time, assuming that the step size of the sliding time window is 1, the sliding time window slides by one step. , the sub-time period corresponding to a target time can be determined; furthermore, the feature value corresponding to the target time can be determined based on the sub-time period corresponding to the target time, until the feature value when the target time is the second time is obtained, and all targets are obtained The eigenvalue corresponding to the time, that is, the eigenvalue sequence.
  • the feature values corresponding to part or all of the target moments from the first moment to the second moment may be normalized first, and then the normalized feature values may be used as the feature value sequence.
  • the parameters in normalization are such as the maximum
  • the value and minimum value can be set to the same global value instead of the maximum and minimum values of the characteristic values corresponding to all target moments.
  • the target can be a detected object, and the target is not limited here.
  • the target is the user's hand.
  • whether the target is sensed may be determined based on a maximum threshold, a minimum threshold, and a sequence of characteristic values.
  • the minimum threshold and the maximum threshold are preset values.
  • the sensing range is from the minimum sensing distance to the maximum sensing distance. For example, if the sensing range is 5cm to 30cm, the minimum threshold can be the eigenvalue corresponding to 30cm; the maximum threshold can be the eigenvalue corresponding to 5cm.
  • the time period in which the feature value rises from the minimum threshold to the maximum threshold from the first moment to the second moment is determined as the first time period; further, the movement of the target within the first time period
  • the speed is greater than the preset threshold or when the duration of the first time period is greater than the preset duration, it is determined that the target is sensed, wherein the moving speed of the target in the first time period is the difference between the maximum threshold and the minimum threshold and the duration of the first time period ratio.
  • the terminal device can determine the time for the target to move from the maximum sensing distance to the minimum sensing distance (i.e., the first time period) and the target's moving speed or dwell time based on the characteristic value sequence, thereby preventing the target from moving too fast or staying for too long.
  • a short one is judged as an accidental touch. It is understandable that this method can reduce the user's accidental touch and bring unnecessary operations to the terminal device.
  • the first moment to the second moment is the first time period. That is to say, the terminal device determines that the characteristic value at the first moment is the minimum threshold, and the characteristic value at the second moment is the maximum threshold.
  • the terminal device can determine The target is sensed at the second moment, so that the next step such as step S605 is performed.
  • the duration of the detection time (ie, the first moment to the second moment) can be set to a preset duration; the terminal device
  • the detection start time to the detection end time include multiple detection times.
  • the detection time can be determined based on the sliding time window method. Assuming that the step size is 1 moment, the terminal device determines a certain detection time based on the first time period in the second step.
  • the sensing range is 5cm to 30cm, and when the user moves his hand from 30cm to 5cm and the moving speed meets the preset conditions, the terminal device can determine that the target is sensed when the target moves to 5cm.
  • the time period in which the eigenvalues from the first moment to the second moment are not less than the maximum threshold are determined as the second time period; furthermore, the duration in the second time period is longer than the predetermined time period.
  • the duration is set, the target is determined to be perceived.
  • none of the eigenvalues is less than the maximum threshold, that is, the target distance is less than or equal to the minimum sensing distance. That is, the method determines that the target is sensed when the target is within the minimum sensing distance for a preset period of time.
  • the terminal device determines that the target is sensed when the user moves the hand from 30cm to a distance less than or equal to 5cm and maintains the distance for a preset time.
  • Figure 11 is a schematic diagram of a feature value sequence when a target approaches according to an embodiment of the present application.
  • Figure 11 exemplarily shows the changes in eigenvalues when the target approaches.
  • the abscissa is time and the ordinate is distance.
  • the minimum threshold ⁇ 0 is the eigenvalue corresponding to 30cm
  • the maximum threshold ⁇ 1 is 5cm. eigenvalue, then it can be determined that t1 is the first time period and t2 is the second time period.
  • the eigenvalue when the target is located at 30cm, the eigenvalue is the minimum threshold ⁇ 0; when the target moves from 30cm to 5cm, the eigenvalue increases from the minimum threshold ⁇ 0 to the maximum threshold ⁇ 1, and the eigenvalue increases from the minimum threshold ⁇ 0 to
  • the time period of the maximum threshold ⁇ 1 is the first time period; when the target is located at a distance less than 5cm, the characteristic value is greater than the maximum threshold ⁇ 1, and the characteristic value is greater than the maximum threshold ⁇ 1 is the second time period.
  • the feature value sequence can be input into the trained neural network to obtain the perception result at the second moment.
  • the trained neural network is based on the feature value sequence corresponding to the sample time period as the input, and the perception result of the sample time period is the label. obtained through training.
  • the neural network can use the long short-term memory network (LSTM, Long Short-Term Memory) for two-classification training, and the softmax layer is used as the output to obtain the classification result.
  • the classification result can be 0 or 1, and 0 is used to represent the feature value sequence.
  • the perception result of is that the target is not perceived, and 1 is used to indicate that the perception result of this feature value sequence is that the target is perceived.
  • sample data can be obtained first.
  • the sample data includes feature value sequences corresponding to multiple sample time periods, and sample labels corresponding to multiple feature value sequences, where the sample labels are used Indicates the perception result of the sample time period.
  • the sample labels can be 0 and 1.
  • the sample labels of the feature value sequences whose feature value sequences are both greater than the maximum threshold and the duration corresponding to the feature value sequence exceeds the preset value are marked as 1, and the others are marked as 1.
  • the method of marking sample labels can also refer to the marking situation shown in Figure 12 below.
  • Figure 12 is a schematic diagram of several tags provided by embodiments of the present application.
  • the abscissa in Figure 12 is time, the ordinate is eigenvalues, the dotted line is used to represent the maximum threshold, and the curve is a curve generated based on the correspondence between eigenvalues and time;
  • (a), (b) and (c) in Figure 12 ) exemplarily shows three situations in which the label is 0, that is, when the characteristic value is not greater than the maximum threshold during the detection time or the duration of the characteristic value greater than the maximum threshold is less than the preset duration, it is determined that the target is not sensed within the detection time, Mark this feature value sequence as 0;
  • Figure 12 (d) and (e) exemplarily show two situations where the label is 1, that is, the duration when the feature value is greater than the maximum threshold within the detection time is greater than the preset duration. When, it is judged that the detection time is not detected To the target, mark the feature value sequence as 1.
  • the target interface can be determined based on the application scenario of the terminal device, and is not limited here.
  • the terminal device may be a vehicle-mounted display, and the sensing function is used by the vehicle-mounted display to display menu content when it senses a target.
  • FIG. 13 exemplarily shows a possible user interface on the display screen of the vehicle-mounted display.
  • (A) in Figure 13 is the interface of the display screen before the target is sensed; when the vehicle monitor senses the target, the interface shown in (B) in Figure 13 can be displayed through the display screen, as shown in Figure 13
  • the left side of the interface includes a menu, which includes options such as call, settings, music, and weather. It is understandable that the user can pop up the secondary submenu by sensing the proximity of the hand on the vehicle central control screen. Reducing the user's operation time can reduce the driver's distraction and improve driving safety.
  • the terminal device may be a smart home device, and the sensing function is used for the smart home device to display the content of the secondary menu on the display screen when it senses a target.
  • Figure 14 illustrates a possible user interface on a display screen of a smart home device.
  • (A) in Figure 14 is the interface of the display screen before the target is sensed.
  • This interface displays option bars such as the living room, master bedroom, second bedroom, and kitchen.
  • the slashed area is used to indicate that the current option bar is selected. , that is, the master bedroom is in the selected state; when the smart home device senses the target, the interface shown in (B) in Figure 14 can be displayed through the display screen.
  • the interface displays the main bedroom.
  • the second-level menu for the bedroom includes options such as ceiling lights, air conditioning, and curtains.
  • the terminal device can be a home appliance.
  • the home appliance senses a target, it wakes up the display panel. It is understandable that this method can reduce the power consumption of the device.
  • vehicle scene shown in Figure 13 and the home scene shown in Figure 14 are only exemplary implementations of the embodiments of the present application.
  • Application scenarios of the embodiments of the present application include but are not limited to the above scenarios; Figures 13 and 14
  • the interface shown in 14 is only an exemplary interface provided by the embodiment of the present application and shall not limit the embodiment of the present application.
  • the above embodiment may not include step S605; or, the above embodiment may not include step S605, but include other steps, such as driving the terminal device to move after sensing the target, which is not limited here.
  • the device for sensing an object shown in Figure 1 can be applied to a vehicle, such as a display screen of the vehicle; the method for sensing an object shown in Figure 6 can also be executed by the vehicle or some equipment of the vehicle.
  • FIG. 15 is a schematic diagram of a possible functional framework of a vehicle 10 provided by an embodiment of the present application.
  • the functional framework of the vehicle 10 may include various subsystems, such as the sensor system 12 in the figure, the control system 14 , one or more peripheral devices 16 (one is shown as an example in the figure), a power supply 18.
  • the vehicle 10 may also include other functional systems, such as an engine system that provides power for the vehicle 10 , etc., which are not limited here in this application. in,
  • the sensor system 12 may include several detection devices that can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules. As shown in the figure, these detection devices may include a global positioning system 1201 (GPS), a vehicle speed sensor 1202, an inertial measurement unit 1203 (IMU), a radar unit 1204, a laser rangefinder 1205, and a camera unit 1206, wheel speed sensor 1207, steering sensor 1208, gear sensor 1209, or other components for automatic detection, etc., are not limited by this application.
  • GPS global positioning system
  • vehicle speed sensor 1202 an inertial measurement unit 1203
  • IMU inertial measurement unit
  • radar unit 1204 a laser rangefinder 1205
  • camera unit 1206 wheel speed sensor 1207, steering sensor 1208, gear sensor 1209, or other components for automatic detection, etc.
  • Global Positioning System GPS 1201 is a system that uses GPS positioning satellites to perform positioning and navigation in real time around the world.
  • the global positioning system GPS can be used to realize the real-time positioning of the vehicle and provide the vehicle's geographical location information.
  • the vehicle speed sensor 1202 is used to detect the driving speed of the vehicle.
  • the inertial measurement unit 1203 may include a combination of an accelerometer and a gyroscope and is a device that measures the angular rate and acceleration of the vehicle 10 . For example, while the vehicle is driving, the inertial measurement unit can measure the position and angle changes of the vehicle body based on the vehicle's inertial acceleration.
  • Radar unit 1204 may also be called a radar system.
  • the radar unit uses wireless signals to sense objects in the current environment in which the vehicle is traveling.
  • the radar unit can also sense information such as the object's speed and direction of travel.
  • the radar unit may be configured as one or more antennas for receiving or transmitting wireless signals.
  • the laser rangefinder 1205 is an instrument that can use modulated laser to measure the distance of a target object. That is, the laser rangefinder can be used to measure the distance of a target object.
  • the laser rangefinder may include, but is not limited to, any one or a combination of the following elements: laser source, laser scanner, and laser detector.
  • the camera unit 1206 is used to capture images, such as images and videos.
  • the camera unit can collect images of the environment where the vehicle is located in real time. For example, during the process of vehicles entering and exiting a tunnel, the camera unit can continuously collect corresponding images in real time.
  • the camera unit includes but is not limited to a driving recorder, a camera, a camera or other components used for taking pictures/photography, etc. The number of the camera units is not limited in this application.
  • Wheel speed sensor 1207 is a sensor for detecting vehicle wheel rotation speed.
  • wheel speed sensors 1207 may include, but are not limited to, magnetoelectric wheel speed sensors and Hall wheel speed sensors.
  • Steering sensor 1208 which may also be referred to as a steering angle sensor, may represent a system for detecting the steering angle of a vehicle.
  • the steering sensor 1208 may be used to measure the steering angle of the vehicle steering wheel, or to measure an electrical signal representing the steering angle of the vehicle steering wheel.
  • the steering sensor 1208 can also be used to measure the steering angle of the vehicle tire, or to measure an electrical signal representing the steering angle of the vehicle tire, etc., which is not limited by this application.
  • the steering sensor 1208 may be used to measure any one or a combination of: the steering angle of the steering wheel, an electrical signal representative of the steering angle of the steering wheel, the steering angle of the wheels (vehicle tires), and the steering angle of the wheels. electrical signals, etc.
  • the gear position sensor 1209 is used to detect the current gear position of the vehicle.
  • self-driving vehicles support 6 gears: P, R, N, D, 2 and L.
  • P (parking) gear is used for parking. It uses the vehicle's mechanical device to lock the braking part of the vehicle so that the vehicle cannot move.
  • R (reverse) gear also called reverse gear, is used for reversing the vehicle.
  • D (drive) gear also called forward gear, is used for vehicles driving on the road.
  • the 2nd gear (second gear) is also a forward gear and is used to adjust the vehicle's driving speed.
  • the second gear can usually be used for vehicles going up and down slopes.
  • L (low) gear also called low speed gear, is used to limit the driving speed of the vehicle. For example, on a downhill road, the vehicle enters L gear, so that the vehicle uses engine power to brake when going downhill. The driver does not have to apply the brakes for a long time and cause the brake pads to overheat and cause danger.
  • the control system 14 may include several elements, such as the illustrated steering unit 1401, braking unit 1402, lighting system 1403, automatic driving system 1404, map navigation system 1405, network time synchronization system 1406 and obstacle avoidance system 1407.
  • the control system 14 may also include components such as a throttle controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • the steering unit 1401 may represent a system for adjusting the direction of travel of the vehicle 10 , which may include, but is not limited to, a steering wheel, Or any other structural device used to adjust or control the direction of travel of the vehicle.
  • the braking unit 1402 may represent a system for slowing down the traveling speed of the vehicle 10 , which may also be referred to as a vehicle braking system. It may include but is not limited to brake controllers, reducers or other any structural devices used to decelerate the vehicle. In practical applications, the braking unit 1402 can use friction to slow down the vehicle tires, thereby slowing down the driving speed of the vehicle.
  • the lighting system 1403 is used to provide lighting or warning functions for the vehicle.
  • the lighting system 1403 can activate the front lights and rear lights of the vehicle to provide lighting brightness for the vehicle to drive and ensure the safe driving of the vehicle.
  • lighting systems include but are not limited to front lights, rear lights, width lights, warning lights, etc.
  • the automatic driving system 1404 may include a hardware system and a software system for processing and analyzing data input to the automatic driving system 1404 to obtain actual control parameters of each component in the control system 14, such as the desired braking of the brake controller in the braking unit. pressure and expected torque of the engine, etc. It is convenient for the control system 14 to implement corresponding control and ensure the safe driving of the vehicle.
  • the autonomous driving system 1404 can also determine information such as obstacles faced by the vehicle and characteristics of the environment in which the vehicle is located (such as the lane in which the vehicle is currently traveling, road boundaries, and upcoming traffic lights) by analyzing the data.
  • the data input to the automatic driving system 1404 can be image data collected by the camera unit, or data collected by various components in the sensor system 12, such as the steering wheel angle provided by the steering angle sensor, the wheel speed provided by the wheel speed sensor, etc. , this application is not limited.
  • the map navigation system 1405 is used to provide map information and navigation services for the vehicle 10 .
  • the map navigation system 1405 can plan an optimal driving route based on the vehicle's positioning information provided by GPS (specifically, the vehicle's current location) and the destination address input by the user, such as the shortest distance or less traffic flow. Routes etc. It is convenient for the vehicle to navigate according to the optimal driving route to reach the destination address.
  • the map navigation system can also provide or display corresponding map information to the user according to the user's actual needs, such as displaying the current road section of the vehicle on the map in real time, which is not limited by this application.
  • Network time system 1406 (network time system, NTS) is used to provide time synchronization services to ensure that the current system time of the vehicle is synchronized with the network standard time, which is beneficial to providing more accurate time information for the vehicle.
  • the network time synchronization system 1406 can obtain a standard time signal from a GPS satellite, and use the time signal to synchronously update the current system time of the vehicle to ensure that the current system time of the vehicle is consistent with the time of the obtained standard time signal.
  • the obstacle avoidance system 1407 is used to predict obstacles that the vehicle may encounter while driving, and then control the vehicle 10 to bypass or cross the obstacles to achieve normal driving of the vehicle 10 .
  • the obstacle avoidance system 1407 may analyze the sensor data collected by each element in the sensor system 12 to determine possible obstacles on the road where the vehicle is traveling. If the obstacle is large in size, such as a fixed building (building) on the roadside, the obstacle avoidance system 1407 can control the vehicle 10 to avoid the obstacle for safe driving. On the contrary, if the size of the obstacle is small, such as a small stone on the road, the obstacle avoidance system 1407 can control the vehicle 10 to overcome the obstacle and continue to drive forward.
  • Peripheral device 16 may include several elements, such as communication system 1601, touch screen 1602, user interface 1603, microphone 1604, speaker 1605, etc. in the illustration.
  • the communication system 1601 is used to implement network communication between the vehicle 10 and other devices other than the vehicle 10 .
  • the communication system 1601 may use wireless communication technology or wired communication technology to implement network communication between the vehicle 10 and other devices.
  • the wired communication technology may refer to communication between vehicles and other devices through network cables or optical fibers.
  • the wireless communication technology includes but is not limited to global system for mobile communications (GSM), general packet radio service (GPRS), Code division multiple access (CDMA), wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long-term Evolution (long term evolution, LTE), wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC) and infrared technology (infrared, IR), etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA Code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long-term Evolution
  • WLAN wireless local area networks
  • WLAN such as wireless fidelity (Wi-Fi) network
  • Bluetooth blue, BT
  • the touch screen 1602 can be used to detect operating instructions on the touch screen 1602 .
  • the user performs a touch operation on the content data displayed on the touch screen 1602 according to actual needs to realize the function corresponding to the touch operation, such as playing music, video and other multimedia files.
  • the user interface 1603 may specifically be a touch panel and is used to detect operating instructions on the touch panel.
  • User interface 1603 may also be physical buttons or a mouse.
  • the user interface 1603 may also be a display screen for outputting data, displaying images or data.
  • the user interface 1603 may also be at least one device belonging to the category of peripheral devices, such as a touch screen, a microphone, a speaker, etc.
  • Microphone 1604 also known as microphone or microphone, is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user approaches the microphone and speaks, and the sound signal can be input into the microphone.
  • the speaker 1605 is also called a horn and is used to convert audio electrical signals into sound signals. The vehicle can listen to music or listen to hands-free calls through the speaker 1605.
  • the power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, rechargeable lithium batteries or lead-acid batteries, etc. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle. The type and material of the power supply are not limited in this application.
  • the power source 18 may also be an energy source used to provide an energy source for the vehicle, such as gasoline, diesel, ethanol, solar cells or panels, etc., which is not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (which may also be referred to as a storage device).
  • the memory 2002 may also be inside the computer system 20 or outside the computer system 20 , for example, as a cache in the vehicle 10 , which is not limited in this application. in,
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU).
  • the processor 2001 may be used to run relevant programs or instructions corresponding to the programs stored in the memory 2002 to implement corresponding functions of the vehicle.
  • Memory 2002 may include volatile memory (volatile memory), such as RAM; memory may also include non-volatile memory (non-vlatile memory), such as ROM, flash memory (flash memory), HDD or solid state drive SSD; memory 2002 may also include combinations of the above types of memories.
  • the memory 2002 can be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 can call the program codes or instructions stored in the memory 2002 to implement corresponding functions of the vehicle. This function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in Figure 15. In this application, a set of program codes for vehicle control can be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle will be described in detail below in this application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, and the like.
  • Computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors Sensors, GPS, etc. in the system realize vehicle-related functions.
  • the computer system 20 can control the driving direction or driving speed of the vehicle 10 based on data input from the sensor system 12 , which is not limited in this application.
  • the device 22 for sensing targets may include several elements, such as the antenna unit 2201, the radar unit 2202 and the detection unit 2203 shown in FIG. 15 .
  • the device 22 for sensing the target may also include a control unit, which may be used to control the radar unit 2202 to send and receive signals through the antenna unit 2201 according to user instructions.
  • the functions of some elements of the device for sensing the target may also be provided by the vehicle.
  • the control unit may also be an element in the control system; for another example, the device 22 for sensing the target may not include the radar unit 2202 but be implemented by the radar unit 1204.
  • Figure 15 of this application shows that it includes four subsystems.
  • the sensor system 12, the control system 14, the computer system 20 and the device 22 for sensing targets are only examples and do not constitute a limitation.
  • the vehicle 10 can combine several components in the vehicle according to different functions, thereby obtaining subsystems with corresponding different functions.
  • the vehicle 10 may also include an electronic stability system (electronic stability program, ESP), an electric power steering system (electric power steering, EPS), etc., not shown in the figure.
  • the ESP system may be composed of some sensors in the sensor system 12 and some components in the control system 14.
  • the ESP system may include a wheel speed sensor 1207, a steering sensor 1208, a lateral acceleration sensor, and control units involved in the control system 14. etc.
  • the EPS system may be composed of some sensors in the sensor system 12, some components in the control system 14, and the power supply 18.
  • the EPS system may include the steering sensor 1208, the generator and reducer involved in the control system 14, and the battery. Power supply and so on.
  • the device for sensing the target may also include a user interface 1603 and a touch screen 1602 in peripheral devices to implement the function of receiving user instructions.
  • FIG. 15 is only a possible functional framework schematic diagram of the vehicle 10 .
  • the vehicle 10 may include more or fewer systems or components, which is not limited in this application.
  • the above-mentioned vehicle 10 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an entertainment vehicle, a playground vehicle, a construction equipment, a tram, a golf cart, a train, a trolley, etc.
  • the application examples are not particularly limited.
  • all or part of the functions may be implemented by software, hardware, or a combination of software and hardware.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in a computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk (SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本申请提供了一种感知目标的方法,该方法包括:在第一时刻至第二时刻内接收回波信号;基于接收到的回波信号,计算每个目标时刻对应的特征值,得到特征值序列;特征值序列由按时间先后排序的多个目标时刻对应的特征值组成,目标时刻为第一时刻至第二时刻之间的时刻;计算目标时刻对应的特征值的过程包括:基于目标时刻之前的子时间段接收的多个目标回波信号,得到多个目标回波信号分别对应的目标频谱数据,子时间段包括目标时刻,目标频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于多个目标回波信号分别对应的目标频谱数据,得到目标时刻对应的特征值;基于特征值序列感知目标。实施本技术方案,可以精确感知目标。

Description

一种感知目标的方法和装置
本申请要求于2022年03月11日提交中国专利局、申请号为202210242015.3、申请名称为“一种感知目标的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本方案涉及雷达技术,应用于自动驾驶、智能驾驶、测绘、智能家居或者智能制造领域,尤其涉及一种感知目标的方法和装置。
背景技术
随着终端技术的发展,越来越多的终端设备开始通过靠近感应实现终端设备的节能和极简操控体验。例如车辆中控屏及家电操控面板等产品,而在此类产品上的感知目标的功能需要较高感知精度如厘米级的感知精度,以避免误触。
目前,感知目标的方法包括红外、2D/3D摄像头、超声传感器和毫米波雷达等,其中,毫米波雷达方式在感知目标时需要较高带宽才可实现高精度的感知。例如,基于调频连续波(Frequency-modulated continuous wave,FMCW)毫米波雷达实现目标的距离检测是,通过分析距离多普勒谱(range-doppler)识别有效运动目标得到该目标的距离。该方法要实现厘米级的距离感应需要4GHz以上大带宽,如使用60GHz频段的4GHz带宽可实现厘米级的距离感应。但是,由于60GHz频段在许多地区未开放免费使用,通常可使用的24GHz毫米波频段只有250MHz带宽,在此带宽下采用上述方法的理论分辨率只有60cm,无法实现厘米级靠近感应。
如何在有限带宽条件下提高目标感知的精度,是业界亟需解决的问题。
发明内容
本申请实施例提供了一种感知目标的方法和装置,由于该方法是通过距离向取值小于预设值的频点对应的频谱数据的变化特性来感知目标,不需要较高带宽的支持,从而可以在有限带宽条件下提高目标感知的精度。
第一方面,本申请实施例提供了一种感知目标的方法,该方法包括:
在第一时刻至第二时刻内接收回波信号;
基于接收到的回波信号,计算每个目标时刻对应的特征值,得到特征值序列;特征值序列由按时间先后排序的多个目标时刻对应的特征值组成,目标时刻为第一时刻至第二时刻之间的时刻;
计算目标时刻对应的特征值的过程包括:基于目标时刻之前的子时间段接收的多个目标回波信号,得到多个目标回波信号分别对应的目标频谱数据,子时间段包括目标时刻,目标 频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于多个目标回波信号分别对应的目标频谱数据,得到目标时刻对应的特征值;
基于特征值序列感知目标。
本申请实施例中,可以基于接收到回波信号的距离向取值小于预设值的频点对应的频谱数据,计算特征值;进而,可以基于特征值与距离的对应关系,感知目标。该方法与带宽无关,仅涉及特征值与距离的对应的关系,因此该方法可以在低带宽条件下提高感知精度。
在一种可能的实现方式中,距离向取值小于预设值的频点为距离向取值为零处的频点。
在一种可能的实现方式中,可以对雷达接收的啁啾(Chirp)信号进行N点快速傅里叶变换(fastfourier transformation,FFT)计算,获得距离向的频谱幅度谱,该距离向频谱幅度谱为各距离单元(range-bin)上的回波信号能量,距离向取值为零处的频点对应的频谱数据可以为零频点对应的幅值。即,零频点为range-bin为0的频点,零频点对应的幅值则为range-bin为0的频点对应的幅值。该方法通过零频点对应的幅值的变化特性感知目标,可以在低带宽条件下提高感知精度。通过实验可以得到,本申请实施例在24GHz频段250MHz带宽的约束下,通过该方法可以突破雷达测距理论局限,实现低带宽下厘米级靠近感应功能。
在一种可能的实现方式中,目标时刻为第一时刻至第二时刻之间的部分或全部时刻。
在一种可能的实现方式中,所述基于多个目标回波信号分别对应的目标频谱数据,得到目标时刻对应的特征值,包括:从多个目标回波信号分别对应的目标频谱数据获取部分目标频谱数据;基于所述部分目标频谱数据,得到目标时刻对应的特征值。可以理解的,该方法可以节省计算量。
结合第一方面,在一种可能的实现方式中,目标时刻对应的特征值为多个目标回波信号分别对应的目标频谱数据的方差或标准差。
结合第一方面,在一种可能的实现方式中,基于特征值序列感知目标,包括:
基于特征值序列,从第一时刻至第二时刻中确定第一时间段,第一时间段内的目标时刻对应的特征值由最小门限上升至最大门限;
在第一时间段内目标的移动速度大于预设阈值时,确定感知到目标,第一时间段内目标的移动速度为最大门限和最小门限的差与第一时间段的时长的比值。
结合第一方面,在一种可能的实现方式中,基于特征值序列感知目标,包括;
基于特征值序列,从第一时刻至第二时刻中确定第二时间段,第二时间段内的目标时刻对应的特征值均不小于最大门限;
在第二时间段内的时长大于预设时长时,确定感知到目标。
结合第一方面,在一种可能的实现方式中,基于特征值序列感知目标,包括;
将特征值序列输入训练后的神经网络,得到第二时刻的感知结果;训练后的神经网络是基于样本时间段对应的特征值序列为输入,样本时间段的感知结果为标签训练得到的。
结合第一方面,在一种可能的实现方式中,基于目标时刻之前的子时间段接收的多个目标回波信号,得到多个目标回波信号分别对应的目标频谱数据,包括:
基于多个目标回波信号,得到每个目标回波信号对应的频谱数据组,频谱数据组包括多个频谱数据,多个频谱数据对应多个频点;
分别对每个频谱数据组进行归一化处理,得到多个处理后的频谱数据组;
分别从每个处理后的频谱数据组中获取距离向取值小于预设值的频点对应的频谱数据,得到每个目标回波信号对应的目标频谱数据。
结合第一方面,在一种可能的实现方式中,基于多个目标回波信号,得到每个目标回波信号对应的频谱数据组,包括:
计算目标回波信号对应的中频信号;
对中频信号进行N点傅里叶变换,得到目标回波信号对应的频谱数据组,频谱数据组包括N个数据,N为正整数。
第二方面,本申请实施例提供了一种感知目标的装置,该装置包括天线单元和感知单元:
天线单元,用于在第一时刻至第二时刻内接收回波信号;
感知单元,用于基于接收到的回波信号,计算每个目标时刻对应的特征值,得到特征值序列;特征值序列由按时间先后排序的多个目标时刻对应的特征值组成,目标时刻为第一时刻至第二时刻之间的时刻;计算目标时刻对应的特征值的过程包括:基于目标时刻之前的子时间段接收的多个目标回波信号,得到多个目标回波信号分别对应的目标频谱数据,子时间段包括目标时刻,目标频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于多个目标回波信号分别对应的目标频谱数据,得到目标时刻对应的特征值;基于特征值序列感知目标。
第三方面,本申请实施例提供了一种车辆,该车辆包括如第二方面所示的感知目标的装置。
第四方面,本申请实施例提供了一种电子设备,该电子设备包括一个或多个处理器和一个或多个存储器;其中,一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行如第一方面的任意一种可能的实施方式所描述的方法。
第五方面,本申请实施例提供一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如第一方面的任意一种可能的实施方式所描述的方法。
第六方面,本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如第一方面、第二方面、或第三方面以及上述任一方面的任意一种可能的实施方式所描述的方法。
需要说明的是,本申请第二方面、第三方面、第四方面、第五方面和第四方面的部分可能实施方式与第一方面的部分实施方式构思一致,其所带来的有益效果可以参考第一方面的有益效果,因此不再赘述。
附图说明
下面对本申请实施例用到的附图进行介绍。
图1是本申请实施例提供的一种感知目标的装置的示意图;
图2是本申请实施例提供的一种感知目标的装置与显示器的集成示意图;
图3是本申请实施例提供的一种探测范围的示意图;
图4A是本申请实施例提供的一种电子设备100的结构示意图;
图4B是本申请实施例提供的一种电子设备100的软件结构框图;
图5是本申请实施例提供的一种应用场景示意图;
图6是本申请实施例提供的一种感知目标的方法的流程图;
图7是本申请实施例提供的一种探测时间的示意图;
图8是本申请实施例提供的一种目标时刻的示意图;
图9是本申请实施例提供的一种确定目标时刻对应的特征值的流程示意图;
图10是本申请实施例提供的一种距离谱的示意图;
图11是本申请实施例提供的一种目标靠近时特征值序列的示意图;
图12是本申请实施例提供的几种标签的示意图;
图13是本申请实施例提供车载显示器的显示屏上可能的用户界面;
图14是本申请实施例提供智能家居设备的显示屏上可能的用户界面;
图15是本申请实施例提供的一种车辆10的一种可能的功能框架示意图。
具体实施方式
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请实施例的限制。如在本申请实施例的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请实施例中使用的术语“和/或”是指并包含一个或多个所列出项目的任何或所有可能组合。
为了更好地理解本申请实施例提供的一种感知目标的方法,下面先对本申请实施例使用的感知目标的装置进行描述。
请参见图1,图1为本申请实施例提供的一种感知目标的装置的示意图。如图1所示,该感知目标的装置可以包括天线单元11、雷达单元12和检测单元13,其中:
天线单元11包括发射天线和接收天线,其中,发射天线用于发射射频信号,接收天线用于接收射频信号的回波信号。
雷达单元12用于结合天线单元11进行信号的收发,并对接收的回波信号进行信号处理,得到频谱数据。例如,雷达单元12可以计算回波信号的距离谱(Range-FFT)。
检测单元13用于根据雷达单元12输出的频谱数据感知目标,得到检测结果。
在一种实现中,天线单元11可以在探测时间内接收回波信号;雷达单元12对接收到的回波信号对应的中频信号进行N点傅里叶变换,得到回波信号对应的频谱数据组;检测单元13基于回波信号对应的频谱数据组,得到多个目标时刻对应的特征值,进而,基于多个目标时刻对应的特征值感知目标,其中,特征值是基于距离向取值小于预设值的频点对应的频谱 数据得到的。
在一些实施例中,该感知目标的装置还可以包括应用模块,应用模块用于接收该检测结果,响应于该检测结果执行对应的操作。例如该感知目标的装置包括应用模块和显示屏,应用模块可以在确定感知到目标时通过显示屏显示预设的用户界面,如将熄屏状态的感知目标的装置唤醒。
上述感知目标的装置可以单独作为一个产品;也可以集成于电子设备,使该电子设备成为具备感知能力的终端设备。
上述电子设备包括但不限于智能手机、平板电脑、个人数字助理(personal digital assistant,PDA)、具备无线通讯功能的可穿戴电子设备(如智能手表、智能眼镜)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备等。电子设备的示例性实施例包括但不限于搭载Linux或者其它操作系统的便携式电子设备。上述电子设备也可为其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是台式计算机等等。
例如,该感知目标的装置可以集成在显示器上,例如车载智慧屏、中控屏和家居设备显示屏等。
请参见图2,图2是本申请实施例提供的一种感知目标的装置与显示器的集成示意图。如图2中的(A)所示,图中以黑色实心圆代表接收天线,以黑色实心方块代表发射天线,接收天线和发射天线可以镶嵌在显示屏的边框,具体可以位于显示器的左下角。图2中的(B)为显示器的横截面示意图,示例性示出了接收天线的集成形态,图中以空白区域代表显示器,以横线区域代表接收天线,以方格区域代表雷达单元,可见,接收天线可以镶嵌在显示器内,三角的斜线区域用于接收回波信号,从而传输至雷达单元。需要说明的是,接收天线和发射天线还可以设置在显示屏的其它位置,此处仅为一种示例,不应造成对感知目标的装置的集成位置的限定。
图3示例性示出了图2所示的感知目标的装置的探测范围。如图3所示,三角区域用于表示在垂直方向的探测范围;椭圆形区域用于表示在水平方向的探测范围。
图4A为本申请实施例公开的一种电子设备100的结构示意图。
下面以电子设备100为例对实施例进行具体说明。应该理解的是,电子设备100可以具有比图中所示的更多的或者更少的部件,可以组合两个或多个的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
电子设备100可以包括:处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192, 摄像头193,显示屏194以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170 也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
SIM接口可以被用于与SIM卡接口195通信,实现传送数据到SIM卡或读取SIM卡中数据的功能。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波 进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed, 量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度等进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用(比如人脸识别功能,指纹识别功能、移动支付功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如人脸信息模板数据,指纹信息模板等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转 换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。例如,电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
在本申请实施例中,距离传感器180F可以包括上述感知目标的装置,具体可以包括天线单元、雷达单元和检测单元。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈 效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。
本申请实施例中,电子设备100可以通过处理器110执行所述感知目标的方法,由显示屏194显示确定感知目标后的界面。
图4B为本申请实施例公开的一种电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将系统分为四层,从上至下分别为应用程序层,应用程序框架层,运行时(Runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图4B所示,应用程序层还包括感知模块,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序(也可以称为应用)。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图4B所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后 台运行的应用程序的通知,还可以是以对话界面形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
运行时(Runtime)包括核心库和虚拟机。Runtime负责系统的调度和管理。
核心库包含两部分:一部分是编程语言(例如,jave语言)需要调用的功能函数,另一部分是系统的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的编程文件(例如,jave文件)执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了二维(2-Dimensional,2D)和三维(3-Dimensional,3D)图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现3D图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动,虚拟卡驱动。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
请参见图5,图5为本申请实施例提供的一种应用场景示意图。如图5所示,该终端设备为具备感知能力的终端设备,该终端设备包括显示屏和上述感知目标的装置,该终端设备的感知目标的装置位于终端设备左下角。用户可以用手靠近显示屏的左下方,发射天线发射的射频信号发射至用户手上,产生回波信号;回波信号被接收天线所接收;终端设备可以基于接收到回波信号确定是否有目标靠近,从而在目标靠近时通过显示屏显示相应的界面。
以图5的场景为例,介绍本申请实施例提供的一种感知目标的方法。
请参考图6,图6为本申请实施例提供的一种感知目标的方法的流程图,该方法可以由图5所示的终端设备执行,该方法可以包括以下部分或全部步骤。
S601、在第一时刻至第二时刻内接收回波信号。
其中,第一时刻至第二时刻也可以称为探测时间。请参见图7,图7是本申请实施例提 供的一种探测时间的示意图,如图7所示,探测时间为第一时刻至第二时刻的时间段,该时间段内包括多个时刻。
在一些实施例中,终端设备可以实时处于探测状态,即实时通过发射天线发射射频信号和通过接收天线接收射频信号的回波信号。
在另一些实施例中,终端设备可以在接收到用户操作时,开始接收回波信号。例如,终端设备在开机状态时通过感知目标的装置探测目标,在休眠状态时不探测目标;又例如目标应用可以调用感知能力,终端设备可以在检测到用户针对目标应用输入的用户操作时,开始接收回波信号探测目标。
在一种实现中,在接收回波信号之前,可以预先配置雷达的射频信号的关键参数,如带宽、帧率和Chirp信号的信号数。例如,雷达可以采用FMCW调制方式,将带宽B设置为β≤B≤250MHz,β值设置为200MHz,如B值为250MHz,则采用传统的雷达测距下理论分辨率为60cm;雷达帧率设置为20帧每秒或以上,每帧包含k个Chirp信号,k为正整数。需要说明的是,以上仅为发射信号的一种示例,此处对射频信号的参数的设置不做限定。其中,Chirp信号为持续期间频率连续线性变化的信号,也即线性调频信号,是一种常用的雷达信号。
S602、基于接收到的回波信号,计算每个目标时刻对应的特征值,目标时刻为第一时刻至第二时刻之间的时刻。
其中,第一时刻至第二时刻包括多个目标时刻。例如,第一时刻至第二时刻包括M个时刻,M为大于1的正整数,第一时刻至第二时刻内可以包括m个目标时刻,m为不大于M的正整数。此处对目标时刻的个数不做限定。
在一些实施例中,计算目标时刻对应的特征值的过程可以包括:针对每一个目标时刻,确定该目标时刻对应的子时间段,子时间段包括目标时刻;基于目标时刻对应的子时间段接收的目标回波信号,得到每一个目标回波信号对应的目标频谱数据,目标频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于多个目标回波信号分别对应的目标频谱数据,得到目标时刻对应的特征值。例如目标频谱数据可以为零频点的幅值,则可以基于目标时刻对应的子时间段接收的目标回波信号,得到每一个目标回波信号对应的零频点的幅值;进而,可以基于得到的多个零频点的幅值,计算目标时刻对应的特征值。
在一种实现中,可以通过滑动时间窗确定目标时刻对应的子时间段,从而基于子时间段接收的多个目标回波信号确定该目标时刻对应的特征值。请参见图8,图8是本申请实施例提供的一种目标时刻的示意图。如图8所示,斜线区域表示滑动时间窗,图8中示例性的以三个时刻代表滑动时间窗的时长,则第一个目标时刻和该目标时刻对应的子时间段如图8所示;滑动时间窗每滑动一个步长可以确定一个目标时刻和该目标时刻对应的子时间段,假设滑动时间窗的步长为1个时刻,则第一时刻至第二时刻除前三个时刻外的其它时刻均可以为目标时刻,则第一时刻至第二时刻的多个目标时刻可以如图8所示。也即是说,假设探测时间包括Q个时刻,滑动时间窗包括q个时刻,则目标时刻的个数不大于Q-q,特征值的个数不大于Q-q,Q为大于1的正整数,q为大于0的正整数。
以下示例性的以一个目标时刻为例,介绍一种确定目标时刻对应的特征值的方法。请参见图9,图9是本申请实施例提供的一种确定目标时刻对应的特征值的流程示意图,该方法 包括以下部分或全部步骤:
S901、针对目标时刻对应的子时间段接收的W个目标回波信号,分别计算每个目标回波信号对应的中频信号,得到W个中频信号,其中,W为正整数。
其中,W个目标回波信号可以为目标时刻对应的子时间段接收的回波信号的部分或全部。也就是说,可以从接收的回波信号中抽取W个目标回波信号,具体可以根据预设规则进行抽取,例如间隔预设时间进行抽取。
在一些实施例中,射频信号可以为FMCW信号,在持续发射FMCW信号时,可以接收固定的单接收天线接收的每个Chirp回波信号;进而,分别获取对应每个Chirp回波信号的中频信号。其中,雷达的本振信号与雷达接收的回波信号(是雷达的发送信号经过目标物体反射后的信号)经过混频器处理后的信号,即为中频信号。
S902、分别对W个中频信号中每个中频信号进行N点傅里叶变换,得到W个频谱数据组,N为正整数。
其中,每个频谱数据组包括N个频谱数据;一个频谱数据为一个频点对应的幅值。
在一种实现中,假设r(n)为接收天线单个Chirp信号的中频信号,n为单个Chirp信号周期内采样数,n为正整数,则可以对r(n)做FFT计算,即1维(1D)-FFT计算,得到该中频信号的Range-FFT,Range-FFT上的1个频点对应一个距离值和一个反射强度值(即幅值),Range-FFT有n个频点;该中频信号中的所有幅值的序列即为该中频信号的频谱数据组,频谱数据组中的一个频谱数据即为一个频点对应的幅值。其中,Range-FFT为对Chirp信号进行N点FFT计算后得到的频域信号(复数值或模值)。
请参见图10,图10为本申请实施例提供的一种距离谱的示意图。如图10所示,t用于表示时间,A用于表示幅值,一个波形对应一个Chirp信号,示例性的以4个Chirp信号一帧为例。
S903、分别对W个频谱数据组中每个频谱数据组进行归一化处理,得到W个处理后的频谱数据组。
其中,归一化的方法可以采用可以(0,1)标准化或Z-score标准化等方法,此处对归一化处理的方法不作限定。
S904、分别从W个处理后的频谱数据组中每个处理后的频谱数据组中获取距离向取值小于预设值的频点对应的频谱数据,得到W个幅值。
其中,距离向取值小于预设值的频点对应的频谱数据即为距离向取值小于预设值的频点对应的幅值,也就是range-bin为0的频点对应的幅值。其中,Range-bin指Range-FFT中的每个频点,而每个频点对应雷达与目标物体的距离信息,所以可将每个频点定义为Range-bin。
S905、基于W个幅值,得到该目标时刻对应的特征值。
在一些实施例中,可以计算W个幅值的方差或标准差,得到统计值σt;将该统计值作为该目标时刻对应的特征值σt
在一种实现中,也可以从W个幅值中间隔均匀抽取w个幅值,计算w个幅值,得到统计值σt;将该统计值作为该目标时刻对应的特征值σt。可以理解的,该方法可以降低计算量。
S603、基于每个目标时刻对应的特征值,得到特征值序列,特征值序列由按时间先后排 序的多个目标时刻对应的特征值组成。
在一些实施例中,可以将第一时刻至第二时刻中部分或所有目标时刻对应的特征值作为特征值序列。以图8所示的滑动时间窗为例,在基于目标时间对应的子时间段确定该目标时刻对应的特征值后,假设滑动时间窗的步长为1,则滑动时间窗每滑动一个步长,可以确定一个目标时刻对应的子时间段;进而,基于该目标时刻对应的子时间段可以确定该目标时刻对应的特征值,直至获取到目标时刻为第二时刻时的特征值,得到所有目标时刻对应的特征值,即特征值序列。
在另一些实施例中,可以先将第一时刻至第二时刻中部分或所有目标时刻对应的特征值进行归一化处理后,将归一化处理后的特征值作为特征值序列。例如,可以采用最小最大规划(Min-Max scaling)线性函数归一化或对数函数等归一化方式对所有目标时刻对应的特征值进行归一化,其中,归一化中的参数如最大值、最小值可以设置为全局相同值,而非所有目标时刻对应的特征值的最大值和最小值。
S604、基于特征值序列感知目标。
其中,目标可以为探测到的物体,此处对目标不作限定。在如图5所示的场景中,目标即可以为用户的手。
在一些实施例中,可以基于最大门限、最小门限和特征值序列确定是否感知到目标。其中,最小门限和最大门限为预设值。
以下示例性的介绍一种确定最小门限和最大门限的方法:可以先确定特征值与距离值之间的对应关系,例如可以预先统计每一个特征值对应的距离,进而特征值和其对应的距离值进行线性或单调非线性(如对数曲线)拟合,从而将特征值进行距离值的映射,得到特征值和距离值的对应关系;进而,基于该对应关系,将最小感知距离对应的特征值确定为最大门限,将最大感知距离对应的特征值确定为最小门限,例如感知范围为最小感知距离至最大感知距离。例如,感知范围为5cm至30cm,则最小门限可以为30cm对应的特征值;最大门限可以为5cm对应的特征值。
在一种实现中,基于特征值序列,将第一时刻至第二时刻中特征值由最小门限上升至最大门限的时间段确定为第一时间段;进而,在第一时间段内目标的移动速度大于预设阈值时或第一时间段的时长大于预设时长时,确定感知到目标,其中,第一时间段内目标的移动速度为最大门限和最小门限的差与第一时间段的时长的比值。也就是说,终端设备可以基于特征值序列确定目标从最大感知距离移动至最小感知距离的时间(即第一时间段)以及目标的移动速或停留时长,从而将移动速度过快或停留时长过短的判定为误触,可以理解的,该方法可以减少用户误触而给终端设备带来不必要的操作。
例如第一时刻至第二时刻即为第一时间段,也即是说,终端设备确定第一时刻的特征值为最小门限,第二时刻的特征值为最大门限,第一时刻至第二时刻中特征值由最小门限上升至最大门限,且在第一时刻至第二时刻内目标的移动速度大于预设阈值时或第一时刻至第二时刻的时长大于预设时长时,终端设备可以确定在第二时刻感知到目标,从而执行下一个步骤如步骤S605。
可选地,可以将探测时间(即第一时刻至第二时刻)的时长设置为预设时长;终端设备 的探测开始时刻到探测结束时刻包括多个探测时间,探测时间可以基于滑动时间窗的方法确定,假设步长为1个时刻,则终端设备在基于第一时间段确定某一探测时间的第二时刻感知到目标时,可以实现用户将目标移动至最小感知距离实现确定感知目标,从而触发下一操作。例如感知范围为5cm至30cm,则用户将手从30cm移动至5cm且移动速度满足预设条件时,终端设备可以在目标移动至5cm时确定感知到目标。
在另一种实现中,基于特征值序列,将第一时刻至第二时刻中特征值均不小于最大门限的时间段确定为第二时间段;进而,在第二时间段内的时长大于预设时长时,确定感知到目标。其中,特征值均不小于最大门限,也即是目标距离小于等于最小感应距离。也就是,该方法在目标处于最小感应距离内且保持预设时长时确定感知到目标。例如感知范围为5cm至30cm,则用户将手从30cm移动至小于等于5cm的距离且保持在该距离满足预设时长时,终端设备确定感知到目标。
请参见图11,图11是本申请实施例提供的一种目标靠近时特征值序列的示意图。图11示例性的示出了目标靠近时特征值的变化情况,如图11所示,横坐标为时间,纵坐标为距离,假设最小门限σ0为30cm对应的特征值,最大门限σ1为5cm对应的特征值,则可以确定t1为第一时间段,t2为第二时间段。也就是说,目标在位于30cm处时,特征值为最小门限σ0;目标从位于30cm处移动至5cm时,特征值由最小门限σ0增大至最大门限σ1,特征值由最小门限σ0增大至最大门限σ1的时间段为第一时间段;目标在距离位于小于5cm的位置时,特征值大于最大门限σ1,特征值大于最大门限σ1为第二时间段。
在另一些实施例中,可以基于训练后的神经网络,确定是否感知到目标。例如,可以将特征值序列输入训练后的神经网络,得到第二时刻的感知结果,其中,训练后的神经网络是基于样本时间段对应的特征值序列为输入,样本时间段的感知结果为标签训练得到的。其中,神经网络可以采用长短期记忆网络(LSTM,Long Short-Term Memory)进行二分类训练,softmax层作为输出,得出分类结果,分类结果可以为0或1,0用于表示该特征值序列的感知结果为未感知到目标,1用于表示该特征值序列的感知结果为感知到目标。
以下示例性的提供一种训练神经网络的实现:可以先获取样本数据,样本数据包括多个样本时间段对应的特征值序列,以及多个特征值序列对应的样本标签,其中,样本标签用于指示样本时间段的感知结果,例如样本标签可以为0和1,将特征值序列均大于最大门限且特征值序列对应的时长超过预设值的特征值序列的样本标签标记为1,其他标记为0;进而,将样本时间段对应的特征值序列输入神经网络;基于神经网络输出的标签与样本标签得到误差;基于误差优化该神经网络,在误差满足预设阈值时,得到训练后的神经网络。其中,标记样本标签的方法还可以参见以下图12所示的标记情况。
请参见图12,图12为本申请实施例提供的几种标签的示意图。图12中的横坐标为时间,纵坐标为特征值,虚线用于表示最大门限,曲线为基于特征值与时间的对应关系生成的曲线;图12中的(a)、(b)和(c)示例性的示出了三种标签为0的情况,即在探测时间内特征值不大于最大门限或特征值大于最大门限的时长小于预设时长时,判定该探测时间内未感知到目标,将该特征值序列标记为0;图12中的(d)和(e)示例性的示出了两种标签为1的情况,即在探测时间内特征值大于最大门限的时长大于预设时长时,判定该探测时间内未感知 到目标,将该特征值序列标记为1。
S605、在感知到目标时,将当前界面显示为目标界面。
其中,目标界面可以基于终端设备的应用场景确定,此处不做限定。
例如在车载场景中,该终端设备可以为车载显示器,该感知功能用于车载显示器在感知到目标时显示菜单内容。请参见图13,图13示例性示出了车载显示器的显示屏上可能的用户界面。假设图13中的(A)为未感知到目标前的显示屏的界面;在车载显示器感知到目标时,可以通过显示屏显示如图13中的(B)所示的界面,如图13中的(B)所示,该界面左侧包括菜单,该菜单包括呼叫、设置、音乐以及天气等选项。可以理解的,用户可以在车载中控屏上通过感知手部靠近来弹出二级子菜单,减少用户的操作时间即能减少驾驶员的分心,提高驾驶安全。
又例如在家居场景中,该终端设备可以为智能家居设备,该感知功能用于智能家居设备在感知到目标时在显示屏显示二级菜单的内容。请参见图14,图14示例性示出了智能家居设备的显示屏上可能的用户界面。假设图14中的(A)为未感知到目标前的显示屏的界面,该界面显示有客厅、主卧、次卧和厨房等选项栏,其中,斜线区域用于指示当前选项栏被选中,即主卧处于选中状态;在智能家居设备感知到目标时,可以通过显示屏显示如图14中的(B)所示的界面,如图14中的(B)所示,该界面显示主卧的二级菜单,该二级菜单包括顶灯、空调和窗帘等选项。
又例如该终端设备可以为家电设备,在家电设备在感知到目标时,唤醒显示面板,可以理解的,该方法可降低设备功耗。
可以理解的是,图13所示的车载场景和图14所示的家居场景只是本申请实施例的示例性的实施方式,本申请实施例的应用场景包括但不仅限于以上场景;图13和图14所示的界面仅为本申请实施例示例性提供的界面,不应对本申请实施例造成限定。
在一些实施例中,上述实施例可以不包括步骤S605;或者,上述实施例可以不包括步骤S605,而包括其它步骤,例如感知到目标后,驱动终端设备移动等,此处不作限定。
在一些实施例中,图1所示的感知目标的装置可以应用于车辆,例如应用于车辆的显示屏;图6所示感知目标的方法也可以由车辆或车辆的部分设备执行。
请参见图15,图15是本申请实施例提供的一种车辆10的一种可能的功能框架示意图。如图15所示,车辆10的功能框架中可包括各种子系统,例如图示中的传感器系统12、控制系统14、一个或多个外围设备16(图示以一个为例示出)、电源18、计算机系统20和感知目标的装置22。可选地,车辆10还可包括其他功能系统,例如为车辆10提供动力的引擎系统等等,本申请这里不做限定。其中,
传感器系统12可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位系统1201(global positioning system,GPS)、车速传感器1202、惯性测量单元1203(inertial measurement unit,IMU)、雷达单元1204、激光测距仪1205、摄像单元1206、轮速传感器1207、转向传感器1208、档位传感器1209、或者其他用于自动检测的元件等等,本申请并不做限定。
全球定位系统GPS 1201是利用GPS定位卫星,在全球范围内实时进行定位、导航的系统。本申请中,全球定位系统GPS可用于实现车辆的实时定位,提供车辆的地理位置信息。车速传感器1202用于检测车辆的行车车速。惯性测量单元1203可以包括加速计和陀螺仪的组合,是测量车辆10的角速率和加速度的装置。例如,在车辆行驶过程中,惯性测量单元基于车辆的惯性加速可测量车身的位置和角度变化等。
雷达单元1204也可称雷达系统。雷达单元在车辆行驶所处的当前环境中,利用无线信号感测物体。可选地,雷达单元还可感测物体的运行速度和行进方向等信息。在实际应用中,雷达单元可被配置为用于接收或发送无线信号的一个或多个天线。激光测距仪1205可利用调制激光实现对目标物体的距离测量的仪器,也即是激光测距仪可用于实现对目标物体的距离测量。在实际应用中,该激光测距仪可包括但不限于以下中的任一种或多种元件的组合:激光源、激光扫描仪和激光检测器。
摄像单元1206用于拍摄影像,例如图像和视频等。本申请中,在车辆行驶过程中或者摄像单元启用后,该摄像单元可实时采集车辆所处环境中的图像。例如,在车辆进出隧道的过程中,摄像单元可实时、连续地采集相应地图像。在实际应用中,该摄像单元包括但不限于行车记录仪、摄像头、相机或其他用于拍照/摄影的元件等,该摄像单元的数量本申请也不做限定。
轮速传感器1207是用于检测车辆车轮转速的传感器。常用的轮速传感器1207可包括但不限于磁电式轮速传感器和霍尔式轮速传感器。转向传感器1208,也可称为转角传感器,可代表用于检测车辆的转向角的系统。在实际应用中,该转向传感器1208可用于测量车辆方向盘的转向角度,或者用于测量表示车辆方向盘的转向角的电信号。可选地,该转向传感器1208也可用于测量车辆轮胎的转向角度,或者用于测量表示车辆轮胎的转向角的电信号等等,本申请并不做限定。
也即是,转向传感器1208可用于测量以下中的任一种或多种的组合:方向盘的转向角、表示方向盘的转向角的电信号、车轮(车辆轮胎)的转向角和表示车轮的转向角的电信号等。
档位传感器1209,用于检测车辆行驶的当前档位。由于车辆的出厂商不同,则车辆中的档位也可能存在不同。以自动驾驶车辆为例,自动驾驶车辆支持6个档位,分别为:P档、R档、N档、D档、2档及L档。其中,P(parking)档用于停车,它利用车辆的机械装置锁住车辆的制动部分,使车辆不能移动。R(reverse)档,也称为倒档,用于车辆倒车。D(drive)档,也称前进档,用于车辆在道路上行驶。2(secondgear)档也为前进档,用于调整车辆的行驶速度。2档通常可用作车辆上、下斜坡处使用。L(low)档,也称为低速档,用于限定车辆的行驶速度。例如在下坡道路上,车辆进入L档,使得车辆在下坡时使用发动机动力进行制动,驾驶员不必长时间踩刹车导致刹车片过热而发生危险。
控制系统14可包括若干元件,例如图示出的转向单元1401、制动单元1402、照明系统1403、自动驾驶系统1404、地图导航系统1405、网络对时系统1406和障碍规避系统1407。可选地,控制系统14还可包括诸如用于控制车辆行驶速度的油门控制器及发动机控制器等元件,本申请不做限定。
转向单元1401可代表用于调节车辆10的行进方向的系统,其可包括但不限于方向盘、 或其他用于调整或控制车辆行进方向的任意结构器件。制动单元1402可代表用于减慢车辆10的行驶速度的系统,也可称为车辆刹车系统。其可包括但不限于刹车控制器、减速器或其他用于车辆减速的任意结构器件等。在实际应用中,制动单元1402可利用摩擦来使车辆轮胎减慢,进而减慢车辆的行驶速度。照明系统1403用于为车辆提供照明功能或警示功能。例如,在车辆夜间行驶过程中,照明系统1403可启用车辆的前车灯和后车灯,以提供车辆行驶的光照亮度,保证车辆的安全行驶。在实际应用中,照明系统中包括但不限于前车灯、后车灯、示宽灯以及警示灯等。
自动驾驶系统1404可包括硬件系统和软件系统,用于处理和分析输入该自动驾驶系统1404的数据以获得控制系统14中各部件的实际控制参数,例如制动单元中刹车控制器的期望制动压力及发动机的期望扭矩等等。便于控制系统14实现相应控制,保证车辆的安全行驶。可选地,自动驾驶系统1404通过分析数据还可确定车辆面临的障碍物、车辆所处环境的特征(例如车辆当前行驶所在的车道、道路边界以及即将经过的交通红绿灯)等信息。其中,输入自动驾驶系统1404的数据可以是摄像单元采集的图像数据,也可以是传感器系统12中各元件采集的数据,例如转向角传感器提供的方向盘转角、轮速传感器提供的车轮轮速等等,本申请并不做限定。
地图导航系统1405用于为车辆10提供地图信息和导航服务。在实际应用中,地图导航系统1405可根据GPS提供的车辆的定位信息(具体可为车辆的当前位置)和用户输入的目的地址,规划一条最优驾驶路线,例如路程最短或车流量较少的路线等。便于车辆按照该最优驾驶路线进行导航行驶,以到达目的地址。可选地,地图导航系统除了提供导航功能外,还可根据用户实际需求向用户提供或展示相应地地图信息,例如在地图上实时展示车辆当前行驶的路段等,本申请不做限定。
网络对时系统1406(network time system,NTS)用于提供对时服务,以保证车辆的系统当前时间和网络标准时间同步,有利于为车辆提供更为精确地时间信息。具体实现中,网络对时系统1406可从GPS卫星上获得标准的时间信号,利用该时间信号来同步更新车辆的系统当前时间,保证车辆的系统当前时间和获得的标准时间信号的时间一致。
障碍规避系统1407用于预测车辆行驶过程中可能遇到的障碍物,进而控制车辆10绕过或越过障碍物以实现车辆10的正常行驶。例如,障碍规避系统1407可利用传感器系统12中各元件采集的传感器数据分析确定车辆行驶道路上可能存在的障碍物。如果该障碍物的尺寸较大,例如为路边的固定建筑物(楼房)等,障碍规避系统1407可控制车10绕开该障碍物以进行安全行驶。反之,如果该障碍物的尺寸较小,例如为路上的小石头等,障碍规避系统1407可控制车辆10越过该障碍物继续向前行驶等。
外围设备16可包括若干元件,例如图示中的通信系统1601、触摸屏1602、用户接口1603、麦克风1604以及扬声器1605等等。其中,通信系统1601用于实现车辆10和除车辆10之外的其他设备之间的网络通信。在实际应用中,通信系统1601可采用无线通信技术或有线通信技术实现车辆10和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。该无线通信技术包括但不限于全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS), 码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE)、无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC)以及红外技术(infrared,IR)等等。
触摸屏1602可用于检测触摸屏1602上的操作指令。例如,用户根据实际需求对触摸屏1602上展示的内容数据进行触控操作,以实现该触控操作对应的功能,例如播放音乐、视频等多媒体文件等。用户接口1603具体可为触控面板,用于检测触控面板上的操作指令。用户接口1603也可以是物理按键或者鼠标。用户接口1603还可以是显示屏,用于输出数据,显示图像或数据。可选地,用户接口1603还可以是属于外围设备范畴中的至少一个设备,例如触摸屏、麦克风和扬声器等。
麦克风1604,也称为话筒、传声器,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户靠近麦克风发声,可将声音信号输入到麦克风中。扬声器1605也称为喇叭,用于将音频电信号转换为声音信号。车辆通过扬声器1605可以收听音乐,或者收听免提通话等。
电源18代表为车辆提供电力或能源的系统,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。可选地,电源18也可为能量源,用于为车辆提供能量源,例如汽油、柴油、乙醇、太阳能电池或电池板等等,本申请不做限定。
车辆10的若干功能均由计算机系统20控制实现。计算机系统20可包括一个或多个处理器2001(图示以一个处理器为例示出)和存储器2002(也可称为存储装置)。在实际应用中,该存储器2002也在计算机系统20内部,也可在计算机系统20外部,例如作为车辆10中的缓存等,本申请不做限定。其中,
处理器2001可包括一个或多个通用处理器,例如图形处理器(graphic processing unit,GPU)。处理器2001可用于运行存储器2002中存储的相关程序或程序对应的指令,以实现车辆的相应功能。
存储器2002可以包括易失性存储器(volatile memory),例如RAM;存储器也可以包括非易失性存储器(non-vlatile memory),例如ROM、快闪存储器(flash memory)、HDD或固态硬盘SSD;存储器2002还可以包括上述种类的存储器的组合。存储器2002可用于存储一组程序代码或程序代码对应的指令,以便于处理器2001调用存储器2002中存储的程序代码或指令以实现车辆的相应功能。该功能包括但不限于图15所示的车辆功能框架示意图中的部分功能或全部功能。本申请中,存储器2002中可存储一组用于车辆控制的程序代码,处理器2001调用该程序代码可控制车辆安全行驶,关于如何实现车辆安全行驶具体在本申请下文详述。
可选地,存储器2002除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机系统20可以结合车辆功能框架示意图中的其他元件,例如传感器 系统中的传感器、GPS等,实现车辆的相关功能。例如,计算机系统20可基于传感器系统12的数据输入控制车辆10的行驶方向或行驶速度等,本申请不做限定。
感知目标的装置22可包括若干元件,例如图15示出的天线单元2201,雷达单元2202和检测单元2203。感知目标的装置22还可以包括控制单元,该控制单元可以用于根据用户指令控制雷达单元2202通过天线单元2201收发信号等需要说明的是,感知目标的装置中的部分元件的功能也可以由车辆的其它子系统来实现,例如,控制单元也可以为控制系统中的元件;又例如,感知目标的装置22可以不包括雷达单元2202,而由雷达单元1204来实现。
其中,本申请图15示出包括四个子系统,传感器系统12、控制系统14、计算机系统20和感知目标的装置22仅为示例,并不构成限定。在实际应用中,车辆10可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子系统。例如,车辆10中也可包括电子稳定性系统(electronic stability program,ESP)和电动助力转向系统(electric power steering,EPS)等,图未示出。其中,ESP系统可由传感器系统12中的部分传感器及控制系统14中的部分元件组成,具体地该ESP系统可包括轮速传感器1207、转向传感器1208、横向加速度传感器及控制系统14中涉及的控制单元等等。EPS系统可由传感器系统12中的部分传感器、控制系统14中的部分元件及电源18等元件组成,具体地该EPS系统中可包括转向传感器1208、控制系统14中涉及的发电机及减速器、蓄电池电源等等。又例如,感知目标的装置也可以包括外围设备中的用户接口1603和触摸屏1602等,以实现接收用户指令的功能。
需要说明的是,上述图15仅为车辆10的一种可能的功能框架示意图。在实际应用中,车辆10可包括更多或更少的系统或元件,本申请不做限定。
上述车辆10可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
在上述实施例中,全部或部分功能可以通过软件、硬件、或者软件加硬件的组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (13)

  1. 一种感知目标的方法,其特征在于,所述方法包括:
    在第一时刻至第二时刻内接收回波信号;
    基于接收到的回波信号,计算每个目标时刻对应的特征值,得到特征值序列;所述特征值序列由按时间先后排序的多个目标时刻对应的特征值组成,所述目标时刻为所述第一时刻至所述第二时刻之间的时刻;
    计算所述目标时刻对应的特征值的过程包括:基于所述目标时刻之前的子时间段接收的多个目标回波信号,得到所述多个目标回波信号分别对应的目标频谱数据,所述子时间段包括所述目标时刻,所述目标频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于所述多个目标回波信号分别对应的目标频谱数据,得到所述目标时刻对应的特征值;
    基于所述特征值序列感知目标。
  2. 根据权利要求1所述的方法,其特征在于,所述距离向取值小于预设值的频点为距离向取值为零的频点。
  3. 根据权利要求1或2所述的方法,其特征在于,所述目标时刻对应的特征值为所述多个目标回波信号分别对应的目标频谱数据的方差或标准差。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述基于所述特征值序列感知目标,包括:
    基于所述特征值序列,从所述第一时刻至所述第二时刻中确定第一时间段,所述第一时间段内的所述目标时刻对应的特征值由最小门限上升至最大门限;
    在所述第一时间段内所述目标的移动速度大于预设阈值时,确定感知到所述目标,所述第一时间段内所述目标的移动速度为所述最大门限和所述最小门限的差与所述第一时间段的时长的比值。
  5. 根据权利要求1-3任一项所述的方法,其特征在于,所述基于所述特征值序列感知目标,包括;
    基于所述特征值序列,从所述第一时刻至所述第二时刻中确定第二时间段,所述第二时间段内的所述目标时刻对应的特征值均不小于所述最大门限;
    在所述第二时间段内的时长大于预设时长时,确定感知到所述目标。
  6. 根据权利要求1-3任一项所述的方法,其特征在于,所述基于所述特征值序列感知目标,包括;
    将所述特征值序列输入训练后的神经网络,得到所述第二时刻的感知结果;所述训练后的神经网络是基于样本时间段对应的特征值序列为输入,所述样本时间段的感知结果为标签训练得到的。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述基于所述目标时刻之前的子时间段接收的多个目标回波信号,得到所述多个目标回波信号分别对应的目标频谱数据,包括:
    基于所述多个目标回波信号,得到每个所述目标回波信号对应的频谱数据组,所述频谱数据组包括多个频谱数据,所述多个频谱数据对应多个频点;
    分别对每个所述频谱数据组进行归一化处理,得到多个处理后的频谱数据组;
    分别从每个所述处理后的频谱数据组中获取距离向取值小于所述预设值的频点对应的频谱数据,得到每个所述目标回波信号对应的目标频谱数据。
  8. 根据权利要求7所述的方法,其特征在于,所述基于所述多个目标回波信号,得到每个所述目标回波信号对应的频谱数据组,包括:
    计算所述目标回波信号对应的中频信号;
    对所述中频信号进行N点傅里叶变换,得到所述目标回波信号对应的频谱数据组,所述频谱数据组包括N个数据,所述N为正整数。
  9. 一种感知目标的装置,其特征在于,所述装置包括天线单元和感知单元:
    所述天线单元,用于在第一时刻至第二时刻内接收回波信号;
    所述感知单元,用于基于接收到的回波信号,计算每个目标时刻对应的特征值,得到特征值序列;所述特征值序列由按时间先后排序的多个目标时刻对应的特征值组成,所述目标时刻为所述第一时刻至所述第二时刻之间的时刻;基于所述特征值序列感知目标;
    其中,计算所述目标时刻对应的特征值的过程包括:基于所述目标时刻之前的子时间段接收的多个目标回波信号,得到所述多个目标回波信号分别对应的目标频谱数据,所述子时间段包括所述目标时刻,所述目标频谱数据为距离向取值小于预设值的频点对应的频谱数据;基于所述多个目标回波信号分别对应的目标频谱数据,得到所述目标时刻对应的特征值。
  10. 一种车辆,其特征在于,包括如权利要求9所述的感知目标的装置。
  11. 一种电子设备,其特征在于,所述电子设备包括一个或多个处理器和一个或多个存储器;其中,所述一个或多个存储器与所述一个或多个处理器耦合,所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如权利要求1-8中任一项所述的方法。
  12. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-8中任一项所述的方法。
  13. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-8中任一项所述的方法。
PCT/CN2023/080191 2022-03-11 2023-03-08 一种感知目标的方法和装置 WO2023169448A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210242015.3 2022-03-11
CN202210242015.3A CN116774203A (zh) 2022-03-11 2022-03-11 一种感知目标的方法和装置

Publications (1)

Publication Number Publication Date
WO2023169448A1 true WO2023169448A1 (zh) 2023-09-14

Family

ID=87936033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/080191 WO2023169448A1 (zh) 2022-03-11 2023-03-08 一种感知目标的方法和装置

Country Status (2)

Country Link
CN (1) CN116774203A (zh)
WO (1) WO2023169448A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117110991B (zh) * 2023-10-25 2024-01-05 山西阳光三极科技股份有限公司 一种露天矿边坡安全监测方法、装置、电子设备以及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597058A (zh) * 2018-12-21 2019-04-09 上海科勒电子科技有限公司 感应水龙头的微波测量方法、电子设备及存储介质
CN111580086A (zh) * 2019-02-19 2020-08-25 富士通株式会社 生命体检测方法、检测装置和电子设备
CN112462357A (zh) * 2020-12-17 2021-03-09 广东蓝水花智能电子有限公司 一种基于fmcw原理的自动门控制方法及自动门控制系统
US20210121075A1 (en) * 2019-10-23 2021-04-29 National Sun Yat-Sen University Non-contact method of physiological characteristic detection
CN112986973A (zh) * 2019-12-18 2021-06-18 华为技术有限公司 距离测量方法和距离测量装置
CN113820704A (zh) * 2020-06-19 2021-12-21 富士通株式会社 检测运动目标的方法、装置和电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597058A (zh) * 2018-12-21 2019-04-09 上海科勒电子科技有限公司 感应水龙头的微波测量方法、电子设备及存储介质
CN111580086A (zh) * 2019-02-19 2020-08-25 富士通株式会社 生命体检测方法、检测装置和电子设备
US20210121075A1 (en) * 2019-10-23 2021-04-29 National Sun Yat-Sen University Non-contact method of physiological characteristic detection
CN112986973A (zh) * 2019-12-18 2021-06-18 华为技术有限公司 距离测量方法和距离测量装置
CN113820704A (zh) * 2020-06-19 2021-12-21 富士通株式会社 检测运动目标的方法、装置和电子设备
CN112462357A (zh) * 2020-12-17 2021-03-09 广东蓝水花智能电子有限公司 一种基于fmcw原理的自动门控制方法及自动门控制系统

Also Published As

Publication number Publication date
CN116774203A (zh) 2023-09-19

Similar Documents

Publication Publication Date Title
CN108237918B (zh) 车辆及其控制方法
WO2020244622A1 (zh) 一种通知的提示方法、终端及系统
CN111983559A (zh) 室内定位导航方法及装置
WO2022242699A1 (zh) 一种信息推荐方法以及相关设备
WO2021088393A1 (zh) 确定位姿的方法、装置和系统
WO2022152024A1 (zh) 一种微件的显示方法与电子设备
CN113792589B (zh) 一种高架识别方法及装置
WO2022179604A1 (zh) 一种分割图置信度确定方法及装置
JP2023504945A (ja) 制御車両
WO2023169448A1 (zh) 一种感知目标的方法和装置
US20230005277A1 (en) Pose determining method and related device
WO2022022335A1 (zh) 天气信息的展示方法、装置和电子设备
WO2021238740A1 (zh) 一种截屏方法及电子设备
WO2023246563A1 (zh) 一种声音处理方法及电子设备
WO2023207667A1 (zh) 一种显示方法、汽车和电子设备
CN113742460A (zh) 生成虚拟角色的方法及装置
WO2023010923A1 (zh) 一种高架识别方法及装置
CN114789734A (zh) 感知信息补偿方法、装置、车辆、存储介质及程序
CN114422936A (zh) 隧道交通管理方法、装置及存储介质
WO2023241482A1 (zh) 一种人机对话方法、设备及系统
WO2023185687A1 (zh) 车辆位置的获取方法及电子设备
CN116844375B (zh) 停车信息的显示方法和电子设备
WO2024041180A1 (zh) 路径规划方法及装置
WO2023109636A1 (zh) 应用卡片显示方法、装置、终端设备及可读存储介质
CN113239901B (zh) 场景识别方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23766026

Country of ref document: EP

Kind code of ref document: A1