CN117376887A - Method, device, chip and equipment for sensing equipment - Google Patents

Method, device, chip and equipment for sensing equipment Download PDF

Info

Publication number
CN117376887A
CN117376887A CN202210775399.5A CN202210775399A CN117376887A CN 117376887 A CN117376887 A CN 117376887A CN 202210775399 A CN202210775399 A CN 202210775399A CN 117376887 A CN117376887 A CN 117376887A
Authority
CN
China
Prior art keywords
processor
electronic device
condition
equipment
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210775399.5A
Other languages
Chinese (zh)
Inventor
郑博文
熊张亮
林学森
侯伟波
王帅
彭红星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210775399.5A priority Critical patent/CN117376887A/en
Priority to PCT/CN2023/104124 priority patent/WO2024002282A1/en
Publication of CN117376887A publication Critical patent/CN117376887A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0225Power saving arrangements in terminal devices using monitoring of external events, e.g. the presence of a signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The embodiment of the invention provides a method, a device, a chip and equipment for sensing equipment, wherein the method comprises the following steps: determining whether other devices exist around the electronic device; under the condition that other devices exist around the electronic device, positioning results corresponding to the other devices are obtained; determining whether the positioning result meets a preset first condition; and under the condition that the positioning result meets the first condition, waking up a processor of the electronic equipment, and executing a first operation corresponding to the other equipment after the processor is waken up. The embodiment of the invention can realize the real-time low-power consumption sensing equipment.

Description

Method, device, chip and equipment for sensing equipment
Technical Field
The present invention relates to the field of electronic devices, and in particular, to a method, an apparatus, a chip, and a device for sensing a device.
Background
In one implementation, the purpose of sensing the device may be achieved using an ultrasonic medium, and in particular the device may be sensed using an ultrasonic medium. Since the perception processing logic runs in the processor, the processor consumes high power and cannot achieve the perception capability in the processor sleep condition.
Disclosure of Invention
The embodiment of the invention provides a method, a device, a chip and equipment for sensing equipment, which can sense equipment with low power consumption in real time.
In a first aspect, an embodiment of the present invention provides a method for sensing a device, including: determining whether other devices exist around the electronic device; under the condition that other devices exist around the electronic device, positioning results corresponding to the other devices are obtained; determining whether the positioning result meets a preset first condition; and under the condition that the positioning result meets the first condition, waking up a processor of the electronic equipment, and executing a first operation corresponding to the other equipment after the processor is waken up.
Optionally, the obtaining a positioning result corresponding to the other device includes: acquiring first data acquired by an ultrasonic medium in the electronic equipment; and obtaining positioning results corresponding to the other devices according to the first data.
Optionally, the positioning result includes: direction data of the other device relative to the electronic device; the first condition includes: a preset direction range; the determining whether the positioning result meets a preset first condition includes: determining whether the direction data is within the direction range; wherein the case that the positioning result satisfies the first condition includes: and the direction data is in the direction range.
Optionally, the positioning result includes: the target distance between the other equipment and the electronic equipment; the first condition includes: a preset distance threshold; the determining whether the positioning result meets a preset first condition includes: determining whether the target distance is not greater than the distance threshold; wherein the case that the positioning result satisfies the first condition includes: and the target distance is not greater than the distance threshold.
Optionally, after determining that other devices exist around the electronic device and before the waking up the processor of the electronic device, the method further includes: determining whether the electronic device is moving; and executing the step of waking up the processor of the electronic device under the condition that the electronic device is determined to be moving.
Optionally, the method further comprises: acquiring second data acquired by an inertial measurement unit in the electronic equipment; and executing the step of determining whether the electronic equipment is moving according to the second data.
Optionally, the inertial measurement unit includes at least one of a gyroscope sensor, an acceleration sensor, a compass sensor.
Optionally, after determining that other devices exist around the electronic device and before the waking up the processor of the electronic device, the method further includes: determining whether the other device and the electronic device are devices under the same user account; and waking up a processor of the electronic device if the positioning result meets the first condition, including: and waking up a processor of the electronic equipment under the condition that the positioning result meets the first condition and the other equipment and the electronic equipment are equipment under the same user account.
Optionally, the first operation includes: and controlling the electronic equipment to eject the card, wherein the card display content comprises information related to the use states of the other equipment.
Optionally, the method further comprises: acquiring third data acquired by a wireless medium in the electronic equipment; and executing the step of determining whether other devices exist around the electronic device according to the third data.
Optionally, the wireless medium includes at least one of a bluetooth chip and a wireless network communication technology chip.
In a second aspect, an embodiment of the present invention provides an apparatus for sensing a device, including: the first determining module is used for determining whether other devices exist around the electronic device; the acquisition module is used for acquiring positioning results corresponding to other equipment when the other equipment exists around the electronic equipment; the second determining module is used for determining whether the positioning result meets a preset first condition; and the awakening module is used for awakening a processor of the electronic equipment under the condition that the positioning result meets the first condition, and the processor executes a first operation corresponding to the other equipment after being awakened.
In a third aspect, the present application provides an electronic chip, comprising: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of the first aspects of the present application.
In a fourth aspect, the present application provides an electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of the first aspects of the present application.
In a fifth aspect, the present application provides a computer readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the method of any one of the first aspects of the present application.
In a sixth aspect, the present application provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method of any one of the first aspects of the present application.
The embodiment of the application can realize the real-time low-power consumption sensing equipment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an implementation of a sensing device according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a sensing device according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of another sensing device according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of yet another sensing device according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of another sensing implementation according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of yet another sensing implementation provided by an embodiment of the present invention;
FIG. 8 is a timing diagram of a sensing device according to an embodiment of the present invention;
FIG. 9 is a timing diagram of another sensing device according to an embodiment of the present invention;
FIG. 10 is a timing diagram of yet another sensing device according to an embodiment of the present invention;
FIG. 11 is a flowchart of a method for implementing sensing according to an embodiment of the present invention;
fig. 12 is a flowchart of yet another sensing implementation method according to an embodiment of the present invention.
Detailed Description
For a better understanding of the technical solution of the present invention, the following detailed description of the embodiments of the present invention refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. The term "and/or" as used herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. Wherein A, B may be singular or plural. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
It should be understood that although the terms first, second, etc. may be used in embodiments of the present invention to describe the set threshold values, these set threshold values should not be limited to these terms. These terms are only used to distinguish the set thresholds from each other. For example, a first set threshold may also be referred to as a second set threshold, and similarly, a second set threshold may also be referred to as a first set threshold, without departing from the scope of embodiments of the present invention.
The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
In one embodiment, the method provided in the embodiments of the present application may be applied to the electronic device 100 shown in fig. 1. Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
In order to facilitate understanding of the improvements made by the solutions provided in the embodiments of the present application, the existing related technical solutions will be briefly described.
Related technical scheme I: the purpose of sensing the equipment can be achieved by utilizing ultrasonic media, and the equipment can be specifically measured by utilizing the ultrasonic media to position the direction and distance of the equipment. The perceptual processing logic of the device runs in a processor.
The processor may be an application processor (Application Processor, AP).
In one embodiment, fig. 2 shows a schematic diagram of one possible implementation of the present solution. In this implementation, the device 200A includes a processor 210A (such as an AP), a High-Fidelity (HIFI) chip 220A, a speaker 230A/microphone 240A (SPK/MIC), and a perception processing logic 211A disposed in the processor 210A, and an audio processing algorithm chain 221A disposed in the High-Fidelity chip 220A.
The device 200B includes a processor 210B, a hi-fi chip 220B, a speaker 230B/microphone 240B, a perception processing logic 211B disposed in the processor 210B, and an audio processing algorithm chain 221B disposed in the hi-fi chip 220B.
In this manner, device 200A may sense device 200B, and device 200B may sense device 200A as well, i.e., the two devices may sense each other.
The loudspeaker, SPK for short, is named Speaker.
The microphone, MIC for short, is fully spliced into microphone.
Referring to fig. 2, taking the device 200A to sense the device 200B as an example, when the device 200B is sensed by the speaker 230A/microphone 240A in the device 200A, the sensed corresponding audio signal is sent to the hi-fi chip 220A in the device 200A; the hi-fi chip 220A in the apparatus 200A processes the audio signal (for example, performs a data conversion process to convert the audio signal into data processable by a processor, etc.) based on the built-in audio processing algorithm chain 221A, and transmits the processing result to the processor 210A in the apparatus 200A; processor 210A in device 200A processes the processing results from hi-fi chip 220A based on built-in sensing processing logic 211A to sense the orientation and position of device 200B.
Similarly, the device 200B may also sense the direction and the position of the device 200A, and the specific implementation process refers to the above, which is not described herein.
Referring to Table 1 below, the power consumption of a processor making a sensing measurement is shown.
TABLE 1
Test item Processor and method for controlling the same
Increased current 25.18mA
Current program input scale run time 4.3s
Dual microphone current 10ma
Dual microphone acquisition signal time 0.6s
Speaker signaling current 50ma
Speaker signaling time 0.6s
Total power consumption comparison 0.04mah=(25.18×4.3+10×0.6+50×0.6)/3600
As shown in table 1, the power consumption of the processor for one sensing measurement is 0.04mah, and when the power consumption is calculated according to 1000 sensing measurements a day, the power consumption is increased by 40mah, and the power consumption is high.
In the technical scheme, as the perception processing logic of the equipment runs in the processor, the power consumption of the processor is high. Because of the limitation of power consumption, the processor is required to be in an awake state for realizing the ultrasonic sensing capability, so that the sensing capability under the condition of processor dormancy cannot be realized, and the direction and the distance of the equipment cannot be automatically sensed under the condition of processor dormancy.
Related technical scheme II: the hardware of an Ultra Wide Band (UWB) chip can be newly added in the electronic equipment to realize the sensing function. Among them, UWB technology is a wireless carrier communication technology.
However, the added hardware will bring about a corresponding increase in the cost of the electronic device (for example, 5 cents higher), and the present solution also has no low power consumption solution, and the perceptibility under the processor sleep condition cannot be realized.
In view of this, the embodiments of the present application provide a solution to the problem that the power consumption is high and the sensing capability under the sleep condition of the processor cannot be achieved, so that the sensing device with low power consumption in real time can be achieved.
The technical scheme provided by the embodiment of the application is initially introduced by combining a plurality of application scene examples.
Application scenario example 1:
next, using the electronic device as a smart phone of the user, the perceived device is a smart home appliance in the user's home, and the smart home appliance is perceived by the phone and then the card is ejected for the user to use, for example, and the application field Jing Shili is described.
Application scenario example 1.1:
in one embodiment, referring to fig. 3, the electronic device is a mobile phone 301, and the smart home appliance is a speaker 302.
When the processor in the mobile phone is in a dormant state, and the screen of the mobile phone is displayed as a black screen. However, the built-in sensing component in the low-power normally open chip in the mobile phone can sense whether the preset equipment exists around the mobile phone in real time with low power consumption.
If the user holds the mobile phone in the dormant state and moves the mobile phone so that the mobile phone is close to the sound box, the perception component in the mobile phone can perceive the sound box.
In implementation 1, the sensing component may sense the direction and the position of the sound box in real time, and if the sensed direction and distance meet the preset conditions, the processor may be awakened, and after the processor is awakened, the processor may perform further processing, for example, may pop up a corresponding card for the user to use.
In implementation mode 2, after the primary sensing component senses the sound box in real time, the ranging module (secondary sensing component) in the mobile phone can be triggered to perform accurate sensing so as to sense the direction and the position of the sound box. If the direction and distance sensed by the secondary sensing component meet the preset conditions, the processor can be awakened, and further processing can be performed after the processor is awakened, for example, a corresponding card can be ejected for a user to use.
In implementation 3, the primary sensing component may wake up the processor after sensing the sound box in real time, and the processor may further process after being woken up, for example, may control the ranging module (secondary sensing component) in the mobile phone to perform accurate sensing, so as to sense the direction and the position of the sound box. If the perceived direction and distance meet the preset conditions, the corresponding card can be ejected for the user to use.
Information related to the current use state of the sound box can be displayed in the card ejected by the mobile phone for users to use.
For example, assuming that the speaker of the living room of the user is playing audio information of music (or songs), referring to fig. 3, the following information may be displayed in the card ejected from the mobile phone: displaying information "sound box in living room" for indicating audio playback device; displaying image information (the image information may be image information of a background picture when music is played) for indicating the corresponding music, and a name of the music; a control for indicating that the flow goes to "native"; the control is used for indicating whether to open the mobile phone remote controller; controls for controlling audio playback (e.g., controls for controlling music to pause playback, etc.).
Based on the information displayed in the card, the user can check which device the audio information of the music is played on, what the name of the played music is, the image information corresponding to the played music and other information, and can execute control operations of controlling the audio stream to be played by the local machine (namely the mobile phone), controlling the function of the remote controller of the mobile phone to be started or closed, controlling the audio to be played in a pause mode and the like.
In addition, other information can be displayed in the card, such as the name of the sound box, a control for controlling the playing effect (such as silence, increasing/decreasing volume) of the sound box, and a control for controlling the current use state (such as working, standby, shutdown, etc.) of the sound box.
Referring to fig. 3, a display area of a card on a mobile phone screen may occupy a part of the mobile phone screen area, for example, may occupy a middle-lower area in the mobile phone screen area, and an upper area in the mobile phone screen area may display conventional information, for example, information such as a place, a time, a date, weather, a mobile phone signal, a mobile phone residual power, etc. may be displayed, so as to satisfy a user's requirement for learning the conventional information.
In this application scenario example, from the perspective of user use experience, when the mobile phone (mobile phone screen black screen) in the dormant state of the user approaches the sound box, the mobile phone screen automatically lights up, and the card shown in fig. 3 is automatically popped up for the user to use, so that the user use experience is good.
Application scenario example 1.2:
in another embodiment, referring to fig. 4, the electronic device is a mobile phone 401, the smart home appliance is a large screen device, i.e. a television 402, and the television is in a video playing state.
If the user holds the mobile phone in the dormant state and moves the mobile phone so that the mobile phone is close to the television, the sensing component in the mobile phone can sense the television.
Please refer to the above-mentioned implementation modes 1-3, based on the same implementation principle, if the direction and distance of the television perceived by the mobile phone satisfy the preset conditions, the mobile phone can pop up the corresponding card for the user to use.
Information related to the current use state of the television can be displayed in the card ejected by the mobile phone for users to use.
For example, assuming that the smart screen (or tv) in the living room of the user is playing an image of the video, referring to fig. 4, the following information may be displayed in the card ejected from the mobile phone: displaying information "intelligent screen in living room" for indicating video playback device; displaying image information (one of image frames of the video, such as a cover image, etc.) indicating the video, a name of the video; a control for indicating that the flow goes to "native"; the control is used for indicating whether to open the mobile phone remote controller; controls for controlling video playback (e.g., controls for controlling video pause playback, etc.).
Based on the information displayed in the card, the user can check what device the video is played on, what the name of the video is played, the image content of the played video and other information, and can execute control operations of controlling the video stream to be played by the local machine (namely the mobile phone), controlling the function of a remote controller of the mobile phone to be started or closed, controlling the video to be played in a pause mode and the like.
In addition, other information can be displayed in the card, such as the name of the intelligent screen, the control for controlling the video playing effect (such as silence, increasing/decreasing volume) and the current use state (such as working, standby, shutdown, etc.) of the intelligent screen, and the control for controlling the use state of the intelligent screen.
As in the case of ejecting the card in the application scenario example 1.2, referring to fig. 4, when the mobile phone ejects the card in the specific display area, conventional information may also be displayed in other display areas.
Application scenario example 1.3:
in yet another embodiment, referring to fig. 5, the electronic device is a mobile phone 501, the smart home is a large screen device, i.e. a television 502, and the television is in a video display state.
If the user holds the mobile phone in the dormant state and moves the mobile phone so that the mobile phone is close to the television, the sensing component in the mobile phone can sense the television.
Please refer to the above-mentioned implementation modes 1-3, based on the same implementation principle, if the direction and distance of the television perceived by the mobile phone satisfy the preset conditions, the mobile phone can pop up the corresponding card for the user to use.
In some embodiments, the card ejected by the mobile phone may display information related to the current use state of the television for the user to use. Referring to fig. 4 and 5, the current use state of the television is different, and the information in the ejected card may be correspondingly different.
As shown in fig. 5, assuming that the television is in a video showing state, but not playing video, the following information may be displayed in the card ejected by the mobile phone: information "intelligent screen in living room" for indicating video playback device; the control is used for indicating whether to open the mobile phone remote controller; video playing record of the television; several recommends viewing image information of a video (one image picture of the video).
For video playing records of televisions, specifically, image information (an image picture of the video) and corresponding playing amounts (for example, a playing duration is a percentage of the total duration) of each video that is played can be displayed, the current display interface can only display image information and corresponding playing amounts of part (for example, two) of the watched videos, and image information and corresponding playing amounts of other watched videos can be displayed through a corresponding area of a user sliding screen.
For the image information of the recommended video, in particular, the image information of a plurality of videos can be displayed, the current display interface can only display the image information of a part (such as three) recommended videos, and the image information of other recommended videos can be displayed by sliding the corresponding area of the screen by the user.
Based on the above information displayed on the card, the user can view on which device the corresponding video is played, view the video that has been viewed and the corresponding play amount, view information such as the video that is recommended to be viewed, and can perform a control operation that controls the video to start playing. For example, the user may click on an image of any of the viewed videos to cause the television to continue playing the video, and may click on an image of any of the recommended videos to cause the television to begin playing the video.
As in the case of ejecting the card in the application scenario example 1.2, referring to fig. 5, when the mobile phone ejects the card in the specific display area, conventional information may also be displayed in other display areas.
Application scenario example 2:
unlike the pop-up card in application scenario example 1, in the present application field Jing Shili, the electronic device can control the state of any other device that is set in advance after sensing the other device. For example, if the other device is in a standby state, the other device is controlled to start operating, and if the other device is operating, the other device is controlled to stand by to suspend operation or shut down to stop operation.
The implementation manner of the sensing device provided in the embodiment of the present application is specifically described below.
In one embodiment, referring to fig. 6, for the purpose of sensing devices, an electronic device 600 may include at least a processor (e.g., AP) 610, a high-fidelity chip (HIFI chip) 620, a sensor processor (e.g., sensor hub) 630, a speaker 640/microphone 650, a sensor 660 (e.g., inertial measurement unit), and a wireless network communication technology chip (WiFi chip) 670. The association between the components is shown by arrows in fig. 6.
The hi-fi chip 620 may be a low-power chip that specifically processes audio; the sensor processor 630 may be a low power chip that processes sensor data specifically.
Wherein both the wireless network communication technology chip 670 and the sensor 660 may be built into the sensor processor 630.
The electronic device 600 may implement the perception of other devices through any of the above-described implementations 1-3 based on the components described above.
Corresponding to implementation 1 above, the speaker 640/microphone 650 may sense the direction and location of other devices in real time. The Hi-Fi chip 620 determines whether the perceived direction and distance satisfy a predetermined condition, and if so, wakes up the processor 610. The processor 610 may wake up for further processing, such as ejecting a corresponding card for use by a user.
Corresponding to implementation 2 above, the sensor 660 and the wireless network communication technology chip 670 may sense other devices in real time. The sensor processor 630 may trigger the hi-fi chip 620 to perform accurate sensing when it determines that other devices are sensed according to the sensing data collected by the sensor 660 and the wireless network communication technology chip 670. The hi-fi chip 620 perceives the orientation and position of the other device based on the speaker 640/microphone 650. The Hi-Fi chip 620 determines whether the perceived direction and distance satisfy a predetermined condition, and if so, wakes up the processor 610. The processor 610 may wake up for further processing, such as ejecting a corresponding card for use by a user.
Corresponding to implementation 3 above, the sensor 660 and the wireless network communication technology chip 670 may sense other devices in real time. The sensor processor 630 may wake up the processor 610 when it determines that other devices are perceived based on the sensor 660 and the sensory data collected by the wireless network communication technology chip 670. The processor 610 may wake up for further processing, such as controlling the speaker 640/microphone 650 of the electronic device 600 to accurately sense the orientation and position of the other devices. The processor 610 determines whether the perceived direction and distance satisfy a predetermined condition, and if so ejects the corresponding card for use by the user.
Based on the above, the implementation of the sensing device will be further described by taking the above implementation 2 as an example.
In another embodiment, referring to fig. 7, taking an example of sensing the electronic device 700b by the electronic device 700a, the electronic device 700a may include at least a processor 710a, a hi-fi chip 720a, a sensor processor 730a, an acceleration sensor 740 a/compass sensor 750a, a bluetooth chip 760a, and a speaker 770 a/microphone 780a for sensing the device. The association between the components is shown by arrows in fig. 7.
In this embodiment, when the electronic device senses the existence of the surrounding devices according to the wireless medium of the bluetooth chip 760a, it may further determine whether the electronic device is moving, for example, the movement of the electronic device may be determined according to the acceleration sensor 740a or the compass sensor 750 a. For example, if the direction sensed by the compass sensor 750a is changed, it may be confirmed that the electronic device is moving, whereas if the direction sensed by the compass sensor 750a is not changed, it may be confirmed that the electronic device is stationary.
In other embodiments, the electronic device may sense the surrounding device according to other communication methods, for example, sensing the surrounding device according to a wireless network communication technology chip shown in fig. 6, which is not limited in this application.
Wherein the processor 710a includes an application processing module 711a; the hi-fi chip 720a includes a perception processing module 721a and an audio processing algorithm chain 722a; the sensor processor 730a includes a wake processing screening module 731a, a wake sensing screening module 732a.
In some embodiments, referring to fig. 7, the electronic device 700a and the electronic device 700b may have the same structural composition, so that the electronic device 700a may sense the electronic device 700b, and the electronic device 700b may sense the electronic device 700a, that is, the two devices may sense each other.
Thus, referring to fig. 7, the electronic device 700b may include at least a processor 710b, a hi-fi chip 720b, a sensor processor 730b, an acceleration sensor 740 b/compass sensor 750b, a bluetooth chip 760b, and a speaker 770 b/microphone 780b. The association between the components is shown by arrows in fig. 7.
Wherein the processor 710b includes an application processing module 711b; the hi-fi chip 720b includes a perception processing module 721b, an audio processing algorithm chain 722b; sensor processor 730b includes wake processing screening module 731b, wake sensing screening module 732b.
In other embodiments, the electronic device 700a and the electronic device 700b may also have different structural compositions, which is not limited in this application.
Based on the device configuration shown in fig. 7, taking the electronic device 700a to sense the electronic device 700b as an example, the implementation process of the sensing device may be as follows.
The acceleration sensor 740 a/compass sensor 750a, bluetooth chip 760a may sense other devices in real time, and the sensed data is sent to the sensor processor 730a. As described above, exemplary, based on whether the bluetooth chip 760a can sense the presence of a surrounding device, further, whether the electronic device is moving can be determined based on the acceleration sensor 740a or the compass sensor 750 a.
The wake-up sensing filter module 732a in the sensor processor 730a determines whether other devices are sensed according to the sensed data, and if so, can trigger the hi-fi chip 720a to perform accurate sensing. If not, the high-fidelity chip 720a is not triggered to perform accurate sensing.
The hi-fi chip 720a may trigger the speaker 770 a/microphone 780a to sense the direction and position of the other device and the sensed audio signal is sent to the hi-fi chip 720a. The audio signal is processed by the audio processing algorithm 722a in the hi-fi chip 720a, and the processing result is sent to the perception processing module 721a. The sensing processing module 721a determines a sensed direction and distance according to the processing result, and transmits the determined direction and distance to the sensor processor 730a.
The wake-up process screening module 731a in the sensor processor 730a determines whether the perceived direction and distance satisfy a predetermined condition, and if so, wakes up the processor 710a. If not, the processor 710a is not awakened.
After awakening the processor 710a, further processing may be performed, such as ejecting a corresponding card for use by the user.
Referring to fig. 7, in this embodiment, the ultrasonic sensing module (speaker 770 a/microphone 780 a) is sunk onto the low-power normally-on small core of the hi-fi chip to operate, and the processor is awakened as required in combination with a processor awakening condition customized by supporting three-party applications (the condition is used for awakening the processor 710 a), so that the sensing capability of low power consumption can be realized, and the orientation of the surrounding devices can be sensed in real time and low power consumption under the sleeping condition of the electronic device.
In addition, the embodiment can perform sensing based on a hierarchical sensing management policy, specifically, first performing primary sensing based on a bluetooth chip, an acceleration sensor and the like, and triggering a high-precision sensing module in a high-fidelity chip to perform further azimuth sensing under the condition that certain conditions (such as other surrounding devices exist and the electronic devices are moving) are met, so that the sensing capability of lower power consumption can be realized through two-stage sensing.
The embodiment performs high-precision sensing positioning based on the ultrasonic sensing module. In other embodiments, other ways of high-precision perceptual localization may be employed. For example, a pulse mode can be adopted to accurately position based on an Ultra Wide Band (UWB) chip in the electronic equipment.
The implementation manner of the sensing device provided in the embodiment of the present application is specifically described below.
Considering that when a user views information of other devices or controls other devices through an electronic device (such as a mobile phone), a preset card ejection condition is usually met, whether the electronic device ejects a corresponding card or not can be determined by judging whether the card ejection condition is met, so that the user views information of the other devices or controls the other devices through the content of the card.
Illustratively, the card ejection conditions may include some or all of the following sub-conditions 1-3:
sub-condition 1: other devices are in the vicinity of the user and the electronic device;
sub-condition 2: the user walks to the other device with the electronic device held in advance (or the user brings the electronic device close to the other device);
sub-condition 3: the other devices are within a specific angular range and within a specific distance range of the user and the electronic device.
Considering that when other devices exist around the user, the user may use the electronic device to view information of the other devices or control the other devices, so that, optionally, whether other devices exist around the user may be confirmed first, and if so, whether the user is approaching the other devices may be further perceived.
Wherein it is possible to further confirm whether there is a possibility that the user views or controls the other device by sensing whether the electronic device used by the user is moving. If moving, it may default that the user is approaching the other device, i.e. there is the possibility, and if not moving, it may default that the user is not approaching the other device, i.e. there is no possibility.
For example, the card ejection condition may include the above sub-conditions 1 to 3 at the same time, for example, the electronic device may determine whether the sub-condition 1 and the sub-condition 2 are satisfied and then determine whether the sub-condition 3 is satisfied, without performing the determination operation related to the sub-condition 3 in real time.
Considering that the processor performs the judging operation related to the sub-condition 3 with high power consumption and does not always satisfy the sub-condition 3, the judging operation related to the sub-condition 3 can be performed by a low power component instead of the processor, so as to avoid repeated and inefficient activation of the processor.
Next, the device-aware implementation procedure of the present embodiment will be described by taking the implementation logic of the above-described implementation mode 2 as an example.
In one embodiment, please refer to fig. 8, fig. 8 illustrates a timing diagram of an electronic device based on two-stage sensing to sense the device, which relates to the operation of a wireless network communication technology chip 810, a sensor processor 820, and a hi-fi chip 830 in the electronic device.
Referring to fig. 8, a wireless network communication technology chip 810 is configured to sense whether other devices exist around, and if so, to send notification information of the sensed surrounding devices to a sensor processor 820.
The sensor processor 820 senses whether the electronic device is moving in response to the notification information, and triggers the high-fidelity chip 830 to accurately sense when the electronic device is sensed to be moving.
The hi-fi chip 830, in response to the triggering operation for performing the accurate sensing, starts the ultrasonic sensing module to perform the ultrasonic sensing operation, and reports the ultrasonic sensing result to the sensor processor 820.
The sensor processor 820 may then determine whether to wake up the processor to eject the card based on the reported ultrasound sensing results.
Please refer to fig. 8, the first level sensing may be performed first, and then the second level accurate sensing may be performed.
For primary sensing, it may be specifically sensed whether there are other devices around the electronic device through a wireless medium such as a bluetooth chip, a wireless network communication technology chip (WiFi chip) or the like in the sensor processor 820, and whether the electronic device is moving (may be equivalent to whether the user is moving to get close to the other device) through an inertial measurement unit such as a gyro sensor, an acceleration sensor, a compass sensor or the like in the sensor processor 820.
The wireless medium and the inertial measurement unit may be embedded in the sensor processor 820, may be mounted on the sensor processor 820, or may be mounted outside the sensor processor 820 and connected to the sensor processor 820, which is not limited in this embodiment.
For the second-level accurate sensing, if the first-level sensing result is that other devices are around and the electronic device is moving, the ultrasonic sensing module sinking in the hi-fi chip 830 can be triggered to perform accurate sensing.
The accurate sensing results may be returned to the sensor processor 820 so that the sensor processor 820 may perform on-demand wake-up of the processor based on a communication channel between the sensor processor 820 and the processor for waking up the processor. For example, the processor may control the electronic device to light up the screen and eject the card.
In another possible implementation, a communication channel may also be established between the high-fidelity chip 830 and the processor for waking up the processor. In this manner, the high-fidelity chip 830 may process the accurate sensing result directly without returning the accurate sensing result to the sensor processor 820, and perform on-demand wake-up of the processor via the communication channel.
In another embodiment, please refer to fig. 9, fig. 9 shows a timing diagram of the electronic device waking up the processor to eject the card after sensing other devices, the timing diagram involving the hi-fi chip 910, the sensor processor 920 and the processor 930 in the electronic device.
The hi-fi chip 910 reports the ultrasonic sensing result obtained by performing the ultrasonic sensing operation to the sensor processor 920.
The sensor processor 920 determines whether the conditions for waking up the processor are satisfied according to the reported ultrasonic sensing result. And sending an instruction for waking up the processor when the condition for waking up the processor is determined to be met.
Processor 930, in response to an instruction to wake up the processor, controls the electronic device to light up the screen and eject the card for use by the user. For example, the user may view information of the perceived device or control the device based on the card content.
In yet another embodiment, please refer to fig. 10, fig. 10 shows a timing diagram of an electronic device perceiving the device based on an ultrasonic perception, which involves a microphone 1010 and an audio processing algorithm chain 1020 in the electronic device. By way of example, fig. 10 may be a timing diagram of internal timing of a high-fidelity chip.
The microphone 1010 sends the collected ultrasonic sensing information to the audio processing algorithm chain 1020, and the audio processing algorithm chain 1020 calls the ultrasonic algorithm to process the ultrasonic sensing information to obtain an ultrasonic sensing result, wherein the ultrasonic sensing result comprises a distance and a direction.
The embodiment can adopt an ultrasonic technology and perform high-precision sensing positioning based on the ultrasonic sensing module. In other embodiments, other ways of high-precision perceptual localization may be employed. For example, a pulse mode can be adopted to accurately position based on a UWB chip in the electronic equipment.
Fig. 11 is a flowchart of a method for sensing a device according to an embodiment of the present application, where the method may include steps 1101 to 1106:
Step 1101, sensing whether other devices exist around the electronic device, if so, executing step 1102, otherwise, ending the current flow.
A wireless medium such as a bluetooth chip, a wireless network communication technology chip, etc. in an electronic device may be used to discover whether other devices are present around the electronic device.
The bluetooth chip may be a BLE (Bluetooth Low Energe, bluetooth low energy) chip.
Step 1102, determining whether other devices and electronic devices are devices under the same user account, if so, executing step 1103, otherwise, ending the current flow.
Step 1103, it is perceived whether the electronic device is moving, if so, step 1104 is executed, otherwise the current flow is ended.
The movement of the electronic device may be sensed using one or more of an inertial measurement unit (nertial Measurement Unit, IMU), an acceleration sensor, a compass sensor, etc. in the electronic device.
For example, if the direction information collected by the compass sensor changes, the electronic device can be considered to be moving.
And 1104, performing ultrasonic sensing operation to obtain an ultrasonic sensing result.
If the electronic equipment is moving, the ultrasonic sensing module can be awakened, the ultrasonic sensing module executes ultrasonic sensing operation to realize azimuth (direction and position) sensing, and the ultrasonic sensing module can be in a dormant state before being awakened.
The ultrasonic sensing module may be built into a high-fidelity chip of the electronic device.
The embodiment can adopt an ultrasonic technology and perform high-precision sensing positioning based on the ultrasonic sensing module. In other embodiments, other ways of high-precision perceptual localization may be employed. For example, a pulse mode can be adopted to accurately position based on a UWB chip in the electronic equipment.
Step 1105, determining whether a preset processor wake-up condition is met according to the ultrasonic sensing result, if yes, executing step 1106, otherwise ending the current flow.
The ultrasound sensing results may include a distance, which may be expressed as a distance between the electronic device and the other device, and a direction, which may be expressed as a direction of the other device relative to the electronic device.
Correspondingly, the preset processor wake-up conditions may include: the distance in the ultrasonic sensing result is in the corresponding distance range, and the direction in the ultrasonic sensing result is in the corresponding direction range.
Therefore, when other devices are closer to the electronic device and offset from the electronic device by a small angle, the processor can be awakened, so that accurate awakening of the processor is ensured.
The processor wake-up condition may be, as applicable, a wake-up condition corresponding to the other device. A user may customize the processor wake-up conditions (or wake-up policies) corresponding to each other device as desired through the application interfaces of each other device.
Step 1106, wake up a processor of the electronic device, wherein the processor is configured to control the electronic device to eject a card after being woken up, and the card display content includes information related to usage states of other devices.
Based on the method provided by the embodiment, under the condition that the internal processor of the electronic device is in a dormant state, whether other devices exist around the electronic device or not can be found in real time by combining a wireless medium built in a low-power consumption sensor processor of the electronic device, whether the electronic device is moving or not is sensed by combining an inertia measurement unit in the electronic device, whether the electronic device and the other devices are devices under the same account number is judged, and if the judgment results are yes, an ultrasonic sensing module sinking in a low-power consumption high-fidelity chip of the electronic device is notified to sense the azimuth. The orientation sensing information can be transmitted back to the sensor processor, the sensor processor judges whether the orientation sensing information meets a wake-up strategy customized by the three-party application, and if so, the processor of the electronic equipment can be waken up. The processor wakes up for further processing of the message, such as ejecting the card for use by the user.
As shown in fig. 11, after sensing that other devices exist in the periphery, the electronic device may first determine whether the other devices and the electronic device are devices under the same user account, and if so, then sense whether the electronic device is moving.
In other possible implementations, the electronic device may first sense whether the electronic device is moving after sensing that other devices exist in the periphery, and if so, determine whether the other devices and the electronic device are devices under the same user account.
In the related technical schemes, when the electronic equipment is close to other equipment in the dormant state, the orientation of the other equipment cannot be perceived, so that the electronic equipment cannot automatically eject a card for a user to use. Different from the prior related technical schemes, according to the implementation of the embodiment, the electronic equipment can still measure the orientations of the dormant equipment and surrounding equipment with low power consumption in real time under the dormant state, and once the dormant equipment is found to be close to other equipment by a rule, a card can be automatically ejected for a user to use.
It should be noted that, the method of sensing the device in the embodiment of the present application is not limited to what is shown in fig. 11, for example, in other possible implementations, the step 1102 and/or the step 1103 may not be performed, or the two steps may not be performed before the step 1104. The present application is not limited in this regard.
For example, if both steps are not performed, step 1104 may be performed directly after determining yes in step 1101. If step 1102 is not performed, step 1103 may be performed directly after step 1101 is determined to be yes. If step 1103 is not performed, step 1104 may be performed directly after the determination of step 1102 is yes.
For another example, step 1102 may be performed before step 1106, rather than before step 1103, i.e., if step 1105 is determined to be "yes" then step 1102 is performed, and if step 1102 is determined to be "yes" then step 1106 is performed.
Fig. 12 is a flowchart of another method for sensing a device according to another embodiment of the present application, where the method includes steps 1201-1209:
step 1201, the wake-up sensing and filtering module in the sensor processor senses whether other devices exist around the electronic device according to the sensing data of the bluetooth chip, if so, step 1202 is executed, otherwise, the current flow is ended.
Step 1202, the wake-up sensing filtering module determines whether the other devices and the electronic device are devices under the same user account, if yes, step 1203 is executed, otherwise, the current flow is ended.
Step 1203, wake-up sensing and screening module senses whether the electronic device is moving according to the sensing data of the acceleration sensor/compass sensor, if yes, execute step 1204, otherwise end the current flow.
In step 1204, the wake-up sensing filter module triggers the hi-fi chip to perform the ultrasonic sensing operation.
The embodiment can adopt an ultrasonic technology and perform high-precision sensing positioning based on the ultrasonic sensing module. In other embodiments, other ways of high-precision perceptual localization may be employed. For example, a pulse mode can be adopted to accurately position based on a UWB chip in the electronic equipment.
In step 1205, the audio algorithm processing chain in the hi-fi chip processes the sensing data of the speaker/microphone to obtain a processing result.
In step 1206, the sensing processing module in the hi-fi chip processes the processing result to obtain an ultrasonic sensing result.
Step 1207, the wake-up processing screening module in the sensor processor determines whether the preset processor wake-up condition is satisfied according to the ultrasonic sensing result, if yes, step 1208 is executed, otherwise, the current flow is ended.
In step 1208, the wake-up processing filter module wakes up the processor of the electronic device.
In step 1209, the application processing module in the processor controls the electronic device to eject the card after the processor is awakened, where the card display includes information related to the usage status of the other devices.
As shown in fig. 12, after sensing that other devices exist in the periphery, the electronic device may first determine whether the other devices and the electronic device are devices under the same user account, and if so, then sense whether the electronic device is moving.
In other possible implementations, the electronic device may first sense whether the electronic device is moving after sensing that other devices exist in the periphery, and if so, determine whether the other devices and the electronic device are devices under the same user account.
It should be noted that the method of sensing the device in the embodiment of the present application is not limited to what is shown in fig. 12, for example, in other possible implementations, the step 1202 and/or the step 1203 may not be performed, or the two steps may not be performed before the step 1204. The present application is not limited in this regard.
For example, if the two steps are not performed, step 1204 may be performed directly after determining yes in step 1201. If step 1202 is not performed, step 1203 may be directly performed after step 1201 is determined to be yes. If step 1203 is not performed, step 1204 may be performed directly after determining yes in step 1202.
For another example, step 1202 may be performed before step 1208 instead of before step 1203, i.e., if step 1207 determines "yes", step 1202 is performed, and if step 1202 determines "yes", step 1208 is performed.
In the several embodiments provided by the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The embodiment of the application also provides an electronic chip, which comprises: and a processor for executing computer program instructions stored in the memory, wherein the computer program instructions, when executed by the processor, trigger the task processing chip to perform the steps performed by the electronic device 100 in the above embodiments.
The embodiment of the application also provides an electronic device, which may include: and a processor for running the computer program stored in the memory, so that the electronic device implements the steps executed by the electronic device 100 in the above embodiment. One possible product hardware structure of the electronic device provided in the embodiment of the present application may refer to a hardware structure schematic shown in fig. 5.
In particular, in an embodiment of the present application, one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the processor of the electronic device may be a device on chip SOC, where the processor may include a central processing unit (Central Processing Unit, CPU) and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
In particular, in an embodiment of the present application, the processor may include, for example, a CPU, DSP, microcontroller, or digital signal processor, and may further include a GPU, an embedded Neural network processor (Neural-network Process Units, NPU), and an image signal processor (Image Signal Processing, ISP), where the processor may further include a necessary hardware accelerator or logic processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program of the present application, and so on. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
In particular, in an embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), other type of static storage device capable of storing static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device capable of storing information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory, CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any computer readable medium capable of carrying or storing desired program code in the form of instructions or data structures and capable of being accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, more commonly separate components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular, the memory may also be integrated into the processor or may be separate from the processor.
Further, the devices, apparatuses, modules illustrated in the embodiments of the present application may be implemented by a computer chip or entity, or by a product having a certain function.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
The present embodiment also provides a computer storage medium including a computer program, which when executed on an electronic device, causes the electronic device to perform the steps performed by the electronic device 100 in the above embodiment.
Embodiments of the present application also provide a computer program product comprising a computer program for causing a computer to carry out the steps carried out by the electronic device 100 in the above embodiments when the computer program is run on the computer.
The description of embodiments herein is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be understood that the term "unit" in the embodiments of the present application may be implemented in software and/or hardware, which is not specifically limited. For example, a "unit" may be a software program, a hardware circuit or a combination of both that implements the functions described above. The hardware circuitry may include application specific integrated circuits (application specific integrated circuit, ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In several embodiments provided herein, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The integrated units, implemented in the form of software functional units, may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.

Claims (16)

1. A method of sensing a device, comprising:
determining whether other devices exist around the electronic device;
under the condition that other devices exist around the electronic device, positioning results corresponding to the other devices are obtained;
Determining whether the positioning result meets a preset first condition;
and under the condition that the positioning result meets the first condition, waking up a processor of the electronic equipment, and executing a first operation corresponding to the other equipment after the processor is waken up.
2. The method of claim 1, wherein the obtaining positioning results corresponding to the other devices comprises:
acquiring first data acquired by an ultrasonic medium in the electronic equipment;
and obtaining positioning results corresponding to the other devices according to the first data.
3. The method of claim 1, wherein the positioning result comprises: direction data of the other device relative to the electronic device;
the first condition includes: a preset direction range;
the determining whether the positioning result meets a preset first condition includes:
determining whether the direction data is within the direction range;
wherein the case that the positioning result satisfies the first condition includes: and the direction data is in the direction range.
4. The method of claim 1, wherein the positioning result comprises: the target distance between the other equipment and the electronic equipment;
The first condition includes: a preset distance threshold;
the determining whether the positioning result meets a preset first condition includes:
determining whether the target distance is not greater than the distance threshold;
wherein the case that the positioning result satisfies the first condition includes: and the target distance is not greater than the distance threshold.
5. The method of claim 1, wherein after determining that there are other devices around the electronic device and before the waking up the processor of the electronic device, the method further comprises:
determining whether the electronic device is moving;
and executing the step of waking up the processor of the electronic device under the condition that the electronic device is determined to be moving.
6. The method of claim 5, wherein the method further comprises:
acquiring second data acquired by an inertial measurement unit in the electronic equipment;
and executing the step of determining whether the electronic equipment is moving according to the second data.
7. The method of claim 6, wherein the inertial measurement unit comprises at least one of a gyroscope sensor, an acceleration sensor, a compass sensor.
8. The method of claim 1, wherein after determining that there are other devices around the electronic device and before the waking up the processor of the electronic device, the method further comprises: determining whether the other device and the electronic device are devices under the same user account;
and waking up a processor of the electronic device if the positioning result meets the first condition, including:
and waking up a processor of the electronic equipment under the condition that the positioning result meets the first condition and the other equipment and the electronic equipment are equipment under the same user account.
9. The method of claim 1, wherein the first operation comprises: and controlling the electronic equipment to eject the card, wherein the card display content comprises information related to the use states of the other equipment.
10. The method according to claim 1, wherein the method further comprises:
acquiring third data acquired by a wireless medium in the electronic equipment;
and executing the step of determining whether other devices exist around the electronic device according to the third data.
11. The method of claim 10, wherein the wireless medium comprises at least one of a bluetooth chip and a wireless network communication technology chip.
12. An apparatus for sensing a device, comprising:
the first determining module is used for determining whether other devices exist around the electronic device;
the acquisition module is used for acquiring positioning results corresponding to other equipment when the other equipment exists around the electronic equipment;
the second determining module is used for determining whether the positioning result meets a preset first condition;
and the awakening module is used for awakening a processor of the electronic equipment under the condition that the positioning result meets the first condition, and the processor executes a first operation corresponding to the other equipment after being awakened.
13. An electronic chip, comprising: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of claims 1-11.
14. An electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-11.
15. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-11.
16. A computer program product, characterized in that the computer program product comprises a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-11.
CN202210775399.5A 2022-07-01 2022-07-01 Method, device, chip and equipment for sensing equipment Pending CN117376887A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210775399.5A CN117376887A (en) 2022-07-01 2022-07-01 Method, device, chip and equipment for sensing equipment
PCT/CN2023/104124 WO2024002282A1 (en) 2022-07-01 2023-06-29 Method and apparatus for sensing device, and chip and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210775399.5A CN117376887A (en) 2022-07-01 2022-07-01 Method, device, chip and equipment for sensing equipment

Publications (1)

Publication Number Publication Date
CN117376887A true CN117376887A (en) 2024-01-09

Family

ID=89383307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210775399.5A Pending CN117376887A (en) 2022-07-01 2022-07-01 Method, device, chip and equipment for sensing equipment

Country Status (2)

Country Link
CN (1) CN117376887A (en)
WO (1) WO2024002282A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120005124A (en) * 2010-07-08 2012-01-16 삼성전자주식회사 Apparatus and method for operation according to movement in portable terminal
JP6007531B2 (en) * 2012-03-19 2016-10-12 富士通株式会社 Information processing apparatus, information processing method, and information processing program
CN113287081A (en) * 2018-12-14 2021-08-20 云丁网络技术(北京)有限公司 Electronic equipment control system and method
CN110018642A (en) * 2019-04-15 2019-07-16 美的集团股份有限公司 A kind of quick control method of smart home device, medium and smart home device
CN114257996A (en) * 2020-09-21 2022-03-29 华为技术有限公司 Device discovery method and device

Also Published As

Publication number Publication date
WO2024002282A1 (en) 2024-01-04

Similar Documents

Publication Publication Date Title
CN109710080B (en) Screen control and voice control method and electronic equipment
WO2021000876A1 (en) Voice control method, electronic equipment and system
CN111095723B (en) Wireless charging method and electronic equipment
CN110742580A (en) Sleep state identification method and device
CN116156417A (en) Equipment positioning method and related equipment thereof
CN114880251B (en) Memory cell access method, memory cell access device and terminal equipment
CN116070035A (en) Data processing method and electronic equipment
CN115022807B (en) Express information reminding method and electronic equipment
CN114915747B (en) Video call method, electronic device and readable storage medium
CN117376887A (en) Method, device, chip and equipment for sensing equipment
CN115206308A (en) Man-machine interaction method and electronic equipment
CN116389640A (en) Interface display method and electronic equipment
CN117271170B (en) Activity event processing method and related equipment
CN115794476B (en) Processing method of kernel graphic system layer memory and terminal equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN115619628B (en) Image processing method and terminal equipment
CN115513571B (en) Control method of battery temperature and terminal equipment
CN116703691B (en) Image processing method, electronic device, and computer storage medium
CN116233599B (en) Video mode recommendation method and electronic equipment
CN116795604B (en) Processing method, device and equipment for application exception exit
CN114362878B (en) Data processing method and electronic equipment
CN115734323B (en) Power consumption optimization method and device
CN116546281A (en) Screen projection method, system, screen projection source equipment and screen equipment
CN117177216A (en) Information interaction method and device and electronic equipment
CN116954727A (en) Equipment awakening method, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination