WO2022170863A1 - Procédé et système de positionnement à bande ultralarge - Google Patents

Procédé et système de positionnement à bande ultralarge Download PDF

Info

Publication number
WO2022170863A1
WO2022170863A1 PCT/CN2021/140535 CN2021140535W WO2022170863A1 WO 2022170863 A1 WO2022170863 A1 WO 2022170863A1 CN 2021140535 W CN2021140535 W CN 2021140535W WO 2022170863 A1 WO2022170863 A1 WO 2022170863A1
Authority
WO
WIPO (PCT)
Prior art keywords
base station
distance information
base stations
user
target base
Prior art date
Application number
PCT/CN2021/140535
Other languages
English (en)
Chinese (zh)
Inventor
符谋政
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022170863A1 publication Critical patent/WO2022170863A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the field of communication technologies, and in particular, to an Ultra Wide Band (Ultra Wide Band, UWB) positioning method and system.
  • Ultra Wide Band Ultra Wide Band
  • the main positioning technologies can be divided into two categories according to application scenarios, one is outdoor positioning technology, and the other is indoor positioning technology.
  • Outdoor positioning technology mainly includes satellite positioning technology and base station positioning technology.
  • Satellite positioning technologies include, but are not limited to, Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), and Beidou Navigation Satellite System (BDS).
  • Indoor positioning technologies include but are not limited to wireless fidelity (WiFi) positioning and UWB positioning technology.
  • UWB positioning technology has the advantages of high data transmission rate (up to 1Gbit/s), strong anti-multipath interference ability, low power consumption, low cost, strong penetration ability, low interception rate, and shared spectrum with other existing wireless communication systems. And other characteristics, UWB positioning technology is widely used.
  • UWB positioning requires the deployment of base stations with known locations in advance, which is costly to deploy and maintain, and the base station and measurement and calculation units, such as mobile phones and other electronic devices, require networked communication.
  • the embodiments of the present application provide a UWB positioning method and system, which can solve at least one technical problem related to the prior art.
  • an embodiment of the present application provides a UWB positioning method, including:
  • the at least two target base stations including ultra-wideband modules
  • the direction angles from the at least two target base stations set on the first user to the tag, and the distance information of the at least two target base stations are obtained. Due to the high reliability of the distance information and the direction information, By positioning according to the direction angle and distance information, more accurate position information of the tag can be obtained.
  • the acquiring distance information of the at least two target base stations deployed on the first user includes: determining attribute information of the at least two target base stations deployed on the first user, according to the The attribute information determines the distance information of the at least two target base stations.
  • the target is determined according to the attribute information of the target base station and the distance information of the target base station is determined without complicated calculation, the distance information can be obtained quickly, and the computing power cost of the system is reduced.
  • the acquiring distance information of the at least two target base stations deployed on the first user includes: acquiring the input distance information of the at least two target base stations deployed on the first user.
  • the distance information of the target base station is determined according to the input of the user (including the first user), on the one hand, the distance information can be obtained quickly, on the other hand, more accurate distance information can be obtained, which reduces the computing power of the system cost, but also to obtain more accurate positioning results.
  • the obtaining distance information of at least two target base stations deployed on the first user includes:
  • the third base station includes a hand base station or a foot base station
  • the second distance information is determined according to the more accurate first distance information, and then the positioning result is obtained according to the second distance information and the direction angle, so that a more accurate positioning result can be obtained.
  • the attribute information includes deployment site information.
  • the obtaining distance information of at least two target base stations deployed on the first user includes:
  • the ultra-wideband positioning method further includes:
  • the pose of the first user is determined according to the distance information of the at least three target base stations.
  • the pose of the first user is further determined, more user data is obtained, efficiency is improved, and better rescue can be provided in application scenarios.
  • the at least three target base stations include a head base station, a hand base station, and a foot base station;
  • the acquiring distance information of at least three target base stations deployed on the first user includes:
  • the determining the pose of the first user according to the distance information of the at least three target base stations includes:
  • the pose of the first user is determined according to the third distance information and the fourth distance information.
  • a display interface including a radar chart is displayed, and the radar chart identifies the positioning position of the first user and/or the second user.
  • the location information is provided visually, and navigation can be completed quickly.
  • an embodiment of the present application provides a UWB positioning system, including: at least two target base stations deployed on a first user, and tags deployed on a second user; the at least two target base stations and all The tag includes an ultra-wideband module.
  • the second aspect further includes: one or more processors;
  • the one or more processors are integrated into at least one of an electronic device deployed by the target base station, an electronic device deployed by the tag, and a server;
  • the processor is configured to acquire distance information of at least two target base stations
  • the processor is further configured to acquire a direction angle from each of the at least two target base stations to the tag, and obtain a positioning position of the tag based on the direction angle and the distance information.
  • the acquiring the distance information of the at least two target base stations includes: determining attribute information of the at least two target base stations, and determining the distance information of the at least two target base stations according to the attribute information.
  • the acquiring the distance information of the at least two target base stations includes: acquiring the input distance information of the at least two target base stations.
  • the at least two target base stations include a first head base station, a second head base station, and a third base station, and the third base station includes a hand base station or a foot base station;
  • the acquiring distance information of at least two target base stations includes:
  • the attribute information includes deployment site information.
  • the acquiring distance information of at least two target base stations includes:
  • the processor is further configured to determine the pose of the first user according to the distance information of the at least three target base stations.
  • the at least three target base stations include a head base station, a hand base station, and a foot base station
  • the acquiring distance information of the at least two target base stations includes:
  • the determining the pose of the first user according to the distance information of the at least three target base stations includes:
  • the pose of the first user is determined according to the third distance information and the fourth distance information.
  • the second aspect further includes a display, where the display is integrated into the target base station or an electronic device deployed by the tag, or the processor is integrated into a server;
  • the display is used for displaying a display interface including a radar chart, where the positioning positions of the first user and/or the second user are identified in the radar chart.
  • an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, when the processor executes the computer program , so that the electronic device implements the UWB positioning method according to any one of the first aspect and the possible implementation manners of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the first aspect and the first aspect are possible
  • the UWB positioning method described in any one of the implementation manners is implemented.
  • embodiments of the present application provide a computer program product, when the computer program product runs on one or more electronic devices, the one or more electronic devices can perform the above-mentioned first aspect and the first aspect.
  • FIG. 1 is a schematic structural diagram of a UWB positioning system provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the principle of a TOF ranging provided by an embodiment of the present application.
  • 3A is a schematic diagram of the principle of a triangulation positioning method provided by an embodiment of the present application.
  • 3B is a schematic diagram of the principle of another triangulation positioning method provided by an embodiment of the present application.
  • 4A is a schematic diagram of the principle of an AOA-based positioning method provided by an embodiment of the present application.
  • 4B is a schematic structural diagram of an array antenna provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the principle of an AOA-based positioning method provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a display interface of a mobile phone provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of glasses provided by an embodiment of the present application.
  • FIG. 11 is an implementation flowchart of a UWB positioning method provided by an embodiment of the present application.
  • FIG. 12 is an implementation flowchart of a UWB positioning method provided by another embodiment of the present application.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • base stations with known positions need to be deployed in advance, and at least two base stations are usually required.
  • the deployment and maintenance costs are high, and the base station and the measurement and calculation unit, such as terminal equipment, need network communication.
  • the present application provides a positioning method and electronic device, which can realize positioning in a mobile scenario.
  • the positioning solution provided by the present application does not need to deploy base stations in advance, has low cost, and has high flexibility.
  • UWB technology uses ultra-wide baseband pulses with extremely wide spectrum to communicate, so it is also called baseband communication technology and wireless carrier communication technology.
  • UWB technology utilizes nanosecond non-sinusoidal narrow pulses to transmit data by extending the pulses into a frequency range through quadrature frequency division modulation or direct sequencing.
  • the main features of UWB technology are high transmission rate, large space capacity, low cost, and low power consumption.
  • UWB positioning technology is a kind of wireless positioning technology.
  • Wireless positioning technology refers to the measurement method and calculation method used to determine the location of a mobile user, that is, a positioning algorithm.
  • the UWB positioning system may include a data acquisition layer, a data transmission layer, and a data processing layer.
  • the data collection layer may include a positioning tag (hereinafter referred to as a tag) and a positioning base station (hereinafter referred to as a base station), etc.
  • the positioning of the positioning tag is realized through the UWB positioning channel between the base station and the tag.
  • the data transmission layer can include a wired transmission network and/or a wireless transmission network.
  • the wireless transmission network can provide a data transmission link for the base station through a WiFi channel, and the wired transmission network can provide a data transmission link for the base station through the Ethernet.
  • the wired transmission network also A data transmission link may be provided for the wireless transmission network.
  • the data processing layer can include server, UWB positioning engine and internal and external software interfaces.
  • the base station can send the positioning data to the UWB positioning engine in real time, and the UWB positioning engine can calculate the positioning data in real time to obtain the coordinate position of the tag.
  • a UWB positioning system is shown.
  • both the base station and the tag card are provided with UWB modules.
  • the UWB positioning system relies on different numbers of base station deployment strategies to realize zero-dimensional positioning, one-dimensional positioning, two-dimensional positioning, or three-dimensional positioning of the tag card.
  • positioning technology based on Angle of Arrival AOA
  • positioning technology based on Time of Arrival TOA
  • positioning technology based on Time Difference of Arrival TDOA
  • positioning technology based on received signal strength RSS
  • the most basic ranging methods can include Time of Flight (TOF), Time of Arrival (TOA), Time Difference of Arrival (TDOA), Two Way-Time of Flight, TW-TOF), etc.
  • TOF Time of Flight
  • TOA Time of Arrival
  • TDOA Time Difference of Arrival
  • TW-TOF Two Way-Time of Flight
  • TOF is a two-way ranging technology, which calculates the distance by measuring the flight time of the UWB signal to and from the base station and the tag.
  • TOF calculates the distance S between the transmitter and the receiver by measuring the time difference of the radio frequency (RF) signal from the transmitter to the receiver and multiplying it by the speed of light.
  • S c ⁇ [(Ta1-Ta2)-(Tb2-Tb1)], where c is the speed of light; Ta1 is the time when the transmitter transmits the RF signal; Ta2 is the time when the transmitter receives the RF signal transmitted by the receiver; Tb1 is the time when the receiver receives the RF signal transmitted by the transmitter; Tb2 is the time when the receiver transmits the RF signal.
  • This TOF ranging method requires precise synchronization between the sender and the receiver; on the other hand, the receiver needs to provide the length of the transmission time of the RF signal, that is, the response delay.
  • TDOA technology is a method of positioning using the time difference of arrival, also known as hyperbolic positioning.
  • TDOA technology requires high synchronization accuracy between base stations.
  • TOA technology is a method of calculating the distance between a tag or a terminal carrying a tag and a base station based on the propagation delay between them.
  • the position of the tag or the terminal carrying the tag can be determined according to the triangulation method.
  • the TOA ranging method also requires precise synchronization between the tag and the base station.
  • An embodiment of the present application provides a two-dimensional or three-dimensional position positioning solution based on multiple base stations.
  • the target object to be located such as a person or an electronic device
  • carries a positioning tag the tag sends an RF signal to the base station
  • the base station receives the RF signal sent by the tag.
  • the distance between each base station and the tag can be estimated according to the ranging algorithm, so as to calculate the two-dimensional or three-dimensional spatial position of the tag.
  • the position of the tag can be determined according to the triangulation method.
  • the triangulation method As a non-limiting example of the present application, as shown in FIG. 3A and FIG. 3B .
  • cell phone 20 deploys the tag.
  • the second distance r2 from the second UWB base station 22 to the mobile phone 20 the third distance r3 from the third UWB base station 23 to the mobile phone 20, and then combined with the first UWB base station 21
  • An embodiment of the present application provides an AOA-based UWB positioning technology.
  • AOA positioning is to measure the incident angle of the radio wave emitted by the electronic equipment through the base station antenna or antenna array (the incident angle is the angle between the signal source and the normal), thereby forming a radial connection from the receiver to the electronic equipment, that is, the azimuth Wire.
  • the intersection of multiple bearing lines is determined according to the AOA positioning algorithm, that is, the estimated position of the terminal device to be positioned.
  • an electronic device such as a mobile phone can draw a ray L1 by knowing the included angle ⁇ between the connection between the first base station and the mobile phone and the reference direction; it also knows the distance between the second base station and the mobile phone.
  • the included angle ⁇ between the connecting line and the reference direction can draw another ray L2. Then the intersection of the two rays L1 and L2 is the positioning position of the mobile phone. This is the basic mathematical principle of AOA positioning.
  • AOA positioning determines the position through the intersection of two straight lines, and it is impossible to have multiple intersection points when the two straight lines intersect, which avoids the ambiguity of positioning.
  • This AOA-based UWB positioning technology needs to deploy a known position positioning base station in advance, usually at least two or more, and the deployment and maintenance costs are high.
  • the receiver needs to be equipped with an antenna array with strong directivity, as shown in FIG. 4B , and the antenna spacing d shown in FIG. 4B can be half a wavelength.
  • the positioning method provided in this embodiment of the present application can be applied to electronic devices, including but not limited to mobile phones, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebooks Computers, ultra-mobile personal computers (UMPCs), netbooks, tablets, smart speakers, TVs, servers, etc.
  • electronic devices including but not limited to mobile phones, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebooks Computers, ultra-mobile personal computers (UMPCs), netbooks, tablets, smart speakers, TVs, servers, etc.
  • AR augmented reality
  • VR virtual reality
  • UMPCs ultra-mobile personal computers
  • netbooks tablets
  • smart speakers TVs, servers, etc.
  • the electronic device may include a portable, handheld or mobile electronic device, such as a mobile phone, a tablet computer, or a wearable device.
  • FIG. 5 shows a schematic structural diagram of the electronic device 100 by taking a mobile phone as an example.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 is a cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as WiFi networks), BT, global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM), near field communication technology (NFC), infrared technology (infrared, IR), UWB communication technology and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT global navigation satellite system
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • NFC infrared technology
  • UWB communication technology UWB communication technology and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, UWB, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound through the human mouth close to the microphone 170C, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, battery change, and may also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the electronic device may increase or decrease components, for example, may include more or less components than the example shown in FIG. 5 .
  • positioning base stations are deployed on different body parts of the user, and the body parts where each base station is located are manually and/or automatically detected to form a self-marking base station system, so as to locate the positioning tags on the mobile phone,
  • the purpose of determining the three-dimensional space position of the positioning label can be achieved without deploying and determining the positioning base station in advance.
  • a positioning tag is provided on the mobile phone, and the positioning tag is denoted as T m 1 .
  • the UWB positioning base station is deployed as follows: the glasses worn by the user have two temples, one UWB positioning base station is deployed on the two temples respectively, and the two UWB positioning base stations deployed on the temples are respectively recorded as B head 1 and B head 2; Wear a wristband or watch, the wristband or watch has a built-in UWB positioning base station, which is marked as B w 1; a UWB positioning base station is deployed on the user's ankle, and the UWB positioning base station is marked as B a 1.
  • the user can manually specify the attributes of each UWB positioning base station according to which parts of the user's body the multiple UWB positioning base stations are deployed. For example, the user identifies the three UWB positioning base stations as the head base station, the wrist base station, and the ankle base station according to the human body parts deployed by the three UWB positioning base stations, namely the head, the wrist and the ankle.
  • each UWB positioning base station can automatically identify its own attributes. For example, according to the height detected by each UWB positioning base station, determine the body parts deployed by each UWB positioning base station. Specifically, three UWB positioning base stations detect their own heights, and identify the three UWB positioning base stations as the head in turn. Base station, wrist base station, ankle base station. For another example, the electronic equipment deployed by each UWB positioning base station can determine the attributes of each UWB positioning base station according to its own equipment information. The UWB positioning base station of the ring is identified as the wrist base station; when it is determined that it is the glasses according to the device information of the electronic device, the UWB positioning base station deployed on the glasses can be identified as the head base station.
  • the distance between two UWB positioning base stations may be a preset value, or may be a manually set value.
  • the distance between the head base station and the ankle base station is an average of 1m (unit: meters) to 1.5m, or greater, such as 1.7m. Therefore, the distance between the head base station and the ankle base station can be set to a preset value according to experience.
  • the user can manually input the height value, and if the input height value is D Base , this value is used as the distance between the head base station and the ankle base station.
  • the positioning base station when the positioning base station is located on the wearable device on the user's body, through the prior knowledge of the wearing position of the wearable device, such as the head, ankle or wrist, etc., the relative position and relative position of the positioning base station can be estimated. distance.
  • the mobile phone Tm is positioned by using the head base station Bh and the ankle base station Ba as an example.
  • the guardian carries the head base station Bh and the ankle base station Ba, and the guardian carries the mobile phone Tm.
  • the directions from the position (x, y) of the mobile phone Tm to the head base station Bh and the ankle base station Ba can be measured respectively. angles ⁇ and ⁇ .
  • the direction angle from the position (x, y) of the mobile phone Tm to the head base station Bh is ⁇ ; the direction angle from the position (x, y) of the mobile phone Tm to the ankle base station Ba is ⁇ .
  • the coordinate values of the head base station Bh and the ankle base station Ba can be determined, the coordinate value of the head base station Bh is (x1, y1), and the coordinates of the ankle base station Ba The value is (x2,y2).
  • the positioning label that is, the position coordinates (x, y) of the mobile phone Tm, can be calculated.
  • the ankle base station Ba may also be used as the origin of the coordinate system, and/or other connecting lines may be used as the X-axis and the Y-axis of the coordinate system. That is to say, the present application does not limit the manner of establishing the coordinate system.
  • the three points of the head base station Bh, the ankle base station Ba, and the mobile phone Tm can locate the spatial coordinates of the mobile phone Tm on the plane, and complete the positioning calculation.
  • the distance D Base between the head base station Bh and the ankle base station Ba is known, and the coordinate values of the head base station Bh and the ankle base station Ba are determined by establishing a coordinate system, and the coordinate value of the head base station Bh is (x1, y1), the coordinate value of the ankle base station Ba is (x2, y2).
  • the distance from the head base station Bh to the mobile phone Tm can be measured by the UWB ranging method, and the distance from the ankle base station Ba to the mobile phone Tm can be measured.
  • the positioning label that is, the coordinates (x, y) of the location where the mobile phone Tm is located is calculated.
  • a self-calibration positioning system can be formed, which can quickly calibrate the position of the positioning base station, so as to locate the position of the positioning label.
  • the positioning engine deployed in the server performs a positioning calculation according to the original positioning data, and feeds back the positioning result to the mobile phone, and the guardian can view the positioning result through the mobile phone.
  • the positioning result is fed back to the electronic device deployed by the positioning base station, such as a wearable device such as a wristband or smart glasses, and the guardian can view the positioning result through the wearable device.
  • the positioning engine deployed in the mobile phone performs positioning calculation according to the original positioning data
  • the guardian can view the positioning result through the mobile phone
  • the mobile phone can The results are fed back to the electronic devices deployed by the positioning base station, such as wearable devices such as wristbands or smart glasses, and the ward can view the positioning results through the wearable devices.
  • the electronic device deployed by the positioning base station such as a wearable device such as a wristband or smart glasses
  • the wearable device can use a pre-deployed positioning engine to obtain the original positioning data of the base station and the mobile phone according to the original positioning data.
  • the positioning data is used for positioning calculation, and the ward can view the positioning results through the wearable device, and/or the wearable device can feed back the positioning results to the mobile phone, and the guardian can view the positioning results through the mobile phone.
  • the relative positional relationship between the mobile phone and the base station can be confirmed. That is to say, the relative positional relationship between the guardian and the guardian can be determined through this positioning method. Therefore, the guardian can be instructed to search for the guardian through the positioning result, or the guardian can search for the guardian.
  • the guardian's mobile phone or the ward's wearable device may display a radar map, on which the relative positional relationship between the guardian and the ward is mapped.
  • some finding instructions or navigation instructions may be added while the radar chart is displayed. For example, on the basis of displaying the radar map, the surrounding map is displayed superimposed.
  • the user is instructed to search for the person.
  • the guardian triggers the family search mode on the mobile phone, or the guardian triggers the family search mode on the wearable device, and the guardian's mobile phone and the guardian's wearable device perform positioning operations.
  • the wearable device can display a radar chart in response to the user operation for starting the search for help.
  • the guardian initiates the search for help on the mobile phone, and the mobile phone may display a radar chart in response to the user operation for initiating the search for help.
  • Fig. 8 is a schematic diagram of the positioning result display interface of a guardian's mobile phone.
  • the center of the radar chart shown in Fig. 8 represents the guardian's position, and a human-shaped icon is displayed on the radar chart to identify the position of the guardian.
  • the distance between the human-shaped icon and the center indicates the distance between the supervised person and the guardian.
  • there are two wards base stations are respectively deployed on the two wards, and the positions of the two wards are displayed on the mobile phone at the same time.
  • a radar chart can be displayed on the wearable device of the ward, the center of the radar chart represents the position of the ward, and the human-shaped icon displayed on the radar chart can be used to identify the position of the guardian.
  • the radar map in addition to displaying the positions of the guardian and the ward, can also display a navigation indicator, or the electronic device can play a navigation voice to facilitate the guardian or the ward to find family members.
  • the positioning results of the guardian and the guardian are intuitively mapped to the radar chart, so that the guardian can quickly find the guardian through the indication of the radar chart, and the guardian can quickly find the guardian through the indication of the radar chart.
  • the positioning base station and the positioning label are also moving. Therefore, the positioning calculation can be performed every preset time, so as to realize real-time positioning and improve positioning accuracy.
  • the positioning result between the positioning base station and the positioning label changes in real time, the change in the positioning result will be reflected in the radar chart in real time.
  • the distance D Base between the positioning base stations (that is, the distance between the head base station Bh and the ankle base station Ba) is an estimated value, or a value manually input by the user, there is an error, so the positioning accuracy may be affected.
  • the positioning base stations are worn on the user's body, the distance between the positioning base stations will change continuously due to the user's movement, thereby affecting the final positioning accuracy.
  • the positioning base station includes two head positioning base stations, namely B head 1 and B head 2, and the two head positioning base stations are respectively set on both sides of the glasses temples.
  • the positioning base station further includes a foot positioning base station Ba, and/or a hand bracelet positioning base station Bw.
  • the positioning system may also include more devices, such as servers, switches, and the like.
  • the distance D Base Glass between B head 2 can be set to an exact value, in this example D Base Glass is set to 23.5cm. It should be understood that the D Base Glass can be set according to the actual design parameters of the glasses. In the case of different design parameters, the value of the D Base Glass may be different. The embodiments of the present application do not specifically limit the value of the D Base Glass.
  • the glasses after the glasses are initialized, they can automatically set the value of D Base Glass according to their own design parameters; while in other actual usage scenarios, the user of the glasses can The design parameters set the value of D Base Glass. That is to say, the value-taking process of D Base Glass can be completed automatically by the device or by the user.
  • the spatial coordinates of the positioning tag can be accurately calculated based on the two head base stations through the positioning method in the first application scenario, that is, the positioning method based on AOA.
  • other positioning base stations can be calculated based on the two head base stations according to the positioning method in the first application scenario, that is, the AOA-based positioning method, such as Accurate three-dimensional coordinates of the foot positioning base station Ba and/or the hand wristband positioning base station Bw, determine the exact value D base 1 of the distance between the foot positioning base station Ba and the head positioning base station, and, the hand wristband positioning base station Bw The exact value of the distance from the head positioning base station D base 2.
  • the AOA-based positioning method such as Accurate three-dimensional coordinates of the foot positioning base station Ba and/or the hand wristband positioning base station Bw, determine the exact value D base 1 of the distance between the foot positioning base station Ba and the head positioning base station, and, the hand wristband positioning base station Bw The exact value of the distance from the head positioning base station D base 2.
  • the AOA-based positioning method can accurately calculate the spatial coordinates of the positioning tag based on a head base station and a foot positioning base station, or , Based on a head base station and a hand bracelet positioning base station, the spatial coordinates of the positioning tag are accurately calculated.
  • the second application scenario may adopt the same positioning method as the first application scenario, which will not be repeated here.
  • the user's pose can also be determined according to the positioning result.
  • a motion sensor configured in the glasses can determine the current pose of the glasses after the gyroscope is pre-calibrated, and the pose of the glasses includes a horizontal direction or a vertical direction.
  • the posture of the glasses is in the horizontal direction, it can be determined that the user wearing the glasses is in an upright posture; when the posture of the glasses is in the vertical direction, it can be determined that the user wearing the glasses has a tilted head posture or a lying posture. , the user is likely to be at risk.
  • the pose of the glasses can be determined to be the vertical direction, and the user wearing the glasses can be determined to be in the vertical direction.
  • the pose of the glasses can be determined to be the horizontal direction, and the user wearing the glasses can be determined to have an upright head. attitude.
  • the current attitude information of the user may also be determined according to two accurate distance values D base 1 and D base 2 .
  • other positioning base stations such as foot positioning base station Ba and hand bracelet positioning
  • the accurate three-dimensional coordinates of the base station Bw determine the accurate value D base 1 of the distance between the foot positioning base station Ba and the head positioning base station, and the accurate value D base of the distance between the hand bracelet positioning base station Bw and the head positioning base station 2.
  • D base 1-D base 2 ⁇ 50cm unit, centimeters
  • D base 1-D base 2>80cm it can be determined that the user's posture is Standing or stretched position.
  • 50 cm and 80 cm are only exemplary descriptions, are empirical values, and cannot be interpreted as specific limitations of the present application, and other values may be adopted in practical situations.
  • the pose of the glasses is determined by a motion sensor, or the direction of the connection line between two head positioning base stations, and at the same time, two accurate distance values D base 1 and D base obtained by positioning can also be combined. 2. Obtain more accurate current attitude information of the user (the user who deploys the positioning base station).
  • the current posture of the glasses when the current posture of the glasses is a horizontal state, and D base 1-D base 2 ⁇ 50cm, it can be determined that the user's posture is a sitting posture.
  • the current posture of the glasses When the current posture of the glasses is vertical, and D base 1-D base 2 ⁇ 50cm, it can be determined that the user's posture is a collapsed state.
  • the current posture of the glasses When the current posture of the glasses is horizontal, and D base 1-D base 2>80cm, it can be determined that the user's posture is standing.
  • the current posture of the glasses is a vertical state, and D base 1-D base 2>80cm, it can be determined that the user's posture is a lying state.
  • the posture of the ward can be displayed on the display interface of the guardian's mobile phone.
  • the human figure in the radar chart represents the ward
  • the two human figures respectively display the different current postures of the two wards, one is a standing posture and the other is a sitting posture.
  • an embodiment of the present application provides a UWB positioning method, and the UWB positioning method can be executed by an electronic device.
  • the UWB positioning method may be executed by one or more electronic devices in the base station, wearable device and mobile phone in the aforementioned application scenario, the complete process of the method may be executed by one electronic device, or may be executed jointly by several electronic devices The full flow of the method.
  • the UWB positioning method may also be executed by a server or a cloud.
  • the UWB positioning method may also be executed by one or more of a base station, a wearable device, a mobile phone, a server, or a cloud.
  • the UWB positioning method includes steps S1110 to S1120.
  • S1110 Acquire distance information of at least two target base stations deployed on the first user.
  • each positioning base station includes a UWB module.
  • the positioning base station may be independently deployed on the first user, or may be deployed or built in a wearable device of the first user.
  • At least two positioning base stations may be determined as target base stations for locating the tag or the second user.
  • the tag includes a UWB module, and the tag can be deployed independently on the second user, or can be deployed or built into electronic equipment such as a mobile phone or wearable device of the second user, such as a guardian or protector. That is, the target base stations are at least two positioning base stations among the multiple positioning base stations deployed on the first user, and there may be more than two target base stations. When only two positioning base stations are deployed on the first user, both of these two positioning base stations will be used as target base stations.
  • the system or the user may randomly select at least two target base stations, or the target base stations may be determined according to the preset system settings.
  • the target base station or, two positioning base stations, a head base station and a wrist base station of one of the temples, can be determined as The target base station, or, a head base station, a wrist base station and an ankle base of one of the temples can be determined.
  • the three positioning base stations of the base station are determined as the target base station.
  • the UWB positioning method provided by the embodiment of the present application may be executed according to the obtained person-seeking mode enabled by the first user on any base station or an electronic device deployed by any base station; in other practical situations
  • several steps of the UWB positioning method provided by the embodiment of the present application can be started according to the acquired person-seeking mode enabled by the second user on the tag or the electronic device deployed with the tag.
  • each positioning base station may determine its own deployment location information, such as head, wrist, or ankle, according to its own attribute information. Further, according to the deployment location information, the distance information between the two positioning base stations can be determined in real time according to the big data statistical value or the system setting value set according to the experience.
  • the system when there are two target base stations, such as a head base station and an ankle base station, after it is determined that the deployment positions of the two target base stations are the head and the ankle, respectively, according to the setting value stored by the system, for example, 1.5 m, set the distance between the two target base stations.
  • target base stations such as a head base station, a wrist base station, and an ankle base station
  • after determining that the deployment positions of the three target base stations are the head, the wrist, and the ankle, respectively, according to the system storage set the distance between the head base station and the wrist base station, the distance between the head base station and the ankle base station, or the distance between the wrist base station and the ankle base station.
  • the user manually inputs the distance information between the target base stations, that is, the value input by the user is obtained as the distance between the two target base stations.
  • the user may be the first user or other users.
  • the user may be prompted to input the distance between the two target base stations.
  • the value input by the user is 1.7m, for example, the value is obtained as the distance between the two target base stations.
  • the user can input not only one value but multiple values, so that the distances of any two or three groups of target base stations among the three target base stations can be determined according to the values input by the user. It should be noted that any two target base stations are referred to as a group of target base stations herein.
  • S1120 Acquire a direction angle from each of the at least two target base stations to a tag of a second user, and obtain a positioning position of the tag based on the direction angle and the distance information.
  • the direction angle of the tag to each target base station can be measured. Based on the distance information between the direction angle and the target base station, the positioning position of the tag is obtained.
  • positioning based on two target base stations is used as an example for description.
  • Obtaining the positioning position of the tag based on the direction angle and the distance information includes: taking a target base station as the origin of the coordinate system, and taking the direction from the target base station to another target base station as the positive direction of the X-axis or the Y-axis , establish a coordinate system; determine the coordinates of the two target base stations in the coordinate system according to the distance information; finally determine the coordinates of the tag in the coordinate system according to the coordinates of the two target base stations and the direction angle.
  • any target base station worn or carried by the ward (or the protected person) can be used as the origin of the coordinate system.
  • the electronic device (or label) worn or carried by the guardian can also be used as the origin of the coordinate system. Take the direction of the electronic device (or label) to any target base station worn or carried by the guardian (or the protected person) as the positive direction of the X-axis or Y-axis, and establish a coordinate system; determine two according to the distance information. The coordinates of the target base station in the coordinate system; finally, the coordinates of the tag in the coordinate system are determined according to the coordinates of the two target base stations and the direction angle.
  • the relative positions of the first user (or positioning base station) and the second user (or tag) in the usage scenario of the present application will change. Therefore, in this embodiment of the present application, the relationship between at least two target base stations set on the first user to the tag is obtained.
  • the UWB positioning method shown in this embodiment is further optimized on the basis of the embodiment shown in FIG. 11 .
  • the target The number of base stations is three, which increases the process of generating poses.
  • the UWB positioning method includes steps S1210 to S1230. It should be understood that the steps in the embodiment shown in FIG. 12 are the same as those in the embodiment in FIG. 11 , which will not be repeated here, and refer to the foregoing.
  • S1210 Acquire distance information of at least three target base stations deployed on the first user.
  • the at least three target base stations include a head base station, a hand (eg wrist) base station and a foot (eg ankle) base station.
  • the distance information between the head base station and the wrist base station, and the distance information between the head base station and the ankle base station can be obtained.
  • the first user wears glasses
  • a UWB base station is deployed on each of the two temples of the glasses
  • a UWB base station is deployed on each of the wrist and ankle of the first user.
  • the UWB base station deployed on the wrist is the wrist base station
  • the ankle The deployed UWB base station is the ankle base station.
  • the positioning method calculates the accurate three-dimensional coordinates of the ankle base station Ba and the wrist base station Bw, so that the distance information from any head base station to the wrist base station Bw and the ankle base station Ba can be determined respectively.
  • S1220 Acquire a direction angle from each of the at least two target base stations to a tag of a second user, and obtain a positioning position of the tag based on the direction angle and the distance information.
  • S1230 Determine the pose of the first user according to the distance information of the at least three target base stations.
  • step S1210 the distance information of at least three target base stations is obtained, specifically, the distance information D base 2 of the head base station and the wrist base station, and the distance information D base 1 of the head base station and the ankle base station can be obtained.
  • step S1230 the current posture information of the first user can be determined according to the two accurate distance information D base 1 and D base 2.
  • an embodiment of the present application further provides a UWB positioning apparatus.
  • Each module included in the UWB positioning apparatus can correspondingly implement each step of the UWB positioning method.
  • the electronic device includes corresponding hardware and/or software modules for executing each function.
  • the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functionality for each particular application in conjunction with the embodiments, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application further provide an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, when the processor executes the computer program, the The electronic device implements the steps in each of the foregoing method embodiments.
  • the electronic device may include a wearable device, a mobile phone, or a server (including a cloud server, an independent server, a server cluster, etc.) and the like.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the foregoing method embodiments can be implemented.
  • the embodiments of the present application provide a computer program product, when the computer program product runs on one or more electronic devices, the one or more electronic devices can implement the steps in the foregoing method embodiments.
  • the integrated modules/units if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the present application realizes all or part of the processes in the methods of the above embodiments, which can be completed by instructing the relevant hardware through a computer program, and the computer program can be stored in a computer-readable storage medium.
  • the computer program includes computer program code
  • the computer program code may be in the form of source code, object code, executable file or some intermediate form, and the like.
  • the computer-readable medium may include at least: any entity or device capable of carrying a computer program code to a photographing device/electronic device, a recording medium, computer memory, read-only memory (ROM), random access memory (Random Access Memory, RAM), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals telecommunication signals
  • software distribution media For example, U disk, mobile hard disk, disk or CD, etc.
  • computer readable media may not be electrical carrier signals and telecommunications signals.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.

Abstract

La présente invention se rapporte au domaine technique des communications, et concerne un procédé et un système de positionnement à bande ultralarge. Le procédé de positionnement à bande ultralarge consiste à : obtenir des informations de distance d'au moins deux stations de base cibles déployées sur un premier utilisateur, lesdites au moins deux stations de base cibles comprenant chacune un module à bande ultralarge; obtenir des angles de direction respectifs desdites au moins deux stations de base cibles par rapport à une étiquette d'un second utilisateur, et sur la base des angles de direction et des informations de distance, obtenir un emplacement de positionnement de l'étiquette, l'étiquette comprenant un module à bande ultralarge. Selon des modes de réalisation de la présente invention, la précision d'un résultat de positionnement peut être améliorée.
PCT/CN2021/140535 2021-02-09 2021-12-22 Procédé et système de positionnement à bande ultralarge WO2022170863A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110174884.2 2021-02-09
CN202110174884.2A CN113038362B (zh) 2021-02-09 2021-02-09 超宽带定位方法及系统

Publications (1)

Publication Number Publication Date
WO2022170863A1 true WO2022170863A1 (fr) 2022-08-18

Family

ID=76460742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/140535 WO2022170863A1 (fr) 2021-02-09 2021-12-22 Procédé et système de positionnement à bande ultralarge

Country Status (2)

Country Link
CN (1) CN113038362B (fr)
WO (1) WO2022170863A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115497189A (zh) * 2022-09-16 2022-12-20 福建中锐网络股份有限公司 一种基于5g及uwb的ar眼镜的水库巡检系统
CN116193581A (zh) * 2023-05-04 2023-05-30 广东工业大学 一种基于集员滤波的室内无人机混合定位方法及系统
CN116546620A (zh) * 2023-07-07 2023-08-04 深圳派特科技有限公司 一种基于uwb的双基站定位标签的方法及系统

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038362B (zh) * 2021-02-09 2022-10-11 华为技术有限公司 超宽带定位方法及系统
CN113390433A (zh) * 2021-07-20 2021-09-14 上海擎朗智能科技有限公司 一种机器人定位方法、装置、机器人和存储介质
CN113570761A (zh) * 2021-08-06 2021-10-29 广州小鹏汽车科技有限公司 车辆控制方法、装置、车载终端及存储介质
WO2023065110A1 (fr) * 2021-10-19 2023-04-27 深圳市优必选科技股份有限公司 Procédé d'étalonnage d'une station de base, et dispositif informatique et support de stockage
CN114071354B (zh) * 2021-11-05 2024-01-30 国能神东煤炭集团有限责任公司 一种基于拓扑地图的多模态uwb定位方法及系统
CN113820659B (zh) * 2021-11-22 2022-08-26 嘉兴温芯智能科技有限公司 无线定位方法、能量转换装置、无线定位系统和智能服装
CN114562993A (zh) * 2022-02-28 2022-05-31 联想(北京)有限公司 一种轨迹处理方法、装置及电子设备
CN115002663A (zh) * 2022-06-09 2022-09-02 长春理工大学 基于uwb的智能车车头静态指向方向的确定方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM474921U (zh) * 2013-10-04 2014-03-21 Univ Taipei Chengshih Science 眼鏡定位裝置
CN110913466A (zh) * 2019-11-28 2020-03-24 郑州芯力波通信息技术有限公司 基于多通信融合的超宽带uwb定位系统及方法
CN111983559A (zh) * 2020-08-14 2020-11-24 Oppo广东移动通信有限公司 室内定位导航方法及装置
CN113038362A (zh) * 2021-02-09 2021-06-25 华为技术有限公司 超宽带定位方法及系统

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207383A (zh) * 2013-05-16 2013-07-17 沈阳化工大学 基于单个移动节点对一静止节点进行二维无线定位的方法
US10140637B2 (en) * 2013-11-26 2018-11-27 Paypal, Inc. Customer selection determination system
CN105849579B (zh) * 2014-10-08 2018-03-09 华为技术有限公司 目标设备的定位方法和移动终端
CN105631390B (zh) * 2014-10-28 2021-04-27 佛山市顺德区美的电热电器制造有限公司 空间手指定位的方法和空间手指定位的系统
CN104457736A (zh) * 2014-11-03 2015-03-25 深圳市邦彦信息技术有限公司 一种获取目标位置信息的方法及装置
CN107105498B (zh) * 2016-02-22 2020-07-07 华为技术有限公司 定位方法和装置
CN110018508A (zh) * 2018-01-10 2019-07-16 西安中兴新软件有限责任公司 一种定位方法及装置
US10499194B1 (en) * 2018-07-30 2019-12-03 Motorola Mobility Llc Location correlation in a region based on signal strength indications
CN109041213A (zh) * 2018-08-16 2018-12-18 佛山科学技术学院 一种ap室内自适应定位修正方法及装置
CN109541541B (zh) * 2018-12-24 2022-11-08 广东理致技术有限公司 一种室内三角定位精度修正方法及装置
CN110673092A (zh) * 2019-09-10 2020-01-10 清研讯科(北京)科技有限公司 基于超宽带的分时定位方法及装置、系统
CN111405508A (zh) * 2020-02-19 2020-07-10 华为技术有限公司 可穿戴设备的定位方法及可穿戴设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM474921U (zh) * 2013-10-04 2014-03-21 Univ Taipei Chengshih Science 眼鏡定位裝置
CN110913466A (zh) * 2019-11-28 2020-03-24 郑州芯力波通信息技术有限公司 基于多通信融合的超宽带uwb定位系统及方法
CN111983559A (zh) * 2020-08-14 2020-11-24 Oppo广东移动通信有限公司 室内定位导航方法及装置
CN113038362A (zh) * 2021-02-09 2021-06-25 华为技术有限公司 超宽带定位方法及系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115497189A (zh) * 2022-09-16 2022-12-20 福建中锐网络股份有限公司 一种基于5g及uwb的ar眼镜的水库巡检系统
CN115497189B (zh) * 2022-09-16 2023-11-07 福建中锐网络股份有限公司 一种基于5g及uwb的ar眼镜的水库巡检系统
CN116193581A (zh) * 2023-05-04 2023-05-30 广东工业大学 一种基于集员滤波的室内无人机混合定位方法及系统
CN116193581B (zh) * 2023-05-04 2023-08-04 广东工业大学 一种基于集员滤波的室内无人机混合定位方法及系统
CN116546620A (zh) * 2023-07-07 2023-08-04 深圳派特科技有限公司 一种基于uwb的双基站定位标签的方法及系统
CN116546620B (zh) * 2023-07-07 2023-09-15 深圳派特科技有限公司 一种基于uwb的双基站定位标签的方法及系统

Also Published As

Publication number Publication date
CN113038362B (zh) 2022-10-11
CN113038362A (zh) 2021-06-25

Similar Documents

Publication Publication Date Title
WO2022170863A1 (fr) Procédé et système de positionnement à bande ultralarge
EP3961358B1 (fr) Procédé de prévention de faux contact tactile pour écran incurvé, et dispositif électronique
CN110458902B (zh) 3d光照估计方法及电子设备
CN112637758B (zh) 一种设备定位方法及其相关设备
WO2021213165A1 (fr) Procédé de traitement de données multi-sources, dispositif électronique et support de stockage lisible par ordinateur
WO2021023046A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
CN110798568A (zh) 具有折叠屏的电子设备的显示控制方法及电子设备
WO2022116930A1 (fr) Procédé de partage de contenu, dispositif électronique et support de stockage
WO2022100685A1 (fr) Procédé de traitement de commande de dessin et dispositif associé
WO2022027972A1 (fr) Procédé de recherche de dispositif et dispositif électronique
CN112686981A (zh) 画面渲染方法、装置、电子设备及存储介质
CN111835907A (zh) 一种跨电子设备转接服务的方法、设备以及系统
WO2023273476A1 (fr) Procédé de détection de dispositif, et dispositif électronique
CN114257920B (zh) 一种音频播放方法、系统和电子设备
CN114880251A (zh) 存储单元的访问方法、访问装置和终端设备
WO2022048453A1 (fr) Procédé de déverrouillage et dispositif électronique
WO2021175097A1 (fr) Procédé d'imagerie d'objet hors ligne de visée, et dispositif électronique
CN112099741B (zh) 显示屏位置识别方法、电子设备及计算机可读存储介质
CN113518189B (zh) 拍摄方法、系统、电子设备及存储介质
WO2022042275A1 (fr) Procédé de mesure de distance, appareil, dispositif électronique et support de stockage lisible
WO2022037575A1 (fr) Procédé de positionnement à faible consommation d'énergie et appareil associé
CN113436635A (zh) 分布式麦克风阵列的自校准方法、装置和电子设备
WO2022042460A1 (fr) Procédé de connexion de dispositif et dispositif électronique
WO2023207862A1 (fr) Procédé et appareil permettant de déterminer une posture de tête
WO2023030067A1 (fr) Procédé de commande à distance, dispositif de commande à distance et dispositif commandé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21925508

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21925508

Country of ref document: EP

Kind code of ref document: A1