WO2020187082A1 - Procédé permettant de rapporter des informations de position par un dispositif, et appareil associé - Google Patents
Procédé permettant de rapporter des informations de position par un dispositif, et appareil associé Download PDFInfo
- Publication number
- WO2020187082A1 WO2020187082A1 PCT/CN2020/078494 CN2020078494W WO2020187082A1 WO 2020187082 A1 WO2020187082 A1 WO 2020187082A1 CN 2020078494 W CN2020078494 W CN 2020078494W WO 2020187082 A1 WO2020187082 A1 WO 2020187082A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scene
- location information
- reporting period
- cloud server
- reporting
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
Definitions
- the type of the object tracked by the first device is set by default by the first device.
- the first device may pre-store a fixed reporting period, and the object type corresponding to the fixed reporting period may be regarded as the tracking object type set by the first device by default.
- the first device can use a shorter reporting period to report location information. That is, if the movement speed of the tracked object in the first scene is greater than the movement speed in the second scene, the first reporting period is less than the second reporting period.
- the reporting period of the object in different scenarios can be set by the user according to requirements.
- the first device may also adjust the reporting period according to the instruction.
- the process of the first device adjusting the reporting period according to the instruction may include the following steps: the second device sends a first instruction to the cloud server according to the input user operation, and the first instruction is used to request the cloud server to notify the first device to report location information as soon as possible ;
- the cloud server sends a second instruction to the first device based on the first instruction, and the second instruction is used to instruct the first device to report location information as soon as possible;
- the first device receives the second instruction sent by the cloud server to shorten the reporting cycle, Report location information at a faster frequency. This way the user actively intervenes in the reporting cycle of the first device, in some emergency scenarios, the user can quickly learn the current location of the tracked object.
- the first device before the first device obtains the scene data, it may use the initial reporting period to report the location information to the cloud server. After acquiring the scene data, the first device may report the location information to the cloud server according to the reporting period corresponding to the determined scene.
- the initial reporting period may be a reporting period corresponding to the type of the currently tracked object.
- the first device reports location information to the cloud server through a cellular network.
- the cellular network is a narrowband internet of things (NB-IoT)
- the first device activates the non-power saving mode when reporting the location information, and activates power saving when the location information is not reported Mode (power saving mode, PSM).
- PSM power saving mode
- the device is used to track an object, and the distance between the device and the object is less than a first value; the scene data reflects the geographic location or environment where the object is located;
- the first reporting period is used to report the location information to the cloud server; the location information is the real-time location information of the device.
- the reporting period refers to the time interval between the device reporting location information every 2 times.
- the way in which the user sets the device tracking object type can enable the device to track different objects according to the user's needs, with a wider application range, more practicality, more in line with the user's usage habits, and better user experience.
- the reporting period of the object in different scenarios can be set by the user according to requirements.
- the device can independently adjust the reporting period according to the current scene, so that the device can adapt to the current scene and report location information according to actual needs.
- FIG. 3 is a schematic flowchart of a method for establishing a subordination relationship between a first device and a user according to an embodiment of the present application
- FIG. 7 is a schematic diagram of the functions provided by each device in the communication system of an embodiment of the present application.
- the embodiment of the present application provides a method for a device to report location information.
- the device can independently adjust the reporting period according to the current scenario, so that the device can adapt to the current scenario and report location information according to actual needs.
- the reporting period refers to the time interval between the device reporting location information every 2 times.
- the first device 100 is a terminal having a positioning function and a communication function based on a cellular network.
- the first device 100 has a positioning function means that the first device 100 can obtain current location information according to one or more of the following methods, including: global navigation satellite system (GNSS) positioning, Wi-Fi Positioning, base station positioning, and positioning using data measured by configured sensors (such as acceleration sensors, air pressure sensors, etc.).
- GNSS global navigation satellite system
- Wi-Fi Positioning Wi-Fi Positioning
- base station positioning positioning using data measured by configured sensors (such as acceleration sensors, air pressure sensors, etc.).
- the first device 100 having a communication function based on a cellular network means that the first device 100 can be connected to the core network through the cellular network to communicate with other devices.
- the first device 100 may be connected to the cloud server 300 based on a cellular network, so as to communicate with the cloud server 300.
- the first device 100 can be used to track various objects, such as people (such as children, elderly), pets, important items (such as luggage, medical equipment), and objects in industries such as agriculture, animal husbandry, logistics, and cold chain, etc. .
- the distance between the first device 100 and the object is less than the threshold.
- the threshold can be preset, such as 10 centimeters (cm), 5 cm, and so on. That is, in this embodiment of the present application, the location information of the first device 100 can be regarded as the location information of the object tracked by the first device 100.
- the first device 100 may be placed in an accessory, which is convenient for the user to use. It is not limited to the first device 100 placed in the bracelet as shown in FIG.
- the second device 200 may be installed with an application (APP) for managing the first device 100, or the second device 200 may access a world wide web (web) page for managing the first device 100.
- the second device 200 can control the first device that has an affiliation with the associated user of the second device 200 through the application or web page.
- the associated user of the second device 200 refers to a user corresponding to a user account logged in to the second device 200 for managing the application or web page of the first device.
- the user account logged in to the application or web page of the second device 200 may be referred to as the associated user account of the second device 200.
- the structure of the first device 100 provided by the embodiment of the present application is described below.
- the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the first device 100.
- the first device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
- the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
- the first device 100 may also be configured with a display screen.
- the display screen may be used to display a two-dimensional code indicating the identification of the first device 100.
- the processor 110 may include one or more processing units.
- the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
- AP application processor
- GPU graphics processing unit
- ISP image signal processor
- controller video codec
- digital signal processor digital signal processor
- NPU neural-network processing unit
- the different processing units may be independent devices or integrated in one or more processors.
- the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
- a memory may also be provided in the processor 110 to store instructions and data.
- the memory in the processor 110 is a cache memory.
- the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
- the memory 120 may be used to store one or more algorithms for identifying the scene where the first device 100 is located.
- the processor 110 may call the algorithm to analyze the scene data obtained by the sensors (such as the air pressure sensor 170B and the acceleration sensor 170C) and/or the wireless communication module 160, and identify the scene where the first device 100 is located. After the processor 110 recognizes the scene where the first device 100 is located, it may adjust the period of the first device 100 reporting location information according to the scene where the first device 100 is located.
- the charging management module 130 is used to receive charging input from the charger.
- the charger can be a wireless charger or a wired charger.
- the charging management module 130 may receive the charging input of the wired charger through the USB interface.
- the charging management module 130 may receive the wireless charging input through the wireless charging coil of the first device 100. While the charging management module 130 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
- the power management module 141 is used to connect the battery 142, the charging management module 130 and the processor 110.
- the power management module 141 receives input from the battery 142 and/or the charging management module 130, and supplies power to the processor 110, the memory 120, and the wireless communication module 160.
- the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
- the power management module 141 may also be provided in the processor 110.
- the power management module 141 and the charging management module 130 may also be provided in the same device.
- the wireless communication function of the first device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
- the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in the first device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
- the antenna can be used in combination with a tuning switch.
- the modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
- the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite-based augmentation systems
- the air pressure sensor 170B is used to measure air pressure.
- the first device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 170B to assist positioning and navigation.
- the acceleration sensor 170C can detect the magnitude of the acceleration of the first device 100 in various directions (generally three axes). When the first device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and used in applications such as horizontal and vertical screen switching, pedometers and so on.
- the distance sensor 170D is used to measure distance.
- the first device 100 can measure the distance by infrared or laser.
- an embodiment of the present application provides a method for a device to report location information.
- the first device can independently adjust the reporting period according to the current scene, so that the first device can adapt to the current scene and report location information according to actual needs, which can not only meet the needs of tracking objects, but also Can save power consumption.
- the location information of the first device may include one or more of the following: latitude and longitude coordinates, altitude, or geographic location name.
- the first device can obtain location information through any of the following methods or a combination of the following methods:
- the first device obtains the longitude and latitude coordinates or the geographic location name through a global navigation satellite system, such as GPS, GLONASS, BDS, QZSS, or SBAS.
- a global navigation satellite system such as GPS, GLONASS, BDS, QZSS, or SBAS.
- some sensors of the first device 100 can work continuously for a long time to collect data, so as to obtain the location information of the tracked object.
- the wireless communication module 160 of the first device 100 can continuously provide wireless communication solutions such as global navigation satellite systems (such as GPS), Wi-Fi, Bluetooth, infrared, etc. to obtain the location information of the first device 100.
- the above wireless communication solution may also be provided periodically (for example, in a fixed period) or as needed (for example, when or before the location information needs to be reported) to obtain the location information of the first device 100.
- the view user information control 101 can be used to receive an input user operation (for example, a click operation).
- the second device 100 can display user information on the user interface 10 (for example, the user who has logged in to manage the application of the first device) User account avatar, name, etc.), controls for logging out or switching login accounts, etc.
- the add control 103 may be used to receive an input user operation (for example, a click operation), and in response to the user operation, the second device 200 may display a small window including one or more controls on the user interface 10.
- the second device 200 may respond to a user operation (for example, a click operation) detected on the control in the small window, and provide scan QR code, view help feedback information, add friends or some other functions.
- the identifier of the first device 100 may be a service set identifier (SSID) of a service set constructed by the first device 100, and may be used to indicate the first device 100.
- the second device 200 obtains the identity of the first device 100, that is, the second device 200 finds that the first device 100 exists nearby.
- SSID service set identifier
- the second device 200 may execute any of the following methods to obtain the identity of the first device 100:
- the second device 200 scans a message sent by the nearby first device 100, and the message carries the identity of the first device 100.
- the first device 100 may send a message after being connected to a power source or after receiving a user operation (for example, the power button of a smart speaker receives a user's long press operation).
- the first device 100 may continuously send the message.
- the message sent by the first device 100 carries the identity of the first device 100.
- the message may be a broadcast packet sent by the first device 100 using Bluetooth technology.
- the second device 200 may perform a scanning operation on the Bluetooth channel in response to the user operation.
- the time period during which the second device 200 performs the scanning operation may be a preset time period, and the second device 200 may stop scanning after the preset time period.
- the second device 200 can scan the broadcast packet sent by the first device 100.
- the second device 200 obtains the identification of the nearby first device 100 through the above-mentioned method (1) in response to the user operation received on the add device control 105 User interface displayed 20.
- the user interface 20 may include: return to the upper level control 201, prompt information 202, and available device list 203.
- the return to upper level control 201 can be used to receive an input user operation (for example, a click operation), and the second device 200 can display the user interface 10 shown in 4a again in response to the user operation.
- the prompt information 202 is used to prompt the user that the second device 200 is currently scanning a message sent by a nearby first device.
- the available device list 203 is used to display the information of the nearby first device discovered by the second device 200, the information is carried in the message sent by the first device, and the information may be the name of the first device.
- the second device 200 has discovered two first devices whose names are "AAAAA” and "BBBBB" respectively.
- the name of each device in the available device list 203 can be used to receive user operations.
- the user can click "BBBBB" in the available device list 203, and in response to the click operation, the second device 200 can display the user interface 30 shown in 4c.
- the user interface 30 may include a return control 301, a prompt message 302, a temporarily unbound control 303, and an immediate binding control 304.
- the return to the upper level control 301 can be used to receive an input user operation (such as a click operation), and the second device 200 can display the user interface 20 shown in 4b again in response to the user operation.
- the prompt message 302 is used to prompt the user whether to bind the discovered first device.
- the temporarily unbound control 303 can be used to receive an input user operation (for example, a click operation), and the second device 200 can display the user interface 20 shown in 4b again in response to the user operation.
- the immediate binding control 304 can be used to receive an input user operation (such as a click operation).
- the second device 200 can trigger a process for the associated user of the second device 200 to establish a subordination relationship with the first device "BBBBB" .
- BBBBB subordination relationship
- the second device 200 scans the two-dimensional code provided by the nearby first device 100, and the two-dimensional code is used to indicate the identity of the first device 100.
- the second device 200 after receiving a user operation on the add device control 105 in the user interface 10, the second device 200 can turn on the camera and scan the QR code provided by the first device 100 through the camera. According to the QR code The identification of the first device 100 is obtained.
- the embodiment of the present application can also obtain the identity of the first device 100 in other ways.
- the second device 200 can also receive the identity of the first device 100 input by the user. .
- Step 4 The cloud server 300 allocates a device identifier to the first device 100, and sends the device identifier of the first device 100 to the second device 200.
- Step 5 The second device 200 sends the device identification to the first device 100.
- Step 6 The first device 100 sends a binding request to the cloud server 300.
- the binding request carries the device identifier assigned by the cloud server 300 to the first device 100, and is used to request the cloud server 300 to be the associated user of the second device 200 and the The first device 100 establishes a subordinate relationship.
- the cloud server 300 binds the associated user account of the second device 200 with the device identifier of the first device 100, the user corresponding to the user account (ie, the associated user of the second device 200) and the first device 100 successfully establish an affiliation.
- the associated user of the second device 200 can view the location reported by the first device 100 on the second device 200 by managing the application or web page of the second device information.
- Step 8 The cloud server 300 sends a notification message to the second device 200, where the notification message is used to notify the second device 200 that the associated user and the first device 100 are successfully bound.
- the second device 200 may prompt the user to successfully establish an affiliation with the first device 100.
- the prompt method may include, but is not limited to: displaying prompt information on the display screen, voice broadcast prompt information, or vibration.
- step 3 to step 6 can also be replaced with the following step 3a to step 6a:
- Step 3a The first device 100 requests the cloud server 300 to allocate a device ID (device ID) to the first device 100.
- the first device 100 may request the cloud server 300 to allocate a device ID to the first device 100 through a cellular network (for example, 2G/3G/4G/5G, NB-IoT, etc.).
- a cellular network for example, 2G/3G/4G/5G, NB-IoT, etc.
- Step 4a The cloud server 300 allocates a device ID to the first device 100, and sends the device ID of the first device 100 to the first device 100.
- Step 5a The first device 100 sends the device identifier to the second device 200.
- Step 6a The second device 200 sends a binding request to the cloud server 300.
- the binding request carries the device identifier assigned by the cloud server 300 to the first device 100 and the associated user account of the second device 200.
- the binding request is used to request
- the cloud server 300 establishes an affiliation relationship for the associated user of the second device 200 and the first device 100.
- the second device 200 may display related information 108 of the first device 100 on the user interface 10.
- the related information 108 of the first device 100 may include: the picture and/or nickname of the object tracked by the first device 100, the name of the first device 100 (for example, "BBBBB"), and the remaining power of the first device 100 (for example, 38%) ), the status of the location information reported by the first device 100 (for example, whether the device reports location information, the latest location information reported by the device, etc.), etc.
- the picture and/or nickname of the object tracked by the first device 100 may be added or set by the user when establishing an affiliation with the first device 100.
- the remaining power of the first device 100 and the reported position information may be sent by the first device 100 to the second device 200 through the cloud server 300.
- the manner in which the first device 100 obtains the location information can refer to the previous related description
- the manner in which the first device 100 reports the location information can refer to the related description in the subsequent embodiments.
- the reporting period of the first device 100 determines the accuracy and fineness of the user's control of the position of the first device 100/tracking object and the activity track. The shorter the reporting period, the more frequently the first device 100 can report the location information, so that the user can learn the location and activity track of the first device 100/tracking object more accurately. However, the shorter the reporting period, it also means that the first device 100 needs to communicate with the cloud server 300 more frequently, which will increase power consumption.
- the type of objects tracked by the first device 100 corresponds to its own reporting cycle.
- family members such as children, the elderly, etc.
- the user wants to know the location of the family member more frequently, such as the updated location of the family member every 5 minutes, so as to ensure the safety of the family member, so the first device tracks the family
- the period for reporting location information when a member is a member can be 5 minutes.
- the importance of pets is lower than that of family members.
- users hope to know the pet's location at a normal frequency, such as the updated pet's location every hour, so the first device tracks The period for reporting location information when pets can be 1 hour.
- Table 1 The reporting period corresponding to different object types
- the type of the object tracked by the first device 100 can be determined in any of the following ways:
- the type of the object tracked by the first device 100 may be set by the first device 100 by default.
- the first device 100 may pre-store a fixed reporting period, and the object type corresponding to the fixed reporting period may be regarded as the tracking object type set by the first device 100 by default. For example, when the report period pre-stored by the first device is 5 minutes, the first device can be used to track family members; when the report period pre-stored by the first device is 1 hour, the first device can be used to track pets.
- the type of the object tracked by the first device 100 may be set by the user.
- the user can select one of the object types from the menu containing multiple tracking object types provided by the second device 200 as the type of the object tracked by the first device 100, that is, the first device 100 binds the tracking object type. .
- the second device 200 may display the user interface 40 shown in 5a in response to the user operation input on the immediate binding control 304 in the user interface 30.
- the user interface 40 may include: return to the upper level control 401, prompt information 402, tracking object type menu 403, and next control 404.
- the return to upper level control 401 can be used to receive an input user operation (such as a click operation), and the second device 200 can display the user interface 30 shown in 4c of FIG. 4 again in response to the user operation.
- the prompt information 402 is used to prompt the user to set the type of the object tracked by the first device 100.
- the tracking object type menu 403 may include the names and/or icons of multiple object types, such as luggage 403A, pets 403B, bicycles 403C, and valuables 403D shown in 5a. It is not limited to the tracking object type menu 403 shown in 5a. In a specific implementation, the tracking object type menu 403 may include more or less object types, which is not limited in the embodiment of the present application.
- the following describes a method for the first device 100 to report location information provided by an embodiment of the present application.
- the method for the first device 100 to report location information may include the following two:
- the first device 100 may determine the reporting period corresponding to the type of the self-tracking object by any of the following methods:
- the first device 100 may pre-store the reporting period corresponding to each tracking object type.
- the first device 100 may find the reporting period corresponding to the type of the tracking object after learning the type of the tracking object.
- FIG. 6 shows a schematic flowchart of a method for the first device 100 to adaptively determine a reporting period and report location information according to a scene in an embodiment of the present application.
- the method may include the following steps:
- Step S101 The first device 100 obtains scene data.
- Table 2 shows possible scenarios corresponding to several object types.
- the scene corresponding to each object type can be preset according to empirical data, or can be set independently by the user.
- the second device 200 may provide a user interface for setting or adding a scene corresponding to the object type, and the user can set or add a scene corresponding to the object type through the user interface.
- the user can set 3 scenes corresponding to pets: a home scene, a dog walking scene, and a travel scene.
- Step S102 The first device 100 determines the scene where the tracked object is currently located according to the scene data.
- the first device 100 may determine the scene where the tracked object is currently located in the scene corresponding to the type of the tracked object according to the acquired scene data.
- the scenes that the pet may be in include: a home scene, a dog walking scene, and a travel scene.
- the first device 100 can determine that the pet is currently in the home scene; if the movement speed is> 4km/h, the first device 100 can determine that the pet is currently Dog walking scene; if the location information indicates that the distance between the pet and the home exceeds a preset value (for example, 100 km), the first device 100 may determine that it is currently in a traveling scene.
- a preset value for example, 100 km
- the possible scenes of the bicycle include: a home scene and a riding scene.
- the first device 100 can determine that it is currently in the home scene; if the movement speed is >10km/h, the first device 100 can determine that it is currently Riding scene.
- the scenes that the child may be in include: a home scene, a school scene, and an outside school scene.
- the first device 100 can determine that the child is currently at home; if the location information indicates that the child is at school, the first device 100 can determine that the child is currently In a school scene; if the location information indicates that the child is neither at home nor at school, the first device 100 may determine that the child is currently in an out-of-school scene.
- the first device 100 may store one or more algorithms for recognizing scenes.
- the algorithm may be an artificial intelligence (AI) algorithm.
- the first device 100 may use the acquired scene data as the input of the algorithm, and the output of the algorithm is the scene where the currently tracked object is located. In other words, the first device 100 can recognize the scene where the tracked object is currently located through the algorithm.
- AI artificial intelligence
- the first device 100 may store an algorithm corresponding to the type of the tracking object.
- the first device 100 may store multiple types of tracking objects corresponding to different types of tracking objects. Algorithms, for example, algorithms corresponding to pets, algorithms corresponding to luggage, and algorithms corresponding to bicycles can be stored.
- one or more algorithms for scene recognition stored in the first device 100 can be obtained in the following ways: 1.
- the algorithm can be obtained through preliminary training based on the experience data obtained by the R&D personnel, and is preset in the first device at the factory. Equipment 100.
- the cloud server 300 trains the algorithm according to the empirical data obtained by the research and development personnel, and stores the algorithm, and the first device 100 obtains the algorithm from the cloud server 300.
- the empirical data used when training the algorithm specifically includes: scene data collected by the researcher during the experiment, and scenes during the experiment. Specifically, the scene data collected by the experiment can be used as the input, and the scene during the experiment can be used as the output, so as to train the corresponding algorithm.
- the first device 100 may also update one or more stored algorithms for scene recognition. Specifically, as more and more first devices are put into use, more and more abundant data will be generated.
- the cloud server 300 may obtain these data and use the data to update the one or more algorithms for recognizing scenes.
- the update process may include the following steps:
- Step 1 When more and more first devices track objects belonging to the specific object type, they can report the acquired scene data and the determined scene to the cloud server 300.
- the manner in which these first devices obtain scene data can refer to the related description in step S101.
- the manners for the first device to determine the scene in which it is located may include: (1) Using a currently stored algorithm to obtain a calculation result using the acquired scene data as input, and use the calculation result as the scene in which it is located. (2)
- the first device determines the current scene according to other methods than algorithms. For example, the scene can be determined according to the keywords of the geographic location (for example, the keyword of the geographic location of the suitcase includes an airport, and the scene of the suitcase can be determined to be an "airport scene").
- Step 2 The cloud server 300 may use the scene data and scenes reported by the first devices to update the algorithm corresponding to the specific object type. Specifically, the cloud server 300 may use the scene data reported by the first device as input, and use the corresponding scene as input to update the algorithm corresponding to the specific object type.
- Step 3 The cloud server 300 may send the update data of the algorithm of the specific object type to the first device 100.
- Step 4 The first device 100 may update the stored algorithm corresponding to the specific object type according to the received update data.
- the updated algorithm can be used to more accurately identify the scene where the object of the specific object type is located.
- a machine learning algorithm can be used to update or train for scene recognition.
- the machine learning algorithm can be a deep learning algorithm, which can include one or more of the following: nearest neighbor (k-Nearest Neighbor, KNN) classification algorithm, convolutional neural network (convolutional neural network, CNN), recurrent neural network ( recurrent neural network, RNN) or statistical algorithms, etc. It is understandable that in the embodiment of the present application, the machine learning algorithm used can be adjusted according to the update result or the training result.
- the following describes in detail what is the reporting period corresponding to the scenario.
- the user uses the first device 100 to track an object, and the object is in a different scene, the user needs to know the location of the tracked object and the activity trajectory, that is, the first device 100 reports location information.
- the period is different.
- the following exemplarily provides two strategies for the first device 100 to report location information under different user requirements:
- the safe scene and the non-safe scene can be preset. For example, when a user uses the first device 100 to track a pet, when the pet is in a “home scene”, the user does not need the first device 100 to report location information particularly frequently due to the high degree of security at home. For another example, when a pet is in a “traveling scene”, the degree of safety is reduced, and the user needs the first device 100 to report location information more frequently to obtain the pet's location and movement track, so as to prevent the pet from losing.
- Scenes Reporting period Home scene 1 hour Flying scene No need to report location information, and the reporting cycle tends to be infinite Airport scene 5 minutes Taxi scene 10 minutes Hotel scene 0.5 hour
- Table 3 The reporting cycle of objects of type "luggage" in different scenarios
- reporting box shown in Table 3.
- Other types of objects also have different reporting periods in their corresponding scenarios.
- the reporting period can be 60 minutes; when the pet dog is in a dog walking scene, the reporting period can be 5 minutes ; When the pet dog is in other scenes, the reporting period can be 30 minutes.
- the reporting period when the first device 100 tracks objects of the “child” type, when the child is in a school/school scene, the reporting period may be 1 minute; when the child is in a home scene or school scene, the reporting period may be 60 minutes; When children are in a play scene, the reporting period can be 30 minutes.
- the first device 100 can learn the reporting period of the currently tracked object in different scenarios.
- the method for the first device 100 to learn the reporting period of the currently tracked object in different scenarios may include any of the following:
- the reporting period of the object in different scenarios can be preset based on empirical data. For example, R&D personnel can determine that the object is suitable for the reporting period of different scenarios through investigations, etc., and preset it in The first device 100.
- the first device 100 may pre-store the reporting period of the object of this type in different scenarios. For example, when the first device 100 sets the tracking object type as "luggage” by default at the factory, the first device 100 may pre-store the reporting periods of objects of the type "luggage” in different scenarios.
- the first device 100 may pre-store multiple objects of different types in The reporting period in different scenarios. For example, the first device 100 may pre-store the reporting periods of multiple types of objects such as "luggage”, “pets”, and “family members” in different scenarios, and the user sets the types of objects tracked by the first device 100 After that, the first device 100 may find the reporting periods corresponding to the objects tracked by the first device 100 in different scenarios from the reporting periods corresponding to multiple types of objects stored in advance.
- the reporting period of the object in different scenarios can be set by the user according to requirements.
- the second device 200 may provide a user interface on the display screen for setting the reporting period of the object in different scenarios.
- the user can set the reporting period of the object in different scenarios on the user interface.
- the second device 200 may send the report period of the object in different scenarios set by the user to the cloud server 300, and the cloud server 300 may send the report period of the object in different scenarios to the first device 100.
- Step S104 The first device 100 uses the reporting period corresponding to the scene where the tracked object is currently located to report location information to the cloud server 300, where the location information is real-time location information obtained by the first device 100.
- the first device 100 may use the reporting period corresponding to the current scene determined in step S103 to report the location information.
- the reporting period refers to the interval at which the first device 100 reports location information every 2 times. For example, when the first device 100 determines that the tracked suitcase is currently in the "airport scene", it may report the location information once every 5 minutes.
- the manner in which the first device 100 obtains the location information can refer to the previous related description, and it will not be repeated here.
- the first device 100 may use a wireless communication solution provided by a mobile communication module to report location information to the cloud server 300. That is, the first device 100 may report location information to the cloud server 300 through a cellular network (for example, 2G/3G/4G/5G, NB-IoT, etc.). Specifically, when the first device 100 needs to report location information, the first device 100 may use the cellular network to report the location information obtained by the first device 100 in real time to the cloud server 300.
- the real-time location information acquired by the first device 100 may refer to the location information acquired last time, or it may refer to the location information acquired by the electronic device 100 before (for example, 1 second before) or when the location information needs to be reported.
- This application There is no restriction on this. It is understandable that the time when location information needs to be reported in the embodiment of the present application refers to when the reporting period of the first device 100 arrives.
- the first device 100 may also be combined with the power saving mode (PSM) under NB-IoT. ) To report location information.
- PSM power saving mode
- a device based on NB-IoT can be in PSM when working. Under PSM, the first device 100 disables the function of sending and receiving wireless signals, does not monitor the paging on the core network side, cannot receive downlink data, and consumes very low power. .
- the first device 100 can start PSM when it does not need to report location information; when it needs to report location information, if the first device 100 is in PSM, then start non-PSM, that is, actively wake up the first device 100 and pass NB-
- the IoT reports the position information newly acquired by the first device 100 to the cloud server 300.
- the non-PSM of the first device 100 may include a connected (connected) state and an idle (idle) state.
- the first device 100 needs to report location information. That is, the first device 100 may activate the non-power saving mode when reporting location information, and activate the power saving mode when not reporting location information.
- Combining the PSM of the first device 100 under NB-IoT to report location information can avoid wasting data traffic and power at the communication level, and can make full use of the PSM of the device under NB-IoT to achieve power saving effects.
- the first device can independently adjust the reporting period according to the current scene, adapt to the current scene and report location information according to actual needs, which can meet the needs of tracking objects and save power consumption.
- the scene in which the tracked object is determined by the first device 100 according to the scene data in step S102 may be referred to as the first scene, and the scene determined in step S103 corresponding to the current scene is reported The period is called the first reporting period.
- the first reporting period is less than the second reporting period.
- the first reporting period is greater than the second reporting period. In this way, the user's needs for grasping the location and activity trajectory of the tracked object can be met.
- the first device 100 may also consider the first device 100 when reporting location information according to the corresponding reporting period.
- the movement of a device 100 For example, when the reporting period of the first device 100 arrives, if the first device 100 does not move within the preset time, it may not report the location information. That is, when the first device 100 does not move for a long time, even when the reporting period comes, it may not report the location information, which can save power consumption.
- the first device 100 may detect whether it moves within a preset time through sensors such as the acceleration sensor 170C, the gyro sensor 170A, and the vibration sensor 170F.
- the first device 100 may also adjust the reporting period in combination with other situations, as detailed below Describe 2 possible adjustment methods:
- the first device 100 when the first device 100 encounters a violent impact, it may be in a dangerous situation.
- the first device 100 can shorten the reporting period, for example, adjust the reporting period to 5 seconds, so that the user can obtain the location of the tracked object as soon as possible .
- the wireless communication module of the first device 100 provides an NB-IoT wireless communication solution
- the first device 100 if the first device 100 is in the PSM when it encounters a severe impact, it can immediately start the non-PSM, that is, wake up the first device 100 , And report the latest location information acquired by the first device 100 to the cloud server 300 through NB-IoT.
- the manner in which the first device 100 actively starts the non-PSM may include: the first device 100 actively sends uplink data to the core network side.
- the first device 100 may detect whether a severe impact is encountered through one or more of the acceleration sensor 170C and the vibration sensor 170F.
- the first device 100 adjusts the reporting cycle according to the instruction.
- the method shown in FIG. 6 may further include: the first device 100 adjusts the reporting period according to the instruction.
- the process of the first device 100 adjusting the reporting period according to the instruction may include the following steps:
- Step 1 The second device 200 sends a first instruction to the cloud server 300 according to the input user operation.
- the second device 200 Displayed user interface 50.
- the user interface 50 may be displayed by the second device 200 in response to an input user operation.
- the user operation may be a picture of an object tracked by the first device 100 in the related information 108 shown in 5c of FIG.
- User actions (such as click actions) received by the name.
- the user interface 50 is used to display the details of the tracking object of the first device 100. As shown in 5d, the user interface 50 may include: return to the upper level control 501, related information 502 of the first device 100, position information 503 reported by the first device 100, view history track control 504, and emergency search control 505. among them:
- Back to the upper level control 501 can be used to receive an input user operation (for example, a click operation), and the second device 200 can display the user interface 10 shown in 5c again in response to the user operation.
- an input user operation for example, a click operation
- the related information 502 of the first device 100 is the same as the related information 108 of the first device 100 shown in the user interface 10, and the related description may be referred to.
- the location information 503 reported by the first device 100 may include the location information reported by the first device and the time when the location information was reported.
- the location information may be the last reported by the first device 100.
- For the method for the first device 100 to report the location information reference may be made to the description of the foregoing related embodiments (for example, the method embodiment shown in FIG. 6).
- the viewing history track control 504 can be used to receive an input user operation (for example, a click operation), and the second device 200 can display the history track of the first device 100 in response to the user operation.
- an input user operation for example, a click operation
- the emergency search control 505 can be used to receive an input user operation (for example, a click operation), and the second device 200 can send a first instruction to the cloud server 300 in response to the user operation.
- the first instruction is used to request the cloud server 300 to notify the first device 100 to report location information as soon as possible.
- the first device 100 may be connected to the core network, but has not yet established a connection with the cloud server 300. When the first device 100 and the cloud server 300 are not connected, the cloud server 300 and the first device 100 cannot communicate.
- Step 3 When the first device 100 is connected to the cloud server 300, the cloud server 300 may send the first device 100 through the cellular network (for example, 2G/3G/4G/5G, NB-IoT, etc.) based on the first instruction.
- the second instruction is used to instruct the first device 100 to report location information as soon as possible.
- the cloud server 300 instructs the first device 100 to establish a connection with the cloud server 300 through the core network.
- the first device 100 receives the instruction from the core network, it can use the restricted application protocol (CoAP) based on the user datagram protocol (UDP), that is, use CoAP over UDP to actively establish with the cloud server 300 connection.
- the cloud server 300 may send a second instruction to the first device 100 through a cellular network (for example, 2G/3G/4G/5G, NB-IoT, etc.) based on the first instruction.
- the second instruction is used to instruct the first device 100 to report location information as soon as possible.
- Step 4 When the first device 100 receives the second instruction sent by the cloud server 300, the reporting period is shortened, and the location information is reported at a faster frequency. For example, the first device 100 may adjust the reporting period to 5 seconds, so that the user can obtain the location of the tracked object as soon as possible.
- the first device 100 communicates with the cloud server 300 through NB-IoT, and the first device 100 may be in PSM or non-PSM.
- the cloud server 300 sends the second instruction to the first device 100
- the first device can immediately receive the second instruction and adjust the reporting cycle according to the second instruction.
- the cloud server 300 sends the second instruction to the first device 100
- the first device 100 needs to exit the PSM before it can receive the second instruction and adjust the reporting cycle according to the second instruction .
- the first device 100 can exit the PSM in the following two situations: the first device 100 actively sends uplink data to the core network side, or the tracking area update (tracking area update, TAU) period of the first device 100 ends.
- the second instruction is time-sensitive. That is, after the first device 100 receives the second instruction and adjusts the reporting period according to the second instruction, it may report the location information according to the adjusted reporting period only within a preset time. After the preset time, the first device 100 may adjust the reporting period according to the method for adjusting the reporting period described in the previous embodiment (for example, the method shown in FIG. 6). That is, the first device 100 uses the shortened reporting period to report the location information to the cloud server 300 within the preset time of receiving the second instruction; after receiving the second instruction for the preset time, it uses and is The location information is reported to the cloud server 300 in the reporting period corresponding to the scene where the tracking object is currently located.
- the preset time can be preset. In the embodiment of the present application, the preset time may be referred to as the first duration.
- the user can quickly learn the current location of the tracked object.
- the cloud server 300 can learn the current location information of the first device 100.
- the cloud server 300 may determine whether to synchronously send the received location information of the first device 100 to the second device 200 according to any one of the following strategies: 1.
- the cloud server 300 receives the location information sent by the first device 100, that is, The location information is sent to the second device 200.
- the cloud server 300 receives the location information sent by the first device 100, determines whether the location information is within a preset area, and if so, sends the location information to the second device 200.
- the preset area can be set independently by the user. It is understandable that the second strategy is similar to geofencing.
- the second device 200 can receive the updated location information.
- the second device 200 may prompt the user that the current location of the object tracked by the first device 100 is updated.
- the way for the second device 200 to prompt the user may include one or more of the following:
- the second device 200 can display the location information through the user interface provided by the application for managing the first device, so as to control the location of the tracked object.
- the user can view the related information 108 of the first device 100 on the user interface 10 shown in 5c provided by the second device, and In the position information 503 shown in 5d, the last position information reported by the first device 100 is checked, so as to grasp the position of the tracked object.
- the second device 200 can call the notification manager of the second device 200 to display a prompt window on the top of the display screen of the second device 200, the prompt window including the second device 200 Location information received.
- the prompt window displayed on the top of the display screen of the second device 200 can automatically disappear after a short stay without user interaction.
- the second device 200 can also prompt the user by emitting a prompt sound, vibrating, and flashing the indicator light, which is not limited in this embodiment of the application.
- the sensor and wireless communication module of the first device 100 can be used to acquire the scene data of the first device 100, and the acquisition method can refer to the relevant description above.
- the first device 100 may store a scene recognition algorithm, and the first device may use the acquired scene data as an input of the scene recognition algorithm to identify the scene in which the first device 100 is located.
- the processor of the first device 100 may be configured to adaptively adjust the reporting period according to the scene in which the first device 100 is located.
- the processor may also adjust the mode (PSM or non-PSM) of the first device 100, and the adjustment method may refer to the relevant description above.
- the sensor and wireless communication module of the first device 100 can also be used to obtain the location information of the first device 100, and the obtaining method can refer to the relevant description above.
- the mobile communication module of the first device 100 is further configured to send the acquired location information to the cloud server 300 according to the reporting cycle.
- the first device 100 can also be placed in an accessory for use, and the type of accessory can refer to the previous description.
- the cloud server 300 may provide electronic fence cloud services, algorithm training and updating, location cloud services, and the like.
- the algorithm training and update can refer to the previous description.
- the location cloud service means that the cloud server 300 can receive the location information reported by the first device 100 and provide the second device 100 with a service of querying the location or historical track.
- the electronic fence cloud service means that after the cloud server 300 receives the location information sent by the first device 100, it can determine whether the location information is within a preset area, and if so, it sends the location information to the second device 200. It is understandable that the various services provided by the cloud server 300 shown in FIG. 7 are not necessarily provided by one cloud server, but may also be provided by multiple cloud servers.
- the second device 200 can provide functions such as device scene binding, historical trajectory viewing, emergency search, etc. For details, please refer to the relevant description above, which will not be repeated here.
- the first device 100 can be used to track different types of objects and has wide applicability by implementing the method for reporting location information by a device provided in an embodiment of the present application.
- the first device 100 can adjust the reporting cycle according to the current scene, and report location information according to actual needs. This can not only meet the needs of tracking objects, but also be suitable for current application scenarios to perform low-power work and improve Standby time.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Telephonic Communication Services (AREA)
- Telephone Function (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
L'invention concerne un procédé permettant de rapporter des informations de position par un dispositif, et un appareil associé. Selon le procédé, lorsque le dispositif suit un objet, une scène où l'objet est actuellement situé peut être déterminée, et une période de rapport correspondant à la scène est utilisée afin de rapporter des informations de position en temps réel. Par la mise en œuvre de la présente invention, le dispositif peut ajuster de manière autonome la période de rapport en fonction de la scène actuelle, ce qui permet d'adapter la scène actuelle et de rapporter les informations de position en fonction des besoins réels.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910197407.0A CN110049445A (zh) | 2019-03-15 | 2019-03-15 | 设备上报位置信息的方法及相关装置 |
CN201910197407.0 | 2019-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020187082A1 true WO2020187082A1 (fr) | 2020-09-24 |
Family
ID=67273735
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/078494 WO2020187082A1 (fr) | 2019-03-15 | 2020-03-10 | Procédé permettant de rapporter des informations de position par un dispositif, et appareil associé |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110049445A (fr) |
WO (1) | WO2020187082A1 (fr) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110049445A (zh) * | 2019-03-15 | 2019-07-23 | 华为技术有限公司 | 设备上报位置信息的方法及相关装置 |
CN112584319A (zh) * | 2019-09-30 | 2021-03-30 | 大唐移动通信设备有限公司 | 一种位置报告周期调整方法、装置、网络设备及终端 |
CN110728823A (zh) * | 2019-10-09 | 2020-01-24 | 中兴健康科技有限公司 | 一种低功耗往返监护的路径感知电子围栏系统 |
CN110868473B (zh) * | 2019-11-20 | 2022-07-12 | 青岛海信日立空调系统有限公司 | 一种家用电器 |
CN110822638B (zh) * | 2019-11-20 | 2022-04-05 | 青岛海信日立空调系统有限公司 | 一种家用电器 |
CN110971255A (zh) * | 2019-11-29 | 2020-04-07 | 四川科道芯国智能技术股份有限公司 | 腕部穿戴设备 |
CN111896979A (zh) * | 2020-08-07 | 2020-11-06 | 成都思晗科技股份有限公司 | 一种电力工器具的定位模块及方法 |
CN112036532A (zh) * | 2020-09-03 | 2020-12-04 | 成都思晗科技股份有限公司 | 一种智能型电力安全工器具管理系统及方法 |
CN115002650A (zh) * | 2021-12-08 | 2022-09-02 | 丰疆智能(深圳)有限公司 | 数据传输方法、装置及存储介质 |
CN114513753A (zh) * | 2022-01-28 | 2022-05-17 | 青岛海信移动通信技术股份有限公司 | 一种终端设备、轨迹监控方法和存储介质 |
CN116258314A (zh) * | 2022-11-23 | 2023-06-13 | 东土科技(宜昌)有限公司 | 生产车间的场景管理方法、系统、电子设备及存储介质 |
CN116546104A (zh) * | 2023-06-27 | 2023-08-04 | 北京集度科技有限公司 | 数据上报方法、物联网设备、系统、存储介质及电子设备 |
CN118362117B (zh) * | 2024-06-18 | 2024-08-20 | 贵州师范大学 | 一种乡村旅游地理数据采集方法、系统及电子设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140118202A (ko) * | 2013-03-28 | 2014-10-08 | 주식회사 앰투앰넷 | 이동 단말 및 그 측위 방법 |
CN105353395A (zh) * | 2015-09-24 | 2016-02-24 | 广州视源电子科技股份有限公司 | 调整位置信息上报频率的方法和装置 |
CN106023540A (zh) * | 2016-05-20 | 2016-10-12 | 广州视源电子科技股份有限公司 | 基于防丢器的防丢物品提示方法及装置 |
CN108156186A (zh) * | 2016-12-02 | 2018-06-12 | 中移(杭州)信息技术有限公司 | 一种信息采集方法及装置 |
CN108900976A (zh) * | 2018-07-10 | 2018-11-27 | 宇龙计算机通信科技(深圳)有限公司 | 一种位置信息上报方法和装置 |
CN109392064A (zh) * | 2018-11-07 | 2019-02-26 | 深圳酷泰丰科技有限公司 | 降低定位穿戴设备功耗的方法、系统、设备及存储介质 |
CN110049445A (zh) * | 2019-03-15 | 2019-07-23 | 华为技术有限公司 | 设备上报位置信息的方法及相关装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103916938A (zh) * | 2014-01-27 | 2014-07-09 | 北京奇虎科技有限公司 | 将随身定位设备切换至省电模式的方法、装置及系统 |
CN104066163B (zh) * | 2014-04-24 | 2017-05-03 | 深圳市研强物联技术有限公司 | 功耗节省装置、移动终端和功耗节省方法 |
CN105338466B (zh) * | 2014-07-08 | 2019-08-27 | 华为软件技术有限公司 | 位置信息获取方法及设备 |
CN107466006A (zh) * | 2016-06-02 | 2017-12-12 | 韩军 | 一种自适应机制的移动终端轨迹跟踪记录技术 |
WO2018098717A1 (fr) * | 2016-11-30 | 2018-06-07 | 华为技术有限公司 | Procédé et appareil de réglage de période de localisation |
CN106792520B (zh) * | 2016-12-07 | 2020-07-28 | 朱策 | 位置信息记录方法和装置 |
CN106507311B (zh) * | 2016-12-13 | 2019-12-24 | 南京福盛建材有限公司 | 一种定位的处理方法及系统 |
CN107197031A (zh) * | 2017-06-19 | 2017-09-22 | 深圳市盛路物联通讯技术有限公司 | 一种应用于物联网的终端设备状态检测方法及系统 |
CN107277774A (zh) * | 2017-07-17 | 2017-10-20 | 上海斐讯数据通信技术有限公司 | 一种基于电子围栏的监控方法及系统 |
CN108228714A (zh) * | 2017-12-01 | 2018-06-29 | 兰雨晴 | 云端管理系统及其云端管理方法 |
CN108174347A (zh) * | 2017-12-27 | 2018-06-15 | 广州小毛球智能科技有限公司 | 一种宠物定位设备能耗管理方法、宠物定位设备和服务器 |
-
2019
- 2019-03-15 CN CN201910197407.0A patent/CN110049445A/zh active Pending
-
2020
- 2020-03-10 WO PCT/CN2020/078494 patent/WO2020187082A1/fr active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140118202A (ko) * | 2013-03-28 | 2014-10-08 | 주식회사 앰투앰넷 | 이동 단말 및 그 측위 방법 |
CN105353395A (zh) * | 2015-09-24 | 2016-02-24 | 广州视源电子科技股份有限公司 | 调整位置信息上报频率的方法和装置 |
CN106023540A (zh) * | 2016-05-20 | 2016-10-12 | 广州视源电子科技股份有限公司 | 基于防丢器的防丢物品提示方法及装置 |
CN108156186A (zh) * | 2016-12-02 | 2018-06-12 | 中移(杭州)信息技术有限公司 | 一种信息采集方法及装置 |
CN108900976A (zh) * | 2018-07-10 | 2018-11-27 | 宇龙计算机通信科技(深圳)有限公司 | 一种位置信息上报方法和装置 |
CN109392064A (zh) * | 2018-11-07 | 2019-02-26 | 深圳酷泰丰科技有限公司 | 降低定位穿戴设备功耗的方法、系统、设备及存储介质 |
CN110049445A (zh) * | 2019-03-15 | 2019-07-23 | 华为技术有限公司 | 设备上报位置信息的方法及相关装置 |
Also Published As
Publication number | Publication date |
---|---|
CN110049445A (zh) | 2019-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020187082A1 (fr) | Procédé permettant de rapporter des informations de position par un dispositif, et appareil associé | |
US11450196B2 (en) | XCB tracking devices, methods and systems | |
US11678141B2 (en) | Hybrid cellular Bluetooth tracking devices, methods and systems | |
US11184858B2 (en) | Bluecell devices and methods | |
CN113615217B (zh) | 一种确定终端设备位于地理围栏内部的方法和终端设备 | |
US10893659B2 (en) | Mobile telephone dog training tool and method | |
US11393323B2 (en) | XCB tracking devices, methods and systems | |
US9775003B2 (en) | Location and activity aware media content delivery system | |
CN110505572B (zh) | 一种室内定位方法及电子设备 | |
US20160050531A1 (en) | Method and apparatus for device positioning | |
TWI586988B (zh) | 追蹤裝置以及追蹤裝置控制方法 | |
CN110220516A (zh) | 用于控制安全消息的发射及/或接收的方法及设备 | |
US20160142876A1 (en) | System and method for ad-hoc network for tracking the position of a subject | |
CN109688539B (zh) | 终端定位方法、装置、移动终端及可读存储介质 | |
US10805900B2 (en) | Method and device for deriving location | |
WO2022257665A1 (fr) | Procédé de détection de suivi de dispositif et dispositif électronique | |
WO2023046012A1 (fr) | Procédé de traitement de service de perception, terminal et dispositif côté réseau | |
KR20200066902A (ko) | 야생동물 추적 시스템 및 그 운영방법 | |
KR101867548B1 (ko) | 무선 신호 특성들에 기초한 이동 기기를 사용하여 사용자의 컨텍스트를 검색하는 방법 | |
US20190208051A1 (en) | Context detection with accelerated ai training and adaptive device engagement | |
CN108055393A (zh) | 速度提醒方法、装置及计算机可读存储介质 | |
KR102707084B1 (ko) | 전자 장치의 위치 추정을 위한 장치 및 방법 | |
JP7344342B2 (ja) | 送信装置、情報処理システム、送信方法およびプログラム | |
CN116569572A (zh) | 传感服务提供方法及装置、通信设备及存储介质 | |
TW201736861A (zh) | 無線定位系統與方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20772910 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20772910 Country of ref document: EP Kind code of ref document: A1 |