WO2023082797A1 - Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique - Google Patents

Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique Download PDF

Info

Publication number
WO2023082797A1
WO2023082797A1 PCT/CN2022/117066 CN2022117066W WO2023082797A1 WO 2023082797 A1 WO2023082797 A1 WO 2023082797A1 CN 2022117066 W CN2022117066 W CN 2022117066W WO 2023082797 A1 WO2023082797 A1 WO 2023082797A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
positioning data
positioning
inertial navigation
fingerprint
Prior art date
Application number
PCT/CN2022/117066
Other languages
English (en)
Chinese (zh)
Inventor
鲁晋杰
金珂
李姬俊男
郭彦东
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023082797A1 publication Critical patent/WO2023082797A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Definitions

  • the present disclosure relates to the technical field of positioning and navigation, and in particular, to a positioning method, a positioning device, a computer-readable storage medium, and electronic equipment.
  • GPS Global Positioning System
  • GPS Global Positioning System
  • GPS Global Positioning System
  • indoor scenarios such as shopping malls and supermarkets
  • engineering scenarios such as mines and tunnels.
  • the disclosure provides a positioning method, a positioning device, a computer-readable storage medium and electronic equipment.
  • a positioning method including: acquiring inertial navigation positioning data of an object to be positioned determined based on inertial navigation sensing data; performing positioning prediction according to the inertial navigation positioning data, and obtaining the pending Position the predicted positioning data of the object; acquire the fingerprint positioning data of the object to be positioned determined based on the wireless signal data; the wireless signal data is the signal data received by the object to be positioned from a fixed wireless access point; using The fingerprint positioning data updates the predicted positioning data to obtain fused positioning data of the object to be positioned.
  • a positioning device including: a first acquisition module configured to acquire inertial navigation positioning data of an object to be positioned based on inertial navigation sensing data; a positioning prediction module configured to Perform positioning prediction according to the inertial navigation positioning data to obtain predicted positioning data of the object to be positioned; the second acquisition module is configured to obtain fingerprint positioning data of the object to be positioned determined based on wireless signal data; the said The wireless signal data is the signal data received by the object to be positioned from a fixed wireless access point; the positioning update module is configured to use the fingerprint positioning data to update the predicted positioning data to obtain the object to be positioned fused location data.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the positioning method of the above-mentioned first aspect and possible implementations thereof are implemented.
  • an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the executable instructions to Execute the positioning method of the above first aspect and possible implementations thereof.
  • FIG. 1 shows a schematic diagram of a system architecture in this exemplary embodiment
  • FIG. 2 shows a schematic structural diagram of an electronic device in this exemplary embodiment
  • FIG. 3 shows a flowchart of a positioning method in this exemplary embodiment
  • Fig. 4 shows a flow chart of positioning prediction in this exemplary embodiment
  • Fig. 5 shows a schematic diagram of positioning prediction in this exemplary embodiment
  • Fig. 6 shows a flow chart of determining fingerprint positioning data in this exemplary embodiment
  • Fig. 7 shows a flow chart of updating predicted positioning data in this exemplary embodiment
  • Fig. 8 shows a schematic diagram of updating predicted positioning data in this exemplary embodiment
  • Fig. 9 shows a flow chart of training a map matching model in this exemplary embodiment
  • FIG. 10 shows a schematic diagram of a first coordinate system and a second coordinate system in this exemplary embodiment
  • Fig. 11 shows a schematic diagram of calibrating the first coordinate system and the second coordinate system in this exemplary embodiment
  • Fig. 12 shows a schematic flowchart of a positioning method in this exemplary embodiment
  • Fig. 13 shows a schematic structural diagram of a positioning device in this exemplary embodiment
  • Fig. 14 shows a schematic structural diagram of another positioning device in this exemplary embodiment.
  • Example embodiments will now be described more fully with reference to the accompanying drawings.
  • Example embodiments may, however, be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete and will fully convey the concept of example embodiments to those skilled in the art.
  • the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
  • numerous specific details are provided in order to give a thorough understanding of embodiments of the present disclosure.
  • those skilled in the art will appreciate that the technical solutions of the present disclosure may be practiced without one or more of the specific details being omitted, or other methods, components, devices, steps, etc. may be adopted.
  • well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
  • the accuracy of positioning is improved by combining GPS positioning with other positioning methods.
  • GPS positioning is easily affected by the environment. When there are factors that interfere or block GPS signals in the environment, it will seriously affect the accuracy of GPS positioning, and even make GPS positioning unavailable. For example, weak GPS signals often occur in areas with dense buildings, and it is difficult to achieve accurate positioning even when combined with other positioning methods.
  • exemplary embodiments of the present disclosure provide a positioning method for positioning an object to be positioned, and the object to be positioned may be a mobile device.
  • the system architecture of the operating environment of the positioning method will be described below with reference to FIG. 1 .
  • FIG. 1 shows a schematic diagram of a system architecture
  • the system architecture 100 may include a mobile device 110 and a processing device 120 .
  • the mobile device 110 may be a mobile phone, a tablet computer, a wearable device, a drone, and the like.
  • the mobile device 110 includes an inertial navigation module and a wireless communication module, which collect inertial navigation sensing data and wireless signal data respectively.
  • the processing device 120 may be a server or a locally located computer capable of providing related processing of the positioning data.
  • the mobile device 110 and the processing device 120 may be connected through a wired or wireless communication link for data exchange.
  • the mobile device 110 may send the collected inertial navigation sensor data and wireless signal data to the processing device 120 .
  • the processing device 120 determines the inertial navigation positioning data according to the inertial navigation sensing data, determines the fingerprint positioning data according to the wireless signal data, and realizes the positioning of the mobile device 110 by processing the inertial navigation positioning data and the fingerprint positioning data.
  • the mobile device 110 can process the collected inertial navigation sensing data and wireless signal data to obtain inertial navigation positioning data and fingerprint positioning data, and then send the inertial navigation positioning data and fingerprint positioning data to the processing device 120.
  • the processing device 120 realizes the positioning of the mobile device 110 by processing the inertial navigation positioning data and the fingerprint positioning data.
  • the mobile device 110 can also independently collect and process inertial navigation sensing data and wireless signal data, and process inertial navigation positioning data and fingerprint positioning data to realize its own positioning. That is to say, the positioning method of this exemplary embodiment can be implemented based on the mobile device 110 without setting the processing device 120 .
  • the execution subject of the positioning method in this exemplary embodiment may be the mobile device 110 or the processing device 120, which is not limited in the present disclosure.
  • Exemplary embodiments of the present disclosure also provide an electronic device for performing the above-mentioned positioning method, and the electronic device may be the above-mentioned mobile device 110 or the processing device 120 .
  • the electronic device includes a processor and a memory.
  • the memory is used to store executable instructions of the processor, and the processor is configured to implement the above positioning method by executing the executable instructions.
  • the electronic device 200 in FIG. 2 is taken as an example for description below. As shown in FIG. 2 , the electronic device 200 may specifically include: a processor 201 , a memory 202 , a bus 203 , a communication module 204 , an antenna 205 , a power supply module 206 and a sensor module 207 .
  • Processor 201 can include one or more processing units, for example: processor 201 can include AP (Application Processor, application processor), modem processor, GPU (Graphics Processing Unit, graphics processing unit), ISP (Image Signal Processor, image signal processor), controller, encoder, decoder, DSP (Digital Signal Processor, digital signal processor), baseband processor and/or NPU (Neural-Network Processing Unit, neural network processor), etc.
  • the positioning method in this exemplary embodiment may be executed by any one or more processing units described above, for example, the positioning method may be executed by an AP or a DSP.
  • the related processing can be performed by the NPU.
  • the processor 201 can form a connection with the memory 202 or other components through the bus 203 .
  • Memory 202 may be used to store computer-executable program code, including instructions.
  • the processor 201 executes various functional applications and data processing of the electronic device 200 by executing instructions stored in the memory 202 .
  • the storage 202 can also store application data, such as storing images, videos and other files.
  • the communication function of the electronic device 200 may be implemented by the communication module 204, the antenna 205, the modem processor, the baseband processor, and the like.
  • the antenna 205 is used to transmit and receive electromagnetic wave signals.
  • the antenna 205 can receive wireless signals, including but not limited to WiFi signals (wireless local area network-based signals), Bluetooth signals, NFC (Near Field Communication, near field communication) signals, infrared signals, visible light signals, etc.
  • the communication module 204 includes a wireless communication module 2041, which can collect wireless signal data and provide a wireless communication solution based on one or more of the above wireless signals.
  • the power module 206 is used to implement power management functions, such as charging the battery, supplying power to the device, and monitoring the state of the battery.
  • the sensor module 207 may include one or more sensors, so as to realize corresponding induction detection functions.
  • the sensor module 207 may include an inertial navigation module 2071, such as an IMU (Inertia Measurement Unit, inertial measurement unit).
  • the inertial navigation module 2071 includes one or more inertial navigation sensors, such as gyroscopes, accelerometers, and the like.
  • the inertial navigation module 2071 is used to collect inertial navigation sensor data.
  • the electronic device 200 may further include a display screen, configured to implement a display function, such as displaying a user interface, images, videos, and the like.
  • the electronic device 200 may further include a camera module, configured to realize functions of shooting images, videos, and the like.
  • the electronic device 200 may further include an audio module, configured to implement audio functions, such as playing audio, collecting voice, and the like.
  • FIG. 3 shows an exemplary flow of the positioning method, which may include:
  • Step S310 acquiring the inertial navigation positioning data of the object to be positioned determined based on the inertial navigation sensing data
  • Step S320 perform positioning prediction according to the inertial navigation positioning data, and obtain predicted positioning data of the object to be positioned;
  • Step S330 acquiring fingerprint positioning data of the object to be positioned based on wireless signal data;
  • the wireless signal data is signal data received by the object to be positioned from a fixed wireless access point;
  • Step S340 using the fingerprint positioning data to update the predicted positioning data to obtain the fused positioning data of the object to be positioned.
  • positioning is realized by combining inertial navigation positioning data and fingerprint positioning data through positioning prediction and updating.
  • this solution integrates the positioning data of the two modalities, which can overcome the limitations of a single modality and improve the accuracy of positioning results, which is conducive to the application in scenarios with high accuracy requirements.
  • it does not need to rely on satellite positioning data such as GPS, thereby improving the restrictions on the positioning environment.
  • this solution is suitable for areas with dense buildings, indoor areas, mines, tunnels and other engineering areas, and has a wide range of applications.
  • step S310 the inertial navigation positioning data of the object to be positioned determined based on the inertial navigation sensing data is acquired.
  • Inertial navigation sensor data refers to the data output by the inertial navigation module, including but not limited to three-axis acceleration, three-axis angular acceleration, etc.
  • the inertial navigation sensing data may be the raw data collected by the inertial navigation sensor, or the data after preliminary processing of the raw data.
  • the inertial navigation positioning data is the positioning data obtained by analyzing and processing the inertial navigation sensing data.
  • the present disclosure does not limit the specific method of processing the inertial navigation sensor data. For example, after obtaining the initial positioning data of the object to be positioned, on this basis, the PDR (Pedestrian Dead Reckoning, pedestrian dead reckoning) algorithm can be used and Based on the inertial navigation sensing data, the moving trajectory of the object to be positioned is calculated, so as to obtain the inertial navigation positioning data of the object to be positioned.
  • PDR pedestrian Dead Reckoning, pedestrian dead reckoning
  • the inertial navigation sensing data can be collected in real time, and each time the inertial navigation sensing data is collected, the inertial navigation positioning data of the object to be positioned at the corresponding moment can be obtained.
  • the frame rate of the inertial navigation sensor data output by the inertial navigation module is 30fps (frame per second, frame per second), that is, the inertial navigation module outputs the inertial navigation sensor data about every 33ms, and the positioning can be determined every 33ms The object's inertial navigation positioning data.
  • the inertial navigation sensing data or the inertial navigation positioning data may carry a time stamp, and the time stamp may represent the collection time of the inertial navigation sensing data, so that the inertial navigation positioning data may be corresponding to the inertial navigation positioning data.
  • the acquisition time of sensing data is adjusted to prevent positioning delays caused by data processing or transmission.
  • positioning data described in this exemplary embodiment may only include position data, such as the three-dimensional position coordinates of the object to be positioned, and may also include pose data, such as the 6DoF (Degree of Freedom, degrees of freedom) data, etc. Which positioning data to use can be determined according to actual needs.
  • position data such as the three-dimensional position coordinates of the object to be positioned
  • pose data such as the 6DoF (Degree of Freedom, degrees of freedom) data, etc. Which positioning data to use can be determined according to actual needs.
  • the acquisition of the inertial navigation positioning data of the object to be positioned based on the inertial navigation sensor data may include the following steps:
  • the inertial navigation sensor data is processed by using the pre-trained inertial navigation positioning model to obtain the inertial navigation positioning data of the object to be positioned.
  • the inertial navigation positioning model may be a machine learning model, such as a neural network.
  • the inertial navigation positioning model can be a TCN network (Temporal Convolutional Networks, temporal convolutional neural network), the TCN network is an end-to-end structure, and the inertial navigation sensor data at different moments are arranged as time series data, which is input into the TCN network , can output inertial navigation positioning data corresponding to different moments.
  • TCN network Temporal Convolutional Networks, temporal convolutional neural network
  • the inertial navigation positioning data is usually used to characterize the relative positioning information of the object to be positioned.
  • the inertial navigation positioning data may include inertial navigation relative positioning data between a previous moment of the current moment and the current moment of the object to be positioned.
  • the current moment can be the moment corresponding to the latest inertial navigation sensing data acquired
  • the previous moment of the current moment can be the moment corresponding to the inertial navigation sensing data acquired last time
  • the inertial navigation module outputs the inertial navigation sensing data
  • the frame rate is 30fps
  • the time difference between the current moment and the previous moment is 33ms.
  • the inertial navigation positioning model is used to process the inertial navigation sensor data and output the relative positioning data of the inertial navigation.
  • the reference positioning data of the object to be positioned can be obtained, such as the initial positioning data of the object to be positioned or the positioning data at a certain moment, which can be determined by means other than inertial navigation and positioning such as fingerprint positioning, and is absolute positioning information.
  • the absolute positioning information of the object to be positioned can be determined according to the reference positioning data and the inertial navigation sensing data. For example, the reference positioning data and the inertial navigation sensor data are input into the inertial navigation positioning model, and the inertial navigation positioning data is output, and the inertial navigation positioning data represents the absolute positioning information of the object to be positioned.
  • the training process of the inertial navigation positioning model may include: during the movement of the object to be positioned, the inertial navigation sample sensor data is collected by the inertial navigation module, and at the same time, the VIO (Visual Inertia Odometry, visual inertial odometer)
  • the system or other systems collect positioning data as inertial navigation tag data.
  • the inertial navigation tag data can be relative to the positioning data, and the inertial navigation sample sensing data and inertial navigation tag data can be corresponding through time stamps, thus constructing a system for training A benchmark dataset for inertial navigation positioning models.
  • the parameters are updated. When the test accuracy of the inertial navigation positioning model reaches the set standard, it is determined that the training is completed.
  • step S320 perform positioning prediction according to the inertial navigation positioning data, and obtain predicted positioning data of the object to be positioned.
  • Inertial navigation positioning is essentially based on the positioning change of the object to be positioned, which belongs to the continuous positioning method.
  • fingerprint positioning belongs to the way of discrete positioning.
  • location prediction is performed based on inertial navigation location data.
  • Positioning prediction can refer to the following formula:
  • k represents the current moment
  • k-1 represents the previous moment.
  • f( ⁇ ) represents the motion equation
  • x k-1 represents the reference positioning data at the previous moment
  • v k represents the input control amount at the current moment.
  • ⁇ in represents input noise, for example, Gaussian noise with a mean value of 0.
  • the predicted positioning data of the object to be positioned at the current moment can be calculated through the motion equation f( ⁇ ).
  • the inertial navigation positioning data may be relative positioning data, for example, it may be inertial navigation relative positioning data between the current moment and the previous moment of the current moment of the object to be positioned.
  • the predicted positioning data is absolute positioning data, for example, it may be a prediction result of 6DoF data of the object to be positioned.
  • the above positioning prediction based on the inertial navigation positioning data to obtain the predicted positioning data of the object to be positioned may include the following steps S410 and S420:
  • Step S410 acquiring reference positioning data of the object to be positioned at a previous moment, the reference positioning data including at least one of predicted positioning data, fingerprint positioning data, and fusion positioning data of the object to be positioned at a previous moment;
  • Step S420 according to the reference positioning data and the inertial navigation relative positioning data, the predicted positioning data of the object to be positioned at the current moment is obtained.
  • the reference positioning data of the object to be positioned at the initial moment may be the fingerprint positioning data at the initial moment, which provides a reference for absolute positioning data for inertial navigation positioning.
  • the predicted positioning data at each subsequent moment can be calculated.
  • the frame rate of inertial navigation positioning data output by the inertial navigation module is 30 fps
  • the frame rate of fingerprint positioning data output by the wireless communication module is 1 fps.
  • t0 represents the initial moment
  • the fingerprint positioning data is acquired at t0 (generally, the inertial navigation positioning data is not acquired at the initial moment).
  • the inertial navigation relative positioning data at each moment refers to the relative positioning data at this moment relative to the previous moment.
  • the predicted positioning data of t1 can be calculated by formula (1).
  • the equation of motion can be expressed as follows:
  • T k represents the positioning data at time k obtained by inertial navigation positioning, which is equivalent to the predicted positioning data at time k.
  • formula (2) is equivalent to simplifying the motion equation, and multiplying the reference positioning data of the previous moment by the inertial navigation relative positioning data of the current moment to obtain the predicted positioning data of the current moment.
  • the predicted positioning data at the previous moment can be used as the reference positioning data.
  • the fingerprint positioning data and inertial navigation relative positioning data can be obtained.
  • the predicted positioning data of t30 can also be updated according to the fingerprint positioning data of t30 to obtain the fusion positioning data of t30. Therefore, predictive positioning data, fingerprint positioning data, and fusion positioning data are available at t30. In this way, when calculating the predicted positioning data at t31, any one of the predicted positioning data, fingerprint positioning data, and fusion positioning data at t30 can be used as the reference positioning data.
  • step S330 acquire fingerprint positioning data of the object to be positioned based on wireless signal data;
  • the wireless signal data is signal data received by the object to be positioned from a fixed wireless access point;
  • wireless access points may be set at multiple fixed positions in the environment, which may be access points based on radio frequency, infrared, visible light and other arbitrary signals, such as Bluetooth beacons.
  • the wireless communication module of the mobile device 110 can receive wireless signals sent by multiple nearby wireless access points, and obtain wireless signal data.
  • the wireless signal data can include the identification or location of the wireless access point sending the wireless signal, and the strength of the wireless signal. wait.
  • wireless signal data may include wireless fingerprint data.
  • the RSS (Received Signal Strength) data of the wireless signal can be arranged in the order of the wireless access points to form an array or vector, so as to obtain the wireless fingerprint data of the current location.
  • the object to be positioned is currently in the target scene, and 10 wireless access points are set in the target scene in advance, numbered 1 to 10 respectively.
  • 10 wireless access points send Arrange the RSS data of 10 wireless signals into a vector according to the order of wireless access points 1 to 10. Some of the wireless access points may be far away, and the RSS of the received radio frequency signal is 0.
  • the RSS vector is the wireless fingerprint data of the current location.
  • the fingerprint positioning data of the object to be positioned can be determined.
  • the fingerprint positioning data of the object to be positioned can be determined through geometric relationship calculation.
  • the acquisition of the fingerprint positioning data of the object to be positioned based on the wireless signal data may include the following steps S610 and S620:
  • Step S610 acquiring the wireless signal data of the object to be positioned at the sampling moment
  • Step S620 according to the similarity between the wireless signal data of the object to be located at the sampling time and the preconfigured wireless signal reference data, determine the fingerprint positioning data of the object to be located at the sampling time.
  • the sampling time refers to the time when fingerprint positioning is performed, which may be any time in the actual positioning process, for example, it may be the current time.
  • the wireless signal reference data is the wireless signal data collected at the reference position in advance. The present disclosure does not limit the setting of the reference position. For example, a reference position may be set at a certain distance in each direction, so as to form a dot matrix of the reference position in the target scene.
  • the wireless signal data sent by the wireless access point is collected at the reference position as a reference for subsequent fingerprint positioning.
  • the wireless signal received by each reference location can be collected to obtain wireless fingerprint data (such as RSS vector) as the wireless signal reference data of each reference location, so that a wireless fingerprint library can be established.
  • wireless fingerprint data such as RSS vector
  • determining the fingerprint positioning data of the object to be positioned at the sampling moment according to the similarity between the wireless signal data of the object to be positioned at the sampling moment and the pre-configured wireless signal reference data may include the following steps:
  • the similarity between the wireless signal data of the object to be positioned at the sampling moment and the pre-configured wireless signal reference data determine a plurality of wireless signal reference data similar to the wireless signal data of the object to be positioned at the sampling moment;
  • the fingerprint positioning data of the object to be positioned at the sampling moment is determined by performing weighting processing on the multiple reference positions corresponding to the multiple wireless signal reference data.
  • any number of wireless signal reference data having a high similarity with the wireless signal data of the object to be positioned at the sampling moment may be determined.
  • a fixed number can be set, and the number of wireless signal reference data can be selected in order of similarity from high to low, or a similarity threshold can be set to select wireless signal reference data whose similarity with the wireless signal data is higher than the similarity threshold.
  • the fixed number and the similarity threshold may be determined according to experience or actual requirements, and are not limited in the present disclosure.
  • the weight of each reference position can be determined according to the similarity, and the multiple reference positions can be weighted to obtain the fingerprint positioning data of the object to be positioned at the sampling time.
  • the similarity corresponding to each wireless signal reference data is normalized, and the normalized similarity is used as a weight to perform weighting processing on multiple reference positions , to obtain the fingerprint positioning data of the object to be located at the sampling moment.
  • the wireless signal data or the fingerprint positioning data may carry a time stamp, and the time stamp may represent the collection time of the wireless signal data (that is, the above-mentioned sampling time), so that the fingerprint positioning data may be corresponding to the wireless signal data acquisition time to prevent positioning delays due to data processing or transmission.
  • step S340 the fingerprint positioning data is used to update the predicted positioning data to obtain fused positioning data of the object to be positioned.
  • Predicted positioning data is obtained by performing positioning prediction based on inertial navigation positioning data, and there is usually a certain error.
  • the fingerprint positioning data is used to update the predicted positioning data.
  • the update process can be regarded as the fusion of the fingerprint positioning data and the predicted positioning data (or inertial navigation positioning data), so the final positioning data is called fusion positioning data. .
  • the fused positioning data can be output as the final positioning result.
  • updating the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include the following steps:
  • Step S710 based on the prior positioning covariance matrix, observation matrix, observation noise control matrix and observation noise covariance matrix of the target time, determine the Kalman gain at the target time;
  • Step S720 using the fingerprint positioning data and Kalman gain at the target time to update the predicted positioning data at the target time to obtain the fusion positioning data of the object to be positioned at the target time.
  • the target time refers to the time when positioning needs to be performed, and may be any time in the actual positioning process, for example, it may be the current time or the sampling time.
  • the update of the predicted positioning data can be realized through the EKF (Extended Kalman Filter, extended Kalman filter) algorithm, referring to the following formula:
  • K k represents the current time, and the current time is taken as the target time.
  • K k represents the Kalman gain at the target moment, Represents the prior positioning covariance matrix at the target time, which is equivalent to the covariance matrix of the predicted positioning data at the target time, G k represents the observation matrix at the target time (that is, the Jacobian matrix of the observation equation to the fusion positioning data), and C k represents the target The observation noise control matrix at time (that is, the Jacobian matrix of the observation equation to the observation noise), and R k represents the observation noise covariance matrix at the target time.
  • ⁇ ob represents observation noise, for example, it may be Gaussian noise with a mean value of 0.
  • the fingerprint positioning data can be used as observations, as shown below:
  • Talk represents the fingerprint positioning data at the target moment.
  • the observation equation g( ⁇ ) characterizes the relationship between observations and predicted positioning data. In one embodiment, it can be simplified as:
  • observation matrix and the observation noise control matrix at any time are unit matrices.
  • the prior positioning covariance matrix at the target time may be determined based on the posterior positioning covariance matrix, the state transition matrix, the input noise control matrix and the input noise covariance matrix at the previous time of the target time. Refer to the formula below:
  • k-1 represents the previous moment of the target moment.
  • F k-1 represents the state transition matrix at the previous moment (that is, the Jacobian matrix of the motion equation to the fusion positioning data at the previous moment)
  • B k-1 represents the input noise control matrix at the previous moment (that is, the Jacobian matrix of the motion equation to the input noise )
  • Q k represents the input noise covariance matrix at the target moment.
  • I represents the identity matrix.
  • the posterior positioning covariance matrix at the previous moment can be calculated by formula (8).
  • the prior positioning covariance matrix of the target moment can be calculated by formula (7).
  • the state transition matrix and the input noise control matrix at any time are unit matrices.
  • the positioning method may also include the following steps:
  • each set of fingerprint sample data includes a fingerprint positioning sample matrix and a corresponding real positioning matrix
  • One of the fingerprint positioning sample matrix and the real positioning matrix in each group of fingerprint sample data is converted into an inverse matrix and multiplied by the other, and the observation noise covariance matrix is determined according to the result of the multiplication.
  • n can be any positive integer, and generally the larger the value of n is, the more accurate the obtained observation noise covariance matrix is.
  • the fingerprint positioning sample data are respectively obtained, expressed in the form of a matrix, and n fingerprint positioning sample matrices are obtained, denoted as Tal, and n fingerprint positioning sample matrices are obtained through the VIO system and other methods
  • the corresponding real positioning data is also expressed in the form of a matrix, and n real positioning matrices are obtained, which are denoted as Ttrue.
  • the formula (9) shows that the fingerprint positioning sample matrix Tal is converted into an inverse matrix and multiplied by the real positioning matrix Ttrue, and the observation noise covariance matrix R is calculated by the product of the multiplication result matrix and its transposed matrix .
  • the real positioning matrix Ttrue can also be converted into an inverse matrix and multiplied by the fingerprint positioning sample matrix Tal, and the observation noise covariance matrix R can be calculated by the product of the multiplication result matrix and its transposed matrix.
  • the positioning method may also include the following steps:
  • each set of inertial navigation sample data includes an inertial navigation positioning sample matrix and a corresponding real positioning matrix
  • One of the inertial navigation positioning sample matrix and the real positioning matrix in each group of inertial navigation sample data is converted into an inverse matrix and multiplied with the other, and the input noise covariance matrix is determined according to the result of the multiplication.
  • m can be any positive integer. Generally, the larger the value of m is, the more accurate the obtained observation noise covariance matrix is.
  • Tr inertial navigation and positioning sample matrices
  • Tr personalized navigation and positioning
  • the formula (10) shows that the inertial navigation positioning sample matrix Tr is converted into an inverse matrix and multiplied by the real positioning matrix Ttrue, and the input noise covariance matrix is calculated by the product of the multiplication result matrix and its transposed matrix Q.
  • the real positioning matrix Ttrue can also be converted into an inverse matrix and multiplied by the inertial navigation positioning sample matrix Tr, and the input noise covariance matrix Q can be calculated by the product of the multiplication result matrix and its transposed matrix.
  • the frame rate of the output data of the wireless communication module and the inertial navigation module is different, or there is a delay in the data processing and transmission of the two modules, the time when the system acquires the inertial navigation positioning data and the fingerprint positioning data is not synchronized.
  • the frame rate of the inertial navigation positioning data output by the inertial navigation module is higher than the frame rate of the fingerprint positioning data output by the wireless communication module.
  • the inertial navigation module continues to output the inertial navigation positioning data, and continues to perform positioning prediction.
  • the fingerprint positioning data acquired in step S330 includes the fingerprint positioning data of the object to be located at the sampling moment.
  • the sampling moment may be determined according to the time stamp in the wireless signal data, or may be determined according to the time stamp in the fingerprint positioning data. If the current moment is used to indicate the moment of positioning prediction, the sampling moment may be earlier than the current moment.
  • the aforementioned updating of the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include the following steps:
  • the fingerprint positioning data of the object to be positioned at the sampling time is used to update the predicted positioning data of the object to be positioned at the sampling time, and the fusion positioning data of the object to be positioned at the sampling time is obtained.
  • the inertial navigation module takes the inertial navigation module outputting the inertial navigation positioning data at a frame rate of 30 fps, and the wireless communication module outputting the fingerprint positioning data at a frame rate of 1 fps as an example.
  • the wireless communication module collects wireless signal data, and obtains fingerprint positioning data after processing and sends it to the processor or processing device 120 .
  • the inertial navigation module also needs a certain amount of time for data processing and transmission. Considering the time difference between the wireless communication module and the inertial navigation module for data processing and transmission, it is assumed to be 0.5s.
  • the wireless communication module collects the wireless signal data of t30, and the wireless signal data is analyzed into the fingerprint positioning data of t30 and The moment of transmission to the system is t45.
  • the fingerprint positioning data is not acquired during t30-t45, the predicted positioning data at the previous moment can be used as the reference positioning data in order to calculate the predicted positioning data at the current moment.
  • the fingerprint positioning data of t30 is received, and the predicted positioning data of t30 is updated according to the fingerprint positioning data, instead of updating the current predicted positioning data of t45, which is equivalent to going back to the time t30 for data fusion to obtain t30 fused location data.
  • the problem of time error caused by time asynchrony between the inertial navigation module and the wireless communication module can be improved, and the real-time performance of positioning can be improved.
  • the inertial navigation positioning data includes inertial navigation relative positioning data of the object to be positioned between the sampling moment and the current moment.
  • the positioning method may also include the following steps:
  • the fused positioning data of the object to be positioned at the sampling moment and the relative positioning data of the inertial navigation is obtained.
  • the fused positioning data of t30 can be used as the reference positioning data to recalculate the predicted positioning data of t31, then the formula (2) can be further expressed as the following formula (11) :
  • the fused positioning data of the previous moment which is used as the reference positioning data of the previous moment, and can be regarded as the posterior positioning data after the fusion of the fingerprint positioning data. Since the reference positioning data incorporates fingerprint positioning information, the predicted positioning data of t31 calculated at this time actually also incorporates fingerprint positioning information, so it can be called t31 fused positioning data, which can also be regarded as posterior positioning data.
  • the predicted positioning data of t31 calculated for the first time can be regarded as prior positioning data.
  • the fused positioning data at each subsequent moment is sequentially calculated, so that data fusion can also be realized at the moment when the fingerprint positioning data is missing, and the accuracy of the positioning result can be improved.
  • the positioning method may also include the following steps:
  • the fusion positioning data is processed by using the map matching model of the target scene to obtain the target positioning data;
  • the target scene is the real scene where the object to be positioned is currently located.
  • the map matching model is used to match the fusion positioning data to a more real position (or pose), so as to further improve the accuracy of the fusion positioning data.
  • the target positioning data may be final output positioning data.
  • the present disclosure does not limit the specific form of the map matching model, for example, it may be a neural network model.
  • This exemplary embodiment also provides a training method for a map matching model, as shown in FIG. 9 , which may include the following steps S910 and S920:
  • step S910 the pose labeling information in the target scene is obtained, and matching data pairs are obtained according to the pose labeling information, and each matching data pair includes a positioning data to be matched and a label positioning data having a matching relationship.
  • Pose annotation information refers to the authentic pose information in the target scene.
  • the pose labeling information may include region labeling information, which may be the information of reachable and unreachable regions in the target scene obtained by counting the real pose information in the target scene.
  • An area refers to an area that can actually be reached, and an unreachable area refers to an area that cannot be reached, such as an area that cannot be reached due to obstacles or building restrictions.
  • the historical trajectory data in the target scene can be obtained, which refers to the data of the moving trajectory that has actually occurred in the target scene, and the real location in the target scene can be determined by making statistics on a large amount of historical trajectory data.
  • the real pose refers to the pose that a person or object can actually present
  • the unreal pose refers to the pose that a person or object cannot actually present.
  • the poses in the above-mentioned inaccessible areas are all unreal poses, or although they are in the accessible area, the steering deviates from the actual poses, etc.
  • the poses that have appeared in the historical trajectory data can be formed into a set of real poses, and poses outside the set are non-real poses.
  • the matching data pair is the data used to train the map matching model, wherein the positioning data to be matched is positioning data whose authenticity is unknown, and the label positioning data refers to real positioning data matched with the positioning data to be matched.
  • the above-mentioned matching data pair obtained according to the pose labeling information may include the following steps:
  • the positioning data to be matched corresponds to the real pose; if yes, the positioning data to be matched is used as the label positioning data matched with the positioning data to be matched; The positioning data corresponding to the similar real pose is used as the label positioning data matched with the positioning data to be matched.
  • the positioning data to be matched may be randomly generated in the target scene, for example, 6DoF data is randomly generated within the scope of the target scene.
  • the positioning data to be matched may also be generated based on a set strategy, for example, the positioning data to be matched is generated according to a set ratio in different areas of the target scene.
  • the pose annotation information it can be judged whether the generated positioning data to be matched corresponds to a real pose or a non-real pose, for example, it can be judged whether the positioning data to be matched is in the set of the above-mentioned real poses. If it corresponds to the real pose, the to-be-matched positioning data itself can be used as its matching label positioning data, that is, the to-be-matched positioning data and itself form a matching data pair. If it corresponds to a non-real pose, the positioning data corresponding to the real pose that is most similar to the positioning data to be matched can be obtained as the matched label positioning data, that is, the positioning data corresponding to the positioning data to be matched and the real pose The positioning data form a matching data pair.
  • Step S920 input the positioning data to be matched in the matching data pair into the map matching model to be trained, and update the parameters of the map matching model based on the matched positioning data output by the map matching model and the tag positioning data in the matching data pair.
  • the map matching model can adopt an end-to-end network structure, for example, U-Net can be used as the network backbone (Backbone), and any intermediate layer can be set according to actual needs.
  • U-Net can be used as the network backbone (Backbone)
  • Backbone network backbone
  • any intermediate layer can be set according to actual needs.
  • the parameters of the map matching model are updated by backpropagation through the value of the loss function.
  • the map matching model for forward reasoning, which is equivalent to using the physical information in the target scene to constrain the fused positioning data. If its authenticity is low, the map matching model It can be matched to a similar real pose to further improve positioning accuracy.
  • the positioning method may also include the following steps:
  • the conversion parameters between the first coordinate system and the second coordinate system are determined.
  • the first coordinate system is the coordinate system where the predicted positioning data is located, and its origin is usually the position of the object to be positioned when the inertial navigation module is initialized, and the direction of its coordinate axis can be consistent with the internal coordinates of the inertial navigation module Axis aligned.
  • the second coordinate system is the coordinate system where the fingerprint positioning data is located. Its origin is usually the position of the object to be located when the target scene is initially mapped, and its coordinate axis direction is usually different from that of the first coordinate system. In order to achieve more accurate fusion of the predicted positioning data and the fingerprint positioning data, the first coordinate system and the second coordinate system may be unified.
  • the difference between the predicted positioning data and the fingerprint positioning data is relatively large, and the two coordinates can be determined based on the difference between the two Conversion parameters between systems.
  • the predicted positioning data and the fingerprint positioning data can be transformed into the same coordinate system based on the above transformation parameters, for example, the predicted positioning data can be converted from the first coordinate system to the second coordinate system, or the fingerprint positioning data The data is converted from the second coordinate system to the first coordinate system, or both the predicted positioning data and the fingerprint positioning data are converted into a unified third coordinate system. Then, the fingerprint positioning data after the unified coordinate system is used to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned.
  • the determination of the conversion parameters between the first coordinate system and the second coordinate system based on the predicted positioning data and fingerprint positioning data at the same time may include the following steps:
  • Low-pass filter processing is performed on the conversion parameter and the conversion parameter sampling data corresponding to the current calibration time to update the conversion parameter.
  • the calibration time refers to the time when the first coordinate system and the second coordinate system are calibrated, usually the time when both the predicted positioning data and the fingerprint positioning data are available.
  • Tr0 represents the predicted positioning data at time 0
  • Tal0 represents the fingerprint positioning data at time 0.
  • W1 represents the first coordinate system
  • W2 represents the second coordinate system.
  • the coordinate systems of the predicted positioning data and the fingerprint positioning data may be unified based on the current latest transformation parameters, and then fusion calculation is performed.
  • the positioning method may also include the following steps:
  • the deviation value may include a position deviation value or an angle deviation value, and corresponding preset ranges may be determined respectively for the position deviation value and the angle deviation value, such as determining the first preset range for the position deviation value according to experience and actual needs , to determine a second preset range for the angle deviation value. If the position deviation value and the angle deviation value exceed the preset range at the same time, or one of them exceeds the preset range, it is determined that at least one of the predicted positioning data and the fingerprint positioning data is abnormal data. Exemplarily, both the predicted positioning data and the fingerprint positioning data at this moment may be determined as abnormal data and discarded. In this way, abnormal positioning data can be filtered out to ensure the reliability of the positioning result.
  • Fig. 12 shows a schematic flowchart of the positioning method in this exemplary embodiment, including:
  • Step S1210 acquiring inertial navigation sensing data
  • Step S1220 determining inertial navigation positioning data based on inertial navigation sensing data
  • Step S1230 acquiring wireless signal data
  • Step S1240 determining fingerprint positioning data based on wireless signal data
  • Step S1250 perform positioning prediction according to the inertial navigation positioning data, and obtain predicted positioning data
  • Step S1260 updating the predicted positioning data according to the fingerprint positioning data to obtain fused positioning data
  • Step S1270 input the fused positioning data into the pre-trained map matching model to obtain target positioning data and output it.
  • the positioning device 1300 may include:
  • the first acquiring module 1310 is configured to acquire the inertial navigation positioning data of the object to be positioned determined based on the inertial navigation sensing data;
  • the positioning prediction module 1320 is configured to perform positioning prediction according to the inertial navigation positioning data, and obtain predicted positioning data of the object to be positioned;
  • the second acquiring module 1330 is configured to acquire the fingerprint positioning data of the object to be positioned determined based on the wireless signal data; the wireless signal data is signal data received by the object to be positioned from a fixedly set wireless access point;
  • the positioning updating module 1340 is configured to update the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned.
  • the acquisition of the inertial navigation positioning data of the object to be positioned based on the inertial navigation sensor data may include:
  • the inertial navigation sensor data is processed by using the pre-trained inertial navigation positioning model to obtain the inertial navigation positioning data of the object to be positioned.
  • the inertial navigation positioning data includes inertial navigation relative positioning data between the previous moment of the current moment and the current moment of the object to be positioned; the above-mentioned positioning prediction is performed according to the inertial navigation positioning data to obtain the prediction of the object to be positioned Location Data, including:
  • the reference positioning data including at least one of predicted positioning data, fingerprint positioning data, and fusion positioning data of the object to be positioned at a previous moment;
  • the predicted positioning data of the object to be positioned at the current moment is obtained.
  • the acquisition of the fingerprint positioning data of the object to be positioned based on the wireless signal data may include:
  • the wireless signal reference data is the wireless signal data collected at the reference position.
  • the determination of the fingerprint positioning data of the object to be positioned at the sampling time according to the similarity between the wireless signal data of the object to be positioned at the sampling time and the pre-configured wireless signal reference data may include:
  • the similarity between the wireless signal data of the object to be positioned at the sampling moment and the pre-configured wireless signal reference data determine a plurality of wireless signal reference data similar to the wireless signal data of the object to be positioned at the sampling moment;
  • the fingerprint positioning data of the object to be positioned at the sampling moment is determined by performing weighting processing on the multiple reference positions corresponding to the multiple wireless signal reference data.
  • the above-mentioned fingerprint positioning data includes fingerprint positioning data of the object to be positioned at a sampling time, and the sampling time is earlier than the current time.
  • the aforementioned updating of the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include:
  • the fingerprint positioning data of the object to be positioned at the sampling time is used to update the predicted positioning data of the object to be positioned at the sampling time, and the fusion positioning data of the object to be positioned at the sampling time is obtained.
  • the inertial navigation positioning data includes inertial navigation relative positioning data of the object to be positioned between the sampling moment and the current moment.
  • the positioning update module 1340 is further configured to:
  • the fused positioning data of the object to be positioned at the sampling moment and the relative positioning data of the inertial navigation is obtained.
  • the location update module 1340 is further configured to:
  • the sampling moment is determined according to the time stamp in the fingerprint positioning data.
  • the aforementioned update of the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include:
  • the Kalman gain at the target time is determined
  • the fingerprint positioning data and Kalman gain at the target time are used to update the predicted positioning data at the target time, and the fusion positioning data of the object to be positioned at the target time is obtained.
  • the location update module 1340 is further configured to:
  • each group of fingerprint sample data includes a fingerprint positioning sample matrix and a corresponding real positioning matrix; n is a positive integer;
  • One of the fingerprint positioning sample matrix and the real positioning matrix in each group of fingerprint sample data is converted into an inverse matrix and multiplied by the other, and the observation noise covariance matrix is determined according to the result of the multiplication.
  • the location update module 1340 is further configured to:
  • the prior positioning covariance matrix of the target moment is determined.
  • the location update module 1340 is further configured to:
  • each set of inertial navigation sample data includes an inertial navigation positioning sample matrix and a corresponding real positioning matrix; m is a positive integer;
  • One of the inertial navigation positioning sample matrix and the real positioning matrix in each group of inertial navigation sample data is converted into an inverse matrix and multiplied with the other, and the input noise covariance matrix is determined according to the result of the multiplication.
  • the positioning device 1300 may further include a map matching module configured to:
  • the target scene is the real scene where the object to be positioned is currently located;
  • the map matching module trains and obtains the map matching model by performing the following steps:
  • each matching data pair includes a positioning data to be matched and a label positioning data with a matching relationship
  • the matching data pair is obtained according to the pose labeling information, including:
  • the positioning data to be matched corresponds to the real pose; if yes, the positioning data to be matched is used as the label positioning data matched with the positioning data to be matched; The positioning data corresponding to the similar real pose is used as the label positioning data matched with the positioning data to be matched.
  • the location update module 1340 is further configured to:
  • the first coordinate system is the coordinate system where the predicted positioning data is located
  • the second coordinate system is where the fingerprint positioning data is located coordinate system
  • fingerprint positioning data to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned, including:
  • the predicted positioning data and the fingerprint positioning data are converted into the same coordinate system, and the fingerprint positioning data is used to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned.
  • the determination of the conversion parameters between the first coordinate system and the second coordinate system based on the predicted positioning data and fingerprint positioning data at the same time includes:
  • Low-pass filter processing is performed on the conversion parameter and the conversion parameter sampling data corresponding to the current calibration time to update the conversion parameter.
  • the location update module 1340 is further configured to:
  • the positioning device 1400 may include a processor 1410 and a memory 1420; the memory 1420 stores the following program modules:
  • the first acquiring module 1421 is configured to acquire the inertial navigation positioning data of the object to be positioned determined based on the inertial navigation sensing data;
  • the positioning prediction module 1422 is configured to perform positioning prediction according to the inertial navigation positioning data, and obtain predicted positioning data of the object to be positioned;
  • the second acquiring module 1423 is configured to acquire the fingerprint positioning data of the object to be positioned determined based on the wireless signal data; the wireless signal data is signal data received by the object to be positioned from a fixedly set wireless access point;
  • the positioning update module 1424 is configured to use the fingerprint positioning data to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned;
  • the processor 1410 is used to execute the above program modules.
  • the acquisition of the inertial navigation positioning data of the object to be positioned based on the inertial navigation sensor data may include:
  • the inertial navigation sensor data is processed by using the pre-trained inertial navigation positioning model to obtain the inertial navigation positioning data of the object to be positioned.
  • the inertial navigation positioning data includes inertial navigation relative positioning data between the previous moment of the current moment and the current moment of the object to be positioned; the above-mentioned positioning prediction is performed according to the inertial navigation positioning data to obtain the prediction of the object to be positioned Location data, including:
  • the reference positioning data including at least one of predicted positioning data, fingerprint positioning data, and fusion positioning data of the object to be positioned at a previous moment;
  • the predicted positioning data of the object to be positioned at the current moment is obtained.
  • the acquisition of the fingerprint positioning data of the object to be positioned based on the wireless signal data may include:
  • the wireless signal reference data is the wireless signal data collected at the reference position.
  • the determination of the fingerprint positioning data of the object to be positioned at the sampling time according to the similarity between the wireless signal data of the object to be positioned at the sampling time and the pre-configured wireless signal reference data may include:
  • the similarity between the wireless signal data of the object to be positioned at the sampling moment and the pre-configured wireless signal reference data determine a plurality of wireless signal reference data similar to the wireless signal data of the object to be positioned at the sampling moment;
  • the fingerprint positioning data of the object to be positioned at the sampling moment is determined by performing weighting processing on the multiple reference positions corresponding to the multiple wireless signal reference data.
  • the above-mentioned fingerprint positioning data includes fingerprint positioning data of the object to be positioned at a sampling time, and the sampling time is earlier than the current time.
  • the aforementioned update of the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include:
  • the fingerprint positioning data of the object to be positioned at the sampling time is used to update the predicted positioning data of the object to be positioned at the sampling time, and the fusion positioning data of the object to be positioned at the sampling time is obtained.
  • the inertial navigation positioning data includes inertial navigation relative positioning data of the object to be positioned between the sampling moment and the current moment.
  • the positioning update module 1424 is further configured to:
  • the fused positioning data of the object to be positioned at the sampling moment and the relative positioning data of the inertial navigation is obtained.
  • the location update module 1424 is further configured to:
  • the sampling moment is determined according to the time stamp in the fingerprint positioning data.
  • the aforementioned update of the predicted positioning data by using the fingerprint positioning data to obtain the fusion positioning data of the object to be positioned may include:
  • the Kalman gain at the target time is determined
  • the fingerprint positioning data and Kalman gain at the target time are used to update the predicted positioning data at the target time, and the fusion positioning data of the object to be positioned at the target time is obtained.
  • the location update module 1424 is further configured to:
  • each set of fingerprint sample data includes a fingerprint positioning sample matrix and a corresponding real positioning matrix; n is a positive integer;
  • One of the fingerprint positioning sample matrix and the real positioning matrix in each group of fingerprint sample data is converted into an inverse matrix and multiplied by the other, and the observation noise covariance matrix is determined according to the result of the multiplication.
  • the location update module 1424 is further configured to:
  • the prior positioning covariance matrix of the target moment is determined.
  • the location update module 1424 is further configured to:
  • each set of inertial navigation sample data includes an inertial navigation positioning sample matrix and a corresponding real positioning matrix; m is a positive integer;
  • One of the inertial navigation positioning sample matrix and the real positioning matrix in each group of inertial navigation sample data is converted into an inverse matrix and multiplied with the other, and the input noise covariance matrix is determined according to the result of the multiplication.
  • the positioning device 1400 may further include a map matching module configured to:
  • the target scene is the real scene where the object to be positioned is currently located;
  • the map matching module trains and obtains the map matching model by performing the following steps:
  • each matching data pair includes a positioning data to be matched and a label positioning data with a matching relationship
  • the matching data pair is obtained according to the pose labeling information, including:
  • the positioning data to be matched corresponds to the real pose; if yes, the positioning data to be matched is used as the label positioning data matched with the positioning data to be matched; if not, the positioning data closest to the matching positioning data is obtained.
  • the positioning data corresponding to the similar real pose is used as the label positioning data matched with the positioning data to be matched.
  • the location update module 1424 is further configured to:
  • the first coordinate system is the coordinate system where the predicted positioning data is located
  • the second coordinate system is where the fingerprint positioning data is located the coordinate system
  • the fingerprint positioning data is used to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned, including:
  • the predicted positioning data and the fingerprint positioning data are converted into the same coordinate system, and the fingerprint positioning data is used to update the predicted positioning data to obtain the fusion positioning data of the object to be positioned.
  • the determination of the conversion parameters between the first coordinate system and the second coordinate system based on the predicted positioning data and fingerprint positioning data at the same time includes:
  • Low-pass filter processing is performed on the conversion parameter and the conversion parameter sampling data corresponding to the current calibration time to update the conversion parameter.
  • the location update module 1424 is further configured to:
  • Exemplary embodiments of the present disclosure also provide a computer-readable storage medium, which can be realized in the form of a program product, which includes program code.
  • the program product When the program product is run on the electronic device, the program code is used to make the electronic device The steps described in the "Exemplary Methods" section above in this specification according to various exemplary embodiments of the present disclosure are performed.
  • the program product may be implemented as a portable compact disk read-only memory (CD-ROM) containing program code and run on an electronic device, such as a personal computer.
  • CD-ROM portable compact disk read-only memory
  • the program product of the present disclosure is not limited thereto.
  • a readable storage medium may be any tangible medium containing or storing a program, and the program may be used by or in combination with an instruction execution system, apparatus or device.
  • a program product may take the form of any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof. More specific examples (non-exhaustive list) of readable storage media include: electrical connection with one or more conductors, portable disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • a computer readable signal medium may include a data signal carrying readable program code in baseband or as part of a carrier wave. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a readable signal medium may also be any readable medium other than a readable storage medium that can transmit, propagate, or transport a program for use by or in conjunction with an instruction execution system, apparatus, or device.
  • Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Program code for performing the operations of the present disclosure may be written in any combination of one or more programming languages, including object-oriented programming languages—such as Java, C++, etc., as well as conventional procedural programming Language - such as "C" or similar programming language.
  • the program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server to execute.
  • the remote computing device may be connected to the user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (for example, using an Internet service provider). business to connect via the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet service provider for example, using an Internet service provider

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique. Le procédé de positionnement consiste : à acquérir des données de positionnement de navigation inertielle d'un objet à positionner déterminées en fonction de données de détection de navigation inertielle (S310) ; à réaliser une prédiction de positionnement en fonction des données de positionnement de navigation inertielle, et à obtenir des données de positionnement prédites de l'objet à positionner (S320) ; à acquérir des données de positionnement d'empreinte digitale de l'objet à positionner déterminées en fonction de données de signal sans fil, les données de signal sans fil étant des données de signal reçues par l'objet à positionner en provenance d'un point d'accès sans fil agencé à demeure (S330) ; et à mettre à jour les données de positionnement prédites à l'aide des données de positionnement d'empreinte digitale, et à obtenir des données de positionnement fusionnées de l'objet à positionner (S340). Ainsi, la précision de positionnement est accrue.
PCT/CN2022/117066 2021-11-09 2022-09-05 Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique WO2023082797A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111322570.9A CN114001736A (zh) 2021-11-09 2021-11-09 定位方法、定位装置、存储介质与电子设备
CN202111322570.9 2021-11-09

Publications (1)

Publication Number Publication Date
WO2023082797A1 true WO2023082797A1 (fr) 2023-05-19

Family

ID=79928396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/117066 WO2023082797A1 (fr) 2021-11-09 2022-09-05 Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114001736A (fr)
WO (1) WO2023082797A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114001736A (zh) * 2021-11-09 2022-02-01 Oppo广东移动通信有限公司 定位方法、定位装置、存储介质与电子设备
CN116753976B (zh) * 2023-08-18 2023-12-22 广西大也智能数据有限公司 一种步行导航方法和装置

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102419180A (zh) * 2011-09-02 2012-04-18 无锡智感星际科技有限公司 一种基于惯性导航系统和wifi的室内定位方法
US20140073345A1 (en) * 2012-09-07 2014-03-13 Microsoft Corporation Locating a mobile computing device in an indoor environment
CN105424030A (zh) * 2015-11-24 2016-03-23 东南大学 基于无线指纹和mems传感器的融合导航装置和方法
CN105588566A (zh) * 2016-01-08 2016-05-18 重庆邮电大学 一种基于蓝牙与mems融合的室内定位系统及方法
CN107702712A (zh) * 2017-09-18 2018-02-16 哈尔滨工程大学 基于惯性测量双层wlan指纹库的室内行人组合定位方法
US20180283882A1 (en) * 2017-04-04 2018-10-04 Appropolis Inc. Location-based services system and method therefor
CN112729301A (zh) * 2020-12-10 2021-04-30 深圳大学 一种基于多源数据融合的室内定位方法
CN113382357A (zh) * 2021-06-29 2021-09-10 上海电力大学 一种改进pdr与rssi融合的蓝牙室内定位方法
CN113554754A (zh) * 2021-07-30 2021-10-26 中国电子科技集团公司第五十四研究所 一种基于计算机视觉的室内定位方法
CN114001736A (zh) * 2021-11-09 2022-02-01 Oppo广东移动通信有限公司 定位方法、定位装置、存储介质与电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106441316B (zh) * 2016-09-08 2020-09-01 复旦大学 一种基于历史数据的单点路网匹配方法
CN106647784A (zh) * 2016-11-15 2017-05-10 天津大学 基于北斗导航系统的微小型无人飞行器定位与导航方法
CN109874112B (zh) * 2017-12-05 2021-06-15 华为技术有限公司 一种定位的方法及终端
CN109800454B (zh) * 2018-12-13 2023-04-18 武汉工程大学 一种基于卡尔曼滤波的煤层气采集方法、系统及存储介质
CN112147658B (zh) * 2019-06-27 2023-08-11 财付通支付科技有限公司 一种车辆移动方向的判断方法、装置、设备及存储介质
CN110543917B (zh) * 2019-09-06 2021-09-28 电子科技大学 一种利用行人惯导轨迹与视频信息的室内地图匹配方法
CN110530371B (zh) * 2019-09-06 2021-05-18 电子科技大学 一种基于深度强化学习的室内地图匹配方法
CN112084285B (zh) * 2020-09-11 2023-08-08 北京百度网讯科技有限公司 用于地图匹配的方法、装置、电子设备以及可读介质
CN112146660B (zh) * 2020-09-25 2022-05-03 电子科技大学 一种基于动态词向量的室内地图定位方法
CN112945227A (zh) * 2021-02-01 2021-06-11 北京嘀嘀无限科技发展有限公司 定位方法和装置
CN113203426A (zh) * 2021-04-13 2021-08-03 北京嘀嘀无限科技发展有限公司 地图匹配方法、地图匹配模型的确定方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102419180A (zh) * 2011-09-02 2012-04-18 无锡智感星际科技有限公司 一种基于惯性导航系统和wifi的室内定位方法
US20140073345A1 (en) * 2012-09-07 2014-03-13 Microsoft Corporation Locating a mobile computing device in an indoor environment
CN105424030A (zh) * 2015-11-24 2016-03-23 东南大学 基于无线指纹和mems传感器的融合导航装置和方法
CN105588566A (zh) * 2016-01-08 2016-05-18 重庆邮电大学 一种基于蓝牙与mems融合的室内定位系统及方法
US20180283882A1 (en) * 2017-04-04 2018-10-04 Appropolis Inc. Location-based services system and method therefor
CN107702712A (zh) * 2017-09-18 2018-02-16 哈尔滨工程大学 基于惯性测量双层wlan指纹库的室内行人组合定位方法
CN112729301A (zh) * 2020-12-10 2021-04-30 深圳大学 一种基于多源数据融合的室内定位方法
CN113382357A (zh) * 2021-06-29 2021-09-10 上海电力大学 一种改进pdr与rssi融合的蓝牙室内定位方法
CN113554754A (zh) * 2021-07-30 2021-10-26 中国电子科技集团公司第五十四研究所 一种基于计算机视觉的室内定位方法
CN114001736A (zh) * 2021-11-09 2022-02-01 Oppo广东移动通信有限公司 定位方法、定位装置、存储介质与电子设备

Also Published As

Publication number Publication date
CN114001736A (zh) 2022-02-01

Similar Documents

Publication Publication Date Title
WO2023082797A1 (fr) Procédé de positionnement, appareil de positionnement, support de stockage et dispositif électronique
WO2019219077A1 (fr) Procédé de positionnement, appareil de positionnement, système de positionnement, support de stockage et procédé de construction de base de données cartographiques hors ligne
CN107179086B (zh) 一种基于激光雷达的制图方法、装置及系统
CN108230379B (zh) 用于融合点云数据的方法和装置
JP6339200B2 (ja) 軌跡を使用する位置推定のための方法および装置
KR20220028042A (ko) 포즈 결정 방법, 장치, 전자 기기, 저장 매체 및 프로그램
CN109461208B (zh) 三维地图处理方法、装置、介质和计算设备
WO2018081186A1 (fr) Procédé et système de mise en correspondance de forme globale pour une trajectoire
CN113406682B (zh) 一种定位方法、装置、电子设备及存储介质
EP3414524A1 (fr) Procédé et système d'utilisation d'une navigation portable améliorée assistée par des informations cartographiques hors ligne
CN107888828A (zh) 空间定位方法及装置、电子设备、以及存储介质
WO2016040166A1 (fr) Procédé et appareil permettant d'utiliser une navigation portative améliorée assistée par des informations cartographiques
WO2015023634A2 (fr) Navigation inertielle à base visuelle
US11162791B2 (en) Method and system for point of sale ordering
WO2021012879A1 (fr) Procédé et appareil de reconstruction de trajectoire, dispositif informatique et support de stockage
CN111935644B (zh) 一种基于融合信息的定位方法、装置及终端设备
WO2021088498A1 (fr) Procédé d'affichage d'objet virtuel et dispositif électronique
CN110553648A (zh) 一种用于室内导航的方法和系统
CN114013449B (zh) 针对自动驾驶车辆的数据处理方法、装置和自动驾驶车辆
KR20210142745A (ko) 정보 처리 방법, 장치, 전자 기기, 저장 매체 및 프로그램
Batstone et al. Towards real-time time-of-arrival self-calibration using ultra-wideband anchors
CN113610702B (zh) 一种建图方法、装置、电子设备及存储介质
CN112985394B (zh) 定位方法和装置、存储介质
WO2022246812A1 (fr) Procédé et appareil de positionnement, dispositif électronique et support de stockage
WO2021088497A1 (fr) Procédé d'affichage d'objet virtuel, procédé de mise à jour de carte globale et dispositif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22891602

Country of ref document: EP

Kind code of ref document: A1