CN113252058B - IMU data processing method, system, device and storage medium - Google Patents

IMU data processing method, system, device and storage medium Download PDF

Info

Publication number
CN113252058B
CN113252058B CN202110564425.5A CN202110564425A CN113252058B CN 113252058 B CN113252058 B CN 113252058B CN 202110564425 A CN202110564425 A CN 202110564425A CN 113252058 B CN113252058 B CN 113252058B
Authority
CN
China
Prior art keywords
imu data
data
imu
initial
conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110564425.5A
Other languages
Chinese (zh)
Other versions
CN113252058A (en
Inventor
闫振强
胡达
王丰雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Voyager Technology Co Ltd
Original Assignee
Beijing Voyager Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Voyager Technology Co Ltd filed Critical Beijing Voyager Technology Co Ltd
Priority to CN202110564425.5A priority Critical patent/CN113252058B/en
Publication of CN113252058A publication Critical patent/CN113252058A/en
Priority to PCT/CN2022/080139 priority patent/WO2022247392A1/en
Application granted granted Critical
Publication of CN113252058B publication Critical patent/CN113252058B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the specification discloses an IMU data processing method, an IMU data processing system, an IMU data processing device and a storage medium. The method comprises the following steps: acquiring initial IMU data; determining converted IMU data corresponding to the initial IMU data based on a conversion model; and determining target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than that of the initial IMU data.

Description

IMU data processing method, system, device and storage medium
Technical Field
The present disclosure relates to the field of positioning navigation data processing technologies, and in particular, to a method, a system, a device, and a storage medium for IMU data processing.
Background
With the development of unmanned technology, continuous and stable high-precision positioning navigation capability is a basic requirement of unmanned vehicles. The positioning mode based on the IMU (inertial measurement unit) is widely applied in the aspect of high updating frequency and capability of providing real-time position information. Generally, IMU devices can be classified into high-precision IMUs and low-precision IMUs, where data collected by the high-precision IMUs has high precision, but is costly and cannot be widely used. Therefore, it is necessary to propose an IMU data processing method for converting the data collected by the low-precision IMU into high-precision IMU data.
Disclosure of Invention
Some embodiments of the present description relate to a method for processing IMU data. The method comprises the following steps: acquiring initial IMU data; determining converted IMU data corresponding to the initial IMU data based on a conversion model; and determining target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than the accuracy of the initial IMU data.
Some embodiments of the present description relate to a system for processing IMU data. The system comprises: the data acquisition module acquires initial IMU data; the data conversion module is used for determining converted IMU data corresponding to the initial IMU data based on a conversion model; and a data determination module that determines target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than the accuracy of the initial IMU data.
Some embodiments of the present description relate to an apparatus for processing IMU data. The device comprises: at least one storage medium storing computer instructions; at least one processor executing the computer instructions to implement an IMU data conversion method.
Some embodiments of the present description relate to a computer-readable storage medium storing computer instructions that, when read by a computer, perform an IMU data conversion method.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Wherein:
FIG. 1 is an exemplary application scenario diagram of an IMU data processing system according to some embodiments of the present description;
FIG. 2 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow chart of an exemplary IMU data processing method shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary flow diagram illustrating selection of a transition model based on context information according to some embodiments of the present description;
FIG. 5 is an exemplary flow diagram illustrating selection of a conversion model based on linear information according to some embodiments of the present description;
FIG. 6 is an exemplary flow chart of a training process for a conversion model shown in accordance with some embodiments of the present description;
FIG. 7 is an exemplary flow diagram illustrating determining target IMU data based on transformed IMU data according to some embodiments of the present description;
fig. 8 is a schematic diagram of an exemplary architecture of a transducer model according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be understood that the preceding or following operations are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Embodiments of the present description may be applied to different transportation systems including, but not limited to, one or a combination of several of land, river, lake, sea, aviation, etc. For example, automobiles (e.g., small cars, buses, large transport vehicles, etc.), rail transit (e.g., trains, motor cars, high-speed rails, subways, etc.), ships, airplanes, aircraft, hot air balloons, unmanned vehicles, etc., transportation systems that employ positioning and/or navigation functions, etc. It should be understood that the application scenarios of the systems and methods of the present description are merely some examples or embodiments of the present description, and that the present description can also be applied to other similar scenarios, based on these figures, without undue effort to one of ordinary skill in the art. For example, other similar guidance users park systems.
FIG. 1 is a schematic diagram of an exemplary application scenario of an exemplary IMU data processing system, according to some embodiments of the present description. In some embodiments, an IMU data processing system may be used to process IMU data. For example, the IMU data processing system may perform data conversion based on scene information, linear information, etc. of the initial IMU data, and further determine target IMU data based on the converted IMU data to improve data accuracy. In some embodiments, as shown in fig. 1, an application scenario 100 of an IMU data processing system may include a vehicle 110, a processing device 120, a user device 130, a storage device 140, and a network 150.
The vehicle 110 may be any type of vehicle. In some embodiments, the vehicle 110 may include a bicycle 110-1, an electric car 110-2, a motorcycle 110-3, an automobile 110-4, or the like, or any combination thereof. In some embodiments, the vehicle 110 may include an autonomous vehicle. In some embodiments, an autonomous vehicle may refer to a vehicle that is capable of achieving a level of driving automation. For example only, the driving automation level may include a first level (i.e., the vehicle is primarily supervised by a person and has a particular autonomous function (e.g., autonomous steering or acceleration)), a second level (i.e., the vehicle has one or more advanced driver assistance systems (ADAS, e.g., adaptive cruise control systems, lane keeping systems) that may control braking, steering, and/or accelerating the vehicle), a third level (i.e., the vehicle is capable of autonomous driving when one or more certain conditions are met), a fourth level (i.e., the vehicle may operate without human input or inattention, but still be limited to certain limitations (e.g., limited to a certain area), a fifth level (i.e., the vehicle may operate autonomously in all circumstances), etc., or any combination thereof.
In some embodiments, taking car 110-4 as an example, vehicle 110 may include the structure of a conventional vehicle, such as a body, wheels, chassis, suspension, steering device (e.g., steering wheel), braking device (e.g., brake pedal), accelerator, etc. In some embodiments, the body may be any type of body, such as a sports car, a sedan, a light truck, a recreational vehicle, a Sport Utility Vehicle (SUV), a minivan, or the like. In some embodiments, the wheels may be configured as All Wheel Drive (AWD), front wheel drive (FWR), rear Wheel Drive (RWD), and the like.
In some embodiments, the vehicle 110 may include a detection unit 112 or be in communication with the detection unit 112. In some embodiments, the detection unit 112 may capture data (e.g., speed, acceleration, direction of travel, etc.) related to the vehicle 110 itself and/or the surrounding environment in a moving/stationary state. In some embodiments, the detection unit 112 may include a Global Positioning System (GPS), radar, camera, IMU, or the like, or any combination thereof. As an example, the GPS module may receive geolocation and time information from GPS satellites and determine the geographic location of the vehicle 110; the radar may be configured to scan the surroundings of the vehicle 110 and generate corresponding scan data; the camera may be configured to acquire one or more images related to an object (e.g., a person, an animal, a tree, a roadblock, a building, a vehicle) within range of the camera; the IMU may measure and provide forces, angular rates, accelerations, etc. associated with the vehicle 110 using various inertial sensors.
In some embodiments, the detection unit 112 may also include a sound sensor, an image sensor, a temperature and humidity sensor, a position sensor, a pressure sensor, a distance sensor, a speed sensor, an acceleration sensor, a gravity sensor, a displacement sensor, a moment sensor, an inertia sensor, etc., or any combination thereof.
In some embodiments, after the vehicle 110 obtains the data through the detection unit 112, the data may be transmitted to the processing device 120, the user device 130, and/or the storage device 140 via the network 150.
The processing device 120 can process data and/or information obtained from the vehicle 110, the user device 130, and/or the storage device 140. For example, the processing device 120 may obtain IMU data captured by the detection unit 112 from the vehicle 110 and perform data conversion on the IMU data. For another example, the processing device 120 may obtain sample IMU data from the storage device 140 and train a conversion model based on the sample IMU data. In some embodiments, the processing device 120 may process information and/or data related to IMU data conversion to perform one or more functions described herein. For example, the processing device 120 may process the initial IMU data based on the transformation model to obtain precision-enhanced target IMU data.
In some embodiments, the processing device 120 may be a single server or a group of servers. The server farm may be centralized or distributed (e.g., the processing device 120 may be a distributed system). In some embodiments, the processing device 120 may be local or remote. For example, the processing device 120 may access information and/or data stored in the vehicle 110, the user device 130, and/or the storage device 140 via the network 150. As another example, the processing device 120 may be directly connected to the vehicle 110 and/or the storage device 140 to access stored information and/or data. In some embodiments, the processing device 120 may be implemented by a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multiple cloud, or the like, or any combination thereof.
In some embodiments, processing device 120 may include one or more processing devices (e.g., a single-core processor or a multi-core processor). By way of example only, the processing device 120 may include one or more hardware processors, such as a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a special instruction set processor (ASIP), an image processor (GPU), a physical arithmetic processor (PPU), a Digital Signal Processor (DSP), a field-programmable gate array (FPGA), an editable logic device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof. In some embodiments, the processing device 120 may be integrated in the vehicle 110.
User device 130 may enable user interaction with vehicle 110, processing device 120, and/or storage device 140. In some embodiments, a user may initiate a service request associated with the vehicle 110 through the user device 130. For example, taking an autonomous vehicle as an example, a user may initiate a vehicle use request through the user device 130. In some embodiments, the user device 130 may include a mobile device 130-1, a tablet 130-2, a notebook 130-3, an in-vehicle device 130-4, a wearable device 130-5, and the like, or any combination thereof. In some embodiments, the mobile device 130-1 may include a smart home device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. The smart mobile device may include a smart phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a point-of-sale (POS) device, and the like, or any combination thereof. The virtual reality device or augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality eyepieces, augmented reality helmet, augmented reality glasses, augmented reality eyepieces, and the like, or any combination thereof. In some embodiments, the in-vehicle device 130-4 may include an in-vehicle computer, an in-vehicle television, or the like. In some embodiments, the wearable device 130-5 may include smart bracelets, smart footwear, smart glasses, smart helmets, smart watches, smart clothing, smart backpacks, smart accessories, and the like, or any combination thereof. In some embodiments, the user device 130 may be a device with positioning technology for locating the position of the user device 130.
The storage device 140 may store data and/or instructions. In some embodiments, the storage device 140 may store data/information obtained from components of the vehicle 110, the processing device 120, and/or the user device 130. For example, the storage device 140 may store initial IMU data acquired by the detection unit 112. In some embodiments, the storage device 140 may store the trained transformation model, the initial model, and/or training samples for training the initial model. In some embodiments, the storage device 140 may store data and/or instructions that are executed or used by the processing device 120 to perform the exemplary methods described in this specification.
In some embodiments, storage device 140 may include mass memory, removable memory, random Access Memory (RAM), read Only Memory (ROM), and the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, tape, and the like. Exemplary random access memory may include Dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance RAM (Z-RAM), and the like. Exemplary read-only memory may include Mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), digital versatile disk ROM, and the like. In some embodiments, storage device 140 may be implemented by a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-layer cloud, or the like, or any combination thereof.
In some embodiments, the storage device 140 may be connected to the network 150 to communicate with one or more components in the scenario 100 (e.g., the vehicle 110, the processing device 120, the user device 130). One or more components in the scenario 100 may access data or instructions stored in the storage device 140 through the network 150. In some embodiments, the storage device 140 may be directly connected or in communication with one or more components in the scenario 100 (e.g., the vehicle 110, the processing device 120, the user device 130). In some embodiments, the storage device 140 may be part of the vehicle 110 or the processing device 120. In some embodiments, the storage device 140 may be integrated in the vehicle 110.
Network 150 may be used to facilitate the exchange of information and/or data. In some embodiments, one or more components in the scene 100 (e.g., the vehicle 110, the processing device 120, the user device 130, the storage device 140), may send and/or receive information and/or data to/from other components in the scene 100 over the network 150. For example, the processing device 120 may obtain IMU data from the vehicle 110 and/or the storage device 140 over the network 150. In some embodiments, network 150 may be any form of wired or wireless network or any combination thereof. By way of example only, the network 150 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a zigbee network, a Near Field Communication (NFC) network, a global system for mobile communications (GSM) network, a Code Division Multiple Access (CDMA) network, a Time Division Multiple Access (TDMA) network, a General Packet Radio Service (GPRS) network, an enhanced data rates for GSM evolution (EDGE) network, a Wideband Code Division Multiple Access (WCDMA) network, a High Speed Downlink Packet Access (HSDPA) network, a Long Term Evolution (LTE) network, a User Datagram Protocol (UDP) network, a transmission control protocol/internet protocol (TCP/IP) network, a Short Message Services (SMS) network, a Wireless Application Protocol (WAP) network, a band (UWB) network, mobile communications (1G, 2G, 3G, 4G, 5G) network, wi-Fi, li-Fi, narrowband internet of things (NB-ultra-wideband), or the like, or any combination thereof.
In some embodiments, network 150 may include one or more network access points. For example, network 150 may include wired or wireless network access points (e.g., base station and/or Internet switching points 150-1, 150-2, …) through which one or more components of scenario 100 may connect to network 150 to exchange data and/or information.
It should be noted that the above description of IMU data processing systems is for convenience of description only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be understood by those skilled in the art that it is possible, after understanding the principles of the system, to combine the individual components arbitrarily or to construct a subsystem in connection with other components without departing from such principles.
Fig. 2 is a block diagram of an exemplary processing device, according to some embodiments of the present description. As shown in fig. 2, processing device 120 may include a data acquisition module 210, a data conversion module 220, a data determination module 230, and a model training module 240.
The data acquisition module 210 may acquire initial IMU data.
The data conversion module 220 may determine converted IMU data corresponding to the initial IMU data based on the conversion model. In some embodiments, the data conversion module 220 may determine scene information corresponding to the initial IMU data; based on the scene information, a conversion model is selected from a plurality of candidate conversion models, wherein the plurality of candidate conversion models respectively correspond to different candidate scene categories. In some embodiments, the data conversion module 220 may also extract linear information of the initial IMU data; based on the linear information, a conversion model is selected from a plurality of candidate conversion models, wherein the plurality of candidate conversion models respectively correspond to different candidate linear categories.
The data determination module 230 may determine target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than the accuracy of the initial IMU data. In some embodiments, the data determination module 230 may further obtain reference parameters including at least linear correlation information of the reference IMU data; adjusting the converted IMU data based on the reference parameters; and determining target IMU data based on the adjusted IMU data.
Further details regarding the data acquisition module 210, the data conversion module 220, and the data determination module 230 can be found elsewhere in this specification (e.g., fig. 3-5 and their descriptions), and are not repeated here.
Model training module 240 may train the conversion model. Specifically, a plurality of training samples may be obtained, each of the plurality of training samples including sample IMU data, the accuracy of the sample IMU data being below a preset accuracy threshold; for each training sample in the plurality of training samples, acquiring standard IMU data corresponding to the training sample as a label of the training sample, wherein the precision of the standard IMU data is greater than that of the sample IMU data; the standard IMU data is the same as the sample IMU data in time stamp; based on the plurality of training samples, an initial conversion model is trained through at least one iterative process to determine a conversion model. For each of at least one iteration process, determining predicted IMU data corresponding to each of the plurality of training samples based on the conversion model in the current iteration; based on the difference between the predicted IMU data and the standard IMU data, model parameters of the conversion model in the current iteration are updated. The detailed description of the model training module 240 can be found in other parts of the present specification (e.g., fig. 6 and the description thereof), and will not be repeated here.
It should be appreciated that the processing device and its modules illustrated in fig. 2 may be implemented in a variety of ways. For example, in some embodiments, the processing device and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory for execution by a suitable instruction execution system (e.g., a microprocessor or dedicated design hardware). Those skilled in the art will appreciate that the processing devices and modules thereof described above may be implemented by computer-executable instructions. The system of the present specification and its modules may be implemented not only by hardware circuitry such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, and by a combination of the above hardware circuitry and software (e.g., firmware).
It should be noted that the above description of the processing device 120 and its modules is for convenience of description only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, the data acquisition module 210, the data conversion module 220, the data determination module 230, and the model training module 240 in fig. 2 may be different modules in one device, or the functions of two or more modules may be implemented by one module. For another example, each module in the processing device 120 may share one memory module, or each module may have a respective memory unit. As another example, model training module 240 may be a separate component than a module internal to processing device 120. Such variations are within the scope of the present description.
Fig. 3 is an exemplary flow chart of an exemplary IMU data processing method shown in accordance with some embodiments of the present description. In some embodiments, one or more steps of the flow 300 may be performed by the processing device 120 shown in fig. 1. For example, one or more steps of flowchart 300 may be stored in the form of instructions in a memory module or other storage device (e.g., storage device 140) of processing device 120 and invoked and/or executed by processing device 120. As shown in fig. 3, the flow 300 may include the following operations. In some embodiments, one or more additional operations not described above may be added and/or one or more operations discussed herein may be pruned when process 300 is performed. In addition, the order of the operations shown in FIG. 3 is not limiting.
In step 310, initial IMU data is acquired. In some embodiments, step 310 may be performed by the data acquisition module 210.
In some embodiments, initial IMU data may be acquired from the detection unit 112. In some embodiments, the initial IMU data may be read from the storage device 140. In some embodiments, the accuracy of the initial IMU data may be relatively low (e.g., below a preset accuracy threshold). For example, the initial IMU data may be acquired by a relatively less accurate IMU. In some embodiments, "precision" may refer to a ratio of measured values to true values (e.g., a ratio of measured angles to true angles). In some embodiments, the initial IMU data may include speed, acceleration, angular velocity, longitude, latitude, altitude, rotational angle (e.g., roll angle, pitch angle, yaw angle), etc., or any combination thereof.
In some embodiments, the initial IMU data may correspond to an acquisition timestamp (e.g., acquisition time). For example, the detection unit 112 may acquire IMU data at predetermined time intervals (e.g., 5ms, 10ms, 20 ms). Accordingly, the initial IMU data may be IMU data acquired by the detection unit 112 at a particular time. In some embodiments, the initial IMU data may correspond to an acquisition period. For example, the initial IMU data may include IMU data corresponding to a plurality of acquisition instants. Accordingly, the initial IMU data may be understood as sequence data corresponding to a series of acquisition instants.
In some embodiments, the initial IMU data may be real-time data. For example, the detection unit 112 may collect data in real time and transmit the data to the data acquisition module 210 in real time. In some embodiments, the initial IMU data may be non-real-time data. For example, the detection unit 112 may collect data and then transmit the data to the storage device 140 for storage. Further, the data acquisition module 210 may access the storage device 140 to read the initial IMU data.
Step 320, determining transformed IMU data corresponding to the initial IMU data based on the transformation model. In some embodiments, step 320 may be performed by data conversion module 220.
In some embodiments, based on the transformation model, the data accuracy, data type, data format, data dimensions, data size, etc. of the initial IMU data may be changed. In some embodiments, the accuracy of the transformed IMU data determined based on the transformation model is higher than the accuracy of the initial IMU data.
In some embodiments, the transformation model may include a machine learning model for sequential data processing. In some embodiments, the transformation model may include a nonlinear model for sequence data processing.
In some embodiments, the conversion model may include a CNN-based machine learning model (e.g., CNN, DCNN, GCNN, VDCNN, etc.), an RNN-based machine learning model (e.g., RNN, LSTM, bi-LSTM, BI-LSTM+CRF, seq2Seq, etc.), an attention-mechanism-based machine learning model (e.g., AT-Seq2Seq, ATAE-LSTM, ABCNN, etc.), or any combination thereof. In some embodiments, the transformation model may include a transducer-based machine learning model (e.g., transformer, GPT, BERT, XLM, etc.).
In some embodiments, the conversion model may process sequence data of any length, capture the association between the features of individual data (e.g., initial IMU data corresponding to a specific acquisition time) and different data (e.g., initial IMU data corresponding to different acquisition times) in the sequence, so that the converted IMU data may not only embody the characteristics of the IMU data at the specific acquisition time, but also embody the association between the initial IMU data at different acquisition times, and the processing result is more accurate and comprehensive.
In some embodiments, the initial IMU data may be preprocessed prior to entering the initial IMU data into the conversion model. In some embodiments, the pre-processing may include at least one of noise removal, smoothing, and the like.
In some embodiments, the conversion model may include a transducer model. As an example, a conversion model (converter) may include at least one encoder (Encoder) and at least one Decoder (Decoder). The encoder or decoder may be composed of a plurality of neural network layers (layers) having the same structure. For a detailed description of the transducer model, reference may be made to other parts of the specification (e.g., fig. 8 and description thereof), and no further description is given here.
In some embodiments, when the initial IMU data is IMU data corresponding to a specific acquisition time, the initial IMU data may be input to an encoder, and a coding result of the encoder may be input to a decoder, and further, a decoding result of the decoder may be input to one or more neural network layers, so as to obtain converted IMU data. When the initial IMU data is sequence data (namely, the IMU data corresponding to a plurality of acquisition moments is included), the IMU data corresponding to the acquisition moments can be respectively input into a plurality of encoders to obtain a plurality of encoding results; inputting a first encoding result (namely, the encoding result of IMU data corresponding to a first acquisition time) to a first decoder to obtain a first decoding result; inputting a second encoding result (namely the encoding result of IMU data corresponding to the second acquisition time) and the first decoding result to a second decoder to obtain a second decoding result, and the like, so as to obtain a last decoding result; the converted IMU data may further be obtained based on the last decoding result.
In some embodiments, the transformation model may include a one-dimensional convolutional neural network model. As an example, the one-dimensional convolutional neural network model may include an input layer, a convolutional layer, a pooling layer, and the like. In some embodiments, the initial IMU data may be input to the input layer, and relevant features (for example, association relationships between the initial IMU data corresponding to different acquisition moments) may be further extracted by the convolution layer, so as to determine a feature matrix, and the feature matrix may be input to the pooling layer, so as to finally obtain converted IMU data.
In some embodiments, a conversion model may be selected from a plurality of candidate conversion models for data conversion based on characteristics (e.g., scene information, linear information) of the initial IMU data. For a detailed description of selecting the conversion model, reference may be made to other parts of the present specification (e.g., fig. 4, 5 and descriptions thereof), and no further description is given here.
In some embodiments, a conversion model may also be selected from a plurality of candidate conversion models for data conversion based on travel information of the vehicle 110. In some embodiments, the vehicle travel information may include, but is not limited to, at least one of speed, curvature of the travel path, steering wheel angle, and the like.
In some embodiments, an initial conversion model may be trained based on a plurality of training samples to determine a conversion model. In some embodiments, each training sample may include sample IMU data having an accuracy below a preset accuracy threshold. For each training sample, its corresponding standard IMU data may also be obtained with a precision greater than that of the sample IMU data and for use as a label for the training sample. For a detailed description of training the transformation model, reference may be made to other parts of the present specification (e.g., fig. 6 and description thereof), and no further description is given here.
And step 330, determining target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than that of the initial IMU data. In some embodiments, step 330 may be performed by data determination module 230.
The transformed IMU data is determined based on the transformation model processing the initial IMU data, as described in step 320. In some embodiments, the converted IMU data may be directly treated as target IMU data. In some embodiments, the converted IMU data may be further processed to determine target IMU data in some embodiments. In some embodiments, linear correlation information of the transformed IMU data may be extracted and the transformed IMU data adjusted based on the reference parameters to determine target IMU data. The detailed description of further processing of the converted IMU data can be found elsewhere in this specification (e.g., fig. 7 and description thereof), and will not be repeated here.
It should be noted that the above description of the process 300 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
FIG. 4 is an exemplary flow diagram illustrating selection of a transition model based on context information according to some embodiments of the present description. In some embodiments, one or more steps of flowchart 400 may be performed by processing device 120 shown in fig. 1. For example, one or more steps in flowchart 400 may be stored in the form of instructions in a memory module or other storage device (e.g., storage device 140) of processing device 120 and invoked and/or executed by processing device 120. As shown in fig. 4, the flow 400 may include the following operations. In some embodiments, one or more additional operations not described above may be added and/or one or more operations discussed herein may be pruned when performing flow 400. In addition, the order of the operations shown in FIG. 4 is not limiting.
In step 410, scene information corresponding to the initial IMU data is determined. In some embodiments, step 410 may be performed by data conversion module 220.
In some embodiments, the scene information may include at least one of road conditions, environmental information, road condition information, etc. in which the vehicle 110 travels.
In some embodiments, the road conditions may include at least one of road type (e.g., expressway, urban road, rural road), road flatness (e.g., gentle, steep), road width, road speed limit (e.g., maximum speed per hour no more than 50 km/h), road camber (e.g., curvature), road grade (e.g., angle of inclination), etc.
In some embodiments, the environmental information may include at least one of meteorological conditions (e.g., sunny days, thunderstorms, cloudiness, haze), temperature, humidity, air quality, visibility, wind direction, wind speed, etc.
In some embodiments, the road condition information may include at least one of road congestion conditions (e.g., clear, creep, congestion, severe congestion), road congestion level, average travel speed, estimated congestion time, estimated congestion distance, etc.
In some embodiments, the scene information corresponding to the initial IMU data may be determined in a variety of ways. In some embodiments, the road condition may be determined based on location information of the vehicle 110 and/or the detection unit 112. For example, the road condition corresponding to the corresponding location may be read from the storage device 140 based on the location information. In some embodiments, the environmental information may be determined by various sensors (e.g., temperature sensor, humidity sensor, air quality sensor) included in the detection unit 112. In some embodiments, the environmental information may be read from the storage device 140. In some embodiments, the traffic information may be read from the storage device 140.
In some embodiments, in conjunction with the foregoing, the initial IMU data may be IMU data corresponding to a particular acquisition time instant or IMU data corresponding to multiple acquisition times instants. Accordingly, for IMU data corresponding to multiple acquisition moments, the corresponding scene information may be the same or different. In some embodiments, for IMU data corresponding to multiple acquisition times, average scene information may be determined. For example, scene information corresponding to a plurality of acquisition times may be averaged as average scene information. For another example, the scene information corresponding to a time point near the middle position among the plurality of acquisition time points may be used as the average scene information.
Step 420, selecting a conversion model from a plurality of candidate conversion models based on the scene information, wherein the plurality of candidate conversion models respectively correspond to different candidate scene categories. In some embodiments, step 420 may be performed by data conversion module 220.
In some embodiments, the plurality of candidate conversion models may be determined by separately training the initial conversion model for a plurality of different candidate scene categories. For example, assuming that the plurality of candidate scene categories includes scene a, scene B, scene C, and scene D, corresponding training samples may be determined for scene a, scene B, scene C, and scene D, respectively, and the initial conversion model may be trained based on the training samples to determine the corresponding candidate conversion models.
In some embodiments, candidate scene categories may be determined based on road conditions, environmental information, road condition information, and the like. For example, road conditions may be classified into expressways, urban roads, and rural roads, environmental information may be classified into sunny days, cloudy days, haze, and rainy days, road condition information may be classified into clear, general congestion, and severe congestion, and the above respective categories may be arranged and combined accordingly to determine candidate scene categories. The above examples are merely illustrative, and various pieces of information may be classified in different dimensions or angles, and candidate scene categories in different dimensions or angles may be determined accordingly.
In some embodiments, the number of candidate scene categories may be a system default or may be set according to different situations. The greater the number of candidate scene categories, the more detailed the classification of the scene, and the more accurate the corresponding processing of the initial IMU data.
In some embodiments, the corresponding candidate scene category may be identified based on the scene information (or average scene information) corresponding to the initial IMU data, and the candidate conversion model corresponding to the candidate scene category may be used as the conversion model. In some embodiments, there may be no candidate scene category that corresponds exactly to the scene information (or average scene information) corresponding to the initial IMU data, in which case a candidate scene category similar to or near one or more of the scene information (or average scene information) may be selected.
Because the scene information (such as road condition, environment information and road condition information) can influence the characteristics of data acquisition, corresponding conversion models are trained for different scene categories, and can be selected for different scene information when IMU data are processed or converted, so that conversion or processing results are more accurate.
It should be noted that the above description of the process 400 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 400 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
FIG. 5 is an exemplary flow chart for selecting a conversion model based on linear information according to some embodiments of the present description. In some embodiments, one or more steps of the flow 500 may be performed by the processing device 120 shown in fig. 1. For example, one or more steps of flowchart 500 may be stored in the form of instructions in a memory module or other storage device (e.g., storage device 140) of processing device 120 and invoked and/or executed by processing device 120. As shown in fig. 5, the flow 500 may include the following operations. In some embodiments, one or more additional operations not described above may be added and/or one or more operations discussed herein may be pruned when performing flow 500. In addition, the order of the operations shown in FIG. 5 is not limiting.
At step 510, linear information of the initial IMU data is extracted. In some embodiments, step 510 may be performed by data conversion module 220.
In some embodiments, the linear information of the initial IMU data may include at least one of a linear interval corresponding to the initial IMU data, a size of the linear interval, a distribution of the linear interval, and the like.
In some embodiments, in conjunction with the foregoing, the initial IMU data may be IMU data corresponding to a particular acquisition time instant or IMU data corresponding to multiple acquisition times instants. Accordingly, for IMU data corresponding to multiple acquisition times, the corresponding linear information may be the same or different. In some embodiments, average linear information may be determined for IMU data corresponding to a plurality of acquisition instants. For example, the linear information corresponding to a plurality of acquisition times may be averaged as the average linear information. For another example, the scene linearity corresponding to a time point near the intermediate position among the plurality of acquisition time points may be used as the average linearity information.
Step 520, selecting a conversion model from a plurality of candidate conversion models based on the linear information, wherein the plurality of candidate conversion models respectively correspond to different candidate linear categories. In some embodiments, step 520 may be performed by data conversion module 220.
In some embodiments, the plurality of candidate transformation models may be determined by separately training the initial transformation model for a plurality of different candidate linear categories. For example, assuming that the plurality of candidate linear categories includes category a, category B, category C, and category D, a corresponding training sample may be determined for category a, category B, category C, and category D, respectively, and the initial conversion model may be trained based on the training sample to determine a corresponding candidate conversion model.
In some embodiments, the candidate linear category may be determined based on the size of the linear interval and/or the distribution of the linear interval, etc. For example, the linear section ranges a to b, b to c, and c to d may be classified into a first class, a second class, and a third class according to the size of the linear section. The above examples are merely illustrative, and classification of different dimensions or angles may also be performed for the linear interval, and candidate linear categories of different dimensions or angles may be determined accordingly.
In some embodiments, the number of candidate linear categories may be a system default or may be set according to different circumstances. The greater the number of candidate linear categories, the more detailed the classification of the linear intervals and the more accurate the corresponding processing of the initial IMU data.
In some embodiments, the corresponding candidate linear class may be identified based on the linear information (or average linear information) corresponding to the initial IMU data, and the candidate conversion model corresponding to the candidate linear class may be used as the conversion model. In some embodiments, there may be no candidate linear category that corresponds exactly to the linear information (or average linear information) corresponding to the initial IMU data, in which case a candidate linear category similar to or close to one or more of the linear information (or average linear information) may be selected.
Because the linear information reflects the accuracy condition of the initial IMU data to a certain extent, corresponding conversion models are trained for different linear categories, and the corresponding conversion models can be selected for different linear information when the IMU data are processed or converted, so that the conversion or processing result is more accurate.
It should be noted that the above description of the process 500 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 500 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
FIG. 6 is an exemplary flow chart of a training process for a conversion model shown in accordance with some embodiments of the present description. In some embodiments, one or more steps of flowchart 600 may be performed by processing device 120 shown in fig. 1. For example, one or more steps in flowchart 600 may be stored in the form of instructions in a memory module or other storage device (e.g., storage device 140) of processing device 120 and invoked and/or executed by processing device 120. As shown in fig. 6, the flow 600 may include the following operations. In some embodiments, one or more additional operations not described above may be added and/or one or more operations discussed herein may be eliminated when the flow 600 is performed. In addition, the order of the operations shown in FIG. 6 is not limiting.
At step 610, a plurality of training samples are obtained. In some embodiments, step 610 may be performed by model training module 240.
In some embodiments, each of the plurality of training samples may include sample IMU data having a precision below a preset precision threshold. In some embodiments, the sample IMU data may be data acquired by a relatively low-precision IMU. In some embodiments, the preset precision threshold may be a default value of the system, or may be set according to different requirements under different situations.
In some embodiments, a plurality of training samples may be obtained from a memory module of processing device 120 or memory device 140.
Step 620, for each training sample in the plurality of training samples, obtaining standard IMU data corresponding to the training sample as a label of the training sample, where the accuracy of the standard IMU data is greater than the accuracy of the sample IMU data. In some embodiments, step 620 may be performed by model training module 240.
In some embodiments, the standard IMU data may be data acquired by a relatively high-precision IMU. In some embodiments, the standard IMU data is the same timestamp as the sample IMU data. That is, the time or time period of acquisition of the standard IMU data is the same or substantially the same as the time or time period of acquisition of the sample IMU data. By way of example only, a vehicle having a low-precision IMU and a high-precision IMU mounted thereon may be driven on an urban road, the low-precision IMU and the high-precision IMU may simultaneously acquire data of a speed, an acceleration, etc. of the vehicle at a plurality of times, the low-precision IMU data acquired at each time being sample IMU data, the high-precision IMU data being standard IMU data.
Step 630, training the initial conversion model through at least one iterative process based on the plurality of training samples, determining the conversion model. In some embodiments, step 630 may be performed by model training module 240.
In some embodiments, the initial conversion model may include initial model parameters. In some embodiments, the initial model parameters may be system default values or may be set according to different situations. For example only, the initial model may be a transducer model that includes initial model parameters (e.g., weights, constant terms, etc. for individual neurons in the neural network layer) that are initialized over time.
In some embodiments, for each of at least one iteration, predicted IMU data corresponding to each of the plurality of training samples may be determined based on the conversion model in the current iteration, and model parameters of the conversion model in the current iteration may be updated based on differences between the predicted IMU data and the standard IMU data. Specifically, the values of the loss functions (e.g., over-the-cross, KL-divergence) may be determined based on the differences between the predicted IMU data and the standard IMU data, and model parameters of the conversion model are optimized based on the values of the loss functions. For example, model parameters of the transformation model in the current iteration may be updated by a back-propagation algorithm based on the following formula:
w′m=wm-γg{m}
Where m represents the number of neurons in the neural network, w' m represents the model parameters (e.g., weights) after optimization, w m model parameters before optimization, γ represents the learning rate, and Loss represents the Loss function.
In some embodiments, the entire training process ends when a certain preset condition is met (e.g., the value of the loss function is less than a certain preset value, the iterative process reaches a predetermined number of times).
In some embodiments, different conversion models may be trained for different scene categories. Specifically, training samples for different scene categories may be obtained, and based on the corresponding training samples, a conversion model corresponding to the different scene categories may be trained. In some embodiments, different conversion models may be trained for different linear categories. Specifically, training samples for different linear categories may be obtained, and based on the corresponding training samples, a conversion model corresponding to the different linear categories may be trained. In some embodiments, different conversion models may be trained for different travel information. Specifically, training samples for different driving information may be obtained, and based on the corresponding training samples, a conversion model corresponding to the different driving information may be trained. In some embodiments, different transformation models may also be trained for other categories, such as the type of IMU, the use of IMU, etc.
It should be noted that the above description of the process 600 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 600 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
Fig. 7 is an exemplary flow diagram illustrating determining target IMU data based on transformed IMU data according to some embodiments of the present description. In some embodiments, one or more steps of flowchart 700 may be performed by processing device 120 shown in fig. 1. For example, one or more steps of flowchart 700 may be stored in the form of instructions in a memory module or other storage device (e.g., storage device 140) of processing device 120 and invoked and/or executed by processing device 120. As shown in fig. 7, the flow 700 may include the following operations. In some embodiments, one or more additional operations not described above may be added and/or one or more operations discussed herein may be pruned when performing flow 700. In addition, the order of the operations shown in FIG. 7 is not limiting.
In step 710, reference parameters are obtained. In some embodiments, step 710 may be performed by the data determination module 230.
In some embodiments, the reference parameters may include at least linear correlation information of the reference IMU data. In some embodiments, the relevant parameters of the reference IMU data are similar or identical to the relevant parameters of the initial IMU data. In some embodiments, the accuracy of the reference IMU data is higher than the accuracy of the initial IMU data. In some embodiments, the relevant parameters may include at least one of IMU type, context information, noise indicia, mode of operation, etc. In some embodiments, the reference IMU data may be high-precision ideal IMU data corresponding to the initial IMU data.
In some embodiments, the reference IMU data may be determined by the processing device in conjunction with the above-described related parameter calculations. In some embodiments, the reference IMU data may be acquired by a relatively high accuracy IMU actual acquisition.
In some embodiments, the linear-related information of the reference IMU data may include at least one of a linear interval, a size of the linear interval, a distribution of the linear interval, and the like, corresponding to the reference IMU data.
At step 720, the converted IMU data is adjusted based on the reference parameters. In some embodiments, step 720 may be performed by the data determination module 230.
In some embodiments, the transformed IMU data may be linearly fit based on the linear correlation information of the reference IMU data. In some embodiments, the linearity information of the converted IMU data may be adjusted based on the linearity related information of the reference IMU data. For example, the size of the linear interval or the distribution of the linear interval of the converted IMU data may be adjusted (e.g., increased) based on the linear-related information of the reference IMU data.
In step 730, the target IMU data is determined based on the adjusted IMU data. In some embodiments, step 730 may be performed by the data determination module 230.
In some embodiments, the adjusted IMU data may be directly used as the target IMU data. In some embodiments, post-processing such as smoothing and de-blurring may be performed on the adjusted IMU data to determine target IMU data.
In some embodiments, the conversion IMU data determined by the conversion model processing is further processed based on the reference IMU data, and the linear interval of the IMU data can be further adjusted, so that the data accuracy is further improved.
In some embodiments, the converted IMU data may be adjusted at preset time intervals. For example, the converted IMU data may be adjusted based on the reference parameters every 1 minute, and other times the converted IMU data is directly taken as the target IMU data. The system operation or calculation load can be reduced while the IMU data precision is ensured.
Fig. 8 is a schematic diagram of an exemplary architecture of a transducer model according to some embodiments of the present description.
As shown in fig. 8, the transducer model 800 may be composed of a plurality of identical encoders (Encoder) and a plurality of identical decoders (decoders). The input of the transducer model is the initial IMU data and the output is the converted IMU data.
In some embodiments, when the initial IMU data is IMU data corresponding to a specific acquisition time, the initial IMU data may be input to an encoder, and a coding result of the encoder may be input to a decoder, and further, a decoding result of the decoder may be input to one or more neural network layers, so as to obtain converted IMU data.
In some embodiments, when the initial IMU data is sequence data (i.e., includes IMU data corresponding to a plurality of acquisition moments), the IMU data corresponding to the plurality of acquisition moments may be respectively input to a plurality of encoders, so as to obtain a plurality of encoding results; inputting a first encoding result (namely, the encoding result of IMU data corresponding to a first acquisition time) to a first decoder to obtain a first decoding result; inputting a second encoding result (namely the encoding result of IMU data corresponding to the second acquisition time) and the first decoding result to a second decoder to obtain a second decoding result, and the like, so as to obtain a last decoding result; the converted IMU data may further be obtained based on the last decoding result.
In some embodiments, each encoder may include a multi-layer structure (e.g., multiple neural network layers). Each layer structure may include two Sub-layers, a first Sub-layer based on an attention mechanism (Multi-head attention mechanism) and a second Sub-layer of a fully-connected feed-forward network (Fully connected feed-forward network), respectively. Further, a residual block (Residual connection) and a Normalization block (Normalization) are added to each sub-layer.
In some embodiments, each decoder may also include a multi-layer structure (e.g., multiple neural network layers). Similar to the encoder, each layer structure may include two Sub-layers, a first Sub-layer based on an attention mechanism (Multi-head attention mechanism) and a second Sub-layer of a fully-connected feed-forward network (Fully connected feed-forward network), respectively. In addition, each layer structure may also include a third sub-layer based on an attention mechanism (Masked Multi-head attention mechanism).
Through a specific model structure, the sequence data can be efficiently and accurately processed, and the association relation between IMU data corresponding to different acquisition moments can be accurately extracted, so that the conversion result is more accurate.
Some embodiments of the present specification also provide an IMU data processing apparatus including at least one storage medium and at least one processor. At least one storage medium for storing computer instructions; at least one processor is configured to execute the computer instructions to implement the IMU data processing method as described above.
Some embodiments of the present description also provide a computer-readable storage medium storing computer instructions. The IMU data processing method described above may be performed when the computer reads the computer instructions in the storage medium.
Possible benefits of embodiments of the present description include, but are not limited to: (1) The low-precision IMU data is converted into high-precision IMU data through the conversion model, so that IMU equipment does not need to be replaced, the cost is low, and the large-scale application is facilitated; (2) The data conversion is carried out through the nonlinear conversion model, so that the association relation in the sequence data can be extracted more accurately, and the conversion result is more accurate; (3) Different conversion models are selected according to different scenes or different data linear categories, and the conversion result is more accurate; (4) And combining ideal data, further processing the converted data, and further improving the data precision.
It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
The foregoing describes the present specification and/or some other examples. In light of the above, the present description can also be variously modified. The subject matter disclosed herein is capable of being embodied in various forms and examples and is capable of being used in a variety of applications. All applications, modifications and variations that are claimed in the following claims fall within the scope of the present specification.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment", or "one embodiment", or "an alternative embodiment", or "another embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Those skilled in the art will appreciate that various modifications and improvements to the disclosure herein may be made. For example, the different system components described above are all implemented by hardware devices, but may also be implemented by software-only solutions. For example: the system is installed on an existing server. Furthermore, the provision of location information as disclosed herein may be implemented by a firmware, a combination of firmware/software, a combination of firmware/hardware or a combination of hardware/firmware/software.
All or a portion of the software may sometimes communicate over a network, such as the internet or other communication network. Such communication enables loading of software from one computer device or processor to another. For example: a hardware platform loaded from a management server or host computer of the radiation therapy system to a computer environment, or other computer environment in which the system is implemented, or a system that provides similar functionality in relation to the information needed to determine the wheelchair target structural parameters. Thus, another medium capable of carrying software elements may also be used as a physical connection between local devices, such as optical, electrical, electromagnetic, etc., propagating through cable, optical cable or air. Physical media used for carrier waves, such as electrical, wireless, or optical, may also be considered to be software-bearing media. Unless limited to a tangible "storage" medium, other terms used herein to refer to a computer or machine "readable medium" mean any medium that participates in the execution of any instructions by a processor.
The computer program code necessary for operation of portions of the present description may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, and the like, a conventional programming language such as C language, visual Basic, fortran 2003, perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, for example, a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, for example, software as a service (SaaS).
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing attributes, quantities are used, it being understood that such numbers used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, articles, and the like, referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to the embodiments explicitly described and depicted in the present specification.

Claims (9)

1. A method for processing Inertial Measurement Unit (IMU) data, the method comprising:
Acquiring initial IMU data;
Determining scene information corresponding to the initial IMU data;
Selecting a conversion model from a plurality of candidate conversion models based on the scene information, wherein the plurality of candidate conversion models respectively correspond to different candidate scene categories;
determining converted IMU data corresponding to the initial IMU data based on the conversion model; and
And determining target IMU data based on the converted IMU data, wherein the accuracy of the target IMU data is greater than that of the initial IMU data.
2. The method of claim 1, wherein the transformation model comprises a transducer model.
3. The method according to claim 1, wherein the method further comprises:
Extracting linear information of the initial IMU data, wherein the linear information of the initial IMU data comprises at least one of a linear interval, a size of the linear interval and distribution of the linear interval corresponding to the initial IMU data; and
And selecting the conversion model from a plurality of candidate conversion models based on the linear information, wherein the plurality of candidate conversion models respectively correspond to different candidate linear categories, and the candidate linear categories are determined based on the size of the linear interval and/or the distribution of the linear interval.
4. The method of claim 1, wherein the transformation model is determined by:
Acquiring a plurality of training samples, wherein each training sample in the plurality of training samples comprises sample IMU data, and the precision of the sample IMU data is lower than a preset precision threshold;
for each training sample in the plurality of training samples, obtaining standard IMU data corresponding to the training sample as a label of the training sample, wherein,
The accuracy of the standard IMU data is greater than the accuracy of the sample IMU data;
the standard IMU data is the same as the time stamp of the sample IMU data; and
Based on the plurality of training samples, an initial conversion model is trained through at least one iterative process, and the conversion model is determined.
5. The method of claim 4, wherein training an initial conversion model through at least one iterative process based on the plurality of training samples, determining the conversion model comprises:
for each of the at least one iterative process,
Based on a conversion model in the current iteration, determining predicted IMU data corresponding to the training samples respectively; and
And updating model parameters of a conversion model in the current iteration based on the difference between the predicted IMU data and the standard IMU data.
6. The method of claim 1, wherein the determining target IMU data based on the converted IMU data comprises:
Acquiring a reference parameter, wherein the reference parameter at least comprises linear related information of reference IMU data, and the linear related information of the reference IMU data comprises at least one of a linear interval, a size of the linear interval and distribution of the linear interval corresponding to the reference IMU data;
adjusting the converted IMU data based on the reference parameters; and
And determining the target IMU data based on the adjusted IMU data.
7. A system for processing Inertial Measurement Unit (IMU) data, the system comprising:
the data acquisition module acquires initial IMU data;
the data conversion module is used for determining scene information corresponding to the initial IMU data;
Selecting a conversion model from a plurality of candidate conversion models based on the scene information, wherein the plurality of candidate conversion models respectively correspond to different candidate scene categories;
determining converted IMU data corresponding to the initial IMU data based on the conversion model; and
And the data determining module is used for determining target IMU data based on the converted IMU data, wherein the precision of the target IMU data is greater than that of the initial IMU data.
8. An apparatus for processing Inertial Measurement Unit (IMU) data, the apparatus comprising:
at least one storage medium storing computer instructions;
At least one processor executing the computer instructions to implement the method of any one of claims 1-6.
9. A computer readable storage medium storing computer instructions which, when read by a computer, perform the method of any one of claims 1 to 6.
CN202110564425.5A 2021-05-24 2021-05-24 IMU data processing method, system, device and storage medium Active CN113252058B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110564425.5A CN113252058B (en) 2021-05-24 2021-05-24 IMU data processing method, system, device and storage medium
PCT/CN2022/080139 WO2022247392A1 (en) 2021-05-24 2022-03-10 Method, system and device for processing imu data, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110564425.5A CN113252058B (en) 2021-05-24 2021-05-24 IMU data processing method, system, device and storage medium

Publications (2)

Publication Number Publication Date
CN113252058A CN113252058A (en) 2021-08-13
CN113252058B true CN113252058B (en) 2024-06-28

Family

ID=77183876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110564425.5A Active CN113252058B (en) 2021-05-24 2021-05-24 IMU data processing method, system, device and storage medium

Country Status (2)

Country Link
CN (1) CN113252058B (en)
WO (1) WO2022247392A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113252058B (en) * 2021-05-24 2024-06-28 北京航迹科技有限公司 IMU data processing method, system, device and storage medium
CN114266013B (en) * 2021-12-31 2024-05-28 重庆大学 Transmission system vibration decoupling method based on deep learning virtual perception network

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111811507A (en) * 2020-04-08 2020-10-23 北京嘀嘀无限科技发展有限公司 Method and device for determining posture of mobile equipment, storage medium and electronic equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100023265A1 (en) * 2008-07-24 2010-01-28 Gm Global Technology Operations, Inc. Adaptive vehicle control system with integrated driving style recognition
CN107148553A (en) * 2014-08-01 2017-09-08 迈克尔·科伦贝格 Method and system for improving Inertial Measurement Unit sensor signal
CN108921013B (en) * 2018-05-16 2020-08-18 浙江零跑科技有限公司 Visual scene recognition system and method based on deep neural network
CN109099910A (en) * 2018-06-29 2018-12-28 广东星舆科技有限公司 High Accuracy Inertial Navigation System and implementation method based on inertial navigation unit array
CN109284374B (en) * 2018-09-07 2024-07-05 百度在线网络技术(北京)有限公司 Method, apparatus, device and computer readable storage medium for determining entity class
US11195030B2 (en) * 2018-09-14 2021-12-07 Honda Motor Co., Ltd. Scene classification
US11205112B2 (en) * 2019-04-01 2021-12-21 Honeywell International Inc. Deep neural network-based inertial measurement unit (IMU) sensor compensation method
CN110232335A (en) * 2019-05-24 2019-09-13 国汽(北京)智能网联汽车研究院有限公司 Driving Scene classification method and electronic equipment
CN111572555B (en) * 2020-04-28 2021-09-14 东风汽车集团有限公司 Self-learning auxiliary driving method
CN112381303A (en) * 2020-11-19 2021-02-19 北京嘀嘀无限科技发展有限公司 Task index data prediction method and system
CN113252058B (en) * 2021-05-24 2024-06-28 北京航迹科技有限公司 IMU data processing method, system, device and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111811507A (en) * 2020-04-08 2020-10-23 北京嘀嘀无限科技发展有限公司 Method and device for determining posture of mobile equipment, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN113252058A (en) 2021-08-13
WO2022247392A1 (en) 2022-12-01

Similar Documents

Publication Publication Date Title
US11400959B2 (en) Method and system to predict one or more trajectories of a vehicle based on context surrounding the vehicle
US11531110B2 (en) LiDAR localization using 3D CNN network for solution inference in autonomous driving vehicles
US11594011B2 (en) Deep learning-based feature extraction for LiDAR localization of autonomous driving vehicles
TWI703538B (en) Systems and methods for trajectory determination
CN108089572B (en) Method and device for vehicle positioning
EP3714285B1 (en) Lidar localization using rnn and lstm for temporal smoothness in autonomous driving vehicles
US20200406894A1 (en) System and method for determining a target vehicle speed
US11520347B2 (en) Comprehensive and efficient method to incorporate map features for object detection with LiDAR
CN110686686B (en) System and method for map matching
EP3671550A1 (en) Dynamically loaded neural network models
CN113252058B (en) IMU data processing method, system, device and storage medium
CN108139884A (en) The method simulated the physical model of automatic driving vehicle movement and combine machine learning
CN111860493A (en) Target detection method and device based on point cloud data
CN112650220A (en) Automatic vehicle driving method, vehicle-mounted controller and system
CN111860903B (en) Method and system for determining estimated arrival time
WO2020124438A1 (en) Systems and methods for determining driving path in autonomous driving
US11055857B2 (en) Compressive environmental feature representation for vehicle behavior prediction
CA3028601A1 (en) Systems and methods for determining driving path in autonomous driving
CN111385868A (en) Vehicle positioning method, system, device and storage medium
CN112836586A (en) Intersection information determination method, system and device
CN115035494A (en) Image processing method, image processing device, vehicle, storage medium and chip
US20210188266A1 (en) A camera-based low-cost lateral position calibration method for level-3 autonomous vehicles
CN115480579B (en) Crawler-type mobile machine and method, device and medium for tracking and controlling given track thereof
CN117465472A (en) Control method and device for vehicle speed control, vehicle speed controller and medium
Yokota Vehicle localization by dynamic programming from altitude and yaw rate time series acquired by MEMS sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant