CN116405145A - Data interaction synchronization method and system suitable for mobile scene - Google Patents
Data interaction synchronization method and system suitable for mobile scene Download PDFInfo
- Publication number
- CN116405145A CN116405145A CN202211577641.4A CN202211577641A CN116405145A CN 116405145 A CN116405145 A CN 116405145A CN 202211577641 A CN202211577641 A CN 202211577641A CN 116405145 A CN116405145 A CN 116405145A
- Authority
- CN
- China
- Prior art keywords
- data
- vehicle
- time stamp
- vehicle data
- acquisition time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000003993 interaction Effects 0.000 title claims abstract description 39
- 238000004891 communication Methods 0.000 claims abstract description 63
- 238000001914 filtration Methods 0.000 claims abstract description 24
- 230000008569 process Effects 0.000 claims description 20
- 238000012545 processing Methods 0.000 claims description 16
- 238000004458 analytical method Methods 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 9
- 230000005540 biological transmission Effects 0.000 claims description 8
- 230000001133 acceleration Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000003745 diagnosis Methods 0.000 claims description 3
- 238000012417 linear regression Methods 0.000 claims description 3
- 230000002452 interceptive effect Effects 0.000 claims 1
- 208000002173 dizziness Diseases 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 description 15
- 239000011159 matrix material Substances 0.000 description 9
- 230000036544 posture Effects 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000003205 fragrance Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 1
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 1
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000004134 energy conservation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04J—MULTIPLEX COMMUNICATION
- H04J3/00—Time-division multiplex systems
- H04J3/02—Details
- H04J3/06—Synchronising arrangements
- H04J3/0635—Clock or time synchronisation in a network
- H04J3/0638—Clock or time synchronisation among nodes; Internode synchronisation
- H04J3/0658—Clock or time synchronisation among packet nodes
- H04J3/0661—Clock or time synchronisation among packet nodes using timestamps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Navigation (AREA)
Abstract
The application provides a data interaction synchronization method and a system applicable to a mobile scene, which relate to the technical field of data interaction synchronization, wherein the method comprises the following steps: the vehicle-mounted terminal acquires vehicle data and an acquisition time stamp of the vehicle data in a mobile scene, converts the data formats of the vehicle data and the vehicle data into a target communication data format, and periodically transmits the data acquired by the vehicle-mounted terminal to the XR front end according to a target communication protocol; the XR front end receives and analyzes the data acquired by the vehicle-mounted terminal, the time when the data are received is taken as a data receiving time stamp, data prediction and error filtering are carried out according to the analyzed vehicle data and the acquisition time stamp of the vehicle data, and meanwhile, the data interaction synchronization is carried out on the vehicle-mounted terminal and the acquisition time stamp of the analyzed vehicle data based on the data receiving time stamp. The invention realizes the gesture synchronization, data synchronization and time synchronization of the XR front end and the intelligent automobile in real time, thereby eliminating or improving the dizziness when the XR end is played in the moving scene of the automobile.
Description
Technical Field
The invention belongs to the technical field of data interaction synchronization, and particularly relates to a data interaction synchronization method and system applicable to a mobile scene.
Background
Automobile simulated driving, also known as automobile driving simulation. An artificial environment is constructed by high-tech means such as a three-dimensional image instant generation technology, an automobile dynamics simulation physical system, a large-view-field display technology (such as a multi-channel stereo projection system), a six-degree-of-freedom (6 Dof) motion platform (or a three-degree-of-freedom (3 Dof) motion platform), a user input hardware system, a stereo system, a central control system and the like. Virtual driving allows an experimenter to feel visual, audible and somatosensory car driving experiences close to real effects in a virtual driving environment. The method has the advantages of vivid driving simulation effect, energy conservation, safety, economy, no limitation of time, climate and place, high driving training efficiency, short training period and the like.
However, in the process of simulating driving, in order to ensure the perception of the moving environment of the vehicle, different sensors are used for carrying out environment perception in multiple ways, and then the environment perception is revealed to a tester through a virtual reality device and an enhanced display device. In the existing equipment, the XR end refers to an environment which is generated by a computer, artificial intelligence and other technologies and a wearable device, is truly combined with the virtual environment and can be interacted with a human machine. (XR) augmented reality includes (VR) virtual reality, (AR) augmented reality, and (MR) mixed reality, referred to as the final modality of future virtual reality interactions. However, at present, an XR end (XR includes virtual reality and enhanced display) is generally suitable for a home or stationary scene, and in a vehicle moving scene, when the XR end performs data interaction with an intelligent automobile, the XR end cannot be adapted to the matching of the gesture in the vehicle moving scene, which easily causes dizziness.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a data interaction synchronization method and a system suitable for a mobile scene, and aims to solve the problems that in the mobile scene of a vehicle, when the current XR end performs data interaction with an intelligent automobile, the current XR end cannot be suitable for an attitude matching process in the mobile scene of the vehicle, and dizziness is easy to cause.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in a first aspect, the present application provides a method for data interaction synchronization applicable to mobile scenarios, including the following steps:
s1: the vehicle-mounted terminal acquires vehicle data and an acquisition time stamp of the vehicle data in a mobile scene, and converts the data format of the vehicle data and the acquisition time stamp of the vehicle data into a target communication data format;
s2: the vehicle-mounted terminal periodically transmits vehicle data in a target communication data format and an acquisition time stamp of the vehicle data to the XR front end according to a target communication protocol; the XR front-end comprises a VR device, an AR device, and an MR device;
s3: the XR front end receives and analyzes the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, and takes the time when the vehicle data is received as the data receiving time stamp;
s4: the XR front end acquires vehicle data and an acquisition time stamp of the vehicle data according to the analysis result, performs data prediction and error filtering on the vehicle data, and performs vehicle data interaction synchronization with the vehicle-mounted terminal based on the data receiving time stamp and the analyzed acquisition time stamp of the vehicle data.
Further, the vehicle data includes vehicle equipment information, vehicle running posture, vehicle driving posture, vehicle running state, vehicle surrounding environment state, and vehicle control data.
Further, the process of data prediction for the vehicle data specifically includes: according to the vehicle data obtained by the analysis result, GPS positioning data, vehicle speed and gyroscope data of the vehicle are obtained; judging whether the GPS positioning point is updated or not by using a linear regression model and taking the GPS positioning point in the current GPS positioning data as a predicted point, and if so, predicting the next GPS positioning point by using the vehicle speed and the gyroscope data on the basis of the latest GPS positioning point to be used as the latest predicted point for carrying out next-stage data prediction; if the GPS positioning point is not updated, the next GPS positioning point is predicted by utilizing the vehicle speed and the gyroscope data on the basis of the current prediction point, and the next stage of data prediction is performed by taking the next GPS positioning point as the latest prediction point.
Further, the process of predicting the next GPS fix using the vehicle speed and the gyroscope data specifically includes:
determining the direction of a gyroscope according to the position of a current GPS positioning point, and predicting the position distance of the next GPS positioning point of the vehicle according to the vehicle speed and the data uploading interval period of the vehicle-mounted terminal, wherein the prediction process is realized by adopting the following formula: s1=vt, where S1 is a position distance between a current GPS anchor point of the vehicle and a next GPS anchor point, v is a vehicle speed, and t is a data uploading interval period of the vehicle-mounted terminal;
according to the position distance S1 and the position of the current GPS positioning point, the position interval of the next GPS positioning point is calculated, and the calculation process is shown in the following formula: s=s (GPS) +s1
S is the position interval of the next GPS positioning point, and S (GPS) is the position coordinate of the current GPS positioning point;
and finally, predicting the position coordinates of the next GPS locating point in the position interval S of the next GPS locating point according to the direction of the gyroscope, and interpolating to move to the predicted position of the next GPS locating point.
Further, the target communication protocol is one or more of a TCP communication protocol or a Bluetooth communication protocol.
In a second aspect, the application provides a data interaction synchronization system applicable to a mobile scene, which is implemented by adopting the data interaction synchronization method applicable to the mobile scene, and comprises a vehicle-mounted terminal and an XR front end, wherein the vehicle-mounted terminal is connected with the XR front end through a TCP or Bluetooth; the vehicle-mounted terminal is used for acquiring vehicle data and a vehicle data acquisition time stamp of the vehicle-mounted terminal under a mobile scene, converting the data format of the vehicle data and the vehicle data acquisition time stamp into a target communication data format, and transmitting the vehicle data and the vehicle data acquisition time stamp of the target communication data format to the XR front end according to a target communication protocol; the XR front end is used for receiving and analyzing the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, acquiring the vehicle data and the acquisition time stamp of the vehicle data according to the analysis result, carrying out data prediction and error filtering on the vehicle data, taking the time when the vehicle data is received as the data receiving time stamp, and completing the vehicle data interaction synchronization between the XR front end and the vehicle terminal by utilizing the acquisition time stamp of the vehicle data and the acquisition time stamp of the vehicle data.
Further, the vehicle-mounted terminal comprises a data acquisition module, a central processing unit and a data transmission module; the data acquisition module is used for acquiring vehicle data of the intelligent automobile;
the central processing unit is used for reading the vehicle data acquired by the data acquisition module and the acquisition time stamp of the vehicle data through the vehicle-mounted automatic diagnosis system or the Can protocol or the OBD protocol, and carrying out format conversion on the vehicle data and the acquisition time stamp of the vehicle data according to the target communication data format;
the data transmission module is used for transmitting the vehicle data after format conversion and the acquisition time stamp of the vehicle data to the XR front end through the target communication protocol.
Further, the data acquisition module comprises a GPS (global positioning system) locator, a speed sensor, a gyroscope and an acceleration sensor; and the central processing unit is respectively connected with the GPS locator, the speed sensor, the gyroscope and the acceleration sensor.
The application adopts the technical scheme, possesses following beneficial effect at least:
according to the data interaction synchronization method and system applicable to the mobile scene, firstly, vehicle data and the acquisition time stamp of the vehicle data in the mobile scene are acquired by using a vehicle-mounted terminal, and the data format of the vehicle data and the acquisition time stamp of the vehicle data is converted into a target communication data format; the vehicle-mounted terminal periodically transmits vehicle data in a target communication data format and an acquisition time stamp of the vehicle data to the XR front end according to a target communication protocol; then the XR front end receives and analyzes the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, and simultaneously takes the time when the vehicle data is received as the data receiving time stamp; and finally, the XR front end acquires vehicle data and an acquisition time stamp of the vehicle data according to the analysis result, performs data prediction and error filtering on the vehicle data, and performs vehicle data interaction synchronization with the vehicle-mounted terminal based on the data receiving time stamp and the analyzed acquisition time stamp of the vehicle data. According to the technical scheme, the XR end can acquire the data of the intelligent vehicle machine end in the moving scene, and through data processing, prediction, filtering and other processing, the real-time gesture synchronization of the XR front end and the intelligent vehicle is realized, and the data synchronization and the time synchronization are realized, so that the dizziness of playing the XR end in the moving scene of the vehicle is eliminated or improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart illustrating steps of a method for data interaction synchronization for an applicable mobile scenario, according to one embodiment;
FIG. 2 is a flow diagram illustrating GPS fix prediction according to an embodiment;
FIG. 3 is a schematic diagram of high frequency noise and low frequency noise, according to an embodiment;
FIG. 4 is a schematic diagram of a data interaction synchronization system architecture for an applicable mobile scenario, according to one embodiment;
in the accompanying drawings: 1-vehicle terminal, 2-XR front end.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, based on the examples herein, which are within the scope of the protection sought by those of ordinary skill in the art without undue effort, are intended to be encompassed by the present application.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a method for synchronizing data interactions for a mobile scenario, according to an exemplary embodiment. As shown in fig. 1, the data interaction synchronization method provided in the present application includes the following steps:
s1: the vehicle-mounted terminal acquires vehicle data and an acquisition time stamp of the vehicle data in a mobile scene, and converts the data format of the vehicle data and the acquisition time stamp of the vehicle data into a target communication data format;
s2: the vehicle-mounted terminal periodically transmits vehicle data in a target communication data format and an acquisition time stamp of the vehicle data to the XR front end according to a target communication protocol; wherein the XR front-end comprises a VR device, an AR device, and an MR device;
s3: the XR front end receives and analyzes the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, and takes the time when the vehicle data is received as the data receiving time stamp;
s4: the XR front end acquires vehicle data and an acquisition time stamp of the vehicle data according to the analysis result, performs data prediction and error filtering on the vehicle data, and performs vehicle data interaction synchronization with the vehicle-mounted terminal based on the data receiving time stamp and the analyzed acquisition time stamp of the vehicle data.
The XR front-end includes, among other things, VR devices, (AR) augmented reality devices, and (MR) mixed reality devices, such as Pico Neo3, occuls Quest2, HTC WAVE, etc. The XR front end adopts a 3D development engine to conduct data processing, is used for data interaction with the intelligent automobile end, and can be used as the front end for processing.
The vehicle data refers to interaction data sent to the front end of the XR by the intelligent vehicle through the vehicle-mounted terminal, and the vehicle data comprises vehicle equipment information, vehicle driving postures, vehicle running states, vehicle surrounding environment states and vehicle control data.
Specifically, in this application, the vehicle data demand content of the interaction between the VR device and the vehicle includes:
1. vehicle equipment information synchronization: brand, model, configuration, system version number, CAN protocol or OBD protocol version number;
2. vehicle driving posture: position, speed, orientation (euler angle);
3. vehicle driving state: steering wheel (angle and speed), oil/electric door depth, brake depth, gear;
4. vehicle running state: a door (including a back door) open/close state, a window (including a sunroof) state, an air-conditioning state, a seat, a DMS system, a fragrance state, an acoustic state, and the like;
5. state of the vehicle surroundings: reference to ADAS data+location-based map data;
6. vehicle control: door opening and closing, window opening and closing, air conditioning adjustment, seat adjustment, fragrance adjustment, sound adjustment and the like.
Further, in one embodiment, in order to implement intelligent interconnection between the vehicle and the VR device, a target communication protocol needs to be established between the vehicle-mounted end and the VR end, so as to implement data interaction. In the scheme, the intelligent automobile end reads the gyroscope, GPS and speed data of the vehicle-mounted terminal through the OBD or Can protocol, and uses the vehicle-mounted terminal as a server end to send the acquired data to the XR front end by using the target data communication format, so that the front end processes the vehicle-mounted terminal data.
The target data communication format can be selected according to actual requirements, and data communication formats such as json data format, protobuf data format and the like can be selected for communication. In the scheme, the communication data is tentatively based on json data, and the XR front end sends a request to a communication protocol rule between servers, as follows:
{
RequestCode "1000000",// RequestCode is defined by 1XXXXXX, as 1000001,1000002 indicates that the front end is sent to the client
RequestData, "{ }",// request communication data, defined by json content
}
The server sends the data rule to the front end as follows:
{
the requestCode is defined by '2000000',/request communication protocol code beginning with 2XXXXXXX, for example 2000001, 2000002, the client sends the request communication protocol code to the front end;
requestMsg: "Success",// return data message content, success indicates Success in return, fail indicates failure;
the requestData, "{ }",// return json data content, the specific json data volume being defined according to the requestCode;
}
further, in one embodiment, the target communication protocol is one or more of a TCP communication protocol or a Bluetooth communication protocol. The TCP communication protocol adopts a TCP Socket to construct a local area network, and WiFi or USB is used for realizing local area network communication. Meanwhile, the data active sending period of the vehicle-mounted terminal is set to be millisecond level, and can be specifically set according to actual requirements. The target communication protocol is suitable for a scene that a vehicle-mounted terminal (ELE BOX, vehicle set SDK) sends data such as a sensor to a VR terminal. The XR front end obtains vehicle-mounted terminal data through TCP, json analysis is carried out, and the read GPS point, speed and gyroscope data are processed and are synchronous with the intelligent automobile.
Specifically, in the scheme, when the vehicle-mounted terminal acquires the vehicle data in the mobile scene, the vehicle-mounted terminal also acquires the acquisition time stamp of the vehicle data, the XR front end establishes a time synchronization mechanism according to the acquisition time stamp of the vehicle data, and local time is corrected through GPS or Beidou time to keep synchronization, or the time offset between the vehicle-mounted terminal and the XR front end is used for calculation and synchronization. The intelligent automobile synchronizes the data time through the vehicle-mounted terminal and the XR front end, so that the effectiveness of the data is maintained.
Because data delay occurs in the front-end interaction process of the data of the intelligent automobile end, and data of the gyroscope, GPS and vehicle speed are delayed, the time stamp of each data acquisition in the vehicle-mounted terminal is calculated to judge which data corresponds to which time stamp, the synchronism of the data is kept at any moment, and meanwhile the data sent to the front end is also subjected to time stamp and data judgment, and then calculated once, so that timeliness is kept.
Each vehicle data provides a corresponding timestamp field using unix timestamp type. Accurate to the millisecond level. The vehicle-mounted terminal side transmits data, uses the recorded original time stamp, and thus the time stamps of the two data may be different. The XR front end side performs logic such as sampling and interpolation on different data according to the time stamp of the vehicle data.
In addition, the vehicle-mounted terminal needs to provide an IP acquisition mode as a server. If the WiFi exists in the vehicle and the communication box, the hotspot capability needs to be provided, and the states of the WiFi hotspot starting, service starting, client connection and the like are indicated through the communication lamp.
When the XR front end is connected with the service end, a handshake is needed, and the specific flow is as follows:
client- > Server: a handshake request; 0x01 one byte; state synchronization of subsequent extended XR front-end equipment;
server- > Client, handshake response; service conditions; supported functions (4 bytes); data format (2 byte); protocol version (2 byte); VIN:64bytes (vehicle code). The protocol is only suitable for the scene that the vehicle-mounted terminal (ELE BOX, vehicle SDK) sends data such as a sensor to the VR terminal.
The XR front end acquires vehicle-mounted data through TCP or Bluetooth, and performs json analysis, GPS point reading, speed processing and gyroscope processing, and posture synchronization with the intelligent vehicle.
Referring to fig. 2, in the present application, a process for predicting vehicle data specifically includes: according to the vehicle data obtained by the analysis result, GPS positioning data, vehicle speed and gyroscope data of the vehicle are obtained; judging whether the GPS positioning point is updated or not by using a linear regression model and taking the GPS positioning point in the current GPS positioning data as a predicted point, and if so, predicting the next GPS positioning point by using the vehicle speed and the gyroscope data on the basis of the latest GPS positioning point to be used as the latest predicted point for carrying out next-stage data prediction; if the GPS positioning point is not updated, the next GPS positioning point is predicted by utilizing the vehicle speed and the gyroscope data on the basis of the current prediction point, and the next stage of data prediction is performed by taking the next GPS positioning point as the latest prediction point.
Further, the process of predicting the next GPS fix using the vehicle speed and the gyroscope data specifically includes:
determining the direction of a gyroscope according to the position of a current GPS positioning point, and predicting the position distance of the next GPS positioning point of the vehicle according to the vehicle speed and the data uploading interval period of the vehicle-mounted terminal, wherein the prediction process is realized by adopting the following formula: s1=vt, where S1 is a position distance between a current GPS anchor point of the vehicle and a next GPS anchor point, v is a vehicle speed, and t is a data uploading interval period of the vehicle-mounted terminal;
according to the position distance S1 and the position of the current GPS positioning point, the position interval of the next GPS positioning point is calculated, and the calculation process is shown in the following formula: s=s (GPS) +s1
S is the position interval of the next GPS positioning point, and S (GPS) is the position coordinate of the current GPS positioning point;
and finally, predicting the position coordinates of the next GPS locating point in the position interval S of the next GPS locating point according to the direction of the gyroscope, and interpolating to move to the predicted position of the next GPS locating point.
Further, in the scheme of the application, the error filtering process filters the data error by using complementary filtering and Kalman filtering, so as to prevent the data error and deviation from being processed.
For the Kalman filtering data error processing method, the attitude drift is serious because the attitude is calculated by directly utilizing the angular rate information integral obtained by the gyroscope measurement. Based on these two aspects, when the pose estimation is performed by using the Kalman filtering method, the nonlinear pose differential equation serving as the filter state model can be approximated to become a simple linear equation, as shown in the following formula:
wherein θ is the pitch angle; gamma is the roll angle; and psi is the heading angle. In the pose update calculation, ω x 、ω y 、ω z For the measurement of the angular velocity of the three axes of the gyroscope, it is evident that there is an unavoidable measurement error of the measurement of the gyroscope, in which the null shift (epsilon) x 、ε y 、ε z ) Is the most important error. Based on this, the selected state variables are as follows: (since the current experiment uses only an accelerometer, the parameters in the z-direction cannot be estimated effectively, and thereforeThe state is temporarily determined in this way
X=[θ γ ψ ε x εy ] T X=[θ γ ψ ε x ε y ] T
The state equation is established as:
here, F is a state transition matrix;represented as a gyroscope triaxial angular velocity measurement.
Specifically, the measurement model is established by the following steps:
since the MEMS sensor is very poorly accurate, plus the model is a simplified linear model, the integrated attitude needs to be corrected using the attitude angle obtained by the accelerometer. Wherein the pitch angle θ and roll angle γ (the relationship between the gravity measured by the accelerometer and the actual gravity is not difficult to infer) obtained by the accelerometer are as follows:
where f is the measurement information of the accelerometer. Taking (3) and (4) as the measurement information Z, the measurement equation of the vehicle attitude filter is easily obtained:
wherein H is a measurement transfer matrix, and V is measurement noise. It should be noted that the attitude angle errors caused by the movement of the carrier are all regarded as attitude measurement noise herein.
So far, the attitude angle and the gyro zero drift error can be estimated by carrying out Kalman filtering update calculation through the state equation (2), the measurement information (3) and (4) and the measurement equation (5).
The Kalman filtering update calculation process specifically comprises the following steps:
a) One-step predictive estimation:
wherein, the discretized state transition matrix phi is:
Φ=I+F·tsΦ=I+F·ts (6)
control amount U k+1 Is u k+1 * ts, wherein ts is the update period.
b) Covariance of one-step prediction estimation error:
P k+1/k =Φ·P k Φ T +Q k P k+1/k =Φ·P k ·Φ T +Q k (7)
wherein Q is k Is a system noise variance matrix.
c) Filtering gain matrix:
K k =P k+1/k ·H T ·(H·P k+1/k ·H T +R k ) -1 K k =P k+1/k ·H T ·(H·P k+1/k ·H T +R k ) -1 (8)
wherein R is k To measure the noise variance matrix.
d) State estimation
e) Covariance of state estimation error
P k+1 =P k+1/k -K k ·H·P k+1/k P k+1 =P k+1/k -K k ·H·P k+1/k (10)
Thus, the attitude angle and gyro drift can be estimated by the formulas (6) - (10). However, the direct use of equations (6) - (10) is quite complex, and especially, matrix inversion calculation exists in equation (8), which results in a decrease in algorithm instantaneity.
However, the system model and the measurement model are very simple, and only a few elements in the state estimation error array are not 0, so that the elements in the state estimation error covariance array can be directly calculated, redundant matrix operation is avoided, and algorithm instantaneity is improved.
After each iterative update of Kalman filtering, P k Except that the diagonal element is not 0, only Pk (1, 4), pk (4, 1), pk (2, 5) and Pk (5, 2) are not 0, and Pk (1, 4) =pk (4, 1), pk (2, 5) =pk (5, 2).
Obtaining
To simplify the calculation, let
Filtering gain K k Is that
Order the
Then the covariance matrix of the estimated error is
Thus, the on-board execution of the pose estimation can be summarized as follows: a) One-step predictive estimation
b) Covariance of one-step prediction estimation error
c) Filtering gain
d) State estimation
f) Covariance of estimation errors
Further, referring to fig. 3, the complementary filtering in the solution of the present application mainly filters high-frequency noise and low-frequency noise in a signal, where the low-frequency noise is filtered by a low-pass filter, and the filtering process is implemented by a low-pass filtering formula:
the high-frequency noise is filtered by a high-pass filter, and the high-frequency noise filtering process is realized by a high-pass filter formula:
in a second aspect, the present application provides a data interaction synchronization system applicable to a mobile scenario, which is implemented by using the data interaction synchronization method applicable to a mobile scenario provided by the present application. Referring to fig. 4, the system includes a vehicle-mounted terminal and an XR front end, where the vehicle-mounted terminal is connected to the XR front end through TCP or bluetooth; the vehicle-mounted terminal is used for acquiring vehicle data and a vehicle data acquisition time stamp of the vehicle-mounted terminal under a mobile scene, converting the data format of the vehicle data and the vehicle data acquisition time stamp into a target communication data format, and transmitting the vehicle data and the vehicle data acquisition time stamp of the target communication data format to the XR front end according to a target communication protocol; the XR front end is used for receiving and analyzing the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, acquiring the vehicle data and the acquisition time stamp of the vehicle data according to the analysis result, carrying out data prediction and error filtering on the vehicle data, taking the time when the vehicle data is received as the data receiving time stamp, and completing the vehicle data interaction synchronization between the XR front end and the vehicle terminal by utilizing the acquisition time stamp of the vehicle data and the acquisition time stamp of the vehicle data.
Further, in one embodiment, the vehicle-mounted terminal includes a data acquisition module, a central processing unit, and a data transmission module. The data acquisition module is used for acquiring vehicle data of the intelligent automobile;
the central processing unit is used for reading the vehicle data and the acquisition time stamp of the vehicle data acquired by the data acquisition module through a vehicle-mounted automatic diagnosis system or a Can protocol, and carrying out format conversion on the vehicle data and the acquisition time stamp of the vehicle data according to a target communication data format;
the data transmission module is used for transmitting the vehicle data after format conversion and the acquisition time stamp of the vehicle data to the XR front end through the target communication protocol.
Further, in one embodiment, the data acquisition module includes a GPS locator, a speed sensor, a gyroscope, and an acceleration sensor; and the central processing unit is respectively connected with the GPS locator, the speed sensor, the gyroscope and the acceleration sensor.
Further, in one embodiment, the data transmission module may be implemented by using a bluetooth transmission module and a 5G/4G/3G network communication module, and the vehicle data is transferred by using a lan communication or a bluetooth wireless communication.
It is to be understood that the same or similar parts in the above embodiments may be referred to each other, and that in some embodiments, the same or similar parts in other embodiments may be referred to.
It should be noted that in the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present application, unless otherwise indicated, the meaning of "plurality", "multiple" means at least two.
It will be understood that when an element is referred to as being "mounted" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present; when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may be present, and further, as used herein, connection may comprise a wireless connection; the use of the term "and/or" includes any and all combinations of one or more of the associated listed items.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (8)
1. The data interaction synchronization method suitable for the mobile scene is characterized by comprising the following steps of:
s1: the vehicle-mounted terminal acquires vehicle data and an acquisition time stamp of the vehicle data in a mobile scene, and converts the data format of the vehicle data and the acquisition time stamp of the vehicle data into a target communication data format;
s2: the vehicle-mounted terminal periodically transmits vehicle data in a target communication data format and an acquisition time stamp of the vehicle data to the XR front end according to a target communication protocol; the XR front-end comprises a VR device, an AR device, and an MR device;
s3: the XR front end receives and analyzes the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, and takes the time when the vehicle data is received as the data receiving time stamp;
s4: the XR front end acquires vehicle data and an acquisition time stamp of the vehicle data according to the analysis result, performs data prediction and error filtering on the vehicle data, and performs vehicle data interaction synchronization with the vehicle-mounted terminal based on the data receiving time stamp and the analyzed acquisition time stamp of the vehicle data.
2. The method of claim 1, wherein the vehicle data includes vehicle equipment information, vehicle driving pose, vehicle operating state, vehicle surrounding environment state, and vehicle control data.
3. The method for data interaction synchronization applicable to mobile scenes according to claim 1, wherein the process of data prediction of vehicle data specifically comprises: according to the vehicle data obtained by the analysis result, GPS positioning data, vehicle speed and gyroscope data of the vehicle are obtained; judging whether the GPS positioning point is updated or not by using a linear regression model and taking the GPS positioning point in the current GPS positioning data as a predicted point, and if so, predicting the next GPS positioning point by using the vehicle speed and the gyroscope data on the basis of the latest GPS positioning point to be used as the latest predicted point for carrying out next-stage data prediction; if the GPS positioning point is not updated, the next GPS positioning point is predicted by utilizing the vehicle speed and the gyroscope data on the basis of the current prediction point, and the next stage of data prediction is performed by taking the next GPS positioning point as the latest prediction point.
4. A method for interactive synchronization of data applicable to mobile scenes according to claim 3, wherein the process of predicting the next GPS anchor point using the vehicle speed and gyroscope data comprises:
determining the direction of a gyroscope according to the position of a current GPS positioning point, and predicting the position distance of the next GPS positioning point of the vehicle according to the vehicle speed and the data uploading interval period of the vehicle-mounted terminal, wherein the prediction process is realized by adopting the following formula: s1=vt, where S1 is a position distance between a current GPS anchor point of the vehicle and a next GPS anchor point, v is a vehicle speed, and t is a data uploading interval period of the vehicle-mounted terminal;
according to the position distance S1 and the position of the current GPS positioning point, the position interval of the next GPS positioning point is calculated, and the calculation process is shown in the following formula: s=s (GPS) +s1
S is the position interval of the next GPS positioning point, and S (GPS) is the position coordinate of the current GPS positioning point;
and finally, predicting the position coordinates of the next GPS locating point in the position interval S of the next GPS locating point according to the direction of the gyroscope, and interpolating to move to the predicted position of the next GPS locating point.
5. The method for data interaction synchronization applicable to a mobile scenario of claim 1, wherein the target communication protocol is one or more of a TCP communication protocol and a bluetooth communication protocol.
6. A data interaction synchronization system suitable for a mobile scene, which is realized by adopting the data interaction synchronization method suitable for the mobile scene according to any one of the claims 1-5, and is characterized by comprising a vehicle-mounted terminal and an XR front end, wherein the vehicle-mounted terminal is connected with the XR front end through TCP or Bluetooth; the vehicle-mounted terminal is used for acquiring vehicle data and a vehicle data acquisition time stamp of the vehicle-mounted terminal under a mobile scene, converting the data format of the vehicle data and the vehicle data acquisition time stamp into a target communication data format, and transmitting the vehicle data and the vehicle data acquisition time stamp of the target communication data format to the XR front end according to a target communication protocol; the XR front end is used for receiving and analyzing the vehicle data in the target communication data format and the acquisition time stamp of the vehicle data, acquiring the vehicle data and the acquisition time stamp of the vehicle data according to the analysis result, carrying out data prediction and error filtering on the vehicle data, taking the time when the vehicle data is received as the data receiving time stamp, and completing the vehicle data interaction synchronization between the XR front end and the vehicle terminal by utilizing the acquisition time stamp of the vehicle data and the acquisition time stamp of the vehicle data.
7. The system for synchronizing data interaction applicable to a mobile scenario of claim 6, wherein the vehicle-mounted terminal comprises a data acquisition module, a central processing unit and a data transmission module; the data acquisition module is used for acquiring vehicle data of the intelligent automobile;
the central processing unit is used for reading the vehicle data acquired by the data acquisition module and the acquisition time stamp of the vehicle data through the vehicle-mounted automatic diagnosis system or the Can protocol or the OBD protocol, and carrying out format conversion on the vehicle data and the acquisition time stamp of the vehicle data according to the target communication data format;
the data transmission module is used for transmitting the vehicle data after format conversion and the acquisition time stamp of the vehicle data to the XR front end through the target communication protocol.
8. The system of claim 7, wherein the data acquisition module comprises a GPS locator, a speed sensor, a gyroscope, and an acceleration sensor; and the central processing unit is respectively connected with the GPS locator, the speed sensor, the gyroscope and the acceleration sensor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211577641.4A CN116405145A (en) | 2022-12-09 | 2022-12-09 | Data interaction synchronization method and system suitable for mobile scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211577641.4A CN116405145A (en) | 2022-12-09 | 2022-12-09 | Data interaction synchronization method and system suitable for mobile scene |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116405145A true CN116405145A (en) | 2023-07-07 |
Family
ID=87014825
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211577641.4A Pending CN116405145A (en) | 2022-12-09 | 2022-12-09 | Data interaction synchronization method and system suitable for mobile scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116405145A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117171701A (en) * | 2023-08-14 | 2023-12-05 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
CN117979412A (en) * | 2024-03-29 | 2024-05-03 | 江铃汽车股份有限公司 | Internal time synchronization method and system for vehicle-mounted communication remote terminal |
-
2022
- 2022-12-09 CN CN202211577641.4A patent/CN116405145A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117171701A (en) * | 2023-08-14 | 2023-12-05 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
CN117171701B (en) * | 2023-08-14 | 2024-05-14 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
CN117979412A (en) * | 2024-03-29 | 2024-05-03 | 江铃汽车股份有限公司 | Internal time synchronization method and system for vehicle-mounted communication remote terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116405145A (en) | Data interaction synchronization method and system suitable for mobile scene | |
CN111859618B (en) | Virtual-real combined traffic comprehensive scene simulation test system and method for multi-terminal ring | |
CN110177374B (en) | V2X functional application testing method, device and system based on vehicle-road cooperation | |
US10867409B2 (en) | Methods and systems to compensate for vehicle calibration errors | |
US20200293041A1 (en) | Method and system for executing a composite behavior policy for an autonomous vehicle | |
CN110895147B (en) | Image data acquisition logic for capturing image data with a camera of an autonomous vehicle | |
CN110100190A (en) | System and method for using the sliding window of global location epoch in vision inertia ranging | |
WO2021184218A1 (en) | Relative pose calibration method and related apparatus | |
CN110914778A (en) | System and method for image localization based on semantic segmentation | |
CN110418980A (en) | Communication for high accuracy co-positioned solution | |
CN110543814A (en) | Traffic light identification method and device | |
CN112146679B (en) | Flexible test board for improving sensor I/O coverage rate of automatic driving platform | |
CN113848855B (en) | Vehicle control system test method, device, equipment, medium and program product | |
CN108139211A (en) | For the device and method and program of measurement | |
WO2022116000A1 (en) | Communication method and apparatus | |
US20220229759A1 (en) | Method, device, and system for simulation test | |
US11245909B2 (en) | Timestamp and metadata processing for video compression in autonomous driving vehicles | |
CN112455331A (en) | Vehicle anti-collision early warning method and system and vehicle-mounted computer equipment | |
WO2018225067A1 (en) | Fusion and calibration of sensor signals in a moving vehicle | |
JPWO2020129369A1 (en) | Calibration device and calibration method and program and calibration system and calibration target | |
CN101576386B (en) | Micro-inertial navigation system and method | |
Mollica et al. | MA-VIED: A multisensor automotive visual inertial event dataset | |
JP2007538231A (en) | Interferometric sensing system | |
CN217435657U (en) | Electrical system of automatic driving vehicle and automatic driving vehicle | |
CN115145246B (en) | Method and device for testing controller, vehicle, storage medium and chip |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |