WO2020253260A1 - Time synchronization processing method, electronic apparatus, and storage medium - Google Patents
Time synchronization processing method, electronic apparatus, and storage medium Download PDFInfo
- Publication number
- WO2020253260A1 WO2020253260A1 PCT/CN2020/076836 CN2020076836W WO2020253260A1 WO 2020253260 A1 WO2020253260 A1 WO 2020253260A1 CN 2020076836 W CN2020076836 W CN 2020076836W WO 2020253260 A1 WO2020253260 A1 WO 2020253260A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- angular velocity
- information
- different sensors
- velocity information
- sensors
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 141
- 238000005259 measurement Methods 0.000 claims abstract description 67
- 239000011159 matrix material Substances 0.000 claims description 81
- 238000000034 method Methods 0.000 claims description 69
- 230000001360 synchronised effect Effects 0.000 claims description 22
- 238000007499 fusion processing Methods 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 12
- 230000004927 fusion Effects 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 230000033001 locomotion Effects 0.000 abstract description 29
- 230000008569 process Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 25
- 230000000007 visual effect Effects 0.000 description 19
- 230000005291 magnetic effect Effects 0.000 description 10
- 238000005457 optimization Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 230000005294 ferromagnetic effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
Definitions
- the embodiments of the present disclosure relate to the field of computer vision, and specifically relate to time synchronization processing methods, electronic devices, and storage media.
- Visual inertial odometry is currently a hot research topic in the field of computer vision. It is widely used in the navigation and entertainment of electronic devices.
- the main principle is to fuse vision and inertial sensors to estimate the position and posture of the camera itself during movement to obtain accuracy.
- the positioning information which belongs to autonomous navigation.
- the electronic device needs to determine the delay time information between different sensors when fusing the measurement results of different sensors.
- the motion in the three-dimensional space is usually used to calibrate the delay time information between different sensors, or the time delay between different sensors is processed as a constant.
- the delay time information between different sensors changes. Therefore, there is a problem of low time synchronization accuracy.
- the embodiments of the present disclosure expect to provide a time synchronization processing method, electronic equipment and storage medium.
- an embodiment of the present disclosure provides a time synchronization processing method, the method includes: obtaining angular velocity information collected by two different sensors; the angular velocity information is the angular velocity information when the electronic device rotates, and the Two different sensors are installed on the electronic device and rigidly connected; the two angular velocity information obtained are aligned, and the delay time information between the two different sensors is determined; according to the delay time The information performs time synchronization processing on the respective measurement results of the two different sensors.
- an embodiment of the present disclosure provides a time synchronization processing device, the device includes: a first acquisition module, an alignment module, and a synchronization module, wherein the first acquisition module is configured to obtain two different sensors respectively The angular velocity information is collected; the angular velocity information is the angular velocity information when the electronic device rotates, and the two different sensors are both arranged on the electronic device and rigidly connected; the alignment module is used to compare the two obtained Aligning the two angular velocity information to determine the delay time information between the two different sensors; the synchronization module is configured to perform the respective measurement results of the two different sensors according to the delay time information Time synchronization processing.
- embodiments of the present disclosure provide an electronic device that includes at least a processor and a memory for storing a computer program that can run on the processor; when the processor is used to run the computer program, Perform the steps in the above-mentioned time synchronization processing method.
- embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned time synchronization processing method are implemented.
- the embodiments of the present disclosure provide a time synchronization processing method, an electronic device, and a storage medium.
- the angular velocity information collected by the two different sensors is obtained through two different sensors that are arranged on the electronic device and rigidly connected;
- the two angular velocity information is aligned to determine the delay time information between two different sensors, and then the respective measurement results of the two different sensors are time synchronized according to the delay time information, thereby making full use of the rigid connection angular velocity
- the principle of the same information realizes the time synchronization of different sensors.
- FIG. 1 is a schematic diagram 1 of a system structure suitable for a time synchronization processing method according to an embodiment of the present disclosure
- FIG. 2A is a schematic diagram 2 of a system architecture suitable for a time synchronization processing method according to an embodiment of the present disclosure
- 2B is a third schematic diagram of a system architecture suitable for a time synchronization processing method according to an embodiment of the present disclosure
- FIG. 3 is a schematic diagram 1 of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure
- FIG. 4 is a schematic diagram of an exemplary time delay between the angular velocity of the visual sensor and the angular velocity of the inertial sensor in the embodiment of the disclosure;
- FIG. 5 is a schematic diagram after the angular velocity of the visual sensor and the angular velocity of the inertial sensor are synchronized in an exemplary embodiment of the disclosure
- FIG. 6 is a second schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the disclosure.
- FIG. 7 is a third schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the disclosure.
- FIG. 8 is a fourth schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
- FIG. 9 is a first schematic diagram of measuring pose rotation matrix information by using a quaternion to represent a pose sensor in an embodiment of the disclosure.
- FIG. 10 is a second schematic diagram of the embodiment of the disclosure using a quaternion to represent the pose sensor to measure the pose rotation matrix information
- FIG. 11 is a schematic diagram of the composition structure of a time synchronization processing device provided by an embodiment of the disclosure.
- FIG. 12 is a schematic diagram of the composition structure of an electronic device provided by an embodiment of the disclosure.
- Fig. 1 is a schematic diagram 1 of a system structure suitable for a time synchronization processing method provided by an embodiment of the present disclosure.
- the system may include a processor 11, two different sensors 12, and a memory 13, two different The sensors 12 respectively send the acquired information to the processor 11 for processing.
- One of the two different sensors 12 may be a sensor that directly measures angular velocity, and the other sensor may be a sensor that indirectly measures angular velocity.
- the sensor that indirectly measures angular velocity is capable of independently estimating the rotational movement of the sensor itself (that is, obtaining posture rotation).
- Matrix information) sensor is capable of independently estimating the rotational movement of the sensor itself (that is, obtaining posture rotation).
- Matrix information Matrix information
- the two different sensors 12 may also be sensors of other structures, and the embodiment of the present disclosure does not limit the structure of the two different sensors.
- the two sensors of the two different sensors 12 are rigidly fixed, that is, the two different sensors 12 of the electronic device are relatively fixed. During the movement, the angular velocity measured by different sensors at the same time is the same. Two different sensors 12 respectively input the acquired information into the processor 11, and the processor 11 performs time synchronization processing by executing the method provided in the embodiment of the present disclosure.
- the electronic device may include a time synchronization processing device and two different sensors 12; the time synchronization processing device may include the aforementioned processor 11 and memory 13.
- the time synchronization processing device may include the aforementioned processor 11 and memory 13.
- the system architecture of the time synchronization processing method may also be shown in FIGS. 2A and 2B.
- the system includes a time synchronization processing device 20, a first sensor 22a and a second sensor 22b, and a first sensor 22a and a second sensor 22a.
- the second sensor 22b is arranged on the electronic equipment of the vehicle, for example, the time synchronization processing device 20 may be an on-board equipment in the vehicle; the first sensor 22a and the second sensor 22b may send the acquired information to the time synchronization through an electrical connection or a wireless communication connection ⁇ 20 ⁇ Processing device 20.
- the time synchronization processing device 20 has a processor and a memory.
- the first sensor 22a and the second sensor 22b send the acquired information to the time synchronization processing device 20, which is processed by the processor to implement the information provided by the embodiments of the present disclosure.
- the method of time synchronization is described in FIGS. 2A and 2B.
- the first sensor 22a and the second sensor 22b may be arranged at different positions of the vehicle, or may be arranged at the same position of the vehicle.
- the first sensor 22a and the second sensor 22b may both be arranged on the tire of the vehicle; for another example, as shown in FIG. 2A, the first sensor 22a may also be arranged on the tire of the vehicle.
- the second sensor 22b is arranged in other parts of the vehicle, for example, in the vehicle-mounted device part. The embodiment of the present disclosure does not limit the deployment positions of the first sensor 22a and the second sensor 22b.
- the processor processes the information sent by different sensors in different ways. For example, when the sensor is a sensor that directly measures angular velocity, the processor can directly process the obtained angular velocity information; when the sensor is a sensor that indirectly measures angular velocity, the processor first obtains the pose rotation matrix information, and then according to the pose rotation matrix Information acquires angular velocity information, and then processes the acquired angular velocity information.
- the time synchronization processing apparatus can be applied to various types of electronic devices with information processing capabilities during implementation.
- the time synchronization processing device can obtain the data collected by two different sensors in real time in an online manner, and use the technical solutions of the embodiments of the present disclosure to perform time synchronization processing.
- the time synchronization processing device may also obtain data collected by two different sensors in an offline manner, and use the technical solutions of the embodiments of the present disclosure to perform time synchronization processing.
- obtaining the data collected by two different sensors in an offline manner may include: storing the data collected by two different sensors, and exporting the stored data when needed, so as to synchronize the respective measurement results of the different sensors;
- Obtaining the data collected by two different sensors in an online manner may include: obtaining data collected by two different sensors in real time, so as to synchronize the respective measurement results of the different sensors.
- the time synchronization processing device can be arranged in the electronic equipment or outside the electronic equipment; two different sensors are arranged in the electronic equipment.
- the electronic device may be, for example, a mobile robot device, an unmanned device, or various types of mobile terminals and other devices.
- the unmanned device may include, but is not limited to, a vehicle, an airplane, or a ship, which is not limited in the embodiment of the present disclosure.
- time synchronization processing method which can solve the problem of low and complex time delay calibration between different sensors.
- the functions implemented by the time synchronization processing method can be implemented by the processor in the time synchronization processing device calling executable instructions.
- the executable instructions can be stored in the storage medium of the memory. It can be seen that the time synchronization processing device at least includes a processor And storage media.
- FIG. 3 is a schematic diagram 1 of the implementation process of a time synchronization processing method provided in an embodiment of the present disclosure, which is applied to a time synchronization processing device.
- the time synchronization processing method includes:
- two different sensors are rigidly connected, that is, the two different sensors are relatively fixed.
- the angular velocity represented by the angular velocity information acquired by two different sensors at the same time is the same.
- the two different sensors in the embodiments of the present disclosure include sensors with different structures, or sensors with the same structure but different deployment positions.
- the two different sensors may include any two of the following structures: vision sensors, inertial sensors, magnetic sensors, lidar sensors, and wheel sensors. There is no restriction here.
- the two different sensors may be vision sensors or magnetic sensors deployed in different positions of the electronic device, and the embodiments of the present disclosure are not limited here.
- the angular velocity information may be the angular velocity information when the electronic device rotates in a preset period of time, which is obtained by two different sensors. In other words, when the electronic device rotates, it rotates by a certain angle, so that the angular velocity information collected by different sensors can be obtained within a preset time period.
- the foregoing preset time period may be set by the user according to actual conditions.
- the preset time period may be set to half an hour or fifteen minutes, etc., which is not limited in the embodiments of the present disclosure.
- the above two different sensors can be classified based on acquisition methods such as direct acquisition of angular velocity information or indirect acquisition of angular velocity information.
- one type of sensor is a gyro sensor, which can directly measure angular velocity information; the other type of sensor is a pose sensor, which can indirectly measure angular velocity information.
- the above two different sensors can also be classified based on the structure of the sensor.
- the gyro sensor and the attitude sensor are sensors with different structures.
- At least one of the two different sensors is a pose sensor
- the obtaining angular velocity information collected by the two different sensors separately includes: obtaining a position The pose rotation matrix information collected by the pose sensor; the angular velocity information when the electronic device rotates is determined by the pose rotation matrix information.
- At least one of the two different sensors is a gyro sensor, and obtaining angular velocity information collected by the two different sensors respectively includes: obtaining the information collected by the gyro sensor Angular velocity information.
- acquiring angular velocity information when an electronic device undergoes a rotational movement through two different sensors includes the following application scenarios:
- Scenario 1 Two different sensors are gyroscopic sensors, but the deployment positions of the two gyroscopic sensors are different, and the angular velocity information of the electronic device when the electronic device rotates is directly collected through the two gyroscopic sensors.
- Scenario 2 The two different sensors are both pose sensors, but the deployment positions of the two pose sensors are different, and the pose rotation matrix information when the electronic device rotates is obtained through the two pose sensors. Then, the angular velocity information when the electronic device rotates is calculated through the information of the pose rotation matrix.
- Scenario 3 Among the two different sensors, one is a pose sensor and the other is a gyro sensor; on the one hand, the pose rotation matrix information when the electronic device rotates can be obtained through the pose sensor, and then pass the position The posture rotation matrix information is calculated to obtain the angular velocity information when the electronic device rotates; on the other hand, the angular velocity information when the electronic device rotates can be directly collected through the gyro sensor.
- the gyro sensor includes any one of an inertial sensor, a magnetic sensor, and a wheel-type odometer sensor
- the attitude sensor includes any one of a vision sensor and a lidar sensor.
- S102 Perform alignment processing on the two obtained angular velocity information, and determine delay time information between two different sensors.
- the two angular velocity information need to be aligned to determine the delay time information between the two different sensors.
- the alignment processing of the two angular velocity information in the embodiment of the present disclosure is to make the angular velocity information acquired by the two different sensors be consistent in frequency, so that the two angular velocity information can be determined based on the aligned two angular velocity information. Delay time information between two different sensors.
- the two angular velocities can be interpolated to align the two angular velocity information.
- the interpolation processing may adopt any interpolation processing method among linear interpolation, cubic spline interpolation, and spherical interpolation, which is not limited in the embodiment of the present disclosure.
- the respective measurement results of the two different sensors can be time synchronized according to the delay time information. It should be noted that the time synchronization processing of the respective measurement results of two different sensors is to determine the respective measurement results of the two different sensors at the same time, so as to solve the trigger delay and transmission delay caused by the different sensors. The problem of low time synchronization accuracy.
- the respective measurement results of the two different sensors are the measurement results obtained by the two different sensors respectively measuring the rotational movement of the electronic device when the electronic device undergoes rotational movement.
- the measurement results obtained by the two different sensors may include at least one of angular velocity information, azimuth angle information, and acceleration information.
- the measurement result of the inertial sensor may include angular velocity information and acceleration information of the rotational movement of the electronic device;
- the measurement result of the magnetic sensor may include the azimuth angle information of the rotational movement of the electronic device.
- At least three sensors may also be provided on the electronic device, and the at least three sensors are rigidly connected; the angular velocity information collected by the at least three sensors is obtained respectively; and the three angular velocity information obtained is performed Alignment processing determines the delay time information between at least three sensors; according to the delay time information, time synchronization processing is performed on the respective measurement results of the at least three sensors.
- At least three sensors can be combined in pairs to obtain the delay time information between any two sensors, and then the delay time information Time synchronization processing is performed on the respective measurement results of the two sensors corresponding to the same time, so as to realize the time synchronization processing of at least three sensors.
- the delay time information between the visual sensor and the inertial sensor can be acquired, and the visual effect can be adjusted based on the delay time information.
- the measurement result of the sensor and the measurement result of the inertial sensor are processed in time synchronization to realize the time synchronization of the measurement result of the visual sensor and the measurement result of the inertial sensor.
- the delay time information between the vision sensor and the inertial sensor may be determined first, based on the delay time information
- the measurement result of the vision sensor and the measurement result of the inertial sensor are time-synchronized to realize the time synchronization of the measurement result of the vision sensor and the measurement result of the inertial sensor; then the delay time information between the vision sensor and the lidar sensor is determined, based on The delay time information performs time synchronization processing on the measurement result of the vision sensor and the measurement result of the lidar sensor, and realizes the time synchronization of the measurement result of the vision sensor and the measurement result of the lidar sensor. In this way, through two time synchronization processing, The time synchronization of the measurement results between the vision sensor, inertial sensor and lidar sensor.
- the following uses one of the two different sensors as a vision sensor, the other sensor as an inertial sensor, and the measurement result is angular velocity information as For example, the time synchronization processing in the embodiment of the present disclosure will be described.
- FIG. 4 is a schematic diagram showing the time delay between the angular velocity information of the visual sensor and the angular velocity information of the inertial sensor in an exemplary embodiment of the disclosure.
- FIG. 5 is a schematic diagram after the angular velocity of the visual sensor and the angular velocity of the inertial sensor are synchronized in an exemplary embodiment of the disclosure.
- the dotted line represents the angular velocity of the vision sensor
- the solid line represents the angular velocity of the inertial sensor.
- the angular velocity curve of the visual sensor is staggered from the angular velocity curve of the inertial sensor, and the angular velocity curve of the visual sensor is lagging.
- the angular velocity curve of the visual sensor and the angular velocity curve of the inertial sensor are aligned and not staggered, thus According to the determined delay time information, the time synchronization processing of the angular velocity information between the visual sensor and the inertial sensor is realized.
- the respective measurement results of the two different sensors can be directly synchronized in time according to the delay time information.
- the delay time information may also be stored; when the electronic device is in a non-moving state In this case, according to the stored delay time information, the respective measurement results of the two different sensors are time synchronized.
- the technical solutions of the embodiments of the present disclosure can realize both online time synchronization processing and offline time synchronization processing, and the time synchronization processing is more flexible.
- the time synchronization processing in the embodiments of the present disclosure does not need to use a calibration reference object, such as a checkerboard image, and the time synchronization is more convenient and simple, and has strong universality.
- the embodiments of the present disclosure require the electronic device to be capable of rotating movement, that is, the electronic device is required to only need to rotate around the rotation axis at least to calibrate the delay time information between different sensors on the electronic device, without having to require movement around multiple axes ,
- the complexity of time synchronization of different sensors is reduced, and it can be adapted to the needs of different scenarios; and the embodiments of the present disclosure can also calibrate the delay time based on the rotation motion generated by the electronic device rotating around multiple axes, so that more abundant rotation information can be obtained ,
- the embodiments of the present disclosure require sensors to independently obtain angular velocity information to determine the delay time information between different sensors, which can be widely used for time synchronization between multiple sensors.
- the embodiment of the present disclosure is a software-based method for realizing time synchronization, without additional deployment of dedicated hardware for time synchronization; the embodiment of the present disclosure performs time synchronization based on the delay time information obtained in real time, and can perform time synchronization online Processing: The embodiment of the present disclosure determines the delay time information through the angular velocity information obtained when the electronic device rotates, instead of treating the delay time information as a constant, and improves the accuracy of time synchronization.
- FIG. 6 is a second schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
- the angular velocity information acquired by two different sensors is aligned to determine
- the delay time information between two different sensors, namely S102, can include S102a and S102b, as follows:
- S102a Perform interpolation processing on at least one of the two angular velocity information to align the two angular velocity information.
- the interpolation processing of sensors of different structures may be different from the processing of sensors of the same structure. Therefore, based on the different structures of the two sensors, interpolation processing can be performed on at least one of the two angular velocity information to align the two angular velocity information.
- interpolation processing is performed on at least one of the obtained angular velocity information collected by the two gyroscopic sensors to align the two angular velocity information.
- interpolation processing is performed on at least one of the obtained pose rotation matrix information collected by the two pose sensors to align the two angular velocities information.
- interpolation processing is performed on the pose rotation matrix information collected by the obtained pose sensor to align the two angular velocity information.
- performing interpolation processing on at least one of the two angular velocity information includes performing interpolation processing on both angular velocity information and performing interpolation processing on any one of the two angular velocity information.
- performing interpolation processing on the two angular velocity information includes: selecting a data acquisition frequency as the standard data acquisition frequency, and performing interpolation processing on the two angular velocity information according to the standard data acquisition frequency.
- the standard data acquisition frequency may be between the respective data acquisition frequencies of the two sensors or higher than the respective data acquisition frequencies of the two sensors.
- the data acquisition frequency higher than the two sensors can be selected as the standard data acquisition frequency, for example, the standard data acquisition frequency is 17 Hz.
- a frequency between the data acquisition frequencies of the two sensors can also be selected as the standard data acquisition frequency, for example, the standard data acquisition frequency is 13 Hz.
- the angular velocity information acquired by the two sensors is interpolated according to the standard data acquisition frequency, so that the angular velocity acquisition frequency of the first sensor and the angular velocity acquisition frequency of the second sensor are respectively the same as the standard acquisition frequency , Both are 13Hz. In this way, the problem of deviations in data processing caused by inconsistent data acquisition frequencies of different sensors can be solved, which helps to obtain more accurate delay time information based on the aligned angular velocity information.
- the interpolation processing includes: performing interpolation processing on the acquired angular velocity information collected by the second sensor according to the data acquisition frequency of the first sensor.
- the angular velocity information acquired by the second sensor can be interpolated according to the data acquisition frequency of the first sensor 15 Hz, so that the The angular velocity acquisition frequency of one sensor is the same as the angular velocity information acquisition frequency of the second sensor, and both are 15 Hz. This solves the problem of deviations in data processing due to inconsistent data acquisition frequencies of different sensors, and helps to obtain more accurate delay time information based on the aligned angular velocity information.
- the obtained pose rotation matrix information collected by the pose sensor is subjected to interpolation processing to Align the two angular velocity information.
- it can include S01, S02 and S03, as follows:
- adjacent pose rotation matrix information may be obtained from the multiple pose rotation matrix information. It should be noted that the pose rotation matrix information collected by the pose sensor in the preset time period is multiple pose rotation matrix information.
- the pose rotation matrix information of the adjacent frame image can be obtained; when the pose sensor is a lidar sensor, it can be obtained separately by the lidar sensor
- the pose rotation matrix information in the preset time period is used to estimate the rotation movement of the electronic device.
- S02 Perform interpolation processing on adjacent pose rotation matrix information according to the frequency at which the gyro sensor obtains angular velocity information, and obtain interpolated rotation matrix information.
- the interpolated rotation matrix information may be obtained according to the adjacent pose rotation matrix information.
- an interpolation model can be constructed, and then the interpolation model and the adjacent pose rotation matrix information can be used to obtain the interpolated rotation matrix information. Rotation matrix information.
- the interpolation model is used to estimate other adjacent pose rotation information between adjacent pose rotation information based on the value status of adjacent pose rotation information at a limited number of points.
- the interpolation model may include a spherical linear interpolation (Spherical Linear Interpolation) model, a cubic spline interpolation (Cubic Spline Interpolation) model, and a nearest neighbor interpolation model.
- Spherical Linear Interpolation Spherical Linear Interpolation
- Cubic Spline Interpolation cubic spline interpolation
- nearest neighbor interpolation model a nearest neighbor interpolation model.
- the interpolated rotation matrix information may be differentiated according to a geometric manner to obtain angular velocity information corresponding to the pose sensor.
- the differential model can be constructed in the process of performing differentiation processing on the interpolated rotation matrix information to obtain angular velocity information corresponding to the pose sensor. Through the constructed differential model and the interpolated rotation matrix information, the angular velocity information corresponding to the pose sensor is obtained.
- the differential model satisfies formula (1).
- q(t)' is the differential pose rotation matrix information
- w is the second sub-angular velocity information
- q(t) is the interpolated pose rotation matrix information
- the differential model in the embodiment of the present disclosure is a model used to characterize angular velocity information.
- the differential model may be a quaternion differential model or a model in other mathematical expression forms, which is not limited in the embodiment of the present disclosure.
- S102b Determine the delay time information between two different sensors according to the aligned two angular velocity information.
- the angular velocity information acquired by different sensors at the same time is the same. Therefore, the angular velocity information between the two different sensors can be determined according to the aligned two angular velocity information. Delay time information.
- an error model can be constructed first, and then based on the aligned two angular velocity information and the error model , To determine the delay time information between two different sensors.
- the error parameter between the two sensors is introduced, so that the delay time information determined by the error model can be more accurate.
- the error model can be as shown in formula (2):
- Q 12 is the rotation parameter between the different coordinate axes corresponding to two different sensors
- td is the delay time information between the two different sensors
- bg is the error parameter between the two different sensors
- w 1 (t) and w 2 (t) are respectively the angular velocity information acquired by two different sensors at the same time t
- f(Q 12 , td, bg) is the error between the angular velocities.
- the method further includes: aligning the two angular velocity information, Determine the external parameters between two different sensors.
- the external parameters include the rotation parameters between the different coordinate axes corresponding to the two different sensors and the error parameters between the two different sensors.
- the delay time information can be obtained, but also the rotation parameters and error parameters between two different sensors can be obtained, so that the functions of the electronic device are more abundant.
- FIG. 7 is a schematic diagram of the third implementation flow of a time synchronization processing method provided by an embodiment of the disclosure.
- the delay time information between two different sensors is determined according to the aligned two angular velocity information, namely Step S102b may include S102b1, S102b2, and S102b3, as follows:
- the sub-error equations corresponding to the two different sensors at different moments during the rotation of the electronic device can be determined according to the aligned two angular velocity information.
- n can be preset at different times within the preset time period according to actual conditions. For example, 15 moments or 20 moments can be set within the preset time period, and then the sub-error equation corresponding to each moment can be obtained.
- the angular velocity information collected by different sensors corresponding to time 1 are respectively w 1 (1) and w 2 (1+td); the angular velocity information collected by different sensors corresponding to time 2 are respectively w 1 (2) And w 2 (2+td), and so on, the angular velocity information collected by different sensors corresponding to time n is w 1 (n) and w 2 (n+td) respectively, then when the error model satisfies formula (2),
- the corresponding sub-error equations corresponding to different times n are:
- the sub-error equations corresponding to each time are determined, the sub-error equations corresponding to different times are summed to obtain the final error equation.
- the embodiment of the present disclosure obtains the delay time information by obtaining the accumulated errors at different times, and further requires summing the sub-error equations corresponding to different times to obtain the final error equation.
- S102b3 Perform minimum processing on the final error equation to obtain delay time information.
- the final error equation can be processed to a minimum value to obtain delay time information.
- the above-mentioned minimum value processing is to minimize the final error equation value, and then estimate the delay time information in the error model, the rotation parameter, and the error parameter between the sensors.
- the minimum processing of the final error equation may be performed through a nonlinear model or an iterative closest point model to perform the minimum processing on the final error equation.
- the process of performing minimum value processing on the final error equation to obtain delay time information may include: performing iterative nearest point processing on the final error equation to obtain the second minimization equation; and solving the second minimization equation , Until the preset second threshold is met, the delay time information in the second minimization equation is acquired.
- the iterative closest point processing is performed on the final error equation to obtain the second minimization equation, and the delay time can be set between the preset time periods. For example, you can use the golden section method to select a delay time and then substitute it into the error model.
- the error term obtained at this time is only the two unknowns of the rotation parameter and the error parameter between the sensors.
- it can be processed by iterative closest point. Then the rotation parameters and the error parameters between the sensors can be obtained.
- f(Q 12 ,bg) is the error between angular velocities
- n is the number of the nearest point pairs
- Q 12 is the rotation matrix between the two sensors
- bg is the error parameter between the two sensors
- w i1 (t) and w i2 (t) are respectively a point in the point cloud in the angular velocity corresponding to the two sensors.
- the process of performing minimum value processing on the final error equation to obtain delay time information may further include: performing nonlinear optimization processing on the final error equation to obtain the first minimization equation; Solve, until the preset first threshold is met, obtain the delay time information in the first minimization equation.
- the final error equation is a non-linear function, it needs to be Taylor expanded. By minimizing the final error equation, it is necessary to find the corresponding delay time information when the preset first threshold is met, so as to make the final error The equation drops to a minimum.
- nonlinear optimization processing is performed on the final error equation, and a nonlinear optimization model can be constructed in the process of obtaining the first minimization equation. Based on the final error equation and the nonlinear optimization model, the first minimization equation is determined.
- the preset nonlinear optimization model may include a Gauss-Newton algorithm model or a Levenberg-Marquardt algorithm model, and the first threshold can be set according to the actual needs of the user. If it is 0.1 or 0.01, the embodiments of the present disclosure are not limited here.
- obtaining the delay time information in the first minimization equation includes: when it is determined that the delay time is determined for the first time Information, according to the preset initial variable value and the preset nonlinear optimization model, obtain the current variable value; according to the initial variable value, the current variable value and the first minimization equation, determine the current solution value of the minimized equation, when the current solution value When the preset first threshold is met, the delay time information in the first minimization equation is obtained.
- the first minimization equation is composed of the final error equation corresponding to the adjacent variable values.
- the first minimization equation can be formula (8).
- x k+1 is the current variable value
- x k is the initial variable value
- e is the current solution value of the minimization equation.
- the next variable value needs to be obtained according to the current variable value and the preset nonlinear optimization model; according to the next variable value, the current variable value and the first Minimize the equation, determine the next solution value of the minimized equation, and determine in turn whether the next solution value meets the preset first threshold, until the next solution value meets the preset first threshold, the iteration ends, and the first Minimize the delay time information in the equation.
- rotation parameters and sensor deviation parameters can also be obtained.
- the obtained rotation parameters can transform the measurement information of different coordinate systems into In the same coordinate system.
- the fusion processing after performing time synchronization processing on the respective measurement results of the two different sensors according to the delay time information, it may further include: performing fusion processing on the synchronized measurement results;
- the measurement result of the fusion processing performs at least one of the following operations: positioning processing, ranging processing, target detection of the scene where the electronic device is located, generating or updating a map.
- performing fusion processing on two synchronized measurement results includes: analyzing and synthesizing two measurement results at the same time to obtain a reliable fusion processing result.
- applying the result of the fusion processing to the positioning process can achieve precise positioning of the electronic device; applying the result of the fusion processing to the ranging process can improve the measurement accuracy; applying the result of the fusion processing to the In the target detection process of the scene where the electronic device is located, accurate target detection results can be obtained; when the fusion processing result is applied to the process of generating or updating a map, an accurate map can be obtained.
- a fusion algorithm model in the process of fusing the synchronized measurement results, can be constructed to obtain accurate fusion processing results.
- the fusion algorithm model may include a Kalman filter fusion algorithm model and a cluster analysis recognition algorithm model, which is not limited in the embodiment of the present disclosure.
- the Kalman filter fusion algorithm model can be used to perform fusion processing on the synchronized measurement results, including: using the measurement results of two different sensors set by the electronic device to perform state propagation, so that the previous moment's The pose estimation of the first measurement result obtains the pose estimation at the current moment, and then uses the second measurement result at the current moment as observation information to correct the preliminary estimation of the pose at the current moment, thereby obtaining an optimal pose at the current moment
- the best estimate is the result of the fusion processing.
- the time synchronization processing method in the embodiment of the present disclosure can also be applied to more than three sensors.
- the measurement results after the time synchronization processing of every two sensors are acquired, and the measurement results after the time synchronization processing are fused to obtain the result of the fusion processing.
- more accurate measurement results can be obtained by fusing the measurement results corresponding to at least three sensors.
- FIG. 8 is a fourth schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
- two different sensors provided on an electronic device are a vision sensor and an inertial sensor.
- the time synchronization processing method of this embodiment may include the following steps:
- the inertial sensor on the electronic device can independently obtain angular velocity information. Since the inertial sensor is provided with a gyro unit, the angular velocity information when the electronic device rotates can be directly obtained through the gyro unit.
- the inertial sensor is a sensor that measures the three-axis attitude angle (or angular rate) and acceleration.
- the inertial sensor can also include an acceleration unit, where the acceleration unit can detect the three-axis position of the electronic device. Acceleration information of the axis.
- the visual sensor on the electronic device can independently obtain the pose rotation matrix information.
- the vision sensor is provided with a pose estimation unit, which can obtain the pose rotation matrix information when the electronic device rotates.
- the vision sensor is mainly composed of one or two graphics sensors, and is also equipped with a light projector and other auxiliary equipment. It can acquire the original image within a preset time period and store the acquired image in the memory. The benchmarks are compared and analyzed to obtain the pose rotation matrix information, and then calculate the pose rotation matrix information of the electronic device.
- the rotational movement of the electronic device can be represented by a quaternion; the rotational movement of the electronic device can also be represented by a three-dimensional rotation group, which is not limited in the embodiment of the present disclosure.
- S203 Determine the second angular velocity information according to the pose rotation matrix information, and align the second angular velocity information with the first angular velocity information.
- the image frequency of the vision sensor is usually 10 Hz
- the frequency of the inertial sensor is usually 100 Hz
- At least one of the second angular velocity information and the first angular velocity information can be interpolated through an interpolation model.
- the interpolation models used are also different. For example, for the rotational motion represented by the quaternion, the pose rotation matrix can be interpolated through the spherical linear interpolation model; for the rotational motion represented by the three-dimensional rotation group, the cubic spline interpolation model and the nearest neighbor interpolation model can be used. Interpolate the pose rotation matrix information.
- the rotation motion represented by a quaternion is used as an example to illustrate the interpolation processing process of the pose rotation matrix information of the vision sensor using a spherical linear interpolation model.
- a quaternion is composed of a real number plus an imaginary number, as in formula (9):
- e 0 , e 1 , e 2 , and e 3 are real numbers, i, j, and k are mutually orthogonal imaginary number units, and q is the pose rotation matrix information represented by a quaternion.
- w is the angular velocity
- t is the time
- q is the pose rotation matrix information represented by the exponential mapping of quaternion.
- the constructed spherical interpolation model is formula (11).
- q 0 and q 1 are adjacent posture rotation matrices
- q(t) is the posture rotation matrix information obtained by spherical interpolation of the posture rotation matrix.
- the posture rotation matrix information after interpolation can be obtained.
- FIG. 9 is a schematic diagram 1 of the embodiment of the present disclosure using a quaternion sensor to measure the pose rotation matrix information.
- the electronic device continuously moves from the previous moment to the current moment. That is, rotate from a solid circle to a dotted circle.
- FIG. 10 is a second schematic diagram of the embodiment of the disclosure using a quaternion to represent the pose rotation matrix information measured by the pose sensor. As shown in FIG. 10, it is a plan view of the dotted frame extracted from FIG.
- q 0 and q 1 is the adjacent posture rotation matrix
- q(t) is the posture rotation matrix information after spherical interpolation is performed on the posture rotation matrix
- the angle of rotation is ⁇
- t is the time
- the angular velocity information corresponding to the vision sensor in the process of acquiring the second angular velocity information according to the interpolated pose rotation matrix information, can be determined according to the interpolated pose rotation matrix information and the differential model.
- the differential model is as in formula (1).
- the second angular velocity information can be obtained indirectly by interpolating the pose rotation matrix information first, and then differential processing. In this way, it can adapt to the time synchronization between more sensors and has universal applicability.
- S204 Determine the delay time information between the visual sensor and the inertial sensor.
- the delay time information between the visual sensor and the inertial sensor can be determined according to the first angular velocity information and the second angular velocity information.
- the delay time information between the vision sensor and the inertial sensor can be solved, but also the rotation parameter and the error parameter between the sensors can be solved.
- S205 Perform time synchronization processing on the measurement result of the visual sensor and the measurement result of the inertial sensor according to the delay time information.
- the time synchronization processing method provided by the embodiments of the present disclosure can be applied to unmanned driving or mobile robot navigation.
- the unmanned electronic device or mobile robot electronic device can be implemented through the present disclosure
- the time synchronization processing method provided in the example realizes precise positioning.
- the embodiment of the present disclosure only uses a vision sensor and an inertial sensor as an example to illustrate the process of the electronic device implementing the time synchronization processing method.
- the time synchronization processing method provided by the embodiment of the present disclosure is not limited to the vision sensor and the inertial sensor. Between the two sensors, it can also be applied to other sensors, as long as the other sensors can independently obtain angular velocity information and pose rotation matrix information.
- the embodiments of the present disclosure can also be applied to electronic devices including more than three sensors, that is, the above-mentioned time synchronization method is also applicable to electronic devices with more than three sensors.
- FIG. 11 is a schematic diagram of the composition structure of a time synchronization processing device provided by an embodiment of the disclosure.
- the synchronization processing device 300 includes a first acquisition module 301, an alignment module 302, and a synchronization module 303, where:
- the first acquisition module 301 is configured to obtain angular velocity information collected by two different sensors; the angular velocity information is the angular velocity information when the electronic device rotates; the two different sensors are both set on the electronic device And rigidly connected;
- the alignment module 302 is configured to perform alignment processing on the two obtained angular velocity information, and determine the delay time information between the two different sensors;
- the synchronization module 303 is configured to perform time synchronization processing on the respective measurement results of the two different sensors according to the delay time information.
- the time synchronization processing device of the embodiment of the present disclosure requires the electronic device to be capable of rotating motion, that is, the electronic device is required to rotate around the rotation axis at least to calibrate the delay time information between different sensors on the electronic device without It is necessary to require movement around multiple axes, which reduces the complexity of time synchronization of different sensors and can adapt to the needs of different scenarios; and the embodiments of the present disclosure can also calibrate the delay time based on the rotational movement generated by the multi-axis rotation of the electronic device. Obtain richer rotation information to improve the accuracy of time synchronization; the embodiments of the present disclosure require sensors to independently obtain angular velocity information to determine the delay time information between different sensors, which can be widely used in a variety of sensors.
- the time synchronization between the devices has universal adaptability;
- the embodiment of the present disclosure is a method for realizing time synchronization based on software, and there is no need to deploy additional dedicated hardware for time synchronization;
- the embodiment of the present disclosure performs time synchronization based on the delay time information obtained in real time , Can realize online time synchronization processing;
- embodiments of the present disclosure determine delay time information through angular velocity information obtained when the electronic device rotates, instead of treating the delay time information as a constant, and improve the accuracy of time synchronization.
- the first obtaining module 301 is configured to obtain the pose rotation matrix information collected by the pose sensor; the angular velocity information when the electronic device rotates is determined by the pose rotation matrix information, where: At least one of the two different sensors is a pose sensor.
- the first obtaining module 301 is configured to obtain angular velocity information collected by a gyro sensor, wherein at least one of the two different sensors is a gyro sensor.
- the alignment module 302 is configured to perform interpolation processing on at least one of the angular velocity information to align the two angular velocity information; determine the two angular velocity information according to the aligned two angular velocity information Delay time information between two different sensors.
- the alignment module 302 is configured to perform at least one of the obtained angular velocity information collected by the two gyroscopic sensors when the two different sensors are both gyroscopic sensors. One performs interpolation processing to align the two angular velocity information.
- the alignment module 302 is configured to, in a case where the two different sensors are both pose sensors, perform a pose rotation matrix for each of the obtained two pose sensors. At least one of the information is subjected to interpolation processing to align the two angular velocity information.
- the alignment module 302 is configured to rotate the acquired pose collected by the pose sensor when the two different sensors are a pose sensor and a gyro sensor.
- the matrix information undergoes interpolation processing to align the two angular velocity information.
- the time synchronization processing device 300 further includes: a second acquisition module 304, configured to perform alignment processing on the two angular velocity information through the alignment module 302 to determine the difference between the two different sensors
- the external parameters include rotation parameters between different coordinate axes corresponding to the two different sensors and error parameters between the two different sensors.
- the time synchronization processing device 300 further includes: a storage module 305, configured to store the delay time information;
- the synchronization module is configured to perform time synchronization on the respective measurement results of the two different sensors according to the delay time information stored by the storage module 305 when the electronic device is in a non-motion state deal with.
- the alignment module 302 is configured to determine the sub-error equations corresponding to the two different sensors at different times according to the two aligned angular velocity information; The error equations are summed to obtain the final error equation; the minimum value processing is performed on the final error equation to obtain the delay time information.
- the alignment module 302 is configured to perform nonlinear processing on the final error equation to obtain a first minimization equation; the first minimization equation is solved until the preset first In the case of a threshold value, the delay time information in the first minimization equation is acquired.
- the alignment module 302 is configured to perform iterative closest point processing on the final error equation to obtain a second minimization equation; the second minimization equation is solved until the preset second In the case of a threshold value, the delay time information in the second minimization equation is acquired.
- the time synchronization processing apparatus 300 further includes:
- the fusion module 307 is used to perform fusion processing on the synchronized measurement results
- the execution module 308 is configured to perform at least one of the following operations according to the measurement result of the fusion processing: positioning processing, ranging processing, target detection of the scene where the electronic device is located, and generating or updating a map.
- time synchronization processing device when the time synchronization processing device provided in the above embodiment performs time synchronization processing, only the division of the above program modules is used as an example for illustration. In actual applications, the above processing can be allocated to different program modules according to needs. Complete, that is, divide the internal structure of the device into different program modules to complete all or part of the processing described above.
- time synchronization processing device provided in the foregoing embodiment and the time synchronization processing method embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment, and will not be repeated here.
- FIG. 12 is a schematic diagram of the structure of the electronic device provided by the embodiment of the disclosure.
- the electronic device at least includes a processor 21 and a storage device capable of running on the processor.
- the memory 23 of the computer program; the processor 21 is used to execute the steps in the time synchronization processing method provided in the above embodiment when the computer program is running.
- the electronic device used to execute the steps in the time synchronization processing method provided in the foregoing embodiment may be the same or different from the electronic device provided with two different sensors.
- the electronic device may further include a communication interface 24, which is used to obtain angular velocity information and pose rotation matrix information.
- the various components in the electronic device are coupled together through the bus system 25. It can be understood that the bus system 25 is used to implement connection and communication between these components.
- the bus system 25 also includes a power bus, a control bus, and a status signal bus. However, for the sake of clear description, various buses are marked as the bus system 25 in FIG. 12.
- the memory 23 may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
- the non-volatile memory can be a read only memory (ROM, Read Only Memory), a programmable read only memory (PROM, Programmable Read-Only Memory), an erasable programmable read only memory (EPROM, Erasable Programmable Read- Only Memory, Electrically Erasable Programmable Read-Only Memory (EEPROM, Electrically Erasable Programmable Read-Only Memory), magnetic random access memory (FRAM, ferromagnetic random access memory), flash memory (Flash Memory), magnetic surface memory , CD-ROM, or CD-ROM (Compact Disc Read-Only Memory); magnetic surface memory can be magnetic disk storage or tape storage.
- the volatile memory may be random access memory (RAM, Random Access Memory), which is used as an external cache.
- RAM random access memory
- SRAM static random access memory
- SSRAM synchronous static random access memory
- DRAM dynamic random access Memory
- SDRAM Synchronous Dynamic Random Access Memory
- DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
- ESDRAM enhanced -Type synchronous dynamic random access memory
- SLDRAM SyncLink Dynamic Random Access Memory
- direct memory bus random access memory DRRAM, Direct Rambus Random Access Memory
- DRRAM Direct Rambus Random Access Memory
- the memory 23 described in the embodiment of the present invention is intended to include, but is not limited to, these and any other suitable types of memory.
- the method disclosed in the foregoing embodiment of the present invention may be applied to the processor 21 or implemented by the processor 21.
- the processor 21 may be an integrated circuit chip with signal processing capability. In the implementation process, the steps of the foregoing method can be completed by an integrated logic circuit of hardware in the processor 21 or instructions in the form of software.
- the aforementioned processor 21 may be a general-purpose processor, a digital signal processor (DSP, Digital Signal Processor), or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
- the processor 21 may implement or execute various methods, steps, and logical block diagrams disclosed in the embodiments of the present invention.
- the general-purpose processor may be a microprocessor or any conventional processor.
- the steps of the method disclosed in the embodiments of the present invention can be directly embodied as being executed and completed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
- the software module may be located in a storage medium, and the storage medium is located in the memory 23.
- the processor 21 reads the information in the memory 23 and completes the steps of the foregoing method in combination with its hardware.
- the embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by the foregoing processor, the steps in the time synchronization processing in the foregoing embodiment are implemented.
- the disclosed device and method may be implemented in other ways.
- the device embodiments described above are merely illustrative.
- the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
- the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms of.
- the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units; Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- the functional units in the embodiments of the present disclosure can be all integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
- the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
- the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: various media that can store program codes, such as a mobile storage device, ROM, RAM, magnetic disk, or optical disk.
- the aforementioned integrated unit of the present disclosure is implemented in the form of a software function module and sold or used as an independent product, it may also be stored in a computer readable storage medium.
- the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) is allowed to execute all or part of the methods described in the various embodiments of the present disclosure.
- the aforementioned storage media include: removable storage devices, ROM, RAM, magnetic disks or optical disks and other media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Manufacturing & Machinery (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Gyroscopes (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (28)
- 一种时间同步处理方法,所述方法包括:A time synchronization processing method, the method includes:分别获得两个不同的传感器采集的角速度信息;所述角速度信息为电子设备发生旋转运动时的角速度信息,所述两个不同的传感器均设置在所述电子设备上且刚性连接;Obtaining angular velocity information collected by two different sensors; the angular velocity information is angular velocity information when the electronic device rotates, and the two different sensors are both arranged on the electronic device and rigidly connected;对获得的两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的延时时间信息;Aligning the obtained two angular velocity information, and determining the delay time information between the two different sensors;根据所述延时时间信息对所述两个不同的传感器各自的测量结果进行时间同步处理。Perform time synchronization processing on the respective measurement results of the two different sensors according to the delay time information.
- 根据权利要求1所述的方法,其中,所述两个不同的传感器中的至少之一为位姿式传感器;所述分别获得两个不同的传感器采集的角速度信息,包括:The method according to claim 1, wherein at least one of the two different sensors is a pose sensor; the obtaining the angular velocity information collected by the two different sensors separately comprises:获得所述位姿式传感器采集的位姿旋转矩阵信息;通过所述位姿旋转矩阵信息确定所述电子设备发生旋转运动时的角速度信息。The pose rotation matrix information collected by the pose sensor is obtained; the angular velocity information when the electronic device rotates is determined by the pose rotation matrix information.
- 根据权利要求1或2所述的方法,其中,所述两个不同的传感器中的至少之一为陀螺式传感器;所述分别获得两个不同的传感器采集的角速度信息,包括:The method according to claim 1 or 2, wherein at least one of the two different sensors is a gyro sensor; the obtaining the angular velocity information collected by the two different sensors separately includes:获得所述陀螺式传感器采集的角速度信息。Obtain the angular velocity information collected by the gyro sensor.
- 根据权利要求1所述的方法,其中,所述对获得的两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的延时时间信息,包括:The method according to claim 1, wherein the aligning the obtained two angular velocity information to determine the delay time information between the two different sensors comprises:对所述两个角速度信息中的至少之一进行插值处理,以对齐所述两个角速度信息;Performing interpolation processing on at least one of the two angular velocity information to align the two angular velocity information;根据对齐后的两个角速度信息,确定所述两个不同的传感器之间的延时时间信息。Determine the delay time information between the two different sensors according to the aligned two angular velocity information.
- 根据权利要求4所述的方法,其中,所述对所述两个角速度信息中的至少之一进行插值处理,包括:The method according to claim 4, wherein the performing interpolation processing on at least one of the two angular velocity information comprises:在所述两个不同的传感器均为陀螺式传感器的情况下,对获得的所述两个陀螺式传感器各自采集的角速度信息中的至少之一进行插值处理。In the case where the two different sensors are both gyro sensors, interpolation processing is performed on at least one of the obtained angular velocity information collected by the two gyro sensors.
- 根据权利要求4所述的方法,其中,所述对所述两个角速度信息中的至少之一进行插值处理,包括:The method according to claim 4, wherein the performing interpolation processing on at least one of the two angular velocity information comprises:在所述两个不同的传感器均为位姿式传感器的情况下,对获得的所述两个位姿式传感器各自采集的位姿旋转矩阵信息中的至少之一进行插值处理。In the case where the two different sensors are both pose sensors, interpolation processing is performed on at least one of the obtained pose rotation matrix information collected by the two pose sensors.
- 根据权利要求4所述的方法,其中,所述对所述两个角速度信息中的至少之一进行插值处理,包括:The method according to claim 4, wherein the performing interpolation processing on at least one of the two angular velocity information comprises:在所述两个不同的传感器分别为位姿式传感器和陀螺式传感器的情况下,对获得的所述位姿式传感器采集的位姿旋转矩阵信息进行插值处理。In the case where the two different sensors are a pose sensor and a gyro sensor, interpolation processing is performed on the obtained pose rotation matrix information collected by the pose sensor.
- 根据权利要求1-7任一项所述的方法,其中,在对获得的两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的延时时间信息之后,所述方法还包括:The method according to any one of claims 1-7, wherein, after aligning the obtained two angular velocity information and determining the delay time information between the two different sensors, the method further comprises :通过对所述两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的外参,所述外参包括所述两个不同的传感器各自对应的不同坐标轴之间的旋转参数和所述两个不同的传感器之间的误差参数。By aligning the two angular velocity information, the external parameters between the two different sensors are determined, and the external parameters include the rotation parameters and the rotation parameters between the different coordinate axes corresponding to the two different sensors. The error parameter between the two different sensors.
- 根据权利要求1-8任一项所述的方法,其中,在对获得的两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的延时时间信息之后,所述方法还包括:The method according to any one of claims 1-8, wherein, after aligning the obtained two angular velocity information and determining the delay time information between the two different sensors, the method further comprises :存储所述延时时间信息;Storing the delay time information;在所述电子设备处于非运动状态的情况下,根据存储的所述延时时间信息,对所述两个不同的传感器各自的测量结果进行时间同步处理。When the electronic device is in a non-moving state, time synchronization processing is performed on the respective measurement results of the two different sensors according to the stored delay time information.
- 根据权利要求4所述的方法,其中,所述根据对齐后的两个角速度信息,确定 所述两个不同的传感器之间的延时时间信息,包括:The method according to claim 4, wherein the determining the delay time information between the two different sensors according to the aligned two angular velocity information includes:根据所述对齐后的两个角速度信息,确定所述两个不同的传感器在不同时刻对应的子误差方程;Determine the sub-error equations corresponding to the two different sensors at different times according to the two aligned angular velocity information;对所述不同时刻对应的子误差方程求和,得到最终误差方程;Sum the sub-error equations corresponding to the different times to obtain the final error equation;对所述最终误差方程进行最小值处理,得到所述延时时间信息。Perform minimum processing on the final error equation to obtain the delay time information.
- 根据权利要求10所述的方法,其中,所述对所述最终误差方程进行最小值处理,得到所述延时时间信息,包括:The method according to claim 10, wherein said performing minimum processing on said final error equation to obtain said delay time information comprises:对所述最终误差方程进行非线性化处理,得到第一最小化方程;Performing non-linear processing on the final error equation to obtain a first minimization equation;对所述第一最小化方程的求解,直到满足预设第一阈值的情况下,获取所述第一最小化方程中的所述延时时间信息。The first minimization equation is solved until the preset first threshold is satisfied, and the delay time information in the first minimization equation is acquired.
- 根据权利要求10所述的方法,其中,所述对所述最终误差方程进行最小值处理,得到所述延时时间信息,包括:The method according to claim 10, wherein said performing minimum processing on said final error equation to obtain said delay time information comprises:对所述最终误差方程进行迭代最近点处理,得到第二最小化方程;Performing iterative nearest point processing on the final error equation to obtain a second minimization equation;对所述第二最小化方程的求解,直到满足预设第二阈值的情况下,获取所述第二最小化方程中的所述延时时间信息。The second minimization equation is solved until the preset second threshold is satisfied, and the delay time information in the second minimization equation is acquired.
- 根据权利要求1-12任一项所述的方法,其中,在所述根据所述时间延迟信息对所述两个不同的传感器各自的测量结果进行时间同步处理之后,所述方法还包括:The method according to any one of claims 1-12, wherein after the time synchronization processing is performed on the respective measurement results of the two different sensors according to the time delay information, the method further comprises:对同步后的测量结果进行融合处理;Perform fusion processing on the synchronized measurement results;根据融合处理的测量结果执行以下至少之一的操作:定位处理、测距处理、电子设备所在场景的目标检测、生成或更新地图。Perform at least one of the following operations according to the measurement result of the fusion processing: positioning processing, ranging processing, target detection of the scene where the electronic device is located, and generating or updating a map.
- 一种时间同步处理装置,所述装置包括第一获取模块、对齐模块和同步模块,其中,A time synchronization processing device, the device includes a first acquisition module, an alignment module, and a synchronization module, wherein:所述第一获取模块,用于分别获得两个不同的传感器采集的角速度信息;所述角速度信息为电子设备发生旋转运动时的角速度信息,所述两个不同的传感器均设置在所述电子设备上且刚性连接;The first acquisition module is configured to obtain angular velocity information collected by two different sensors; the angular velocity information is angular velocity information when the electronic device rotates, and the two different sensors are both set on the electronic device Upper and rigid connection;所述对齐模块,用于对获得的两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的延时时间信息;The alignment module is configured to perform alignment processing on the two obtained angular velocity information, and determine the delay time information between the two different sensors;所述同步模块,用于根据所述延时时间信息对所述两个不同的传感器各自的测量结果进行时间同步处理。The synchronization module is configured to perform time synchronization processing on the respective measurement results of the two different sensors according to the delay time information.
- 根据权利要求14所述的装置,其中,所述第一获取模块,用于获得位姿式传感器采集的位姿旋转矩阵信息;通过所述位姿旋转矩阵信息确定所述电子设备发生旋转运动时的角速度信息,其中,所述两个不同的传感器中的至少之一为位姿式传感器。The apparatus according to claim 14, wherein the first acquisition module is configured to acquire the pose rotation matrix information collected by the pose sensor; and determine when the electronic device rotates according to the pose rotation matrix information Angular velocity information, wherein at least one of the two different sensors is a pose sensor.
- 根据权利要求14或15所述的装置,其中,所述第一获取模块,用于获得陀螺式传感器采集的角速度信息,其中,所述两个不同的传感器中的至少之一为陀螺式传感器。The device according to claim 14 or 15, wherein the first acquisition module is configured to acquire angular velocity information collected by a gyro sensor, wherein at least one of the two different sensors is a gyro sensor.
- 根据权利要求14所述的装置,其中,所述对齐模块,用于对所述两个角速度信息中的至少之一进行插值处理,以对齐所述两个角速度信息;根据对齐后的两个角速度信息,确定所述两个不同的传感器之间的延时时间信息。The device according to claim 14, wherein the alignment module is configured to perform interpolation processing on at least one of the two angular velocity information to align the two angular velocity information; according to the aligned two angular velocity information Information to determine the delay time information between the two different sensors.
- 根据权利要求17所述的装置,其中,所述对齐模块,用于在所述两个不同的传感器均为陀螺式传感器的情况下,对获得的所述两个陀螺式传感器各自采集的角速度信息中的至少之一进行插值处理,以对齐所述两个角速度信息。The device according to claim 17, wherein the alignment module is configured to compare the obtained angular velocity information collected by the two gyroscopic sensors when the two different sensors are both gyroscopic sensors. At least one of them performs interpolation processing to align the two angular velocity information.
- 根据权利要求17所述的装置,其中,所述对齐模块,用于在所述两个不同的传感器均为位姿式传感器的情况下,对获得的所述两个位姿式传感器各自采集的位姿旋转矩阵信息中的至少之一进行插值处理,以对齐所述两个角速度信息。The device according to claim 17, wherein the alignment module is configured to, in the case that the two different sensors are both pose sensors, collect data collected by the two obtained pose sensors. At least one of the pose rotation matrix information undergoes interpolation processing to align the two angular velocity information.
- 根据权利要求17所述的装置,其中,所述对齐模块,用于在所述两个不同的传感器分别为位姿式传感器和陀螺式传感器的情况下,对获得的所述位姿式传感器采集的位姿旋转矩阵信息进行插值处理,以对齐所述两个角速度信息。The device according to claim 17, wherein the alignment module is configured to collect the obtained pose sensor when the two different sensors are a pose sensor and a gyro sensor. Perform interpolation processing on the posture rotation matrix information to align the two angular velocity information.
- 根据权利要求14-20任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 14-20, wherein the device further comprises:第二获取模块,用于通过所述对齐模块对所述两个角速度信息进行对齐处理,确定所述两个不同的传感器之间的外参,所述外参包括所述两个不同的传感器各自对应的不同坐标轴之间的旋转参数和所述两个不同的传感器之间的误差参数。The second acquisition module is configured to align the two angular velocity information through the alignment module, and determine the external parameters between the two different sensors, and the external parameters include each of the two different sensors. Corresponding rotation parameters between different coordinate axes and error parameters between the two different sensors.
- 根据权利要求14-21任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 14-21, wherein the device further comprises:存储模块,用于存储所述延时时间信息;A storage module for storing the delay time information;所述同步模块,用于在所述电子设备处于非运动状态的情况下,根据所述存储模块存储的所述延时时间信息,对所述两个不同的传感器各自的测量结果进行时间同步处理。The synchronization module is configured to perform time synchronization processing on the respective measurement results of the two different sensors according to the delay time information stored in the storage module when the electronic device is in a non-moving state .
- 根据权利要求17所述的装置,其中,所述对齐模块,用于根据所述对齐后的两个角速度信息,确定所述两个不同的传感器在不同时刻对应的子误差方程;对所述不同时刻对应的子误差方程求和,得到最终误差方程;对所述最终误差方程进行最小值处理,得到所述延时时间信息。The device according to claim 17, wherein the alignment module is configured to determine the sub-error equations corresponding to the two different sensors at different moments according to the aligned two angular velocity information; The sub-error equations corresponding to the moments are summed to obtain the final error equation; the minimum value processing is performed on the final error equation to obtain the delay time information.
- 根据权利要求23所述的装置,其中,所述对齐模块,用于对所述最终误差方程进行非线性化处理,得到第一最小化方程;对所述第一最小化方程的求解,直到满足预设第一阈值的情况下,获取所述第一最小化方程中的所述延时时间信息。The device according to claim 23, wherein the alignment module is configured to perform nonlinear processing on the final error equation to obtain a first minimization equation; the first minimization equation is solved until it is satisfied When the first threshold is preset, the delay time information in the first minimization equation is acquired.
- 根据权利要求23所述的装置,其中,所述对齐模块,用于对所述最终误差方程进行迭代最近点处理,得到第二最小化方程;对所述第二最小化方程的求解,直到满足预设第二阈值的情况下,获取所述第二最小化方程中的所述延时时间信息。The device according to claim 23, wherein the alignment module is configured to perform iterative closest point processing on the final error equation to obtain a second minimization equation; the second minimization equation is solved until it is satisfied When the second threshold is preset, the delay time information in the second minimization equation is acquired.
- 根据权利要求14-25任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 14-25, wherein the device further comprises:融合模块,用于对同步后的测量结果进行融合处理;Fusion module, used for fusion processing of synchronized measurement results;执行模块,用于根据融合处理的测量结果执行以下至少之一的操作:The execution module is configured to perform at least one of the following operations according to the measurement result of the fusion processing:定位处理、测距处理、电子设备所在场景的目标检测、生成或更新地图。Positioning processing, ranging processing, target detection of the scene where the electronic device is located, generating or updating the map.
- 一种电子设备,所述电子设备至少包括处理器和用于存储能够在处理器上运行的计算机程序的存储器;所述处理器用于运行所述计算机程序时,执行如权利要求1至13任一项所述的方法。An electronic device comprising at least a processor and a memory for storing a computer program that can run on the processor; when the processor is used to run the computer program, it executes any one of claims 1 to 13 The method described in the item.
- 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至13任一项所述的方法。A computer-readable storage medium having a computer program stored thereon, and when the computer program is executed by a processor, the method according to any one of claims 1 to 13 is implemented.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217017070A KR20210084622A (en) | 2019-06-21 | 2020-02-26 | Time synchronization processing methods, electronic devices and storage media |
JP2021531851A JP2022510418A (en) | 2019-06-21 | 2020-02-26 | Time synchronization processing method, electronic devices and storage media |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910545218.8A CN112113582A (en) | 2019-06-21 | 2019-06-21 | Time synchronization processing method, electronic device, and storage medium |
CN201910545218.8 | 2019-06-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020253260A1 true WO2020253260A1 (en) | 2020-12-24 |
Family
ID=73796638
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/076836 WO2020253260A1 (en) | 2019-06-21 | 2020-02-26 | Time synchronization processing method, electronic apparatus, and storage medium |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP2022510418A (en) |
KR (1) | KR20210084622A (en) |
CN (1) | CN112113582A (en) |
WO (1) | WO2020253260A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113310505A (en) * | 2021-06-15 | 2021-08-27 | 苏州挚途科技有限公司 | External parameter calibration method and device of sensor system and electronic equipment |
CN113591015A (en) * | 2021-07-30 | 2021-11-02 | 北京小狗吸尘器集团股份有限公司 | Time delay calculation method and device, storage medium and electronic equipment |
CN113848696A (en) * | 2021-09-15 | 2021-12-28 | 北京易航远智科技有限公司 | Multi-sensor time synchronization method based on position information |
CN114217665A (en) * | 2021-12-21 | 2022-03-22 | 清华大学 | Camera and laser radar time synchronization method, device and storage medium |
CN115235527A (en) * | 2022-07-20 | 2022-10-25 | 上海木蚁机器人科技有限公司 | Sensor external parameter calibration method and device and electronic equipment |
CN115451932A (en) * | 2022-09-16 | 2022-12-09 | 湖南航天机电设备与特种材料研究所 | Multichannel gyroscope data synchronous acquisition and calculation method and system |
CN115979277A (en) * | 2023-02-22 | 2023-04-18 | 广州导远电子科技有限公司 | Time synchronization method, device, electronic equipment and computer readable storage medium |
CN117034201A (en) * | 2023-10-08 | 2023-11-10 | 东营航空产业技术研究院 | Multi-source real-time data fusion method |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113177440A (en) * | 2021-04-09 | 2021-07-27 | 深圳市商汤科技有限公司 | Image synchronization method and device, electronic equipment and computer storage medium |
CN113610136A (en) * | 2021-07-30 | 2021-11-05 | 深圳元戎启行科技有限公司 | Sensor data synchronization method and device, computer equipment and storage medium |
CN114413878B (en) * | 2021-12-24 | 2024-02-13 | 苏州浪潮智能科技有限公司 | Time calibration system, method, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103616710A (en) * | 2013-12-17 | 2014-03-05 | 靳文瑞 | Multi-sensor combined navigation time synchronizing system based on field programmable gate array (FPGA) |
CN104009833A (en) * | 2013-02-26 | 2014-08-27 | 赫克斯冈技术中心 | Sensor synchronization method and sensor measuring system appertaining thereto |
CN104501817A (en) * | 2014-11-24 | 2015-04-08 | 李青花 | Error elimination-based vehicle navigation system |
CN108680196A (en) * | 2018-04-28 | 2018-10-19 | 远形时空科技(北京)有限公司 | A kind of time delay adjustment method, system and computer-readable medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002090151A (en) * | 2000-09-14 | 2002-03-27 | Japan Aviation Electronics Industry Ltd | Sensor abnormality determination circuit and steering control device |
CN107728617B (en) * | 2017-09-27 | 2021-07-06 | 速感科技(北京)有限公司 | Multi-view online calibration method, mobile robot and system |
CN108871311B (en) * | 2018-05-31 | 2021-01-19 | 北京字节跳动网络技术有限公司 | Pose determination method and device |
CN109029433B (en) * | 2018-06-28 | 2020-12-11 | 东南大学 | Method for calibrating external parameters and time sequence based on vision and inertial navigation fusion SLAM on mobile platform |
CN109186596B (en) * | 2018-08-14 | 2020-11-10 | 深圳清华大学研究院 | IMU measurement data generation method, system, computer device and readable storage medium |
CN109506617B (en) * | 2018-12-28 | 2021-08-10 | 歌尔科技有限公司 | Sensor data processing method, storage medium, and electronic device |
-
2019
- 2019-06-21 CN CN201910545218.8A patent/CN112113582A/en active Pending
-
2020
- 2020-02-26 WO PCT/CN2020/076836 patent/WO2020253260A1/en active Application Filing
- 2020-02-26 KR KR1020217017070A patent/KR20210084622A/en active Search and Examination
- 2020-02-26 JP JP2021531851A patent/JP2022510418A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104009833A (en) * | 2013-02-26 | 2014-08-27 | 赫克斯冈技术中心 | Sensor synchronization method and sensor measuring system appertaining thereto |
CN103616710A (en) * | 2013-12-17 | 2014-03-05 | 靳文瑞 | Multi-sensor combined navigation time synchronizing system based on field programmable gate array (FPGA) |
CN104501817A (en) * | 2014-11-24 | 2015-04-08 | 李青花 | Error elimination-based vehicle navigation system |
CN108680196A (en) * | 2018-04-28 | 2018-10-19 | 远形时空科技(北京)有限公司 | A kind of time delay adjustment method, system and computer-readable medium |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113310505A (en) * | 2021-06-15 | 2021-08-27 | 苏州挚途科技有限公司 | External parameter calibration method and device of sensor system and electronic equipment |
CN113310505B (en) * | 2021-06-15 | 2024-04-09 | 苏州挚途科技有限公司 | External parameter calibration method and device of sensor system and electronic equipment |
CN113591015A (en) * | 2021-07-30 | 2021-11-02 | 北京小狗吸尘器集团股份有限公司 | Time delay calculation method and device, storage medium and electronic equipment |
CN113848696A (en) * | 2021-09-15 | 2021-12-28 | 北京易航远智科技有限公司 | Multi-sensor time synchronization method based on position information |
CN113848696B (en) * | 2021-09-15 | 2022-09-16 | 北京易航远智科技有限公司 | Multi-sensor time synchronization method based on position information |
CN114217665A (en) * | 2021-12-21 | 2022-03-22 | 清华大学 | Camera and laser radar time synchronization method, device and storage medium |
CN115235527A (en) * | 2022-07-20 | 2022-10-25 | 上海木蚁机器人科技有限公司 | Sensor external parameter calibration method and device and electronic equipment |
CN115451932A (en) * | 2022-09-16 | 2022-12-09 | 湖南航天机电设备与特种材料研究所 | Multichannel gyroscope data synchronous acquisition and calculation method and system |
CN115451932B (en) * | 2022-09-16 | 2024-05-24 | 湖南航天机电设备与特种材料研究所 | Multichannel gyroscope data synchronous acquisition and calculation method and system |
CN115979277A (en) * | 2023-02-22 | 2023-04-18 | 广州导远电子科技有限公司 | Time synchronization method, device, electronic equipment and computer readable storage medium |
CN115979277B (en) * | 2023-02-22 | 2023-06-02 | 广州导远电子科技有限公司 | Time synchronization method, apparatus, electronic device, and computer-readable storage medium |
CN117034201A (en) * | 2023-10-08 | 2023-11-10 | 东营航空产业技术研究院 | Multi-source real-time data fusion method |
Also Published As
Publication number | Publication date |
---|---|
JP2022510418A (en) | 2022-01-26 |
CN112113582A (en) | 2020-12-22 |
KR20210084622A (en) | 2021-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020253260A1 (en) | Time synchronization processing method, electronic apparatus, and storage medium | |
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
Qin et al. | Vins-mono: A robust and versatile monocular visual-inertial state estimator | |
Heng et al. | Self-calibration and visual slam with a multi-camera system on a micro aerial vehicle | |
CN106803271B (en) | Camera calibration method and device for visual navigation unmanned aerial vehicle | |
Schmid et al. | Autonomous vision‐based micro air vehicle for indoor and outdoor navigation | |
US11205283B2 (en) | Camera auto-calibration with gyroscope | |
US9243916B2 (en) | Observability-constrained vision-aided inertial navigation | |
CN110207714B (en) | Method for determining vehicle pose, vehicle-mounted system and vehicle | |
US10322819B2 (en) | Autonomous system for taking moving images from a drone, with target tracking and improved target location | |
US20210183100A1 (en) | Data processing method and apparatus | |
CN108981687B (en) | Indoor positioning method with vision and inertia integration | |
CN106767785B (en) | Navigation method and device of double-loop unmanned aerial vehicle | |
US20180075614A1 (en) | Method of Depth Estimation Using a Camera and Inertial Sensor | |
US20220051031A1 (en) | Moving object tracking method and apparatus | |
CN112116651B (en) | Ground target positioning method and system based on monocular vision of unmanned aerial vehicle | |
Zhang et al. | Vision-aided localization for ground robots | |
KR101985344B1 (en) | Sliding windows based structure-less localization method using inertial and single optical sensor, recording medium and device for performing the method | |
WO2019191288A1 (en) | Direct sparse visual-inertial odometry using dynamic marginalization | |
WO2021081774A1 (en) | Parameter optimization method and apparatus, control device, and aircraft | |
CN110824453A (en) | Unmanned aerial vehicle target motion estimation method based on image tracking and laser ranging | |
CN113551665A (en) | High dynamic motion state sensing system and sensing method for motion carrier | |
Hinzmann et al. | Flexible stereo: constrained, non-rigid, wide-baseline stereo vision for fixed-wing aerial platforms | |
WO2020019175A1 (en) | Image processing method and apparatus, and photographing device and unmanned aerial vehicle | |
CN110503684A (en) | Camera position and orientation estimation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20827563 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021531851 Country of ref document: JP Kind code of ref document: A Ref document number: 20217017070 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20827563 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25-05-2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20827563 Country of ref document: EP Kind code of ref document: A1 |