WO2023185215A1 - 数据校准 - Google Patents

数据校准 Download PDF

Info

Publication number
WO2023185215A1
WO2023185215A1 PCT/CN2023/071951 CN2023071951W WO2023185215A1 WO 2023185215 A1 WO2023185215 A1 WO 2023185215A1 CN 2023071951 W CN2023071951 W CN 2023071951W WO 2023185215 A1 WO2023185215 A1 WO 2023185215A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation data
data
calibration
determined
visual navigation
Prior art date
Application number
PCT/CN2023/071951
Other languages
English (en)
French (fr)
Inventor
李子恒
胡君
郎小明
Original Assignee
北京三快在线科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京三快在线科技有限公司 filed Critical 北京三快在线科技有限公司
Publication of WO2023185215A1 publication Critical patent/WO2023185215A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Definitions

  • This application relates to the field of autonomous driving technology, and in particular to data calibration.
  • sensors include Inertial Measurement Unit (IMU), cameras, and Global Navigation Satellite System (GNSS).
  • IMU Inertial Measurement Unit
  • GNSS Global Navigation Satellite System
  • VIO visual-inertial odometry
  • this application provides a data calibration method, including:
  • the visual navigation data and the calibration data with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same, the first coordinate system of the visual navigation data and the The conversion relationship between the second coordinate system of satellite navigation data;
  • a correction amount corresponding to the visual navigation data is determined, and the visual navigation data is calibrated according to the correction amount.
  • At least one calibration data for calibration is determined from the satellite navigation data according to the satellite navigation data resolution type at the current moment, including:
  • the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution
  • data representing the position and speed of the unmanned equipment are determined from the satellite navigation data as calibration for calibration data
  • the third value of the visual navigation data is determined.
  • the conversion relationship between a coordinate system and the second coordinate system of the satellite navigation data includes:
  • the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution, based on the visual navigation data and the calibration data, with the constraint that the visual navigation data and the calibration data have the same speed, treat Calculate the rotation matrix of the solution;
  • the translation matrix to be solved is solved
  • the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined.
  • determining the correction amount corresponding to the visual navigation data according to the determined conversion relationship, the visual navigation data and the calibration data includes:
  • the calibration data and the projection determine the gap between the observation matrix, the calibration data and the projection
  • the correction amount corresponding to the visual navigation data is determined.
  • the third value of the visual navigation data is determined.
  • the conversion relationship between a coordinate system and the second coordinate system of the satellite navigation data includes:
  • the solution to be solved The rotation matrix is solved;
  • the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined.
  • the third value of the visual navigation data is determined.
  • the conversion relationship between a coordinate system and the second coordinate system of the satellite navigation data includes:
  • the visual navigation data at that moment is determined
  • the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data at the current moment is determined.
  • the conversion relationship determined at this time is updated, including:
  • the conversion relationship determined at the time before this time is updated based on the conversion relationship determined at this time.
  • a data calibration device including:
  • the navigation data determination module is used to determine the visual navigation data and satellite navigation data of the unmanned driving equipment at the current moment based on the data collected by the sensor;
  • a calibration data determination module configured to determine at least one calibration data for calibration from the satellite navigation data according to the satellite navigation data resolution type at the current moment;
  • a conversion relationship determination module configured to determine the value of the visual navigation data according to the visual navigation data and the calibration data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same.
  • a data calibration module configured to determine the correction amount corresponding to the visual navigation data according to the determined conversion relationship, the visual navigation data and the calibration data, and calibrate the visual navigation data according to the correction amount.
  • the calibration data determination module is configured to determine the characteristics of the unmanned driving from the satellite navigation data when it is determined that the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution.
  • the position and speed data of the device are used as calibration data for calibration; when it is determined that the satellite navigation data solution type at the current moment is a single-point solution, the characteristics of the unmanned driving are determined from the satellite navigation data.
  • Data on the speed of the device is used as calibration data for calibration.
  • the conversion relationship determination module is configured to determine that the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution based on the visual navigation data and the calibration data.
  • the constraint is that the speed of the visual navigation data and the calibration data are the same, and the rotation matrix to be solved is solved; according to the rotation matrix, the visual navigation data and the calibration data determined by the solution, the visual navigation data is The position of the data and the calibration data are the same as the constraint, and the translation matrix to be solved is solved; according to the rotation matrix and the translation matrix determined by the solution, the first coordinate system of the visual navigation data and the satellite are determined The conversion relationship between the second coordinate system of navigation data.
  • the conversion relationship determination module is configured to determine, according to the visual navigation data and the calibration data, the visual navigation data solution type at the current moment as a single-point solution.
  • the constraint is that the speed of the navigation data and the calibration data are the same, and the rotation matrix to be solved is solved; according to the rotation matrix determined by the solution, the first coordinate system of the visual navigation data and the satellite navigation data are determined.
  • the conversion relationship between the second coordinate system is configured to determine, according to the visual navigation data and the calibration data, the visual navigation data solution type at the current moment as a single-point solution.
  • the data calibration module is configured to determine the projection of the visual navigation data in the second coordinate system according to the determined conversion relationship and the visual navigation data; according to the calibration The data and the projection are used to determine the gap between the observation matrix, the calibration data and the projection; based on the observation matrix and the gap, the correction amount corresponding to the visual navigation data is determined.
  • the conversion relationship determination module is used to obtain the visual navigation data and calibration data of the unmanned driving equipment at each historical moment; in chronological order, for each moment, according to the visual navigation data at that moment and Calibration data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data at this time is the same, determine the distance between the first coordinate system of the visual navigation data at this time and the second coordinate system of the satellite navigation data. Transformation relationship; based on the transformation relationship determined at this moment, update the transformation relationship determined at the previous moment until the stability of the updated transformation relationship is less than the preset stability threshold; determine the current moment based on the updated transformation relationship The conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data.
  • the conversion relationship determination module is used to determine whether the deviation degree of the conversion relationship determined at this time relative to the conversion relationship determined at the previous time is greater than a preset deviation threshold; if so, the determination is not based on this time.
  • the transformation relationship determined at the moment before this moment is updated; if not, the transformation relationship determined at the moment before this moment is updated according to the transformation relationship determined at this moment.
  • Embodiments of the present application provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer-readable storage medium.
  • Program when the computer program is executed by the processor, the above data calibration method is implemented.
  • An embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the computer program, the above data calibration method is implemented.
  • the data calibration method provided by the embodiment of the present application first determines the visual navigation data and satellite navigation data of the unmanned driving equipment at the current moment based on the collected data, and then determines the satellite navigation data from the satellite navigation data according to the solution type of the satellite navigation data at the current moment. Determine at least one calibration data used for calibration. Then, with the constraint that at least one of the position or speed of the visual navigation data and the calibration data is the same, the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined. Afterwards, based on the determined conversion relationship, the visual navigation data and the calibration data, the correction amount corresponding to the visual navigation data is determined, and the visual navigation data is calibrated based on the correction amount. By determining what data in the satellite navigation data to calibrate the visual navigation data based on the solution type of the satellite navigation data at the current moment, a better calibration result under this solution type can be obtained.
  • Figure 1 is a schematic flow chart of a data calibration method provided by an embodiment of the present application.
  • Figure 2 is a schematic flow chart of a data calibration method provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a data calibration device provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an electronic device implementing a data calibration method provided by an embodiment of the present application.
  • the visual navigation data in the VIO system needs to be calibrated based on satellite navigation data.
  • the difference between the satellite navigation data and the visual navigation data can be determined based on the satellite navigation data obtained during the same period, and the visual navigation data can be calibrated.
  • unmanned driving equipment may adopt different satellite navigation data calculation types to determine satellite navigation data under different actual circumstances.
  • Different satellite navigation data calculation types correspond to different data accuracy.
  • the accuracy of the data representing the position of the unmanned driving equipment corresponding to the single-point solution is not high enough. If the visual navigation data is calibrated based on this, the error will be large and it will be difficult to obtain better calibration results.
  • the accuracy of the satellite navigation data obtained by the unmanned driving device is also different. If the accuracy of the satellite navigation data acquired by the unmanned driving equipment is low, it will be difficult to obtain better calibration results when calibrating the visual navigation data based on the acquired satellite navigation data.
  • Figure 1 is a schematic flow chart of a data calibration method in an embodiment of the present application, including the following steps:
  • S100 Determine the visual navigation data and satellite navigation data of the unmanned driving equipment at the current moment based on the data collected by the sensor.
  • the unmanned driving equipment will position itself based on the data collected by the sensors during the driving process to determine the driving strategy according to the actual situation, or the unmanned driving equipment can determine the driving strategy based on its own location. Navigate and more. Based on this, in some embodiments, the unmanned driving device can determine the visual navigation data and satellite navigation data of the unmanned driving device at the current moment based on the data collected by the sensor.
  • the sensor can be an IMU, a camera, a satellite signal receiving device, etc.
  • Unmanned driving equipment navigates based on the VIO system and satellite navigation system. Therefore, each sensor needs to be able to collect at least one kind of data from the VIO system and the satellite navigation system.
  • the embodiments of this application do not limit the type of sensor used for data collection.
  • the visual navigation data mentioned above is the data that the VIO system of the unmanned driving device calculates and determines based on the collected data, which represents the position information and speed information of the unmanned driving device.
  • the satellite navigation data may be data representing the position information and speed information of the unmanned driving equipment determined by the unmanned driving equipment based on the data collected by the satellite navigation system.
  • the solved visual navigation data can include the current status of the unmanned driving equipment. and the covariance matrix P v .
  • P V represents the position of the unmanned driving equipment in the first coordinate system
  • R V represents the attitude of the unmanned driving equipment in the first coordinate system
  • v V represents the position of the unmanned driving equipment in the first coordinate system.
  • speed Represents the bias of the IMU accelerometer
  • the covariance information corresponding to each state quantity is a matrix of dimension 3 ⁇ 3
  • P v is a matrix of dimension 15 ⁇ 15.
  • the visual navigation data can be the visual navigation data obtained after calibration at the previous moment.
  • p x , p y , and p z represent the position of the unmanned driving equipment in the second coordinate system
  • v x , v y , and v z represent the speed of the unmanned driving equipment in the second coordinate system.
  • the unmanned driving equipment can determine whether the frame of satellite navigation data can be used to calibrate the visual navigation data based on the accuracy of each state quantity in the satellite navigation data and the preset accuracy conditions. For example, the unmanned driving equipment can determine whether the frame of satellite navigation data meets the accuracy requirements based on the standard deviation of each state quantity, and determine whether the frame of satellite navigation data is available. That is to say, for a frame of satellite navigation data, the frame of satellite navigation data includes state quantities. If based on the standard deviation of at least one state quantity in each state quantity, it is judged that the frame of satellite navigation data meets the accuracy requirements, Then it is determined that the satellite navigation data of this frame is available, that is, the satellite navigation data of this frame can be used to calibrate the visual navigation data. Then, the satellite navigation data that can be used to calibrate the visual navigation data is used as the satellite navigation data determined by the solution.
  • the accuracy conditions can be set as needed, and the embodiments of the present application do not limit this.
  • the unmanned driving equipment can determine whether the standard deviation ⁇ of p x is less than 0.04, whether the standard deviation ⁇ of p y is less than 0.05, whether the standard deviation ⁇ of v Determine whether the number of satellites used in this frame of satellite navigation data is greater than 20 and other conditions, and determine whether this frame of satellite navigation data can be used to calibrate visual navigation data.
  • the unmanned driving device can also send the collected data to the server, and the server will perform subsequent steps.
  • the following description will take unmanned driving equipment as an example.
  • the unmanned driving equipment mentioned in the embodiments of this application may refer to unmanned passenger vehicles, unmanned distribution equipment and other equipment that can realize automatic driving.
  • only the unmanned The driving equipment is explained as the execution subject.
  • S102 Determine at least one calibration data for calibration from the satellite navigation data according to the satellite navigation data resolution type at the current moment.
  • the unmanned driving equipment can further determine what data in the satellite navigation data is based on to calibrate the visual navigation data based on the accuracy of the resolved satellite navigation data. Therefore, in some embodiments, the unmanned driving device may determine at least one data for calibration from the satellite navigation data according to the type of satellite navigation data analysis at the current moment in step S100.
  • the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution
  • data representing the position and speed of the unmanned equipment are determined from the satellite navigation data as calibration for calibration.
  • data representing the speed of the unmanned equipment is determined from the satellite navigation data as calibration data for calibration.
  • the real-time differential positioning fixed solution refers to the solution result after positioning based on carrier phase observation values and the carrier phase narrow lane integer ambiguity has been fixed.
  • the accuracy of satellite navigation data corresponding to this solution type is relatively high, up to Centimeter or even millimeter level.
  • the difference between the data representing the position of the unmanned device in the satellite navigation data and the actual position of the unmanned device is very small, and the data representing the speed of the unmanned device in the satellite navigation data is different from the actual position of the unmanned device.
  • the actual speed difference between unmanned driving equipment is also very small, and the visual navigation data can be calibrated based on this satellite navigation data.
  • the small gap may include the gap being smaller than a first threshold, and the first threshold may be set based on experience or actual needs. Therefore, data characterizing the position and speed of the unmanned device can be determined from the satellite navigation data as calibration data for calibration.
  • the satellite navigation data corresponding to this solution type does not have high accuracy enough to represent the position of the unmanned equipment, and usually There is an error of 2 to 5 meters. In this case, there is a large gap between the data representing the position of the unmanned equipment in the satellite navigation data and the actual position of the unmanned equipment, and the visual navigation data cannot be calibrated based on the satellite navigation data. However, the difference between the data representing the speed of the unmanned device in the satellite navigation data and the actual speed of the unmanned device is very small. Therefore, data characterizing the speed of the unmanned device can be determined from the satellite navigation data as calibration data for calibration.
  • the unmanned driving equipment can reduce the confidence of the data representing the location.
  • the part of the covariance matrix of the observation noise corresponding to the position can be multiplied by a very large coefficient to indicate that the observation noise in the position part is very large and has a low confidence level, which will have no impact or only a very low impact on data calibration. Influence.
  • the above-mentioned “maximum” and “very large” may mean that the value is greater than the second threshold.
  • the second threshold may be set based on experience or actual needs, and is not limited here.
  • S104 According to the visual navigation data and the calibration data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same, determine the first coordinate system and sum of the visual navigation data. The conversion relationship between the second coordinate systems of the satellite navigation data.
  • the unmanned driving equipment needs to calibrate the visual navigation data based on the calibration data based on the gap between the two.
  • the gap between the calibration data and the visual navigation data can be determined.
  • the unmanned driving device can use the position and position of the visual navigation data and the calibration data according to the visual navigation data and the calibration data. At least one of the speeds is the same as a constraint, and the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined.
  • the transformation relationship can include rotation relationship and translation relationship. Since both the pitch angle and the roll angle of the unmanned equipment can be obtained by measurement, the yaw angle (yaw) of the unmanned equipment between the first coordinate system and the second coordinate system and the coordinate origin are determined. After the relationship, the transformation relationship between the first coordinate system and the second coordinate system can be determined.
  • the unmanned driving device can determine the relationship between the first coordinate system and the second coordinate system based on the rotation relationship and the translation relationship. conversion relationship between.
  • the unmanned driving device can first solve the rotation matrix to be solved based on the visual navigation data and the calibration data, with the constraint that the visual navigation data and the calibration data have the same speed.
  • the conversion of speed between the first coordinate system and the second coordinate system is only related to the rotation matrix, with In the formula, represents the rotation matrix to be solved, v G represents the speed of the calibration data in the second coordinate system, and v V represents the speed of the visual navigation data in the first coordinate system. Then the rotation matrix to be solved can be calculated That is, yaw is determined.
  • the unmanned driving equipment may first determine the conversion relationship between the first coordinate system and the second coordinate system as Then according to this conversion relationship, it can be determined Then the translation matrix to be solved can be calculated Among them, according to the description in step S100, PV represents the position of the unmanned driving equipment in the first coordinate system. Correspondingly, it can be seen that PG represents the position of the unmanned driving equipment in the second coordinate.
  • the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data can be determined.
  • the unmanned driving equipment when it is determined that the satellite navigation data solution type at the current moment is a single-point solution, since the accuracy of the position-characterizing data in the satellite navigation data corresponding to the single-point solution is not high enough, the unmanned driving equipment does not Data calibration is not based on data characterizing position in satellite navigation data. Therefore, the unmanned driving equipment can determine the transformation relationship between the first coordinate system and the second coordinate system only based on the rotation relationship.
  • the unmanned driving device can first solve the rotation matrix to be solved based on the visual navigation data and the calibration data, with the constraint that the visual navigation data and the calibration data have the same speed.
  • the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined.
  • the yaw is actually determined here, and the three-dimensional quantity corresponding to the translation relationship can be set to 0, and the rotation relationship can be used later.
  • the translation relationship may refer to the above-mentioned translation matrix
  • the rotation relationship may refer to the above-mentioned rotation matrix, that is, yaw.
  • S106 Determine the correction amount corresponding to the visual navigation data according to the determined conversion relationship, the visual navigation data and the calibration data, and calibrate the visual navigation data according to the correction amount.
  • the unmanned driving equipment can convert the visual navigation data and the calibration data into the same coordinate system, and determine The gap between the visual navigation data and the calibration data, and the visual navigation data is calibrated.
  • the unmanned driving device taking the conversion of visual navigation data to the coordinate system in which the calibration data is located, the unmanned driving device first determines that the visual navigation data is in the second coordinate system based on the determined conversion relationship and the visual navigation data. projection.
  • the visual navigation data also includes a covariance matrix, then for the covariance matrix P v in step S100, according to is the constructed transformation matrix, is the transformation matrix
  • the transpose matrix of , the projection P G of the covariance in the second coordinate system can be determined.
  • observation matrix Zm represents the current status of the unmanned driving equipment in the satellite navigation data. Please refer to the corresponding description in step S100. Represents the derivative of the measured value with respect to the state quantity.
  • the difference here can refer to the position and speed difference between the two, that is, the difference is Z m -(P G ,v G ).
  • the correction amount d y PG *H T *(H* PG *H T +R) -1 *(Z m -( PG ,v G )).
  • d y represents the update amount of the filter
  • H represents the previously determined observation matrix
  • H T represents the transpose matrix of the observation matrix H
  • R represents the covariance matrix of the observation noise, which can be determined as needed.
  • the update amount of covariance f y -P G *H T *(H*P G *H T +R) -1 *H*P G T .
  • P G T represents the transpose matrix of the projection P G.
  • the meaning of other quantities can refer to the corresponding description above and will not be repeated here.
  • the visual navigation data is calibrated according to the determined correction amount.
  • the determined corresponding update amount that is, the update amount dy of the filter
  • the determined corresponding update amount that is, the update amount f y of the covariance
  • the unmanned driving equipment can perform the next calibration based on the visual navigation data obtained from this calibration.
  • the visual navigation data and satellite navigation data of the unmanned driving equipment at the current moment are first determined based on the collected data, and then based on the solution type of the satellite navigation data at the current moment, the satellite navigation At least one calibration data used for calibration is determined in the data. Then, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same, a conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data is determined. According to the determined conversion relationship, the visual navigation data and the calibration data, the correction amount corresponding to the visual navigation data is determined, and the visual navigation data is calibrated. By determining what data in the satellite navigation data to calibrate the visual navigation data based on the solution type of the satellite navigation data at the current moment, a better calibration result under this solution type can be obtained.
  • step S104 when determining the transformation relationship between the first coordinate system and the second coordinate system, there may be errors in the transformation relationship determined through a single observation, so the unmanned driving equipment may Through the filter, the conversion relationship is updated.
  • the unmanned driving device may first obtain the visual navigation data and calibration data of the unmanned driving device at each historical moment. Then, in chronological order, for each moment, based on the visual navigation data and calibration data at that moment, and with the constraint that at least one of the position and speed of the visual navigation data and the calibration data at that moment is the same, the visual navigation data at that moment is determined.
  • the conversion relationship between the first coordinate system of the navigation data and the second coordinate system of the satellite navigation data is determined.
  • the conversion relationship determined at the previous time is updated until the stability of the updated conversion relationship is less than the preset stability threshold. According to the updated conversion relationship, the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data at the current moment is determined.
  • step S104 For determining the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data for each time moment, reference may be made to the corresponding description in step S104, which will not be described again here. Subsequently determined based on this moment The transformation relationship is to update the transformation relationship determined at the previous moment.
  • d x P*H T *(H*P*H T +R) -1 *(X m -X e ), where, d x Represents the update amount of the filter, H represents the observation matrix, P represents the covariance matrix of the filter, and R represents the covariance matrix of the observation noise, which can be set as needed.
  • the update amount of covariance f x -P*H T *(H*P*H T +R) -1 *H*P T .
  • the initial state and covariance can be updated according to the corresponding update amount. That is, the state X e is updated according to the update amount d x of the filter, and the covariance P is updated according to the update amount f x of the covariance.
  • the stability of the updated conversion relationship mentioned here is less than the preset stability threshold to determine whether the updated conversion relationship is convergent or stable.
  • the stability threshold can be set as needed, and the embodiments of this application do not limit this. Taking yaw as an example, when the variance of yaw is less than 0.000289, it can be determined that the updated transformation relationship is a converged transformation relationship. Then, according to the time sequence, the conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data at the current moment can be determined by updating.
  • the conversion relationship determined at this time may be different from the conversion relationship determined at the time before this time.
  • the degree of deviation between relationships is large.
  • the satellite navigation signal is poor at this time, and the accuracy of the collected data is not high enough, resulting in the conversion relationship determined at this time based on the satellite navigation data at the current time being not accurate enough. Therefore, the unmanned driving device can determine whether the conversion relationship determined at this time can be used for updating based on the degree of deviation between the conversion relationship determined at this time and the conversion relationship determined at a time before this time.
  • the unmanned driving device may first determine whether the deviation degree of the conversion relationship determined at this time from the conversion relationship determined at the previous time is greater than a preset deviation threshold. If so, the conversion relationship determined at the previous time will not be updated based on the conversion relationship determined at this time. If not, the conversion relationship determined at the time before this time is updated based on the conversion relationship determined at this time.
  • the degree of deviation can be reflected by the Euclidean distance or Mahalanobis distance between the two, and the preset deviation threshold can be set as needed, which is not limited in the embodiments of the present application.
  • the status of the filter can be reset and the status of the filter can continue to be updated based on subsequent observations.
  • the solution type of the satellite navigation data of the unmanned driving equipment at the last moment may be a real-time differential positioning fixed solution, but next time the solution type is a real-time differential positioning fixed solution. At this time, no differential correction information could be used, and the solution type was changed to single-point solution.
  • the unmanned driving equipment before determining the conversion relationship between the first coordinate system and the second coordinate system, the unmanned driving equipment can determine what data the subsequent unmanned driving equipment is based on based on the resolution type of the satellite navigation data at the current moment. Calibrating visual navigation data.
  • the determination can be based on application scenarios of different situations.
  • embodiments of the present application also provide a process of another data calibration method, as shown in Figure 2.
  • FIG. 2 is a schematic flow chart of another data calibration method provided by an embodiment of the present application, including the following steps:
  • S200 Determine the visual navigation data and satellite navigation data of the unmanned driving equipment at the current moment based on the data collected by the sensor.
  • step S202 Based on the accuracy of the satellite navigation data, determine whether the satellite navigation data can be used for data calibration. If so, execute step S204; if not, execute step S200.
  • step S100 To determine whether the satellite navigation data can be used for data calibration, refer to the corresponding description in step S100. If the accuracy of the current frame of satellite navigation data is poor and cannot be used to calibrate visual navigation data, the unmanned driving equipment can The data collected by the sensor determines the visual navigation data and satellite navigation data at the next moment, and the calibration process is restarted.
  • Step S206 Determine whether there is a conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data when the solution type of the determined satellite navigation data is a single-point solution. If so, execute Step S212, if not, execute step S208.
  • step S208 Determine whether there is a conversion relationship between the first coordinate system and the second coordinate system when the determined solution type of the satellite navigation data is a real-time differential positioning fixed solution. If so, perform step S212. , if not, execute step S210.
  • the human driving equipment since the conversion relationship is a conversion relationship between the two coordinate systems, it generally does not change over time, so there is no
  • the human driving equipment since the conversion relationship is a conversion relationship between the two coordinate systems, it generally does not change over time, so there is no
  • the human driving equipment does not have such a conversion relationship at the initial stage of operation, it can first determine the conversion relationship through a certain amount of data, including the corresponding conversion relationship when the solution type is a single-point solution, or when the solution type is a real-time differential positioning fixed solution. At least one of the corresponding conversion relationships is obtained and stored in the storage device. Then at subsequent moments, if there is an available conversion relationship, there is no need to determine the conversion relationship again.
  • the unmanned driving device determines that the solution type of the satellite navigation data at the current moment is a single-point solution through step S204, the unmanned driving device can first determine whether there is a determined solution type of the satellite navigation data that is a single-point solution. In the case of solution, the conversion relationship between the first coordinate system of visual navigation data and the second coordinate system of satellite navigation data.
  • the unmanned driving equipment can directly perform subsequent data calibration steps based on the determined conversion relationship.
  • the unmanned driving equipment can further determine whether there is a determined satellite navigation data and the solution type is a real-time differential positioning fixed solution.
  • step S104 it can be seen from the content in the aforementioned step S104 that when the solution type is a single-point solution, only the rotation matrix is used
  • the transformation relationship between the first coordinate system and the second coordinate system is determined, that is, only yaw is actually determined, and only the rotation relationship is actually used in the subsequent steps. Therefore, here you can set the three-dimensional quantity corresponding to the translation relationship in the corresponding transformation relationship when the solution type is fixed solution to 0, and use this as the corresponding transformation relationship when the solution type is single-point solution.
  • the unmanned driving equipment can perform subsequent data calibration steps based on the determined conversion relationship.
  • step S204 when the unmanned driving equipment determines through step S204 that the solution type of the satellite navigation data at the current moment is a single-point solution, no matter whether there is a corresponding conversion relationship when the solution type is a single-point solution, or there is a solution
  • the calculation type is the corresponding conversion relationship when the calculation type is real-time differential positioning fixed solution.
  • the unmanned driving equipment can perform subsequent data calibration steps based on the existing conversion relationship. Therefore, step S206 and step S208 may be executed in no particular order.
  • S212 Determine the data characterizing the speed of the unmanned driving device from the satellite navigation data as calibration data for calibration, and perform step S220.
  • step S214 Determine whether there is a conversion relationship between the first coordinate system and the second coordinate system when the determined solution type of the satellite navigation data is a real-time differential positioning fixed solution. If so, perform step S218. , if not, execute step S216.
  • the unmanned driving device determines in step S204 that the solution type of the satellite navigation data at the current moment is a real-time differential positioning fixed solution
  • the corresponding conversion relationship when the solution type is a real-time differential positioning fixed solution includes a rotation relationship. and translational relationships.
  • the corresponding conversion relationship when the solution type is single-point solution only includes rotation relationships. Therefore, in this case, even if there is a corresponding conversion relationship when the solution type is single-point solution at the current moment, the unmanned driving equipment is based on Performing subsequent calibration steps based on this existing conversion relationship will not yield a more accurate calibration result.
  • the unmanned driving equipment can only determine whether there is a determined solution type of satellite navigation data, which is the real-time differential positioning fixed solution, between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data. conversion relationship.
  • the unmanned driving equipment can directly perform subsequent data calibration steps based on the determined conversion relationship.
  • the unmanned driving equipment needs to determine the corresponding conversion relationship when the solution type is real-time differential positioning fixed solution in subsequent step S216, and Perform data calibration.
  • S216 According to the visual navigation data and the satellite navigation data, with the constraint that the positions and speeds of the visual navigation data and the satellite navigation data are the same, determine the relationship between the first coordinate system and the second coordinate system. conversion relationship between.
  • S218 Determine the data representing the position and speed of the unmanned driving device from the satellite navigation data as calibration data for calibration, and perform step S220.
  • S220 Determine the correction amount corresponding to the visual navigation data according to the determined conversion relationship, the visual navigation data and the calibration data, and calibrate the visual navigation data according to the correction amount.
  • step S214 when the unmanned driving device determines through step S204 that the solution type of the satellite navigation data at the current moment is real-time differential positioning fixed solution, and there is no solution type at the current moment that is real-time differential positioning
  • the unmanned driving equipment can further determine whether there is a corresponding conversion relationship in the current moment when the solution type is a single-point solution. If so, step S212 is executed. If not, , then execute step S216.
  • special circumstances can be one or more of the following situations: the driverless equipment has less energy and needs to save energy; the driverless equipment has tight computing power; the driverless equipment faces emergencies and needs to obtain calibration results in the shortest time, etc. .
  • the data calibration method provided by the embodiment of the present application can be applied to navigation scenarios based on VIO systems and satellite navigation systems.
  • the unmanned driving equipment can determine the visual navigation data and satellite navigation data based on the data collected by the sensor, and adaptively calibrate the visual navigation data based on the satellite navigation data according to different situations.
  • the method provided by the embodiment of the present application proposes corresponding data calibration methods under different satellite navigation data calculation types, so that better calibration results can be obtained under various satellite navigation data calculation types. .
  • Figure 3 is a schematic diagram of a data calibration device provided by an embodiment of the present application, including:
  • the navigation data determination module 300 is used to determine the visual navigation data and satellite navigation data of the unmanned driving device at the current moment based on the data collected by the sensor.
  • the calibration data determination module 302 is configured to determine at least one calibration data for calibration from the satellite navigation data according to the satellite navigation data resolution type at the current moment.
  • the conversion relationship determination module 304 is configured to determine the visual navigation data according to the visual navigation data and the calibration data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same.
  • the conversion relationship between the first coordinate system and the second coordinate system of the satellite navigation data is configured to determine the visual navigation data according to the visual navigation data and the calibration data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data is the same.
  • the data calibration module 306 is configured to determine the correction amount corresponding to the visual navigation data according to the determined conversion relationship, the visual navigation data and the calibration data, and calibrate the visual navigation data according to the correction amount.
  • the calibration data determination module 302 is configured to determine, from the satellite navigation data, the characteristics of the unmanned driving equipment when it is determined that the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution.
  • the position and speed data are used as calibration data for calibration.
  • the characteristics of the unmanned driving equipment are determined from the satellite navigation data.
  • the speed data is used as calibration data for calibration.
  • the conversion relationship determination module 304 is configured to, when it is determined that the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution, based on the visual navigation data and the calibration data, use the The constraint is that the speed of the visual navigation data and the calibration data are the same.
  • the rotation matrix to be solved is solved. According to the rotation matrix, the visual navigation data and the calibration data determined by the solution, the visual navigation data is and the position of the calibration data are the same as the constraint, the translation matrix to be solved is solved, and the first coordinate system of the visual navigation data and the satellite navigation are determined according to the rotation matrix and the translation matrix determined by the solution.
  • the transformation relationship between the second coordinate system of the data is configured to, when it is determined that the satellite navigation data solution type at the current moment is a real-time differential positioning fixed solution, based on the visual navigation data and the calibration data, use the The constraint is that the speed of the visual navigation data and the calibration data are the same.
  • the rotation matrix to be solved is solved. According to the rotation matrix,
  • the data calibration module 306 is configured to determine the projection of the visual navigation data in the second coordinate system according to the determined conversion relationship and the visual navigation data. According to the calibration data and the projection, determine the gap between the observation matrix, the calibration data and the projection, and determine based on the observation matrix and the gap The correction amount corresponding to the visual navigation data is determined, and the visual navigation data is calibrated according to the determined correction amount.
  • the conversion relationship determination module 304 is configured to, when it is determined that the satellite navigation data solution type at the current moment is a single-point solution, based on the visual navigation data and the calibration data, use the visual navigation
  • the constraint is that the speed of the data and the calibration data are the same.
  • the rotation matrix to be solved is solved. According to the rotation matrix determined by the solution, the first coordinate system of the visual navigation data and the third coordinate system of the satellite navigation data are determined. The conversion relationship between two coordinate systems.
  • the conversion relationship determination module 304 is used to obtain the visual navigation data and calibration data of the unmanned driving equipment at each historical moment, in chronological order, for each moment, according to the visual navigation data and calibration data at that moment. data, with the constraint that at least one of the position and speed of the visual navigation data and the calibration data at that time is the same, determine the conversion between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data at that time relationship, based on the conversion relationship determined at this moment, update the conversion relationship determined at the previous moment until the stability of the updated conversion relationship is less than the preset stability threshold, and determine all the transformation relationships at the current moment based on the updated conversion relationship. The conversion relationship between the first coordinate system of the visual navigation data and the second coordinate system of the satellite navigation data.
  • the conversion relationship determination module 304 is used to determine whether the deviation degree of the conversion relationship determined at this time relative to the conversion relationship determined at the previous time is greater than a preset deviation threshold. If so, the deviation degree determined at this time will not be determined.
  • the transformation relationship updates the transformation relationship determined at the moment before this moment. If not, the transformation relationship determined at the moment before this moment is updated based on the transformation relationship determined at this moment.
  • Embodiments of the present application also provide a computer-readable storage medium that stores a computer program.
  • the computer program can be used to execute the data calibration method provided in Figure 1 above.
  • the embodiment of the present application also provides a schematic structural diagram of the electronic device shown in FIG. 4 .
  • the electronic device includes a processor, internal bus, network interface, memory and non-volatile memory, and of course may also include other hardware required for business.
  • the processor reads the corresponding computer program from the non-volatile memory into the memory and then runs it to implement the data calibration method described in Figure 1 above.
  • a programmable logic device such as a field programmable gate array (FPGA)
  • FPGA field programmable gate array
  • HDL Hardware Description Language
  • HDL high-density LDL
  • ABEL Advanced Boolean Expression Language
  • AHDL Advanced Boolean Expression Language
  • Confluence CUPL
  • HDCal Component-Dielectric
  • JHDL Java Hardware Description Language
  • Lava Lava
  • Lola MyHDL
  • PALASM RHDL
  • Verilog Verilog
  • Those skilled in the art should also know that by simply logically programming the method flow using the above-mentioned hardware description languages and programming it into the integrated circuit, the hardware circuit that implements the logical method flow can be easily obtained.
  • the controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (eg, software or firmware) executable by the (micro)processor. , logic gates, switches, Application Specific Integrated Circuit (ASIC), programmable logic controllers and embedded microcontrollers.
  • controllers include but are not limited to the following microcontrollers: ARC 625D, Atmel AT91SAM, For Microchip PIC18F26K20 and Silicone Labs C8051F320, the memory controller can also be implemented as part of the memory's control logic.
  • the controller in addition to implementing the controller in the form of pure computer-readable program code, the controller can be completely programmed with logic gates, switches, application-specific integrated circuits, programmable logic controllers and embedded logic by logically programming the method steps. Microcontroller, etc. to achieve the same function. Therefore, this controller can be considered as a hardware component, and the devices included therein for implementing various functions can also be considered as structures within the hardware component. Or even, the means for implementing various functions can be considered as structures within hardware components as well as software modules implementing the methods.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, a cellular phone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device Or a combination of any of these devices.
  • embodiments of the present invention may be provided as methods, systems, or computer program products.
  • the invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
  • the invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions
  • the device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
  • These computer program instructions may also be loaded onto a computer or other programmable data processing device such that the computer or A series of operational steps are performed on other programmable devices to produce a computer-implemented process, whereby instructions executed on the computer or other programmable device provide for implementing a process or processes in a flowchart and/or a block in a block diagram or Steps for functions specified in multiple boxes.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include non-permanent storage in computer-readable media, random access memory (RAM) and/or non-volatile memory in the form of read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash random access memory
  • Computer-readable media includes both persistent and non-volatile, removable and non-removable media that can be implemented by any method or technology for storage of information.
  • Information may be computer-readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • read-only memory read-only memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • compact disc read-only memory CD-ROM
  • DVD digital versatile disc
  • Magnetic tape cassettes tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
  • embodiments of the present application may be provided as methods, systems or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • the application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • the present application may also be practiced in distributed computing environments where tasks are performed by remote processing devices connected through a communications network.
  • program modules may be located in both local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

一种数据校准方法,先根据采集到的数据确定无人驾驶设备当前时刻的视觉导航数据以及卫星导航数据(S100);然后根据当前时刻的卫星导航数据的解算类型,从该卫星导航数据中确定至少一种用于进行校准的校准数据(S102);接着以该视觉导航数据和该校准数据的位置和速度中的至少一种相同为约束,确定该视觉导航数据的第一坐标系和该卫星导航数据的第二坐标系之间的转换关系(S104);之后根据确定出的转换关系、该视觉导航数据以及该校准数据,确定该视觉导航数据对应的校正量,根据校正量对该视觉导航数据进行校准(S106)。

Description

数据校准
本申请要求于2022年04月02日提交的申请号为202210351011.9、申请名称为“一种自适应数据校准方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶技术领域,尤其涉及数据校准。
背景技术
目前,随着自动驾驶技术的发展,无人驾驶设备在行驶过程中,可以根据传感器采集到的数据不断地对自身进行定位,从而根据定位结果进行决策或进行导航。例如,传感器有惯性测量单元(Inertial Measurement Unit,IMU)、相机和全球卫星导航系统(Global Navigation Satellite System,GNSS)等。其中,融合IMU和相机可实现视觉惯性里程计(Visual-Inertial Odometry,VIO)。
发明内容
本申请实施例提供了数据校准。本申请采用下述技术方案。
一方面,本申请提供了一种数据校准方法,包括:
根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据;
根据所述当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据;
根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系;
根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据所述校正量对所述视觉导航数据进行校准。
在一些实施方式中,根据所述当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据,包括:
当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从所述卫星导航数据中确定表征所述无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据;
当确定所述当前时刻的卫星导航数据解算类型为单点解时,从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
在一些实施方式中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待 解的旋转矩阵进行解算;
根据解算确定的所述旋转矩阵、所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置相同为约束,对待解的平移矩阵进行求解;
根据解算确定的所述旋转矩阵以及所述平移矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,包括:
根据确定出的所述转换关系以及所述视觉导航数据,确定所述视觉导航数据在所述第二坐标系中的投影;
根据所述校准数据以及所述投影,确定观测矩阵、所述校准数据和所述投影之间的差距;
根据所述观测矩阵以及所述差距,确定所述视觉导航数据对应的校正量。
在一些实施方式中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
当确定所述当前时刻的卫星导航数据解算类型为单点解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;
根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
获取所述无人驾驶设备各历史时刻的视觉导航数据以及校准数据;
按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系;
根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值;
根据更新得到的转换关系,确定当前时刻的所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,包括:
判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值;
若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新;
若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
一方面,本申请实施例提供了一种数据校准装置,包括:
导航数据确定模块,用于根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据;
校准数据确定模块,用于根据所述当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据;
转换关系确定模块,用于根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系;
数据校准模块,用于根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据所述校正量对所述视觉导航数据进行校准。
在一些实施方式中,所述校准数据确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从所述卫星导航数据中确定表征所述无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据;当确定所述当前时刻的卫星导航数据解算类型为单点解时,从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
在一些实施方式中,所述转换关系确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;根据解算确定的所述旋转矩阵、所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置相同为约束,对待解的平移矩阵进行求解;根据解算确定的所述旋转矩阵以及所述平移矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,所述转换关系确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为单点解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,所述数据校准模块,用于根据确定出的所述转换关系以及所述视觉导航数据,确定所述视觉导航数据在所述第二坐标系中的投影;根据所述校准数据以及所述投影,确定观测矩阵、所述校准数据和所述投影之间的差距;根据所述观测矩阵以及所述差距,确定所述视觉导航数据对应的校正量。
在一些实施方式中,所述转换关系确定模块,用于获取所述无人驾驶设备各历史时刻的视觉导航数据以及校准数据;按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系;根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值;根据更新得到的转换关系,确定当前时刻的所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
在一些实施方式中,所述转换关系确定模块,用于判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值;若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新;若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机 程序,所述计算机程序被处理器执行时实现上述数据校准方法。
本申请实施例提供了一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述数据校准方法。
本申请实施例采用的上述至少一个技术方案能够达到以下有益效果:
本申请实施例提供的数据校准方法先根据采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据以及卫星导航数据,然后根据当前时刻的卫星导航数据的解算类型,从该卫星导航数据中确定至少一种用于进行校准的校准数据。接着以该视觉导航数据和该校准数据的位置或速度中的至少一种相同为约束,确定该视觉导航数据的第一坐标系和该卫星导航数据的第二坐标系之间的转换关系。之后,根据确定出的转换关系、该视觉导航数据以及该校准数据,确定该视觉导航数据对应的校正量,并根据校正量对该视觉导航数据进行校准。通过根据当前时刻的卫星导航数据的解算类型,确定基于卫星导航数据中的何种数据对视觉导航数据进行校准,从而得到该种解算类型下较好的校准结果。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请实施例提供的一种数据校准方法的流程示意图;
图2为本申请实施例提供的一种数据校准方法的流程示意图;
图3为本申请实施例提供的一种数据校准装置的示意图;
图4为本申请实施例提供的一种实现数据校准方法的电子设备示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
目前,在无人驾驶设备行驶过程中,会对自身不断的进行定位以进行导航。但是,由于VIO系统存在累计误差,因此,需要根据卫星导航数据对VIO系统中的视觉导航数据进行校准。比如,可根据同时段获取的卫星导航数据,确定卫星导航数据与视觉导航数据之间的差异,并对视觉导航数据进行校准。
然而由于无人驾驶设备实际的行驶环境较为复杂,且无人驾驶设备可能会在不同的实际情况下,采取不同的卫星导航数据解算类型确定卫星导航数据。而不同的卫星导航数据解算类型对应的数据的精度不同。其中,对于单点解对应的表征无人驾驶设备的位置的数据的精度不够高。若根据此对视觉导航数据进行校准,则误差较大,难以得到较好的校准结果。也就是说,根据无人驾驶设备所处位置的不同,该无人驾驶设备所获取的卫星导航数据的精度也不同。若该无人驾驶设备获取的卫星导航数据精度较低,则在根据获取的卫星导航数据对视觉导航数据进行校准时,难以得到较好的校准结果。
以下结合附图,详细说明本申请各实施例提供的技术方案。
图1为本申请实施例中一种数据校准方法的流程示意图,包括以下步骤:
S100:根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据。
为了保证无人驾驶设备的行驶安全,无人驾驶设备在行驶过程中,会根据传感器采集到的数据,对自身进行定位,以根据实际情况确定行驶策略,或者无人驾驶设备可根据自身的位置进行导航等。基于此,在一些实施例中,无人驾驶设备可根据传感器采集到的数据,确定该无人驾驶设备当前时刻的视觉导航数据和卫星导航数据。
其中,传感器可以是IMU、相机、卫星信号接收设备等。无人驾驶设备基于VIO系统和卫星导航系统进行导航,因此,各传感器需能够采集到VIO系统和卫星导航系统中的至少一种数据。本申请实施例对采用何种传感器进行数据采集不做限制。上文所说的视觉导航数据为该无人驾驶设备的VIO系统根据采集得到的数据,解算确定出的表征该无人驾驶设备位置信息与速度信息的数据。卫星导航数据可以是该无人驾驶设备基于卫星导航系统根据采集得到的数据,解算确定出的表征该无人驾驶设备位置信息与速度信息的数据。
当然,无人驾驶设备对采集到的数据进行解算时,通常会得到无人驾驶设备的状态量以及协方差等。
例如,对于视觉导航数据,解算得到的视觉导航数据可包含该无人驾驶设备当前的状态以及协方差矩阵Pv。其中,PV表示该无人驾驶设备在第一坐标系下的位置,RV表示该无人驾驶设备在第一坐标系下的姿态,vV表示该无人驾驶设备在第一坐标系下的速度,表示IMU加速度计的偏置,表示陀螺仪的偏置,PV、RV、vV各状态量对应的协方差信息为维度3×3的矩阵,因此Pv为维度15×15的矩阵。当然,由于本申请实施例通过滤波器对视觉导航数据进行校准,因此,该视觉导航数据可为上一时刻校准后得到的视觉导航数据。
对于卫星导航数据,解算得到的卫星导航数据可包含该无人驾驶设备当前的状态Zm=(px,py,pz,vx,vy,vz)、各状态量的标准差以及协方差矩阵。其中,px、py、pz表示该无人驾驶设备在第二坐标系下的位置,vx、vy、vz表示该无人驾驶设备在第二坐标系下的速度。
进一步地,无人驾驶设备可根据卫星导航数据中各状态量的精度以及预设的精度条件,判断该帧卫星导航数据是否可用于对视觉导航数据进行校准。示例性地,无人驾驶设备可根据各状态量的标准差判断该帧卫星导航数据是否达到精度要求,确定该帧卫星导航数据是否可用。也就是说,对于一帧卫星导航数据而言,该帧卫星导航数据包括有状态量,如果根据各个状态量中的至少一个状态量的标准差,判断出该帧卫星导航数据达到了精度要求,则确定该帧卫星导航数据可用,也即是,该帧卫星导航数据可用于对视觉导航数据进行校准。然后,将可用于对视觉导航数据进行校准的卫星导航数据,作为解算确定出的卫星导航数据。当然,精度条件可根据需要设置,本申请实施例对此不做限制。
例如,该无人驾驶设备可通过判断px的标准差σ是否小于0.04、py的标准差σ是否小于0.05、vx,vy,vz的标准差σ是否小于0.5、在通过解算确定该帧卫星导航数据时用到的卫星数是否大于20等条件中的一种或多种,确定该帧卫星导航数据是否可用于对视觉导航数据进行校准。
当然,该无人驾驶设备还可将采集到的数据发送给服务器,由服务器执行后续步骤。为了方便描述,后续以无人驾驶设备为例进行说明。本申请实施例中提到的无人驾驶设备可以是指无人乘用车、无人配送设备等能够实现自动驾驶的设备。为了方便说明,下面仅以无人 驾驶设备器为执行主体进行说明。
S102:根据所述当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据。
通过上述确定出视觉导航数据和卫星导航数据后,无人驾驶设备可进一步根据解算得到的卫星导航数据的精度,确定基于该卫星导航数据中的何种数据对视觉导航数据进行校准。因此,在一些实施例中,无人驾驶设备可根据步骤S100中的对当前时刻的卫星导航数据解算类型,从该卫星导航数据中确定至少一种用于进行校准的数据。
示例性地,当确定当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从该卫星导航数据中确定表征该无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据。当确定当前时刻的卫星导航数据解算类型为单点解时,从该卫星导航数据中确定表征该无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
其中,实时差分定位固定解是指通过载波相位观测值定位,且载波相位窄巷整周模糊度已经固定之后的解算结果,该种解算类型对应的卫星导航数据的精度较高,可达厘米级甚至毫米级。这种情况下,卫星导航数据中的表征该无人驾驶设备的位置的数据与该无人驾驶设备的实际位置差距很小,卫星导航数据中的表征该无人驾驶设备的速度的数据与该无人驾驶设备的实际速度差距也很小,完全可基于该卫星导航数据对视觉导航数据进行校准。其中,差距很小可以包括差距小于第一阈值,该第一阈值可以根据经验或者实际需求设置。因此,可从该卫星导航数据中确定表征该无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据。
对于单点解,是指解算过程中未使用任何差分改正信息得到的解算结果,该种解算类型对应的卫星导航数据中表征该无人驾驶设备的位置的数据精度不够高,通常会有2~5米的误差。这种情况下,卫星导航数据中的表征该无人驾驶设备的位置的数据与该无人驾驶设备的实际位置差距较大,无法基于该卫星导航数据对视觉导航数据进行校准。不过,卫星导航数据中的表征该无人驾驶设备的速度的数据与该无人驾驶设备的实际速度差距很小。因此,可从该卫星导航数据中确定表征该无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
当然,对于解算类型为单点解的情况,虽然仅确定表征该无人驾驶设备的速度的数据,作为用于进行校准的校准数据,但是,对表征该无人驾驶设备的位置的数据,也可以为该无人驾驶设备的后续过程提供一定的参考,比如该无人驾驶设备可将表征位置的数据的置信度降低。例如,可将观测噪声的协方差矩阵对应位置的部分乘以一个极大的系数,以表示位置部分的观测噪声非常大,置信度低,在进行数据校准时不造成影响或只造成极低的影响。其中,上文所说的极大和非常大可以是指取值大于第二阈值,该第二阈值可以根据经验或者实际需求设置,在此不做限定。
S104:根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
通过上述步骤确定出视觉导航数据以及校准数据后,无人驾驶设备需要根据两者之间的差距,基于校准数据对视觉导航数据进行校准。其中,当视觉导航数据和校准数据在同一个坐标系下时,能够确定该校准数据与视觉导航数据的差距。基于此,在一些实施例中,无人驾驶设备可根据该视觉导航数据以及该校准数据,以该视觉导航数据和该校准数据的位置和 速度中的至少一种相同为约束,确定该视觉导航数据的第一坐标系和该卫星导航数据的第二坐标系之间的转换关系。
其中,转换关系可包含旋转关系和平移关系。由于无人驾驶设备的俯仰角和横滚角都可通过测量获得,因此,确定该无人驾驶设备在第一坐标系和第二坐标系之间的偏航角(yaw)和坐标原点之间的关系后,即可确定第一坐标系和第二坐标系之间的转换关系。
进一步的,在一些实施例中,当确定当前时刻的卫星导航数据解算类型为实时差分定位固定解时,无人驾驶设备可根据旋转关系和平移关系确定第一坐标系和第二坐标系之间的转换关系。
示例性地,该无人驾驶设备可先根据视觉导航数据以及校准数据,以该视觉导航数据和该校准数据的速度相同为约束,对待解的旋转矩阵进行解算。其中,速度在第一坐标系和第二坐标系之间的转换仅与旋转矩阵相关,有式中,表示待解的旋转矩阵,vG表示校准数据在第二坐标系中的速度,vV表示视觉导航数据在第一坐标系下的速度。则可解算出待解的旋转矩阵即确定出yaw。
接着,根据解算确定的该旋转矩阵、该视觉导航数据以及该校准数据,以该视觉导航数据和该校准数据的位置相同为约束,对待解的平移矩阵进行求解。其中,该无人驾驶设备可首先确定出第一坐标系和第二坐标系之间的转换关系为则根据该转换关系,可确定出进而可解算出待解的平移矩阵其中,根据步骤S100中的说明可知,PV表示该无人驾驶设备在第一坐标系下的位置。相应地可知,PG表示该无人驾驶设备在第二坐标下的位置。
之后,根据解算确定的该旋转矩阵以及该平移矩阵,即可确定该视觉导航数据的第一坐标系和该卫星导航数据的第二坐标系之间的转换关系。
此外,在一些实施例中,当确定当前时刻的卫星导航数据解算类型为单点解时,由于单点解对应的卫星导航数据中表征位置的数据的精度不够高,该无人驾驶设备并不基于卫星导航数据中表征位置的数据进行数据校准。因此,该无人驾驶设备可只根据旋转关系确定第一坐标系和第二坐标系之间的转换关系。
示例性地,该无人驾驶设备可先根据视觉导航数据以及校准数据,以该视觉导航数据和该校准数据的速度相同为约束,对待解的旋转矩阵进行解算。
根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
其中,与解算类型为实时差分定位固定解时一致的内容可参考前述相应的说明。此处实际上确定出了yaw,并可将平移关系对应的三维量置为0,后续用到旋转关系即可。其中,平移关系可以是指上述的平移矩阵,旋转关系可以是指上述的旋转矩阵,即yaw。
S106:根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据校正量对所述视觉导航数据进行校准。
通过上述确定出视觉导航数据、校准数据以及视觉导航数据和校准数据各自所在坐标系之间的转换关系后,无人驾驶设备可将该视觉导航数据和该校准数据转换到同一坐标系下,确定该视觉导航数据和该校准数据之间的差距,并对该视觉导航数据进行校准。
示例性地,以将视觉导航数据转换到校准数据所在的坐标系为例,无人驾驶设备先根据确定出的该转换关系以及该视觉导航数据,确定该视觉导航数据在该第二坐标系中的投影。其中,该视觉导航数据包括无人驾驶设备当前时刻的状态,则对于步骤S100中的状态Xv根据(该为坐标系外参), 即可确定该状态在该第二坐标系中的投影该视觉导航数据还包括协方差矩阵,则对于步骤S100中的协方差矩阵Pv,根据 为构建的变换矩阵,为该变换矩阵的转置矩阵,即可确定该协方差在该第二坐标系中的投影PG
然后,根据该校准数据以及该投影,确定观测矩阵、该校准数据和该投影之间的差距。观测矩阵Zm表示卫星导航数据中该无人驾驶设备当前的状态,可参考步骤S100中的相应描述。表示测量值对于状态量的导数。这里的差距可以是指两者之间的位置和速度差距等,即差距为Zm-(PG,vG)。
接着,根据确定出的观测矩阵以及差距,确定该视觉导航数据对应的校正量。其中,校正量dy=PG*HT*(H*PG*HT+R)-1*(Zm-(PG,vG))。式中,dy表示滤波器的更新量,H表示前述确定出的观测矩阵,HT表示观测矩阵H的转置矩阵,R表示观测噪声的协方差矩阵,可以根据需要确定,本申请实施例对此不做限制。对于协方差,协方差的更新量fy=-PG*HT*(H*PG*HT+R)-1*H*PG T。式中,PG T表示投影PG的转置矩阵,其他各量的含义可参考上述相应描述,此处不再赘述。
之后,根据确定出的该校正量,对该视觉导航数据进行校准。对于状态,可将确定出的对应更新量(即滤波器的更新量dy)加在原状态上作为校准后的状态。同理,对于协方差也可将确定出的相应更新量(即协方差的更新量fy)加在原协方差上作为校准后该滤波器的协方差。后续该无人驾驶设备可基于该校准得到的视觉导航数据进行下一次校准。
基于图1所示的数据校准方法,先根据采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据以及卫星导航数据,然后根据当前时刻的卫星导航数据的解算类型,从该卫星导航数据中确定至少一种用于进行校准的校准数据。接着以该视觉导航数据和该校准数据的位置和速度中的至少一种相同为约束,确定该视觉导航数据的第一坐标系和该卫星导航数据的第二坐标系之间的转换关系。根据确定出的转换关系、该视觉导航数据以及该校准数据,确定该视觉导航数据对应的校正量,并对该视觉导航数据进行校准。通过根据当前时刻的卫星导航数据的解算类型,确定基于卫星导航数据中的何种数据对视觉导航数据进行校准,从而得到该种解算类型下较好的校准结果。
此外,在一些实施例中,步骤S104中,在确定第一坐标系和第二坐标系之间的转换关系时,通过单次观测确定出的转换关系可能会存在误差,因此无人驾驶设备可通过滤波器,对转换关系进行更新。
示例性地,该无人驾驶设备可先获取该无人驾驶设备各历史时刻的视觉导航数据以及校准数据。然后,按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系。接着,根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值。根据更新得到的转换关系,确定当前时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系。
其中,针对每个时刻确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系可参考步骤S104中的相应描述,此处不再赘述。后续根据该时刻确定的 转换关系,对该时刻前一时刻确定的转换关系进行更新,无人驾驶设备可以将前一时刻确定的转换关系Xe=(yaw_e,x_e,y_e,z_e)作为状态量。然后将该时刻确定的转换关系Xm=(yaw_m,x_m,y_m,z_m)作为测量值,确定观测矩阵进一步确定各状态量的更新量和协方差分别对应的更新量,dx=P*HT*(H*P*HT+R)-1*(Xm-Xe),其中,dx表示滤波器的更新量,H表示观测矩阵,P表示该滤波器的协方差矩阵,R表示观测噪声的协方差矩阵,可根据需要设置。协方差的更新量fx=-P*HT*(H*P*HT+R)-1*H*PT。之后,可对初始状态以及协方差根据相应的更新量进行更新。也即是,根据滤波器的更新量dx对状态Xe进行更新,根据协方差的更新量fx对协方差P进行更新。
这里说的更新后的转换关系的稳定性小于预设稳定阈值,是判断更新后的转换关系是否收敛或稳定。稳定阈值可根据需要设置,本申请实施例对此不做限制。以yaw为例,当yaw的方差小于0.000289时,可以确定更新后的转换关系为收敛后的转换关系。则按照时间顺序,可通过更新确定出当前时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系。
进一步地,在一些实施例中,上述根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新的过程中,可能该时刻确定的转换关系与该时刻前一时刻确定的转换关系之间的偏离程度较大。例如,该时刻处于卫星导航信号较差的位置,采集得到数据精度不够高,导致根据当前时刻的卫星导航数据确定得到的该时刻确定的转换关系不够准确。因此,无人驾驶设备可基于该时刻确定的转换关系与该时刻前一时刻确定的转换关系之间的偏离程度,确定该时刻确定的转换关系是否可用于进行更新。
示例性地,该无人驾驶设备可先判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值。若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新。若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
其中,偏离程度可通过两者之间的欧氏距离或马氏距离等反映,预设的偏离阈值可根据需要设置,本申请实施例对此不做限制。当然,可能会出现较长时间的数据都无法通过偏离程度的判断,此时,可将滤波器的状态进行重置,并继续基于后续的观测对滤波器的状态进行更新。
另外,在一些实施例中,在无人驾驶设备的行驶过程中,由于实际情况较为复杂,因此,可能无人驾驶设备上一时刻卫星导航数据的解算类型为实时差分定位固定解、但下一时刻未能使用任何差分改正信息,解算类型转变为单点解。这种情况下,在确定第一坐标系与第二坐标系之间的转换关系之前,无人驾驶设备可根据当前时刻的卫星导航数据的解算类型,确定后续无人驾驶设备基于何种数据对视觉导航数据进行校准。
此外,在一些实施例中,步骤S106中,确定观测噪声的协方差矩阵R时,可以基于不同情况的应用场景进行确定。示例性地,无人驾驶设备可根据卫星导航数据的精确度,调整测量方差,进一步确定观测噪声的协方差矩阵R。例如,若无人机以角速度0.2及以上快速旋转时,卫星导航数据的精确度与角速度呈负相关。此时,可给测量方差乘以基于角速度确定的系数,如w_scale=(w/0.2)^2。若无人驾驶设备以速度模长小于0.5进行低速运动时,卫星导航数据的精确度与速度呈正相关。此时,可给测量方差乘以基于速度确定的系数,如v_scale=(1/v)^2。
基于图1所示的数据校准流程,本申请实施例还提供了另一种数据校准方法的流程,如图2所示。
图2为本申请实施例提供的另一种数据校准方法的流程示意图,包括以下步骤:
S200:根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据。
S202:根据所述卫星导航数据的精度,判断所述卫星导航数据是否可用作数据校准,若是,则执行步骤S204,若否则执行步骤S200。
判断卫星导航数据是否可用作进行数据校准的内容可参考步骤S100中的相应描述,若当前该帧卫星导航数据精度较差,不可用于对视觉导航数据进行校准,则无人驾驶设备可根据传感器采集到的数据,确定下一时刻的视觉导航数据和卫星导航数据,重新开始执行校准流程。
S204:判断所述当前时刻的卫星导航数据的解算类型是否为单点解,若是,则执行步骤S206,若否,则执行步骤S214。
S206:判断是否存在已确定出的卫星导航数据的解算类型为单点解情况下,视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系,若是,则执行步骤S212,若否,则执行步骤S208。
S208:判断是否存在已确定出的卫星导航数据的解算类型为实时差分定位固定解情况下,所述第一坐标系和所述第二坐标系之间的转换关系,若是,则执行步骤S212,若否,则执行步骤S210。
对于视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系,由于该转换关系是两个坐标系之间的转换关系,一般不会随着时间发生变化,所以无人驾驶设备在运行初期没有该转换关系时,可以先通过一定量的数据确定该转换关系,包括解算类型为单点解情况下对应的转换关系、或解算类型为实时差分定位固定解情况下对应的转换关系中的至少一种,并存储在存储设备中。则在后续时刻,若存在可用的转换关系,则无需再次确定该转换关系。
因此,当无人驾驶设备通过步骤S204确定当前时刻的卫星导航数据的解算类型为单点解时,无人驾驶设备可先判断是否存在已确定出的卫星导航数据的解算类型为单点解情况下,视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系。
若当前时刻存在解算类型为单点解情况下对应的转换关系,则无人驾驶设备可直接根据该已确定出的转换关系,进行后续的数据校准步骤。
若当前时刻不存在解算类型为单点解情况下对应的转换关系,则无人驾驶设备可进一步判断是否存在已确定出的卫星导航数据的解算类型为实时差分定位固定解情况下,第一坐标系和第二坐标系之间的转换关系。
若当前时刻存在解算类型为实时差分定位固定解情况下对应的转换关系,则说明当前时刻的卫星导航信号发生了跳变,无人驾驶设备在之前时刻卫星导航数据的解算类型为实时差分定位固定解、但当前时刻未能使用任何差分改正信息,解算类型转变为单点解,无人驾驶设备可记录该次跳变。此时,无人驾驶设备仍然可以利用解算类型为固定解情况下对应的转换关系完成数据校准。
示例性地,由前述步骤S104中的内容可知,解算类型为单点解情况时,只根据旋转矩阵 确定该第一坐标系和第二坐标系之间的转换关系,即实际上只确定出了yaw,后续实际只用到了旋转关系。因此,此处可将解算类型为固定解情况下对应的转换关系中的平移关系对应的三维量置为0,并以此作为解算类型为单点解情况下对应的转换关系。然后,无人驾驶设备可根据该已确定出的转换关系,进行后续的数据校准步骤。
若当前时刻不存在解算类型为实时差分定位固定解情况下对应的转换关系,则说明不存在任何已确定出的转换关系,则无人驾驶设备需要在后续步骤S210中确定解算类型为单点解情况下对应的转换关系,并进行数据校准。
需要说明的是,当无人驾驶设备通过步骤S204确定当前时刻的卫星导航数据的解算类型为单点解时,则无论存在解算类型为单点解情况下对应的转换关系,还是存在解算类型为实时差分定位固定解情况下对应的转换关系,无人驾驶设备都可根据已存在的转换关系,执行后续数据校准步骤。因此,步骤S206和步骤S208在执行时,可不分先后顺序。
S210:根据所述视觉导航数据以及所述卫星导航数据,以所述视觉导航数据和所述卫星导航数据的速度相同为约束,确定所述第一坐标系和所述第二坐标系之间的转换关系。
S212:从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据,并执行步骤S220。
S214:判断是否存在已确定出的卫星导航数据的解算类型为实时差分定位固定解情况下,所述第一坐标系与所述第二坐标系之间的转换关系,若是,则执行步骤S218,若否,则执行步骤S216。
当无人驾驶设备通过步骤S204确定当前时刻的卫星导航数据的解算类型为实时差分定位固定解时,由前述内容可知,解算类型为实时差分定位固定解情况下对应的转换关系包括旋转关系以及平移关系。而解算类型为单点解情况下对应的转换关系只包括旋转关系,因此,这种情况下,即使当前时刻存在解算类型为单点解情况下对应的转换关系,该无人驾驶设备基于该已存在的转换关系执行后续校准步骤也无法得到较为准确的校准结果。
因此,无人驾驶设备可只判断是否存在已确定出的卫星导航数据的解算类型为实时差分定位固定解情况下,视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系。
若当前时刻存在解算类型为实时差分定位固定解情况下对应的转换关系,则无人驾驶设备可直接根据该已确定出的转换关系,进行后续的数据校准步骤。
若当前时刻不存在解算类型为实时差分定位固定解情况下对应的转换关系,则无人驾驶设备需要在后续步骤S216中确定解算类型为实时差分定位固定解情况下对应的转换关系,并进行数据校准。
S216:根据所述视觉导航数据以及所述卫星导航数据,以所述视觉导航数据和所述卫星导航数据的位置和速度相同为约束,确定所述第一坐标系和所述第二坐标系之间的转换关系。
S218:从所述卫星导航数据中确定表征所述无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据,并执行步骤S220。
S220:根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据校正量对所述视觉导航数据进行校准。
其中,基于图2所示的数据校准方法中的各步骤,与步骤S100~S106中一致的内容,可参考前述相应说明,此处不再一一赘述。
此外,在一些实施例中,步骤S214中,当无人驾驶设备通过步骤S204确定当前时刻的卫星导航数据的解算类型为实时差分定位固定解,且当前时刻不存在解算类型为实时差分定位固定解情况下对应的转换关系时,在特殊情况下,无人驾驶设备也可进一步判断当前时刻是否存在解算类型为单点解情况下对应的转换关系,若是,则执行步骤S212,若否,则执行步骤S216。其中,特殊情况可为无人驾驶设备能源较少需要节省能源、无人驾驶设备运算能力紧张、无人驾驶设备面临突发情况需要在最短的时间得到校准结果等中的一种或多种情况。
本申请实施例提供的数据校准方法,可应用于基于VIO系统和卫星导航系统进行导航的场景中。无人驾驶设备可根据传感器采集的数据,确定视觉导航数据以及卫星导航数据,并根据不同的情况,基于卫星导航数据自适应对视觉导航数据进行校准。本申请实施例提供的方法通过在不同的卫星导航数据解算类型的情况下,提出对应的数据校准方法,使得在各种卫星导航数据解算类型的情况下,都能得到较好的校准结果。
需要说明的是,本申请中所有获取信号、信息或数据的动作都是在遵照所在地国家相应的数据保护法规政策的前提下,并获得由相应装置所有者给予授权的情况下进行的。
以上为本申请实施例提供的数据校准方法,基于同样的思路,本申请实施例还提供了相应的数据校准装置,如图3所示。
图3为本申请实施例提供的一种数据校准装置示意图,包括:
导航数据确定模块300,用于根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据。
校准数据确定模块302,用于根据所述当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据。
转换关系确定模块304,用于根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
数据校准模块306,用于根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据校正量对所述视觉导航数据进行校准。
可选地,所述校准数据确定模块302,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从所述卫星导航数据中确定表征所述无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据,当确定所述当前时刻的卫星导航数据解算类型为单点解时,从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
可选地,所述转换关系确定模块304,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算,根据解算确定的所述旋转矩阵、所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置相同为约束,对待解的平移矩阵进行求解,根据解算确定的所述旋转矩阵以及所述平移矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
可选地,所述数据校准模块306,用于根据确定出的所述转换关系以及所述视觉导航数据,确定所述视觉导航数据在所述第二坐标系中的投影,根据所述校准数据以及所述投影,确定观测矩阵、所述校准数据和所述投影之间的差距,根据所述观测矩阵以及所述差距,确 定所述视觉导航数据对应的校正量,根据确定出的所述校正量,对所述视觉导航数据进行校准。
可选地,所述转换关系确定模块304,用于当确定所述当前时刻的卫星导航数据解算类型为单点解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算,根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
可选地,所述转换关系确定模块304,用于获取所述无人驾驶设备各历史时刻的视觉导航数据以及校准数据,按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系,根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值,根据更新得到的转换关系,确定当前时刻的所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
可选地,所述转换关系确定模块304,用于判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值,若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新,若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,计算机程序可用于执行上述图1提供的数据校准方法。
本申请实施例还提供了图4所示的电子设备的结构示意图。如图1所述,在硬件层面,该电子设备包括处理器、内部总线、网络接口、内存以及非易失性存储器,当然还可能包括其他业务所需要的硬件。处理器从非易失性存储器中读取对应的计算机程序到内存中然后运行,以实现上述图1所述的数据校准方法。
当然,除了软件实现方式之外,本申请实施例并不排除其他实现方式,比如逻辑器件异或软硬件结合的方式等等,也就是说以下处理流程的执行主体并不限定于各个逻辑单元,也可以是硬件或逻辑器件。
在20世纪90年代,对于一个技术的改进可以很明显地区分是硬件上的改进(例如,对二极管、晶体管、开关等电路结构的改进)还是软件上的改进(对于方法流程的改进)。然而,随着技术的发展,当今的很多方法流程的改进已经可以视为硬件电路结构的直接改进。设计人员几乎都通过将改进的方法流程编程到硬件电路中来得到相应的硬件电路结构。因此,不能说一个方法流程的改进就不能用硬件实体模块来实现。例如,可编程逻辑器件(Programmable Logic Device,PLD)(例如现场可编程门阵列(Field Programmable Gate Array,FPGA))就是这样一种集成电路,其逻辑功能由用户对器件编程来确定。由设计人员自行编程来把一个数字系统“集成”在一片PLD上,而不需要请芯片制造厂商来设计和制作专用的集成电路芯片。而且,如今,取代手工地制作集成电路芯片,这种编程也多半改用“逻辑编译器(logic compiler)”软件来实现,它与程序开发撰写时所用的软件编译器相类似,而要编译之前的原始代码也得用特定的编程语言来撰写,此称之为硬件描述语言(Hardware Description Language,HDL),而HDL也并非仅有一种,而是有许多种,如ABEL(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL (Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language)等,目前最普遍使用的是VHDL(Very-High-Speed Integrated Circuit Hardware Description Language)与Verilog。本领域技术人员也应该清楚,只需要将方法流程用上述几种硬件描述语言稍作逻辑编程并编程到集成电路中,就可以很容易得到实现该逻辑方法流程的硬件电路。
控制器可以按任何适当的方式实现,例如,控制器可以采取例如微处理器或处理器以及存储可由该(微)处理器执行的计算机可读程序代码(例如软件或固件)的计算机可读介质、逻辑门、开关、专用集成电路(Application Specific Integrated Circuit,ASIC)、可编程逻辑控制器和嵌入微控制器的形式,控制器的例子包括但不限于以下微控制器:ARC 625D、Atmel AT91SAM、Microchip PIC18F26K20以及Silicone Labs C8051F320,存储器控制器还可以被实现为存储器的控制逻辑的一部分。本领域技术人员也知道,除了以纯计算机可读程序代码方式实现控制器以外,完全可以通过将方法步骤进行逻辑编程来使得控制器以逻辑门、开关、专用集成电路、可编程逻辑控制器和嵌入微控制器等的形式来实现相同功能。因此这种控制器可以被认为是一种硬件部件,而对其内包括的用于实现各种功能的装置也可以视为硬件部件内的结构。或者甚至,可以将用于实现各种功能的装置视为既可以是实现方法的软件模块又可以是硬件部件内的结构。
上述实施例阐明的系统、装置、模块或单元,可以由计算机芯片或实体实现,或者由具有某种功能的产品来实现。一种典型的实现设备为计算机。示例性地,计算机例如可以为个人计算机、膝上型计算机、蜂窝电话、相机电话、智能电话、个人数字助理、媒体播放器、导航设备、电子邮件设备、游戏控制台、平板计算机、可穿戴设备或者这些设备中的任何设备的组合。
为了描述的方便,描述以上装置时以功能分为各种单元分别描述。当然,在实施本申请实施例时可以把各单元的功能在同一个或多个软件和/或硬件中实现。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或 其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本申请,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
本申请中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (16)

  1. 一种数据校准方法,其中,包括:
    根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据;
    根据当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据;
    根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系;
    根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据所述校正量对所述视觉导航数据进行校准。
  2. 如权利要求1所述的方法,其中,根据当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据,包括:
    当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从所述卫星导航数据中确定表征所述无人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据;
    当确定所述当前时刻的卫星导航数据解算类型为单点解时,从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
  3. 如权利要求2所述的方法,其中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
    当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;
    根据解算确定的所述旋转矩阵、所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置相同为约束,对待解的平移矩阵进行求解;
    根据解算确定的所述旋转矩阵以及所述平移矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  4. 如权利要求2所述的方法,其中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
    当确定所述当前时刻的卫星导航数据解算类型为单点解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;
    根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  5. 如权利要求1-4任一所述的方法,其中,根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,包括:
    根据确定出的所述转换关系以及所述视觉导航数据,确定所述视觉导航数据在所述第二坐标系中的投影;
    根据所述校准数据以及所述投影,确定观测矩阵、所述校准数据和所述投影之间的差距;
    根据所述观测矩阵以及所述差距,确定所述视觉导航数据对应的校正量。
  6. 如权利要求1-4任一所述的方法,其中,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系,包括:
    获取所述无人驾驶设备各历史时刻的视觉导航数据以及校准数据;
    按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系;
    根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值;
    根据更新得到的转换关系,确定当前时刻的所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  7. 如权利要求6所述的方法,其中,根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,包括:
    判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值;
    若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新;
    若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
  8. 一种数据校准装置,其中,包括:
    导航数据确定模块,用于根据传感器采集到的数据,确定无人驾驶设备当前时刻的视觉导航数据和卫星导航数据;
    校准数据确定模块,用于根据当前时刻的卫星导航数据解算类型,从所述卫星导航数据中确定至少一种用于进行校准的校准数据;
    转换关系确定模块,用于根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置和速度中的至少一种相同为约束,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系;
    数据校准模块,用于根据确定出的所述转换关系、所述视觉导航数据以及所述校准数据,确定所述视觉导航数据对应的校正量,根据所述校正量对所述视觉导航数据进行校准。
  9. 根据权利要求8所述的装置,其中,所述校准数据确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,从所述卫星导航数据中确定表征所述无 人驾驶设备的位置以及速度的数据,作为用于进行校准的校准数据;当确定所述当前时刻的卫星导航数据解算类型为单点解时,从所述卫星导航数据中确定表征所述无人驾驶设备的速度的数据,作为用于进行校准的校准数据。
  10. 根据权利要求9所述的装置,其中,所述转换关系确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为实时差分定位固定解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;根据解算确定的所述旋转矩阵、所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的位置相同为约束,对待解的平移矩阵进行求解;根据解算确定的所述旋转矩阵以及所述平移矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  11. 根据权利要求9所述的装置,其中,所述转换关系确定模块,用于当确定所述当前时刻的卫星导航数据解算类型为单点解时,根据所述视觉导航数据以及所述校准数据,以所述视觉导航数据和所述校准数据的速度相同为约束,对待解的旋转矩阵进行解算;根据解算确定的所述旋转矩阵,确定所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  12. 根据权利要求8-11任一所述的装置,其中,所述数据校准模块,用于根据确定出的所述转换关系以及所述视觉导航数据,确定所述视觉导航数据在所述第二坐标系中的投影;根据所述校准数据以及所述投影,确定观测矩阵、所述校准数据和所述投影之间的差距;根据所述观测矩阵以及所述差距,确定所述视觉导航数据对应的校正量。
  13. 根据权利要求8-11任一所述的装置,其中,所述转换关系确定模块,用于获取所述无人驾驶设备各历史时刻的视觉导航数据以及校准数据;按照时间顺序,针对每个时刻,根据该时刻的视觉导航数据以及校准数据,以该时刻的视觉导航数据和校准数据的位置和速度中的至少一种相同为约束,确定该时刻的视觉导航数据的第一坐标系和卫星导航数据的第二坐标系之间的转换关系;根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新,直至更新后的转换关系的稳定性小于预设稳定阈值;根据更新得到的转换关系,确定当前时刻的所述视觉导航数据的第一坐标系和所述卫星导航数据的第二坐标系之间的转换关系。
  14. 根据权利要求13所述的装置,其中,所述转换关系确定模块,用于判断该时刻确定的转换关系相对该时刻前一时刻确定的转换关系的偏离程度是否大于预设偏离阈值;若是,则不根据该时刻确定的转换关系对该时刻前一时刻确定的转换关系进行更新;若否,则根据该时刻确定的转换关系,对该时刻前一时刻确定的转换关系进行更新。
  15. 一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现上述权利要求1-7任一项所述的数据校准方法。
  16. 一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,所述处理器执行所述计算机程序时实现上述权利要求1-7任一所述的数据校准方法。
PCT/CN2023/071951 2022-04-02 2023-01-12 数据校准 WO2023185215A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210351011.9A CN116929407A (zh) 2022-04-02 2022-04-02 一种自适应数据校准方法及装置
CN202210351011.9 2022-04-02

Publications (1)

Publication Number Publication Date
WO2023185215A1 true WO2023185215A1 (zh) 2023-10-05

Family

ID=88199044

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/071951 WO2023185215A1 (zh) 2022-04-02 2023-01-12 数据校准

Country Status (2)

Country Link
CN (1) CN116929407A (zh)
WO (1) WO2023185215A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117668575B (zh) * 2024-01-31 2024-05-28 利亚德智慧科技集团有限公司 光影秀的数据模型构建方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103454650A (zh) * 2013-08-20 2013-12-18 北京航空航天大学 一种视觉辅助卫星完好性监测方法
US20170031032A1 (en) * 2015-07-27 2017-02-02 Qualcomm Incorporated Visual inertial odometry attitude drift calibration
CN109541656A (zh) * 2018-11-16 2019-03-29 和芯星通科技(北京)有限公司 一种定位方法及装置
CN110100151A (zh) * 2017-01-04 2019-08-06 高通股份有限公司 在视觉惯性测距中使用全球定位系统速度的系统及方法
CN111025364A (zh) * 2019-12-17 2020-04-17 南京航空航天大学 一种基于卫星辅助的机器视觉定位系统及方法
CN113405545A (zh) * 2021-07-20 2021-09-17 阿里巴巴新加坡控股有限公司 定位方法、装置、电子设备及计算机存储介质
CN113433576A (zh) * 2021-06-28 2021-09-24 中国科学院国家授时中心 一种gnss与v-slam融合定位方法及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103454650A (zh) * 2013-08-20 2013-12-18 北京航空航天大学 一种视觉辅助卫星完好性监测方法
US20170031032A1 (en) * 2015-07-27 2017-02-02 Qualcomm Incorporated Visual inertial odometry attitude drift calibration
CN110100151A (zh) * 2017-01-04 2019-08-06 高通股份有限公司 在视觉惯性测距中使用全球定位系统速度的系统及方法
CN109541656A (zh) * 2018-11-16 2019-03-29 和芯星通科技(北京)有限公司 一种定位方法及装置
CN111025364A (zh) * 2019-12-17 2020-04-17 南京航空航天大学 一种基于卫星辅助的机器视觉定位系统及方法
CN113433576A (zh) * 2021-06-28 2021-09-24 中国科学院国家授时中心 一种gnss与v-slam融合定位方法及系统
CN113405545A (zh) * 2021-07-20 2021-09-17 阿里巴巴新加坡控股有限公司 定位方法、装置、电子设备及计算机存储介质

Also Published As

Publication number Publication date
CN116929407A (zh) 2023-10-24

Similar Documents

Publication Publication Date Title
CN111077555B (zh) 一种定位方法及装置
WO2021169420A1 (zh) 基于多个图像帧的视觉定位
WO2023185215A1 (zh) 数据校准
WO2022063120A1 (zh) 组合导航系统初始化方法、装置、介质及电子设备
CN111797906B (zh) 基于视觉和惯性里程定位的方法及装置
CN112762965B (zh) 一种磁力计校准方法以及装置
CN114111776B (zh) 定位方法及相关装置
CN113188505B (zh) 姿态角度的测量方法、装置、车辆及智能臂架
WO2022218306A1 (zh) 一种无人驾驶设备
CN111998870B (zh) 一种相机惯导系统的标定方法和装置
CN113674424B (zh) 一种电子地图绘制的方法及装置
WO2022135070A1 (zh) 惯性导航方法及设备
CN112461258A (zh) 一种参数修正的方法及装置
US11353579B2 (en) Method for indicating obstacle by smart roadside unit
WO2023143132A1 (zh) 传感器数据的标定
CN116222586A (zh) 自动驾驶车辆的融合定位方法、装置及电子设备
CN115655305A (zh) 外参标定方法、装置、计算设备、存储介质和车辆
CN113048989B (zh) 一种无人驾驶设备的定位方法及定位装置
CN114061573A (zh) 地面无人车辆编队定位装置及方法
EP3855117A1 (en) Terrain referenced navigation system with generic terrain sensors for correcting an inertial navigation solution
CN112712561A (zh) 一种建图方法、装置、存储介质及电子设备
CN116242385A (zh) 一种视觉导航数据校准方法及装置
CN113734198B (zh) 一种目标相对航向获取方法及设备
CN111348223B (zh) 一种控制弹道顶点高度的闭路制导方法、装置及设备
CN116625372A (zh) 一种多源无人机导航方法、装置、存储介质及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23777594

Country of ref document: EP

Kind code of ref document: A1