CN116734845A - Gesture determination method and device, computer equipment and storage medium - Google Patents

Gesture determination method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN116734845A
CN116734845A CN202310707771.3A CN202310707771A CN116734845A CN 116734845 A CN116734845 A CN 116734845A CN 202310707771 A CN202310707771 A CN 202310707771A CN 116734845 A CN116734845 A CN 116734845A
Authority
CN
China
Prior art keywords
information
vehicle
attitude
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310707771.3A
Other languages
Chinese (zh)
Inventor
王璀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jidu Technology Co Ltd
Original Assignee
Beijing Jidu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jidu Technology Co Ltd filed Critical Beijing Jidu Technology Co Ltd
Priority to CN202310707771.3A priority Critical patent/CN116734845A/en
Publication of CN116734845A publication Critical patent/CN116734845A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

The present disclosure provides a gesture determination method, apparatus, computer device, and storage medium, including: acquiring motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information; determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information; correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information; and determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time.

Description

Gesture determination method and device, computer equipment and storage medium
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to a gesture determining method, a gesture determining device, computer equipment and a storage medium.
Background
In the field of automatic driving, it is generally required to acquire the pose of a vehicle to control the vehicle, and when determining the pose of the vehicle, the pose of the vehicle is usually calculated according to data acquired by sensors, but sensors such as an inertial measurement unit (Inertial Measurement Unit, IMU), an accelerometer, a gyroscope and the like are usually installed in a cab of the vehicle, so that the sensors are not rigidly connected with a chassis of the vehicle, and due to inertia effects of the vehicle during acceleration and deceleration or due to wind resistance and the like, the vehicle body is deformed and the pose is inclined, so that the data measured by the sensors are different from actual motion data of the chassis of the vehicle, and thus, the calculated pose of the vehicle has a large error.
Disclosure of Invention
The embodiment of the disclosure at least provides a gesture determining method, a gesture determining device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a gesture determining method, including:
acquiring motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
Correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information;
and determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time.
In a possible embodiment, the determining the first gravity information of the vehicle under the motion information includes:
determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time subsequent to the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time;
first gravity information of the vehicle under the motion information is determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
In a possible implementation manner, the gesture change pre-estimation function is used for representing the mapping relation between acceleration and speed and gesture change amount;
The determining the first posture change amount based on the pre-calibrated posture change estimation function and the motion information comprises the following steps:
and determining the first attitude change amount based on the pre-calibrated attitude change estimation function, the acceleration information at the target moment and the speed information at the target moment.
In a possible implementation manner, the determining the target attitude information of the vehicle at the target time based on the corrected second gravity information, the preset standard gravity information and the initial attitude information of the vehicle at the time before the target time includes:
determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information;
determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information;
the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
In a possible embodiment, the determining the second posture information of the vehicle based on the angular velocity information and the initial posture information includes:
determining a second attitude change amount of the vehicle based on angular velocity information of the vehicle at a target time;
second posture information of the vehicle is determined based on the second posture change amount and the initial posture information.
In a possible embodiment, the target attitude information includes roll angle, pitch angle, and azimuth angle;
the determining, based on the first posture information and the second posture information, target posture information of the vehicle at the target time, includes:
taking the azimuth angle in the second attitude information as the azimuth angle in the target attitude information; the method comprises the steps of,
and determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
In a possible embodiment, the method further comprises determining the attitude change estimation function according to the following method:
Determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively;
based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
In a second aspect, the disclosed embodiments further provide an in-vehicle apparatus for performing the attitude determination method according to the first aspect, or any one of possible implementation manners of the first aspect.
In a third aspect, embodiments of the present disclosure further provide an attitude determination apparatus, including:
the acquisition module is used for acquiring the motion information of the vehicle at the target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
a first determining module for determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
the correction module is used for correcting the first gravity information based on the first posture change quantity to obtain corrected second gravity information;
And the second determining module is used for determining target attitude information of the vehicle at the target moment based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the moment before the target moment.
In a possible implementation manner, the first determining module is configured, when determining first gravity information of the vehicle under the motion information, to:
determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time subsequent to the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time;
first gravity information of the vehicle under the motion information is determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
In a possible implementation manner, the gesture change pre-estimation function is used for representing the mapping relation between acceleration and speed and gesture change amount;
the first determining module is used for determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information:
And determining the first attitude change amount based on the pre-calibrated attitude change estimation function, the acceleration information at the target moment and the speed information at the target moment.
In a possible implementation manner, the second determining module is configured to, when determining the target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information, and initial attitude information of the vehicle at a time previous to the target time, determine the target attitude information of the vehicle at the target time:
determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information;
determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information;
the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
In a possible embodiment, the second determining module, when determining second posture information of the vehicle based on the angular velocity information and the initial posture information, is configured to:
Determining a second attitude change amount of the vehicle based on angular velocity information of the vehicle at a target time;
second posture information of the vehicle is determined based on the second posture change amount and the initial posture information.
In a possible embodiment, the target attitude information includes roll angle, pitch angle, and azimuth angle;
the second determining module is configured to, when determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information:
taking the azimuth angle in the second attitude information as the azimuth angle in the target attitude information; the method comprises the steps of,
and determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
In a possible implementation manner, the device further comprises a third determining module, and the third determining module is used for determining the gesture change estimating function according to the following method:
determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively;
Based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
In a fourth aspect, embodiments of the present disclosure further provide a computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect, or any of the possible implementations of the first aspect.
In a fifth aspect, the presently disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the first aspect, or any of the possible implementations of the first aspect.
The gesture determining method, the gesture determining device, the computer equipment and the storage medium provided by the embodiment of the disclosure can determine first gravity information of the vehicle under the motion information after acquiring the motion information of the vehicle at a target moment, and determine a first gesture change amount based on a pre-calibrated gesture change estimating function and the motion information; then correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information; and finally, determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time. Here, since the first posture variation characterizes the posture difference between the vehicle body and the chassis, after the first gravity information is corrected based on the first posture variation, the second gravity information of the vehicle chassis under the current posture can be obtained, and finally, the direction or angle difference between the second gravity information and the standard gravity information can be used for characterizing the real posture of the vehicle chassis.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method of pose determination provided by embodiments of the present disclosure;
FIG. 2 shows a flowchart for determining target pose information in a pose determination method provided by an embodiment of the present disclosure;
FIG. 3 illustrates an overall flow chart of a gesture determination method provided by embodiments of the present disclosure;
FIG. 4 illustrates an architectural diagram of an attitude determination apparatus provided by an embodiment of the present disclosure;
Fig. 5 shows a schematic structural diagram of a computer device according to an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the disclosed embodiments generally described and illustrated herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
It has been found that in the related art, the pose of the vehicle is generally determined according to a positioning system (such as a global positioning system (Global Positioning System, GPS), a Real-time kinematic (RTK)), a radar device, and an image acquired by a camera, or calculated according to motion data (such as angular velocity, speed, acceleration, and the like) measured by a sensor such as an accelerometer, a gyroscope, and the like. However, the positioning system usually has no signal in the places such as tunnels, underground parking lots and the like, the radar device and the camera have lower precision in severe weather and environment, and the radar device is easily interfered by a shielding object, so that the attitude estimation needs to be performed through a sensor.
The sensor such as the inertia measuring unit is usually installed in the cockpit of the automobile, and the data measured by the sensor is different from the real chassis motion data of the automobile due to the inertia effect, the deformation of the automobile body caused by wind resistance and the like in the acceleration or deceleration process of the automobile, so that the calculated attitude error is larger.
Based on the above study, the present disclosure provides a gesture determining method, which can determine first gravity information of a vehicle under motion information after obtaining the motion information of the vehicle at a target moment, and determine a first gesture change amount based on a pre-calibrated gesture change estimating function and the motion information; then correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information; and finally, determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time. Here, since the first posture variation characterizes the posture difference between the vehicle body and the chassis, after the first gravity information is corrected based on the first posture variation, the second gravity information of the vehicle chassis under the current posture can be obtained, and finally, the direction or angle difference between the second gravity information and the standard gravity information can be used for characterizing the real posture of the vehicle chassis.
Compared with the method for determining the gesture in the related art, the corrected second gravity information obtained by the method provided by the disclosure eliminates the influence of the deformation and the inertia effect of the vehicle body, so that the gesture of the vehicle chassis determined according to the second gravity information is more real and accurate, and the method provided by the disclosure is higher in gesture determination accuracy.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
For the convenience of understanding the present embodiment, first, a detailed description will be given of a method for determining an attitude disclosed in the embodiment of the present disclosure, referring to fig. 1, which is a flowchart of a method for determining an attitude provided in the embodiment of the present disclosure, where the method includes steps 101 to 104, where:
Step 101, acquiring motion information of a vehicle at a target moment and initial posture information of the vehicle at a moment before the target moment; wherein the motion information comprises angular velocity information, acceleration information and speed information;
step 102, determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
step 103, correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information;
and 104, determining target attitude information of the vehicle at the target moment based on the corrected second gravity information and preset standard gravity information.
The following is a detailed description of the above steps:
aiming at step 101,
The vehicle can acquire the motion information of each moment in real time in the running process, and the target moment can be any moment before the current moment, such as the moment before the current moment. The motion information may be acquired based on a sensor on the vehicle, specifically, the velocity information may be acquired according to a velocity sensor, the angular velocity information may be acquired according to a gyroscope, the acceleration information may be acquired according to an accelerometer, or the angular velocity information and the acceleration information may be acquired according to an inertial measurement unit (Inertial Measurement Unit, IMU).
Here, the initial posture information of the previous time of the target time may be target posture information determined according to the method described in steps 101 to 104, with the previous time of the target time as the target time; in the case where the previous time is the first time (e.g., when the vehicle is just started), the initial posture information may be preset with posture information (e.g., 0).
Aiming at step 102,
The first gravity information may represent acceleration of the position (approximately, a vehicle body) where the sensor is located in a vertical direction in a first coordinate system, where the first coordinate system is a coordinate system established based on data measured by the sensor, and may be approximately a vehicle body coordinate system, and may be, for example, a coordinate system formed by coordinate axes perpendicular to each other in three directions, that is, a forward direction of the vehicle, a vertical direction of the vehicle, and a left-right direction of the vehicle.
In one possible embodiment, when determining the first gravity information of the vehicle under the motion information, the longitudinal acceleration of the vehicle may be determined based on the speed information of the vehicle at a time before the target time and the speed information at a time after the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time; first gravity information of the vehicle under the motion information is then determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
Specifically, the longitudinal acceleration of the vehicle may represent the acceleration of the vehicle in the forward or backward direction, and when determining the longitudinal acceleration, a speed difference between the speed information of the previous time of the target time and the speed information of the next time of the target time, and a first time difference between the previous time of the target time and the next time of the target time may be calculated, and then the speed difference is divided by the first time difference to obtain the longitudinal acceleration, which may be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the longitudinal acceleration, v i-1 Velocity information v at a time preceding the target time i+1 T is the speed information at the time subsequent to the target time i-1 T, which is the time before the target time i+1 Is saidThe time subsequent to the target time.
The lateral acceleration of the vehicle may be understood as the centrifugal acceleration of the vehicle, which may be obtained by multiplying the angular velocity information of the vehicle at the target time and the velocity information at the target time, that is, may be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the lateral acceleration r i V, which is the angular velocity information at the target time i And the speed information is the speed information of the target moment.
The acceleration information at the target time is used to represent the overall acceleration of the vehicle, that is, the acceleration information is formed by acceleration in multiple directions, so that the first gravity information can be obtained by subtracting the lateral acceleration and the longitudinal acceleration from the acceleration information, that is, the first gravity information can be calculated according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the first gravity information, a i For the acceleration information, ++>For the longitudinal acceleration +.>Is the lateral acceleration.
By this method, the first gravity information of the vehicle body can be accurately determined by determining the first gravity information based on the longitudinal acceleration and the lateral acceleration.
In step 102, the attitude change estimation function may be used to estimate an attitude change of a position (typically, a vehicle body) where a sensor is located in the vehicle. In a specific embodiment, the estimated posture change of the posture change estimating function may be a posture change caused by rotational inertia generated by wind resistance and/or acceleration, and since the chassis of the vehicle is less affected by rotational inertia generated by wind resistance and/or acceleration, the estimated posture change function may also be a function for estimating a posture difference between the vehicle body and the chassis of the vehicle, or a function for determining a posture difference between a first coordinate system of the vehicle body and a second coordinate system of the chassis.
In one possible implementation, the attitude change estimation function may be determined according to the following method: determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively; based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
Specifically, a plurality of speed information, such as 24 speeds (i.e., 5km/h, 10km/h, 15km/h, 20km/h … … km/h) with a difference of 5km/h between 0km/h and 120km/h may be preset, and the plurality of longitudinal accelerations may be opening and closing degree control based on the accelerator and the brake (such as lightly stepping on the accelerator may make the vehicle have a smaller longitudinal acceleration, stepping on the accelerator to the bottom may make the vehicle have a larger longitudinal acceleration), for example, 12 opening and closing degrees may be set for the accelerator and the brake respectively to obtain 24 different longitudinal accelerations, where a method for determining the longitudinal acceleration is the same as the method for determining the longitudinal acceleration described above and will not be repeated herein.
And then, the plurality of speed information and the plurality of longitudinal accelerations are arranged and combined to test the posture change amount corresponding to each longitudinal acceleration under each speed information, and the above example is continued, wherein the posture change amount corresponding to 24 longitudinal accelerations under 24 speed information is tested, namely, the posture change amount under 24×24 speed information and the longitudinal acceleration are tested in total.
Here, the posture change amount may be determined by any method of determining the posture change amount, and in a specific example, determination may be made by a method of determining the second posture change amount later.
Finally, a transfer function between the speed information, the longitudinal acceleration, and the attitude change amount, that is, the attitude change estimation function, may be determined, and for example, an algorithm (such as a least square method) for fitting a curved surface may be used to determine the attitude change estimation function.
The attitude change estimation function can be represented by the following formula:
wherein δQ is i For the amount of change in the attitude,for the longitudinal acceleration, v i Is the speed information.
Since the larger the speed of the vehicle is, the larger the wind resistance is, the larger the acceleration of the vehicle is, and the larger the influence of the inertia effect is, the sensor is, therefore, by the method, the attitude change amount of the vehicle caused by the wind resistance and the inertia when the vehicle runs at different speeds and accelerations can be accurately determined, and the target attitude information of the vehicle can be accurately calculated based on the calculated attitude change amount in the subsequent method.
In a possible implementation manner, in a case where the gesture change prediction function is used to characterize a mapping relationship between acceleration and velocity and a gesture change amount, when determining a first gesture change amount based on a pre-calibrated gesture change prediction function and the motion information, the first gesture change amount may be determined based on the pre-calibrated gesture change prediction function, acceleration information at the target time, and velocity information at the target time.
Specifically, the first attitude change amount may be obtained by using acceleration information at the target time and velocity information at the target time as input amounts of the attitude change estimation function.
By this method, since the attitude change estimation function is determined based on a plurality of pieces of speed information and the attitude change amount under the longitudinal acceleration, the first attitude change amount can be determined accurately based on the acceleration information at the target time and the speed information at the target time.
Aiming at step 103,
In one possible implementation manner, when the first gravity information is corrected based on the first posture change amount to obtain corrected second gravity information, the first posture change amount may be multiplied by the first gravity information to obtain the second gravity information, that is, a formula for calculating the second gravity information may be as follows
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the second gravity information, δQ i For the first attitude change amount, +.>Is the first gravity information.
It is understood that the position of the sensor (approximately the vehicle body) has a pose difference compared with the vehicle chassis, and based on the first pose variation (that is, the pose difference between the chassis and the position of the sensor), the acceleration of the chassis in the vertical direction in the second coordinate system, which is a coordinate system formed by taking the advancing direction, the left-right direction, and the vertical direction of the chassis as coordinate axes, can be determined.
For step 104,
Here, the standard gravity information may be [0, 1], and the target posture information may be represented in any posture representation form such as euler angle, li-algebra, quaternion, etc., and if the target posture information is represented in the form of euler angle, that is, if the target posture information includes roll angle, pitch angle, and azimuth angle, the roll angle and pitch angle of the target posture information may be determined according to the second gravity information and the standard gravity information, and the azimuth angle of the target posture information may be determined according to the initial posture information.
In one possible embodiment, when determining the roll angle and the pitch angle of the target attitude information according to the second gravity information and the standard gravity information, an attitude axis angle (i.e., a method of representing an attitude by an axis and an angle rotated around the axis, hereinafter referred to as an attitude axis and an attitude angle, respectively) may be determined first according to the second gravity information and the standard gravity information, and then the roll angle and the pitch angle of the target attitude information may be determined based on the attitude axis angle. Illustratively, the attitude axis angle may be calculated by the following formula:
Wherein, the liquid crystal display device comprises a liquid crystal display device,representing the attitude axis angle, axis i Representing the attitude axis, θ i Representing the attitude angle, g representing the standard gravity information,>representing the second gravitational information.
In one possible embodiment, when determining the target attitude information of the vehicle at the target time based on the corrected second gravity information, the preset standard gravity information, and the initial attitude information of the vehicle at the time before the target time, as shown in fig. 2, the method includes the following steps 201 to 202:
step 201, determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information.
Here, the first attitude information is the attitude axis angle described above.
In one possible embodiment, when determining the second posture information of the vehicle based on the angular velocity information and the initial posture information, the second posture change amount of the vehicle may be determined based on the angular velocity information of the vehicle at the target time first; and then determining second posture information of the vehicle based on the second posture change amount and the initial posture information.
Specifically, the second attitude change amount of the vehicle may be determined according to the angular velocity information of the vehicle at the target time and the second time difference between the target time and the time immediately before the target time, and the second attitude change amount may be calculated according to the following formula, by way of example:
wherein δQ is i Representing the second posture change amount, Q represents a preset calculation algorithm of the second posture change amount, and r i Information indicating the angular velocity of the vehicle at a target time, t i Representing the target time, t i-1 Indicating a time immediately preceding the target time.
And determining a position change amount of the vehicle based on the speed information of the vehicle at the target time and the second time difference value, the position change amount being calculated, for example, according to the following formula:
δp i =v i ·(t i -t i-1 )
wherein the method comprises the steps of,δp i Representing the amount of change in position, v i Information indicating the speed of the vehicle at a target time, t i Representing the target time, t i-1 Indicating a time immediately preceding the target time.
Then, a pose change amount can be formed based on the second pose change amount and the position change amount, and the target pose information at the target moment is obtained based on the initial pose information at the moment before the target moment and the pose change amount, so that the second pose information in the target pose information is obtained; wherein the initial pose information is formed based on the initial pose information and initial position information, and the initial position information can be determined based on any positioning technology, such as a global positioning system (Global Positioning System, GPS), a Real-time kinematic (RTK), and a Real-time kinematic.
Here, the calculation formula of the target pose information may be exemplarily shown as follows:
T i =T i-1 δT i
wherein T is i Target pose information representing the target moment, T i-1 Initial pose information δT representing the time immediately before the target time i And representing the pose change amount.
Step 202, determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information; the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
In one possible embodiment, the roll angle and the pitch angle of the first attitude information may be regarded as the roll angle and the pitch angle of the target attitude information, and the azimuth angle of the second attitude information may be regarded as the azimuth angle of the target attitude information.
In another possible implementation, the target pose information may be determined based on a complementary filtering algorithm (e.g., mahonyl algorithm), the first pose information, and the second pose information, i.e., calculated according to the following formula:
Wherein Q is i Representing the object pose information, mahonyl representing the mahonyl algorithm,representing the second gesture information, +.>Representing the first posture information, +.>Representing roll angle in said first posture information,/or->Representing the pitch angle (here, may be 0) in the first attitude information,>represents azimuth angle, θ in the first attitude information i Representing roll angle phi in the second attitude information i Representing pitch angle, < > in said second attitude information>Representing an azimuth in the second pose information.
More specifically, the azimuth in the second attitude information may be taken as the azimuth in the target attitude information; and determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
For example, taking the first preset weight as 0.1 and the second preset weight as 0.9 as an example, the calculation formula of the target gesture information may be as follows:
wherein Q is i The object pose information is represented as such,representing roll angle in said first posture information,/or->Representing the pitch angle, θ, in the first attitude information i Representing roll angle phi in the second attitude information i Representing pitch angle, < > in said second attitude information>Representing an azimuth in the second pose information.
By combining the first posture information and the second posture information calculated by the two methods, the target posture information is determined together, and larger deviation of posture calculation errors caused by sensor measurement errors can be reduced.
Finally, the embodiment of the present disclosure further provides an overall flow of the gesture determining method, as shown in fig. 3, including the following steps 301 to 306:
step 301, obtaining motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
step 302 of determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time following the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time; determining first gravity information of the vehicle under the motion information based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration; determining a first attitude change amount based on a pre-calibrated attitude change estimation function, acceleration information at the target moment and speed information at the target moment;
Step 303, correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information;
step 304, determining first posture information based on the corrected second gravity information and the standard gravity information;
after the execution of step 304, jump to step 306;
step 305, determining a second attitude change amount of the vehicle based on the angular velocity information of the vehicle at the target time; determining second posture information of the vehicle based on the second posture change amount and initial posture information of the vehicle at a time previous to a target time;
here, the execution sequence of steps 302 to 304 is not consecutive to the execution sequence of step 305;
step 306, using the azimuth angle in the second gesture information as the azimuth angle in the target gesture information; and determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
According to the gesture determining method provided by the embodiment of the disclosure, after the motion information of the vehicle at the target moment is obtained, the first gravity information of the vehicle under the motion information is determined, and the first gesture change amount is determined based on a pre-calibrated gesture change estimation function and the motion information; then correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information; and finally, determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time. Here, since the first posture variation characterizes the posture difference between the vehicle body and the chassis, after the first gravity information is corrected based on the first posture variation, the second gravity information of the vehicle chassis under the current posture can be obtained, and finally, the direction or angle difference between the second gravity information and the standard gravity information can be used for characterizing the real posture of the vehicle chassis.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
The embodiment of the disclosure also provides an in-vehicle device for executing the gesture determining method described in the above embodiment.
Based on the same inventive concept, the embodiment of the present disclosure further provides an attitude determination apparatus corresponding to the attitude determination method, and since the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to that of the attitude determination method in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 4, an architecture diagram of an attitude determination apparatus according to an embodiment of the present disclosure is provided, where the apparatus includes: an acquisition module 401, a first determination module 402, a correction module 403, and a second determination module 404; wherein, the liquid crystal display device comprises a liquid crystal display device,
an obtaining module 401, configured to obtain motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
A first determining module 402, configured to determine first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
the correction module 403 is configured to correct the first gravity information based on the first posture change amount, so as to obtain corrected second gravity information;
a second determining module 404, configured to determine target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information, and initial attitude information of the vehicle at a time previous to the target time.
In a possible implementation manner, the first determining module 402 is configured, when determining the first gravity information of the vehicle under the motion information, to:
determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time subsequent to the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time;
First gravity information of the vehicle under the motion information is determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
In a possible implementation manner, the gesture change pre-estimation function is used for representing the mapping relation between acceleration and speed and gesture change amount;
the first determining module 402 is configured to, when determining the first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information:
and determining the first attitude change amount based on the pre-calibrated attitude change estimation function, the acceleration information at the target moment and the speed information at the target moment.
In a possible implementation manner, the second determining module 404 is configured to, when determining the target pose information of the vehicle at the target moment based on the corrected second gravity information, the preset standard gravity information, and the initial pose information of the vehicle at a time previous to the target moment:
determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information;
Determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information;
the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
In a possible implementation manner, the second determining module 404 is configured to, when determining the second pose information of the vehicle based on the angular velocity information and the initial pose information:
determining a second attitude change amount of the vehicle based on angular velocity information of the vehicle at a target time;
second posture information of the vehicle is determined based on the second posture change amount and the initial posture information.
In a possible embodiment, the target attitude information includes roll angle, pitch angle, and azimuth angle;
the second determining module 404 is configured to, when determining, based on the first pose information and the second pose information, target pose information of the vehicle at the target time, to:
taking the azimuth angle in the second attitude information as the azimuth angle in the target attitude information; the method comprises the steps of,
And determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
In a possible implementation manner, the apparatus further includes a third determining module 405, where the third determining module 405 is configured to determine the gesture change estimation function according to the following method:
determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively;
based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 5, a schematic structural diagram of a computer device 500 according to an embodiment of the disclosure includes a processor 501, a memory 502, and a bus 503. The memory 502 is configured to store execution instructions, including a memory 5021 and an external memory 5022; the memory 5021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 501 and data exchanged with an external memory 5022 such as a hard disk, the processor 501 exchanges data with the external memory 5022 through the memory 5021, and when the computer device 500 is running, the processor 501 and the memory 502 communicate through the bus 503, so that the processor 501 executes the following instructions:
Acquiring motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information;
and determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time.
In a possible implementation manner, in the instructions executed by the processor 501, the determining the first gravity information of the vehicle under the motion information includes:
determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time subsequent to the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time;
First gravity information of the vehicle under the motion information is determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
In a possible implementation manner, in the instructions executed by the processor 501, the gesture change pre-estimation function is used to characterize a mapping relationship between acceleration and speed and a gesture change amount;
the determining the first posture change amount based on the pre-calibrated posture change estimation function and the motion information comprises the following steps:
and determining the first attitude change amount based on the pre-calibrated attitude change estimation function, the acceleration information at the target moment and the speed information at the target moment.
In a possible implementation manner, in the instructions executed by the processor 501, the determining, based on the corrected second gravity information, preset standard gravity information, and initial posture information of the vehicle at a time previous to a target time, target posture information of the vehicle at the target time includes:
determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information;
Determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information;
the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
In a possible implementation manner, in the instructions executed by the processor 501, the determining, based on the angular velocity information and the initial pose information, second pose information of the vehicle includes:
determining a second attitude change amount of the vehicle based on angular velocity information of the vehicle at a target time;
second posture information of the vehicle is determined based on the second posture change amount and the initial posture information.
In a possible implementation, the target attitude information includes roll angle, pitch angle and azimuth angle in the instructions executed by the processor 501;
the determining, based on the first posture information and the second posture information, target posture information of the vehicle at the target time, includes:
taking the azimuth angle in the second attitude information as the azimuth angle in the target attitude information; the method comprises the steps of,
And determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
In a possible implementation manner, in the instructions executed by the processor 501, the method further includes determining the gesture change estimation function according to the following method:
determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively;
based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the attitude determination method described in the above method embodiments. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
Embodiments of the present disclosure further provide a computer program product, where the computer program product carries program code, where instructions included in the program code may be used to perform steps of the gesture determining method described in the foregoing method embodiments, and specifically reference may be made to the foregoing method embodiments, which are not described herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (11)

1. A posture determining method, characterized by comprising:
acquiring motion information of a vehicle at a target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
Correcting the first gravity information based on the first posture change amount to obtain corrected second gravity information;
and determining target attitude information of the vehicle at the target time based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the time before the target time.
2. The method of claim 1, wherein the determining first gravitational information of the vehicle under the movement information comprises:
determining a longitudinal acceleration of the vehicle based on the speed information of the vehicle at a time preceding the target time and the speed information at a time subsequent to the target time; and determining a lateral acceleration of the vehicle based on the angular velocity information of the vehicle at the target time and the velocity information at the target time;
first gravity information of the vehicle under the motion information is determined based on acceleration information of the vehicle at the target time, the longitudinal acceleration, and the lateral acceleration.
3. The method according to claim 1, wherein the attitude change estimation function is used for characterizing a mapping relationship between acceleration and velocity and attitude change;
The determining the first posture change amount based on the pre-calibrated posture change estimation function and the motion information comprises the following steps:
and determining the first attitude change amount based on the pre-calibrated attitude change estimation function, the acceleration information at the target moment and the speed information at the target moment.
4. The method according to claim 1, wherein the determining the target attitude information of the vehicle at the target time based on the corrected second gravity information, the preset standard gravity information, and the initial attitude information of the vehicle at a time preceding the target time includes:
determining first posture information based on the corrected second gravity information and the standard gravity information; and determining second posture information of the vehicle based on the angular velocity information and the initial posture information;
determining target attitude information of the vehicle at the target time based on the first attitude information and the second attitude information;
the first posture information is chassis posture information calculated by considering the non-rigid connection condition of the sensor and the chassis, and the second posture information is vehicle body posture information calculated by not considering the non-rigid connection condition of the sensor and the chassis.
5. The method of claim 4, wherein the determining second pose information of the vehicle based on the angular velocity information and the initial pose information comprises:
determining a second attitude change amount of the vehicle based on angular velocity information of the vehicle at a target time;
second posture information of the vehicle is determined based on the second posture change amount and the initial posture information.
6. The method of claim 4, wherein the target attitude information includes roll angle, pitch angle, and azimuth angle;
the determining, based on the first posture information and the second posture information, target posture information of the vehicle at the target time, includes:
taking the azimuth angle in the second attitude information as the azimuth angle in the target attitude information; the method comprises the steps of,
and determining the pitch angle and the roll angle of the target attitude information based on the pitch angle and the roll angle in the first attitude information, the first preset weight corresponding to the first attitude information, the pitch angle and the roll angle in the second attitude information and the second preset weight corresponding to the second attitude information.
7. The method of claim 1, further comprising determining the attitude change estimation function according to the method of:
Determining attitude change amounts of the vehicle under a plurality of pieces of speed information and a plurality of pieces of longitudinal acceleration respectively;
based on the attitude change amounts of the vehicle under the plurality of speed information and the plurality of longitudinal accelerations, respectively, an attitude change estimation function for estimating the attitude change amount from the speed information and the longitudinal acceleration of the vehicle is determined.
8. An in-vehicle apparatus for performing the posture determining method according to any one of claims 1 to 7.
9. An attitude determination apparatus, characterized by comprising:
the acquisition module is used for acquiring the motion information of the vehicle at the target moment; wherein the motion information includes angular velocity information, acceleration information, and velocity information;
a first determining module for determining first gravity information of the vehicle under the motion information; determining a first attitude change amount based on a pre-calibrated attitude change estimation function and the motion information;
the correction module is used for correcting the first gravity information based on the first posture change quantity to obtain corrected second gravity information;
and the second determining module is used for determining target attitude information of the vehicle at the target moment based on the corrected second gravity information, preset standard gravity information and initial attitude information of the vehicle at the moment before the target moment.
10. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the pose determination method according to any of claims 1 to 7.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the pose determination method according to any of claims 1 to 7.
CN202310707771.3A 2023-06-14 2023-06-14 Gesture determination method and device, computer equipment and storage medium Pending CN116734845A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310707771.3A CN116734845A (en) 2023-06-14 2023-06-14 Gesture determination method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310707771.3A CN116734845A (en) 2023-06-14 2023-06-14 Gesture determination method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116734845A true CN116734845A (en) 2023-09-12

Family

ID=87902398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310707771.3A Pending CN116734845A (en) 2023-06-14 2023-06-14 Gesture determination method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116734845A (en)

Similar Documents

Publication Publication Date Title
US10907969B2 (en) Inertia-based navigation apparatus and inertia-based navigation method based on relative preintegration
CN106289275B (en) Unit and method for improving positioning accuracy
US10240931B2 (en) System and method for navigation by applying corrected bias values to gyroscopic data
EP1760431B1 (en) Inertial navigation system with a plurality of Kalman filters and vehicle equipped with such a system
CN110956665B (en) Bidirectional calculation method, system and device for turning track of vehicle
CN111426332B (en) Course installation error determination method and device, electronic equipment and storage medium
JP2018047888A (en) System and method for measuring angular position of vehicle
JP2020169872A (en) Inertial navigation device
CN111562603A (en) Navigation positioning method, equipment and storage medium based on dead reckoning
CN110288154A (en) Speed predicting method, device, equipment and medium
CN109141411B (en) Positioning method, positioning device, mobile robot, and storage medium
CN110440797A (en) Vehicle attitude estimation method and system
CN113465628A (en) Inertial measurement unit data compensation method and system
JP2014240266A (en) Sensor drift amount estimation device and program
CN113566850B (en) Method and device for calibrating installation angle of inertial measurement unit and computer equipment
JP2007107951A (en) Installation angle calculation device
JP2015004593A (en) Navigation device
CN116734845A (en) Gesture determination method and device, computer equipment and storage medium
CN114019954B (en) Course installation angle calibration method, device, computer equipment and storage medium
CN115790613A (en) Visual information assisted inertial/odometer integrated navigation method and device
CN109827572A (en) A kind of method and device of detection truck position prediction
CN114370872A (en) Vehicle attitude determination method and vehicle
CN108931247B (en) Navigation method and device
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
CN111284496B (en) Lane tracking method and system for autonomous vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination