US20220259833A1 - Posture Estimation Method, Posture Estimation Device, And Movable Device - Google Patents
Posture Estimation Method, Posture Estimation Device, And Movable Device Download PDFInfo
- Publication number
- US20220259833A1 US20220259833A1 US17/670,577 US202217670577A US2022259833A1 US 20220259833 A1 US20220259833 A1 US 20220259833A1 US 202217670577 A US202217670577 A US 202217670577A US 2022259833 A1 US2022259833 A1 US 2022259833A1
- Authority
- US
- United States
- Prior art keywords
- posture
- value
- angular velocity
- posture estimation
- bias
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000001133 acceleration Effects 0.000 claims abstract description 72
- 238000012545 processing Methods 0.000 claims description 68
- 238000005259 measurement Methods 0.000 claims description 50
- 230000036544 posture Effects 0.000 description 136
- 239000011159 matrix material Substances 0.000 description 25
- 238000012937 correction Methods 0.000 description 24
- 230000008859 change Effects 0.000 description 23
- 230000007246 mechanism Effects 0.000 description 7
- 230000010354 integration Effects 0.000 description 5
- 238000010606 normalization Methods 0.000 description 5
- 238000005070 sampling Methods 0.000 description 5
- 238000010276 construction Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008602 contraction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 235000015842 Hesperis Nutrition 0.000 description 1
- 235000012633 Iberis amara Nutrition 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
- E02F9/265—Sensors and their calibration for indicating the position of the work tool with follow-up actions (e.g. control signals sent to actuate the work tool)
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/2029—Controlling the position of implements in function of its load, e.g. modifying the attitude of implements in accordance to vehicle speed
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
- E02F3/431—Control of dipper or bucket position; Control of sequence of drive operations for bucket-arms, front-end loaders, dumpers or the like
- E02F3/434—Control of dipper or bucket position; Control of sequence of drive operations for bucket-arms, front-end loaders, dumpers or the like providing automatic sequences of movements, e.g. automatic dumping or loading, automatic return-to-dig
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
- E02F3/435—Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like
- E02F3/437—Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like providing automatic sequences of movements, e.g. linear excavation, keeping dipper angle constant
Definitions
- the present disclosure relates to a posture estimation method, a posture estimation device, and a movable device.
- a device and a system are known in which an inertial measurement unit (IMU) is attached to an object and a position and a posture of the object are calculated using an output signal of the inertial measurement unit. Since the output signal of the inertial measurement unit has a bias error and an error also occurs in posture calculation, a method is proposed in which a Kalman filter is used to correct these errors and estimate an accurate posture of the object.
- JP-A-2020-20631 describes a posture estimation method for estimating a posture of an object by correcting predicted posture information on the object based on error information when an angular velocity sensor exceeds an effective measurement range.
- an inertial measurement unit equipped with an angular velocity sensor and an acceleration sensor has different X-axis, Y-axis, and Z-axis directions depending on mounting positions associated with an operation of an object, and a bias error in the directions occur from an initial state.
- estimation accuracy of a posture of the object may be decreased and high-precision measurement cannot be performed.
- a posture estimation method includes: measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit; reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset; measuring an angular velocity and an acceleration by the angular velocity sensor and the acceleration sensor in a stationary state; updating the bias value BW and the variance value PWW of the angular velocity sensor and the variance value PVV of the acceleration sensor from the initial setting values; and estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
- a posture estimation device includes a storage unit that stores a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object, and a processing unit that estimates a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
- a movable device includes the posture estimation device described above and a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
- FIG. 1 is a diagram showing a configuration example of a posture estimation device according to a first embodiment.
- FIG. 2 is a diagram showing a sensor coordinate system and a local coordinate system.
- FIG. 3 is a flowchart showing an example of a procedure of a posture estimation method according to the first embodiment.
- FIG. 4 is a flowchart showing an example of a procedure of a posture estimation method according to a second embodiment.
- FIG. 5 is a diagram showing a configuration example of a movable device having a posture estimation device according to a third embodiment.
- FIG. 6 is a block diagram showing a configuration example of the movable device.
- a posture estimation device 1 according to the first embodiment will be described with reference to FIGS. 1 and 2 .
- the posture estimation device 1 of the present embodiment includes a processing unit 20 , ROM 30 , RAM 40 , and a recording medium 50 that are storage units, and a communication unit 60 .
- the posture estimation device 1 estimates a posture of an object based on an output of an inertial measurement unit (IMU) 10 .
- the posture estimation device 1 of the present embodiment may have a part of these components changed or removed or have another component added.
- the posture estimation device 1 is separated from the inertial measurement unit 10 .
- the posture estimation device 1 may include the inertial measurement unit 10 .
- the inertial measurement unit 10 and the posture estimation device 1 may be accommodated in one housing, and the inertial measurement unit 10 may be separated or separable from a main body accommodating the posture estimation device 1 .
- the posture estimation device 1 is mounted on the object, and in the latter case, the inertial measurement unit 10 is mounted on the object.
- the inertial measurement unit 10 includes an angular velocity sensor 12 , an acceleration sensor 14 , and a signal processing unit 16 .
- the inertial measurement unit 10 of the present embodiment may have a part of these components changed or removed or have another component added.
- the angular velocity sensor 12 measures angular velocities in directions of three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured angular velocities on the three axes.
- the acceleration sensor 14 measures accelerations in the directions of the three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured accelerations on the three axes.
- the signal processing unit 16 performs processing of sampling the output signals of the angular velocity sensor 12 at a predetermined sampling interval ⁇ t to convert the output signals into angular velocity data d ⁇ having a digital value.
- the signal processing unit 16 performs processing of sampling the output signals of the acceleration sensor 14 at the predetermined sampling interval ⁇ t to convert the output signals into acceleration data d ⁇ having a digital value.
- the angular velocity sensor 12 and the acceleration sensor 14 are attached to the inertial measurement unit 10 such that the three axes coincide with three axes (x-axis, y-axis, z-axis) of a sensor coordinate system that is an orthogonal coordinate system defined for the inertial measurement unit 10 .
- an error occurs in a mounting angle in practice.
- the signal processing unit 16 performs processing of converting the angular velocity data d ⁇ and the acceleration data d ⁇ into data in an xyz coordinate system, by using a correction parameter that is calculated in advance in accordance with the error in the mounting angle.
- the signal processing unit 16 also performs processing of correcting the temperature in the angular velocity data d ⁇ and the acceleration data d ⁇ in accordance with temperature characteristics of the angular velocity sensor 12 and the acceleration sensor 14 .
- a function of A/D conversion or temperature correction may be assembled in the angular velocity sensor 12 and the acceleration sensor 14 .
- the inertial measurement unit 10 outputs the angular velocity data d ⁇ and the acceleration data d ⁇ after the processing by the signal processing unit 16 to the processing unit 20 of the posture estimation device 1 .
- the ROM 30 stores programs for the processing unit to perform various types of processing, and various programs or various types of data for implementing application functions.
- the ROM 30 stores a bias value BW and a variance value PWW of the angular velocity sensor 12 and a bias value BA and a variance value PVV of the acceleration sensor 14 in a state in which the inertial measurement unit 10 is placed at a mounting position associated with a predetermined operation of the object.
- the predetermined operation is most frequently performed one of operations of the object.
- the predetermined operation may include a plurality of operations.
- the ROM 30 stores the bias values BW, BA and the variance values PWW, PVV according to the plurality of operations, that is, types of the operations.
- the RAM 40 is a storage unit that is used as a work area of the processing unit 20 , and temporarily stores a program or data read out from the ROM 30 or operation results obtained by the processing unit 20 performing processing in accordance with various programs.
- the recording medium 50 is a non-volatile storage unit that stores data required to be preserved for a long term among data generated by processing of the processing unit 20 .
- the recording medium 50 may store programs for the processing unit 20 to perform various types of processing, and various programs or various types of data for implementing application functions.
- the processing unit 20 performs various types of processing in accordance with a program stored in the ROM 30 or the recording medium 50 or in accordance with a program received from a server via a network and then stored in the RAM 40 or the recording medium 50 .
- the processing unit 20 executes the program to function as a bias removal unit 22 , a posture change amount calculation unit 24 , a velocity change amount calculation unit 26 , and a posture estimation unit 28 , and performs a predetermined operation on the angular velocity data d ⁇ and the acceleration data d ⁇ output at the interval ⁇ t by the inertial measurement unit 10 to perform processing of estimating the posture of the object.
- the processing unit 20 Upon receiving a reset instruction from a user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30 and sets the read values as initial setting values.
- the processing unit 20 reads the bias values BW, BA and the variance values PWW, PVV corresponding to an operation selected by the user from the plurality of operations from the ROM 30 and sets the read values as the initial setting values.
- the sensor coordinate system that is the coordinate system of the inertial measurement unit 10 , for example, an xyz coordinate system constituted by an x-axis, a y-axis, and a z-axis that are perpendicular to each other and a local space coordinate system that is a coordinate system of a space in which the object is present, for example, an XYZ coordinate system constituted by an X-axis, a Y-axis, and a Z-axis that are perpendicular to each other are considered.
- the processing unit 20 estimates the posture of the object in the local space coordinate system from the angular velocities on the three axes and the accelerations on the three axes in the sensor coordinate system, which are output from the inertial measurement unit 10 mounted on the object.
- the posture of the object may also be referred to as the posture of the inertial measurement unit 10 .
- the bias removal unit 22 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30 , sets the read values as the initial setting values of a bias error, and then performs processing of calculating the angular velocities on the three axes obtained by removing the bias error from the output of the angular velocity sensor 12 and processing of calculating the accelerations on the three axes obtained by removing the bias error from the output of the acceleration sensor 14 .
- the posture change amount calculation unit 24 calculates a posture change amount of the object based on the output of the angular velocity sensor 12 . Specifically, the posture change amount calculation unit 24 performs processing of calculating the posture change amount of the object by approximation with a polynomial expression in which the sampling interval ⁇ t is used as a variable by using the angular velocities on the three axes in which the bias error is removed by the bias removal unit 22 .
- the velocity change amount calculation unit 26 calculates a velocity change amount of the object based on the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 . Specifically, the velocity change amount calculation unit 26 performs processing of calculating the velocity change amount of the object by using the angular velocities on the three axes and accelerations on the three axes in which the bias error is removed by the bias removal unit 22 .
- the posture estimation unit 28 functions as an integration calculation unit 101 , a posture information prediction unit 102 , an error information update unit 103 , a correction coefficient calculation unit 104 , a posture information correction unit 105 , a normalization unit 106 , an error information correction unit 107 , a rotational error component removal unit 108 , a bias error limitation unit 109 , and an error information adjustment unit 110 .
- the posture estimation unit 28 performs processing of estimating the posture of the object with the posture change amount calculated by the posture change amount calculation unit 24 and the velocity change amount calculated by the velocity change amount calculation unit 26 . In practice, the posture estimation unit 28 performs processing of estimating a state vector x and an error covariance matrix ⁇ x 2 thereof with an extended Kalman filter.
- the integration calculation unit 101 performs integration processing of integrating the posture change amount calculated by the posture change amount calculation unit 24 with a previous estimated value of the posture that is corrected by the posture information correction unit 105 and normalized by the normalization unit 106 .
- the integration calculation unit 101 performs integration processing of integrating the velocity change amount calculated by the velocity change amount calculation unit 26 with a previous estimated value of the velocity that is corrected by the posture information correction unit 105 and normalized by the normalization unit 106 .
- the posture information prediction unit 102 performs processing of predicting posture quaternion q that is posture information on the object using the posture change amount calculated by the posture change amount calculation unit 24 .
- the posture information prediction unit 102 also performs processing of predicting a motion velocity vector v that is velocity information on the object based on the velocity change amount calculated by the velocity change amount calculation unit 26 .
- the posture information prediction unit 102 performs processing of predicting the state vector x including the posture quaternion q and the motion velocity vector v as components.
- the error information update unit 103 performs processing of updating the error covariance matrix ⁇ x 2 that is error information based on the output of the angular velocity sensor 12 . Specifically, the error information update unit 103 performs processing of updating a posture error of the object with the angular velocities on the three axes in which the bias error is removed by the bias removal unit 22 . In practice, the error information update unit 103 performs processing of updating the error covariance matrix ⁇ x 2 with the extended Kalman filter.
- the rotational error component removal unit 108 performs processing of removing a rotational error component around a reference vector in an error covariance matrix ⁇ 2 that is the error information. Specifically, the rotational error component removal unit 108 performs processing of removing an azimuth error component included in the posture error in the error covariance matrix ⁇ x 2 updated by the error information update unit 103 . In practice, the rotational error component removal unit 108 performs processing of generating the error covariance matrix ⁇ x 2 in which rank limitation of an error covariance matrix ⁇ q 2 of the posture and removal of the azimuth error component are performed on the error covariance matrix ⁇ x 2 .
- the error information adjustment unit 110 determines whether the output of the angular velocity sensor 12 is within an effective range. When the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 is not within the effective range, the error information adjustment unit 110 increases a posture error component in the error covariance matrix ⁇ x 2 that is the error information and reduces a correlation component between the posture error component and an error component other than the posture error component in the error covariance matrix ⁇ x 2 , for example, the correlation component is set to zero. The error information adjustment unit 110 determines whether the output of the acceleration sensor 14 is within an effective range.
- the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 or the output of the acceleration sensor 14 is not within a corresponding effective range, the error information adjustment unit 110 increases a motion velocity error component in an error covariance matrix ⁇ x, k 2 and reduces a correlation component between the motion velocity error component and an error component other than the motion velocity error component in the error covariance matrix ⁇ x, k 2 , for example, the correlation component is set to zero.
- the error information adjustment unit 110 increases the posture error component and the motion velocity error component in the error covariance matrix ⁇ x 2 generated by the rotational error component removal unit 108 and reduces the correlation component between the posture error component and the error component other than the posture error component and a correlation component between the motion velocity error component and an error component other than the motion velocity error component, for example, the correlation components are set to zero.
- the error information adjustment unit 110 In an acceleration off-scale recovery period after the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 is within the effective range and the output of the acceleration sensor 14 is not within the effective range, the error information adjustment unit 110 increases the motion velocity error component in the error covariance matrix ⁇ x 2 generated by the rotational error component removal unit 108 and reduces the correlation component between the motion velocity error component and the error component other than the motion velocity error component, for example, the correlation component is set to zero.
- the bias error limitation unit 109 performs processing of limiting a bias error component of an angular velocity around the reference vector in the error covariance matrix ⁇ x 2 that is the error information. Specifically, the bias error limitation unit 109 performs processing of limiting a vertical component of the bias error of the angular velocity in the error covariance matrix ⁇ x 2 generated by the error information adjustment unit 110 . In practice, the bias error limitation unit 109 performs processing as follows. The bias error limitation unit 109 determines whether the vertical component of the bias error of the angular velocity exceeds an upper limit value. When the vertical component exceeds the upper limit value, the bias error limitation unit 109 generates the error covariance matrix ⁇ x 2 in which limitation is applied such that the vertical component has the upper limit value.
- the correction coefficient calculation unit 104 performs processing of calculating a correction coefficient based on the error covariance matrix ⁇ x 2 that is the error information generated by the bias error limitation unit 109 .
- the correction coefficient determines a correction amount of the posture quaternion q that is the posture information on the object by the posture information correction unit 105 or the motion velocity vector v that is the velocity information, and a correction amount of an error covariance matrix ⁇ x that is the error information by the error information correction unit 107 .
- the correction coefficient calculation unit 104 performs processing of calculating an observation residual ⁇ z, a Kalman coefficient K, and a transformation matrix H.
- the posture information correction unit 105 performs processing of correcting the posture quaternion q that is the posture information on the object predicted by the posture information prediction unit 102 based on the error covariance matrix ⁇ x that is the error information. Specifically, the posture information correction unit 105 performs processing of correcting the posture quaternion q by using the error covariance matrix ⁇ x generated by the bias error limitation unit 109 and the Kalman coefficient K and an observation residual ⁇ za of the gravitational acceleration calculated by the correction coefficient calculation unit 104 based on a gravitational acceleration vector g that is the reference vector and an acceleration vector a obtained from the output of the acceleration sensor 14 . In practice, the posture information correction unit 105 performs processing of correcting the state vector x predicted by the posture information prediction unit 102 with the extended Kalman filter.
- the normalization unit 106 performs processing of normalizing the posture quaternion q that is the posture information on the object corrected by the posture information correction unit 105 so that the magnitude thereof does not change. In practice, the normalization unit 106 performs processing of normalizing the state vector x corrected by the posture information correction unit 105 .
- the error information correction unit 107 performs processing of correcting the error covariance matrix ⁇ x that is the error information. Specifically, the error information correction unit 107 performs processing of correcting the error covariance matrix ⁇ x generated by the bias error limitation unit 109 with the extended Kalman filter and the transformation matrix H and the Kalman coefficient K calculated by the correction coefficient calculation unit 104 .
- the posture quaternion q that is the posture information on the object estimated by the processing unit may be transmitted to another device via the communication unit 60 .
- the posture estimation device 1 of the present embodiment can estimate the posture of the object in a stored predetermined operation of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 . Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy.
- the posture estimation method for the posture estimation device includes a measurement step, a storage step, an initial value setting step, an angular velocity and acceleration measurement step, an initial setting value updating step, and a posture estimation step.
- step S 101 the inertial measurement unit 10 is installed at the mounting position associated with most frequently performed one of operations of the object, which is the predetermined operation of the object, to measure the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 .
- step S 102 the processing unit 20 stores the measured bias value BW and variance value PWW of the angular velocity sensor 12 and the measured bias value BA and variance value PVV of the acceleration sensor 14 in the ROM 30 as the storage unit.
- step S 103 upon receiving the reset instruction from the user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30 and sets the read values as the initial setting values of the bias error.
- step S 104 the angular velocity and the acceleration are measured by the angular velocity sensor 12 and the acceleration sensor 14 in a stationary state. Measurement time is 200 msec.
- step S 105 the processing unit 20 adds the bias value BW and the variance value PWW of the angular velocity sensor 12 to the measured angular velocity data d ⁇ , and updates the bias error from the initial setting values. Further, the variance value PVV of the acceleration sensor 14 is added to the acceleration data d ⁇ and the bias error is updated from the initial setting values. The bias value BA of the acceleration sensor 14 is not used to update the initial setting values.
- step S 106 the processing unit 20 estimates the posture of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the updated bias value BW, variance value PWW, and variance value PVV and the un-updated bias value BA.
- the posture quaternion q that is the estimated posture information on the object is transmitted to another device via the communication unit 60 .
- the posture estimation method of the present embodiment can estimate the posture of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 in the stored predetermined operation of the object. Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy.
- the posture estimation method of the present embodiment is the same as the posture estimation method of the first embodiment except that the predetermined operation of the object includes a plurality of operations and the corresponding bias values BW, BA and the variance values PWW, PVV are stored in the ROM 30 according to types of the operations. Differences from the first embodiment described above will be mainly described, and the description of similar matters will be omitted.
- the posture estimation method according to the second embodiment will be described with reference to FIG. 4 .
- the posture estimation method includes a measurement step, a storage step, an initial value selection step, an initial value setting step, an angular velocity and acceleration measurement step, an initial setting value updating step, and a posture estimation step.
- step S 201 the inertial measurement unit 10 is installed at a mounting position associated with a first operation of the object, to measure the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 .
- step S 202 the measured bias value BW and variance value PWW of the angular velocity sensor 12 and the measured bias value BA and variance value PVV of the acceleration sensor 14 are stored in the ROM 30 that is the storage unit as the bias values BW, BA and the variance values PWW, PVV of the first operation.
- step S 201 and step S 202 are repeated.
- the steps of step S 201 and step S 202 are repeated according to the plurality of operations of the object, that is, the types of the operations.
- the processing unit 20 After acquiring the bias values BW, BA and the variance values PWW, PVV of the sensors 12 and 14 corresponding to the plurality of operations of the object, in step S 203 , the processing unit 20 receives an operation selection instruction from a user.
- step S 204 upon receiving a reset instruction from the user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 associated with the operation selected from the ROM 30 and sets the read values as the initial setting values of the bias error.
- Step S 205 is similar to step S 104 of the first embodiment, and thus the description thereof is omitted.
- Step S 206 is similar to step S 105 of the first embodiment, and thus the description thereof is omitted.
- Step S 207 is similar to step S 106 of the first embodiment, and thus the description thereof is omitted.
- the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 can be selected according to the types of the operations, and thus the bias error in the initial state can be corrected and the posture of the object can be measured with high accuracy.
- a movable device 600 including the posture estimation device 1 according to the third embodiment will be described with reference to FIGS. 5 and 6 .
- FIGS. 5 and 6 show a hydraulic shovel that is an example of the construction machine as the movable device 600 .
- a vehicle body includes a lower running body 612 and an upper revolving body 611 that is mounted to revolve above the lower running body 612 .
- a work mechanism 620 is provided at a front portion of the upper revolving body 611 .
- the work mechanism 620 includes a plurality of members that are pivotable in an up-and-down direction.
- a driver seat (not shown) is provided in the upper revolving body 611 , and an operation device (not shown) for operating the members constituting the work mechanism 620 is provided at the driver seat.
- An inertial measurement unit 10 d functioning as an inclination sensor that detects an inclination angle of the upper revolving body 611 is disposed in the upper revolving body 611 .
- the work mechanism 620 includes a boom 613 , an arm 614 , a bucket link 616 , a bucket 615 , a boom cylinder 617 , an arm cylinder 618 , and a bucket cylinder 619 , as the plurality of members.
- the boom 613 is attached to the front portion of the upper revolving body 611 to move up and down.
- the arm 614 is attached to a top side of the boom 613 to move up and down.
- the bucket link 616 is attached to a top side of the arm 614 to be pivotable.
- the bucket 615 is attached to top sides of the arm 614 and the bucket link 616 to be pivotable.
- the boom cylinder 617 drives the boom 613 .
- the arm cylinder 618 drives the arm 614 .
- the bucket cylinder 619 drives the bucket 615 through the bucket link 616 .
- a base end of the boom 613 is supported by the upper revolving body 611 to be pivotable in the up-and-down direction, and is rotationally driven relative to the upper revolving body 611 by expansion and contraction of the boom cylinder 617 .
- An inertial measurement unit 10 c functioning as an inertial sensor that detects a motion state of the boom 613 is disposed in the boom 613 .
- One end of the arm 614 is supported by the top side of the boom 613 to be rotatable.
- the arm 614 is rotationally driven relative to the boom 613 by expansion and contraction of the arm cylinder 618 .
- An inertial measurement unit 10 b functioning as an inertial sensor that detects a motion state of the arm 614 is disposed in the arm 614 .
- the bucket link 616 and the bucket 615 are supported by the top side of the arm 614 to be pivotable.
- the bucket link 616 is rotationally driven relative to the arm 614 by expansion and contraction of the bucket cylinder 619 , and the bucket 615 is rotationally driven relative to the arm 614 together with the bucket link 616 .
- An inertial measurement unit 10 a functioning as an inertial sensor that detects a motion state of the bucket link 616 is disposed in the bucket link 616 .
- the inertial measurement units 10 a, 10 b, 10 c, and 10 d can detect at least one of an angular velocity and an acceleration acting on the members of the work mechanism 620 or the upper revolving body 611 . According to mounting positions of the upper revolving body 611 , the boom 613 , the arm 614 , and the bucket 615 that operate differently, the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 are actually measured before mounting. Therefore, the inertial measurement units 10 a, 10 b, 10 c, and 10 d can correct bias errors in an initial state. Further, as shown in FIG.
- the inertial measurement units 10 a, 10 b, and 10 c are coupled in series and can transmit detection signals to a calculation device 630 .
- the inertial measurement units 10 a, 10 b, and 10 c are coupled in series, it is possible to reduce the number of wires for transmitting the detection signals in a movable region and to obtain a compact wiring structure.
- the compact wiring structure it is easy to select a method for laying wires, and it is possible to reduce deterioration of or damage to the wire.
- the movable device 600 is provided with the calculation device 630 that calculates an inclination angle of the upper revolving body 611 or positions or postures of the boom 613 , the arm 614 , and the bucket 615 constituting the work mechanism 620 .
- the calculation device 630 includes the posture estimation device 1 in the above embodiment and a control device 632 .
- the posture estimation device 1 estimates posture information on the movable device 600 based on output signals of the inertial measurement units 10 a, 10 b, 10 c, and 10 d.
- the control device 632 controls the posture of the movable device 600 based on the posture information on the movable device 600 estimated by the posture estimation device 1 .
- the calculation device 630 receives various detection signals input from the inertial measurement units 10 a, 10 b, 10 c, and 10 d and calculates the positions and postures or posture angles of the boom 613 , the arm 614 , and the bucket 615 or an inclination state of the upper revolving body 611 based on the various detection signals.
- the calculated position-and-posture signal including the posture angles of the boom 613 , the arm 614 , and the bucket 615 or an inclination signal including the posture angle of the upper revolving body 611 , for example, the position-and-posture signal of the bucket 615 is used in feedback information for display of a monitoring device (not shown) on the driver seat or for controlling an operation of the work mechanism 620 or the upper revolving body 611 .
- a rough terrain crane (crane car), a bulldozer, an excavator and loader, a wheel loader, and an aerial work vehicle (lift car) are provided in addition to the hydraulic shovel (jumbo, back hoe, power shovel) exemplified above.
- the posture estimation device 1 it is possible to obtain information on a posture with high accuracy, and thus it is possible to implement appropriate posture control of the movable device 600 .
- the inertial measurement units 10 a, 10 b, 10 c, and 10 d having the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 according to the mounting position are mounted on the upper revolving body 611 , the boom 613 , the arm 614 , and the bucket 615 that operate differently, so that the bias error in the initial state can be corrected, and thus the posture of the object can be measured with high accuracy.
- a four-wheel vehicle such as an agricultural machine and the construction machine as an example of the movable device in which the posture estimation device 1 is used.
- motorcycles, bicycles, trains, airplanes, biped robots, remote-controlled or autonomous aircraft (such as radio-controlled aircraft, radio-controlled helicopters and drones), rockets, satellites, ships, automated guided vehicles (AGVs) are provided.
- the predetermined operation described above may be an operation of raising the bucket 615 , an operation of lowering the bucket 615 , or revolving or movement.
- the plurality of operations described above may be any two of the operation of raising the bucket 615 , the operation of lowering the bucket 615 , revolving or movement, and the like.
- the predetermined operation may be a predetermined operation of a robot arm or a drone. Further, the predetermined operation may be a walking operation of a biped robot.
- the bias value BW and the variance value PWW of the angular velocity sensor 12 may be different values or the same values in the inertial measurement units 10 a, 10 b, 10 c, 10 d.
- the movable device 600 has four inertial measurement units 10 a, 10 b, 10 c, and 10 d, but may have two, three, or five or more.
- the present disclosure includes a configuration substantially the same as the configuration described in the embodiments, for example, a configuration having the same function, method, and result, or a configuration having the same purpose and effect.
- the present disclosure includes a configuration in which a non-essential portion of the configuration described in the embodiments is replaced.
- the present disclosure includes a configuration having the same function and effect as the configuration described in the embodiments, or a configuration capable of achieving the same purpose.
- the present disclosure includes a configuration in which a known technique is added to the configuration described in the embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Gyroscopes (AREA)
Abstract
A posture estimation method includes: measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit; reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset; and estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
Description
- The present application is based on, and claims priority from JP Application Serial Number 2021-021507, filed Feb. 15, 2021, the disclosure of which is hereby incorporated by reference herein in its entirety.
- The present disclosure relates to a posture estimation method, a posture estimation device, and a movable device.
- A device and a system are known in which an inertial measurement unit (IMU) is attached to an object and a position and a posture of the object are calculated using an output signal of the inertial measurement unit. Since the output signal of the inertial measurement unit has a bias error and an error also occurs in posture calculation, a method is proposed in which a Kalman filter is used to correct these errors and estimate an accurate posture of the object. For example, JP-A-2020-20631 describes a posture estimation method for estimating a posture of an object by correcting predicted posture information on the object based on error information when an angular velocity sensor exceeds an effective measurement range.
- However, an inertial measurement unit equipped with an angular velocity sensor and an acceleration sensor has different X-axis, Y-axis, and Z-axis directions depending on mounting positions associated with an operation of an object, and a bias error in the directions occur from an initial state. Thus, estimation accuracy of a posture of the object may be decreased and high-precision measurement cannot be performed.
- A posture estimation method includes: measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit; reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset; measuring an angular velocity and an acceleration by the angular velocity sensor and the acceleration sensor in a stationary state; updating the bias value BW and the variance value PWW of the angular velocity sensor and the variance value PVV of the acceleration sensor from the initial setting values; and estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
- A posture estimation device includes a storage unit that stores a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object, and a processing unit that estimates a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
- A movable device includes the posture estimation device described above and a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
-
FIG. 1 is a diagram showing a configuration example of a posture estimation device according to a first embodiment. -
FIG. 2 is a diagram showing a sensor coordinate system and a local coordinate system. -
FIG. 3 is a flowchart showing an example of a procedure of a posture estimation method according to the first embodiment. -
FIG. 4 is a flowchart showing an example of a procedure of a posture estimation method according to a second embodiment. -
FIG. 5 is a diagram showing a configuration example of a movable device having a posture estimation device according to a third embodiment. -
FIG. 6 is a block diagram showing a configuration example of the movable device. - First, a posture estimation device 1 according to the first embodiment will be described with reference to
FIGS. 1 and 2 . - As shown in
FIG. 1 , the posture estimation device 1 of the present embodiment includes aprocessing unit 20,ROM 30,RAM 40, and arecording medium 50 that are storage units, and acommunication unit 60. The posture estimation device 1 estimates a posture of an object based on an output of an inertial measurement unit (IMU) 10. The posture estimation device 1 of the present embodiment may have a part of these components changed or removed or have another component added. - In the present embodiment, as shown in
FIG. 1 , the posture estimation device 1 is separated from theinertial measurement unit 10. Alternatively, the posture estimation device 1 may include theinertial measurement unit 10. Theinertial measurement unit 10 and the posture estimation device 1 may be accommodated in one housing, and theinertial measurement unit 10 may be separated or separable from a main body accommodating the posture estimation device 1. In the former case, the posture estimation device 1 is mounted on the object, and in the latter case, theinertial measurement unit 10 is mounted on the object. - In the present embodiment, the
inertial measurement unit 10 includes anangular velocity sensor 12, anacceleration sensor 14, and asignal processing unit 16. Theinertial measurement unit 10 of the present embodiment may have a part of these components changed or removed or have another component added. - The
angular velocity sensor 12 measures angular velocities in directions of three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured angular velocities on the three axes. - The
acceleration sensor 14 measures accelerations in the directions of the three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured accelerations on the three axes. - The
signal processing unit 16 performs processing of sampling the output signals of theangular velocity sensor 12 at a predetermined sampling interval Δt to convert the output signals into angular velocity data dω having a digital value. Thesignal processing unit 16 performs processing of sampling the output signals of theacceleration sensor 14 at the predetermined sampling interval Δt to convert the output signals into acceleration data dα having a digital value. - Ideally, the
angular velocity sensor 12 and theacceleration sensor 14 are attached to theinertial measurement unit 10 such that the three axes coincide with three axes (x-axis, y-axis, z-axis) of a sensor coordinate system that is an orthogonal coordinate system defined for theinertial measurement unit 10. However, an error occurs in a mounting angle in practice. Thus, thesignal processing unit 16 performs processing of converting the angular velocity data dω and the acceleration data dα into data in an xyz coordinate system, by using a correction parameter that is calculated in advance in accordance with the error in the mounting angle. Thesignal processing unit 16 also performs processing of correcting the temperature in the angular velocity data dω and the acceleration data dα in accordance with temperature characteristics of theangular velocity sensor 12 and theacceleration sensor 14. - A function of A/D conversion or temperature correction may be assembled in the
angular velocity sensor 12 and theacceleration sensor 14. - The
inertial measurement unit 10 outputs the angular velocity data dω and the acceleration data dα after the processing by thesignal processing unit 16 to theprocessing unit 20 of the posture estimation device 1. - The
ROM 30 stores programs for the processing unit to perform various types of processing, and various programs or various types of data for implementing application functions. TheROM 30 stores a bias value BW and a variance value PWW of theangular velocity sensor 12 and a bias value BA and a variance value PVV of theacceleration sensor 14 in a state in which theinertial measurement unit 10 is placed at a mounting position associated with a predetermined operation of the object. The predetermined operation is most frequently performed one of operations of the object. The predetermined operation may include a plurality of operations. At this time, theROM 30 stores the bias values BW, BA and the variance values PWW, PVV according to the plurality of operations, that is, types of the operations. - The
RAM 40 is a storage unit that is used as a work area of theprocessing unit 20, and temporarily stores a program or data read out from theROM 30 or operation results obtained by theprocessing unit 20 performing processing in accordance with various programs. - The
recording medium 50 is a non-volatile storage unit that stores data required to be preserved for a long term among data generated by processing of theprocessing unit 20. Therecording medium 50 may store programs for theprocessing unit 20 to perform various types of processing, and various programs or various types of data for implementing application functions. - The
processing unit 20 performs various types of processing in accordance with a program stored in theROM 30 or therecording medium 50 or in accordance with a program received from a server via a network and then stored in theRAM 40 or therecording medium 50. In particular, in the present embodiment, theprocessing unit 20 executes the program to function as abias removal unit 22, a posture changeamount calculation unit 24, a velocity changeamount calculation unit 26, and aposture estimation unit 28, and performs a predetermined operation on the angular velocity data dω and the acceleration data dα output at the interval Δt by theinertial measurement unit 10 to perform processing of estimating the posture of the object. Upon receiving a reset instruction from a user, theprocessing unit 20 reads the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 from theROM 30 and sets the read values as initial setting values. When the predetermined operation includes a plurality of operations, theprocessing unit 20 reads the bias values BW, BA and the variance values PWW, PVV corresponding to an operation selected by the user from the plurality of operations from theROM 30 and sets the read values as the initial setting values. - In the present embodiment, as shown in
FIG. 2 , the sensor coordinate system that is the coordinate system of theinertial measurement unit 10, for example, an xyz coordinate system constituted by an x-axis, a y-axis, and a z-axis that are perpendicular to each other and a local space coordinate system that is a coordinate system of a space in which the object is present, for example, an XYZ coordinate system constituted by an X-axis, a Y-axis, and a Z-axis that are perpendicular to each other are considered. Theprocessing unit 20 estimates the posture of the object in the local space coordinate system from the angular velocities on the three axes and the accelerations on the three axes in the sensor coordinate system, which are output from theinertial measurement unit 10 mounted on the object. The posture of the object may also be referred to as the posture of theinertial measurement unit 10. - The
bias removal unit 22 reads the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 from theROM 30, sets the read values as the initial setting values of a bias error, and then performs processing of calculating the angular velocities on the three axes obtained by removing the bias error from the output of theangular velocity sensor 12 and processing of calculating the accelerations on the three axes obtained by removing the bias error from the output of theacceleration sensor 14. - The posture change
amount calculation unit 24 calculates a posture change amount of the object based on the output of theangular velocity sensor 12. Specifically, the posture changeamount calculation unit 24 performs processing of calculating the posture change amount of the object by approximation with a polynomial expression in which the sampling interval Δt is used as a variable by using the angular velocities on the three axes in which the bias error is removed by thebias removal unit 22. - The velocity change
amount calculation unit 26 calculates a velocity change amount of the object based on the output of theangular velocity sensor 12 and the output of theacceleration sensor 14. Specifically, the velocity changeamount calculation unit 26 performs processing of calculating the velocity change amount of the object by using the angular velocities on the three axes and accelerations on the three axes in which the bias error is removed by thebias removal unit 22. - The
posture estimation unit 28 functions as anintegration calculation unit 101, a postureinformation prediction unit 102, an errorinformation update unit 103, a correctioncoefficient calculation unit 104, a postureinformation correction unit 105, anormalization unit 106, an errorinformation correction unit 107, a rotational errorcomponent removal unit 108, a biaserror limitation unit 109, and an errorinformation adjustment unit 110. Theposture estimation unit 28 performs processing of estimating the posture of the object with the posture change amount calculated by the posture changeamount calculation unit 24 and the velocity change amount calculated by the velocity changeamount calculation unit 26. In practice, theposture estimation unit 28 performs processing of estimating a state vector x and an error covariance matrix Σx 2 thereof with an extended Kalman filter. - The
integration calculation unit 101 performs integration processing of integrating the posture change amount calculated by the posture changeamount calculation unit 24 with a previous estimated value of the posture that is corrected by the postureinformation correction unit 105 and normalized by thenormalization unit 106. Theintegration calculation unit 101 performs integration processing of integrating the velocity change amount calculated by the velocity changeamount calculation unit 26 with a previous estimated value of the velocity that is corrected by the postureinformation correction unit 105 and normalized by thenormalization unit 106. - The posture
information prediction unit 102 performs processing of predicting posture quaternion q that is posture information on the object using the posture change amount calculated by the posture changeamount calculation unit 24. The postureinformation prediction unit 102 also performs processing of predicting a motion velocity vector v that is velocity information on the object based on the velocity change amount calculated by the velocity changeamount calculation unit 26. In practice, the postureinformation prediction unit 102 performs processing of predicting the state vector x including the posture quaternion q and the motion velocity vector v as components. - The error
information update unit 103 performs processing of updating the error covariance matrix Σx 2 that is error information based on the output of theangular velocity sensor 12. Specifically, the errorinformation update unit 103 performs processing of updating a posture error of the object with the angular velocities on the three axes in which the bias error is removed by thebias removal unit 22. In practice, the errorinformation update unit 103 performs processing of updating the error covariance matrix Σx 2 with the extended Kalman filter. - The rotational error
component removal unit 108 performs processing of removing a rotational error component around a reference vector in an error covariance matrix Σ2 that is the error information. Specifically, the rotational errorcomponent removal unit 108 performs processing of removing an azimuth error component included in the posture error in the error covariance matrix Σx 2 updated by the errorinformation update unit 103. In practice, the rotational errorcomponent removal unit 108 performs processing of generating the error covariance matrix Σx 2 in which rank limitation of an error covariance matrix Σq 2 of the posture and removal of the azimuth error component are performed on the error covariance matrix Σx 2. - The error
information adjustment unit 110 determines whether the output of theangular velocity sensor 12 is within an effective range. When the errorinformation adjustment unit 110 determines that the output of theangular velocity sensor 12 is not within the effective range, the errorinformation adjustment unit 110 increases a posture error component in the error covariance matrix Σx 2 that is the error information and reduces a correlation component between the posture error component and an error component other than the posture error component in the error covariance matrix Σx 2, for example, the correlation component is set to zero. The errorinformation adjustment unit 110 determines whether the output of theacceleration sensor 14 is within an effective range. When the errorinformation adjustment unit 110 determines that the output of theangular velocity sensor 12 or the output of theacceleration sensor 14 is not within a corresponding effective range, the errorinformation adjustment unit 110 increases a motion velocity error component in an error covariance matrix Σx, k 2 and reduces a correlation component between the motion velocity error component and an error component other than the motion velocity error component in the error covariance matrix Σx, k 2, for example, the correlation component is set to zero. Specifically, in an angular velocity off-scale recovery period after the errorinformation adjustment unit 110 determines that the output of theangular velocity sensor 12 is not within the effective range, the errorinformation adjustment unit 110 increases the posture error component and the motion velocity error component in the error covariance matrix Σx 2 generated by the rotational errorcomponent removal unit 108 and reduces the correlation component between the posture error component and the error component other than the posture error component and a correlation component between the motion velocity error component and an error component other than the motion velocity error component, for example, the correlation components are set to zero. In an acceleration off-scale recovery period after the errorinformation adjustment unit 110 determines that the output of theangular velocity sensor 12 is within the effective range and the output of theacceleration sensor 14 is not within the effective range, the errorinformation adjustment unit 110 increases the motion velocity error component in the error covariance matrix Σx 2 generated by the rotational errorcomponent removal unit 108 and reduces the correlation component between the motion velocity error component and the error component other than the motion velocity error component, for example, the correlation component is set to zero. - The bias
error limitation unit 109 performs processing of limiting a bias error component of an angular velocity around the reference vector in the error covariance matrix Σx 2 that is the error information. Specifically, the biaserror limitation unit 109 performs processing of limiting a vertical component of the bias error of the angular velocity in the error covariance matrix Σx 2 generated by the errorinformation adjustment unit 110. In practice, the biaserror limitation unit 109 performs processing as follows. The biaserror limitation unit 109 determines whether the vertical component of the bias error of the angular velocity exceeds an upper limit value. When the vertical component exceeds the upper limit value, the biaserror limitation unit 109 generates the error covariance matrix Σx 2 in which limitation is applied such that the vertical component has the upper limit value. - The correction
coefficient calculation unit 104 performs processing of calculating a correction coefficient based on the error covariance matrix Σx 2 that is the error information generated by the biaserror limitation unit 109. The correction coefficient determines a correction amount of the posture quaternion q that is the posture information on the object by the postureinformation correction unit 105 or the motion velocity vector v that is the velocity information, and a correction amount of an error covariance matrix Σx that is the error information by the errorinformation correction unit 107. In practice, the correctioncoefficient calculation unit 104 performs processing of calculating an observation residual Δz, a Kalman coefficient K, and a transformation matrix H. - The posture
information correction unit 105 performs processing of correcting the posture quaternion q that is the posture information on the object predicted by the postureinformation prediction unit 102 based on the error covariance matrix Σx that is the error information. Specifically, the postureinformation correction unit 105 performs processing of correcting the posture quaternion q by using the error covariance matrix Σx generated by the biaserror limitation unit 109 and the Kalman coefficient K and an observation residual Δza of the gravitational acceleration calculated by the correctioncoefficient calculation unit 104 based on a gravitational acceleration vector g that is the reference vector and an acceleration vector a obtained from the output of theacceleration sensor 14. In practice, the postureinformation correction unit 105 performs processing of correcting the state vector x predicted by the postureinformation prediction unit 102 with the extended Kalman filter. - The
normalization unit 106 performs processing of normalizing the posture quaternion q that is the posture information on the object corrected by the postureinformation correction unit 105 so that the magnitude thereof does not change. In practice, thenormalization unit 106 performs processing of normalizing the state vector x corrected by the postureinformation correction unit 105. - The error
information correction unit 107 performs processing of correcting the error covariance matrix Σx that is the error information. Specifically, the errorinformation correction unit 107 performs processing of correcting the error covariance matrix Σx generated by the biaserror limitation unit 109 with the extended Kalman filter and the transformation matrix H and the Kalman coefficient K calculated by the correctioncoefficient calculation unit 104. - The posture quaternion q that is the posture information on the object estimated by the processing unit may be transmitted to another device via the
communication unit 60. - As described above, the posture estimation device 1 of the present embodiment can estimate the posture of the object in a stored predetermined operation of the object from the output of the
angular velocity sensor 12 and the output of theacceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14. Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy. - Next, a posture estimation method for the posture estimation device 1 according to the first embodiment will be described with reference to
FIG. 3 . - As shown in
FIG. 3 , the posture estimation method for the posture estimation device according to the present embodiment includes a measurement step, a storage step, an initial value setting step, an angular velocity and acceleration measurement step, an initial setting value updating step, and a posture estimation step. - First, in step S101, the
inertial measurement unit 10 is installed at the mounting position associated with most frequently performed one of operations of the object, which is the predetermined operation of the object, to measure the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14. - In step S102, the
processing unit 20 stores the measured bias value BW and variance value PWW of theangular velocity sensor 12 and the measured bias value BA and variance value PVV of theacceleration sensor 14 in theROM 30 as the storage unit. - In step S103, upon receiving the reset instruction from the user, the
processing unit 20 reads the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 from theROM 30 and sets the read values as the initial setting values of the bias error. - In step S104, the angular velocity and the acceleration are measured by the
angular velocity sensor 12 and theacceleration sensor 14 in a stationary state. Measurement time is 200 msec. - In step S105, the
processing unit 20 adds the bias value BW and the variance value PWW of theangular velocity sensor 12 to the measured angular velocity data dω, and updates the bias error from the initial setting values. Further, the variance value PVV of theacceleration sensor 14 is added to the acceleration data dα and the bias error is updated from the initial setting values. The bias value BA of theacceleration sensor 14 is not used to update the initial setting values. - In step S106, the
processing unit 20 estimates the posture of the object from the output of theangular velocity sensor 12 and the output of theacceleration sensor 14 using the Kalman filter based on the updated bias value BW, variance value PWW, and variance value PVV and the un-updated bias value BA. The posture quaternion q that is the estimated posture information on the object is transmitted to another device via thecommunication unit 60. - As described above, the posture estimation method of the present embodiment can estimate the posture of the object from the output of the
angular velocity sensor 12 and the output of theacceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 in the stored predetermined operation of the object. Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy. - Next, a posture estimation method according to the second embodiment will be described with reference to
FIG. 4 . - Compared to the posture estimation method of the first embodiment, the posture estimation method of the present embodiment is the same as the posture estimation method of the first embodiment except that the predetermined operation of the object includes a plurality of operations and the corresponding bias values BW, BA and the variance values PWW, PVV are stored in the
ROM 30 according to types of the operations. Differences from the first embodiment described above will be mainly described, and the description of similar matters will be omitted. - The posture estimation method according to the second embodiment will be described with reference to
FIG. 4 . - As shown in
FIG. 4 , the posture estimation method according to the present embodiment includes a measurement step, a storage step, an initial value selection step, an initial value setting step, an angular velocity and acceleration measurement step, an initial setting value updating step, and a posture estimation step. - First, in step S201, the
inertial measurement unit 10 is installed at a mounting position associated with a first operation of the object, to measure the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14. - In step S202, the measured bias value BW and variance value PWW of the
angular velocity sensor 12 and the measured bias value BA and variance value PVV of theacceleration sensor 14 are stored in theROM 30 that is the storage unit as the bias values BW, BA and the variance values PWW, PVV of the first operation. - Next, the
inertial measurement unit 10 is installed at a mounting position associated with a second operation of the object, and the steps of step S201 and step S202 are repeated. The steps of step S201 and step S202 are repeated according to the plurality of operations of the object, that is, the types of the operations. - After acquiring the bias values BW, BA and the variance values PWW, PVV of the
sensors processing unit 20 receives an operation selection instruction from a user. - In step S204, upon receiving a reset instruction from the user, the
processing unit 20 reads the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 associated with the operation selected from theROM 30 and sets the read values as the initial setting values of the bias error. - Step S205 is similar to step S104 of the first embodiment, and thus the description thereof is omitted.
- Step S206 is similar to step S105 of the first embodiment, and thus the description thereof is omitted.
- Step S207 is similar to step S106 of the first embodiment, and thus the description thereof is omitted.
- With such a configuration, even though the
inertial measurement unit 10 is installed at mounting positions corresponding to a plurality of operations of the object, the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 can be selected according to the types of the operations, and thus the bias error in the initial state can be corrected and the posture of the object can be measured with high accuracy. - Next, a
movable device 600 including the posture estimation device 1 according to the third embodiment will be described with reference toFIGS. 5 and 6 . - The posture estimation device 1 of the above embodiment can be effectively used in posture control of a construction machine.
FIGS. 5 and 6 show a hydraulic shovel that is an example of the construction machine as themovable device 600. - As shown in
FIG. 5 , in themovable device 600, a vehicle body includes alower running body 612 and an upper revolvingbody 611 that is mounted to revolve above thelower running body 612. Awork mechanism 620 is provided at a front portion of the upper revolvingbody 611. Thework mechanism 620 includes a plurality of members that are pivotable in an up-and-down direction. A driver seat (not shown) is provided in the upper revolvingbody 611, and an operation device (not shown) for operating the members constituting thework mechanism 620 is provided at the driver seat. Aninertial measurement unit 10 d functioning as an inclination sensor that detects an inclination angle of the upper revolvingbody 611 is disposed in the upper revolvingbody 611. - The
work mechanism 620 includes aboom 613, anarm 614, abucket link 616, abucket 615, aboom cylinder 617, anarm cylinder 618, and abucket cylinder 619, as the plurality of members. Theboom 613 is attached to the front portion of the upper revolvingbody 611 to move up and down. Thearm 614 is attached to a top side of theboom 613 to move up and down. Thebucket link 616 is attached to a top side of thearm 614 to be pivotable. Thebucket 615 is attached to top sides of thearm 614 and thebucket link 616 to be pivotable. Theboom cylinder 617 drives theboom 613. Thearm cylinder 618 drives thearm 614. Thebucket cylinder 619 drives thebucket 615 through thebucket link 616. - A base end of the
boom 613 is supported by the upper revolvingbody 611 to be pivotable in the up-and-down direction, and is rotationally driven relative to the upper revolvingbody 611 by expansion and contraction of theboom cylinder 617. Aninertial measurement unit 10 c functioning as an inertial sensor that detects a motion state of theboom 613 is disposed in theboom 613. - One end of the
arm 614 is supported by the top side of theboom 613 to be rotatable. Thearm 614 is rotationally driven relative to theboom 613 by expansion and contraction of thearm cylinder 618. Aninertial measurement unit 10b functioning as an inertial sensor that detects a motion state of thearm 614 is disposed in thearm 614. - The
bucket link 616 and thebucket 615 are supported by the top side of thearm 614 to be pivotable. Thebucket link 616 is rotationally driven relative to thearm 614 by expansion and contraction of thebucket cylinder 619, and thebucket 615 is rotationally driven relative to thearm 614 together with thebucket link 616. Aninertial measurement unit 10 a functioning as an inertial sensor that detects a motion state of thebucket link 616 is disposed in thebucket link 616. - The
inertial measurement units work mechanism 620 or the upper revolvingbody 611. According to mounting positions of the upper revolvingbody 611, theboom 613, thearm 614, and thebucket 615 that operate differently, the bias value BW and the variance value PWW of theangular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 are actually measured before mounting. Therefore, theinertial measurement units FIG. 6 , theinertial measurement units calculation device 630. As described above, since theinertial measurement units - Further, as shown in
FIG. 5 , themovable device 600 is provided with thecalculation device 630 that calculates an inclination angle of the upper revolvingbody 611 or positions or postures of theboom 613, thearm 614, and thebucket 615 constituting thework mechanism 620. As shown inFIG. 6 , thecalculation device 630 includes the posture estimation device 1 in the above embodiment and acontrol device 632. The posture estimation device 1 estimates posture information on themovable device 600 based on output signals of theinertial measurement units control device 632 controls the posture of themovable device 600 based on the posture information on themovable device 600 estimated by the posture estimation device 1. Specifically, thecalculation device 630 receives various detection signals input from theinertial measurement units boom 613, thearm 614, and thebucket 615 or an inclination state of the upper revolvingbody 611 based on the various detection signals. The calculated position-and-posture signal including the posture angles of theboom 613, thearm 614, and thebucket 615 or an inclination signal including the posture angle of the upper revolvingbody 611, for example, the position-and-posture signal of thebucket 615, is used in feedback information for display of a monitoring device (not shown) on the driver seat or for controlling an operation of thework mechanism 620 or the upper revolvingbody 611. - As a construction machine in which the posture estimation device 1 in the above embodiment is used, for example, a rough terrain crane (crane car), a bulldozer, an excavator and loader, a wheel loader, and an aerial work vehicle (lift car) are provided in addition to the hydraulic shovel (jumbo, back hoe, power shovel) exemplified above.
- According to the present embodiment, with the posture estimation device 1, it is possible to obtain information on a posture with high accuracy, and thus it is possible to implement appropriate posture control of the
movable device 600. According to themovable device 600, theinertial measurement units angular velocity sensor 12 and the bias value BA and the variance value PVV of theacceleration sensor 14 according to the mounting position are mounted on the upper revolvingbody 611, theboom 613, thearm 614, and thebucket 615 that operate differently, so that the bias error in the initial state can be corrected, and thus the posture of the object can be measured with high accuracy. - In the present embodiment, descriptions are made by using a four-wheel vehicle such as an agricultural machine and the construction machine as an example of the movable device in which the posture estimation device 1 is used. However, in addition, motorcycles, bicycles, trains, airplanes, biped robots, remote-controlled or autonomous aircraft (such as radio-controlled aircraft, radio-controlled helicopters and drones), rockets, satellites, ships, automated guided vehicles (AGVs) are provided. The predetermined operation described above may be an operation of raising the
bucket 615, an operation of lowering thebucket 615, or revolving or movement. The plurality of operations described above may be any two of the operation of raising thebucket 615, the operation of lowering thebucket 615, revolving or movement, and the like. The predetermined operation may be a predetermined operation of a robot arm or a drone. Further, the predetermined operation may be a walking operation of a biped robot. In theinertial measurement units angular velocity sensor 12 may be different values or the same values in theinertial measurement units movable device 600 has fourinertial measurement units - The present disclosure is not limited to the present embodiments, and various modifications can be made within the scope of the gist of the present disclosure.
- The above-described embodiments and modifications are merely examples, and the present disclosure is not limited thereto. For example, it is also possible to appropriately combine embodiments and modifications.
- The present disclosure includes a configuration substantially the same as the configuration described in the embodiments, for example, a configuration having the same function, method, and result, or a configuration having the same purpose and effect. The present disclosure includes a configuration in which a non-essential portion of the configuration described in the embodiments is replaced. The present disclosure includes a configuration having the same function and effect as the configuration described in the embodiments, or a configuration capable of achieving the same purpose. The present disclosure includes a configuration in which a known technique is added to the configuration described in the embodiments.
Claims (10)
1. A posture estimation method comprising:
measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit;
reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset;
measuring an angular velocity and an acceleration by the angular velocity sensor and the acceleration sensor in a stationary state of the object;
updating the bias value BW and the variance value PWW of the angular velocity sensor and the variance value PVV of the acceleration sensor from the initial setting values; and
estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
2. The posture estimation method according to claim 1 , wherein
the predetermined operation is most frequently performed one of operations of the object.
3. The posture estimation method according to claim 1 , wherein
the predetermined operation includes a plurality of operations, and the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV corresponding to one of the plurality of operations according to a type of the plurality of operations are read from the storage unit.
4. The posture estimation method according to claim 1 , wherein
measurement time in the stationary state is 200 msec.
5. A posture estimation device comprising:
a storage unit configured to store a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object; and
a processing unit configured to estimate a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
6. The posture estimation device according to claim 5 , wherein
the predetermined operation is most frequently performed one amoung operations of the object.
7. The posture estimation device according to claim 5 , wherein
the predetermined operation includes a plurality of operations, and
the storage unit stores the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV corresponding to the plurality of operations.
8. A posture estimation device comprising:
a storage unit configured to store, in association with each of a plurality of movements of the object, a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor ; and
a processing unit configured to estimate a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
9. A movable device comprising:
the posture estimation device according to claim 5 ; and
a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
10. A movable device comprising:
the posture estimation device according to claim 8 ; and
a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021021507A JP2022123999A (en) | 2021-02-15 | 2021-02-15 | Posture estimation method, posture estimation device, and movable device |
JP2021-021507 | 2021-02-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220259833A1 true US20220259833A1 (en) | 2022-08-18 |
Family
ID=82802034
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/670,577 Pending US20220259833A1 (en) | 2021-02-15 | 2022-02-14 | Posture Estimation Method, Posture Estimation Device, And Movable Device |
Country Status (2)
Country | Link |
---|---|
US (1) | US20220259833A1 (en) |
JP (1) | JP2022123999A (en) |
-
2021
- 2021-02-15 JP JP2021021507A patent/JP2022123999A/en active Pending
-
2022
- 2022-02-14 US US17/670,577 patent/US20220259833A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2022123999A (en) | 2022-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230202486A1 (en) | Posture estimation method, posture estimation device, and vehicle | |
US20210215483A1 (en) | Controlling movement of a machine using sensor fusion | |
US20180373966A1 (en) | System and method for controlling machine pose using sensor fusion | |
US6704619B1 (en) | Method and system for universal guidance and control of automated machines | |
US11372020B2 (en) | Posture estimation method, posture estimation device, and vehicle | |
US20180372498A1 (en) | System and method for determining machine state using sensor fusion | |
US9347205B2 (en) | Estimation of the relative attitude and position between a vehicle body and an implement operably coupled to the vehicle body | |
US20190301144A1 (en) | Converting mobile machines into high precision robots | |
EP3126785B1 (en) | Automatic identification of sensors | |
US20190242687A1 (en) | Relative angle estimation using inertial measurement units | |
JP5733518B2 (en) | Motion prediction control apparatus and method | |
US7248948B2 (en) | Apparatus and method for estimating attitude using inertial measurement equipment | |
US10371522B2 (en) | Iterative estimation of centripetal accelerations of inertial measurement units in kinematic chains | |
US20220259833A1 (en) | Posture Estimation Method, Posture Estimation Device, And Movable Device | |
CN115680057A (en) | Method for monitoring and/or executing a movement of a work apparatus, work apparatus and computer program product | |
CN114427868B (en) | Inertial measurement device | |
US20220162834A1 (en) | Method for State Estimation of Position and Orientation of a Plurality of Movable Modules of a Common System | |
CN113739794B (en) | Posture estimating device and method, sensor module, measuring system, and moving object | |
JP2021189160A (en) | Posture estimation device, sensor module, measuring system, movable body, and posture estimation method | |
US20220195689A1 (en) | End Effector Position Estimation | |
Minor et al. | Instrumentation and algorithms for posture estimation in compliant framed modular mobile robots | |
JP2022098667A (en) | Angular velocity correction device and movable body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YODA, KENTARO;REEL/FRAME:058997/0784 Effective date: 20220113 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |