WO2005062984A2 - Road curvature estimation system - Google Patents
Road curvature estimation system Download PDFInfo
- Publication number
- WO2005062984A2 WO2005062984A2 PCT/US2004/043695 US2004043695W WO2005062984A2 WO 2005062984 A2 WO2005062984 A2 WO 2005062984A2 US 2004043695 W US2004043695 W US 2004043695W WO 2005062984 A2 WO2005062984 A2 WO 2005062984A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- curvature
- measure
- road
- vehicle
- responsive
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
Definitions
- FIG. 1 illustrates a block diagram of hardware associated with a predictive collision sensing system
- FIG. 2 illustrates a coverage pattern of a radar beam used by the predictive collision sensing system
- Fig. 3 depicts a driving scenario for purposes of illustrating the operation of the predictive collision sensing system
- Fig. 4 illustrates a block diagram of the hardware and an associated signal processing algorithm of the predictive collision sensing system
- Fig. 5 illustrates a flow chart of an associated signal processing algorithm of the predictive collision sensing system
- Fig. 6 illustrates a geometry used for determining curvature parameters of a roadway
- Fig. 7 illustrates the geometry of an arc
- FIG. 8a-d illustrates an example of the estimation of target position, lateral velocity, and road curvature parameters for a straight roadway
- Figs. 9a-b illustrate an example of the target state RMS errors from unconstrained and constrained filtering on the straight roadway, corresponding to Figs. 8a-d
- Figs. lOa-d illustrate an example of the estimation of target position, lateral velocity, and road curvature parameters for a curved roadway
- Figs, lla-b illustrate an example of the target state RMS errors from unconstrained and constrained filtering for the curved roadway, corresponding to Figs. lOa-d
- Figs. 9a-b illustrate an example of the target state RMS errors from unconstrained and constrained filtering on the straight roadway, corresponding to Figs. 8a-d
- Figs. lOa-d illustrate an example of the estimation of target position, lateral velocity, and road curvature parameters for a curved roadway
- FIG. 12a-d illustrate an example of the estimation of target position, lateral velocity, and associated RMS errors for a straight roadway involving a lane change
- Figs. 13a-d illustrates an example of the estimation of target position, lateral velocity, and their RMS errors for a curved roadway involving a lane change
- Fig. 14 illustrates a block diagram of hardware associated with another embodiment of a predictive collision sensing system
- Fig. 15 illustrates a free-body diagram of a steered wheel
- Fig. 16a illustrates a geometry of a bicycle model of a vehicle undergoing a turn
- Fig. 16b illustrates a geometry of the steered wheel illustrated in Fig. 16a.
- Fig. 17 illustrates a switching curve
- FIG. 18 illustrates a flow chart of a process associated with the switching curve illustrated in Fig. 17;
- Fig. 19 illustrates a block diagram of a road curvature estimation subsystem for estimating road curvature from host vehicle state estimates;
- Fig. 20 illustrates a curvature filter associated with a first embodiment of a curvature estimator;
- Fig. 21 illustrates a curvature filter associated with a fourth embodiment of a curvature estimator;
- Fig. 22 illustrates various types of roads and associated road models;
- Fig. 23 illustrates a block diagram of a tenth embodiment of a road curvature estimation subsystem;
- Fig. 24 illustrates a flow chart of an interacting multiple model algorithm;
- a predictive collision sensing system 10 incorporated in a host vehicle 12, comprises a radar system 14 for sensing objects external to the host vehicle 12, and a set of sensors, including a yaw rate sensor 16, e.g.
- the radar system 14 e.g. a Doppler radar system, comprises an antenna 20 and a radar processor 22, wherein the radar processor 22 generates the RF signal which is transmitted by the antenna 20 and which is reflected by objects in view thereof.
- the radar processor 22 demodulates the associated reflected RF signal that is received by the antenna 20, and detects a signal that is responsive to one or more objects that are irradiated by the RF signal transmitted by the antenna 20.
- the radar system 14 provides target range, range rate and azimuth angle measurements in host vehicle 12 fixed coordinates.
- the antenna 20 is adapted to generate a radar beam 23 of RF energy that is, for example, either electronically or mechanically scanned across an azimuth range, e.g. +/- ⁇ , e.g. +/- 50 degrees, responsive to a beam control element 24, and which has a distance range, e.g. about 100 meters, from the host vehicle 12 that is sufficiently far to enable a target to be detected sufficiently far in advance of a prospective collision with the host vehicle 12 so as to enable a potentially mitigating action to be taken by the host vehicle 12 so as to either avoid the prospective collision or mitigate damage or injury as a result thereof.
- a radar beam 23 of RF energy that is, for example, either electronically or mechanically scanned across an azimuth range, e.g. +/- ⁇ , e.g. +/- 50 degrees, responsive to a beam control element 24, and which has a distance range, e.g. about 100 meters, from the host vehicle 12 that is sufficiently far to enable a target to be detected sufficiently far in advance of
- the radar processor 22, yaw rate sensor 16, and speed sensor 18 are operatively connected to a signal processor 26 that operates in accordance with an associated predictive collision sensing algorithm to determine whether or not a collision with an object, e.g. a target vehicle 36 (illustrated in Fig. 3), is likely, and if so, to also determine an action to be taken responsive thereto, for example, one or more of activating an associated warning system 28 or safety system 30 (e.g. frontal air bag system), or using a vehicle control system 32 (e.g. an associated braking or steering system) to take evasive action so as to either avoid the prospective collision or to reduce the consequences thereof.
- an associated warning system 28 or safety system 30 e.g. frontal air bag system
- vehicle control system 32 e.g. an associated braking or steering system
- the host vehicle 12 is shown moving along a multiple lane roadway 34, either straight or curved, and there is also shown a target vehicle 36 moving in an opposite direction, towards the host vehicle 12.
- a target vehicle 36 moving in an opposite direction, towards the host vehicle 12.
- target vehicles 36 can either be in the host lane 38 or in a neighboring lane 40 either adjacent to or separated from the host lane 38, but generally parallel thereto.
- the host vehicle 12 moves along the center line 41 of its lane 38 steadily without in-lane wandering, and the road curvatures of all the parallel lanes 38, 40 are the same.
- the predictive collision sensing system 10 uses the measurements of speed U h and yaw rate af l of the host vehicle 12 from the speed sensor 18 and the yaw rate sensor 16 respectively therein; and the measurements of target range r, range rate r and azimuth angle ⁇ for all target vehicles 36 from the radar system 14 mounted on the host vehicle 12; along with the corresponding error covariance matrices of all these measurements, to estimate each target's two dimensional position, velocity and acceleration [x,x,x, y, y, y] in the host fixed coordinate system at every sampling instance, preferably with an error as small as possible.
- the predictive collision sensing system 10 comprises 1) a road curvature estimation subsystem 42 for estimating the curvature of the roadway 34 using measurements from the host vehicle motion sensors, i.e. the yaw rate sensor 16 and speed sensor 18; 2) an unconstrained target state estimation subsystem 44 for estimating the state of a target illuminated by the radar beam 23 and detected by the radar processor 22; 3) a constrained target state estimation subsystem 46 for estimating the state of the constraint on the target, assuming that the target is constrained to be on the roadway 34, either in the host lane 38 or in a neighboring lane 40, for each possible lane 38, 40; 4) a target state decision subsystem 48 for determining whether the best estimate of the target state is either the unconstrained target state, or a target state constrained by one of the constraints; and 5) a target state fusion subsystem 50 for fusing the unconstrained target state estimate with the appropriate constraint identified by the target state decision subsystem 48 so as to generate a fused target state.
- the best estimate of target state is then used by a decision or control subsystem for determining whether or not the host vehicle 12 is at risk of collision with the target, and if so, for determining and effecting what the best course of action is to mitigate the consequences thereof, e.g. by action of either the warning system 28, the safety system 30, or the vehicle control system 32, or some combination thereof.
- a decision or control subsystem for determining whether or not the host vehicle 12 is at risk of collision with the target, and if so, for determining and effecting what the best course of action is to mitigate the consequences thereof, e.g. by action of either the warning system 28, the safety system 30, or the vehicle control system 32, or some combination thereof.
- the use of the geometric structure of the roadway 34 as a constraint to the target kinematics provides for a more accurate estimate of the target state, which thereby improves the reliability of any actions taken responsive thereto. Referring also to Fig. 5, illustrating a method 500 of detecting the state, i.e.
- kinematic state variables, of a target in view of the host vehicle 12 the steps of which are, for example, carried out by the signal processor 26, in steps (502) and (504), the speed U !l and yaw rate al' of the host vehicle 12 relative to the roadway 34 are respectively read from the speed sensor 18 and the yaw rate sensor 16 respectively.
- step (506) the curvature parameters and associated covariance thereof of the roadway 34 are estimated using first 52 and second 54 Kalman filters that respectively estimate the state (i.e.
- a well-designed and constructed roadway 34 can be described by a set of parameters, including curvature, wherein' the curvature of a segment of the roadway 34 is defined as:
- R is the radius of the segment.
- the curvature variation can be described as a function of a distance I along the roadway 34 by a so-called clothoid model, i.e.:
- the road curvature parameters C 0 and C x are estimated using data from motion sensors (yaw rate sensor 16 and speed sensor 18) in the host vehicle 12, based upon the assumption that the host vehicle 12 moves along the center line 41 of the roadway 34 or associated host lane 38.
- the road curvature parameters C 0 and C can be calculated from data of ⁇ , ⁇ , U, U responsive to measurements of yaw rate aJ l and speed U h of the host vehicle 12 from the available host vehicle 12 motion sensors. However, generally the measurements of yaw rate co h and speed U h , from the yaw rate sensor 16 and speed sensor 18 respectively, are noisy.
- a host state filter implemented by a first Kalman filter 52 is beneficial to generate estimates of ⁇ , ⁇ , U, U from the associated noisy measurements of yaw rate o h and speed U !l ; after which a curvature filter implemented by a second Kalman filter 54 is used to generate smoothed estimates of the curvature parameters C 0 and C, .
- the dynamics of the host vehicle 12 for the host state filter follows a predefined set of kinematic equations (constant velocity in this case) given by: ft H I, ft , h I k -x k + ⁇ k , £ ⁇ N(0,R t h ) (10) where
- the first Kalman filter 52 is implemented to estimate the host state x and its error covariance PZ' , as illustrated in Fig. 4.
- the estimate of the host state from the first Kalman filter 52 i.e. the host state filter, is then used to generate a synthetic measurement that is input to the second Kalman filter 54, i.e. curvature coefficient (or parameter) filter, wherein the associated Kalman filters 52, 54 operate in accordance with the Kalman filtering process described more fully in the Appendix hereinbelow.
- ⁇ t is the update time period of the second Kalman filter 54, and the values of the elements of the measurement vector z k are given by the corresponding values of the state variables — i.e. the clothoid parameters Co and Ci — of the curvature filter.
- the measurement, z k is transformed from the estimated state [U , ⁇ , ⁇ , ⁇ > as follows:
- the curvature parameters of the roadway 34 may be substituted in the road curvature estimation subsystem 42 for that described above.
- the curvature parameters of the roadway may also be estimated from images of the roadway 34 by a vision system, • either instead of or in conjunction with the above described system based upon measurements of speed U h and yaw rate c l from associated motion sensors.
- yaw rate can be either measured or determined in a variety of ways, or using a variety of means, for example, but not limited to, using a yaw gyro sensor, a steering angle sensor, a differential wheel speed sensor, or a GPS-based sensor; a combination thereof; or functions of measurements therefrom (e.g. a function of, inter alia, steering angle rate).
- a yaw gyro sensor e.g., a steering angle sensor, a differential wheel speed sensor, or a GPS-based sensor; a combination thereof; or functions of measurements therefrom (e.g. a function of, inter alia, steering angle rate).
- the measurements of target range r, range rate r , and azimuth angle ⁇ are read from the radar processor 22, and are used as inputs to an extended Kalman filter 56, i.e.
- the main filter which, in step (510), generates estimates of the unconstrained target state — i.e. the kinematic state variables of the target - which estimates are relative values in the local coordinate system of the host vehicle 12 (i.e. the host-fixed coordinate system) which moves with therewith.
- the unconstrained target state i.e. the target velocity and acceleration
- the unconstrained target state is transformed to absolute coordinates of the absolute coordinate system fixed on the host vehicle 12 at the current instant of time as illustrated in Fig. 3, so as to be consistent with the absolute coordinate system in which the road constraint equations are derived and for which the associated curvature parameters are assumed to be constant, when used in the associated constraint equations described hereinbelow in order to generate estimates of the constrained target state.
- the absolute coordinate system superimposes the moving coordinate system in space at the current instant of time, so that the transformation in step (512) is realized by adding velocity and acceleration related correction terms ⁇ accounting for the motion of the host vehicle 12 — to the corresponding target estimates, in both x and y directions.
- the result from the coordinate transformation in step (512) of the output from the extended Kalman filter 56 is then partitioned into the following parts, corresponding respectively to the x and y position of the target vehicle 36 relative to the host vehicle 12, wherein the superscript 1 refers to the unconstrained target state of the target vehicle 36:
- steps (506) and (512) in steps (514) through (524) described more fully hereinbelow, various constraints on the possible trajectory of the target vehicle 36 are applied and tested to determine if the target vehicle 36 is likely traveling in accordance with one of the possible constraints.
- the constraints are assumed to be from a set of lanes that includes the host lane 38 and possible neighboring lanes 40, and a target vehicle 36 that is likely traveling in accordance with one of the possible constraints would likely be traveling on either the host lane 38 or one of the possible neighboring lanes 40.
- step (524) the hypothesis that the target vehicle 36 is traveling on either the host lane 38 or one of the possible neighboring lanes 40 is tested for each possible lane.
- step (526) the state of the target is assumed to be the unconstrained target state, which is then used for subsequent predictive crash sensing analysis and control responsive thereto. Otherwise, from step (524), in step (528), the target state is calculated by the target state fusion subsystem 50 as the fusion of the unconstrained target state with the associated state of the constraint that was identified in step (524) as being most likely.
- step (514) Prior to discussing the process of steps (514) through (524) for determining whether the target is likely constrained by a constraint, and if so, what is the most likely constraint, the process of fusing the unconstrained target state with state of a constraint will first be described for the case of a target vehicle 36 moving in the same lane as the host vehicle 12.
- the constraints are applied in the y-direction and are derived from road equations where y- direction state variables are functions of x-direction state variables, consistent with the assumptions that the host vehicle 12 moves along the center line 41 of its lane 38 steadily without in-lane wandering and that the road curvatures of all the parallel lanes 38, 40 are the same, and given that the absolute coordinate system is fixed on the host vehicle 12 at the current instant of time.
- the constraint state variables are then given in terms of the lateral kinematic variable as:
- step (528) the two -coordinate estimates, one from the main filter and the other from the road constraint, are then fused as follows:
- step (530) this composed estimate would then be output as the estimate of the target state if the target vehicle 36 were to be determined from steps (514) through (524) to be traveling in the host lane 38.
- the knowledge of which lane the target vehicle 36 is current in is generally not available, especially when the target is moving on a curved roadway 34.
- Constraint jump process is a Markov chain with known transition probabilities To implement the Markov model - for systems with more than one possible constraint state — it is assumed that at each scan time there is a probability py that the target will make the transition from constraint state i to state These probabilities are assumed to be known a priori and can be expressed in the probability transition matrix as shown below. New State 1 2 3
- the constrained target state estimation subsystem 46 provides for determining whether the target state corresponds to a possible constrained state, and if so, then provides for determining the most likely constrained state.
- a multiple constraint (MC) estimation algorithm mixes and updates N° constraint- conditioned state estimates using the unconstrained state estimate y as a measurement,
- the constrained state estimate output is a composite combination of all of the constraint- conditioned state estimates. If this constrained state estimate is valid, i.e. if the constrained state estimate corresponds to ⁇ e.g. matches — the unconstrained state estimate, then the target state is given by fusing the constrained and unconstrained state estimates; otherwise the target state is given by the unconstrained state estimate.
- This embodiment of the multiple constraint (MC) estimation algorithm comprises the following steps: 1.
- step (514) using the multiple lane road equation (34) to replace the first row in equation (25), the multiple constraint state estimates are given by: R / + C 0 - ( 1 ) 2 /2 + C I - ( 1 ) 3 /6 C 0 - x ⁇ -x + C l - (x ⁇ f -x ll (41) C 0 -(x 1 ) 2 +C 0 -x 1 - x 1 +C, - x 1 ⁇ f + C l -(x 1 ) 2 -x l ll N -1
- B j 0, ⁇ B, ..., ⁇ B
- B is the width of a lane.
- constraint state estimates corresponds to — e.g. matches — the y locations of the centerlines of each possible lane in which the target vehicle 36 could be located.
- the updated state estimate and covariances corresponding to constraint j are obtained using measurement — y il l , as follows: 3.
- Likelihood calculation: In step (518), the likelihood function corresponding to constraint j is evaluated at the value y of the unconstrained target state estimate, assuming a — ' ⁇ Gaussian distribution of the measurement around the constraint-conditioned state estimate for each of the constraints j 1, ...
- the output of the estimator from step (522) in the above algorithm is then used as the constrained estimates in the fusion process described by equations (29) and (30), and the result of equation (52), instead of the result of equation (33), is used in equation (32).
- the target vehicle 36 is not following the roadway 34 or is changing lanes, imposing the road constraint on target kinematic state variables will result in incorrect estimates that would be worse than using the associated unconstrained estimates. However, noise related estimation errors might cause a correct road constraint to appear invalid.
- the unconstrained target state estimate plays a useful role in road constraint validation, since it provides independent target state estimates.
- One approach is to test the hypothesis that the unconstrained target state estimate satisfies the road constraint equation, or equivalently, that the constrained estimate and the unconstrained estimate each correspond to the same target. The optimal test would require using all available target state estimates in history through time t k and is generally not practical.
- a practical approach is the sequential hypothesis testing in which the test is carried out based on the most recent state estimates only.
- the difference between the constrained and unconstrained target state estimates is denoted: as the estimate of where y is the true target state and y is the true state of a target moving along the —'k — roadway 34 (or a lane).
- a high threshold a low error tolerance value
- targets in neighboring lanes 40 are usually regarded as passing-by vehicles.
- constrained filtering may further reduce false alarm rate, a "changing lane” maneuver of such a target (into the host lane 38) would pose a real threat to the host vehicle 12.
- a low threshold a high error tolerance value for a target in a neighboring lane if false alarm rate is already low enough.
- the hypothesis testing scheme efficiently uses different threshold values for targets in different lanes, with the multiple constraint filtering algorithm providing the knowledge of which lane the target is most likely in currently.
- the constrained state estimate used for the hypothesis testing is the most likely of the separate constrained target state estimates (i.e. in accordance with a "winner take all” strategy), rather than a composite combination of all of the constrained target state estimates. If this most likely constrained state estimate is valid, i.e. if the most likely constrained state estimate corresponds to — e.g.
- the target state is given by fusing the most likely constrained state estimate and the unconstrained state estimate; otherwise the target state is given by the unconstrained state estimate.
- MC multiple constraint estimation algorithm
- hypothesis tests are made for each of the constrained state estimates. If none of the hypotheses are accepted, then the target state is given by the unconstrained state estimate. If one of the hypotheses is accepted, then the target state is given by fusing the corresponding constrained state estimate and the unconstrained state estimate. If more than one hypotheses are accepted, then the most likely constrained state may be identified by voting results from a plurality of approaches, or by repeating the hypothesis tests with different associated thresholds. Generally, the number of constraints (i.e.
- the number of roadway lanes can vary with respect to time, as can associated parameters therewith, for example, the width of the lanes of the roadway, so as to accommodate changes in the environment of the host vehicle 12.
- the host vehicle 12 in one trip could travel on a one-lane road, a two-lane road with opposing traffic, a three-lane road with a center turn lane, a four line road two lanes of opposing traffic, or on a multi-lane divided freeway.
- Road vehicle tracking simulation's using constrained and unconstrained filtering were carried out for four scenarios. In all scenarios, the host vehicle 12 was moving at 15.5 m/s and a target vehicle 36 is approaching on the same roadway 34 at a speed of 15.5 m/s.
- the initial position of the target was 125 meters away from the host in the x direction, and the lane width for all lanes was assumed to be 3.6 meters.
- the measurement variance of the vehicle speed sensor was 0.02 m/s and the variance of the gyroscope yaw rate measurement was 0.0063 rad/s.
- the variances of radar range, range rate and azimuth angle measurements were 0.5 m, 1 m/s, and 1.5° respectively. Simulation results were then generated from 100 Monte-Carlo runs of the associated tracking filters.
- Fig. 9a-b illustrate the average target vehicle 36 lateral position, velocity and acceleration RMS errors of the unconstrained and constrained filtering schemes.
- the estimation errors from constrained filtering were substantially reduced. Before 48 radar scans, when the target vehicle 36 was farther than 65 meters away from the host vehicle 12, constrained filtering resulted in a more than 40 percent reduction of error in target lateral velocity estimation, and a more than 60 percent reduction of error in lateral acceleration estimation.
- Figs. lOa-d illustrate the target state estimation and curvature estimation results of the unconstrained and constrained filtering schemes
- Figs, lla-b illustrate the average target vehicle 36 lateral position, velocity and acceleration RMS errors of the unconstrained and constrained filtering schemes.
- the estimation errors from constrained filtering were substantially reduced after about 48 radar scans, when the target vehicle 36 was less than 65 meters away from the host vehicle 12. Estimation errors were the same for constrained and unconstrained filtering before 20 radar scans, when the target vehicle 36 was about 100 meters away from the host vehicle 12. For the target vehicle 36 located between 100 and 65 meters away from the host vehicle 12, constrained filtering resulted in about a 30 percent reduction in errors of lateral velocity and acceleration estimation, and when the target vehicle 36 was less than 65 meters away from the host vehicle 12, more than 50 percent of lateral position estimation error and more than 90 percent of lateral velocity and acceleration estimation errors were reduced by constrained filtering.
- Figs. 12a-d illustrate the target state estimation results and the lateral position and velocity RMS errors of the unconstrained and constrained filtering schemes.
- the performance of constrained filtering with validation was substantially close to that of unconstrained filtering, producing slightly lower estimation errors before the target vehicle 36 turns away, and exhibiting target state estimation results and RMS errors that were the same as unconstrained filtering after the target vehicle 36 began to turn away from its lane, implying that road constraints were promptly lifted off after the target vehicle 36 began to diverge from its lane.
- 13a-d illustrate the target state estimation results and the lateral position and velocity RMS errors of the unconstrained and constrained filtering schemes.
- the error tolerance levels were the same as in the third scenario, and the results and observations were also similar to that of the third scenario.
- Road constraints were promptly lifted off by the proposed constraint validation after the target vehicle 36 began to diverge from its lane.
- the overall improvement by constrained filtering in estimation accuracy of target vehicle 36 lateral kinematics was substantial, given the fact that estimation accuracy of target vehicle 36 lateral kinematics was often limited by poor radar angular resolution.
- simulation results of road vehicle tracking on both straight and curved roadways 34 show that the predictive collision sensing system 10 could substantially reduce the estimation errors in target vehicle 36 lateral kinematics when the target vehicles 36 were in the host lane 38.
- the predictive collision sensing system 10 promptly detects this maneuver and lifts off the road constraint to avoid an otherwise incorrect constrained result.
- the predictive collision sensing system 10 has provided for a substantial improvement in estimation accuracy of target vehicle 36 lateral kinematics, which is beneficial for an early and reliable road vehicle collision prediction. Referring to Fig.
- the predictive collision sensing system 10 further comprises a steer angle sensor 58 which provides a measure indicative of or responsive to the steer angle ⁇ of one or more steered wheels 60.
- the steer angle ⁇ of a particular steered wheel 60 e.g. one of the front wheels, is the angle between the longitudinal axis 62 of the vehicle 12 and the heading direction 64 of the steered wheel 60, wherein the heading direction 64 is the direction in which the steered wheel 60 rolls.
- the associated steer angles ⁇ of different steered wheels 60 - i.e. inside and outside relative to a turn — will be different.
- the turn radius R is substantially larger than the wheelbase L of the vehicle 12, in which case the associated slip angles ⁇ of the different steered wheels 60 are relatively small so that the difference therebetween for the inside and outside steered wheels 60 can be assumed to be negligible.
- the vehicle 12 can be represented by what is known as a bicycle model 68 with a front 63 and rear 65 wheel, each corresponding to a composite of the associated front or rear wheels of the vehicle 12, wherein different steered wheels 60 at the front or rear of the vehicle are modeled as a single steered wheel 60 that is steered at an associated steer angle ⁇ .
- Each wheel of the bicycle model 68 - front 63 and rear 65 — is assumed to generate the same lateral force responsive to an associated slip angle ⁇ as would all (e.g. both) corresponding wheels of the actual vehicle 12.
- the sum of the lateral forces i.e.
- b and c are the distances from the center of gravity CG to the front 63 and rear 65 wheels respectively, and W r is the weight of the vehicle 12 carried by the rear wheel 65.
- the lateral force F yr at the rear wheel 65 is given by the product of the portion of vehicle mass (W r /g) carried by the rear wheel 65 times the lateral acceleration at the rear wheel 65.
- the factor K [W f /C ⁇ f - W r /C ⁇ r ] - which provides the sensitivity of steer angle ⁇ to lateral acceleration a y , and which is also referred to as an' understeer gradient — consists of two terms, each of which is the ratio of the load on the wheel W f , W r (front or rear) to the corresponding cornering stiffness C ⁇ f, C ⁇ r of the associated tires.
- the cornering behavior of the vehicle 12 is classified as either neutral steer, understeer or oversteer, depending upon whether K is equal to zero, greater than zero, or less than zero, respectively.
- the steer angle sensor 58 can be implemented in various ways, including, but not limited to, an angular position sensor — e.g. shaft encoder, rotary potentiometer or rotary transformer/syncro - adapted to measure the rotation of the steering wheel shaft or input to a steering box, e.g. a pinion of a rack-and-pinion steering box; or a linear position sensor adapted to measure the position of the rack of the rack-and-pinion steering box.
- the steer angle sensor 58 could be shared with another vehicle control system, e.g.
- the steer angle sensor 58 can be used to supplement a yaw rate sensor 16, and can beneficially provide independent information about vehicle maneuvers. Furthermore, the steer angle ⁇ measurement error is substantially independent of longitudinal speed TJ, in comparison with a gyroscopic yaw rate sensor 16 for which the associated yaw rate ⁇ measurement error is related to vehicle speed, notwithstanding that a gyroscopic yaw rate sensor 16 is generally more accurate and more sensitive to vehicle maneuvers than a steer angle sensor 58 when each is used to generate a measure of yaw angle.
- the curvature error variance associated with steer angle ⁇ measurements can be compared with that associated with yaw rate ⁇ measurements in order to identify conditions under which one measurement is more accurate than the other.
- the curvature error variance ⁇ 2 0 of the yaw rate ⁇ is given by equation (97), described hereinbelow.
- equation (97) The curvature error variance associated with the steer angle ⁇ measurement is equal to the curvature error variance associated with the yaw rate ⁇ measurement when: ⁇ ⁇ ⁇ ⁇ Equation (95) defines a switching curve - e.g. as illustrated in Fig.
- step (502) the longitudinal speed TJ of the vehicle 12 is read from the speed sensor 18. Then, in step (1802), if the longitudinal speed T
- the road curvature parameters are determined T using yaw rate ⁇ , wherein the speed threshold TJ parameter is given by equation (95) and illustrated in Fig. 17 as a function of the error variance ⁇ ⁇ 2 of the steer angle ⁇ measurement, the latter of which is assumed to be constant for a given steer angle sensor 58. More particularly, from step (1802), the yaw rate ⁇ of the vehicle 12 is read from the yaw rate sensor 16, and then in step (506), the road curvature parameters o and Ci are estimated from (TJ, ⁇ ) as described hereinabove.
- step (1802) if the longitudinal speed TJ of the vehicle 12 is less than the speed threshold TJ , then in step (1804) the steer angle ⁇ is read from the steer angle sensor 58, and then in step (1806), the road curvature parameters o and C ⁇ are estimated from (TJ, ⁇ ), wherein the yaw rate ⁇ measurement input to the first Kalman filter 52 (host state filter) can be determined from the steer angle ⁇ and longitudinal speed TJ using equation (84). If the longitudinal speed TJ is equal to the speed T threshold TJ , then the estimates of the road curvature parameters C 0 and C ⁇ from steps (506) and (1806) would have the same error variance, and either estimate could be used.
- the road curvature estimation subsystem 42 can be embodied in a variety of ways.
- the road curvature estimation subsystem 42 comprises a host state filter 52.1 and a curvature filter 54.1, wherein the host state filter 52.1 processes measures responsive to vehicle speed and vehicle yaw to determine the corresponding host state [ ⁇ , ⁇ , U, U ] of the host vehicle 12.
- a speed sensor 18 provides the longitudinal speed U. 1 of the vehicle 12 as the measure of vehicle speed, and either a yaw
- the rate sensor 16 or a steer angle sensor 58, or both provide either the yaw rate ⁇ or the steer angle ⁇ respectively, or both, as the measure of vehicle yaw, wherein samples k of the measurements are provided by a sampled data system at corresponding sampling times.
- the measure of vehicle yaw is given by the yaw rate ⁇ from the yaw rate sensor 16, wherein the yaw rate ⁇ and the longitudinal
- the measure of vehicle yaw is given by either the yaw rate from the yaw rate sensor 16 or the steer angle from the steer angle sensor 58, depending upon
- a yaw rate processor 68 - e.g. embodied in the signal processor 26 - calculates the yaw rate ⁇ k h from the steer angle , e.g. using equation (85)
- the yaw rate _o t 7' from the yaw rate sensor 16 is input to the first Kalman filter 52.
- the steer angle sensor 58 and yaw rate processor 68 are used without benefit of the yaw rate sensor 16.
- the steer angle ⁇ j! from the steer angle sensor 58 is input directly to the first Kalman filter 52.
- the steer angle ⁇ from the steer angle sensor 58 is input directly to the first Kalman filter 52 that is adapted to use steer angle as the associated state variable.
- the yaw rate from the yaw rate sensor 16 is input to the first Kalman filter 52 if the longitudinal speed U of the vehicle 12 is less than the speed threshold U .
- Kalman filter 52 that is adapted to use yaw rate as the associated state variable.
- the associated state and variance output of the host state filter 52.1 is processed by a curvature estimator 70 so as to provide estimates of the road curvature parameters Co and Ci and the error covariance associated therewith.
- the curvature estimator 70 comprises a first curvature processor 72 which transforms the associated state and covariance output of the host state filter 52.1 to either road curvature parameters Co and Ci, or another related form — comprising a measurement vector Z and an associated covariance matrix R ⁇ thereof - that is either used directly as the output of the road curvature estimation subsystem 42, or is input to a second Kalman filter 54 of the curvature filter 54.1, the output of which is either used as the output of the road curvature estimation subsystem 42, or which is transformed to road curvature parameters Co and Ci and the associated covariance thereof using a second curvature processor 74.
- a first curvature processor 72 which transforms the associated state and covariance output of the host state filter 52.1 to either road curvature parameters Co and Ci, or another related form — comprising a measurement vector Z and an associated covariance matrix R ⁇ thereof - that is either used directly as the output of the road curvature estimation subsystem 42, or is input to a second Kalman filter
- the first 72 and second 74 curvature processors and the host state filter 52.1 of the curvature estimator 70 can be embodied in the signal processor 26.
- the curvature estimator 70 comprises the first curvature processor 72 and the second Kalman filter 54, wherein the first curvature processor 72 calculates the road curvature parameters Co and Ci from the host state [ ⁇ , ⁇ , U, U ] ⁇ for input as a measurement vector Z to the second Kalman filter 54, e.g. in accordance with equation (21).
- the first curvature processor 72 calculates the associated covariance R k of the measurement vector from the covariance P of the host state vector, e.g. in accordance with equations (21) and (22).
- the output of the road curvature estimation subsystem 42 is then given by the output of the second Kalman filter 54.
- the first embodiment of the curvature estimator 70 is also illustrated in Fig. 4, wherein the action of the first curvature processor 72 is implicit in the interconnection between the first 52 and second 54 Kalman filters thereof.
- a second embodiment of the curvature estimator 70 is a modification of the first embodiment, wherein the second Kalman filter 54 is adapted to incorporate a sliding window in the associated filtering process.
- the length of the sliding window is adapted so as to avoid excessive delay caused by window processing, and for example, in one embodiment, comprises about 5 samples.
- the associated vectors and matrices — referenced in the Appendix — of the associated second Kalman filter 54 are given by:
- L is the sliding window length
- i k-L
- a third embodiment of the curvature estimator 70 is a modification of the second embodiment, wherein the length L of the sliding window is adaptive.
- This rule provides for a larger window length L - e.g. as large as 25 samples ⁇ when both Co and Ci are relatively small, for example, corresponding to a straight section of road.
- the window length L becomes smaller - e.g. as small as 1 sample — when either C 0 or Ci is large, corresponding to a turn or transition of road.
- the window length L can be sharply reduced to account for a sudden vehicle maneuver, with a limitation on the subsequent increase in window length L to one sample per step so that the previous samples so not adversely affect the output from the curvature estimator 70 when maneuver ends.
- the window length L is changed, the associated F, Q, and R matrices in the second Kalman filter 54 are also changed.
- the associated second Kalman filter 54 is illustrated in Fig. 21, wherein the associated vectors and matrices referenced in the Appendix are given by:
- a fifth embodiment of the curvature estimator 70 is a modification of the fourth embodiment, wherein the second Kalman filter 54 is adapted to incorporate a sliding window in the associated filtering process.
- the length of the sliding window is adapted so as to avoid excessive delay caused by window processing, and for example, in one embodiment, comprises about 5 samples.
- the associated vectors and matrices — referenced in the Appendix — of the associated second Kalman filter 54 are given by: 1 dL mean( ⁇ co (k - L : k)) mean( ⁇ c ⁇ (k — L : k)) F, R k ⁇ ,.- - (120) 0 1 mena( ⁇ 2 ⁇ (k - L : k)) mean( ⁇ 2 (k - L : k)) 'L
- a sixth embodiment of the curvature estimator 70 is a modification of the fifth embodiment, wherein the length L of the sliding window is adaptive.
- the window length can be adapted to be responsive to the road curvature parameters Co and Ci, for example, in accordance with the following rule of equation (117). This rule provides for a larger window length L - e.g. as large as 25 samples — when both Co and Ci are relatively small, for example, corresponding to a straight section of road.
- the window length L becomes smaller - e.g. as small as 1 sample ⁇ when either Co or Ci is large, corresponding to a turn or transaction of road. Furthermore, the window length L can be sharply reduced to account for a sudden vehicle maneuver, with a limitation on the subsequent increase in window length L to one sample per step so that the previous samples so not adversely affect the output from the curvature estimator 70 when maneuver ends. As the window length L is changed, the associated F, Q, and R matrices in the second Kalman filter 54 are also changed.
- the curvature estimator 70 comprises the first curvature processor 72, which calculates the road curvature parameters Co and Ci from the host state [ ⁇ , ⁇ , U, U ] ⁇ as the output of the road curvature estimation subsystem 42 - without using a second Kalman filter 54.
- the curvature estimator 70 comprises the first curvature processor 72, which determines the road curvature parameters Co and Ci of the clothoid model by a curve fit of a trajectory of the host state [ ⁇ , ⁇ , U, U ] ⁇ from the host state filter 52.1.
- equations (129) through (131) are used with curve fitting to solve for the road curvature parameters C 0 and Ci, wherein a window of sampled data points — e.g. in one embodiment, about 12 sample points ⁇ is used to improve the smoothness of the associated curve used for curve fitting.
- a ninth embodiment of the curvature estimator 70 is a modification of the eighth embodiment, wherein the length L of the sliding window is adaptive.
- the window length can be adapted to be responsive to the road curvature parameters Co and Ci , for example, in accordance with the following rule: L k - min ⁇ max ⁇ 25-/Zoor[(
- This rale provides for a larger window length L - e.g. as large as 25 samples — when both Co and Ci are relatively small, for example, corresponding to a straight section of road.
- the window length L becomes smaller - e.g. as small as 2 samples — when either Co or Ci is large, corresponding to a turn or transition of road.
- the window length L can be sharply reduced to account for a sudden vehicle maneuver, with limitation on the subsequent increase in window length L to one sample per step so that the previous samples so not adversely affect the output from the curvature estimator 70 when maneuver ends.
- the above-described first through ninth embodiments of the road curvature estimation subsystem 42 are based upon the clothoid model of road curvature, wherein the road curvature C is characterized as varying linearly with respect to path length along the road, and wherein different types of roads (e.g. straight, circular or generally curved) are represented by different values of the clothoid road curvature parameters C 0 and Ci .
- Different embodiments of the road curvature estimation subsystem 42 can be used under different conditions.
- the seventh embodiment of the curvature estimator 70 is used when the longitudinal speed U of the host vehicle 12 is greater than a threshold, e.g. about 11 meters/second, and the ninth embodiment of the curvature estimator 70 is used at greater velocities.
- the ratio of the mean prediction error to the mean prediction distance can be used to compare and evaluate the various embodiments of the curvature estimator 70.
- high-speed roads can be modeled as a collection of different types of interconnected road segments, each of which is represented by a different road model. For example, Fig.
- different types of roads are represented with different models, rather than using a single all-inclusive clothoid model, thereby providing for improved estimation accuracy for road segments that are characterized by models with fewer degrees of freedom resulting from constraints on one or more coefficients of the clothoid model, because of the fewer degrees of freedom.
- a particular road segment is characterized by one of a finite number r of models.
- Each curvature estimator 70.1, 70.2, 70.3 is generally structured as illustrated in Fig. 19, and is adapted to process the output of the host state filter 52.1.
- Fig. 23 illustrates a common host state filter 52.1 for all of the curvature estimators 70.1, 70.2, 70.3, different curvature estimators 70.1, 70.2, 70.3 could utilize different corresponding embodiments of the host state filter 52.1 - e.g.
- the first curvature estimator 70.1 embodies a straight road model
- the second curvature estimator 70.2 embodies a circular arc or quadratic road model
- the third curvature estimator 70.3 embodies a clothoid road model suitable for a general high-speed road.
- the multiple model system 82 has both continuous (noise) uncertainties as well as discrete (“model” or "mode”) uncertainties, and is thereby referred to as a hybrid system.
- the multiple model system 82 is assumed to be characterized by a base state model, a modal state, and a mode jump process.
- the mode jump process governs the transition from one mode to another is assumed to be characterized by a Markov chain with known transition probabilities, as follows:
- the curvature estimators 70.1, 70.2, 70.3 operate in parallel, and the output therefrom is operatively coupled to a curvature processor 84 - which, for example, can be embodied in the signal processor 26 - which generates a single estimate of road curvature and associated covariance, in accordance with an interacting multiple model algorithm
- the interacting multiple model algorithm 2400 is useful to track either or both maneuvering and non-maneuvering targets with moderate computational complexity, wherein a maneuver is modeled as a switching of the target state model governed by an underlying Markov chain.
- Different state models can have different structures, and the statistics of the associated process noises of different state models can be different.
- the interacting multiple model algorithm 2400 performs similar to the exact Bayesian filter, but requires substantially less computational power.
- Each model has a corresponding filter that is processed by the curvature processor 84 in accordance with the interacting multiple model algorithm 2400. Referring to Fig.
- the interacting multiple model algorithm 2400 commences with the initialization in the current cycle of each mode conditioned filter in step (2402), whereby the mode-conditioned state estimates and covariance of the previous cycle are mixed using mixing probabilities.
- Each filter uses a mixed estimate at the beginning of each cycle, wherein the mixing is determined by the probabilities of switching between models.
- the Kalman filter matched to the j th mode, M j (k), uses measurement z(k) to provide the state x*(k
- the Markov model is implemented by assuming that at each scan time there is a probability py that the target will make the transition from model state i to state j. These probabilities are assumed to be known a priori and can be expressed in a model probability transition matrix, e.g. as follows: New State 1 2 3
- the predictive collision sensing system 10 further comprises a vehicle navigation system 86 and an associated map system 88, operatively coupled to the signal processor 26.
- the vehicle navigation system 86 could also provide altitude and time information, and/or the associated velocities of the measurements.
- the vehicle position measure z /c is used with the map system 88 to determine the position and curvature coordinates of the road relative to the host vehicle 12, wherein the map system 88 incorporates a digital map database in absolute coordinates and an algorithm to convert the vehicle position measure Z? — in world absolute coordinates — of the road on which the host vehicle 12 is located, to the host absolute coordinates in the coordinate system used hereinabove for road curvature and target state estimation. Accordingly, the map system 88 provides for the following transformation: wherein (x, y) are the world absolute coordinates of the vehicle position measure Z? from
- Z c and P are vectors containing the coordinates of the center of the road closest to [x, y]
- the map system 88 can either store both the road position coordinates and associated curvature parameter information, or could calculate the curvature parameters from the stored position coordinates of the road centers as the information is required.
- the world absolute coordinates can be transformed to host vehicle absolute coordinates by a combination of translation and rotation, given the position of the host vehicle 12 from the vehicle navigation system 86, and the heading of the host vehicle 12 based upon either information from the host filter 52.1, or from the vehicle navigation system 86, or both. More particularly, given the vehicle position measure Z? , in one embodiment the map system 88 uses an associated map database to determine the road that the host vehicle
- T is a selection threshold
- Z is a point on the road that is closest to the vehicle
- curvature C from the other curvature estimation system 42, is selected as the most likely road from the map database, and the associated curvature at the closest point is then given as Referring to Fig. 25, the curvature from the map system 88 of a point on the road closest to the location of the host vehicle 12 from the vehicle navigation system 86 is used as a measurement to an associated Kalman filter 54' of a corresponding curvature filter 54.1', which is used to generate a corresponding map-based road curvature estimate g Referring to Fig. 26, in accordance with another embodiment of a road curvature
- the associated trajectory of the target vehicle 36 as measured by the radar sensor 14 — can be used to determine an estimate of road curvature of the roadway 34 based upon the premise that under normal driving conditions, the target vehicle 36 is assumed to follow the roadway 34.
- the dynamics of the t th target are assumed to be given by the following constant-acceleration kinematic equations, which are embodied in an extended Kalman filter 90 of an auxiliary filter 90.1.
- T is the sampling period
- the superscript (•) is used to designate a particular target
- the associated measurement function h a k ' is non-linear, and given by:
- the extended Kalman filter 90 provides for linearization of the non-linear measurement function h"' using an associated Jacobian matrix, and provides for an estimate
- the measurement input z k ⁇ and associated error covariance Rjr-' are given by: y c y (166) k ⁇ k
- a predictive collision sensing system 10.1 comprises a plurality of road curvature estimation subsystems 42.1, 42.2, ..., 42. ⁇ , each of which operates in accordance with any one of the various embodiments of road curvature estimation subsystems 42, 42' or 42" described hereinabove, e.g. as illustrated in Figs. 4, 19, 23, 25 or 26.
- the predictive collision sensing system 10.1 incoiporates first 42.1 and second 42.2 road curvature estimation subsystems, wherein the first road curvature estimation subsystem 42.1 is a road curvature estimation subsystem 42 responsive to host vehicle measurements, and the second road curvature estimation subsystem 42.2 is a road curvature estimation subsystem 42' responsive to measurements from a vehicle navigation system 86 and an associated map system 88. From the first road curvature estimation subsystem 42.1, the curvature estimate and associated error covariance at a distance / along the roadway 34 from the current location are given respectively by: C ⁇ ⁇ C ⁇ + C. -l (172)
- R c ( ⁇ ) R c ⁇ Rczi (173) where Rco, Rci and Re are the error covariances of Co, Cj and C respectively.
- C g (l) the corresponding curvature estimate
- R g the corresponding error covariance
- R g is generally a constant scalar.
- R f (l) (R c (ir +R g - l ) ⁇ i (175)
- G and G g are weights given by:
- the fused curvature C f (/) and associated error covariance R f (/) can be used by other processes, for example for improving the estimation of the locations of the host 12 or target 36 vehicles, or for collision prediction.
- the fused curvature C f (/) and associated error covariance R f (/) are input to a constrained target state estimation subsystem 46 for estimating the constrained target state, wherein the constrained target state estimation subsystem 46 and the associated unconstrained target state estimation subsystem 44, target state decision subsystem 48, and target state fusion subsystem 50 function in accordance with the embodiment illustrated in Fig. 4.
- a Kalman filter is used to estimate, from a set of noisy measurements, the state and associated covariance of a dynamic system subject to noise.
- the associated matrices ⁇ k , Q k , ⁇ k , R j . can vary over time.
- the Kalman filter Given a measurement z k at time k, and initial values of the state x k _ ⁇ k _ x and associated covariance P, ⁇ .. ! at time k-1, the Kalman filter is used to estimate the associated state x k]k and associated covariance P k k at time k.
- the first step in the filtering process is to calculate estimates of the state x A:lt _ 1 and associated covariance P ⁇ , t _ j at time k based upon estimates at time k-1, as follows:
- ** ⁇ * **_.- ⁇ + G k ⁇ (& - ) (A-8)
- P ⁇ - P - ⁇ - G t - S, - G (A-9)
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006547495A JP4990629B2 (en) | 2003-12-24 | 2004-12-24 | Road curvature estimation system |
EP04815709A EP1714108A4 (en) | 2003-12-24 | 2004-12-24 | Road curvature estimation system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US53234403P | 2003-12-24 | 2003-12-24 | |
US60/532,344 | 2003-12-24 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005062984A2 true WO2005062984A2 (en) | 2005-07-14 |
WO2005062984A3 WO2005062984A3 (en) | 2007-02-22 |
Family
ID=34738789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2004/043695 WO2005062984A2 (en) | 2003-12-24 | 2004-12-24 | Road curvature estimation system |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1714108A4 (en) |
JP (3) | JP4990629B2 (en) |
WO (1) | WO2005062984A2 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1760431A1 (en) | 2005-08-30 | 2007-03-07 | Honeywell International Inc. | Vehicle comprising an inertial navigation system with a plurality of Kalman filters |
DE102008020410A1 (en) | 2008-04-24 | 2009-11-12 | Ford Global Technologies, LLC (n.d.Ges.d. Staates Delaware), Dearborn | Purpose braking method for driving wheel of motor vehicle output shaft, involves obtaining braking torque in closed control loop, and outputting obtained braking torque to multiple braking devices |
US7987038B2 (en) * | 2006-09-29 | 2011-07-26 | Nissan Motor Co., Ltd. | Cruise control |
WO2013037853A1 (en) * | 2011-09-12 | 2013-03-21 | Continental Teves Ag & Co. Ohg | Orientation model for a sensor system |
RU2499278C1 (en) * | 2012-07-19 | 2013-11-20 | Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования Владивостокский государственный университет экономики и сервиса (ВГУЭС) | Method of tracking path of moving ship |
RU2661889C1 (en) * | 2015-12-18 | 2018-07-20 | Акционерное общество "Федеральный научно-производственный центр "Нижегородский научно-исследовательский институт радиотехники" | Radar tracking method of objects and a radar station for its implementation |
CN111429716A (en) * | 2019-01-08 | 2020-07-17 | 威斯通全球技术公司 | Method for determining position of own vehicle |
CN111737633A (en) * | 2020-06-23 | 2020-10-02 | 上海汽车集团股份有限公司 | Method and device for calculating curvature radius of road in front of vehicle |
CN111824163A (en) * | 2019-04-16 | 2020-10-27 | 标致雪铁龙汽车股份有限公司 | Method and device for determining a curve of a road |
CN112406861A (en) * | 2019-08-19 | 2021-02-26 | 通用汽车环球科技运作有限责任公司 | Method and device for Kalman filter parameter selection by using map data |
CN113391287A (en) * | 2021-06-10 | 2021-09-14 | 哈尔滨工业大学 | High-frequency ground wave radar sea state data fusion method based on time sequence |
CN114001976A (en) * | 2021-10-19 | 2022-02-01 | 杭州飞步科技有限公司 | Method, device and equipment for determining control error and storage medium |
CN114578401A (en) * | 2022-04-29 | 2022-06-03 | 泽景(西安)汽车电子有限责任公司 | Method and device for generating lane track points, electronic equipment and storage medium |
CN116946148A (en) * | 2023-09-20 | 2023-10-27 | 广汽埃安新能源汽车股份有限公司 | Vehicle state information and road surface information estimation method and device |
CN117584982A (en) * | 2023-12-28 | 2024-02-23 | 上海保隆汽车科技股份有限公司 | Curve radius estimation method, system, medium, electronic equipment, vehicle machine and vehicle |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4229141B2 (en) | 2006-06-19 | 2009-02-25 | トヨタ自動車株式会社 | Vehicle state quantity estimation device and vehicle steering control device using the device |
EP2380794B1 (en) | 2009-01-22 | 2015-05-06 | Toyota Jidosha Kabushiki Kaisha | Curve radius estimation device |
JP5291610B2 (en) * | 2009-12-21 | 2013-09-18 | 日本電信電話株式会社 | Azimuth estimation apparatus, method, and program |
JP5821288B2 (en) * | 2011-05-31 | 2015-11-24 | 日産自動車株式会社 | Road shape prediction device |
KR101789073B1 (en) | 2011-08-24 | 2017-10-23 | 현대모비스 주식회사 | Method and apparatus for estimating radius of curvature of vehicle |
US9255989B2 (en) * | 2012-07-24 | 2016-02-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Tracking on-road vehicles with sensors of different modalities |
KR101839978B1 (en) * | 2013-03-18 | 2018-04-26 | 주식회사 만도 | Apparatus and method for determining traveling status of vehicle |
JP6105509B2 (en) * | 2014-04-08 | 2017-03-29 | 株式会社日本自動車部品総合研究所 | Runway estimation device and runway estimation program |
US9463804B2 (en) * | 2014-11-11 | 2016-10-11 | Ford Global Tehnologies, LLC | Vehicle cornering modes |
JP6313198B2 (en) * | 2014-11-28 | 2018-04-18 | 株式会社デンソー | Vehicle control device |
KR102277479B1 (en) * | 2015-02-25 | 2021-07-14 | 현대모비스 주식회사 | Apparatus and method for estimating radius of curvature in vehicle |
DE102016221171B4 (en) * | 2015-11-06 | 2022-10-06 | Ford Global Technologies, Llc | Method and device for determining lane progression data |
JP6428671B2 (en) | 2016-02-17 | 2018-11-28 | 株式会社デンソー | Estimator |
CN106114511B (en) * | 2016-07-21 | 2018-03-06 | 辽宁工业大学 | A kind of automobile cruise system core target identification method |
JP6293213B2 (en) | 2016-08-01 | 2018-03-14 | 三菱電機株式会社 | Lane marking detection correction device, lane marking detection correction method, and automatic driving system |
JP6770393B2 (en) * | 2016-10-04 | 2020-10-14 | 株式会社豊田中央研究所 | Tracking device and program |
KR102375149B1 (en) * | 2017-10-18 | 2022-03-16 | 현대자동차주식회사 | Apparatus and method for estimating redius of curvature of vehicle |
CN110525442B (en) * | 2018-05-23 | 2021-10-26 | 长城汽车股份有限公司 | Gradient detection method and system and vehicle |
CN111047908B (en) * | 2018-10-12 | 2021-11-02 | 富士通株式会社 | Detection device and method for cross-line vehicle and video monitoring equipment |
US20220098009A1 (en) * | 2019-02-14 | 2022-03-31 | Tadano Ltd. | Lifting control device and mobile crane |
US20220098008A1 (en) * | 2019-02-14 | 2022-03-31 | Tadano Ltd. | Dynamic lift-off control device, and crane |
CN112441012B (en) * | 2019-09-05 | 2023-05-12 | 北京地平线机器人技术研发有限公司 | Vehicle driving track prediction method and device |
CN111400931B (en) * | 2020-04-09 | 2022-09-27 | 北京理工大学 | Method and system for determining yaw velocity of vehicle |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3164439B2 (en) * | 1992-10-21 | 2001-05-08 | マツダ株式会社 | Obstacle detection device for vehicles |
DE4408745C2 (en) * | 1993-03-26 | 1997-02-27 | Honda Motor Co Ltd | Driving control device for vehicles |
JP3288566B2 (en) * | 1994-11-10 | 2002-06-04 | 株式会社豊田中央研究所 | Travel lane recognition device |
DE19749086C1 (en) * | 1997-11-06 | 1999-08-12 | Daimler Chrysler Ag | Device for determining data indicating the course of the lane |
JPH11160078A (en) * | 1997-12-02 | 1999-06-18 | Toyota Motor Corp | System for estimating condition of traveling road |
JP2000002535A (en) * | 1998-06-15 | 2000-01-07 | Daihatsu Motor Co Ltd | Method for detecting curvature of curve road and detector used therefor |
JP3319399B2 (en) * | 1998-07-16 | 2002-08-26 | 株式会社豊田中央研究所 | Roadway recognition device |
JP2001048035A (en) * | 1999-08-10 | 2001-02-20 | Nissan Motor Co Ltd | Lane following device |
DE60031868T2 (en) * | 1999-09-15 | 2007-09-13 | Sirf Technology, Inc., San Jose | NAVIGATION SYSTEM AND METHOD FOR FOLLOWING THE POSITION OF AN OBJECT |
JP3391745B2 (en) * | 1999-09-22 | 2003-03-31 | 富士重工業株式会社 | Curve approach control device |
JP4348662B2 (en) * | 1999-09-28 | 2009-10-21 | マツダ株式会社 | Vehicle steering device |
US6618690B1 (en) * | 1999-11-22 | 2003-09-09 | Nokia Mobile Phones Ltd | Generalized positioning system based on use of a statistical filter |
JP3822770B2 (en) * | 1999-12-10 | 2006-09-20 | 三菱電機株式会社 | Vehicle front monitoring device |
JP2001319299A (en) * | 2000-05-12 | 2001-11-16 | Denso Corp | Road curvature estimating device for vehicle and preceding vehicle selecting device |
JP2001328451A (en) * | 2000-05-18 | 2001-11-27 | Denso Corp | Travel route estimating device, preceding vehicle recognizing device and recording medium |
WO2002021156A2 (en) * | 2000-09-08 | 2002-03-14 | Raytheon Company | Path prediction system and method |
JP4606638B2 (en) * | 2001-04-20 | 2011-01-05 | 本田技研工業株式会社 | Vehicle trajectory estimation device and vehicle travel safety device using the same |
JP4668470B2 (en) * | 2001-07-06 | 2011-04-13 | 富士重工業株式会社 | Inter-vehicle distance alarm device |
JP3700625B2 (en) * | 2001-08-28 | 2005-09-28 | 日産自動車株式会社 | Road white line recognition device |
US6751547B2 (en) * | 2001-11-26 | 2004-06-15 | Hrl Laboratories, Llc | Method and apparatus for estimation of forward path geometry of a vehicle based on a two-clothoid road model |
US7009500B2 (en) * | 2002-02-13 | 2006-03-07 | Ford Global Technologies, Llc | Method for operating a pre-crash sensing system in a vehicle having a countermeasure system using stereo cameras |
US6643588B1 (en) * | 2002-04-11 | 2003-11-04 | Visteon Global Technologies, Inc. | Geometric based path prediction method using moving and stop objects |
DE10218924A1 (en) * | 2002-04-27 | 2003-11-06 | Bosch Gmbh Robert | Method and device for course prediction in motor vehicles |
JP3975922B2 (en) * | 2003-01-17 | 2007-09-12 | トヨタ自動車株式会社 | Curve radius estimation device |
JP2004286724A (en) * | 2003-01-27 | 2004-10-14 | Denso Corp | Vehicle behavior detector, on-vehicle processing system, detection information calibrator and on-vehicle processor |
JP2007534041A (en) * | 2003-09-23 | 2007-11-22 | ダイムラークライスラー・アクチェンゲゼルシャフト | Lane change driving recognition method and apparatus for vehicles |
-
2004
- 2004-12-24 WO PCT/US2004/043695 patent/WO2005062984A2/en not_active Application Discontinuation
- 2004-12-24 JP JP2006547495A patent/JP4990629B2/en not_active Expired - Fee Related
- 2004-12-24 EP EP04815709A patent/EP1714108A4/en not_active Withdrawn
-
2012
- 2012-01-12 JP JP2012004376A patent/JP5848137B2/en not_active Expired - Fee Related
- 2012-01-12 JP JP2012004375A patent/JP5265787B2/en not_active Expired - Fee Related
Non-Patent Citations (1)
Title |
---|
See references of EP1714108A4 * |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1760431A1 (en) | 2005-08-30 | 2007-03-07 | Honeywell International Inc. | Vehicle comprising an inertial navigation system with a plurality of Kalman filters |
US7860651B2 (en) | 2005-08-30 | 2010-12-28 | Honeywell International Inc. | Enhanced inertial system performance |
US8185309B2 (en) | 2005-08-30 | 2012-05-22 | Honeywell International Inc. | Enhanced inertial system performance |
US7987038B2 (en) * | 2006-09-29 | 2011-07-26 | Nissan Motor Co., Ltd. | Cruise control |
DE102008020410A1 (en) | 2008-04-24 | 2009-11-12 | Ford Global Technologies, LLC (n.d.Ges.d. Staates Delaware), Dearborn | Purpose braking method for driving wheel of motor vehicle output shaft, involves obtaining braking torque in closed control loop, and outputting obtained braking torque to multiple braking devices |
DE102008020410B4 (en) * | 2008-04-24 | 2016-02-11 | Ford Global Technologies, Llc (N.D.Ges.D. Staates Delaware) | Method for targeted braking of a driven wheel of a drive axle of a motor vehicle |
WO2013037854A1 (en) * | 2011-09-12 | 2013-03-21 | Continental Teves Ag & Co. Ohg | Sensor system comprising a fusion filter for common signal processing |
CN103917417A (en) * | 2011-09-12 | 2014-07-09 | 大陆-特韦斯贸易合伙股份公司及两合公司 | Orientation model for sensor system |
CN103930312A (en) * | 2011-09-12 | 2014-07-16 | 大陆-特韦斯贸易合伙股份公司及两合公司 | Sensor system comprising a fusion filter for common signal processing |
US9183463B2 (en) | 2011-09-12 | 2015-11-10 | Continental Teves Ag & Co., Ohg | Orientation model for a sensor system |
WO2013037853A1 (en) * | 2011-09-12 | 2013-03-21 | Continental Teves Ag & Co. Ohg | Orientation model for a sensor system |
US10360476B2 (en) | 2011-09-12 | 2019-07-23 | Continental Teves Ag & Co. Ohg | Sensor system comprising a fusion filter for common signal processing |
RU2499278C1 (en) * | 2012-07-19 | 2013-11-20 | Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования Владивостокский государственный университет экономики и сервиса (ВГУЭС) | Method of tracking path of moving ship |
RU2661889C1 (en) * | 2015-12-18 | 2018-07-20 | Акционерное общество "Федеральный научно-производственный центр "Нижегородский научно-исследовательский институт радиотехники" | Radar tracking method of objects and a radar station for its implementation |
CN111429716A (en) * | 2019-01-08 | 2020-07-17 | 威斯通全球技术公司 | Method for determining position of own vehicle |
CN111824163A (en) * | 2019-04-16 | 2020-10-27 | 标致雪铁龙汽车股份有限公司 | Method and device for determining a curve of a road |
CN112406861A (en) * | 2019-08-19 | 2021-02-26 | 通用汽车环球科技运作有限责任公司 | Method and device for Kalman filter parameter selection by using map data |
CN112406861B (en) * | 2019-08-19 | 2024-03-22 | 通用汽车环球科技运作有限责任公司 | Method and device for carrying out Kalman filter parameter selection by using map data |
CN111737633A (en) * | 2020-06-23 | 2020-10-02 | 上海汽车集团股份有限公司 | Method and device for calculating curvature radius of road in front of vehicle |
CN113391287B (en) * | 2021-06-10 | 2023-09-01 | 哈尔滨工业大学 | High-frequency ground wave radar sea state data fusion method based on time sequence |
CN113391287A (en) * | 2021-06-10 | 2021-09-14 | 哈尔滨工业大学 | High-frequency ground wave radar sea state data fusion method based on time sequence |
CN114001976A (en) * | 2021-10-19 | 2022-02-01 | 杭州飞步科技有限公司 | Method, device and equipment for determining control error and storage medium |
CN114001976B (en) * | 2021-10-19 | 2024-03-12 | 杭州飞步科技有限公司 | Method, device, equipment and storage medium for determining control error |
CN114578401A (en) * | 2022-04-29 | 2022-06-03 | 泽景(西安)汽车电子有限责任公司 | Method and device for generating lane track points, electronic equipment and storage medium |
CN116946148A (en) * | 2023-09-20 | 2023-10-27 | 广汽埃安新能源汽车股份有限公司 | Vehicle state information and road surface information estimation method and device |
CN116946148B (en) * | 2023-09-20 | 2023-12-12 | 广汽埃安新能源汽车股份有限公司 | Vehicle state information and road surface information estimation method and device |
CN117584982A (en) * | 2023-12-28 | 2024-02-23 | 上海保隆汽车科技股份有限公司 | Curve radius estimation method, system, medium, electronic equipment, vehicle machine and vehicle |
CN117584982B (en) * | 2023-12-28 | 2024-04-23 | 上海保隆汽车科技股份有限公司 | Curve radius estimation method, system, medium, electronic equipment, vehicle machine and vehicle |
Also Published As
Publication number | Publication date |
---|---|
JP5848137B2 (en) | 2016-01-27 |
JP2012131496A (en) | 2012-07-12 |
EP1714108A2 (en) | 2006-10-25 |
JP5265787B2 (en) | 2013-08-14 |
EP1714108A4 (en) | 2010-01-13 |
JP2012131495A (en) | 2012-07-12 |
WO2005062984A3 (en) | 2007-02-22 |
JP2007516906A (en) | 2007-06-28 |
JP4990629B2 (en) | 2012-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7522091B2 (en) | Road curvature estimation system | |
WO2005062984A2 (en) | Road curvature estimation system | |
EP1537440B1 (en) | Road curvature estimation and automotive target state estimation system | |
Dickmann et al. | Automotive radar the key technology for autonomous driving: From detection and ranging to environmental understanding | |
CN101793528B (en) | System and method of lane path estimation using sensor fusion | |
Goldbeck et al. | Lane following combining vision and DGPS | |
Jo et al. | Road slope aided vehicle position estimation system based on sensor fusion of GPS and automotive onboard sensors | |
CN110816548A (en) | Sensor fusion | |
US20200072617A1 (en) | Host vehicle position estimation device | |
CN102700548A (en) | Robust vehicular lateral control with front and rear cameras | |
CN102576494A (en) | Collision avoidance system and method for a road vehicle and respective computer program product | |
Heirich et al. | Bayesian train localization method extended by 3D geometric railway track observations from inertial sensors | |
Hammarstrand et al. | Long-range road geometry estimation using moving vehicles and roadside observations | |
García-Fernández et al. | Bayesian road estimation using onboard sensors | |
Li et al. | Constrained unscented Kalman filter based fusion of GPS/INS/digital map for vehicle localization | |
Lundquist et al. | Joint ego-motion and road geometry estimation | |
Lee et al. | Application of $ W $-band FMCW radar for road curvature estimation in poor visibility conditions | |
Heirich et al. | Probabilistic localization method for trains | |
Saadeddin et al. | Performance enhancement of low-cost, high-accuracy, state estimation for vehicle collision prevention system using ANFIS | |
Hossein et al. | Multi-sensor data fusion for autonomous vehicle navigation through adaptive particle filter | |
Tsogas et al. | Using digital maps to enhance lane keeping support systems | |
Polychronopoulos et al. | Extended path prediction using camera and map data for lane keeping support | |
Yoon et al. | High-definition map based motion planning, and control for urban autonomous driving | |
Niknejad et al. | Multi-sensor data fusion for autonomous vehicle navigation and localization through precise map | |
CN114889643A (en) | Three-element autonomous obstacle avoidance method for moving obstacle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006547495 Country of ref document: JP Ref document number: 200480038838.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004815709 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004815709 Country of ref document: EP |