GB2471276A - Terrain sensing apparatus for an autonomous vehicle - Google Patents

Terrain sensing apparatus for an autonomous vehicle Download PDF

Info

Publication number
GB2471276A
GB2471276A GB0910730A GB0910730A GB2471276A GB 2471276 A GB2471276 A GB 2471276A GB 0910730 A GB0910730 A GB 0910730A GB 0910730 A GB0910730 A GB 0910730A GB 2471276 A GB2471276 A GB 2471276A
Authority
GB
United Kingdom
Prior art keywords
vehicle
data
spline
gradient
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0910730A
Other versions
GB0910730D0 (en
Inventor
Alexander Douglas Stewart
Richard Arthur Brimble
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to GB0910730A priority Critical patent/GB2471276A/en
Publication of GB0910730D0 publication Critical patent/GB0910730D0/en
Publication of GB2471276A publication Critical patent/GB2471276A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/076Slope angle of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/072Curvature of the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Feedback Control In General (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method and apparatus for terrain sensing is disclosed that obtains data from at least one sensor onboard a vehicle. The sensor data is used to generate data representing a surface defined by a set of control points. Gradient information is then generated by computing a gradient of the surface for at least one query point, and the gradient information is converted into a local frame coordinate system for the vehicle. A Kalman filter process may be used to generate a B-spline representation of the surface, wherein the B-spline may be a NURBS (Non Uniform Rational Basis Spline). The at least one sensor may comprise a LIDAR, RADAR, or an image based sensor. The vehicle may by an autonomous vehicle, and the terrain sensing means may assist with guidance of the vehicle, for example, to help avoid areas of terrain that have a gradient where there is a rollover risk.

Description

Terrain Sensinci The present invention relates to terrain sensing.
Autonomous vehicles are intended to be able to navigate without constant human interaction. Certain types of terrain in particular are challenging for such vehicles. For example, if a vehicle is navigated into an area having an excessively steep gradient then there is a danger of it rolling over.
Embodiments of the present invention are intended to assist with guiding an autonomous vehicle, for example to help avoid areas of terrain that have a gradient where there is a rollover risk.
According to a first aspect of the present invention there is provided a method of terrain sensing for use onboard a vehicle, the method including: obtaining data from at least one sensor onboard a vehicle; using the sensor data to generate data representing a surface defined by a set of control points; generating gradient information by computing a gradient of the surface for at least one query point, and converting the gradient information into a local frame coordinate system for the vehicle.
The method may include using a Kalman filter process to generate a B-spline representation of the surface. The B-spline may be a NURBS (Non Uniform Rational Basis Spline).
The step of generating gradient information may include calculating a normal vector to the surface and deriving a transform relative to the z-axis of the vehicle frame. The method may include checking the gradient information against at least one threshold to assess vehicle rollover risk. The step of assessing the vehicle rollover risk may include: obtaining data representing a yaw of the vehicle; obtaining data representing a normal of a location on the surface where the vehicle is positioned, and using the yaw and surface normal data to estimate roll and pitch of the vehicle at its location.
The method may further include comparing the roll and pitch with models to assess whether the vehicle is at a rollover risk.
The method may further include: storing a first data set representing the surface at a first point in time; storing a second data set representing the surface at a second point in time, and comparing the first and the second data sets to detect changes relating to the surface.
The method may further include a step of using the converted gradient information to assist planning a route for the vehicle.
According to yet another aspect of the present invention there is provided a computer program product comprising a computer readable medium, having thereon computer program code means, when the program code is loaded, to make the computer execute a method of terrain sensing substantially as described herein.
According to another aspect of the present invention there is provided apparatus configured to assist with failure mode and effects analysis of a system, the apparatus including: a processor configured to: obtain data from at least one sensor onboard a vehicle; use the sensor data to generate data representing a surface defined by a set of control points; generate gradient information by computing a gradient of the surface for at least one query point, and convert the gradient information into a local frame coordinate system for the vehicle.
The at least one sensor may include LIDAR, RADAR or image-based sensors. The vehicle may be an at least partially autonomous vehicle.
According to yet another aspect of the present invention there is provided a method assess risk of rollover of a vehicle, the method including: obtaining data representing a yaw of the vehicle; obtaining data representing a normal of a location on the surface where the vehicle is positioned; using the yaw and surface normal data to estimate roll and pitch of the vehicle at its location.
The method may further include comparing the roll and pitch with thresholds to assess whether the vehicle is at a rollover risk.
According to a further aspect of the present invention there is provided a a method of detecting change in a surface, the method including: storing a first data set representing the surface at a first point in time; storing a second data set representing the surface at a second point in time, and comparing the first and the second data sets to detect changes relating to the surface.
The data sets may be generated by using a Kalman filter process to generate a B-spline representation of the surfaces.
Computer program products and apparatus configured to execute the methods described herein are also provided.
Whilst the invention has been described above, it extends to any inventive combination of features set out above or in the following description. Although illustrative embodiments of the invention are described in detail herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to these precise embodiments. As such, many modifications and variations will be apparent to practitioners skilled in the art.
Furthermore, it is contemplated that a particular feature described either individually or as part of an embodiment can be combined with other individually described features, or parts of other embodiments, even if the other features and embodiments make no mention of the particular feature. Thus, the invention extends to such specific combinations not already described.
The invention may be performed in various ways, and, by way of example only, embodiments thereof will now be described, reference being made to the accompanying drawings in which: Figure 1 is a schematic illustration of an example architecture for an autonomous vehicle controller; Figure 2 is a flowchart outlining steps performed by a process executed by the controller; Figure 3 is a diagram illustrating an autonomous vehicle on a gradient; Figure 4 is a flowchart detailing a filter update stage of the process; Figure 5 is a flowchart detailing a filter prediction stage of the process; Figure 6 illustrates graphically data produced by the process; Figure 7 shows a vehicle on an arbitrary 2.5D surface; Figure 8 illustrates graphically rigid motion; Figure 9 is a kimematic model of a simple car; Figure 10 illustrates schematically separation of B-spline estimation and evaluation; Figure 11 illustrates schematically origins of information matrix sparsity; Figure 12 illustrates schematically information matrix diagonals, and Figure 13 is graph showing the benefits of exploiting the sparsity of information matrix.
Figure 1 shows a vehicle 100 along with a diagram schematically illustrating components of a controller that assists with autonomous driving of the vehicle. Although a full size land vehicle is shown in the example, it will be appreciated that the system and processes described herein can be implemented on other types of vehicles.
The controller receives inputs including terrain maps 102, which provide global terrain data, and signals from at least one terrain sensor 104 that provide local terrain data (i.e. data derived from sensors onboard the vehicle). A non-exhaustive list of such sensors includes LIDAR, stereovision cameras and/or RADAR based sensors. The inputs 102, 104 are received by a terrain perceptions component 106 that processes them to determine characteristics of the present terrain, e.g. whether there are any obstacles in the vicinity; the topography; and the type of terrain (e.g. mud, sand, etc).
The terrain perceptions component can produce one or more output, e.g. data describing un-driveable regions, data describing driveable regions and/or data describing terrain characteristics. These outputs are passed to a driveability engine 108, responsible for predicting vehicle response on uneven terrain. The 108 engine also receives input in the form of vehicle model data (which models characteristics of the vehicle, such as tyre models, tip-over limits and kinematic/dynamic constraints) and vehicle state data 112 (which can include, for example, the planned velocity of the vehicle, the state of its steering mechanism, and pose including position, orientation and associated velocities and accelerations, etc). The engine 108 is configured to receive a driveability query, e.g. generated by a route planning component (not shown) for the vehicle 100, and return an answer to that, e.g. an indication whether or not a proposed route is driveable.
Figure 2 shows steps performed by a processor configured to perform part of the terrain perceptions functionality relating to topography. At step 202 a measure of uncertainty derived from an error function model for the sensor(s) that provide data into the process is obtained. The position of the vehicle may also be taken into account. This information will be used in calculating a 2.5 dimensional representation of the terrain.
At step 204 observations (e.g. data 104 from the sensor(s)) are received and at step 206 a point cloud describing the terrain based on the sensor data is produced using processes known to the skilled person.
At step 208 a Kalman filter prediction stage is used to generate an estimate of control points of a B-spline surface based on the sensor measurements. The skilled person will appreciate that alternative filtering techniques could be used to represent multi-level surfaces by means of multi-model distributions, such as Mixture of Gaussian and Particle filters. The B-spline surface may be a NURBS (Non Uniform Rational Basis Spline) defined by a set of control points: \N IF L-.wf The control points of the derivative surface with respect to both u and v can be calculated as follows, although the skilled person will appreciate that other known equations can be used: P(1,o) -I' i, j - j+i, --U+i At step 210 the value of the gradient in terms of u, v with associated uncertainty is computed. This is the gradient at a query point, which will typically be a point state in a proposed route for vehicle. At step 212 these gradients are transformed into a coordinate frame that is local to the vehicle so that the vehicle controller can use them directly.
At step 214 the data representing the estimated surface and/or the gradient information is output, e.g. to a process onboard the vehicle controller that checks the gradient against at least one threshold in order to confirm whether the vehicle is at risk of rolling over if it were to traverse the gradient. At step 216 a Kalman filter update is performed and control then passes back to step 204.
Figure 3 is a diagram showing vehicle 100 on a surface 302 represented by B-splines and also illustrates the u, v axes and the vehicle's local lateral (y) and longitudinal (x) as axes for its local coordinate frame. Step 212 above transforms the gradient from the u, v axes on the surface to the vehicle's local frame coordinate system to assist with autonomous control. This is performed by calculating the normal vector to the gradient surface and deriving the transform relative to the z-axis of the vehicle frame.
Figure 4 shows the Kalman filter prediction stage and associated steps outlined in Figure 2 in more detail. At step 402 the sample points are received.
At step 404 a question is asked as to whether the timestamp for the Kalman filter (TKF) is equal to the timestamp for the information filter (TIF). The information filter is an exact equivalent form of the Kalman Filter, which operates in information-space and in general reduces the computational complexity of the update stage of the filter at the expense the prediction stage and is thus more suited to high-frequency update filters. If this is not the case then control passes to step 406, where the information filter is converted to Kalman filter form. After this, at step 408, the Kalman filter timestamp is updated so that it is equal to the information filter timestamp value.
At step 410 the B-spline basis function is calculated and at step 412 a sparse B-spline basis function product vector is constructed. At step 414 the z coordinates of the B-spline surface are computed. At step 416 the gradients of the B-spline surface are computed. The gradients are calculated by constructing a gradient surface by differentiating the control points of the NURBS surface to obtain the control points of the derivative surface. At step 418 the uncertainty in z coordinates of the B-spline surface is computed and the process ends at step 420, which can result in an output representing a mean estimate of the sensed terrain surface with associated uncertainty.
Figure 5 shows the Kalman filter update stage outlined in Figure 2 in more detail. At step 502 the point cloud data is received. At step 504 the B-spline basis function is computed. At step 506 a sparse B-spline basis function product vector is constructed. Next, at step 508, the information state vector and information matrix is updated. At step 510 the information filter timestamp is updated as the current time. Steps 508 and 510 can be repeated for all points in the point cloud and the process ends at step 512.
Figure 6 illustrates graphically a B-spline surface 602 generated by the processes described above. Highlighted on the surface is a proposed path 604 to be traversed by the autonomous vehicle. Graph 606 shows the magnitude of the gradient with respect to the lateral axis of the vehicle's local coordinate frame against time as the vehicle traverses the path. Upper 608A and lower 608B thresholds are also shown on the graph. As can be seen, part of the path crosses the upper threshold, indicating that there is a risk of rollover at this point.
Graph 610 shows the magnitude of the gradient with respect to the longitudinal axis of the vehicle's local coordinate frame against time as the vehicle traverses the path. Upper 612A and lower 612B thresholds are also shown on the graph.
It will be understood that the gradient information can be processed by the autonomous vehicle controller and if, for example, a proposed route includes an area having a dangerous gradient, then the controller can be configured to generated another proposed route that avoids that area.
The process for creating a B-spline surface using a Kalman filter described above can be modified to provide change detection functionality. This can involve analysing the generated surface for changes over a period of time in order to either detect changes in terrain geometry and/or detect changes due to objects appearing, moving or disappearing. It is also possible to set up a process to detect whenever surface measurements are inconsistent with the surface estimate. The skilled person will appreciate that the process can be adapted to operate with inputs from fixed sensors to perform such change detection. This can be useful for several tasks, including detecting if (part of a) terrain has been disturbed (intentionally or naturally).
One use for the gradient information that has been calculated is to predict whether gravity is likely to cause a land vehicle to fall over onto its side when it is standing on the surface having that gradient. Figure 7 shows a vehicle 700 on an arbitrary 2.5D surface 702. W denotes an external reference frame; S(u, v) denotes a surface parameterised by {u, v}, ni" denotes the surface normal at the vehicle position and a denotes the vehicle yaw measured in W. For simplicity, ni" is defined as positive when directed into the surface interior.
Static rollover is assumed to occur when the magnitudes of the vehicle's pitch (13) and roll (p) exceed defined thresholds, as set out compactly in equation (1') below. Thus, the objective problem is to estimate the roll and pitch of the vehicle if it were placed at an arbitrary position on the surface at an arbitrary yaw (a).
1131 >-I3Iim U I(PI > (Purn (1') Considering the general case of rigid motion in R3 shown in Figure 8, Pab E R, Rab SO(3) denote the position vector and rotation matrix that define the orientation of frame B measured from frame A (see RM Murray, Z Li, SS Sastry, A Mathematical Introduction to Robotic Manipulation, CRC Press 1994). The following conventions used in aerospace for rigid motion in R3 using right-handed orthogonal coordinate systems are adopted: Cl Construct the homogenous transformation Gab E SE(3) : q Gabqb by measuring position Pab and orientation Rab of the origin frame B, in the destination frame A. C2 Define the orientation of the origin frame B in the destination frame A through three consecutive rotations starting from frame A in the following order: first yaw (a) then pitch (13) then roll (p).
C3 Construct Rab, the rotation component of Gab: Pb Pa by applying the same rotation angles measured in C2 in the opposite order, i.e. first roll, then pitch then yaw.
C4 The valid ranges for roll, pitch and yaw are: p E [-i-r, 1F], 13 [-1r12, 1r12], aE[-Tr,u] Element rotations about the x-, y and z-axes can be represented as rotation matrices E SO(3): ii'. 1) 0 ft t ( c'() r fl,-.'t:)\ I _c>_; I U OS(cx1 ---sin(cd U I it -u C) Hence it is possible to derive the matrix form of Rab (5') using 03: CCp (s&. ---sc) (c S!<, -4-ft; Ft ft s < For the stated problem, the estimation of roll an pitch angles for a vehicle with known yaw at a position on the surface with a known surface normal n", frame A is defined to be the external frame (e.g. UTM) and frame B to be the vehicle frame. The origins of the frame can be assumed to be co-located: Pab = 0. As q = Rab qbthe following relationships between frames A and B hold: 0= i /
L
Considering (7') and (5') it is possible to derive (8') for 13: / r ( O=R U U / I Cçy / :c@ I *- (n1jsj (ñ e +-t2c)c.-? 8r hc---F-i2S = auui2( [1c--4 is0 Similarly, as 13 is known from (8'), it is possible to derive (9') from (6') and (5'):
F I
11 -fl U (%S4C4/
SC ----L. c) c.
LI t
*t-.-j-C>
C S. r-
---C (
-e'.t C. ?i-_ / e/:I} çj) :----Kb =-a.tzu2([c4(sJ1 cr;it)]. fl3) (9') It is noted that although there are multiple alternative formulations for calculating I3 and p, only those where the final angle is obtained using atan2 should be used in order to ensure that the result is correctly signed. In addition, (8') and (9') are numerically stable for all [-u, U], [-i-r12, 1-1/2], a E [-u, Tr], which is generally not true (any alternative formulations should thus be checked to ensure validity for all variables).
In order to utilise the algorithms described above as a driver aid for manned vehicles, the trajectory the driver is planning to follow should ideally be known. As this is not always possible with a human driver it may be preferable to evaluate p, I3 for a set of kinematically-feasible candidate trajectories for the simple car shown in Figure 9 (see SM LaValle, "Planning Algorithms", Cambridge University Press, 2006), transformed to the human-driven vehicle's current position.
For completeness, a mathematical description of the terrain sensing process will now be given. B-spline surfaces are a standard representation in the computer-aided design community across a wide variety of fields due to their compact and intuitive representation of potentially complex surfaces. As such they have are well understood by the skilled person, (see, for example, Y Huang, X Qian, Dynamic B-spline Surface Reconstruction: Closing the sensing-and-modelling loop in 3D digitaliziation, Computer-Aided Design, Vol. 39, Issue 11, November 2007). A brief overview of the key principles of B-spline surfaces is given below.
A B-spline surface of degrees p,q in the u E [O,1],v E [0,1] parametric directions along the surface respectively is defined by equation (1) below. The control points of the surface are denoted by P,1 E R3, N1,(u) E R1 and Nj,q(v) E are the B-spline basis functions in the u,v parametric directions along the surface respectively and S(u,v) E R3 is the position of the surface at the evaluation point (u,v).
V 0 (1)
For a B-spline surface of degrees p,q with n control points in u and m in v, there are r = n i-p + 1 knots in Uand s = m + q + 1 knots in V, the knot vectors in u and v respectively, given by equation (2) below: U {Oo,..., °p Up+ lrpi,...,lri} V = {Oo,..., °q Vq+ lsqi,...,lsi} (2) The following properties of B-spline surfaces can be derived from the reference given above: * In any given rectangle [Ujo,Ujo+l] X [vjo+l] at most (p + 1)(q + 1) basis functions are non-zero, in particular Nj,p(u)Nj,q(v) for i0 -p �= i �= i0 and Jo -q �=j�=Jo.
* The surface interpolates the four corner control points: S(O,O) = P0,0, S(1,O) = Pn,0, S(O,1) = PO,m and S(1,1) = Pn,m.
* Local modification scheme: if P,1 is moved it affects the surface only in the rectangle [u1,u1++] x [vj,vj+q+i].
* S(u,v) is p -k times differentiable in the u direction at a u knot of multiplicity k, a similar relation holds for v, albeit with p interchanged for q.
In the general discrete linear system given by (3), at time-step k, x(k) is the state, u(k) is the input vector, v(k) is some additive motion noise, B(k) and G(k) are the input and noise transition matrices and F(k) is the state transition matrix. Correspondingly, z(k) is the observation at time-step k, H(k) is the observation matrix (model) and w(k) is some additive observation noise. It is additionally assumed that v(k) and w(k) (the noise terms) are zero-mean and have covariance matrices given by Q(k) and R(k), respectively.
x(k) = F(k)x(k -1) + B(k)u(k) + G(k)v(k) z(k) = H(k)x(k) + w(k) (3) The prediction and update stages for a Kalman Filter for this problem are given in equations (4) and (5), respectively: x"(k k-1)F(k)x"(k 1 k-1)+B(k)u(k) E(k k -1) = F(k)E(k -1 k -l)FT(k) = G(k)Q(k)GT(k) (4) x"(k k) = x"(k k-1) + W(k)(z(k) -H(k)x"(k k-1)) W(k) = E(k k -1)HT (k)(H(k)E(k k -1)HT(k) + R(k)) -(5) E(k k) = E(k k -1) -W(k)H(k)E(k k -1) In order to represent the local environment from uncertain point cloud observations using a B-spline surface (1) the following simplifying assumptions are made: Al The operating environment can be accurately represented as a 2.5D surface, therefore V {x,y} ! z. A2 The variance in the x,y components of each point observation denoted Ax, Ay is negligible and can be assumed to be zero, hence the only source of uncertainty exists in z component, Az!= 0.
A3 The {x,y} coordinates of P,1 are user-defined and known a priori; this is the easiest mechanism to enforce Al.
From inspection of (1) it can be seen that the x,y,z components of S(u,v) E are linearly independent as N1(u), Nj,q(v) E are scalars and P,1 E R3.
Combined with Al and A2 this means that a problem of finding the z coordinates of the control points P,1 whose x, y positions are known, from uncertain observation of the z position of the environment (surface) at known x,y locations.
Hence (1) can be rewritten as (6): z = A(u,v)P A(u,v) = Njp(u)Njq(v) (6) Where z E R 1 denotes the z coordinate of an observation of the surface defined by the control points P,1, A(u,v) E Rm is the B-spline basis function product vector and P, E X is the vector of the control point z coordinates obtained by reforming the control points matrix column-wise. It should be noted that the only requirement on the method through which P, is formed is that A(u,v) should be formed in order to maintain equivalence between (1) and (6), hence alternative methods could also be used, e.g. row-wise.
A significant issue in B-spline surface fitting is that of parameterisation: {x,y,z} -* {u,v} (alternatively known as point-inversion). Although the problem is theoretically solvable in closed-form for p,q �= 4, when the point exists on the surface, in practice the closed-form method for p,q �= 3 often suffers from numerical tolerance problems. For the above problem there is a simplification based upon the assumptions described below which simplify the parameterisation problem.
Comparing (6) with (3) it can be seen that (6) may be posed as a discrete linear system, as shown in (7). It should be noted that we have explicitly utilised the assumption of a static environment through our use of the identity matrix as the process in (7).
x(k) = F(k)x(k -1) + B(k)u(k) + G(k)v(k) z(k) = H(k)x(k) = w(k) (3) P(k) = 1 P(k-1) + 0 z(k) = A,(k)P(k) + Az (7) To recast the problem as a discrete linear system, it is possible to rewrite (4) and (5) to construct a Kalman Filter that will estimate as its state P, as shown in (8) and (9) where A P is substituted for E, the covariance of the filter.
pA(kk1)1pA(k1 k1)+0 =PAz(k_1 k_i) AP(kk-1)=1AP(k-1k-1)1T(k)+O (8) =AP(k-1k-1) PAz(k k) = PAz(k k-1) + W(k)(z(k) - (k)P"(k k-1) pA (k-i k-i) +W(k)(z(k) Au,v(k)PAz(k_ 1 k-i) W(k) = AP(k k-1) (k) (k) AP(k k-1) (k) + = AP(k k -1) (k) (k) AP(k k -1) (k) + (9) AP(k k) = AP(k k -1) -W(k) AP(k k -1) = APZ(k -1 k -1) -W(k) AP(k -1 k -1) In order to calculate the uncertainty of a surface represented by {P, AP} at an arbitrary position, consider the general Kaman Filter detailed above; the residual r(k) A= z(k) -H(k)xA(k -1) is the difference between the measured value and the best estimate of its value prior to the measurement being taken. S(k) = cov(r(k), rT(k)) is the covariance of the residual (10) and it is possible to exploit E{r(k)} 0 as W(k) is chosen as the optimal Kalman gain.
Cov(X,Y)E{XY} -E{X}E{Y} = E{(X-E{X}) (Y-E{Y})} r(k) = z(k) -H(k)xA(k -1) = H(k)x(k -1) + v(k) -H(k)xA(k -1) 1 5 S(k) = cov(r(k), rT(k)) = E{r(k)rT(k)} -E{r(k)} E{rT(k)} = E{r(k)rT(k)} -0*0 = E{[H(k)[ v(k) -xA(k_ 1)] + v(k)] [H(k)[xT-xA(k_ 1)] + v(k)] T} = E{[H(k)[x(k) -xA(k_ 1)] [x(k) -xA(k_ 1)] THT(k) + H(k)[x(k) -xA(k -1)] vT(k) + v(k) [x(k) -xA(k_ 1)] THT(k) + v(k) vT(k)} = H(k) P(k-1)] HT(k) + R(k) Equation (10) is specialised for the problem of B-spline surface reconstruction to form (11) where the residual covariance is the sum of variance of two sources of uncertainty, the filter and the current observation. Hence for a surface represented by {P, AP}, define Ap (12) to be the variance of the surface at parameteric coordinates {u,v}.
S(k)=A,AP(k k 1)AT,+Az(k) (11) Ap (12) The Kalman Filter form of the stated problem given in (8) and (9) for 1 point observations can be shown to have a computational complexity of 0(1 x (n rn)2), where n, rn are the numbers of control points in the u, v directions respectively. Hence this approach is not attractive for situations where 1 >> (n.rn). An alternative formulation of the problem is presented using an Information Filter (see, for example, H.F. Durrant-Whyte, Multi-Sensor Data Fusion, Australian Centre for Field Robotics, University of Sydney, 2001), which results in a computational complexity which has a general worst-case of 0(1 + (n rn)3). However, for practical considerations discussed in detail below, this formulation is likely to be preferred in the majority of cases.
A conventional Kalman Filter maintains an estimate of the state xA(i) and the associated variance E(i j). An Information Filter maintains instead the information state vector yA( j) and the information matrix Y(i j) which are defined in (13).
yA( j) = E'(i j) xA(i j) (13) Y(i j) = E'(i j) In the case when the additive process noise v(k) = 0, the prediction and update stages of the Information Filter for the linear system in (3) are given by (14) and (15), respectively.
A(k k-1)FT(k)y"(k_ 1 k-l)+M(k)B(k)u(k) Y(kk-1)=M(k) M(k) = FT(k)Y(k_ 1 k-i) F' (k) A(k k) = k -i) + HT(k)R(k)z(k) Y(k k) = Y(k k -i) + HT(k)R(k)fi(k) As before to obtain (8) and (9), (7) is substituted for the general linear system in (14) and (15) in order to to obtain the equations for the Information Filter form of our problem, given in (16) and (17).
A(kk1) i A(kiki) A(k ik-i) M(k)=iY(k-ik-i)i (i6) =Y(k-ik-i) Y(kk-i)=M(k) =Y(k-ik-i) A(kk)1A(kk1)AT/(k)(k) Y(k k)=Y(k k-i)+Az(k)ATA (i7) The parameterisation of observations {x,y,z} -* {u,v} is a step in B-spline surface fitting. In the general case problem Newton's method can be used to iteratively solve for the parametric coordinates on the surface {u,v} which minimise {X,y,Z}p -(p1({u,v}s) 112, where {x,y,z}p denotes the point for which the parametric coordinates are desired.
It can be observed that as a consequence of A3 and the observation above that {x,y,z} are linearly independent in (1) that as P1,1(x,y) is constant, S,(x,y) is independent of P.,1(z). Thus, irrespective of P1,1(z) and hence the shape of the surface, a fixed position {u,v} in parametric coordinates always maps to a fixed column {u,v} (z free) in the external frame. In conjunction with Al and A2 this allows us to build a parameterisation Px,y: {u,v} Px,y-* {u,v} for a simple' surface and then use it, without loss of generality on an arbitrary surface (subject to the above-mentioned assumptions). In this context a simple' surface is one that simplifies the construction of p, although the method of construction is still subject to the constraints mentioned above.
The Information Filter form of the B-spline surface reconstruction problem detailed above have a number of practical advantages over the Kalman Filter form. Principally, the computational cost of updating the filter is upper bounded by 0(1 + (ii m) ([p + 1] [q + 1])2) which simplifies in the case 1>> (ii m)> ([p + 1] * [q + 1])2) = 0(1), allowing large numbers of observations to be efficiently incorporated into the surface estimate. This efficiency in updating the filter comes at the cost of requiring an inversion 0((n * rn)3), of the information matrix to obtain APwhen predicting using the filter.
However, for typical usage scenarios where the frequency of observations >> frequency of prediction requests, the conversion back to Kalman Filter form from Information Filter form and the resulting 0((n * rn)3) inversion operation can be performed at the lower frequency of prediction requests. It is also noted that the bonded sparsity of the Information filter allows alternative matrix inversion techniques and approximations that reduce the computational cost of inverting. Additionally, if prediction requests are batched then the inversion often need only be performed once per batch. As a result of the sparsity of the information matrix Y, discussed in detail below it can be compactly transmitted allowing the easy separation of the update and prediction operation across multiple nodes as shown in Figure 10, further lessening the impact of the O((n rn)3) inversion operation.
Considering the Information Filter form of the B-spline surface fitting given above, it can be observed that the only update terms E R1 in the update stage (17) are composed of yA(k k)= yA(kk_ 1)+AT,+(k)z(k) Y(k k) = Y(k k-1) + Az(k) (17) At most (p + 1) (q + 1) elements of are non-zero and hence it is sparse.
At this point for brevity and ease of implementation, two additional assumption are made: A4 P, is formed column-wise from the control points matrix.
A5 The number of control points in the parametric directions u and v are equal = n = rn.
A6 The degree of the B-spline surface in the parametric directions u and v are equal = p = q.
Although similar formulations to those given in here can be constructed for cases where these additional assumptions are not made, the form of the constructions is dependent upon the relationships between the B-spline parameters and their usage.
As a consequence of the assumptions above, consists of (p + 1) groups of (p + 1) non-zero elements separated by n elements. Thus and the resulting sum over all observations (in this filter update) are sparse, as shown in Figure 11 (where p = 3, n = 15).
In order to determine an upper-bound on the number of non-zero elements in z AT{U,V} , mapping of P from a matrix to a vector as shown in Figure 12 (shaded elements are counted as non-zero).
Using this simplification, it is possible to derive the number of non-zero elements in AY A= AT{U,V} /A{,} , resulting in (18). The form of the derivation is to observe that AY is symmetric and thus we need only consider the diagonal indices d �= 0 and that for ME -n �= d �= n: dim(diag(M,d) = k-I d i \ n 2, 1 2 2 (18) Using (18) it is possible to compare the number of elements of Y updated after exploitation of the sparsity of relative to the naïve case when all element of AY are calculated: n2 n4, as shown in Figure 13. The benefits of exploiting the sparsity properties of Y are particularly relevant if the update and prediction stages are distributed over multiple nodes (Figure 10) due to the significant reduction in data that must be communicated between the nodes.
As detailed above, under the assumptions the parameterisation {x,y}{u,v} can be built for a simple' surface and then used for an arbitrary surface. As Px,y is dependent upon U,V,p, it is implemented as a lookup table that is built for a flat surface P = 0 by iteratively evaluating (1) at {u,v} coordinates and sub-sampling until the required resolution in {x,y} is achieved.

Claims (15)

  1. CLAIMS1. A method of terrain sensing for use onboard a vehicle (100), the method including: obtaining data (104) from at least one sensor onboard a vehicle; using the sensor data to generate data representing a surface defined by a set of control points; generating gradient information by computing a gradient of the surface for at least one query point, and converting the gradient information into a local frame coordinate system for the vehicle.
  2. 2. A method according to claim 1, including using a Kalman filter process to generate a B-spline representation of the surface.
  3. 3. A method according to claim 2, wherein the B-spline is a NURBS (Non Uniform Rational Basis Spline).
  4. 4. A method according to claim 3, wherein the step of generating gradient information includes calculating a normal vector to the surface and deriving a transform relative to the z-axis of the vehicle frame.
  5. 5. A method according to any one of the preceding claims, further including checking the gradient information against at least one threshold to assess vehicle rollover risk.
  6. 6. A method according to claim 4, wherein the step of assessing the vehicle rollover risk includes: obtaining data representing a yaw of the vehicle; obtaining data representing a normal of a location on the surface where the vehicle is positioned, and using the yaw and the surface normal data to estimate roll and pitch of the vehicle at its location.
  7. 7. A method according to claim 6, further including comparing the roll and pitch with models to assess whether the vehicle is at risk from a rollover.
  8. 8. A method according to claim 7, further including a step of using the converted gradient information to assist planning a route for the vehicle.
  9. 9. A method according to claim 8, further including creating a route for the vehicle that avoids an area including the rollover risk.
  10. 10. A method according to any one of the preceding claims, further including: storing a first data set representing the surface at a first point in time; storing a second data set representing the surface at a second point in time, and comparing the first and the second data sets to detect changes relating to the surface.
  11. 11. A computer program product comprising a computer readable medium, having thereon computer program code means, when the program code is loaded, to make the computer execute a method according to any one of the preceding claims.
  12. 12. Apparatus configured to sense terrain including a processor (106) configured to: obtain data from at least one sensor (104) onboard a vehicle; use the sensor data to generate data representing a surface defined by a set of control points; generate gradient information by computing a gradient of the surface for at least one query point, and convert the gradient information into a local frame coordinate system for the vehicle.
  13. 13. Apparatus according to claim 12, wherein the at least one sensor includes a LIDAR, RADAR or image-based sensor.
  14. 14. Apparatus according to claim 12 or 13, wherein the vehicle is an at least partially autonomous vehicle.
  15. 15. A vehicle including apparatus according to claim 12 or 13.
GB0910730A 2009-06-22 2009-06-22 Terrain sensing apparatus for an autonomous vehicle Withdrawn GB2471276A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0910730A GB2471276A (en) 2009-06-22 2009-06-22 Terrain sensing apparatus for an autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0910730A GB2471276A (en) 2009-06-22 2009-06-22 Terrain sensing apparatus for an autonomous vehicle

Publications (2)

Publication Number Publication Date
GB0910730D0 GB0910730D0 (en) 2009-08-05
GB2471276A true GB2471276A (en) 2010-12-29

Family

ID=40972555

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0910730A Withdrawn GB2471276A (en) 2009-06-22 2009-06-22 Terrain sensing apparatus for an autonomous vehicle

Country Status (1)

Country Link
GB (1) GB2471276A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9041589B2 (en) 2012-04-04 2015-05-26 Caterpillar Inc. Systems and methods for determining a radar device coverage region
US9261881B1 (en) * 2013-08-01 2016-02-16 Google Inc. Filtering noisy/high-intensity regions in laser-based lane marker detection
WO2017174314A1 (en) * 2016-04-05 2017-10-12 Jaguar Land Rover Limited Slope detection system for a vehicle
WO2017174606A1 (en) * 2016-04-05 2017-10-12 Jaguar Land Rover Limited Improvements in vehicle speed control
DE102016110461A1 (en) * 2016-06-07 2017-12-07 Connaught Electronics Ltd. Method for detecting an inclination in a roadway for a motor vehicle, driver assistance system and motor vehicle
CN107560599A (en) * 2017-09-04 2018-01-09 清华大学 A kind of road grade data processing method of feature based point sampling and curve matching
EP3734390A1 (en) * 2019-05-03 2020-11-04 Ningbo Geely Automobile Research & Development Co. Ltd. Automatic onboarding for service ramp
EP3757613A4 (en) * 2018-09-28 2021-05-26 Tencent Technology (Shenzhen) Company Limited Method and device for determining road gradient, storage medium, and computer device
GB2591332A (en) * 2019-12-19 2021-07-28 Motional Ad Llc Foreground extraction using surface fitting
US11792644B2 (en) 2021-06-21 2023-10-17 Motional Ad Llc Session key generation for autonomous vehicle operation

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112946625B (en) * 2021-02-04 2022-07-15 江南大学 B-spline shape-based multi-extended target track tracking and classifying method
CN117433491B (en) * 2023-12-20 2024-03-26 青岛亿联建设集团股份有限公司 Foundation pit engineering safety monitoring method based on unmanned aerial vehicle image

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09210682A (en) * 1996-02-06 1997-08-12 Ishikawajima Harima Heavy Ind Co Ltd Device for detecting inclination of road surface by laser
JP2003065740A (en) * 2001-08-27 2003-03-05 Nissan Motor Co Ltd Device for determining slope of forward road
US6963657B1 (en) * 1999-09-24 2005-11-08 Honda Giken Kogyo Kabushiki Kaisha Object recognition system
WO2006123438A1 (en) * 2005-05-20 2006-11-23 The Circle For The Promotion Of Science And Engineering Method of detecting planar road region and obstruction using stereoscopic image
JP2007318460A (en) * 2006-05-26 2007-12-06 Alpine Electronics Inc Vehicle upper viewpoint image displaying apparatus
JP2008033781A (en) * 2006-07-31 2008-02-14 Toyota Motor Corp Road surface gradient detection device and image display device
US20090041337A1 (en) * 2007-08-07 2009-02-12 Kabushiki Kaisha Toshiba Image processing apparatus and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09210682A (en) * 1996-02-06 1997-08-12 Ishikawajima Harima Heavy Ind Co Ltd Device for detecting inclination of road surface by laser
US6963657B1 (en) * 1999-09-24 2005-11-08 Honda Giken Kogyo Kabushiki Kaisha Object recognition system
JP2003065740A (en) * 2001-08-27 2003-03-05 Nissan Motor Co Ltd Device for determining slope of forward road
WO2006123438A1 (en) * 2005-05-20 2006-11-23 The Circle For The Promotion Of Science And Engineering Method of detecting planar road region and obstruction using stereoscopic image
JP2007318460A (en) * 2006-05-26 2007-12-06 Alpine Electronics Inc Vehicle upper viewpoint image displaying apparatus
JP2008033781A (en) * 2006-07-31 2008-02-14 Toyota Motor Corp Road surface gradient detection device and image display device
US20090041337A1 (en) * 2007-08-07 2009-02-12 Kabushiki Kaisha Toshiba Image processing apparatus and method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9041589B2 (en) 2012-04-04 2015-05-26 Caterpillar Inc. Systems and methods for determining a radar device coverage region
US9261881B1 (en) * 2013-08-01 2016-02-16 Google Inc. Filtering noisy/high-intensity regions in laser-based lane marker detection
US9440652B1 (en) 2013-08-01 2016-09-13 Google Inc. Filtering noisy/high-intensity regions in laser-based lane marker detection
WO2017174314A1 (en) * 2016-04-05 2017-10-12 Jaguar Land Rover Limited Slope detection system for a vehicle
WO2017174606A1 (en) * 2016-04-05 2017-10-12 Jaguar Land Rover Limited Improvements in vehicle speed control
US11021160B2 (en) 2016-04-05 2021-06-01 Jaguar Land Rover Limited Slope detection system for a vehicle
US10611375B2 (en) 2016-04-05 2020-04-07 Jaguar Land Rover Limited Vehicle speed control
DE102016110461A1 (en) * 2016-06-07 2017-12-07 Connaught Electronics Ltd. Method for detecting an inclination in a roadway for a motor vehicle, driver assistance system and motor vehicle
CN107560599B (en) * 2017-09-04 2020-05-12 清华大学 Road gradient data processing method based on feature point sampling and curve fitting
CN107560599A (en) * 2017-09-04 2018-01-09 清华大学 A kind of road grade data processing method of feature based point sampling and curve matching
EP3757613A4 (en) * 2018-09-28 2021-05-26 Tencent Technology (Shenzhen) Company Limited Method and device for determining road gradient, storage medium, and computer device
US11370445B2 (en) 2018-09-28 2022-06-28 Tencent Technology (Shenzhen) Company Limited Road gradient determining method and apparatus, storage medium, and computer device
EP3734390A1 (en) * 2019-05-03 2020-11-04 Ningbo Geely Automobile Research & Development Co. Ltd. Automatic onboarding for service ramp
GB2591332A (en) * 2019-12-19 2021-07-28 Motional Ad Llc Foreground extraction using surface fitting
US11161525B2 (en) 2019-12-19 2021-11-02 Motional Ad Llc Foreground extraction using surface fitting
GB2591332B (en) * 2019-12-19 2024-02-14 Motional Ad Llc Foreground extraction using surface fitting
US11976936B2 (en) 2019-12-19 2024-05-07 Motional Ad Llc Foreground extraction using surface fitting
US11792644B2 (en) 2021-06-21 2023-10-17 Motional Ad Llc Session key generation for autonomous vehicle operation

Also Published As

Publication number Publication date
GB0910730D0 (en) 2009-08-05

Similar Documents

Publication Publication Date Title
GB2471276A (en) Terrain sensing apparatus for an autonomous vehicle
EP2273334A1 (en) Terrain sensing
US20240011776A9 (en) Vision-aided inertial navigation
Chilian et al. Multisensor data fusion for robust pose estimation of a six-legged walking robot
Pomárico-Franquiz et al. Accurate self-localization in RFID tag information grids using FIR filtering
Helmick et al. Slip-compensated path following for planetary exploration rovers
Cherubini et al. Visual navigation with obstacle avoidance
Li et al. Using consecutive point clouds for pose and motion estimation of tumbling non-cooperative target
Zhu et al. Multisensor fusion using fuzzy inference system for a visual-IMU-wheel odometry
Helmick et al. Slip compensation for a Mars rover
Dang et al. Tightly-coupled data fusion of VINS and odometer based on wheel slip estimation
Nazemipour et al. MEMS gyro bias estimation in accelerated motions using sensor fusion of camera and angular-rate gyroscope
Yu et al. Tightly-coupled fusion of VINS and motion constraint for autonomous vehicle
Barrau et al. Invariant filtering for pose ekf-slam aided by an imu
Verras et al. Vision and inertial sensor fusion for terrain relative navigation
Ye et al. Model-based offline vehicle tracking in automotive applications using a precise 3D model
Adams et al. Velocimeter LIDAR-based multiplicative extended Kalman filter for Terrain relative navigation applications
Wang et al. Information-fusion based robot simultaneous localization and mapping adapted to search and rescue cluttered environment
Fazekas et al. Identification of kinematic vehicle model parameters for localization purposes
Zhai et al. A novel Stereo-IMU system based on accurate pose parameterization of two-wheeled inverted pendulum (TWIP) robot in localization and mapping
Stanislav et al. Sensors data fusion via bayesian filter
Goudar et al. Ultra-wideband aided inertial navigation: Observability analysis and sensor extrinsics calibration
Macias et al. Single landmark feedback-based time optimal navigation for a differential drive robot
Kelly et al. Simultaneous mapping and stereo extrinsic parameter calibration using GPS measurements
Johnson Continuous-time Trajectory Estimation and its Application to Sensor Calibration and Differentially Flat Systems

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)