GB2544153A - Motion compensation for on-board vehicle sensors - Google Patents
Motion compensation for on-board vehicle sensors Download PDFInfo
- Publication number
- GB2544153A GB2544153A GB1614762.1A GB201614762A GB2544153A GB 2544153 A GB2544153 A GB 2544153A GB 201614762 A GB201614762 A GB 201614762A GB 2544153 A GB2544153 A GB 2544153A
- Authority
- GB
- United Kingdom
- Prior art keywords
- vehicle
- motion
- inputs
- sensors
- computer system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
- B60R16/0231—Circuits relating to the driving or the functioning of the vehicle
- B60R16/0232—Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/02—Registering or indicating driving, working, idle, or waiting time only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C7/00—Tracing profiles
- G01C7/02—Tracing profiles of land surfaces
- G01C7/04—Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
- B60R16/0231—Circuits relating to the driving or the functioning of the vehicle
- B60R16/0232—Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
- B60R16/0233—Vehicle tilting, overturning or roll over
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Traffic Control Systems (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
A method and system 10 of improving the accuracy with which a road profile ahead of a vehicle 18 may be determined. The method including a computer 12 receiving a plurality of inputs corresponding to a plurality of on-board sensors 16 corresponding to a vehicle. The computer estimating motion of the vehicle using the received inputs. The on-board computer system correcting data from one or more sensors for the motion of the vehicle. The one of more sensor may comprise a sensor for determining the driving environment in front of the vehicle. The one or more inputs received by the computer preferably relating to the attitude of the vehicle body in terms of pitch, yaw or roll.
Description
MOTION COMPENSATION FOR ON-BOARD VEHICLE SENSORS
BACKGROUND
FIELD OF THE INVENTION
[001] This invention relates to vehicular systems and more particularly to systems and methods for improving the accuracy with which a road profile ahead of the vehicle may be determined.
BACKGROUND OF THE INVENTION
[002] To provide, enable, or support functionality such as driver assistance, controlling vehicle dynamics, and/or autonomous driving, an accurate and clear picture of the environment around (e.g., ahead of) a vehicle is vital. Unfortunately, the motion of the vehicle itself often makes it difficult to extract such a picture from the signal output by on-board sensors. Accordingly, what is needed is a system and method for improving the accuracy with which a road profile ahead of a vehicle may be determined.
SUMMARY OF THE INVENTION
[003] According to a first aspect of the present invention there is provided a method as set forth in claim 1 of the appended claims.
[004] According to a second aspect of the present invention, there is provided a vehicle as set forth in claim 13 of the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[005] In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which: [006] Figure 1 is a schematic diagram illustrating a vehicle at a first instant in time, the vehicle comprising a system for correcting sensor inputs in accordance with the present invention; [007] Figure 2 is a schematic diagram illustration of the vehicle of Figure 1 at a second instant in time in which the vehicle is pitched forward; [008] Figure 3 is a schematic block diagram of one embodiment of software that may be executed by the system of Figure 1; [009] Figure 4 is a schematic block diagram of one embodiment of a method corresponding to or executed by a system in accordance with the present invention; [0010] Figure 5 is a schematic diagram illustration of the vehicle of Figure 1 at a third instant in time in which one or more sensors of the system are “viewing” a pothole located ahead of the vehicle; [0011] Figure 6 is a schematic diagram illustration of the vehicle of Figure 1 at a fourth instant in time in which the vehicle is encountering (e.g., driving over) the pothole; [0012] Figure 7 is a schematic block diagram of an alternative embodiment of software that may be executed by the system of Figure 1; [0013] Figure 8 is a schematic block diagram of an alternative embodiment of a method corresponding to or executed by a system in accordance with the present invention; and [0014] Figure 9 is a schematic block diagram of another alternative embodiment of a method corresponding to or executed by a system in accordance with the present invention.
DETAILED DESCRIPTION
[0015] It will be readily understood that the components of the present invention, as generally described and illustrated in the Figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following more detailed description of the embodiments of the invention, as represented in the Figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of certain examples of presently contemplated embodiments in accordance with the invention. The presently described embodiments will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout.
[0016] Referring to Figures 1 and 2, a system 10 in accordance with the present invention may improve the accuracy with which a driving environment may be characterized. A system 10 may do this in any suitable method. For example, a system 10 may be embodied as hardware, software, or some combination thereof.
[0017] In certain embodiments, a system 10 may include a computer system 12, a data acquisition system 14, and one or more sensors 16. The computer system 12, data acquisition system 14, and sensors 16 may be carried on-board a vehicle 18. Accordingly, those components 12, 14, 16 may each be characterized as “on-board” components. In operation, one or more sensors 16 may output signals, a data acquisition system 14 may convert those signals into inputs processable by an on-board computer system 12, and the on-board computer system 12 may process data corresponding to one or more sensors 16 in order to improve the accuracy thereof.
[0018] In selected embodiments, various sensors 16 may each comprise a transducer that senses or detects some characteristic of an environment and provides a corresponding output (e.g., an electrical or optical signal) that defines that characteristic. For example, one or more sensors 16 of a system 10 may be accelerometers that output an electrical signal characteristic of the proper acceleration being experienced thereby. Such accelerometers may be used to determine the orientation, acceleration, velocity, and/or distance traveled by a vehicle 18. Other sensors 16 of a system 10 may include cameras, laser scanners, lidar scanners, ultrasonic transducers, radar devices, gyroscopes, inertial measurement units, revolution counters or sensors, strain gauges, temperature sensors, or the like or combinations or sub-combinations thereof.
[0019] On-board sensors 16 may monitor the environment of a corresponding vehicle 18. Certain sensors 16 such as cameras, laser scanners, ultrasonic devices, radars, or the like may be used for driver assistance, controlling vehicle dynamics, and/or autonomous driving. For example, the data from such sensors 16 may be used to identify objects (eg., other vehicles, traffic signs etc.) or surface anomalies (eg., bumps, potholes, truck ruts) and their position (e.g., angle, distance) relative to a corresponding vehicle 18.
[0020] If a vehicle 18 rides over a bumpy road, a forward-looking sensor 16 (e.g., a vehicle-mounted camera 16a, laser sensor 16a, or the like monitoring the road surface ahead of the vehicle 18) may register the same portion of road at different angles, depending on the current motion state of the vehicle 18. Thus, a road profile calculated from this noisy sensor information becomes less accurate. Accordingly, in certain embodiments, to improve the usability of information collected from such sensors 16, a system 10 in accordance with the present invention may compensate for motion of the vehicle 18.
[0021] For example, at a first instant 20 in time, a forward-looking sensor 16a may have a range extending a first distance 22. However, at a second instant 24 in time, a vehicle 18 may have pitched forward 26 due to a bump, braking, or the like. Accordingly, in the second instant 24, the forward-looking sensor 16a may have a range extending a second distance 28 that is significantly shorter than the first distance 22. Thus, between the first and second instants 22, 24, the “view” of the sensor 16a may have changed faster or differently than what would have occurred strictly based on the velocity of the vehicle. This may produce unwanted noise or instability in the output of that sensor 16a. Accordingly, a system 10 may work to filter out such noise.
[0022] Referring to Figure 3, in selected embodiments, a data acquisition system 14 may sample signals 30 output by one or more sensors 16 and convert the resulting samples into inputs 32 (e.g., digital numeric values) that can be manipulated by an onboard computer system 12. For example, a data acquisition system 14 may convert signals 30 in the form of analog waveforms into inputs 32 in the form of digital values suitable for processing. In certain embodiments, a data acquisition system 14 may include conditioning circuitry that converts signals 30 output by one or more sensor 16 into forms that can be converted to digital values, as well as analog-to-digital converters to perform such converting.
[0023] In certain embodiments, inputs 32 produced by a data acquisition system 14 may be separated into two categories 34, 36. In a first category 34 may be those inputs 32 that are less adversely affected by motion of a corresponding vehicle 18. In a second category 36 may be those that are more adversely affected by motion of a corresponding vehicle 18.
[0024] In selected embodiments, inputs 32 in the first category 34 may include one or more driver inputs 32a and/or direct motion inputs 32b. Driver inputs 32a may include one or more values characterizing things such as velocity, drive torque, brake actuation, steering input, or the like or combinations or sub-combinations thereof. Direct motion inputs 32b may include one or more values obtained from one or more signals 30 corresponding to one or more inertial measurement units, gyroscopes, accelerometers, or the like or combinations or sub-combinations thereof.
[0025] Inputs 32 in the second category 34 may include one or more inputs 32 for which correction or compensation is needed or desired. Such inputs 32 may include one or more values corresponding to one or more forward-looking sensors 16. For example, inputs 32 in the second category 36 may corresponding to one or more cameras, laser sensors or scanners, lidar scanners, ultrasonic transducers, or the like or combinations or sub-combinations thereof.
[0026] An on-board computer system 12 in accordance with the present invention may provide, enable, or support an integrated compensation system utilizing various sensors 16 mounted on a corresponding vehicle 18. In selected embodiments, this may be accomplished at least in part by using inputs 32 in the first category 34 to correct or compensate for inputs 32 in the second category 36.
[0027] In selected embodiments, an on-board computer system 12 may be self contained and operate independent of any other computer system or hardware that is not carried on-board the corresponding vehicle 18. Alternatively, an on-board computer system 12 may communicate as necessary with at least one remote computer via a communication network (e.g., a cellular network, satellite network, local area wireless network, or the like).
[0028] An on-board computer system 12 may comprise computer hardware and computer software. The computer hardware of an on-board computer system 12 may include one or more processors, memory, a user interface, one or more antennas, other hardware, or the like or a combination or sub-combination thereof. The memory may be operably connected to the one or more processors and store the computer software. This may enable the one or more processors to execute the computer software.
[0029] A user interface of an on-board computer system 12 may enable an engineer, technician, or a driver to interact with, customize, or control various aspects of an on-board computer system 12. In selected embodiments, a user interface of an onboard computer system 12 may include one or more buttons, keys, touch screens, pointing devices, or the like or a combination or sub-combination thereof. In other embodiments, a user interface of an on-board computer system 12 may simply comprise one or more connection ports, pairings, or the like that enable an external computer to interact or communicate with the on-board computer system 12.
[0030] In certain embodiments, an on-board computer system 12 may include an antenna that enables the on-board computer system 12 to communicate with at least one remote computer via a communication network (e g., a cellular network connected to the Internet), an antenna to receive GPS signals from one or more GPS satellites, or the like or a combination thereof.
[0031] In selected embodiments, the memory of an on-board computer system 12 may store software programmed to use inputs 32 in the first category 34 to correct or compensate for inputs 32 in the second category 36. Such software may have any suitable configuration. In certain embodiments, the software of an on-board computer system 12 may include a motion-estimation module 38 and a motion-compensation module 40.
[0032] A motion-estimation module 38 may use one or more inputs 32 to determine how a corresponding vehicle 18 is moving. In selected embodiments, this may be accomplished by combining inputs 32a characterizing driver controlled parameters such as velocity, drive torque, brake actuation, steering input, or the like with inputs 32b indicative of a current attitude or orientation of a body of the vehicle 18 to obtain motion information 42 articulating the current motion state of the body of the vehicle 18.
[0033] In selected embodiments, a motion-estimation module 38 may use inputs 32a characterizing driver controlled parameters to obtain or define a vector setting forth a direction of travel and velocity at a particular moment in time. Inputs 32b indicative of a current attitude or orientation of a body of the vehicle 18 may corresponding to one or more inertial measurement units, gyroscopes, accelerometers, or the like or combinations or sub-combinations thereof. A motion-estimation module 38 may use such inputs 32b to define one or more parameters such as pitch, roll, and yaw of a body of the vehicle 18. Accordingly, by using both inputs 32a, 32b, a motion-estimation module 38 may output motion information 42 that substantially completely estimates and articulates the motion of the body of vehicle 18 at a given moment in time.
[0034] The motion information 42 produced by a motion-estimation module 38 may be used by a motion-compensation module 40 to compensate for the motions which certain sensors 16 (e.g., sensors 16 corresponding to inputs 32 of the second category 36) experience relative to the surroundings they are intended to measure. This compensation may be the product of an algorithm applied by the motion-compensation module 40. Accordingly, a motion-compensation module 40 may output one or more corrected inputs 44 or compensated inputs 44 that are more useful (e.g., more stable, less noisy, etc.).
[0035] Referring to Figure 4, a system 10 may support, enable, or execute a process 46 in accordance with the present invention. In selected embodiments, such a process 46 may begin with receiving 48, by a data acquisition system 14, signals 30 from one or more sensors 16. The signals 30 may be converted 50 to computer inputs 32 by the data acquisition system 14.
[0036] An on-board computer system 12 (e.g., a motion-estimation module 38) may use 52 certain inputs 32 (e.g., inputs 32 in a first category 34) to determine the current motion of a corresponding vehicle 18. The on-board computer system 12 (e.g., a motion-compensation module 40) may further apply 54 a compensation algorithm to correct other inputs 32 (e.g., inputs 32 in a second category 36). In selected embodiments, the compensation algorithm may use information 42 defining the current motion of the corresponding vehicle 18 to reduce the adverse effects of that motion on certain inputs 32 (e.g., inputs 32 in a second category 36). Accordingly, an on-board computer system 12 (e.g., a motion-compensation module 40) may output 56 corrected inputs 44.
[0037] In general, there may be three coordinate systems to consider in devising a compensation algorithm. The first may be a global, inertial coordinate system. The second may be an undisturbed coordinate system of a vehicle 16. This may be the coordinate system of an “undisturbed” version of the vehicle 16, which may be defined as having its “xy” plane parallel to a ground plane (e.g., an estimated ground plane). The third may be a disturbed coordinate system of the vehicle 16. This may be the coordinate system of an actual vehicle performing roll, pitch, heave, and yaw motions which can be driver-induced (e.g., caused by steering, braking, accelerating, or the like) or road-induced (e.g., caused by road irregularities or the like) or due to other disturbances (e.g., side wind or the like).
[0038] In selected embodiments, a compensation algorithm may transform inputs 32 corresponding to signals 30 measured in the third coordinate system into the first coordinate system (e.g., for determining the positions of targets/objects relative to the vehicle 16) and/or the second coordinate system (e.g., for performing collision avoidance functionalities). Transforming the inputs 32 from one coordinate system to another may be performed using transformation matrices.
[0039] For example, any disturbance in the inputs 32 (which inputs 32 may be described as vectors) caused by motion of the vehicle body relative to the first coordinate system (e g., heave, pitch, roll, yaw) may be seen as a transformation that could be described as a matrix operation. Accordingly, correcting the disturbance may involve undoing the disturbance transformation by performing another matrix transformation. The coupled result of both transformation operations (e.g., disturbance transformation and correction transformation) may be neutral. Thus, in selected embodiments, a compensation algorithm may execute three steps, namely: (1) determine or estimate a disturbance transformation (e.g., disturbance matrix) based on a currently detected motion state; (2) determine a transformation operation (e.g., correction matrix) that would compensate for the estimated disturbance; and (3) perform the compensation transformation on a current, disturbed input 32 (e.g., input vector).
[0040] The steps 46, 50, 52, 54, 56 for obtaining corrected inputs 44 may be substantially continuously repeated. The time interval between each such repeat may be relatively small (e.g., a small fraction of a second). Accordingly, corrected inputs 44 that are up-to-date may be substantially continuously available. Thus, the corrected inputs 44 produced by a system 10 in accordance with the present invention may be suitable for use in processes where reaction time is important and must be fast.
[0041] Corrected inputs 44 may be employed 58 to control certain operations of a corresponding vehicle 18. For example, in selected embodiments, corrected inputs 44 may be used for driver assistance. This may include collision avoidance or the like. In other embodiments, corrected inputs 44 may be used for controlling vehicle dynamics. This may include modifying the dynamics of a vehicle 18 to better deal with an obstacle. For example, corrected inputs 44 may enable a vehicle 18 to lift a wheel to avoid a pothole or the like. In still other embodiments, corrected inputs 44 may be used for autonomous driving. This may include recognizing and properly negotiating road boundaries, lane markers, obstacles, other vehicles, and the like.
[0042] Referring to Figures 5 and 6, in selected embodiments, corrected inputs 44 may provide a better view of certain physical features in the environment of the corresponding vehicle 18. Accordingly, a system 10 may use that better view to better estimate the motion of the vehicle 18, which better estimation may better correct one or more inputs 32. In certain embodiments, this may allow one or more forward-looking sensors 16a to contribute to and improve the accuracy of the motion information 42.
[0043] For example, one or more forward-looking sensors 16a (e g., cameras) may detect a physical feature ahead of a corresponding vehicle 18. An on-board computer system 12 may use that physical feature in one or more ways to better estimate the current motion of the vehicle 18. In certain embodiments, the relative position of the physical feature may be tracked over some period of time. This may enable an on-board computer system 12 to better understand how a corresponding vehicle 18 is moving with respect to that physical feature. For example, if the physical feature is a horizon, tracking that horizon using one or more forward-looking sensors 16a may provide information on the pitch, the roll, and potentially the yaw of the vehicle 18.
[0044] Alternatively, or in addition thereto, a physical feature may be detected and analyzed to determine something about the motion of the vehicle at a future time (e g., at the time the corresponding vehicle 18 encounters or drives over that physical feature). For example, at a first instant 60 in time, one or more forward-looking sensors 16a may detect a physical feature such as a pothole 62, bump, curving center line, or the like. After determining the nature of the physical feature, speed of the vehicle 18, direction of the vehicle 18, and the like, an on-board computer system 12 may predict how the physical feature will affect the motion of the vehicle 18 at a second instant 64 in time (i.e., the time the vehicle 18 encounters the physical feature). Accordingly, an onboard computer system 12 may use that prediction to obtain more accurate motion information 42 suitable for correcting inputs 32 corresponding to signals 30 collected at the time of the second instant 64.
[0045] Referring to Figure 7, in certain embodiments, the software of an on-board computer system 12 may include a motion-estimation module 38, a motion-compensation module 40, and a sensor-evaluation module 66. A sensor-evaluation module 66 may analyze one or more corrected inputs 44 to obtain therefrom data 68 that may be used by a motion-estimation module 38 to better estimate the motion of the corresponding vehicle 18. The data 68 may characterize current motion of a vehicle 18, future motion a physical feature or set of physical features is likely to produce when the physical feature or set is encountered, or a combination current motion and future motion.
[0046] In selected embodiments, a sensor-evaluation module 66 may provide, support, or enable an iterative process to produce corrected inputs 32. An iterative process may be particularly useful when the data 68 characterizes the current motion of a vehicle 18. The number of iterations may vary and may be selected to accommodate the processing capabilities and available time. In certain embodiments, to shorten response time, only one iteration may be executed for a given set or batch of inputs 32, which set or batch would correspond to a particular moment in time.
[0047] In embodiments where the data 68 characterizes (e.g., exclusively characterizes) future motion a physical feature or set of physical features is likely to produce when the physical feature or set is encountered, the data 68 may be taken into account without an iterative process. For example, the data 68 corresponding to a first instant 60 in time may be stored in memory of an on-board computer system 12. Accordingly, the data 68 may be available and ready to use in estimating the motion of the body of the vehicle 18 at the second instant 62 in time.
[0048] In selected embodiments, a motion-estimation module 38 may include a virtual, vehicle-motion model 70. In operation, a vehicle-motion model 70 may be provided with one or more driver inputs 32a and data 68 characterizing a road ahead of the vehicle 18. With this information 32, 68, the vehicle-motion model 70 may predict motion states of the body of the vehicle 18. Thereafter, as the road ahead becomes the road currently being encountered, inputs 32 (e.g., direct motion inputs 32b and/or inputs 32 from forward-looking sensors 16a) characterizing current motion states may be used to correct those earlier predictions. In certain embodiments, this correction of predictions and the resulting production of motion information 42 may be accomplished through the application of a Kalman filter.
[0049] The parameters of a virtual, vehicle-motion model 70 may be determined or specified in any suitable manner. In selected embodiments, certain parameters of a vehicle-motion model 70 may be derived from previous knowledge of the mechanical properties (e.g., geometries, inertia, stiffness, damping coefficients, etc.) of the vehicle 18. Alternatively or addition thereto, self-learning algorithms may be used. Such algorithms may adapt certain parameters of a virtual, vehicle-motion model 70 to reflect real-world conditions and/or changes due to loading, ageing, temperature effects, or the like or combinations or sub-combinations thereof. Accordingly, the parameters may be selected and/or optimized to closely match predicted and measured motion.
[0050] Referring to Figure 8, a system 10 may support, enable, or execute an alternative process 46a in accordance with the present invention. In selected embodiments, such a process 46a may include a decision 72 to iterate after the corrected inputs 44 have been output 56.
[0051] If an iteration (or an additional iteration) is to be performed, the corrected inputs 44 or data (e.g., data 68) derived therefrom may be fed into a motion-estimation module 38 so that the accuracy motion information 42 at a particular moment in time may be improved. If no iteration (or no additional iteration) is to be performed, a system 10 may move on and process inputs 32 corresponding to a later moment in time. The corrected inputs 44 produced with each round may be employed 58 as desired or necessary.
[0052] Referring to Figure 9, a system 10 may support, enable, or execute another alternative process 46b in accordance with the present invention. In selected embodiments, such a process 46b may include using corrected inputs 44 to measure 74 or otherwise characterize a road ahead of the corresponding vehicle 18. This understanding of the road ahead may enable an on-board computer system 12 (e.g., a motion-estimation module 38) to predict 76 future motion of the vehicle 18 (e.g., motion of the vehicle 18 as it drives over that portion of road). The prediction of future motion may be stored 78 as necessary until it is needed or used. In this manner, a system 10 in accordance with the present invention may provide rapid, real-time correction of inputs 32. Accordingly, the corrected inputs 44 produced thereby may be suitable for use in various driver assistance, vehicle dynamics, and/or autonomous driving activities where a short processing and reaction time is important or even vital.
[0053] The flowcharts in Figures 4, 8, and 9 illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to certain embodiments of the present invention. In this regard, each block in the flowcharts may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It will also be noted that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0054] It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. In certain embodiments, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. Alternatively, certain steps or functions may be omitted if not needed.
Claims (18)
1. A method comprising: receiving, by a computer system, a plurality of inputs corresponding to a plurality of sensors; estimating, by the computer system based on one or more inputs of the plurality of inputs, motion of a vehicle carrying the computer system and plurality of sensors; and correcting, by the computer system, data corresponding to one or more sensors of the plurality of sensors by accounting for the motion of the vehicle.
2. The method of claim 1, wherein at least one of the one or more sensors comprises a forward-looking sensor characterizing a driving environment ahead of the vehicle.
3. The method of claim 2, further comprising producing, by the computer system, information characterizing the driving environment ahead of the vehicle.
4. The method of claim 3, further comprising estimating, by the computer system based at least in part on the information, the motion of the vehicle at a future time when the driving environment ahead of the vehicle becomes a driving environment under the vehicle.
5. The method of claim 3 or 4, wherein each sensor of the one or more sensors is selected from the group consisting of an ultrasonic transducer, a laser scanner, a lidar scanner, and a camera.
6. The method of any preceding claim, wherein the estimating comprises correcting a predicted motion of the vehicle with the one or more inputs.
7. The method of claim 6, wherein correcting the predicted motion comprises applying a Kalman filter.
8. The method of claim 7, wherein the one or more inputs characterize the attitude of the body of the vehicle in at least one of pitch, roll, and yaw.
9. The method of claim 7 or 8, further comprising producing, by the computer system, the predicted motion.
10. The method of claim 9, wherein at least one of the one or more sensors comprises a forward-looking sensor directed to an area in front of the vehicle.
11. The method of claim 10, wherein the producing comprises: receiving, by the computer system before the receiving of the plurality of inputs, one or more previous inputs corresponding to the one or more sensors; using, by the computer system, the one or more previous inputs to profile a portion of road ahead of the vehicle at the time of the receiving of the one or more previous inputs; and using, by the computer system, a virtual, vehicle-motion model and the profile to generate the predicted motion.
12. The method of claim 11, wherein receiving the plurality of inputs, estimating the motion of the vehicle, and correcting the data corresponding to the one or more sensors occur as the vehicle is driving over the portion of road.
13. A vehicle comprising: a plurality of on-board sensors; an on-board computer system; and an on-board data acquisition system converting signals from the plurality of ob-board sensors to a plurality of inputs processable by the on-board computer system; the on-board computer system comprising memory and at least one processor operably connected to the memory, the memory storing a motion-estimation module programmed to estimate, based at least in part on one or more inputs of the plurality of inputs, motion of the vehicle, and a motion-compensation module programmed to correct data corresponding to one or more sensors of the plurality of on-board sensors by accounting for the motion of the vehicle.
14. The vehicle of claim 13, wherein at least one of the one or more sensors comprises an on-board, forward-looking sensor characterizing a driving environment ahead of the vehicle.
15. The vehicle of claim 14, wherein the memory further stores a sensor-evaluation module programmed to output information characterizing the driving environment ahead of the vehicle.
16. The vehicle of claim 15, wherein the motion-estimation module is further programmed to estimate, based at least in part on the information, the motion of the vehicle at a future time when the driving environment ahead of the vehicle becomes a driving environment under the vehicle.
17. The vehicle of claim 16, wherein the motion-estimation module is further programmed to applying a Kalman filter to estimate, based at least in part on the information, the motion of the vehicle at the future time when the driving environment ahead of the vehicle becomes the driving environment under the vehicle.
18. The vehicle of claim 17, wherein each sensor of the one or more sensors is selected from the group consisting of an ultrasonic transducer, a laser scanner, a lidar scanner, and a camera.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/842,621 US10235817B2 (en) | 2015-09-01 | 2015-09-01 | Motion compensation for on-board vehicle sensors |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201614762D0 GB201614762D0 (en) | 2016-10-12 |
GB2544153A true GB2544153A (en) | 2017-05-10 |
Family
ID=57119836
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1614762.1A Withdrawn GB2544153A (en) | 2015-09-01 | 2016-08-31 | Motion compensation for on-board vehicle sensors |
Country Status (6)
Country | Link |
---|---|
US (1) | US10235817B2 (en) |
CN (1) | CN106476728B (en) |
DE (1) | DE102016215143A1 (en) |
GB (1) | GB2544153A (en) |
MX (1) | MX2016011013A (en) |
RU (1) | RU2016134734A (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2545713B (en) * | 2015-12-23 | 2019-01-09 | Jaguar Land Rover Ltd | Improvements to vehicle handling |
US20180087907A1 (en) * | 2016-09-29 | 2018-03-29 | The Charles Stark Draper Laboratory, Inc. | Autonomous vehicle: vehicle localization |
DE102016224753A1 (en) * | 2016-12-12 | 2018-06-14 | Deere & Company | Device for influencing the vertical dynamics of an agricultural vehicle |
CN109139893B (en) * | 2017-06-27 | 2020-06-16 | 广西大学 | AGV forklift bumpy road surface identification method |
CN107176098B (en) * | 2017-07-10 | 2023-07-07 | 辽宁工业大学 | Automatic monitoring and early warning device for inner wheel difference blind area and control method |
US10841563B1 (en) * | 2017-10-15 | 2020-11-17 | Banpil Photonics, Inc. | Smart sensor and its system for autonomous system |
FR3072633B1 (en) * | 2017-10-24 | 2019-11-01 | Valeo Schalter Und Sensoren Gmbh | ASSISTING THE DRIVING OF A MOTOR VEHICLE ON THE APPROACH OF A SPEED RETARDER |
JP7077044B2 (en) * | 2018-02-13 | 2022-05-30 | 株式会社トプコン | Data processing equipment, data processing methods and data processing programs |
CN111376850A (en) * | 2018-12-29 | 2020-07-07 | 罗伯特·博世有限公司 | Crosswind detection method and crosswind detection system |
EP3699630A1 (en) * | 2019-02-25 | 2020-08-26 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | System and method for compensating a motion of a vehicle component |
CN110022363B (en) * | 2019-04-03 | 2021-10-29 | 腾讯科技(深圳)有限公司 | Method, device and equipment for correcting motion state of virtual object and storage medium |
CN112477813B (en) * | 2019-09-12 | 2022-06-28 | 长沙智能驾驶研究院有限公司 | Cleaning control method and device of sensing equipment, vehicle and cleaning control system of vehicle |
US20210213955A1 (en) * | 2020-01-15 | 2021-07-15 | GM Global Technology Operations LLC | Method and apparatus for evaluating a vehicle travel surface |
US11673567B2 (en) | 2020-04-14 | 2023-06-13 | Plusai, Inc. | Integrated fiducial marker for simultaneously calibrating sensors of different types |
US11366233B2 (en) | 2020-04-14 | 2022-06-21 | Plusai, Inc. | System and method for GPS based automatic initiation of sensor calibration |
US11635313B2 (en) * | 2020-04-14 | 2023-04-25 | Plusai, Inc. | System and method for simultaneously multiple sensor calibration and transformation matrix computation |
DE102021104781A1 (en) | 2021-03-01 | 2022-09-01 | Schaeffler Technologies AG & Co. KG | Method for optimizing the operation of an automated guided vehicle and automated guided vehicle system |
WO2023075206A1 (en) * | 2021-10-28 | 2023-05-04 | 캐롯손해보험 주식회사 | Method and apparatus for detecting driving behavior without yaw calculation |
DE102023102520A1 (en) * | 2023-02-02 | 2024-08-08 | Daimler Truck AG | Vehicle and method for vibration calibration and vibration compensation of a vehicle sensor system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070213904A1 (en) * | 2006-03-08 | 2007-09-13 | Yamaha Hatsudoki Kabushiki Kaisha | Acceleration estimation device and vehicle |
JP2010143379A (en) * | 2008-12-18 | 2010-07-01 | Toyota Central R&D Labs Inc | Vehicle attitude angle estimation device and program |
US20120067122A1 (en) * | 2010-09-22 | 2012-03-22 | Takuya Sakamoto | Bank angle detecting device for vehicle |
US20120089297A1 (en) * | 2009-06-03 | 2012-04-12 | Toyota Jidosha Kabushiki Kaisha | Sensor offset amount estimate device |
US20120281881A1 (en) * | 2009-11-25 | 2012-11-08 | Conti Temic Microelectronic Gmbh | Method for Estimating the Roll Angle in a Travelling Vehicle |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5964822A (en) * | 1997-08-27 | 1999-10-12 | Delco Electronics Corp. | Automatic sensor azimuth alignment |
DE19738608C1 (en) * | 1997-09-04 | 1998-07-16 | Bosch Gmbh Robert | Running gear regulation arrangement for vehicle ride regulation |
US6161071A (en) * | 1999-03-12 | 2000-12-12 | Navigation Technologies Corporation | Method and system for an in-vehicle computing architecture |
US6898501B2 (en) * | 1999-07-15 | 2005-05-24 | Cnh America Llc | Apparatus for facilitating reduction of vibration in a work vehicle having an active CAB suspension system |
US6233510B1 (en) * | 1999-10-15 | 2001-05-15 | Meritor Heavy Vehicle Technology, Llc | Method and system for predicting road profile |
US6763292B1 (en) * | 2000-06-21 | 2004-07-13 | International Business Machines Corporation | Prediction and compensation for land vehicle dynamics based on feedforward road conditions |
KR100803414B1 (en) * | 2000-08-16 | 2008-02-13 | 레이던 컴퍼니 | Near object detection system |
US6590507B2 (en) * | 2001-03-05 | 2003-07-08 | Hrl Laboratories, Llc | Method and system for providing personalized traffic alerts |
US7130743B2 (en) * | 2001-08-06 | 2006-10-31 | Matsushita Electric Industrial Co., Ltd. | Information providing method and information providing device |
GB0203688D0 (en) * | 2002-02-16 | 2002-11-13 | Bae Systems Combat And Radar S | Ship motion predictor |
US6782315B2 (en) * | 2002-06-19 | 2004-08-24 | Ford Global Technologies, Llc | Method and apparatus for compensating misalignments of a sensor system used in a vehicle dynamic control system |
AU2003275550A1 (en) * | 2002-10-10 | 2004-05-04 | Matsushita Electric Industrial Co., Ltd. | Information acquisition method, information providing method, and information acquisition device |
SE526913C2 (en) * | 2003-01-02 | 2005-11-15 | Arnex Navigation Systems Ab | Procedure in the form of intelligent functions for vehicles and automatic loading machines regarding mapping of terrain and material volumes, obstacle detection and control of vehicles and work tools |
EP1745953A1 (en) * | 2003-02-26 | 2007-01-24 | Ford Global Technologies, LLC | A vehicle dynamic control system and method |
US7068815B2 (en) * | 2003-06-13 | 2006-06-27 | Sarnoff Corporation | Method and apparatus for ground detection and removal in vision systems |
US7831354B2 (en) * | 2004-03-23 | 2010-11-09 | Continental Teves, Inc. | Body state estimation of a vehicle |
US7706978B2 (en) * | 2005-09-02 | 2010-04-27 | Delphi Technologies, Inc. | Method for estimating unknown parameters for a vehicle object detection system |
CN1775601A (en) * | 2005-11-18 | 2006-05-24 | 吉林大学 | Vehicle driving trace predicating and lane deviation evaluating method |
JP4229141B2 (en) | 2006-06-19 | 2009-02-25 | トヨタ自動車株式会社 | Vehicle state quantity estimation device and vehicle steering control device using the device |
DE102006039353A1 (en) * | 2006-08-22 | 2008-03-06 | Daimler Ag | Device and method for influencing the spring force characteristic of an active chassis of a motor vehicle |
US7692552B2 (en) | 2007-01-23 | 2010-04-06 | International Business Machines Corporation | Method and system for improving driver safety and situational awareness |
JP5173854B2 (en) * | 2008-04-21 | 2013-04-03 | 株式会社豊田中央研究所 | Sensor drift estimation device |
US8165742B2 (en) * | 2008-11-14 | 2012-04-24 | Robert Bosch Gmbh | System and method for compensating sensor signals |
DE102010013178A1 (en) * | 2010-03-27 | 2010-12-30 | Daimler Ag | Motor vehicle driving dynamics control method, involves determining dynamic disturbance variable acting on transverse dynamics based on sensor data, and controlling transverse dynamics based on disturbance variable |
US8743219B1 (en) * | 2010-07-13 | 2014-06-03 | Marvell International Ltd. | Image rotation correction and restoration using gyroscope and accelerometer |
IT1403430B1 (en) * | 2010-12-24 | 2013-10-17 | Magneti Marelli Spa | CALIBRATION PROCEDURE OF AN INERTIAL SENSOR ASSEMBLED IN AN ARBITRARY POSITION ON THE VEHICLE OF A VEHICLE, AND A SENSOR SYSTEM OF THE DYNAMICS OF A VEHICLE MOUNTED ON BOARD IN AN ARBITRARY POSITION |
US8655588B2 (en) * | 2011-05-26 | 2014-02-18 | Crown Equipment Limited | Method and apparatus for providing accurate localization for an industrial vehicle |
US11039109B2 (en) * | 2011-08-05 | 2021-06-15 | Fox Sports Productions, Llc | System and method for adjusting an image for a vehicle mounted camera |
US20130046505A1 (en) * | 2011-08-15 | 2013-02-21 | Qualcomm Incorporated | Methods and apparatuses for use in classifying a motion state of a mobile device |
US9533539B2 (en) * | 2011-10-20 | 2017-01-03 | GM Global Technology Operations LLC | Vehicle suspension system and method of using the same |
PL2786347T3 (en) * | 2011-11-28 | 2017-02-28 | Trailertrack Aps | A system for controlling the adjustment of a side rearview device |
US8849555B2 (en) * | 2012-02-29 | 2014-09-30 | Inrix, Inc. | Fuel consumption calculations and warnings |
TWI460668B (en) * | 2012-07-30 | 2014-11-11 | Faraday Tech Corp | Image capture system and image capture method |
DE102012017118A1 (en) * | 2012-08-29 | 2014-05-15 | GM Global Technology Operations, LLC (n.d. Ges. d. Staates Delaware) | Method for optimizing performance of motor vehicle while driving, involves predicting vehicle movement behavior when driving over driving surface by a model, and setting chassis of motor vehicle based on predicted movement behavior |
US9423498B1 (en) * | 2012-09-25 | 2016-08-23 | Google Inc. | Use of motion data in the processing of automotive radar image processing |
GB2509102B (en) * | 2012-12-20 | 2015-09-02 | Thales Holdings Uk Plc | Image processor for processing images received from a plurality of image sensors |
DE112013006388B4 (en) * | 2013-01-10 | 2021-04-29 | Toyota Jidosha Kabushiki Kaisha | Vehicle control device |
DE102013201379B4 (en) * | 2013-01-29 | 2020-12-10 | Robert Bosch Gmbh | Motorcycle with a camera system |
DE102013101639A1 (en) * | 2013-02-19 | 2014-09-04 | Continental Teves Ag & Co. Ohg | Method and device for determining a road condition |
EP3626485B1 (en) | 2013-03-15 | 2024-05-29 | ClearMotion, Inc. | Active vehicle suspension improvements |
DE102013208735A1 (en) * | 2013-05-13 | 2014-11-13 | Robert Bosch Gmbh | Method and device for determining and compensating for a misalignment angle of a radar sensor of a vehicle |
DE102013223014A1 (en) * | 2013-11-12 | 2015-05-13 | Robert Bosch Gmbh | Driver assistance system for a motor vehicle for the anticipatory detection of a road surface |
US10037596B2 (en) * | 2014-11-11 | 2018-07-31 | Raymond Miller Karam | In-vehicle optical image stabilization (OIS) |
US20160137209A1 (en) * | 2014-11-18 | 2016-05-19 | GM Global Technology Operations LLC | Motion-based multi-sensor calibration |
US9886801B2 (en) * | 2015-02-04 | 2018-02-06 | GM Global Technology Operations LLC | Vehicle sensor compensation |
CN104723999A (en) * | 2015-03-18 | 2015-06-24 | 邢恩东 | Vehicle auxiliary safe driving instrument and corresponding early warning method and detection method |
US9794483B1 (en) * | 2016-08-22 | 2017-10-17 | Raytheon Company | Video geolocation |
-
2015
- 2015-09-01 US US14/842,621 patent/US10235817B2/en active Active
-
2016
- 2016-08-15 DE DE102016215143.6A patent/DE102016215143A1/en active Pending
- 2016-08-24 MX MX2016011013A patent/MX2016011013A/en unknown
- 2016-08-25 RU RU2016134734A patent/RU2016134734A/en not_active Application Discontinuation
- 2016-08-29 CN CN201610754338.5A patent/CN106476728B/en active Active
- 2016-08-31 GB GB1614762.1A patent/GB2544153A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070213904A1 (en) * | 2006-03-08 | 2007-09-13 | Yamaha Hatsudoki Kabushiki Kaisha | Acceleration estimation device and vehicle |
JP2010143379A (en) * | 2008-12-18 | 2010-07-01 | Toyota Central R&D Labs Inc | Vehicle attitude angle estimation device and program |
US20120089297A1 (en) * | 2009-06-03 | 2012-04-12 | Toyota Jidosha Kabushiki Kaisha | Sensor offset amount estimate device |
US20120281881A1 (en) * | 2009-11-25 | 2012-11-08 | Conti Temic Microelectronic Gmbh | Method for Estimating the Roll Angle in a Travelling Vehicle |
US20120067122A1 (en) * | 2010-09-22 | 2012-03-22 | Takuya Sakamoto | Bank angle detecting device for vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN106476728B (en) | 2021-10-22 |
GB201614762D0 (en) | 2016-10-12 |
US10235817B2 (en) | 2019-03-19 |
MX2016011013A (en) | 2017-02-28 |
DE102016215143A1 (en) | 2017-03-02 |
CN106476728A (en) | 2017-03-08 |
RU2016134734A (en) | 2018-03-01 |
US20170061710A1 (en) | 2017-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10235817B2 (en) | Motion compensation for on-board vehicle sensors | |
US10990099B2 (en) | Motion planning methods and systems for autonomous vehicle | |
US10867409B2 (en) | Methods and systems to compensate for vehicle calibration errors | |
US20210170947A1 (en) | Vehicular trailering assist system with trailer state estimation | |
CN109975792B (en) | Method for correcting point cloud motion distortion of multi-line laser radar based on multi-sensor fusion | |
US10836395B2 (en) | Efficient optimal control with dynamic model for autonomous vehicle | |
CN101846734B (en) | Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer | |
US20200402246A1 (en) | Method and apparatus for predicting depth completion error-map for high-confidence dense point-cloud | |
WO2019150884A1 (en) | Vehicle control device, control method therefor, and vehicle control system | |
JP7036080B2 (en) | Inertial navigation system | |
WO2018182524A1 (en) | Real time robust localization via visual inertial odometry | |
CN109900490B (en) | Vehicle motion state detection method and system based on autonomous and cooperative sensors | |
US20200159233A1 (en) | Memory-Based Optimal Motion Planning With Dynamic Model For Automated Vehicle | |
CN114076610A (en) | Error calibration and navigation method and device of GNSS/MEMS vehicle-mounted integrated navigation system | |
KR20160120467A (en) | Azimuth correction apparatus and method of 2-dimensional radar for vehicle | |
Brunker et al. | GNSS-shortages-resistant and self-adaptive rear axle kinematic parameter estimator (SA-RAKPE) | |
CN109900295B (en) | Method and system for detecting vehicle motion state based on autonomous sensor | |
CN114274964B (en) | Direction independent lane tracking in a vehicle | |
Bazeille et al. | Characterization of the impact of visual odometry drift on the control of an autonomous vehicle | |
JP7409037B2 (en) | Estimation device, estimation method, estimation program | |
Baer et al. | EgoMaster: A central ego motion estimation for driver assist systems | |
US20240278780A1 (en) | Location of vehicle tracking point | |
US11965978B2 (en) | Calibration pipeline for estimating six degrees of freedom (6DoF) alignment parameters for an autonomous vehicle | |
US20240071101A1 (en) | Method and system for topology detection | |
CN118264761A (en) | Method and device for calibrating a camera mounted on a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |