GB2576206A - Sensor degradation - Google Patents

Sensor degradation Download PDF

Info

Publication number
GB2576206A
GB2576206A GB1813018.7A GB201813018A GB2576206A GB 2576206 A GB2576206 A GB 2576206A GB 201813018 A GB201813018 A GB 201813018A GB 2576206 A GB2576206 A GB 2576206A
Authority
GB
United Kingdom
Prior art keywords
sensor
degradation
control system
output
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1813018.7A
Other versions
GB201813018D0 (en
GB2576206B (en
Inventor
Hidalgo Emilia
Hussein Adwan Adam
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1813018.7A priority Critical patent/GB2576206B/en
Publication of GB201813018D0 publication Critical patent/GB201813018D0/en
Publication of GB2576206A publication Critical patent/GB2576206A/en
Application granted granted Critical
Publication of GB2576206B publication Critical patent/GB2576206B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/87Combinations of sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2400/00Indexing codes relating to detected, measured or calculated conditions or factors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/20Data confidence level
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S2007/4975Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • G01S2007/52009Means for monitoring or calibrating of sensor obstruction, e.g. dirt- or ice-coating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9314Parking operations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9315Monitoring blind spots
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2015/937Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles sensor installation details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4039Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating

Abstract

A control system, for a vehicle, is configured to receive a sensor signal 42 output by a sensor, e.g. camera, lidar, ultrasonic, radar etc., of the vehicle and receive or generate a sensor degradation signal 44 indicative of a degradation in sensing performance of the sensor. The control system also receives or determines a degradation time 46 at which the sensing performance of the sensor degraded and generates an estimated sensor output 48 in dependence on the sensor signal at or before the degradation time. The control system generates a confidence score 50 associated with the estimated sensor output. Sensing performance may be degraded by, for example, a second vehicle causing a splash in front of the vehicle so that a camera outputs a degraded signal. In addition fog or smoke may degrade the signal. Reference is also made to a method, a vehicle and a non-transitory computer readable medium.

Description

SENSOR DEGRADATION
TECHNICAL FIELD
The present disclosure relates to the sensor degradation in vehicle control systems. Aspects of the invention relate to a control system, a method, a vehicle and a non-transitory computer readable medium.
BACKGROUND
It is known that sensors outputs can become unreliable, for example in the event that a sensor is blinded. The absence of reliable sensor data can reduce the effectiveness of vehicle control systems. It is an aim of the present invention to address one or more of the disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a method, a vehicle and a non-transitory computer readable medium as claimed in the appended claims.
According to an aspect of the present invention there is provided a control system for a host vehicle, the control system comprising one or more controllers, the control system configured to: receive a sensor signal output by a sensor of the host vehicle; receive or generate a sensor degradation signal indicative of a degradation in sensing performance of the sensor; receive or determine a degradation time at which the sensing performance of the sensor degraded; generate an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generate a confidence score associated with the estimated sensor output. By providing a confidence score associated with a sensor data, it is possible to make use of sensor data, even if that data is deemed unreliable. This can be particularly advantageous in multi-sensory fusion algorithms in which many sensor inputs may be provided having different confidence levels.
According to a further aspect of the present invention, there is provided a control system for a host vehicle, the control system comprising one or more controllers, wherein the one or more controller collectively comprise: at last one electronic processor having an electric input for receiving signals; and at least one electronic memory device coupled to the at least on electronic processor and having instructions stored therein; and wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions therefore as to cause the host vehicle to: receive a sensor signal output by a sensor of the host vehicle; receive or generate a sensor degradation signal indicative of a degradation in sensing performance of the sensor; receive or determine a degradation time at which the sensing performance of the sensor degraded; generate an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generate a confidence score associated with the estimated sensor output.
According to another aspect of the invention, there is provided a method comprising: receiving a sensor signal output by a sensor of a host vehicle; receiving or generating a sensor degradation signal indicative of a degradation in sensing performance of the sensor; receiving or determining a degradation time at which the sensing performance of the sensor degraded; generating an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generating a confidence score associated with the estimated sensor output.
According to yet another aspect of the invention, there is provided a vehicle comprising a control system as set out herein.
According to a further aspect of the invention, there is provided a computer readable medium (such as a non-transitory computer readable medium) comprising computer readable instructions that, when executed by a processor, cause performance of: receiving or generating a sensor degradation signal indicative of a degradation in sensing performance of the sensor; receiving or determining a degradation time at which the sensing performance of the sensor degraded; generating an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generating a confidence score associated with the estimated sensor output.
A buffer may be provided having an input for receiving the sensor signal. Thus, the buffer may buffer the sensor signal. The buffer may comprise an output providing the estimated sensor output.
The estimated sensor output may be equal to the sensor signal at or before said degradation time. Alternatively, the estimated sensor output may be an extrapolation of the sensor signal based on the sensor signal at or before said degradation time. Thus, the estimated sensor output may, in some embodiments, be predictive.
The sensor signal may provide an estimate of a current location of an object. For example, the estimated sensor output may the location of the object at a time before said degradation time. Alternatively, the estimated sensor output may be an extrapolation of the location of the object based on the location of the object at or before said degradation time.
In an embodiment, the confidence score may be dependent on whether, and/or the extent to which, the object can move. Alternatively, or in addition, the confidence score may be dependent on a degree of the degradation in sensing performance.
The sensor degradation signal may be at least one of a partial and a total reduction in sensing performance.
The host vehicle may be operable in at least one of an autonomous and a non-autonomous mode.
The sensor may be an embedded sensor local to the vehicle and/or a sensor external to the host vehicle. Alternatively, or in addition, the sensor may be one or more of: a lidar sensor, a radar sensor, an imaging sensor, an ultrasonic sensor, an electromagnetic sensor, a bolometer, an infrared sensor and a temperature sensor.
Optionally, the control system may be configured to output the estimated sensor output and the confidence score. The said output may be output to a sensor fusion algorithm. The sensor fusion algorithm may associate a weight to sensor data dependent on the confidence score.
Optionally, the control system may be configured to use the estimated sensor output and the confidence score.
The control system may be configured to receive one or more additional sensor signals from one or more additional sensors. Further, the control system may be configured to: receive or generate one or more additional sensor degradation signals indicative of a degradation in sensing performance of one or more of the additional sensors; receive or determine one or more additional degradation times at which the sensing performance of said additional sensors degraded; generate one or more additional estimated sensor outputs in dependence on the one or more additional sensor signals at or before the one or more additional degradation times; and generate one or more additional confidence scores associated with the one or more addition estimated sensor outputs.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term “controller” or “control unit” will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying schematic drawings, in which:
Figure 1 shows a schematic representation of an example control system;
Figure 2 shows a schematic representation of a vehicle including a plurality of sensors in accordance with an example embodiment;
Figures 3 to 5 show schematic representations of an example scenario in which principles of the invention are used;
Figure 6 shows a flow chart illustrating an algorithm in accordance with embodiment of the invention;
Figure 7 shows a schematic block diagram of a control system in accordance with an embodiment of the invention;
Figure 8 shows a schematic block diagram of a control system in accordance with an embodiment of the invention;
Figure 9 shows a neural network that may be used in an example embodiment of the invention; and
Figure 10 shows a vehicle in accordance with an embodiment of the invention.
DETAILED DESCRIPTION
Figure 1 shows a schematic representation, indicated generally by the reference numeral 1, of an example control system. Control system 1 comprises a sensor fusion module 2. Sensor fusion module 2 receives information from a plurality of sensors. The received information comprises sensor signal outputs from the plurality of sensors (inputs from a first sensor, a second sensor, and a third sensor are shown in Figure 1 by way of example only). The sensor fusion module 2 uses the received information for generating one or more outputs. Example of sensor fusion module usage include determining emergency braking or collision avoidance requirements on the basis of object detection, steering requirements based on driving lane positions, torque control signals in a cruise control mode and localisation by cross-checking and/or fusing sensor data and other data sources (such as HD maps, global positioning system (GPS), inertial measurement unit (IMU) and odometer data). The control system 1 comprises one or more controllers, each having an electronic processor having an electrical input and an electronic memory device electrically coupled to the electronic processor. The electronic memory device has instructions stored therein. The electronic processor is configured to access the memory device and execute the instructions thereon so as to utilise the received information to generate an estimated sensor output and a confidence score as discussed below. The electrical input is for receiving the sensor signal outputs. The electronic processor includes an electrical output for outputting the estimated sensor output and the confidence score. The electrical input and the electrical output may be combined such as by being formed by an I/O unit or interface unit. For example, the one or more controllers may comprise an interface to a network forming a communication bus of the host vehicle. The interface bus may be an Internet Protocol (IP) based communication bus such as Ethernet, although embodiments of the invention are not limited in this respect.
Figure 2 shows a schematic representation, indicated generally by the reference numeral 10, of a vehicle 22, including a plurality of sensors in accordance with an example embodiment. Vehicle 22 may comprise a plurality of ultrasonic or electromagnetic sensors (e.g. parking distance control sensors (PDCs)) 12 at the front of the vehicle 22, a plurality of ultrasonic or electromagnetic sensors (e.g. PDCs) 13 at the rear of the vehicle 22, a plurality of imaging sensors including, in this example, a forward facing camera 14, a backward facing camera 15 and a plurality of surrounding cameras 16, a plurality of radar devices 17, and a plurality of lidars 18. The plurality of surrounding cameras 16 may be placed around the vehicle 22 on all sides of the vehicle 22. The plurality of radar devices 17 may include long range radars (illustrated by solid rectangles in the front and back of the vehicle 22), side radars (illustrated by striped circles on the sides of the vehicle 22), front radars, rear radars, and corner radars (illustrated by striped rectangles on the corners of the vehicle 22). The plurality of lidars 18 may include front and rear lidars (illustrated by cross patterned rectangles) and corner lidars (illustrated by dotted rectangles).
It will be apparent to the skilled person that the sensors shown in Figure 2 are examples only. More or fewer of the example sensors may be provided. Moreover, additional sensor types (such as one or more ultrasonic sensors, imaging sensors, bolometers, infrared sensors, and/or temperature sensors) may be provided instead of, or in addition to, some or all of the sensors shown. In example embodiments, vehicle sensors may include one or more embedded sensors local to the vehicle and/or one or more sensors external to vehicle.
Vehicles, such as cars, are becoming increasingly autonomous. The plurality of sensors shown in Figure 2 may be used for increasing autonomy of the vehicle 22. The vehicle 22 may therefore be an autonomous vehicle (e.g. a driverless autonomous vehicle). In some embodiments, the vehicle 22 may be operable in either an autonomous mode or a nonautonomous mode.
Vehicle autonomy can be described as having a number of different levels. The levels of autonomy may be defined as follows:
• Level 0: driver-only driving.
• Level 1: driver assistance, in which a computer-controlled system may assist with certain tasks, such as acceleration or steering, in specific modes of operation.
• Level 2: partial automation, in which a computer-controlled system controls both steering and acceleration in specific modes of operation (such as automatic parking modes).
• Level 3: high automation, in which a computer-controlled system performs all aspect of driving, with the expectation that a human driver will respond to a request to intervene when required. Thus, the human driver must be ready to intervene at all times.
• Level 4: full automation, in which the computer-controlled system performs all aspects of the driving task in a defined use case (such as highway driving or parking scenarios). The human driver will not be required to intervene during such defined use cases.
• Level 5: autonomous driving, in which the computer-controlled system performs all driving tasks under all condition. The human driver will not be required to intervene at any time.
As the level of automation increases, the number of sensors is likely to increase. Moreover, the level of understanding of, and confidence in, the sensor data is likely to be required to increase.
Figures 3 to 5 show schematic representations, indicated generally by the reference numerals 20a, 20b and 20c respectively, of an example scenarios in which principles of the present invention may be used.
In the scenario 20a shown in Figure 3, a first vehicle, such as vehicle 22, is being driven in a first lane of a road and a second vehicle 24 is being driven in a second lane of the road. The vehicle 22 is sometimes referred to herein as the host vehicle.
The vehicle 22 is being controlled in accordance with the principles described herein and may be referred to as the ‘ego’ vehicle. The lane in which the vehicle 22 is travelling in is referred to as the ego lane. As shown in Figure 3, an object 26, such as a stationary vehicle or road furniture, is located in the ego lane, in the direction of travel of the vehicle 22. There is a puddle 28 across both lanes of the road between the vehicles and the object 26. When the vehicle 22 is located in the position illustrated in scenario 20a, one or more sensors of the vehicle 22 (such as a forward facing camera) may detect the presence of object 26 in the ego lane.
In the scenario 20b, the vehicle 22 is shown to be approaching the object 26 in the ego lane. The second vehicle 24 is shown as driving over the puddle 28 which causes a splash 30. When there is a splash, such as splash 30, in front of the vehicle 22, one or more sensors of the vehicle 22, such as a forward facing camera 14 may output a degraded signal as the splash 30 may create a barrier between the forward facing camera and the object 26.
In the scenario 20c, the vehicle 22 has moved closer to the object 26. The vehicle may now be sufficiently close to the object 26 that the vehicle should have slowed down or stopped in order to prevent a possible collision with the object. The impact of the splash 30 on the functionality of at least some of the sensors of the vehicle 22 may make it difficult to provide effective control of the vehicle 22.
The splash 30 described above is provided by way of example only. It will be appreciated that there are many other scenarios in which the functionality of one or more sensors of the vehicle 22 may be degraded. For example, smoke or fog may cause degradation in sensing performance of a camera and a lidar, a metallic bridge, or the like, may cause degradation in sensing performance of a radar device, and a flash of light or sun glare may cause degradation in sensing performance of a camera.
Figure 6 shows a flow chart illustrating an algorithm, indicated generally by the reference numeral 40, in accordance with embodiment of the invention. The algorithm 40 starts at step 42, where a sensor signal is received. The sensor signal may, for example, be output by a sensor of the vehicle 22 described above with reference to Figure 2.
At step 44, a sensor degradation signal, indicative of a degradation in sensing performance of the sensor, is received or generated. In some embodiments, a signal, such as a “sensor is blind” signal may be received as the sensor degradation signal. In other embodiments, the sensor degradation signal may be generated by considering the sensor signal itself (for example, a rapid change in a sensor signal may be indicative of a degradation in sensing performance). The skilled person will be aware of alternative methods for generating suitable sensor degradation signals.
At step 46, a degradation time is determined. The degradation time indicates a time at which the sensor signal may have become degraded (e.g. the degradation time may be the time at which a sensor degradation signal was received or generated).
At step 48, an estimated sensor output is generated. The estimated sensor output is based on sensor signal(s) received at or before the degradation time identified above. For example, the estimated sensor output may simply be the sensor signal at or before said degradation time. Alternatively, the estimated sensor output may be an extrapolation of the sensor signal based on the sensor signal at or before said degradation time.
At step 50, a confidence score associated with the estimated sensor output is generated. The confidence score may be indicative of how likely the sensor output estimated in step 48 is to be correct. A number of factors may be relevant in the generation of a confidence score, such as the nature of the sensor, the nature of the sensor output (e.g. the rate of change of the sensor output), and the time since the last reliable sensor output. Other factors relevant to confidence score include the nature and values of other signals (that could, for example, include some or all of GPS, IMU, odometry sensors, wheel sensors, engine control unit etc.) providing information such as localisation, speed and/or acceleration and/or braking jerks of the ego vehicle. The confidence score could, for example, be higher if the vehicle is moving slowly and braking or not accelerating than if the vehicle is moving quickly and/or accelerating.
The sensor output could take many other forms. For example, the sensor output may classify an object as being either a static object or a moving object. An object that has been classified as a static object could lead to a higher confidence score being given in the step 50, since it may easier to extrapolate the position of a static object than a moving object, even if one or more sensors are blind (for example, using HD map inputs, global navigation satellite systems (GNSS), odometry, IMU, wheel sensors etc.). Such other sensor data could also be inputs that are considered when generating the confidence score at step 50, such that the availability of other sensor data is relevant to confidence.
The estimated sensor output (generated in step 48) and the confidence score (generated in step 50) are outputted (at step 52), for example to a sensor fusion module, and the algorithm ends at step 54.
An exemplary use of the algorithm 40 is described in conjunction with Figures 3 to 5.
Assume that the time at scenario 20a is t, the time at scenario 20b is t+1, and the time at scenario 20c is t+2. Assume also that the vehicle 22 provides a signal from a forward facing camera 14 that seeks to identify obstructions, such as the object 26.
At step 42 of the algorithm 40, a sensor signal is received by a control system of the vehicle 22 from the forward facing camera 14 of the vehicle 22. At time t, the degradation signal in step 44 indicates that there is no degradation, and no degradation time is determined in step 46.
A confidence score of 100% may be generated at step 50 and the actual sensor output, together with the 100% confidence score output at step 52. Alternatively, on determining (at step 44) that there is no degradation of the sensor, the algorithm may simply terminate (by moving directly to step 54).
Referring to scenario 20b, at step 42, a sensor signal is received (at time t+1) by the control system of the vehicle from the forward facing camera 14. As a result of the splash 30, the sensor signal is degraded and a degradation signal is received or generated at step 44.
At step 46, the degradation time is determined to be t+1. Next, at step 48, an estimated sensor output is generated, for example based on the output of the forward facing camera 14 at or before the degradation time. The confidence score generated at step 50 is less than 100%, since the sensor output is being estimated. The algorithm 40 then outputs the estimated sensor output and confidence score at step 52.
Referring to scenario 20c, at step 42, a sensor signal is received (at time t+2) by the control system of the vehicle from the forward facing camera 14. As a result of the splash 30, the sensor signal remains degraded and a degradation signal is received or generated at step 44.
At step 46, the degradation time remains t+1. Next, at step 48, an estimated sensor output is generated, for example based on the output of the forward facing camera 14 at or before the degradation time. The confidence score generated at step 50 is less than 100%, since the sensor output is being estimated. Moreover, the confidence score is likely to be lower than the confidence score at the time t+1, since the sensor output has now been unknown for longer. The algorithm 40 then outputs the estimated sensor output and confidence score at step 52.
It will be appreciated that a degradation signal may be received or generated differently based on the type of sensors.
In one example, some sensors may have a blindness detection device, such that the sensors are able to generate a degradation signal using the blindness detection device. For example, in radars, such as radar devices 17, factors for detecting blindness may include frequency and power of a transmitted radar signal, modulation amplitude, or the like. A test mode may be used for comparing the factors of the test measurements with current measurements, and blinding of the sensor may be detected accordingly. A degradation signal may be generated if blindness is detected.
In another example, some sensors, such as lidars may be able to detect blindness based on measured distance. As lidars use laser lights for measuring distances, if the range of the lidars are blocked due to a barrier, the distance measured may be short and constant, which is not the usual nature of measurements. As such, it may be determined that the lidars are fully or partially blocked, and a degradation signal may be generated. A lidar may be emit several beams or several layers of beams, such that when one or more of the beams are obstructed by a barrier, such as splash 30, or when the reflected beams have less energy than expected, it may be determined which beam(s) and which sensors are obstructed. The degradation signal may be generated accordingly.
In another example, degradation signals may be received or generated for cameras, such as forward facing camera 14, based on vision algorithms. The algorithms may be based on various techniques, such as fast variation in illumination, contrast, analysis of a background and changes in the background over time, or the like. It will be appreciated that there may be many techniques for determining that a camera’s view is blocked or obstructed fully or partially, and degradation signals may be generated accordingly.
With regards to the examples above for determining degradation, the degradation signals may be generated by the sensors, and received by the control system. Alternatively, or in addition, the degradation signals may not be received from the sensors, and may be generated by a control system based on the sensor signals received from the sensors.
In one example, the degradation signal may indicate a partial reduction and/or a total reduction in sensing performance. In another example, a degradation signal may be received from sensors or generated at the control system regardless of whether there may be any degradation in sensing performance of the sensors. The degradation signal may indicate a level of degradation in sensing performance. As such, if there is no degradation in sensing performance, the level of degradation indicated may be low, or zero. If there is degradation in sensing performance, for example, due to any obstruction, the level of degradation indicated may be higher.
Figure 7 shows a schematic block diagram of a control system, indicated generally by the reference numeral 60, in accordance with an embodiment of the invention. The control system 60 comprises a sensor 62 and a control module 64. The control module 64 comprises a buffer 66 and a controller 68. A sensor signal is received by the control module 64. The control system 60 may be used to implement the algorithm 40 described above.
In the example system 60, the sensor signal received in step 42 of the algorithm 40 is the output of the sensor 62. The sensor signal is received at an input of the controller 68 and is also provided as an input to the buffer 66. The buffer 66 may be a rolling buffer or a first-infirst-out (FIFO) buffer such that the most recent sensor signals are retained by the buffer.
At step 44, a sensor degradation signal is received or generated, indicative of a reduced in the performance of the sensor 62. As shown in Figure 7, the sensor degradation signal may be received at as an input to the control module 64. In other embodiments, the controller 68 may determine (on the basis of the sensor output 62, for example by determining a change in sensor output) that the sensor has become degraded (such that, for example, the degradation signal input may be omitted from the control system 60).
At step 46, a degradation time is determined. The degradation time indicates the time at which the sensor signal may have become degraded.
The received sensor signal and the time of receiving the sensor signal may be stored by buffer 66. Controller 68 may use the sensor signal, information stored in the buffer and/or the degradation signal for generating an estimated sensor output and a confidence score. In one example, the time of receiving the sensor signal may be stored as a degradation time if the degradation signal indicates that the signal from sensor 62 has degraded. Alternatively, the degradation signal comprises information of the degradation time.
At step 48, an estimated sensor output is generated. The buffer 66 may comprise an output providing the estimated sensor output. Alternatively, the estimated sensor output may be generated by the controller 68 (for example, based on an output of the buffer).
The estimated sensor output may be based on sensor signal(s) received at or before the degradation time identified above. For example, the estimated sensor output may be equal to the sensor signal at or before said degradation time. Alternatively, the estimated sensor output may be an extrapolation of the sensor signal based on the sensor signal at or before the degradation time. Such signals may be obtained from the buffer 66 or extrapolated from signals stored in the buffer.
At step 50, a confidence score is generated by the controller 68. The confidence score may be indicative of how likely the sensor output estimated in step 48 is to be correct.
The estimated sensor output and the confidence score are output by the control system 60 (for example to a sensor fusion module, as discussed further below).
In one example embodiment, the buffer 66 has an input for receiving the sensor signal (as shown in Figure 7), and an output for providing the estimated sensor output. As such, the estimated sensor output may be based entirely on the received sensor signal.
Figure 8 shows a schematic block diagram of a control system, indicated generally by the reference numeral 70, in accordance with an embodiment of the invention. Control system 70 comprises a first control module 72, a second control module 74, a third control module 76, and a sensor fusion module 78. The first, second and third control modules 72, 74 and 76 may each be similar to the control module 64. The sensor fusion module 78 may be similar to the sensor fusion module 2 described above, but additionally including confidence inputs.
A first sensor signal and a first degradation signal is received by the first control module 72 from a first sensor. The first control module 72 generates a first estimated sensor output and a first confidence score and provides those outputs to the sensor fusion module 78. A second sensor signal and a second degradation signal is received by the second control module 74 from a second sensor, and the second control module 74 then generates a second estimated sensor output and a second confidence score and provides those outputs to the sensor fusion module 78. A third sensor signal is received by the third control module 76, and the third control module 76 generates a third estimated sensor output and a third confidence score and provides those outputs to the sensor fusion module 78. The sensor fusion module 78 uses at least some of the estimated sensor outputs and confidence scores in the generation of one or more outputs, such as outputs for use in controlling a vehicle.
As shown in Figure 8, the third control module 76 does not receive a degradation signal. Accordingly, the third control module 76 may generate a third degradation signal based on the third sensor signal and/or information from a buffer of the third control module. Figure 8 shows the first control module 72 and the second control module 74 receiving a degradation signal from the sensors, and the third control module 76 not receiving a degradation signal. However, this is for illustration purpose only, such that it is possible that any one or more of the control modules may receive or not receive a degradation signal.
The system 70 may be used to provide control signals in the scenarios described above with reference to Figures 3 to 5.
Assume that the first sensor signal of the first control module 72 is received from a forward facing camera, such as forward facing camera 14, the second sensor signal of the second control module 74 is received from a radar device, such as one of radar devices 17, and the third sensor signal of the third control module 76 is received from a lidar, such as one of lidars 18.
At time t, shown in Figure 3, the first, second and third sensor signals of the control system 70 are received without degradation. Accordingly, each sensor output is provided to the sensor fusion module 78 with 100% confidence.
At time t+1, shown in Figure 4, the first degradation signal of the first control module 72 receives a signal indicating that the camera output is degraded (caused by the splash 30). The first control module 72 uses the first sensor signal, the degradation signal, and information from the buffer of the first control module 72 for generating a first estimated sensor output and a first confidence score. The information from the buffer may include previous sensor signals, such as a sensor signal received at time t (shown in scenario 20a). The previous sensor signals may indicate the presence of the object 26. As the first estimated sensor output is not entirely based on the first sensor signal, and is based partially on previous sensor signals, the first confidence score is less than 100%. For example, the first confidence score for the first sensor signal may be 60%. At time t+1, the second sensor signal may be unaffected by the splash 30, such that the second sensor output is provided with 100% confidence and the third sensor signal may be only minimally affected by the splash 30, such that the second sensor output is provided with 80% confidence.
Thus, at time t+1, the sensor fusion module receives a first output having 60% confidence, a second output having 100% confidence and a third output having 80% confidence.
At time t+2, shown in Figure 5, the first degradation signal of the first control module 72 receives a signal indicating that the camera output remains degraded. Since the signal has been degraded for longer than at time t+1, the confidence score generated by the control module 72 is lower. For example, the first confidence score for the first sensor signal at time t+2 may be 40%. At time t+1, the second sensor signal may still be unaffected by the splash 30 (such that the second sensor output is provided with 100% confidence) and the third sensor signal may still be minimally affected by the splash 30 (such that the second sensor output is provided with 70% confidence).
Thus, at time t+2, the sensor fusion module receives a first output having 40% confidence, a second output having 100% confidence and a third output having 70% confidence.
Accordingly, at both times t+1 and t+2, the sensor fusion module 78 can be provided with information regarding the sensor outputs, together with an indication of how reliable that information is. Thus, the sensor fusion module may be able to provide a better response than if the unreliable first and third sensor signals were simply ignored. Moreover, the sensor fusion module is aware that whilst the first and third inputs provide some potentially useful data, these should be treated with caution, particularly the first sensor data.
Thus, in an example use of the system 70, one or more of the first, second and third sensor signals provides an estimate of a current location of an object (such as an obstacle, a lane marking, a footpath, road furniture etc.). Further, in the event of a degradation of sensor performance, one or more of the first, second and third estimate sensor outputs provides a location of the object, or an extrapolation of the location of the object, at or before a degradation time.
In embodiments in which an estimated sensor output relates to a position of an object (such as the object 26 described above), the confidence score may be at least partially based on whether, and/or the extent to which, the object can move. For example, if object 26 is deemed to be a vehicle or another movable object, the confidence scores at time t+1 and t+2 may be lower than if the object 26 is deemed to be a non-moving object (such as permanent road furniture).
The confidence score may be dependent on a degree of the degradation in sensing performance. For example, the confidence score may be high if the degradation signal indicates that the degree of degradation is limited, and the confidence score may be low if the degradation signal indicates that the degree of degradation is substantial for the received sensor signal. The degree of degradation may be based on whether the sensors are obstructed fully or partially, or which sensors are obstructed, and/or the degradation time.
The system 70 may include additional inputs. By way of example, the sensor fusion module 78 may have an input receiving localisation signals 80 (e.g. GNSS, IMU, wheel sensor signals, HD maps etc.). Alternatively, or in addition, the sensor fusion module 78 may have an input receiving vehicle dynamics signals 82 (e.g. speed, acceleration, decelerations, yaw, roll, pitch etc.).
In the event that a sensor is no longer degraded, the confidence score may rise again. In some embodiments, the confidence score does not immediately return to 100%, but may do so after a period of time. For example, in the example above in relation to the detection of the presence of an object, a sensor (such as a camera) that was previously blinded by a splash may be deemed less than 100% reliable once the sensor is no longer deemed to be blinded. Thus, for example, a sensor that has not been blinded may be deemed more reliable than a sensor that has been blinded in the recent past but is no longer considered to be blinded.
The sensor fusions modules 2 and 78 may take any suitable form and may be implemented using a deterministic algorithm or by machine learning. Alternatively, or in addition, the controller 68 may be implemented using a deterministic algorithm or by machine learning. The use of machine learning may enable a degradation/confidence scores to be used even if it is not possible to provide deterministic laws to describe the relevant behaviour.
In the case of sensor fusion modules incorporating one or more machine learning algorithms, the provision of a confidence score can assist in improving the output(s).
Figure 9 shows a neural network, indicated generally by the reference numeral 90, used in an example implementation of a sensor fusion module in accordance with an embodiment of the invention. The neural network 90 is a feedforward neural network including an input layer 92, one or more hidden layers 94 (a plurality of hidden layers is shown in Figure 9) and an output layer 96. The neural network 90 could be used to implement one or more of the modules 2, 68 and 78 described above.
The neural network 90 comprises a plurality of network nodes. The input layer 92 includes a plurality of nodes, each receiving an input. For example, the nodes of the input layer may receive data from a plurality of sensors. Each node of the input layer provides an output to a plurality of nodes of a first hidden layer. After one or more hidden layers (two are shown in Figure 9 by way of example), the outputs of the nodes of the last hidden layer are provided to one or more nodes of an output layer. Each node of the output layer provides an output of the neural network. As is well known in the art, the couplings between the nodes of the neural network 90 include weights that are typically adjustable.
The various nodes of the neural network 90 are trainable by changing the relative weights of each node. As is well known in the art, given a large set of training data, the neural network 90 can be trained to provide suitable outputs given a set of inputs.
The neural network 90 is provided by way of example only. Many alternative neural networks or machine-learned algorithms could be used for processing the sensor data to determine the one or more ambient conditions in the environment exterior to the vehicle.
The skilled person will be aware of many suitable implementations for the sensor fusion modules.
A vehicle 100 in accordance with an embodiment of the present invention is described herein with reference to the accompanying Figure 10. Vehicle 100 may be similar to vehicle 22, such that the methods of algorithm 40 may be performed at vehicle 100.
It will be appreciated that various changes and modifications can be made to the present invention without departing from the scope of the present application. Variations and modifications will be apparent to persons skilled in the art. For example, the flow chart of Figure 6 is provided by way of example only and the various steps shown therein may be omitted, reordered and/or combined. In particular, the step 46 may be optional.
Moreover, the present specification should be understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or any generalisation thereof.

Claims (29)

1. A control system for a host vehicle, the control system comprising one or more controllers, the control system configured to:
receive a sensor signal output by a sensor of the host vehicle;
receive or generate a sensor degradation signal indicative of a degradation in sensing performance of the sensor;
receive or determine a degradation time at which the sensing performance of the sensor degraded;
generate an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generate a confidence score associated with the estimated sensor output.
2. A control system according to claim 1, wherein the one or more controller collectively comprise:
at last one electronic processor having an electrical input for receiving signals; and at least one electronic memory device coupled to the at least on electronic processor and having instructions stored therein;
and wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions therefore as to cause the host vehicle to generate the estimated sensor output and the confidence score.
3. A control system according to claim 1 or claim 2, comprising a buffer having an input for receiving the sensor signal.
4. A control system according to claim 3, wherein the buffer comprises an output providing the estimated sensor output.
5. A control system according to any one of the preceding claims, wherein the estimated sensor output is equal to the sensor signal at or before said degradation time.
6. A control system according to any one of claims 1 to 4, wherein the estimated sensor output is an extrapolation of the sensor signal based on the sensor signal at or before said degradation time.
7. A control system according to any one of the preceding claims, wherein the sensor signal provides an estimate of a current location of an object.
8. A control system according to claim 7, wherein the estimated sensor output is the location of the object at a time before said degradation time.
9. A control system according to claim 7, wherein the estimated sensor output is an extrapolation of the location of the object based on the location of the object at or before said degradation time.
10. A control system according to any one of claims 7 to 9, wherein the confidence score is dependent on whether, and/or the extent to which, the object can move.
11. A control system according to any one of the preceding claims, wherein the confidence score is dependent on a degree of the degradation in sensing performance.
12. A control system according any one of the preceding claims, wherein the sensor degradation signal is at least one of a partial and a total reduction in sensing performance.
13. A control system according to any one of the preceding claims, wherein the host vehicle is operable in at least one of an autonomous and a non-autonomous mode.
14. A control system according to any one of the preceding claims, wherein the sensor is an embedded sensor local to the vehicle and/or a sensor external to the host vehicle.
15. A control system according to any one of the preceding claims, wherein the sensor is one or more of: a lidar sensor, a radar sensor, an imaging sensor, an ultrasonic sensor, an electromagnetic sensor, a bolometer, an infrared sensor and a temperature sensor.
16. A control system according to any one of the preceding claims, configured to output the estimated sensor output and the confidence score.
17. A control system according to any one of the preceding claims, configured to use the estimated sensor output and the confidence score.
18. A control system according to any one of the preceding claims, the control system configured to receive one or more additional sensor signals from one or more additional sensors, the control system configured to:
receive or generate one or more additional sensor degradation signals indicative of a degradation in sensing performance of one or more of the additional sensors;
receive or determine one or more additional degradation times at which the sensing performance of said additional sensors degraded;
generate one or more additional estimated sensor outputs in dependence on the one or more additional sensor signals at or before the one or more additional degradation times; and generate one or more additional confidence scores associated with the one or more addition estimated sensor outputs.
19. A method comprising:
receiving a sensor signal output by a sensor of a host vehicle;
receiving or generating a sensor degradation signal indicative of a degradation in sensing performance of the sensor;
receiving or determining a degradation time at which the sensing performance of the sensor degraded;
generating an estimated sensor output in dependence on the sensor signal at or before the degradation time; and generating a confidence score associated with the estimated sensor output.
20. A method according to claim 19, comprising buffering the sensor signal.
21. A method according to claim 19 or claim 20, wherein the estimated sensor output is equal to the sensor signal at or before said degradation time.
22. A method according to claim 19 or claim 20, comprising generating the estimated sensor output by extrapolating the sensor signal based on the sensor signal at or before said degradation time.
23. A method according to any one of claims 19 to 22, wherein the first sensor provides an estimate of a current location of an object.
24. A method according to any one of claims 19 to 23, wherein host vehicle is operable in at least one of an autonomous and a non-autonomous mode.
25. A method according to any one of claims 19 to 24, comprising outputting the estimated sensor output and the confidence score.
26. A method according to claim 25, comprising outputting the estimated sensor output and the confidence score to a sensor fusion algorithm.
27. A method according to any one of claims 19 to 26, comprising:
receiving one or more additional sensor signals from one or more additional sensors;
receiving or generating one or more additional sensor degradation signals indicative of a degradation in sensing performance of one or more of the additional sensors;
receiving or determining one or more additional degradation times at which the sensing performance of said additional sensors degraded;
generating one or more additional estimated sensor outputs in dependence on the one or more additional sensor signals at or before the one or more additional degradation times; and generating one or more additional confidence scores associated with the one or more addition estimated sensor outputs.
28. A vehicle comprising a control system according to any one of claims 1 to 18.
29. A non-transitory computer readable medium comprising computer readable instructions that, when executed by a processor, cause performance of the method of any one of claims 19 to 27.
GB1813018.7A 2018-08-10 2018-08-10 Sensor degradation Active GB2576206B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1813018.7A GB2576206B (en) 2018-08-10 2018-08-10 Sensor degradation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1813018.7A GB2576206B (en) 2018-08-10 2018-08-10 Sensor degradation

Publications (3)

Publication Number Publication Date
GB201813018D0 GB201813018D0 (en) 2018-09-26
GB2576206A true GB2576206A (en) 2020-02-12
GB2576206B GB2576206B (en) 2021-01-06

Family

ID=63667131

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1813018.7A Active GB2576206B (en) 2018-08-10 2018-08-10 Sensor degradation

Country Status (1)

Country Link
GB (1) GB2576206B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3901662A1 (en) * 2020-04-22 2021-10-27 Baidu USA LLC Systems and methods to determine risk distribution based on sensor coverages of a sensor system for an autonomous driving vehicle
US20220128682A1 (en) * 2019-07-05 2022-04-28 Denso Corporation Target detection apparatus
EP4206740A1 (en) * 2021-12-30 2023-07-05 Yandex Self Driving Group Llc Method and a system of determining lidar data degradation degree
WO2024013048A1 (en) * 2022-07-12 2024-01-18 Valeo Schalter Und Sensoren Gmbh Method and driving support system for sequentially processing sequentially provided sets of sensor information
EP4258240A4 (en) * 2020-12-04 2024-04-24 Nissan Motor Redundant system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115597585A (en) * 2022-12-16 2023-01-13 青岛通产智能科技股份有限公司(Cn) Robot positioning method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5394341A (en) * 1993-03-25 1995-02-28 Ford Motor Company Apparatus for detecting the failure of a sensor
US9274525B1 (en) * 2012-09-28 2016-03-01 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
GB2547999A (en) * 2016-01-29 2017-09-06 Ford Global Tech Llc Tracking objects within a dynamic environment for improved localization

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5394341A (en) * 1993-03-25 1995-02-28 Ford Motor Company Apparatus for detecting the failure of a sensor
US9274525B1 (en) * 2012-09-28 2016-03-01 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
GB2547999A (en) * 2016-01-29 2017-09-06 Ford Global Tech Llc Tracking objects within a dynamic environment for improved localization

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220128682A1 (en) * 2019-07-05 2022-04-28 Denso Corporation Target detection apparatus
EP3901662A1 (en) * 2020-04-22 2021-10-27 Baidu USA LLC Systems and methods to determine risk distribution based on sensor coverages of a sensor system for an autonomous driving vehicle
US11702104B2 (en) 2020-04-22 2023-07-18 Baidu Usa Llc Systems and methods to determine risk distribution based on sensor coverages of a sensor system for an autonomous driving vehicle
EP4258240A4 (en) * 2020-12-04 2024-04-24 Nissan Motor Redundant system
EP4206740A1 (en) * 2021-12-30 2023-07-05 Yandex Self Driving Group Llc Method and a system of determining lidar data degradation degree
WO2024013048A1 (en) * 2022-07-12 2024-01-18 Valeo Schalter Und Sensoren Gmbh Method and driving support system for sequentially processing sequentially provided sets of sensor information

Also Published As

Publication number Publication date
GB201813018D0 (en) 2018-09-26
GB2576206B (en) 2021-01-06

Similar Documents

Publication Publication Date Title
GB2576206A (en) Sensor degradation
CN106240565B (en) Collision mitigation and avoidance
KR102342143B1 (en) Deep learning based self-driving car, deep learning based self-driving control device, and deep learning based self-driving control method
US11498577B2 (en) Behavior prediction device
JP6353525B2 (en) Method for controlling the speed of a host vehicle and system for controlling the speed of a host vehicle
US11377145B2 (en) Vehicle control device and control method for vehicle
RU2703824C1 (en) Vehicle control device
JP2019151185A (en) Driving support device
US20230148202A1 (en) Vehicle control system
CN111216707A (en) Apparatus and method for controlling autonomous driving of vehicle
US11427200B2 (en) Automated driving system and method of autonomously driving a vehicle
JP7013284B2 (en) Mobile behavior predictor
CN112172816A (en) Lane change control apparatus and method for autonomous vehicle
US20210011481A1 (en) Apparatus for controlling behavior of autonomous vehicle and method thereof
US20230271621A1 (en) Driving assistance device, learning device, driving assistance method, medium with driving assistance program, learned model generation method, and medium with learned model generation program
CN114179795A (en) System for predicting collision risk in lane change and method thereof
WO2021089608A1 (en) Adaptive cruise control
JP6854141B2 (en) Vehicle control unit
CN112991817B (en) Adaptive object in-path detection model for automatic or semi-automatic vehicle operation
US20220396287A1 (en) Adaptive trust calibration
US11851088B2 (en) Method for determining capability boundary and associated risk of a safety redundancy autonomous system in real-time
JP6267430B2 (en) Mobile environment map generation control device, mobile body, and mobile environment map generation method
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
GB2579194A (en) Torque modification request
WO2023032092A1 (en) Vehicle control method and vehicle control device