WO2020106201A1 - Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle - Google Patents

Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle

Info

Publication number
WO2020106201A1
WO2020106201A1 PCT/SE2019/051136 SE2019051136W WO2020106201A1 WO 2020106201 A1 WO2020106201 A1 WO 2020106201A1 SE 2019051136 W SE2019051136 W SE 2019051136W WO 2020106201 A1 WO2020106201 A1 WO 2020106201A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor data
driver
data segment
features
Prior art date
Application number
PCT/SE2019/051136
Other languages
French (fr)
Inventor
Paola MAGGINO
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Publication of WO2020106201A1 publication Critical patent/WO2020106201A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0243Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model
    • G05B23/0254Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model based on a quantitative model, e.g. mathematical relationships between inputs and outputs; functions: observer, Kalman filter, residual calculation, Neural Networks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0232Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
    • B60R16/0234Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions related to maintenance or repairing of vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • G06F18/2178Validation; Performance evaluation; Active pattern learning techniques based on feedback of a supervisor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/021Means for detecting failure or malfunction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the invention relates generally to the enhancement of advanced driver-assistance systems.
  • the present invention concerns a method for detecting faults in a driver-assistance system of a vehicle, a control unit implementing this method and a vehicle containing the control unit.
  • the invention also relates to a computer program product and a non-volatile data carrier.
  • US 2008/0161989 describes a system on a moving object for monitoring components or subsystems, which includes sensors for obtaining a value of a measurable characteristic of the com ponent or subsystem and generating a signal indicative or repre sentative of the value, and a processor operatively connected to the sensors for receiving the signal from each sensor and ana lyzing the value of the measurable characteristic to determine that the component or subsystem has a fault condition, e.g., an actual or potential fault or failure.
  • a communications unit is coupled to the processor and transmits a diagnostic or prognos tic message relating to the determination of the fault condition of the component or system to a remote site, upon direction or command by the processor.
  • the processor may be part of a dia gnostics module and configured to recognize a predetermined fault condition, using for example pattern recognition technolo gies.
  • EP 2 790 165 discloses a method, an apparatus and a computer program product for quality determination in data acquisition.
  • a signal processing apparatus and method for processing at least one detection signal are described.
  • the apparatus contains a re DCver configured to receive the at least one detection signal from at least one sensor, a model generator configured to gene rate a model of signal sequences based on previously received detection signals while taking into consideration a topology of the sensors and/or environmental condition in the vicinity of the at least one sensor, comparing means configured to compare the at least one detection signal with the model of signal se quences generated by the model generator, and a prediction means configured to predict quality of the at least one detection signal based on a comparison of the at least one detection sig nal with the model generated by the model generator.
  • US 2016/0350194 shows a health management solution, and more particularly a method and system for artificial intelligence based diagnostic and prognostic health management of host systems.
  • the system includes a memory to store instructions, and a neural network controller coupled to the memory.
  • the neural network controller is configured by the inst ructions to monitor a plurality of unique patterns generated in real-time.
  • the plurality of system parameters is indicative of a system-level performance of the host system.
  • the neural net work controller is configured by the instructions to compare the plurality of unique patterns with a plurality of predetermined pat terns corresponding to the plurality of system parameters to de- tect potential anomalies in the host system and one or more subsystems of the plurality of subsystems, where the one or mo re subsystems are responsible for contributing to the potential anomalies in the host system.
  • One object of the present invention is therefore to offer an en hanced solution for identifying faults in a driver-assistance sys tem.
  • this object is achieved by a method performed in a control unit to detect faults in a dri ver-assistance system of a vehicle.
  • the method involves obtai ning sensor data in at least one processor.
  • the sensor data des cribe spatio-temporal relationships between the vehicle and obs- tacles located in a sector relative to the vehicle, for example in a cone in front of the vehicle.
  • the method involves divi ding the sensor data into data segments in the at least one pro cessor.
  • Each of the data segments is associated with a particu lar feature from a set of features containing at least two featu- res.
  • each feature has a probability distribution for detec ting obstacles within the sector.
  • a respective artificial neural network model is configured to process the data segments. The processing involves predicting data segment content in an error- free operation of the driver-assistance system. Specifically, each of the data segments is processed in a respective one of the artificial neural network models depending on the feature with which the respective data segment is associated. As a re sult, respective predicted data segment content is obtained. For each of the features, the predicted data segment content is com- pared with corresponding data segment content divided off from the obtained sensor data to derive a respective difference mea sure.
  • the method involves generating a fault diag nosis report based on the difference measures.
  • this method is advantageous because it enables discovery of rare and critical error scenarios very promptly, i.e. with exceptionally short latency from occurrence of an error.
  • the features reflect a respective longitudinal relative position, a res pective latitudinal relative position, a respective longitudinal re lative velocity and/or a respective latitudinal relative velocity of the obstacles, if any, that are located in the sector.
  • the vehicle surroundings are modeled in a relevant manner.
  • the sector contains at least two zones, and the features further reflect in which of the at least two zones each of the obstacles is located.
  • the method involves generating the artificial neural network mo dels by, for each of the at least two zones, training a basic neu ral-network model with training data in the form of segmented sensor data representing error-free operation of the driver-assis- tance system.
  • the training involves evaluating a capability for the basic neural-network model to predict a data segment con tent based on an input data segment.
  • One or more parameters in the basic neural-network model are then adjusted until the capability for the basic neural-network model to predict the data segment content lies within an accuracy threshold.
  • the artificial neural network models can be made highly reliable and accurate predictors.
  • the method further involves producing the segmented sensor data by: obtaining raw sensor data from the driver-assis tance system, which raw sensor data reflect a sequence of events following in succession after one another in time; and preprocessing the raw sensor data by interpolating data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples whose content is based on a respective content of temporally neigh boring data samples in the raw sensor data.
  • the trai ning avoids being based on non-representative input, and can thus be made very efficient.
  • radar radio detection and ranging
  • lidar light detection and ranging
  • sonar sound navigation and ranging
  • video signal a video signal.
  • the object is ac hieved by a computer program containing instructions which, when executed on at least one processor, cause the at least one processor to carry out the above-described method.
  • the object is achie ved by a non-volatile data carrier containing such a computer program.
  • the above ob ject is achieved by a control unit adapted to be included in a ve hicle for detecting faults in a driver-assistance system of the ve hicle.
  • the control unit contains at least one processor configu red to obtain sensor data describing spatio-temporal relation- ships between the vehicle and obstacles located in a sector re lative to the vehicle.
  • the at least one processor is further confi gured to divide the sensor data into data segments, and asso ciate each data segment with a particular feature from a set of features containing at least two features.
  • a respective artificial neural network model implemented in the at least one processor is configured to process the data seg ments. This processing involves predicting data segment con- tent in an error-free operation of the driver-assistance system. Each of the data segments is processed in a respective one of the artificial neural network models depending on the feature with which the respective data segment is associated. As a re- suit, a respective predicted data segment content is obtained. For each features, the at least one processor is further configu red compare the predicted data segment content with correspon ding data segment content divided from the obtained sensor da ta to derive a respective difference measure. Based on the diffe- rence measures, in turn, the at least one processor is configured to generate a fault diagnosis report.
  • the object is achieved by a vehicle including the proposed control unit for de tecting faults in a driver-assistance system of the vehicle.
  • Figure 1 schematically depicts a vehicle in which an embo diment of the invention is implemented
  • Figure 2 shows a block diagram over a processor accor ding to one embodiment of the invention
  • Figure 3 shows a block diagram over a control unit accor ding to one embodiment of the invention
  • Figure 4 illustrates how a monitored sector in vicinity of a vehicle can be divided into zones according to one embodiment of the invention
  • Figure 5 illustrates, schematically, how an artificial neural network may be trained according to one embodi ment of the invention
  • Figure 6 illustrates, by means of a flow diagram, the gene ral method according to the invention.
  • Figure 1 schematically depicts a vehicle V in which an embodi ment of the invention is implemented in an onboard control unit 1 10.
  • Figure 2 shows a block diagram over a processor 1 12 ac cording to one embodiment of the invention.
  • Figure 2 shows only one processor 1 12.
  • Figure 3 shows a block diagram over the control unit 1 10 according to one embodiment of the invention.
  • the control unit 1 10 may contain a processing unit 1 12 with pro- cessing means including at least one processor, such as one or more general-purpose processors. Further, the processing unit 1 12 is preferably communicatively connected to a data carrier 1 14 in the form computer-readable storage medium, such as a Random Access Memory (RAM), a Flash memory, or the like.
  • the data carrier 1 14 contains computer-executable instructions, i.e. a computer program 1 15, for causing the processing unit 1 12 of the control unit 1 10 to perform in accordance with the embodiments of the invention as described herein, when the computer-executable instructions are executed on the at least one processor of the processing unit 1 12.
  • the control unit 1 10 is preferably implemented in one or more so-called ECUs (Electronic Control Units) in the vehicle V, which ECUs exchange data and instructions with other units, sensors and actuators in the vehicle V via a CAN (Controller Area Net work) bus, or analogous internal communications network.
  • ECUs Electronic Control Units
  • CAN Controller Area Net work
  • the control unit 1 10 is adapted to detect faults in a driver-assis- tance system 120 of the vehicle V.
  • the control unit 1 10 contains at least one processor 1 12, which is configured to obtain sensor data DS describing spatio-temporal relationships between the vehicle V and any obstacles, e.g. another moving vehicle OB1 and a stationary barrel OB2, that are located in a sector S relative to the vehicle V.
  • the obstacles OB1 and OB2 may be represented by other vehicles travelling in the sa me direction as the vehicle V or in any other direction, pedes trians and/or stationary objects on or beside the road.
  • the at least one processor 1 12 is configured to divide the sensor data DS into data segments di, and associate each of these data segments d, with a particular feature f i , f2, ... , f n from a set of features ⁇ f m ⁇ .
  • the set of features ⁇ f m ⁇ contains at least two features each of which has a probability distribution for detecting obstacles within the sector S.
  • a respective artificial neural network model ANN1 , ANN2, ... , ANNn is configured to process the data segments di. The processing involves predic ting data segment content d’, in an error-free operation of the driver-assistance system 120.
  • the at least one processor 1 12 is configured to process each of the data segments d, in a respec- tive one of the artificial neural network models ANN1 , ANN2, ... , ANNn depending on the feature of said features with which the respective data segment d, is associated to obtain a respective predicted data segment content fi’(di), f2’(d j ) and f n ’(d k ) respectively.
  • This means that the at least one processor 1 12 is configured to the content of data segments d, so that each segment is processed in an appropriate artificial neural network model.
  • the at least one processor 1 12 is configured to compare the predicted data segment content fi’(di), f2’(d j ) and f n ’(d k ) with corresponding data segment content fi (di), f2(d j ) and f n (d k ) respectively divided off from the obtained sensor data DS to derive a respective difference measure ei , ez, ... , e n .
  • the at least one processor 1 12 is configured to ge nerate a fault diagnosis report R, for instance in the form of one or more alarm signals or an indication of error-free operation de pending on the statuses of the difference measures ei , ei, ... ,
  • the features f i , f 2 , ... , f n reflect: a respective longi tudinal relative position, a respective latitudinal relative position, a respective longitudinal relative velocity and/or a respective la titudinal relative velocity of the obstacles OB1 and OB2 that are located in the sector S to describe the spatio-temporal relation- ships between the vehicle V and the obstacles OB 1 and OB2.
  • the sector S may be embodied as a cone, for instance as illustrated in Figure 1 , in which one or more sensors (not shown) are configured to register the sensor data DS, via for ex ample radar signals, lidar signals, sonar signals and/or video signals.
  • the sector S con tains, i.e. is subdivided into, two or more zones Z01 , Z02, Z03, Z04, Z05, Z06, Z07, Z08, Z09, Z10, Z1 1 , Z12, Z13 and Z14, which zones represent different areas in front of the vehicle V as illustrated in Figure 4.
  • the features f 1 , f 2 , ... , fn further reflect in which of the at least two zones that each of the obstacles OB1 and OB2 respectively is located.
  • the artificial neural network models ANN 1 , ANN2, ... , ANNn have preferably been generated as follows. Referring further to Figure 5, for each of the at least two zones
  • a basic neural-network model ANN is trained with training data in the form of segmented sensor data that repre sent error-free operation of the driver-assistance system 120.
  • this training data is exemplified by segmented sensor data fi(di), fi(di+i) and fi (d ,+2) respectively.
  • the training involves evaluating a capability for the basic neural-network model ANN to predict a data segment f’(di) based on an input data segment fi (d i) . For repeated inputs of training data, i.e.
  • the output from the neural-network model ANN is compared to the input, and one or more parameters (p m ) in the basic neural-network model ANN are adjusted until the capability for the thus adjusted va- riant of the basic neural-network model ANN to predict the data segment content f’(di) lies within an accuracy threshold e t n., i.e. an acceptable quality measure for the prediction.
  • the enhancement may involve obtaining raw sensor data DS from the driver-assistance system 120, which raw sensor data DS reflect a sequence of events following in succession after one another in time.
  • the raw sensor data DS is then preprocessed by interpolating data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples, where a content synthetic data samples is based on respective contents of temporally neighboring data samples in the raw sensor data.
  • the raw sensor data DS is supplemented with suitable artificial data samples to avoid fee ding non-representative data into the basic neural-network model ANN, and its gradually adjusted variants, during the trai ning process.
  • a first step 610 sensor data are obtained, which describe spatio-temporal relationships between a vehicle and any obstac- les located in a sector relative to the vehicle.
  • a step 620 thereafter checks if a sufficient amount of sensor data have been received to form a data segment, and if so a step 630 follows. Otherwise, the procedure loops back and stays in step 620 until enough sensor data have been received to fill up a data segment.
  • the data segment formed in the previous step is associated with a particular feature from a set of features containing at least two features. Each of these featu res, in turn, has a probability distribution for detecting obstacles within the sector.
  • the data segment is processed in a particular artificial neural network model depending on the featu re of said features with which the data segment is associated to.
  • the processing involves predicting data segment content in an error-free operation of the driver-assistance system.
  • the processing results in a predicted data segment content being obtained.
  • a subsequent step 650 compares, for the features in question, the predicted data segment content with corresponding data segment content divided from the obtained sensor data to derive a difference measure between predicted data segment and the data segment content obtained from the sensor data.
  • a fault diagnosis report is generated based on the difference measure.
  • the fault diagnosis report may be generated gradually as new data segments are divided off from the obtained sensor data, and if the difference measure exceeds a threshold level, an alarm signal may be generated.
  • step 660 the procedure loops back to step 610.
  • All of the process steps, as well as any sub-sequence of steps, described with reference to Figure 6 above may be controlled by means of at least one programmed processor.
  • the embodiments of the invention described above with reference to the drawings comprise processor and processes performed in at least one processor, the invention thus also extends to com puter programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice.
  • the program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the pro cess according to the invention.
  • the program may either be a part of an operating system, or be a separate application.
  • the carrier may be any entity or device capable of carrying the program.
  • the carrier may comprise a storage medium, such as a Flash memory, a ROM (Read Only Memory), for example a DVD (Digital Video/Versatile Disk), a CD (Compact Disc) or a semi conductor ROM, an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a magnetic recording medium, for ex ample a floppy disc or hard disc.
  • the carrier may be a transmissible carrier such as an electrical or optical signal which may be conveyed via electrical or optical cable or by radio or by other means.
  • the carrier When the program is embodied in a signal which may be conveyed directly by a cable or other device or means, the carrier may be constituted by such cable or device or means.
  • the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted for performing, or for use in the performance of, the relevant proces ses.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Transportation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Traffic Control Systems (AREA)

Abstract

Faults are detected in a driver-assistance system of a vehicle by obtaining sensor data (DS) in a processor (112). Sensor data (DS) are divided into data segments (d,), wherein each segment (di) is associated with a particular feature (f1, f2,..., fn). The processing involves predicting data segment content (d',) in an error-free operation of the driver-assistance system (120). Each of the data segments (di) is processed in a respective one of an artificial neural network models (ANN1, ANN2, ANNn) depending on the feature with which the respective data segment (d,) is associated to obtain a respective predicted data segment content (f1'(di), f2'(dj), fn'(dk)). For each feature (f1 , f2,..., fn), the predicted data segment content (f1(di), f2'(dj), fn'(dk)) is compared with corresponding data segment content (f1(dj), f2(dj), fn(dk)) divided from the obtained sensor data (DS) to derive a respective difference measure (e1, e2,..., en).

Description

Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle
TECHNICAL FIELD
The invention relates generally to the enhancement of advanced driver-assistance systems. In particular, the present invention concerns a method for detecting faults in a driver-assistance system of a vehicle, a control unit implementing this method and a vehicle containing the control unit. The invention also relates to a computer program product and a non-volatile data carrier.
BACKGROUND
Automatic detection and handling data concerning obstacles in a driver-assistance system is a highly complex task. Moreover, it is a fundamental that a driver-assistance system is robust and reliable. Consequently, any faults in the driver-assistance sys tem, as such, must be detected promptly and accurately. The prior art contains examples of such solutions of which some are listed below.
US 2008/0161989 describes a system on a moving object for monitoring components or subsystems, which includes sensors for obtaining a value of a measurable characteristic of the com ponent or subsystem and generating a signal indicative or repre sentative of the value, and a processor operatively connected to the sensors for receiving the signal from each sensor and ana lyzing the value of the measurable characteristic to determine that the component or subsystem has a fault condition, e.g., an actual or potential fault or failure. A communications unit is coupled to the processor and transmits a diagnostic or prognos tic message relating to the determination of the fault condition of the component or system to a remote site, upon direction or command by the processor. The processor may be part of a dia gnostics module and configured to recognize a predetermined fault condition, using for example pattern recognition technolo gies.
EP 2 790 165 discloses a method, an apparatus and a computer program product for quality determination in data acquisition. A signal processing apparatus and method for processing at least one detection signal are described. The apparatus contains a re ceiver configured to receive the at least one detection signal from at least one sensor, a model generator configured to gene rate a model of signal sequences based on previously received detection signals while taking into consideration a topology of the sensors and/or environmental condition in the vicinity of the at least one sensor, comparing means configured to compare the at least one detection signal with the model of signal se quences generated by the model generator, and a prediction means configured to predict quality of the at least one detection signal based on a comparison of the at least one detection sig nal with the model generated by the model generator.
US 2016/0350194 shows a health management solution, and more particularly a method and system for artificial intelligence based diagnostic and prognostic health management of host systems. In an embodiment, the system includes a memory to store instructions, and a neural network controller coupled to the memory. The neural network controller is configured by the inst ructions to monitor a plurality of unique patterns generated in real-time. The plurality of system parameters is indicative of a system-level performance of the host system. The neural net work controller is configured by the instructions to compare the plurality of unique patterns with a plurality of predetermined pat terns corresponding to the plurality of system parameters to de- tect potential anomalies in the host system and one or more subsystems of the plurality of subsystems, where the one or mo re subsystems are responsible for contributing to the potential anomalies in the host system.
Thus, different solutions are known for supervising and diagno- sing complex technical arrangements, such as driver-assistance systems, inter alia by using artificial intelligence based diagnos tic mechanisms. However, there is room for improving these su pervision systems, for example in terms of reliability and speedi- ness.
SUMMARY
One object of the present invention is therefore to offer an en hanced solution for identifying faults in a driver-assistance sys tem. According to one aspect of the invention, this object is achieved by a method performed in a control unit to detect faults in a dri ver-assistance system of a vehicle. The method involves obtai ning sensor data in at least one processor. The sensor data des cribe spatio-temporal relationships between the vehicle and obs- tacles located in a sector relative to the vehicle, for example in a cone in front of the vehicle. Further, the method involves divi ding the sensor data into data segments in the at least one pro cessor. Each of the data segments is associated with a particu lar feature from a set of features containing at least two featu- res. Here, each feature has a probability distribution for detec ting obstacles within the sector. A respective artificial neural network model is configured to process the data segments. The processing involves predicting data segment content in an error- free operation of the driver-assistance system. Specifically, each of the data segments is processed in a respective one of the artificial neural network models depending on the feature with which the respective data segment is associated. As a re sult, respective predicted data segment content is obtained. For each of the features, the predicted data segment content is com- pared with corresponding data segment content divided off from the obtained sensor data to derive a respective difference mea sure. Additionally, the method involves generating a fault diag nosis report based on the difference measures. Provided , of course, that the artificial neural network models ha ve been trained properly, this method is advantageous because it enables discovery of rare and critical error scenarios very promptly, i.e. with exceptionally short latency from occurrence of an error.
According to one embodiment of this aspect of the invention, the features reflect a respective longitudinal relative position, a res pective latitudinal relative position, a respective longitudinal re lative velocity and/or a respective latitudinal relative velocity of the obstacles, if any, that are located in the sector. This means that the vehicle surroundings are modeled in a relevant manner. Preferably, the sector contains at least two zones, and the features further reflect in which of the at least two zones each of the obstacles is located. According to another embodiment of this aspect of the invention , the method involves generating the artificial neural network mo dels by, for each of the at least two zones, training a basic neu ral-network model with training data in the form of segmented sensor data representing error-free operation of the driver-assis- tance system. The training involves evaluating a capability for the basic neural-network model to predict a data segment con tent based on an input data segment. One or more parameters in the basic neural-network model are then adjusted until the capability for the basic neural-network model to predict the data segment content lies within an accuracy threshold. Thereby, the artificial neural network models can be made highly reliable and accurate predictors.
According to yet another embodiment of this aspect of the in vention , the method further involves producing the segmented sensor data by: obtaining raw sensor data from the driver-assis tance system, which raw sensor data reflect a sequence of events following in succession after one another in time; and preprocessing the raw sensor data by interpolating data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples whose content is based on a respective content of temporally neigh boring data samples in the raw sensor data. Thereby, the trai ning avoids being based on non-representative input, and can thus be made very efficient.
According to still another embodiment of this aspect of the in vention, the sensor data contains a radar signal (radar = radio detection and ranging), a lidar signal (lidar = light detection and ranging), a sonar signal (sonar = sound navigation and ranging) and/or a video signal. Namely, this enables a relevant basis for registering and keeping track of any obstacles in vicinity of the vehicle.
According to a further aspect of the invention the object is ac hieved by a computer program containing instructions which, when executed on at least one processor, cause the at least one processor to carry out the above-described method.
According to another aspect of the invention, the object is achie ved by a non-volatile data carrier containing such a computer program. According to yet another aspect of the invention, the above ob ject is achieved by a control unit adapted to be included in a ve hicle for detecting faults in a driver-assistance system of the ve hicle. The control unit contains at least one processor configu red to obtain sensor data describing spatio-temporal relation- ships between the vehicle and obstacles located in a sector re lative to the vehicle. The at least one processor is further confi gured to divide the sensor data into data segments, and asso ciate each data segment with a particular feature from a set of features containing at least two features. Each feature has a probability distribution for detecting obstacles within the sector. A respective artificial neural network model implemented in the at least one processor is configured to process the data seg ments. This processing involves predicting data segment con- tent in an error-free operation of the driver-assistance system. Each of the data segments is processed in a respective one of the artificial neural network models depending on the feature with which the respective data segment is associated. As a re- suit, a respective predicted data segment content is obtained. For each features, the at least one processor is further configu red compare the predicted data segment content with correspon ding data segment content divided from the obtained sensor da ta to derive a respective difference measure. Based on the diffe- rence measures, in turn, the at least one processor is configured to generate a fault diagnosis report.
The advantages of this control unit, as well as the preferred em bodiments thereof, are apparent from the discussion above with reference to the method to detect faults in a driver-assistance system of a vehicle.
According to still another aspect of the invention, the object is achieved by a vehicle including the proposed control unit for de tecting faults in a driver-assistance system of the vehicle.
Further advantages, beneficial features and applications of the present invention will be apparent from the following description and the dependent claims.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is now to be explained more closely by means of preferred embodiments, which are disclosed as examples, and with reference to the attached drawings.
Figure 1 schematically depicts a vehicle in which an embo diment of the invention is implemented;
Figure 2 shows a block diagram over a processor accor ding to one embodiment of the invention;
Figure 3 shows a block diagram over a control unit accor ding to one embodiment of the invention; Figure 4 illustrates how a monitored sector in vicinity of a vehicle can be divided into zones according to one embodiment of the invention;
Figure 5 illustrates, schematically, how an artificial neural network may be trained according to one embodi ment of the invention;
Figure 6 illustrates, by means of a flow diagram, the gene ral method according to the invention.
DETAILED DESCRIPTION
Figure 1 schematically depicts a vehicle V in which an embodi ment of the invention is implemented in an onboard control unit 1 10. Figure 2 shows a block diagram over a processor 1 12 ac cording to one embodiment of the invention. For reasons of cla rity, Figure 2 shows only one processor 1 12. However, accor- ding to the invention, the processing functionality may equally well be distributed between two or more processors. Figure 3 shows a block diagram over the control unit 1 10 according to one embodiment of the invention.
The control unit 1 10 may contain a processing unit 1 12 with pro- cessing means including at least one processor, such as one or more general-purpose processors. Further, the processing unit 1 12 is preferably communicatively connected to a data carrier 1 14 in the form computer-readable storage medium, such as a Random Access Memory (RAM), a Flash memory, or the like. The data carrier 1 14 contains computer-executable instructions, i.e. a computer program 1 15, for causing the processing unit 1 12 of the control unit 1 10 to perform in accordance with the embodiments of the invention as described herein, when the computer-executable instructions are executed on the at least one processor of the processing unit 1 12.
The control unit 1 10 is preferably implemented in one or more so-called ECUs (Electronic Control Units) in the vehicle V, which ECUs exchange data and instructions with other units, sensors and actuators in the vehicle V via a CAN (Controller Area Net work) bus, or analogous internal communications network.
The control unit 1 10 is adapted to detect faults in a driver-assis- tance system 120 of the vehicle V. To this aim, the control unit 1 10 contains at least one processor 1 12, which is configured to obtain sensor data DS describing spatio-temporal relationships between the vehicle V and any obstacles, e.g. another moving vehicle OB1 and a stationary barrel OB2, that are located in a sector S relative to the vehicle V. Thus, the obstacles OB1 and OB2 may be represented by other vehicles travelling in the sa me direction as the vehicle V or in any other direction, pedes trians and/or stationary objects on or beside the road.
More precisely, the at least one processor 1 12 is configured to divide the sensor data DS into data segments di, and associate each of these data segments d, with a particular feature f i , f2, ... , fn from a set of features {fm}. The set of features {fm} contains at least two features each of which has a probability distribution for detecting obstacles within the sector S. A respective artificial neural network model ANN1 , ANN2, ... , ANNn is configured to process the data segments di. The processing involves predic ting data segment content d’, in an error-free operation of the driver-assistance system 120. The at least one processor 1 12 is configured to process each of the data segments d, in a respec- tive one of the artificial neural network models ANN1 , ANN2, ... , ANNn depending on the feature of said features with which the respective data segment d, is associated to obtain a respective predicted data segment content fi’(di), f2’(dj) and fn’(dk) respectively. This means that the at least one processor 1 12 is configured to the content of data segments d, so that each segment is processed in an appropriate artificial neural network model.
For each of the features f i , f 2 , ... , fn, the at least one processor 1 12 is configured to compare the predicted data segment content fi’(di), f2’(dj) and fn’(dk) with corresponding data segment content fi (di), f2(dj) and fn(dk) respectively divided off from the obtained sensor data DS to derive a respective difference measure ei , ez, ... , en. Then, based on the difference measures ei , e2, en, the at least one processor 1 12 is configured to ge nerate a fault diagnosis report R, for instance in the form of one or more alarm signals or an indication of error-free operation de pending on the statuses of the difference measures ei , ei, ... ,
6 n Preferably, the features f i , f 2 , ... , fn reflect: a respective longi tudinal relative position, a respective latitudinal relative position, a respective longitudinal relative velocity and/or a respective la titudinal relative velocity of the obstacles OB1 and OB2 that are located in the sector S to describe the spatio-temporal relation- ships between the vehicle V and the obstacles OB 1 and OB2. The sector S, in turn, may be embodied as a cone, for instance as illustrated in Figure 1 , in which one or more sensors (not shown) are configured to register the sensor data DS, via for ex ample radar signals, lidar signals, sonar signals and/or video signals.
According to one embodiment of the invention , the sector S con tains, i.e. is subdivided into, two or more zones Z01 , Z02, Z03, Z04, Z05, Z06, Z07, Z08, Z09, Z10, Z1 1 , Z12, Z13 and Z14, which zones represent different areas in front of the vehicle V as illustrated in Figure 4. In such a case, the features f 1 , f 2 , ... , fn further reflect in which of the at least two zones that each of the obstacles OB1 and OB2 respectively is located. In this embodi ment of the invention , the artificial neural network models ANN 1 , ANN2, ... , ANNn have preferably been generated as follows. Referring further to Figure 5, for each of the at least two zones
Z01 , Z02, Z03, Z04, Z05, Z06, Z07, Z08, Z09, Z10, Z1 1 , Z12, Z13 and Z14, a basic neural-network model ANN is trained with training data in the form of segmented sensor data that repre sent error-free operation of the driver-assistance system 120. In Figure 5, this training data is exemplified by segmented sensor data fi(di), fi(di+i) and fi (d ,+2) respectively. The training involves evaluating a capability for the basic neural-network model ANN to predict a data segment f’(di) based on an input data segment fi (d i) . For repeated inputs of training data, i.e. error-free seg mented sensor data fi (d ,) , fi (d ,+i ) and fi (d ,+2) , the output from the neural-network model ANN is compared to the input, and one or more parameters (pm) in the basic neural-network model ANN are adjusted until the capability for the thus adjusted va- riant of the basic neural-network model ANN to predict the data segment content f’(di) lies within an accuracy threshold etn., i.e. an acceptable quality measure for the prediction.
It is preferable if the segmented sensor data fi (d ,) , fi(di+i), fi(di+2) being used for the training have been preprocessed to enhance the data quality. For example, the enhancement may involve obtaining raw sensor data DS from the driver-assistance system 120, which raw sensor data DS reflect a sequence of events following in succession after one another in time. The raw sensor data DS is then preprocessed by interpolating data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples, where a content synthetic data samples is based on respective contents of temporally neighboring data samples in the raw sensor data. In other words, the raw sensor data DS is supplemented with suitable artificial data samples to avoid fee ding non-representative data into the basic neural-network model ANN, and its gradually adjusted variants, during the trai ning process.
In order to sum up, and with reference to the flow diagram in Figure 6, we will now describe the general method according to the invention for detecting faults in a driver-assistance system of a vehicle.
In a first step 610, sensor data are obtained, which describe spatio-temporal relationships between a vehicle and any obstac- les located in a sector relative to the vehicle.
A step 620, thereafter checks if a sufficient amount of sensor data have been received to form a data segment, and if so a step 630 follows. Otherwise, the procedure loops back and stays in step 620 until enough sensor data have been received to fill up a data segment. In step 630, the data segment formed in the previous step is associated with a particular feature from a set of features containing at least two features. Each of these featu res, in turn, has a probability distribution for detecting obstacles within the sector.
Subsequently, in a step 640, the data segment is processed in a particular artificial neural network model depending on the featu re of said features with which the data segment is associated to. The processing involves predicting data segment content in an error-free operation of the driver-assistance system. Thus, the processing results in a predicted data segment content being obtained.
A subsequent step 650, compares, for the features in question, the predicted data segment content with corresponding data segment content divided from the obtained sensor data to derive a difference measure between predicted data segment and the data segment content obtained from the sensor data.
Then, in a step 660, a fault diagnosis report is generated based on the difference measure. The fault diagnosis report may be generated gradually as new data segments are divided off from the obtained sensor data, and if the difference measure exceeds a threshold level, an alarm signal may be generated.
After step 660, the procedure loops back to step 610.
All of the process steps, as well as any sub-sequence of steps, described with reference to Figure 6 above may be controlled by means of at least one programmed processor. Moreover, although the embodiments of the invention described above with reference to the drawings comprise processor and processes performed in at least one processor, the invention thus also extends to com puter programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the pro cess according to the invention. The program may either be a part of an operating system, or be a separate application. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a Flash memory, a ROM (Read Only Memory), for example a DVD (Digital Video/Versatile Disk), a CD (Compact Disc) or a semi conductor ROM, an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a magnetic recording medium, for ex ample a floppy disc or hard disc. Further, the carrier may be a transmissible carrier such as an electrical or optical signal which may be conveyed via electrical or optical cable or by radio or by other means. When the program is embodied in a signal which may be conveyed directly by a cable or other device or means, the carrier may be constituted by such cable or device or means. Alternatively, the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted for performing, or for use in the performance of, the relevant proces ses.
The term“comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components. However, the term does not preclude the presen- ce or addition of one or more additional features, integers, steps or components or groups thereof.
The invention is not restricted to the described embodiments in the figures, but may be varied freely within the scope of the claims.

Claims

Claims
1 . A method performed in a control unit (1 10) to detect faults in a driver-assistance system (120) of a vehicle (V), the method comprising:
obtaining, in at least one processor (1 12), sensor data
(DS) describing spatio-temporal relationships between the ve hicle (V) and obstacles (OB1 , OB2) located in a sector (S) rela tive to the vehicle (V),
characterized by, in the at least one processor (1 12):
dividing the sensor data (DS) into data segments (d,);
associating each of said data segments (d,) with a particu lar feature (f 1 , f 2 , - - - , fn) from a set of features ({f }) containing at least two features each of which has a probability distribution for detecting obstacles within the sector (S), a respective artificial neural network model (ANN 1 , ANN2, ANNn) being configured to process the data segments (d ), the processing involving predic ting data segment content (d’,) in an error-free operation of the driver-assistance system (120);
processing each of the data segments (d,) in a respective one of the artificial neural network models (ANN 1 , ANN2, ANNn) depending on the feature of said features with which the respec tive data segment (d,) is associated to obtain a respective pre dicted data segment content (fi’(di), f2’(dj), fn’(dk));
comparing , for each of said features (f i , f 2 , .. - , fn), the pre- dieted data segment content (fi’(di), f2’(dj) , fn’(dk)) with corres ponding data segment content (f i (d ,) , f2(dj) , fn(dk)) divided from the obtained sensor data (DS) to derive a respective difference measure (ei , e2, ... , en); and
generating a fault diagnosis report (R) based on said dif- ference measures (ei , e2, ... , en).
2. The method according to claim 1 , wherein said features (fi , f 2 , ... , fn) reflect at least one of: a respective longitudinal relative position , a respective latitudinal relative position, a respective longitudinal relative velocity and a respective latitudinal relative velocity of the obstacles (OB1 , OB2) located in the sector (S).
3. The method according to claim 2, wherein the sector (S) comprises at least two zones (Z01 , Z02, Z03, Z04, Z05, Z06, Z07, Z08, Z09, Z10, Z1 1 , Z12, Z13, Z14), and said features (fi , f 2 , ... , fn) further reflect in which of said at least two zones each of the obstacles (OB1 , OB2) is located.
4. The method according to claim 3, comprising:
generating the artificial neural network models (ANN1 ,
ANN2, ANNn) by, for each of the at least two zones, training a basic neural-network model (ANN) with training data in the form of segmented sensor data (f(d,) , f(d,+i ), f(di+2), ... ) representing error-free operation of the driver-assistance system (120), the training involving evaluating a capability for the basic neural- network model (ANN) to predict a data segment content (f’(d ,)) based on an input data segment (f(d ,)) , and adjusting one or mo- re parameters (pm) in the basic neural-network model (ANN) un til the capability for the basic neural-network model (ANN) to predict the data segment content (f’(d ,)) lies within an accuracy threshold (etn).
5. The method according to claim 4, further comprising
producing the segmented sensor data (fi(d,), fi (d,+i ) , fi(di+2), ... ) by:
obtaining raw sensor data (DS) from the driver-assis tance system (120), which raw sensor data (DS) reflect a sequence of events following in succession after one an- other in time; and
preprocessing the raw sensor data (DS) by interpola ting data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples whose content is based on a res- pective content of temporally neighboring data samples in the raw sensor data.
6. The method according to any one of the preceding claims, wherein the sensor data (DS) comprises at least one of: a radar signal, a lidar signal, a sonar signal and a video signal.
7. A computer program product (1 15) loadable into a non-vo latile data carrier (1 14) communicatively connected to at least one processing unit (1 12), the computer program product (1 15) comprising software for executing the method according any of the claims 1 to 6 when the computer program product (1 15) is run on the at least one processing unit (1 12).
8. A non-volatile data carrier (1 14) containing the computer program product (1 15) of the claim 7.
9. A control unit (1 10) adapted to be comprised in a vehicle
(V) for detecting faults in a driver-assistance system (120) of the vehicle (V), the control unit (1 10) comprising at least one pro cessor (1 12) configured to obtain sensor data (DS) describing spatio-temporal relationships between the vehicle (V) and obs- tacles (OB1 , OB2) located in a sector (S) relative to the vehicle (V), characterized in that the at least one processor (1 12) is further configured to:
divide the sensor data (DS) into data segments (d,);
associate each of said data segments (d,) with a particular feature (f i , f 2 , ... , fn) from a set of features ({fm}) containing at least two features each of which has a probability distribution for detecting obstacles within the sector (S), a respective artificial neural network model (ANN 1 , ANN2, ANNn) being configured to process the data segments (di), the processing involving predic- ting data segment content (d’i) in an error-free operation of the driver-assistance system (120);
process each of the data segments (d,) in a respective one of the artificial neural network models (ANN1 , ANN2, ANNn) de pending on the feature of said features with which the respective data segment (d,) is associated to obtain a respective predicted data segment content (fi’(di), f2’(dj), fn’(dk));
compare, for each of said features (f 1 , f 2 , ... , fn) , the predic ted data segment content (fi’(di), f2’(dj), fn’(dk)) with correspon- ding data segment content (fi (di), f2(dj), fn(dk)) divided from the obtained sensor data (DS) to derive a respective difference measure (ei , ez, ... , en); and
generate a fault diagnosis report (R) based on said dif- ference measures (ei , ei , en).
10. The control unit (1 10) according to claim 9, wherein said features (fi , f 2 , ... , fn) reflect at least one of: a respective longi tudinal relative position, a respective latitudinal relative position, a respective longitudinal relative velocity and a respective latitu- dinal relative velocity of the obstacles (OB 1 , OB2) located in the sector (S).
1 1 . The control unit (1 10) according to claim 10, wherein the sector (S) comprises at least two zones (Z01 , Z02, Z03, Z04, Z05, Z06, Z07, Z08, Z09, Z10, Z1 1 , Z12, Z13, Z14), and said features (fi , f .. fn) further reflect in which of said at least two zones each of the obstacles (OB1 , OB2) is located.
12. The control unit (1 10) according to claim 1 1 , wherein the artificial neural network models (ANN 1 , ANN2, ANNn) have been generated by, for each of the at least two zones, training a basic neural-network model (ANN) with training data in the form of segmented sensor data (f 1 (d ,) , fi (d,+i ) , fi (di+2), ... ) representing error-free operation of the driver-assistance system (120), the training involving evaluating a capability for the basic neural- network model (ANN) to predict a data segment (f’(d ,)) based on an input data segment (fi (d,)), and adjusting one or more para meters (pm) in the basic neural-network model (ANN) until the capability for the basic neural-network model (ANN) to predict the data segment content lies (f’(di)) within an accuracy thres hold (eth).
13. The control unit (1 10) according to claim 12, wherein the segmented sensor data (fi (d,), fi (d,+i ) , fi(di+2), ... ) have been pro duced by: obtaining raw sensor data (DS) from the driver-assis tance system (120), which raw sensor data (DS) reflect a sequence of events following in succession after one an other in time; and
preprocessing the raw sensor data (DS) by interpola ting data samples in the raw sensor data so as to fill out any temporal gaps in the raw sensor data with one or more synthetic data samples whose content is based on a res pective content of temporally neighboring data samples in the raw sensor data.
14. The control unit (1 10) according to any one of the claims 9 to 13, wherein the sensor data (DS) comprises at least one of: a radar signal, a lidar signal, a sonar signal and a video signal.
15. A vehicle (V) comprising the control unit (1 10) according to any one of claims 9 to 14 for detecting faults in a driver-assis tance system (120) of the vehicle (V).
PCT/SE2019/051136 2018-11-23 2019-11-12 Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle WO2020106201A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1851450-5 2018-11-23
SE1851450A SE1851450A1 (en) 2018-11-23 2018-11-23 Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle

Publications (1)

Publication Number Publication Date
WO2020106201A1 true WO2020106201A1 (en) 2020-05-28

Family

ID=70774102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2019/051136 WO2020106201A1 (en) 2018-11-23 2019-11-12 Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle

Country Status (2)

Country Link
SE (1) SE1851450A1 (en)
WO (1) WO2020106201A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348293A (en) * 2021-01-07 2021-02-09 北京三快在线科技有限公司 Method and device for predicting track of obstacle
WO2023169325A1 (en) * 2022-03-07 2023-09-14 长城汽车股份有限公司 Vehicle early warning method and apparatus, electronic device, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046822B1 (en) * 1999-06-11 2006-05-16 Daimlerchrysler Ag Method of detecting objects within a wide range of a road vehicle
US20160284212A1 (en) * 2015-03-28 2016-09-29 Igor Tatourian Technologies for detection of anomalies in vehicle traffic patterns
US20170169627A1 (en) * 2015-12-09 2017-06-15 Hyundai Motor Company Apparatus and method for failure diagnosis and calibration of sensors for advanced driver assistance systems
WO2017160201A1 (en) * 2016-03-15 2017-09-21 Scania Cv Ab Method and control unit for vehicle self-diagnosis
DE102017205093A1 (en) * 2017-03-27 2018-09-27 Conti Temic Microelectronic Gmbh Method and system for predicting sensor signals of a vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046822B1 (en) * 1999-06-11 2006-05-16 Daimlerchrysler Ag Method of detecting objects within a wide range of a road vehicle
US20160284212A1 (en) * 2015-03-28 2016-09-29 Igor Tatourian Technologies for detection of anomalies in vehicle traffic patterns
US20170169627A1 (en) * 2015-12-09 2017-06-15 Hyundai Motor Company Apparatus and method for failure diagnosis and calibration of sensors for advanced driver assistance systems
WO2017160201A1 (en) * 2016-03-15 2017-09-21 Scania Cv Ab Method and control unit for vehicle self-diagnosis
DE102017205093A1 (en) * 2017-03-27 2018-09-27 Conti Temic Microelectronic Gmbh Method and system for predicting sensor signals of a vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
G.N. BIFULCO ET AL.: "Real-time smoothing of car-following data through sensor-fusion techniques", PROCEDIA - SOCIAL AND BEHAVIORAL SCIENCES, vol. 20, pages 524 - 535, XP055711247, DOI: 10.1016/j.sbspro.2011.08.059 *
M. TAIE ET AL.: "Remote Diagnosis, Maintenance and Prognosis for Advanced Driver Assistance Systems Using Machine Learning Algorithms", SAE INTERNATIONAL JOURNAL OF PASSENGER CARS - ELECTRONIC AND ELECTRICAL SYSTEMS, vol. 9, no. 1, 2016, pages 114 - 122, XP055711249, DOI: 10.4271/2016-01-0076 *
Z. SUN ET AL.: "On-Road Vehicle Detection: A Review", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 28, no. 5, May 2006 (2006-05-01), pages 694 - 711, XP008092043, DOI: 10.1109/TPAMI.2006.104 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348293A (en) * 2021-01-07 2021-02-09 北京三快在线科技有限公司 Method and device for predicting track of obstacle
WO2023169325A1 (en) * 2022-03-07 2023-09-14 长城汽车股份有限公司 Vehicle early warning method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
SE1851450A1 (en) 2020-05-24

Similar Documents

Publication Publication Date Title
CN109784254B (en) Vehicle violation event detection method and device and electronic equipment
CN113272838A (en) Virtual scene generation method and device, computer equipment and storage medium
CN109740609B (en) Track gauge detection method and device
CN111077882B (en) Fault voice broadcasting method and device based on unmanned vehicle
CN104966304A (en) Kalman filtering and nonparametric background model-based multi-target detection tracking method
WO2020106201A1 (en) Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle
CN113269042B (en) Intelligent traffic management method and system based on driving vehicle violation identification
CN113096397A (en) Traffic jam analysis method based on millimeter wave radar and video detection
CN115856872A (en) Vehicle motion track continuous tracking method
CN111814766A (en) Vehicle behavior early warning method and device, computer equipment and storage medium
CN115311617A (en) Method and system for acquiring passenger flow information of urban rail station area
JP3065822B2 (en) Moving object detection device
CN113674317A (en) Vehicle tracking method and device of high-order video
CN113763425A (en) Road area calibration method and electronic equipment
CN112633151A (en) Method, device, equipment and medium for determining zebra crossing in monitored image
CN116990768A (en) Predicted track processing method and device, electronic equipment and readable medium
CN115966084A (en) Holographic intersection millimeter wave radar data processing method and device and computer equipment
CN116052417A (en) Driving prediction method, device, equipment and readable storage medium
CN109800685A (en) The determination method and device of object in a kind of video
CN109800678A (en) The attribute determining method and device of object in a kind of video
CN115294169A (en) Vehicle tracking method and device, electronic equipment and storage medium
CN117523914A (en) Collision early warning method, device, equipment, readable storage medium and program product
CN109740518B (en) Method and device for determining object in video
CN110059591B (en) Method for identifying moving target area
CN117593708B (en) Traffic digital twin method, equipment and storage medium containing vehicle identity information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19887740

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19887740

Country of ref document: EP

Kind code of ref document: A1