SE1851450A1 - Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle - Google Patents
Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and VehicleInfo
- Publication number
- SE1851450A1 SE1851450A1 SE1851450A SE1851450A SE1851450A1 SE 1851450 A1 SE1851450 A1 SE 1851450A1 SE 1851450 A SE1851450 A SE 1851450A SE 1851450 A SE1851450 A SE 1851450A SE 1851450 A1 SE1851450 A1 SE 1851450A1
- Authority
- SE
- Sweden
- Prior art keywords
- data
- sensor data
- driver
- data segment
- vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract 11
- 238000004590 computer program Methods 0.000 title claims 5
- 101100269885 Arabidopsis thaliana ANN1 gene Proteins 0.000 claims abstract 8
- 101100269886 Arabidopsis thaliana ANN2 gene Proteins 0.000 claims abstract 8
- 238000013528 artificial neural network Methods 0.000 claims abstract 6
- 238000003745 diagnosis Methods 0.000 claims abstract 3
- 238000003062 neural network model Methods 0.000 claims 9
- 230000002123 temporal effect Effects 0.000 claims 2
- 208000035657 Abasia Diseases 0.000 claims 1
- 235000001008 Leptadenia hastata Nutrition 0.000 claims 1
- 244000074209 Leptadenia hastata Species 0.000 claims 1
- 230000000875 corresponding effect Effects 0.000 claims 1
- 238000007781 pre-processing Methods 0.000 claims 1
- 239000008186 active pharmaceutical agent Substances 0.000 abstract 4
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B23/00—Testing or monitoring of control systems or parts thereof
- G05B23/02—Electric testing or monitoring
- G05B23/0205—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
- G05B23/0218—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
- G05B23/0243—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model
- G05B23/0254—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model based on a quantitative model, e.g. mathematical relationships between inputs and outputs; functions: observer, Kalman filter, residual calculation, Neural Networks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
- B60R16/0231—Circuits relating to the driving or the functioning of the vehicle
- B60R16/0232—Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
- B60R16/0234—Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions related to maintenance or repairing of vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
- G06F18/2178—Validation; Performance evaluation; Active pattern learning techniques based on feedback of a supervisor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
- B60W2050/021—Means for detecting failure or malfunction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/54—Audio sensitive means, e.g. ultrasound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Transportation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Medical Informatics (AREA)
- Mathematical Physics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Traffic Control Systems (AREA)
Abstract
Faults are detected in a driver-assistance system of a vehicle by obtaining sensor data (DS) in a processor (1 12), which sensor data (DS) describe spatio-temporal relationships between the vehicle and obstacles in a sector relative to the vehicle. The sensor data (DS) are divided into data segments (di). Each data segment (di) is associated with a particular feature (f1, f2, ..., fn) from a set of at least two different features ({f}). Each feature has a probability distribution for detecting obstacles within the sector. A respective artificial neural network model (ANN1 , ANN2, ANNn) is configured to process the data segments (di). The processing involves predicting data segment content (d’i) in an error-free operation of the driver-assistance system (120). Each of the data segments (di) is processed in a respective one of the artificial neural network models (ANN1 , ANN2, ANNn) depending on the feature with which the respective data segment (di) is associated to obtain a respective predicted data segment content (f1 ’(di), f2 ’(dj) , fn’(d)). For each feature (f1, f2, ..., fn), the predicted data segment content (f1’ (d1) , f2’(dj), f’(d)) is compared with corresponding data segment content (f1 (di), f2 (dj), fn(d)) divided from the obtained sensor data (DS) to derive a respective difference measure (e1, e2, ..., en). A fault diagnosis report (R) is generated based on said difference measures (e1, e, ..., e).
Claims (15)
1. A method performed in a control unit (110) to detect faultsin a driver-assistance system (120) of a vehicle (V), the methodcomprising: obtaining, in at least one processor (112), sensor data(DS) describing spatio-temporal relationships between the ve-hicle (V) and obstacles (OB1, OB2) located in a sector (S) rela-tive to the vehicle (V),characterized by, in the at least one processor (112): dividing the sensor data (DS) into data segments (di); associating each of said data segments (di) with a particu-lar feature (fi, f2,..., fn) from a set of features ({fin}) containingat least two features each of which has a probability distributionfor detecting obstacles within the sector (S), a respectiveartificial neural network model (ANN1, ANN2, ANNn) beingconfigured to process the data segments (di), the processinginvolving predicting data segment content (d”i) in an error-freeoperation of the driver-assistance system (120); processing each of the data segments (di) in a respectiveone of the artificial neural network models (ANN1, ANN2, ANNn)depending on the feature of said features with which the respec-tive data segment (di) is associated to obtain a respective pre-dicted data segment content (fi”(di), f2'(d,-), fn'(dk)); comparing, for each of said features (fi, f2,..., fn), the pre-dicted data segment content (fi”(di), f2'(d,-), fn”(dr<)) with corres-ponding data segment content (fi(di), fz(<)l,-), fn(di<)) divided fromthe obtained sensor data (DS) to derive a respective differencemeasure (ei, eg, en); and generating a fault diagnosis report (R) based on said dif-ference measures (ei, e2, en).
2. The method according to claim 1, wherein said features(fi, f2,..., fn) reflect at least one of: a respective longitudinalrelative position, a respective latitudinal relative position, arespective longitudinal relative velocity and a respectivelatitudinal relative velocity of the obstacles (OB1, OB2) located 14 in the sector (S).
3. The method according to claim 2, wherein the sector (S)comprises at least two zones (ZO1, ZO2, 203, ZO4, 205, ZO6,ZO7, ZO8, ZO9, Z10, Z11, Z12, 213, Z14), and said features (fi,f2,..., fn) further reflect in which of said at least two zones eachof the obstacles (OB1, OB2) is located.
4. The method according to claim 3, comprising: generating the artificial neural network models (ANN1,ANN2, ANNn) by, for each of the at least two zones, training abasic neural-network model (ANN) with training data in the formof segmented sensor data (f(di), f(dr+1), f(dr+2),...) representingerror-free operation of the driver-assistance system (120), thetraining involving evaluating a capability for the basic neural-network model (ANN) to predict a data segment content (f”(di))based on an input data segment (f(di)), and adjusting one ormore parameters (pm) in the basic neural-network model (ANN)until the capability for the basic neural-network model (ANN) topredict the data segment content (f”(di)) lies within an accuracythreshold (em).
5. The method according to claim 4, further comprisingproducing the segmented sensor data (f1(dr), f1(dr+1),f1(dr+2),...) by: obtaining raw sensor data (DS) from the driver-assis-tance system (120), which raw sensor data (DS) reflect asequence of events following in succession after one an-other in time; and preprocessing the raw sensor data (DS) by interpola-ting data samples in the raw sensor data so as to fill outany temporal gaps in the raw sensor data with one or moresynthetic data samples whose content is based on a res-pective content of temporally neighboring data samples inthe raw sensor data.
6. The method according to any one of the preceding claims,wherein the sensor data (DS) comprises at least one of: a radarsignal, a |idar signal, a sonar signal and a video signal.
7. A computer program product (115) loadable into a non-vo-latile data carrier (114) communicatively connected to at leastone processing unit (112), the computer program product (115)comprising software for executing the method according any ofthe claims 1 to 6 when the computer program product (115) isrun on the at least one processing unit (112).
8. A non-volatile data carrier (114) containing the computerprogram product (115) of the claim 7.
9. A control unit (110) adapted to be comprised in a vehicle(V) for detecting faults in a driver-assistance system (120) of thevehicle (V), the control unit (110) comprising at least one pro-cessor (112) configured to obtain sensor data (DS) describingspatio-temporal relationships between the vehicle (V) and obs-tacles (OB1, OB2) located in a sector (S) relative to the vehicle(V), characterized in that the at least one processor (112) isfurther configured to: divide the sensor data (DS) into data segments (di); associate each of said data segments (di) with a particularfeature (fi, f2,..., fii) from a set of features ({fm}) containing atleast two features each of which has a probability distribution fordetecting obstacles within the sector (S), a respective artificialneural network model (ANN1, ANN2, ANNn) being configured toprocess the data segments (di), the processing involving predic-ting data segment content (d'i) in an error-free operation of thedriver-assistance system (120); process each of the data segments (di) in a respective oneof the artificial neural network models (ANN1, ANN2, ANNn) de-pending on the feature of said features with which the respectivedata segment (di) is associated to obtain a respective predicteddata segment content (fi'(di), f2'(di), fii”(di<)); 16 compare, for each of said features (fi, f2,..., fn), the pre-dicted data segment content (fi”(di), f2”(d,-), fn”(di<)) withcorresponding data segment content (fi(di), f2(d,-), fn(di<)) divi-ded from the obtained sensor data (DS) to derive a respectivedifference measure (ei, e2, en); and generate a fault diagnosis report (R) based on said dif-ference measures (ei, e2, en).
10. The control unit (110) according to c|aim 9, wherein saidfeatures (fi, f2,..., fn) reflect at least one of: a respective longi-tudinal relative position, a respective latitudinal relative position,a respective longitudinal relative velocity and a respective latitu-dinal relative velocity of the obstacles (OB1, OB2) located in thesector (S).
11. The control unit (110) according to c|aim 10, wherein thesector (S) comprises at least two zones (Z01, Z02, Z03, Z04,Z05, Z06, Z07, Z08, Z09, Z10, Z11, Z12, Z13, Z14), and saidfeatures (fi, f2,..., fn) further reflect in which of said at least twozones each of the obstacles (OB1, OB2) is located.
12. The control unit (110) according to c|aim 11, wherein theartificial neural network models (ANN1, ANN2, ANNn) have beengenerated by, for each of the at least two zones, training a basicneural-network model (ANN) with training data in the form ofsegmented sensor data (fi(di), fi(di+i), fi(di+2),...) representingerror-free operation of the driver-assistance system (120), thetraining involving evaluating a capability for the basic neural-network model (ANN) to predict a data segment (f”(di)) based onan input data segment (fi(di)), and adjusting one or more para-meters (pin) in the basic neural-network model (ANN) until thecapability for the basic neural-network model (ANN) to predictthe data segment content lies (f'(di)) within an accuracy thres-hold (ein).
13. The control unit (110) according to c|aim 12, wherein the 17 segmented sensor data (f1(dr), f1(dr+1), f1(dr+2),...) have beenproduced by:obtaining raw sensor data (DS) from the driver-assis-tance system (120), which raw sensor data (DS) reflect asequence of events following in succession after one an-other in time; andpreprocessing the raw sensor data (DS) by interpola-ting data samples in the raw sensor data so as to fill outany temporal gaps in the raw sensor data with one or moresynthetic data samples whose content is based on a res-pective content of temporally neighboring data samples inthe raw sensor data.
14. The control unit (110) according to any one of the claims 9to 13, wherein the sensor data (DS) comprises at least one of: aradar signal, a lidar signal, a sonar signal and a video signal.
15. A vehicle (V) comprising the control unit (110) according toany one of claims 9 to 14 for detecting faults in a driver-assis-tance system (120) of the vehicle (V).
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1851450A SE1851450A1 (en) | 2018-11-23 | 2018-11-23 | Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle |
PCT/SE2019/051136 WO2020106201A1 (en) | 2018-11-23 | 2019-11-12 | Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1851450A SE1851450A1 (en) | 2018-11-23 | 2018-11-23 | Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
SE1851450A1 true SE1851450A1 (en) | 2020-05-24 |
Family
ID=70774102
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1851450A SE1851450A1 (en) | 2018-11-23 | 2018-11-23 | Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle |
Country Status (2)
Country | Link |
---|---|
SE (1) | SE1851450A1 (en) |
WO (1) | WO2020106201A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112348293A (en) * | 2021-01-07 | 2021-02-09 | 北京三快在线科技有限公司 | Method and device for predicting track of obstacle |
CN115230723A (en) * | 2022-03-07 | 2022-10-25 | 长城汽车股份有限公司 | Vehicle early warning method and device, electronic equipment and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19926559A1 (en) * | 1999-06-11 | 2000-12-21 | Daimler Chrysler Ag | Method and device for detecting objects in the vicinity of a road vehicle up to a great distance |
US11482100B2 (en) * | 2015-03-28 | 2022-10-25 | Intel Corporation | Technologies for detection of anomalies in vehicle traffic patterns |
KR101786237B1 (en) * | 2015-12-09 | 2017-10-17 | 현대자동차주식회사 | Apparatus and method for processing failure detection and calibration of sensor in driver assist system |
SE542087C2 (en) * | 2016-03-15 | 2020-02-25 | Scania Cv Ab | Method and control unit for vehicle diagnosis |
DE102017205093A1 (en) * | 2017-03-27 | 2018-09-27 | Conti Temic Microelectronic Gmbh | Method and system for predicting sensor signals of a vehicle |
-
2018
- 2018-11-23 SE SE1851450A patent/SE1851450A1/en not_active Application Discontinuation
-
2019
- 2019-11-12 WO PCT/SE2019/051136 patent/WO2020106201A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2020106201A1 (en) | 2020-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kim et al. | A capsule network for traffic speed prediction in complex road networks | |
DE112016006692T5 (en) | Method for predicting a movement of an object | |
WO2018053536A3 (en) | Time-series fault detection, fault classification, and transition analysis using a k-nearest-neighbor and logistic regression approach | |
CN103259962B (en) | A kind of target tracking method and relevant apparatus | |
DE102011117585B4 (en) | Systems and methods for tracking objects | |
SE1851450A1 (en) | Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle | |
JP2016075905A5 (en) | ||
DE102013014106A1 (en) | V2V communication based vehicle identification device and identification method for same | |
DE102015209857A1 (en) | Autonomous emergency braking system and method for detecting pedestrians in this | |
CN106199583A (en) | Multi-target Data coupling and the method and system followed the tracks of | |
JP2001084479A (en) | Method and device for forecasting traffic flow data | |
DE112015005364T5 (en) | VEHICLE VELOCITY CONTROL DEVICE, VEHICLE SPEED CONTROL METHOD AND VEHICLE SPEED CONTROL PROGRAM | |
CN104700657A (en) | Artificial neural network-based system for pre-judging behaviors of surrounding vehicles | |
CN107862863B (en) | Method and device for improving traffic data quality | |
DE102018121165A1 (en) | Method for estimating the surroundings of a vehicle | |
JP6920342B2 (en) | Devices and methods for determining object kinematics for movable objects | |
CN109992579B (en) | Data restoration method and system for multisource heterogeneous data of highway infrastructure | |
EP3637311A1 (en) | Device and method for determining the altitude information of an object in an environment of a vehicle | |
CN113705074A (en) | Chemical accident risk prediction method and device | |
DE102018106478A1 (en) | TARGET TRACKING USING REGIONAL COVENANT | |
CN112884801A (en) | High altitude parabolic detection method, device, equipment and storage medium | |
DE102016203472A1 (en) | Method and processing unit for detecting objects based on asynchronous sensor data | |
Lin et al. | An object reconstruction algorithm for moving vehicle detection based on three-frame differencing | |
DE102022000257A1 (en) | Method for detecting the surroundings of a vehicle | |
DE102017212953A1 (en) | Determination of odometric data of a rail vehicle with the aid of stationary sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NAV | Patent application has lapsed |