EP3855205A1 - Évaluation des performances de perception d'un système adas ou ads de véhicule - Google Patents
Évaluation des performances de perception d'un système adas ou ads de véhicule Download PDFInfo
- Publication number
- EP3855205A1 EP3855205A1 EP20153214.0A EP20153214A EP3855205A1 EP 3855205 A1 EP3855205 A1 EP 3855205A1 EP 20153214 A EP20153214 A EP 20153214A EP 3855205 A1 EP3855205 A1 EP 3855205A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- perception
- data
- perception data
- discrepancy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008447 perception Effects 0.000 title claims abstract description 452
- 238000011156 evaluation Methods 0.000 title claims abstract description 28
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 claims abstract description 108
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 claims abstract description 108
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 claims abstract description 108
- 238000004891 communication Methods 0.000 claims abstract description 25
- 238000000034 method Methods 0.000 claims abstract description 17
- 238000004590 computer program Methods 0.000 claims abstract description 14
- 230000004807 localization Effects 0.000 claims description 40
- 230000009471 action Effects 0.000 description 30
- 239000013598 vector Substances 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 230000001953 sensory effect Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/035—Bringing the control units into a predefined state, e.g. giving priority to particular actuators
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0055—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
- G05D1/0061—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/04—Monitoring the functioning of the control system
- B60W50/045—Monitoring control system parameters
- B60W2050/046—Monitoring control system parameters involving external transmission of data to or from the vehicle, e.g. via telemetry, satellite, Global Positioning System [GPS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/20—Data confidence level
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/65—Data transmitted between vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
Definitions
- the present disclosure relates to perception performance evaluation of an advanced driver-assistance system, ADAS, or automated driving system, ADS, of a vehicle.
- ADAS advanced driver-assistance system
- ADS automated driving system
- ADAS driver-assistance systems
- ADAS advanced driver-assistance systems
- ADAS which for instance may be represented by adaptive cruise control, ACC, collision avoidance system, forward collision warning, etc.
- ADAS are electronic systems that may aid a vehicle driver while driving.
- ADAS may rely on inputs from multiple data sources, such as e.g. automotive imaging, LIDAR, radar, image processing, computer vision, and/or in-car networking.
- An ADS is a complex combination of various components that can be defined as systems where perception, decision making, and operation of the vehicle are performed by electronics and machinery instead of a human driver, and as introduction of automation into road traffic. This includes handling of the vehicle, destination, as well as awareness of surroundings. While the automated system has control over the vehicle, it allows the human operator to leave all responsibilities to the system.
- An ADS commonly combines a variety of sensors to perceive the vehicle's surroundings, such as e.g. radar, LIDAR, sonar, camera, navigation system e.g. GPS, odometer and/or inertial measurement units, upon which advanced control systems may interpret sensory information to identify appropriate navigation paths, as well as obstacles and/or relevant signage.
- An ADAS or ADS as described above is, however, bound to be required to function with a high integrity to provide sufficiently low risk for the vehicle user(s) as well as other traffic participants. Ensuring that the risk is sufficiently low may require intractable amounts of data for statistical proofs, and would - according to an example - take e.g. approximately a hundred vehicles to drive continuously for five centuries, to acquire. There are several approaches to minimize the risk of the ADAS or ADS before it is launched onto public roads. However, on top of this, it is generally believed that the ADAS or ADS should be monitored once in the field, in order to ensure that it adheres to required safety levels.
- the disclosed subject-matter relates to a method performed by a perception comparing system of a vehicle for perception performance evaluation of an advanced driver-assistance system, ADAS, or automated driving system, ADS, of the vehicle.
- the perception comparing system establishes communication with a secondary vehicle determined and/or estimated to be positioned within a potential range of surrounding detecting sensors on-board the vehicle.
- the perception comparing system further derives perception data from a perception system of the ADAS or ADS adapted to estimate surroundings of the vehicle.
- the perception comparing system receives secondary perception data from a secondary perception system of a secondary ADAS or ADS of said secondary vehicle.
- the perception comparing system further determines, when the secondary vehicle is locatable in the perception data, a discrepancy output based on comparison of at least a portion of the perception data and at least a portion of the secondary perception data. Furthermore, the perception comparing system communicates acknowledgement data when at least a portion of the discrepancy output fulfils discrepancy exceedance criteria, and/or when the vehicle is perceivable in the secondary perception data but the secondary vehicle not is locatable in the perception data.
- the disclosed subject-matter further relates to a perception comparing system of a vehicle for - and/or adapted for - perception performance evaluation of an ADAS or ADS of the vehicle.
- the perception comparing system comprises a communication establishing unit for - and/or adapted for - establishing communication with a secondary vehicle determined and/or estimated to be positioned within a potential range of surrounding detecting sensors on-board the vehicle.
- the perception comparing system further comprises a perception data deriving unit for - and/or adapted for - deriving perception data from a perception system of the ADAS or ADS adapted to estimate surroundings of the vehicle.
- the perception comparing system comprises a secondary perception data receiving unit for - and/or adapted for - receiving secondary perception data from a secondary perception system of a secondary ADAS or ADS of the secondary vehicle.
- the perception comparing system further comprises a discrepancy output determining unit for - and/or adapted for - determining, when the secondary vehicle is locatable in the perception data, a discrepancy output based on comparison of at least a portion of the perception data and at least a portion of the secondary perception data.
- the perception comparing system comprises an acknowledgement communicating unit for - and/or adapted for - communicating acknowledgement data when at least a portion of the discrepancy output fulfils discrepancy exceedance criteria, and/or when the vehicle is perceivable in the secondary perception data but the secondary vehicle not is locatable in the perception data.
- the disclosed subject-matter relates to a vehicle comprising an ADAS or ADS, and further comprising a perception comparing system as described herein.
- the disclosed subject-matter relates to a computer program product comprising a computer program containing computer program code means arranged to cause a computer or a processor to execute the steps of a the perception comparing system described herein, stored on a computer-readable medium or a carrier wave.
- the disclosed subject-matter further relates to a non-volatile computer readable storage medium having stored thereon said computer program product.
- perception performance of a perception system of an ADAS or ADS may be assessed - such as in the field - to evaluate whether it is failing and/or in risk of failing, and take action when such is the case. That is, since communication is established with a secondary vehicle determined and/or estimated to be positioned within a potential range of surrounding detecting sensors on-board the ego-vehicle, said ego-vehicle may communicate with another vehicle deemed to be within a sensor and/or perception range or expected sensor and/or perception range thereof.
- perception data is derived from a perception system of the ADAS or ADS, adapted to estimate surroundings of the vehicle
- a world view is provided for the ego-vehicle, presenting a perceived model of ego-vehicle surroundings.
- secondary perception data is received from a secondary perception system of a secondary ADAS or ADS of the secondary vehicle
- there is provided to the ego-vehicle perception data - such as a world view - produced by the secondary vehicle, comprising a perceived model of secondary vehicle surroundings.
- perception data from the secondary vehicle may potentially at least partly overlap, correspond to and/or cover a geographical region comprised in perception data from the ego-vehicle. Furthermore, that is, since there is determined - when the secondary vehicle is locatable in the perception data - a discrepancy output based on comparison of at least a portion of the perception data and at least a portion of the secondary perception data, data relating to overlapping and/or corresponding portions, geographical regions or areas, objects etc.
- the discrepancy output may indicate a perception error, failure and/or a risk of failure for the ego-vehicle perception system, and/or potentially the perception system of the secondary vehicle.
- an acknowledgement indicative of discrepancy may be transmitted from the perception comparing system.
- the perception comparing system brings attention e.g. to when ego-vehicle perception data deviates from the secondary vehicle perception data to an extent or in a manner fulfilling one or more conditions defined by the discrepancy exceedance criteria.
- failure and/or risk of failure of the perception system of the ego-vehicle - and/or potentially the perception system of the secondary vehicle - may be evaluated.
- an approach is provided for in an improved and/or alternative manner evaluate perception performance of an ADAS or ADS of a vehicle, such as in the field.
- a schematic view of an exemplifying perception comparing system 1 of a vehicle 2 according to embodiments of the disclosure.
- the exemplifying vehicle 2 - which may be referred to as the ego-vehicle, host vehicle, primary vehicle and/or first vehicle - comprises, and/or is adapted to support, an ADAS and/or ADS 21, and here it further comprises the perception comparing system 1.
- the exemplifying vehicle 2 may refer to any arbitrary manned or unmanned vehicle, for instance an engine-propelled or electrically-powered vehicle such as a car, truck, lorry, van, bus, motorcycle and/or tractor.
- vehicle may refer to "launched vehicle", “road-traffic vehicle” and/or “production vehicle”, and further to "autonomous and/or at least partly autonomous vehicle", “driverless and/or at least partly driverless vehicle”, “automated and/or at least partly automated vehicle” and/or “self-driving and/or at least partly self-driving vehicle”.
- the exemplifying ADAS or ADS may refer to any arbitrary ADAS or ADS e.g.
- perception comparing system may refer to "production vehicle perception comparing system”, “perception failure evaluating system”, “perception performance evaluating system”, “perception performance validation system and/or “perception assessment system”, whereas "of” a vehicle may refer to “comprised in” a vehicle and/or “on-board” a vehicle.
- “For” perception performance evaluation may refer to “adapted for” perception performance evaluation
- the phrase “for perception performance evaluation” of an ADAS or ADS may refer to "for perception data performance evaluation” of an ADAS or ADS, “for perception performance monitoring and evaluation” of an ADAS or ADS, “for perception failure and/or failure-risk evaluation” of an ADAS or ADS, “for perception performance assessment” of an ADAS or ADS, “for perception performance validation” of an ADAS or ADS and/or “for perception evaluation” of an ADAS or ADS.
- the phrase “for perception performance evaluation” of an ADAS or ADS may refer to "for perception performance evaluation, and potential further evaluation and/or intervention," of an ADAS or ADS.
- the phrase "perception performance evaluation of an ADAS or ADS of said vehicle” may refer to "perception performance evaluation associated with an ADAS or ADS of said vehicle", and/or “perception performance evaluation of perception estimates on which an ADAS or ADS of said vehicle is pertinent", whereas “an” ADAS or ADS may further refer to "at least a first" ADAS or ADS.
- the phrase “perception performance evaluation” of an ADAS or ADS may refer to "perception performance evaluation of a perception system" of an ADAS or ADS.
- the vehicle 2 and/or the ADAS or ADS 21 may comprise, be provided with and/or have on-board a perception system 22 adapted to estimate surroundings of the vehicle 2, and subsequently adapted to estimate world views of the surroundings e.g. with support from a commonly known digital map such as a high definition, HD, map.
- the perception system 22 may refer to any commonly known system and/or functionality, e.g. comprised in one or more electronic control modules and/or nodes of the vehicle 2 and/or the ADAS or ADS 21, adapted and/or configured to interpret sensory information - relevant for driving of the vehicle 2 - to identify e.g. obstacles, vehicle lanes, relevant signage, appropriate navigation paths etc.
- the exemplifying perception system 22 - which may be adapted to support e.g. sensor fusion, tracking, localization etc. - may thus be adapted to rely on and obtain inputs from multiple data sources, such as automotive imaging, image processing, computer vision, and/or in-car networking, etc., in combination with sensory information.
- Such exemplifying sensory information may for instance be derived from one or more exemplifying surrounding detecting sensors 23 comprised in and/or provided on-board the vehicle 2.
- the surrounding detecting sensors 23 may be represented by any arbitrary sensors adapted to sense and/or perceive the vehicle's 2 surroundings and/or whereabouts, and may e.g. refer to one or a combination of one or more of radar, LIDAR, sonar, camera, navigation or positioning system e.g. GPS, odometer and/or inertial measurement units.
- the perception comparing system 1 is - e.g. by means of a communication establishing unit 101 (shown in Fig. 4 ) - adapted and/or configured for establishing communication with a secondary vehicle 3 determined and/or estimated to be positioned within a potential range of surrounding detecting sensors 23 on-board the vehicle 2.
- the ego-vehicle 2 may communicate with another vehicle 3 deemed to be within sensor and/or perception range - or expected sensor and/or perception range - of the ego-vehicle 2.
- the communication between the ego-vehicle 2 and the secondary vehicle 3 - and establishment thereof - may be accomplished in any arbitrary manner - e.g. known - with support from e.g. communication systems (not shown) on-board respective vehicle 2, 3.
- the established communication may accordingly for instance involve exchanging security certificates between the ego-vehicle 2 and the secondary vehicle 3.
- Whether to establish communication with the secondary vehicle 3 may be determined in an arbitrary manner deemed suitable and/or feasible, e.g. determined based on at least a first predeterminable trigger.
- Said optional trigger may for instance relate to determination and/or estimation of the secondary vehicle 3 being within a predeterminable distance and/or at a predeterminable angle from the ego-vehicle 2, such as e.g. estimated to be driving in vicinity of the ego-vehicle 2 along the same road.
- said trigger may for instance relate to the secondary vehicle 3 being determined to be of the same make as the ego-vehicle 2 and/or being equipped with a perception system, ADAS and/or ADS of the same brand, similar or identical to the optional perception system 22 and/or ADAS or ADS 21 of the ego-vehicle 2.
- said trigger may for instance relate to fulfillment of a timing criterion, such as e.g. time of day, day of week, time since previous trigger, new driving cycle etc.
- said trigger may for instance relate to a request from the perception comparing system - and/or e.g. a remote centralized system (such as the remote entity 9 shown in Fig. 4 and described in greater detail in conjunction therewith) - to establish communication with the secondary vehicle 3, for instance due to being deemed suitable and/or desirable.
- Determining and/or estimating that the secondary vehicle 3 is positioned within a potential range of surrounding detecting sensors 23 of the ego-vehicle 2 may be accomplished in any arbitrary - e.g. known - manner. For instance , determining and/or estimating that the secondary vehicle 3 is positioned within a potential range of surrounding detecting sensors 23 of the ego-vehicle 2 may be accomplished by the ego-vehicle 2 itself with support from a digital map such as a HD map and/or said surrounding detecting sensors 23.
- this may be accomplished with support from vehicle-to-vehicle communication, and/or with support from the exemplifying remote centralized system discussed above which may be adapted for keeping track of real-time or essentially real-time positions of vehicles such as the ego-vehicle 2 and the secondary vehicle 3.
- the potential range of surrounding detecting sensors 23 may refer to any feasible range - e.g. pertinent characteristics and/or capacity of said sensors 23 - or surrounding detecting sensors in general - and for instance refer to a distance such as exemplifying 10, 100 or 1000 meters from the ego-vehicle 2.
- the secondary vehicle 3 is in an exemplifying manner driving slightly ahead of the ego-vehicle 2, along the same road, in the same direction as said ego-vehicle 2. It may be noted, however, that the secondary vehicle 3 may be positioned arbitrarily in relation to the ego-vehicle 2, such as positioned behind, at another angle, driving in another e.g. opposite direction, etc.
- the exemplifying secondary vehicle 3 may similar to the ego-vehicle 2 refer to any arbitrary manned or unmanned vehicle, for instance an engine-propelled or electrically-powered vehicle such as a car, truck, lorry, van, bus, motorcycle and/or tractor.
- secondary vehicle may similarly refer to "secondary launched vehicle", “secondary road-traffic vehicle” and/or “secondary production vehicle”, and further to “secondary autonomous and/or at least partly autonomous vehicle”, “secondary driverless and/or at least partly driverless vehicle”, “secondary automated and/or at least partly automated vehicle” and/or “secondary self-driving and/or at least partly self-driving vehicle”.
- the secondary vehicle 3 may furthermore have similar characteristics as the ego-vehicle 2 in terms of e.g. functionality, features, on-board systems and/or equipment, why detailed descriptions of such similar characteristics for the secondary vehicle 3 have been omitted.
- the secondary vehicle 3 may comprise, with functionalities similar to those of ego-vehicle 2, a corresponding secondary ADAS or ADS 31, a corresponding secondary perception system 32, corresponding secondary surrounding detecting sensors 33, and potentially furthermore a corresponding perception comparing system described herein.
- the phrase "establishing communication with a secondary vehicle” may refer to "establishing communication with a nearby secondary vehicle", “establishing communication with a secondary vehicle involving exchanging security certificates and/or handshaking” and/or “establishing encrypted communication with a secondary vehicle”, whereas the term “secondary” throughout may refer to “second", “reference” and/or “other”. "A” secondary vehicle, on the other hand, may refer to "at least one” secondary vehicle.
- the phrase “determined and/or estimated to be positioned within a potential range of surrounding detecting sensors” may refer to "deemed and/or expected to be positioned within a potential range of surrounding detecting sensors", whereas "positioned” may refer to "located”.
- the phrase "determined and/or estimated to be positioned within a potential range” may refer to “determined and/or estimated - with support from a high definition, HD, map - to be positioned within a potential range” and/or “determined and/or estimated - with support from vehicle-to-vehicle communication and/or a remote centralized system knowledgeable of essentially real-time vehicle positions of said vehicle and said secondary vehicle - to be positioned within a potential range”.
- “Potential range” may refer to "expected range” and/or “estimated range”
- range may refer to "sensor and/or perception range”.
- the phrase "within a potential range of surrounding detecting sensors on-board said vehicle” may refer to "in vicinity of said vehicle", “within a predeterminable distance and/or at a predeterminable angle from said vehicle”, “within a predeterminable distance from said vehicle shorter than a potential range of surrounding detecting sensors on-board said vehicle”.
- Such an optional predeterminable distance may be of any arbitrary feasible extension deemed relevant for the situation at hand, and for instance refer to exemplifying 5, 50 or 500 meters from said vehicle 2.
- surrounding detecting sensors may refer to “one or more surrounding detecting sensors” and/or “surrounding detecting sensors adapted to capture surroundings of said vehicle”, whereas “on-board said vehicle” may refer to “on-board the ego-vehicle”.
- the perception comparing system 1 is - e.g. by means of a perception data deriving unit 102 (shown in Fig. 4 ) - adapted and/or configured for deriving perception data 4 (shown in Figs. 2-3 ) from a perception system 22 of the ADAS or ADS 21, adapted to estimate surroundings of the vehicle 2.
- a world view 4 is provided for the ego-vehicle 2, presenting a perceived model of ego-vehicle surroundings.
- Deriving perception data 4 may be accomplished in any arbitrary - e.g. known - manner, such as electronically and/or digitally.
- the perception data 4 may refer to any arbitrary data of relevance for the ADAS or ADS 21, which data further may be indicative of perceptions of surroundings of the vehicle 2, e.g. comprising - and/or being derived from - sensory data from the one or more surroundings detecting sensors 23, internal ADAS or ADS 21 processing to achieve sensor fusion, tracking etc. and/or a digital map such as a HD map. It may be noted that according to an example, the perception data 4 may be derived from the perception system 22 even should the ego-vehicle 2 during said deriving and/or during production of the perception data 4 have been at least partly controlled by a vehicle driver.
- the phrase “deriving perception data from a perception system” may refer to “retrieving and/or obtaining perception data from a perception system", and according to an example further to “determining perception data of a perception system”.
- “Deriving perception data”, on the other hand may refer to "deriving primary and/or ego-vehicle perception data", “deriving world view data”, “deriving a world view”, “deriving a world view with support from internal processing of said ADAS or ADS and/or a digital map such as HD map” and/or “deriving a world view of global and/or HD map coordinates with support from a HD map”.
- the phrase "from a perception system” may refer to "from a primary and/or ego-vehicle perception system”.
- “perception system adapted to estimate surroundings of said vehicle” may refer to “perception system adapted to estimate a world view for said vehicle” and/or “perception system adapted to estimate a world view of surroundings of said vehicle”, and furthermore to “perception system adapted and/or configured to estimate at least a portion of surroundings of said vehicle", “perception system adapted and/or configured to interpret sensory information relevant for driving of said vehicle”.
- “Surrounding of said vehicle” may refer to “surrounding of the ego, primary and/or host vehicle”.
- the phrase “perception system of said vehicle, adapted to estimate surroundings of said vehicle” may refer to “perception system of said vehicle, adapted to estimate surroundings of said vehicle with input from said surrounding detecting sensors and/or with support from a digital map such as a HD map”.
- the term “perception” data may refer to “surroundings assessment” data and/or “sensory” data, whereas perception “data” may refer to perception "predictions", "information” and/or “estimates”.
- “Perception” system may refer to “vehicle surroundings assessment and/or evaluation” system, whereas perception “system” may refer to perception "function and/or functionality”.
- the perception comparing system 1 is - e.g. by means of a secondary perception data receiving unit 103 (shown in Fig. 4 ) - adapted and/or configured for receiving secondary perception data 5 (shown in Figs. 2-3 ) from a secondary perception system 32 of a secondary ADAS or ADS 31 of the secondary vehicle 3.
- secondary perception data 5 shown in Figs. 2-3
- the ego-vehicle 2 perception data 5 - such as a world view - produced by the secondary vehicle 3, comprising a perceived model of secondary vehicle surroundings.
- perception data 5 from the secondary vehicle 3 may potentially at least partly overlap, correspond to and/or cover a geographical region comprised in perception data 4 from the ego-vehicle 2.
- the secondary perception data 5 - which may be indicative of perceptions of surroundings of the secondary vehicle 3 e.g. comprising and/or being derived from sensory data from the one or more secondary surrounding detecting sensors 33 and/or a secondary digital map such as a HD map - may be received from the secondary vehicle 3 in any arbitrary - e.g. known - manner, such as wirelessly, electronically and/or digitally.
- Said secondary perception data 5 may comprise data relevant for a geographical area at least partly overlapping a geographical area covered by the ego-vehicle perception data 4 i.e. the perception data 4 derived from the ego-vehicle 2.
- the secondary perception system 32 may optionally be similar or identical to the perception system 22 of the ego-vehicle 2, such as having similar, corresponding and/or identical configurations, settings, output formats etc.
- the secondary ADAS or ADS 31 may optionally be similar or identical to the perception system 22 of the ego-vehicle 2, such as referring to a similar, corresponding and/or identical ADAS or ADS, and/or having similar, corresponding and/or identical configurations, settings, etc. It may be noted that according to an example, the secondary perception data 5 may be received from the secondary vehicle 23 even should the secondary vehicle 3 during said receiving and/or during production of the secondary perception data 5 have been at least partly controlled by a vehicle driver.
- receiving secondary perception data may refer to “obtaining and/or retrieving secondary perception data”
- secondary perception data may refer to merely “perception data”.
- secondary ADAS or ADS may refer to merely “ADAS or ADS”.
- the phrase “receiving secondary perception data from a secondary perception system of a secondary ADAS or ADS of said secondary vehicle” may refer to “receiving secondary perception data from a secondary perception system of a secondary ADAS or ADS of said secondary vehicle, said secondary perception data comprising data relevant for a geographical area at least partly overlapping a geographical area covered by the perception data of said vehicle.”
- the perception comparing system 1 is - e.g. by means of a discrepancy output determining unit 104 (shown in Fig. 4 ) - adapted and/or configured for determining, when the secondary vehicle 3 is locatable in the perception data 4, a discrepancy output based on comparison of at least a portion of the perception data 4 and at least a portion of the secondary perception data 5.
- a discrepancy output determining unit 104 shown in Fig. 4
- the perception comparing system 1 is - e.g. by means of a discrepancy output determining unit 104 (shown in Fig. 4 ) - adapted and/or configured for determining, when the secondary vehicle 3 is locatable in the perception data 4, a discrepancy output based on comparison of at least a portion of the perception data 4 and at least a portion of the secondary perception data 5.
- the secondary vehicle 3 may be found in the perception data 4 of the ego-vehicle 2, data relating to overlapping and/or
- the discrepancy output may indicate a perception error, failure and/or a risk of failure for the ego-vehicle perception system 22, and/or potentially the perception system 32 of the secondary vehicle 3. Determining the discrepancy output may be accomplished by comparing data related to one or more selected e.g. overlapping portions and/or geographical areas of respective ego-vehicle perception data 4 and secondary perception data 5. Moreover, the at least a portion of the perception data 4 may refer to any arbitrary set of data of the ego-vehicle perception data 4 of any arbitrary size, e.g. a predeterminable set of data selected and/or deemed relevant for comparison.
- the at least a portion of the secondary perception data 5 may refer to any arbitrary set of data of the secondary vehicle perception data 5 of any arbitrary size, e.g. a predeterminable set of data such as essentially corresponding to and/or overlapping the ego-vehicle perception data portion.
- the discrepancy output may refer to any arbitrary difference between the ego-vehicle perception data 4 and the perception data 5 from the secondary vehicle 3 which may be deemed relevant to evaluate, e.g. in view of quality aspects and/or safety critical aspects.
- determining a discrepancy output based on comparison of may refer to “calculating a discrepancy output by comparing” and/or “determining - and subsequently storing - a discrepancy output based on comparison of”, whereas “a discrepancy output” may refer to “a discrepancy output measure", “discrepancy data”, “a perception discrepancy” and/or "a perception difference”.
- “Comparison of at least a portion of said perception data and at least a portion of said secondary perception data” may refer to "by analyzing at least a portion of said perception data in view of at least a portion of said secondary perception data", “comparison of at least a portion of said perception data and at least a corresponding, relevant and/or essentially geographically overlapping portion of said secondary perception data” and/or “comparison of at least a portion of the ego-vehicle perception data and at least a portion of said secondary perception data”.
- the phrase "determining a discrepancy output based on comparison of at least a portion of said perception data and at least a portion of said secondary perception data” may refer to “determining a discrepancy output based on comparison of at least a portion of said perception data and at least a portion of said secondary perception data, said discrepancy output indicating a discrepancy between the perception data and the secondary perception data".
- the phrase "when said secondary vehicle is locatable in said perception data” may refer to "provided that said secondary vehicle is locatable in said perception data" and/or "when said secondary vehicle is found, comprised in and/or perceivable in said perception data", and according to an example further to "when said secondary vehicle is identifiable in said perception data”.
- the perception comparing system 1 is - e.g. by means of an acknowledgement communicating unit 105 (shown in Fig. 4 ) - adapted and/or configured for communicating acknowledgement data 6 when at least a portion of the discrepancy output fulfils discrepancy exceedance criteria and/or when the vehicle 2 is perceivable in the secondary perception data 5 but the secondary vehicle 3 not is locatable in the perception data 4.
- an acknowledgement 6 indicative of discrepancy is transmitted from the perception comparing system 1.
- the outcome of the comparison i.e. the discrepancy output
- an acknowledgment 6 indicative thereof may be transmitted from the perception comparing system 1.
- an acknowledgment 6 indicative thereof may be transmitted from the perception comparing system 1.
- the perception comparing system 1 brings attention to when ego-vehicle perception data 4 deviates from the secondary vehicle perception data 5 to an extent and/or in a manner fulfilling one or more conditions defined by the discrepancy exceedance criteria and/or when the ego-vehicle 2 is perceivable in the secondary perception data 5 but the secondary vehicle 3 unexpectedly not is locatable in the ego-vehicle perception data 4.
- failure and/or failure risk of the perception system 22 of the ego-vehicle 2, and/or potentially the perception system 32 of the secondary vehicle 3 may be evaluated.
- the discrepancy exceedance criteria may be defined by any one or more arbitrary criterion deemed relevant for the implementation at hand - e.g.
- the discrepancy exceedance criteria may be variable, and furthermore predeterminable.
- “Communicating” acknowledgement data may refer to “communicating wirelessly and/or by wire” acknowledgement data and/or “communicating in due time and/or when deemed feasible and/or safe” acknowledgement data
- “acknowledgement data” may refer to “one or more acknowledgement signals” and/or “an acknowledgement message”, and further to “acknowledgement data indicative of, reflecting and/or comprising at least said discrepancy output and/or an indication of that the secondary vehicle not is locatable in said perception data”.
- the term "when”, on the other hand, may refer to "should" and/or "if".
- the phrase “at least a portion of said discrepancy output” may refer to “at least a predeterminable and/or selected portion of said discrepancy output”
- the phrase “fulfils discrepancy exceedance criteria” may refer to “fulfils predeterminable discrepancy exceedance criteria", “fulfils at least a first exceedance criterion” and/or “exceeds one or more discrepancy exceedance thresholds”.
- but said secondary vehicle not is locatable in said perception data may refer to “but said secondary vehicle not is perceivable and/or comprised in said perception data", and according to an example, further to “but said secondary vehicle not is identifiable in said perception data” and/or “but said secondary vehicle not is locatable and perceivable in said perception data”. Furthermore, the phrase “but said secondary vehicle not is locatable in said perception data” may further refer to “but said secondary vehicle unexpectedly not is locatable in said perception data”. "Unexpectedly" not locatable may here indicate that the secondary vehicle 3 - although expected and/or deemed to be so - is determined not to be locatable in the perception data 4.
- the secondary vehicle 3 may for instance be expected to be locatable if deemed possible to be located, such as e.g. based on deemed to be within sensor range and/or deemed not occluded or obstructed from view of one or more of the surrounding detecting sensors 23.
- the perception comparing system 1 may - e.g. by means of said discrepancy output determining unit 104 - optionally be adapted and/or configured for determining at least a first portion of said discrepancy output based on comparison of perception data 4 comprising object state estimates 41 of the vehicle 2 and secondary perception data 5 comprising secondary object states estimates 51 of the vehicle 2. Additionally or alternatively, the perception comparing system 1 may - e.g.
- discrepancy output determining unit 104 be adapted and/or configured for determining at least a first portion of the discrepancy output based on comparison of perception data 4 comprising object state estimates 42 of the secondary vehicle 3 and secondary perception data 5 comprising secondary object state estimates 52 of the secondary vehicle 3.
- the discrepancy output may at least in part be based on the outcome of comparing object state estimates 41 such as localization of the ego-vehicle 2 from the ego-vehicle perception data 4 with object state estimates 51 such as localization of the ego-vehicle 2 comprised in the perception data 5 produced by - and subsequently received from - the secondary vehicle 3.
- the discrepancy output may at least in part be based on the outcome of comparing object state estimates 42 such as localization of the secondary vehicle 3 comprised in the ego-vehicle perception data 4 with object state estimates 52 such as localization of the secondary vehicle 3 from the perception data 5 produced by - and subsequently received from - the secondary vehicle 3.
- object state estimates 42 such as localization of the secondary vehicle 3 comprised in the ego-vehicle perception data 4
- object state estimates 52 such as localization of the secondary vehicle 3 from the perception data 5 produced by - and subsequently received from - the secondary vehicle 3.
- a misalignment i.e. discrepancy between object state estimates 41 of the ego-vehicle 2 in the ego-vehicle perception data 4 as compared to object state estimates 51 of the ego-vehicle 2 in the secondary vehicle perception data 5.
- a misalignment i.e. discrepancy between object state estimates 42 of the secondary vehicle 3 in the ego-vehicle perception data 4 as compared to object state estimates 52 of the secondary vehicle 3 in the secondary vehicle perception data 5.
- acknowledgement data 6 may be communicated by the perception comparing system 1.
- the object state estimates 41, 42, 51, 52 may e.g. comprise the ego- and/or the other vehicle's 2, 3 states, such as the ego- and/or the other vehicle's 2, 3 velocity, heading, acceleration, localization etc.
- the phrase "determining at least a first portion of said discrepancy output based on” may refer to “determining said discrepancy output at least partly based on”.
- the phrase "comparison of perception data comprising object state estimates of said vehicle and secondary perception data comprising secondary object state estimates of said vehicle” may refer to "applying a comparison function on a vector of said perception data comprising object state estimates of said vehicle and a secondary vector of said secondary perception data comprising secondary object states estimates of said vehicle”.
- the phrase "comparison of perception data comprising object state estimates of said secondary vehicle and secondary perception data comprising secondary object state estimates of said secondary vehicle” may refer to "applying a comparison function on a vector of said perception data comprising object state estimates of said secondary vehicle and a secondary vector of said secondary perception data comprising secondary object state estimates of said secondary vehicle”.
- one or more or at least a portion of the object state estimates 41, 42, 51, 52 may be expressed as state vectors, including localization, in an exemplifying manner denoted as x .
- performance checks - i.e. determining one or more potential discrepancy outputs - may be formulated as follows:
- the perception comparing system 1 may - e.g. by means of said discrepancy output determining unit 104 - when localization estimates 41, 42 of the vehicle 2 and the secondary vehicle 3 in the perception data 4 respectively aligns with localization estimates 51, 52 of the vehicle 2 and said secondary vehicle 3 respectively in the secondary perception data 5, optionally be adapted and/or configured for determining additionally at least a second portion of the discrepancy output.
- Said at least second portion of the discrepancy output may, as shown in exemplifying Fig. 3a , optionally be based on comparison of perception data 4 comprising object state estimates 43 of at least a first surrounding object and secondary perception data 5 comprising secondary object states estimates 53 of said at least first surrounding object.
- the discrepancy output may at least in part be based on the outcome of comparing object state estimates 43 such as localization of at least a first surrounding object from the ego-vehicle perception data 4 with object state estimates 53 such as localization of said at least first surrounding object comprised in the perception data 5 produced by - and subsequently received from - the secondary vehicle 3.
- a misalignment i.e. discrepancy between object state estimates 43 of the at least first surrounding object in the ego-vehicle perception data 4 as compared to object state estimates 53 of said at least first surrounding object in the secondary vehicle perception data 5.
- acknowledgement data 6 may be communicated by the perception comparing system 1.
- the at least first surrounding object may refer to any arbitrary moving or static object - e.g. a vehicle - deemed feasible and/or suitable for object state estimations 43, 53, and be of any arbitrary dimensions.
- Said at least first object may furthermore be arbitrarily positioned in view of the ego-vehicle 2 and the secondary vehicle 3, respectively.
- said at least second portion of the discrepancy output may, as shown in exemplifying Fig. 3b , optionally be based on comparison of perception data 4 comprising free space estimates 44 of at least a region 45 around the vehicle 2 and secondary perception data 5 comprising secondary free space estimates 54 of at least a secondary region 55 around the secondary vehicle 3, which region 45 at least partly overlaps the secondary region 55.
- the discrepancy output may at least in part be based on the outcome of comparing free space estimates 44 comprised in the ego-vehicle perception data 4 and free space estimates 54 comprised in the perception data 5 produced by - and subsequently received from - the secondary vehicle 3, of an overlapping region 4555.
- a misalignment i.e. discrepancy 7 between free space estimates 44 in the ego-vehicle perception data 4 as compared to free space estimates 54 in the secondary vehicle perception data 5.
- acknowledgement data 6 may be communicated by the perception comparing system 1.
- the free space region 45 may refer to any to arbitrarily dimensioned region around the ego-vehicle 2, in any arbitrary direction of said ego-vehicle 2.
- the secondary free space region 55 may refer to any to arbitrarily dimensioned region around the secondary vehicle 3, in any arbitrary direction of said secondary vehicle 3, provided that the secondary region 55 overlaps 4555 the ego-vehicle region 45 to an arbitrarily sized and/or dimensioned extent.
- said at least second portion of the discrepancy output may, as shown in exemplifying Fig. 3c , optionally be based on comparison of perception data 4 comprising drivable area estimates 46 in at least a region 47 around the vehicle 2 and secondary perception data 5 comprising secondary drivable area estimates 56 in at least a secondary region 57 around the secondary vehicle 3, which region 47 at least partly overlaps the secondary region 57.
- the discrepancy output may at least in part be based on the outcome of comparing drivable area estimates 46 comprised in the ego-vehicle perception data 4 and drivable area estimates 56 comprised in the perception data 5 produced by - and subsequently received from - the secondary vehicle 3, of an overlapping region 4757.
- a misalignment i.e. discrepancy 8 between drivable area estimates 46 in the ego-vehicle perception data 4 as compared to drivable area estimates 56 in the secondary vehicle perception data 5.
- acknowledgement data 6 may be communicated by the perception comparing system 1.
- the drivable area region 46 may refer to any to arbitrarily dimensioned region around the ego-vehicle 2, in an essentially forward direction of said ego-vehicle 2.
- the secondary drivable area region 55 may refer to any to arbitrarily dimensioned region around the secondary vehicle 3, in an essentially forward direction of said secondary vehicle 3, provided that the secondary region 55 overlaps 4555 the ego-vehicle region 45 to an arbitrarily sized and/or dimensioned extent.
- the phrase "when localization estimates of said vehicle and said secondary vehicle in said perception data respectively aligns with localization estimates of said vehicle and said secondary vehicle respectively in said secondary perception data” may refer to "when localization estimates of said vehicle and said secondary vehicle in said perception data vector respectively aligns with localization estimates of said vehicle and said secondary vehicle respectively in said secondary perception data vector”.
- the phrase “comparison of perception data” may in this context refer to “applying a comparison function on a vector of said perception data”
- “secondary perception data” in this context may refer to "a secondary vector of said secondary perception data”.
- the perception comparing system 1 may - e.g. by means of the acknowledgement communicating unit 105 (shown in Fig. 4 ) - optionally be adapted and/or configured for transmitting the acknowledgement data 6 wirelessly to a remote entity 9 .
- Said acknowledgement data 6 may for instance comprise the discrepancy output. Thereby, data related to e.g. the discrepancy output may be transmitted to the entity 9 for offline processing and/or analysis.
- the acknowledgement data 6 may comprise an indication of that the ego-vehicle 2 is perceivable in the secondary perception data 5 but the secondary vehicle 3 not is locatable in the ego-vehicle perception data 4.
- the timing of transmittal may be arbitrarily selected as deemed feasible, such as e.g. promptly or at a point in time deemed suitable.
- the latter case may e.g. refer to awaiting a high bandwidth connection to the remote entity 9, such as e.g. a Wifi connection.
- the entity 9 may refer to any off-board data storage entity - e.g. known - adapted for and/or configured for off-board and/or offline processing and/or analysis, such as e.g. a cloud and/or automotive cloud, cloud network adapted for cloud-based storage, back-end system, and/or one or more servers.
- "Remote” entity may refer to "off-board” entity and/or “offline” entity, whereas "comprising said discrepancy output” may refer to "comprising essentially said discrepancy output”.
- the perception comparing system 1 may - e.g. by means of the acknowledgement communicating unit 105 (shown in Fig. 4 ) - be adapted and/or configured for transmitting the acknowledgement data 6 to an on-board ADAS/ADS control system 24 adapted to at least to some extent control the ADAS or ADS 21, which acknowledgement data 6 comprises an indication to at least partly disable the ADAS or ADS 21.
- the ADAS or ADS 21 may be revoked to, at least temporarily, inhibit further activation(s).
- the ADAS/ADS control system 24 may refer to any commonly known system and/or functionality, e.g. comprised in one or more electronic control modules and/or nodes of the ego-vehicle 2, adapted and/or configured to at least in part control the ADAS or ADS 21.
- ADAS/ADS control system may refer to "disablement system” and/or "ADAS or ADS control system”
- ADAS/ADS control system adapted to "control” may refer to ADAS/ADS control system adapted to "at least partly control”.
- “Comprising an indication" to at least partly disable may refer to “prompting” to at least partly disable and/or “comprising instructions prompting” to at least partly disable, whereas “indication to at least partly disable” may refer to “indication to in due time at least partly disable” and/or “indication to at least partly disable when deemed feasible and/or safe”.
- “disable” the ADAS or ADS may refer to “inhibit” and/or “restrict functionality of” the ADAS or ADS.
- the perception comparing system 1 may further - e.g. by means of the acknowledgement communicating unit 105 (shown in Fig. 4 ) - optionally be adapted and/or configured for communicating, when the a discrepancy output fulfils further discrepancy exceedance criteria, further acknowledgement data 60, which discrepancy exceedance criteria deviate from the further discrepancy exceedance criteria. Thereby, different acknowledgment data 6, 60 may be communicated depending on which discrepancy exceedance criteria is fulfilled.
- fulfilling discrepancy exceedance criteria - upon which the acknowledgement data 6 is communicated - may initiate a first action
- fulfilling further discrepancy exceedance criteria - upon which the further acknowledgement data 60 is communicated - may initiate a differing second action. That is, the outcome of the comparison and/or evaluation - i.e. the discrepancy output - is considered in view of additional second discrepancy exceedance criteria, and should said second discrepancy exceedance criteria be fulfilled, then a further acknowledgment 60 indicative thereof is transmitted from the perception comparing system 1.
- the perception comparing system 1 may additionally bring attention to when ego-vehicle perception data 4 deviates from secondary vehicle perception data 5 to an extent fulfilling the condition(s) defined by the further discrepancy exceedance criteria, which for instance may reflect greater and/or more frequent discrepancies. Thereby, there may be achieved a further evaluation in view of failure-risk, failure and/or failure rate - and/or intervention - of the ADAS or ADS.
- the further discrepancy exceedance criteria may be defined by any one or more arbitrary criterion deemed relevant for the implementation at hand - e.g. in view of characteristics of and/or requirements related to the perception system 22 and/or the ADAS or ADS 21 - such as e.g.
- discrepancy maximum level maximum number of exceedance of e.g. a discrepancy level, maximum number of exceedances of e.g. a discrepancy level within a predeterminable time frame, etc.
- further discrepancy exceedance criteria may be variable, and furthermore predeterminable.
- Communication further acknowledgement data may refer to “communicating wirelessly and/or by wire” further acknowledgement data and/or “communicating in due time and/or when deemed feasible and/or safe” further acknowledgement data
- further acknowledgement data may refer to “at least second acknowledgement data", “one or more further acknowledgement signals” and/or “a further acknowledgement message”.
- further acknowledgement data may refer to "further acknowledgement data indicative of, reflecting and/or comprising at least said discrepancy output and/or an indication of that said vehicle is perceivable in said secondary perception data but said secondary vehicle not is locatable in said perception data”.
- the perception comparing system 1 may - e.g. by means of the acknowledgement communicating unit 105 (shown in Fig. 4 ) - optionally be adapted and/or configured for transmitting the further acknowledgement data 60 wirelessly to the remote entity 9.
- the further acknowledgement data 60 may for instance comprise the discrepancy output. Thereby, data related to e.g. the discrepancy output may be transmitted to the entity 9 for offline processing and/or analysis, when further discrepancy exceedance criteria is/are fulfilled.
- the further acknowledgement data 60 may comprise an indication of that the ego-vehicle 2 is perceivable in the secondary perception data 5 but the secondary vehicle 3 not is locatable in the ego-vehicle perception data 4.
- the timing of transmittal may be arbitrarily selected as deemed feasible, such as e.g. promptly or at a point in time deemed suitable.
- the latter case may e.g. refer to awaiting a high bandwidth connection to the remote entity 9, such as e.g. a Wifi connection.
- the perception comparing system 1 may - e.g. by means of the acknowledgement communicating unit 105 - optionally be adapted and/or configured for transmitting the further acknowledgement data 60 to the ADAS/ADS control system 24, which further acknowledgement data 60 comprises an indication to at least partly disable the ADAS or ADS 21.
- the ADAS or ADS 21 may at least to some extent be revoked to, at least temporarily, inhibit further activation(s), when further discrepancy exceedance criteria is/are fulfilled.
- the perception comparing system 1 comprises a communication establishing unit 101, a perception data deriving unit 102, a secondary perception data receiving unit 103, a discrepancy output determining unit 104 and an acknowledgement communicating unit 105, all of which already have been described in greater detail above.
- the embodiments herein for perception performance evaluation of an ADAS or ADS 21 of a vehicle 2 may be implemented through one or more processors, such as a processor 106, here denoted CPU, together with computer program code for performing the functions and actions of the embodiments herein.
- Said program code may also be provided as a computer program product, for instance in the form of a data carrier carrying computer program code for performing the embodiments herein when being loaded into the perception comparing system 1.
- a data carrier carrying computer program code for performing the embodiments herein when being loaded into the perception comparing system 1.
- One such carrier may be in the form of a CD ROM disc and/or a hard drive, it is however feasible with other data carriers.
- the computer program code may furthermore be provided as pure program code on a server and downloaded to the perception comparing system 1.
- the perception comparing system 1 may further comprise a memory 107 comprising one or more memory units.
- the memory 107 may be arranged to be used to store e.g. information, and further to store data, configurations, scheduling, and applications, to perform the methods herein when being executed in the perception comparing system 1.
- the computer program code may be implemented in the firmware, stored in FLASH memory 107, of an embedded processor 106, and/or downloaded wirelessly e.g. from an off-board server.
- the communication establishing unit 101, the perception data deriving unit 102, the secondary perception data receiving unit 103, the discrepancy output determining unit 104, the acknowledgement communicating unit 105, the optional processor 106 and/or the optional memory 107 may at least partly be comprised in one or more nodes 108 e.g. ECUs of the vehicle 2.
- said units 101, 102, 103, 104, 105 described above may refer to a combination of analog and digital circuits, and/or one or more processors configured with software and/or firmware, e.g. stored in a memory such as the memory 107, that when executed by the one or more processors such as the processor 106 perform as described herein, such as in conjunction with Fig. 5 .
- processors as well as the other digital hardware, may be included in a single Application-Specific Integrated Circuitry, ASIC, or several processors and various digital hardware may be distributed among several separate components, whether individually packaged or assembled into a System-on-a-Chip, SoC.
- Fig. 4 Further shown in Fig. 4 is the ADAS or ADS 21, the perception system 22, the optional one or more surrounding detecting sensors 23, the optional ADAS/ADS control system 24, the secondary vehicle 3, the secondary perception data 5, the acknowledgement data 6, the optional further acknowledgement data 60, and the optional remote entity 9, all of which have been discussed in greater detail above.
- Fig. 5 is a flowchart depicting an exemplifying method performed by a perception comparing system 1 according to embodiments of the disclosure. Said method is for perception performance evaluation of an ADAS or ADS 21 of a vehicle 2.
- the exemplifying method which may be continuously repeated, comprises one or more of the following actions discussed with support from Figs. 1-4 . Moreover, the actions may be taken in any suitable order and/or one or more actions may be performed simultaneously and/or in alternate order where applicable. For instance, Action 1002 and Action 1003 may be performed in alternate order and/or simultaneously.
- the performance comparing system 1 establishes - e.g. with support from the communication establishing unit 101 - communication with a secondary vehicle 3 determined and/or estimated to be positioned within a potential range of surrounding detecting sensors 23 on-board the vehicle 2.
- the performance comparing system 1 derives - e.g. with support from the perception data deriving unit 102 - perception data 4 from a perception system 22 of the ADAS or ADS 21, adapted to estimate surroundings of the vehicle 2.
- the performance comparing system 1 receives - e.g. with support from the secondary perception data receiving unit 103 - secondary perception data 5 from a secondary perception system 32 of a secondary ADAS or ADS 31 of the secondary vehicle 3.
- the performance comparing system 1 determines - e.g. with support from the discrepancy output determining unit 104 - when the secondary vehicle 3 is locatable in the perception data 4, a discrepancy output based on comparison of at least a portion of the perception data 4 and at least a portion of the secondary perception data 5.
- Action 1004 of determining a discrepancy output may comprise determining at least a first portion of the discrepancy output based on comparison of perception data 4 comprising object state estimates 41 of the vehicle 2 and secondary perception data 5 comprising secondary object states estimates 51 of the vehicle 2.
- Action 1004 of determining a discrepancy output may comprise determining at least a first portion of the discrepancy output based on comparison of perception data 4 comprising object state estimates 42 of the secondary vehicle 3 and secondary perception data 5 comprising secondary object state estimates 52 of the secondary vehicle 3.
- said Action 1004 of determining a discrepancy output may comprise determining additionally at least a second portion of the discrepancy output. Said second portion of the discrepancy output may then be based on comparison of perception data 4 comprising object state estimates 43 of at least a first surrounding object and secondary perception data 5 comprising secondary object states estimates 53 of the at least first surrounding object.
- said second portion of the discrepancy output may then be based on comparison of perception data 4 comprising free space estimates 44 of at least a region 45 around the vehicle 2 and secondary perception data 5 comprising secondary free space estimates 54 of at least a secondary region 55 around the secondary vehicle 3, wherein the region 45 at least partly overlaps 4555 the secondary region 55.
- said second portion of the discrepancy output may then be based on comparison of perception data 4 comprising drivable area estimates 46 in at least a region 47 around the vehicle 2 and secondary perception data 5 comprising secondary drivable area estimates 56 in at least a secondary region 57 around the secondary vehicle 3, wherein the region 47 at least partly overlaps 4757 the secondary region 57.
- the performance comparing system 1 communicates - e.g. with support from the acknowledgement communicating unit 105 - acknowledgement data 6, when at least a portion of the discrepancy output fulfils discrepancy exceedance criteria and/or when the vehicle 2 is perceivable in the secondary perception data 5 but the secondary vehicle 3 not is locatable in the perception data 4.
- Action 1005 of communicating acknowledgement data 6 may comprise transmitting the acknowledgement data 6 wirelessly to a remote entity 9, wherein the acknowledgement data 6 e.g. comprises the discrepancy output.
- Action 1005 of communicating acknowledgement data 6 may comprise transmitting said acknowledgement data 6 to an on-board ADAS/ADS control system 24 adapted to control the ADAS or ADS 21, which acknowledgement data 6 comprises an indication to at least partly disable the ADAS or ADS 21.
- the performance comparing system 1 may communicate - e.g. with support from the acknowledgement communicating unit 105 - when the discrepancy output fulfils further discrepancy exceedance criteria, further acknowledgement data 60, wherein the discrepancy exceedance criteria deviates from the further discrepancy exceedance criteria.
- Action 1006 of communicating further acknowledgement data 60 may comprise transmitting the further acknowledgement data 60 wirelessly to the remote entity 9, which further acknowledgement data 60 e.g. comprises the discrepancy output.
- Action 1006 of communicating further acknowledgement data 60 may comprise transmitting the further acknowledgement data 60 to the ADAS/ADS control system 24, wherein the further acknowledgement data 60 comprises an indication to at least partly disable the ADAS or ADS.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20153214.0A EP3855205A1 (fr) | 2020-01-22 | 2020-01-22 | Évaluation des performances de perception d'un système adas ou ads de véhicule |
US17/154,202 US11738776B2 (en) | 2020-01-22 | 2021-01-21 | Perception performance evaluation of a vehicle ADAS or ADS |
CN202110082682.5A CN113160452A (zh) | 2020-01-22 | 2021-01-21 | 车辆adas或ads的感知性能评估 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20153214.0A EP3855205A1 (fr) | 2020-01-22 | 2020-01-22 | Évaluation des performances de perception d'un système adas ou ads de véhicule |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3855205A1 true EP3855205A1 (fr) | 2021-07-28 |
Family
ID=69187716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20153214.0A Pending EP3855205A1 (fr) | 2020-01-22 | 2020-01-22 | Évaluation des performances de perception d'un système adas ou ads de véhicule |
Country Status (3)
Country | Link |
---|---|
US (1) | US11738776B2 (fr) |
EP (1) | EP3855205A1 (fr) |
CN (1) | CN113160452A (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102023105997A1 (de) | 2023-03-10 | 2024-09-12 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Assistenzsystem zum Unterstützen einer Fahrzeugentwicklung sowie dafür eingerichtetes Kraftfahrzeug |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11927962B2 (en) * | 2020-10-15 | 2024-03-12 | Ford Global Technologies, Llc | System and method for detecting and addressing errors in a vehicle localization |
US20220204043A1 (en) * | 2020-12-29 | 2022-06-30 | Here Global B.V. | Autonomous driving pattern profile |
EP4095746A1 (fr) * | 2021-05-24 | 2022-11-30 | Zenseact AB | Développement de perception ads |
DE102021209789A1 (de) * | 2021-09-06 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Erkennung von Fehljustagen eines Sensors |
CN115620520A (zh) * | 2021-12-15 | 2023-01-17 | 阿波罗智联(北京)科技有限公司 | 用于测试感知目标数的方法、装置、设备、介质和产品 |
CN114802284A (zh) * | 2022-02-16 | 2022-07-29 | 武汉路特斯汽车有限公司 | 车辆感知性能评价方法及系统 |
DE102023107012A1 (de) | 2023-03-21 | 2024-09-26 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Verfahren, System und Computerprogrammprodukt zur Prädiktion von subjektiven Bewertungen eines ADAS/ADS-Systems |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180113472A1 (en) * | 2016-10-21 | 2018-04-26 | Toyota Jidosha Kabushiki Kaisha | Estimate of geographical position of a vehicle using wireless vehicle data |
DE102016221440A1 (de) * | 2016-11-02 | 2018-05-03 | Robert Bosch Gmbh | Verfahren zur Diagnose von Umfeld-Sensorsystemen in Fahrzeugen |
DE102017207233A1 (de) * | 2017-04-28 | 2018-10-31 | Siemens Aktiengesellschaft | Verfahren zum Durchführen der Kalibrierung eines Fahrzeug-Sensorsystems, insbesondere eines Autonomen Fahrzeugs, sowie Fahrzeug-Sensorsystem, insbesondere Autonomes Fahrzeug |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4548604B2 (ja) * | 2005-06-14 | 2010-09-22 | 三菱自動車工業株式会社 | 車車間通信システム |
EP2922033B1 (fr) * | 2014-03-18 | 2018-11-21 | Volvo Car Corporation | Système et procédé de diagnostic de capteur de véhicule et véhicule comportant un tel système |
JP6485049B2 (ja) * | 2015-01-09 | 2019-03-20 | 株式会社デンソー | 車載機、車載機診断システム |
JP6326004B2 (ja) * | 2015-05-11 | 2018-05-16 | 株式会社Subaru | 他車位置検出装置 |
DE102016218934A1 (de) * | 2016-09-29 | 2018-03-29 | Continental Teves Ag & Co. Ohg | Verfahren zum Datenaustausch und Datenfusionierung von Umfelddaten |
US20180186468A1 (en) * | 2017-01-04 | 2018-07-05 | Honeywell International Inc. | System and methods to evaluate or improve own ship sensor data in connected vehicles |
US10997429B2 (en) * | 2018-04-11 | 2021-05-04 | Micron Technology, Inc. | Determining autonomous vehicle status based on mapping of crowdsourced object data |
US11119478B2 (en) * | 2018-07-13 | 2021-09-14 | Waymo Llc | Vehicle sensor verification and calibration |
US11454525B2 (en) * | 2018-10-19 | 2022-09-27 | Robert Bosch Gmbh | Vehicle sensor field calibration utilizing other vehicles |
-
2020
- 2020-01-22 EP EP20153214.0A patent/EP3855205A1/fr active Pending
-
2021
- 2021-01-21 CN CN202110082682.5A patent/CN113160452A/zh active Pending
- 2021-01-21 US US17/154,202 patent/US11738776B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180113472A1 (en) * | 2016-10-21 | 2018-04-26 | Toyota Jidosha Kabushiki Kaisha | Estimate of geographical position of a vehicle using wireless vehicle data |
DE102016221440A1 (de) * | 2016-11-02 | 2018-05-03 | Robert Bosch Gmbh | Verfahren zur Diagnose von Umfeld-Sensorsystemen in Fahrzeugen |
DE102017207233A1 (de) * | 2017-04-28 | 2018-10-31 | Siemens Aktiengesellschaft | Verfahren zum Durchführen der Kalibrierung eines Fahrzeug-Sensorsystems, insbesondere eines Autonomen Fahrzeugs, sowie Fahrzeug-Sensorsystem, insbesondere Autonomes Fahrzeug |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102023105997A1 (de) | 2023-03-10 | 2024-09-12 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Assistenzsystem zum Unterstützen einer Fahrzeugentwicklung sowie dafür eingerichtetes Kraftfahrzeug |
Also Published As
Publication number | Publication date |
---|---|
US20210221403A1 (en) | 2021-07-22 |
US11738776B2 (en) | 2023-08-29 |
CN113160452A (zh) | 2021-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3855205A1 (fr) | Évaluation des performances de perception d'un système adas ou ads de véhicule | |
CN109421738B (zh) | 用于监视自主车辆的方法和装置 | |
EP3644294B1 (fr) | Procédé et dispositif de stockage d'informations de véhicule, et procédé de commande de déplacement de véhicule | |
CN109421742B (zh) | 用于监测自主车辆的方法和设备 | |
US20210118245A1 (en) | Performance monitoring and evaluation of a vehicle adas or autonomous driving feature | |
EP3525192B1 (fr) | Procédé d'évaluation de véhicule, procédé de correction d'itinéraire de déplacement, dispositif d'évaluation de véhicule et dispositif de correction d'itinéraire de déplacement | |
US11618473B2 (en) | Vehicle control system | |
US20180347993A1 (en) | Systems and methods for verifying road curvature map data | |
CN113734193A (zh) | 用于估计接管时间的系统和方法 | |
JP2023085371A (ja) | 走行記憶システム、および走行記憶方法 | |
EP3626570B1 (fr) | Procédé et dispositif d'aide à la conduite | |
US20210394794A1 (en) | Assessment of a vehicle control system | |
US20230132179A1 (en) | Tow management systems and methods for autonomous vehicles | |
US11904856B2 (en) | Detection of a rearward approaching emergency vehicle | |
US20210206392A1 (en) | Method and device for operating an automated vehicle | |
US11904899B2 (en) | Limp home mode for an autonomous vehicle using a secondary autonomous sensor system | |
US11801870B2 (en) | System for guiding an autonomous vehicle by a towing taxi | |
US20220242455A1 (en) | Unforeseen vehicle driving scenarios | |
EP4357213A1 (fr) | Procédé pour déterminer si une manoeuvre de direction d'évitement de collision automatique doit être exécutée ou non | |
EP4186768B1 (fr) | Gestionnaire de mouvement, appareil de conduite autonome et système de commande | |
US20240219201A1 (en) | Method and system of inconsistent map data reconciliation in connected vehicles | |
US20220390937A1 (en) | Remote traveling vehicle, remote traveling system, and meander traveling suppression method | |
CN112009496A (zh) | 用于自主车辆控制的安全架构 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220121 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230510 |