EP3874402A1 - Verfahren zur unterstützung einer kamerabasierten umfelderkennung eines fortbewegungsmittels mittels einer strassennässeinformation eines ersten ultraschallsensors - Google Patents
Verfahren zur unterstützung einer kamerabasierten umfelderkennung eines fortbewegungsmittels mittels einer strassennässeinformation eines ersten ultraschallsensorsInfo
- Publication number
- EP3874402A1 EP3874402A1 EP19773013.8A EP19773013A EP3874402A1 EP 3874402 A1 EP3874402 A1 EP 3874402A1 EP 19773013 A EP19773013 A EP 19773013A EP 3874402 A1 EP3874402 A1 EP 3874402A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- transportation
- signal
- ultrasonic sensor
- road
- environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60S—SERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
- B60S1/00—Cleaning of vehicles
- B60S1/02—Cleaning windscreens, windows or optical devices
- B60S1/04—Wipers or the like, e.g. scrapers
- B60S1/06—Wipers or the like, e.g. scrapers characterised by the drive
- B60S1/08—Wipers or the like, e.g. scrapers characterised by the drive electrically driven
- B60S1/0818—Wipers or the like, e.g. scrapers characterised by the drive electrically driven including control systems responsive to external conditions, e.g. by detection of moisture, dirt or the like
- B60S1/0822—Wipers or the like, e.g. scrapers characterised by the drive electrically driven including control systems responsive to external conditions, e.g. by detection of moisture, dirt or the like characterized by the arrangement or type of detection means
- B60S1/0855—Ultrasonic rain sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/539—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/182—Network patterns, e.g. roads or rivers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1306—Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing
Definitions
- the present invention relates to a method for supporting a camera-based environment detection of a means of transportation by means of road wetness information from a first ultrasonic sensor.
- Means of transportation are known from the prior art which carry out a camera-based environment detection in order to obtain information about objects in the environment of the means of transportation. This information is received and used by driver assistance systems and / or systems for autonomous control of the means of transport, for example.
- Such an environment detection is based on algorithms known from the prior art for image analysis and for the classification of
- Objects which are usually use one or more classifiers for specific objects.
- Means of transportation known to detect an existing precipitation. These are usually arranged in an upper area of a windshield of the means of transportation and are arranged, one on the
- Windshield to detect existing precipitation can be used to select suitable classifiers of an environment detection.
- ultrasound sensors are known from the prior art, which are often used in connection with means of transportation for parking assistance systems or Similar driver assistance systems are used.
- ultrasonic sensors are usually arranged on the means of transportation in such a way that their direction of radiation and detection is essentially horizontal to the means of transportation, by distances of objects in the vicinity of the
- Proposed means of transportation by means of road wetness information from a first ultrasonic sensor can be, for example, a road vehicle (e.g. motorcycle, car, van, truck) or a
- the device can
- the evaluation unit which preferably has a data input.
- the evaluation unit can be configured, for example, as an ASIC, FPGA, processor, digital signal processor, microcontroller, or the like, and can be connected to an internal and / or external storage unit in terms of information technology.
- Evaluation unit can also be set up to carry out the method according to the invention in connection with a computer program executed by the evaluation unit.
- a first signal representing an environment of the means of transportation is generated by means of the first
- the ultrasound sensor can be an ultrasound sensor of the means of transportation that can also be used for other purposes. Alternatively or additionally, for that
- a dedicated ultrasonic sensor can also be used.
- the ultrasound sensor of the means of transportation can be, for example, an ultrasound sensor of a parking assistance system or another
- the ultrasonic sensor can, for example, in a front apron or in the area of a rear of the means of transportation or also at other positions of the Means of transportation may be arranged so that both the tire noise, but also the road ahead or the back road and its surroundings can be detected.
- the ultrasonic sensor can be used directly or indirectly (ie, for example, via another control device of the
- Means of transportation to be connected in terms of information technology to the data input of the evaluation unit according to the invention.
- the connection can be established, for example, by means of a bus system (e.g. CAN, LIN, MOST, Ethernet, etc.) of an on-board network of the means of transportation.
- the first signal of the first ultrasonic sensor received by the evaluation unit can first be stored in the memory unit connected to the evaluation unit for subsequent processing by the evaluation unit.
- a second signal representing the surroundings of the means of transportation is recorded by means of a camera of the means of transportation.
- the second signal can preferably be acquired at an essentially identical point in time as the first signal, so that it can be ensured that the two signals each contain environmental information corresponding to one another in time. Due to different sensor types and different signal processing and signal transmission chains, there may be a time offset between the two signals.
- the offset between the two signals can be, for example, between a few milliseconds and a few hundred milliseconds, or even in the seconds range.
- the camera can be, for example, a 2D or 3D camera with a standard image resolution, an HD or an ultra-HD image resolution or an infrared camera.
- the camera can preferably be arranged and aligned on the means of transportation in such a way that it captures an environment in front of the means of transportation.
- an arrangement and / or alignment of the camera is not limited to this example.
- the camera can be used directly or indirectly in terms of information technology with the evaluation unit according to the invention via the vehicle electrical system
- the camera can be connected in terms of information technology to an image processing unit of the means of transport, which is set up to receive image signals from the camera and to process them.
- an image processing unit can be part of a Driver assistance system or a system for autonomous operation of the means of transportation.
- the image processing unit in this preferred embodiment can be used for information technology purposes with the
- Evaluation unit can transmit the road wetness information described in more detail below to the image processing unit.
- the image processing unit can be a component of the evaluation unit itself (or vice versa), so that communication between these two components can take place directly and not via the vehicle electrical system.
- This can be implemented, for example, in such a way that a logic to be executed by the evaluation unit according to the invention for implementing the method steps according to the invention is executed in the form of a computer program by means of the image processing unit.
- the evaluation unit can compare a noise level of the first signal with a predefined threshold value for a noise level, which can be stored in the memory unit connected to the evaluation unit.
- the predefined threshold value for a noise level is preferably selected such that when the predefined threshold value is exceeded by the noise level of a currently prevailing road wetness
- the evaluation unit can also estimate a degree of wetness by exceeding the predefined level
- Threshold value is taken into account by the noise level of the first signal.
- Speed ranges of the means of transportation can correspond.
- it can be advantageous to use the vehicle electrical system Information provided on a current means of transportation
- the evaluation unit can select a corresponding, predefined threshold value from a plurality of predefined threshold values depending on a value of a current speed, since a higher speed generally with a higher speed
- Noise level associated with the first signal In this way it can be prevented that, at a higher speed of the means of transportation, the evaluation unit erroneously detects road wetness, even though the road surface is actually in a dry state.
- predefined parameter set selected from a plurality of predefined parameter sets depending on the road wetness information.
- the plurality of predefined parameter sets can be different, for example
- a classifier can preferably be part of a computer program which is executed by the image processing unit and / or the evaluation unit.
- Image processing unit are implemented as separate components, the selection of the predefined parameter set by the
- Image processing unit depending on the road wetness information and possibly further information (e.g. about a speed of the road wetness information).
- Means of transportation can be made available by the evaluation unit.
- a goal of using different predefined ones can be made available by the evaluation unit.
- predefined parameter sets adapted to a current environment (ie wet or dry) for the environment detection.
- a classifier trained for a dry environment in a wet environment for example due to whirled up water (spray), means of transportation in front can usually only ensure inadequate or unreliable results in the course of the environment detection.
- a classifier trained for a wet environment can often not be optimal in a dry environment Deliver recognition results.
- a predefined parameter set that is matched to the respective degree of measurement can additionally be selected depending on the degree of wetness.
- the determination of the road wetness information can additionally depend on a
- values received via the on-board electrical system of the means of transportation can be determined via a current acceleration and / or a current engine speed of the vehicle
- Means of transportation are taken into account advantageously in a similar manner.
- the predefined parameter set can be a configuration of a trained
- the classifier described above can be implemented on the basis of a self-learning system, such as a neural network (e.g. with a deep learning structure).
- a self-learning system such as a neural network (e.g. with a deep learning structure).
- other types of self-learning systems can be used. In this way, training trips of the means of transportation can be carried out in different ways by means of such a self-learning system
- the selection of the predefined parameter set can alternatively or additionally take place as a function of a change in the noise level and / or a current temperature and / or an amount of water present in the environment of the means of transportation.
- a change in the noise level can, as described above, due to different amounts of water on one
- a change in the noise level can also be caused by a change in a distance
- vehicles in front can be caused by an additional analysis of the second signal, for example by a
- Size change of immediately preceding vehicles is determined.
- signals from other environment sensors of the means of transportation can also be used to evaluate a current situation. It can be particularly advantageous here to take into account distance information from vehicles in front of a LIDAR and / or a radar system of the means of transportation.
- the predefined parameter sets generated and used for the abovementioned cases can have the effect that means of locomotion partially obscured by a spray cloud are recognized better and / or faster, even if only vague outlines of means of transportation in front can be recorded by the camera.
- Roadside snow is present, in particular can be high if there is an outside temperature of 0 ° C or less and at the same time there is recognized road wetness. Based on this information, another suitable predefined parameter set selected and in the course of
- Lane limitation can be reliably recognized even in the presence of snow.
- the road wetness information can be determined as a function of the interference-free nature of the first signal.
- Interference-free is to be understood here to mean the absence of a wide variety of interferences that make reliable detection of road wetness difficult or even impossible, such as developments on the roadside and / or other means of transportation in the immediate vicinity of the means of transportation.
- Such interference can be determined, for example, on the basis of the second signal or on the basis of signals from other environmental sensors, such as LIDAR and / or radar sensors. In the event that there is a corresponding interference, the
- Evaluation unit transmit a road wetness information to the image processing unit which shows a road wetness condition before the occurrence of the
- This value can preferably continue to be used as road wetness information by the overall system until the interference or interference from the environment of the means of transportation
- the first ultrasonic sensor can be arranged on the means of transportation in such a way that a detection area of the first ultrasonic sensor is in the direction of travel or counter to the direction of travel of the
- Locomotion means are additionally detected on the basis of a second ultrasound sensor and in particular are detected by a second ultrasound sensor which is arranged on the means of locomotion in such a way that a detection area of the second ultrasound sensor lies in the direction of travel or against the direction of travel of the means of locomotion.
- the first ultrasonic sensor in the direction of travel and the second ultrasonic sensor in the opposite direction to the direction of travel of the means of transportation be aligned. In this way, the road wetness information can be determined on the basis of both ultrasonic sensors, which means an additional one
- Road wetness information can be determined alternately on the basis of the first or on the basis of the second ultrasonic sensor, in each case by evaluating the first signal of the ultrasonic sensor with respect to road wetness which has the lowest proportion of interference at a current point in time.
- further ultrasonic sensors can be used for the method according to the invention.
- third, fourth and more ultrasonic sensors can be used, which can be combined and used analogously to the configurations described above.
- An arrangement of the first, second, third, fourth and possibly further ultrasonic sensors is explicitly not on the front and / or
- the road wetness information determined from the first signal can be checked for plausibility by road wetness information determined from the second signal. This can be done on the basis of an analysis of reflections from light sources in the
- Camera images take place, for example, by checking whether these light sources are above or apparently below a road level.
- the road wetness information can also be checked for plausibility by further sensors and / or control devices of the means of transportation.
- a rain sensor arranged on a windshield of the means of transportation comes into question, or further sensors of the
- Information about objects in the environment of the means of transport determined by the environment detection can then, among other things, transmitted to a driver assistance system and / or a system for autonomous control of the means of transportation and used there.
- the device comprises an evaluation unit and a data input.
- the evaluation unit can be configured, for example, as an ASIC, FPGA, processor, digital signal processor, microcontroller, or the like, and can be connected to an internal and / or external storage unit in terms of information technology.
- the evaluation unit can be set up to process the method according to the invention in conjunction with a method
- Execute evaluation unit executed computer program. Furthermore, the evaluation unit is set up to record, in connection with the data input, a first signal determined by means of the first ultrasonic sensor of the means of transportation representing an environment of the means of transportation and a second signal determined by means of a camera of the means of transportation to capture the environment of the means of transportation.
- the ultrasonic sensor can preferably be an already existing ultrasonic sensor of the means of transportation. Furthermore, the ultrasonic sensor can be arranged, for example, in a front apron or in the area of a rear of the means of transportation or also at further positions of the means of transportation, so that either the road ahead or the road ahead and its surroundings can be detected.
- the camera can be, for example, a 2D or 3D camera with a standard image resolution, an HD or an ultra-HD image resolution or an infrared camera.
- the camera can preferably be arranged and aligned on the means of transportation in such a way that the camera can capture an environment in front of the means of transportation.
- Evaluation unit can directly and / or indirectly by means of an electrical system of the means of transportation with the ultrasonic sensor and the camera
- the evaluation unit is set up to determine road wetness information on the basis of the first signal, to select a predefined parameter set from a plurality of predefined parameter sets as a function of the road wetness information, and to carry out an environment detection on the basis of the second signal in connection with the predefined parameter set.
- Figure 1 is a flow diagram illustrating steps of a
- Figure 2 is a block diagram of a device according to the invention in
- Figure 3 is a diagram of a speed-dependent
- Noise level of a first ultrasonic sensor is Noise level of a first ultrasonic sensor.
- Figure 1 shows a flow diagram illustrating steps of a
- an evaluation unit 10 which is a microcontroller, represents a first signal representing an environment 60 of the
- the first ultrasonic sensor 30 is arranged in a front apron of the means of transportation 80 and aligned in the direction of travel of the means of transportation 80.
- the evaluation unit 10 receives the first signal by means of a data input 12 of the evaluation unit 10 and stores environmental information represented by the first signal in an internal memory unit 20 of the microcontroller.
- a second signal representing the surroundings 60 of the means of transportation 80 is recorded by means of a camera 40 of the means of transportation 80.
- the camera 40 is arranged in an interior of the means of transportation 80 in an upper region of a windshield of the means of transportation 80 and is oriented such that the camera 40 is attached to the means of transportation 80
- the second signal of the camera 40 is from an image processing unit of the means of transportation 80
- step 300 the evaluation unit 10 executes one
- Computer program determines road wetness information based on the first signal.
- the evaluation unit 10 compares a noise level 70 of the first signal with a predefined threshold value 75 for a noise level 70. Exceeding the predefined threshold value 75 by the noise level 70 indicates that there is road wetness in the surroundings 60 of the means of transportation 80. Because in this case based on the
- the evaluation unit 10 sends a corresponding signal, which includes the current road wetness information, by means of a vehicle bus
- step 400 of the method according to the invention the image processing unit selects a predefined parameter set from a plurality of predefined ones
- Parameter sets depending on the received road wetness information The one selected in this case by the image processing unit
- Parameter set represents a configuration of a classifier based on a neural network, which at an earlier point in time (e.g. in a development phase of the means of transportation 80) among others
- step 500 an environment recognition is carried out on the basis of the second signal in conjunction with the predefined parameter set by means of the image processing unit.
- Information about objects in the surroundings 60 of the means of transportation 80 determined by means of the environment detection is then transmitted to a system for autonomously controlling the means of transportation 80 by means of the vehicle electrical system and is used by the latter in the course of the autonomous control of the means of transportation 80.
- FIG. 2 shows a block diagram of a device according to the invention in FIG.
- the device comprises an evaluation unit 10, which here is a microcontroller and has a data input 12.
- the evaluation unit 10 is provided with a first one oriented in the direction of travel of the means of transportation 80
- Ultrasound sensor 30 and a second ultrasound sensor 35 oriented counter to the direction of travel are connected in terms of information technology via an electrical system of the means of transport 80. Also via the data input 12 is the evaluation unit 10 with a camera 40 oriented in the direction of travel of the means of transportation 80 in terms of information technology via the vehicle electrical system
- the evaluation unit 10 is connected in terms of information technology to an external storage unit 20, which is set up to receive information received by the evaluation unit 10 for a to store downstream processing by the evaluation unit 10.
- the evaluation unit 10 is able to detect an environment 60 of the
- Evaluation unit 10 is not only configured on the basis of first signals from the first ultrasonic sensor 30 and the second ultrasonic sensor 35
- Determining road wetness information is also set up to select a predefined parameter set corresponding to the road wetness information and by means of the predefined parameter set on the basis of a second signal from camera 40 an environment detection
- FIG. 3 shows a diagram of a speed-dependent noise level 70 of a first ultrasound sensor 30.
- a means of transport 80 which uses the first ultrasound sensor 30 in the sense of the method according to the invention travels at a speed v which corresponds to a predefined threshold value 75 corresponds to the first phase P1.
- predefined threshold value 75 from a plurality of predefined threshold values 75 is used for a comparison with the noise level 70 of the first signal, which was previously defined for this speed range.
- the noise level 70 in the first phase P1 is completely above the predefined threshold value 75 of the first phase P1, an existing road wetness is determined by an evaluation unit 10 according to the invention. From the course of the speed v it can be seen that the speed v of the means of transportation 80 here continues to increase over time.
- a predefined threshold value 75 which differs from the predefined threshold value 75 of the first phase P1, is selected for a second phase P2 by means of the evaluation unit 10, on account of the now higher speed v.
- the predefined threshold value 75 of the second phase P2 is adapted to the noise level 75 generated by the higher speed v.
- the evaluation unit 10 determines a dry road surface.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Acoustics & Sound (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018218733.9A DE102018218733A1 (de) | 2018-10-31 | 2018-10-31 | Verfahren zur Unterstützung einer kamerabasierten Umfelderkennung eines Fortbewegungsmittels mittels einer Strassennässeinformation eines ersten Ultraschallsensors |
PCT/EP2019/074653 WO2020088829A1 (de) | 2018-10-31 | 2019-09-16 | Verfahren zur unterstützung einer kamerabasierten umfelderkennung eines fortbewegungsmittels mittels einer strassennässeinformation eines ersten ultraschallsensors |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3874402A1 true EP3874402A1 (de) | 2021-09-08 |
Family
ID=67999619
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19773013.8A Pending EP3874402A1 (de) | 2018-10-31 | 2019-09-16 | Verfahren zur unterstützung einer kamerabasierten umfelderkennung eines fortbewegungsmittels mittels einer strassennässeinformation eines ersten ultraschallsensors |
Country Status (7)
Country | Link |
---|---|
US (1) | US11580752B2 (de) |
EP (1) | EP3874402A1 (de) |
JP (1) | JP7069425B2 (de) |
KR (1) | KR20210083303A (de) |
CN (1) | CN112997189A (de) |
DE (1) | DE102018218733A1 (de) |
WO (1) | WO2020088829A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022202036A1 (de) | 2022-02-28 | 2023-08-31 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren und Vorrichtung zum Bereitstellen eines Klassifikationsergebnisses zur Objektidentifikation mithilfe ultraschallbasierter Sensorsysteme in mobilen Einrichtungen |
Family Cites Families (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2945230B2 (ja) | 1993-02-25 | 1999-09-06 | 三菱電機株式会社 | 路面状態検知装置 |
JP2000304860A (ja) * | 1999-04-23 | 2000-11-02 | Matsushita Electric Works Ltd | 車載用超音波検知器 |
US7920102B2 (en) * | 1999-12-15 | 2011-04-05 | Automotive Technologies International, Inc. | Vehicular heads-up display system |
US9873442B2 (en) * | 2002-06-04 | 2018-01-23 | General Electric Company | Aerial camera system and method for identifying route-related hazards |
JP4183542B2 (ja) * | 2003-03-28 | 2008-11-19 | 名古屋電機工業株式会社 | 車両用路面状態検出装置、車両用路面状態検出方法および車両用路面状態検出装置の制御プログラム |
DE102004020282A1 (de) * | 2003-04-24 | 2006-07-20 | Roman Koller | Verlustmessung mit Sensor |
JP4296940B2 (ja) * | 2004-01-13 | 2009-07-15 | トヨタ自動車株式会社 | 車両運転支援装置および車両運転支援方法 |
JP3934119B2 (ja) | 2004-06-14 | 2007-06-20 | 本田技研工業株式会社 | 車両周辺監視装置 |
DE102005023696A1 (de) | 2005-05-23 | 2006-11-30 | Robert Bosch Gmbh | Überwachungseinrichtung für ein Fahrzeug |
US20140172727A1 (en) * | 2005-12-23 | 2014-06-19 | Raj V. Abhyanker | Short-term automobile rentals in a geo-spatial environment |
JP2007322231A (ja) | 2006-05-31 | 2007-12-13 | Fujifilm Corp | 路面状況検出装置 |
DE102006037591A1 (de) * | 2006-08-11 | 2008-02-14 | Robert Bosch Gmbh | Vorrichtung zur Erfassung eines bewegten Objektes |
CN101016053A (zh) * | 2007-01-25 | 2007-08-15 | 吉林大学 | 高等级公路上车辆防追尾碰撞预警方法和系统 |
JP4952444B2 (ja) * | 2007-08-29 | 2012-06-13 | 横浜ゴム株式会社 | 車両走行路面状態推定システム |
JP5034809B2 (ja) * | 2007-09-18 | 2012-09-26 | マツダ株式会社 | 車両用路面状態推定装置 |
DE102010044219A1 (de) * | 2010-11-22 | 2012-05-24 | Robert Bosch Gmbh | Verfahren zur Erfassung der Umgebung eines Fahrzeugs |
GB2489561B (en) | 2011-03-15 | 2013-10-02 | Land Rover Uk Ltd | Vehicle under-body mounted sensor and control system |
WO2012171739A2 (de) * | 2011-06-17 | 2012-12-20 | Robert Bosch Gmbh | Verfahren und steuergerät zur erkennung einer wetterbedingung in einem umfeld eines fahrzeugs |
DE102011056051A1 (de) * | 2011-12-05 | 2013-06-06 | Conti Temic Microelectronic Gmbh | Verfahren zur Auswertung von Bilddaten einer Fahrzeugkamera unter Berücksichtigung von Informationen über Regen |
DE102012221518A1 (de) * | 2012-11-26 | 2014-05-28 | Robert Bosch Gmbh | Verfahren zur Ermittlung der Straßenglätte in einem Fahrzeug |
JPWO2015045501A1 (ja) * | 2013-09-27 | 2017-03-09 | 日立オートモティブシステムズ株式会社 | 外界認識装置 |
DE102013226631A1 (de) * | 2013-12-19 | 2015-06-25 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Ermittlung von lokalen Wetterverhältnissen und eines lokalen Fahrbahnzustands |
US20150202770A1 (en) * | 2014-01-17 | 2015-07-23 | Anthony Patron | Sidewalk messaging of an autonomous robot |
US9335178B2 (en) * | 2014-01-28 | 2016-05-10 | GM Global Technology Operations LLC | Method for using street level images to enhance automated driving mode for vehicle |
US9550418B1 (en) * | 2014-06-03 | 2017-01-24 | Twin Harbor Labs, LLC | Travel safety control |
KR101618551B1 (ko) * | 2014-07-02 | 2016-05-09 | 엘지전자 주식회사 | 차량 운전 보조 장치 및 이를 구비한 차량 |
DE102014213536A1 (de) * | 2014-07-11 | 2016-01-14 | Bayerische Motoren Werke Aktiengesellschaft | Zusammenfügen von Teilbildern zu einem Abbild einer Umgebung eines Fortbewegungsmittels |
US9428183B2 (en) * | 2014-07-31 | 2016-08-30 | Toyota Motor Engineering & Manufacturing North America, Inc. | Self-explaining autonomous vehicle |
DE102015202782A1 (de) * | 2015-02-17 | 2016-08-18 | Robert Bosch Gmbh | Verfahren zum Betreiben einer Sensorvorrichtung und Sensorvorrichtung |
DE102015106408A1 (de) * | 2015-04-27 | 2016-10-27 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Sensoranordnung zum Erkennen eines Zustands einer Fahrbahn mit einem Ultraschallsensor, Fahrerassistenzsystem, Kraftfahrzeug sowie dazugehöriges Verfahren |
DE102015106401A1 (de) * | 2015-04-27 | 2016-10-27 | Valeo Schalter Und Sensoren Gmbh | Sensoranordnung zum Erkennen eines Zustands einer Fahrbahn mit zumindest zwei beabstandeten Ultraschallsensoren, Fahrerassistenzsystem, Kraftfahrzeug sowie dazuhöriges Verfahren |
DE102015208429A1 (de) * | 2015-05-06 | 2016-11-10 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Erkennung und Bewertung von Fahrbahnreflexionen |
US20160357187A1 (en) | 2015-06-05 | 2016-12-08 | Arafat M.A. ANSARI | Smart vehicle |
JP6468162B2 (ja) * | 2015-10-19 | 2019-02-13 | 株式会社デンソー | 障害物報知装置 |
DE102015015022A1 (de) | 2015-11-20 | 2016-05-25 | Daimler Ag | Vorrichtung und Verfahren zur Ermittlung von Nässe auf einer Fahrbahn |
US9928427B2 (en) * | 2015-12-03 | 2018-03-27 | GM Global Technology Operations LLC | Vision-based wet road surface condition detection using tire rearward splash |
DE102016103251A1 (de) * | 2016-02-24 | 2017-08-24 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Betreiben von mindestens einem Sensor zur Umfelderfassung eines Fahrzeugs |
DE102016009022A1 (de) | 2016-07-23 | 2017-02-02 | Daimler Ag | Verfahren zur Erkennung von Nässe auf einer Fahrbahn |
DE102016218238B3 (de) | 2016-09-22 | 2017-07-06 | Robert Bosch Gmbh | Verfahren und Recheneinheit zur Erkennung einer nassen oder feuchten Fahrbahn und zur Objektdetektion |
JP6859073B2 (ja) | 2016-11-01 | 2021-04-14 | 株式会社デンソー | 異常検出装置 |
DE102016122987A1 (de) * | 2016-11-29 | 2018-05-30 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum vorrausschauenden Bestimmen einer Aquaplaninggefahr für ein Kraftfahrzeug, Fahrerassistenzsystem, Kraftfahrzeug sowie Aquaplaning-Bestimmungssystem |
CN106600987A (zh) * | 2016-12-28 | 2017-04-26 | 北京博研智通科技有限公司 | 一种具有多维检测功能的路口交通信号控制方法及系统 |
DE102017206244A1 (de) * | 2017-04-11 | 2018-10-11 | Continental Teves Ag & Co. Ohg | Verfahren und vorrichtung zur ermittlung eines fahrbahnzustands |
-
2018
- 2018-10-31 DE DE102018218733.9A patent/DE102018218733A1/de active Pending
-
2019
- 2019-09-16 EP EP19773013.8A patent/EP3874402A1/de active Pending
- 2019-09-16 JP JP2021547885A patent/JP7069425B2/ja active Active
- 2019-09-16 KR KR1020217015945A patent/KR20210083303A/ko active Search and Examination
- 2019-09-16 US US17/252,946 patent/US11580752B2/en active Active
- 2019-09-16 CN CN201980073712.7A patent/CN112997189A/zh active Pending
- 2019-09-16 WO PCT/EP2019/074653 patent/WO2020088829A1/de unknown
Also Published As
Publication number | Publication date |
---|---|
CN112997189A (zh) | 2021-06-18 |
WO2020088829A1 (de) | 2020-05-07 |
JP7069425B2 (ja) | 2022-05-17 |
JP2022509379A (ja) | 2022-01-20 |
US11580752B2 (en) | 2023-02-14 |
US20210124957A1 (en) | 2021-04-29 |
DE102018218733A1 (de) | 2020-04-30 |
KR20210083303A (ko) | 2021-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2651696B1 (de) | Verfahren zur erkennung einer nassen strasse | |
WO2019174682A1 (de) | Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnzuständen und witterungsbedingten umwelteinflüssen | |
EP2581892B1 (de) | Abstandsmesssystem sowie Verfahren zur Abstandsmessung insbesondere eines Fahrzeugs zu seiner Umgebung | |
DE102013112916A1 (de) | Fahrzeugfahrtunterstützungssteuerungsvorrichtung | |
DE112017003974B4 (de) | Geschwindigkeitsregelungsvorrichtung | |
DE102012101453A1 (de) | Fahrzeugfahrtunterstützungsvorrichtung | |
DE102020212799A1 (de) | Durchführen von objekt- und aktivitätserkennung auf der grundlage von daten von einer kamera und einem radarsensor | |
DE102012111846A1 (de) | Kollisionsschutzverfahren und Kollisionsschutzsystem | |
EP3622709A1 (de) | Verfahren und vorrichtung zur ortsaufgelösten detektion von einem fahrzeugexternen objekt mithilfe eines in einem fahrzeug verbauten sensors | |
DE102015203845A1 (de) | System und Verfahren zum Erkennen eines vorausfahrenden Fahrzeugs unter Verwendung eines Sensors | |
DE102014009059A1 (de) | Seiten-Rückwärts-Alarmsystem für Fahrzeuge und Alarmsteuerverfahren für dieses | |
WO2018059817A1 (de) | Verfahren zum erkennen eines objekts in einer umgebung eines kraftfahrzeugs unter berücksichtigung einer streuung von abstandswerten eines ultraschallsensors, steuergerät, fahrerassistenzsystem sowie kraftfahrzeug | |
DE102014106506A1 (de) | Verfahren zum Durchführen einer Diagnose eines Kamerasystems eines Kraftfahrzeugs, Kamerasystem und Kraftfahrzeug | |
DE102015009849A1 (de) | Radarsystem vom Typ für ein Fahrzeug und Verfahren zum Entfernen eines nicht interessierenden Ziels | |
WO2018202552A1 (de) | Verfahren und vorrichtung zum klassifizieren von objekten im umfeld eines kraftfahrzeuges | |
DE102013202915A1 (de) | Verfahren und Vorrichtung zum Vermessen einer Parklücke für ein Einparkassistenzsystem eines Kraftfahrzeugs | |
WO2015090691A1 (de) | Verfahren zum erzeugen eines umgebungsmodells eines kraftfahrzeugs, fahrerassistenzsystem und kraftfahrzeug | |
DE102012024959A1 (de) | Verfahren zum Betreiben eines Fahrzeugs und Fahrzeug | |
DE102021213900A1 (de) | Vorrichtung und verfahren zur erfassung einer blockierung eines radarsensors und radarvorrichtung | |
DE102013022050A1 (de) | Verfahren zum Verfolgen eines Zielfahrzeugs, insbesondere eines Motorrads, mittels eines Kraftfahrzeugs, Kamerasystem und Kraftfahrzeug | |
EP3874402A1 (de) | Verfahren zur unterstützung einer kamerabasierten umfelderkennung eines fortbewegungsmittels mittels einer strassennässeinformation eines ersten ultraschallsensors | |
WO2020182812A1 (de) | Verfahren zum ermitteln eines nässebedingten unfallrisikos für ein fortbewegungsmittel | |
DE102017122578A1 (de) | Verfahren zur Unterdrückung von Falschdetektionen, Radarsystem und Fahrerassistenzsystem | |
DE102017118809B4 (de) | Verfahren zum Betreiben einer Sensorvorrichtung eines Kraftfahrzeugs, Sensorvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug | |
DE102017115457A1 (de) | Erkennung einer Fehlstellung eines Abstandssensors basierend auf einem Verhältnis von Detektionsmerkmalen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210531 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230428 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230509 |