EP1776602A1 - Verfahren und vorrichtung zum betreiben eines sensorsystems - Google Patents
Verfahren und vorrichtung zum betreiben eines sensorsystemsInfo
- Publication number
- EP1776602A1 EP1776602A1 EP05768003A EP05768003A EP1776602A1 EP 1776602 A1 EP1776602 A1 EP 1776602A1 EP 05768003 A EP05768003 A EP 05768003A EP 05768003 A EP05768003 A EP 05768003A EP 1776602 A1 EP1776602 A1 EP 1776602A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor
- data
- detection
- processing unit
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9321—Velocity regulation, e.g. cruise control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9322—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
Definitions
- the invention relates to a method and an apparatus for operating a sensor system, which preferably comprises at least two sensors which are different from each other.
- DE 101 33 945 A1 shows a method and a device for operating a sensor system.
- sensors of the sensor system which are preferably embodied in different sensor technologies (radar, video, ultrasound, etc.) communicate with a processing unit (sensor data fusion unit or information platform).
- the sensors transmit sensor data, i. Results of the measurement process of the sensors, the processing unit, which are further processed there and forwarded to at least one, preferably several functionalities.
- data exchange also takes place from the processing unit to the sensors. This confirmation data will become
- identification data are exchanged, in particular sent from the processing unit to at least one sensor.
- the transmission of information from the processing unit to the sensor device which is an indication of at least one area to be detected (in particular space area) and / or at least one operating mode to be activated represent affected sensor device, an improvement of the detection performance and the detection quality of the sensor device involved is achieved.
- Operating mode and detection scope is optimized for the task to be performed. This also increases the quality of the information provided by the processing unit (information platform).
- Spectrum of information users such as functions such as automatic emergency braking systems, adaptive cruise control systems, etc., can serve very different requirements.
- the information between the sensors is transmitted via a central processing unit (information platform), which additionally assumes the sensor control on the basis of a situation detection and description contained in the processing unit or supplied to the processing unit.
- a central processing unit information platform
- the information transmitted by the processing unit to the sensors is displayed independently of the sensor. In the sensor, this information is then converted into sensor-specific data.
- the information transmitted by the processing unit to the sensors is at least one of the following information: for essential operating states an indication of the particular spatial detection range (windowing), for example a limitation of the spatial detection range or a windowing of the object data based on at least one criterion, such as e.g. Speed, position, angle, etc .; an indication representing at least one region of interest to be observed, which is preferably substantially smaller than the above-mentioned fenestration;
- Prioritization and / or identification information for these detection areas Prioritization and / or identification information for these detection areas; a tracking list with identity markers of objects to be observed; Information regarding the congestion of objects (eg, location of entry into the detection and / or detection area, speed, size, type, etc.) potentially re-detected by the sensor and resulting from predictions of object change;
- the processing unit transmits to the sensors control data for setting a sensor mode.
- sensor-specific and sensor-unspecific options for changing the information content of the information provided by the sensor. Such possibilities are, for example, the sensor cycle time, the quality threshold of the information to be supplied, the type of information, the technological methodology of gathering information, a prioritization between various information items
- FIG. 1 shows an overview image of a sensor system with a processing unit (information platform) controlling the sensor system.
- FIGS. 2 to 4 schematically show the effect of the procedure according to the invention, while in FIG.
- Description of exemplary embodiments 1 shows an overview image of a sensor system which sends sensor signals to a processing unit and receives information for controlling the sensors.
- the sensor system comprises several, at least two sensors (Sl to SN).
- the sensors are sensors for detecting the surroundings of a vehicle, for example ultrasound sensors, radar sensors,
- Video sensors etc. These sensors are connected via preferably bi-directional communication links 10 - 18 with a processing unit (information platform, IP).
- the communication systems are a bus system, such as CAN, which interconnects the sensors and the processing unit for mutual data exchange.
- the processing unit is part of a control unit SE.
- the processing device is realized by one or more software modules that run on one or more microcomputers of the control unit SE.
- the information platform data are other sensor systems 20 - 24 via corresponding communication links 26 -.
- the information platform supplied, for example, to the information platform to supply operating variables such as the airspeed, which are not detected by the sensor system Sl to SN. These operating variables serve the information platform, if necessary, to be taken into account in the formation of information and / or control variables for the sensor system S1 to SN. Furthermore, the information platform determines quantities for different (at least two) functionalities, which are indicated in FIG. 1 by Fl to FN. These functionalities relate to functions such as automatic emergency braking, an adaptive cruise control, a parking aid, a lane departure warning, etc.
- the data transmitted from the information platform to the functionalities is the preferred one Embodiment example of the sensor data fused object data, which are then evaluated by the functionalities according to their function.
- the functions F1 to FN control actuators for example warning elements, brake systems, etc., which is indicated by the example of the function F1 in FIG.
- the data connection between the information platform and the various functions also takes place via a bus system, for example a CAN bus system, in a preferred embodiment.
- the data exchange is bi-directional, for example data relating to the activation state of the function being transmitted by the functions to the information platform become.
- a sensor system consisting of two sensors, which detect object data from the surroundings of a motor vehicle.
- the preferred application example includes two different sensors and two functions different in their requirements from the processing unit (information platform).
- the sensors used are a monovideo camera and at least one radar sensor, preferably with a common (overlapping) or adjacent detection areas.
- various functions of the motor vehicle which function as a function of the detected objects and their characteristics, such as an automatic emergency braking system and an adaptive cruise control are controlled. Object data outside these limits are discarded.
- At least one of the sensors will be
- Detection range e.g. by specifying a maximum range and / or maximum angles.
- the information platform determines these values depending on which function is to be supported. with active adaptive cruise control a comparatively long range with small angles specified, while with active parking aid the values for the
- Detection range can be set opposite. '- 7
- the combination of a monovideo sensor with a radar sensor improves in particular the plausibility of the detected objects.
- the plausibility check takes place by means of the information platform.
- Radar sensor for example, speed, angle and / or distance to the detected object supplied to the information platform.
- This list of detection areas is transmitted to the video sensor.
- the data include, for example, coordinates of an excellent point (eg center or Center of gravity) and the extent of the detection area and / or the speed of the point.
- an identification number and possibly a priority classification of the detection areas is transmitted.
- the video sensor receives this list and works the areas contained in the list in the specified Priority order.
- the video sensor or its evaluation unit analyzes the recorded image in the transmitted detection areas for object recognition.
- all sensors work with the same coordinate system, or a transformation takes place from the coordinate system of the radar sensor to that of the video sensor, and vice versa, preferably in the information platform. If the image analysis of the video sensor results in one or more objects in the transmitted region (s), the data or information, if appropriate together with the identification number, is transmitted to the information platform where the object (or the objects detected) determined by the radar sensor is made plausible. Detects the video sensor in the transmitted region (s).
- Detection area no object it can be assumed that the detected by the radar sensor object is a decoy.
- Objects recognized after the plausibility check are further processed by the information platform, e.g. transmits the object data or information derived therefrom to the connected functions.
- the above-outlined plausibility check preferably takes place in the information platform, but can also be part of the software of the video sensor.
- the video sensor uses known approaches.
- the information platform can control the video sensor so that the plausibility of the radar objects is faster, for example, that the processing in the video sensor is aborted after a few relevant image areas and / or the plausibility is weakened.
- the latter takes place, for example, in that only location information is used for the object for plausibility checking, while the speed information for the plausibility check is dispensed with.
- the corresponding signals for controlling the sensor are sent from the information platform to the sensor depending on the operating condition (e.g., emergency brake function active or not).
- Another possibility for accelerating the plausibility of the object detection of the radar sensor by the video sensor is by preconditioning the Reached video objects. This is preferably carried out by deriving on the basis of the object data determined by the radar sensor, such as location, speed, direction of movement, etc., the location of the presumed arrival of the object in the detection area of the sensor, around which location an image area to be examined is formed in turn transmitted to the video sensor.
- sensors are used which, on the one hand, detect information (for example, width, curvature, etc.) about the traffic lane, lane or roadway (roadway sensors) and, on the other hand, detect objects (object-detecting sensors).
- information for example, width, curvature, etc.
- objects object-detecting sensors
- a mode control is performed by the information platform turning off at least a part of the lane detection when a driving condition is reached in which this part is not required (eg curb detection is required in the city, not on highways, so here this part is turned off can be).
- a driving condition e.g curb detection is required in the city, not on highways, so here this part is turned off can be.
- the information platform transmits from the information platform to the sensor information representing the expected lane marking.
- the sensor or its evaluation adapts. This saves resources.
- a road type may be transmitted (highway, winding road, etc.) to adapt the model to the vehicle edge detection, so that the quality of the estimation of the parameters is improved.
- FIG. 2 shows the windowing of the object data to be detected (definition of the detection area).
- FIG. 3 shows the specification of one or more detection regions (ROI), while FIG. 4 shows the preconditioning of detected objects.
- 100 denotes the own vehicle on which the sensors are mounted. 2 and 3, a first environmental sensor 102 and its detection region 104 are shown. In FIG. 4, a second environmental sensor 106 with a wider detection range 108, but with a smaller range, is shown in addition to the sensor 102 and its detection region 104. According to FIG. 2, a windowing of the object data or a restriction of the detection area for resource reduction is undertaken.
- the information communicated to the sensor by the information platform includes data representing a boundary of the sensing range of the sensor, such as minimum and / or maximum values from the detection range delimiting coordinates, speed values that limit the width of the detection range, and / or road parameters that determine the width of the detection range Specify detection area (two-lane road, four-lane road, etc.).
- the sensor receives this data and forms from it the adapted detection range 110 shown in FIG. 2.
- Detection range to the respective active function for example, parking aid or vehicle speed controller
- the driving ritual e.g., street type
- data is transmitted to the sensor or sensors by the information platform with respect to at least one detection area to be considered particularly.
- These data are derived from the data of a detected object of another sensor, for example a radar sensor, and consist, for example, of the coordinates for the center point (resp.
- a unique identification number is linked to each detection area or its data. By limiting it to a few areas of interest, the resources in the sensor, preferably in the video sensor, can be reduced and thus the most important information can be generated in a very short time. This is of particular interest in zeilcritical functions, for example in the case of automatic emergency braking, in which object detection and plausibility must be carried out very quickly.
- the identification numbers are assigned by the information platform and passed on to the sensor.
- the results determined by the sensor are sent back to the information platform with the identification number, so that the information platform can monitor the processing on the basis of the numbers, since the sensor is checked for plausibility of the detected object or with own object recognition a corresponding information under this identification number to the Information platform sends back. Only when the processing is done, the identification number is re-assigned by the information platform. On the basis of this feedback, the information platform also recognizes an overload in the processing in the sensor if the processing of the task has not been confirmed within a predetermined time period. In the event of an overload, the information platform indicates this to the active function and / or assigns prioritization values to the sensor or adapts existing ones by waiving specified tasks, eg as outlined above, or by performing them only to a limited extent.
- a tracking list of the object data with object identification number is generated by at least one of the sensors or the information platform.
- the data for the windowing and / or the generation of the detection areas are made by the information platform and transmitted to the at least one other sensor.
- FIG. 3 shows the solution outlined above.
- the vehicle 100 has a sensor system 102, which has at least one sensor with the detection area 104. Plotted are the detection regions (ROI) 112, 114, 116, which are characterized by quantities such as: midpoint (eg 112a, 114a, 116a, optionally with variance values and (not shown) velocity optionally with variance values, where the variance values are the uncertainty of the ROIs express).
- the detection areas shown which are formed on the basis of detected objects of another sensor of the sensor system, are evaluated by the evaluation unit of the relevant sensor particularly frequently or exclusively on the sensor
- Presence and / or properties of objects in these areas Presence and / or properties of objects in these areas.
- a third option for attention control of the sensor system is the preconditioning of at least one of the sensors. At least one of the sensors of the sensor system transmits object data to the information platform, which the
- Information platform in data for another sensor with other coverage area implements, which in particular represent the location of the expected intrusion of the object in the detection range of the sensor.
- the data transmitted by the information platform to this sensor relate in the preferred exemplary embodiment to the location at which the penetration of the object into the detection range of the sensor is to be expected and, optionally, the speed and / or direction of movement of the object. This is done in the preferred embodiment in the context of the transmission of data to a special detection area (ROI).
- the sensor is then the Trackinginitialmaschine or the angle assignment, for example, with regard to be optimally adjusted to the newly erende to deteku 'object on the basis of this information.
- the vehicle 100 has a sensor system with at least two sensors 102 and 106.
- the detection range of the sensor 102 is designated 104, while the sensor 106 has a wider detection range 108 with a shorter range. This is a typical one
- Case constellation when using a radar sensor 102 and a video camera 106.
- the detected by the radar sensor 102 object is designated 118.
- the information platform therefore determines a particular detection area 120, which is communicated to the sensor 106 and which represents the location of the suspected penetration of the object into the detection area.
- the sensor then observes this detection area preferably (or together with the other notified detection areas exclusively) and can therefore already before
- prioritization data are transmitted with the detection area data, which predetermine the sequence or frequency of the processing of the areas for the sensor.
- the cycle time is extended again. In some operating situations it makes sense to receive incomplete information earlier. This is the case, for example, if the affected sensor only serves to check the plausibility of already detected objects. In this case, in one embodiment, it may be sufficient to provide information already if the sensor-internal plausibility check for an object is not yet completely completed, but an object has been detected, or if the object states are only partially known. A corresponding control information is transmitted from the information platform to the sensor concerned.
- the prioritization between track detection and object detection of a video sensor can be shifted, for example, when in cities, the track detection has a lower priority over the object detection.
- the object detection is performed more frequently than the track detection.
- Corresponding information about the operating state is supplied to the information platform (for example, from the connected functions), which in turn transmits corresponding data to the sensor (s) concerned.
- FIG. 5 outlines a flowchart which illustrates the mode of operation of the information platform using the example of the formation and transmission of detection area scales.
- the sketched program is run through at predetermined time intervals.
- object data is received by a first sensor, for example by a radar sensor.
- These object data include data relating to detected objects such as location of the object (for example angular relationships or coordinates), the relative velocity or absolute velocity of the object, the distance to the object, its direction of movement, etc.
- the corresponding data becomes a list transmitted.
- step 202 on the basis of the detected Object data detection areas (ROI) formed. For example, the location of the detected object is evaluated with variant values and in this way a detection area is spanned. If the reference systems of the individual sensors and / or the information platform are different, the data must of course be transformed into the corresponding reference systems.
- ROI Object data detection areas
- Another possibility is to use not only the center of the detected object as the basis for the calculation of the detection area, but also the speed of the object, wherein at greater speed of the object, a larger detection area is staked, which also in an embodiment according to the direction of movement of the object is adjusted so that it is greater in the direction of movement than to the side or to the rear.
- step 204 identification numbers are assigned to the individual detection areas. Further, in one embodiment at step 206, each one
- Detection area provided with priority values.
- the highest priority detection area is to be treated, representing an object closest to the vehicle.
- the data to the detection areas to another sensor, such as the video sensor, transmitted, the object of the object detection in the particular
- the processing reports back to the information platform, if necessary, with a time stamp, as does the result, for example, whether an object detected by the radar sensor could be made plausible or not.
- the information is passed on to the following functionalities, in the other case discarded.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102004038494A DE102004038494A1 (de) | 2004-08-07 | 2004-08-07 | Verfahren und Vorrichtung zum Betreiben eines Sensorsystems |
PCT/EP2005/052728 WO2006015894A1 (de) | 2004-08-07 | 2005-06-14 | Verfahren und vorrichtung zum betreiben eines sensorsystems |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1776602A1 true EP1776602A1 (de) | 2007-04-25 |
Family
ID=35206616
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05768003A Ceased EP1776602A1 (de) | 2004-08-07 | 2005-06-14 | Verfahren und vorrichtung zum betreiben eines sensorsystems |
Country Status (5)
Country | Link |
---|---|
US (1) | US8193920B2 (de) |
EP (1) | EP1776602A1 (de) |
JP (1) | JP4814234B2 (de) |
DE (1) | DE102004038494A1 (de) |
WO (1) | WO2006015894A1 (de) |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004038494A1 (de) | 2004-08-07 | 2006-03-16 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Betreiben eines Sensorsystems |
EP2095351B1 (de) * | 2006-10-13 | 2014-06-25 | Continental Teves AG & Co. oHG | System zur bestimmung von objekten |
DE102006052779A1 (de) * | 2006-11-09 | 2008-05-15 | Bayerische Motoren Werke Ag | Verfahren zur Erzeugung eines Gesamtbilds der Umgebung eines Kraftfahrzeugs |
DE102007002197A1 (de) * | 2007-01-16 | 2008-07-17 | Siemens Ag | Gemeinsamer Kontroller für verschiedene Fahrerassistenzsysteme |
DE102007018470A1 (de) * | 2007-04-19 | 2008-10-23 | Robert Bosch Gmbh | Fahrerassistenzsystem und Verfahren zur Objektplausibilisierung |
FI120605B (fi) * | 2008-02-28 | 2009-12-15 | Elsi Technologies Oy | Menetelmä ja järjestelmä tapahtumien havaitsemiseen |
AT10236U3 (de) | 2008-07-10 | 2009-09-15 | Avl List Gmbh | Messanordnung und verfahren zur erfassung von messdaten |
US8473171B2 (en) * | 2008-10-09 | 2013-06-25 | GM Global Technology Operations LLC | Apparatus and method for optimizing a vehicle collision preparation response |
DE102009009896B4 (de) * | 2009-02-20 | 2011-02-10 | Eads Deutschland Gmbh | Verfahren und Vorrichtung zur Erfassung von Zielobjekten |
DE102009021785B4 (de) * | 2009-05-18 | 2014-10-09 | Airbus Defence and Space GmbH | Verfahren zur Objekterkennung |
ATE545045T1 (de) | 2009-12-17 | 2012-02-15 | Sick Ag | Optoelektronischer sensor |
AU2011207141B2 (en) * | 2010-01-22 | 2013-10-03 | Erema Engineering Recycling Maschinen Und Anlagen Gesellschaft M.B.H. | Method for preparing and detoxifying |
DE102010063984A1 (de) * | 2010-02-11 | 2011-08-11 | Continental Teves AG & Co. OHG, 60488 | Fahrzeug-Sensor-Knoten |
DE102010015731A1 (de) * | 2010-04-21 | 2011-10-27 | Audi Ag | Verfahren zur Steuerung eines Scheinwerfersystems eines Kraftfahrzeugs und Kraftfahrzeug |
WO2011161177A1 (de) | 2010-06-23 | 2011-12-29 | Continental Teves Ag & Co. Ohg | Verfahren und system zur informationsvalidierung |
FR2973847B1 (fr) | 2011-04-11 | 2015-10-30 | Pellenc Sa | Helice de generateur de flux d'air pulse, en particulier pour souffleur portatif. |
DE102012104742A1 (de) * | 2012-06-01 | 2013-12-05 | Continental Safety Engineering International Gmbh | Verfahren und Vorrichtung zur Objektdetektion |
JP2014006114A (ja) * | 2012-06-22 | 2014-01-16 | Denso Corp | レーダ装置、及び、プログラム |
US9720412B1 (en) * | 2012-09-27 | 2017-08-01 | Waymo Llc | Modifying the behavior of an autonomous vehicle using context based parameter switching |
DE102013201545A1 (de) * | 2013-01-30 | 2014-07-31 | Bayerische Motoren Werke Aktiengesellschaft | Erstellen eines Umfeldmodells für ein Fahrzeug |
JP6212880B2 (ja) | 2013-03-04 | 2017-10-18 | 株式会社デンソー | 物標認識装置 |
US9696420B2 (en) * | 2013-04-09 | 2017-07-04 | Ford Global Technologies, Llc | Active park assist object detection |
DE102013219095A1 (de) * | 2013-09-23 | 2015-03-26 | Hella Kgaa Hueck & Co. | Verfahren zur Steuerung einer Lichtverteilung eines Scheinwerfers und Scheinwerfer hierfür |
US9387867B2 (en) * | 2013-12-19 | 2016-07-12 | Thales Canada Inc | Fusion sensor arrangement for guideway mounted vehicle and method of using the same |
KR101727162B1 (ko) * | 2014-02-27 | 2017-04-14 | 한국전자통신연구원 | 관제 서비스 제공 장치 및 그 방법 |
DE102014107305A1 (de) * | 2014-05-23 | 2015-11-26 | Valeo Schalter Und Sensoren Gmbh | Parkassistenzvorrichtung für ein Kraftfahrzeug |
KR101621857B1 (ko) | 2014-06-26 | 2016-05-17 | 주식회사 사람과기술 | 어린이 통학용 차량을 위한 지능형 문끼임 방지 시스템 및 동작방법 |
DE102014009869A1 (de) * | 2014-07-03 | 2016-01-21 | Audi Ag | Verfahren zum Betrieb eines Radarsensors in einem Kraftfahrzeug und Kraftfahrzeug |
US10634778B2 (en) * | 2014-10-21 | 2020-04-28 | Texas Instruments Incorporated | Camera assisted tracking of objects in a radar system |
DE102015011022B4 (de) * | 2015-08-22 | 2019-11-28 | Audi Ag | Verfahren zum Betrieb von Radarsensoren in einem Kraftfahrzeug und Kraftfahrzeug |
US11255663B2 (en) | 2016-03-04 | 2022-02-22 | May Patents Ltd. | Method and apparatus for cooperative usage of multiple distance meters |
US20170307743A1 (en) * | 2016-04-22 | 2017-10-26 | Delphi Technologies, Inc. | Prioritized Sensor Data Processing Using Map Information For Automated Vehicles |
DE102017205495A1 (de) * | 2017-03-31 | 2018-10-04 | Conti Temic Microelectronic Gmbh | Vorrichtung und Verfahren zum Fokussieren von Sensoren im fahrdynamischen Grenzbereich für ein Kraftfahrzeug |
JP2019020351A (ja) * | 2017-07-21 | 2019-02-07 | 株式会社Soken | 物標認識装置、物標認識装置を備えた車両制御装置、及び物標認識方法 |
KR102390208B1 (ko) * | 2017-10-17 | 2022-04-25 | 삼성전자주식회사 | 멀티미디어 데이터를 전송하는 방법 및 장치 |
US10848718B2 (en) | 2018-03-08 | 2020-11-24 | Aptiv Technologies Limited | Vehicle sensor configuration based on map data |
DE102018107360A1 (de) * | 2018-03-28 | 2019-10-02 | Connaught Electronics Ltd. | Verfahren zum Senden von Sensorinformationen an Funktionseinheiten, Steuerungseinrichtung, Steuerungseinrichtungsanordnung, Fahrerassistenzsystem sowie Kraftfahrzeug |
KR20210022570A (ko) * | 2018-06-29 | 2021-03-03 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 정보 처리 장치 및 정보 처리 방법, 촬상 장치, 컴퓨터 프로그램, 정보 처리 시스템, 그리고 이동체 장치 |
US20210302570A1 (en) * | 2018-08-09 | 2021-09-30 | Sony Semiconductor Solutions Corporation | Information processing device and information processing method, computer program, information processing system, and mobile device |
WO2020100569A1 (ja) * | 2018-11-14 | 2020-05-22 | ソニー株式会社 | 制御装置、制御方法及びセンサ制御システム |
DE102018130916A1 (de) * | 2018-12-05 | 2020-06-10 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum priorisierten Übertragen von erfassten Informationen eines Sensors eines Kraftfahrzeugs an eine Steuereinheit des Kraftfahrzeugs, Sensor, Fahrerassistenzsystem sowie Kraftfahrzeug |
EP3751294B8 (de) * | 2019-06-14 | 2023-09-13 | Rohde & Schwarz GmbH & Co. KG | Vorrichtung und verfahren zur synchronisation von sensoren |
DE102019123855A1 (de) * | 2019-09-05 | 2021-03-11 | Valeo Schalter Und Sensoren Gmbh | Abstandswarnung und elektronisches Fahrzeugsystem |
CN112578781B (zh) * | 2019-09-29 | 2022-12-30 | 华为技术有限公司 | 数据处理方法、装置、芯片系统及介质 |
JP2023046417A (ja) * | 2020-02-27 | 2023-04-05 | ソニーセミコンダクタソリューションズ株式会社 | 物体認識システム、および物体認識方法 |
WO2021181841A1 (ja) * | 2020-03-11 | 2021-09-16 | パナソニックIpマネジメント株式会社 | 測距装置 |
CN113715753A (zh) * | 2020-05-25 | 2021-11-30 | 华为技术有限公司 | 车辆传感器数据的处理方法和系统 |
EP4421776A1 (de) * | 2023-02-21 | 2024-08-28 | Aptiv Technologies AG | System und verfahren zur datenerfassung an einem fahrzeug zur strassenüberwachung und fahrzeug damit |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10254806A1 (de) * | 2002-11-22 | 2004-06-17 | Robert Bosch Gmbh | Verfahren zur Informationsverarbeitung |
WO2004055547A1 (de) * | 2002-12-13 | 2004-07-01 | Robert Bosch Gmbh | Verfahren und einrichtung zur objektdetektierung |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04355390A (ja) * | 1991-06-03 | 1992-12-09 | Nissan Motor Co Ltd | 距離計測装置 |
JP3123303B2 (ja) * | 1992-07-21 | 2001-01-09 | 日産自動車株式会社 | 車両用画像処理装置 |
JP3302848B2 (ja) * | 1994-11-17 | 2002-07-15 | 本田技研工業株式会社 | 車載レーダー装置 |
JP3600314B2 (ja) | 1995-05-19 | 2004-12-15 | 本田技研工業株式会社 | 車両の外部環境認識装置 |
JPH09231379A (ja) | 1996-02-28 | 1997-09-05 | Hitachi Ltd | 画像処理による走行車線認識装置 |
JPH09264954A (ja) | 1996-03-29 | 1997-10-07 | Fujitsu Ten Ltd | レーダを用いた画像処理システム |
JPH1090406A (ja) * | 1996-09-13 | 1998-04-10 | Omron Corp | 警報装置 |
JPH10244891A (ja) | 1997-03-07 | 1998-09-14 | Nissan Motor Co Ltd | 駐車補助装置 |
GB2328819A (en) | 1997-08-30 | 1999-03-03 | Ford Motor Co | Antenna cluster for vehicle collision warning system |
US6289332B2 (en) * | 1999-02-26 | 2001-09-11 | Freightliner Corporation | Integrated message display system for a vehicle |
JP4563531B2 (ja) * | 1999-10-13 | 2010-10-13 | 富士重工業株式会社 | 車両用運転支援装置 |
JP3891537B2 (ja) | 2000-05-09 | 2007-03-14 | 株式会社ホンダエレシス | 車両側面監視装置 |
JP3671825B2 (ja) * | 2000-09-22 | 2005-07-13 | 日産自動車株式会社 | 車間距離推定装置 |
DE10114470A1 (de) | 2001-03-24 | 2002-09-26 | Bosch Gmbh Robert | Spurhalte- und Fahgeschwindigkeitsregeleinrichtung für Kraftfahrzeuge |
GB0115433D0 (en) * | 2001-06-23 | 2001-08-15 | Lucas Industries Ltd | An object location system for a road vehicle |
JP2003248055A (ja) * | 2001-12-18 | 2003-09-05 | Hitachi Ltd | モノパルスレーダシステム |
DE60205711T2 (de) | 2001-12-18 | 2006-05-18 | Hitachi, Ltd. | Monopuls Radar mit Einstellung der Strahlaufweitung |
JP3891011B2 (ja) * | 2002-03-12 | 2007-03-07 | 株式会社デンソー | クルーズ制御装置、プログラム |
JP2004117071A (ja) * | 2002-09-24 | 2004-04-15 | Fuji Heavy Ind Ltd | 車外監視装置、及び、この車外監視装置を備えた走行制御装置 |
DE102004038494A1 (de) | 2004-08-07 | 2006-03-16 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Betreiben eines Sensorsystems |
-
2004
- 2004-08-07 DE DE102004038494A patent/DE102004038494A1/de not_active Withdrawn
-
2005
- 2005-06-14 US US11/658,616 patent/US8193920B2/en not_active Expired - Fee Related
- 2005-06-14 WO PCT/EP2005/052728 patent/WO2006015894A1/de active Application Filing
- 2005-06-14 EP EP05768003A patent/EP1776602A1/de not_active Ceased
- 2005-06-14 JP JP2007525271A patent/JP4814234B2/ja not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10254806A1 (de) * | 2002-11-22 | 2004-06-17 | Robert Bosch Gmbh | Verfahren zur Informationsverarbeitung |
WO2004055547A1 (de) * | 2002-12-13 | 2004-07-01 | Robert Bosch Gmbh | Verfahren und einrichtung zur objektdetektierung |
Non-Patent Citations (1)
Title |
---|
See also references of WO2006015894A1 * |
Also Published As
Publication number | Publication date |
---|---|
JP2008509413A (ja) | 2008-03-27 |
DE102004038494A1 (de) | 2006-03-16 |
WO2006015894A1 (de) | 2006-02-16 |
JP4814234B2 (ja) | 2011-11-16 |
US20100007476A1 (en) | 2010-01-14 |
US8193920B2 (en) | 2012-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006015894A1 (de) | Verfahren und vorrichtung zum betreiben eines sensorsystems | |
DE102017114495B4 (de) | Autonomes fahrsystem | |
EP2242674B1 (de) | Verfahren und assistenzsystem zum erfassen von objekten im umfeld eines fahrzeugs | |
DE102019104974A1 (de) | Verfahren sowie System zum Bestimmen eines Fahrmanövers | |
DE102018220724A1 (de) | Fahrzeug und Steuerverfahren desselben | |
DE102018218220A1 (de) | Steuergerät für ein Fahrzeug | |
EP4025470A1 (de) | Querführung eines fahrzeugs mittels von anderen fahrzeugen erfassten umgebungsdaten | |
WO2013007536A2 (de) | Verfahren und kommunikationssystem zum empfang von daten bei der drahtlosen fahrzeug-zu-umgebung-kommunikation | |
WO2006037687A1 (de) | Verfahren und vorrichtung zur fahrerunterstützung | |
DE102010049091A1 (de) | Verfahren zum Betreiben zumindest eines Sensors eines Fahrzeugs und Fahrzeug mit zumindest einem Sensor | |
DE102016218934A1 (de) | Verfahren zum Datenaustausch und Datenfusionierung von Umfelddaten | |
DE102018133457B4 (de) | Verfahren und System zum Bereitstellen von Umgebungsdaten | |
WO2003105108A1 (de) | Verfahren und vorrichtung zur fahrerinformation bzw. zur reaktion bei verlassen der fahrspur | |
DE102018210779A1 (de) | Verfahren und System zur Rettungsgassenbildung durch ein Fahrzeug | |
DE102012207864A1 (de) | Verfahren zum Reduzieren einer Staugefahr | |
DE102008063033B4 (de) | Vorrichtung und Verfahren zur Erkennung von Kollisionen mit erhöhter funktionaler Sicherheit | |
DE102004057060A1 (de) | Fahrassistenzvorrichtung sowie Verfahren zur Erkennung von auf der eigenen Fahrspur entgegenkommenden Fahrzeugen | |
DE102018121312A1 (de) | Verfahren zum teilautomatisierten Betreiben eines Fahrzeugs und Fahrerassistenzsystem | |
DE102010003375B4 (de) | Umfeldbewertungssystem in einem Fahrzeug mit Sensormitteln zur Erfassung von Objekten im Umfeld des Fahrzeuges | |
WO2019223833A1 (de) | Steuerung eines kraftfahrzeugs | |
DE102013002284B4 (de) | Verfahren zum aktiven Kollisionsschutz eines nichtmotorisierten Verkehrsteilnehmers | |
DE102019206870A1 (de) | Verfahren zum Ermitteln einer durch einen Verkehrsteilnehmer verursachten Gefährdung und zum Einleiten einer Überwachung des Verkehrsteilnehmers und Steuergerät | |
EP2254104A2 (de) | Verfahren zum automatischen Erkennen einer Situationsänderung | |
DE102020213588A1 (de) | Verfahren und Vorrichtung zum Betreiben eines automatisierten Fahrzeugs | |
DE102018215136B4 (de) | Verfahren zum Auswählen eines Bildausschnitts eines Sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20070307 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
17Q | First examination report despatched |
Effective date: 20070620 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
DAX | Request for extension of the european patent (deleted) | ||
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20190726 |