EP3559856A1 - Method and device for evaluating an image and providing the evaluation for a driving assist system of a vehicle - Google Patents
Method and device for evaluating an image and providing the evaluation for a driving assist system of a vehicleInfo
- Publication number
- EP3559856A1 EP3559856A1 EP17821788.1A EP17821788A EP3559856A1 EP 3559856 A1 EP3559856 A1 EP 3559856A1 EP 17821788 A EP17821788 A EP 17821788A EP 3559856 A1 EP3559856 A1 EP 3559856A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- vehicle
- evaluation
- evaluating
- assistance system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 81
- 238000000034 method Methods 0.000 title claims abstract description 24
- 241001465754 Metazoa Species 0.000 claims description 3
- 238000010191 image analysis Methods 0.000 claims 1
- 230000006870 function Effects 0.000 description 5
- 238000002604 ultrasonography Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/285—Selection of pattern recognition techniques, e.g. of classifiers in a multi-classifier system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/87—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using selection of the recognition techniques, e.g. of a classifier in a multiple classifier system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
Definitions
- the present invention relates to a method and a first device for
- the present invention relates to a system for evaluating an image and providing the evaluation for a driver assistance system.
- the inventive method for evaluating an image and providing the evaluation for a driving assistance system of a vehicle comprises a step of capturing the image, a step of determining an operating state of the image
- Image evaluation method which is selected depending on the operating state of the vehicle of at least two possible image evaluation methods and a step of providing the evaluation of the image as data values for the driving assistance system.
- the operating state of the vehicle may include a movement state of the vehicle, such as a speed and / or an acceleration and / or a yaw and / or pitch and / or roll angle.
- the operating state an ambient condition of the vehicle, such as daytime and / or weather-related lighting conditions and / or a - in particular country-specific - location and / or
- Traffic infrastructure characteristics state of the traffic route and / or type of
- Traffic route [highway, country road, dirt road, etc.] and / or width of the
- the driver assistance system of the vehicle an assistance system for lateral and / or steering control and / or a
- assistance systems There are also other assistance systems that are not listed, possible.
- the image represents an environment of the vehicle and the data values represent a description of at least one object in that environment.
- This shows a particularly security-relevant advantages, since just objects in the environment of
- Vehicle such as other road users (vehicles, pedestrians, etc.) and / or animals and / or traffic signs and / or in particular not permanently existing objects (lost objects, construction sites, etc.) have a significant impact on a safe operation of the vehicle, which is improved by the description of these objects.
- the image is captured as at least one field, wherein the at least one field comprises a subset of image characteristics of the image.
- the at least one field comprises a subset of image characteristics of the image.
- Image properties are, for example, hues (red, green, blue, etc.) and / or
- Gray values and / or structural values - in particular gradient values - of features contained in the image means that the subset has at least one image property less than the number of all
- Image properties of the image is captured as at least two partial images, each of the at least two partial images comprising a different subset of image characteristics.
- the evaluations of the at least two partial images can be mutually plausible.
- a subset of the at least two subsets comprises gray values and a second of the at least two subsets comprises a color value.
- the operating state of the vehicle includes at least one of the following states: speed of the vehicle, weather conditions in the vehicle
- the evaluation is carried out by means of the at least one image evaluation method in such a way that the at least one image evaluation method comprises a first evaluation step and a second evaluation step, wherein at least one feature is classified by means of the first evaluation step by means of the subset of image properties in the at least one partial image and by means of second Ausensesch Ritts starting from the at least one classified feature the at least one object is determined.
- the at least one object is a traffic sign and / or a
- Pedestrian and / or another vehicle and / or animal are Pedestrian and / or another vehicle and / or animal.
- the device according to the invention for evaluating an image and providing the evaluation for a driving assistance system of a vehicle comprises first means for acquiring the image, second means for determining an operating state of the image Vehicle, third means for evaluating the image by means of at least one
- Image evaluation method which is selected depending on the operating state of the vehicle of at least two possible image evaluation methods and fourth means for providing the evaluation of the image as data values for the driving assistance system.
- the first means and / or the second means and / or the third means and / or the fourth means are adapted to a method according to at least one of the method claims.
- the system according to the invention for operating a vehicle comprises
- Driver assistance system in particular a control unit, which is adapted to a driving assistance function according to at least one of the examples mentioned here
- Figure 1 purely by way of example the device according to the invention
- Figure 2 purely by way of example a vehicle comprising the device according to the invention and the system according to the invention.
- FIG. 1 shows a device 1 10 for evaluating 330 an image and providing 340 the evaluation for a driving assistance system 140 of a vehicle 100 with first means 1 1 1 for capturing 310 the image and second means 1 12 for determining 320 an operating state of the vehicle 100 includes third funds 1 13 to
- the first means 1 1 1 for detecting 310 an image are designed such that they can receive an image in the form of data values, for example from a camera system 130, and process accordingly. These include the first means 1 1 1, for example, a processor, memory and a storage device with appropriate programs.
- the first means 1 1 1 are designed such that the image is detected as at least one partial image, wherein the at least one partial image comprises a subset of image properties of the image. This is done, for example, by detecting the image by means of a plurality of input channels, each input channel being adapted to capture a particular image characteristic of the image, and only passing the image characteristics of an input channel to the third means 13. Additionally and optionally, the first means 1 1 1 are designed to detect the image and / or the at least one partial image in such a way that the image and / or the at least one partial image is divided into image regions upon detection 310.
- a single grid has a predetermined size and this size, for example, depends on the operating condition of the vehicle.
- a grid has a size of 2 x 2 pixels, or 4 x 4 pixels.
- a first input channel is designed, for example, to detect gray values with 8 to 16 bits and / or a second input channel is designed to detect and / or input the colors red and / or green and / or blue with 8 or 12 or 16 bits third Input channel is designed to detect gradient values.
- each input channel corresponds to a filter which is designed to be a predetermined one
- the image properties are thereby assigned, for example, to an image area of the image and / or the at least one subimage when capturing 310.
- Gradient values are detected, for example, in that the image comprises an object-for example a round shape-with a specific color and / or gray-scale progression, and the slope of this curve-based on a predefined value
- Coordinate system - is detected in a certain image area of the image by assigning the slope as a value to exactly this image area.
- the first means 1 1 1 are designed such that the image is detected as at least two partial images, wherein each of the at least two partial images comprises a different subset of image properties. This is done, for example, by capturing the image by means of at least two input channels, each of the at least two input channels being opposite each other
- Input channel is designed to capture a different image property.
- the acquisition of the image properties by means of the input channels takes place, for example, by detecting the acquired image properties as data values, wherein the image properties are assigned to a specific image area of the image, and these data values are forwarded to the third means 13 and evaluated by the third means 13 can.
- the second means 1 12 are designed to determine an operating state of the vehicle 100. This is done, for example, by the second means 1 12 being connected to a first sensor system 150, which comprises at least one sensor, wherein the first sensor system is designed to determine at least one movement state of the vehicle 100. Furthermore, the second means 1 12 is connected, for example, to a second sensor system 160, wherein the second sensor system is configured to detect an ambient condition of the vehicle 100.
- the second sensor system 160 includes, for example, a camera and / or a radar sensor and / or an ultrasound sensor and / or a lidar sensor.
- the second sensor system 160 includes, for example, a transmitting and / or receiving unit, which is designed for radio data connection of weather data and / or lighting conditions (dark, light, etc.) in the vicinity of the
- Vehicle 100 to request and / or receive.
- Receiving unit may also be designed such that they for transmitting and / or receiving to an already existing unit - for example, a
- the second sensor system 160 includes
- an environmental condition of the vehicle 100 is detected, for example, as - in brief - dark, by means of the
- Navigation system is detected, which is located directly in front of the vehicle 100 in the direction of travel.
- the second means 1 12 and / or the first sensor system 150 and / or the second sensor system 160 are designed for this purpose, for example by means of a processor, main memory and a memory device, which corresponding
- Determination software includes the operating state of the vehicle 100, such as
- a movement state and / or an environment state in the form of data values to capture and these data values to the third means 13 1
- the third means 1 13 are adapted to the image and / or the at least one partial image by means of at least one image evaluation method, which depends on the operating state of the vehicle 100 of at least two possible
- Image evaluation method is selected to evaluate.
- the selection is made in such a way that predefined profiles are assigned to a specific operating state, which in turn are suitable for this operating state
- Image evaluation methods include. For example, an image evaluation method is suitable at a speed of the vehicle up to a predetermined one
- Another image evaluation method is suitable since, for example, it evaluates less details overall but also all the relevant details even at higher speeds.
- One Another image evaluation method is suitable, for example, to evaluate the image and / or the at least one partial image by evaluating colors. If an ambient condition of the vehicle by means of the second means 1 12 determined as dark, now another image evaluation method is applied, which evaluates gray tones, for example, because the image and / or the at least one field - due to the dark environment - according to predetermined criteria too few color values include.
- the evaluation 330 takes place by means of the at least one image evaluation method such that the at least one image evaluation method comprises a first evaluation step and a second evaluation step, wherein at least one feature is classified by means of the first evaluation step by means of the subset of image properties in the at least one partial image and by means of the second
- Ausensesch Ritts starting from the at least one classified feature that at least one object is determined.
- the first evaluation step comprises, for example, the following first partial steps (all partial steps of the first and / or second evaluation step are carried out in such a way that corresponding processing of data values takes place by means of suitable software, including the image and / or the at least one partial image and / or represented by the first means 1 1 1 edited versions thereof, without this being explicitly mentioned at each step):
- a cell comprises at least one image area, wherein a plurality of image areas can also be linked to one cell;
- results of the aggregation are stored as internal intermediate images by storing all cells.
- the cells are stored by means of the first set of functions, and one or more features are classified. For example, if the at least one object in the vicinity of the vehicle 100 is a traffic sign, the following features are classified: closed and round, outside red and inside white, inside the white area are black symbols.
- the classification is carried out, for example, by means of at least one of the following methods:
- ACF cell a cell evaluated according to this method is called an ACF cell
- the second evaluation step comprises, for example, the following substeps:
- each of the levels evaluates each cell and / or each image area according to predetermined criteria in more detail than the level before. If all stages have been evaluated, there is a possible object at the current position in the picture and / or in at least one sub-picture, ie in the current cell and / or in the current image area.
- Image areas is known whether a possible object is present or not.
- Example will be based on the first
- Evaluation step exemplary features mentioned, - according to the German StVO - recognized a traffic sign, which a
- Speed limit indication represented After carrying out the first and / or second evaluation steps, data values which represent the evaluation of the image are forwarded to the fourth means 14.
- the forwarded data values now represent, for example, the information that in the surroundings of the vehicle 100 there is a speed limit with a certain maximum speed.
- the fourth means 1 14 are designed to provide the evaluation of the image as data values for the driver assistance system 140.
- the fourth means comprise, for example, a processor, main memory and a memory device and are designed to rewrite and / or modify the data values received from the third means 13 in such a way that they can be processed by the driving assistance system 140, for example the data format, depending on the corresponding driver assistance system 140, is adjusted.
- FIG. 2 shows a vehicle 100, which comprises a system 120 for operating the vehicle 100. This comprises a camera system 130 for capturing at least one image, a device 110 for evaluating 330 the at least one image and providing 340 the evaluation for a driving assistance system 140 of the vehicle 100 and the driving assistance system 140 for executing a driving assistance function for operating the vehicle 100.
- the camera system 130 includes, for example, a monocamera and / or a stereo camera and / or both, and is configured to capture images of an environment of the vehicle 100.
- the camera system 130 can be arranged such that a recording of the environment in any direction, starting from the vehicle 100, is possible.
- the camera system 130 includes a plurality of cameras (mono and / or stereo cameras) such that an image of the environment comprising more than one direction from the vehicle (front, back, left, right) is included.
- the camera system 130 is further configured to forward a captured image in the form of data values to the device 110 for evaluating 330 the image and providing 340 the evaluation for a driving assistance system 140.
- the device 110 for evaluating 330 the image and providing 340 the evaluation for a driving assistance system 140 of the vehicle 100 is designed to capture the image in the form of data values, an evaluation of the image, depending on one Operating state of the vehicle 100, perform and the evaluation, which
- the determining 320 of the operating state of the vehicle 100 takes place, for example, by means of a first sensor system 150, which comprises at least one sensor, wherein the first sensor system is designed to at least one
- the determination 320 of the operating state takes place, for example, by means of a second sensor system, which is designed to detect an ambient state of the vehicle 100.
- the second sensor system 160 includes, for example, a camera and / or a radar sensor and / or an ultrasound sensor and / or a lidar sensor.
- the second sensor system 160 is configured such that it does not include its own sensors, but instead accesses sensors already included in the vehicle 100 that do not belong to the system 120. This may likewise be, for example, a camera and / or a radar sensor and / or an ultrasound sensor and / or a lidar sensor. In a further embodiment, the second sensor system 160 additionally or optionally includes a transmitting and / or receiving unit, which is designed to per
- the system 120 further includes a driver assistance system 120 configured to receive data values representing information about at least one object in the environment of the object.
- the driver assistance system 140 is further configured to operate the vehicle 100 depending on these data values.
- the driver assistance system 140 is configured in such a way that the vehicle 100 does not operate directly, but rather drives already existing control units in the vehicle 100.
- FIG. 3 shows an embodiment of the inventive method 300 for evaluating an image and providing the evaluation for a driving assistance system 140 of a vehicle 100 in the form of a flowchart.
- step 310 an image captured by, for example, a camera system 130 is captured.
- step 320 an operating condition of the vehicle 100 is determined.
- the two steps 310 and 320 can also take place in the reverse order of succession, the sequence depending, for example, on the design of the device 110 and / or on a presetting made, for example, by a manufacturer or an operator of the vehicle 100. If step 320 is first carried out, the first means 1 1 1 for capturing the image may, for example, be embodied such that the capturing 310 of the image already takes place depending on the operating state of the vehicle 100.
- step 330 the image is evaluated.
- step 340 the evaluation of the image as data values for the
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016225876.1A DE102016225876A1 (en) | 2016-12-21 | 2016-12-21 | Method and device for evaluating an image and providing the evaluation for a driver assistance system of a vehicle |
PCT/EP2017/080033 WO2018114181A1 (en) | 2016-12-21 | 2017-11-22 | Method and device for evaluating an image and providing the evaluation for a driving assist system of a vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3559856A1 true EP3559856A1 (en) | 2019-10-30 |
Family
ID=60813793
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17821788.1A Pending EP3559856A1 (en) | 2016-12-21 | 2017-11-22 | Method and device for evaluating an image and providing the evaluation for a driving assist system of a vehicle |
Country Status (6)
Country | Link |
---|---|
US (1) | US11113549B2 (en) |
EP (1) | EP3559856A1 (en) |
JP (1) | JP2020513638A (en) |
CN (1) | CN110100248A (en) |
DE (1) | DE102016225876A1 (en) |
WO (1) | WO2018114181A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7240916B2 (en) | 2019-03-22 | 2023-03-16 | ソニーセミコンダクタソリューションズ株式会社 | Information processing device, information processing method, and information processing program |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4604088B2 (en) * | 2004-05-25 | 2010-12-22 | シーメンス アクチエンゲゼルシヤフト | Automobile monitoring unit and support system |
JP2007329762A (en) | 2006-06-08 | 2007-12-20 | Fujitsu Ten Ltd | Apparatus and method for detecting object candidate area, walker recognition apparatus, and vehicle controller |
JP2010224670A (en) * | 2009-03-19 | 2010-10-07 | Honda Motor Co Ltd | Periphery monitoring device for vehicle |
JP2013529348A (en) * | 2010-06-10 | 2013-07-18 | タタ コンサルタンシー サービシズ リミテッド | Lighting invariant and robust apparatus and method for detecting and recognizing various traffic signs |
JP5708689B2 (en) * | 2013-03-13 | 2015-04-30 | 株式会社デンソー | Object detection device |
CN104008377A (en) * | 2014-06-07 | 2014-08-27 | 北京联合大学 | Ground traffic sign real-time detection and recognition method based on space-time correlation |
-
2016
- 2016-12-21 DE DE102016225876.1A patent/DE102016225876A1/en active Pending
-
2017
- 2017-11-22 JP JP2019552338A patent/JP2020513638A/en active Pending
- 2017-11-22 US US16/470,795 patent/US11113549B2/en active Active
- 2017-11-22 CN CN201780079718.6A patent/CN110100248A/en active Pending
- 2017-11-22 WO PCT/EP2017/080033 patent/WO2018114181A1/en unknown
- 2017-11-22 EP EP17821788.1A patent/EP3559856A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2018114181A1 (en) | 2018-06-28 |
CN110100248A (en) | 2019-08-06 |
US20200089975A1 (en) | 2020-03-19 |
DE102016225876A1 (en) | 2018-06-21 |
JP2020513638A (en) | 2020-05-14 |
US11113549B2 (en) | 2021-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2394234B1 (en) | Method and device for determining an applicable lane marker | |
WO2018177484A1 (en) | Method and system for predicting sensor signals from a vehicle | |
WO2008145543A1 (en) | Method and device for identifying traffic-relevant information | |
DE102009025545A1 (en) | Method and device for detecting brake light systems | |
DE102016210534A1 (en) | Method for classifying an environment of a vehicle | |
DE102006005512A1 (en) | System and method for measuring the distance of a preceding vehicle | |
DE102017106152A1 (en) | Determine an angle of a trailer with optimized template | |
EP3655299B1 (en) | Method and device for determining an optical flow on the basis of an image sequence captured by a camera of a vehicle | |
DE102019214558A1 (en) | PROJECTION INFORMATION RECOGNITION DEVICE BASED ON AN ARTIFICIAL NEURAL NETWORK AND PROCESSES OF THE SAME | |
WO2020020654A1 (en) | Method for operating a driver assistance system having two detection devices | |
DE102016104044A1 (en) | A method for detecting a deposit on an optical element of a camera through a feature space and a hyperplane, and camera system and motor vehicle | |
WO2009086970A1 (en) | Method and device for image detection for motor vehicles | |
DE102008036219A1 (en) | Method for identification of object i.e. traffic sign, in surrounding area of e.g. passenger car, involves determining similarity measure between multiple characteristics of image region and multiple characteristics of characteristic set | |
DE102016101149A1 (en) | A method for detecting smoke in an environmental area of a motor vehicle with the aid of a camera of the motor vehicle, driver assistance system and motor vehicle | |
DE102015006569A1 (en) | Method for image-based recognition of the road type | |
DE102020125232A1 (en) | Color correction method for a camera system and a camera system | |
DE102020211971A1 (en) | VEHICLE TRAJECTORY PREDICTION USING ROAD TOPOLOGY AND TRAFFIC PARTICIPANT OBJECT STATES | |
WO2018114181A1 (en) | Method and device for evaluating an image and providing the evaluation for a driving assist system of a vehicle | |
DE102014007565A1 (en) | Method for determining a respective boundary of at least one object, sensor device, driver assistance device and motor vehicle | |
DE102014213270A1 (en) | Method and device for monitoring a space required to open a flap of a vehicle | |
DE102022125914A1 (en) | FUSION OF IMAGE DATA AND LIDAR DATA FOR IMPROVED OBJECT RECOGNITION | |
DE102018132627A1 (en) | Method for capturing an environment of a motor vehicle by means of temporal fusion of images through an artificial neural network; Control unit, driver assistance system; Computer program product | |
DE102018207923A1 (en) | Improved training of a classifier | |
DE102018215136B4 (en) | Method for selecting an image section of a sensor | |
DE102020133626A1 (en) | Method for recognizing scenes, assistance devices and motor vehicles which are difficult to classify correctly for a neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190722 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20211029 |