EP2989590A1 - Procédé et dispositif de détection d'usagers de la route non motorisés - Google Patents

Procédé et dispositif de détection d'usagers de la route non motorisés

Info

Publication number
EP2989590A1
EP2989590A1 EP14718973.2A EP14718973A EP2989590A1 EP 2989590 A1 EP2989590 A1 EP 2989590A1 EP 14718973 A EP14718973 A EP 14718973A EP 2989590 A1 EP2989590 A1 EP 2989590A1
Authority
EP
European Patent Office
Prior art keywords
motor vehicle
driver
message
images
road user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14718973.2A
Other languages
German (de)
English (en)
Inventor
Christoph Arndt
Uwe Gussen
Frederic Stefan
Goetz-Philipp Wegner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of EP2989590A1 publication Critical patent/EP2989590A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Definitions

  • the invention relates to a method for the automatic detection of non-motorized road users in the vicinity of a moving motor vehicle based on images which are recorded by means of at least one camera mounted in or on the motor vehicle and which are analyzed for further information contained therein, according to the preamble of claim 1, and an apparatus for carrying out the method.
  • Such a method is known from DE 10 2007 052 093 A1.
  • the pictures are searched for indicators that indicate a change in movement of the detected road user.
  • Such indicators are e.g. Movements or postures of parts of the body which indicate an expected course of movement, which are directly related to the locomotion of the road user. That is, it is looking for indicators that can be with certainty or at least very likely to infer a future movement behavior, because this follows physical laws.
  • variable indicators such as e.g. Changes in the center of gravity, changes in arm or leg movements, changes in head orientations and direction of vision. Taking into account the movement sequence of the motor vehicle is also determined whether a collision with a detected non-motorized road users is likely, and possibly the driver of the motor vehicle is warned.
  • a generic method is also known from EP 2 023 267 B1. Living beings such as pedestrians, cyclists and animals in the vicinity of the motor vehicle are detected and identified on the basis of movement periodicities. In addition, the direction of movement is determined by recognized creatures. If a risk of collision is detected, the driver of the motor vehicle is warned.
  • US 2006/0187305 A1 discloses a likewise generic method for recognizing and tracking faces, facial orientations and emotions. It is proposed to observe the driver of a motor vehicle and to warn against his own inattention. It is also proposed to observe the surrounding traffic and to warn the driver in unsafe situations e.g. to warn against pedestrians or obstacles.
  • US 2010/0185341 A1 discloses a method for detecting gestures of a person in or near a motor vehicle to whom the motor vehicle is to react.
  • the method may also include threatening gestures of a person close to a motor vehicle, e.g. recognize as a theft intention and take deterrent measures if necessary.
  • the invention is based on the object to support the driver of a moving motor vehicle in the maintenance of traffic safety even more. This object is achieved by a method according to claim 1 and a device according to claim 9. Advantageous developments of the invention are specified in the dependent claims.
  • the invention makes it possible to make the driver of a motor vehicle aware that a non-motorized road user wants to communicate something to him, to which he should possibly react.
  • Non-motorized road users usually attract attention with gestures when they want to give a message to a specific or all motor vehicle operators in their vicinity. Thus, for example, a cyclist can indicate a turning intention with a hand sign so that subsequent or oncoming motor vehicle drivers adjust to it.
  • Automotive drivers who focus on driving themselves easily miss such messages, especially in confusing traffic situations or when distracted.
  • the motor vehicle driver it is possible for the motor vehicle driver to be made aware only of the existence of a traffic-related visual message, so that he has to ascertain himself what this is, but he is also preferably informed about the content of the recognized message, eg visually or acoustically by means of speech synthesis or the like. That is to say, the message is preferably not only recognized as such, but also interpreted.
  • the invention not only improves road safety, but also the communication of non-motorized road users with motor vehicle operators in general.
  • the invention may be e.g. To make it easier for taxi drivers to become aware of potential customers who are standing by the side of the road and who indicate that they want to move by hand. Or it can e.g. a traffic policeman is recognized, who instructs a motor vehicle driver by hand signals to stop.
  • the invention may provide driver assistance systems or vehicle safety systems with useful additional information.
  • Non-motorized road users are here understood to mean persons who are on or near the road (ie their road surface) and in the field of vision of the camera (s), such as pedestrians and cyclists, whether they are moving or not.
  • only persons whose entire body is visible to the camera (s) should be considered. People in the immediate vicinity of the motor vehicle, for example at a distance of less than 10 m, should be disregarded, because in such cases an automatic message recognition would be unreliable and generally unnecessary.
  • the distance up to which persons are disregarded can also be changed as a function of the speed and / or adapted to the ambient or driving situation.
  • traffic-related visual messages is meant primarily certain gestures of persons visible in the camera images, indicating that somebody specifically wants to alert the driver of the motor vehicle, or in general any other road users, to something that may require a reaction from the driver or other road users ,
  • traffic-related messages are not only recognized by gestures, but other visual indicators are taken into account, in particular type, location, body orientation, head orientation, gaze, gesture and / or equipment (ie, special clothing, headgear and / or or objects in the hand) of a gesticulating person. If such indicia are present in any of a number of prestored combinations, and the nature, number and / or strength of the indicia may play a role, it is assumed that there is a traffic-related message relevant to the driver of the motor vehicle.
  • traffic-related visual messages are something that someone at this moment is deliberately doing to communicate to one or more other road users, and thus are something quite different from the more or less involuntarily occurring movement changes, as described in the above. closest prior art.
  • indicia that are substantially static, that is, that do not change so quickly that they would qualify as motions, are considered.
  • the type, body orientation, head orientation, viewing direction, gesture and / or equipment of a non-motorized road user are identified by comparing their contours with pre-stored patterns.
  • a response to a detected message is situation-dependent in a notification or warning of the driver and / or a pre-activation or activation of a driver assistance system.
  • the motor vehicle driver and / or a learning algorithm can configure in which situations a notification, warning or driver assistance is to be given.
  • 1 is an overview sketch of a system for the automatic recognition of messages from non-motorized road users in a motor vehicle; 2a - 2c different contours of potentially relevant road users; a perspective view of a road with non-motorized road users seen from an on-road motor vehicle;
  • FIGS. 4a-4c show some possible head orientations and viewing directions of a non-motorized road user
  • FIGS. 5a-5c show some possible body orientations of a non-motorized road user
  • 6a-6d show some possible arm positions of a non-motorized road user
  • FIG. 8 shows a flow chart of an example of a method for automatic detection of non-motorized road users in the vicinity of a moving motor vehicle on the basis of camera images.
  • the message recognition system shown in FIG. 1 comprises one or more cameras 1, which are installed in or on a motor vehicle 2 and can visually detect the surroundings of the motor vehicle 2. In particular, environmental areas are detected in which pedestrians, cyclists and other non-motorized road users 3 may be located. For this purpose, at least one camera 1 takes pictures in the direction of travel of the motor vehicle 2.
  • An image acquisition module 4 performs pre-processing of the captured images by filtering, etc.
  • An image analysis and feature extraction module 5 first performs a preliminary analysis of the preprocessed images to determine whether any non-motorized road users 3 are substantially visible with their entire body, whether they are on or near the road on which the motor vehicle 2 is traveling Such a road user 3 gives certain pre-stored gestures or signs, and whether certain gestures or signs obviously apply to the driver 9 of the motor vehicle 2 or are relevant to him (this can be determined by the type and location of the non-motorized road user 3 and the line of sight and body - or arm orientation notice). Thereafter, the image analysis and feature extraction module 5 performs a detailed analysis of the images, possibly using further images of the same or another camera 1, to refine the analysis and classify the gestures or characters found by the pre-analysis according to their type.
  • a classification module 6 classifies the found gestures or signs according to relevance. For example, a gesture of a traffic cop that the driver 9 should stop the motor vehicle 2 is very relevant, but not any person who greets. For this classification, also features of gesticulating persons are taken into account, e.g. special clothing, headgear and / or items in hand such as a Winkerkelle.
  • history data and reference data such as e.g. Reference images and classification trees are stored in a database 7.
  • a suitable man-machine interface 8 notifies the driver 9 of the motor vehicle 2 by means of audible or visual signals that he should pay attention to a gesture of a non-motorized road user 3, and if the system has also recognized the meaning of the gesture, e.g. that a pickup wants to be taken, the driver 9 can also be communicated the specific meaning of the gesture.
  • the motor vehicle 2 is equipped with an augmented reality technique, the non-motorized road user 3 may also be visually highlighted, and the nature and / or importance of the gesture may be e.g. indicated by certain colors, e.g. the color red is very important.
  • the classification result of the classification module 6 can be used by the driver assistance system 10 to improve and refine decisions and actions.
  • An example of this is a motor vehicle 2, which approaches a school, where a school pilot stands on or on the road and pans a Winkerkelle.
  • the message contained therein, slower to drive and stop, can be recognized by the message recognition system installed in the motor vehicle 1 on the basis of the special clothing of the school pilot (yellow vest) and the Winkerkelle.
  • This important message is communicated to the driver 9 of the motor vehicle 2 via the man-machine interface 8. If the driver 9 does not react immediately, a brake assist is preactivated, and if the driver 9 still does not respond, the brake assist can automatically decelerate the vehicle 2 to avoid a collision.
  • the interpretation of visual messages from non-motorized road users 3 to the motor vehicle driver 9 occurs in three steps:
  • the first step is to identify non-motorized road users 3 who are likely to visually communicate something to the motor vehicle driver 9.
  • the second step is to identify and interpret visual messages to the motor vehicle driver 9.
  • the third step is to inform the motor vehicle driver 9 about an identified message.
  • the signals generated by the camera (s) 1 in the motor vehicle 2 are detected. wonnenen images evaluated by means of various image processing algorithms, as described below.
  • a road user for a motor vehicle driver 9 also depends on its position on or on the road. As shown in FIG. 3, only non-motorized road users are considered within the scope of the invention, which are located either on the carriageway A in FIG. 3 or on a narrow strip B in FIG. 3 which adjoins the carriageway A on the right. Strips A and B may be e.g. be differentiated by curbs. In Fig. 3 drives in the strip A, a cyclist 1 1 in front of the motor vehicle 2 ago and is in the strip B a pedestrian 12. people farther away from the lane A remain unconsidered, as shown.
  • FIGS. 4a to 4c show various head orientations and viewing directions of a person with respect to a motor vehicle 2, from which the person is filmed. As you can see, only the in Fig. 4b shown person, in both head orientation and facing the motor vehicle 2, obviously the motor vehicle 2 attention.
  • FIGS. 5a-5c show various possible body orientations and viewing directions of a pedestrian with respect to the motor vehicle 2, namely in FIG. 5a a pedestrian oriented transversely to the motor vehicle 2, FIG. 5b a pedestrian oriented frontally to the motor vehicle 2 and FIG. 5c a partially oriented towards the motor vehicle 2 pedestrian.
  • the pedestrian shown in Fig. 5b is most likely to communicate with the motor vehicle driver 9.
  • a plane is generated in each case by the shoulders of each pedestrian, which planes below the pedestrian contours in FIGS. 5a-5c are shown as dashed lines.
  • a vector is generated, which is perpendicular to the respective shoulder plane and passes through the vertical axis of symmetry of the pedestrian, as shown below the pedestrian contours in Figures 5a - 5c as arrows. Then it is determined whether this vector to the motor vehicle 2 shows or not. If so, the pedestrian is oriented toward the motor vehicle 2 and may wish to communicate with the motor vehicle driver 9. This is most likely for the pedestrian shown in Figure 5b and somewhat less for the pedestrian shown in Figure 5c.
  • an indicia evaluation takes place in which the obtained indications that a non-motorized road user comes into question as an emissary are considered in combination with each other.
  • search is made for predetermined combinations of features of the type, location and orientation of the road user.
  • a suitable learning algorithm could e.g. in the form of a neural network, improve the decision logic.
  • Some indicators may be weighted more heavily than others. For example, pedestrians on the street can be weighted more heavily than pedestrians on the sidewalk.
  • FIG. 7 shows an example of a classification matrix for indicative evaluation for three scenarios in which non-motorized road users are involved for the message recognition system are relevant and are suitable as messengers because they may want to communicate something visually to the motor vehicle driver 9.
  • thin short dashed lines represent a pedestrian who is on the road and looking toward the motor vehicle 2.
  • Thick dashed lines represent a pedestrian on the sidewalk near the curb and looking toward the motor vehicle 2, similar to the pedestrian 12 in FIG. 3.
  • Thick solid lines represent a cyclist ahead on the road, like the cyclist 11 in FIG Fig. 3.
  • a classification matrix shown in Fig. 7 for the o.g. first step, to identify non-motorized road users 3, who probably just want to communicate something to the motor vehicle driver 9, using only the lines in the first three lines. Regions in FIG. 7 in which no such lines pass can be ignored. Feature combinations in the areas traversed by lines are compared with pre-stored feature combinations in order to determine the indicative force of these feature combinations, wherein the individual features can be weighted differently.
  • Arm positions are first used for identification and interpretation, some of which are illustrated in Figures 6a-6d for a pedestrian as seen in Figure 5b, both arms hanging (Figure 6a), one arm slightly raised, i.e. up to 30 ° ( Figure 6b), one arm raised mid-high, i. 30 ° to 60 °, one arm raised high, i. more than 60 ° ( Figure 6c), and one arm raised, i. beckoning ( Figure 6d).
  • the identification and interpretation also takes into account whether the left or right arm of the non-motorized road user is in motion. For example, a forward or oncoming cyclist with his left arm outstretched wants to indicate that he wants to turn left and that subsequent or oncoming vehicles should take care of it.
  • certain head equipment is considered, such as helmets or visors, to which the image analysis focuses on the outlined in Fig. 5b head portion 13.
  • certain body features are considered, such as stripes, badges or certain words such as "police”, to which the image analysis focuses on the outlined in Fig. 5b trunk area 14.
  • certain hand equipment is taken into account, such as a Winkenkelle, a sign with a city name, a luggage, a stretcher, etc., to which the image analysis focuses on the outlined in Fig. 5b hand areas 15.
  • colors and color patterns may also be taken into account, as they are characteristic for the uniforms of traffic police, firefighters, road workers, etc., possibly also depending on vehicle location, because such colors and color patterns are country specific.
  • a vector sequence of feature combinations results for each scenario, which is compared to prestored feature combinations. Since certain characteristics often belong together, eg a traffic cop wears typical clothing and headgear or a cyclist often wears a helmet, in this way it can be reliably determined whether a non-motorized road user gives the motor vehicle driver 9 a message and which one.
  • the pedestrian represented in FIG. 7 by thick long dashed lines who is on the sidewalk near the curb and looking at the motor vehicle 9, waves his right hand and may carry a piece of luggage. That is, this is a potential customer for a taxi driver.
  • a pedestrian close to the road holding one arm a little and holding a sign with a city name on it is probably a pickup, especially if there is a piece of luggage nearby.
  • a pedestrian on the street with no particular arm movement who wears white clothes and is not alone may be a paramedic, especially if a stretcher is nearby.
  • Scenarios that are inconsistent and do not allow unambiguous interpretation are either ignored by the message recognition system, or further analysis is performed, for example, as follows: Are there any special objects in the vicinity of the gesticulating person (eg broken-down vehicle, warning triangle, etc.)?
  • the system can inform the driver in different ways. For example, there can be three different reaction modes:
  • the driver will receive an audible or visual message that a specific message has been detected.
  • the driver can be informed about the existence of the message or about content, i. E. who gives why which message u. For example, the driver receives the notification that a pickup wants to be taken or a taxi driver receives an indication of a potential customer.
  • Warning The driver receives a particularly clear audible or visual message that a particular message has been detected that requires a driver's reaction. For example, the driver is notified that a road worker is asking for slow driving or that a traffic cop is instructing him to stop. 3.
  • Driver Assistance If the driver does not respond to a warning and is too distracted to respond, a driver assistance system is pre-activated. In such a case, for example, a brake assistant, the brake pads closer to the
  • the situations in which a notification, warning or driver assistance is given may be configurable by the driver. Additionally or alternatively, such a configuration may be observed and analyzed, supplemented, or refined by a learning algorithm that allows driver responses to any messages over an extended period of time. Such an algorithm may also supplement classes of situations with situations where it has detected similar driver behavior, or set up new classes of situations, or change default behaviors according to detected driver behavior.
  • step S1 video images are read in by one or more cameras.
  • Step S2 are extracted from the images non-motorized road users on the basis of their contours. From the contours, the type, location and orientation of the detected road users are further determined in the parallel steps S3, S4 and S5, and on this basis the road users are classified in step S6. In step S7 it is determined whether one of the road users comes as an emissary in question. If not, it returns to step S2, and if so, its characteristics are stored in step S8, and a possible message is determined therefrom in step S9. In step S10 to S15, parallel analyzes of arm positions, arm movements, headgear, hand equipment, body equipment and color patterns of the candidate road user are performed, and in step S16, the message is classified. In step S17, it is determined whether everything is conclusive.
  • step S18 the way of informing the driver is determined in step S18, the driver is notified of the detected message in step S19, and in step S20, it returns to step S1. If it is determined in step S17 that not all is conclusive, in step S21 the driver's reaction is observed and, in step S22, the classification and interpretation modes are supplemented or modified in accordance with the driver's response.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de détection automatique d'usagers de la route non motorisés (3) dans l'environnement d'un véhicule automobile (2) en train de rouler au moyen d'images qui sont enregistrées par au moins une caméra (1) montée dans ou sur le véhicule automobile (2) et qui sont analysées afin de déterminer d'autres informations qu'elles contiennent. Selon l'invention, les images d'un usager de la route non motorisé (3) détecté dans l'environnement véhicule automobile (2) sont analysées afin de déterminer si ledit usager (3) va visiblement envoyer à ce moment au conducteur (9) du véhicule automobile (2) et/ou à un autre usager de la route un message visuel relatif au trafic. Le conducteur (9) du véhicule automobile (2) est informé lorsqu'un tel message est détecté.
EP14718973.2A 2013-04-22 2014-04-22 Procédé et dispositif de détection d'usagers de la route non motorisés Withdrawn EP2989590A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013207223.6A DE102013207223A1 (de) 2013-04-22 2013-04-22 Verfahren zur Erkennung von nicht motorisierten Verkehrsteilnehmern
PCT/EP2014/058059 WO2014173863A1 (fr) 2013-04-22 2014-04-22 Procédé et dispositif de détection d'usagers de la route non motorisés

Publications (1)

Publication Number Publication Date
EP2989590A1 true EP2989590A1 (fr) 2016-03-02

Family

ID=50543584

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14718973.2A Withdrawn EP2989590A1 (fr) 2013-04-22 2014-04-22 Procédé et dispositif de détection d'usagers de la route non motorisés

Country Status (5)

Country Link
US (1) US20160012301A1 (fr)
EP (1) EP2989590A1 (fr)
CN (1) CN105283883A (fr)
DE (1) DE102013207223A1 (fr)
WO (1) WO2014173863A1 (fr)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015004605B4 (de) * 2015-04-08 2021-01-14 Audi Ag Verfahren zum Betrieb eines Steuersystems einer Mobileinheit und Mobileinheit
DE102015212364A1 (de) * 2015-07-02 2017-01-05 Conti Temic Microelectronic Gmbh Verfahren zum Lokalisieren eines Zweirades, insbesondere eines Fahrrades
US10272921B2 (en) * 2015-08-25 2019-04-30 International Business Machines Corporation Enriched connected car analysis services
DE102015225082A1 (de) * 2015-12-14 2017-06-14 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Betreiben eines Kraftfahrzeugs in einem vollautomatisierten Fahrmodus und Kraftfahrzeug mit einem vollautomatisierten Fahrmodus
DE102016215587A1 (de) 2016-08-19 2018-02-22 Audi Ag Verfahren zum Betreiben eines zumindest teilautonom betriebenen Kraftfahrzeugs und Kraftfahrzeug
DE102016217770A1 (de) 2016-09-16 2018-03-22 Audi Ag Verfahren zum Betrieb eines Kraftfahrzeugs
CN110312952B (zh) 2017-02-20 2022-09-27 3M创新有限公司 光学制品和与其交互的系统
DE102017208728A1 (de) 2017-05-23 2018-11-29 Audi Ag Verfahren zur Ermittlung einer Fahranweisung
JP6613265B2 (ja) * 2017-06-01 2019-11-27 本田技研工業株式会社 予測装置、車両、予測方法およびプログラム
JP6796201B2 (ja) * 2017-06-02 2020-12-02 本田技研工業株式会社 予測装置、車両、予測方法およびプログラム
JP2019008519A (ja) * 2017-06-23 2019-01-17 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 移動体検出方法、移動体学習方法、移動体検出装置、移動体学習装置、移動体検出システム、および、プログラム
DE102017216000A1 (de) * 2017-09-11 2019-03-14 Conti Temic Microelectronic Gmbh Gestensteuerung zur Kommunikation mit einem autonomen Fahrzeug auf Basis einer einfachen 2D Kamera
WO2019058446A1 (fr) 2017-09-20 2019-03-28 本田技研工業株式会社 Dispositif de commande de véhicule, procédé de commande de véhicule et programme
CN111164605A (zh) 2017-09-27 2020-05-15 3M创新有限公司 使用光学图案以用于设备和安全监控的个人防护设备管理系统
US10717412B2 (en) * 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
DE102017222288A1 (de) 2017-12-08 2019-06-13 Audi Ag Verfahren zur Organisation mehrerer Fahrzeuge einer Fahrzeugflotte zur Personenbeförderung und Servereinrichtung zum Durchführen des Verfahrens
CN109969172B (zh) * 2017-12-26 2020-12-01 华为技术有限公司 车辆控制方法、设备及计算机存储介质
JP6989418B2 (ja) * 2018-03-12 2022-01-05 矢崎総業株式会社 車載システム
CN110443833B (zh) * 2018-05-04 2023-09-26 佳能株式会社 对象跟踪方法和设备
US11545032B2 (en) * 2018-05-25 2023-01-03 Sony Corporation Roadside apparatus and vehicle-side apparatus for road-to-vehicle communication, and road-to-vehicle communication system
US11887477B2 (en) 2018-07-25 2024-01-30 Motorola Solutions, Inc. Device, system and method for controlling autonomous vehicles using a visual notification device
CN109389838A (zh) * 2018-11-26 2019-02-26 爱驰汽车有限公司 无人驾驶路口路径规划方法、系统、设备及存储介质
CN109859527A (zh) * 2019-01-30 2019-06-07 杭州鸿泉物联网技术股份有限公司 一种非机动车转弯预警方法及装置
DE102019204616A1 (de) * 2019-04-01 2020-10-01 Volkswagen Aktiengesellschaft Verfahren zum Betreiben einer Anzeigevorrichtung eines Kraftfahrzeugs, bei welchem Lebewesen-Symbole angezeigt werden, sowie Anzeigevorrichtung
CN112530172A (zh) * 2019-09-17 2021-03-19 奥迪股份公司 预防骑行目标碰撞前车的安全提示系统和安全提示方法
CN113129597B (zh) * 2019-12-31 2022-06-21 深圳云天励飞技术有限公司 一种机动车道违法车辆识别方法及装置
CN112037364A (zh) * 2020-08-19 2020-12-04 泰州德川绿化养护有限公司 一种行车时安全提醒的方法及行车记录装置
CN112435475B (zh) * 2020-11-23 2022-04-29 北京软通智慧科技有限公司 一种交通状态检测方法、装置、设备及存储介质

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8599266B2 (en) 2002-07-01 2013-12-03 The Regents Of The University Of California Digital processing of video images
DE102006008981A1 (de) * 2006-02-23 2007-08-30 Siemens Ag Assistenzsystem zur Unterstützung eines Fahrers
US20090180668A1 (en) * 2007-04-11 2009-07-16 Irobot Corporation System and method for cooperative remote vehicle behavior
JP4470067B2 (ja) * 2007-08-07 2010-06-02 本田技研工業株式会社 対象物種別判定装置、車両
DE102007052093B4 (de) * 2007-10-31 2023-08-10 Bayerische Motoren Werke Aktiengesellschaft Erkennung von spontanen Bewegungsänderungen von Fußgängern
US20100185341A1 (en) 2009-01-16 2010-07-22 Gm Global Technology Operations, Inc. Vehicle mode activation by gesture recognition
US8351651B2 (en) * 2010-04-26 2013-01-08 Microsoft Corporation Hand-location post-process refinement in a tracking system
CN102096803B (zh) * 2010-11-29 2013-11-13 吉林大学 基于机器视觉的行人安全状态识别系统
CN102685516A (zh) * 2011-03-07 2012-09-19 李慧盈 立体视觉主动安全辅助驾驶方法
CN102765365B (zh) * 2011-05-06 2014-07-30 香港生产力促进局 基于机器视觉的行人检测方法及行人防撞预警系统
TW201328340A (zh) * 2011-12-27 2013-07-01 Hon Hai Prec Ind Co Ltd 乘客攔車提示系統及方法
DE102012016240A1 (de) * 2012-08-16 2014-02-20 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Assistenzsystem für ein Kraftfahrzeug und Verfahren zur Steuerung eines Kraftfahrzeugs
JP5561396B1 (ja) * 2013-02-19 2014-07-30 日本電気株式会社 運転支援システムおよび運転支援方法
WO2014132747A1 (fr) * 2013-02-27 2014-09-04 日立オートモティブシステムズ株式会社 Dispositif de détection d'objet

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2014173863A1 *

Also Published As

Publication number Publication date
US20160012301A1 (en) 2016-01-14
DE102013207223A1 (de) 2014-10-23
CN105283883A (zh) 2016-01-27
WO2014173863A1 (fr) 2014-10-30

Similar Documents

Publication Publication Date Title
EP2989590A1 (fr) Procédé et dispositif de détection d'usagers de la route non motorisés
DE102013110332B4 (de) Visuelles leitsystem
DE10325762A1 (de) Bildverarbeitungssystem für ein Fahrzeug
EP3661796B1 (fr) Procédé de fonctionnement d'un écran d'un véhicule automobile et véhicule automobile
EP2400473A1 (fr) Procédé et dispositif destinés à assister un conducteur de véhicule
DE102014201159A1 (de) Verfahren und Vorrichtung zum Klassifizieren eines Verhaltens eines Fußgängers beim Überqueren einer Fahrbahn eines Fahrzeugs sowie Personenschutzsystem eines Fahrzeugs
DE10336638A1 (de) Vorrichtung zur Klassifizierung wengistens eines Objekts in einem Fahrzeugumfeld
DE102019208663B4 (de) Verfahren zur Unterstützung eines Fahrers hinsichtlich für eine Verkehrssituation relevanter Objekte und Kraftfahrzeug
DE102009009473A1 (de) Verfahren zum Unterstützen eines Fahrers eines Fahrzeugs und anderer Verkehrsteilnehmer sowie Fahrerassistenzsystem für ein Fahrzeug und andere Verkehrsteilnehmer
EP2710573B1 (fr) Procédé et dispositif permettant d'identifier un objet de collision éventuel
DE102013017626A1 (de) Verfahren zur Warnung weiterer Verkehrsteilnehmer vor Fußgängern durch ein Kraftfahrzeug und Kraftfahrzeug
EP2869284B1 (fr) Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires
DE102016215115A1 (de) Vorrichtung und Verfahren zum Detektieren von Verkehrsteilnehmern in einem Umfeld eines Ego-Fahrzeugs
DE102013225773A1 (de) Rückraumüberwachungsystem und Verfahren zur Rückraumüberwachung eines Fahrzeugs
DE102020208008A1 (de) Bildklassifikation und zugehöriges Training für sicherheitsrelevante Klassifikationsaufgaben
EP2555178B1 (fr) Procédé de détection d'objets placés sur le côté d'un véhicule utilitaire et véhicule utilitaire avec système de détection permettant de réaliser le procédé
DE102017202380A1 (de) Automatisierte Aktivierung eines Sichtunterstützungssystems
DE10103767A1 (de) Verfahren und Vorrichtung zur vorausschauenden Kollisionserkennung und -vermeidung
DE102016120166A1 (de) Steuern eines Fahrzeugs in Abhängigkeit von der Umgebung
DE102019219658A1 (de) System und Verfahren zur Darstellung einer durch ein Objekt für einen Verkehrsteilnehmer verdeckten Sicht
DE102016010818A1 (de) Vorrichtung zur Ausgabe einer Warnung
WO2021047827A1 (fr) Procédé pour éviter une collision dans le trafic routier sur la base de configuration adaptative de zones occupées
DE102019214004A1 (de) Verfahren zum Steuern einer Vorrichtung in einer Umgebung eines ungeschützten Verkehrsteilnehmers auf Basis eines ermittelten Aufmerksamkeitsgrads des ungeschützten Verkehrsteilnehmers
DE102019206922A1 (de) Verfahren und Vorrichtung zur Objektmarkierung in Fahrzeugen
DE102011120878A1 (de) Verfahren und Anzeigevorrichtung zur Erzeugung eines virtuellen Bildteils auf einer Bildanzeigeeinheit

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20151123

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20190425

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190906