WO2020167110A1 - Système et procédé reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique - Google Patents

Système et procédé reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique Download PDF

Info

Publication number
WO2020167110A1
WO2020167110A1 PCT/MX2019/000016 MX2019000016W WO2020167110A1 WO 2020167110 A1 WO2020167110 A1 WO 2020167110A1 MX 2019000016 W MX2019000016 W MX 2019000016W WO 2020167110 A1 WO2020167110 A1 WO 2020167110A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driver
controllers
visual
images
Prior art date
Application number
PCT/MX2019/000016
Other languages
English (en)
Spanish (es)
Inventor
Álvaro IZAGUIRRE SERRANO
Luis Enrique GONZÁLEZ JIMÉNEZ
Original Assignee
Instituto Tecnológico Y De Estudios Superiores De Occidente, A.C.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Instituto Tecnológico Y De Estudios Superiores De Occidente, A.C. filed Critical Instituto Tecnológico Y De Estudios Superiores De Occidente, A.C.
Publication of WO2020167110A1 publication Critical patent/WO2020167110A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources

Definitions

  • the present invention is related to electronic systems in general, in particular it is related to electronic monitoring and control systems applied to motor vehicles and more specifically it refers to a system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving.
  • Our invention is directed to a system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving.
  • a search was conducted to determine the closest state of the art and the following documents were found:
  • Document D1 was located consisting of the patent application US20160297365A1 of Axel Nix of June 13, 2016, which reveals a driver assistance system suitable for use in a vehicle includes a forward vision camera mounted on the front windshield of the vehicle. vehicle and an event recorder that has an image processor and memory. Based at least in part on the image processor's processing of the image data captured by the camera, a lane departure warning is generated if, when the vehicle's turn signal is not activated, at least one of ( i) the vehicle is heading towards a lane marking and (ii) the vehicle is at a certain distance from an edge of a lane line for a specified time. Based at least in part on Image Processor processing of the image data captured by the camera, a forward collision alert is generated if the vehicle is heading towards a given object with the gap between the vehicle and the given object. which is less than a threshold.
  • the document does not include an internal camera so no allows monitoring of the driver's status and therefore its efficiency is low.
  • Document D1 does not reveal or suggest the association of a pair of cameras, one internal and one external, an Information processing device that includes a processor that will analyze the data from these cameras and will also be able to communicate with the different electronic control units ( ECUs for its acronym in English) in the car system to be able to alert the driver or take a specific action to avoid harm to the user through a Human Machine Interface (HMI).
  • ECUs electronice control units
  • HMI Human Machine Interface
  • D2 EP3161507A1 by Dooley Damien et. to the. of April 29, 2015 which reveals a method to track a target vehicle (9) approaching a motor vehicle (1) by means of a camera system (2) of the motor vehicle (1).
  • a temporal sequence of Images (10) of an environmental region of the motor vehicle (1) is provided by means of at least one camera (3) of the camera system (5).
  • the target vehicle (9) is detected in an image (10) of the sequence by means of an Image processing device (5) of the camera system (5) based on a characteristic of a front (11) or a part rear of the target vehicle (9) and then the target vehicle (9) followed by subsequent images (10) of the sequence based on the detected characteristic.
  • the image processing device (5) detects at least one predetermined characteristic of a side flank (14) of the target vehicle (9) in one of the Later images (10) of the sequence and after detecting the characteristic of the lateral flank (14) the target vehicle (9)
  • the images (10) of the sequence are tracked by the characteristic of the lateral edge (14).
  • a temporal sequence of images of an environmental region of the motor vehicle is provided by means of at least one camera of the camera system.
  • the target vehicle is detected in a sequence image by means of an image processing device of the camera system based on a characteristic of a front or rear of the target vehicle, and then the target vehicle is tracked through subsequent images of the sequence based on the detected characteristic.
  • the invention also relates to a camera system for a motor vehicle, which is designed to carry out said method, as well as to a motor vehicle with said camera system.
  • Document D3 US9308914 by Sun Bo et. to the. of January 23, 2015, which reveals an advanced driver assistance system of a vehicle to perform one or more functions of Driver assistance includes an environmental sensor, a driving status monitor, an Activation Switch and a driver assistance controller.
  • the environmental sensor detects an environment around the vehicle.
  • the controller health monitor monitors the controller and determines a controller abnormal condition.
  • the activation switch is operable by the controller to activate or deactivate the driver assistance functions.
  • the driver assistance controller performs one or more driver assistance functions to control the vehicle according to a target behavior, where the target behavior is defined according to the environment surrounding the vehicle.
  • the driver assistance controller activates the driver assistance functions when the driver status monitor detects the abnormal situation with the activation switch set to deactivate.
  • the D3 document does not reveal or suggest the association of a pair of cameras, one internal and one external, an information processing device that comprises a processor that will analyze the data from these cameras and will also be able to communicate with the different ECUs in the system of the car to be able to warn the driver or take a specific action to avoid harm to the user through a human-machine interface (HMI).
  • HMI human-machine interface
  • the main objective of the invention is to make available a system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving that allows increasing driver safety, through monitoring visual of the exterior (other vehicles, signs, people, buildings, etc.) and the interior (loss of driver attention, state of driver, number and position of occupants, etc.) of the vehicle and correlate these two sources of information to carry out a preventive action (avoid ignition or acceleration, alert the passenger with light or noise, turn on the turn signals, etc.) or corrective (decelerate, activate automatic driving, maintain safety distances, etc.).
  • a preventive action avoid ignition or acceleration, alert the passenger with light or noise, turn on the turn signals, etc.
  • corrective decelerate, activate automatic driving, maintain safety distances, etc.
  • Another objective of the invention is to make available said system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving, where the proposed visual monitoring includes a wide visual field that normally it is not within the driver's reach.
  • Another objective of the invention is to make available said system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving, where the output of the proposed system allows preventive and corrective actions , can be used to reduce traffic accidents, make the use of traffic more time and energy efficient, and improve the driving experience of a vehicle.
  • Another objective of the invention is to make available said system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving, which also confers on the vehicle automatic reaction capacity in the event of risk situations detected by it.
  • the system and method based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving allows the correlation of images obtained from the interior and exterior of a car and is composed by at least one external camera arranged in a front area of the vehicle looking towards the driver's field of vision in the front direction of the vehicle and at least one internal camera arranged inside the vehicle cabin focused towards the driver's area; a main information processing and control device that comprises a processor that will analyze the data (images-video) from said cameras and also comprises communication means to communicate with the plurality of engine control units (ECU's) of the electronic system of control of the vehicle and also configured to receive information from the plurality of sensors of the vehicle, to be able to notify the driver or take a specific action to avoid a damage to the user via a human machine interface (HMI) with visual or audible warning and warning alarms.
  • ECU's engine control units
  • HMI human machine interface
  • the system allows through said at least one interior camera to determine where the driver is looking and will analyze the maximum visual field based on the position of the driver's eyes and head; said at least one second exterior camera mounted externally preferably on the windshield of the vehicle or other area that allows viewing the front area of the vehicle will be recording the image in front of the automobile and the objects within visual field.
  • the system will correlate the objects in the frontal image with the visual field detected by the position of the eyes and will analyze the objects to detect hazards based on a classification and a detection sensitivity adjustment, as well as information from other sensors, such as by example the speed of the car, among other sensors.
  • the system When the system detects a pedestrian in front and sees that it is not within the driver's field of vision, then a serious alert will be issued to the driver, instead if a bird is detected it should be ignored. In addition, it will provide the ability to detect traffic signs and see if the driver was able to observe them, otherwise an alarm of medium severity will be issued.
  • the system will decide if it is necessary to take any control primitive (i .e. Secure profiles of speed, automatic driving support, task activation in other vehicle ECUs, etc.) or if it is only necessary to notify the driver by means of a visual or audible alarm from the human-machine interface (HM I).
  • any control primitive i .e. Secure profiles of speed, automatic driving support, task activation in other vehicle ECUs, etc.
  • HM I human-machine interface
  • the invention also involves a method to improve safety in manual or semi-automatic driving by a driver, by means of the system of correlation of interior and exterior visual information in a vehicle, as claimed in claim 1, characterized in that it consists in:
  • ECU's engine control units
  • actuators for example, speed controllers, injection controllers, light controllers, ABS brake controllers
  • H M I human-machine interface
  • Figure 1 shows a schematic block diagram of the system based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving.
  • Figure 2 illustrates a schematic diagram of the interior of a vehicle showing the arrangement of the internal and external cameras of the system based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving.
  • Figure 3 shows a conventional perspective of a vehicle where the components of the system based on the correlation of interior and exterior visual information in a vehicle are schematically illustrated to improve safety in manual or semi-automatic driving.
  • Figure 4 shows a flow diagram of the operation of the system based on the correlation of visual information inside and outside in a vehicle to improve driving safety manual or semi-automatic.
  • the system based on the correlation of interior and exterior visual information in a vehicle to improve safety in manual or semi-automatic driving, consists of at least one external camera (1) arranged in a zone front of the vehicle (2) looking towards the visual field of the driver's area (3) in the front direction of the vehicle (2) and at least one internal camera (4) arranged inside the cabin (5) of the vehicle ( 2) focused towards the visual field of the driver's area (3); a main information processing and control device (6) comprising a processor that will analyze the data (images-video) from said cameras external and internal (1, 4) and also comprises communication means to communicate with the plurality of engine control units (ECU's) and actuators (7) [for example speed controllers, injection controllers, light controllers, ABS brakes] from the vehicle's electronic control system and also configured to receive information from the plurality of sensors (8) [such as RPM speed sensor, ABS brake sensor, among others] of the vehicle (2), in order to notify the driver (3) or take a specific action to avoid harm to the user through a
  • ECU's engine
  • the system allows through said at least one interior camera (4) to determine where the driver (3) is looking and will analyze the maximum visual field based on the position of the driver's eyes and head (3); said at least one second external camera (1) mounted externally preferably on the windshield of the vehicle (2) or another area that allows viewing the front area of the vehicle (2) will be recording the image in front of the car and the objects within visual field .
  • the system will correlate the objects in the frontal image with the visual field detected by the position of the driver's eyes (3) and will analyze the objects for hazards based on a classification and a detection sensitivity setting, as well as information from a plurality of sensors (8), such as the speed of the car, brake sensors, among others sensors.
  • the system When the system detects a pedestrian in front and sees that it is not within the driver's field of vision, then a serious alert will be issued to the driver (3), on the other hand, if a bird is detected, it must be ignored. In addition, it will provide the ability to detect traffic signs and see if the driver (3) was able to observe it, otherwise an alarm of medium severity will be issued. Based on this information, the system will decide if it is necessary to take any control primitive (ie safe speed profiles, automatic driving support, activation of tasks in other vehicle ECUs, etc.), or if it is only necessary to notify the driver (3) via a visual or audible alarm from the human-machine interface (HMI) (9).
  • HMI human-machine interface
  • the proposed logic system follows the following directives to execute the algorithm: a. It reads the status of the HMI man-machine interface, this reading indicates if the system functionality is active (at 1), since the user is allowed to deactivate it from the graphical interface. In addition, the system allows modifying the sensitivity (a2) for the detection of objects in the Images. By reading the status of the HMI these values can be known.
  • the algorithm executes the reading of the external camera (c1), and the reading of the internal camera (c2), to obtain an external image (c a) from outside the vehicle (front) (c1 b) and to obtain an interior image (c2a) from inside the vehicle (c2b), to be able to transmit this data to the main controller.
  • This is represented on the main controller as the internal and external images.
  • the algorithm will read the sensors to know the state of the vehicle, that is, if the vehicle is moving or if it is on or not, using the signals from the sensors (d 1) to detect the speed (d2) or the revolutions per minute (RPM) (d3).
  • the main controller will analyze the inner image looking for a face within the inner image (e1). Since the interior image (e1) will be captured with a camera from the car dashboard, the location of a face within this image will allow the presence / absence of a driver inside the vehicle to be detected.
  • the algorithm does not detect a driver, but detects that the vehicle is moving (g) through the speed (g 1) and RPM (g 2) sensors, it will send an alert through the HM I (g 3), it will also send control directives (g4) to the injection and ABS controllers to stop the vehicle, because if the car is in motion, but a driver could not be located, this indicates that the driver is in a position that does not allow you to be watching the road, which can cause an accident.
  • the algorithm detects a driver, then the interior image (h 1) will be analyzed again, since it is known in which part of the image the face is located, this face will be analyzed to detect the position of the eyes (h2), thus as the direction in which the driver is looking (h3).
  • the algorithm will calculate the visual range (i 1) of the driver. That is, as far as it is possible for the driver to interpret the image in front of him, through the position of the eyes (i2) and the direction of the gaze (i3).
  • the calculated visual range will be correlated in the image of the external camera. This will allow to define an image containing the image within the driver's visual range (j 3), as well as an image with the part outside the visual range (j 4).
  • the algorithm will analyze the image that is outside the visual range (k 1) in search of objects, in this analysis all objects within the image outside the visual range (k2) will be detected.
  • the algorithm If the algorithm does not detect objects in the image out of visual range, the execution of the algorithm ends. In case it is detect one or more objects, the algorithm will check if the car is moving (L1) through the speed (L2) and RPM (L3) sensors, in case the algorithm is stopped it will finish its execution.
  • the algorithm will perform a categorization of the objects within the image outside the visual range of the driver (m 1). This categorization will take into account the value for the sensitivity (m2) in the detection of objects defined by the user.
  • the algorithm will separate objects like pedestrians, cyclists, or other cars from objects like dogs, birds, fire hydrants, etc.
  • the algorithm will create two categories (m3 warnings and m4 actions):
  • Control directives (fi) will be sent as actions ( ⁇ 1) to prevent accidents in addition to displaying warnings in the HMI (eg detection of pedestrians, cyclists, other cars, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

L'invention concerne un système reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique. Ledit système est caractérisé en ce qu'il comprend au moins une caméra externe disposée dans une zone frontale du véhicule dirigée vers le champ visuel du conducteur dans la direction frontale du véhicule et au moins une caméra interne disposée à l'intérieur de la cabine du véhicule mise au point vers la zone du conducteur; un dispositif principal de commande et de traitement des informations qui comprend un processeur qui analyse les données (images-vidéo) desdites caméras externe et interne et, en outre, comprend des moyens de communication pour communiquer avec la pluralité d'unités de contrôle du moteur (ECU) et des actionneurs (par exemple, des régulateurs de vitesse, des régulateurs d'injection, des dispositifs de commande de lumières, des dispositifs de commande de freins ABS) du système électronique de contrôle du véhicule et configuré en outre pour recevoir des informations de la pluralité de capteurs (comme par exemple, un capteur de vitesse RPM, un capteur de freins ABS, entre autres) du véhicule; ledit dispositif principal de commande et de traitement d'informations étant en communication avec une interface homme-machine (HMI) avec des alarmes visuelles ou sonores d'alerte et d'avertissement et étant configuré pour prévenir le conducteur ou bien choisir une chanson déterminée pour éviter un dommage à l'utilisateur.
PCT/MX2019/000016 2019-02-11 2019-02-15 Système et procédé reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique WO2020167110A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MXMX/A/2019/001712 2019-02-11
MX2019001712A MX2019001712A (es) 2019-02-11 2019-02-11 Sistema y metodo basado en la correlación de información visual interior y exterior en un vehículo para mejorar la seguridad en conducción manual o semi-automática.

Publications (1)

Publication Number Publication Date
WO2020167110A1 true WO2020167110A1 (fr) 2020-08-20

Family

ID=72045415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MX2019/000016 WO2020167110A1 (fr) 2019-02-11 2019-02-15 Système et procédé reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique

Country Status (2)

Country Link
MX (1) MX2019001712A (fr)
WO (1) WO2020167110A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051516A1 (en) * 2006-02-23 2009-02-26 Continental Automotive Gmbh Assistance System for Assisting a Driver
US20180065452A1 (en) * 2016-09-07 2018-03-08 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Using vision devices and techniques to control sunlight blocking devices for vehicles
EP3372465A1 (fr) * 2017-03-10 2018-09-12 The Hi-Tech Robotic Systemz Ltd Procédé et système d'assistance avancée au conducteur à base d'état de véhicule
US20180304814A1 (en) * 2017-04-24 2018-10-25 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Rear view mirror-like perspective change system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051516A1 (en) * 2006-02-23 2009-02-26 Continental Automotive Gmbh Assistance System for Assisting a Driver
US20180065452A1 (en) * 2016-09-07 2018-03-08 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Using vision devices and techniques to control sunlight blocking devices for vehicles
EP3372465A1 (fr) * 2017-03-10 2018-09-12 The Hi-Tech Robotic Systemz Ltd Procédé et système d'assistance avancée au conducteur à base d'état de véhicule
US20180304814A1 (en) * 2017-04-24 2018-10-25 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Rear view mirror-like perspective change system and method

Also Published As

Publication number Publication date
MX2019001712A (es) 2020-08-12

Similar Documents

Publication Publication Date Title
CN108068698B (zh) 车辆系统以及车辆
US9352683B2 (en) Traffic density sensitivity selector
US10166918B2 (en) Drive assisting apparatus
US6496117B2 (en) System for monitoring a driver's attention to driving
KR102051142B1 (ko) 차량용 운전자 위험 지수 관리 시스템 및 그 방법
EP2513882B1 (fr) Interface homme-machine prédictive utilisant une technologie de suivi oculaire, des indicateurs d'angle mort et l'expérience d'un conducteur
CN104175954A (zh) 一种车辆盲区监测报警系统
JP4415856B2 (ja) 周囲感知システムによって路上車両の前方周囲を検出するための方法
US10745025B2 (en) Method and device for supporting a vehicle occupant in a vehicle
US10759334B2 (en) System for exchanging information between vehicles and control method thereof
JP7471021B2 (ja) システムおよびプログラム等
CN111038499B (zh) 可根据驾驶员状态自主控制车辆的盲区监测系统及方法
JP2024097875A (ja) システムおよびプログラム等
JP4751894B2 (ja) 自動車の前方にある障害物を検知するシステム
CN207773004U (zh) 转向预警系统及车辆
KR20110118882A (ko) 차량의 사이드미러를 대체하는 측방충돌방지 경보시스템
WO2020167110A1 (fr) Système et procédé reposant sur la corrélation d'informations visuelles intérieures et extérieures d'un véhicule pour améliorer la sécurité lors de la conduite manuelle ou semi-automatique
EP4005889A1 (fr) Procédé d'assistance à la conduite avec détection dynamique d'angles morts et détection de l'état du conducteur en boucle fermée, système d'assistance à la conduite, véhicule et programme d'ordinateur
KR20180039838A (ko) 차량 후측방 경보 제어장치 및 방법
CN112550282A (zh) 用于机动车的警示方法及警示系统
WO2014090957A1 (fr) Procédé pour faire passer un système de caméra en mode de soutien, système de caméra et véhicule à moteur
JP7194890B2 (ja) 自動車のミラー装置
JP2009146141A (ja) 走行支援装置
GB2524894A (en) Traffic density sensitivity selector
Mutthalkar et al. Advanced Driver Assist System: Case Study

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19914857

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19914857

Country of ref document: EP

Kind code of ref document: A1