CN106205057A - A kind of DAS (Driver Assistant System) based on ZigBee technology and method - Google Patents

A kind of DAS (Driver Assistant System) based on ZigBee technology and method Download PDF

Info

Publication number
CN106205057A
CN106205057A CN201610837205.4A CN201610837205A CN106205057A CN 106205057 A CN106205057 A CN 106205057A CN 201610837205 A CN201610837205 A CN 201610837205A CN 106205057 A CN106205057 A CN 106205057A
Authority
CN
China
Prior art keywords
processor
blink
driver
ccd camera
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610837205.4A
Other languages
Chinese (zh)
Inventor
谢敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Chuanghui Keda Technology Co Ltd
Original Assignee
Chengdu Chuanghui Keda Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Chuanghui Keda Technology Co Ltd filed Critical Chengdu Chuanghui Keda Technology Co Ltd
Priority to CN201610837205.4A priority Critical patent/CN106205057A/en
Publication of CN106205057A publication Critical patent/CN106205057A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The invention provides a kind of DAS (Driver Assistant System) based on ZigBee technology and method, relate to automotive field.It is characterized in that, described system includes: for obtaining the infrared CCD video camera of original image information;Described infrared CCD camera signal is connected to dsp processor;Described dsp processor signal is connected to for the result of dsp processor is sent the second processor to high-performance processor;Described second processor signal is connected to the high-performance processor for processing the data message that each processor sends over;Described high-performance processor signal is connected to the alarm for sending alarm signal;Described system also includes: for obtaining the common CCD camera of driver's navigation route image information;Described common CCD camera signal is connected to, for obtaining data message and the acquisition turn signal instruction that common CCD camera sends over, send the signal obtained to high-performance processor.

Description

Auxiliary driving system and method based on ZigBee technology
Technical Field
The invention relates to the field of automobiles, in particular to a driving assisting system and method based on a ZigBee technology.
Background
Among the numerous vehicle accessories, the accessories related to reversing safety are particularly attractive, and the brand model equipped with the reversing aid system also often becomes one of the important signs of high-grade vehicle configuration.
According to statistics, traffic accidents caused by the vehicle rear dead zone account for about 30% in China and 20% in the United states, and traffic control departments recommend that vehicle owners install multi-curvature large-view rearview mirrors to reduce the vehicle rear dead zone and improve the safety performance of vehicles, but the accidents cannot be effectively reduced and controlled. The potential danger of the dead zone at the tail of the automobile often brings great loss of lives and properties and serious mental injury to people. For a novice driver or a woman, the driver can look ahead and feel worried about courageous war when backing a car every time.
The existing car backing auxiliary products can be roughly divided into two types if the products are distinguished from manual operation and automatic operation: one is a manual type (represented by a conventional reversing system) and one is an automatic type (represented by an intelligent reversing system). The traditional backing system mainly takes backing radar and backing visual as representatives, and reminds the rear condition of a main vehicle by giving out warning sound or seeing the rear condition, so that the main vehicle can avoid actively and accident injury is reduced. The product is poor in initiative for a driver, and even though the driver can avoid the injury of a vehicle to pedestrians to a great extent, the driver can not smoothly and effectively park the vehicle, so that the vehicle is very easy to scratch or collide. .
Disclosure of Invention
In view of the above, the invention provides an auxiliary driving system and method based on the ZigBee technology, and the auxiliary driving system and method have the advantages of off-road monitoring, fatigue driving monitoring, accurate monitoring, high operation efficiency and the like.
The technical scheme adopted by the invention is as follows:
a driving assistance system based on ZigBee technology, characterized in that the system comprises: the system comprises: the infrared CCD camera is used for acquiring original image information; the infrared CCD camera is connected with the DSP processor through signals; the DSP processor is in signal connection with a second processor which is used for sending the processing result of the DSP processor to the high-performance processor; the second processor is in signal connection with a high-performance processor used for processing the data information sent by each processor; the high-performance processor is in signal connection with an alarm for sending out an alarm signal; the system further comprises: the common CCD camera is used for acquiring image information of a driver navigation route; the common CCD camera is in signal connection with the high-performance processor and is used for acquiring data information sent by the common CCD camera and acquiring the indication of the automobile steering lamp and sending the acquired signals to the high-performance processor.
The DSP processor comprises: the blink frequency counting unit is used for counting the blink frequency of the driver; a driver blink closure time monitoring unit for monitoring a driver blink closure time; the frequency monitoring unit is used for counting the frequency of the yawning of the driver; and the blink frequency statistical unit, the blink closure time monitoring unit and the yawning frequency monitoring unit are respectively in signal connection with the second processor.
The blink frequency statistic unit comprises: the region verification module is used for screening out the eye region of the driver; the region verification module is in signal connection with an image segmentation module for segmenting the verified region; the image segmentation module is in signal connection with a statistic module used for carrying out blink statistics on segmented images.
The driver blink closure time monitoring unit includes: the blink area verification module is used for screening out the eye area of the driver; the blink area detection module is in signal connection with a blink image segmentation module used for segmenting the detected area; the blink image segmentation module is connected with a closure time counting unit for carrying out blink closure time counting on segmented images in a signal mode.
The driver yawning frequency monitoring unit comprises: a mouth area verification module for screening out the mouth area of the driver; the mouth region verification module is in signal connection with a mouth image segmentation module used for segmenting the verified region; the mouth image segmentation module is in signal connection with a Hardgless frequency statistics unit for carrying out Hardgless frequency statistics on segmented images.
A method of assisting a driving system based on ZigBee technology is characterized by comprising the following steps:
step 1: starting a system and initializing the system;
step 2: the infrared CCD camera starts to collect image information of a driver, part of a face in the collected original image information is screened out, and the screened out part is sent to the DSP;
and step 3: the DSP processor respectively carries out blink frequency statistics, blink closing time monitoring and yawning frequency monitoring on the received image information and sends the processed result to the high-performance processor;
and 4, step 4: the method comprises the following steps that a common CCD camera starts to acquire the condition of an automobile in the advancing process and sends the acquired condition information to a first processor; the first processor judges whether the automobile leaves the road or not according to the received judgment result and the latest steering lamp indicating information, and sends the judgment result to the high-performance processor;
and 5: the high-performance processor judges whether the alarm should be started or not according to the data information system sent by the second processor and the first processor and sends out an alarm signal.
The method for screening the human face partial image from the collected original image by the CCD camera comprises the following steps:
step 1: the image collected from the camera is in an RGB format, and the conversion from an RGB color space to an HSV color space is realized by using the following formula:
step 2: and detecting the skin color area of the driver by using the following formula:
and step 3: then, horizontally and vertically projecting the human face to determine the boundary area of the human face, wherein the boundary determination formula is as follows:
wherein,
the method for determining the eye region of interest comprises the following steps:
step 1: assuming that the length of the detected face region is HF; width is WF;
step 2: in the vertical direction, the eyes are located in the HF/5 region above one-half of the face and below the top of the head.
And step 3: in the horizontal direction, the eye boundary region is located at a distance from the face left boundary WF/8 to the eye right boundary WF/8.
By adopting the technical scheme, the invention has the following beneficial effects:
1. the data transmission is reliable: in the system, a radio frequency transmitting unit, a memory and a microprocessor are integrated in a first processor and a second processor; except that can carry out the signal and forward for, can also carry out the temporary storage to the signal, after the data transmission in-process goes wrong, can transfer the data of temporary storage and resend, guaranteed the reliability of data.
2. The function is various: the invention can monitor the off-road of the automobile in the running process, and send out an alarm signal according to the monitoring result; the function is various, and the accuracy of early warning is guaranteed to all-round inspection driver's health.
3. The monitoring is accurate: the image recognition algorithm scientifically screens the human face, recognizes the screened human face, avoids the interference of other images in the image information and ensures the accuracy of the monitoring result.
4. The operation efficiency is high: the DSP processor is divided into three units for parallel processing, and the processing and operating efficiency of the system is improved.
Drawings
Fig. 1 is a schematic system structure diagram of a driving assistance system and method based on the ZigBee technology.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract) may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
The embodiment 1 of the invention provides an auxiliary driving system based on a ZigBee technology, and the system structure is shown in figure 1:
a driving assistance system based on ZigBee technology, characterized in that the system comprises: the system comprises: the infrared CCD camera is used for acquiring original image information; the infrared CCD camera is connected with the DSP processor through signals; the DSP processor is in signal connection with a second processor which is used for sending the processing result of the DSP processor to the high-performance processor; the second processor is in signal connection with a high-performance processor used for processing the data information sent by each processor; the high-performance processor is in signal connection with an alarm for sending out an alarm signal; the system further comprises: the common CCD camera is used for acquiring image information of a driver navigation route; the common CCD camera is in signal connection with the high-performance processor and is used for acquiring data information sent by the common CCD camera and acquiring the indication of the automobile steering lamp and sending the acquired signals to the high-performance processor.
The DSP processor comprises: the blink frequency counting unit is used for counting the blink frequency of the driver; a driver blink closure time monitoring unit for monitoring a driver blink closure time; the frequency monitoring unit is used for counting the frequency of the yawning of the driver; and the blink frequency statistical unit, the blink closure time monitoring unit and the yawning frequency monitoring unit are respectively in signal connection with the second processor.
The blink frequency statistic unit comprises: the region verification module is used for screening out the eye region of the driver; the region verification module is in signal connection with an image segmentation module for segmenting the verified region; the image segmentation module is in signal connection with a statistic module used for carrying out blink statistics on segmented images.
The driver blink closure time monitoring unit includes: the blink area verification module is used for screening out the eye area of the driver; the blink area detection module is in signal connection with a blink image segmentation module used for segmenting the detected area; the blink image segmentation module is connected with a closure time counting unit for carrying out blink closure time counting on segmented images in a signal mode.
The driver yawning frequency monitoring unit comprises: a mouth area verification module for screening out the mouth area of the driver; the mouth region verification module is in signal connection with a mouth image segmentation module used for segmenting the verified region; the mouth image segmentation module is in signal connection with a Hardgless frequency statistics unit for carrying out Hardgless frequency statistics on segmented images.
Embodiment 2 of the present invention provides a method for assisting a driving system based on the ZigBee technology,
a method of assisting a driving system based on ZigBee technology is characterized by comprising the following steps:
step 1: starting a system and initializing the system;
step 2: the infrared CCD camera starts to collect image information of a driver, part of a face in the collected original image information is screened out, and the screened out part is sent to the DSP;
and step 3: the DSP processor respectively carries out blink frequency statistics, blink closing time monitoring and yawning frequency monitoring on the received image information and sends the processed result to the high-performance processor;
and 4, step 4: the method comprises the following steps that a common CCD camera starts to acquire the condition of an automobile in the advancing process and sends the acquired condition information to a first processor; the first processor judges whether the automobile leaves the road or not according to the received judgment result and the latest steering lamp indicating information, and sends the judgment result to the high-performance processor;
and 5: the high-performance processor judges whether the alarm should be started or not according to the data information system sent by the second processor and the first processor and sends out an alarm signal.
The method for screening the human face partial image from the collected original image by the CCD camera comprises the following steps:
step 1: the image collected from the camera is in an RGB format, and the conversion from an RGB color space to an HSV color space is realized by using the following formula:
step 2: and detecting the skin color area of the driver by using the following formula:
and step 3: then, horizontally and vertically projecting the human face to determine the boundary area of the human face, wherein the boundary determination formula is as follows:
wherein,
the method for determining the eye region of interest comprises the following steps:
step 1: assuming that the length of the detected face region is HF; width is WF;
step 2: in the vertical direction, the eyes are located in the HF/5 region above one-half of the face and below the top of the head.
And step 3: in the horizontal direction, the eye boundary region is located at a distance from the face left boundary WF/8 to the eye right boundary WF/8.
The embodiment 3 of the invention provides an auxiliary driving system and method based on a ZigBee technology, and the system structure is shown in figure 1:
a driving assistance system based on ZigBee technology, characterized in that the system comprises: the system comprises: the infrared CCD camera is used for acquiring original image information; the infrared CCD camera is connected with the DSP processor through signals; the DSP processor is in signal connection with a second processor which is used for sending the processing result of the DSP processor to the high-performance processor; the second processor is in signal connection with a high-performance processor used for processing the data information sent by each processor; the high-performance processor is in signal connection with an alarm for sending out an alarm signal; the system further comprises: the common CCD camera is used for acquiring image information of a driver navigation route; the common CCD camera is in signal connection with the high-performance processor and is used for acquiring data information sent by the common CCD camera and acquiring the indication of the automobile steering lamp and sending the acquired signals to the high-performance processor.
The DSP processor comprises: the blink frequency counting unit is used for counting the blink frequency of the driver; a driver blink closure time monitoring unit for monitoring a driver blink closure time; the frequency monitoring unit is used for counting the frequency of the yawning of the driver; and the blink frequency statistical unit, the blink closure time monitoring unit and the yawning frequency monitoring unit are respectively in signal connection with the second processor.
The blink frequency statistic unit comprises: the region verification module is used for screening out the eye region of the driver; the region verification module is in signal connection with an image segmentation module for segmenting the verified region; the image segmentation module is in signal connection with a statistic module used for carrying out blink statistics on segmented images.
The driver blink closure time monitoring unit includes: the blink area verification module is used for screening out the eye area of the driver; the blink area detection module is in signal connection with a blink image segmentation module used for segmenting the detected area; the blink image segmentation module is connected with a closure time counting unit for carrying out blink closure time counting on segmented images in a signal mode.
The driver yawning frequency monitoring unit comprises: a mouth area verification module for screening out the mouth area of the driver; the mouth region verification module is in signal connection with a mouth image segmentation module used for segmenting the verified region; the mouth image segmentation module is in signal connection with a Hardgless frequency statistics unit for carrying out Hardgless frequency statistics on segmented images.
A method of assisting a driving system based on ZigBee technology is characterized by comprising the following steps:
step 1: starting a system and initializing the system;
step 2: the infrared CCD camera starts to collect image information of a driver, part of a face in the collected original image information is screened out, and the screened out part is sent to the DSP;
and step 3: the DSP processor respectively carries out blink frequency statistics, blink closing time monitoring and yawning frequency monitoring on the received image information and sends the processed result to the high-performance processor;
and 4, step 4: the method comprises the following steps that a common CCD camera starts to acquire the condition of an automobile in the advancing process and sends the acquired condition information to a first processor; the first processor judges whether the automobile leaves the road or not according to the received judgment result and the latest steering lamp indicating information, and sends the judgment result to the high-performance processor;
and 5: the high-performance processor judges whether the alarm should be started or not according to the data information system sent by the second processor and the first processor and sends out an alarm signal.
The method for screening the human face partial image from the collected original image by the CCD camera comprises the following steps:
step 1: the image collected from the camera is in an RGB format, and the conversion from an RGB color space to an HSV color space is realized by using the following formula:
step 2: and detecting the skin color area of the driver by using the following formula:
and step 3: then, horizontally and vertically projecting the human face to determine the boundary area of the human face, wherein the boundary determination formula is as follows:
wherein,
the method for determining the eye region of interest comprises the following steps:
step 1: assuming that the length of the detected face region is HF; width is WF;
step 2: in the vertical direction, the eyes are located in the HF/5 region above one-half of the face and below the top of the head.
And step 3: in the horizontal direction, the eye boundary region is located at a distance from the face left boundary WF/8 to the eye right boundary WF/8.
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.

Claims (7)

1. A driving assistance system based on ZigBee technology, characterized in that the system comprises: the system comprises: the infrared CCD camera is used for acquiring original image information; the infrared CCD camera is connected with the DSP processor through signals; the DSP processor is in signal connection with a second processor which is used for sending the processing result of the DSP processor to the high-performance processor; the second processor is in signal connection with a high-performance processor used for processing the data information sent by each processor; the high-performance processor is in signal connection with an alarm for sending out an alarm signal; the system further comprises: the common CCD camera is used for acquiring image information of a driver navigation route; the common CCD camera is in signal connection with the high-performance processor and is used for acquiring data information sent by the common CCD camera and acquiring the indication of the automobile steering lamp and sending the acquired signals to the high-performance processor.
2. The ZigBee technology based assistant driving system of claim 1, wherein the DSP processor comprises: the blink frequency counting unit is used for counting the blink frequency of the driver; a driver blink closure time monitoring unit for monitoring a driver blink closure time; the frequency monitoring unit is used for counting the frequency of the yawning of the driver; and the blink frequency statistical unit, the blink closure time monitoring unit and the yawning frequency monitoring unit are respectively in signal connection with the second processor.
3. The ZigBee technology-based driving assistance system of claim 2, wherein the blink frequency statistic unit comprises: the region verification module is used for screening out the eye region of the driver; the region verification module is in signal connection with an image segmentation module for segmenting the verified region; the image segmentation module is in signal connection with a statistic module used for carrying out blink statistics on segmented images.
4. The ZigBee technology-based driver assistance system of claim 3, wherein the driver blink closure time monitoring unit comprises: the blink area verification module is used for screening out the eye area of the driver; the blink area detection module is in signal connection with a blink image segmentation module used for segmenting the detected area; the blink image segmentation module is connected with a closure time counting unit for carrying out blink closure time counting on segmented images in a signal mode.
5. The ZigBee-technology-based assisted driving system of claim 4, wherein the driver Harvest frequency monitoring unit comprises: a mouth area verification module for screening out the mouth area of the driver; the mouth region verification module is in signal connection with a mouth image segmentation module used for segmenting the verified region; the mouth image segmentation module is in signal connection with a Hardgless frequency statistics unit for carrying out Hardgless frequency statistics on segmented images.
6. Method of driving assistance system based on ZigBee technology according to one of the claims 1 to 5, characterized in that it comprises the following steps:
step 1: starting a system and initializing the system;
step 2: the infrared CCD camera starts to collect image information of a driver, part of a face in the collected original image information is screened out, and the screened out part is sent to the DSP;
and step 3: the DSP processor respectively carries out blink frequency statistics, blink closing time monitoring and yawning frequency monitoring on the received image information and sends the processed result to the high-performance processor;
and 4, step 4: the method comprises the following steps that a common CCD camera starts to acquire the condition of an automobile in the advancing process and sends the acquired condition information to a first processor; the first processor judges whether the automobile leaves the road or not according to the received judgment result and the latest steering lamp indicating information, and sends the judgment result to the high-performance processor;
and 5: the high-performance processor judges whether the alarm should be started or not according to the data information system sent by the second processor and the first processor and sends out an alarm signal.
7. The method of the driving assistance system based on the ZigBee technology as claimed in claim 6, wherein the method of the CCD camera for screening the face part image from the collected original image comprises the steps of:
step 1: the image collected from the camera is in an RGB format, and the conversion from an RGB color space to an HSV color space is realized by using the following formula:
step 2: and detecting the skin color area of the driver by using the following formula:
and step 3: then, horizontally and vertically projecting the human face to determine the boundary area of the human face, wherein the boundary determination formula is as follows:
wherein,
the method of the driving assistance system based on the ZigBee technology as claimed in claim 7, wherein the method of the eye region of interest determination comprises the steps of:
step 1: assuming that the length of the detected face region is HF; width is WF;
step 2: in the vertical direction, the eyes are located in an HF/5 area above one half of the face and below the top of the head;
and step 3: in the horizontal direction, the eye boundary region is located at a distance from the face left boundary WF/8 to the eye right boundary WF/8.
CN201610837205.4A 2016-09-21 2016-09-21 A kind of DAS (Driver Assistant System) based on ZigBee technology and method Pending CN106205057A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610837205.4A CN106205057A (en) 2016-09-21 2016-09-21 A kind of DAS (Driver Assistant System) based on ZigBee technology and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610837205.4A CN106205057A (en) 2016-09-21 2016-09-21 A kind of DAS (Driver Assistant System) based on ZigBee technology and method

Publications (1)

Publication Number Publication Date
CN106205057A true CN106205057A (en) 2016-12-07

Family

ID=58067891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610837205.4A Pending CN106205057A (en) 2016-09-21 2016-09-21 A kind of DAS (Driver Assistant System) based on ZigBee technology and method

Country Status (1)

Country Link
CN (1) CN106205057A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7236650B1 (en) * 1999-06-04 2007-06-26 Sony Corporation Signal processing apparatus, method of the same, an image processing apparatus and method of same
CN102568158A (en) * 2010-12-15 2012-07-11 淄博高新区联创科技服务中心 Alarming system for automobile vehicle fatigue driving
EP2747056A1 (en) * 2011-12-27 2014-06-25 Honda Motor Co., Ltd. Driving assistance system
CN104504856A (en) * 2014-12-30 2015-04-08 天津大学 Fatigue driving detection method based on Kinect and face recognition
CN104751149A (en) * 2015-04-16 2015-07-01 张小磊 Personnel fatigue degree judging platform based on electronic detection
CN105844252A (en) * 2016-04-01 2016-08-10 南昌大学 Face key part fatigue detection method
CN105894735A (en) * 2016-05-31 2016-08-24 成都九十度工业产品设计有限公司 Intelligent vehicle-mounted fatigue monitoring system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7236650B1 (en) * 1999-06-04 2007-06-26 Sony Corporation Signal processing apparatus, method of the same, an image processing apparatus and method of same
CN102568158A (en) * 2010-12-15 2012-07-11 淄博高新区联创科技服务中心 Alarming system for automobile vehicle fatigue driving
EP2747056A1 (en) * 2011-12-27 2014-06-25 Honda Motor Co., Ltd. Driving assistance system
CN104504856A (en) * 2014-12-30 2015-04-08 天津大学 Fatigue driving detection method based on Kinect and face recognition
CN104751149A (en) * 2015-04-16 2015-07-01 张小磊 Personnel fatigue degree judging platform based on electronic detection
CN105844252A (en) * 2016-04-01 2016-08-10 南昌大学 Face key part fatigue detection method
CN105894735A (en) * 2016-05-31 2016-08-24 成都九十度工业产品设计有限公司 Intelligent vehicle-mounted fatigue monitoring system and method

Similar Documents

Publication Publication Date Title
US20230406342A1 (en) Vehicular driving assist system with driver monitoring
CN105620489B (en) Driving assistance system and vehicle real-time early warning based reminding method
TWI434239B (en) Pre-warning method for rear coming vehicle which switches lane and system thereof
CN104916165B (en) A kind of front truck driver unsafe driving behavioral value method and device
JP5198835B2 (en) Method and system for presenting video images
CN101984478B (en) Abnormal S-type driving warning method based on binocular vision lane marking detection
JP4812343B2 (en) Driving tendency estimation device and driving support device
US9305221B2 (en) Method and apparatus for identifying a possible collision object
CN102303609A (en) System and method for prewarning lane deviation
CN101885309A (en) Automobile auxiliary video warning device
CN104691333B (en) Coach driver fatigue state evaluation method based on the deviation frequency
JP2009301495A (en) Image processor and image processing method
CN105774651A (en) Vehicle lane change auxiliary system
KR20120086577A (en) Apparatus And Method Detecting Side Vehicle Using Camera
JP4751894B2 (en) A system to detect obstacles in front of a car
CN103465828A (en) Automobile steering early warning method and vehicle-mounted intelligent terminal device thereof
CN204354915U (en) A kind of vehicle lane change ancillary system
CN112149560B (en) Lane departure detection method
CN206520542U (en) Auto-panorama monitors image system
KR101684020B1 (en) Apparatus and Method for detecting a vehicle in the blind spot
CN114822058B (en) Driving specification driving prompting monitoring method and system based on signal lamp intersection, vehicle-mounted terminal and storage medium
US20230001922A1 (en) System providing blind spot safety warning to driver, method, and vehicle with system
CN106205057A (en) A kind of DAS (Driver Assistant System) based on ZigBee technology and method
KR101543119B1 (en) Method for providing drive composite image
KR102536826B1 (en) Apparatus and method for driving information of a car, and program recorded on a recording medium performing the method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20161207