CN107640149B - Vehicle auxiliary driving method and system - Google Patents

Vehicle auxiliary driving method and system Download PDF

Info

Publication number
CN107640149B
CN107640149B CN201610586675.8A CN201610586675A CN107640149B CN 107640149 B CN107640149 B CN 107640149B CN 201610586675 A CN201610586675 A CN 201610586675A CN 107640149 B CN107640149 B CN 107640149B
Authority
CN
China
Prior art keywords
vehicle
data
headlight
oncoming vehicle
oncoming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610586675.8A
Other languages
Chinese (zh)
Other versions
CN107640149A (en
Inventor
唐帅
吕尤
孙铎
张海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN201610586675.8A priority Critical patent/CN107640149B/en
Publication of CN107640149A publication Critical patent/CN107640149A/en
Application granted granted Critical
Publication of CN107640149B publication Critical patent/CN107640149B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The disclosure provides a vehicle driving assisting method and system. The method operates in a vehicle in a dark driving mode and comprises: detecting profile data of an oncoming vehicle; collecting an image of an oncoming vehicle; calibrating the profile data and the image of the oncoming vehicle to the same coordinate system; determining a headlight position area of the oncoming vehicle based on the corresponding relation between each kind of vehicle contour data and the headlight position data and the contour data of the oncoming vehicle; extracting pixels of the image at a headlight position area of the oncoming vehicle; calculating an average brightness value of each headlight location area of the oncoming vehicle based on the extracted pixels; and comparing the average brightness value of each headlight location area with a predetermined brightness threshold value corresponding to the headlight location area data of the profile data of the oncoming vehicle to determine whether the corresponding headlight of the corresponding headlight location area is on. The vehicle driving assisting method and the vehicle driving assisting system improve the safety of driving at night on the premise of not increasing the burden of a driver.

Description

Vehicle auxiliary driving method and system
Technical Field
The application relates to a vehicle driving assisting method and system. In particular, the present application relates to a vehicle driving assist method and system that determines a state of a headlight of an oncoming vehicle in a dark driving mode.
Background
For a vehicle that travels in an environment with low visibility such as at night, in a tunnel, in heavy fog, in rain and snow (hereinafter referred to as "dark travel"), the vehicular lamp largely plays a role of warning other vehicles in addition to the role of illumination. It is very dangerous for the host vehicle as well as other vehicles if the lights are not normally turned on due to a malfunction or driver's carelessness. Assuming that only one headlight of an oncoming vehicle in front of the host vehicle is turned on and the other headlight is not turned on (particularly in the case that the headlight on the side close to the host vehicle is not turned on and the lamp on the other side is turned on), the driver of the host vehicle often considers that the opposite side is a motorcycle or an electric bicycle in an environment with low visibility, so that the width of the vehicle is estimated erroneously, which is likely to cause scratches, collisions, and even serious accidents. Therefore, during dark driving, it is necessary to accurately determine the state of headlights of oncoming vehicles and give a warning to the driver of the own vehicle if necessary, thereby reducing the occurrence of accidents.
Disclosure of Invention
According to one aspect of the present application, a vehicle driving assist method is provided. The method comprises the following steps when detecting that the vehicle is in a dark driving mode: detecting profile data of the oncoming vehicle by using a detection device; acquiring an image of an oncoming vehicle by using an image acquisition device; performing, with a data processing apparatus, the steps of: calibrating the profile data and the image of the oncoming vehicle to the same coordinate system; determining a headlight position area of the oncoming vehicle based on the corresponding relation between each type of vehicle contour data and headlight position data stored in the storage device in advance and the contour data of the oncoming vehicle detected by the detection device; extracting pixels of the image at the determined headlight position area of the oncoming vehicle; calculating an average luminance value of each headlight location area of the determined oncoming vehicle based on the extracted pixels; and comparing the average brightness value of each headlight position region with a predetermined brightness threshold value corresponding to headlight position region data of the contour data of the oncoming vehicle, which is stored in advance in a storage device, to determine whether the corresponding headlight of the corresponding headlight position region is on.
According to another aspect of the present application, a vehicle assisted driving system is provided. The system comprises: night driving mode starting means for monitoring driving of the vehicle in the dark and enabling a dark driving mode; the detection device is used for detecting the profile data of the oncoming vehicle; the image acquisition device is used for acquiring an image of an oncoming vehicle; a storage means for storing in advance a correspondence relationship of each vehicle profile data and headlight position data and a predetermined luminance threshold value corresponding to headlight position area data of the profile data of the oncoming vehicle; and data processing means for performing the following operations: calibrating the profile data and the image of the oncoming vehicle to the same coordinate system; determining a headlight position area of the oncoming vehicle based on the corresponding relation between each type of vehicle contour data and headlight position data stored in the storage device in advance and the contour data of the oncoming vehicle detected by the detection device; extracting pixels of the image at the determined headlight position area of the oncoming vehicle; calculating an average luminance value of each headlight location area of the determined oncoming vehicle based on the extracted pixels; and comparing the average brightness value of each headlight position region with a predetermined brightness threshold value corresponding to headlight position region data of the contour data of the oncoming vehicle, which is stored in advance in a storage device, to determine whether the corresponding headlight of the corresponding headlight position region is on.
According to the vehicle driving assisting method and the vehicle driving assisting system, the state of headlights of oncoming vehicles can be accurately judged, the drivers of the vehicles can be warned when needed, and adjustment can be made in time, so that the safety of night driving is greatly improved on the premise of not increasing the burden of the drivers.
Drawings
Features, advantages and technical effects of exemplary embodiments of the present invention will be described below with reference to the accompanying drawings, in which like reference numerals represent like elements, and wherein:
FIG. 1 depicts an example scenario in which the vehicle assisted driving methods and systems provided by embodiments of the present application may be used;
FIG. 2 shows a simplified schematic diagram of a vehicle assisted driving system according to an embodiment of the present application; and
fig. 3 shows a simplified flowchart of a vehicle assisted driving method according to an embodiment of the application.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present invention by illustrating examples of the present invention. The present invention is in no way limited to any specific configuration and algorithm set forth below, but rather covers any modification, replacement or improvement of elements, components or algorithms without departing from the spirit of the invention.
Fig. 1 depicts an example scenario 100 in which vehicle-assisted driving methods and systems provided by embodiments of the present application may be used. The scene 100 shows vehicles a and B traveling in opposition. It is to be noted that, although the vehicles a and B are shown as automobiles in fig. 1 for convenience of explanation, the vehicles a and B may each be various types of four-wheeled vehicles such as automobiles, off-road vehicles, sports cars, trucks, and the like. When driving in the dark, each vehicle should turn on the lights to illuminate and warn the other vehicles. As can be seen from fig. 1, at least one of the headlights of the vehicle B is not lit due to a malfunction or inattention of the driver. In particular, when the headlights of the driver of the vehicle B are not turned on and the headlights of the other driver are turned on during night driving, the driver of the opposite vehicle a cannot recognize the contour of the vehicle B, and therefore the vehicle B appears to be a motorcycle such as a motorcycle, and the width of the vehicle B is estimated erroneously. This is highly likely to cause accidents. Under the condition that the vehicle A is provided with the vehicle driving assisting system, the vehicle driving assisting system can accurately judge the state of the front lamp of the oncoming vehicle B, warn the driver of the vehicle A when necessary and make reasonable actions in time, and greatly improves the safety of driving at night on the premise of not increasing the burden of the driver.
Turning now to fig. 2, fig. 2 shows a simplified schematic diagram of a vehicle assisted driving system 200 according to an embodiment of the present application. The vehicle driving assistance system 200 according to the embodiment of the present invention may be applied to an automobile, and may also be applied to other motor vehicles, for example, various vehicles having an internal combustion engine, an electric motor, or the like as a power mechanism, and a new electric vehicle. As shown in fig. 2, the driving assistance system 200 includes a dark running monitoring device 201, a detection device 202, an image acquisition device 205, a storage device 204, and a data processing device 205, and optionally includes a warning device 206, an automatic control device 207, and other necessary devices. The various devices may be interconnected.
The dark running monitoring means 201 is for monitoring that the vehicle is running in the dark and enabling a dark running mode. The dark driving monitoring device 201 may be a clock and/or a brightness sensor of the vehicle itself. The dark travel monitoring device 201 may be a switch such as a button. The driver of the vehicle can manually start the dark driving mode according to actual needs.
The storage device 204 stores beforehand the correspondence relationship of each vehicle profile data and the headlight position data and a predetermined luminance threshold value corresponding to the headlight position area data of the profile data of various vehicles. The storage device 204 may also store other desired data or information. The storage device 204 may be any machine-readable medium capable of storing or transmitting information, such as Random Access Memory (RAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Read Only Memory (ROM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, and the like. Storage device 204 may also include, for example, any memory using paper, magnetic, and/or optical media, such as paper tape, hard disk, magnetic tape, floppy disk, magneto-optical disk (MO), CD, DVD, Blue-ray, and the like.
After the dark driving mode is started, the detection means 202, the image acquisition means 203, the data processing means 205, the warning means 206, and the automatic control means 207 start to operate.
In one example, the detection device 202 is used to detect profile data of an oncoming vehicle. In another example, the detection device 202 may also be used to detect parameters such as structural data or operational data of oncoming vehicles. The detection device 202 may comprise, for example, different types of sensors, such as laser scanners, radar sensors, infrared cameras, velocity sensors, accelerometers, global positioning system (GSM), and so forth. In one example, a laser scanner or radar sensor may detect an oncoming vehicle from shape features and output information of the position, shape, etc. of the oncoming vehicle. In another example, the infrared camera may detect an oncoming vehicle based on shape and texture information, etc., and output information of the position, shape, etc., of the oncoming vehicle. In embodiments, sensors such as laser scanners, radar sensors, infrared cameras, velocity sensors, accelerometers, GSM, and the like may be used alone or in combination.
In some embodiments, image capture device 203 is used to capture images of oncoming vehicles. The image capture device 203 may be, for example, an RGB camera. Such an image can clearly display color information such as brightness.
In some embodiments, after obtaining the profile data of the oncoming vehicle and the image of the oncoming vehicle, the data processing device 205 calibrates the profile data and the image of the oncoming vehicle to the same coordinate system, e.g., according to a predefined mapping function or other correspondence. In some embodiments, the data processing device 205 determines the headlight position region of the oncoming vehicle, for example, based on the correspondence relationship between each vehicle profile data and headlight position data stored in advance in the storage device 204, and the profile data of the oncoming vehicle detected by the detection device 202. In some embodiments, the data processing device 205 extracts pixels of the image at the determined headlight location area of the oncoming vehicle. In some embodiments, the data processing device 205 calculates an average luminance value for each headlight location area of the determined oncoming vehicle based on the extracted pixels. In some embodiments, the data processing device 205 compares the average brightness value of each headlight location region with a predetermined brightness threshold value stored in advance in the storage device 204 corresponding to the headlight location region data of the profile data of the oncoming vehicle to determine whether the corresponding headlight of the corresponding headlight location region is on.
Alternatively, in calculating the average luminance value of each headlight location region of an oncoming vehicle, the data processing device 205 may calculate, for each headlight location region, the luminance value of each pixel within the region, for example, based on the Lab color space, and then average the luminance values of all pixels within the region to obtain the average luminance value of the corresponding headlight location region. The Lab color space is a color opponent space based on a non-linear compression (e.g., CIE XYZ color space) coordinate system, in which a luminance dimension is represented by L and color opponent dimensions are represented by a and b. Since the Lab color space is a technique well known to those skilled in the related art, it will not be described herein.
In some embodiments, the warning device 206 is configured to issue a warning if the data processing device 205 determines that the headlights of the oncoming vehicle on the side close to the driver are not lit and the headlights of the other side are lit.
The scenario of fig. 1 is taken as an example, and it is assumed that the vehicle a is equipped with a vehicle driving assistance system 200. The data processing device 205 calculates that the average luminance value of the headlight position region on the driver side of the vehicle B is L _ ave _1, and the average luminance value of the headlight position region on the other side is L _ ave _ 2. The predetermined luminance threshold value corresponding to the headlight position region data of the profile data for the vehicle B stored in advance in the storage device 314 is TB. If L _ ave _2 is greater than TBAnd L _ ave _1 is less than TBThe data processing device 205 determines that the headlights of the vehicle B on the side close to the driver are not lit and the headlights on the other side are lit. The warning device 206 issues a warning to the driver of the vehicle a based on the determination result.
In various embodiments, the alert may take a variety of forms, for example, a visual, audible, or tactile alert. For example, the alert may include displaying an infrared image of an oncoming vehicle. In another example, alerting may include highlighting (e.g., highlighting, bolding, standard red box, etc.) the oncoming vehicle in various views. In another example, the alert may include emitting an audible prompt. In other embodiments, the alert may include the seat (e.g., back and/or buttocks) tapping the driver at different frequencies and different degrees to make the back and/or buttocks of the driver notice the condition, or the steering wheel may be inflated or micro-vibrated at different frequencies and different degrees to make the feel of the driver feel the signal.
In some embodiments, the detection device 202 may further include various sensors such as laser scanners, radar sensors, infrared cameras, positioning devices, speed sensors, distance sensors, acceleration sensors, and the like to detect structural data and operational data of oncoming vehicles. The data processing device 205 is optionally configured to determine whether there will be an overlap between the trajectory of the oncoming vehicle and the trajectory of the own vehicle based on the configuration data and the operational data of the oncoming vehicle and the configuration data and the operational data of the own vehicle. In some embodiments, in the case where the data processing device 205 determines that the trajectory of the oncoming vehicle will overlap with the trajectory of the own vehicle, the automatic control device 207 may automatically steer the own vehicle from the oncoming vehicle to the other side or automatically brake. In other embodiments, the automatic control device 207 may also perform other automatic control operations, such as vehicle light control; controlling a loudspeaker; actuation controls such as braking, acceleration, deceleration, steering; adaptive Cruise Control (ACC), etc.
Fig. 3 shows a simplified flowchart of a vehicle assisted driving method 300 according to an embodiment of the application. The vehicle assistant driving method 300 is executed when it is detected that the vehicle is in a "dark driving mode" (modes enabled when driving in an environment with low visibility such as at night, a tunnel, heavy fog, rain, and snow are collectively referred to as "dark driving modes").
In step 301, profile data of an oncoming vehicle is detected using a detection device (e.g., detection device 202 of fig. 2). At step 302, an image of an oncoming vehicle is captured with an image capture device (e.g., image capture device 203 of fig. 2).
After acquiring the profile data and images of the oncoming vehicle, the profile data and images of the oncoming vehicle are calibrated to the same coordinate system, e.g., according to a predefined mapping function or other correspondence, using a data processing device (e.g., data processing device 205 of fig. 2), in step 303.
In step 304, a headlight position area of the oncoming vehicle is determined by a data processing device (e.g., the data processing device 205 of fig. 2), for example, based on a correspondence relationship between each type of vehicle contour data and headlight position data stored in advance in a storage device (e.g., the storage device 204 of fig. 2), and the contour data of the oncoming vehicle detected by a detection device (e.g., the detection device 202 of fig. 2).
At step 305, pixels of the image at the determined headlight location area of the oncoming vehicle are extracted using a data processing device (e.g., data processing device 205 of fig. 2).
At step 306, an average luminance value for each headlight location area of the determined oncoming vehicle is calculated based on the extracted pixels using a data processing device (e.g., data processing device 205 of fig. 2).
At step 307, the average brightness value of each headlight location region is compared with a predetermined brightness threshold value corresponding to the headlight location region data of the contour data of the oncoming vehicle, which is stored in advance in a storage device (e.g., the storage device 204 of fig. 2), using a data processing device (e.g., the data processing device 205 of fig. 2) to determine whether the corresponding headlight of the corresponding headlight location region is on.
In some embodiments, the step 306 may include the following sub-steps: at step 306-1 (not shown in fig. 3), calculating, with a data processing device (e.g., data processing device 205 of fig. 2), for each headlight location region, a luminance value for each pixel within the region, e.g., based on a Lab color space; at step 306-2 (not shown in fig. 3), the luminance values of all pixels within each headlight location area are averaged using a data processing device (e.g., data processing device 205 of fig. 2) to derive an average luminance value for the corresponding headlight location area. In other embodiments, the average luminance value of each headlight position region may be calculated in other manners.
The vehicle assistant driving method 300 further includes issuing a warning with a warning device (e.g., the warning device 206 of fig. 2) (step 308) in a case where the data processing device (e.g., the data processing device 205 of fig. 2) determines that the headlights of the oncoming vehicle on the side close to the driver are not lit (step 307-1) and the headlights of the other side are lit (step 307-2). In various embodiments, the alert may take a variety of forms, for example, a visual, audible, or tactile alert. For example, the alert may include displaying an infrared image of an oncoming vehicle. In another example, alerting may include highlighting (e.g., highlighting, bolding, marking a red box, etc.) the oncoming vehicle in various views. In another example, the alert may include emitting an audible prompt. In other embodiments, the alert may include the seat (e.g., back and/or buttocks) tapping the driver at different frequencies and different degrees to make the back and/or buttocks of the driver notice the condition, or the steering wheel may be inflated or micro-vibrated at different frequencies and different degrees to make the feel of the driver feel the signal.
Alternatively, when the data processing device (e.g., the data processing device 205 of fig. 2) determines that the headlight of the oncoming vehicle on the side close to the driver is not lit (step 307-1) and the headlight of the other side is lit (step 307-2), the warning may not be issued directly with the warning device (e.g., the warning device 206 of fig. 2). In some embodiments, various sensors, such as laser scanners, radar sensors, positioning devices, infrared sensors, speed sensors, distance sensors, acceleration sensors, and the like, may be utilized to detect structural data and operational data of oncoming vehicles. Then, it is determined by a data processing device (e.g., the data processing device 205 of fig. 2) whether the trajectory of the oncoming vehicle and the trajectory of the own vehicle will overlap based on the configuration data and the operation data of the oncoming vehicle and the configuration data and the operation data of the own vehicle. In the case where the data processing device (e.g., the data processing device 205 of fig. 2) determines that the trajectory of the oncoming vehicle and the trajectory of the own vehicle will overlap, the own vehicle may be automatically steered from the oncoming vehicle to the other side or automatically braked using the automatic control device (e.g., the automatic control device 207 of fig. 2). In other embodiments, the automatic control device 207 may also perform other automatic control operations, such as vehicle light control; controlling a loudspeaker; actuation controls such as braking, acceleration, deceleration, steering; adaptive Cruise Control (ACC), etc.
While the invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the construction and methods of the embodiments described above. On the contrary, the invention is intended to cover various modifications and equivalent arrangements. In addition, while the various elements and method steps of the disclosed invention are shown in various example combinations and configurations, other combinations, including more, less or all, of the elements or methods are also within the scope of the invention.

Claims (8)

1. A vehicle assisted driving method that operates in a vehicle in a dark driving mode and includes the steps of:
detecting profile data of an oncoming vehicle with a detection device, the detection device comprising one or more of a laser scanner, a radar sensor, and an infrared camera;
acquiring an image of the oncoming vehicle by using an image acquisition device;
performing, with a data processing apparatus, the steps of:
calibrating the profile data and the image of the oncoming vehicle to the same coordinate system;
determining a headlight position area of the oncoming vehicle based on a corresponding relationship between each type of vehicle profile data and headlight position data stored in a storage device in advance and the profile data of the oncoming vehicle detected by the detection device;
extracting pixels of the image at the determined headlight location area of the oncoming vehicle;
calculating an average luminance value of each headlight location area of the determined oncoming vehicle based on the extracted pixels; and
comparing the average brightness value of each headlight position region with a predetermined brightness threshold value corresponding to headlight position region data of the contour data of the oncoming vehicle, which is stored in advance in the storage device, to determine whether the corresponding headlight of the corresponding headlight position region is on; and
and if the headlights of the oncoming vehicle close to one side of the driver are not lighted and the headlights of the other side of the oncoming vehicle are lighted, warning is sent out by utilizing a warning device.
2. The vehicle assistant driving method according to claim 1, wherein the step of calculating the average luminance value of each headlight location area of the determined oncoming vehicle based on the extracted pixels includes:
calculating a brightness value of each pixel within each headlight position region based on the Lab color space; and
the average luminance value of each headlight position region is calculated by averaging the luminance values of all the pixels in the respective headlight position region.
3. The vehicle assistant driving method according to claim 1 or 2, the image pickup device being an RGB camera.
4. The vehicle assistant driving method according to claim 1, further comprising:
detecting structural data and operational data of the oncoming vehicle by using the detection device;
judging whether the driving tracks of the oncoming vehicles and the driving tracks of the vehicle are overlapped or not by utilizing the data processing device based on the structural data and the operation data of the oncoming vehicles and the structural data and the operation data of the vehicle; and
under the condition that the running track of the oncoming vehicle and the running track of the vehicle are overlapped, the vehicle is automatically steered to the other side from the oncoming vehicle or automatically braked by using an automatic control device.
5. A vehicle assisted driving system, the system comprising:
a dark driving monitoring device for monitoring driving of the vehicle in the dark and enabling a dark driving mode;
a detection device for detecting profile data of an oncoming vehicle, the detection device comprising one or more of a laser scanner, a radar sensor, and an infrared camera;
the image acquisition device is used for acquiring an image of the oncoming vehicle;
a storage means for storing in advance a correspondence relationship of each vehicle profile data and headlight position data and a predetermined luminance threshold value corresponding to headlight position area data of the profile data of each vehicle;
a data processing apparatus for performing the following operations:
calibrating the profile data and the image of the oncoming vehicle to the same coordinate system;
determining a headlight position area of the oncoming vehicle based on a corresponding relationship between each type of vehicle contour data and headlight position data stored in advance in the storage device and the contour data of the oncoming vehicle detected by the detection device;
extracting pixels of the image at the determined headlight location area of the oncoming vehicle;
calculating an average luminance value of each headlight location area of the determined oncoming vehicle based on the extracted pixels; and
comparing the average brightness value of each headlight location area with a predetermined brightness threshold value corresponding to headlight location area data of the profile data of the oncoming vehicle, which is stored in advance in the storage device, to determine whether the corresponding headlight of the corresponding headlight location area is on; and
and warning means for giving a warning when the data processing means determines that the headlight of the oncoming vehicle on the side closer to the driver is not lit but the headlight of the oncoming vehicle on the other side is lit.
6. The vehicle driving assistance system according to claim 5, wherein the data processing device is configured to:
calculating a brightness value of each pixel within each headlight position region based on the Lab color space; and
the average luminance value of each headlight position region is calculated by averaging the luminance values of all the pixels in the respective headlight position region.
7. The vehicle assistant driving system according to claim 5 or 6, the image pickup device being an RGB camera.
8. The vehicle driving assist system according to claim 5, wherein:
the detection device is further configured to detect structural data and operational data of the oncoming vehicle;
the data processing device is further configured to judge whether the driving track of the oncoming vehicle and the driving track of the own vehicle are overlapped or not based on the structural data and the operation data of the oncoming vehicle and the structural data and the operation data of the own vehicle;
the vehicle driving assistance system further comprises an automatic control device for automatically steering the vehicle from the oncoming vehicle to the other side or automatically braking in case that the running track of the oncoming vehicle and the running track of the vehicle will overlap.
CN201610586675.8A 2016-07-22 2016-07-22 Vehicle auxiliary driving method and system Active CN107640149B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610586675.8A CN107640149B (en) 2016-07-22 2016-07-22 Vehicle auxiliary driving method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610586675.8A CN107640149B (en) 2016-07-22 2016-07-22 Vehicle auxiliary driving method and system

Publications (2)

Publication Number Publication Date
CN107640149A CN107640149A (en) 2018-01-30
CN107640149B true CN107640149B (en) 2020-09-08

Family

ID=61109562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610586675.8A Active CN107640149B (en) 2016-07-22 2016-07-22 Vehicle auxiliary driving method and system

Country Status (1)

Country Link
CN (1) CN107640149B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110550050A (en) * 2019-10-09 2019-12-10 马鞍山问鼎网络科技有限公司 Intelligent navigation system for safe driving of vehicle
CN112101230B (en) * 2020-09-16 2024-05-14 招商局重庆公路工程检测中心有限公司 Method and system for detecting opening of head lamp of road tunnel passing vehicle
CN112697066A (en) * 2020-12-02 2021-04-23 王刚 Vehicle part positioning method and device and computer storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201293683Y (en) * 2008-11-27 2009-08-19 凯迈(洛阳)测控有限公司 Portable vehicle-mounted thermal infrared imager
KR20100035943A (en) * 2008-09-29 2010-04-07 (주)인펙비전 Tracking method for image brightness
KR20150011424A (en) * 2013-07-22 2015-02-02 (주)에이치엠씨 Method for detecting the failure of rear combination lamp of vehicle by using camera image and system thereof
CN104760537A (en) * 2015-04-21 2015-07-08 重庆大学 Novel vehicle-mounted safe driving assistance system
CN105549023A (en) * 2014-10-23 2016-05-04 现代摩比斯株式会社 Object detecting apparatus, and method of operating the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120287276A1 (en) * 2011-05-12 2012-11-15 Delphi Technologies, Inc. Vision based night-time rear collision warning system, controller, and method of operating the same
JP6095605B2 (en) * 2014-04-24 2017-03-15 本田技研工業株式会社 Vehicle recognition device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100035943A (en) * 2008-09-29 2010-04-07 (주)인펙비전 Tracking method for image brightness
CN201293683Y (en) * 2008-11-27 2009-08-19 凯迈(洛阳)测控有限公司 Portable vehicle-mounted thermal infrared imager
KR20150011424A (en) * 2013-07-22 2015-02-02 (주)에이치엠씨 Method for detecting the failure of rear combination lamp of vehicle by using camera image and system thereof
CN105549023A (en) * 2014-10-23 2016-05-04 现代摩比斯株式会社 Object detecting apparatus, and method of operating the same
CN104760537A (en) * 2015-04-21 2015-07-08 重庆大学 Novel vehicle-mounted safe driving assistance system

Also Published As

Publication number Publication date
CN107640149A (en) 2018-01-30

Similar Documents

Publication Publication Date Title
US11914381B1 (en) Methods for communicating state, intent, and context of an autonomous vehicle
JP7096150B2 (en) Brake light detection of vehicles traveling in front to adapt the activation of active safety mechanisms
KR101816423B1 (en) Displaying apparatus replacing side mirror and method for controlling output brightness thereof
US9589464B2 (en) Vehicular headlight warning system
US9346401B2 (en) Motor vehicle and method for operating a motor vehicle
US7394356B2 (en) Information presenting apparatus and method
JP4730267B2 (en) Visibility state determination device for vehicle
EP2879912B1 (en) System and method for controlling exterior vehicle lights responsive to detection of a semi-truck
CN107640149B (en) Vehicle auxiliary driving method and system
JP2016101797A (en) Safety control device for vehicle start time
JP2008189148A (en) Traveling state detection device
JP4751894B2 (en) A system to detect obstacles in front of a car
CN109435839B (en) Device and method for detecting vehicle steering lamp close to lane
JP6136564B2 (en) Vehicle display device
CN109878535B (en) Driving assistance system and method
US11479299B2 (en) Display device for a vehicle
CN113525359A (en) Curve speed control module and method and engine control unit comprising same
US20220292686A1 (en) Image processing apparatus, image processing method, and computer-readable storage medium storing program
KR20230075413A (en) Information processing device, information processing method, program and projection device
US20140153782A1 (en) Imaging system and method for detecting a winding road
CN108569290B (en) Electronic control device and method for vehicle and vehicle comprising same
CN110816401A (en) Method and device for warning vehicle sliding of front vehicle, vehicle and storage medium
US11951981B2 (en) Systems and methods for detecting turn indicator light signals
KR102559534B1 (en) Method and apparatus for supporting driving lines of vehicle in bad weather
KR20230074481A (en) Information processing device, information processing method, program and projection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant