CN112149560A - Lane departure detection method - Google Patents

Lane departure detection method Download PDF

Info

Publication number
CN112149560A
CN112149560A CN202011002556.6A CN202011002556A CN112149560A CN 112149560 A CN112149560 A CN 112149560A CN 202011002556 A CN202011002556 A CN 202011002556A CN 112149560 A CN112149560 A CN 112149560A
Authority
CN
China
Prior art keywords
detection
lane
detection area
area
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011002556.6A
Other languages
Chinese (zh)
Other versions
CN112149560B (en
Inventor
贺徵延
程自清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dacheng Technology Development Co ltd
Original Assignee
Shenzhen Dacheng Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dacheng Technology Development Co ltd filed Critical Shenzhen Dacheng Technology Development Co ltd
Priority to CN202011002556.6A priority Critical patent/CN112149560B/en
Publication of CN112149560A publication Critical patent/CN112149560A/en
Application granted granted Critical
Publication of CN112149560B publication Critical patent/CN112149560B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a lane departure detection method, which comprises the following steps: respectively arranging target detection units in a first detection area, a second detection area and a third detection area; acquiring first visual angle target image information by using a target detection unit in a first detection area, and acquiring a safe driving route by combining the panoramic image information; the detection units of the second detection area and the third detection area respectively acquire second visual angle and third visual angle target image information of the safe driving route in real time; and providing real-time lane monitoring information for the driver by using an auxiliary driving and big data technology according to the safe driving route and the target image information of the second visual angle and the third visual angle. By integrating the lane departure, blind zone detection system, driving behavior analysis and big data platform technology, the lane departure detection method integrating panoramic image, auxiliary driving, Internet of vehicles and big data application technology is realized, and the lane departure recognition rate is 99%.

Description

Lane departure detection method
Technical Field
The invention relates to the technical field of driving assistance systems, in particular to a lane departure detection method.
Background
The Lane Departure Warning System (LDWS) is a System that recognizes a Lane by capturing a forward driving image of a vehicle, detects the position of the vehicle in the Lane by using the image, and warns when a Lane Departure that is not intended by a driver is detected. The LDWS algorithm has been developed for the purpose of warning a lane departure when recognizing a lane deviation of a vehicle.
Specifically, the front vehicle photographing technology mainly utilizes a camera lens (camera/image sensor) installed in a vehicle and facing the front of the vehicle to read a road image in front of the vehicle, when the vehicle is used, such as driving on a highway, the white lane line images on two sides of a lane are generally read, and when the vehicle is installed, a lane starting point, a lane vanishing point and the left-right offset sensitivity of the vehicle can be set by a user as identification parameters; inputting the read image into an Electronic Control Unit (ECU) arranged in a Lane Departure Warning System (LDWS), wherein the ECU mainly comprises a Digital Signal Processor (DSP) and is matched with a memory unit (memory unit) such as a Random Access Memory (RAM) to process image identification work; the Electronic Control Unit (ECU) is further connected to the left and right turn signals of the vehicle or the speed indicator, etc. to acquire and use the vehicle body signal of the vehicle to determine whether the vehicle is a dangerous driving behavior, for example: the vehicle runs on the expressway at a speed of more than 60 kilometers per hour, and a vehicle body signal indicates that a driver does not operate a left turn light and a right turn light (lane is not changed), but if a Lane Departure Warning System (LDWS) identifies that the running direction of the vehicle is in a line pressing or snake running state relative to lane lines, namely the vehicle does not stably run between the two lane lines; if the result of the identification is positive, a signal is transmitted to a warning unit (warning unit), such as a visual warning unit and/or an audible warning unit, to be activated for warning the driver, so as to achieve the function of a Lane Departure Warning System (LDWS).
The lane departure detection relates to technologies such as panoramic images, Assisted Driving (ADAS), internet of vehicles and big data application, and at present, all the systems in the market mostly appear as independent functional units, and a lane departure detection method of a panoramic assisted driving system integrating the functions is not available.
Disclosure of Invention
The invention aims to solve the technical problem that in the prior art, lane departure detection relates to technologies such as panoramic images, Assisted Driving (ADAS), internet of vehicles and big data application, all the systems in the market appear as independent functional units at present, and a lane departure detection method integrating the functions is not provided.
Aiming at the problems, technologies such as panoramic images, Assisted Driving (ADAS), internet of vehicles and big data application are integrated, a lane departure detection method of a panoramic assisted driving system is provided, and the lane departure detection method of the panoramic assisted driving system is realized by deeply researching the technologies such as digital signal processing, visual analysis, complex system integration, intelligent sensing control and internet of vehicles and integrating lane departure, a blind area detection system, driving behavior analysis and a big data platform. When the lane detection is carried out, the gray level binary method is adopted for operation, the noise in the gray level image is removed, clear lane line outline information is obtained, and compared with the independent function single-product scheme on the current market, the cost performance of the system is greatly improved, the driving stability and safety are improved, and the lane departure recognition rate is 99%.
A lane departure detection method, comprising the steps of:
respectively arranging target detection units in a first detection area, a second detection area and a third detection area;
acquiring first visual angle target image information by using a target detection unit in a first detection area, and acquiring a safe driving route by combining the panoramic image information;
the detection units of the second detection area and the third detection area respectively acquire target image information of a second visual angle and a third visual angle of the safe driving route in real time;
and providing real-time lane monitoring information for the driver by using an Auxiliary Driving (ADAS) and big data technology according to the safe driving route and the target image information of the second visual angle and the third visual angle.
With reference to the present invention, in a first possible implementation manner, the steps of: the target detection unit is respectively arranged in a first detection area, a second detection area and a third detection area, and the method comprises the following steps:
the target detection unit adopts a single intelligent camera and is respectively arranged in the first detection area, the second detection area and the third detection area;
and setting the first detection area, the second detection area and the third detection area as a center area at the front part of the vehicle head, a vehicle blind area and areas at two sides of the vehicle body.
With reference to the first possible embodiment and the second possible embodiment of the present invention, in a second possible embodiment, the steps of: utilize the target detection unit in the first detection area to acquire first visual angle target image information and combine panoramic image information, acquire safe driving route, include:
detecting image information according to the panoramic image and the first detection area target detection unit, and acquiring lane line information and performing image simulation by using a boundary operator and a data calibration algorithm;
and predicting the lane extension line information according to the lane information and outputting the extension line image simulation information.
With reference to the second possible embodiment and the third possible embodiment of the present invention, the steps of: according to the panoramic image and the image information detected by the first detection area target detection unit, acquiring lane line information and performing image simulation by using a boundary operator and a data calibration algorithm, wherein the method comprises the following steps:
judging whether a lane line exists in a first visual angle area or not according to the panoramic image and the graphic information detected by the first detection area target detection unit;
if the lane line exists, determining the possible area range of the lane line to be detected by the first detection area target detection unit
And detecting the lane line, performing linear calibration, and performing analog imaging on the calibrated lane line.
With reference to the third possible embodiment of the present invention, in a fourth possible embodiment, the steps of: utilize the target detection unit in the first detection area to acquire first visual angle target image information and combine panoramic image information, acquire safe driving route, still include:
acquiring a reference value of a calibration straight line in an image detected by the target detection unit in the first detection area;
the first detection area target detection unit acquires data of lane line information in a driving process in a dot matrix form, and compares the data with the reference value to acquire a safe driving route.
With reference to the third possible embodiment of the present invention, in a fifth possible embodiment, the steps: detecting the lane line, carrying out linear calibration, and carrying out analog imaging on the calibrated lane line, wherein the method comprises the following steps:
carrying out binarization operation on the gray value of the region of interest outside the vehicle image to obtain a lane contour;
and continuously carrying out gray level binarization operation on the lane contour, and selecting a characteristic lane according to the lane contour characteristics after gray level binarization.
With reference to the fifth possible embodiment and the sixth possible embodiment of the present invention, in the step of: detecting the lane line, carrying out linear calibration, and carrying out analog imaging on the calibrated lane line, and the method also comprises the following steps:
and predicting the lane line characteristics of the current frame according to the lane line information of the previous frame.
In a seventh possible embodiment combined with the present invention, the steps of: the detection units of the second detection area and the third detection area respectively acquire the target image information of the second visual angle and the third visual angle of the safe driving route in real time, and the method comprises the following steps:
respectively detecting and identifying the vehicle head and the vehicle body in the vehicle blind area by utilizing a neural network;
and calculating the advancing attitude of the vehicle according to the detection and identification results of the vehicle head and the vehicle body to obtain the transverse and longitudinal distances of the vehicle at the blind area view angle.
In an eighth possible embodiment combined with the present invention, the step of: the detection units of the second detection area and the third detection area respectively acquire the target image information of the second visual angle and the third visual angle of the safe driving route in real time, and the method comprises the following steps:
identifying wheels of the vehicle at the view angle of the side area of the vehicle body;
and obtaining the distance between the wheels of the vehicle according to the panoramic image and the identification result.
In a ninth possible implementation manner, with reference to the present invention, the detection method further includes:
detecting signal light information or pedestrian information by using a target detection unit in a first detection area;
when the signal lamp stop information or the pedestrian information is detected, an alarm signal is sent out and the vehicle is braked.
By implementing the lane departure detection method, the lane departure detection method of the panoramic auxiliary driving system integrating the panoramic image, the Auxiliary Driving (ADAS), the Internet of vehicles and the big data application is realized by deeply researching the digital signal processing, the visual analysis, the complex system integration, the intelligent sensing control, the Internet of vehicles and other technologies and integrating the lane departure, the blind area detection system, the driving behavior analysis and the big data platform. When the lane detection is carried out, the gray level binary method is adopted for operation, the noise in the gray level image is removed, clear lane line outline information is obtained, and compared with the independent function single-product scheme on the current market, the cost performance of the system is greatly improved, the driving stability and safety are improved, and the lane departure recognition rate is 99%.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a lane departure detection method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a lane departure detection method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of a lane departure detection method according to a third embodiment of the present invention;
FIG. 4 is a diagram illustrating a lane departure detection method according to a fourth embodiment of the present invention;
FIG. 5 is a diagram illustrating a fifth embodiment of a lane departure detection method according to the present invention;
FIG. 6 is a schematic diagram of a sixth embodiment of a lane departure detection method in accordance with the present invention;
FIG. 7 is a diagram of a seventh embodiment of a lane departure detection method in accordance with the present invention;
fig. 8 is a schematic diagram of an eighth embodiment of a lane departure detection method in the present invention;
fig. 9 is a schematic diagram of a ninth embodiment of a lane departure detection method in accordance with the present invention;
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention aims to solve the technical problem that in the prior art, lane departure detection relates to technologies such as panoramic images, Assisted Driving (ADAS), internet of vehicles and big data application, all the systems in the market appear as independent functional units at present, and a lane departure detection method integrating the functions is not provided.
In order to solve the problems, technologies such as panoramic imagery, Assisted Driving (ADAS), internet of vehicles and big data application are integrated, and a lane departure detection method of a panoramic assisted driving system is provided.
A lane departure detection method, comprising the steps of:
s1, the target detection means is set in the first detection area, the second detection area, and the third detection area, respectively.
Preferably, the step S1 includes the steps of: and S11, the target detection unit adopts a single intelligent camera and is respectively arranged in the first detection area, the second detection area and the third detection area. And S12, setting the first detection area, the second detection area and the third detection area as a front center area of the vehicle head, a blind area of the vehicle and a specific area of the side surface of the vehicle body.
The intelligent camera in the central area in the front of the vehicle head is mainly used for collecting a lane image in front of a vehicle, the gray level image of the lane image is used for carrying out binarization operation, the contour characteristics of a lane are obtained, image simulation is carried out by combining a panoramic image for a driver to use, and meanwhile, the lane line can be predicted through the collected lane image.
And S2, acquiring first visual angle target image information by using the target detection unit in the first detection area, and acquiring a safe driving route by combining the panoramic image information.
The first visual angle mainly refers to the visual angle range in front of the running vehicle, and the first visual angle target image mainly refers to the lane characteristics in front of the running vehicle. The target image is obtained by utilizing a boundary operator and a data calibration algorithm according to the gray level image, the lane line characteristics in front are obtained, the panoramic image is utilized to simulate each visual angle of a driving route for a driver to refer to, and early warning and reminding are carried out when the driving direction is obviously deviated.
Preferably, step S2 includes:
s21, detecting image information according to the panoramic image and the first detection area target detection unit, and acquiring lane line information and performing image simulation by using a boundary operator and a data calibration algorithm; and S22, predicting the lane extension line information according to the lane information and outputting the extension line image simulation information.
Preferably, the step S21 includes the steps of:
s211, judging whether a lane line exists in a first visual angle area or not according to the panoramic image and the graphic information detected by the first detection area target detection unit; s212, if a lane line exists, determining a possible area range of the lane line detected by the first detection area target detection unit; and S213, detecting the lane line, performing linear calibration, and performing analog imaging on the calibrated lane line.
The acquired lane line image is arc-shaped in the camera, while the lane line in the panoramic image is always straight, and the conversion from the actually shot image to the straight line needs to be calibrated to acquire the straight line image in the panoramic image. The two are in one-to-one correspondence, and the straight lane lines in the panoramic image need to be compared with the shot image lane lines, so that the safe driving route in the panoramic image is obtained.
Preferably, step S213 includes the steps of:
s2131, carrying out binarization operation on the gray value of the region of interest outside the vehicle image to obtain a lane contour; and S2132, continuously carrying out gray level binarization operation on the lane contour, selecting a characteristic lane according to the lane contour characteristics after gray level binarization, and predicting the lane characteristics of the current frame according to the lane line information of the previous frame.
And obviously separating the lane line characteristics and the non-lane line characteristics in the lane lines through image binarization.
The region of interest is actually the region in front of the vehicle, and in the panoramic image is the region in front of the vehicle image. The car picture is fixed in the panoramic image. During vehicle travel, the lane lines are generally on both sides of the vehicle unless the vehicle presses against the lane lines.
Preferably, the step S2 further includes the steps of:
s23, acquiring a reference value of the lane line calibration straight line affected by detection in the first detection area target detection unit; and S24, the first detection area target detection unit acquires data of lane line information in the driving process in a dot matrix form, and compares the data with the reference value to obtain a safe driving route.
Actually shooting lane line images and panoramic lane line straight lines, wherein the actual shooting lane line images and the panoramic lane line straight lines are in one-to-one mapping relation, and the straight lane lines in the panoramic images need to be compared with the shot image lane lines, so that safe driving routes in the panoramic images are obtained.
And S3, the detection units of the second detection area and the third detection area respectively acquire the target image information of the second visual angle and the third visual angle of the safe driving route in real time.
Preferably, step S3 includes:
s31, respectively detecting and identifying the vehicle head and the vehicle body in the vehicle blind area by using a neural network;
and S32, calculating the vehicle advancing posture according to the detection and recognition results of the vehicle head and the vehicle body, and obtaining the transverse and longitudinal distances of the vehicle at the blind area view angle.
The second view angle and the third view angle are actually vehicle information corresponding to the blind area and vehicle information corresponding to both sides of the vehicle.
And monitoring the vehicle information in the blind area visual angle area, and respectively identifying the vehicle head and the vehicle body by utilizing a neural network. And calculating the posture of the vehicle according to the recognition results of the vehicle head and the vehicle body to obtain the transverse and longitudinal distances of the vehicle behind and beside the vehicle. And obtaining boundary image characteristics of the side vehicle by using a boundary detection operator, and obtaining the front distance of the vehicle by referring to calibration data according to the front bottom of the vehicle. And obtaining the vehicle side distance according to the vehicle side boundary by referring to the boundary data and the calibration data.
Preferably, step S3 further includes:
s33, identifying the wheels of the vehicle in the view angle of the side area of the vehicle body;
and S34, obtaining the vehicle wheel distance according to the panoramic image and the identification result.
And for vehicle information monitoring on two sides of the vehicle, calculating the distance between the identified target and the vehicle according to the wheel identification result and the panoramic calibration data. And calculating the distance between the vehicles on two sides and the vehicle according to the bottom of the tire by referring to the calibration data.
And S4, providing real-time lane monitoring information for the driver by using an Assisted Driving (ADAS) and big data technology according to the safe driving route and the target image information of the second view angle and the third view angle.
Preferably, the detection method further comprises the steps of:
s5, detecting signal light information or pedestrian information by using the target detection unit in the first detection area;
and S6, when the signal lamp stop information or the pedestrian information is detected, sending an alarm signal and braking the vehicle.
The lane departure detection method disclosed by the invention is implemented by carrying out deep research on aspects of digital signal processing, visual analysis, complex system integration, intelligent sensing control, Internet of vehicles and the like, and integrating lane departure and blind area detection systems, driving behavior analysis and a big data platform, so that the lane departure detection method of the panoramic auxiliary driving system integrating panoramic images, Auxiliary Driving (ADAS), Internet of vehicles and big data application is realized. When the lane detection is carried out, the binary method operation is adopted, the noise in the gray level image is removed, clear lane line outline information is obtained, and compared with an independent function single-product scheme in the current market, the cost performance of the system is greatly improved, and the driving stability and safety are improved.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A lane departure detection method, characterized by comprising the steps of:
respectively arranging target detection units in a first detection area, a second detection area and a third detection area;
acquiring first visual angle target image information by using a target detection unit in a first detection area, and acquiring a safe driving route by combining the panoramic image information;
target detection units of a second detection area and a third detection area respectively acquire second visual angle and third visual angle target image information of the safe driving route in real time;
and providing real-time lane monitoring information for the driver by using an auxiliary driving and big data technology according to the safe driving route, the second visual angle and the third visual angle target image information.
2. The lane departure detection method according to claim 1, characterized by the steps of: the target detection unit is respectively arranged in a first detection area, a second detection area and a third detection area, and the method comprises the following steps:
the target detection unit adopts a single intelligent camera and is respectively arranged in the first detection area, the second detection area and the third detection area;
and setting the first detection area, the second detection area and the third detection area as a center area at the front part of the vehicle head, a vehicle blind area and areas at two sides of the vehicle body.
3. The lane departure detection method according to claim 2, characterized by the steps of: utilize the target detection unit in the first detection area to obtain first visual angle target image information and combine panoramic image information, obtain safe driving route, including the step:
detecting image information according to the panoramic image and the first detection area target detection unit, and acquiring lane line information and performing image simulation by using a boundary operator and a data calibration algorithm;
and predicting the lane extension line information according to the lane information and outputting the extension line image simulation information.
4. The lane departure detection method according to claim 3, characterized by the steps of: according to the panoramic image and the image information detected by the first detection area target detection unit, acquiring lane line information and performing image simulation by using a boundary operator and a data calibration algorithm, wherein the method comprises the following steps:
judging whether a lane line exists in a first visual angle area or not according to the panoramic image and the image information detected by the first detection area target detection unit;
if the lane line exists, determining the possible area range of the lane line to be detected in the first detection area target detection unit;
and detecting the lane line, performing linear calibration, and performing analog imaging on the calibrated lane line.
5. The lane departure detection method according to claim 4, wherein the steps of: utilize the target detection unit in the first detection area to acquire first visual angle target image information and combine panoramic image information, acquire safe driving route, still include the step:
acquiring a reference value of a calibration straight line in an image detected by the target detection unit in the first detection area;
the first detection area target detection unit acquires data of lane line information in a driving process in a dot matrix form, and compares the data with the reference value to acquire a safe driving route.
6. The lane departure detection method according to claim 5, characterized by the steps of: detecting the lane line, carrying out linear calibration, and carrying out analog imaging on the calibrated lane line, wherein the method comprises the following steps:
carrying out binarization operation on the gray value of the region of interest outside the vehicle image to obtain a lane contour;
and continuously carrying out gray level binarization operation on the lane contour, and selecting a characteristic lane according to the lane contour characteristics after gray level binarization.
7. The lane departure detection method according to claim 6, characterized by the steps of: detecting the lane line, carrying out linear calibration, and carrying out analog imaging on the calibrated lane line, and the method also comprises the following steps:
and predicting the lane line of the current frame according to the lane line information of the previous frame.
8. The lane departure detection method according to any one of claims 1 to 7, characterized by the steps of: the detection units of the second detection area and the third detection area respectively acquire the target image information of the second visual angle and the third visual angle of the safe driving route in real time, and the method comprises the following steps:
respectively detecting and identifying the vehicle head and the vehicle body in the vehicle blind area by utilizing a neural network;
and calculating the advancing attitude of the vehicle according to the detection and identification results of the vehicle head and the vehicle body to obtain the transverse and longitudinal distances of the vehicle at the blind area view angle.
9. The lane departure detection method according to claim 8, characterized by the steps of: the detection units of the second detection area and the third detection area respectively acquire the target image information of the second visual angle and the third visual angle of the safe driving route in real time, and the method further comprises the following steps:
identifying wheels of the vehicle at the view angle of the side area of the vehicle body;
and obtaining the distance between the wheels of the vehicle according to the panoramic image and the identification result.
10. The lane departure detection method according to claim 9, characterized by further comprising the steps of:
detecting signal light information or pedestrian information by using a target detection unit in a first detection area;
when the signal lamp stop information or the pedestrian information is detected, an alarm signal is sent out and the vehicle is braked.
CN202011002556.6A 2020-09-22 2020-09-22 Lane departure detection method Active CN112149560B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011002556.6A CN112149560B (en) 2020-09-22 2020-09-22 Lane departure detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011002556.6A CN112149560B (en) 2020-09-22 2020-09-22 Lane departure detection method

Publications (2)

Publication Number Publication Date
CN112149560A true CN112149560A (en) 2020-12-29
CN112149560B CN112149560B (en) 2024-03-01

Family

ID=73896129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011002556.6A Active CN112149560B (en) 2020-09-22 2020-09-22 Lane departure detection method

Country Status (1)

Country Link
CN (1) CN112149560B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112885087A (en) * 2021-01-22 2021-06-01 北京嘀嘀无限科技发展有限公司 Method, apparatus, device and medium for determining road condition information and program product
CN116331220A (en) * 2023-05-12 2023-06-27 禾多科技(北京)有限公司 Lane departure early warning method and early warning system for automatic driving vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005205983A (en) * 2004-01-21 2005-08-04 Aisan Ind Co Ltd Apparatus for visually recognizing surrounding of own vehicle
KR20140055726A (en) * 2012-11-01 2014-05-09 (주)언맨드솔루션 Lane wideh measuring system by using cammera
CN106167045A (en) * 2015-05-21 2016-11-30 Lg电子株式会社 Human pilot auxiliary device and control method thereof
CN110525360A (en) * 2019-08-26 2019-12-03 广汽蔚来新能源汽车科技有限公司 Auxiliary driving method, device, system and storage medium based on car-mounted terminal
CN110641366A (en) * 2019-10-12 2020-01-03 爱驰汽车有限公司 Obstacle tracking method and system during driving, electronic device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005205983A (en) * 2004-01-21 2005-08-04 Aisan Ind Co Ltd Apparatus for visually recognizing surrounding of own vehicle
KR20140055726A (en) * 2012-11-01 2014-05-09 (주)언맨드솔루션 Lane wideh measuring system by using cammera
CN106167045A (en) * 2015-05-21 2016-11-30 Lg电子株式会社 Human pilot auxiliary device and control method thereof
CN110525360A (en) * 2019-08-26 2019-12-03 广汽蔚来新能源汽车科技有限公司 Auxiliary driving method, device, system and storage medium based on car-mounted terminal
CN110641366A (en) * 2019-10-12 2020-01-03 爱驰汽车有限公司 Obstacle tracking method and system during driving, electronic device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴彬 等: "车道线快速检测算法及其应用", 《传感器与微系统》, pages 157 - 160 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112885087A (en) * 2021-01-22 2021-06-01 北京嘀嘀无限科技发展有限公司 Method, apparatus, device and medium for determining road condition information and program product
CN116331220A (en) * 2023-05-12 2023-06-27 禾多科技(北京)有限公司 Lane departure early warning method and early warning system for automatic driving vehicle

Also Published As

Publication number Publication date
CN112149560B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
US8766816B2 (en) System for monitoring the area around a vehicle
CN106485233B (en) Method and device for detecting travelable area and electronic equipment
US8311283B2 (en) Method for detecting lane departure and apparatus thereof
CN102765365B (en) Pedestrian detection method based on machine vision and pedestrian anti-collision warning system based on machine vision
US8180561B2 (en) Vehicle-installation obstacle detection apparatus
KR101891460B1 (en) Method and apparatus for detecting and assessing road reflections
CN108263279A (en) The pedestrian detection and pedestrian impact avoiding device and method integrated based on sensor
EP1251032A2 (en) Apparatus and method of recognizing vehicle travelling behind
EP3252712B1 (en) Vision system and method for a motor vehicle
CN110588623B (en) Large automobile safe driving method and system based on neural network
CN101101333A (en) Apparatus and method for producing assistant information of driving vehicle for driver
JP6073358B2 (en) Active warning and / or navigation assistance method for avoiding collision between vehicle body and / or wheel and object
CN101131321A (en) Real-time safe interval measurement method and device used for vehicle anti-collision warning
KR20160051993A (en) Apparatus for driver assistance
US10839263B2 (en) System and method for evaluating a trained vehicle data set familiarity of a driver assitance system
EP2224743B1 (en) Vehicle periphery monitoring device, vehicle, and vehicle periphery monitoring program
CN110682907B (en) Automobile rear-end collision prevention control system and method
US8160300B2 (en) Pedestrian detecting apparatus
CN112149560B (en) Lane departure detection method
JP4116643B2 (en) Device for classifying at least one object around a vehicle
CN109389060A (en) A kind of vehicle week anti-collision warning method of view-based access control model
KR20150096924A (en) System and method for selecting far forward collision vehicle using lane expansion
JP4901287B2 (en) Driving support device
EP4177833A1 (en) Vision system and method for a motor vehicle
JP2000011298A (en) Read and side monitoring device for vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant