CN114495063A - Method for detecting lane departure degree and readable storage medium - Google Patents
Method for detecting lane departure degree and readable storage medium Download PDFInfo
- Publication number
- CN114495063A CN114495063A CN202210096246.8A CN202210096246A CN114495063A CN 114495063 A CN114495063 A CN 114495063A CN 202210096246 A CN202210096246 A CN 202210096246A CN 114495063 A CN114495063 A CN 114495063A
- Authority
- CN
- China
- Prior art keywords
- lane
- lane line
- line
- image
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000001514 detection method Methods 0.000 claims abstract description 78
- 238000012549 training Methods 0.000 claims description 22
- 230000006870 function Effects 0.000 claims description 17
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 9
- 238000003825 pressing Methods 0.000 claims description 9
- 238000005056 compaction Methods 0.000 claims description 6
- 238000013135 deep learning Methods 0.000 claims description 5
- 238000003062 neural network model Methods 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 abstract description 13
- 230000000875 corresponding effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
A method of detecting a degree of lane departure, readable storage medium, wherein the method comprises: acquiring a road surface image in front of a vehicle; inputting the road surface image in front of the vehicle into a pre-trained lane line detection model to obtain a lane line image; detecting a lane line straight line from the lane line image; the method comprises the steps of finding out the distance from the intersection point of each lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image, taking the distance as the first lane line of the current lane, judging whether the second lane line of the current lane can be obtained according to the fact whether the lane line straight line with the horizontal included angle opposite to the horizontal included angle of the first lane line exists or not, distinguishing the situations that two lane lines can be obtained and only one lane line can be obtained, calculating lane departure degrees in different modes respectively, adapting to lane departure degree calculation when only one lane line is obtained, widening application scenes and improving lane departure degree calculation accuracy.
Description
Technical Field
The invention relates to the technical field of auxiliary driving, in particular to a lane departure degree detection method and a readable storage medium.
Background
With the development of social economy, the market demand of the automobile assistant driving system is larger and larger, and the functions of the automobile assistant driving system are more and more, wherein lane departure early warning is an important function. In the prior art, solutions of early warning methods related to lane departure are different, but lane departure degree is mostly judged on the premise that two lane lines on the left and the right of a current lane can be accurately detected, and the processing capability of the early warning method related to lane departure is limited for the situation that only one lane line can be detected.
Disclosure of Invention
The invention mainly solves the technical problem that the existing lane departure degree detection method has limited processing capacity for the condition that only one lane line can be detected.
According to a first aspect, an embodiment provides a method of detecting a degree of lane departure, comprising:
acquiring a road surface image in front of a vehicle;
inputting the road surface image in front of the vehicle into a pre-trained lane line detection model to obtain a lane line image, wherein the lane line image is used for representing a lane line area in the road surface image in front of the vehicle, when the lane line area is not detected, the lane departure degree is judged to be 0, the subsequent steps are not executed, otherwise, the following steps are continuously executed;
performing straight line detection on the lane line image to obtain a lane line straight line, if the lane line straight line is not detected, judging that the lane departure degree is 0, and not executing the subsequent steps, otherwise, continuously executing the following steps;
calculating the distance from the intersection point of each lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image, and taking the lane line straight line with the minimum distance as a first lane line of the current lane;
calculating a horizontal included angle theta 1 of the first lane line and horizontal included angles of other lane line straight lines, wherein the horizontal included angles refer to included angles between the lane line straight lines and the lower boundary of the lane line image, and the value range is (-90 degrees and 90 degrees); searching whether a lane line straight line with a horizontal included angle opposite to the symbol theta 1 exists, if so, taking the intersection point of the lane line straight line and the lower boundary of the lane line image as the second lane line of the current lane, wherein the distance from the intersection point of the lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image is the minimum, and if not, judging that only one lane line of the current lane can be obtained;
when two lane lines of the current lane are obtained, calculating lane departure degree according to the distance from the intersection point of the two lane lines and the lower boundary of the lane line image to the central axis of the lane line image and/or the horizontal included angle of the two lane lines; when only one lane line of the current lane can be obtained, calculating the lane departure degree according to the position of the intersection point of the first lane line and the lower boundary of the lane line image and the horizontal included angle of the lane line;
and outputting an alarm signal when the lane departure degree is greater than a preset departure degree threshold value.
In one embodiment, the lane line detection model is obtained by training a road surface sample image set labeled with position information of a lane line region and corresponding lane line type information, and the lane line image is further used for representing a lane line type corresponding to the lane line region, wherein the lane line type includes a solid line, a dotted line and unknown;
the detection method further comprises the following steps: and obtaining the lane line type of the first lane line according to the lane line area where the first lane line is located, wherein the output alarm signal is a solid line pressing alarm signal when the lane line type of the first lane line is a solid line.
In one embodiment, the performing line detection on the lane line image to obtain a lane line straight line, and determining that the lane departure degree is 0 if the lane line straight line is not detected includes:
performing straight line detection on the lane line image by using a hough straight line detection algorithm, or performing straight line fitting in each connected domain in the lane line image by using a least square straight line fitting algorithm to obtain a primary lane line straight line, judging that the lane departure degree is 0 if the primary lane line straight line is not detected, and not executing the subsequent steps, otherwise, continuously executing the following steps;
calculating the slope of each preliminary lane line straight line and the coordinates of the intersection point of the preliminary lane line straight line and the lower boundary of the lane line image; if the difference of the slopes of the two preliminary lane line straight lines is smaller than a preset slope difference threshold value, or the difference of the coordinates of the intersection points of the two preliminary lane line straight lines and the lower boundary of the lane line image is smaller than a preset coordinate difference threshold value, only one of the two preliminary lane line straight lines is reserved as a final lane line straight line, otherwise, both the two preliminary lane line straight lines are reserved as final lane line straight lines.
In one embodiment, when the lane departure degree is calculated from the distance from the intersection of two lane lines and the lower boundary of the lane line image to the central axis of the lane line image, the formula for calculating the lane departure degree is as follows:
Deviate=1-d1/d2,
where Deviate denotes a lane departure degree, d1 denotes a distance from an intersection of the first lane line with the lower boundary of the lane line image to the central axis of the lane line image, and d2 denotes a distance from an intersection of the second lane line with the lower boundary of the lane line image to the central axis of the lane line image.
In one embodiment, when calculating the lane departure degree according to the horizontal angle between two lane lines, the formula for calculating the lane departure degree is as follows:
Deviate=1-min(abs(θ1),abs(θ2))/max(abs(θ1),abs(θ2)),
where Deviate denotes a degree of lane departure, θ 2 denotes a horizontal angle of the second lane line, abs () denotes an absolute value, min (abs (θ 1), abs (θ 2)) denotes a smaller value of abs (θ 1) and abs (θ 2), and max (abs (θ 1), abs (θ 2)) denotes a larger value of abs (θ 1) and abs (θ 2).
In one embodiment, the calculating the lane departure degree according to the position of the intersection point of the lane line and the lower boundary of the lane line image and the horizontal included angle of the lane line includes:
when the intersection point of the first lane line and the lower boundary of the lane line image is positioned outside the image range, judging that the lane departure degree is 0; and when the intersection point of the first lane line and the lower boundary of the lane line image is positioned in the image range, calculating the slope of the first lane line according to the horizontal included angle theta 1, and inputting the absolute value of the slope of the first lane line into a preset lane departure degree mapping function to obtain the lane departure degree.
In one embodiment, the lane departure degree mapping function is:
y=2*(1/(1+e^(-x))-0.5),
where y represents the degree of lane departure and x represents the absolute value of the slope of the first lane line.
In one embodiment, the lane line detection model is trained by:
obtaining a basic lane line detection model through deep learning training by using a road surface sample image set only marked with the position information of the lane line region;
and training on the basic lane line detection model by using a road surface sample image set marked with the position information of the lane line area and the corresponding lane line type information to obtain the lane line detection model.
In one embodiment, the method for detecting lane departure and solid line pressing further includes: and storing the road surface image in front of the vehicle and the corresponding lane departure degree and the compaction line detection result for training to obtain a neural network model capable of directly obtaining the lane departure degree and the compaction line detection result according to the road surface image in front of the vehicle.
According to a second aspect, an embodiment provides a computer-readable storage medium having a program stored thereon, the program being executable by a processor to implement the method of detecting a degree of lane departure as set forth in the first aspect above.
According to the lane departure degree detection method of the embodiment, lane line straight lines in a road surface image in front of a vehicle are detected, the distance from the intersection point of each lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image is found, the lane line straight line with the minimum distance is taken as the first lane line of the current lane, whether the second lane line of the current lane can be obtained is judged according to whether the lane line straight line with the horizontal included angle opposite to the horizontal included angle of the first lane line exists, so that the situations that two lane lines can be obtained and only one lane line can be obtained can be distinguished, and the departure degree can be calculated in different ways, so that lane departure degree calculation when only one lane line is obtained can be adapted, an application scene is widened, and lane departure degree calculation accuracy is improved.
Drawings
Fig. 1 is a flowchart of a lane departure degree detection method of an embodiment;
FIG. 2 is a schematic diagram of a lane line image from an image of a road surface in front of a vehicle;
FIG. 3 is a diagram illustrating a situation where no lane line region is detected;
FIG. 4 is a flow diagram of training a lane line detection model in one embodiment;
FIG. 5 is a flow chart illustrating line detection of a lane line image according to an embodiment;
FIG. 6 is a schematic of line detection or line fitting;
FIG. 7 is a schematic view of merging similar straight lines;
FIG. 8 is a schematic diagram of the intersection and angle of the lane line straight line and the lower image boundary;
FIG. 9 is a road surface image in front of a vehicle acquired when the camera device is installed according to specifications;
fig. 10 is a road surface image in front of the vehicle acquired when the image pickup apparatus is deflected rightward or leftward;
FIG. 11 is a diagram illustrating an example of a suitable scenario for a manner of calculating a degree of lane departure in one embodiment;
FIG. 12 is an image of a lane departure degree mapping function in one embodiment;
fig. 13 is a schematic diagram of a case where a wire pressing risk is detected.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
Referring to fig. 1, a method for detecting lane departure in an embodiment of the present application includes steps 110 to 180, which are described in detail below.
Step 110: an image of a road surface ahead of the vehicle is acquired. The image of the road surface in front of the vehicle may be captured by an image capturing device provided on the head of the vehicle, for example, a vehicle traveling data recorder provided in the center of the head of the vehicle.
Step 120: inputting the road surface image in front of the vehicle into a pre-trained lane line detection model to obtain a lane line image, wherein the lane line image is used for representing a lane line region in the road surface image in front of the vehicle, when the lane line region is not detected, judging that the lane departure degree is 0, not executing the subsequent steps, otherwise, continuously executing the step 130.
The existing lane line detection algorithm is realized by adopting a traditional image processing algorithm, including edge detection, straight line detection and the like, but the method has narrow use scene and poor detection effect, so that the lane line detection is realized by adopting a deep learning algorithm in some embodiments of the application. The lane line detection model is obtained by deep learning training by using a road surface sample image set marked with position information of a lane line region, so that the lane line region and a non-lane line region in a road surface image in front of a vehicle can be distinguished, and the lane line image is obtained by representing in different colors. The lane line detection model may use an image segmentation model such as ERFNet or the like. The road surface sample image set can be an open-source lane line detection data set or a self-labeling road surface image set.
In addition, the prior art does not consider the type of the lane line, does not distinguish whether the lane line is a solid line or a broken line, and cannot deal with the situation of pressing the solid line. For example, patent document No. 201611253350.4 proposes a lane departure warning method in which a lane departure model detector is obtained based on convolutional neural network training, and a lane departure result is obtained using the lane departure model detector, but in this method, a lane driving area is labeled during training, so that the type information of a lane line cannot be obtained, and thus, a compacted line detection cannot be performed. In an embodiment of the application, lane line types of the road sample images can be labeled to obtain a road sample image set labeled with position information of a lane line region and corresponding lane line type information, so that a lane line detection model capable of distinguishing lane line types is obtained through training, wherein the lane line types are divided into a solid line, a dotted line and unknown. The lane line types can be distinguished by displaying the lane line regions with different colors, brightness and the like, which is described herein by taking brightness as an example, as shown in fig. 2, the left side in the figure is an original road surface image in front of the vehicle, the right side is a lane line image, a non-lane line region in the lane line image is represented by black, a lane line region is represented by white, the lowest brightness represents a solid line, the middle brightness represents a dotted line, and the highest brightness represents that the lane line type is unknown; when the lane line region is not detected, as shown in fig. 3, there is only black in the lane line image, and it can be determined that the lane departure degree is 0. In the training process, if each road surface sample image in the road surface sample image set is directly marked with a lane line type, a large amount of manpower and material resources are consumed, so the application proposes a two-step training method, please refer to fig. 4, which includes steps 121 and 122, which are described below.
Step 121: and obtaining a basic lane line detection model through deep learning training by using the road surface sample image set only marked with the position information of the lane line region. For example, training using an open source lane marking dataset results in a base lane marking model. Because the road surface sample image set is only marked with the position information of the lane line region and is not marked with the lane line type, the basic lane line detection model obtained in the step can not detect the lane line type and only can identify the lane line region.
Step 122: the method includes the steps of training a basic lane line detection model by using a road surface sample image set marked with position information of lane line areas and corresponding lane line type information to obtain a lane line detection model, wherein the model can detect the lane line areas and can also detect corresponding lane line types. The road surface sample image set used in this step may be an open-source lane line detection data set or a self-labeling road surface image set, when the open-source lane line detection data set is used, a lane line type label may be added to the open-source lane line detection data set, and when the self-labeling road surface image set is used, both a lane line region and a corresponding lane line type need to be labeled.
The applicant finds through experiments that by adopting the method for training, the basic lane line detection model can be finely adjusted by only using a small number of samples with lane line categories in the training of the step 122, and the lane line detection model with the same quality can also be obtained, so that manpower and material resources are greatly saved, and in actual implementation, the ratio of the number of the samples used in the step 121 to the number of the samples used in the step 122 can be 100:1 to 50: 1.
Step 130: and performing straight line detection on the lane line image to obtain a lane line straight line, if the lane line straight line is not detected, judging that the lane departure degree is 0, and not executing the subsequent steps, otherwise, continuing to execute the step 140. Referring to fig. 5, the flow of performing the line detection on the lane line image in one embodiment includes steps 131 and 132, which are described below.
Step 131: and carrying out straight line detection or straight line fitting. The hough straight line detection algorithm can be used for carrying out straight line detection on the lane line image, or the least square straight line fitting algorithm is used for carrying out straight line fitting in each connected domain in the lane line image, so that a primary lane line straight line is obtained, as shown in fig. 6, the left side in the figure is the lane line image, and the right side is an effect graph of straight line detection or straight line fitting. If the preliminary lane line straight line is not detected, it is determined that the lane departure degree is 0, and the subsequent steps are not executed, otherwise, the step 132 is executed continuously.
Step 132: merge similar straight lines. The lane lines obtained in step 131 have many straight lines, some of which are relatively close to each other, and can be combined according to the priori knowledge of the lane line imaging rules. Specifically, the slope of each preliminary lane line straight line and the coordinates of the intersection point of the two preliminary lane line straight lines and the lower boundary of the lane line image are calculated, if the difference between the slopes of the two preliminary lane line straight lines is smaller than a preset slope difference threshold value, or the difference between the coordinates of the intersection point of the two preliminary lane line straight lines and the lower boundary of the lane line image is smaller than a preset coordinate difference threshold value, only one of the two preliminary lane line straight lines is reserved as a final lane line straight line, and otherwise, both the two preliminary lane line straight lines are reserved as final lane line straight lines. The result of the merging of similar straight lines is shown in fig. 7.
Here, the intersection point may be not only inside the image but also outside the image, and when there is no intersection point between the lane line straight line in the image and the lower boundary of the image, the lane line straight line and the lower boundary of the image may be extended, respectively, and the intersection point between the lane line straight line L4 and the lower boundary of the lane line image may be obtained outside the image, as shown in fig. 8.
Step 140: and calculating the distance from the intersection point of each lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image, and taking the lane line straight line with the minimum distance as a first lane line of the current lane, wherein the current lane refers to the lane where the vehicle is currently located.
In order to reduce the amount of calculation, calculation may be performed by taking only a partial image below the image, for example, taking an area of 40% below the image (e.g., a portion below a dotted line in fig. 8). As shown in fig. 8, L1, L2, L3 and L4 in the figure represent lane line straight lines, d1, d2, d3 and d4 respectively represent distances from intersection points of the straight lines and the lower boundary of the image to the central axis of the image, and since d1 is the smallest, L1 is taken as the first lane line of the current lane.
Step 150: calculating a horizontal included angle theta 1 of the first lane line and horizontal included angles of other lane line straight lines, searching whether a lane line straight line with the horizontal included angle opposite to the sign of theta 1 exists, if so, executing step 160, otherwise, judging that only one lane line of the current lane can be acquired, and executing step 170. The horizontal included angle refers to an included angle between a lane line straight line and a lower boundary of a lane line image, and the value range is (-90 degrees and 90 degrees), as shown in fig. 8, in the figure, θ 1, θ 2, θ 3 and θ 4 respectively represent horizontal included angles of lane line straight lines L1, L2, L3 and L4, wherein θ 2 and θ 4 are horizontal included angles opposite to the symbol of θ 1, and corresponding L2 and L4 are candidate straight lines, and since d2 is smaller than d4, L2 is another lane line of the current lane.
Step 160: and in the lane line straight line with the horizontal included angle opposite to the theta 1 sign, taking the intersection point of the lower boundary of the lane line image and the central axis of the lane line image with the minimum distance as the second lane line of the current lane, and calculating the lane departure degree according to the distance from the intersection point of the lower boundary of the two lane lines and the lane line image and the central axis of the lane line image and/or the horizontal included angle of the two lane lines.
In one embodiment, when the lane departure degree is calculated from the distance from the intersection point of the two lane lines and the lower boundary of the lane line image to the central axis of the lane line image, the following formula may be used for calculation:
Deviate=1-d1/d2,
where Deviate denotes the degree of lane departure.
In one embodiment, when calculating the lane departure degree according to the horizontal angle between two lane lines, the following formula can be used for calculation:
Deviate=1-min(abs(θ1),abs(θ2))/max(abs(θ1),abs(θ2)),
where Deviate denotes a lane departure degree, abs () denotes an absolute value, min (abs (θ 1), abs (θ 2)) denotes a smaller value of abs (θ 1) and abs (θ 2), and max (abs (θ 1), abs (θ 2)) denotes a larger value of abs (θ 1) and abs (θ 2).
The calculated lane departure degree is between (0, 1).
If the camera device is installed in a standard manner, as shown in fig. 9, lane departure degree can be accurately calculated by the two calculation methods, but the camera device is installed in an irregular manner in an actual scene, as shown in fig. 10, the camera device inclines left or right in the horizontal direction, and the left and right positions of the lane line are greatly changed, so that the result is greatly deviated if the distance from the intersection point of the lower boundary of the lane line and the image of the lane line to the central axis of the image of the lane line is adopted for calculation, but even if the camera device inclines, the horizontal included angle of the lane line is not greatly influenced, and therefore the adaptability of lane departure degree calculated by adopting the horizontal included angle of the lane line is stronger. This wide applicability is also shown in the case of congestion, interference, etc., as shown in fig. 11.
Step 170: and calculating the lane departure degree according to the position of the intersection point of the first lane line and the lower boundary of the lane line image and the horizontal included angle of the lane line. In one embodiment, an intersection of the first lane line and the lower boundary of the lane line image may be obtained, and when the intersection is located outside the image range, for example, as in the case of L4 in fig. 8, it is determined that the lane departure degree is 0; and when the intersection point of the first lane line and the lower boundary of the lane line image is positioned in the image range, calculating the slope of the first lane line according to the horizontal included angle theta 1, and inputting the absolute value of the slope of the first lane line into a preset lane departure degree mapping function to obtain the lane departure degree. Since the absolute value of the slope of the first lane line is positively correlated with the lane departure degree, the lane departure degree mapping function should reflect this relationship, and in one embodiment, the lane departure degree mapping function may be set as:
y=2*(1/(1+e^(-x))-0.5),
where y represents the degree of lane departure and x represents the absolute value of the slope of the first lane line. As shown in fig. 12, the range of the absolute value x of the slope of the first lane line is (0, + ∞) and the range of the lane departure degree y is (0,1), and the lane departure degree mapping function reflects the actual lane departure degree, and the determination criterion is consistent with the lane departure degree calculated using the two lane lines.
Step 180: and comparing the lane departure degree with a preset departure degree threshold value, judging whether the lane departure degree is greater than the preset departure degree threshold value, and if so, executing the step 190.
Step 190: and outputting an alarm signal. When the lane departure degree is larger than a preset departure degree threshold value, the risk of pressing the line is indicated, and therefore an alarm signal is sent out to prompt a driver.
Regardless of whether one lane line or two lane lines of the current lane are obtained in step 150, since the first lane line is the lane line closest to the central axis of the image, the lane departure direction should be close to the first lane line, and if there is a line pressing risk, the type of the pressed lane line is the type of the first lane line. Therefore, when the lane line detection model capable of detecting the lane line type is used, the lane line type of the first lane line can be obtained according to the lane line area where the first lane line is located, and when the lane line type of the first lane line is a solid line, the output alarm signal is a solid line pressing alarm signal. Fig. 13 shows a case where a risk of pressing a lane is detected, in which the pressed lane is a broken line.
In one embodiment, after each detection, the obtained road surface image in front of the vehicle, the corresponding lane deviation degree and the corresponding compacted line detection result can be stored, and then the neural network model can be used for training to obtain the lane deviation degree and the compacted line detection result directly according to the road surface image in front of the vehicle, so that the end-to-end lane deviation degree and compacted line detection can be realized. Specifically, the road surface image in front of the vehicle can be used as a training sample, corresponding lane departure degree and compaction line detection results are used as marking information, the neural network model is trained, the end-to-end lane departure degree and compaction line detection model completely based on the neural network are obtained, so that a complex judgment rule does not need to be designed manually, and the maintenance and detection effect of the product are facilitated to be continuously improved.
The lane departure degree detection method can distinguish the situations that two lane lines can be acquired and only one lane line can be acquired, and calculates the lane departure degree in different modes, so that the lane departure degree calculation method can adapt to the lane departure degree calculation when only one lane line is acquired, widens the application scenes, and improves the lane departure degree calculation accuracy. Meanwhile, the lane departure degree detection method does not need to calibrate the camera equipment or the road, can detect based on the existing automobile data recorder, does not need to additionally increase image acquisition equipment, is low in deployment cost, wide in applicable scene and convenient to popularize and use on a large scale.
Those skilled in the art will appreciate that all or part of the functions of the various methods in the above embodiments may be implemented by hardware, or may be implemented by computer programs. When all or part of the functions of the above embodiments are implemented by a computer program, the program may be stored in a computer-readable storage medium, and the storage medium may include: a read only memory, a random access memory, a magnetic disk, an optical disk, a hard disk, etc., and the program is executed by a computer to realize the above functions. For example, the program may be stored in a memory of the device, and when the program in the memory is executed by the processor, all or part of the functions described above may be implemented. In addition, when all or part of the functions in the above embodiments are implemented by a computer program, the program may be stored in a storage medium such as a server, another computer, a magnetic disk, an optical disk, a flash disk, or a removable hard disk, and may be downloaded or copied to a memory of a local device, or may be version-updated in a system of the local device, and when the program in the memory is executed by a processor, all or part of the functions in the above embodiments may be implemented.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.
Claims (10)
1. A method of detecting a degree of lane departure, comprising:
acquiring a road surface image in front of a vehicle;
inputting the road surface image in front of the vehicle into a pre-trained lane line detection model to obtain a lane line image, wherein the lane line image is used for representing a lane line area in the road surface image in front of the vehicle, when the lane line area is not detected, the lane departure degree is judged to be 0, the subsequent steps are not executed, otherwise, the following steps are continuously executed;
performing straight line detection on the lane line image to obtain a lane line straight line, if the lane line straight line is not detected, judging that the lane departure degree is 0, and not executing the subsequent steps, otherwise, continuously executing the following steps;
calculating the distance from the intersection point of each lane line straight line and the lower boundary of the lane line image to the central axis of the lane line image, and taking the lane line straight line with the minimum distance as a first lane line of the current lane;
calculating a horizontal included angle theta 1 of the first lane line and horizontal included angles of other lane line straight lines, wherein the horizontal included angles refer to included angles between the lane line straight lines and the lower boundary of the lane line image, and the value range is (-90 degrees and 90 degrees); searching whether a lane line straight line with a horizontal included angle opposite to the sign of theta 1 exists, if so, taking the intersection point of the lane line straight line, and if not, judging that only one lane line of the current lane can be obtained;
when two lane lines of the current lane are obtained, calculating lane departure degree according to the distance from the intersection point of the two lane lines and the lower boundary of the lane line image to the central axis of the lane line image and/or the horizontal included angle of the two lane lines; when only one lane line of the current lane can be obtained, calculating the lane departure degree according to the position of the intersection point of the first lane line and the lower boundary of the lane line image and the horizontal included angle of the lane line;
and outputting an alarm signal when the lane departure degree is greater than a preset departure degree threshold value.
2. The detection method according to claim 1, wherein the lane line detection model is obtained by training a road surface sample image set labeled with position information of a lane line region and corresponding lane line type information, and the lane line image is further used for representing a lane line type corresponding to the lane line region, wherein the lane line type includes a solid line, a dotted line and unknown;
the detection method further comprises the following steps: and obtaining the lane line type of the first lane line according to the lane line area where the first lane line is located, wherein the output alarm signal is a solid line pressing alarm signal when the lane line type of the first lane line is a solid line.
3. The detection method according to claim 1, wherein the performing of the line detection on the lane line image to obtain a lane line straight line, and determining that the degree of lane departure is 0 if the lane line straight line is not detected, includes:
performing straight line detection on the lane line image by using a hough straight line detection algorithm, or performing straight line fitting in each connected domain in the lane line image by using a least square straight line fitting algorithm to obtain a primary lane line straight line, judging that the lane departure degree is 0 if the primary lane line straight line is not detected, and not executing the subsequent steps, otherwise, continuously executing the following steps;
calculating the slope of each preliminary lane line straight line and the coordinates of the intersection point of the preliminary lane line straight line and the lower boundary of the lane line image; if the difference of the slopes of the two preliminary lane line straight lines is smaller than a preset slope difference threshold value, or the difference of the coordinates of the intersection points of the two preliminary lane line straight lines and the lower boundary of the lane line image is smaller than a preset coordinate difference threshold value, only one of the two preliminary lane line straight lines is reserved as a final lane line straight line, otherwise, both the two preliminary lane line straight lines are reserved as final lane line straight lines.
4. The detection method according to claim 1, wherein when the degree of lane departure is calculated from a distance from an intersection of two lane lines and a lower boundary of the lane line image to a central axis of the lane line image, the formula for calculating the degree of lane departure is as follows:
Deviate=1-d1/d2,
where Deviate denotes a lane departure degree, d1 denotes a distance from an intersection of the first lane line with the lower boundary of the lane line image to the central axis of the lane line image, and d2 denotes a distance from an intersection of the second lane line with the lower boundary of the lane line image to the central axis of the lane line image.
5. The detection method according to claim 1, wherein when the degree of lane departure is calculated from a horizontal angle between two lane lines, the formula for calculating the degree of lane departure is as follows:
Deviate=1-min(abs(θ1),abs(θ2))/max(abs(θ1),abs(θ2)),
where Deviate denotes a degree of lane departure, θ 2 denotes a horizontal angle of the second lane line, abs () denotes an absolute value, min (abs (θ 1), abs (θ 2)) denotes a smaller value of abs (θ 1) and abs (θ 2), and max (abs (θ 1), abs (θ 2)) denotes a larger value of abs (θ 1) and abs (θ 2).
6. The detection method according to claim 1, wherein the calculating of the degree of lane departure from the position of the intersection of the lane line and the lower boundary of the lane line image and the horizontal angle of the lane line includes:
when the intersection point of the first lane line and the lower boundary of the lane line image is positioned outside the image range, judging that the lane departure degree is 0; and when the intersection point of the first lane line and the lower boundary of the lane line image is positioned in the image range, calculating the slope of the first lane line according to the horizontal included angle theta 1, and inputting the absolute value of the slope of the first lane line into a preset lane departure degree mapping function to obtain the lane departure degree.
7. The detection method as claimed in claim 6, wherein the lane departure degree mapping function is:
y=2*(1/(1+e^(-x))-0.5),
where y represents the degree of lane departure and x represents the absolute value of the slope of the first lane line.
8. The detection method of claim 2, wherein the lane line detection model is trained by:
obtaining a basic lane line detection model through deep learning training by using a road surface sample image set only marked with the position information of the lane line region;
and training on the basic lane line detection model by using a road surface sample image set marked with the position information of the lane line area and the corresponding lane line type information to obtain the lane line detection model.
9. The detection method of claim 2, further comprising: and storing the road surface image in front of the vehicle and the corresponding lane departure degree and the compaction line detection result for training to obtain a neural network model capable of directly obtaining the lane departure degree and the compaction line detection result according to the road surface image in front of the vehicle.
10. A computer-readable storage medium, characterized in that the medium has stored thereon a program executable by a processor to implement the detection method according to any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210096246.8A CN114495063B (en) | 2022-01-26 | 2022-01-26 | Lane departure degree detection method and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210096246.8A CN114495063B (en) | 2022-01-26 | 2022-01-26 | Lane departure degree detection method and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114495063A true CN114495063A (en) | 2022-05-13 |
CN114495063B CN114495063B (en) | 2024-09-10 |
Family
ID=81476379
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210096246.8A Active CN114495063B (en) | 2022-01-26 | 2022-01-26 | Lane departure degree detection method and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114495063B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024027284A1 (en) * | 2022-08-02 | 2024-02-08 | 上海三一重机股份有限公司 | Travelling deviation prediction method and apparatus, and operation machine |
CN117719515A (en) * | 2024-01-05 | 2024-03-19 | 深圳技术大学 | Lane departure early warning method and system |
WO2024202034A1 (en) * | 2023-03-31 | 2024-10-03 | 本田技研工業株式会社 | Dashboard camera and control method therefor |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104517111A (en) * | 2013-09-27 | 2015-04-15 | 比亚迪股份有限公司 | Lane line detection method and system, and lane deviation early warning method and system |
US20150125039A1 (en) * | 2013-11-04 | 2015-05-07 | Sarmo Technology Inc. | Lane departure warning system and method |
CN106803066A (en) * | 2016-12-29 | 2017-06-06 | 广州大学 | A kind of vehicle yaw angle based on Hough transform determines method |
CN110263713A (en) * | 2019-06-20 | 2019-09-20 | 百度在线网络技术(北京)有限公司 | Method for detecting lane lines, device, electronic equipment and storage medium |
-
2022
- 2022-01-26 CN CN202210096246.8A patent/CN114495063B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104517111A (en) * | 2013-09-27 | 2015-04-15 | 比亚迪股份有限公司 | Lane line detection method and system, and lane deviation early warning method and system |
US20150125039A1 (en) * | 2013-11-04 | 2015-05-07 | Sarmo Technology Inc. | Lane departure warning system and method |
CN106803066A (en) * | 2016-12-29 | 2017-06-06 | 广州大学 | A kind of vehicle yaw angle based on Hough transform determines method |
CN110263713A (en) * | 2019-06-20 | 2019-09-20 | 百度在线网络技术(北京)有限公司 | Method for detecting lane lines, device, electronic equipment and storage medium |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024027284A1 (en) * | 2022-08-02 | 2024-02-08 | 上海三一重机股份有限公司 | Travelling deviation prediction method and apparatus, and operation machine |
WO2024202034A1 (en) * | 2023-03-31 | 2024-10-03 | 本田技研工業株式会社 | Dashboard camera and control method therefor |
CN117719515A (en) * | 2024-01-05 | 2024-03-19 | 深圳技术大学 | Lane departure early warning method and system |
Also Published As
Publication number | Publication date |
---|---|
CN114495063B (en) | 2024-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114495063B (en) | Lane departure degree detection method and readable storage medium | |
US10970566B2 (en) | Lane line detection method and apparatus | |
Jia et al. | Region-based license plate detection | |
US9269001B2 (en) | Illumination invariant and robust apparatus and method for detecting and recognizing various traffic signs | |
US9123242B2 (en) | Pavement marker recognition device, pavement marker recognition method and pavement marker recognition program | |
CN111080661B (en) | Image-based straight line detection method and device and electronic equipment | |
CN115272280A (en) | Defect detection method, device, equipment and storage medium | |
CN107909009B (en) | Obstacle detection method and device based on road surface learning | |
CN115184380B (en) | Method for detecting abnormity of welding spots of printed circuit board based on machine vision | |
CN111126393A (en) | Vehicle appearance refitting judgment method and device, computer equipment and storage medium | |
CN115063618B (en) | Defect positioning method, system, equipment and medium based on template matching | |
CN112784675B (en) | Target detection method and device, storage medium and terminal | |
WO2019085929A1 (en) | Image processing method, device for same, and method for safe driving | |
CN111444911B (en) | Training method and device of license plate recognition model and license plate recognition method and device | |
KR102242996B1 (en) | Method for atypical defects detect in automobile injection products | |
CN113408519B (en) | Method and system for pointer instrument reading based on template rotation matching | |
CN111241911B (en) | Self-adaptive lane line detection method | |
CN112699825A (en) | Lane line identification method and device | |
CN116343143A (en) | Target detection method, storage medium, road side equipment and automatic driving system | |
CN116182831A (en) | Vehicle positioning method, device, equipment, medium and vehicle | |
CN111583341B (en) | Cloud deck camera shift detection method | |
CN113128264B (en) | Vehicle region determining method and device and electronic equipment | |
CN115393655A (en) | Method for detecting industrial carrier loader based on YOLOv5s network model | |
CN108959355A (en) | A kind of ship classification method, device and electronic equipment | |
CN114359147A (en) | Crack detection method, crack detection device, server and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |