CN111191487A - Lane line detection and driving control method and device and electronic equipment - Google Patents

Lane line detection and driving control method and device and electronic equipment Download PDF

Info

Publication number
CN111191487A
CN111191487A CN201811355223.4A CN201811355223A CN111191487A CN 111191487 A CN111191487 A CN 111191487A CN 201811355223 A CN201811355223 A CN 201811355223A CN 111191487 A CN111191487 A CN 111191487A
Authority
CN
China
Prior art keywords
lane line
lane
road map
prediction
probability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811355223.4A
Other languages
Chinese (zh)
Inventor
庄佩烨
程光亮
石建萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN201811355223.4A priority Critical patent/CN111191487A/en
Priority to JP2021525695A priority patent/JP2022507226A/en
Priority to PCT/CN2019/118097 priority patent/WO2020098708A1/en
Priority to KR1020217015078A priority patent/KR20210079339A/en
Publication of CN111191487A publication Critical patent/CN111191487A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a method and a device for detecting a lane line and controlling driving, and electronic equipment, wherein the method for detecting the lane line comprises the following steps: acquiring a road map, and predicting lane lines of the road map to obtain a prediction result of at least two lane lines; determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines; and outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point. According to the embodiment of the application, through acquiring the vanishing point of the lane line, the prediction result of the lane line is corrected based on the vanishing point, the lane line can be supplemented, a more complete lane line is generated, points with large prediction and actual deviation can be removed, the accuracy of lane line detection is improved, a technical foundation is laid for a lane line deviation system, a vehicle lane change system and the like, and the safety and the reliability of intelligent driving are improved.

Description

Lane line detection and driving control method and device and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of computer vision, in particular to a method and a device for detecting lane lines and controlling driving, and electronic equipment.
Background
The lane line detection technology is one of key technologies for realizing intelligent driving such as auxiliary driving and automatic driving. The lane line inspection is mainly used for a visual navigation system to find out the position of a lane line in a road map from a captured road image.
The main work of lane line detection is the fitting of the lane line, and the accuracy of the fitting of the lane line directly influences the accuracy of the lane line detection, so that the safety of intelligent driving is determined.
Disclosure of Invention
The embodiment of the invention provides a technical scheme for detecting a lane line.
In a first aspect, an embodiment of the present invention provides a method for detecting a lane line, including:
acquiring a road map, wherein the road map comprises at least two lane lines;
predicting lane lines of the road map to obtain a prediction result of at least two lane lines;
determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines;
and outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
In a possible implementation manner of the first aspect, the performing lane line prediction on the road map to obtain a prediction result of at least two lane lines includes:
inputting the road map into a neural network to output a first lane line probability map of the at least two lane lines through the neural network;
and determining a first prediction fitting curve of the at least two lane lines according to at least part of pixel points of which the probability values in the first lane line probability graph are greater than a set threshold value.
In another possible implementation manner of the first aspect, the determining a first prediction fit curve of the at least two lane lines according to at least some pixel points in the first lane line probability map, where the probability is greater than a set threshold, includes:
sampling probability points of which the probability values are greater than a preset value in the first lane line probability map, and determining sampling points of the at least two first lane line probability maps;
and performing curve fitting on pixel points corresponding to the sampling points of the at least two first lane line probability maps to determine a first prediction fitting curve of the at least two lane lines.
In another possible implementation manner of the first aspect, the sampling probability points of the at least two first lane line probability maps, where the probability values of the at least two first lane line probability maps are greater than a preset value, and determining sampling points of the at least two first lane line probability maps includes:
and performing Gaussian sampling on each probability point with the probability value larger than a preset value in the first lane line probability map, and determining sampling points of the at least two first lane line probability maps.
In another possible implementation manner of the first aspect, the determining a vanishing point of the at least two lane lines according to the prediction result of the at least two lane lines includes:
and determining a common intersection point of the first prediction fitting curves of the at least two lane lines as a vanishing point of the at least two lane lines.
In another possible implementation manner of the first aspect, the outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point includes:
and performing curve fitting according to at least part of pixel points with the probability value larger than a set threshold value in the first lane line probability graph and the vanishing points, and determining and outputting a first detection fitting curve of the lane line.
In another possible implementation manner of the first aspect, the acquiring the road map includes:
and acquiring a road map in a scene where the vehicle is located through the vehicle-mounted camera.
In another possible implementation manner of the first aspect, the road map is a road training map with lane marking information, and before the inputting the road map into the neural network, the method further includes:
training the neural network based on the road training image and vanishing points of lane lines in the road training image.
In another possible implementation manner of the first aspect, the training the neural network based on the road training diagram and vanishing points of lane lines in the road training diagram includes:
inputting the road training diagram into a neural network to output a second lane line probability diagram of the at least two lane lines through the neural network;
determining a second prediction fitting curve of the at least two lane lines according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph;
determining vanishing points of the at least two lane lines according to a second prediction fitting curve of the at least two lane lines;
performing curve fitting according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph and the vanishing points, and determining a second detection fitting curve of the at least two lane lines;
and adjusting the network parameters of the neural network according to the first difference between the second predicted fitted curve of each lane line and the true value of each lane line and the second difference between the second detected fitted curve of each lane line and the true value of each lane line.
In another possible implementation manner of the first aspect, the adjusting network parameters of the neural network according to a first difference between the second prediction fit curve of each lane line and a true value of each lane line and a second difference between the second detection fit curve of each lane line and a true value of each lane line includes:
for each lane line of the at least two lane lines, determining a first difference between a second predicted fit curve of the lane line and a true value of the lane line and a second difference between a second detected fit curve of the lane line and a true value of the lane line;
determining the detection loss of the neural network according to the first difference and the second difference of each lane line;
adjusting network parameters of the neural network based on the detected loss.
In another possible implementation manner of the first aspect, the determining a first difference between the second prediction fit curve of the lane line and a true value of the lane line includes:
and taking the least square operation result between the second prediction fitting curve of the lane line and the truth value of the lane line as the first difference of the lane line.
In another possible implementation manner of the first aspect, the determining a second difference between a second detection fitting curve of the lane line and a true value of the lane line includes:
and taking the cross entropy between the second detection fitting curve of the lane line and the truth value of the lane line as the second difference of the lane line.
In another possible implementation manner of the first aspect, the determining a detection loss of the neural network according to the first error and the second error of each lane line includes:
and taking the sum of the first error and the second error of each lane line as the detection loss of the neural network.
In a second aspect, an embodiment of the present invention provides a lane line detection apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a road map, and the road map comprises at least two lane lines;
the prediction module is used for predicting the lane lines of the road map to obtain the prediction results of at least two lane lines;
the determining module is used for determining the vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines;
and the output module is used for outputting the lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
In one possible implementation manner of the second aspect, the prediction module includes:
a first prediction unit, configured to input the road map into a neural network, so as to output a first lane line probability map of the at least two lane lines via the neural network;
and the first fitting unit is used for determining a first prediction fitting curve of the at least two lane lines according to at least part of pixel points of which the probability values in the first lane line probability graph are greater than a set threshold value.
In another possible implementation manner of the second aspect, the first fitting unit is specifically configured to sample each probability point in the first lane line probability map, where the probability value is greater than a preset value, and determine sampling points of the at least two first lane line probability maps; and performing curve fitting on pixel points corresponding to the sampling points of the at least two first lane line probability maps to determine a first prediction fitting curve of the at least two lane lines.
In another possible implementation manner of the second aspect, the first fitting unit is specifically configured to perform gaussian sampling on each probability point of the first lane line probability map, where the probability value is greater than a preset value, and determine sampling points of the at least two first lane line probability maps.
In another possible implementation manner of the second aspect, the first fitting unit is specifically configured to determine that a common intersection point of first predictive fitting curves of the at least two lane lines is a vanishing point of the at least two lane lines.
In another possible implementation manner of the second aspect, the first fitting unit is specifically configured to perform curve fitting according to at least part of the pixel points and the vanishing points in the first lane line probability map, where the probability value is greater than a set threshold, and determine and output a first detection fitting curve of the lane line.
In another possible implementation manner of the second aspect, the obtaining module is specifically configured to collect a road map in a scene where a vehicle is located through a vehicle-mounted camera.
In another possible implementation manner of the second aspect, the road map is a road training map with lane marking information, and the apparatus further includes a training module:
the training module is used for training the neural network based on the road training image and the vanishing points of the lane lines in the road training image.
In another possible implementation manner of the second aspect, the training module includes:
a second prediction unit, configured to input the road training map into a neural network, so as to output a second lane line probability map of the at least two lane lines via the neural network;
the second fitting unit is used for determining a second prediction fitting curve of the at least two lane lines according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph;
a vanishing point determining unit, configured to determine vanishing points of the at least two lane lines according to a second prediction fit curve for determining the at least two lane lines;
the second fitting unit is further configured to perform curve fitting according to at least part of the pixel points and the vanishing points in the second lane line probability map, where the probability value is greater than a set threshold, and determine a second detection fitting curve of the at least two lane lines;
and the adjusting unit is used for adjusting the network parameters of the neural network according to the first difference between the second predicted fitting curve of each lane line and the true value of each lane line and the second difference between the second detected fitting curve of each lane line and the true value of each lane line.
In another possible implementation manner of the second aspect, the adjusting unit includes:
the difference subunit is configured to determine, for each of the at least two lane lines, a first difference between a second prediction fit curve of the lane line and a true value of the lane line, and a second difference between a second detection fit curve of the lane line and a true value of the lane line;
a loss determining subunit, configured to determine a detection loss of the neural network according to the first difference and the second difference of each lane line;
and the adjusting subunit is used for adjusting the network parameters of the neural network according to the detection loss.
In another possible implementation manner of the second aspect, the difference subunit is specifically configured to use a least squares operation result between the second prediction fit curve of the lane line and a true value of the lane line as the first difference of the lane line.
In another possible implementation manner of the second aspect, the difference subunit is specifically configured to use a cross entropy between a second detection fitting curve of the lane line and a true value of the lane line as the second difference of the lane line.
In another possible implementation manner of the second aspect, the loss determining subunit is configured to use a sum of the first error and the second error of each lane line as the detection loss of the neural network.
In a third aspect, an embodiment of the present invention provides a driving control method, including:
the driving control device acquires a lane line detection result of a road map, wherein the lane line detection result of the road map is obtained by adopting the lane line detection method of any one of the first aspect;
and the driving control device outputs prompt information and/or carries out driving control on the vehicle according to the lane line detection result.
In a fourth aspect, an embodiment of the present invention provides a driving control apparatus, including:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a lane line detection result of a road map, and the lane line detection result of the road map is obtained by adopting the lane line detection method of any one of the first aspect;
and the driving control module is used for outputting prompt information and/or carrying out driving control on the vehicle according to the lane line detection result.
In a fifth aspect, an embodiment of the present invention provides an electronic device, including:
a memory for storing a computer program;
a processor configured to execute the computer program to implement the lane line detection method according to the first aspect.
In a sixth aspect, an embodiment of the present invention provides an electronic device, including:
the camera is used for acquiring a road map, wherein the road map comprises at least two lane lines;
a memory for storing a computer program;
a processor configured to execute the computer program to implement the lane line detection method according to the first aspect.
In a seventh aspect, an embodiment of the present invention provides an intelligent driving system, including: a communicatively connected camera for acquiring a road map, an electronic device according to the fifth or sixth aspect, and a driving control apparatus according to the fourth aspect.
In an eighth aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored in the storage medium, and the computer program, when executed, implements the lane line detection method according to the first aspect.
According to the lane line detection and driving control method, device and electronic equipment provided by the embodiment of the invention, the lane line prediction is carried out on the road map by acquiring the road map, so that the prediction results of at least two lane lines are obtained; determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines; and outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point. According to the embodiment of the application, through acquiring the vanishing point of the lane line, the prediction result of the lane line is corrected based on the vanishing point, the lane line can be supplemented, a more complete lane line is generated, points with large prediction and actual deviation can be removed, the accuracy of lane line detection is improved, a technical foundation is laid for a lane line deviation system, a vehicle lane change system and the like, and the safety and the reliability of intelligent driving are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a lane line detection method according to an embodiment of the present invention;
fig. 2 is a flowchart of a lane line detection method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of a neural network model according to this embodiment;
FIG. 4 is a road map according to the present embodiment;
FIG. 5 is a probability map corresponding to the road map shown in FIG. 4;
FIG. 6A is a schematic diagram showing the intersection of prediction fit curves according to the present embodiment;
FIG. 6B is a schematic diagram of a predictive fit curve according to this embodiment;
FIG. 6C is a schematic diagram of a detection fitting curve according to the present embodiment;
fig. 7 is a flowchart of a lane line detection method according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of a lane line detection device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a lane line detection device according to a second embodiment of the present invention;
fig. 10 is a schematic structural diagram of a lane line detection device according to a third embodiment of the present invention;
fig. 11 is a schematic structural diagram of a lane line detection apparatus according to a fourth embodiment of the present invention;
fig. 12 is a schematic structural diagram of a lane line detection apparatus according to a fifth embodiment of the present invention;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 15 is a flowchart illustrating a driving control method according to an embodiment of the present invention;
fig. 16 is a schematic structural diagram of a driving control device according to an embodiment of the present invention;
fig. 17 is a schematic diagram of an intelligent driving system provided in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method provided by the embodiment of the invention is suitable for the fields of computer vision, intelligent driving and the like which need to obtain the fitted curve of the lane line.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 is a flowchart of a lane line detection method according to an embodiment of the present invention. As shown in fig. 1, the method of this embodiment may include:
and S101, acquiring a road map.
The execution subject of the embodiment is an electronic device, which may be, but is not limited to, a smart phone, a computer, an in-vehicle system, and the like. The execution subject of this embodiment is specifically a processor in the electronic device.
Optionally, the electronic device of this embodiment may further have a camera or may be connected to the camera, and the road map of a scene ahead (or around) the vehicle may be captured by the camera, and the processor of the electronic device processes the road map. The road map may be a single frame image or a frame image in a captured video stream.
Optionally, the road map may be preset, for example, a user inputs the road map for testing the lane line detection function of the electronic device.
Optionally, the road map may also be a road training map, and the road training map has lane marking information, and may be used to train accuracy of lane detection of the electronic device.
The embodiment does not limit the specific way for the electronic device to obtain the road map.
The road map of the present embodiment includes at least two lane lines.
S102, performing lane line prediction on the road map to obtain a prediction result of at least two lane lines.
Optionally, in this embodiment, an edge detection method may be adopted to perform lane line prediction on the road map, so as to obtain a prediction result of at least two lane lines.
Optionally, in this embodiment, a support vector machine method may be adopted to perform lane line prediction on the road map, so as to obtain a prediction result of at least two lane lines.
Optionally, in this embodiment, other lane line detection methods may also be adopted to predict the lane lines of the road map, so as to obtain the prediction results of at least two lane lines.
S103, determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines.
The lane lines are parallel in the 3D space of the real world, but they eventually intersect at a point in the two-dimensional camera image, which is called the Vanishing point (vanising point) of the lane line.
In this way, the vanishing point of the lane line can be obtained based on the above-described prediction results of the at least two lane lines.
And S104, outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
And obtaining a prediction result and a vanishing point of the lane line based on the steps, and correcting the prediction result of the lane line by using the vanishing point, so that the lane line can be supplemented, a more complete lane line can be generated, and points with large prediction and actual deviation can be removed.
Optionally, the predicted lane line is connected to the vanishing point, and the connected lane line is used as a lane line detection result of the road map.
Optionally, the predicted lane line and the vanishing point may be fit again, and the fitted lane line is used as a lane line detection result of the road map.
According to the lane line detection method provided by the embodiment of the invention, the lane line prediction is carried out on the road map by acquiring the road map, so that the prediction results of at least two lane lines are obtained; determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines; and outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point. According to the embodiment of the application, through acquiring the vanishing point of the lane line, the prediction result of the lane line is corrected based on the vanishing point, the lane line can be supplemented, a more complete lane line is generated, points with large prediction and actual deviation can be removed, the accuracy of lane line detection is improved, a technical foundation is laid for a lane line deviation system, a vehicle lane change system and the like, and the safety and the reliability of intelligent driving are improved.
Fig. 2 is a flowchart of a lane line detection method according to a second embodiment of the present invention, where on the basis of the above-mentioned embodiment, a specific process of lane line detection according to this embodiment is shown in fig. 2, and the method according to this embodiment may include:
s201, acquiring a road map.
The road map of the embodiment may be a real-time road map of a vehicle running environment, for example, a road map in a scene where a vehicle is located is collected by a vehicle-mounted camera.
And detecting the lane line in front of the running vehicle in real time based on the road map, and providing reference for a lane line deviation system and a vehicle lane changing system.
S202, inputting the road map into a neural network so as to output a first lane line probability map of the at least two lane lines through the neural network.
The neural Network preset in this embodiment may be an FCN (full Convolutional Network), a ResNet (Residual Network), a Convolutional neural Network, or the like.
Optionally, as shown in fig. 3, the neural network of this embodiment includes 7 convolutional layers, which are respectively: the parameters of the first convolution layer were 145 × 169 × 16, the parameters of the second convolution layer were 73 × 85 × 32, the parameters of the third convolution layer were 37 × 43 × 64, the parameters of the fourth convolution layer were 19 × 22 × 128, the parameters of the fifth convolution layer were 73 × 85 × 32, the parameters of the sixth convolution layer were 145 × 169 × 16, and the parameters of the seventh convolution layer were 289 × 337 5.
As shown in fig. 3, the neural network of this embodiment may be trained in advance, and when the road map shown in fig. 4 is input to the neural network, the neural network outputs a lane line probability map of each lane line in the road map, which is recorded as a first lane line probability map, as shown in fig. 5.
S203, determining a first prediction fitting curve of the at least two lane lines according to at least part of pixel points of the first lane line probability graph with the probability value larger than a set threshold value.
The lane line probability map of each lane line comprises a plurality of probability points, and each probability point corresponds to a pixel point in the lane line map one to one. The value of each probability point is the probability value of the corresponding position pixel point in the road map as the lane line.
Taking the rightmost lane line in fig. 5 as an example, the value of each probability point represents the probability value that the pixel point at the corresponding position in the lane graph is the lane line, as shown in fig. 5, for example, the probability value of a white probability point is 1, and the probability value of a black probability point is 0.
Then, based on the first lane line probability map shown in fig. 5, probability points with probability values greater than a preset value in fig. 5 are obtained, and curve fitting is performed on pixel points corresponding to the probability points to generate a first prediction fitting curve of the lane line.
The preset value is whether the pixel points corresponding to the division probability points are the standard on the lane line or not, and the preset value can be determined according to actual needs.
For example, the preset value is 0.8, so that the points with the probability value greater than 0.8 in fig. 5, that is, the white probability points in fig. 5, can be selected, and curve fitting is performed on the pixel points corresponding to the white probability points, so that a first prediction fitting curve of the lane line can be obtained.
Optionally, when performing curve fitting, the present embodiment may use linear function curve fitting, quadratic function curve fitting, cubic function curve fitting, or high-order function curve fitting. The fitting manner of the first prediction fitting curve is not limited in this embodiment, and is specifically determined according to actual needs.
In an example, the S203 may include: sampling probability points of which the probability values are greater than a preset value in the first lane line probability map, and determining sampling points of the at least two first lane line probability maps; and performing curve fitting on pixel points corresponding to the sampling points of the at least two first lane line probability maps to determine a first prediction fitting curve of the at least two lane lines.
The number of the pixel points corresponding to the lane lines in the road map is large, fitting operation is performed on each pixel point corresponding to the lane lines, the calculation amount is large, and the fitting speed is low.
In order to solve the above problem, this embodiment screens the pixel points corresponding to the lane lines, and selects a part of the pixel points satisfying the condition to perform curve fitting.
Specifically, at least part of probability points with probability values larger than a preset value are selected from a first lane line probability graph of each lane line, the preset value is a reference for dividing whether a pixel point is a lane line or not, namely when the probability value of the pixel point is larger than the preset value, the pixel point is a point on the lane line and can be reserved. When the probability value of the pixel point is smaller than the preset value, the pixel point is not a point on the lane line and can be abandoned.
For convenience of explanation, probability points with probability values larger than preset values in the lane line probability graph of each lane line are recorded as sampling points, and pixel points corresponding to the sampling points are all points on the lane lines.
Alternatively, the sampling may be performed by using a Markov Chain monte Carlo sampling method (Markov Chain mont Carlo), Gibbs sampling (Gibbs sampling), or the like.
Optionally, in this embodiment, gaussian sampling is performed on each probability point, in the first lane line probability map, of which the probability value is greater than the preset value, so as to determine sampling points of at least two first lane line probability maps.
And S204, determining a common intersection point of the first prediction fitting curves of the at least two lane lines as a vanishing point of the at least two lane lines.
According to the above steps, the first predictive fitting curve of each lane line in the road map is obtained, and as can be seen from the above, due to the difference in the shooting angle, the extension lines of each lane line in the same road map intersect at a point a, and the point a is taken as the common intersection point of each first predictive fitting curve.
For example, as shown in fig. 6A, assuming that the road map includes 3 lane lines, according to the preset neural network, a first predicted fitting curve of the first lane line may be obtained as y ═ f1(x) The first predicted fitting curve of the second lane line is y ═ f2(x) And the first prediction fitting curve of the third lane line is that y is f3(x)。
Let f1(x)=f3(x)=f2(x) A common intersection point of the 3 first prediction fit curves is obtained, for example, the common intersection point a is (723, 607).
That is, the first prediction fit curve of each lane line passes through the vanishing point, so that the prediction fit curves can be corrected by using the vanishing point, for example, the incomplete first prediction fit curve is completed, and the points with large deviation between prediction and actual can be removed.
S205, performing curve fitting according to at least part of pixel points with the probability value larger than a set threshold value in the first lane line probability map and the vanishing points, and determining and outputting a first detection fitting curve of the lane line.
In this embodiment, the curve of the lane line is re-fitted according to the first lane line probability map of the lane line, the first prediction fitting curve, and the vanishing point obtained in the above steps, so as to generate a first detection fitting curve of the lane line.
Specifically, taking a lane line as an example, probability points with probability values larger than a preset value in a first lane line probability map of the lane line are obtained, an existing curve fitting method is used, pixel points and vanishing points corresponding to the probability points are used as fitting points, curve fitting is carried out, and a first detection fitting curve of the lane line is generated.
In practical application, the vanishing point of the first prediction fitting curve is on a lane line, and in the embodiment, when the curve is fitted, a known vanishing point is added, so that the incomplete first prediction fitting curve can be completed, and a point with a large prediction and actual deviation can be removed, so that the fitting result is more accurate.
Optionally, when curve fitting is performed, the fitting grade of the vanishing point is improved, and when curve fitting is performed, the fitted curve must pass through the vanishing point, so that fitted points far away from the real situation can be filtered, and the accuracy of curve fitting is further improved.
In this embodiment, the subsequent processing is performed on the prediction result of the lane line, specifically, the first prediction fitting curve is re-fitted according to the predicted vanishing points of the first prediction fitting curves to obtain the first detection fitting curve, so that the accuracy of lane line detection is improved.
The process of predicting the probability map by the neural network is long, but in the process of correcting the first prediction fitting curve, the probability map generated by the neural network for the first time is used, the neural network model prediction probability map does not need to be returned again, the time spent in the fitting process of the first detection fitting curve is short, and the speed of lane line detection is ensured while the accuracy of lane line detection is improved.
Fig. 6B is a schematic diagram of a first predicted fit curve obtained based on a preset neural network, and fig. 6C is a schematic diagram of a first detected fit curve obtained by using the lane line detection method of the present embodiment.
Further, as shown in fig. 6B and 6C, in the process of fitting the lane line, the first detection fitting curve generated by fitting in this embodiment may complement the lane line, generate a more complete lane line, and may remove points where the prediction has a large deviation from the actual.
According to the lane line detection method provided by the embodiment of the invention, the road map is input into a neural network, so that a first lane line probability map of at least two lane lines is output through the neural network; determining a first prediction fitting curve of the at least two lane lines according to at least part of pixel points with the probability greater than a set threshold value in the first lane line probability map; determining a common intersection point of the first prediction fitting curves of the at least two lane lines as a vanishing point of the at least two lane lines; and performing curve fitting according to at least part of pixel points with the probability greater than a set threshold value in the first lane line probability graph and the vanishing points, and determining and outputting a first detection fitting curve of the lane line. This embodiment promptly, based on the pixel that vanishing point and lane line probability map correspond, the first detection fitting curve of each lane line of fitting generation can be mended the lane line like this, generates more complete lane line, can get rid of moreover and predict with the very big point of actual deviation, has improved the detection accuracy of lane line, and the fitting process is consuming time short.
In this embodiment, as shown in fig. 2, before inputting a road map into a neural network and detecting a lane line, training the neural network, that is, training the neural network based on the road training map and a vanishing point of the lane line in the road training map, as shown in fig. 7 specifically.
Fig. 7 is a flowchart of a lane line detection method according to a third embodiment of the present invention. On the basis of the above embodiments, the present embodiment relates to a specific process of training a neural network. As shown in fig. 7, the method of this embodiment may include:
s301, inputting the road training graph into a neural network so as to output a second lane line probability graph of the at least two lane lines through the neural network.
S303, determining a second prediction fitting curve of the at least two lane lines according to at least part of pixel points of which the probability values in the second lane line probability graph are larger than a set threshold value.
The road training graph is provided with lane line marking information, the road training graph is input into a neural network, and a second lane line probability graph of at least two lane lines is output. And for each lane line, determining at least partial probability points with probability values larger than a set threshold value from a second lane line probability graph of the lane line, and performing curve fitting on pixel points corresponding to the partial probability points to generate a second prediction fitting curve of the lane line. The specific process may refer to the determination process of the first lane line probability map and the first prediction fitting curve, which is not described herein again.
S303, determining vanishing points of the at least two lane lines according to the second prediction fitting curve of the at least two lane lines.
Optionally, a common intersection of the second predictive fitting curves of the at least two lane lines may be used as a vanishing point of the at least two lane lines.
S304, performing curve fitting according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph and the vanishing points, and determining a second detection fitting curve of the lane line.
And for each lane line, determining probability points with probability values larger than a preset threshold value from the second lane line probability map which obtains the lane line, performing curve fitting by using pixel points corresponding to the probability points and the vanishing point in the S303 as fitting points, and generating a second detection fitting curve of the lane line.
S305, adjusting network parameters of the neural network according to a first difference of the truth values of the second prediction fitting curve and the lane line and a second difference of the truth values of the second detection fitting curve and the lane line.
The truth value of the lane line described in this embodiment may be an objectively existing lane line, or a marked lane line, or a lane line obtained by fitting according to marking information, and is used as monitoring information in the neural network training process to correct a predicted lane line or a detected lane line.
The second predictive fit curve for each lane line is compared to the true value for that lane line, the difference (i.e., the deviation) between the predictive fit curve and the true value for each lane line is determined, and the difference is recorded as the first difference.
And comparing the second detection fitting curve of each lane line with the true value of the lane line, determining the difference (namely deviation) between the second detection fitting curve of each lane line and the true value, and marking the difference as a second difference.
And adjusting network parameters of the neural network according to the first difference and the second difference, such as network parameters of a convolution kernel parameter, a matrix weight and the like of the neural network.
In one example, the S305 may include: for each lane line of the at least two lane lines, determining a first difference between a second predicted fit curve of the lane line and a true value of the lane line and a second difference between a second detected fit curve of the lane line and a true value of the lane line; determining the detection loss of the neural network according to the first difference and the second difference of each lane line; adjusting network parameters of the neural network based on the detected loss.
The embodiment does not limit the specific method for determining the first difference between the second prediction fitted curve of each lane line and the true value of each lane line.
For example, assume that a lane line j is taken as an example, and a second prediction of the lane line j is fitted to a curve f1j(xi) True value f from lane line j0j(xi) First difference epsilon between1jThe determination may be performed using any one of equations (1) to (4), where i is a fitting point, i is 0, 1, …, m, j is a lane line, j is 0, 1, …, n:
Figure BDA0001865907090000111
Figure BDA0001865907090000112
Figure BDA0001865907090000113
Figure BDA0001865907090000114
that is, the present embodiment may determine the first error between the second prediction fitted curve of each lane line and the true value of each lane line using any of the above formulas. It is understood that the above formula is also only exemplary, and the first difference may be determined by using a modification of the above formula, or another formula different from the above formula, or other manners, which is not limited by the embodiment of the present application.
In one example, the determining a first difference between the second predictive fit curve for each of the lane lines and the true value for each of the lane lines includes:
and taking a least square operation result between the second prediction fitting curve of each lane line and a true value of each lane line as the first difference.
Specifically, the first difference is obtained by using a least square operation result between the second prediction fitted curve of each lane line and a true value of each lane line according to the following formula (5).
Figure BDA0001865907090000115
Wherein x isiThe abscissa representing the fitting point i, f1j(xi) And the ordinate of the fitting point i on the second prediction fitting curve corresponding to the lane line j is represented. f. of0j(xi) And the ordinate of the fitting point i on the truth value corresponding to the lane line j is shown.
Since the operation process of the least square operation is simple and the overall error distribution is considered, the first difference between the second prediction fit curve and the true value of each lane line can be determined quickly and accurately according to the formula (5). It is understood that the above formula (5) is also only exemplary, and the first difference may be determined by using a modification of the above formula (5) or another formula different from the above formula (5) or other manners, which is not limited by the embodiment of the present application.
The embodiment does not limit the specific method for determining the second difference between the second detection fitting curve of each lane line and the true value of each lane line.
For example, assume that a lane line j is taken as an example, and a second detection of the lane line j is fitted to a curve f2j(xi) True value f from lane line j0j(xi) Second difference epsilon therebetween2jThe determination may be made using any one of the following formulas (6) to (9), where i is 0, 1, …, m, j is 0, 1, …, n: :
Figure BDA0001865907090000116
Figure BDA0001865907090000117
Figure BDA0001865907090000118
Figure BDA0001865907090000121
that is, the present embodiment may use any one of the above methods for determining the difference to determine the second difference between the second detection fitting curve of the lane line and the true value of the lane line 1. It is to be understood that the above equations (6) to (9) are also only exemplary, and the second difference may be determined by using a modification of the above equations or other manners different from the above equations, which is not limited by the embodiment of the present application.
In one example, the determining the second difference between the second detection fitting curve of each of the lane lines and the true value of each of the lane lines includes:
and taking the cross entropy between the second detection fitting curve of each lane line and the truth value of each lane line as the second difference.
Specifically, the cross entropy between the second detection fitting curve of each lane line and the true value of each lane line is used as the second difference according to the following formula (10).
Figure BDA0001865907090000122
Wherein f is2j(xi) And the ordinate of the fitting point i on the second detection fitting curve corresponding to the lane line j is represented.
Thus, the second difference between the second detection fitting curve and the true value of each lane line can be accurately determined according to the formula (10). It is understood that the above formula (10) is also only exemplary, and the second difference may be determined by using a modification of the above formula (10) or another formula different from the above formula (10) or other manners, which is not limited by the embodiment of the present application.
In another example, the cross entropy between the second prediction fit curve of each lane line and the true value of each lane line may be used as the first difference, and the cross entropy between the second detection fit curve of each lane line and the true value of each lane line may be used as the second difference.
Optionally, a least square operation result between the second prediction fitting curve of each lane line and a true value of each lane line may be used as the first difference, and a least square operation result between the second detection fitting curve of each lane line and a true value of each lane line may be used as the second difference.
Optionally, the cross entropy between the second prediction fitting curve of each lane line and the true value of each lane line may be used as the first difference, and the least square operation result between the second detection fitting curve of each lane line and the true value of each lane line may be used as the second difference.
Optionally, the method for solving the first difference and the second difference may be the same or different in this embodiment, and this embodiment does not limit this.
Then, the first difference and the second difference of each lane line determine the detection loss of the neural network.
Specifically, the first difference and the second difference of each lane line determined above are brought into a loss function of the neural network, and the detection loss of the neural network is determined.
In one example, the present embodiment may use a weighted sum of the first difference and the second difference for each lane line as a loss function of the neural network model.
For example, each first difference ε1jIs a, each second difference epsilon2jB, so that the loss function of the neural network can be determined to have a value loss of:
Figure BDA0001865907090000131
wherein n represents the total number of lane lines in the road map.
In an example, the present embodiment may further use a least squares operation result of the first difference and the second difference of each lane line as a detection loss of the neural network.
In another example, the present embodiment may further use a sum of the first difference and the second difference of each lane line as a detection loss of the neural network.
Specifically, each second difference ε2jWith each first difference epsilon1jThe sum, the value loss as a function of the detected loss of the neural network, i.e. the loss of the neural network, is:
Figure BDA0001865907090000132
and adjusting the network parameters of the neural network according to the detection loss, for example, comparing the detection loss of the neural network with a preset loss, and adjusting the network parameters of the neural network by inverse gradient propagation.
And then, based on the adjusted grid parameters, continuing to detect the lane lines, specifically, inputting a new road training graph into the adjusted neural network, executing the steps, and determining the detection loss of the neural network. And judging whether the detection loss reaches a convergence condition, for example, judging whether the detection loss is less than a preset loss, if so, determining that the neural network training is finished, and predicting the lane line by using the trained neural network. And if the detection loss does not reach the convergence condition, continuing to adjust the network parameters of the neural network, and using a new road training graph to train the adjusted neural network until the detection loss of the neural network meets the convergence condition.
Therefore, after a large amount of iterative training, the prediction precision of the neural network can be effectively improved, and thus, in the detection process of the actual lane line, the high-precision neural network can accurately predict the second prediction fitting curve of the lane line. And then, based on the vanishing points of the second prediction fitting curves, correcting the second prediction fitting curves to generate a detection fitting curve with higher precision, further improving the accurate detection of the lane line and providing guarantee for the popularization of intelligent driving.
According to the lane line detection method provided by the embodiment of the invention, the road training graph is input into the neural network, so that a second lane line probability graph of the at least two lane lines is output through the neural network, a second prediction fitting curve of the at least two lane lines is determined according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph, and vanishing points of the at least two lane lines are determined according to the prediction fitting curve of the at least two lane lines; performing curve fitting according to at least part of pixel points with probability values larger than a set threshold value in the second lane line probability graph and the vanishing points, and determining a second detection fitting curve of the lane line; and adjusting network parameters of the neural network according to the first difference of the truth values of the second prediction fitting curve and the lane line and the second difference of the truth values of the second detection fitting curve and the lane line so as to realize the training of the neural network and further improve the lane line prediction precision of the neural network.
Fig. 8 is a schematic structural diagram of a lane line detection device according to an embodiment of the present invention. As shown in fig. 8, the lane line detection apparatus 100 of the present embodiment may include:
the acquiring module 110 is configured to acquire a road map, where the road map includes at least two lane lines;
the prediction module 120 is configured to perform lane line prediction on the road map to obtain prediction results of at least two lane lines;
a determining module 130, configured to determine vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines;
and an output module 140, configured to output a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
The lane line detection device of the embodiment of the present invention may be used to implement the technical solutions of the above-described method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 9 is a schematic structural diagram of a lane line detection device according to a second embodiment of the present invention. As shown in fig. 9, the prediction module 120 includes: a first prediction unit 121 and a first fitting unit 122;
a first prediction unit 121, configured to input the road map into a neural network, so as to output a first lane line probability map of the at least two lane lines through the neural network;
the first fitting unit 122 is configured to determine a first predicted fitting curve of the at least two lane lines according to at least some pixel points in the first lane line probability map, where the probability value is greater than a set threshold.
In a possible implementation manner of this embodiment, the first fitting unit 122 is specifically configured to sample each probability point in the first lane line probability map, where a probability value is greater than a preset value, and determine sampling points of the at least two first lane line probability maps; and performing curve fitting on pixel points corresponding to the sampling points of the at least two first lane line probability maps to determine a first prediction fitting curve of the at least two lane lines.
In another possible implementation manner of this embodiment, the first fitting unit 122 is specifically configured to perform gaussian sampling on each probability point in the first lane line probability map, where the probability value is greater than a preset value, and determine sampling points of the at least two first lane line probability maps.
In another possible implementation manner of this embodiment, the first fitting unit 122 is specifically configured to determine that a common intersection of the first predictive fitting curves of the at least two lane lines is a vanishing point of the at least two lane lines.
In another possible implementation manner of this embodiment, the first fitting unit 122 is specifically configured to perform curve fitting according to at least some pixel points in the first lane line probability map, where the probability value is greater than a set threshold, and the vanishing point, and determine and output a first detection fitting curve of the lane line.
In another possible implementation manner of this embodiment, the obtaining module 110 is specifically configured to collect a road map in a scene where a vehicle is located through a vehicle-mounted camera.
The lane line detection device of the embodiment of the present invention may be used to implement the technical solutions of the above-described method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 10 is a schematic structural diagram of a lane line detection device according to a third embodiment of the present invention. As shown in fig. 10, the apparatus further includes: a training module 150;
the training module 150 is configured to train the neural network based on the road training diagram and vanishing points of lane lines in the road training diagram.
Fig. 11 is a schematic structural diagram of a lane line detection device according to a fourth embodiment of the present invention, where on the basis of the fourth embodiment, the training module 150 includes: a second prediction unit 151, a second fitting unit 152, a vanishing point determination unit 153, and an adjustment unit 154;
a second prediction unit 151, configured to input the road training map into a neural network, so as to output a second lane line probability map of the at least two lane lines via the neural network;
a second fitting unit 152, configured to determine a second predicted fitting curve of the at least two lane lines according to at least some pixel points in the second lane line probability map, where the probability value is greater than a set threshold;
a vanishing point determining unit 153, configured to determine vanishing points of the at least two lane lines according to a second prediction fitting curve for determining the at least two lane lines;
the second fitting unit 152 is further configured to perform curve fitting according to at least some pixel points in the second lane line probability map, where the probability value is greater than a set threshold, and the vanishing point, and determine a second detection fitting curve of the at least two lane lines;
an adjusting unit 154, configured to adjust the network parameters of the neural network according to a first difference between the second predicted fitted curve of each lane line and the true value of each lane line, and a second difference between the second detected fitted curve of each lane line and the true value of each lane line.
The lane line detection device of the embodiment of the present invention may be used to implement the technical solutions of the above-described method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 12 is a schematic structural diagram of a lane line detection device according to a fifth embodiment of the present invention, in which the adjusting unit 154 includes: a difference subunit 1541, a loss determination subunit 1542 and a tuning subunit 1543;
the difference subunit 1541 is configured to determine, for each of the at least two lane lines, a first difference between a second predicted fit curve of the lane line and a true value of the lane line, and a second difference between a second detected fit curve of the lane line and a true value of the lane line;
a loss determination subunit 1542, configured to determine a detection loss of the neural network according to the first difference and the second difference of each lane line;
an adjusting subunit 1543, configured to adjust a network parameter of the neural network according to the detected loss.
In one possible implementation, the difference subunit 1541 is specifically configured to use a least squares operation result between the second prediction fit curve of the lane line and a true value of the lane line as the first difference of the lane line.
In another possible implementation manner, the difference subunit 1541 is specifically configured to use a cross entropy between a second detected fitted curve of the lane line and a true value of the lane line as the second difference of the lane line.
In another possible implementation manner, the loss determining subunit 1542 is specifically configured to use a sum of the first error and the second error of each lane line as the detection loss of the neural network.
The lane line detection device of the embodiment of the present invention may be used to implement the technical solutions of the above-described method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 13, an electronic device 30 of the embodiment includes:
a memory 31 for storing a computer program;
the processor 32 is configured to execute the computer program to implement the above-mentioned lane line detection method, which has similar implementation principles and technical effects and is not described herein again.
Fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 14, an electronic device 40 of the embodiment includes:
the camera 41 is configured to obtain a road map, where the road map includes at least two lane lines;
a memory 42 for storing a computer program;
the processor 43 is configured to execute the computer program to implement the above-mentioned lane line detection method, which has similar implementation principles and technical effects, and is not described herein again.
Fig. 15 is a schematic flow chart of a driving control method according to an embodiment of the present invention, and on the basis of the foregoing embodiment, an embodiment of the present invention further provides a driving control method, including:
s401, the driving control device obtains a lane line detection result of the road map.
S402, the driving control device outputs prompt information and/or carries out intelligent driving control on the vehicle according to the lane line detection result.
The execution subject of the present embodiment is a driving control device, and the driving control device of the present embodiment and the electronic device described in the above embodiments may be located in the same device, or may be located in different devices separately. The driving control device of the present embodiment is in communication connection with the electronic device.
The lane line detection result of the road map is obtained by the lane line detection method according to the above embodiment, and the specific process refers to the description of the above embodiment and is not described herein again.
Specifically, the electronic device executes the above-mentioned lane line detection method, obtains a lane line detection result of the road map, and outputs the lane line detection result of the road map. The driving control device obtains the lane line detection result of the road map, and outputs prompt information and/or carries out intelligent driving control on the vehicle according to the lane line detection result of the road map.
The prompt information may include a lane departure warning prompt, or a lane keeping prompt, and the like.
The smart driving of the present embodiment includes assisted driving and/or automatic driving.
The above-mentioned intelligent driving control may include: braking, changing the speed of travel, changing the direction of travel, lane keeping, changing the state of lights, driving mode switching, etc., wherein the driving mode switching may be switching between assisted driving and automated driving, e.g., switching assisted driving to automated driving.
According to the vehicle driving method provided by the embodiment, the driving control device outputs the prompt information and/or performs intelligent driving control on the vehicle according to the lane line detection result of the road map by acquiring the lane line detection result of the road map, so that the safety and reliability of intelligent driving are improved.
Fig. 16 is a schematic structural diagram of a driving control apparatus according to an embodiment of the present invention, and based on the above embodiment, the driving control apparatus 200 according to the embodiment of the present application includes:
the obtaining module 210 is configured to obtain a lane line detection result of a road map, where the lane line detection result of the road map is obtained by using the above lane line detection method;
and the driving control module 220 is used for outputting prompt information and/or performing intelligent driving control on the vehicle according to the lane line detection result.
The driving control device according to the embodiment of the present invention may be used to implement the technical solutions of the above-mentioned embodiments of the methods, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 17 is a schematic diagram of an intelligent driving system according to an embodiment of the present invention, and as shown in fig. 17, an intelligent driving system 50 according to the present embodiment includes: the driving control device comprises a camera 51, an electronic device 30 and a driving control device 200 which are in communication connection, wherein the electronic device 30 is shown in figure 13 or 14, the driving control device 200 is shown in figure 16, and the camera 51 is used for shooting a road map.
Specifically, as shown in fig. 17, in actual use, the camera 51 captures a road map and transmits the road map to the electronic device 30, and after receiving the road map, the electronic device 30 processes the road map according to the above-mentioned method for detecting a lane line, and obtains a lane line detection result of the road map. Then, the electronic device 30 transmits the obtained lane line detection result of the road map to the driving control apparatus 200, and the driving control apparatus 200 outputs the prompt information and/or performs the intelligent driving control on the vehicle according to the lane line detection result of the road map.
Further, when at least a part of the functions of the lane line detection method and/or the driving control method in the embodiments of the present invention are implemented by software, the embodiments of the present invention further provide a computer storage medium for storing computer software instructions for detecting the lane line, which, when executed on a computer, enable the computer to execute various possible lane line detection methods and/or driving control methods in the embodiments of the above methods. The processes or functions described in accordance with the embodiments of the present invention may be generated in whole or in part when the computer-executable instructions are loaded and executed on a computer. The computer instructions may be stored on a computer storage medium or transmitted from one computer storage medium to another via wireless (e.g., cellular, infrared, short-range wireless, microwave, etc.) to another website site, computer, server, or data center. The computer storage media may be any available media that can be accessed by a computer or a data storage device, such as a server, data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., SSD), among others.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for detecting a lane line, comprising:
acquiring a road map;
predicting lane lines of the road map to obtain a prediction result of at least two lane lines;
determining vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines;
and outputting a lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
2. The method of claim 1, wherein the performing lane line prediction on the road map to obtain a prediction result of at least two lane lines comprises:
inputting the road map into a neural network to output a first lane line probability map of the at least two lane lines through the neural network;
and determining a first prediction fitting curve of the at least two lane lines according to at least part of pixel points of which the probability values in the first lane line probability graph are greater than a set threshold value.
3. The method of claim 2, wherein determining a first predictive fit curve for the at least two lane lines based on at least some of the pixels in the first lane line probability map having a probability greater than a predetermined threshold comprises:
sampling probability points of which the probability values are greater than a preset value in the first lane line probability map, and determining sampling points of the at least two first lane line probability maps;
and performing curve fitting on pixel points corresponding to the sampling points of the at least two first lane line probability maps to determine a first prediction fitting curve of the at least two lane lines.
4. A lane line detection device, comprising:
the acquisition module is used for acquiring a road map;
the prediction module is used for predicting the lane lines of the road map to obtain the prediction results of at least two lane lines;
the determining module is used for determining the vanishing points of the at least two lane lines according to the prediction results of the at least two lane lines;
and the output module is used for outputting the lane line detection result of the road map according to the prediction result of the at least two lane lines and the vanishing point.
5. A driving control method characterized by comprising:
the driving control device acquires a lane line detection result of a road map, wherein the lane line detection result of the road map is obtained by adopting the lane line detection method according to any one of claims 1 to 4;
and the driving control device outputs prompt information and/or carries out intelligent driving control on the vehicle according to the lane line detection result.
6. A driving control apparatus, characterized by comprising:
an obtaining module, configured to obtain a lane line detection result of a road map, where the lane line detection result of the road map is obtained by using the lane line detection method according to any one of claims 1 to 3;
and the driving control module is used for outputting prompt information and/or carrying out intelligent driving control on the vehicle according to the lane line detection result.
7. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program to implement the lane line detection method according to any one of claims 1 to 3.
8. An electronic device, comprising: the device comprises a camera, a memory and a processor;
the camera is used for acquiring a road map, wherein the road map comprises at least two lane lines;
a memory for storing a computer program;
a processor for executing the computer program to implement the lane line detection method according to any one of claims 1 to 3.
9. An intelligent driving system, comprising: communicatively connected camera for acquiring a road map, an electronic device according to claim 7 or 8 and a driving control apparatus according to claim 6.
10. A computer storage medium, characterized in that the storage medium has stored therein a computer program that, when executed, implements the lane line detection method according to any one of claims 1 to 3, and implements the driving control method according to claim 5.
CN201811355223.4A 2018-11-14 2018-11-14 Lane line detection and driving control method and device and electronic equipment Pending CN111191487A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201811355223.4A CN111191487A (en) 2018-11-14 2018-11-14 Lane line detection and driving control method and device and electronic equipment
JP2021525695A JP2022507226A (en) 2018-11-14 2019-11-13 Compartment line detection methods, devices, and operation control methods, devices and electronic devices
PCT/CN2019/118097 WO2020098708A1 (en) 2018-11-14 2019-11-13 Lane line detection method and apparatus, driving control method and apparatus, and electronic device
KR1020217015078A KR20210079339A (en) 2018-11-14 2019-11-13 Lane detection and driving control method, apparatus and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811355223.4A CN111191487A (en) 2018-11-14 2018-11-14 Lane line detection and driving control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111191487A true CN111191487A (en) 2020-05-22

Family

ID=70709100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811355223.4A Pending CN111191487A (en) 2018-11-14 2018-11-14 Lane line detection and driving control method and device and electronic equipment

Country Status (4)

Country Link
JP (1) JP2022507226A (en)
KR (1) KR20210079339A (en)
CN (1) CN111191487A (en)
WO (1) WO2020098708A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111539401A (en) * 2020-07-13 2020-08-14 平安国际智慧城市科技股份有限公司 Lane line detection method, device, terminal and storage medium based on artificial intelligence
CN111652952A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Lane line generation method, lane line generation device, computer device, and storage medium
CN111814651A (en) * 2020-07-02 2020-10-23 北京百度网讯科技有限公司 Method, device and equipment for generating lane line
CN112146620A (en) * 2020-11-25 2020-12-29 腾讯科技(深圳)有限公司 Target object ranging method and device
CN112215214A (en) * 2020-12-11 2021-01-12 智道网联科技(北京)有限公司 Method and system for adjusting camera offset of intelligent vehicle-mounted terminal
CN112465925A (en) * 2020-11-20 2021-03-09 北京赛目科技有限公司 Method and device for processing lane line for simulation test
CN112734139A (en) * 2021-01-28 2021-04-30 腾讯科技(深圳)有限公司 Passage time length prediction method and device, storage medium and electronic equipment
CN113011285A (en) * 2021-03-02 2021-06-22 北京三快在线科技有限公司 Lane line detection method and device, automatic driving vehicle and readable storage medium
CN113869293A (en) * 2021-12-03 2021-12-31 禾多科技(北京)有限公司 Lane line recognition method and device, electronic equipment and computer readable medium
CN115470830A (en) * 2022-10-28 2022-12-13 电子科技大学 Multi-source domain adaptation-based electroencephalogram signal cross-user alertness monitoring method
WO2023131065A1 (en) * 2022-01-07 2023-07-13 华为技术有限公司 Image processing method, lane line detection method and related device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199999A (en) * 2020-09-09 2021-01-08 浙江大华技术股份有限公司 Road detection method, road detection device, storage medium and electronic equipment
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium
CN112101321B (en) * 2020-11-18 2021-02-02 蘑菇车联信息科技有限公司 Vanishing point extraction method and device, electronic equipment and storage medium
CN114743178B (en) * 2021-12-29 2024-03-08 北京百度网讯科技有限公司 Road edge line generation method, device, equipment and storage medium
CN115440048A (en) * 2022-09-20 2022-12-06 澳克诺(上海)汽车科技有限公司 Method, apparatus and medium for predicting vehicle travel track
CN116091648B (en) * 2023-02-09 2023-12-01 禾多科技(北京)有限公司 Lane line generation method and device, storage medium and electronic device
CN117649635A (en) * 2024-01-30 2024-03-05 湖北经济学院 Method, system and storage medium for detecting shadow eliminating point of narrow water channel scene
CN118012630A (en) * 2024-04-08 2024-05-10 腾讯科技(深圳)有限公司 Lane line data processing method and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
CN105138955A (en) * 2015-07-10 2015-12-09 深圳市中天安驰有限责任公司 Detection method of road disappearance points
CN105893949A (en) * 2016-03-29 2016-08-24 西南交通大学 Lane line detection method under complex road condition scene
CN108216229A (en) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 The vehicles, road detection and driving control method and device
CN108629292A (en) * 2018-04-16 2018-10-09 海信集团有限公司 It is bent method for detecting lane lines, device and terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3729141B2 (en) * 2002-02-27 2005-12-21 日産自動車株式会社 Road white line recognition device
JP2012048289A (en) * 2010-08-24 2012-03-08 Isuzu Motors Ltd Straight line detection device
JP6160252B2 (en) * 2013-05-29 2017-07-12 日産自動車株式会社 Image processing apparatus and image processing method
JP2018164199A (en) * 2017-03-27 2018-10-18 ソニーセミコンダクタソリューションズ株式会社 Image processing device and image processing method
CN108229354A (en) * 2017-12-22 2018-06-29 温州大学激光与光电智能制造研究院 The method of lane detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
CN105138955A (en) * 2015-07-10 2015-12-09 深圳市中天安驰有限责任公司 Detection method of road disappearance points
CN105893949A (en) * 2016-03-29 2016-08-24 西南交通大学 Lane line detection method under complex road condition scene
CN108216229A (en) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 The vehicles, road detection and driving control method and device
CN108629292A (en) * 2018-04-16 2018-10-09 海信集团有限公司 It is bent method for detecting lane lines, device and terminal

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652952A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Lane line generation method, lane line generation device, computer device, and storage medium
CN111814651A (en) * 2020-07-02 2020-10-23 北京百度网讯科技有限公司 Method, device and equipment for generating lane line
CN111814651B (en) * 2020-07-02 2024-01-12 阿波罗智能技术(北京)有限公司 Lane line generation method, device and equipment
CN111539401A (en) * 2020-07-13 2020-08-14 平安国际智慧城市科技股份有限公司 Lane line detection method, device, terminal and storage medium based on artificial intelligence
CN111539401B (en) * 2020-07-13 2020-10-23 平安国际智慧城市科技股份有限公司 Lane line detection method, device, terminal and storage medium based on artificial intelligence
CN112465925A (en) * 2020-11-20 2021-03-09 北京赛目科技有限公司 Method and device for processing lane line for simulation test
CN112146620A (en) * 2020-11-25 2020-12-29 腾讯科技(深圳)有限公司 Target object ranging method and device
CN112215214A (en) * 2020-12-11 2021-01-12 智道网联科技(北京)有限公司 Method and system for adjusting camera offset of intelligent vehicle-mounted terminal
CN112734139A (en) * 2021-01-28 2021-04-30 腾讯科技(深圳)有限公司 Passage time length prediction method and device, storage medium and electronic equipment
CN112734139B (en) * 2021-01-28 2023-09-29 腾讯科技(深圳)有限公司 Method and device for predicting passage duration, storage medium and electronic equipment
CN113011285A (en) * 2021-03-02 2021-06-22 北京三快在线科技有限公司 Lane line detection method and device, automatic driving vehicle and readable storage medium
CN113869293A (en) * 2021-12-03 2021-12-31 禾多科技(北京)有限公司 Lane line recognition method and device, electronic equipment and computer readable medium
WO2023131065A1 (en) * 2022-01-07 2023-07-13 华为技术有限公司 Image processing method, lane line detection method and related device
CN115470830A (en) * 2022-10-28 2022-12-13 电子科技大学 Multi-source domain adaptation-based electroencephalogram signal cross-user alertness monitoring method
CN115470830B (en) * 2022-10-28 2023-04-07 电子科技大学 Multi-source-domain-adaptation-based electroencephalogram signal cross-user alertness monitoring method

Also Published As

Publication number Publication date
JP2022507226A (en) 2022-01-18
KR20210079339A (en) 2021-06-29
WO2020098708A1 (en) 2020-05-22

Similar Documents

Publication Publication Date Title
CN111191487A (en) Lane line detection and driving control method and device and electronic equipment
CN107481292B (en) Attitude error estimation method and device for vehicle-mounted camera
KR101854554B1 (en) Method, device and storage medium for calculating building height
KR20190090393A (en) Lane determining method, device and storage medium
CN113052966B (en) Automatic driving crowdsourcing high-precision map updating method, system and medium
CN111209780A (en) Lane line attribute detection method and device, electronic device and readable storage medium
CN110175507B (en) Model evaluation method, device, computer equipment and storage medium
CN111091023B (en) Vehicle detection method and device and electronic equipment
CN113554643B (en) Target detection method and device, electronic equipment and storage medium
CN111080682B (en) Registration method and device for point cloud data
CN112052807B (en) Vehicle position detection method, device, electronic equipment and storage medium
CN112801047B (en) Defect detection method and device, electronic equipment and readable storage medium
CN111539484A (en) Method and device for training neural network
CN108596032B (en) Detection method, device, equipment and medium for fighting behavior in video
CN110659658A (en) Target detection method and device
CN110751040B (en) Three-dimensional object detection method and device, electronic equipment and storage medium
CN113223064A (en) Method and device for estimating scale of visual inertial odometer
Pei et al. Detecting potholes in asphalt pavement under small-sample conditions based on improved faster region-based convolution neural networks
CN111476062A (en) Lane line detection method and device, electronic equipment and driving system
CN112434753A (en) Model training method, target detection method, device, equipment and storage medium
CN113704276A (en) Map updating method and device, electronic equipment and computer readable storage medium
CN116524382A (en) Bridge swivel closure accuracy inspection method system and equipment
CN116363619A (en) Attention mechanism lane line detection method, system, equipment and medium
CN113744361A (en) Three-dimensional high-precision map construction method and device based on trinocular vision
CN113393494A (en) Model training and target tracking method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200522