WO2020098708A1 - Procédé et appareil de détection de ligne de voie, procédé et appareil de commande de conduite, et dispositif - Google Patents

Procédé et appareil de détection de ligne de voie, procédé et appareil de commande de conduite, et dispositif Download PDF

Info

Publication number
WO2020098708A1
WO2020098708A1 PCT/CN2019/118097 CN2019118097W WO2020098708A1 WO 2020098708 A1 WO2020098708 A1 WO 2020098708A1 CN 2019118097 W CN2019118097 W CN 2019118097W WO 2020098708 A1 WO2020098708 A1 WO 2020098708A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane line
lane
probability
map
lines
Prior art date
Application number
PCT/CN2019/118097
Other languages
English (en)
Chinese (zh)
Inventor
庄佩烨
程光亮
石建萍
Original Assignee
北京市商汤科技开发有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京市商汤科技开发有限公司 filed Critical 北京市商汤科技开发有限公司
Priority to KR1020217015078A priority Critical patent/KR20210079339A/ko
Priority to JP2021525695A priority patent/JP2022507226A/ja
Publication of WO2020098708A1 publication Critical patent/WO2020098708A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • Embodiments of the present disclosure relate to the field of computer vision technology, and in particular, to a lane line detection and driving control method, device, and electronic equipment.
  • the lane line detection technology is one of the key technologies for realizing intelligent driving such as assisted driving and automatic driving.
  • Lane line detection is mainly used in visual navigation systems to find the position of lane lines in road maps from the captured road images.
  • the main work of lane line detection is the fitting of lane lines.
  • the accuracy of the fitting of lane lines directly affects the accuracy of the results of lane line detection, which in turn determines the safety of intelligent driving.
  • Embodiments of the present disclosure provide a technical solution for detecting lane lines.
  • An aspect of an embodiment of the present disclosure provides a lane line detection method, including:
  • the road map includes at least two lane lines
  • the lane line detection result of the road map is output.
  • a lane line detection device including:
  • a prediction module configured to predict lane lines on the road map, and obtain prediction results of at least two lane lines
  • a determining module configured to determine the vanishing point of the at least two lane lines according to the prediction results of the at least two lane lines;
  • the output module is configured to output the lane line detection result of the road map according to the prediction results of the at least two lane lines and the vanishing point.
  • a driving control method including:
  • the driving control device obtains the detection result of the lane line of the road map, and the detection result of the lane line of the road map is obtained by the detection method of the lane line according to any one of the first aspect;
  • the driving control device outputs prompt information according to the lane line detection result and / or performs driving control on the vehicle.
  • a driving control device including:
  • An obtaining module configured to obtain a lane line detection result of a road map, which is obtained by using the lane line detection method described in any of the above embodiments of the present disclosure
  • the driving control module is used for outputting prompt information and / or driving control of the vehicle according to the detection result of the lane line.
  • an electronic device including:
  • Memory used to store computer programs
  • the processor is configured to execute the computer program to implement the lane line detection method described in the first aspect.
  • an electronic device including:
  • the camera is used to obtain a road map, wherein the road map includes at least two lane lines;
  • Memory used to store computer programs
  • the processor is configured to execute the computer program to implement the lane line detection method described in the first aspect.
  • an intelligent driving system including: a camera connected in communication, the electronic device and the driving control device according to any one of the above embodiments of the present disclosure, the camera is used to obtain a road map.
  • an embodiment of the present disclosure provides a computer storage medium that stores a computer program in the storage medium, and when the computer program is executed, the lane line detection according to any of the foregoing embodiments of the present disclosure is implemented method.
  • the lane line detection and driving control method, device and electronic equipment obtained the prediction results of at least two lane lines by acquiring a road map and predicting the lane lines on the road map; according to the at least two lanes
  • the prediction result of the line determines the vanishing point of the at least two lane lines; based on the prediction result of the at least two lane lines and the vanishing point, the lane line detection result of the road map is output.
  • FIG. 1 is a flowchart of a lane line detection method according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of a lane line detection method according to another embodiment of the present disclosure.
  • FIG. 3 is a schematic structural diagram of a neural network model involved in an embodiment of the present disclosure.
  • FIG. 5 is a probability map corresponding to the road map shown in FIG. 4;
  • 6A is a schematic diagram of the intersection of prediction fit curves involved in an embodiment of the present disclosure.
  • 6B is a schematic diagram of a prediction fitting curve involved in an embodiment of the present disclosure.
  • 6C is a schematic diagram of a detection fitting curve involved in an embodiment of the present disclosure.
  • FIG. 7 is a flowchart of a lane line detection method according to another embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a lane line detection device according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a lane line detection device according to another embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a lane line detection device according to another embodiment of the present disclosure.
  • FIG. 11 is a schematic structural diagram of a lane line detection device according to yet another embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of a lane line detection device according to still another embodiment of the present disclosure.
  • FIG. 13 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 14 is a schematic structural diagram of an electronic device provided by another embodiment of the present disclosure.
  • 15 is a schematic structural diagram of an application example of an electronic device of the present disclosure.
  • 16 is a schematic flowchart of a driving control method provided by an embodiment of the present disclosure.
  • FIG. 17 is a schematic structural diagram of a driving control device according to an embodiment of the present disclosure.
  • FIG. 18 is a schematic diagram of an intelligent driving system provided by an embodiment of the present disclosure.
  • a plurality may refer to two or more, and “at least one” may refer to one, two, or more than two.
  • first and second in the embodiments of the present disclosure are only used to distinguish different steps, devices, or modules, etc., and neither represent any specific technical meaning nor represent between The inevitable logical order.
  • association relationship describing the association object, indicating that there may be three kinds of relationships, for example, A and / or B, which may mean: there is A alone, A and B exist at the same time There are three cases of B alone.
  • character “/” in the present disclosure generally indicates that the related objects before and after are in an “or” relationship.
  • the embodiments of the present disclosure can be applied to electronic devices such as terminal devices, computer systems, servers, vehicle-mounted devices, etc., which can operate together with many other general-purpose or special-purpose computing system environments or configurations.
  • Examples of well-known terminal devices, computing systems, environments, and / or configurations suitable for use with terminal devices, computer systems, servers, in-vehicle devices, and other electronic devices include, but are not limited to: personal computer systems, server computer systems, thin clients , Thick clients, handheld or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, small computer systems, mainframe computer systems, in-vehicle equipment, and distributed cloud computing including any of the above Technical environment, etc.
  • Electronic devices such as terminal devices, computer systems, servers, in-vehicle devices, etc. may be described in the general context of computer system executable instructions (such as program modules) executed by the computer system.
  • program modules may include routines, programs, target programs, components, logic, data structures, etc., which perform specific tasks or implement specific abstract data types.
  • the computer system / server can be implemented in a distributed cloud computing environment, where tasks are performed by remote processing devices linked through a communication network.
  • program modules may be located on local or remote computing system storage media including storage devices.
  • the method provided by the embodiment of the present disclosure is applicable to the fields of computer vision, intelligent driving, etc., which need to obtain the fitting curve of the lane line.
  • the neural networks in the embodiments of the present disclosure may be a multi-layer neural network (ie, deep neural network), wherein the neural network may be a multi-layer convolutional neural network, for example, LeNet, AlexNet, GoogLeNet, VGG , ResNet and other arbitrary neural network models.
  • Each neural network may use a neural network of the same type and structure, or a neural network of a different type and / or structure. The embodiments of the present disclosure do not limit this.
  • FIG. 1 is a flowchart of a lane line detection method according to an embodiment of the present disclosure. As shown in FIG. 1, the method of this embodiment may include:
  • the execution subject of this embodiment is an electronic device, and the electronic device may include, but is not limited to, a smart phone, a computer, an in-vehicle system, and the like.
  • the execution subject of this embodiment may be the processor in the above electronic device.
  • the electronic device of this embodiment may also have a camera or may be connected to the camera, and the road map of the scene in front of (or around) the vehicle running may be photographed through the camera, and the road map may be processed by the processor of the electronic device .
  • the road map may be a single-frame image or a frame image in the captured video stream.
  • the above road map may also be preset, for example, the user inputs the road map to test the lane detection function of the electronic device.
  • the above road map may also be a road training map with lane line labeling information, which may be used to train the accuracy of lane line detection of electronic devices.
  • This embodiment does not limit the way in which the electronic device obtains the road map.
  • the road map of this embodiment includes at least two lane lines.
  • this step S101 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the acquisition module executed by the processor.
  • S102 Perform lane line prediction on the road map to obtain prediction results of at least two lane lines.
  • an edge detection method may be used to predict the lane lines of the road map to obtain prediction results of at least two lane lines.
  • a support vector machine method may be used to predict lane lines on the road map to obtain prediction results of at least two lane lines.
  • other lane line detection methods may be used to predict the lane line on the road map to obtain prediction results of at least two lane lines.
  • this step S102 may be executed by the processor invoking the corresponding instruction stored in the memory, or by the prediction module executed by the processor.
  • Lane lines are parallel in real-world 3D space, but they will eventually intersect at a point in a two-dimensional camera image. This intersection point is called the vanishing point of the lane line.
  • the vanishing point of the lane line can be obtained.
  • this step S103 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the determination module executed by the processor.
  • the predicted lane line is connected with the vanishing point, and the connected lane line is used as the lane line detection result of the road map.
  • the predicted lane line and the vanishing point can be re-fitted, and the fitted lane line can be used as the lane line detection result of the road map.
  • this step S104 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the output module executed by the processor.
  • the method for detecting a lane line obtained in an embodiment of the present disclosure obtains a prediction result of at least two lane lines by acquiring a road map and predicting the lane line on the road map; determining the at least two lane line prediction results according to the prediction results of the at least two lane lines Vanishing points of two lane lines; based on the prediction results of the at least two lane lines and the vanishing points, the lane line detection results of the road map are output.
  • the above S102 may include: inputting the road map into a neural network to output a first lane line probability map of the at least two lane lines via the neural network; according to the first lane line probability map At least a part of the pixels whose median probability value is greater than the set threshold value determine the first predicted fitting curve of the at least two lane lines.
  • the above S103 may include: determining that the common intersection point of the first predicted fitting curves of the at least two lane lines is the vanishing point of the at least two lane lines.
  • the above S104 may include: performing curve fitting according to at least a part of pixels with a probability value greater than a set threshold value in the first lane line probability map and the vanishing point, determining and outputting the lane line The first test fits the curve.
  • FIG. 2 is a flowchart of a lane line detection method according to another embodiment of the present disclosure. Based on the foregoing embodiment, the optional process of lane line detection involved in this embodiment is shown in FIG. 2.
  • the methods can include:
  • the road map in this embodiment may be a real-time road map of the vehicle operating environment, for example, a road map in a scene where the vehicle is collected via an on-board camera.
  • this step S101 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the acquisition module executed by the processor.
  • the preset neural network in this embodiment may be FCN (Fully Convolutional Networks, full convolutional network), ResNet (Residual Network, residual network), or convolutional neural network.
  • the neural network of this embodiment includes 7 convolutional layers, respectively: the parameters of the first convolutional layer are 145 * 169 * 16, and the parameters of the second convolutional layer are 73 * 85 * 32, the parameter of the third convolutional layer is 37 * 43 * 64, the parameter of the fourth convolutional layer is 19 * 22 * 128, and the parameter of the fifth convolutional layer is 73 * 85 * 32
  • the parameter of the sixth convolutional layer is 145 * 169 * 16, and the parameter of the seventh convolutional layer is 289 * 337 * 5.
  • the neural network in this embodiment may be pre-trained.
  • the neural network When the road map shown in FIG. 4 is input to the neural network, the neural network outputs a lane line probability map of each lane line in the road map. Recorded as the first lane line probability map, as shown in Figure 5.
  • this step S202 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the prediction module or the first prediction unit executed by the processor.
  • S203 Determine the first predicted fitting curve of the at least two lane lines according to at least a part of pixels where the probability value in the first lane line probability map is greater than a set threshold.
  • the lane line probability map of each lane line includes multiple probability points, and each probability point corresponds one-to-one with the pixel points in the road map.
  • the value of each probability point is the probability value of the corresponding pixel in the road map as the lane line.
  • each probability point represents the probability value of the pixel point at the corresponding position in the road map as the lane line, as shown in FIG. 5, for example, the probability value of the white probability point is 1, The probability value of the black probability point is 0.
  • the preset value is a standard that determines whether the pixel corresponding to the divided probability point is on the lane line, and the preset value can be determined according to actual needs.
  • the preset value is 0.8, so that points with a probability value greater than 0.8 in FIG. 5 can be selected, that is, white probability points in FIG. 5, and curve fitting is performed on the pixels corresponding to these white probability points to obtain the The first predicted fitting curve.
  • linear function curve fitting quadratic function curve fitting, cubic function curve fitting, or higher-order function curve fitting may be used.
  • This embodiment does not limit the fitting method of the first predicted fitting curve, and can be determined according to actual needs.
  • the above S203 may include: sampling each probability point with a probability value greater than a preset value in the first lane line probability map, and determining the sampling points of the at least two first lane line probability maps; The pixel points corresponding to the sampling points of the at least two first lane line probability maps are subjected to curve fitting to determine the first predicted fitting curve of the at least two lane lines.
  • the number of pixels corresponding to the lane line in the road map is relatively large. Fitting operation for each pixel corresponding to the lane line has a large amount of calculation and a slow fitting speed.
  • the pixels corresponding to the lane line are screened, and a part of the pixels that meet the conditions are selected for curve fitting.
  • the preset value is a reference for dividing whether the pixel point is the lane line, that is, when the pixel point If the probability value is greater than the preset value, the pixel point is a point on the lane line and can be retained. When the probability value of the pixel point is less than the preset value, the pixel point is not a point on the lane line and can be discarded.
  • each probability point with a probability value greater than a preset value in the lane line probability map of each lane line is recorded as a sampling point, and the pixel points corresponding to the sampling points are all points on the lane line.
  • sampling methods such as Markov Chain Monte Carlo, Gibbs sampling and other sampling methods can be used.
  • Gaussian sampling is performed on each probability point with a probability value greater than a preset value in the first lane line probability map, and at least two sampling points of the first lane line probability map are determined.
  • this step S203 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the prediction module or the first fitting unit executed by the processor.
  • the first predicted fitting curve of each lane line in the road map is obtained. From the above, due to the different shooting angles, the extension lines of each lane line in the same road map intersect at point A, and point A Record as the common intersection point of each first predicted fitting curve.
  • the first predicted fitting curve of each lane line passes through the vanishing point, so that the vanishing point can be used to correct the foregoing predicted fitting curves, for example, the incomplete first predicted fitting curve is completed, and It is possible to remove the point where the prediction deviates greatly from the actual one.
  • step S204 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the prediction module or the first fitting unit executed by the processor.
  • S205 Perform curve fitting according to at least a part of pixels with a probability value greater than a set threshold and the vanishing point in the first lane line probability map, and determine and output a first detection fitting curve of the lane line.
  • the curve of the lane line is refitted to generate the first detection fitting curve of the lane line.
  • a lane line as an example, obtain the probability points whose probability value in the first lane line probability map of the lane line is greater than a preset value, and use the existing curve fitting method to convert the pixels corresponding to these probability points
  • the points and vanishing points are used as fitting points to perform curve fitting to generate the first detection fitting curve of the lane line.
  • the vanishing point of the above first predicted fitting curve is on the lane line.
  • a known vanishing point is added during curve fitting, and the incomplete first predicted fitting curve can be completed , And can remove the point where the prediction deviates greatly from the actual, making the fitting result more accurate.
  • the fitting level of the vanishing point is increased.
  • the fitting curve must pass through the vanishing point, which can filter out fitting points that are far away from the real situation. Further improve the accuracy of curve fitting.
  • This embodiment performs subsequent processing on the predicted result of the lane line.
  • the first predicted fitting curve is refitted according to the predicted vanishing point of each first predicted fitting curve to obtain the first detection fitting curve To improve the accuracy of lane line detection.
  • this step S205 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the prediction module or the first fitting unit executed by the processor. Because the process of predicting the probability map by the neural network takes a long time, and in the process of correcting the first prediction fitting curve in this embodiment, the probability map generated by the neural network for the first time is used, and there is no need to return to the neural network model to predict the probability As shown in the figure, the fitting process of the first detection fitting curve takes a short time, thereby improving the accuracy of lane line detection and ensuring the speed of lane line detection.
  • FIG. 6B is a schematic diagram of a first predicted fitting curve obtained based on a preset neural network
  • FIG. 6C is a schematic diagram of a first detected fitting curve obtained using the lane line detection method of an embodiment of the present disclosure.
  • the first detection fitting curve generated by the fitting of the embodiment of the present disclosure can complete the lane line to generate a more complete lane line, and can Remove points where the prediction deviates greatly from the actual.
  • the method for detecting lane lines provided by an embodiment of the present disclosure, by inputting the road map into a neural network, to output a first lane line probability map of the at least two lane lines via the neural network; according to the first lane line At least a part of pixels with a probability greater than a set threshold in the probability map, determining the first predicted fitting curve of the at least two lane lines; determining the common intersection point of the first predicted fitting curves of the at least two lane lines is the Vanishing points of at least two lane lines; curve fitting is performed according to at least a part of pixels with a probability greater than a set threshold in the first lane line probability map and the vanishing points, and the first detection scheme of the lane line is determined and output ⁇ Curve.
  • the first detection fitting curve of each lane line is generated by fitting, so that the lane line can be completed, a more complete lane line can be generated, and the prediction can be removed Points that deviate greatly from the actuality improve the detection accuracy of lane lines, and the fitting process takes a short time.
  • the neural network is also trained, that is, based on the road training map and the vanishing point of the lane line in the road training map, Train the neural network as shown in Figure 7.
  • FIG. 7 is a flowchart of a lane line detection method according to another embodiment of the present disclosure. Based on the above embodiment, this embodiment relates to an optional process for training a neural network. As shown in FIG. 7, the method of this embodiment may include:
  • S301 Input the road training map into a neural network to output a second lane line probability map of the at least two lane lines through the neural network.
  • this step S301 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the training module or the second prediction unit run by the processor.
  • S302. Determine a second predicted fitting curve of the at least two lane lines according to at least a part of pixels with a probability value greater than a set threshold in the second lane line probability map.
  • the road training map has lane line labeling information
  • the road training map is input into a neural network
  • a second lane line probability map of at least two lane lines is output.
  • For each lane line determine at least part of the probability points whose probability value is greater than the set threshold from the second lane line probability map of the lane line, and perform curve fitting on the pixel points corresponding to the partial probability points to generate the lane line
  • the second prediction fits the curve.
  • step S302 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the training module or the second fitting unit executed by the processor.
  • the common intersection of the second predicted fitting curves of the at least two lane lines may be used as the vanishing point of the at least two lane lines.
  • this step S303 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the vanishing point determination unit executed by the processor.
  • S304 Perform curve fitting according to at least a part of pixels with a probability value greater than a set threshold and a vanishing point in the second lane line probability map, and determine a second detection fitting curve of the lane line.
  • For each lane line determine the probability points whose probability value is greater than a preset threshold from the second lane line probability map obtained for the lane line above, and use the pixel points corresponding to these probability points and the vanishing points in S303 as fitting Point to perform curve fitting to generate a second detection fitting curve for the lane line.
  • this step S304 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the second fitting unit executed by the processor.
  • the true value of the lane line described in this embodiment may be an objective lane line, or a marked lane line, or a lane line fitted based on the marked information, which is used as supervision information during the neural network training process to Correct the predicted lane line or the detected lane line.
  • adjust the network parameters of the neural network for example, adjust the network parameters such as the convolution kernel parameters and matrix weights of the neural network.
  • the above S305 may include: for each lane line of the at least two lane lines, determining the second between the second predicted fitting curve of the lane line and the true value of the lane line A difference, and a second difference between the second detection fitting curve of the lane line and the true value of the lane line; according to the first difference and the second difference of each lane line, determine the neural network Detect losses; adjust the network parameters of the neural network according to the detected losses.
  • this step S305 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the adjustment unit executed by the processor. This embodiment does not limit the method for determining the first difference between the second predicted fitting curve of each lane line and the true value of each lane line.
  • this embodiment may use any of the above formulas to determine the first error between the second predicted fitting curve of each lane line and the true value of each lane line. It can be understood that the above formula is also only exemplary, and the first difference may be determined by a modification of the above formula or other formulas or other ways different from the above formula, which is not limited in the embodiments of the present disclosure.
  • the above determination of the first difference between the second predicted fitting curve of each of the lane lines and the true value of each of the lane lines includes:
  • the least square operation result between the second predicted fitting curve of each lane line and the true value of each lane line is used as the first difference.
  • the result of the least square operation between the second predicted fitting curve of each lane line and the true value of each lane line is used as the first difference .
  • x i represents the abscissa of the fitting point i
  • f 1j (x i ) represents the ordinate of the fitting point i on the second predicted fitting curve corresponding to the lane line j
  • f 0j (x i ) represents the ordinate of the fitting point i on the true value corresponding to the lane line j.
  • this embodiment can quickly and accurately determine the second predicted fitting curve and the true value of each lane line according to the above formula (5)
  • the first difference between It can be understood that the above formula (5) is also only exemplary, and the first difference may be determined by a modification of the above formula (5) or other formulas or other ways different from the above formula (5). Not limited.
  • This embodiment does not limit the method for determining the second difference between the second detection fitting curve of each lane line and the true value of each lane line.
  • this embodiment may use any of the above-mentioned difference determination methods to determine the second difference between the second detection fitting curve of the lane line and the true value of the lane line 1. It can be understood that the above formula (6) to formula (9) are also only exemplary, and the second difference may be determined by a modification of the above formula or other formulas or other ways different from the above formula. not limited.
  • the above-mentioned determination of the second difference between the second detection fitting curve of each of the lane lines and the true value of each of the lane lines includes:
  • the cross entropy between the second detection fitting curve of each lane line and the true value of each lane line is used as the second difference.
  • the cross entropy between the second detection fitting curve of each lane line and the true value of each lane line is taken as the second difference.
  • f 2j (x i ) represents the ordinate of the fitting point i on the second detection fitting curve corresponding to the lane line j.
  • the second difference between the second detection fitting curve and the true value of each lane line can be accurately determined according to the above formula (10).
  • the above formula (10) is also only exemplary, and the second difference may be determined by using a modification of the above formula (10) or other formulas or other ways different from the above formula (10). This embodiment of the present disclosure Not limited.
  • the cross entropy between the second predicted fitting curve of each lane line and the true value of each lane line may be used as the first difference.
  • the cross entropy between the second detection fitting curve of the lane lines and the true value of each lane line serves as the second difference.
  • the result of the least square operation between the second predicted fitting curve of each of the lane lines and the true value of each of the lane lines may be used as the first difference, and each of the The least square operation result between the second detection fitting curve of the lane line and the true value of each lane line is used as the second difference.
  • the cross entropy between the second predicted fitting curve of each lane line and the true value of each lane line may also be used as the first difference, and the The result of the least square operation between the second detection fitting curve and the true value of each lane line is used as the second difference.
  • the method for solving the first difference and the second difference in this embodiment may be the same or different, and this embodiment does not limit this.
  • the first difference and the second difference of each lane line determine the detection loss of the neural network.
  • the first difference and the second difference of each lane line determined above are brought into the loss function of the neural network to determine the detection loss of the neural network.
  • this embodiment may use the weighted sum of the first difference and the second difference of each lane line as the loss function of the neural network model.
  • the weight of each first difference ⁇ 1j is a
  • the weight of each second difference ⁇ 2j is b.
  • n the total number of lane lines in the road map.
  • this embodiment may also use the least squares operation result of the first difference and the second difference of each lane line as the detection loss of the neural network.
  • this embodiment may also use the sum of the first difference and the second difference of each lane line as the detection loss of the neural network.
  • each second difference ⁇ 2j and each first difference ⁇ 1j is used as the detection loss of the neural network, that is, the value loss of the loss function of the neural network is:
  • Adjust the network parameters of the neural network according to the detection loss for example, compare the detection loss of the neural network with a preset loss, and adjust the network parameters of the neural network by reverse gradient propagation.
  • the detection loss of the neural network determines whether the detection loss reaches the convergence condition, for example, determine whether the detection loss is less than the preset loss, and if so, determine that the neural network training is completed, and use the trained neural network to predict the lane line. If the detection loss does not reach the convergence condition, continue to adjust the network parameters of the neural network, and use the new road training map to continue training the adjusted neural network until the detection loss of the neural network meets the convergence condition.
  • the prediction accuracy of the neural network can be effectively improved, so that in the detection process of the actual lane line, the high-precision neural network can accurately predict the second prediction fitting curve of the lane line. Then, based on the vanishing point of each second predicted fitting curve, the second predicted fitting curve is corrected to generate a higher-precision detection fitting curve, which further improves the accurate detection of lane lines and provides guarantee for the promotion of intelligent driving .
  • the detection method of a lane line provided by an embodiment of the present disclosure, by inputting the road training map into a neural network to output a second lane line probability map of the at least two lane lines through the neural network, according to the second At least a portion of the pixels with a probability value greater than a set threshold in the lane line probability map, determine the second predicted fitting curve of the at least two lane lines, and determine the at least two predicted lanes based on the predicted fitting curve of the at least two lane lines Vanishing points of two lane lines; curve fitting is performed according to at least a part of pixels with a probability value greater than a set threshold value in the probability map of the second lane line and the vanishing point to determine the second detection fitting curve of the lane line Adjust the network parameters of the neural network according to the first difference between the true value of the second predicted fitting curve and the lane line, and the second difference between the true value of the second detected fitting curve and the lane line, In order to realize the training of the neural network, and thereby improve the prediction accuracy of
  • An obtaining module 110 configured to obtain a road map, wherein the road map includes at least two lane lines;
  • the prediction module 120 is configured to perform lane line prediction on the road map to obtain prediction results of at least two lane lines;
  • the determining module 130 is configured to determine the vanishing point of the at least two lane lines according to the prediction results of the at least two lane lines;
  • the output module 140 is configured to output the lane line detection result of the road map according to the prediction results of the at least two lane lines and the vanishing point.
  • the lane line detection device of the embodiment of the present disclosure may be used to execute the technical solution of the above lane line detection method embodiment, and its implementation principles and technical effects are similar, and will not be repeated here.
  • the above prediction module 120 includes: a first prediction unit 121 and a first fitting unit 122;
  • the first prediction unit 121 is configured to input the road map into a neural network to output the first lane line probability map of the at least two lane lines via the neural network;
  • the first fitting unit 122 is configured to determine the first predicted fitting curve of the at least two lane lines according to at least a part of pixels in which the probability value in the first lane line probability map is greater than a set threshold.
  • the first fitting unit 122 may be used to sample each probability point with a probability value greater than a preset value in the first lane line probability map to determine the At least two sampling points of the first lane line probability map; curve fitting the pixels corresponding to the sampling points of the at least two first lane line probability maps to determine the first prediction fit of the at least two lane lines curve.
  • the first fitting unit 122 may be used to perform Gaussian sampling on each probability point with a probability value greater than a preset value in the first lane line probability map to determine Sampling points of the at least two first lane line probability maps.
  • the first fitting unit 122 may be used to determine that the common intersection point of the first predicted fitting curve of the at least two lane lines is the at least two lane lines Vanishing point.
  • the first fitting unit 122 may be used to determine at least a portion of pixels with a probability value greater than a set threshold and the disappearance in the first lane line probability map Perform curve fitting at points to determine and output the first detection fitting curve of the lane line.
  • the acquisition module 110 may be used to collect a road map in a scene where the vehicle is located via an on-board camera.
  • FIG. 10 is a schematic structural diagram of a lane line detection device according to another embodiment of the present disclosure. As shown in FIG. 10, the device further includes: a training module 150;
  • the training module 150 is configured to train the neural network based on the road training map and vanishing points of lane lines in the road training map.
  • FIG. 11 is a schematic structural diagram of a lane line detection device according to yet another embodiment of the present disclosure.
  • the training module 150 includes: a second prediction unit 151, a second fitting unit 152, and a vanishing point determination Unit 153 and adjustment unit 154;
  • the second prediction unit 151 is configured to input the road training map into a neural network to output a second lane line probability map of the at least two lane lines via the neural network;
  • the second fitting unit 152 is configured to determine a second predicted fitting curve of the at least two lane lines according to at least a part of pixels with probability values greater than a set threshold in the second lane line probability map;
  • the vanishing point determining unit 153 is configured to determine the vanishing points of the at least two lane lines based on the second predicted fitting curve that determines the at least two lane lines;
  • the second fitting unit 152 is further configured to perform curve fitting based on at least a portion of pixels with a probability value greater than a set threshold and a vanishing point in the second lane line probability map to determine the at least two lane lines The second test fitting curve;
  • the adjusting unit 154 is used to determine the first difference between the second predicted fitting curve of each lane line and the true value of each lane line, and the second detected fitting curve of each lane line and the true value of each lane line The second difference in values adjusts the network parameters of the neural network.
  • FIG. 12 is a schematic structural diagram of a lane line detection device according to still another embodiment of the present disclosure.
  • the adjustment unit 154 includes a difference subunit 1541, a loss determination subunit 1542, and an adjustment subunit 1543;
  • the difference subunit 1541 is configured to determine, for each of the at least two lane lines, a first difference between the second predicted fitting curve of the lane line and the true value of the lane line And the second difference between the second detection fitting curve of the lane line and the true value of the lane line;
  • the loss determination subunit 1542 is used to determine the detection loss of the neural network according to the first difference and the second difference of each lane line;
  • the adjusting subunit 1543 is used to adjust the network parameters of the neural network according to the detected loss.
  • the above-mentioned difference subunit 1541 can be used as the first result of the least square operation between the second predicted fitting curve of the lane line and the true value of the lane line difference.
  • the above-mentioned difference subunit 1541 may be used as the second entropy of the lane line as the second entropy of the lane line as the second entropy of the lane line difference.
  • the above-mentioned loss determination subunit 1542 may be used to use the sum of the first error and the second error of each lane line as the detection loss of the neural network.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in FIG. 13, the electronic device 30 of this embodiment includes:
  • the memory 31 is used to store a computer program
  • the processor 32 is configured to execute the computer program to implement the lane line detection method of the above embodiment of the present disclosure, and its implementation principles and technical effects are similar and will not be repeated here.
  • FIG. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in FIG. 14, the electronic device 40 of this embodiment includes:
  • the camera 41 is used to obtain a road map, wherein the road map includes at least two lane lines;
  • Memory 42 used to store computer programs
  • the processor 43 is configured to execute the computer program to implement the above-mentioned lane line detection method embodiment, and its implementation principle and technical effect are similar, and will not be repeated here.
  • the electronic device includes one or more processors, a communication section, etc.
  • processors such as one or more central processing units (CPU), and / or one or more image processors (GPU), etc.
  • the processor can perform various appropriate actions and processes according to the executable instructions stored in the read-only memory (ROM) or the executable instructions loaded from the storage section into the random access memory (RAM).
  • the communication unit may include, but is not limited to, a network card.
  • the network card may include, but is not limited to, an IB (Infiniband) network card.
  • the processor may communicate with a read-only memory and / or random access memory to execute executable instructions. Communicate with other target devices via the communication unit to complete operations corresponding to any of the methods provided in the embodiments of the present disclosure, for example, obtain a road map; perform lane line prediction on the road map to obtain prediction results of at least two lane lines; The prediction results of the at least two lane lines determine the vanishing points of the at least two lane lines; and based on the prediction results of the at least two lane lines and the vanishing points, the lane line detection results of the road map are output.
  • IB Infiniband
  • the driving control device obtains the lane line detection result of the road map, and the lane line detection result of the road map is obtained by using the lane line detection method as described in the above embodiment; the driving control device detects the lane line according to the lane line As a result, prompt information is output and / or intelligent driving control of the vehicle is performed.
  • various programs and data necessary for the operation of the device can be stored in the RAM.
  • the CPU, ROM, and RAM are connected to each other through a bus.
  • ROM is an optional module.
  • the RAM stores executable instructions, or writes executable instructions to the ROM at runtime, and the executable instructions cause the processor to perform operations corresponding to any of the above methods of the present disclosure.
  • the input / output (I / O) interface is also connected to the bus.
  • the communication part may be integratedly set, or may be set to have multiple sub-modules (for example, multiple IB network cards), and be on the bus link.
  • the following components are connected to the I / O interface: input parts including keyboard, mouse, etc .; output parts including cathode ray tube (CRT), liquid crystal display (LCD), etc., and speakers; storage parts including hard disks, etc .; and The communication part of network interface cards such as LAN cards and modems.
  • the communication section performs communication processing via a network such as the Internet.
  • the drive is also connected to the I / O interface as needed.
  • Removable media such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, etc., are installed on the drive as needed, so that the computer program read out therefrom is installed into the storage portion as needed.
  • FIG. 15 is only an optional implementation method.
  • the number and types of the components in FIG. 15 can be selected, deleted, added, or replaced according to actual needs; Separate settings or integrated settings can also be adopted for the setting of different functional components.
  • GPU and CPU can be set separately or the GPU can be integrated on the CPU.
  • the communication department can be set separately or integrated on the CPU or GPU. and many more.
  • the process described above with reference to the flowchart may be implemented as a computer software program.
  • the embodiments of the present disclosure include a computer program product including a computer program tangibly contained on a machine-readable medium, the computer program containing program code for performing the method shown in the flowchart, the program code may include a corresponding The instruction corresponding to the method step provided by any embodiment of the present disclosure is executed.
  • the computer program may be downloaded and installed from the network through the communication section, and / or installed from a removable medium.
  • FIG. 16 is a schematic flowchart of a driving control method provided by an embodiment of the present disclosure. As shown in FIG. 16, the driving control method of this embodiment includes:
  • the driving control device obtains the lane line detection result of the road map.
  • this step S401 may be executed by the processor invoking the corresponding instruction stored in the memory, or may be executed by the acquisition module executed by the processor.
  • the driving control device outputs prompt information according to the detection result of the lane line and / or performs intelligent driving control on the vehicle.
  • this step S402 may be executed by the processor invoking the corresponding instruction stored in the memory, or by the driving control module executed by the processor.
  • the execution subject of this embodiment is a driving control device.
  • the driving control device of this embodiment and the electronic device described in the foregoing embodiment may be located in the same device, or may be separate devices in different devices. Among them, the driving control device of this embodiment is communicatively connected with the above-mentioned electronic device.
  • the detection result of the lane line of the road map is obtained by the detection method of the lane line of the above embodiment, and the specific process refers to the description of the above embodiment, which will not be repeated here.
  • the electronic device executes the above lane line detection method, obtains the lane line detection result of the road map, and outputs the lane line detection result of the road map.
  • the driving control device obtains the lane line detection result of the road map, and outputs prompt information according to the lane line detection result of the road map and / or performs intelligent driving control on the vehicle.
  • the prompt information may include a warning warning of lane line departure, or a reminder of keeping lane line.
  • the intelligent driving in this embodiment includes assisted driving and / or automatic driving.
  • the above-mentioned intelligent driving control may include: braking, changing the driving speed, changing the driving direction, keeping lane lines, changing the state of the lights, driving mode switching, etc., wherein the driving mode switching may be switching between assisted driving and automatic driving, for example To switch from assisted driving to automatic driving.
  • the driving control device obtains the lane line detection result of the road map, and outputs prompt information and / or performs intelligent driving control on the vehicle according to the lane line detection result of the road map, thereby improving the intelligent driving Safety and reliability.
  • Any of the lane line detection methods or driving control methods provided by the embodiments of the present disclosure may be executed by any appropriate device with data processing capabilities, including but not limited to: terminal devices and servers.
  • any of the lane line detection methods or driving control methods provided by the embodiments of the present disclosure may be executed by a processor, for example, the processor executes any of the lane lines mentioned in the embodiments of the present disclosure by calling corresponding instructions stored in the memory Detection method or driving control method. I will not repeat them below.
  • FIG. 17 is a schematic structural diagram of a driving control device provided by an embodiment of the present disclosure. Based on the foregoing embodiment, the driving control device 200 of the embodiment of the present disclosure includes:
  • the obtaining module 210 is used to obtain the detection result of the lane line of the road map, and the detection result of the lane line of the road map is obtained by the detection method of the lane line as described above;
  • the driving control module 220 is configured to output prompt information according to the lane line detection result and / or perform intelligent driving control on the vehicle.
  • the driving control device of the embodiment of the present disclosure may be used to execute the technical solutions of the above driving control method embodiments, and its implementation principles and technical effects are similar and will not be repeated here.
  • FIG. 18 is a schematic diagram of an intelligent driving system provided by an embodiment of the present disclosure.
  • the intelligent driving system 50 of this embodiment includes: a communication-connected camera 51, an electronic device 30, and a driving control device 200, wherein the electronic device 30 is shown in FIG. 13 or 14, the driving control device 200 is shown in FIG. 17, and the camera 51 is used to take a road map.
  • the camera 51 photographs the road map and sends the road map to the electronic device 30.
  • the electronic device 30 compares the road map according to the above lane line detection method. Perform processing to obtain the lane detection results of the road map.
  • the electronic device 30 transmits the obtained lane line detection result of the road map to the driving control device 200, and the driving control device 200 outputs prompt information and / or performs intelligent driving control on the vehicle according to the lane line detection result of the road map.
  • an embodiment of the present disclosure also provides a computer storage medium for storing the above lanes
  • the computer software instructions for line detection when run on the computer, enable the computer to execute various possible lane line detection methods and / or driving control methods in the above method embodiments.
  • the processes or functions according to the embodiments of the present disclosure may be generated in whole or in part.
  • the computer instructions may be stored in a computer storage medium or transmitted from one computer storage medium to another computer storage medium, and the transmission may be transmitted to another through wireless (such as cellular communication, infrared, short-range wireless, microwave, etc.) Web site, computer, server or data center for transmission.
  • the computer storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes one or more available medium integrations.
  • the usable medium may be a magnetic medium (eg, floppy disk, hard disk, magnetic tape), optical medium (eg, DVD), or semiconductor medium (eg, SSD), or the like.
  • the method and apparatus of the present disclosure may be implemented in many ways.
  • the method and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware.
  • the above-mentioned sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the above-described sequence of optional descriptions unless specifically stated otherwise.
  • the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the method according to the present disclosure.
  • the present disclosure also covers the recording medium storing the program for executing the method according to the present disclosure.
  • the description of the present disclosure is given for the sake of example and description, and is not exhaustive or limits the present disclosure to the disclosed form.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un appareil de détection de ligne de voie, un procédé et un appareil de commande de conduite, ainsi qu'un dispositif. Le procédé de détection de ligne de voie consiste à : obtenir une carte routière ; effectuer une prédiction de ligne de voie sur la carte routière afin d'obtenir un résultat de prédiction pour au moins deux lignes de voie ; déterminer un point de fuite des au moins deux lignes de voie en fonction du résultat de prédiction pour les au moins deux lignes de voie ; et générer un résultat de détection de ligne de voie de la carte routière selon le résultat de prédiction pour les au moins deux lignes de voie et le point de fuite.
PCT/CN2019/118097 2018-11-14 2019-11-13 Procédé et appareil de détection de ligne de voie, procédé et appareil de commande de conduite, et dispositif WO2020098708A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020217015078A KR20210079339A (ko) 2018-11-14 2019-11-13 차선의 검출 및 주행 제어 방법, 장치 및 전자 디바이스
JP2021525695A JP2022507226A (ja) 2018-11-14 2019-11-13 区画線検出方法、装置、及び運転制御方法、装置並びに電子機器

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811355223.4 2018-11-14
CN201811355223.4A CN111191487A (zh) 2018-11-14 2018-11-14 车道线的检测及驾驶控制方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2020098708A1 true WO2020098708A1 (fr) 2020-05-22

Family

ID=70709100

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/118097 WO2020098708A1 (fr) 2018-11-14 2019-11-13 Procédé et appareil de détection de ligne de voie, procédé et appareil de commande de conduite, et dispositif

Country Status (4)

Country Link
JP (1) JP2022507226A (fr)
KR (1) KR20210079339A (fr)
CN (1) CN111191487A (fr)
WO (1) WO2020098708A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112101321A (zh) * 2020-11-18 2020-12-18 蘑菇车联信息科技有限公司 灭点提取方法、装置、电子设备及存储介质
CN112132109A (zh) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 车道线处理和车道定位方法、装置、设备及存储介质
CN112199999A (zh) * 2020-09-09 2021-01-08 浙江大华技术股份有限公司 道路检测方法、装置、存储介质和电子设备
CN115440048A (zh) * 2022-09-20 2022-12-06 澳克诺(上海)汽车科技有限公司 预测车辆行驶轨迹的方法、装置及介质
US20230021027A1 (en) * 2021-12-29 2023-01-19 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for generating a road edge line
CN116091648A (zh) * 2023-02-09 2023-05-09 禾多科技(北京)有限公司 车道线的生成方法及装置、存储介质及电子装置
CN117649635A (zh) * 2024-01-30 2024-03-05 湖北经济学院 狭窄水道场景影消点检测方法、系统及存储介质

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652952B (zh) * 2020-06-05 2022-03-18 腾讯科技(深圳)有限公司 车道线生成方法、装置、计算机设备和存储介质
CN111814651B (zh) * 2020-07-02 2024-01-12 阿波罗智能技术(北京)有限公司 车道线的生成方法、装置和设备
CN111539401B (zh) * 2020-07-13 2020-10-23 平安国际智慧城市科技股份有限公司 基于人工智能的车道线检测方法、装置、终端及存储介质
CN112465925A (zh) * 2020-11-20 2021-03-09 北京赛目科技有限公司 一种用于仿真测试的车道线的处理方法及装置
CN112146620B (zh) * 2020-11-25 2021-03-16 腾讯科技(深圳)有限公司 目标物体的测距方法及装置
CN112215214A (zh) * 2020-12-11 2021-01-12 智道网联科技(北京)有限公司 调整智能车载终端的摄像头偏移的方法及系统
CN112734139B (zh) * 2021-01-28 2023-09-29 腾讯科技(深圳)有限公司 通行时长预测方法和装置、存储介质及电子设备
CN113011285B (zh) * 2021-03-02 2023-04-07 北京三快在线科技有限公司 车道线检测方法、装置、自动驾驶车辆及可读存储介质
CN113869293B (zh) * 2021-12-03 2022-03-11 禾多科技(北京)有限公司 车道线识别方法、装置、电子设备和计算机可读介质
CN114494158A (zh) * 2022-01-07 2022-05-13 华为技术有限公司 一种图像处理方法、一种车道线检测方法及相关设备
CN115470830B (zh) * 2022-10-28 2023-04-07 电子科技大学 一种基于多源域适应的脑电信号跨用户警觉性监测方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138955A (zh) * 2015-07-10 2015-12-09 深圳市中天安驰有限责任公司 一种道路消失点的检测方法
CN105893949A (zh) * 2016-03-29 2016-08-24 西南交通大学 一种复杂路况场景下的车道线检测方法
CN108229354A (zh) * 2017-12-22 2018-06-29 温州大学激光与光电智能制造研究院 车道线检测的方法
CN108216229A (zh) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 交通工具、道路线检测和驾驶控制方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3729141B2 (ja) * 2002-02-27 2005-12-21 日産自動車株式会社 道路白線認識装置
US8311283B2 (en) * 2008-07-06 2012-11-13 Automotive Research&Testing Center Method for detecting lane departure and apparatus thereof
JP2012048289A (ja) * 2010-08-24 2012-03-08 Isuzu Motors Ltd 直線検知装置
JP6160252B2 (ja) * 2013-05-29 2017-07-12 日産自動車株式会社 画像処理装置および画像処理方法
JP2018164199A (ja) * 2017-03-27 2018-10-18 ソニーセミコンダクタソリューションズ株式会社 画像処理装置、および画像処理方法
CN108629292B (zh) * 2018-04-16 2022-02-18 海信集团有限公司 弯曲车道线检测方法、装置及终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138955A (zh) * 2015-07-10 2015-12-09 深圳市中天安驰有限责任公司 一种道路消失点的检测方法
CN105893949A (zh) * 2016-03-29 2016-08-24 西南交通大学 一种复杂路况场景下的车道线检测方法
CN108216229A (zh) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 交通工具、道路线检测和驾驶控制方法及装置
CN108229354A (zh) * 2017-12-22 2018-06-29 温州大学激光与光电智能制造研究院 车道线检测的方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199999A (zh) * 2020-09-09 2021-01-08 浙江大华技术股份有限公司 道路检测方法、装置、存储介质和电子设备
CN112132109A (zh) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 车道线处理和车道定位方法、装置、设备及存储介质
CN112101321A (zh) * 2020-11-18 2020-12-18 蘑菇车联信息科技有限公司 灭点提取方法、装置、电子设备及存储介质
CN112101321B (zh) * 2020-11-18 2021-02-02 蘑菇车联信息科技有限公司 灭点提取方法、装置、电子设备及存储介质
US20230021027A1 (en) * 2021-12-29 2023-01-19 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for generating a road edge line
CN115440048A (zh) * 2022-09-20 2022-12-06 澳克诺(上海)汽车科技有限公司 预测车辆行驶轨迹的方法、装置及介质
CN116091648A (zh) * 2023-02-09 2023-05-09 禾多科技(北京)有限公司 车道线的生成方法及装置、存储介质及电子装置
CN116091648B (zh) * 2023-02-09 2023-12-01 禾多科技(北京)有限公司 车道线的生成方法及装置、存储介质及电子装置
CN117649635A (zh) * 2024-01-30 2024-03-05 湖北经济学院 狭窄水道场景影消点检测方法、系统及存储介质

Also Published As

Publication number Publication date
CN111191487A (zh) 2020-05-22
KR20210079339A (ko) 2021-06-29
JP2022507226A (ja) 2022-01-18

Similar Documents

Publication Publication Date Title
WO2020098708A1 (fr) Procédé et appareil de détection de ligne de voie, procédé et appareil de commande de conduite, et dispositif
CN108304775B (zh) 遥感图像识别方法、装置、存储介质以及电子设备
US11321593B2 (en) Method and apparatus for detecting object, method and apparatus for training neural network, and electronic device
CN108229479B (zh) 语义分割模型的训练方法和装置、电子设备、存储介质
CN108122234B (zh) 卷积神经网络训练及视频处理方法、装置和电子设备
US11443445B2 (en) Method and apparatus for depth estimation of monocular image, and storage medium
CN109087510B (zh) 交通监测方法及装置
CN112966587B (zh) 目标检测模型的训练方法、目标检测方法及相关设备
TWI721510B (zh) 雙目圖像的深度估計方法、設備及儲存介質
CN112733820B (zh) 障碍物信息生成方法、装置、电子设备和计算机可读介质
WO2023193401A1 (fr) Procédé et appareil de formation de modèle de détection de nuage de points, dispositif électronique et support de stockage
CN116848555A (zh) 使用在潜在变量上调节的几何感知神经网络渲染场景的新图像
CN113112542A (zh) 一种视觉定位方法、装置、电子设备及存储介质
CN112258565B (zh) 图像处理方法以及装置
CN109242882B (zh) 视觉跟踪方法、装置、介质及设备
CN106778822B (zh) 基于漏斗变换的图像直线检测方法
CN117115900A (zh) 一种图像分割方法、装置、设备及存储介质
CN112651351B (zh) 一种数据处理的方法和装置
CN111765892B (zh) 一种定位方法、装置、电子设备及计算机可读存储介质
CN114913500A (zh) 位姿确定方法、装置、计算机设备和存储介质
CN116295466A (zh) 地图生成方法、装置、电子设备、存储介质、及车辆
CN112489450A (zh) 交通路口处的车辆流量控制方法、路侧设备及云控平台
CN112861940A (zh) 双目视差估计方法、模型训练方法以及相关设备
CN113469025B (zh) 应用于车路协同的目标检测方法、装置、路侧设备和车辆
CN113379591B (zh) 速度确定方法、速度确定装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19885778

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021525695

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20217015078

Country of ref document: KR

Kind code of ref document: A

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/08/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19885778

Country of ref document: EP

Kind code of ref document: A1