CN112132109A - Lane line processing and lane positioning method, device, equipment and storage medium - Google Patents

Lane line processing and lane positioning method, device, equipment and storage medium Download PDF

Info

Publication number
CN112132109A
CN112132109A CN202011080260.6A CN202011080260A CN112132109A CN 112132109 A CN112132109 A CN 112132109A CN 202011080260 A CN202011080260 A CN 202011080260A CN 112132109 A CN112132109 A CN 112132109A
Authority
CN
China
Prior art keywords
lane
lane line
line
lines
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011080260.6A
Other languages
Chinese (zh)
Inventor
黄生辉
李映辉
何刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011080260.6A priority Critical patent/CN112132109A/en
Publication of CN112132109A publication Critical patent/CN112132109A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a lane line processing and lane positioning method, device, equipment and storage medium, and relates to the artificial intelligence technology, in particular to the technical field of computer vision, deep learning and intelligent transportation. The specific implementation scheme is as follows: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the route lines; and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line. The accuracy of the lane line identification processing is improved, and a new thought is provided for the lane line processing.

Description

Lane line processing and lane positioning method, device, equipment and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to an artificial intelligence technology, and further relates to the technical field of computer vision, deep learning and intelligent transportation. In particular to a method, a device, equipment and a storage medium for lane line processing and lane positioning.
Background
With the development of intelligent transportation technology, the application of lane line identification technology is more and more extensive, for example, the lane where the vehicle is located by identifying the lane line. At present, the lane line recognition technology is to collect a road image through a camera mounted on a vehicle and recognize a lane line in the road image based on a straight line detection algorithm. However, due to the complex road environment, the lane line is often missing or recognized by mistake, which seriously affects the accuracy of the related applications (such as lane positioning) related to the lane line, and needs to be improved.
Disclosure of Invention
The disclosure provides a lane line processing and lane positioning method, device, equipment and storage medium.
According to a first aspect of the present disclosure, there is provided a lane line processing method including:
identifying a road image acquired by a vehicle to obtain a lane line and a road edge line;
determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the route lines;
and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
According to a second aspect of the present disclosure, there is provided a lane positioning method including:
acquiring a lane positioning request;
responding to the lane positioning request, and determining lane positioning information of the vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route line, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain a corrected lane line.
According to a third aspect of the present disclosure, there is provided a lane line processing apparatus including:
the image identification module is used for identifying road images acquired by vehicles to obtain lane lines and road edge lines;
the lane line analysis module is used for determining a wrongly identified lane line and a missing lane line in the lane lines according to the road line;
and the lane line correction module is used for correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
According to a fourth aspect of the present disclosure, there is provided a lane positioning device comprising:
the request acquisition module is used for acquiring a lane positioning request;
the lane positioning module is used for responding to the lane positioning request and determining lane positioning information of the vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route line, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain a corrected lane line.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a lane line processing method or a lane locating method of any of the embodiments of the present application.
According to a sixth aspect of the present disclosure, a non-transitory computer readable storage medium having computer instructions stored thereon is provided. The computer instructions are for causing a computer to perform a lane line processing method or a lane positioning method of any embodiment of the present application.
According to the technology of the application, the accuracy of lane line identification processing is improved, and a new thought is provided for lane line processing.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1A is a flowchart of a lane line processing method according to an embodiment of the present application;
FIG. 1B is a schematic diagram of a road image provided according to an embodiment of the application;
fig. 2 is a flowchart of another lane line processing method provided in an embodiment of the present application;
fig. 3 is a flowchart of another lane line processing method provided in an embodiment of the present application;
fig. 4 is a flowchart of another lane line processing method provided in an embodiment of the present application;
FIG. 5 is a flow chart of a lane locating method according to an embodiment of the present application;
FIG. 6 is a flow chart of another lane locating method provided in accordance with an embodiment of the present application;
FIG. 7 is a flow chart of another lane locating method provided in accordance with an embodiment of the present application;
FIG. 8 is a flow chart of another lane locating method provided in accordance with an embodiment of the present application;
fig. 9 is a schematic structural diagram of a lane line processing apparatus according to an embodiment of the present application;
FIG. 10 is a schematic structural diagram of a lane positioning device provided in an embodiment of the present application;
fig. 11 is a block diagram of an electronic device for implementing a lane line processing method or a lane positioning method according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1A is a flowchart of a lane line processing method according to an embodiment of the present application, and fig. 1B is a schematic diagram of a road image according to an embodiment of the present application. The embodiment is suitable for the condition of carrying out lane line identification processing on the collected road image. The embodiment may be performed by a lane line processing apparatus configured in an electronic device, which may be implemented in software and/or hardware. The electronic device may be a vehicle-mounted terminal device, and may also be a back-end server device. As shown in fig. 1A-1B, the method includes:
and S101, identifying the road image acquired by the vehicle to obtain a lane line and a road edge line.
The road image may be an image of an environmental road in front of the vehicle, which is acquired by an image acquisition device (such as a camera) mounted on the vehicle in real time. For example, fig. 1B is a schematic diagram of a road image, which includes a road surface marked with lane lines, vehicles on the road surface, road edges on two sides of the road, railings on the road edges, buildings on two sides of the road, and the like. Optionally, the road image of the embodiment of the present application may be a front road image acquired during the driving of the vehicle. The lane lines may refer to lines marked on the road surface to indicate a special lane limited to the travel of the motor vehicle, as shown at S1-S4 in fig. 1B. The curbs may refer to intersections between curbs on both sides of the road and the road surface, as shown at S5 and S6 in fig. 1B.
Optionally, if the electronic device implementing the embodiment of the application is a vehicle-mounted terminal device, the image acquisition device on the vehicle may transmit the acquired road image to the vehicle-mounted terminal device, and the vehicle-mounted terminal device identifies the lane and the route of the received road image; if the electronic device executing the embodiment of the application is a back-end server device, the image acquisition device on the vehicle can transmit the acquired image to the back-end server device, and the back-end server device identifies the lane and the route of the received road image. Optionally, when the road image is transmitted to the back-end server, the road image may be directly transmitted by the image acquisition device, or may be transmitted through the vehicle-mounted terminal, which is not limited in this embodiment.
Optionally, in the embodiment of the present application, the identification of the road image is divided into two stages, where one stage is to perform lane line identification on the road image, and the other stage is to perform road edge line identification on the road image. Optionally, in the embodiment of the present application, the lane line and the route line may be identified for the road image based on identification algorithms corresponding to the lane line and the route line, respectively. Taking lane line identification as an example, a straight line in a road image can be identified based on a hough straight line transformation algorithm, and an identified result is taken as a preliminarily identified lane line. The identification of lane lines and route lines can be carried out on the road image based on the deep neural network. For example, a deep learning model capable of lane line recognition and a deep learning model capable of lane line recognition can be trained in advance; the method can also be characterized in that a deep learning model capable of identifying the lane line and the route line simultaneously is trained in advance, the collected road image is input into the trained deep learning model, and the deep learning model is operated, so that the identified lane line and the route line can be obtained.
Optionally, in this embodiment of the application, the operations of lane line recognition and road edge line recognition on the road image may be executed in parallel, so as to improve the recognition efficiency. The method can also be used for recognizing the lane lines of the road image, then recognizing the road edge lines of the road image if the lane lines are recognized, stopping recognizing the road image if the lane lines are not recognized, and waiting for recognizing the next frame of road image, so that the lane line recognition of the road without the lane lines is avoided, and the power consumption of the electronic equipment can be reduced.
It should be noted that, because the number of lane lines on different roads is uncertain, and the characteristics of the lane lines are relatively single, the road image is identified, and the obtained lane lines are not necessarily accurate lane lines, and there may be cases of false identification or missed identification, for example, when the step identifies the lane lines of the road image shown in fig. 1B, the railings S7 and S9 on both sides of the road, and the edge line S8 of the building may be mistakenly identified as the lane lines; due to the road edge lane lines S1 and S4 near the road edge, which may not be recognized due to poor definition thereof, there is also a missing lane line among the recognized lane lines.
And S102, determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the route lines.
Optionally, in this embodiment of the application, the lane line identified in S101 may be analyzed with the route line as a reference, and the misrecognized lane line included in the lane line may be found out, and whether the lane line is missing may be determined. Specifically, there are many methods for determining the misrecognized lane lines and the missing lane lines in the embodiments of the present application, and the methods are not limited to these.
In one embodiment, whether there are misrecognized and missing lane lines in the lane lines recognized in S101 may be determined based on the positional relationship between each recognized lane line and the recognized edge line in combination with a preset determination rule. For example, the misrecognized lane line is mainly caused by misrecognizing obstacles whose edges on both sides of the road are straight lines, so that the lane line on the left side of the left side road edge line and the lane line on the right side of the right side road edge line can be regarded as the misrecognized lane lines. The missing lane lines may be determined by analyzing the intervals between the remaining lane lines after removing the misrecognized lane lines from the recognized lane lines.
In the second implementation, the misrecognized lane lines and the missing lane lines are determined based on a deep neural network. For example, a deep learning model that can determine misrecognized and missing lane lines based on the curbside may be trained in advance using a large number of training samples. And inputting the data information (such as the corresponding pixel coordinates of the lane line and the road edge line in the road image) corresponding to the lane line and the road edge line identified in the step S101 into the trained deep learning model, and operating the deep learning model to obtain the data information of the mistakenly identified lane line and the missing lane line.
S103, the identified lane line is corrected according to the mistakenly identified lane line and the missing lane line.
Optionally, in S102, the preliminarily recognized lane lines are analyzed to determine the erroneously recognized lane lines included therein, and after the missing lane lines are determined, the preliminarily recognized lane lines in S101 may be corrected based on the analysis result in S102. Specifically, the mistakenly identified lane line may be removed from the initially identified lane lines, and then the missing lane lines are supplemented to complete the correction of the lane lines, so as to obtain the accurate lane lines included in the road image.
According to the technical scheme of the embodiment of the application, the lane lines and the route lines are identified for the road images collected by the vehicle, the mistakenly identified and missing lane lines in the identified lane lines are determined based on the identified route lines, and then the identified lane lines are corrected based on the determination result. According to the scheme of the embodiment of the application, the wrongly identified lane lines and the missing lane lines can be corrected based on the identified road edge lines, and the accuracy of the finally obtained lane lines is greatly improved. Thereby improving the accuracy of related applications involving lane lines, such as lane positioning. Provides a new idea for the treatment of the lane line.
Optionally, in the application related to the lane line, sometimes it is only necessary to ensure that the lane line around one side road edge is accurate. For example, in an application scenario of lane positioning, only the lane line beside one side road edge needs to be accurately determined, and the lane where the vehicle is located can be determined based on the side road edge. At this time, in order to improve the lane line processing efficiency, for such application scenarios, a determined one-side route line may be selected, and a misrecognized lane line and a missing lane line of the preliminarily recognized lane lines may be determined based on the determined one-side route line. Optionally, in consideration of a complex road environment, such as a vehicle convergence intersection, a bus stop, a temporary parking space, and the like, all located on the right side of a road, and therefore the right-side lane identification result is relatively lower in accuracy than the left side, in the case that at least two lane lines are identified, in the embodiment of the present application, after the lane lines are identified from a road image, it is preferable to determine, with the left-side lane line as a reference, a lane line that is erroneously identified and missing in the initially identified lane line. Optionally, in the embodiment of the present application, when the number of the identified route lines is at least two, the target route line may be determined from the at least two route lines according to the confidence degrees of the at least two route lines; and determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the target route. Specifically, when the route lines are identified from the road image, the confidence level of each identified route line may be further determined. For example, if a deep learning model is used to identify a route, the deep learning model outputs the identified route and also outputs the probability (i.e., confidence) that the route is a true route. And then comparing the confidence degrees of the identified route lines, taking the route line with the highest confidence degree as a target route line, and determining the mistakenly identified lane line and the missing lane line in the identified lane lines by taking one side of the target route line as a reference according to the mode of the S102. The benefits of this arrangement are: the wrongly identified lane lines and the missing lane lines are determined only on the basis of one side of the lane line, so that the lane line processing efficiency is greatly improved while the accuracy of the application scene of the lane lines is not influenced. In addition, the route lines according to the confidence coefficient selection of each route line are selected, so that the accuracy of route line selection is improved, and the subsequent accurate lane line processing is guaranteed.
Fig. 2 is a flowchart of another lane line processing method provided in an embodiment of the present application; in this embodiment, based on the above embodiment, a specific description is given of determining a misrecognized lane line and a missing lane line in lane lines according to a route line, as shown in fig. 2, the method includes:
s201, identifying the road image collected by the vehicle to obtain a lane line and a road edge line.
S202, taking the lane line with the interval distance between the preset point positions along the lane line and the route line meeting the misrecognized threshold value as the misrecognized lane line.
The preset point location along the route may be a point selected in advance on the route, for example, the preset point location may be any one of a start point, a middle point, an end point, and the like of the route, and considering that the start point and the end point of the route in the road image are prone to feature loss, the embodiment of the present application may select the middle point location along the route as the preset point location.
The misrecognized threshold may be a criterion for determining whether a lane line is a misrecognized lane line according to a distance between a lane line and a predetermined point location along the route. Optionally, in this embodiment of the application, the route line on the left side of the road and the route edge on the right side of the road correspond to different misrecognition thresholds. Specifically, the false identification threshold corresponding to the route line on the left side of the road is as follows: the spacing distance between preset point positions along the lane line and the left side line is smaller than a first numerical value (such as 300 mm); the false recognition threshold corresponding to the route line on the right side of the road is as follows: the spacing distance between the preset point positions along the lane line and the right side line is greater than a second numerical value (such as-300 mm), wherein the first numerical value is greater than or equal to 0, the second numerical value is less than or equal to 0, the first numerical value and the second numerical value can be a pair of opposite numbers, and the specific value can be set according to the distance between each line and the nearest road edge lane line.
Optionally, in this embodiment of the application, for each identified lane line, a separation distance between the lane line and the preset point locations along the route line is calculated, and then, for the left route line, whether the separation distance between the lane line and the preset point locations along the left route line satisfies the misrecognition threshold is determined based on the misrecognition threshold corresponding to the left route line, and the lane line satisfying the misrecognition threshold is used as the misrecognized lane line. The process of determining the misrecognized lane line is similar according to the right route line, and is not described herein again.
For example, taking the road image shown in fig. 1B as an example, it is assumed that the lane line identified in S201 includes: S2-S4, and S7-S9. The identified routes are along S5 and S6. And the misidentification threshold corresponding to the left route edge line S5 is: the spacing distance between the lane line and the midpoint of the left side road edge line S5 is less than 300 mm; the misidentification threshold corresponding to the right-side route line S6 is: the separation distance between the lane line and the midpoint of the right side road edge line S6 is greater than-300 mm. In this case, with the left side lane line S5 as a reference, the distance between the lane line S2 and the lane line S4 and the distance between the lane line S7 and the lane line S9 and the midpoint of the left side line S5 may be calculated, wherein the distance between the lane line S7 located on the left side of the left side line S5 and the midpoint of the lane line S8 and the left side line S5 is negative, and the distance between the lane line S2 and the lane line S4 located on the right side of the left side line S5 and the midpoint of the lane line S9 and the left side line S5 is positive. If the distance between the lane line S2 closest to the left side lane line S5 and the preset point of the left side lane line S is greater than 300mm, only the lane lines S7 and S8 on the left side of the left side lane line S5 are the lane lines that satisfy the misrecognition threshold of the left side lane line S5. At this time, the lane line S7 and the lane line S8 may be regarded as misrecognized lane lines, i.e., S7 is actually a rail on the left side of the road, and S8 is actually a top sideline of a building on the left side of the road. In a similar manner, the right-hand side route line S6 may identify a misrecognized lane line S9.
And S203, if the remaining lane lines except the mistakenly identified lane line have the lane lines with the spacing distances meeting the missing threshold, taking the lane line at the edge of the road as the missing lane line in the lane lines.
The missing threshold may be a criterion for determining whether the identified lane line is missing according to a distance between the lane line and a preset point location along the route. Optionally, in this embodiment of the application, the route line on the left side of the road and the route edge on the right side of the road correspond to different missing thresholds. Optionally, the missing threshold corresponding to the route on the left side of the road is: the spacing distance between the first lane line on the left side and the preset point position along the left side route is larger than a third numerical value (such as 3500 mm); the missing threshold corresponding to the right side of the road along the route is as follows: the distance between the preset point positions along the first lane line on the right side and the right side line is greater than a fourth numerical value (such as-3500 mm), wherein the third numerical value is positive, the fourth numerical value is negative, and the specific values of the third numerical value and the fourth numerical value can be determined according to the distance between the two lane lines on the road.
Optionally, in a general case, the lane line in the middle of the road is usually obvious, and the lane lines at the edges of the road (i.e. the first lane lines near the two edges) near the two edges of the road are affected by roadside dust, fallen leaves, accumulated water or light, and the like, and have poor definition in the acquired road image, as shown in fig. 1B, the definition of the lane line at the edge of the left side road S1 and the lane line at the edge of the right side road S4 is obviously reduced compared with that of the lane line at the middle S2 and the lane line at the edge of the right side road S3. Therefore, when lane line recognition is performed on the road image, the missing lane line is usually a road edge lane line located beside the road edge. The missing lane lines determined in the embodiments of the present application are mainly road edge lane lines.
Optionally, the misrecognized lane line determined in S202 may be removed from the lane line recognized in S201, and then it may be determined whether there is a missing lane line from the remaining lane lines. Specifically, for the left lane line, it may be determined whether a distance between a first lane line on the left side and a preset point along the left lane line in the remaining lane lines is greater than a third value, if so, the lane line on the edge of the road beside the left lane line is missing in the identified lane line, otherwise, the lane line on the left side does not have a missing lane line in the lane line identified in S201. For the right route line, the process of determining the missing lane line is similar, and is not described herein again.
For example, taking the road image shown in fig. 1B as an example, assuming that after removing the misidentified lane line, the remaining lane lines are S2-S4, and the missing threshold corresponding to the left road edge line S5 is: the spacing distance between the preset points of the first lane line on the left side and the left road edge line S5 is more than 3500 mm. Since the lane line S2 — the first lane line on the left side in the lane line S4 is S2, it may be determined whether the distance between the lane line S2 and the preset point of the left side road edge line S5 is greater than 3500mm, and if so, it indicates that the boundary lane line beside the left side road edge line S5 is missing in the lane line S2-the lane line S4, that is, the lane line S1 in fig. 1B.
And S204, correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
It should be noted that, in the embodiment of the present application, whether to perform operations S202 to S203 on the basis of one route or to perform operations S202 to S203 on the basis of two routes may be selected based on a difference in application scenarios after lane route processing. For example, if the application scenario after lane line processing is a lane positioning scenario, the operations of S202-S203 may be performed based on a route line (e.g., a left route line, or a route line with high confidence).
According to the technical scheme, the lane lines and the route lines are identified for the road images collected by the vehicles, and the mistakenly identified lane lines are determined according to the spacing distance between the preset point positions of the identified lane lines and the identified route lines and the mistakenly identified threshold value. And removing the mistakenly identified lane lines, and if the lane lines with the interval distance meeting the missing threshold value between the preset point positions of the road edge line exist in the rest lane lines, taking the lane lines at the edge of the road as the missing lane lines. The embodiment of the application provides a method for determining the mistakenly identified lane line and the missing lane line based on the spacing distance between the lane line and the road edge line, the mistakenly identified threshold and the missing threshold, so that the accuracy of determining the mistakenly identified lane line and the missing lane line is greatly improved, and the authenticity of the processed lane line is ensured.
Fig. 3 is a flowchart of another lane line processing method according to an embodiment of the present disclosure. In this embodiment, a specific description is given of how to determine the distance between the lane line and the preset point along the route based on the above embodiments, as shown in fig. 3, the method includes:
s301, identifying the road image collected by the vehicle to obtain a lane line and a road edge line.
And S302, performing curve fitting on the lane line to obtain a curve equation of the lane line.
Optionally, in this embodiment of the application, S301 identifies the road image, and may obtain data information of the lane line and the road edge line, for example, coordinates of pixel points forming each lane line and each road edge line. At this time, curve fitting can be performed according to the data information of each identified lane line and a preset hyperbolic model, so as to obtain a curve equation of the lane line. Specifically, the method comprises the following steps: the curve equation adopted in the embodiment of the application is as follows:
u-u0=A(v-v0)+B/(v-v0);(1)
where (u0, v0) is the vanishing point, A is the slope, B is the curvature, and (u, v) is a specific point on the curve. A curve equation may be fitted to each lane line based on the above equation (1).
Optionally, a curve equation may also be fitted to each identified route based on the curve model.
And S303, determining the position coordinates of the preset point positions along the route.
Optionally, the preset point location along the route may be preset according to actual requirements, and may be any one of a start point, a midpoint, an end point, and the like of the route, and in consideration of a situation that the start point and the end point of the route in the road image are prone to feature loss, the midpoint location along the route may be selected as the preset point location in the embodiment of the present application. If the preset point location is the starting point or the ending point of the route, the coordinates of the starting point or the ending point of the route can be directly obtained from the road image to be used as the position coordinates of the preset point location. If the preset point location is the middle point of the route, the coordinates of the starting point and the ending point of the route are firstly obtained from the road image, the coordinates of the starting point and the ending point are averaged, and the coordinate of the middle point is obtained and used as the position coordinate of the preset point location. The position coordinates of all the pixel points along the identified route can be subjected to mean value processing, and the processing result is used as the position coordinates of the preset point position and the like. This embodiment is not limited to this.
S304, determining the spacing distance between the lane line and the preset point position along the line according to the curve equation of the lane line and the position coordinates of the preset point position.
Optionally, in the embodiment of the present application, under the condition that a curve equation of the lane line and the position coordinates of the preset point locations along the route line are known, the distance between the preset point locations and the lane line may be calculated based on a point-to-curve distance calculation formula.
S305, taking the lane line with the interval distance between the preset point positions along the lane line and the route line meeting the misrecognized threshold value as the misrecognized lane line.
And S306, if the remaining lane lines except the mistakenly identified lane line have the lane lines with the spacing distances meeting the missing threshold, taking the lane line at the edge of the road as the missing lane line in the lane lines.
And S307, correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
According to the technical scheme of the embodiment of the application, a lane line and a route line are identified for a road image acquired by a vehicle, a curve equation is fitted for the identified lane line, the position coordinates of preset point positions are determined for the identified route line, and then the spacing distance between each identified lane line and the preset point positions along the route line is determined based on the curve equation and the position coordinates of the preset point positions; and determining the mistakenly identified lane line and the missing lane line by combining the mistakenly identified threshold and the missing threshold, and correcting the identified lane line based on the determination result. The scheme of this application embodiment provides a spacing distance's between accurate definite route line and the lane line scheme, and accurate spacing distance provides the assurance for follow-up confirmed misrecognition lane line and the lane line of disappearance, and then has guaranteed the accurate nature of lane line correction to discerning.
Fig. 4 is a flowchart of another lane line processing method according to an embodiment of the present application. The present embodiment provides a specific description of identifying a road image captured by a vehicle on the basis of the above embodiments, and as shown in fig. 4, the method includes:
s401, start.
S402, carrying out lane line identification on the road image collected by the vehicle, judging whether a lane line is identified, if not, executing S403, and if so, executing S404.
Optionally, in the embodiment of the present application, the lane line may be identified on a road image acquired by a vehicle, and a specific identification method is introduced in the above embodiment and is not described herein again. Since each frame of road image does not necessarily contain lane lines or clear lane lines, the step executes lane line identification operation on the road image, and the lane lines cannot be identified necessarily, at this time, it needs to be judged whether the step identifies the lane lines, and if so, the operation of S404 can be executed to further judge whether the road edge detection condition is met; if the lane line is not identified, the operation of S403 may be executed to clear the image identification data of this time, end the lane line processing on the road image, and wait for processing the next frame of road image collected by the vehicle.
S403, the lane line processing of the road image is completed.
S404, if the lane line is identified, determines whether the road edge detection condition is satisfied, if so, executes S405, and if not, executes S403.
Here, the road edge detection condition may refer to a determination condition that specifies whether or not the road edge recognition operation can be performed on the road image. The road edge detection conditions in the embodiment of the present application may be many, and the embodiment is not limited to this, and for example, the following at least one of the following conditions may be included: the method comprises the following steps that firstly, the current moment reaches a preset identification period; the preset identification period in the first condition may be determined by combining the acquisition frequency of the image acquisition device, the vehicle running speed, the lane line application scenario, and other factors, and for example, the preset identification period may be 2 s. The second condition is that the confidence coefficient of the lane line is greater than a confidence coefficient threshold value (such as 70%); and in the third condition, the number of the lane lines identified in the two adjacent road images is inconsistent. The road edge detection condition in the embodiment of the application can be any one or more of the above combinations, can be selected according to actual conditions, has stronger flexibility, and can be more accurately positioned to the condition that the lane line changes by the combination of the above conditions, so that the lane line is accurately processed in real time.
Optionally, in order to reduce power consumption of lane line processing, in the embodiment of the present application, the lane line may be identified in the road image without performing lane line identification on each frame of road image acquired by the vehicle, and when the lane line is currently met, S405 is executed to perform the operation of lane line identification on the road image acquired by the vehicle. Otherwise, executing the operation of S403 to clear the image identification data, ending the lane line processing on the road image, and waiting for processing the next frame of road image acquired by the vehicle.
Specifically, if the road edge detection condition is one, the step may be to determine whether the current time reaches a preset identification period from the last road edge identification, for example, 2s, and if so, it indicates that the first road edge detection condition is satisfied. If the road edge detection condition is the second condition, the confidence level, which is the credibility of each identified lane line, may be further calculated when the lane line in the road image is identified in S402; whether the confidence of the identified lane line meets a preset confidence threshold (such as 70%) is judged, and for example, the confidence may be met completely or partially. If yes, the road edge detection condition II is met. If the road edge detection condition is the third condition, the number value of the lane line identified by the current frame road image and the number value of the lane line identified by the previous frame road image are obtained, and whether the two values are consistent or not is compared, if not, the road edge detection condition is satisfied.
And S405, if the lane line is identified and the road edge detection condition is met, performing road edge line identification on the road image to obtain the road edge line.
Optionally, in this embodiment of the application, only when the condition that the lane line is identified from the road image is met at the same time and the current road detection condition is also met, the operation of identifying the road edge line may be further performed on the road image acquired by the vehicle, and the specific method for identifying the road edge line from the road image has been described in the above embodiment, which is not repeated herein.
And S406, determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route lines.
S407, the identified lane line is corrected based on the erroneously identified lane line and the missing lane line.
According to the technical scheme, after the road image collected by the vehicle is obtained, lane line identification is carried out on the road image, lane line identification can be carried out on the road image continuously when the lane line is identified and the current road edge detection condition is met, mistakenly identified and missing lane lines in the identified lane lines are determined based on the identified lane lines, and then the identified lane lines are corrected based on the determination result. According to the scheme of the embodiment of the application, the lane lines and the road lines are not required to be identified for each frame of road image acquired by the vehicle, the road images with the possibly changed lane lines are positioned for identifying the road lines through the lane line identification condition and the road edge detection condition, and the timeliness and the accuracy of lane line processing are guaranteed while the power consumption of lane line processing is reduced.
Fig. 5 is a flowchart of a lane positioning method according to an embodiment of the present application. The embodiment is suitable for the situation that the lane line corrected by the lane line processing method of any embodiment of the application is used for positioning the lane of the vehicle acquiring the road image in real time. This embodiment may be performed by a lane locating device configured in the electronic device, which may be implemented in software and/or hardware. The electronic device can be a vehicle-mounted terminal device and can also be a backend server device. As shown in fig. 5, the method includes:
s501, obtaining a lane positioning request.
The lane positioning request is used for indicating the electronic equipment receiving the request to position the lane where the vehicle is located. The lane positioning request may be that the vehicle-mounted terminal locally acquires and responds to the road positioning request after generating the lane positioning request, or that the vehicle-mounted terminal transmits the generated lane positioning request to a back-end server, and the back-end server acquires and responds to the lane positioning request. The lane positioning request can be generated by a back-end server and then locally acquired by the back-end server and responded to the road positioning request; the rear-end server can also transmit the generated lane positioning request to the vehicle-mounted terminal, and the vehicle-mounted terminal acquires and responds to the lane positioning request. Optionally, in this embodiment of the application, if the electronic device that obtains the lane positioning is a back-end server, the lane positioning request may include vehicle identification information that needs to be lane positioned.
S502, responding to the lane positioning request, and determining lane positioning information of the vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain the corrected lane line.
The lane line required to be corrected for lane positioning of the vehicle in the embodiment of the present application may be determined by using the lane line processing method described in any of the embodiments above. The specific determination process is not described in detail. Optionally, if the lane line processing process and the lane positioning process are executed by the same electronic device, the electronic device may obtain the corrected lane line locally, and if the lane line processing process and the lane positioning process are executed by different electronic devices, the lane positioning request may include the corrected lane line; or acquiring the corrected lane line through interaction between the electronic devices.
Optionally, in the embodiment of the present application, in response to the lane positioning request, there are many ways to determine the lane positioning information of the vehicle in real time according to the corrected lane line, which are not limited herein. And the first mode is that the lane positioning information where the vehicle is located is determined according to the corrected relation between the slopes of the lane lines. The specific determination method will be described in detail in the following embodiments. And secondly, converting the positions of the corrected lane lines in the road image into a world coordinate system, determining target lane lines positioned at two sides of the current vehicle from the corrected lane lines by combining the actual positioning information of the vehicle, and taking lane numbers associated with the target lane lines as the lane numbers of the current vehicle, namely the lane positioning information of the vehicle. And thirdly, inputting the corrected lane line into a pre-trained deep learning model for vehicle positioning, and operating the deep learning model to obtain the lane positioning information of the vehicle.
It should be noted that, in the embodiment of the present application, the lane positioning information corresponding to the vehicle in each frame of road image may be determined based on the lane line corrected by each frame of road image acquired by the vehicle. If the road image is collected in real time during the driving process of the vehicle, and the corrected lane line is obtained according to the lane line processing method described in the above embodiments, this embodiment can realize that the lane positioning information of the vehicle is determined and updated in real time during the driving process of the vehicle based on the real-time corrected lane line.
According to the technical scheme of the embodiment of the application, after the vehicle positioning request is obtained, the mistakenly identified lane lines and the missing lane lines are corrected based on the road edge lines in the road image, and the accurate lane lines are obtained to determine the lane positioning information of the vehicle in real time. According to the lane positioning method and the lane positioning device, the mistakenly identified lane lines and the missed lane lines do not exist in the corrected lane lines adopted when the lane positioning information is determined, the accuracy is higher, and the lane positioning accuracy is further improved.
Fig. 6 is a flowchart of another lane positioning method provided according to an embodiment of the present application. The present embodiment provides a specific description of determining lane positioning information of a vehicle in real time according to a corrected lane line based on the above-mentioned embodiments, as shown in fig. 6, the method includes:
s601, obtaining a lane positioning request.
And S602, responding to the lane positioning request, and determining the slope of the corrected lane line.
Wherein the corrected lane line is determined by: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain the corrected lane line.
Optionally, in the embodiment of the present application, when determining the slope of the modified lane line in response to the lane positioning request, the slope of the lane line may be determined based on a curve equation fitted to each lane line in the lane line processing stage. For example, if the curve equation fitted to a certain modified lane line is: u-u0 ═ a (v-v0) + B/(v-v 0); the value of the parameter a can be obtained as the slope of the lane line.
S603, two corrected adjacent lane lines with mutually opposite slopes are used as the target lane line.
The target lane line may refer to lane lines on both sides of a lane where the vehicle is located. For example, if the lane number of the vehicle is the left lane 1, the target lane line is the first lane line and the second lane line on the left side of the road.
Optionally, for the road image acquired during the driving process of the vehicle, the lane lines on both sides of the lane where the vehicle is actually located are generally symmetrical based on the driving direction. Therefore, after determining each modified target lane line, the present embodiment may sequentially determine whether the slopes of two adjacent lane lines are opposite numbers, such as 2 and-2. And taking the two corrected adjacent lane lines with the slopes of a pair of opposite numbers as target lane lines. For example, taking the road image shown in fig. 1B as an example, if the corrected lane lines are S1-S4, and the slopes of the four corrected lane lines are analyzed, it can be seen that the slopes of the lane line S2 and the lane line S3 should be a pair of opposite numbers, and the slopes of the lane line S1 and the lane line S4 should be a pair of opposite numbers, and at this time, since the lane line S2 and the lane line S3 are two adjacent lane lines, the lane line S2 and the lane line S3 are set as the target lane lines.
And S604, determining lane positioning information of the vehicle according to the target lane line.
Optionally, in the embodiment of the present application, after the target lane lines are obtained, lanes between the target lane lines may be used as lanes where the vehicle is currently located, that is, lane positioning information of the vehicle. For example, taking the road image shown in fig. 1B as an example, if the target lane lines are the lane line S2 and the lane line S3, the lane 2 between the lane line S2 and the lane line S3 may be used as the lane positioning information of the vehicle.
In the embodiment of the present application, if the corrected lane line is corrected based on the left side edge line, the lane positioning information determined at this time is the corresponding lane number from the left side, and may be, for example, lane 1, lane 2, lane 3, and the like. If the corrected lane line is corrected based on the right side edge line, the lane positioning information determined at this time is the corresponding lane number from the right side, and may be, for example, lane-1, lane-2, and lane-3. If the corrected lane line is corrected based on both the side edge lines, the lane line may be calculated from either side.
According to the technical scheme of the embodiment of the application, after the vehicle positioning request is obtained, two corrected adjacent lane lines with mutually opposite slopes are used as the target lane line based on the slopes of the corrected lane lines, and then the lane positioning information of the vehicle is determined in real time according to the target lane line. According to the scheme of the embodiment of the application, when the lane positioning information of the vehicle is determined, the mistakenly identified lane line and the missed lane line do not exist in the adopted corrected lane line, the accuracy is higher, the positioning information of the lane is determined according to the slope of the lane line, other information does not need to be acquired, and a new thought is provided for lane positioning.
Fig. 7 is a flowchart of another lane positioning method provided according to an embodiment of the present application. In this embodiment, a specific description is given of the specific case of determining the lane positioning information of the vehicle in real time according to the corrected lane line on the basis of the above embodiments, as shown in fig. 7, the method includes:
s701, acquiring a lane positioning request.
S702, responding to the lane positioning request, determining whether the vehicle has a lane change event in the current frame road image when the corrected number of lane lines of the two adjacent frame road images is consistent, if so, executing S703, and if not, executing S704.
The lane change event refers to the fact that the vehicle changes from one lane to the left or the right to run to the other lane in the running process. For example, a right-shift from lane 1 to lane 2 is made. Optionally, the lane change event includes a lane change direction.
Optionally, in the embodiment of the present application, in response to the acquired lane positioning request, it is first determined whether the number of the lane lines after correction of the current frame road image acquired by the vehicle is consistent with the number of the lane lines after correction of the previous frame road image acquired by the vehicle, and if not, it is described that the lane lines after correction of the current frame road image and the previous frame road image are changed (that is, the lane lines are actually changed, or a situation of misrecognition or missing cognition occurs). For example, two corrected adjacent lane lines with mutually opposite slopes in the current frame road image are used as target lane lines, and lane positioning information of the vehicle is determined according to the target lane lines. If the road images are consistent with each other, it is indicated that the corrected lane lines of the current frame and the previous frame of road image are not changed, and in order to further determine whether the positioning information of the vehicle is changed, it may be determined whether a lane change event exists in the current frame of road image compared with the previous frame of road image. Specifically, whether the vehicle has the lane change event in the current frame road image may be determined according to the slope change condition of the corrected lane line in the current frame road image and the previous frame road image, and the specific determination method will be specifically described in the following embodiments. Optionally, in this embodiment of the application, if it is determined that the vehicle has a lane change event in the current frame road image, it indicates that the lane positioning information of the vehicle in the current frame road image is necessarily changed compared with the previous frame road image, and at this time, S703 is executed to re-determine the lane positioning information of the vehicle in the current frame road image. If it is determined that the vehicle does not have a lane change event in the current frame road image, it indicates that the lane positioning information of the vehicle in the current frame road image remains unchanged compared to the previous frame road image, and then S704 is executed to use the lane positioning information of the previous frame road image as the lane positioning information of the current frame image.
And S703, if a lane change event exists, determining lane positioning information of the vehicle in the current frame road image according to the lane change direction.
Optionally, if the vehicle has a lane change event in the current frame road image, the lane change direction may be determined according to the lane change event, for example, if the lane change event includes the lane change direction, the lane change direction may be directly obtained from the lane change event. After determining the corresponding lane change direction of the vehicle in the current frame road image, the lane positioning information of the vehicle in the current frame road image may be determined in combination with the lane positioning information of the vehicle in the previous frame road image. Specifically, since the related law stipulates that the vehicle can only change one lane at a time and cannot change lanes continuously when changing lanes, if the lane changing direction is changing lanes to the left, the lane number corresponding to the lane information of the vehicle in the previous frame of road image can be subtracted by one to serve as the lane positioning information of the vehicle in the current frame of road image; if the lane change direction is lane change to the right, the lane number corresponding to the lane information in the previous road image of the vehicle may be added by one to serve as the lane positioning information of the vehicle in the current road image.
S704, using the lane positioning information of the previous frame of road image as the lane positioning information of the current frame of road image.
According to the technical scheme of the embodiment of the application, the vehicle positioning request is obtained and responded, and if the number of the lane lines after the two adjacent frames of road images are corrected is consistent and the vehicle has a lane change event in the current frame of road image, the lane positioning information of the vehicle in the current frame of road image can be determined according to the lane change direction and the lane positioning information of the vehicle in the previous frame of road image. If the corrected lane lines of the two adjacent frames of road images are consistent in number, but no lane change event exists in the current frame of road image, the lane positioning information of the current frame of road image is consistent with that of the previous frame of road image. According to the scheme of the embodiment of the application, when the lane positioning information of the vehicle is determined, the lane changing event is introduced, and under the condition that the number of the lane lines after the correction of the two adjacent frames of road images is consistent, the lane positioning information of the vehicle in the current frame of road image can be rapidly determined according to whether the lane changing event exists or not. The efficiency and the accuracy of lane location are improved, and another new idea is provided for lane location.
Fig. 8 is a flowchart of another lane positioning method provided according to an embodiment of the present application. The present embodiment is described in detail based on the above embodiments, and as shown in fig. 8, the method includes:
s801, acquiring a lane positioning request.
S802, responding to the lane positioning request, and under the condition that the number of the lane lines after the correction of the two adjacent frames of road images is consistent, determining a target lane line related to the lane positioning information according to the lane positioning information of the vehicle in the previous frame of road image.
Optionally, in this embodiment of the application, when it is determined that the corrected lane lines of the two adjacent road images are consistent in number in response to the lane positioning request, lane positioning information determined for a previous road image, for example, a lane number corresponding to a vehicle in the previous road image, may be obtained first. After the lane positioning information of the vehicle in the previous frame of road image is acquired, the nearest lane lines on the left and right sides of the lane number corresponding to the positioning information can be used as the target lane lines associated with the lane positioning information. For example, assuming that the lane positioning information of the vehicle in the road image of the previous frame is the left lane 1, the first lane line and the second lane line located on the left side of both sides of the lane 1 may be set as the target lane lines at this time.
And S803, determining a first proportion according to the slope of the target lane line in the previous frame of road image.
Optionally, since the number of the target lane lines is two, the slope of the two target lane lines corresponding to the previous frame of road image may be obtained in the embodiment of the present application. Specifically, the slope of the target lane line may be determined based on a curve equation fitted to the two target lane lines during the lane line processing stage of the previous frame of the road image. For example, if one of the target lane lines fits a curve equation of: u-u0 ═ a (v-v0) + B/(v-v 0); the value of the acquired parameter a may be taken as the slope of the target lane line.
When the first ratio is determined according to the slope of the target lane line in the previous frame of road image in the embodiment of the application, the first ratio may be determined according to the following formula (2):
d1=(al_1+ar_1)/(al_1-ar_1);(2)
wherein d1 is a first ratio; al _1 is the slope of the target lane line on the left side in the previous frame of road image; ar _1 is the slope of the target lane line located on the right side in the previous road image.
S804, according to the slope of the target lane line in the current frame road image, determining a second proportion.
Optionally, in the embodiment of the present application, when determining the second proportion according to the slope of the target lane line in the current frame road image, the second proportion may be determined according to the following formula (3):
d2=(al_2+ar_2)/(al_2-ar_2);(3)
wherein d2 is a second ratio; al _2 is the slope of the target lane line positioned on the left side in the current frame road image; and ar _2 is the slope of the target lane line located on the right side in the current frame road image.
It should be noted that the method for determining the slope of the target lane line in the current frame road image is similar to the method for determining the slope of the target lane line in the previous frame road image, and details are not repeated here.
S805, determining whether the vehicle has a lane change event in the current frame road image according to the change relationship between the first proportion and the second proportion, if so, executing S806, and if not, executing S807.
Optionally, in this embodiment of the present application, if the vehicle changes lane to the left, the slope ratio of the target lane line will be changed from 1 to-1; when the vehicle changes lane to the right, the slope ratio of the target lane line is changed from-1 to 1; therefore, it is determined whether the change from the first scale to the second scale is in accordance with the above-mentioned rule of changing lanes to the left or to the right based on the rule, if so, it indicates that the vehicle has a lane change event in the current frame road image, at this time, S806 is executed to determine the lane positioning information of the vehicle in the current frame road image, if not, the vehicle does not have a lane change event in the current frame road image, at this time, S807 is executed to use the lane positioning information of the previous frame road image as the lane positioning information of the current frame image.
Optionally, when determining whether the change from the first proportion to the second proportion meets the rule of changing lanes to the left or to the right, the embodiment of the present application may be the first proportion d1 and the second proportion d2 calculated based on the above formulas (2) and (3), and if the first proportion d1 to the second proportion d2 < the first proportion threshold (e.g., -1.5), it indicates that the vehicle has a right lane change event in the current frame road image; if the first proportion d 1-the second proportion d2 > the second proportion threshold (such as 1.5), it indicates that the vehicle has a lane-left-changing event in the current frame road image.
And S806, if the lane change event exists, determining lane positioning information of the vehicle in the current road image according to the lane change direction.
S807, the lane positioning information of the previous road image is used as the lane positioning information of the current road image.
According to the technical scheme of the embodiment of the application, a vehicle positioning request is obtained and responded, if the number of the lane lines after the two adjacent frames of road images are corrected is consistent, a target lane line is determined according to positioning information of a vehicle in the previous frame of road image, a first proportion is determined according to the slope of the target lane line in the previous frame of road image, a second proportion is determined according to the slope of the target lane line in the current frame of road image, whether a lane change event exists is determined according to the change relation between the first proportion and the second proportion, and if the lane change event exists, lane positioning information of the vehicle in the current frame of road image can be determined according to the lane change direction and the lane positioning information of the vehicle in the previous frame of road image. Otherwise, the lane positioning information is consistent with the previous frame of road image. The scheme of the embodiment of the application provides a new idea for determining whether the vehicle has a lane change event or not based on the change condition of the slope of the lane line in the adjacent frame road image, and provides guarantee for rapidly determining lane positioning information of the vehicle based on the lane change event.
Fig. 9 is a schematic structural diagram of a lane line processing device according to an embodiment of the present application. The embodiment is suitable for the condition of carrying out lane line identification processing on the collected road image. The device can realize the lane line processing method in any embodiment of the application. The apparatus 900 specifically includes the following:
the image identification module 901 is used for identifying a road image acquired by a vehicle to obtain a lane line and a road edge line;
a lane line analysis module 902, configured to determine, according to the route line, a lane line that is erroneously identified and a lane line that is missing in the lane line;
and the lane line correction module 903 is configured to correct the identified lane line according to the misrecognized lane line and the missing lane line.
According to the technical scheme of the embodiment of the application, the lane lines and the route lines are identified for the road images collected by the vehicle, the mistakenly identified and missing lane lines in the identified lane lines are determined based on the identified route lines, and then the identified lane lines are corrected based on the determination result. According to the scheme of the embodiment of the application, the wrongly identified lane lines and the missing lane lines can be corrected based on the identified road edge lines, and the accuracy of the finally obtained lane lines is greatly improved. Thereby improving the accuracy of related applications involving lane lines, such as lane positioning. Provides a new idea for the treatment of the lane line.
Further, the lane line analysis module 902 includes:
the false recognition analysis unit is used for taking the lane line with the spacing distance between the preset point positions along the lane line and the false recognition threshold as the false recognition lane line;
and the missing analysis unit is used for taking the lane line at the edge of the road as the missing lane line in the lane lines if the lane lines with the spacing distances meeting the missing threshold exist in the rest lane lines except the mistakenly identified lane lines.
Further, the lane line analysis module 902 further includes a distance determination unit, configured to:
performing curve fitting on the lane line to obtain a curve equation of the lane line;
determining the position coordinates of preset point positions along the route;
and determining the spacing distance between the lane line and the preset point position along the line according to the curve equation of the lane line and the position coordinates of the preset point position.
Further, the lane line analysis module 902 is specifically configured to:
determining a target route from the at least two route lines according to the confidence degrees of the at least two route lines under the condition that the route lines are at least two;
and determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the target route.
Further, the image recognition module 901 includes:
the lane line identification unit is used for identifying lane lines of the road images acquired by the vehicles;
and the road edge line identification unit is used for identifying the road edge line of the road image if the lane line is identified and the road edge detection condition is met.
Further, the road edge detection condition includes at least one of: the current moment reaches a preset identification period; the confidence of the lane line is greater than a confidence threshold; the number of the identified lane lines in the two adjacent frames of road images is inconsistent.
Fig. 10 is a schematic structural diagram of a lane positioning device according to an embodiment of the present application. The embodiment is suitable for the situation that the lane line corrected by the lane line processing method of any embodiment of the application is used for positioning the lane of the vehicle acquiring the road image in real time. The device can realize the lane positioning method of any embodiment of the application. The apparatus 1000 specifically comprises the following:
a request obtaining module 1001, configured to obtain a lane positioning request;
the lane positioning module 1002 is configured to respond to the lane positioning request, and determine lane positioning information of a vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying a road image acquired by a vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route line, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain a corrected lane line.
According to the technical scheme of the embodiment of the application, after the vehicle positioning request is obtained, the mistakenly identified lane lines and the missing lane lines are corrected based on the road edge lines in the road image, and the accurate lane lines are obtained to determine the lane positioning information of the vehicle in real time. According to the lane positioning method and the lane positioning device, the mistakenly identified lane lines and the missed lane lines do not exist in the corrected lane lines adopted when the lane positioning information is determined, the accuracy is higher, and the lane positioning accuracy is further improved.
Further, the lane positioning module 1002 includes:
a lane line slope determination unit for determining a slope of the corrected lane line;
a target lane line determining unit, configured to use two corrected adjacent lane lines with mutually opposite slopes as a target lane line;
and the lane positioning unit is used for determining lane positioning information of the vehicle according to the target lane line.
Further, the lane positioning module 1002 further comprises:
the lane change event determining unit is used for determining whether a lane change event exists in the current frame road image or not under the condition that the corrected lane line number of the two adjacent frame road images is consistent;
and the lane positioning unit is used for determining lane positioning information of the vehicle in the current frame road image according to the lane changing direction if a lane changing event exists.
Further, the lane change event determining unit is specifically configured to:
determining a target lane line associated with the lane positioning information according to the lane positioning information of the vehicle in the previous frame of road image;
determining a first proportion according to the slope of the target lane line in the last frame of road image;
determining a second proportion according to the slope of the target lane line in the current frame road image;
and determining whether the vehicle has a lane change event in the current frame road image according to the change relation between the first proportion and the second proportion.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 11 is a block diagram of an electronic device for implementing the lane line processing method or the lane positioning method according to the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 11, the electronic apparatus includes: one or more processors 1101, a memory 1102, and interfaces for connecting the various components, including a high speed interface and a low speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 11, a processor 1101 is taken as an example.
The memory 1102 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by at least one processor to cause the at least one processor to perform a lane line processing method or a lane locating method provided herein. A non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute a lane line processing method or a lane positioning method provided by the present application.
The memory 1102 may be used as a non-transitory computer readable storage medium for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the lane line processing method or the lane locating method in the embodiment of the present application (for example, the image recognition module 901, the lane line analysis module 902, and the lane line correction module 903 shown in fig. 9, or the request acquisition module 1001 and the lane locating module 1002 shown in fig. 10). The processor 1101 executes various functional applications of the server and data processing, i.e., implements the lane line processing method or the lane locating method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 1102.
The memory 1102 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of an electronic device implementing a lane line processing method or a lane positioning method, or the like. Further, the memory 1102 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 1102 may optionally include a memory remotely located from the processor 1101, which may be connected over a network to an electronic device implementing a lane line processing method or a lane locating method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device implementing the lane line processing method or the lane positioning method may further include: an input device 1103 and an output device 1104. The processor 1101, the memory 1102, the input device 1103 and the output device 1104 may be connected by a bus or other means, and are exemplified by being connected by a bus in fig. 11.
The input device 1103 may receive input numeric or character information and generate key signal inputs related to user settings and function control of an electronic apparatus implementing a lane line processing method or a lane positioning method, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 1104 may include a display device, auxiliary lighting devices (e.g., LEDs), tactile feedback devices (e.g., vibrating motors), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
According to the technical scheme of the embodiment of the application, the lane lines and the route lines are identified for the road images acquired by the vehicle, the mistakenly identified and missing lane lines in the identified lane lines are determined based on the identified route lines, and then the identified lane lines are corrected based on the determination result. According to the scheme of the embodiment of the application, the wrongly identified lane lines and the missing lane lines can be corrected based on the identified road edge lines, and the accuracy of the finally obtained lane lines is greatly improved. Thereby improving the accuracy of related applications involving lane lines, such as lane positioning. Provides a new idea for the treatment of the lane line.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (22)

1. A lane line processing method, comprising:
identifying a road image acquired by a vehicle to obtain a lane line and a road edge line;
determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the route lines;
and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
2. The method of claim 1, wherein the determining of the misrecognized lane lines and the missing lane lines from the along-route lines comprises:
taking the lane line with the spacing distance between the preset point positions along the lane line and the lane line meeting the misrecognized threshold value as a misrecognized lane line;
and if the lane lines with the spacing distance meeting the missing threshold exist in the rest lane lines except the mistakenly identified lane lines, taking the lane lines at the edge of the road as the missing lane lines in the lane lines.
3. The method of claim 2, further comprising:
performing curve fitting on the lane line to obtain a curve equation of the lane line;
determining the position coordinates of preset point positions along the route;
and determining the spacing distance between the lane line and the preset point position along the line according to the curve equation of the lane line and the position coordinates of the preset point position.
4. The method of any one of claims 1-3, the determining, from the route lines, a misrecognized lane line and a missing lane line of the lane lines, comprising:
determining a target route from the at least two route lines according to the confidence degrees of the at least two route lines under the condition that the route lines are at least two;
and determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the target route.
5. The method of claim 1, wherein the identifying a road image captured by a vehicle comprises:
carrying out lane line identification on a road image acquired by a vehicle;
and if the lane line is identified and the road edge detection condition is met, performing road edge line identification on the road image.
6. The method of claim 5, wherein the road edge detection condition comprises at least one of:
the current moment reaches a preset identification period;
the confidence of the lane line is greater than a confidence threshold;
the number of the identified lane lines in the two adjacent frames of road images is inconsistent.
7. A lane locating method, comprising:
acquiring a lane positioning request;
responding to the lane positioning request, and determining lane positioning information of the vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying the road image collected by the vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route line, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain a corrected lane line.
8. The method of claim 7, wherein the determining lane locating information of the vehicle in real time according to the corrected lane line comprises:
determining the slope of the corrected lane line;
taking two corrected adjacent lane lines with mutually opposite slopes as target lane lines;
and determining lane positioning information of the vehicle according to the target lane line.
9. The method of claim 7, wherein the determining lane locating information of the vehicle in real time according to the corrected lane line comprises:
determining whether a lane change event exists in the current frame road image by the vehicle under the condition that the corrected lane lines of the two adjacent frames of road images are consistent in number;
and if the lane change event exists, determining lane positioning information of the vehicle in the current frame road image according to the lane change direction.
10. The method of claim 9, wherein the determining whether the vehicle has a lane change event in the current frame road image comprises:
determining a target lane line associated with the lane positioning information according to the lane positioning information of the vehicle in the previous frame of road image;
determining a first proportion according to the slope of the target lane line in the last frame of road image;
determining a second proportion according to the slope of the target lane line in the current frame road image;
and determining whether the vehicle has a lane change event in the current frame road image according to the change relation between the first proportion and the second proportion.
11. A lane line processing apparatus comprising:
the image identification module is used for identifying road images acquired by vehicles to obtain lane lines and road edge lines;
the lane line analysis module is used for determining a wrongly identified lane line and a missing lane line in the lane lines according to the road line;
and the lane line correction module is used for correcting the identified lane line according to the mistakenly identified lane line and the missing lane line.
12. The apparatus of claim 11, wherein the lane line analysis module comprises:
the false recognition analysis unit is used for taking the lane line with the spacing distance between the preset point positions along the lane line and the false recognition threshold as the false recognition lane line;
and the missing analysis unit is used for taking the lane line at the edge of the road as the missing lane line in the lane lines if the lane lines with the spacing distances meeting the missing threshold exist in the rest lane lines except the mistakenly identified lane lines.
13. The apparatus of claim 12, wherein the lane line analysis module further comprises a distance determination unit to:
performing curve fitting on the lane line to obtain a curve equation of the lane line;
determining the position coordinates of preset point positions along the route;
and determining the spacing distance between the lane line and the preset point position along the line according to the curve equation of the lane line and the position coordinates of the preset point position.
14. The apparatus according to any one of claims 11-13, wherein the lane line analysis module is specifically configured to:
determining a target route from the at least two route lines according to the confidence degrees of the at least two route lines under the condition that the route lines are at least two;
and determining the mistakenly identified lane lines and the missing lane lines in the lane lines according to the target route.
15. The apparatus of claim 11, wherein the image recognition module comprises:
the lane line identification unit is used for identifying lane lines of the road images acquired by the vehicles;
and the road edge line identification unit is used for identifying the road edge line of the road image if the lane line is identified and the road edge detection condition is met.
16. The apparatus of claim 15, wherein the road edge detection condition comprises at least one of:
the current moment reaches a preset identification period;
the confidence of the lane line is greater than a confidence threshold;
the number of the identified lane lines in the two adjacent frames of road images is inconsistent.
17. A lane positioning device comprising:
the request acquisition module is used for acquiring a lane positioning request;
the lane positioning module is used for responding to the lane positioning request and determining lane positioning information of the vehicle in real time according to the corrected lane line; wherein the corrected lane line is determined by: identifying the road image collected by the vehicle to obtain a lane line and a road edge line; and determining the mistakenly identified lane line and the missing lane line in the lane lines according to the route line, and correcting the identified lane line according to the mistakenly identified lane line and the missing lane line to obtain a corrected lane line.
18. The apparatus of claim 17, wherein the lane locating module comprises:
a lane line slope determination unit for determining a slope of the corrected lane line;
a target lane line determining unit, configured to use two corrected adjacent lane lines with mutually opposite slopes as a target lane line;
and the lane positioning unit is used for determining lane positioning information of the vehicle according to the target lane line.
19. The apparatus of claim 17, wherein the lane locating module further comprises:
the lane change event determining unit is used for determining whether a lane change event exists in the current frame road image or not under the condition that the corrected lane line number of the two adjacent frame road images is consistent;
and the lane positioning unit is used for determining lane positioning information of the vehicle in the current frame road image according to the lane changing direction if a lane changing event exists.
20. The apparatus of claim 19, wherein the lane change event determination unit is specifically configured to:
determining a target lane line associated with the lane positioning information according to the lane positioning information of the vehicle in the previous frame of road image;
determining a first proportion according to the slope of the target lane line in the last frame of road image;
determining a second proportion according to the slope of the target lane line in the current frame road image;
and determining whether the vehicle has a lane change event in the current frame road image according to the change relation between the first proportion and the second proportion.
21. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the lane line processing method of any one of claims 1-6 or to perform the lane locating method of any one of claims 7-10.
22. A non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the lane line processing method of any one of claims 1-6 or the lane locating method of any one of claims 7-10.
CN202011080260.6A 2020-10-10 2020-10-10 Lane line processing and lane positioning method, device, equipment and storage medium Pending CN112132109A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011080260.6A CN112132109A (en) 2020-10-10 2020-10-10 Lane line processing and lane positioning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011080260.6A CN112132109A (en) 2020-10-10 2020-10-10 Lane line processing and lane positioning method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112132109A true CN112132109A (en) 2020-12-25

Family

ID=73844290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011080260.6A Pending CN112132109A (en) 2020-10-10 2020-10-10 Lane line processing and lane positioning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112132109A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112712040A (en) * 2020-12-31 2021-04-27 潍柴动力股份有限公司 Method, device and equipment for calibrating lane line information based on radar and storage medium
CN112721926A (en) * 2021-02-25 2021-04-30 北京信达五域科技有限公司 Block chain-based lane keeping control method and system for automatic driving automobile
CN112823377A (en) * 2021-01-14 2021-05-18 深圳市锐明技术股份有限公司 Road edge segmentation method and device, terminal equipment and readable storage medium
CN112862845A (en) * 2021-02-26 2021-05-28 长沙慧联智能科技有限公司 Lane line reconstruction method and device based on confidence evaluation
CN113562465A (en) * 2021-09-26 2021-10-29 成都新西旺自动化科技有限公司 Visual guiding method and system for sheet placement
CN113642533A (en) * 2021-10-13 2021-11-12 宁波均联智行科技股份有限公司 Lane level positioning method and electronic equipment
CN115049997A (en) * 2022-06-07 2022-09-13 北京百度网讯科技有限公司 Method and device for generating edge lane line, electronic device and storage medium

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
US20130266175A1 (en) * 2012-04-09 2013-10-10 GM Global Technology Operations LLC Road structure detection and tracking
CN104742912A (en) * 2013-12-27 2015-07-01 比亚迪股份有限公司 Lane deviation detection method and device
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
CN106570446A (en) * 2015-10-12 2017-04-19 腾讯科技(深圳)有限公司 Lane line extraction method and device
CN106647776A (en) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 Judgment method and device for lane changing trend of vehicle and computer storage medium
CN107860391A (en) * 2017-02-13 2018-03-30 问众智能信息科技(北京)有限公司 Automobile accurate navigation method and device
US20180149488A1 (en) * 2016-11-29 2018-05-31 Alpine Electronics, Inc. Guide route setting apparatus and guide route setting method
US20180181817A1 (en) * 2015-09-10 2018-06-28 Baidu Online Network Technology (Beijing) Co., Ltd. Vehicular lane line data processing method, apparatus, storage medium, and device
WO2018177026A1 (en) * 2017-03-29 2018-10-04 蔚来汽车有限公司 Device and method for determining road edge
CN108830165A (en) * 2018-05-22 2018-11-16 南通职业大学 A kind of method for detecting lane lines considering front truck interference
CN109271857A (en) * 2018-08-10 2019-01-25 广州小鹏汽车科技有限公司 A kind of puppet lane line elimination method and device
CN109460739A (en) * 2018-11-13 2019-03-12 广州小鹏汽车科技有限公司 Method for detecting lane lines and device
CN109583416A (en) * 2018-12-11 2019-04-05 广州小鹏汽车科技有限公司 Pseudo- Lane detection method and system
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
KR20190041150A (en) * 2017-10-12 2019-04-22 현대모비스 주식회사 Calibration method and apparatus thereof
CN109886122A (en) * 2019-01-23 2019-06-14 珠海市杰理科技股份有限公司 Method for detecting lane lines, device, computer equipment and storage medium
CN110136447A (en) * 2019-05-23 2019-08-16 杭州诚道科技股份有限公司 Lane change of driving a vehicle detects and method for distinguishing is known in illegal lane change
CN110263713A (en) * 2019-06-20 2019-09-20 百度在线网络技术(北京)有限公司 Method for detecting lane lines, device, electronic equipment and storage medium
CN111178150A (en) * 2019-12-09 2020-05-19 安徽奇点智能新能源汽车有限公司 Lane line detection method, system and storage medium
WO2020098708A1 (en) * 2018-11-14 2020-05-22 北京市商汤科技开发有限公司 Lane line detection method and apparatus, driving control method and apparatus, and electronic device
CN111209780A (en) * 2018-11-21 2020-05-29 北京市商汤科技开发有限公司 Lane line attribute detection method and device, electronic device and readable storage medium
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN111383464A (en) * 2018-12-28 2020-07-07 沈阳美行科技有限公司 Vehicle lane change recognition method and device, electronic equipment and medium
CN111563463A (en) * 2020-05-11 2020-08-21 上海眼控科技股份有限公司 Method and device for identifying road lane lines, electronic equipment and storage medium
CN111582201A (en) * 2020-05-12 2020-08-25 重庆理工大学 Lane line detection system based on geometric attention perception
US20200302189A1 (en) * 2018-03-09 2020-09-24 Tencent Technology (Shenzhen) Company Limited Lane line data processing method and apparatus, computer device, and storage medium
CN111738057A (en) * 2020-04-30 2020-10-02 上海智目科技有限公司 Lane line correction method and device based on roadside features

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
US20130266175A1 (en) * 2012-04-09 2013-10-10 GM Global Technology Operations LLC Road structure detection and tracking
CN104742912A (en) * 2013-12-27 2015-07-01 比亚迪股份有限公司 Lane deviation detection method and device
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
US20180181817A1 (en) * 2015-09-10 2018-06-28 Baidu Online Network Technology (Beijing) Co., Ltd. Vehicular lane line data processing method, apparatus, storage medium, and device
CN106570446A (en) * 2015-10-12 2017-04-19 腾讯科技(深圳)有限公司 Lane line extraction method and device
US20180149488A1 (en) * 2016-11-29 2018-05-31 Alpine Electronics, Inc. Guide route setting apparatus and guide route setting method
CN107860391A (en) * 2017-02-13 2018-03-30 问众智能信息科技(北京)有限公司 Automobile accurate navigation method and device
CN106647776A (en) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 Judgment method and device for lane changing trend of vehicle and computer storage medium
WO2018177026A1 (en) * 2017-03-29 2018-10-04 蔚来汽车有限公司 Device and method for determining road edge
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
KR20190041150A (en) * 2017-10-12 2019-04-22 현대모비스 주식회사 Calibration method and apparatus thereof
US20200302189A1 (en) * 2018-03-09 2020-09-24 Tencent Technology (Shenzhen) Company Limited Lane line data processing method and apparatus, computer device, and storage medium
CN108830165A (en) * 2018-05-22 2018-11-16 南通职业大学 A kind of method for detecting lane lines considering front truck interference
CN109271857A (en) * 2018-08-10 2019-01-25 广州小鹏汽车科技有限公司 A kind of puppet lane line elimination method and device
CN109460739A (en) * 2018-11-13 2019-03-12 广州小鹏汽车科技有限公司 Method for detecting lane lines and device
WO2020098708A1 (en) * 2018-11-14 2020-05-22 北京市商汤科技开发有限公司 Lane line detection method and apparatus, driving control method and apparatus, and electronic device
CN111209780A (en) * 2018-11-21 2020-05-29 北京市商汤科技开发有限公司 Lane line attribute detection method and device, electronic device and readable storage medium
CN109583416A (en) * 2018-12-11 2019-04-05 广州小鹏汽车科技有限公司 Pseudo- Lane detection method and system
CN111383464A (en) * 2018-12-28 2020-07-07 沈阳美行科技有限公司 Vehicle lane change recognition method and device, electronic equipment and medium
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN109886122A (en) * 2019-01-23 2019-06-14 珠海市杰理科技股份有限公司 Method for detecting lane lines, device, computer equipment and storage medium
CN110136447A (en) * 2019-05-23 2019-08-16 杭州诚道科技股份有限公司 Lane change of driving a vehicle detects and method for distinguishing is known in illegal lane change
CN110263713A (en) * 2019-06-20 2019-09-20 百度在线网络技术(北京)有限公司 Method for detecting lane lines, device, electronic equipment and storage medium
CN111178150A (en) * 2019-12-09 2020-05-19 安徽奇点智能新能源汽车有限公司 Lane line detection method, system and storage medium
CN111738057A (en) * 2020-04-30 2020-10-02 上海智目科技有限公司 Lane line correction method and device based on roadside features
CN111563463A (en) * 2020-05-11 2020-08-21 上海眼控科技股份有限公司 Method and device for identifying road lane lines, electronic equipment and storage medium
CN111582201A (en) * 2020-05-12 2020-08-25 重庆理工大学 Lane line detection system based on geometric attention perception

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ISLAM GAMAL 等: "A robust, real-time and calibration-free lane departure warning system", 《MICROPROCESSORS AND MICROSYSTEMS》, vol. 71, pages 1 - 10 *
余厚云;张为公;: "直线模型下的车道线跟踪与车道偏离检测", 自动化仪表, no. 11, 20 November 2009 (2009-11-20), pages 5 - 7 *
余厚云;张为公;: "直线模型下的车道线跟踪与车道偏离检测", 自动化仪表, no. 11, pages 5 - 7 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112712040A (en) * 2020-12-31 2021-04-27 潍柴动力股份有限公司 Method, device and equipment for calibrating lane line information based on radar and storage medium
CN112712040B (en) * 2020-12-31 2023-08-22 潍柴动力股份有限公司 Method, device, equipment and storage medium for calibrating lane line information based on radar
CN112823377A (en) * 2021-01-14 2021-05-18 深圳市锐明技术股份有限公司 Road edge segmentation method and device, terminal equipment and readable storage medium
WO2022151147A1 (en) * 2021-01-14 2022-07-21 深圳市锐明技术股份有限公司 Curb segmentation method and apparatus, and terminal device and readable storage medium
CN112823377B (en) * 2021-01-14 2024-02-09 深圳市锐明技术股份有限公司 Road edge segmentation method and device, terminal equipment and readable storage medium
CN112721926A (en) * 2021-02-25 2021-04-30 北京信达五域科技有限公司 Block chain-based lane keeping control method and system for automatic driving automobile
CN112862845A (en) * 2021-02-26 2021-05-28 长沙慧联智能科技有限公司 Lane line reconstruction method and device based on confidence evaluation
CN112862845B (en) * 2021-02-26 2023-08-22 长沙慧联智能科技有限公司 Lane line reconstruction method and device based on confidence evaluation
CN113562465A (en) * 2021-09-26 2021-10-29 成都新西旺自动化科技有限公司 Visual guiding method and system for sheet placement
CN113642533A (en) * 2021-10-13 2021-11-12 宁波均联智行科技股份有限公司 Lane level positioning method and electronic equipment
CN115049997A (en) * 2022-06-07 2022-09-13 北京百度网讯科技有限公司 Method and device for generating edge lane line, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN112132109A (en) Lane line processing and lane positioning method, device, equipment and storage medium
Yoo et al. A robust lane detection method based on vanishing point estimation using the relevance of line segments
CN111311925B (en) Parking space detection method and device, electronic equipment, vehicle and storage medium
CN111859778B (en) Parking model generation method and device, electronic device and storage medium
CN111292531B (en) Tracking method, device and equipment of traffic signal lamp and storage medium
CN111797187A (en) Map data updating method and device, electronic equipment and storage medium
US20200200545A1 (en) Method and System for Determining Landmarks in an Environment of a Vehicle
CN110595490B (en) Preprocessing method, device, equipment and medium for lane line perception data
CN112541475B (en) Sensing data detection method and device
CN113688935A (en) High-precision map detection method, device, equipment and storage medium
WO2023273344A1 (en) Vehicle line crossing recognition method and apparatus, electronic device, and storage medium
CN114037966A (en) High-precision map feature extraction method, device, medium and electronic equipment
JP7461399B2 (en) Method and device for assisting the running operation of a motor vehicle, and motor vehicle
CN111950345A (en) Camera identification method and device, electronic equipment and storage medium
CN111640301B (en) Fault vehicle detection method and fault vehicle detection system comprising road side unit
CN110458815A (en) There is the method and device of mist scene detection
CN115973190A (en) Decision-making method and device for automatically driving vehicle and electronic equipment
CN113276888B (en) Riding method, device, equipment and storage medium based on automatic driving
CN113469045B (en) Visual positioning method and system for unmanned integrated card, electronic equipment and storage medium
CN116264038A (en) Signal lamp control method and device, electronic equipment and storage medium
CN114429631A (en) Three-dimensional object detection method, device, equipment and storage medium
CN114689061A (en) Navigation route processing method and device of automatic driving equipment and electronic equipment
CN114495049A (en) Method and device for identifying lane line
CN114049615B (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN111753765B (en) Sensing device detection method, sensing device detection apparatus, sensing device detection device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211022

Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Zhilian (Beijing) Technology Co.,Ltd.

Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right