CN109636841B - Lane line generation method and device - Google Patents
Lane line generation method and device Download PDFInfo
- Publication number
- CN109636841B CN109636841B CN201811290730.4A CN201811290730A CN109636841B CN 109636841 B CN109636841 B CN 109636841B CN 201811290730 A CN201811290730 A CN 201811290730A CN 109636841 B CN109636841 B CN 109636841B
- Authority
- CN
- China
- Prior art keywords
- lane
- point
- lane line
- base map
- reflection value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000003062 neural network model Methods 0.000 claims abstract description 33
- 238000004590 computer program Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 11
- 238000010586 diagram Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000005299 abrasion Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 206010039203 Road traffic accident Diseases 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
- G06T3/067—Reshaping or unfolding 3D tree structures onto 2D planes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a lane line generation method and a lane line generation device. The lane line generation method of the invention comprises the following steps: determining a reflection value base map of the lane line, wherein the reflection value base map is used for representing reflection information of the lane line; determining a plurality of lane points corresponding to lane lines in a reflection value base map; inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point; and generating a lane line according to the attribute information of each lane point. The invention realizes the continuous and complete generation process of the lane line, so that the unmanned vehicle can safely run according to the lane line.
Description
Technical Field
The invention relates to the technical field of intelligent traffic, in particular to a lane line generation method and a lane line generation device.
Background
Along with the intellectualization of automobile technology, unmanned vehicles are produced. The unmanned vehicle achieves the purpose of unmanned driving by means of an intelligent driving instrument in the vehicle. During the driving process of the unmanned vehicle, the information of the lane line needs to be provided for the unmanned vehicle.
In the prior art, lane lines are generated based on a semantic segmentation method. However, the lane line on the road has problems such as blurring, and disconnection due to occlusion, abrasion, and the like. The lane line obtained by the prior art is incomplete, and the problems of interruption, loss and the like exist, so that the unmanned vehicle cannot safely run according to the lane line.
Disclosure of Invention
The invention provides a lane line generation method and a lane line generation device, which are used for realizing the generation process of a lane line of an unmanned vehicle and ensuring that the unmanned vehicle can safely run according to a continuous and complete lane line.
In a first aspect, the present invention provides a lane line generating method, including:
determining a reflection value base map of a lane line, wherein the reflection value base map is used for representing reflection information of the lane line;
determining a plurality of lane points corresponding to lane lines in the reflection value base map;
inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point;
and generating a lane line according to the attribute information of each lane point.
Optionally, the inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point includes:
and inputting the reflection value base map into a pre-trained neural network model, and determining whether a first lane point is a stopping point, wherein the first lane point is any one of all lane points.
Optionally, the generating a lane line according to the attribute information of each lane point includes:
and if the first lane point is a stopping point, determining that the lane line is interrupted at the first lane point.
Optionally, the generating a lane line according to the attribute information of each lane point includes:
if the first lane point is not the stopping point, determining the advancing direction and the advancing step length of a second lane point, wherein the second lane point is the next lane point of the first lane point;
and generating the lane line according to the advancing direction and the advancing step length of the second lane point.
Optionally, the generating the lane line according to the advancing direction and the advancing step length of the second lane point includes:
determining the position of the second lane point according to the advancing direction and the advancing step length of the second lane point;
and generating the lane line according to the positions of the first lane point and the second lane point.
Optionally, the determining a reflection value base map of the lane line includes:
acquiring point cloud data containing lane lines;
and mapping the point cloud data to a two-dimensional space to obtain a reflection value base map of the lane line.
Optionally, the determining a plurality of lane points corresponding to lane lines in the reflection value base map includes:
and sampling the lane line to obtain a plurality of lane points corresponding to the lane line.
In a second aspect, the present invention provides a lane line generation apparatus, including:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining a reflection value base map of a lane line, and the reflection value base map is used for representing reflection information of the lane line;
the determining module is further configured to determine a plurality of lane points corresponding to lane lines in the reflection value base map;
the determining module is further configured to input the reflection value base map into a pre-trained neural network model, and sequentially determine attribute information of each lane point;
and the generating module is used for generating the lane line according to the attribute information of each lane point.
Optionally, the determining module is specifically configured to input the reflection value base map into a pre-trained neural network model, and determine whether a first lane point is a stopping point, where the first lane point is any one of all lane points.
Optionally, the determining module is configured to determine that the lane line is interrupted at the first lane point when the first lane point is determined to be the stopping point.
Optionally, the determining module is configured to determine, when it is determined that the first lane point is not the stopping point, a forward direction and a forward step length of a second lane point, where the second lane point is a next lane point of the first lane point;
the generating module is configured to generate the lane line according to the advancing direction and the advancing step length of the second lane point.
Optionally, the determining module is configured to determine the position of the second lane point according to the advancing direction and the advancing step length of the second lane point;
the generating module is configured to generate the lane line according to the positions of the first lane point and the second lane point.
Optionally, the apparatus further comprises;
the acquisition module is used for acquiring point cloud data containing lane lines;
and the determining module is used for mapping the point cloud data to a two-dimensional space to obtain a reflection value base map of the lane line.
Optionally, the determining module is configured to perform sampling processing on the lane line to obtain a plurality of lane points corresponding to the lane line.
In a third aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the lane line generation method of the first aspect.
In a fourth aspect, the present invention provides an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the lane line generation method of the first aspect via execution of the executable instructions.
The invention provides a method and a device for generating lane lines, which are characterized in that a plurality of lane points corresponding to the lane lines in a reflection value base map are determined, wherein the reflection value base map can represent reflection information of the lane lines, and the pre-trained neural network model can simulate attribute information of each lane point corresponding to the lane lines in the reflection value base map, so that the reflection value base map is input into the pre-trained neural network model, the attribute information of each lane point can be sequentially determined, and continuous and complete lane lines are generated according to the attribute information of each lane point. According to the invention, a continuous and complete generation process of the lane line is realized, so that the unmanned vehicle can safely run according to the lane line, the problem that the unmanned vehicle cannot safely run due to the problems of unclear reasons, disconnection and the like caused by shielding, abrasion and the like of the lane line in the prior art is solved, traffic accidents caused by the fact that the unmanned vehicle runs according to the damaged and missing lane line are avoided, the safety performance of the unmanned vehicle is improved, and the safe trip of a user is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a lane line generation method provided by the present invention;
FIG. 2 is a schematic flow chart of a lane line generation method according to the present invention;
FIG. 3 is a schematic flow chart of a lane line generation method according to the present invention;
fig. 4 is a schematic structural diagram of a lane line generating device according to the present invention;
fig. 5 is a schematic structural diagram of a lane line generating device according to the present invention;
fig. 6 is a schematic diagram of a hardware structure of the electronic device provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The lane line generation method and device provided by the embodiment are suitable for various application scenes such as various unmanned vehicles. In this embodiment, by obtaining the reflection value base map representing the reflection information of the real lane line and inputting the reflection value base map of the lane line into the pre-trained neural network model, the attribute information of each lane point on the lane line can be obtained, and then, according to the attribute information of each lane point on the lane line, a complete and continuous lane line can be generated, so that the unmanned vehicle can safely drive according to the lane line, and the safety performance of the unmanned vehicle is improved.
Next, a specific implementation procedure of the lane line generation method according to the present embodiment will be described in detail with a server as an execution subject. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 is a schematic flow chart of a lane line generation method provided by the present invention, and as shown in fig. 1, the lane line generation method of this embodiment may include:
s101, determining a reflection value base map of the lane line, wherein the reflection value base map is used for representing reflection information of the lane line.
S102, determining a plurality of lane points corresponding to the lane lines in the reflection value base map.
S103, inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point.
And S104, generating a lane line according to the attribute information of each lane point.
Specifically, the server may collect and store a reflection value base map of the lane line of various types of damage, loss, and the like, where the reflection value base map is used to represent reflection information of the lane line. The lane line may be a sign line separating lanes from each other in the road, or may also be a sign line pointing to the driving direction on the road, and the embodiment does not limit the specific form of the lane line.
Further, the present embodiment does not limit the specific form of the emission value histogram. The server can determine the reflection value base map of each lane line by acquiring the actual image of each lane line and utilizing the prior art and other modes. Optionally, the server acquires point cloud data containing the lane line; and mapping the point cloud data to a two-dimensional space to obtain a reflection value base map of the lane line.
Specifically, the server may obtain point cloud data of a lane line in a current scene, wherein in the prior art, a laser scanning mode is mostly adopted to obtain point cloud data of an environment; when a laser beam irradiates the surface of an object, the reflected laser beam carries information such as direction, distance and the like. When the laser beam is scanned along a certain trajectory, the reflected laser spot information is recorded while scanning, and since the scanning is extremely fine, a large number of laser spots can be obtained, and thus, laser point cloud data of an object can be formed. The point cloud data is a collection of a large number of point clouds at the target surface features. Furthermore, the server can map the point cloud data into a two-dimensional space, convert the three-dimensional coordinate data into two-dimensional coordinate data, and obtain a reflection value base map of the lane line. The point cloud obtained according to the laser measurement principle comprises three-dimensional coordinates (XYZ) and laser reflection information; a point cloud obtained according to photogrammetry principles, comprising three-dimensional coordinates (XYZ); and combining laser measurement and photogrammetry principles to obtain a point cloud comprising three-dimensional coordinates (XYZ) and laser reflection information. And representing the point cloud data according to the reflection information in the point cloud, and acquiring a reflection value base map corresponding to the point cloud data.
The embodiment does not limit the specific implementation manner of converting the three-dimensional coordinate data into the two-dimensional coordinate data by the server.
Further, since the continuous lane lines corresponding to the various types of lane lines such as the breakage and the loss are known, the server can train the respective lane points on the reflection value base map of the respective lane lines and the reflection value base map of the continuous respective lane lines, and can simulate the attribute information of the respective lane points on the respective types of lane lines to obtain the neural network model trained in advance.
The attribute information of each lane point may include, but is not limited to, information such as whether the lane point is a stop point, a forward direction of the lane point, and a forward step length of the lane point. And the pre-trained Neural network model may adopt a Deep Convolutional Neural network (Deep Convolutional Neural Networks) structure, which includes but is not limited to: RCNN (regions with CNN features), SSD (Single Shot MultiBox Detector), Mask RCNN and other image segmentation models.
It should be noted that: in this embodiment, reference may be made to the prior art for a specific implementation process of obtaining a pre-trained neural network model by a server based on a principle of a deep convolutional neural network structure, which is not described herein again.
Further, the server may obtain a reflection value base map of the lane line in the current scene, and then determine each corresponding lane point on the lane line in the reflection value base map. Optionally, the server may perform sampling processing on the lane line to obtain a plurality of lane points corresponding to the lane line. The number of lane points corresponding to the lane line may be determined according to an actual situation, which is not limited in this embodiment.
Further, since the pre-trained neural network model can simulate the attribute information of each lane point corresponding to the lane line in the reflection value base map, the server can input the reflection value base map into the pre-trained neural network model, and the attribute information of each lane point can be sequentially determined. Further, the server can generate a continuous and complete lane line according to the attribute information of each lane point.
In the method for generating a lane line provided in this embodiment, a plurality of lane points corresponding to a lane line in a reflection value base map are determined, where the reflection value base map may represent reflection information of the lane line, and since a pre-trained neural network model may simulate attribute information of each lane point corresponding to the lane line in the reflection value base map, the reflection value base map is input to the pre-trained neural network model, so that the attribute information of each lane point may be sequentially determined, and a continuous and complete lane line is generated according to the attribute information of each lane point. In the embodiment, the generation process of the continuous and complete lane line is realized, so that the unmanned vehicle can safely run according to the lane line, the problem that the unmanned vehicle cannot safely run due to the fact that the lane line is unclear, broken and the like due to shielding, abrasion and the like in the prior art is solved, traffic accidents caused by the fact that the unmanned vehicle runs according to the damaged and missing lane line are avoided, the safety performance of the unmanned vehicle is improved, and safe traveling of a user is guaranteed.
On the basis of the above embodiment of fig. 1, a detailed description will be given, with reference to fig. 2, of a specific implementation process of inputting the reflection value base map into the pre-trained neural network model and sequentially determining the attribute information of each lane point in the embodiment S103 shown in fig. 1.
Fig. 2 is a schematic flow chart of the lane line generation method provided in the present invention, and as shown in fig. 2, the lane line generation method of this embodiment may include:
s201, determining a reflection value base map of the lane line, wherein the reflection value base map is used for representing reflection information of the lane line.
S202, determining a plurality of lane points corresponding to the lane lines in the reflection value base map.
S201 and S202 are similar to the implementation manners of S101 and S102 in the embodiment of fig. 1, and are not described herein again.
S203, inputting the reflection value base map into a pre-trained neural network model, and determining whether a first lane point is a stopping point, wherein the first lane point is any one of all lane points.
Specifically, the pre-trained neural network model can simulate the attribute information of each lane point corresponding to the lane line in the reflection value base map, so that the server can input the reflection value base map into the pre-trained neural network model, and can determine whether any one lane point is a stopping point, that is, whether the first lane point is a stopping point.
The stopping point refers to an end point of any one lane line, such as two ends of a direction line on a road, two ends of a sign line isolating each lane in the road, and the like, which is not limited in this embodiment.
Further, the server can determine whether any lane point is a stop point, so that the server can determine two ends of a complete lane line, and accordingly, the lane points are connected according to the attribute information of the rest lane points to obtain the complete and continuous lane line, and the unmanned vehicle can safely drive according to the lane line.
And S204, generating a lane line according to the attribute information of each lane point.
S204 is similar to the implementation manner of S104 in the embodiment of fig. 1, and details of this embodiment are not repeated here.
In the method for generating a lane line provided in this embodiment, a plurality of lane points corresponding to a lane line in a reflection value base map are determined, where the reflection value base map may represent reflection information of the lane line, and since a pre-trained neural network model may simulate attribute information of each lane point corresponding to the lane line in the reflection value base map, where the attribute information is whether the lane point is a stop point, the reflection value base map is input into the pre-trained neural network model, and by sequentially determining whether each lane point is a stop point, each lane point may be continuous, and a continuous and complete lane line is generated. In the embodiment, the generation process of the continuous and complete lane line is realized, so that the unmanned vehicle can safely run according to the lane line, the problem that the unmanned vehicle cannot safely run due to the fact that the lane line is unclear, broken and the like due to shielding, abrasion and the like in the prior art is solved, traffic accidents caused by the fact that the unmanned vehicle runs according to the damaged and missing lane line are avoided, the safety performance of the unmanned vehicle is improved, and safe traveling of a user is guaranteed.
On the basis of the above-mentioned embodiment of fig. 2, a detailed description will be given, with reference to fig. 3, of a specific implementation process of generating a lane line according to the attribute information of each lane point in the embodiment S104 shown in fig. 2.
Fig. 3 is a schematic flow chart of the lane line generation method provided in the present invention, and as shown in fig. 3, the lane line generation method of this embodiment may include:
s301, determining a reflection value base map of the lane line, wherein the reflection value base map is used for representing reflection information of the lane line.
S302, determining a plurality of lane points corresponding to the lane lines in the reflection value base map.
S303, inputting the reflection value base map into a pre-trained neural network model, and determining whether a first lane point is a stopping point, wherein the first lane point is any one of all lane points.
S301, S302, and S303 are similar to the implementation manners of S201, S202, and S203 in the embodiment of fig. 2, and are not described herein again.
And S304, judging whether the first lane point is a stopping point. If yes, go to S305; otherwise, executing S306-S307.
Specifically, the server may determine whether the first lane point is a stopping point, so as to determine both ends and a middle portion of the lane line. When the first lane point is the stopping point, the server may perform S305. When the first lane point is not the stopping point, the server may perform S306-S307.
S305, determining that the lane line is interrupted at the first lane point.
S306, determining the advancing direction and the advancing step length of a second lane point, wherein the second lane point is the next lane point of the first lane point.
And S307, generating a lane line according to the advancing direction and the advancing step length of the second lane point.
Specifically, when the first lane point is determined to be the stopping point, the server may determine that the lane line is interrupted at the position of the first lane point, that is, the position of the first lane point is at any end of the lane line. When it is determined that the first lane point is not the stopping point, the server may determine that the first lane point is located at a middle portion of the lane line, and thus, the server may determine the proceeding direction and the proceeding step length of the next lane point of the first lane point, that is, the proceeding direction and the proceeding step length of the second lane point, according to the proceeding direction and the proceeding step length of the first lane point. Furthermore, the server can also determine the position of the second lane point according to the advancing direction and the advancing step length of the second lane point, and then generate a lane line according to the positions of the first lane point and the second lane point.
The first lane point and the second lane point are both lane points on the current damaged lane line. The advancing direction and the specific size of the advancing step length of the lane point may be set in advance in a pre-trained neural network model, which is not limited in this embodiment.
By analogy, the server can determine the advancing direction and the advancing step length of each lane point, and thus the server can determine the positions of all the lane points, so that the server can accurately determine the two ends and the middle part of a lane line, so that the lane points are connected to generate a complete and continuous lane line, an unmanned vehicle can safely run according to the lane line, and the passing safety of a user is guaranteed.
In the method for generating a lane line provided in this embodiment, a plurality of lane points corresponding to a lane line in a reflection value base map are determined, where the reflection value base map may represent reflection information of the lane line, and a pre-trained neural network model may simulate attribute information of each lane point corresponding to the lane line in the reflection value base map, so that the reflection value base map is input into the pre-trained neural network model, and it may be sequentially determined whether each lane point is a stop point. In the embodiment, the generation process of the continuous and complete lane line is realized, so that the unmanned vehicle can safely run according to the lane line, the problem that the unmanned vehicle cannot safely run due to the fact that the lane line is unclear, broken and the like due to shielding, abrasion and the like in the prior art is solved, traffic accidents caused by the fact that the unmanned vehicle runs according to the damaged and missing lane line are avoided, the safety performance of the unmanned vehicle is improved, and safe traveling of a user is guaranteed.
Fig. 4 is a schematic structural diagram of the lane line generating device provided in the present invention, and as shown in fig. 4, the lane line generating device 40 of this embodiment may include:
the determining module 41 is configured to determine a reflection value base map of the lane line, where the reflection value base map is used to represent reflection information of the lane line;
the determining module 41 is further configured to determine a plurality of lane points corresponding to lane lines in the reflection value base map;
the determining module 41 is further configured to input the reflection value base map into a pre-trained neural network model, and sequentially determine attribute information of each lane point;
and the generating module 42 is configured to generate a lane line according to the attribute information of each lane point.
Optionally, the determining module 41 is specifically configured to input the reflection value base map into a pre-trained neural network model, and determine whether a first lane point is a stopping point, where the first lane point is any one of all lane points.
Optionally, the determining module 41 is configured to determine that the lane line is interrupted at the first lane point when the first lane point is determined to be the stopping point.
Optionally, the determining module 41 is configured to determine, when it is determined that the first lane point is not the stop point, a forward direction and a forward step length of a second lane point, where the second lane point is a next lane point of the first lane point;
and the generating module 42 is configured to generate a lane line according to the advancing direction and the advancing step length of the second lane point.
Optionally, the determining module 41 is specifically configured to determine the position of the second lane point according to the advancing direction and the advancing step length of the second lane point;
the generating module 42 is specifically configured to generate a lane line according to the positions of the first lane point and the second lane point.
The lane line generating apparatus of this embodiment may be used to implement the technical solutions of the method embodiments shown in fig. 1 to fig. 3, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 5 is a schematic structural diagram of the lane line generating device provided in the present invention, and as shown in fig. 5, the lane line generating device of this embodiment further includes, on the basis of the device structure shown in fig. 4:
an obtaining module 43, configured to obtain point cloud data including a lane line;
and the determining module 41 is configured to map the point cloud data into a two-dimensional space to obtain a reflection value base map of the lane line.
Optionally, the determining module 41 is configured to perform sampling processing on the lane line to obtain a plurality of lane points corresponding to the lane line.
The lane line generating apparatus of this embodiment may be used to implement the technical solutions of the method embodiments shown in fig. 1 to fig. 3, and the implementation principles and technical effects thereof are similar and will not be described herein again.
In the present invention, the lane line generation device may be divided into functional modules according to the above method, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that the division of the modules in the embodiments of the present invention is schematic, and is only a logical function division, and there may be another division manner in actual implementation.
Fig. 6 is a schematic diagram of a hardware structure of the electronic device provided by the present invention. As shown in fig. 6, the electronic device 60 is configured to implement the operation corresponding to the server in any of the method embodiments described above, where the electronic device 60 of this embodiment may include: a memory 61 and a processor 62;
a memory 61 for storing a computer program;
a processor 62 for executing the computer program stored in the memory to implement the lane line generation method in the above-described embodiments. Reference may be made in particular to the description relating to the method embodiments described above.
Alternatively, the memory 61 may be separate or integrated with the processor 62.
When the memory 61 is a device separate from the processor 62, the electronic device 60 may further include:
a bus 63 for connecting the memory 61 and the processor 62.
Optionally, this embodiment further includes: a communication interface 64, the communication interface 64 being connectable to the processor 62 via a bus 63. Processor 62 may control communication interface 63 to implement the above-described receiving and transmitting functions of electronic device 60.
The electronic device provided in this embodiment may be used to execute the lane line generation method, and the implementation manner and the technical effect thereof are similar, and this embodiment is not described herein again.
The present invention also provides a computer-readable storage medium including a computer program for implementing the lane line generation method in the above embodiment.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of modules is only one logical division, and other divisions may be realized in practice, for example, a plurality of modules may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each module may exist alone physically, or two or more modules are integrated into one unit. The unit formed by the modules can be realized in a hardware form, and can also be realized in a form of hardware and a software functional unit.
The integrated module implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present application.
It should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile storage NVM, such as at least one disk memory, and may also be a usb disk, a removable hard disk, a read-only memory, a magnetic or optical disk, etc.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The computer-readable storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. A lane line generation method, comprising:
determining a reflection value base map of a lane line, wherein the reflection value base map is used for representing reflection information of the lane line;
determining a plurality of lane points corresponding to lane lines in the reflection value base map;
inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point, wherein the attribute information of the lane point is whether the lane point is a stopping point, the advancing direction of the lane point and the advancing step length of the lane point;
and generating a lane line according to the attribute information of each lane point.
2. The method of claim 1, wherein inputting the reflection value base map into a pre-trained neural network model, and sequentially determining attribute information of each lane point comprises:
and inputting the reflection value base map into a pre-trained neural network model, and determining whether a first lane point is a stopping point, wherein the first lane point is any one of all lane points.
3. The method according to claim 2, wherein the generating a lane line according to the attribute information of each lane point comprises:
and if the first lane point is a stopping point, determining that the lane line is interrupted at the first lane point.
4. The method according to claim 2, wherein the generating a lane line according to the attribute information of each lane point comprises:
if the first lane point is not the stopping point, determining the advancing direction and the advancing step length of a second lane point, wherein the second lane point is the next lane point of the first lane point;
and generating the lane line according to the advancing direction and the advancing step length of the second lane point.
5. The method of claim 4, wherein the generating the lane line according to the heading and the heading step of the second lane point comprises:
determining the position of the second lane point according to the advancing direction and the advancing step length of the second lane point;
and generating the lane line according to the positions of the first lane point and the second lane point.
6. The method of any one of claims 1-5, wherein determining the base map of reflection values for the lane lines comprises:
acquiring point cloud data containing lane lines;
and mapping the point cloud data to a two-dimensional space to obtain a reflection value base map of the lane line.
7. The method according to any one of claims 1-5, wherein the determining a plurality of lane points corresponding to lane lines in the reflection value base map comprises:
and sampling the lane line to obtain a plurality of lane points corresponding to the lane line.
8. A lane line generation device, comprising:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining a reflection value base map of a lane line, and the reflection value base map is used for representing reflection information of the lane line;
the determining module is further configured to determine a plurality of lane points corresponding to lane lines in the reflection value base map;
the determining module is further configured to input the reflection value base map into a pre-trained neural network model, and sequentially determine attribute information of each lane point, where the attribute information of the lane point is whether the lane point is a stopping point, a forward direction of the lane point, and a forward step length of the lane point;
and the generating module is used for generating the lane line according to the attribute information of each lane point.
9. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the lane line generation method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the lane line generation method of any of claims 1-7 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811290730.4A CN109636841B (en) | 2018-10-31 | 2018-10-31 | Lane line generation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811290730.4A CN109636841B (en) | 2018-10-31 | 2018-10-31 | Lane line generation method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109636841A CN109636841A (en) | 2019-04-16 |
CN109636841B true CN109636841B (en) | 2021-06-01 |
Family
ID=66066982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811290730.4A Active CN109636841B (en) | 2018-10-31 | 2018-10-31 | Lane line generation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109636841B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106494406A (en) * | 2015-09-08 | 2017-03-15 | 星克跃尔株式会社 | Bend guidance method, bend guider, electronic installation and program |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102521589B (en) * | 2011-11-18 | 2013-06-12 | 深圳市宝捷信科技有限公司 | Method and system for detecting lane marked lines |
CN104766058B (en) * | 2015-03-31 | 2018-04-27 | 百度在线网络技术(北京)有限公司 | A kind of method and apparatus for obtaining lane line |
CN111542860A (en) * | 2016-12-30 | 2020-08-14 | 迪普迈普有限公司 | Sign and lane creation for high definition maps for autonomous vehicles |
CN108470159B (en) * | 2018-03-09 | 2019-12-20 | 腾讯科技(深圳)有限公司 | Lane line data processing method and device, computer device and storage medium |
-
2018
- 2018-10-31 CN CN201811290730.4A patent/CN109636841B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106494406A (en) * | 2015-09-08 | 2017-03-15 | 星克跃尔株式会社 | Bend guidance method, bend guider, electronic installation and program |
Also Published As
Publication number | Publication date |
---|---|
CN109636841A (en) | 2019-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109032103B (en) | Method, device and equipment for testing unmanned vehicle and storage medium | |
CN109711242B (en) | Lane line correction method, lane line correction device, and storage medium | |
CN112199991B (en) | Simulation point cloud filtering method and system applied to vehicle-road cooperation road side perception | |
JP6850324B2 (en) | Obstacle distribution simulation method, device, terminal and program based on multi-model | |
CN113761999B (en) | Target detection method and device, electronic equipment and storage medium | |
CN109509236B (en) | Vehicle bounding box generation method and device in unmanned scene and storage medium | |
CN112154448A (en) | Target detection method and device and movable platform | |
EP4345773A1 (en) | Lane line extraction method and apparatus, vehicle and storage medium | |
CN115273039B (en) | Small obstacle detection method based on camera | |
CN113392793A (en) | Method, device, equipment, storage medium and unmanned vehicle for identifying lane line | |
CN113297958A (en) | Automatic labeling method and device, electronic equipment and storage medium | |
CN109300322B (en) | Guideline drawing method, apparatus, device, and medium | |
CN109598199B (en) | Lane line generation method and device | |
JP2020042792A (en) | Obstacle position simulation method, device, and terminal based on statistics | |
CN112639822B (en) | Data processing method and device | |
CN109636841B (en) | Lane line generation method and device | |
CN111380529B (en) | Mobile device positioning method, device and system and mobile device | |
US20210207964A1 (en) | Verification Method And Device For Modeling Route, Unmanned Vehicle, And Storage Medium | |
CN113160406B (en) | Road three-dimensional reconstruction method and device, storage medium and electronic equipment | |
CN114779210A (en) | Method for generating barrier oriented bounding box based on unmanned vehicle | |
US11205289B2 (en) | Method, device and terminal for data augmentation | |
CN113619606A (en) | Obstacle determination method, apparatus, device and storage medium | |
CN118043864A (en) | Obstacle recognition method and device, storage medium and electronic equipment | |
CN112184605A (en) | Method, equipment and system for enhancing vehicle driving visual field | |
CN114445648A (en) | Obstacle recognition method, apparatus and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |