CN112163475A - Method and device for determining lane line direction - Google Patents
Method and device for determining lane line direction Download PDFInfo
- Publication number
- CN112163475A CN112163475A CN202010967447.1A CN202010967447A CN112163475A CN 112163475 A CN112163475 A CN 112163475A CN 202010967447 A CN202010967447 A CN 202010967447A CN 112163475 A CN112163475 A CN 112163475A
- Authority
- CN
- China
- Prior art keywords
- lane line
- driving
- road
- track
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000004590 computer program Methods 0.000 claims description 16
- 238000003860 storage Methods 0.000 claims description 16
- 238000012549 training Methods 0.000 claims description 8
- 238000005457 optimization Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 230000006872 improvement Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 238000002310 reflectometry Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
The specification discloses a method and a device for determining lane line directions, which are used for acquiring road point cloud data acquired by acquisition equipment and a driving track based on the road point cloud data acquired by the acquisition equipment, further generating a road image according to the road point cloud data, and determining each lane line contained in the road image. Further, for each determined lane line, a part of the driving track matched with the lane line in the driving track may be determined as a target track. Then, the driving direction of the road where the lane line is located can be determined according to the driving direction of the acquisition device on the target track, the driving direction is used as the lane line direction corresponding to the lane line, and the lane line direction is updated in a preset electronic map. Compared with the prior art, the efficiency of determining the lane line direction is improved.
Description
Technical Field
The present disclosure relates to the field of unmanned driving technologies, and in particular, to a method and an apparatus for determining a lane line direction.
Background
At present, unmanned equipment can utilize high-precision map to navigate for self in the driving process to guarantee self safety of traveling.
In practical application, the lane line direction of each lane line in a road needs to be marked in a high-precision map so as to better navigate for the unmanned equipment. In the prior art, the lane line direction in the high-precision map can be marked in a manual mode, but the mode not only consumes labor cost, but also has low efficiency of marking the lane line direction.
Therefore, how to improve the efficiency of determining the lane line direction in the high-precision map and reduce the cost of determining the lane line direction is an urgent problem to be solved.
Disclosure of Invention
The present disclosure provides a method and an apparatus for determining a lane line direction, which partially solve the above problems in the prior art.
The technical scheme adopted by the specification is as follows:
the present specification provides a method of determining lane line direction, comprising:
acquiring road point cloud data acquired by acquisition equipment and a driving track on which the road point cloud data is acquired by the acquisition equipment;
generating a road image according to the road point cloud data, and determining each lane line contained in the road image;
determining a part of driving track matched with each lane line in the driving track as a target track aiming at each determined lane line;
and determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the lane line direction corresponding to the lane line, and updating the lane line direction in a preset electronic map.
Optionally, determining a part of the driving track matched with the lane line in the driving track, as a target track, specifically includes:
determining each track point in the driving track;
determining a track point which is closest to the image coordinate point in the track points as a track point corresponding to the image coordinate point aiming at each image coordinate point of the lane line;
and taking a driving track formed by track points corresponding to the image coordinate points of the lane line as a target track.
Optionally, determining, according to the driving direction of the acquisition device on the target track, a driving direction of a road where the lane line is located, as a lane line direction corresponding to the lane line, specifically including:
sorting image coordinate points of the lane line in the road image according to a preset sorting mode to obtain a sorting result, and determining the reference direction of the lane line according to the sorting result;
and determining the driving direction of the lane line on the road according to the driving direction of the acquisition equipment on the target track and the determined reference direction, and taking the driving direction as the lane line direction corresponding to the lane line.
Optionally, determining, according to the driving direction corresponding to the position of the acquisition device on the target track and the determined reference direction, a driving direction of a road on which the lane line is located, as a lane line direction corresponding to the lane line, specifically includes:
judging whether the reference direction is matched with the driving direction;
and if the reference direction is determined to be matched with the driving direction, taking the reference direction as the lane line direction corresponding to the lane line, otherwise, adjusting the reference direction according to the driving direction, and determining the driving direction of the road where the lane line is located according to the adjusted reference direction as the lane line direction corresponding to the lane line.
Optionally, determining each lane line included in the road image specifically includes:
and inputting the road image into a pre-trained lane line recognition model to determine each lane line contained in the road image.
Optionally, training the lane line recognition model specifically includes:
acquiring each sample image;
inputting the sample image into the lane line identification model aiming at each sample image to obtain an identified lane line;
and training the lane line recognition model by taking the minimum difference between the recognized lane line and the lane line marked in the sample image as an optimization target.
Optionally, the method further comprises:
determining the lane line direction of the adjacent lane line of each lane line, wherein the lane line and the adjacent lane line belong to the same lane line on the same side;
judging whether the lane line direction of the adjacent lane line is consistent with the lane line direction of the lane line;
and if the lane line direction of the adjacent lane line is determined to be inconsistent with the lane line direction of the lane line, correcting the lane line direction of the lane line according to the lane line direction of the adjacent lane line.
The present specification provides an apparatus for determining a lane line direction, comprising:
the acquisition module is used for acquiring road point cloud data acquired by acquisition equipment and a driving track on which the acquisition equipment acquires the road point cloud data;
the lane line determining module is used for generating a road image according to the road point cloud data and determining each lane line contained in the road image;
the track determining module is used for determining a part of driving track matched with each lane line in the driving track as a target track aiming at each determined lane line;
and the direction determining module is used for determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the direction of the lane line corresponding to the lane line, and updating the direction of the lane line in a preset electronic map.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described method of determining a lane line direction.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above method of determining lane line direction when executing the program.
The technical scheme adopted by the specification can achieve the following beneficial effects:
in the method for determining the lane line direction provided in this specification, road point cloud data acquired by an acquisition device and a travel track on which the acquisition device acquires the road point cloud data are acquired, a road image is generated according to the road point cloud data, and each lane line included in the road image is determined. Further, for each determined lane line, a part of the driving track matched with the lane line in the driving track may be determined as a target track. And then, determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the lane line direction corresponding to the lane line, and updating the lane line direction in a preset electronic map.
According to the method, the road image containing the lane lines can be obtained completely through the road point cloud data collected by the collection equipment, and after the road image is obtained, the direction of the lane lines in the road image can be determined according to the direction of the driving track of the collection equipment. Compared with the mode that the direction of the lane line needs to be determined manually in the prior art, the method can automatically determine the direction of the lane line, so that the efficiency of determining the direction of the lane line is improved, and the cost is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
FIG. 1 is a schematic flow chart of a method for determining lane line direction in the present specification;
FIG. 2 is a schematic view of a road image provided herein;
FIG. 3 is a schematic illustration of a lane line in a determined road image provided herein;
FIG. 4 is a schematic view of a driving direction and a reference direction provided herein;
fig. 5A and 5B are schematic diagrams illustrating an error in a determined lane line direction according to the present disclosure;
FIG. 6 is a schematic view of an adjacent lane line provided herein;
FIG. 7 is a schematic view of an apparatus for determining lane line direction provided herein;
fig. 8 is a schematic diagram of an electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a method for determining a lane line direction in this specification, which specifically includes the following steps:
s101: acquiring road point cloud data acquired by acquisition equipment and a driving track on which the road point cloud data is acquired by the acquisition equipment.
In practical application, the service platform needs to determine the lane line direction of a lane line in the electronic map, and the determined lane line direction in the electronic map can not only facilitate the navigation of the unmanned equipment or the common vehicle to the self in the driving process, but also facilitate the service platform to divide roads and lanes, determine the direction of the lanes, determine the communication relationship of the lanes and the like in the electronic map according to the lane line direction.
Based on the method, the service platform can acquire the road point cloud data acquired by the acquisition equipment and the driving track on which the road point cloud data acquired by the acquisition equipment is based. The road point cloud data mentioned here may refer to point cloud data acquired by the acquisition device driving on the road. The acquisition device mentioned here may refer to a general vehicle, an unmanned aerial vehicle, or the like for acquiring point cloud data. The service platform may acquire a driving track based on road point cloud data acquired by the acquisition device through a Positioning device installed on the acquisition device, and the Positioning device mentioned herein may specifically adopt a conventional Positioning manner such as a Global Positioning System (GPS), a base station Positioning, and the like, and is not limited herein.
The above-mentioned unmanned device may refer to an unmanned vehicle, a robot, an automatic distribution device, and the like, which can implement automatic driving. Based on this, the method can be applied to determine the lane line direction in the electronic map, and the unmanned device can use the electronic map to navigate in the automatic driving process so as to ensure that the unmanned device can execute distribution tasks in the distribution field through the electronic map, for example, service scenes of distribution such as express delivery, logistics, takeaway and the like are carried out by using the unmanned device.
The execution main body for determining the lane line direction provided by the specification may be a large-scale service platform or a server, or may be a notebook computer or a desktop computer. For convenience of description, the method for determining the lane line direction provided in this specification will be described below by taking only the service platform as an execution subject.
S102: and generating a road image according to the road point cloud data, and determining each lane line contained in the road image.
After the service platform acquires the road point cloud data, a road image can be generated through the road point cloud data. The service platform can use the point cloud data lower than the set height in the road point cloud data as the point cloud data for determining the road image. The set height mentioned here can be set according to actual needs. The service platform selects the point cloud data with the set height, can screen useless point cloud data of some vehicles, trees and the like in the road to a certain extent, and leaves the point cloud data which is beneficial to obtaining a lane line image. Then, the service platform projects the point cloud data into an image according to the laser reflectivity of the point cloud data, so as to obtain a road image as shown in fig. 2.
Fig. 2 is a schematic view of a road image provided in the present specification.
As can be seen from fig. 2, the grid part in the road image is a median in the road, the white straight line in fig. 2 is a lane line in the road, and the white directional arrow is a guide arrow in the road. The road image is formed by converting the laser reflectivity of the point cloud data into a gray value by the service platform, and the higher the laser reflectivity is, the higher the gray value is. Since the laser reflectance corresponding to the lane line is higher than that of the road surface, a region having a higher gray value in the road image, that is, a region that is more white in the road image is more likely to be the lane line. As can be seen from this, the road image referred to in this specification refers to an image obtained by converting collected road point cloud data according to the laser reflectivity, rather than an image directly captured by a camera or the like.
In this specification, after the service platform determines the road image, the service platform may determine a lane line in the road image, where the service platform may input the road image into a pre-trained lane line recognition model to determine the lane line in the road image, and the determined lane line may specifically be as shown in fig. 3.
Fig. 3 is a schematic diagram of a determined lane line in a road image provided by the present specification.
Fig. 3 corresponds to fig. 2, if the service platform identifies the lane line in the road image in fig. 2 through the lane line identification model, the lane line as in fig. 3 can be obtained, as can be seen from fig. 3, the lane line identification model can identify the lane line in the road image in fig. 2, and the identified lane line is presented in fig. 3 in a line form.
In the present specification, the lane line recognition model mentioned above may be trained in advance by means of supervised learning. The service platform needs to obtain each sample image for training the model, each sample image has a marked lane line, the service platform can input the sample image into the lane line recognition model aiming at each sample image, and after the lane line recognition model recognizes the lane line aiming at the sample image, the service platform can train the lane line recognition model by taking the difference between the recognized lane line and the marked lane line as a training target. The algorithm used by the lane line identification model may be various, such as a neural network, a Support Vector Machine (SVM), and the like, and is not limited herein.
S103: and determining a part of the driving track matched with each lane line in the driving track as a target track aiming at each determined lane line.
After the service platform determines the lane lines in the road image, a part of the driving track matched with the lane lines in the driving track can be determined as a target track for each determined lane line. That is to say, the service platform may determine, for each lane line, a part of the driving track that is closer to the lane line in the driving track, as a target track matched with the lane line, and in a subsequent process, the service platform may determine the lane line direction of the lane line according to the driving direction corresponding to the target track.
When the service platform determines the target track of the lane line, each track point in the driving track can be determined, and the track point closest to the image coordinate point is determined as the track point corresponding to the image coordinate point aiming at each image coordinate point of the lane line. Then, the service platform can use a driving track formed by track points corresponding to the image coordinate points of the lane line as a target track.
That is, this method is to determine, for each image coordinate point of the lane line, a track point that is closest to the image coordinate point of the lane line in a one-to-one correspondence, and thus, a target track that matches the lane line is formed by the track points. The image coordinate points referred to herein represent the coordinates of the lane lines in the image, while the track points represent the geographic location of the acquisition device in the real world. Therefore, for one image coordinate point, the service platform can convert the image coordinate point into a coordinate point under a world coordinate system, and then determine a track point closest to the image coordinate point according to the coordinate point under the world coordinate system.
Besides the above manners, the service platform may determine the target track matching the lane line in various manners. For example, the service platform may determine, for each track point, a distance between the track point and the lane line. The service platform can sequence the track points according to the determined distance to obtain the track points before the set ranking, and a running track formed by the track points before the set ranking is used as a target track. It can be seen that this target trajectory is a driving trajectory composed of trajectory points that are closer to the lane line. The setting rank mentioned here can be set according to actual needs.
S104: and determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the lane line direction corresponding to the lane line, and updating the lane line direction in a preset electronic map.
After the service platform determines the target track according to the lane line, the service platform can determine the driving direction of the road where the lane line is located according to the driving direction of the acquisition device on the target track, and the driving direction is used as the lane line direction corresponding to the lane line. After the service platform determines the lane line direction corresponding to the lane line, the lane line direction can be updated in a preset electronic map, wherein the electronic map can be in various forms, such as a high-precision map, a navigation map and the like.
Since the target track is close to the lane line and the collection device drives on the road according to the traffic rules, the driving direction of the collection device on the target track can indicate the driving direction of the road where the lane line is located. Therefore, the service platform can determine the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track. The lane line direction corresponding to the lane line should be the same as the driving direction of the road on which the lane line is located, so that the service platform can use the determined driving direction of the road on which the lane line is located as the lane line direction corresponding to the lane line.
It should be noted that, if the road on which the lane line is located is a one-way road, the driving direction of the road on which the lane line is located is the road direction of the road on which the lane line is located, and if the road on which the lane line is located includes a bidirectional lane, the driving direction of the road on which the lane line is located is the driving direction of the lane on which the lane line is located.
Further, in this specification, the service platform may also determine the reference direction of the lane line first. The reference direction is an initial direction of the lane line in the road image determined by the service platform, and the subsequent service platform can determine the actual lane line direction of the lane line on the basis of the reference direction by combining the determined driving direction of the acquisition equipment on the target track.
The service platform can sort the image coordinate points of the lane line in the road image according to a preset sorting mode, and determine the reference direction of the lane line according to the obtained sorting result. The predetermined ordering mentioned here can be various. For example, the service platform may sort the image coordinate points of the lane line in the order of ascending ordinate, or may sort the image coordinate points of the lane line in the order of descending abscissa. Specifically, the service platform may select a proper sorting manner according to the inclination degree of the lane line in the road image. For example, if the lane line is vertical in the road image, the image coordinate points of the lane line may be sorted by the ordinate; for another example, if the angle between the lane line and the horizontal axis in the road image is small, the image coordinate points of the lane line may be sorted by the horizontal axis.
As can be seen from the above process, the reference direction is only an initial direction determined by the service platform for the lane line along the image coordinate point of the lane line, and the actual lane line direction of the lane line may be consistent with or opposite to the reference direction. Based on this, the service platform may determine whether the reference direction is matched with the driving direction of the acquisition device on the target track, and if it is determined that the reference direction is matched with the driving direction, the service platform may use the reference direction as the lane line direction of the lane line. Otherwise, the service platform may adjust the reference direction according to the driving direction, and determine the driving direction of the road where the lane line is located according to the adjusted reference direction, as the lane line direction corresponding to the lane line. The reference direction mentioned here may be matched with the driving direction, and the reference direction may be consistent with the driving direction, or an included angle between the reference direction and the driving direction may not exceed 90 °.
For example, assuming that the service platform determines the target trajectory according to the method for determining the target trajectory set forth in S103, that is, the trajectory points in the target trajectory correspond to the coordinate points of each image of the lane line in a one-to-one manner, the service platform may determine the driving direction corresponding to the target trajectory according to the sequence of the trajectory points in the target trajectory. The image coordinate points have been sorted in the above process, and the reference direction of the lane line is obtained according to the sorting result, so that the traveling direction is either opposite to the reference direction (or means that the angle between the traveling direction and the reference direction is greater than 90 °), or the same (or means that the angle between the traveling direction and the reference direction is less than 90 °), as shown in fig. 4.
Fig. 4 is a schematic view of a driving direction and a reference direction provided in the present specification.
It can be seen from fig. 4 that lane line a includes image coordinate points 1, 2, 3, 4, 5, and image coordinate point 1 corresponds with track point 1, and image coordinate point 2 corresponds with track point 2, so on, and image coordinate point 5 corresponds with track point 5, and track point 1, 2, 3, 4, 5 have constituted the target track that lane line a corresponds. The service platform determines that the driving direction of the target track is the direction from track points 5, 4, 3 and 2 to track point 1 according to the acquisition time corresponding to each track point in the target track, and the service platform sorts the image coordinate points from small to large according to the abscissa when determining the reference direction, so that the reference direction is the direction from image coordinate points 1, 2, 3 and 4 to image coordinate point 5. It can be seen that the reference direction is opposite to the driving direction, and therefore, the service platform may adjust the reference direction to an opposite direction (i.e. to the same direction as the driving direction of the target track) according to the driving direction, and use the adjusted reference direction as the determined lane line direction of the lane line.
For another example, the service platform may determine an included angle between the driving direction corresponding to the target track and the reference direction of the lane line, if the included angle is smaller than 90 °, it may determine that the lane line direction of the lane line is the reference direction, if the included angle is larger than 90 °, it may adjust the reference direction to the opposite direction, and the service platform may use the adjusted reference direction as the lane line direction of the lane line.
It should be noted that, in practical applications, a situation that a large deviation exists between a driving track acquired by a service platform and an actual driving track due to an error occurring in a positioning result of a positioning device of an acquisition device may occur, or a situation that a driving direction corresponding to a determined target track is not consistent with an actual lane line direction of the lane line may occur, and the like, may cause an error occurring in the determined lane line direction when these situations occur.
The following illustrates a case where an error may occur in the lane line direction, as shown in fig. 5A and 5B.
Fig. 5A and 5B are schematic diagrams illustrating an error in a determined lane line direction provided in this specification.
In fig. 5A, a solid line represents the determined lane line, and a dotted line represents the driving track acquired by the service platform, and as can be seen from fig. 5A, in practice, the lane line a should match the driving track 1, but since the acquisition device drives outside the road when acquiring the road point cloud data at the driving track 1, the driving track 1 is far from the lane line a, so that when the service platform determines the target track of the lane line a, the target track of the lane line a is determined as the driving track 2. And the travel track 2 is actually a travel track in which the collection device is located in a lane opposite to the lane line a, and therefore, the lane line direction of the lane line a will be erroneously recognized as a direction that coincides with the travel track 2, but actually, the lane line direction of the lane line a should be a direction that coincides with the travel track 1, and therefore, the determined lane line of the lane line a is erroneous.
In fig. 5B, the travel locus is also indicated by a broken line, and the lane line is indicated by a solid line. Due to the fact that the positioning of the acquisition equipment is wrong, a part of the track in the driving track 3 is greatly deviated from the actual driving track of the acquisition equipment. Therefore, when determining the target trajectory of the lane line B, the service platform uses a part of the trajectory in the travel trajectory 4 as the target trajectory of the lane line B, so that the lane line direction of the lane line B is erroneously recognized as a direction that coincides with the travel trajectory 4.
Therefore, the service platform can verify and correct the determined lane line direction. Specifically, the service platform may determine, for each lane line, a lane line direction of an adjacent lane line of the lane line, where the lane line and the adjacent lane line belong to the same lane line on the same side of the same lane, and therefore, the adjacent lane line and the lane line should be the same lane line in the same direction in the actual road, or the same whole lane line, as shown in fig. 6.
Fig. 6 is a schematic diagram of adjacent lane lines provided in the present specification.
The lane lines a and b in fig. 6 both belong to lane lines of the lane Z, and it can be seen that the lane lines a and b are both located on the same side of the lane Z, the adjacent lane line of the lane line a is the lane line b, and after the service platform determines the lane line direction of each lane line, the general form of the lane line direction may be the form marked in fig. 6, and it can be seen that the lane line directions of the lane lines a and the lane lines on the same lane are different, and thus there is a possibility that an error occurs in the lane line direction of the lane line a.
After the service platform determines the lane line direction of the adjacent lane line of the lane line, whether the lane line direction of the adjacent lane line of the lane line is consistent with the lane line direction of the lane line can be judged, and if the lane line direction of the adjacent lane line is determined to be inconsistent with the lane line direction of the lane line, the lane line direction of the lane line can be corrected according to the lane line direction of the adjacent lane line of the lane line.
Specifically, when the service platform determines that the lane line direction of a lane line is different from the lane line direction of an adjacent lane line for the lane line, the lane line direction of the lane line may be wrong, and of course, the lane line direction of the adjacent lane line may also be wrong. Therefore, the service platform can further verify the lane line direction of the lane line and determine whether to correct the lane line direction. The service platform can further verify the lane line direction of the lane line through the lane line direction of each lane line of the lane to which the lane line belongs. If the lane line direction of the lane line smaller than the set proportion in the lane to which the lane line belongs is consistent with the lane line direction of the lane line, which indicates that an error occurs in the lane line direction of the lane line, the service platform can correct the lane line direction of the lane line according to the lane line direction of the adjacent lane line, so that the lane line direction of the lane line is the same as the lane line direction of the adjacent lane line. Wherein, the proportion of setting can be set according to actual demand.
If the lane line direction of the lane line in the lane to which the lane line belongs is not less than the set proportion lane line and is consistent with the lane line direction of the lane line, it is indicated that the lane line direction of the lane line is not wrong, and the service platform does not need to correct the lane line direction of the lane line according to the lane line direction of the adjacent lane line. And the service platform can also correct the lane line direction of the adjacent lane line of the lane line according to the lane line direction of most lane lines in the lane to which the lane line belongs.
Still taking fig. 6 as an example, the service platform may first verify the lane line direction of the lane line a, and since the lane line direction of the lane line b is not consistent with the lane line direction of the lane line a, the service platform may further verify the lane line direction of the lane line a through the lane line direction of the lane line belonging to the lane Z. It can be seen that the lane line direction of most lane lines belonging to the lane Z is not consistent with the lane line direction of the lane line a, so the service platform can correct the lane line direction of the lane line a according to the lane line direction of the lane line b, thereby obtaining the correct lane line direction of the lane line a.
According to the method, the road image containing the lane lines can be obtained completely through the road point cloud data acquired by the acquisition equipment, and after the road image is obtained, the direction of the lane lines in the road image can be determined according to the direction of the driving track acquired by the acquisition equipment. In addition, after the lane line direction of the lane line is determined, the determined lane line direction can be verified and corrected according to the lane line direction of the adjacent lane line of the lane line. Therefore, compared with the mode of manually determining the lane line direction in the prior art, the method can improve the efficiency of determining the lane line direction and reduce the cost on the basis of ensuring the accuracy of the determined lane line direction.
It should be noted that the service platform may also obtain each driving track on the lane where the multiple collection devices pass through the lane line, or the driving track to be located in the neighborhood range of the lane where the lane line belongs, and determine, for each driving track, a part of the driving track matched with the lane line, as a target track matched with the lane line in the driving track. After the service platform determines each target track, the distance between each target track and the lane line can be determined. The service platform can determine the influence degree of each target track on the lane line direction of the lane line according to the distance between the target track and the lane line. The closer the distance between the target track and the lane line, the higher the influence degree of the target track on the lane line direction of the lane line. Then, the service platform can determine the lane line direction of the lane line according to the degree of influence of each target track on the lane line direction of the lane line and the driving direction corresponding to each target track.
In this specification, after the service platform determines the lane line direction of each lane line and updates the lane line direction of each lane line in the electronic map, the unmanned device can plan a route for its own driving by using the lane line direction in the electronic map, so as to ensure the driving safety of itself in the road.
Based on the same idea, the present specification further provides a corresponding apparatus for determining the lane line direction, as shown in fig. 7.
Fig. 7 is a schematic diagram of an apparatus for determining a lane line direction provided in this specification, which specifically includes:
an obtaining module 701, configured to obtain road point cloud data collected by a collection device and a driving track on which the collection device collects the road point cloud data;
a lane line determining module 702, configured to generate a road image according to the road point cloud data, and determine each lane line included in the road image;
a track determining module 703, configured to determine, for each determined lane line, a partial driving track, which is matched with the lane line, in the driving track as a target track;
and a direction determining module 704, configured to determine, according to the driving direction of the acquisition device on the target track, a driving direction of a road where the lane line is located, as a lane line direction corresponding to the lane line, and update the lane line direction in a preset electronic map.
Optionally, the track determining module 703 is specifically configured to determine track points in the driving track; determining a track point which is closest to the image coordinate point in the track points as a track point corresponding to the image coordinate point aiming at each image coordinate point of the lane line; and taking a driving track formed by track points corresponding to the image coordinate points of the lane line as a target track.
Optionally, the direction determining module 704 is specifically configured to sort image coordinate points of the lane line in the road image according to a preset sorting manner to obtain a sorting result, and determine a reference direction of the lane line according to the sorting result; and determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track and the determined reference direction, and taking the driving direction as the lane line direction corresponding to the lane line.
Optionally, the direction determining module 704 is specifically configured to determine whether the reference direction matches the driving direction; and if the reference direction is determined to be matched with the driving direction, taking the reference direction as the lane line direction corresponding to the lane line, otherwise, adjusting the reference direction according to the driving direction, and determining the driving direction of the road where the lane line is located according to the adjusted reference direction as the lane line direction corresponding to the lane line.
Optionally, the lane line determining module 702 is specifically configured to input the road image into a pre-trained lane line recognition model to determine each lane line included in the road image.
Optionally, the apparatus further comprises:
a training module 705, configured to obtain each sample image; inputting the sample image into the lane line identification model aiming at each sample image to obtain an identified lane line; training the lane line recognition model by taking the minimum difference between the recognized lane line and the lane line marked in the sample image as an optimization target;
a correction module 706, configured to determine, for each lane line, a lane line direction of an adjacent lane line of the lane line, where the lane line and the adjacent lane line belong to a same-side lane line of a same lane; judging whether the lane line direction of the adjacent lane line is consistent with the lane line direction of the lane line; and if the lane line direction of the adjacent lane line is determined to be inconsistent with the lane line direction of the lane line, correcting the lane line direction of the lane line according to the lane line direction of the adjacent lane line.
The present specification also provides a computer-readable storage medium storing a computer program, which can be used to execute the method for determining the lane direction shown in fig. 1.
This specification also provides a schematic block diagram of the electronic device shown in fig. 8. As shown in fig. 8, at the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, but may also include hardware required for other services. The processor reads a corresponding computer program from the non-volatile memory into the memory and then runs the computer program to implement the method for determining the lane direction described in fig. 1. Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.
Claims (10)
1. A method of determining lane line direction, comprising:
acquiring road point cloud data acquired by acquisition equipment and a driving track on which the road point cloud data is acquired by the acquisition equipment;
generating a road image according to the road point cloud data, and determining each lane line contained in the road image;
determining a part of driving track matched with each lane line in the driving track as a target track aiming at each determined lane line;
and determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the lane line direction corresponding to the lane line, and updating the lane line direction in a preset electronic map.
2. The method according to claim 1, wherein determining a part of the driving trajectory matching the lane line as a target trajectory specifically comprises:
determining each track point in the driving track;
determining a track point which is closest to the image coordinate point in the track points as a track point corresponding to the image coordinate point aiming at each image coordinate point of the lane line;
and taking a driving track formed by track points corresponding to the image coordinate points of the lane line as a target track.
3. The method according to claim 1, wherein determining a driving direction of a road where the lane line is located according to a driving direction of the acquisition device located on the target track, as a lane line direction corresponding to the lane line, specifically comprises:
sorting image coordinate points of the lane line in the road image according to a preset sorting mode to obtain a sorting result, and determining the reference direction of the lane line according to the sorting result;
and determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track and the determined reference direction, and taking the driving direction as the lane line direction corresponding to the lane line.
4. The method according to claim 3, wherein determining the driving direction of the road on which the lane line is located according to the driving direction of the acquisition device on the target track and the determined reference direction, as the lane line direction corresponding to the lane line, specifically comprises:
judging whether the reference direction is matched with the driving direction;
and if the reference direction is determined to be matched with the driving direction, taking the reference direction as the lane line direction corresponding to the lane line, otherwise, adjusting the reference direction according to the driving direction, and determining the driving direction of the road where the lane line is located according to the adjusted reference direction as the lane line direction corresponding to the lane line.
5. The method of claim 1, wherein determining each lane line in the road image includes:
and inputting the road image into a pre-trained lane line recognition model to determine each lane line contained in the road image.
6. The method of claim 5, wherein training the lane line identification model specifically comprises:
acquiring each sample image;
inputting the sample image into the lane line identification model aiming at each sample image to obtain an identified lane line;
and training the lane line recognition model by taking the minimum difference between the recognized lane line and the lane line marked in the sample image as an optimization target.
7. The method of claim 1, wherein the method further comprises:
determining the lane line direction of the adjacent lane line of each lane line, wherein the lane line and the adjacent lane line belong to the same lane line on the same side;
judging whether the lane line direction of the adjacent lane line is consistent with the lane line direction of the lane line;
and if the lane line direction of the adjacent lane line is determined to be inconsistent with the lane line direction of the lane line, correcting the lane line direction of the lane line according to the lane line direction of the adjacent lane line.
8. An apparatus for determining a lane line direction, comprising:
the acquisition module is used for acquiring road point cloud data acquired by acquisition equipment and a driving track on which the acquisition equipment acquires the road point cloud data;
the lane line determining module is used for generating a road image according to the road point cloud data and determining each lane line contained in the road image;
the track determining module is used for determining a part of driving track matched with each lane line in the driving track as a target track aiming at each determined lane line;
and the direction determining module is used for determining the driving direction of the road where the lane line is located according to the driving direction of the acquisition equipment on the target track, taking the driving direction as the direction of the lane line corresponding to the lane line, and updating the direction of the lane line in a preset electronic map.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1 to 7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010967447.1A CN112163475B (en) | 2020-09-15 | 2020-09-15 | Method and device for determining lane line direction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010967447.1A CN112163475B (en) | 2020-09-15 | 2020-09-15 | Method and device for determining lane line direction |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112163475A true CN112163475A (en) | 2021-01-01 |
CN112163475B CN112163475B (en) | 2024-07-26 |
Family
ID=73857416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010967447.1A Active CN112163475B (en) | 2020-09-15 | 2020-09-15 | Method and device for determining lane line direction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112163475B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112418193A (en) * | 2021-01-21 | 2021-02-26 | 武汉中海庭数据技术有限公司 | Lane line identification method and system |
CN112990087A (en) * | 2021-04-08 | 2021-06-18 | 济南博观智能科技有限公司 | Lane line detection method, device, equipment and readable storage medium |
CN113345251A (en) * | 2021-07-08 | 2021-09-03 | 北京紫光展锐通信技术有限公司 | Vehicle reverse running detection method and related device |
CN114295119A (en) * | 2021-12-31 | 2022-04-08 | 北京三快在线科技有限公司 | Map construction method and device |
CN115683130A (en) * | 2021-07-27 | 2023-02-03 | 北京三快在线科技有限公司 | Method and device for generating target area entrance and exit area of map based on lane line |
CN115727834A (en) * | 2022-11-16 | 2023-03-03 | 新石器慧通(北京)科技有限公司 | Reverse check processing method for boundary line, map making method, map making apparatus, and map making medium |
CN115762152A (en) * | 2022-11-11 | 2023-03-07 | 北京紫光展锐通信技术有限公司 | Vehicle retrograde motion detection method, device, system, vehicle and medium |
CN115953752A (en) * | 2023-03-07 | 2023-04-11 | 中汽创智科技有限公司 | Lane reference line extraction method and device, electronic equipment and storage medium |
CN118552656A (en) * | 2024-07-30 | 2024-08-27 | 苏州魔视智能科技有限公司 | Lane line fitting optimization method, lane line fitting optimization device, computer equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105701449A (en) * | 2015-12-31 | 2016-06-22 | 百度在线网络技术(北京)有限公司 | Method and device for detecting lane lines on road surface |
US20180181817A1 (en) * | 2015-09-10 | 2018-06-28 | Baidu Online Network Technology (Beijing) Co., Ltd. | Vehicular lane line data processing method, apparatus, storage medium, and device |
CN109143259A (en) * | 2018-08-20 | 2019-01-04 | 北京主线科技有限公司 | High-precision cartography method towards the unmanned truck in harbour |
CN110097025A (en) * | 2019-05-13 | 2019-08-06 | 奇瑞汽车股份有限公司 | Detection method, device and the storage medium of lane line |
CN110163930A (en) * | 2019-05-27 | 2019-08-23 | 北京百度网讯科技有限公司 | Lane line generation method, device, equipment, system and readable storage medium storing program for executing |
WO2020038091A1 (en) * | 2018-08-22 | 2020-02-27 | 北京市商汤科技开发有限公司 | Intelligent driving control method and apparatus, electronic device, program and medium |
WO2020048487A1 (en) * | 2018-09-05 | 2020-03-12 | 北京嘀嘀无限科技发展有限公司 | Image data processing method and system |
CN111652112A (en) * | 2020-05-29 | 2020-09-11 | 北京百度网讯科技有限公司 | Lane flow direction identification method and device, electronic equipment and storage medium |
-
2020
- 2020-09-15 CN CN202010967447.1A patent/CN112163475B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180181817A1 (en) * | 2015-09-10 | 2018-06-28 | Baidu Online Network Technology (Beijing) Co., Ltd. | Vehicular lane line data processing method, apparatus, storage medium, and device |
CN105701449A (en) * | 2015-12-31 | 2016-06-22 | 百度在线网络技术(北京)有限公司 | Method and device for detecting lane lines on road surface |
CN109143259A (en) * | 2018-08-20 | 2019-01-04 | 北京主线科技有限公司 | High-precision cartography method towards the unmanned truck in harbour |
WO2020038091A1 (en) * | 2018-08-22 | 2020-02-27 | 北京市商汤科技开发有限公司 | Intelligent driving control method and apparatus, electronic device, program and medium |
WO2020048487A1 (en) * | 2018-09-05 | 2020-03-12 | 北京嘀嘀无限科技发展有限公司 | Image data processing method and system |
CN110097025A (en) * | 2019-05-13 | 2019-08-06 | 奇瑞汽车股份有限公司 | Detection method, device and the storage medium of lane line |
CN110163930A (en) * | 2019-05-27 | 2019-08-23 | 北京百度网讯科技有限公司 | Lane line generation method, device, equipment, system and readable storage medium storing program for executing |
CN111652112A (en) * | 2020-05-29 | 2020-09-11 | 北京百度网讯科技有限公司 | Lane flow direction identification method and device, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
吴彦文;张楠;周涛;严巍;: "基于多传感融合的车道线检测与跟踪方法的研究", 计算机应用研究, no. 02, 15 March 2017 (2017-03-15) * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112418193A (en) * | 2021-01-21 | 2021-02-26 | 武汉中海庭数据技术有限公司 | Lane line identification method and system |
CN112990087A (en) * | 2021-04-08 | 2021-06-18 | 济南博观智能科技有限公司 | Lane line detection method, device, equipment and readable storage medium |
CN112990087B (en) * | 2021-04-08 | 2022-08-19 | 济南博观智能科技有限公司 | Lane line detection method, device, equipment and readable storage medium |
CN113345251A (en) * | 2021-07-08 | 2021-09-03 | 北京紫光展锐通信技术有限公司 | Vehicle reverse running detection method and related device |
CN115683130A (en) * | 2021-07-27 | 2023-02-03 | 北京三快在线科技有限公司 | Method and device for generating target area entrance and exit area of map based on lane line |
CN114295119A (en) * | 2021-12-31 | 2022-04-08 | 北京三快在线科技有限公司 | Map construction method and device |
CN114295119B (en) * | 2021-12-31 | 2024-02-23 | 北京三快在线科技有限公司 | Map construction method and device |
CN115762152A (en) * | 2022-11-11 | 2023-03-07 | 北京紫光展锐通信技术有限公司 | Vehicle retrograde motion detection method, device, system, vehicle and medium |
CN115727834A (en) * | 2022-11-16 | 2023-03-03 | 新石器慧通(北京)科技有限公司 | Reverse check processing method for boundary line, map making method, map making apparatus, and map making medium |
CN115953752A (en) * | 2023-03-07 | 2023-04-11 | 中汽创智科技有限公司 | Lane reference line extraction method and device, electronic equipment and storage medium |
CN118552656A (en) * | 2024-07-30 | 2024-08-27 | 苏州魔视智能科技有限公司 | Lane line fitting optimization method, lane line fitting optimization device, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112163475B (en) | 2024-07-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112163475B (en) | Method and device for determining lane line direction | |
CN112801229B (en) | Training method and device for recognition model | |
CN110674723B (en) | Method and device for determining driving track of unmanned vehicle | |
CN111508258B (en) | Positioning method and device | |
CN111639682A (en) | Ground segmentation method and device based on point cloud data | |
CN111208838B (en) | Control method and device of unmanned equipment | |
CN112766468A (en) | Trajectory prediction method and device, storage medium and electronic equipment | |
CN111062372B (en) | Method and device for predicting obstacle track | |
CN112013853B (en) | Method and device for verifying track points of unmanned equipment | |
CN112327864A (en) | Control method and control device of unmanned equipment | |
CN111797722A (en) | Method and device for drawing lane line | |
CN112949756B (en) | Method and device for model training and trajectory planning | |
CN111797698A (en) | Target object identification method and identification device | |
CN112033421A (en) | Method and device for detecting lane in electronic map | |
CN114295119A (en) | Map construction method and device | |
CN112861831A (en) | Target object identification method and device, storage medium and electronic equipment | |
CN112902987B (en) | Pose correction method and device | |
CN112883871A (en) | Model training and unmanned vehicle motion strategy determining method and device | |
CN113642616B (en) | Training sample generation method and device based on environment data | |
CN112987754B (en) | Unmanned equipment control method and device, storage medium and electronic equipment | |
CN112712595A (en) | Method and device for generating simulation environment | |
CN112393723A (en) | Positioning method, device, medium and unmanned device | |
CN114297326A (en) | Address verification method and device | |
CN114299147A (en) | Positioning method, positioning device, storage medium and electronic equipment | |
CN113706552A (en) | Method and device for generating semantic segmentation marking data of laser reflectivity base map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |