CN109977776A - A kind of method for detecting lane lines, device and mobile unit - Google Patents
A kind of method for detecting lane lines, device and mobile unit Download PDFInfo
- Publication number
- CN109977776A CN109977776A CN201910139130.6A CN201910139130A CN109977776A CN 109977776 A CN109977776 A CN 109977776A CN 201910139130 A CN201910139130 A CN 201910139130A CN 109977776 A CN109977776 A CN 109977776A
- Authority
- CN
- China
- Prior art keywords
- lane line
- pixel
- lane
- image
- fitting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The present embodiments relate to a kind of method for detecting lane lines, device and mobile unit, method includes: the image for obtaining imaging sensor acquisition;Extract the lane line pixel in image;Determine the lane line pixel for belonging to same lane line;Screening belongs to the lane line pixel of same lane line, obtains to match pixel point;Fitting obtains lane line to match pixel point.The embodiment of the present invention excludes the lane line pixel of erroneous detection by screening, improves the accuracy of lane detection.In addition, the requirement to computing platform reduces due to without deep neural network, it is easy to large-scale vehicle rule grade deployment and uses.
Description
Technical field
The present embodiments relate to automatic Pilot technical fields, and in particular to a kind of method for detecting lane lines, device and vehicle
Carry equipment.
Background technique
Lane detection is important one of the perceptional function of automatic driving vehicle.Existing method for detecting lane lines mainly divides
For the method for detecting lane lines based on image procossing and based on the method for detecting lane lines of deep learning.
Method for detecting lane lines based on image procossing becomes dependent on the higher inverse perspective mapping of computation complexity or Hough
Change image processing techniques such as (Hough Transform), and to block, obscure, the robustness of the complex working conditions such as bend it is lower, easily
There is missing inspection.
Method for detecting lane lines based on deep learning realizes the detection to lane line using deep neural network, although
Robustness improves, but to the more demanding of computing platform, need certain chip to support the deployment of neural network, be not easy to advise greatly
The vehicle rule grade deployment of mould uses.
The description of the above-mentioned discovery procedure to problem, is only used to facilitate the understanding of the technical scheme, and does not represent and holds
Recognizing above content is the prior art.
Summary of the invention
In order to solve the problems, such as it is of the existing technology at least one, at least one embodiment of the present invention provides a kind of vehicle
Road line detecting method, device and mobile unit.
In a first aspect, the embodiment of the present invention proposes a kind of method for detecting lane lines, comprising:
Obtain the image of imaging sensor acquisition;
Extract the lane line pixel in described image;
Determine the lane line pixel for belonging to same lane line;
Screening belongs to the lane line pixel of same lane line, obtains to match pixel point;
Fitting is described to obtain lane line to match pixel point.
Based in a first aspect, in first aspect first embodiment, the method also includes:
Lane line and information of vehicles based on fitting update the lane line tracked.
Based on first aspect first embodiment, in first aspect second embodiment, the lane line based on fitting and
Information of vehicles updates the lane line tracked, comprising:
Based on the lane line and the information of vehicles tracked, the lane line of described image is predicted;
The lane line of fitting is matched with the lane line of prediction;
If matching, the lane line of fitting and the lane line tracked are spliced.
Based on first aspect second embodiment, in first aspect 3rd embodiment, the information of vehicles include: speed and
Angular speed;
It is described based on the lane line tracked and the information of vehicles, predict the lane line of described image, comprising:
Lane line homologous thread based on the speed, angular speed, the time interval of two field pictures and previous frame image
Parameter predicts the parameter of the lane line homologous thread of described image.
Based on first aspect 3rd embodiment, in first aspect fourth embodiment, the lane line by fitting and pre-
The lane line of survey is matched, comprising:
The parameter of the lane line homologous thread of fitting and the parameter of the lane line homologous thread of prediction are subjected to similarity
Match.
Based on first aspect 3rd embodiment, in the 5th embodiment of first aspect, it is based on the speed, angular speed, two
The parameter of the lane line homologous thread of the time interval and previous frame image of frame image predicts that the lane line of described image is corresponding bent
The parameter of line, comprising:
Lane line homologous thread based on the speed, angular speed, the time interval of two field pictures and previous frame image
Constant term and first order predict the constant term and first order of the lane line homologous thread of described image.
Based in a first aspect, extracting the lane line pixel in described image, packet in first aspect sixth embodiment
It includes:
Area-of-interest is set in described image;
Edge detection is carried out to the area-of-interest;
The pixel that edge detection obtains is extracted in interlacing, obtains lane line pixel.
Based in a first aspect, the determination belongs to the lane line picture of same lane line in the 7th embodiment of first aspect
Vegetarian refreshments, comprising:
Determine the world coordinates of the lane line pixel;
World coordinates is clustered to obtain the lane line pixel for belonging to same lane line.
It is described that world coordinates is clustered in the 8th embodiment of first aspect based on the 7th embodiment of first aspect
Obtain the lane line pixel for belonging to same lane line, comprising:
KD tree is constructed based on world coordinates;
Density Clustering is carried out to all abscissas in the KD tree, obtains the lane line pixel for belonging to same lane line
Point.
Based in a first aspect, the screening belongs to the lane line picture of same lane line in the 9th embodiment of first aspect
Vegetarian refreshments is obtained to match pixel point, comprising:
Belong to the outside pixel in the lane line pixel of same lane line described in rejecting;
Judgement is fitted to remaining lane line pixel after rejecting outside pixel, is obtained to match pixel point.
Based on the 9th embodiment of first aspect, in the tenth embodiment of first aspect, on the outside of described pair of rejecting after pixel
Remaining lane line pixel is fitted judgement, obtains to match pixel point, comprising:
If the minimum ordinate of remaining lane line pixel is less than or equal to pre-determined distance, and minimum ordinate and maximum
The difference of ordinate is greater than or equal to the first default spacing, it is determined that remaining lane line pixel is to match pixel point.
Based on the tenth embodiment of first aspect, in the 11st embodiment of first aspect, fitting is described to match pixel point
Obtain lane line, comprising:
If the difference of minimum ordinate and maximum ordinate is greater than the second default spacing, use described in Cubic Curve Fitting to
Match pixel point obtains lane line;
If the difference of minimum ordinate and maximum ordinate is less than or equal to the second default spacing, conic fitting is used
It is described to obtain lane line to match pixel point.
Second aspect, the embodiment of the present invention also propose a kind of lane detection device, comprising:
First acquisition unit, for obtaining the image of imaging sensor acquisition;
Extraction unit, for extracting the lane line pixel in described image;
Determination unit, for determining the lane line pixel for belonging to same lane line;
Screening unit is obtained for screening the lane line pixel for belonging to same lane line to match pixel point;
Fitting unit obtains lane line to match pixel point for being fitted described.
Based on second aspect, in second aspect first embodiment, described device further include:
Updating unit updates the lane line tracked for lane line and information of vehicles based on fitting.
Based on second aspect first embodiment, in second aspect second embodiment, the updating unit includes:
First subelement, for predicting the lane line of described image based on the lane line and the information of vehicles tracked;
Second subelement, for matching the lane line of fitting with the lane line of prediction;
Third subelement, if splicing the lane line of fitting and the lane line tracked for matching.
Based on second aspect second embodiment, in second aspect 3rd embodiment, the information of vehicles include: speed and
Angular speed;
First subelement, for based on the speed, angular speed, the time interval of two field pictures and previous frame image
Lane line homologous thread parameter, predict the parameter of the lane line homologous thread of described image.
Based on second aspect 3rd embodiment, in second aspect fourth embodiment, second subelement, for that will intend
The parameter of the lane line homologous thread of the parameter and prediction of the lane line homologous thread of conjunction carries out similarity mode.
Based on second aspect 3rd embodiment, in the 5th embodiment of second aspect, first subelement, for being based on
The speed, angular speed, the constant term of lane line homologous thread of the time interval of two field pictures and previous frame image and primary
, predict the constant term and first order of the lane line homologous thread of described image.
Based on second aspect, in second aspect sixth embodiment, the extraction unit is used for:
Area-of-interest is set in described image;
Edge detection is carried out to the area-of-interest;
The pixel that edge detection obtains is extracted in interlacing, obtains lane line pixel.
Based on second aspect, in the 7th embodiment of second aspect, the determination unit, comprising:
First subelement, for determining the world coordinates of the lane line pixel;
Second subelement obtains the lane line pixel for belonging to same lane line for being clustered to world coordinates.
Based on the 7th embodiment of second aspect, in the 8th embodiment of second aspect, second subelement is used for:
KD tree is constructed based on world coordinates;
Density Clustering is carried out to all abscissas in the KD tree, obtains the lane line pixel for belonging to same lane line
Point.
Based on second aspect, in the 9th embodiment of second aspect, the screening unit, comprising:
First subelement, for rejecting the outside pixel belonged in the lane line pixel of same lane line;
Second subelement is obtained for being fitted judgement to remaining lane line pixel after rejecting outside pixel
To match pixel point.
Based on the 9th embodiment of second aspect, in the tenth embodiment of second aspect, second subelement, if for surplus
The minimum ordinate of remaining lane line pixel is less than or equal to pre-determined distance, and the difference of minimum ordinate and maximum ordinate is big
In or equal to the first default spacing, it is determined that remaining lane line pixel is to match pixel point.
Based on the tenth embodiment of second aspect, in the 11st embodiment of second aspect, the fitting unit is used for:
If the difference of minimum ordinate and maximum ordinate is greater than the second default spacing, use described in Cubic Curve Fitting to
Match pixel point obtains lane line;
If the difference of minimum ordinate and maximum ordinate is less than or equal to the second default spacing, conic fitting is used
It is described to obtain lane line to match pixel point.
The third aspect, the embodiment of the present invention also propose a kind of mobile unit, comprising:
Processor and memory;
The processor and memory are coupled by bus system;
The processor is used to execute side as described in relation to the first aspect by the program or instruction of calling the memory to store
The step of method.
As it can be seen that excluding the lane line pixel of erroneous detection by screening at least one embodiment of the embodiment of the present invention, mentioning
The high accuracy of lane detection.In addition, the requirement drop due to without deep neural network, to computing platform
It is low, it is easy to large-scale vehicle rule grade deployment and uses.
Detailed description of the invention
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be in embodiment or description of the prior art
Required attached drawing is briefly described, it should be apparent that, the accompanying drawings in the following description is only some realities of the invention
Example is applied, for those of ordinary skill in the art, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of structural schematic diagram of mobile unit provided in an embodiment of the present invention;
Fig. 2 is a kind of method for detecting lane lines flow chart provided in an embodiment of the present invention;
Fig. 3 is a kind of lane detection device block diagram provided in an embodiment of the present invention;
Fig. 4 is the one provided in an embodiment of the present invention sample image for being used for lane detection;
Fig. 5 is the lane line bianry image that sample image shown in Fig. 4 obtains after lane detection.
Specific embodiment
To better understand the objects, features and advantages of the present invention, with reference to the accompanying drawings and examples
The present invention is described in further detail.It is understood that described embodiment is a part of the embodiments of the present invention,
Instead of all the embodiments.Specific embodiment described herein is used only for explaining the present invention, rather than to limit of the invention
It is fixed.Based on described the embodiment of the present invention, those of ordinary skill in the art's every other embodiment obtained is belonged to
The scope of protection of the invention.
It should be noted that, in this document, the relational terms of such as " first " and " second " or the like are used merely to one
A entity or operation with another entity or operate distinguish, without necessarily requiring or implying these entities or operation it
Between there are any actual relationship or orders.
Fig. 1 is a kind of structural schematic diagram of mobile unit provided in an embodiment of the present invention.
Mobile unit shown in FIG. 1 includes: at least one processor 101 and at least one processor 102.In mobile unit
Various components be coupled by bus system 103.It is understood that bus system 103 is for realizing between these components
Connection communication.Bus system 103 further includes power bus, control bus and status signal bus in addition in addition to including data/address bus.
But for the sake of clear explanation, various buses are all designated as bus system 103 in Fig. 1.
It is appreciated that the memory 102 in the present embodiment can be volatile memory or nonvolatile memory, or can
Including both volatile and non-volatile memories.Wherein, nonvolatile memory can be read-only memory (Read-Only
Memory, ROM), programmable read only memory (Programmable ROM, PROM), Erasable Programmable Read Only Memory EPROM
(Erasable PROM, EPROM), electrically erasable programmable read-only memory (Electrically EPROM, EEPROM) dodge
It deposits.Volatile memory can be random access memory (Random Access Memory, RAM), be used as external high speed
Caching.By exemplary but be not restricted explanation, the RAM of many forms is available, such as static random access memory
(Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), Synchronous Dynamic Random Access Memory
(Synchronous DRAM, SDRAM), double data speed synchronous dynamic RAM (DoubleDataRate
SDRAM, DDRSDRAM), enhanced Synchronous Dynamic Random Access Memory (Enhanced SDRAM, ESDRAM), synchronized links
Dynamic random access memory (Synchlink DRAM, SLDRAM) and direct rambus random access memory
(DirectRambus RAM, DRRAM).Memory 102 described herein is intended to include but is not limited to these and fits with any other
Close the memory of type.
In some embodiments, memory 102 stores following element, and unit or data structure can be performed, or
Their subset of person or their superset: operating system 1021 and application program 1022.
Wherein, operating system 1021 include various system programs, such as ccf layer, core library layer, driving layer etc., are used for
Realize various basic businesses and the hardware based task of processing.Application program 1022 includes various application programs, such as media
Player (Media Player), browser (Browser) etc., for realizing various applied business.Realize the embodiment of the present invention
The program of method may be embodied in application program 1022.
In embodiments of the present invention, program or instruction that processor 101 is stored by calling memory 102, specifically, can
To be the program or instruction stored in application program 1022, processor 101 is for executing each embodiment of method for detecting lane lines
Step, such as may include following steps one to step 5:
Step 1: obtaining the image of imaging sensor acquisition.
Step 2: extracting the lane line pixel in described image.
Step 3: determining the lane line pixel for belonging to same lane line.
Step 4: screening belongs to the lane line pixel of same lane line, obtain to match pixel point.
Step 5: fitting is described to obtain lane line to match pixel point.
In the embodiment of the present invention, the lane line pixel of erroneous detection is excluded by screening, improves the accurate of lane detection
Property.In addition, the requirement to computing platform reduces due to without deep neural network, it is easy to large-scale vehicle rule grade
Deployment uses.
The method that the embodiments of the present invention disclose can be applied in processor 101, or be realized by processor 101.
Processor 101 may be a kind of IC chip, the processing capacity with signal.During realization, the above method it is each
Step can be completed by the integrated logic circuit of the hardware in processor 101 or the instruction of software form.Above-mentioned processing
Device 101 can be general processor, digital signal processor (Digital Signal Processor, DSP), dedicated integrated electricity
Road (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components.It may be implemented or execute disclosed each method, step and the logic diagram in the embodiment of the present invention.It is general
Processor can be microprocessor or the processor is also possible to any conventional processor etc..In conjunction with institute of the embodiment of the present invention
The step of disclosed method, can be embodied directly in hardware decoding processor and execute completion, or with the hardware in decoding processor
And software unit combination executes completion.Software unit can be located at random access memory, and flash memory, read-only memory may be programmed read-only
In the storage medium of this fields such as memory or electrically erasable programmable memory, register maturation.The storage medium is located at
The step of memory 102, processor 101 reads the information in memory 102, completes the above method in conjunction with its hardware.
It is understood that embodiments described herein can with hardware, software, firmware, middleware, microcode or its
Combination is to realize.For hardware realization, processing unit be may be implemented in one or more specific integrated circuits (ASIC), number letter
Number processor (DSP), digital signal processing appts (DSP Device, DSPD), programmable logic device (PLD), scene can compile
Journey gate array (FPGA), general processor, controller, microcontroller, microprocessor, for execute herein described function its
In its electronic unit or combinations thereof.
For software implementations, the techniques described herein can be realized by executing the unit of function described herein.Software generation
Code is storable in memory and is executed by processor.Memory can in the processor or portion realizes outside the processor.
It is understood that unit described in conjunction with the examples disclosed in the embodiments of the present disclosure and algorithm steps, energy
The combination with electronic hardware or computer software and electronic hardware is reached to realize.These functions are actually with hardware or software
Mode executes, specific application and design constraint depending on technical solution.Professional technician can be to each specific
Application use different methods to achieve the described function, but such implementation should not be considered as beyond the scope of the present invention.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description,
The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In embodiment provided herein, it should be understood that unless existing clear between the step of embodiment of the method
Sequencing, otherwise execution sequence can arbitrarily adjust.Disclosed device and method, may be implemented in other ways.
For example, the apparatus embodiments described above are merely exemplary, for example, the division of the unit, only a kind of logic
Function division, there may be another division manner in actual implementation, such as multiple units or components can combine or can collect
At another system is arrived, or some features can be ignored or not executed.Another point, shown or discussed mutual coupling
It closes or direct-coupling or communication connection can be through some interfaces, the indirect coupling or communication connection of device or unit can be with
It is electrically mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.
It, can be with if the function is realized in the form of SFU software functional unit and when sold or used as an independent product
It is stored in a computer readable storage medium.Based on this understanding, the technical solution of the embodiment of the present invention is substantially
The part of the part that contributes to existing technology or the technical solution can embody in the form of software products in other words
Come, which is stored in a storage medium, including some instructions are used so that a computer equipment (can
To be personal computer, server or the network equipment etc.) execute all or part of each embodiment the method for the present invention
Step.And storage medium above-mentioned includes: that USB flash disk, mobile hard disk, ROM, RAM, magnetic or disk etc. are various can store program
The medium of code.
Fig. 2 is a kind of method for detecting lane lines flow chart provided in an embodiment of the present invention.The executing subject of this method is vehicle
Carry equipment.
As shown in Fig. 2, method for detecting lane lines disclosed in the present embodiment may include following steps 201 to 205:
201, the image of imaging sensor acquisition is obtained.
202, the lane line pixel in image.
203, the lane line pixel for belonging to same lane line is determined.
204, the lane line pixel that screening belongs to same lane line is obtained to match pixel point.
205, fitting obtains lane line to match pixel point.
In the present embodiment, imaging sensor is, for example, camera, and camera is mounted on automatic driving vehicle, and acquisition is automatic
The image of vehicle heading is driven, it is also referred to as preceding to camera.
The mode that mobile unit obtains the image of imaging sensor acquisition is that passive receiving type namely forward direction camera can incite somebody to action
The image transmitting of acquisition receives image to mobile unit, mobile unit.
In the present embodiment, in order to reduce missing inspection, the present embodiment does not use the higher perspective of complexity to become the processing of image
It changes or the complex process such as Hough transformation, but the lane line pixel in image is extracted based on edge detection, other can also be used
Mode extract lane line pixel.
The lane line pixel in image is extracted based on edge detection, image procossing complexity is reduced, accelerates image
Processing speed improves the robustness of complex working condition, reduces missing inspection.
For automatic driving vehicle, if automatic driving vehicle in lane when driving, lane line includes: automatic driving vehicle
The lane line in left side, the lane line on the right side of abbreviation left-hand lane line and automatic driving vehicle, abbreviation right-hand lane line.
In the present embodiment, the lane line pixel for belonging to same lane line is determined, namely return to lane line pixel
Class belongs to of a sort lane line pixel and is located at same lane line, and same lane line is left-hand lane line or right-hand lane
Line.
In addition, reducing image procossing complexity due to extracting the lane line pixel in image based on edge detection, adding
Therefore fast image processing speed inevitably increases the lane line pixel of erroneous detection.
For the accuracy for improving lane detection, in the present embodiment, the lane line pixel for belonging to same lane line is being determined
After point, it is subordinated to the lane line pixel for excluding erroneous detection in the lane line pixel of same lane line by screening, improves vehicle
The accuracy of diatom detection.
After screening belongs to the lane line pixel of same lane line, the effective pixel points of lane line can be obtained, referred to as to quasi-
Close pixel.By being fitted to match pixel point, lane line can be obtained.Wherein, approximating method can continue to use the prior art, herein not
It repeats again.
Method for detecting lane lines disclosed in the present embodiment can be applied to lane and keep auxiliary (LKA, Lane Keeping
) and the exploitation of the Function for Automatic Pilot such as lane departure warning (LDW, Lane Departure Warning) Assist.
As it can be seen that excluding the lane line pixel of erroneous detection by screening in the present embodiment, improving the accurate of lane detection
Property.Since without deep neural network, the requirement to computing platform is reduced, being easy to large-scale vehicle rule grade deployment makes
With.
In addition, in the present embodiment, by extracting the lane line pixel in image based on edge detection, without to image into
The complex process such as row perspective transform or Hough transformation, reduce image procossing complexity, accelerate image processing speed, improve
The robustness of complex working condition reduces missing inspection.
In some embodiments, the lane line pixel in image is extracted, it may include following steps (1) to (3):
(1) area-of-interest is set in the picture;
(2) edge detection is carried out to area-of-interest;
(3) pixel that edge detection obtains is extracted in interlacing, obtains lane line pixel.
In the present embodiment, area-of-interest (ROI, Region Of Interest) is set in the picture, can be counted to reduce
Calculation amount improves processing speed, improves the robustness of complex working condition.Specifically, can be existed based on the lane line of previous frame image detection
Dynamic ROI is arranged in position in imaging sensor acquisition range.
The boundary of dynamic ROI namely the movement with automatic driving vehicle, ROI can dynamic adjustment.For example, initial
ROI when ROI or automatic driving vehicle traveling are in lane is the trapezoid area of pre-set dimension, if automatic driving vehicle is to the left
Lane change, relative position of the lane line in imaging sensor acquisition range rotate clockwise, and therefore, the boundary of ROI is also answered
When rotating clockwise.
In addition, the size of ROI can the value of fitting first order corresponding to the lane line based on previous frame image detection carry out
Adjustment.In some applications, the size for being fitted first order and ROI is positively correlated, namely fitting first order is bigger, and the size of ROI is got over
Greatly.
In the present embodiment, in order to improve the robustness of complex working condition, missing inspection is reduced, edge detection is carried out to area-of-interest
When, settable lower detection threshold value reduces the probability of missing inspection, but the lane line pixel of erroneous detection inevitably occurs.
In order to reduce the lane line pixel of erroneous detection, processing speed is further increased, edge detection can be extracted with interlacing and obtained
The pixel arrived obtains lane line pixel.
In some embodiments, determine and belong to the lane line pixel of same lane line, it may include following steps (1) and
(2):
(1) world coordinates of lane line pixel is determined;
(2) world coordinates is clustered to obtain the lane line pixel for belonging to same lane line.
In the present embodiment, in order to determine the lane line pixel for belonging to same lane line, it is thus necessary to determine that lane line pixel
World coordinates.
Since the image of imaging sensor acquisition is projection of the three-dimensional scene in two-dimensional image plane under world coordinate system,
Its projection theory is approximately pinhole imaging system principle, is then mathematically a perspective mapping process, and inverse process is known as inverse perspective
It maps (Inverse Perspective Mapping, IPM).Therefore, it can determine that the world of lane line pixel is sat using IPM
Mark, comprising: longitudinal coordinate and lateral coordinates.Two-dimensional coordinate is converted to by the mature skill that three-dimensional coordinate belongs to this field using IPM
Art, details are not described herein.
After the world coordinates for determining lane line pixel, lane line pixel can be classified by clustering, be obtained
Belong to the lane line pixel of same lane line.
There are many modes clustered to world coordinates, such as K-means cluster can be used and be based on Density Clustering etc.
Method.K-means cluster needs the categorical measure of specified cluster, more sensitive for abnormal point.Based on Density Clustering for different
Often put insensitive, but time complexity is high.
In some embodiments, world coordinates is clustered using Density Clustering to obtain the lane for belonging to same lane line
Line pixel specifically includes following steps (1) and (2):
(1) KD tree is constructed based on world coordinates;
(2) Density Clustering is carried out to all abscissas in KD tree, obtains the lane line pixel for belonging to same lane line.
In the present embodiment, by constructing KD tree to world coordinates, clustering processing speed can be improved.Specifically, based on belonging to
The abscissa of the world coordinates of all lane line pixels of same lane line constructs KD tree.
In some embodiments, the lane line pixel that screening belongs to same lane line obtains to wrap to match pixel point
Include following steps (1) and (2):
(1) the outside pixel in the lane line pixel for belonging to same lane line is rejected;
(2) judgement is fitted to remaining lane line pixel after rejecting outside pixel, obtained to match pixel point.
No matter for left-hand lane line or right-hand lane line, lane line has certain width, is being based on edge detection
After extracting the lane line pixel in image, lane line is made of out conductor and inside cord, and out conductor is made of outside pixel,
Inside cord is made of inside pixel.Out conductor can be regarded as far from the line inside lane, and inside cord can be regarded as close to lane
Internal line.
To eliminate influence of the lane line width for subsequent fitting result, need to reject the lane line picture for belonging to same lane line
Outside pixel in vegetarian refreshments, so as to the lane line pixel of exclusive segment erroneous detection.
The erroneous detection that the lane line pixel in image may cause is extracted based on edge detection to eliminate, and improves fitting
Precision, to reject outside pixel after remaining lane line pixel be fitted judgement, obtain to match pixel point.
In some embodiments, judgement is fitted to remaining lane line pixel after rejecting outside pixel, obtained
To match pixel point, specifically:
If the minimum ordinate of remaining lane line pixel is less than or equal to pre-determined distance, and minimum ordinate and maximum
The difference of ordinate is greater than or equal to the first default spacing, it is determined that remaining lane line pixel is to match pixel point.
In the present embodiment, the world coordinates that IPM determines lane line pixel can be used, comprising: longitudinal coordinate and laterally seat
Mark.
In the present embodiment, if the minimum ordinate of remaining lane line pixel is greater than pre-determined distance, illustrate that distance is automatic
It is still distant to drive the nearest lane line pixel of vehicle, the traveling of automatic driving vehicle will not be had an impact, therefore nothing
It need to be fitted.
Pre-determined distance is, for example, 15 meters, and the present embodiment by way of example only, does not limit the specific value of pre-determined distance.
In the present embodiment, if the difference of minimum ordinate and maximum ordinate less than the first default spacing, illustrates to detect lane
Line pixel may belong to sign, such as the turning first-class terrestrial reference of indicator, be not belonging to the pixel of lane line, there is no need to
It is fitted.
First default spacing is, for example, 4 meters, and the present embodiment by way of example only, does not limit specifically taking for the first default spacing
Value.
In some embodiments, fitting obtains lane line to match pixel point, it may include following (1) and (2):
(1) if the difference of minimum ordinate and maximum ordinate is greater than the second default spacing, Cubic Curve Fitting institute is used
It states and obtains lane line to match pixel point;
(2) if the difference of minimum ordinate and maximum ordinate is less than or equal to the second default spacing, conic section is used
Fitting is described to obtain lane line to match pixel point.
In the present embodiment, if the difference of minimum ordinate and maximum ordinate greater than the second default spacing, illustrate lane line compared with
Far, lane line pixel is more, therefore, more accurate using Cubic Curve Fitting.
Second default spacing is, for example, 20 meters, and the present embodiment by way of example only, does not limit the specific of the second default spacing
Value.
In some embodiments, method for detecting lane lines may further comprise the step of: lane line and information of vehicles based on fitting,
The lane line tracked is updated, specifically, it may include following steps (1) to (3):
(1) based on the lane line and information of vehicles tracked, the lane line of described image is predicted;
(2) lane line of fitting is matched with the lane line of prediction;
(3) if matching, the lane line of fitting is spliced with the lane line tracked.
In the present embodiment, currently just this frame image or current frame image can be regarded as in processed image.Based on having chased after
The lane line and information of vehicles of track, can be predicted the lane line of this frame image.
In the present embodiment, information of vehicles be the motion information of automatic driving vehicle in the process of moving, for example including but not
It is limited to speed and angular speed.Information of vehicles is acquired by the sensor group that is mounted on automatic driving vehicle, sensor group include but
It is not limited to velocity sensor and angular-rate sensor.Sensor group is by the real-time data transmission of acquisition to mobile unit.
It can be matched with the lane line of prediction by the lane line that this frame image is fitted, if matching, it is same to illustrate that the two belongs to
One kind, such as be left-hand lane line or right-hand lane line, if mismatching, it is emerging to illustrate that the lane line of fitting belongs to
Line, the line be both not belonging to left-hand lane line, were also not belonging to right-hand lane line, should be sign, such as turning arrow
Equal terrestrial references, are not belonging to lane line.
The lane line tracked can be regarded as having been detected by the lane line come from history image.Mobile unit is from image
Sensor one frame image of every acquisition, the lane line that can detect lane line, and will test are spelled with the lane line tracked
It connects, completes the update to the lane line tracked.
In some embodiments, it based on the lane line and information of vehicles tracked, predicts the lane line of described image, can wrap
It includes: the parameter of the lane line homologous thread based on speed, angular speed, the time interval of two field pictures and previous frame image, prediction
The parameter of the lane line homologous thread of described image.
In the present embodiment, currently just this frame image can be regarded as in processed image, previous frame image can be regarded as this
The previous frame image of frame image.
Since lane line is made of multiple lane line pixels, specifically, lane line is by multiple lane line pixels
It is fitted obtained curve, therefore, lane line and curve are to corresponding.
Correspondingly, the lane line of fitting is matched with the lane line of prediction, specifically: the lane line of fitting is corresponding
The parameter of the lane line homologous thread of the parameter and prediction of curve carries out similarity mode.
It in some embodiments, can be by the constant term and one of the lane line of fitting and the foundation matched curve of the lane line of prediction
Secondary item carries out similarity mode.
Similarity mode is matched for example, by using Euclidean distance, specifically, if Euclidean distance is less than preset threshold, is classified as one
Class.Such as: if the lane line of fitting and the left-hand lane line of prediction meet Euclidean distance less than preset threshold, illustrate fitting
Lane line is left-hand lane line.
If the lane line of fitting and the left-hand lane line and right-hand lane line of prediction are all satisfied: Euclidean distance is greater than default
Threshold value then illustrates that the lane line of fitting is emerging line, which had both been not belonging to left-hand lane line, and had also been not belonging to right-hand lane
Line should be sign, such as the turning first-class terrestrial reference of indicator, is not belonging to lane line.
As it can be seen that the lane line of matching fitting and the lane line of prediction, can further decrease the probability of erroneous detection.
In some embodiments, based on the speed, angular speed, the time interval of two field pictures and previous frame image vehicle
The parameter of diatom homologous thread predicts the parameter of the lane line homologous thread of described image, comprising:
Lane line homologous thread based on the speed, angular speed, the time interval of two field pictures and previous frame image
Constant term and first order predict the constant term and first order of the lane line homologous thread of described image.
In the present embodiment, currently just this frame image can be regarded as in processed image, previous frame image can be regarded as this
The previous frame image of frame image.
In the present embodiment, the lane line of next frame image is predicted by following formula:
Wherein,For the time interval of two field pictures, vkFor the corresponding speed of previous frame image, wkFor previous frame image pair
The angular speed answered, a0,kFor the constant term of lane line homologous thread in previous frame image, a1,kFor lane line pair in previous frame image
Answer the first order of curve, a0,k+1For the constant term of lane line homologous thread in this frame image, a1,k+1For lane line in this frame image
The first order of homologous thread.
Based on method for detecting lane lines disclosed in the above various embodiments, to the sample shown in Fig. 4 for lane detection
After image is detected, lane line bianry image shown in fig. 5 can be obtained.
As shown in figure 3, the present embodiment discloses a kind of lane detection device, it may include with lower unit: first acquisition unit
31, extraction unit 32, determination unit 33, screening unit 34 and fitting unit 35, are described as follows:
First acquisition unit 31, for obtaining the image of imaging sensor acquisition;
Extraction unit 32, for extracting the lane line pixel in described image;
Determination unit 33, for determining the lane line pixel for belonging to same lane line;
Screening unit 34 is obtained for screening the lane line pixel for belonging to same lane line to match pixel point;
Fitting unit 35 obtains lane line to match pixel point for being fitted described.
In some embodiments, lane detection device further include:
Updating unit updates the lane line tracked for lane line and information of vehicles based on fitting.
In some embodiments, the updating unit includes:
First subelement, for predicting the lane line of described image based on the lane line and the information of vehicles tracked;
Second subelement, for matching the lane line of fitting with the lane line of prediction;
Third subelement, if splicing the lane line of fitting and the lane line tracked for matching.
In some embodiments, the information of vehicles includes: speed and angular speed;
First subelement, for based on the speed, angular speed, the time interval of two field pictures and previous frame image
Lane line homologous thread parameter, predict the parameter of the lane line homologous thread of described image.
In some embodiments, second subelement, the parameter and prediction of the lane line homologous thread for that will be fitted
Lane line homologous thread parameter carry out similarity mode.
In some embodiments, first subelement, for based on the speed, angular speed, two field pictures time
The constant term and first order of the lane line homologous thread of interval and previous frame image, predict the lane line homologous thread of described image
Constant term and first order.
In some embodiments, the extraction unit 32, is used for:
Area-of-interest is set in described image;
Edge detection is carried out to the area-of-interest;
The pixel that edge detection obtains is extracted in interlacing, obtains lane line pixel.
In some embodiments, the determination unit 33, comprising:
First subelement, for determining the world coordinates of the lane line pixel;
Second subelement obtains the lane line pixel for belonging to same lane line for being clustered to world coordinates.
In some embodiments, second subelement, is used for:
KD tree is constructed based on world coordinates;
Density Clustering is carried out to all abscissas in the KD tree, obtains the lane line pixel for belonging to same lane line
Point.
In some embodiments, the screening unit 34, comprising:
First subelement, for rejecting the outside pixel belonged in the lane line pixel of same lane line;
Second subelement is obtained for being fitted judgement to remaining lane line pixel after rejecting outside pixel
To match pixel point.
In some embodiments, second subelement, if the minimum ordinate for remaining lane line pixel is small
In or be equal to pre-determined distance, and the difference of minimum ordinate and maximum ordinate is greater than or equal to the first default spacing, it is determined that remains
Remaining lane line pixel is to match pixel point.
In some embodiments, the fitting unit 35, is used for:
If the difference of minimum ordinate and maximum ordinate is greater than the second default spacing, use described in Cubic Curve Fitting to
Match pixel point obtains lane line;
If the difference of minimum ordinate and maximum ordinate is less than or equal to the second default spacing, conic fitting is used
It is described to obtain lane line to match pixel point.
Lane detection device disclosed in above embodiments can be realized the inspection of lane line disclosed in the above each method embodiment
The process of survey method, to avoid repeating, details are not described herein.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
It will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments is wrapped
Certain features for including rather than other feature, but the combination of the feature of different embodiments mean in the scope of the present invention it
It is interior and form different embodiments.
Although the embodiments of the invention are described in conjunction with the attached drawings, but those skilled in the art can not depart from this hair
Various modifications and variations are made in the case where bright spirit and scope, such modifications and variations are each fallen within by appended claims
Within limited range.
Claims (10)
1. a kind of method for detecting lane lines characterized by comprising
Obtain the image of imaging sensor acquisition;
Extract the lane line pixel in described image;
Determine the lane line pixel for belonging to same lane line;
Screening belongs to the lane line pixel of same lane line, obtains to match pixel point;
Fitting is described to obtain lane line to match pixel point.
2. the method according to claim 1, wherein further include:
Lane line and information of vehicles based on fitting update the lane line tracked.
3. according to the method described in claim 2, it is characterized in that, the lane line and information of vehicles based on fitting, updates
The lane line tracked, comprising:
Based on the lane line and the information of vehicles tracked, the lane line of described image is predicted;
The lane line of fitting is matched with the lane line of prediction;
If matching, the lane line of fitting and the lane line tracked are spliced.
4. according to the method described in claim 3, it is characterized in that,
The information of vehicles includes: speed and angular speed;
It is described based on the lane line tracked and the information of vehicles, predict the lane line of described image, comprising:
The parameter of lane line homologous thread based on the speed, angular speed, the time interval of two field pictures and previous frame image,
Predict the parameter of the lane line homologous thread of described image.
5. according to the method described in claim 4, it is characterized in that, described carry out the lane line of fitting and the lane line of prediction
Matching, comprising:
The parameter of the lane line homologous thread of fitting and the parameter of the lane line homologous thread of prediction are subjected to similarity mode.
6. according to the method described in claim 4, it is characterized in that, based on the speed, angular speed, two field pictures time between
Every the parameter of the lane line homologous thread with previous frame image, the parameter of the lane line homologous thread of described image is predicted, comprising:
The constant of lane line homologous thread based on the speed, angular speed, the time interval of two field pictures and previous frame image
Item and first order, predict the constant term and first order of the lane line homologous thread of described image.
7. the method according to claim 1, wherein
Extract the lane line pixel in described image, comprising:
Area-of-interest is set in described image;
Edge detection is carried out to the area-of-interest;
The pixel that edge detection obtains is extracted in interlacing, obtains lane line pixel.
8. the method according to claim 1, wherein
The determination belongs to the lane line pixel of same lane line, comprising:
Determine the world coordinates of the lane line pixel;
World coordinates is clustered to obtain the lane line pixel for belonging to same lane line.
9. a kind of lane detection device characterized by comprising
First acquisition unit, for obtaining the image of imaging sensor acquisition;
Extraction unit, for extracting the lane line pixel in described image;
Determination unit, for determining the lane line pixel for belonging to same lane line;
Screening unit is obtained for screening the lane line pixel for belonging to same lane line to match pixel point;
Fitting unit obtains lane line to match pixel point for being fitted described.
10. a kind of mobile unit characterized by comprising
Processor and memory;
The processor and memory are coupled by bus system;
The processor is used to execute such as any one of claim 1 to 8 by the program or instruction of calling the memory to store
The step of the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910139130.6A CN109977776B (en) | 2019-02-25 | 2019-02-25 | Lane line detection method and device and vehicle-mounted equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910139130.6A CN109977776B (en) | 2019-02-25 | 2019-02-25 | Lane line detection method and device and vehicle-mounted equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109977776A true CN109977776A (en) | 2019-07-05 |
CN109977776B CN109977776B (en) | 2023-06-23 |
Family
ID=67077335
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910139130.6A Active CN109977776B (en) | 2019-02-25 | 2019-02-25 | Lane line detection method and device and vehicle-mounted equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109977776B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110345952A (en) * | 2019-07-09 | 2019-10-18 | 同济人工智能研究院(苏州)有限公司 | A kind of serializing lane line map constructing method and building system |
CN110929655A (en) * | 2019-11-27 | 2020-03-27 | 厦门金龙联合汽车工业有限公司 | Lane line identification method in driving process, terminal device and storage medium |
CN111141306A (en) * | 2020-01-07 | 2020-05-12 | 深圳南方德尔汽车电子有限公司 | A-star-based global path planning method and device, computer equipment and storage medium |
CN112528859A (en) * | 2020-12-11 | 2021-03-19 | 中国第一汽车股份有限公司 | Lane line detection method, device, equipment and storage medium |
CN112912895A (en) * | 2021-01-29 | 2021-06-04 | 华为技术有限公司 | Detection method and device and vehicle |
CN112989991A (en) * | 2021-03-09 | 2021-06-18 | 贵州京邦达供应链科技有限公司 | Lane line detection method and apparatus, electronic device, and computer-readable storage medium |
CN113657174A (en) * | 2021-07-21 | 2021-11-16 | 北京中科慧眼科技有限公司 | Vehicle pseudo-3D information detection method and device and automatic driving system |
CN114724119A (en) * | 2022-06-09 | 2022-07-08 | 天津所托瑞安汽车科技有限公司 | Lane line extraction method, lane line detection apparatus, and storage medium |
EP4089648A1 (en) * | 2021-05-10 | 2022-11-16 | Nio Technology (Anhui) Co., Ltd | Lane edge extraction method and apparatus, autonomous driving system, vehicle, and storage medium |
CN117575920A (en) * | 2023-12-01 | 2024-02-20 | 昆易电子科技(上海)有限公司 | Lane line optimization method, lane line optimization device and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08297023A (en) * | 1995-04-26 | 1996-11-12 | Hitachi Ltd | Apparatus for monitoring running road face by image processing |
CN104008645A (en) * | 2014-06-12 | 2014-08-27 | 湖南大学 | Lane line predicating and early warning method suitable for city road |
CN104318258A (en) * | 2014-09-29 | 2015-01-28 | 南京邮电大学 | Time domain fuzzy and kalman filter-based lane detection method |
CN105320927A (en) * | 2015-03-25 | 2016-02-10 | 中科院微电子研究所昆山分所 | Lane line detection method and system |
CN106326822A (en) * | 2015-07-07 | 2017-01-11 | 北京易车互联信息技术有限公司 | Method and device for detecting lane line |
CN107097794A (en) * | 2016-12-15 | 2017-08-29 | 财团法人车辆研究测试中心 | The detecting system and its method of terrain vehicle diatom |
US20180181817A1 (en) * | 2015-09-10 | 2018-06-28 | Baidu Online Network Technology (Beijing) Co., Ltd. | Vehicular lane line data processing method, apparatus, storage medium, and device |
-
2019
- 2019-02-25 CN CN201910139130.6A patent/CN109977776B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08297023A (en) * | 1995-04-26 | 1996-11-12 | Hitachi Ltd | Apparatus for monitoring running road face by image processing |
CN104008645A (en) * | 2014-06-12 | 2014-08-27 | 湖南大学 | Lane line predicating and early warning method suitable for city road |
CN104318258A (en) * | 2014-09-29 | 2015-01-28 | 南京邮电大学 | Time domain fuzzy and kalman filter-based lane detection method |
CN105320927A (en) * | 2015-03-25 | 2016-02-10 | 中科院微电子研究所昆山分所 | Lane line detection method and system |
CN106326822A (en) * | 2015-07-07 | 2017-01-11 | 北京易车互联信息技术有限公司 | Method and device for detecting lane line |
US20180181817A1 (en) * | 2015-09-10 | 2018-06-28 | Baidu Online Network Technology (Beijing) Co., Ltd. | Vehicular lane line data processing method, apparatus, storage medium, and device |
CN107097794A (en) * | 2016-12-15 | 2017-08-29 | 财团法人车辆研究测试中心 | The detecting system and its method of terrain vehicle diatom |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110345952A (en) * | 2019-07-09 | 2019-10-18 | 同济人工智能研究院(苏州)有限公司 | A kind of serializing lane line map constructing method and building system |
CN110929655A (en) * | 2019-11-27 | 2020-03-27 | 厦门金龙联合汽车工业有限公司 | Lane line identification method in driving process, terminal device and storage medium |
CN110929655B (en) * | 2019-11-27 | 2023-04-14 | 厦门金龙联合汽车工业有限公司 | Lane line identification method in driving process, terminal device and storage medium |
CN111141306A (en) * | 2020-01-07 | 2020-05-12 | 深圳南方德尔汽车电子有限公司 | A-star-based global path planning method and device, computer equipment and storage medium |
CN112528859A (en) * | 2020-12-11 | 2021-03-19 | 中国第一汽车股份有限公司 | Lane line detection method, device, equipment and storage medium |
CN112912895A (en) * | 2021-01-29 | 2021-06-04 | 华为技术有限公司 | Detection method and device and vehicle |
CN112912895B (en) * | 2021-01-29 | 2022-07-22 | 华为技术有限公司 | Detection method and device and vehicle |
CN112989991A (en) * | 2021-03-09 | 2021-06-18 | 贵州京邦达供应链科技有限公司 | Lane line detection method and apparatus, electronic device, and computer-readable storage medium |
EP4089648A1 (en) * | 2021-05-10 | 2022-11-16 | Nio Technology (Anhui) Co., Ltd | Lane edge extraction method and apparatus, autonomous driving system, vehicle, and storage medium |
CN113657174A (en) * | 2021-07-21 | 2021-11-16 | 北京中科慧眼科技有限公司 | Vehicle pseudo-3D information detection method and device and automatic driving system |
CN114724119A (en) * | 2022-06-09 | 2022-07-08 | 天津所托瑞安汽车科技有限公司 | Lane line extraction method, lane line detection apparatus, and storage medium |
CN117575920A (en) * | 2023-12-01 | 2024-02-20 | 昆易电子科技(上海)有限公司 | Lane line optimization method, lane line optimization device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109977776B (en) | 2023-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109977776A (en) | A kind of method for detecting lane lines, device and mobile unit | |
CN110084292B (en) | Target detection method based on DenseNet and multi-scale feature fusion | |
CN110414507B (en) | License plate recognition method and device, computer equipment and storage medium | |
CN110348297B (en) | Detection method, system, terminal and storage medium for identifying stereo garage | |
CN109117825B (en) | Lane line processing method and device | |
CN109099929B (en) | Intelligent vehicle positioning device and method based on scene fingerprints | |
CN108960211B (en) | Multi-target human body posture detection method and system | |
CN110378297B (en) | Remote sensing image target detection method and device based on deep learning and storage medium | |
US11386674B2 (en) | Class labeling system for autonomous driving | |
US20170124415A1 (en) | Subcategory-aware convolutional neural networks for object detection | |
EP3844669A1 (en) | Method and system for facilitating recognition of vehicle parts based on a neural network | |
US20120147189A1 (en) | Adaptation for clear path detection using reliable local model updating | |
CN110879950A (en) | Multi-stage target classification and traffic sign detection method and device, equipment and medium | |
CN110659545B (en) | Training method of vehicle identification model, vehicle identification method, device and vehicle | |
Mahaur et al. | Road object detection: a comparative study of deep learning-based algorithms | |
WO2020258077A1 (en) | Pedestrian detection method and device | |
CN112613344B (en) | Vehicle track occupation detection method, device, computer equipment and readable storage medium | |
CN108898057B (en) | Method, device, computer equipment and storage medium for tracking target detection | |
CN111461145A (en) | Method for detecting target based on convolutional neural network | |
EP3726421A2 (en) | Recognition method and apparatus for false detection of an abandoned object and image processing device | |
CN114913498A (en) | Parallel multi-scale feature aggregation lane line detection method based on key point estimation | |
Gu et al. | Embedded and real-time vehicle detection system for challenging on-road scenes | |
Al Mamun et al. | Efficient lane marking detection using deep learning technique with differential and cross-entropy loss. | |
CN112163521A (en) | Vehicle driving behavior identification method, device and equipment | |
CN114898306B (en) | Method and device for detecting target orientation and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |