CN110163047A - A kind of method and device detecting lane line - Google Patents

A kind of method and device detecting lane line Download PDF

Info

Publication number
CN110163047A
CN110163047A CN201810732366.6A CN201810732366A CN110163047A CN 110163047 A CN110163047 A CN 110163047A CN 201810732366 A CN201810732366 A CN 201810732366A CN 110163047 A CN110163047 A CN 110163047A
Authority
CN
China
Prior art keywords
point cloud
cloud chart
chart picture
vehicle
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810732366.6A
Other languages
Chinese (zh)
Other versions
CN110163047B (en
Inventor
陈仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Tencent Dadi Tongtu Beijing Technology Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Tencent Dadi Tongtu Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd, Tencent Dadi Tongtu Beijing Technology Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810732366.6A priority Critical patent/CN110163047B/en
Publication of CN110163047A publication Critical patent/CN110163047A/en
Application granted granted Critical
Publication of CN110163047B publication Critical patent/CN110163047B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The embodiment of the invention provides a kind of method and devices for detecting lane line, are related to lane detection technology field, this method comprises: obtaining the N frame point cloud chart picture that vehicle-mounted laser equipment is determined by transmitting laser signal.N frame point cloud chart picture is spliced into target point cloud chart picture, target point cloud chart is then determined as corresponding cloud intensity image, later again from target point cloud chart as determining lane line in corresponding cloud intensity image according to target point cloud chart picture.Since the influence that laser signal is illuminated by the light is smaller, therefore vehicle-mounted laser equipment transmitting laser signal is used to obtain point cloud chart picture for detecting lane line, improves the real-time and stability of detection lane line.Since spliced point cloud chart picture includes more information, therefore lane detection is carried out according to spliced point cloud chart picture, improves the precision of detection lane line.Using point cloud chart as corresponding cloud intensity image detection lane line, avoids and directly analyze point cloud chart picture in the 3 d space, improve the efficiency of detection lane line.

Description

A kind of method and device detecting lane line
Technical field
The present embodiments relate to lane detection technology field more particularly to a kind of method and devices for detecting lane line.
Background technique
The sport technique segment of a core is the positioning of vehicle in automatic Pilot, is only aware of phase of the vehicle on road After location information, automatic driving vehicle is just known the location of in environment and map, to carry out subsequent each A link, such as calculate objects and the relative positions from vehicle such as other vehicles in environment.Currently, can be by detecting lane Line positions the position of vehicle.When detecting lane line, the front camera or rear camera shooting figure of vehicle are first used Then picture carries out the image of shooting handling determining lane line.Since camera is a kind of sensing system of passive light source, therefore In the case where the illumination conditions such as night are poor, the image of acquisition is relatively fuzzy, so that the precision for detecting lane line is lower.Except automatic Outside the needs of driving, lane detection technology is possibly used for other field.
Summary of the invention
Due to the shadow being illuminated by the light in the prior art by the precision that camera shoots the method that road image detects lane line Big problem is rung, the embodiment of the invention provides a kind of method and devices for detecting lane line, to improve the essence of detection lane line Degree.
In a first aspect, the embodiment of the invention provides a kind of methods for detecting lane line, this method comprises: obtaining vehicle-mounted sharp The N frame point cloud chart picture that light device is determined by transmitting laser signal, N is positive integer.The N frame point cloud chart picture is spliced into target Point cloud chart picture, then according to the target point cloud chart picture determine the target point cloud chart as corresponding cloud intensity image, later Again from the target point cloud chart as determining lane line in corresponding cloud intensity image.Since vehicle-mounted laser equipment passes through transmitting The process that laser signal obtains point cloud chart picture be interfered by outside it is smaller, it is stronger to the adaptability of the factors such as light source, weather, therefore adopt Point cloud chart picture is obtained with vehicle-mounted laser equipment transmitting laser signal, then by carrying out handling determining lane line to point cloud chart picture When, the detection accuracy of lane line influenced by light source, weather it is also smaller, thus improve detection lane line real-time and stabilization Property.Secondly as spliced point cloud chart picture includes more lane informations, while point cloud chart picture is enhanced, therefore When detecting lane line according to point cloud chart picture, first multiframe point cloud chart picture is spliced, vehicle is carried out according to spliced point cloud chart picture Diatom detection, to improve the precision of detection lane line.In addition, when detecting lane line by spliced point cloud chart picture, it will Point cloud chart picture is converted to a cloud intensity image, avoids and directly analyzes point cloud chart picture in the 3 d space, on the one hand simplify On the other hand the process for detecting lane line improves the efficiency of detection lane line.
It is optionally, described that the N frame point cloud chart picture is spliced into target point cloud chart picture, comprising:
Location information when emitting laser signal according to the vehicle-mounted laser equipment determines the N frame point cloud chart as between Relative position information;
Successively the N frame point cloud chart picture is spliced according to the relative position information between the N frame point cloud chart picture, really The fixed target point cloud chart picture.Due to vehicle-mounted laser equipment position difference when, vehicle-mounted laser equipment by transmitting laser signal Determining point cloud chart picture also can be different, therefore before splicing to N frame point cloud chart picture, it first needs to be emitted according to vehicle-mounted laser equipment Location information determines the relative position information between N frame point cloud chart picture when laser signal, then according to N frame point cloud chart as between Relative position information splices N frame point cloud chart picture, determines target point cloud chart picture, to improve the accurate of target point cloud chart picture Property.
Optionally, the location information when transmitting laser signal according to the vehicle-mounted laser equipment determines the N frame point Relative position information between cloud atlas picture, comprising:
For any two frames point cloud chart picture, vehicle-mounted laser equipment transmitting wherein a frame point cloud chart as corresponding first When laser signal, the first location information of the vehicle-mounted laser equipment is determined by vehicle-mounted inertial navigation equipment;
When the vehicle-mounted laser equipment emits another frame point cloud chart as corresponding second laser signal, by described vehicle-mounted Inertial navigation equipment determines the second location information of the vehicle-mounted laser equipment;
Vehicle-mounted laser equipment transmitting described the is determined according to the first location information and the second location information Relative position information when one laser signal and when the transmitting second laser signal;
Phase when the vehicle-mounted laser equipment is emitted the first laser signal and emitting the second laser signal It is determined as the relative position information of the two frames point cloud chart picture to location information.
When emitting the position difference of laser signal due to vehicle-mounted laser equipment, according to the determining point cloud chart picture of laser signal It is different, therefore the relative position information of vehicle-mounted laser equipment determines the relative position information between point cloud chart picture.Due to vehicle-mounted Inertial navigation equipment can determine in real time the location information of vehicle-mounted laser equipment, therefore determine vehicle-mounted laser equipment by vehicle-mounted inertial navigation equipment Emit location information when each laser signal, then further determines that opposite position when vehicle-mounted laser equipment emits each laser signal Confidence breath, relative position information when vehicle-mounted laser equipment to emit to each laser signal is as the opposite position between each point cloud chart picture Confidence breath, offers convenience for the splicing of subsequent each point cloud chart picture.
Optionally, the relative position information according between the N frame point cloud chart picture is successively by the N frame point cloud chart picture Spliced, determine the target point cloud chart picture, comprising:
The N frame point cloud chart picture is unified in same coordinate according to the relative position information between the N frame point cloud chart picture In system;
N frame point cloud chart picture after unified coordinate system is subjected to splicing and determines the target point cloud chart picture.
Optionally, the N frame point cloud chart picture by after unified coordinate system carries out splicing and determines the target point cloud chart picture, packet It includes:
First frame point cloud chart picture and the second frame point cloud chart picture are spliced, determine the first son splicing point cloud chart picture;
Successively M splicing point cloud chart picture is spliced with M+2 frame point cloud chart picture, until the M+2 frame point cloud Image be nth frame point cloud chart as when, determine the target point cloud chart picture, the M is the integer more than or equal to 1, the N be greater than Integer equal to 3.
Since when splicing to N frame point cloud chart picture, on the basis of first frame point cloud chart picture, other point cloud chart pictures are successively Splice with first frame point cloud chart picture, therefore when determining the relative position information between N frame point cloud chart picture, it is only necessary to determine first frame Then the point cloud chart picture relative position information between other N-1 frame point cloud chart pictures respectively carries out the splicing of point cloud chart picture, thus simple Image mosaic process is changed.
Optionally, the N frame point cloud chart picture by after unified coordinate system carries out splicing and determines the target point cloud chart picture, packet It includes:
First splicing step: M frame point cloud chart picture and M+1 frame point cloud chart picture are spliced, determine the N/2 frame first order Splice point cloud chart picture, 1≤M < N, M are odd number;
Second splicing step: the L-th frame first order is spliced into point cloud chart picture and L+1 frame first order splicing point cloud chart picture carries out Splicing determines that point cloud chart picture is spliced in the N/4 frame second level, and 1≤L < N/2, L are odd number, and N is the even number more than or equal to 4;
The first splicing step, the second splicing step are successively executed, until the splicing point cloud chart picture determined is One frame forms the target point cloud chart picture.
More compared to the similarity between non-adjacent point cloud chart picture due to the similarity between adjacent two frames point cloud chart picture Height, therefore for being spliced between adjacent two frames point cloud chart picture and being spliced compared to non-adjacent two frames point cloud chart picture, splicing misses Difference it is smaller, therefore splice N frame point cloud chart as when, first spelled according to the relative position information between adjacent two frames point cloud chart picture It connects, determines that multiframe splices point cloud chart picture, then again splice adjacent son splicing point cloud chart picture, until determining target point Cloud atlas picture improves the accuracy of target point cloud chart picture to reduce the stitching error when splicing of N frame point cloud chart picture.
Optionally, it is described from the target point cloud chart as determining lane line in corresponding cloud intensity image, comprising:
Described cloud intensity image is split according to the grey scale pixel value of described cloud intensity image, determines lane Line.Since lane line is generally white or yellow, and road surface is black or grey, therefore is emitting laser using vehicle-mounted laser equipment In the point cloud intensity image of signal acquisition, there is apparent difference in the reflected intensity of lane line and the reflected intensity on road surface, so In the point cloud intensity image of acquisition, the grey scale pixel value on the grey scale pixel value of lane line and road surface there is also significant difference, according to The grey scale pixel value of point cloud intensity image, which is split point cloud chart picture, can effectively detect lane line, improve detection lane line Detection accuracy.
Second aspect, the embodiment of the invention provides a kind of devices for detecting lane line, comprising:
Module is obtained, the N frame point cloud chart picture determined for obtaining vehicle-mounted laser equipment by transmitting laser signal, N is positive Integer;
Splicing module, for the N frame point cloud chart picture to be spliced into target point cloud chart picture;
Processing module, for determining the target point cloud chart as corresponding cloud intensity image;
Detection module, for from the target point cloud chart as determining lane line in corresponding cloud intensity image.
Optionally, the splicing module is specifically used for:
Location information when emitting laser signal according to the vehicle-mounted laser equipment determines the N frame point cloud chart as between Relative position information;
Successively the N frame point cloud chart picture is spliced according to the relative position information between the N frame point cloud chart picture, really The fixed target point cloud chart picture.
Optionally, the splicing module is specifically used for:
For any two frames point cloud chart picture, vehicle-mounted laser equipment transmitting wherein a frame point cloud chart as corresponding first When laser signal, the first location information of the vehicle-mounted laser equipment is determined by vehicle-mounted inertial navigation equipment;
When the vehicle-mounted laser equipment emits another frame point cloud chart as corresponding second laser signal, by described vehicle-mounted Inertial navigation equipment determines the second location information of the vehicle-mounted laser equipment;
Vehicle-mounted laser equipment transmitting described the is determined according to the first location information and the second location information Relative position information when one laser signal and when the transmitting second laser signal;
Phase when the vehicle-mounted laser equipment is emitted the first laser signal and emitting the second laser signal It is determined as the relative position information of the two frames point cloud chart picture to location information.
Optionally, the splicing module is specifically used for:
The N frame point cloud chart picture is unified in same coordinate according to the relative position information between the N frame point cloud chart picture In system;
N frame point cloud chart picture after unified coordinate system is subjected to splicing and determines the target point cloud chart picture.
Optionally, the splicing module is specifically used for:
First frame point cloud chart picture and the second frame point cloud chart picture are spliced, determine the first son splicing point cloud chart picture;
Successively M splicing point cloud chart picture is spliced with M+2 frame point cloud chart picture, until the M+2 frame point cloud Image be nth frame point cloud chart as when, determine the target point cloud chart picture, the M is the integer more than or equal to 1, the N be greater than Integer equal to 3.
Optionally, the splicing module is specifically used for:
First splicing step: M frame point cloud chart picture and M+1 frame point cloud chart picture are spliced, determine the N/2 frame first order Splice point cloud chart picture, 1≤M < N, M are odd number;
Second splicing step: the L-th frame first order is spliced into point cloud chart picture and L+1 frame first order splicing point cloud chart picture carries out Splicing determines that point cloud chart picture is spliced in the N/4 frame second level, and 1≤L < N/2, L are odd number;
The first splicing step, the second splicing step are successively executed, until the splicing point cloud chart picture determined is One frame, forms the target point cloud chart picture, and N is the even number more than or equal to 4.
Optionally, the detection module is specifically used for:
Described cloud intensity image is split according to the grey scale pixel value of described cloud intensity image, determines lane Line.
The third aspect, the embodiment of the invention provides a kind of automatic Pilot control methods, including, detection lane line and basis Lane line traffic control travel route detects the lane line using method described in any of the above embodiments.
Fourth aspect, the embodiment of the invention provides a kind of automatic driving control systems, including at least one processing unit, And at least one storage unit, wherein the storage unit is stored with computer program, when described program is single by the processing When member executes, so that the step of processing unit executes any of the above-described the method.
5th aspect, the embodiment of the invention provides a kind of computer-readable mediums, are stored with computer program, work as institute When stating program operation, the step of executing any of the above-described the method.
In the embodiment of the present invention, since vehicle-mounted laser equipment obtains the process of point cloud chart picture to light by transmitting laser signal The adaptability of the factors such as source, weather is stronger, therefore vehicle-mounted laser equipment transmitting laser signal is used to obtain point cloud chart picture, then passes through When carrying out handling determining lane line to point cloud chart picture, the detection accuracy of lane line influenced by light source, weather it is smaller, to improve Detect the real-time and stability of lane line.Secondly as spliced point cloud chart picture includes more lane informations, while right Point cloud chart picture is enhanced, therefore when detecting lane line according to point cloud chart picture, first multiframe point cloud chart picture is spliced, according to Spliced point cloud chart picture carries out lane detection, to improve the precision of detection lane line.Splice point cloud chart as when, by When the position difference of vehicle-mounted laser equipment, vehicle-mounted laser equipment can not by the point cloud chart picture that transmitting laser signal determines yet Together, therefore before splicing to N frame point cloud chart picture, location information is determined when first emitting laser signal according to vehicle-mounted laser equipment Relative position information between N frame point cloud chart picture, then according to the relative position information between N frame point cloud chart picture to N frame point cloud Image is spliced, and determines target point cloud chart picture, to improve the accuracy of target point cloud chart picture.In addition, by spliced When point cloud chart picture detects lane line, point cloud chart picture is converted into a cloud intensity image, avoids direct analysis site in the 3 d space On the other hand cloud atlas picture improves the efficiency of detection lane line on the one hand simplify the process of detection lane line.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment Attached drawing is briefly introduced, it should be apparent that, drawings in the following description are only some embodiments of the invention, for this For the those of ordinary skill in field, without any creative labor, it can also be obtained according to these attached drawings His attached drawing.
Fig. 1 is a kind of application scenario diagram provided in an embodiment of the present invention;
Fig. 2 is a kind of structural schematic diagram of device for detecting lane line provided in an embodiment of the present invention;
Fig. 3 is a kind of flow diagram of method for detecting lane line provided in an embodiment of the present invention;
Fig. 4 is a kind of schematic diagram of lane line provided in an embodiment of the present invention;
Fig. 5 is a kind of flow diagram of the relative position information of determining point cloud chart picture provided in an embodiment of the present invention;
Fig. 6 is a kind of schematic diagram of position coordinates provided in an embodiment of the present invention;
Fig. 7 is a kind of schematic diagram of angle coordinate provided in an embodiment of the present invention;
Fig. 8 is a kind of schematic diagram of point cloud chart picture splicing provided in an embodiment of the present invention;
Fig. 9 is a kind of schematic diagram of point cloud chart picture splicing provided in an embodiment of the present invention;
Figure 10 is a kind of schematic diagram of point cloud chart picture splicing provided in an embodiment of the present invention;
Figure 11 a is the schematic diagram of a kind of cloud intensity image provided in an embodiment of the present invention;
Figure 11 b is the schematic diagram of a kind of cloud intensity image provided in an embodiment of the present invention;
Figure 12 is a kind of schematic diagram of lane line provided in an embodiment of the present invention;
Figure 13 is a kind of structural schematic diagram of device for detecting lane line provided in an embodiment of the present invention;
Figure 14 is a kind of structural schematic diagram of Ride Control System provided in an embodiment of the present invention.
Specific embodiment
In order to which the purpose of the present invention, technical solution and beneficial effect is more clearly understood, below in conjunction with attached drawing and implementation Example, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only used to explain this hair It is bright, it is not intended to limit the present invention.
In order to facilitate understanding, noun involved in the embodiment of the present invention is explained below.
A kind of laser radar: radar system with characteristic quantities such as the position, the speed that emit detecting laser beam target.Pass through to Objective emission detectable signal (laser beam), then by the reflected echo-signal of slave target (target echo) received and hair Penetrate signal to be compared, after making proper treatment, so that it may obtain target for information about, as target range, orientation, height, speed, The parameters such as posture, even shape, to be detected, tracked and be identified to target.
Laser point cloud: the point data set on the product appearance surface obtained by measuring instrument is referred to as point cloud.When a branch of When laser irradiation is to body surface, the laser reflected can carry the information such as orientation, distance.If by laser beam according to certain track It is scanned, the laser point information of reflection will be recorded in scanning, it is extremely fine due to scanning, then it can obtain a large amount of Laser point, thus laser point cloud can be formed.
The splicing of laser point cloud image: in particular moment, the point cloud chart picture of laser radar acquisition is a certain instantaneous image, It is typically provided with positioning device, such as vehicle-mounted inertial navigation in vehicle or vehicle-mounted laser equipment, utilizes the output of vehicle-mounted inertial navigation equipment The point cloud chart picture of adjacent moment can be stitched together by location information, thus obtain the point cloud chart picture in different moments, more The spliced point cloud chart picture of frame is more abundant, and feature becomes apparent, and provides data supporting abundant for environment sensing.
Point cloud strength information: vehicle-mounted laser equipment emits collected Reflection intensity information after laser signal, reflected intensity The Facing material of information and target, roughness, incident angular direction and laser equipment emitted energy, optical maser wavelength is related.
During concrete practice, travelled it was found by the inventors of the present invention that automatic driving vehicle or auxiliary drive vehicle In the process, car's location can be positioned according to the position of lane line.Existing main front camera using vehicle or Rear camera shoots image, then goes out lane line according to the image detection of shooting.However it is poor in illumination conditions such as nights In the case of, the image of camera acquisition is very fuzzy, lane line can not be sometimes identified from the image of shooting, to influence to examine The precision and real-time of measuring car diatom, it is therefore desirable to a method of detection lane line stronger to extraneous environmental suitability, To improve the precision and real-time of detection lane line.
For this purpose, the present inventor it is considered that laser radar transmitting laser signal it is relatively small by external interference, it is right The factors adaptability such as light source, weather is stronger.Laser radar can be by objective emission laser signal, then by received from mesh It marks reflected echo-signal to be compared with transmitting signal, to obtain information related with target.Therefore the present invention is implemented In example, point cloud chart picture is determined using vehicle-mounted laser equipment transmitting laser signal, lane line is then detected according to point cloud chart picture, thus Improve the real-time of detection lane line.Secondly as vehicle-mounted laser equipment is at least wrapped by the point cloud that transmitting laser signal obtains Three-dimensional coordinate and reflected intensity are included, and vehicle-mounted laser equipment (such as 64 line laser radar of Velodyne) scanning obtains a frame point Cloud atlas as when, point cloud reach 100,000 or more, data volume is very huge.If the original point cloud number directly determined to vehicle-mounted laser equipment According to being handled, it is more difficult to reach real-time demand.For this purpose, the present inventor is it is considered that lane line is generally white or yellow Color, and road surface is black or grey, lane line is to the reflected intensity of laser signal to the reflected intensity of laser signal and road surface Different, in point cloud chart as lane line in corresponding cloud intensity image and road surface can also have apparent difference.Therefore, directly Point cloud chart picture is converted into a cloud intensity image, lane line is then detected according to cloud intensity image, to reduce detection vehicle Data volume to be treated when diatom, improves the real-time of lane detection.Due in cloud intensity image, lane line There are significant differences for grey scale pixel value and the grey scale pixel value on road surface, therefore, in the embodiment of the present invention, according to a cloud intensity image Grey scale pixel value detection lane line is split to point cloud chart picture, improve the precision of detection lane line.In addition, in point cloud chart picture The three-dimensional coordinate of point cloud is corresponding with reflected intensity, therefore after detecting lane line by point cloud intensity image, according to cloud The corresponding relationship of three-dimensional coordinate and reflected intensity can determine the three-dimensional coordinate of corresponding cloud of lane line, then according to point The three-dimensional coordinate of cloud determines the position of lane line.Furthermore it was found by the inventors of the present invention that being examined by single frames point cloud intensity image When measuring car diatom, since the information content that single frames point cloud intensity image includes is less, detecting distance is limited, therefore can not pass through single frames point Cloud intensity image judges that currently detected lane line is solid line or dotted line, in addition in single frames point cloud intensity image lane line with Boundary between ground is nor it is obvious that influence the detection accuracy of lane line.Therefore, in the embodiment of the present invention, pass through vehicle It carries laser equipment transmitting laser signal and determines multiframe point cloud chart picture, then multiframe point cloud chart picture is spliced, determines target point Cloud atlas picture, then further according to target point cloud chart as corresponding cloud intensity image detects lane line.Since multiframe point cloud chart picture is spelled The target point cloud chart picture determined after connecing includes more information, and the distance of detection is also farther, therefore can judge lane line well It is solid line or dotted line, while improves precision when carrying out image segmentation to cloud intensity image, to further increases inspection The precision of measuring car diatom.
The method of detection lane line in the embodiment of the present invention can be applied to application scenarios as shown in Figure 1, answer at this With in scene include vehicle 101 and vehicle-mounted laser equipment 102.
Vehicle 101 can be also possible to assist driving intelligent vehicle with automatic driving vehicle.It include automatic Pilot in vehicle 101 Control system, in automatic driving vehicle, automatic driving control system perceives vehicle-periphery using onboard sensor, and According to road, vehicle location and obstacle information obtained is perceived, the steering and speed of vehicle are controlled, to enable the vehicle to Reliably and securely travelled on road.In auxiliary driving intelligent vehicle, automatic driving control system is felt using onboard sensor Know vehicle-periphery, and according to road, vehicle location and obstacle information obtained is perceived, realizes inevitable in collision When take the measures such as emergency brake, to protect driver safe.
Vehicle-mounted laser equipment 102 is the laser radar being loaded on vehicle, is a kind of mobile model three-dimensional laser scanning system. Vehicle-mounted laser equipment 102 can be positioned at the top of vehicle 101, headstock, the tailstock etc..Vehicle-mounted laser equipment 102 is by Laser emission Electric signal is become laser signal and launched by the composition such as machine, optical receiver, turntable and information processing system, laser transmitter It goes, optical receiver electric signal is reduced into from the reflected laser signal of target, is sent to information processing system again.Laser hair Machine is penetrated when emitting laser signal, laser scanning is realized in turntable rotation.Information processing system by optical receiver receive from The reflected laser signal of target is compared with the laser signal that laser transmitter emits, after making proper treatment, so that it may obtain Point cloud chart picture is obtained, point cloud chart picture includes the three-dimensional coordinate of a cloud, the reflected intensity for putting cloud etc..
It further include global positioning system (Global Positioning System, abbreviation GPS) in vehicle 101, vehicle-mounted used The positioning devices such as equipment are led, positioning device can be inside vehicle 101, can also be located in vehicle-mounted laser equipment 102.Its In, inertial navigation equipment is a kind of independent of external information, also not to the autonomic navigation system of external radiation energy, building ring Border not only includes aerial, ground, can also be under water.The basic functional principle of inertial navigation equipment is using Newton mechanics law as base Plinth integrates it to the time, and it is transformed to navigational coordinate system by measurement carrier in the acceleration of inertial reference system In, it will be able to obtain the information such as speed, yaw angle and the position in navigational coordinate system.
Further, in application scenario diagram shown in Fig. 1, the automatic driving control system in vehicle 101 includes detection The device of lane line, the structural schematic diagram of the device as shown in Fig. 2, the device include obtain module 1011, splicing module 1012, Processing module 1013 and detection module 1014.It obtains module 1011 and obtains N frame point cloud chart picture from vehicle-mounted laser equipment 102, Determine that vehicle-mounted light device emits location information of the N frame point cloud chart as corresponding laser signal when by positioning device.Splicing module 1012 according to vehicle-mounted light device emit N frame point cloud chart as corresponding laser signal when location information determine N frame point cloud chart picture it Between relative position information, N frame point cloud chart picture is then spliced by target according to the relative position information between N frame point cloud chart picture Then point cloud chart picture emits target point cloud chart picture to processing module 1003, turned target point cloud chart picture by processing module 1003 Turn to a cloud intensity image.Detection module 1004 is split really a cloud intensity image according to the grey scale pixel value of point cloud chart picture Determine lane line.
The structural schematic diagram of device based on application scenario diagram shown in FIG. 1 and detection lane line shown in Fig. 2, the present invention Embodiment provides a kind of process of method for detecting lane line, and the process of this method can be held by the device of detection lane line Row, as shown in Figure 3, comprising the following steps:
Step S301 obtains the N frame point cloud chart picture that vehicle-mounted laser equipment is determined by transmitting laser signal, and N is positive integer.
Lane line is used to refer to vehicle and drives into section at crossing should to travel by direction.It is general in the big traffic intersection of wagon flow It is decorated with such graticule, purpose is exactly clear direction of traffic, is taken their own roads, and slows down traffic pressure, and lane line includes white dashed line, white Color solid line, guiding index line, deceleration indication wire etc., illustratively, as shown in figure 4, the lane line for including in figure is that white is real Line and guiding index line.
Vehicle-mounted laser equipment emits laser signal in the form scanned, emits the time interval of laser signal every time according to reality Border situation was set, such as every 0.1 second or 0.2 second etc..Road surface is received after vehicle-mounted laser equipment transmitting laser signal to return The echo-signal returned, determines point cloud chart picture according to the echo-signal of return.When detecting lane line, available vehicle-mounted laser is set The standby N frame point cloud chart picture continuously determined, also the N frame point cloud chart picture of the available discontinuous determination of vehicle-mounted laser equipment.It is exemplary Ground, vehicle-mounted laser equipment have determined 10 frame point cloud chart pictures every the laser signal of transmitting in 0.1 second in 1 second.N is set as 5, inspection When measuring car diatom, the preceding 5 frame point cloud chart picture that available vehicle-mounted laser equipment determines, i.e. first frame point cloud chart picture to the 5th frame point Cloud atlas picture.A frame point cloud chart picture, i.e., the can also be chosen every a frame from the 10 frame point cloud chart pictures that vehicle-mounted laser equipment determines One frame, third frame, the 5th frame, the 7th frame and the 9th frame point cloud chart picture.The method that can also be other selection point cloud chart pictures, this Place no longer repeats one by one.
Point cloud chart picture includes the information such as three-dimensional coordinate, reflected intensity by multiple point Yun Zucheng, each cloud.
N frame point cloud chart picture is spliced into target point cloud chart picture by step S302.
Specifically, location information of the N frame point cloud chart as corresponding laser signal when can be emitted according to vehicle-mounted laser equipment Determine the relative position information between N frame point cloud chart picture.N frame point cloud chart can also be received according to vehicle-mounted laser equipment as corresponding Location information when echo-signal determines the relative position information between N frame point cloud chart picture.Then according between N frame point cloud chart picture Relative position information successively N frame point cloud chart picture is spliced, determine target point cloud chart picture.
The process of the relative position information between determining N frame point cloud chart picture is illustrated below, for N frame point cloud chart picture In any two frames point cloud chart picture, set wherein a frame point cloud chart picture as benchmark point cloud chart picture, another frame point cloud chart picture be it is to be spliced Point cloud chart picture, as shown in Figure 5, comprising the following steps:
Step S501, vehicle-mounted laser equipment transmitted reference point cloud chart is as corresponding first laser signal.
Step S501, vehicle-mounted inertial navigation equipment determine first position letter when vehicle-mounted laser equipment emits first laser information Breath.
Step S503, vehicle-mounted laser equipment emit point cloud chart to be spliced as corresponding second laser signal.
Step S504, vehicle-mounted inertial navigation equipment determine second confidence when vehicle-mounted laser equipment emits second laser information Breath.
Location information includes position coordinates and angle coordinate, vehicle-mounted inertial navigation equipment by detection vehicle in the process of moving Then acceleration carries out acceleration integrating determining speed, can determine that the position of vehicle is sat to speed further progress integral Mark.It, therefore can be using the position coordinates of vehicle as the position of vehicle-mounted laser equipment since vehicle-mounted laser equipment is fixed on vehicle Set coordinate.Illustratively, as shown in fig. 6, it is laterally Y-axis that setting road surface, which is longitudinally X-axis, road surface, the direction vertical with road surface is Z Axis establishes navigational coordinate system.The origin of navigational coordinate system is set according to the actual situation, for example origin can be Position coordinates of the vehicle-mounted laser equipment when transmitted reference point cloud chart is as corresponding laser signal, i.e. first position coordinate.Origin Coordinate is also possible to vehicle-mounted laser equipment and is emitting the position coordinates of point cloud chart to be spliced as corresponding laser signal when, i.e., and second Position coordinates.Origin can also be other coordinates in addition to first position coordinate and second position coordinate.Below with One position coordinates be navigational coordinate system origin for be illustrated, for example vehicle is from first position coordinate (0,0,0) institute Position set out, after 0.1 second reach second position coordinate where position, vehicle-mounted laser equipment is where the coordinate of the second position Position emit laser signal, vehicle-mounted inertial navigation equipment obtains in this 0.1 second, vehicle X-direction acceleration, to acceleration It carries out obtaining vehicle after integrating twice in the position coordinates of X-direction being 1.Similarly, vehicle-mounted inertial navigation equipment is obtained at this 0.1 second Interior, in the acceleration of Y direction, obtain vehicle in the position coordinates of Y direction after integrating twice to acceleration progress is vehicle 1.Since vehicle is 0 in the acceleration of Z-direction, therefore vehicle is 0 in the position coordinates of Z-direction, according to vehicle in X-axis, Y-axis And the position coordinates of Z-direction can determine that second position coordinate is (1,1,0).
Angle coordinate is vehicle-mounted laser equipment for references angle, around the angle of X-axis, Y-axis and Z axis rotation. Since vehicle-mounted laser equipment is to be fixed on vehicle, therefore can sit the angle coordinate of vehicle as the angle of vehicle-mounted laser equipment Mark.Before the angle coordinate for determining vehicle-mounted laser equipment, setting references angle coordinate is first needed, for example references angle coordinate can be with It is angle coordinate of the vehicle-mounted laser equipment when transmitted reference point cloud chart is as corresponding laser signal, i.e. first angle coordinate.Base Quasi- angle coordinate is also possible to vehicle-mounted laser equipment and is emitting angle coordinate of the point cloud chart to be spliced as corresponding laser signal when, That is second angle coordinate.References angle coordinate can also be the other angles in addition to first angle coordinate and second angle coordinate Coordinate.Illustratively, vehicle-mounted after 0.1 second as shown in fig. 7, setting references angle coordinate as first angle coordinate (0 °, 0 °, 0 °) Laser equipment emits point cloud chart to be spliced as corresponding laser signal, at this time vehicle compared to first angle coordinate for, about the z axis 30 degree have been rotated clockwise, therefore the second angle coordinate of vehicle-mounted laser equipment is to rotate clockwise 30 degree about the z axis, i.e., (0 °, 0 °, 30°)。
It should be noted that can only not only be set using vehicle-mounted inertial navigation when determining the location information of vehicle-mounted laser equipment It is standby, vehicle GPS can also be used.
Step S505 determines vehicle-mounted laser equipment transmitting first laser letter according to first location information and second location information Number when and transmitting second laser signal when relative position information.
Relative position information includes positional shift and angle offset.For example first position coordinate is (0,0,0), the second position Coordinate is (1,1,0), then the positional shift between first position coordinate and second position coordinate is to offset by 1 along the x axis, 1 is offset by along the y axis.For another example, first angle coordinate is to rotate clockwise 0 degree around X-axis, Y-axis, Z axis, i.e., (0 °, 0 °, 0°).Second angle coordinate is to rotate clockwise 30 degree about the z axis, i.e., (0 °, 0 °, 30 °).So, first angle coordinate and second jiao Angle offset between degree coordinate is to rotate clockwise 30 degree about the z axis.
Step S506, by opposite position when vehicle-mounted laser equipment transmitting first laser signal and when transmitting second laser signal Confidence ceases the relative position information for being determined as benchmark point cloud chart picture and point cloud chart picture to be spliced.
Optionally, after determining the relative position information between N frame point cloud chart picture, according to N frame point cloud chart as between N frame point cloud chart picture is unified in the same coordinate system by relative position information, then by the N frame point cloud chart picture after unified coordinate system into Row, which splices, determines target point cloud chart picture.
It illustrates below to N frame point cloud chart picture to be unified in the same coordinate system, sets first position coordinate For (0,0,0), second position coordinate is (1,1,0), and the positional shift between first position coordinate and second position coordinate is along X Axis direction offsets by 1, offsets by 1 along the y axis, then point cloud chart picture to be spliced compared to benchmark point cloud chart picture for, along X-axis Direction offsets by 1, offsets by 1 along the y axis.In order to by point cloud chart to be spliced as the position coordinates of midpoint cloud are converted to datum mark The position coordinates of cloud atlas picture, the X-coordinate value in original coordinate need to be added 1 by each point cloud in point cloud chart picture to be spliced, will be original Y-coordinate value in coordinate adds 1.If than point cloud chart to be spliced as midpoint cloud A position coordinates be (0,1,0), by a position of cloud A It sets after coordinate is converted to the position coordinates of benchmark point cloud chart picture as (1,2,0).
Illustratively, the angle offset between first angle coordinate and second angle coordinate is set as rotation clockwise about the z axis Turn 30 degree, then point cloud chart picture to be spliced compared to benchmark point cloud chart picture for, angle offset is rotates clockwise 30 about the z axis Degree.Angle in coordinate system in order to which point cloud chart to be spliced to be converted to benchmark point cloud chart picture as the angle coordinate of midpoint cloud is sat It marks, original angle coordinate need to be rotated again 30 degree by each point cloud clockwise about the z axis in point cloud chart picture to be spliced.Such as it is to be spliced Point cloud chart as midpoint cloud A angle coordinate be (0 °, 0 °, 30 °), by an angle coordinate of cloud A convert to benchmark point cloud chart as Angle coordinate after coordinate system is (0 °, 0 °, 60 °).
Due to vehicle-mounted laser equipment by emit laser intelligence determine point cloud chart as when, the position of each point cloud in point cloud chart picture Information is all determining on the basis of location information when vehicle-mounted laser equipment transmitting laser, therefore in the position of vehicle-mounted laser equipment When changing, the coordinate system of point cloud chart picture will with variation, while each point cloud chart picture will not in the same coordinate system, because This can not direct splicing.In the embodiment of the present invention, vehicle-mounted laser is first determined according to first location information and second location information With the relative position information when transmitting second laser signal when equipment transmitting first laser signal, then vehicle-mounted laser equipment is sent out With relative position information when transmitting second laser signal as benchmark point cloud chart picture and point to be spliced when penetrating first laser signal The relative position information of cloud atlas picture is unified benchmark point cloud chart picture and point cloud chart picture to be spliced under a coordinate system, Zhi Houzai Spliced, to improve the accuracy of spliced target point cloud chart picture.
During carrying out the N frame point cloud chart picture after unified coordinate system to splice determining target point cloud chart picture, at least wrap Include following several connecting methods:
In a kind of possible connecting method, first frame point cloud chart picture and the second frame point cloud chart picture are spliced, determined First son splicing point cloud chart picture successively splices M splicing point cloud chart picture with M+2 frame point cloud chart picture, until described M+2 frame point cloud chart picture be nth frame point cloud chart as when, determine the target point cloud chart picture, the M is the integer more than or equal to 1, The N is the integer more than or equal to 3.
Illustratively, as shown in figure 8, setting N is equal to 4, first frame point cloud chart picture and the second frame point cloud chart picture are spelled It connects, determines the first son splicing point cloud chart picture.Sub- splicing point cloud chart picture and third frame point cloud chart picture are spliced by first, determined Second son splicing point cloud chart picture.Sub- splicing point cloud chart picture and the 4th frame point cloud chart picture are spliced by second, determine target point Cloud atlas picture.Since when splicing to N frame point cloud chart picture, on the basis of first frame point cloud chart picture, other point cloud chart pictures are successively Splice with first frame point cloud chart picture, therefore only it needs to be determined that first frame point cloud chart is as respectively between other N-1 frame point cloud chart pictures Relative position information, to simplify splicing.
In a kind of possible connecting method, the first splicing step: by M frame point cloud chart picture and M+1 frame point cloud chart picture Spliced, determines that the N/2 frame first order splices point cloud chart picture, 1≤M < N, M are odd number, and N is the even number more than or equal to 2.Second Splicing step: L-th frame first order splicing point cloud chart picture and L+1 frame first order splicing point cloud chart picture are spliced, determine N/4 Point cloud chart picture is spliced in the frame second level, and 1≤L < N/2, L are odd number, successively executes the first splicing step, the second splicing step, until The splicing point cloud chart picture determined is a frame, forms the target point cloud chart picture, and N is the even number more than or equal to 2.
Illustratively, as shown in figure 9, setting N as 4, first frame point cloud chart picture and the second frame point cloud chart picture are spliced, Determine first order splicing point cloud chart as 1.Third frame point cloud chart picture and the 4th frame point cloud chart picture are spliced, determine that the first order is spelled Contact cloud atlas is as 2.The first order is spliced into point cloud chart and splices point cloud chart as 2 carry out splicing determining target point cloud chart with the first order as 1 Picture.
In a kind of possible connecting method, successively M frame point cloud chart picture and M+1 frame point cloud chart picture are spliced, Determine that (N-1)/2 frame first order splices point cloud chart picture, 1≤M < N, M are odd number, and N is the odd number more than or equal to 3.Successively by L The frame first order is spliced point cloud chart picture and is spliced with L+1 frame first order splicing point cloud chart picture, determines that (N-1)/4 frame second level is spelled Contact cloud atlas picture, 1≤L < (N-1)/2, L is odd number, and so on, until the splicing point cloud chart picture determined is a frame, formation The target point cloud chart picture.Optionally, since N is odd number, therefore nth frame point cloud chart picture is not spelled with other point cloud chart pictures It connects, determines that the first order splices point cloud chart picture.Therefore when carrying out the splicing of point cloud chart picture, nth frame point cloud chart picture can be directly abandoned, Nth frame point cloud chart picture can also be spliced directly as first order splicing point cloud chart picture and other first order splicing point cloud chart picture, Nth frame point cloud chart picture can also be spliced directly as second level splicing point cloud chart picture and second level splicing point cloud chart picture etc., In this regard, the embodiment of the present invention is not specifically limited.
Illustratively, as shown in Figure 10, N is set as 5, and first frame point cloud chart picture and the second frame point cloud chart picture are spelled It connects, determines first order splicing point cloud chart as 1.Third frame point cloud chart picture and the 4th frame point cloud chart picture are spliced, determine first Grade splicing point cloud chart is as 2.The first order is spliced into point cloud chart and splices point cloud chart as 2 carry out splicing the determining second level with the first order as 1 Splice point cloud chart as 1.Second level splicing point cloud chart is spliced as 1 with the 5th frame point cloud chart picture, determines target point cloud chart picture. Since the similarity between adjacent two frames point cloud chart picture is higher compared to the similarity between non-adjacent point cloud chart picture, thus it is adjacent Being spliced for being spliced compared to non-adjacent two frames point cloud chart picture between two frame point cloud chart pictures, stitching error is smaller, therefore Splice N frame point cloud chart as when, first by N frame point cloud chart picture according between adjacent two frames point cloud chart picture relative position information carry out Splicing determines that multiframe splices point cloud chart picture, then again splices adjacent son splicing point cloud chart picture, until determining target Point cloud chart picture improves the accuracy of target point cloud chart picture to reduce the stitching error when splicing of N frame point cloud chart picture.
It should be noted that above-mentioned several connecting methods are intended only as example and are illustrated, the point in the embodiment of the present invention The connecting method of cloud atlas picture is not limited in above-mentioned several.
Step S303 determines target point cloud chart as corresponding cloud intensity image.
Emit laser signal by vehicle-mounted laser equipment to determine in point cloud chart picture, each cloud include at least three-dimensional coordinate and Reflected intensity, there are mapping relations between three-dimensional coordinate and reflected intensity.Due to every frame point cloud chart picture include point cloud data amount compared with Greatly, the method efficiency for directly detect determining lane line to point cloud chart picture is lower.It is examined from target point cloud chart picture to improve The efficiency of lane line is measured, target point cloud chart picture can be mapped as to corresponding cloud intensity image, then according to point Yun Qiangdu Image detection lane line is eliminated and is handled the three-dimensional coordinate of cloud.
Illustratively, single frames point cloud chart picture is mapped as after a cloud intensity image as shown in fig. 11a, and black vertical line is in figure Lane line.Since vehicle-mounted laser equipment transmitting laser signal is closer, therefore information content that single frames point cloud intensity image includes It is less.When encountering detection lane line is dotted line and solid line, since detecting distance is limited, single frames point cloud intensity image can not be passed through Judge that currently detected lane line is in solid line or dotted line, such as point cloud intensity map shown in Figure 11 a, although detected Lane line, but and not know the lane line be solid line or dotted line, therefore also can not further control vehicle lane change.Secondly, Since the information content that single frames point cloud intensity image includes is less, therefore it will affect vehicle when detection lane line in cloud intensity image The detection accuracy of diatom.In order to solve single frames point cloud chart picture problem when detecting lane line, during the present invention is implemented, by N Frame point cloud chart picture is spliced into target point cloud chart picture, then converts a cloud intensity image for target point cloud chart picture again.
Illustratively, setting continuously acquires three frame point cloud chart pictures by vehicle-mounted laser equipment transmitting laser signal, respectively Point cloud chart is as A, point cloud chart are as B and point cloud chart are as C.It obtains vehicle-mounted laser equipment and sends this three frames point cloud chart as corresponding laser Location information when signal, respectively first location information, second location information and the third place information.
On the basis of coordinate system first by point cloud chart as A, by point cloud chart as A and point cloud chart as B splices, specifically: Determine point cloud chart as A and point cloud chart are as the relative position information of B according to first location information and second location information.Then basis Point cloud chart as A and point cloud chart as B relative position information by point cloud chart as the coordinate system of B converts the coordinate to point cloud chart as A System, later again by point cloud chart as A and point cloud chart as B splices, determine the first son splicing point cloud chart picture.
Then the first son splicing point cloud chart picture and point cloud chart are spliced as C, specifically: due to point cloud chart as A On the basis of coordinate system, by point cloud chart as the coordinate system of B convert to point cloud chart as A coordinate system after again by point cloud chart as A and point cloud Image B is spliced, and determines the first son splicing point cloud chart picture, therefore the coordinate system of the first son splicing point cloud chart picture is point cloud chart as A Coordinate system, therefore the first son splicing point cloud chart picture and point cloud chart can be determined according to first location information and the third place information As the relative position information of C.Then according to first son splicing point cloud chart picture and point cloud chart as C relative position information will put cloud The coordinate system of image C convert to point first son splicing point cloud chart picture coordinate system, later again by first son splicing point cloud chart picture with Point cloud chart is spliced as C, determines target point cloud chart picture.Target point cloud chart as corresponding cloud intensity image as shown in figure 11b, Wherein black vertical line is lane line.By Figure 11 b as it can be seen that the lane line detected is dotted line, automatic driving control system can be with This is according to control vehicle lane change.After being spliced multiframe point cloud chart picture, more information can be obtained, can be measured Longer distance, therefore can detect that lane line is solid line or dotted line, thus in the detection accuracy for improving lane line.Secondly, by The three-dimensional coordinate of reflected intensity and point cloud in cloud intensity image has mapping relations, therefore examines according to cloud intensity image After measuring lane line, it is not necessary to by complicated and unstable coordinate transform, but directly according to reflected intensity and three-dimensional coordinate Between mapping relations determine the location information of lane line, to improve the essence positioned in automatic Pilot vehicle using lane line Degree.
Step S304, from target point cloud chart as determining lane line in corresponding cloud intensity image.
Optionally, it since there are biggish differences for the color of lane line and the color on road surface, therefore is set using vehicle-mounted laser Preparation is penetrated in the point cloud intensity image of laser signal acquisition, and the reflected intensity of lane line and the reflected intensity on road surface can exist obviously Difference, therefore a cloud intensity image can be split according to the grey scale pixel value of cloud intensity image, determine lane line.
When being specifically split to a cloud intensity image, following several embodiments are included at least:
In a kind of possible embodiment, determining lane is split to a cloud intensity image using thresholding method Line.Detailed process are as follows: a threshold value T is set first, which can be global threshold, adaptive threshold, optimal threshold etc.. Then threshold value T and the gray value of each pixel in point cloud chart picture are compared one by one.Gray value is greater than to the pixel of threshold value Point is set to 1, and the pixel that gray value is less than threshold value is set to 0.Then the region for the pixel composition for being set to 1 is determined as a cloud The region where lane line in intensity image.
In a kind of possible embodiment, determining lane is split to a cloud intensity image using domain division method Line.Detailed process are as follows: the selected seed pixel first from cloud intensity image, sub-pixel can be vehicle in a cloud intensity image The single pixel in diatom region is also possible to the zonule comprising several pixels.Then according to lane in cloud intensity image The grey scale pixel value in line region determines similitude rule, while formulating the condition or criterion for allowing growth to stop.Later by seed picture In region in plain surrounding neighbors where the pixel combination to sub-pixel same or similar with the gray value of sub-pixel.By this A little new pixels continue process above as new sub-pixel, and the pixel until not meeting condition can be included When, stop area growth.The region where sub-pixel after stop area growth is the lane line in point cloud intensity image.
In a kind of possible embodiment, determining lane is split to a cloud intensity image using edge split plot design Line.Detailed process are as follows: edge detection is carried out to cloud intensity image, lane in cloud intensity image can determine that by edge detection Then the edge of line further determines the position of lane line in point cloud intensity image according to the edge of lane line.Specifically, edge Detection i.e. detection gray level or structure have the place of mutation, show that the termination in a region and another region start Place.The gray value of edge pixel is discontinuous in image, and this discontinuity can be detected by differentiating.For rank Jump shape edge, and position corresponds to the extreme point of first derivative, the zero crossing of corresponding second dervative.Therefore differential operator can be used Edge detection is carried out, common first order differential operator has Roberts operator, Prewitt operator and Sobel operator, second-order differential Operator has Laplace operator and Kirsh operator etc..
In a kind of possible embodiment, in advance using the point cloud intensity image of multiframe handmarking lane line as instruction Practice sample input convolutional neural networks to be trained, determines the convolutional Neural that can detect lane line from cloud intensity image Network model.Therefore in real-time detection lane line, by target point cloud chart as corresponding cloud intensity image is input to convolutional Neural Network carries out feature extraction, determines the lane line in point cloud intensity image.It should be noted that a cloud intensity image minute The method cut is not limited in above-mentioned four kinds, can also be histogram method, level set method etc., in this regard, present invention implementation is not done It is specific to limit.
In the embodiment of the present invention, since vehicle-mounted laser equipment obtains the process of point cloud chart picture by outer by transmitting laser signal The interference on boundary is smaller, stronger to the adaptability of the factors such as light source, weather, therefore vehicle-mounted laser equipment transmitting laser signal is used to obtain Point cloud chart picture, then by carrying out handling determining lane line to point cloud chart picture when, the detection accuracy of lane line is by light source, weather Influence is also smaller, to improve the real-time and stability of detection lane line.Secondly as spliced point cloud chart picture includes more More lane informations, while point cloud chart picture is enhanced, therefore when detecting lane line according to point cloud chart picture, first by multiframe point Cloud atlas picture is spliced, and carries out lane detection according to spliced point cloud chart picture, to improve the precision of detection lane line. In addition, point cloud chart picture is converted to a cloud intensity image when detecting lane line by spliced point cloud chart picture, avoid directly Point cloud chart picture is analyzed in the 3 d space, on the one hand simplify the process of detection lane line, on the other hand improves detection vehicle The efficiency of diatom.
The embodiment of the invention also provides a kind of automatic Pilot control methods, method includes the following steps: detection lane The device of line obtains the N frame point cloud chart picture that vehicle-mounted laser equipment is determined by transmitting laser signal, and N frame point cloud chart picture is spliced into Target point cloud chart picture, then according to target point cloud chart picture determine target point cloud chart as corresponding cloud intensity image, later again from Target point cloud chart is as determining lane line in corresponding cloud intensity image.Automatic driving control system passes through detection lane line After device detects lane line, phase of the vehicle with lane line can be determined according to the position of lane line and the current location of vehicle To position, the position of vehicle is then further adjusted according to the relative position of vehicle and lane line.Illustratively, setting detects Lane line and vehicle position it is as shown in figure 12, as shown in Figure 12, current vehicle position is therefore automatic close to left-hand lane line Ride Control System can be during controlling forward travel, while controlling vehicle close to right-hand lane line, Zhi Daoche Position be located among the lane line of two sides.Automatic driving control system detects lane line by detecting the device of lane line Afterwards, the traveling lane of vehicle can also be adjusted according to the type of lane line.Illustratively, as shown in Figure 12, current vehicle position Left and right lane line be dotted line, automatic driving control system from current lane can transform to left-hand lane or the right side in control vehicle Side lane.Lane line is detected by vehicle-mounted laser equipment, provides high-precision lane line position information for automatic Pilot, from And effectively improve the positioning accuracy of automatic Pilot.
Based on the same technical idea, the embodiment of the invention provides a kind of devices for detecting lane line, as shown in figure 13, The device 1300 includes: to obtain module 1301, splicing module 1302, processing module 1303 and detection module 1304.
Module 1301 is obtained, the N frame point cloud chart picture determined for obtaining vehicle-mounted laser equipment by transmitting laser signal, N For positive integer;
Splicing module 1302, for the N frame point cloud chart picture to be spliced into target point cloud chart picture;
Processing module 1303, for determining the target point cloud chart as corresponding cloud intensity image;
Detection module 1304, for from the target point cloud chart as determining lane line in corresponding cloud intensity image.
Optionally, the splicing module 1302 is specifically used for:
Location information when emitting laser signal according to the vehicle-mounted laser equipment determines the N frame point cloud chart as between Relative position information;
Successively the N frame point cloud chart picture is spliced according to the relative position information between the N frame point cloud chart picture, really The fixed target point cloud chart picture.
Optionally, the splicing module 1302 is specifically used for:
For any two frames point cloud chart picture, vehicle-mounted laser equipment transmitting wherein a frame point cloud chart as corresponding first When laser signal, the first location information of the vehicle-mounted laser equipment is determined by vehicle-mounted inertial navigation equipment;
When the vehicle-mounted laser equipment emits another frame point cloud chart as corresponding second laser signal, by described vehicle-mounted Inertial navigation equipment determines the second location information of the vehicle-mounted laser equipment;
Vehicle-mounted laser equipment transmitting described the is determined according to the first location information and the second location information Relative position information when one laser signal and when the transmitting second laser signal;
Phase when the vehicle-mounted laser equipment is emitted the first laser signal and emitting the second laser signal It is determined as the relative position information of the two frames point cloud chart picture to location information.
Optionally, the splicing module 1302 is specifically used for:
The N frame point cloud chart picture is unified in same coordinate according to the relative position information between the N frame point cloud chart picture In system;
N frame point cloud chart picture after unified coordinate system is subjected to splicing and determines the target point cloud chart picture.
Optionally, the splicing module 1302 is specifically used for:
First frame point cloud chart picture and the second frame point cloud chart picture are spliced, determine the first son splicing point cloud chart picture;
Successively M splicing point cloud chart picture is spliced with M+2 frame point cloud chart picture, until the M+2 frame point cloud Image be nth frame point cloud chart as when, determine the target point cloud chart picture, the M is the integer more than or equal to 1, the N be greater than Integer equal to 3.
Optionally, the splicing module 1302 is specifically used for:
First splicing step: M frame point cloud chart picture and M+1 frame point cloud chart picture are spliced, determine the N/2 frame first order Splice point cloud chart picture, 1≤M < N, M are odd number;
Second splicing step: the L-th frame first order is spliced into point cloud chart picture and L+1 frame first order splicing point cloud chart picture carries out Splicing determines that point cloud chart picture is spliced in the N/4 frame second level, and 1≤L < N/2, L are odd number;
The first splicing step, the second splicing step are successively executed, until the splicing point cloud chart picture determined is One frame, forms the target point cloud chart picture, and N is the even number more than or equal to 4.
Optionally, the detection module 1304 is specifically used for:
Described cloud intensity image is split according to the grey scale pixel value of described cloud intensity image, determines lane Line.
Based on the same technical idea, the embodiment of the invention provides a kind of automatic driving control systems, as shown in figure 14, Including at least one processor 1401, and the memory 1402 being connect at least one processor, in the embodiment of the present invention not The specific connection medium between processor 1401 and memory 1402 is limited, in Figure 14 between processor 1401 and memory 1402 For being connected by bus.Bus can be divided into address bus, data/address bus, control bus etc..
In embodiments of the present invention, memory 1402 is stored with the instruction that can be executed by least one processor 1401, until The instruction that a few processor 1401 is stored by executing memory 1402 can execute in the method above-mentioned for detecting lane line Included step.
Wherein, processor 1401 is the control centre of automatic driving control system, can use various interfaces and route connects The various pieces for connecing automatic driving control system are deposited by running or executing the instruction being stored in memory 1402 and calling The data in memory 1402 are stored up, to detect lane line.Optionally, processor 1401 may include that one or more processing are single Member, processor 1401 can integrate application processor and modem processor, wherein the main processing operation system of application processor System, user interface and application program etc., modem processor mainly handles wireless communication.It is understood that above-mentioned modulation Demodulation processor can not also be integrated into processor 1401.In some embodiments, processor 1401 and memory 1402 can To realize on the same chip, in some embodiments, they can also be realized respectively on independent chip.
Processor 1401 can be general processor, such as central processing unit (CPU), digital signal processor, dedicated collection At circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array or other Perhaps transistor logic, discrete hardware components may be implemented or execute the present invention in fact for programmable logic device, discrete gate Apply each method, step disclosed in example and logic diagram.General processor can be microprocessor or any conventional processing Device etc..The step of method in conjunction with disclosed in the embodiment of the present invention, can be embodied directly in hardware processor and execute completion, or With in processor hardware and software module combination execute completion.
Memory 1402 is used as a kind of non-volatile computer readable storage medium storing program for executing, can be used for storing non-volatile software journey Sequence, non-volatile computer executable program and module.Memory 1402 may include the storage medium of at least one type, It such as may include flash memory, hard disk, multimedia card, card-type memory, random access storage device (Random Access Memory, RAM), static random-access memory (Static Random Access Memory, SRAM), may be programmed read-only deposit Reservoir (Programmable Read Only Memory, PROM), read-only memory (Read Only Memory, ROM), band Electrically erasable programmable read-only memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic storage, disk, CD etc..Memory 1402 can be used for carrying or storing have instruction or data The desired program code of structure type and can by any other medium of computer access, but not limited to this.The present invention is real Applying the memory 1402 in example can also be circuit or other devices that arbitrarily can be realized store function, for storing program Instruction and/or data.
Based on the same inventive concept, the embodiment of the invention provides a kind of computer-readable medium, it is stored with computer Program, when described program operation when, execute as the aforementioned detection lane line method the step of.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method or computer program product. Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the present invention Form.It is deposited moreover, the present invention can be used to can be used in the computer that one or more wherein includes computer usable program code The shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) Formula.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable graphics processing equipments to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable graphics processing equipments execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable graphics processing equipments with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions can also be loaded into computer or other programmable graphics processing equipments, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art Mind and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technologies Within, then the present invention is also intended to include these modifications and variations.

Claims (15)

1. a kind of method for detecting lane line characterized by comprising
The N frame point cloud chart picture that vehicle-mounted laser equipment is determined by transmitting laser signal is obtained, N is positive integer;
The N frame point cloud chart picture is spliced into target point cloud chart picture;
Determine the target point cloud chart as corresponding cloud intensity image;
From the target point cloud chart as determining lane line in corresponding cloud intensity image.
2. the method as described in claim 1, which is characterized in that described that the N frame point cloud chart picture is spliced into target point cloud chart Picture, comprising:
Location information when emitting laser signal according to the vehicle-mounted laser equipment determines opposite between the N frame point cloud chart picture Location information;
Successively the N frame point cloud chart picture is spliced according to the relative position information between the N frame point cloud chart picture, determines institute State target point cloud chart picture.
3. method according to claim 2, which is characterized in that when the transmitting laser signal according to the vehicle-mounted laser equipment Location information determine the relative position information between the N frame point cloud chart picture, comprising:
For any two frames point cloud chart picture, vehicle-mounted laser equipment transmitting wherein a frame point cloud chart as corresponding first laser When signal, the first location information of the vehicle-mounted laser equipment is determined by vehicle-mounted inertial navigation equipment;
When the vehicle-mounted laser equipment emits another frame point cloud chart as corresponding second laser signal, pass through the vehicle-mounted inertial navigation Equipment determines the second location information of the vehicle-mounted laser equipment;
Determine that the vehicle-mounted laser equipment transmitting described first swashs according to the first location information and the second location information Relative position information when optical signal and when the transmitting second laser signal;
Opposite position when the vehicle-mounted laser equipment is emitted the first laser signal and emitting the second laser signal Confidence ceases the relative position information for being determined as the two frames point cloud chart picture.
4. method as claimed in claim 3, which is characterized in that the relative position according between the N frame point cloud chart picture Information successively splices the N frame point cloud chart picture, determines the target point cloud chart picture, comprising:
The N frame point cloud chart picture is unified in the same coordinate system according to the relative position information between the N frame point cloud chart picture;
N frame point cloud chart picture after unified coordinate system is subjected to splicing and determines the target point cloud chart picture.
5. method as claimed in claim 4, which is characterized in that the N frame point cloud chart picture by after unified coordinate system is spelled It connects and determines the target point cloud chart picture, comprising:
First frame point cloud chart picture and the second frame point cloud chart picture are spliced, determine the first son splicing point cloud chart picture;
Successively M splicing point cloud chart picture is spliced with M+2 frame point cloud chart picture, until the M+2 frame point cloud chart picture For nth frame point cloud chart as when, determine the target point cloud chart picture, the M is the integer more than or equal to 1, the N be more than or equal to 3 integer.
6. method as claimed in claim 4, which is characterized in that the N frame point cloud chart picture by after unified coordinate system is spelled It connects and determines the target point cloud chart picture, comprising:
First splicing step: M frame point cloud chart picture and M+1 frame point cloud chart picture are spliced, and determine that the N/2 frame first order is spliced Point cloud chart picture, 1≤M < N, M are odd number;
Second splicing step: L-th frame first order splicing point cloud chart picture and L+1 frame first order splicing point cloud chart picture are spelled It connects, determines that point cloud chart picture is spliced in the N/4 frame second level, 1≤L < N/2, L are odd number, and N is the even number more than or equal to 4;
The first splicing step, the second splicing step are successively executed, until the splicing point cloud chart picture determined is a frame, Form the target point cloud chart picture.
7. the method as described in claim 1 to 6 is any, which is characterized in that it is described from the target point cloud chart as corresponding point Lane line is determined in cloud intensity image, comprising:
Described cloud intensity image is split according to the grey scale pixel value of described cloud intensity image, determines lane line.
8. a kind of device for detecting lane line characterized by comprising
Module is obtained, the N frame point cloud chart picture determined for obtaining vehicle-mounted laser equipment by transmitting laser signal, N is positive integer;
Splicing module, for the N frame point cloud chart picture to be spliced into target point cloud chart picture;
Processing module, for determining the target point cloud chart as corresponding cloud intensity image;
Detection module, for from the target point cloud chart as determining lane line in corresponding cloud intensity image.
9. device as claimed in claim 8, which is characterized in that the splicing module is specifically used for:
Location information when emitting laser signal according to the vehicle-mounted laser equipment determines opposite between the N frame point cloud chart picture Location information;
Successively the N frame point cloud chart picture is spliced according to the relative position information between the N frame point cloud chart picture, determines institute State target point cloud chart picture.
10. device as claimed in claim 9, which is characterized in that the splicing module is specifically used for:
For any two frames point cloud chart picture, vehicle-mounted laser equipment transmitting wherein a frame point cloud chart as corresponding first laser When signal, the first location information of the vehicle-mounted laser equipment is determined by vehicle-mounted inertial navigation equipment;
When the vehicle-mounted laser equipment emits another frame point cloud chart as corresponding second laser signal, pass through the vehicle-mounted inertial navigation Equipment determines the second location information of the vehicle-mounted laser equipment;
Determine that the vehicle-mounted laser equipment transmitting described first swashs according to the first location information and the second location information Relative position information when optical signal and when the transmitting second laser signal;
Opposite position when the vehicle-mounted laser equipment is emitted the first laser signal and emitting the second laser signal Confidence ceases the relative position information for being determined as the two frames point cloud chart picture.
11. device as claimed in claim 10, which is characterized in that the splicing module is specifically used for:
The N frame point cloud chart picture is unified in the same coordinate system according to the relative position information between the N frame point cloud chart picture;
N frame point cloud chart picture after unified coordinate system is subjected to splicing and determines the target point cloud chart picture.
12. the device as described in claim 8 to 11 is any, which is characterized in that the detection module is specifically used for:
Described cloud intensity image is split according to the grey scale pixel value of described cloud intensity image, determines lane line.
13. a kind of automatic Pilot control method, including detection lane line and according to lane line traffic control travel route, feature exists In using any method detection lane line of claim 1~7.
14. a kind of automatic driving control system, which is characterized in that including at least one processing unit and at least one storage Unit, wherein the storage unit is stored with computer program, when described program is executed by the processing unit, so that institute State the step of processing unit perform claim requires 1~7 any claim the method.
15. a kind of computer-readable medium, which is characterized in that it is stored with computer program, when described program operation, executes The step of claim 1~7 any the method.
CN201810732366.6A 2018-07-05 2018-07-05 Method and device for detecting lane line Active CN110163047B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810732366.6A CN110163047B (en) 2018-07-05 2018-07-05 Method and device for detecting lane line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810732366.6A CN110163047B (en) 2018-07-05 2018-07-05 Method and device for detecting lane line

Publications (2)

Publication Number Publication Date
CN110163047A true CN110163047A (en) 2019-08-23
CN110163047B CN110163047B (en) 2023-04-07

Family

ID=67645038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810732366.6A Active CN110163047B (en) 2018-07-05 2018-07-05 Method and device for detecting lane line

Country Status (1)

Country Link
CN (1) CN110163047B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111175788A (en) * 2020-01-20 2020-05-19 北京主线科技有限公司 Transverse positioning method and positioning system for automatic driving vehicle
CN111507340A (en) * 2020-04-16 2020-08-07 北京深测科技有限公司 Target point cloud data extraction method based on three-dimensional point cloud data
CN112712023A (en) * 2020-12-30 2021-04-27 武汉万集信息技术有限公司 Vehicle type identification method and system and electronic equipment
CN113378636A (en) * 2021-04-28 2021-09-10 杭州电子科技大学 Vehicle and pedestrian detection method based on depth map matching
WO2021227797A1 (en) * 2020-05-13 2021-11-18 长沙智能驾驶研究院有限公司 Road boundary detection method and apparatus, computer device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105115976A (en) * 2015-06-24 2015-12-02 上海图甲信息科技有限公司 Detection system and method for wear defects of track
CN105701449A (en) * 2015-12-31 2016-06-22 百度在线网络技术(北京)有限公司 Method and device for detecting lane lines on road surface
CN106570446A (en) * 2015-10-12 2017-04-19 腾讯科技(深圳)有限公司 Lane line extraction method and device
CN107463918A (en) * 2017-08-17 2017-12-12 武汉大学 Lane line extracting method based on laser point cloud and image data fusion
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN108230379A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 For merging the method and apparatus of point cloud data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105115976A (en) * 2015-06-24 2015-12-02 上海图甲信息科技有限公司 Detection system and method for wear defects of track
CN106570446A (en) * 2015-10-12 2017-04-19 腾讯科技(深圳)有限公司 Lane line extraction method and device
CN105701449A (en) * 2015-12-31 2016-06-22 百度在线网络技术(北京)有限公司 Method and device for detecting lane lines on road surface
CN107463918A (en) * 2017-08-17 2017-12-12 武汉大学 Lane line extracting method based on laser point cloud and image data fusion
CN107796397A (en) * 2017-09-14 2018-03-13 杭州迦智科技有限公司 A kind of Robot Binocular Vision localization method, device and storage medium
CN108230379A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 For merging the method and apparatus of point cloud data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
邓超: "《数字图像处理与模式识别研究》", 15 June 2018, 地质出版社 *
陆国栋等: "《工程图样数字化转换与智能理解》", 1 May 2001, 机械工业出版社 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111175788A (en) * 2020-01-20 2020-05-19 北京主线科技有限公司 Transverse positioning method and positioning system for automatic driving vehicle
CN111507340A (en) * 2020-04-16 2020-08-07 北京深测科技有限公司 Target point cloud data extraction method based on three-dimensional point cloud data
CN111507340B (en) * 2020-04-16 2023-09-01 北京深测科技有限公司 Target point cloud data extraction method based on three-dimensional point cloud data
WO2021227797A1 (en) * 2020-05-13 2021-11-18 长沙智能驾驶研究院有限公司 Road boundary detection method and apparatus, computer device and storage medium
CN112712023A (en) * 2020-12-30 2021-04-27 武汉万集信息技术有限公司 Vehicle type identification method and system and electronic equipment
CN112712023B (en) * 2020-12-30 2024-04-05 武汉万集光电技术有限公司 Vehicle type recognition method and system and electronic equipment
CN113378636A (en) * 2021-04-28 2021-09-10 杭州电子科技大学 Vehicle and pedestrian detection method based on depth map matching

Also Published As

Publication number Publication date
CN110163047B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN110163047A (en) A kind of method and device detecting lane line
CN107235044B (en) A kind of restoring method realized based on more sensing datas to road traffic scene and driver driving behavior
KR102653953B1 (en) Method and system for generating and using location reference data
CN108801276B (en) High-precision map generation method and device
KR102404155B1 (en) Methods and systems for generating and using localization reference data
US10445928B2 (en) Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
JP7074438B2 (en) Vehicle position estimation device
CN105984464B (en) Controller of vehicle
CN104272345B (en) Display apparatus and vehicle display packing
US10642278B2 (en) Autonomous driving device
CN108303103A (en) The determination method and apparatus in target track
CN109870689A (en) Millimetre-wave radar and the matched lane grade localization method of high-precision map vector and system
GB2566523A (en) System and method for vehicle convoys
CN103158607A (en) Method and device for controlling a light emission of a headlight of a vehicle
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
CN107657825A (en) Park method and device
CN109583312A (en) Lane detection method, apparatus, equipment and storage medium
CN109115232A (en) The method and apparatus of navigation
CN115718304A (en) Target object detection method, target object detection device, vehicle and storage medium
JP6848847B2 (en) Stationary object map information generator
TW202248963A (en) Compound eyes system, the vehicle using the compound eyes systme and the image processing method for the compound eyes system
CN115661794A (en) Stereoscopic vision perception method, device, equipment, medium and unmanned harvester
TWM618998U (en) Compound eye camera system and vehicle using the same
KR20230073601A (en) Method of detection crosswalk using lidar sensor and crosswalk detection device performing method
KR20240032142A (en) How to determine the position of an object relative to road line markings on a road

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant