US20130208945A1 - Method for the detection and tracking of lane markings - Google Patents

Method for the detection and tracking of lane markings Download PDF

Info

Publication number
US20130208945A1
US20130208945A1 US13/650,182 US201213650182A US2013208945A1 US 20130208945 A1 US20130208945 A1 US 20130208945A1 US 201213650182 A US201213650182 A US 201213650182A US 2013208945 A1 US2013208945 A1 US 2013208945A1
Authority
US
United States
Prior art keywords
lane marking
parameter
image
detected
detected lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/650,182
Other versions
US9047518B2 (en
Inventor
Christian Nunn
Mirko Meuter
Dennis Mueller
Steffen Goermer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aptiv Technologies Ag
Original Assignee
Delphi Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delphi Technologies Inc filed Critical Delphi Technologies Inc
Assigned to DELPHI TECHNOLOGIES, INC. reassignment DELPHI TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEUTER, MIRKO, MUELLER, DENNIS, NUNN, CHRISTIAN, Goermer, Steffen
Publication of US20130208945A1 publication Critical patent/US20130208945A1/en
Application granted granted Critical
Publication of US9047518B2 publication Critical patent/US9047518B2/en
Assigned to APTIV TECHNOLOGIES LIMITED reassignment APTIV TECHNOLOGIES LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DELPHI TECHNOLOGIES INC.
Assigned to APTIV TECHNOLOGIES (2) S.À R.L. reassignment APTIV TECHNOLOGIES (2) S.À R.L. ENTITY CONVERSION Assignors: APTIV TECHNOLOGIES LIMITED
Assigned to APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. reassignment APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. MERGER Assignors: APTIV TECHNOLOGIES (2) S.À R.L.
Assigned to Aptiv Technologies AG reassignment Aptiv Technologies AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking

Definitions

  • This disclosure generally relates to detecting roadway lane markings, and more particularly relates to a method for the detection and tracking of lane markings from a motor vehicle equipped an image capture device.
  • LWD lane departure warning
  • Lane keeping support (LKS) systems are also known. Such system supports the driver with keeping to the present lane by continuous steering, braking and/or drive train intervention.
  • Status estimators are used for approximate projection of a system status in the future, and are known in the field. Specifically, a status estimator used with the method according to the invention has a predictor-corrector structure.
  • a method for the detection and tracking of lane markings from a motor vehicle on which an image capture device is mounted is provided.
  • the image capture device is configured to capture an image of a space located in front of the vehicle.
  • the image includes a plurality of picture elements captured by the image capture device at regular intervals.
  • Picture elements of the image that meet a predetermined detection criterion are identified as detected lane markings in the captured image by an image processing system.
  • At least one detected lane marking is identified as a lane marking to be tracked, and is subjected to a tracking process in which variation over time of the course of the lane marking in the plane of the roadway is tracked by means of a status estimator.
  • At least one test zone comprising a plurality of picture elements of the image is defined for each detected lane marking.
  • at least one parameter is determined.
  • the detected lane marking is assigned to one of a plurality of lane marking categories, depending on the parameter.
  • the parameter used as the assignment criterion can be any image characteristic that is determined based on the pixels of the test zone by means of simple computing operations or by means of complex image processing methods. It should be pointed out that the parameter is preferably independent of the detection criterion, that is, it serves for additional checking, particularly extended checking, of a previously detected lane marking. Different kinds of lane markings can therefore be distinguished from each other in a simple manner. In particular, the actual valid markings can be filtered out from all potential lane markings or marking candidates. The performance of the tracking system can therefore be increased appreciably.
  • each detected lane marking is subjected to the tracking process as a lane marking to be tracked or rejected as an invalid lane marking, depending on the lane marking category to which it is assigned. That is to say, verification of the detected lane markings takes place.
  • “false candidates” such as crash barriers and tar seams may be excluded from tracking, which leads to greater robustness of the system.
  • the detected lane markings in each case may be assigned as a whole to the corresponding lane marking category. Alternatively, however, they may be assigned in sections. That is to say, the lane marking is divided into a plurality of sections along its course, each section on its own being assigned to a lane marking category. In this case, a section of a lane marking located close to the vehicle may be assigned to a different category from a section of the same lane marking further away from the vehicle. As a result, the robustness of the system may be further increased because in case of a restricted view or items being obscured. For example, a lane marking only in the region close to the vehicle is acknowledged as a lane marking to be tracked, whereas in the region further away it is refused or rejected, and so is characterized as invalid.
  • the position of the test zone within the image is determined by converting the position of a tracked lane marking in the plane of the roadway at a predetermined distance from the vehicle to image coordinates by a projection method.
  • a detected lane marking is in this way matched with lane markings already tracked, i.e. subjected to the tracking process.
  • At least one statistical parameter is determined from the intensity values of the picture elements associated with the test zone.
  • a plurality of statistical parameters is determined, for example an average intensity value, an intensity value total, a maximum intensity value, a minimum intensity value and/or at least one moment of intensity value distribution of the picture elements.
  • the intensity values are typically the digital greyscale values or color values of the individual pixels of the image sensor.
  • the parameter for a plurality of successive images is determined and the detected lane marking is assigned to the lane marking category in addition with the aid of the curve of the parameter in time.
  • This allows further refined division into categories, as the time response of the parameter contains important information with respect to the appearance of the detected lane marking. For example, unbroken lane markings and broken lane markings may be distinguished particularly well with the aid of time analysis.
  • the current speed of the vehicle is determined, wherein with the aid of the speed the curve of the parameter in time is converted to a curve of the parameter in space in the plane of the roadway, and wherein the detected lane marking is assigned to the lane marking category with the aid of the curve of the parameter in space.
  • the parameter is therefore regarded as a function of the distance covered by the vehicle, which is more favorable with respect to assessment of the appearance of the lane marking. By looking at the parameter in the location, compensation of the vehicle's own movement may moreover be carried out particularly easily.
  • a statistical curve measure is determined for characterization of the curve of the parameter in time.
  • the statistical curve measure may be based on an average, a standard deviation, or a variance.
  • the detected lane marking is assigned to the lane marking category based in part on the statistical curve measure. Division of the detected lane markings into categories can thus be based on, for example, space statistics provided by the parameter and time statistics provided by the curve measure. This allows again extended analysis of the appearance of the lane marking to be checked. Characterization of the curve of the parameter in time can also be based on a sliding average and/or a filter.
  • the curve of the parameter over time can further be subjected to a time-frequency transform, in particular, a Fourier transform or a Haar transform.
  • a time-frequency transform in particular, a Fourier transform or a Haar transform.
  • the detected lane marking is assigned to the lane marking category with the aid of the transform coefficients determined within the framework of the time-frequency transform.
  • the corresponding Fourier coefficients or Haar coefficients can serve as the statistical curve measures in the characterization of the curve of the parameter over time described above.
  • An advantage of the Haar transform lies in that the associated calculations are quick and easy to do.
  • the Haar transform is particularly well suited to the treatment of square functions. This accommodates the detection of broken lane markings.
  • the time-frequency transform can be carried out iteratively. That is to say, a moving time window is observed, where only new values are added and older values are weighted correspondingly. As a result, in particular the computing costs can be reduced.
  • each detected lane marking there is defined a set of several test zones that in each case correspond to the position of a tracked lane marking in the plane of the roadway at different distances from the vehicle.
  • the lane marking is thus assessed not just in isolation at one point, but along its course.
  • a uniform distance between the positions of the tracked lane marking in the plane of the roadway can be selected.
  • the detected lane marking may be assigned to a lane marking category from a group of lane marking categories which includes the categories of “invalid image object”, “single unbroken line”, “double unbroken line”, “single broken line”, “double broken line”, “broken and unbroken line”, “wide broken line” and “line with surface profile”.
  • a lane marking category from a group of lane marking categories which includes the categories of “invalid image object”, “single unbroken line”, “double unbroken line”, “single broken line”, “double broken line”, “broken and unbroken line”, “wide broken line” and “line with surface profile”.
  • At least 5 and preferably at least 10 different parameters are determined for the or each test zone, wherein in particular the detected lane markings are assigned to a lane marking category with the aid of a subgroup of parameters which is selected by a classification method.
  • a classifier module can decide with the aid of several characteristics as a function of probability to which category the detected lane marking is to be assigned. This also allows division into categories with a similar curve pattern such as e.g. “single unbroken line” and “double unbroken line”.
  • the classification method can use a neuronal network and/or a support vector machine. This enables “teaching” of the system, in order thus to manage a plurality of lane marking categories while processing a plurality of parameters and/or curve measures.
  • Another embodiment provides that respective parameters of several test zones of a detected lane marking at different distances from the vehicle and/or respective parameters of a single test zone in successive images are compared with each other. With the aid of the result of comparison, combined into a common parameter, wherein preferably with the aid of the result of comparison a degree of statistical confidence is calculated and assigned to the detected lane marking.
  • Such an amalgamation allows checking of consistency, as a result of which the robustness of the method can be increased appreciably.
  • the teachings presented herein also relate to a computer program having program code means for carrying out a method as described above, when the computer program is run on a computer or a corresponding calculating unit.
  • teachings presented herein also relate to a computer program product having program code means that are stored on a computer-readable data carrier, for carrying out a method as described above, when the computer program is run on a computer or a corresponding calculating unit.
  • the teachings presented herein also relate to a device for the detection and tracking of lane markings from a motor vehicle, having an image capture device mounted on the vehicle for taking a picture, and a data processing device that is designed to carry out the above-mentioned method.
  • FIG. 1 is a top view of a traffic space occupied by a motor vehicle equipped with an image capture device for the detection and tracking of lane markings in accordance with one embodiment
  • FIG. 2 is a first picture taken by the image capture device mounted on the vehicle as in FIG. 1 in accordance with one embodiment
  • FIG. 3 is a second picture taken by an image capture device mounted on the vehicle as in FIG. 1 in accordance with one embodiment
  • FIG. 4 is a lane marking at three successive points in time during the travel of the vehicle as in FIG. 1 in accordance with one embodiment.
  • a motor vehicle 10 is moving forwards in a direction of travel F in the lane 11 of a road.
  • the lane 11 is defined by a left lane marking 12 a in the form of an unbroken line and by a right lane marking, not shown.
  • a camera 14 is mounted on the vehicle 10 at, for example, a front region of the vehicle, such as a roof lining of the vehicle interior.
  • the camera 14 is generally configured to capture continuously an image (i.e. a sequence or series of image frames) of the space located in front of the vehicle 10 , for example, the area between sight rays 15 .
  • the area viewed by the camera 14 may be characterized according to a world coordinate system having x and y distance values.
  • the camera 14 is coupled to an image processing computer (not shown) to form an image processing system that is generally configured to process the images provided by the camera 14 .
  • the image processing computer may be housed within the camera 14 , or may be located elsewhere in the vehicle 10 .
  • FIGS. 2 and 3 illustrate two non-limiting and simplified examples of images 20 , 20 ′ of the space located in front of the vehicle 10 are captured by the camera 14 .
  • the camera 14 and the associated image processing computer form part of a driver assistance system such as, for example, a lane keeping support system or lane departure warning system (LDW).
  • a driver assistance system such as, for example, a lane keeping support system or lane departure warning system (LDW).
  • LDW lane departure warning system
  • all potential lane markings 12 a , 12 b , 12 c , 16 , 18 in a captured image 20 , 20 ′ are extracted or identified in the images.
  • Image regions of the images that meet a predetermined detection criterion for lane markings are identified as detected lane markings.
  • the lane markings detected in this way may be, for example, unbroken lines 12 a , narrow broken lines 12 b , wide broken lines 12 c , crash barriers 16 , or curbs 18 .
  • a verification and classification module i.e. software or algorithm associated with the image processing computer ensures that only valid lane markings 12 a , 12 b , 12 c are tracked, and that the type of lane marking is taken into consideration during tracking.
  • the verification and classification module defines for each of the detected lane markings (i.e. for each ‘marking candidate’) a set of twenty test zones 22 which in each case correspond to the position of a tracked lane marking 12 a , 12 b , 12 c in the plane of the roadway at a certain distance from the vehicle 10 .
  • the position of the test zones 22 within the image 20 , 20 ′ is in each case determined by converting the position of a tracked lane marking 12 a , 12 b , 12 c in the plane of the roadway into world coordinates x, y of the image.
  • These world coordinates transform locations at predetermined distances from the vehicle 10 to image coordinates by means of suitable projection equations.
  • Each test zone 22 comprises several pixels of the image 20 , 20 ′ and extends over a certain region within the lane marking 12 a , 12 b , 12 c , and if occasion arises, into the surrounding area.
  • a set of various statistical parameters is determined from the greyscale values of the pixel concerned, for example an average greyscale value, a maximum greyscale value, a minimum greyscale value and/or different moments of greyscale value distribution. These statistical parameters form characteristics or descriptors that are assigned to the respective candidate for lane markings.
  • Each of the parameters is determined for all of the images 20 , 20 ′ succeeding each other in time and processed as a function of time.
  • the functions of time can be converted to functions of the distance covered.
  • the functions of distance covered that are obtained constitute the curve of the parameters in space in the plane of the roadway.
  • a set of statistical curve measures is determined, such as for example, average or standard deviation.
  • the curve of the parameters in time is subjected to a Haar transform, wherein the Haar coefficients obtained in the process are also assigned to the set of curve measures.
  • a classifier now selects certain characteristics from the plurality of characteristics, using a neuronal network and/or a support vector machine, and assigns the detected lane marking to one of several lane marking categories with the aid of the values of the selected characteristics. Specifically, the classifier decides whether the detected lane marking is an invalid image object such as a crash barrier 16 or a curb 18 , an unbroken line 12 a , a narrow broken line 12 b or a wide broken line 12 c . It should be pointed out that the method described makes it possible to distinguish between many further common types of marking such as, for example, double broken lines, broken and unbroken line and lines with a surface profile (Botts' dots).
  • an amalgamation can be made by comparing respective parameters of several test zones 22 of a detected lane marking at different distances from the vehicle 10 and respective parameters of a single test zone 22 in successive images 20 , 20 ′ with each other, and with the aid of the result of comparison combining them into a common parameter. If, for example, nineteen parameters indicate a curb for a lane marking and one parameter indicates a broken line, the last-mentioned value is rejected as an error and the lane marking is classed as a curb. Further, a degree of statistical confidence is calculated and also assigned to the detected lane marking.
  • a particular advantage of the invention lies in that the classifier can be taught.
  • a time signal is formed by means of the classification results for a fixed particular location in world coordinates x, y.
  • the results of classification are then shifted by means of compensation of the vehicle's own movement, and the results are amalgamated for a certain point in the world. This is preferably carried out recursively.
  • All detected lane markings which are classified by the classifier as valid lane markings 12 a , 12 b , 12 c are, as lane markings to be tracked, subjected to a tracking process in which the variation of the course of the lane markings 12 a , 12 b , 12 c in time in the plane of the roadway is tracked by means of a status estimator.
  • the detected lane markings which are classified as “invalid lane markings” 16 , 18 by the classifier are, on the other hand, rejected, that is, not tracked.
  • Actual tracking of the variation of the course of the lane markings 12 a , 12 b , 12 c in time is preferably carried out by an independent module and takes place by means of a status estimator that is based on an extended Kalman filter.
  • the invention enables robust lane detection even in complex traffic situations such as the town traffic situation shown in FIG. 2 or the building site situation shown in FIG. 3 , wherein several lanes 11 can be handled safely as well.

Abstract

In a method for the detection and tracking of lane markings from a motor vehicle, an image of a space located in front of the vehicle is captured by means of an image capture device at regular intervals. The picture elements that meet a predetermined detection criterion are identified as detected lane markings in the captured image. At least one detected lane marking as a lane marking to be tracked is subjected to a tracking process. At least one test zone is defined for each detected lane marking. With the aid of intensity values of the picture elements associated with the test zone, at least one parameter is determined. The detected lane marking is assigned to one of several lane marking categories, depending on the parameter.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit under 35 U.S.C. §119(a) of European Patent Application EP 12000978.2, filed Feb. 15, 2012, the entire disclosure of which is hereby incorporated herein by reference.
  • TECHNICAL FIELD OF INVENTION
  • This disclosure generally relates to detecting roadway lane markings, and more particularly relates to a method for the detection and tracking of lane markings from a motor vehicle equipped an image capture device.
  • BACKGROUND OF INVENTION
  • The tracking of lane markings is important for various kinds of driver assistance systems in modern motor vehicles. For example, a lane departure warning (LDW) can use the tracking of lane markings to determine the position of the vehicle within the lane and emits a warning signal if it gets too close to the edge of the lane. Lane keeping support (LKS) systems are also known. Such system supports the driver with keeping to the present lane by continuous steering, braking and/or drive train intervention.
  • Status estimators are used for approximate projection of a system status in the future, and are known in the field. Specifically, a status estimator used with the method according to the invention has a predictor-corrector structure.
  • In particular, in the case of simultaneous monitoring of a plurality of lanes, it is difficult to correctly detect all lane markings which are present in the image. This is because there are image objects which meet the detection criterion, but do not constitute valid lane markings. For example, crash barriers or tar seams can be detected as lane markings and falsely tracked. Erroneous tracking of this kind can considerably impair the performance of a driver assistance system.
  • SUMMARY OF THE INVENTION
  • What is needed is a method of the kind mentioned hereinbefore, that is more robust with respect to the detection of lane markings, and delivers reliable results in complex traffic situations.
  • In accordance with one embodiment, a method for the detection and tracking of lane markings from a motor vehicle on which an image capture device is mounted is provided. The image capture device is configured to capture an image of a space located in front of the vehicle. The image includes a plurality of picture elements captured by the image capture device at regular intervals. Picture elements of the image that meet a predetermined detection criterion are identified as detected lane markings in the captured image by an image processing system. At least one detected lane marking is identified as a lane marking to be tracked, and is subjected to a tracking process in which variation over time of the course of the lane marking in the plane of the roadway is tracked by means of a status estimator.
  • In another embodiment, at least one test zone comprising a plurality of picture elements of the image is defined for each detected lane marking. With the aid of intensity values of the picture elements associated with the test zone, at least one parameter is determined. The detected lane marking is assigned to one of a plurality of lane marking categories, depending on the parameter.
  • By division of the detected lane markings into various categories, unclear situations can be overcome as well. The parameter used as the assignment criterion can be any image characteristic that is determined based on the pixels of the test zone by means of simple computing operations or by means of complex image processing methods. It should be pointed out that the parameter is preferably independent of the detection criterion, that is, it serves for additional checking, particularly extended checking, of a previously detected lane marking. Different kinds of lane markings can therefore be distinguished from each other in a simple manner. In particular, the actual valid markings can be filtered out from all potential lane markings or marking candidates. The performance of the tracking system can therefore be increased appreciably.
  • According to another embodiment of the invention, each detected lane marking is subjected to the tracking process as a lane marking to be tracked or rejected as an invalid lane marking, depending on the lane marking category to which it is assigned. That is to say, verification of the detected lane markings takes place. As a result, “false candidates” such as crash barriers and tar seams may be excluded from tracking, which leads to greater robustness of the system.
  • The detected lane markings in each case may be assigned as a whole to the corresponding lane marking category. Alternatively, however, they may be assigned in sections. That is to say, the lane marking is divided into a plurality of sections along its course, each section on its own being assigned to a lane marking category. In this case, a section of a lane marking located close to the vehicle may be assigned to a different category from a section of the same lane marking further away from the vehicle. As a result, the robustness of the system may be further increased because in case of a restricted view or items being obscured. For example, a lane marking only in the region close to the vehicle is acknowledged as a lane marking to be tracked, whereas in the region further away it is refused or rejected, and so is characterized as invalid.
  • Preferably, the position of the test zone within the image is determined by converting the position of a tracked lane marking in the plane of the roadway at a predetermined distance from the vehicle to image coordinates by a projection method. A detected lane marking is in this way matched with lane markings already tracked, i.e. subjected to the tracking process.
  • According to another embodiment, from the intensity values of the picture elements associated with the test zone at least one statistical parameter is determined. Preferably, a plurality of statistical parameters is determined, for example an average intensity value, an intensity value total, a maximum intensity value, a minimum intensity value and/or at least one moment of intensity value distribution of the picture elements. The intensity values are typically the digital greyscale values or color values of the individual pixels of the image sensor. By statistical analysis of the extended test zone, reliable checking of the appearance of the detected lane marking and hence reliable division into categories are possible. For statistical analysis, Hu's moments, inclination, minimum or maximum gradient, histograms, quantiles of histograms and/or histograms of gradients may further also be used. It is recognized that characteristic parameters such as the width of a lane marking may also be used as parameters.
  • According to an embodiment of the invention, the parameter for a plurality of successive images is determined and the detected lane marking is assigned to the lane marking category in addition with the aid of the curve of the parameter in time. This allows further refined division into categories, as the time response of the parameter contains important information with respect to the appearance of the detected lane marking. For example, unbroken lane markings and broken lane markings may be distinguished particularly well with the aid of time analysis.
  • Preferably, the current speed of the vehicle is determined, wherein with the aid of the speed the curve of the parameter in time is converted to a curve of the parameter in space in the plane of the roadway, and wherein the detected lane marking is assigned to the lane marking category with the aid of the curve of the parameter in space. The parameter is therefore regarded as a function of the distance covered by the vehicle, which is more favorable with respect to assessment of the appearance of the lane marking. By looking at the parameter in the location, compensation of the vehicle's own movement may moreover be carried out particularly easily.
  • According to another embodiment, a statistical curve measure is determined for characterization of the curve of the parameter in time. In particular, the statistical curve measure may be based on an average, a standard deviation, or a variance. The detected lane marking is assigned to the lane marking category based in part on the statistical curve measure. Division of the detected lane markings into categories can thus be based on, for example, space statistics provided by the parameter and time statistics provided by the curve measure. This allows again extended analysis of the appearance of the lane marking to be checked. Characterization of the curve of the parameter in time can also be based on a sliding average and/or a filter.
  • The curve of the parameter over time can further be subjected to a time-frequency transform, in particular, a Fourier transform or a Haar transform. The detected lane marking is assigned to the lane marking category with the aid of the transform coefficients determined within the framework of the time-frequency transform. For example, the corresponding Fourier coefficients or Haar coefficients can serve as the statistical curve measures in the characterization of the curve of the parameter over time described above. An advantage of the Haar transform lies in that the associated calculations are quick and easy to do. The Haar transform is particularly well suited to the treatment of square functions. This accommodates the detection of broken lane markings.
  • The time-frequency transform can be carried out iteratively. That is to say, a moving time window is observed, where only new values are added and older values are weighted correspondingly. As a result, in particular the computing costs can be reduced.
  • Preferably, for each detected lane marking there is defined a set of several test zones that in each case correspond to the position of a tracked lane marking in the plane of the roadway at different distances from the vehicle. The lane marking is thus assessed not just in isolation at one point, but along its course. When defining the set of test zones, in particular a uniform distance between the positions of the tracked lane marking in the plane of the roadway can be selected.
  • According to another embodiment, for each detected lane marking at least 5 and preferably at least 15 test zones are defined. As a result, reliable checking of the detected lane marking along its course is ensured. The detected lane marking may be assigned to a lane marking category from a group of lane marking categories which includes the categories of “invalid image object”, “single unbroken line”, “double unbroken line”, “single broken line”, “double broken line”, “broken and unbroken line”, “wide broken line” and “line with surface profile”. By this means, not only can actual lane markings be distinguished from artifacts, but also an associated driver assistance system can manage several lanes and in the process e.g. distinguish normal lanes from motorway exit lanes. In addition, haptic markings (Botts' dots), gaps between broken markings and the like objects that are difficult to detect can be handled.
  • According to another embodiment, at least 5 and preferably at least 10 different parameters are determined for the or each test zone, wherein in particular the detected lane markings are assigned to a lane marking category with the aid of a subgroup of parameters which is selected by a classification method. For instance, a classifier module can decide with the aid of several characteristics as a function of probability to which category the detected lane marking is to be assigned. This also allows division into categories with a similar curve pattern such as e.g. “single unbroken line” and “double unbroken line”.
  • The classification method can use a neuronal network and/or a support vector machine. This enables “teaching” of the system, in order thus to manage a plurality of lane marking categories while processing a plurality of parameters and/or curve measures.
  • Another embodiment provides that respective parameters of several test zones of a detected lane marking at different distances from the vehicle and/or respective parameters of a single test zone in successive images are compared with each other. With the aid of the result of comparison, combined into a common parameter, wherein preferably with the aid of the result of comparison a degree of statistical confidence is calculated and assigned to the detected lane marking. Such an amalgamation allows checking of consistency, as a result of which the robustness of the method can be increased appreciably.
  • The teachings presented herein also relate to a computer program having program code means for carrying out a method as described above, when the computer program is run on a computer or a corresponding calculating unit.
  • Furthermore, the teachings presented herein also relate to a computer program product having program code means that are stored on a computer-readable data carrier, for carrying out a method as described above, when the computer program is run on a computer or a corresponding calculating unit.
  • The teachings presented herein also relate to a device for the detection and tracking of lane markings from a motor vehicle, having an image capture device mounted on the vehicle for taking a picture, and a data processing device that is designed to carry out the above-mentioned method.
  • Further features and advantages will appear more clearly on a reading of the following detailed description of the preferred embodiment, which is given by way of non-limiting example only and with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF DRAWINGS
  • The present invention will now be described, by way of example with reference to the accompanying drawings, in which:
  • FIG. 1 is a top view of a traffic space occupied by a motor vehicle equipped with an image capture device for the detection and tracking of lane markings in accordance with one embodiment;
  • FIG. 2 is a first picture taken by the image capture device mounted on the vehicle as in FIG. 1 in accordance with one embodiment;
  • FIG. 3 is a second picture taken by an image capture device mounted on the vehicle as in FIG. 1 in accordance with one embodiment; and
  • FIG. 4 is a lane marking at three successive points in time during the travel of the vehicle as in FIG. 1 in accordance with one embodiment.
  • DETAILED DESCRIPTION
  • According to FIG. 1, a motor vehicle 10 is moving forwards in a direction of travel F in the lane 11 of a road. The lane 11 is defined by a left lane marking 12 a in the form of an unbroken line and by a right lane marking, not shown. A camera 14 is mounted on the vehicle 10 at, for example, a front region of the vehicle, such as a roof lining of the vehicle interior. The camera 14 is generally configured to capture continuously an image (i.e. a sequence or series of image frames) of the space located in front of the vehicle 10, for example, the area between sight rays 15. The area viewed by the camera 14 may be characterized according to a world coordinate system having x and y distance values. Furthermore, the camera 14 is coupled to an image processing computer (not shown) to form an image processing system that is generally configured to process the images provided by the camera 14. The image processing computer may be housed within the camera 14, or may be located elsewhere in the vehicle 10.
  • FIGS. 2 and 3 illustrate two non-limiting and simplified examples of images 20, 20′ of the space located in front of the vehicle 10 are captured by the camera 14.
  • The camera 14 and the associated image processing computer form part of a driver assistance system such as, for example, a lane keeping support system or lane departure warning system (LDW). This system detects and tracks lane markings with the aid of the images captured by the camera 14 at regular intervals, as stated in more detail below.
  • First, by means of suitable image processing algorithms known in the field, all potential lane markings 12 a, 12 b, 12 c, 16, 18 in a captured image 20, 20′ are extracted or identified in the images. Image regions of the images that meet a predetermined detection criterion for lane markings are identified as detected lane markings. The lane markings detected in this way may be, for example, unbroken lines 12 a, narrow broken lines 12 b, wide broken lines 12 c, crash barriers 16, or curbs 18. A verification and classification module (i.e. software or algorithm) associated with the image processing computer ensures that only valid lane markings 12 a, 12 b, 12 c are tracked, and that the type of lane marking is taken into consideration during tracking.
  • For this purpose the verification and classification module defines for each of the detected lane markings (i.e. for each ‘marking candidate’) a set of twenty test zones 22 which in each case correspond to the position of a tracked lane marking 12 a, 12 b, 12 c in the plane of the roadway at a certain distance from the vehicle 10. The position of the test zones 22 within the image 20, 20′ is in each case determined by converting the position of a tracked lane marking 12 a, 12 b, 12 c in the plane of the roadway into world coordinates x, y of the image. These world coordinates transform locations at predetermined distances from the vehicle 10 to image coordinates by means of suitable projection equations. Each test zone 22 comprises several pixels of the image 20, 20′ and extends over a certain region within the lane marking 12 a, 12 b, 12 c, and if occasion arises, into the surrounding area. For each test zone 22 a set of various statistical parameters is determined from the greyscale values of the pixel concerned, for example an average greyscale value, a maximum greyscale value, a minimum greyscale value and/or different moments of greyscale value distribution. These statistical parameters form characteristics or descriptors that are assigned to the respective candidate for lane markings.
  • Each of the parameters is determined for all of the images 20, 20′ succeeding each other in time and processed as a function of time. As the current speed of the vehicle 10 is determined, the functions of time can be converted to functions of the distance covered. The functions of distance covered that are obtained constitute the curve of the parameters in space in the plane of the roadway. For description of the functions, subsequently a set of statistical curve measures is determined, such as for example, average or standard deviation.
  • In addition the curve of the parameters in time is subjected to a Haar transform, wherein the Haar coefficients obtained in the process are also assigned to the set of curve measures.
  • For each test zone 22 there are now a plurality of extracted characteristics in the form of parameters and curve measures, which in their entirety characterize relatively precisely the optical appearance of the lane marking concerned and in particular its variation in time or variation over time. A classifier now selects certain characteristics from the plurality of characteristics, using a neuronal network and/or a support vector machine, and assigns the detected lane marking to one of several lane marking categories with the aid of the values of the selected characteristics. Specifically, the classifier decides whether the detected lane marking is an invalid image object such as a crash barrier 16 or a curb 18, an unbroken line 12 a, a narrow broken line 12 b or a wide broken line 12 c. It should be pointed out that the method described makes it possible to distinguish between many further common types of marking such as, for example, double broken lines, broken and unbroken line and lines with a surface profile (Botts' dots).
  • To make the classification which is carried out more robust, an amalgamation can be made by comparing respective parameters of several test zones 22 of a detected lane marking at different distances from the vehicle 10 and respective parameters of a single test zone 22 in successive images 20, 20′ with each other, and with the aid of the result of comparison combining them into a common parameter. If, for example, nineteen parameters indicate a curb for a lane marking and one parameter indicates a broken line, the last-mentioned value is rejected as an error and the lane marking is classed as a curb. Further, a degree of statistical confidence is calculated and also assigned to the detected lane marking. A particular advantage of the invention lies in that the classifier can be taught.
  • Within the scope of the amalgamation described above, the results of the previous time step are shifted by the distance covered and compensated with the current positions. Thus, compensation of the vehicle's own movement is carried out during amalgamation. The robustness of this method has its effect particularly in the regions located close to the vehicle 10, as these positions are observed repeatedly. The amalgamation in combination with compensation of the vehicle's own movement is illustrated in FIG. 4, wherein corresponding test zones 22 are connected by lines 23.
  • Preferably a time signal is formed by means of the classification results for a fixed particular location in world coordinates x, y. The results of classification are then shifted by means of compensation of the vehicle's own movement, and the results are amalgamated for a certain point in the world. This is preferably carried out recursively.
  • All detected lane markings which are classified by the classifier as valid lane markings 12 a, 12 b, 12 c are, as lane markings to be tracked, subjected to a tracking process in which the variation of the course of the lane markings 12 a, 12 b, 12 c in time in the plane of the roadway is tracked by means of a status estimator. The detected lane markings which are classified as “invalid lane markings” 16, 18 by the classifier are, on the other hand, rejected, that is, not tracked.
  • Actual tracking of the variation of the course of the lane markings 12 a, 12 b, 12 c in time is preferably carried out by an independent module and takes place by means of a status estimator that is based on an extended Kalman filter.
  • The invention enables robust lane detection even in complex traffic situations such as the town traffic situation shown in FIG. 2 or the building site situation shown in FIG. 3, wherein several lanes 11 can be handled safely as well.
  • While this invention has been described in terms of the preferred embodiments thereof, it is not intended to be so limited, but rather only to the extent set forth in the claims that follow.

Claims (18)

We claim:
1. A method for the detection and tracking of lane markings (12 a, 12 b, 12 c) from a motor vehicle (10) on which an image capture device (14) is mounted, wherein
an image (20, 20′) of a space located in front of the vehicle (10), comprising a plurality of picture elements, is captured by the image capture device at regular intervals;
the picture elements that meet a predetermined detection criterion are identified as detected lane markings in the captured image by an image processing system;
at least one detected lane marking identified as a lane marking to be tracked is subjected to a tracking process in which the variation of the course of the lane marking in time in the plane of the roadway is tracked by means of a status estimator;
at least one test zone (22) comprising a plurality of picture elements of the image is defined for each detected lane marking;
with the aid of intensity values of the picture elements associated with the test zone, at least one parameter is determined; and
the detected lane marking is assigned to one of a plurality of lane marking categories, depending on the parameter.
2. The method according to claim 1, wherein each detected lane marking is subjected to the tracking process as a lane marking to be tracked or rejected as an invalid lane marking, depending on the lane marking category to which it is assigned.
3. The method according to claim 1, wherein the position of the test zone within the image is determined by converting the position of a tracked lane marking in the plane of the roadway at a predetermined distance from the vehicle to image coordinates by a projection method.
4. The method according to claim 1, wherein from the intensity values of the picture elements associated with the test zone at least one statistical parameter is determined.
5. The method according to claim 4, wherein the statistical parameters include an average intensity value, an intensity value total, a maximum intensity value, a minimum intensity value, and a moment of intensity value distribution of the picture elements.
6. The method according to claim 1, wherein the parameter for a plurality of successive images is determined, and the detected lane marking is assigned to the lane marking category in addition with the aid of the curve of the parameter in time.
7. The method according to claim 6, wherein the current speed of the vehicle is determined and with the aid of the speed the curve of the parameter in time is converted to a curve of the parameter in space in the plane of the roadway, and wherein the detected lane marking is assigned to the lane marking category with the aid of the curve of the parameter in space.
8. The method according to claim 6, wherein for characterization of the curve of the parameter in time, a statistical curve measure is determined, wherein the detected lane marking is assigned to the lane marking category with the aid of the statistical curve measure.
9. The method according to any of claim 6, wherein the curve of the parameter in time is subjected to a time-frequency transform, wherein the detected lane marking is assigned to the lane marking category with the aid of the transform coefficients determined within the framework of the time-frequency transform.
10. The method according to claim 9, wherein the time-frequency transform is carried out iteratively.
11. The method according to claim 1, wherein for each detected lane marking there is defined a set of a plurality of test zones which in each case correspond to the position of a tracked lane marking in the plane of the roadway at different distances from the vehicle.
12. The method according to claim 11, wherein for each detected lane marking at least five test zones are defined.
13. The method according to claim 1, wherein the detected lane marking is assigned to a lane marking category from a group of lane marking categories that includes the categories of “invalid image object”, “single unbroken line”, “double unbroken line”, “single broken line”, “double broken line”, “broken and unbroken line”, “wide broken line” and “line with surface profile”.
14. The method according to claim 1, wherein at least five different parameters are determined for the or each test zone, wherein in particular the detected lane markings are assigned to a lane marking category with the aid of a subgroup of parameters which is selected by a classification method.
15. The method according to claim 14, wherein the classification method uses a neuronal network and/or a support vector machine.
16. The method according to claim 1, wherein respective parameters of a plurality of test zones of a detected lane marking at different distances from the vehicle and/or respective parameters of a single test zone in successive images are compared with each other and, with the aid of the result of comparison, combined into a common parameter.
17. The method according to claim 16, wherein with the aid of the result of comparison a degree of statistical confidence is calculated and assigned to the detected lane marking.
18. A method for the detection and tracking of lane markings by an image capture device mounted on a motor vehicle, said method comprising:
capturing an image of a space located in front of the vehicle with the image capture device, wherein said image comprises a plurality of picture elements captured by the image capture device at regular time intervals;
identifying picture elements that meet a predetermined detection criterion as detected lane markings in the captured image by means of an image processing system;
indicating at least one detected lane marking as a lane marking to be tracked according to a tracking process in which the variation of the course of the lane marking over time in the plane of the roadway is tracked by means of a status estimator of the image processing system;
defining for each detected lane marking a test zone comprising a plurality of picture elements;
determining at least one parameter based on an intensity value of the picture elements associated with the test zone; and
assigning the detected lane marking to one of a plurality of lane marking categories based on the parameter.
US13/650,182 2012-02-15 2012-10-12 Method for the detection and tracking of lane markings Active 2033-10-08 US9047518B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP12000978.2A EP2629243A1 (en) 2012-02-15 2012-02-15 Method for detecting and tracking lane markings
EP12000978.2 2012-02-15
EP12000978 2012-02-15

Publications (2)

Publication Number Publication Date
US20130208945A1 true US20130208945A1 (en) 2013-08-15
US9047518B2 US9047518B2 (en) 2015-06-02

Family

ID=45655022

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/650,182 Active 2033-10-08 US9047518B2 (en) 2012-02-15 2012-10-12 Method for the detection and tracking of lane markings

Country Status (2)

Country Link
US (1) US9047518B2 (en)
EP (1) EP2629243A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130202155A1 (en) * 2012-02-03 2013-08-08 Gopal Gudhur Karanam Low-cost lane marker detection
US20150139520A1 (en) * 2013-03-06 2015-05-21 Koninklijke Philips N.V. Scan region determining apparatus
US9047518B2 (en) * 2012-02-15 2015-06-02 Delphi Technologies, Inc. Method for the detection and tracking of lane markings
US9120486B1 (en) 2014-04-22 2015-09-01 Fca Us Llc Vehicle lane keeping techniques
US20150354976A1 (en) * 2014-06-10 2015-12-10 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US20160176341A1 (en) * 2014-12-22 2016-06-23 Volkswagen Ag Early detection of exit only and shared lanes using perception technology
US9470537B2 (en) * 2014-10-22 2016-10-18 Volkswagen Ag Accurate position determination near exit lanes
CN106897680A (en) * 2017-02-14 2017-06-27 中国科学院自动化研究所 The detection method and device of a kind of lane line
US9840253B1 (en) * 2016-06-14 2017-12-12 Delphi Technologies, Inc. Lane keeping system for autonomous vehicle during camera drop-outs
US9846823B2 (en) 2014-06-13 2017-12-19 Fujitsu Limited Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
WO2018081807A3 (en) * 2016-10-31 2018-06-28 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
CN108427931A (en) * 2018-03-21 2018-08-21 合肥工业大学 The detection method of barrier before a kind of mine locomotive based on machine vision
WO2018167688A1 (en) * 2017-03-15 2018-09-20 3M Innovative Properties Company Pavement marking system for lane identification
US10150473B2 (en) 2014-08-18 2018-12-11 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US10589743B2 (en) 2016-08-04 2020-03-17 Toyota Jidosha Kabushiki Kaisha Vehicle travel control apparatus
US20200192379A1 (en) * 2018-07-13 2020-06-18 Kache.AI System and method for automatically following a lane while in a vehicle's autonomous driving mode
US10803355B2 (en) * 2018-12-19 2020-10-13 Industrial Technology Research Institute Method for training image generator
JP2021508901A (en) * 2018-05-31 2021-03-11 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. Intelligent drive control methods and devices based on lane markings, as well as electronics
US11256781B2 (en) * 2019-02-20 2022-02-22 Rohde & Schwarz Gmbh & Co. Kg Measurement system as well as method of providing statistical information
RU2767508C2 (en) * 2014-05-21 2022-03-17 ЮНИВЕРСАЛ СИТИ СТЬЮДИОС ЭлЭлСи System and method for tracking vehicles in multi-level parking areas and at intersections
DE112015004885B4 (en) 2014-10-28 2023-10-12 Trw Automotive U.S. Llc IMPROVED ROAD DETECTION USING KINEMATIC DATA

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012068331A1 (en) 2010-11-19 2012-05-24 Magna Electronics Inc. Lane keeping system and lane centering system
US9824300B2 (en) 2016-03-01 2017-11-21 Ford Global Technologies, Llc Vehicle lane learning
DE102016205780A1 (en) 2016-04-07 2017-10-12 Volkswagen Aktiengesellschaft Method and device for adjusting a controller of a vehicle and control system for a vehicle
CN106354135A (en) * 2016-09-19 2017-01-25 武汉依迅电子信息技术有限公司 Lane keeping system and method based on Beidou high-precision positioning
EP3576008B1 (en) 2018-05-30 2023-12-27 Aptiv Technologies Limited Image based lane marking classification
DE102021005740A1 (en) 2021-11-19 2022-01-20 Daimler Ag Method of detecting and tracking lane markings

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6819779B1 (en) * 2000-11-22 2004-11-16 Cognex Corporation Lane detection system and apparatus
US20050069207A1 (en) * 2002-05-20 2005-03-31 Zakrzewski Radoslaw Romuald Method for detection and recognition of fog presence within an aircraft compartment using video images
US20050135658A1 (en) * 2003-12-17 2005-06-23 Mitsubishi Denki Kabushiki Kaisha Lane recognizing image processing system
US20050256636A1 (en) * 2004-05-11 2005-11-17 Toyota Jidosha Kabushiki Kaisha Driving lane recognizer and driving lane recognizing method
US20050273264A1 (en) * 2004-06-02 2005-12-08 Daimlerchrysler Ag Method and device for warning a driver of lane departure
US20050278098A1 (en) * 1994-05-23 2005-12-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US20060015252A1 (en) * 2004-07-15 2006-01-19 Mitsubishi Denki Kabushiki Kaisha Lane recognition image processing apparatus
US20060078205A1 (en) * 2004-10-08 2006-04-13 Porikli Fatih M Detecting roads in aerial images using feature-based classifiers
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method
US20080040004A1 (en) * 1994-05-23 2008-02-14 Automotive Technologies International, Inc. System and Method for Preventing Vehicular Accidents
US20080061952A1 (en) * 2004-08-19 2008-03-13 Robert Bosch Gmbh Method And Device For Driver Information
US20080208460A1 (en) * 2007-02-13 2008-08-28 Aisin Aw Co., Ltd. Lane determining device, method, and program
US20080291276A1 (en) * 2003-10-24 2008-11-27 Martin Randler Method for Driver Assistance and Driver Assistance Device on the Basis of Lane Information
US20090067675A1 (en) * 2007-09-07 2009-03-12 Yi Tan Radar guided vision system for vehicle validation and vehicle motion characterization
US20090080704A1 (en) * 2005-06-27 2009-03-26 Naoki Mori Vehicle and lane recognition device
US20090088966A1 (en) * 2007-09-27 2009-04-02 Hitachi, Ltd. Driving support system
US20090169055A1 (en) * 2007-12-27 2009-07-02 Aisin Aw Co., Ltd. Feature information collecting apparatus and feature information collecting program, and own vehicle position recognition apparatus and navigation apparatus
US20090167864A1 (en) * 2005-12-28 2009-07-02 Honda Motor Co., Ltd. Vehicle and Lane Mark Detection Device
US20100121569A1 (en) * 2007-05-25 2010-05-13 Aisin Aw Co., Ltd Lane determining device, lane determining method and navigation apparatus using the same
US20100172542A1 (en) * 2007-12-06 2010-07-08 Gideon Stein Bundling of driver assistance systems
US20110196608A1 (en) * 2010-02-06 2011-08-11 Bayerische Motoren Werke Aktiengesellschaft Method for Position Determination for a Motor Vehicle
US20120050074A1 (en) * 2010-02-26 2012-03-01 Bechtel Jon H Automatic vehicle equipment monitoring, warning, and control system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4314870B2 (en) * 2003-04-22 2009-08-19 日産自動車株式会社 Lane detection device
EP2629243A1 (en) * 2012-02-15 2013-08-21 Delphi Technologies, Inc. Method for detecting and tracking lane markings

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080040004A1 (en) * 1994-05-23 2008-02-14 Automotive Technologies International, Inc. System and Method for Preventing Vehicular Accidents
US20050278098A1 (en) * 1994-05-23 2005-12-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US6819779B1 (en) * 2000-11-22 2004-11-16 Cognex Corporation Lane detection system and apparatus
US20050069207A1 (en) * 2002-05-20 2005-03-31 Zakrzewski Radoslaw Romuald Method for detection and recognition of fog presence within an aircraft compartment using video images
US20080291276A1 (en) * 2003-10-24 2008-11-27 Martin Randler Method for Driver Assistance and Driver Assistance Device on the Basis of Lane Information
US20050135658A1 (en) * 2003-12-17 2005-06-23 Mitsubishi Denki Kabushiki Kaisha Lane recognizing image processing system
US20050256636A1 (en) * 2004-05-11 2005-11-17 Toyota Jidosha Kabushiki Kaisha Driving lane recognizer and driving lane recognizing method
US20050273264A1 (en) * 2004-06-02 2005-12-08 Daimlerchrysler Ag Method and device for warning a driver of lane departure
US20060015252A1 (en) * 2004-07-15 2006-01-19 Mitsubishi Denki Kabushiki Kaisha Lane recognition image processing apparatus
US20080061952A1 (en) * 2004-08-19 2008-03-13 Robert Bosch Gmbh Method And Device For Driver Information
US20060078205A1 (en) * 2004-10-08 2006-04-13 Porikli Fatih M Detecting roads in aerial images using feature-based classifiers
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method
US20090080704A1 (en) * 2005-06-27 2009-03-26 Naoki Mori Vehicle and lane recognition device
US20090167864A1 (en) * 2005-12-28 2009-07-02 Honda Motor Co., Ltd. Vehicle and Lane Mark Detection Device
US20080208460A1 (en) * 2007-02-13 2008-08-28 Aisin Aw Co., Ltd. Lane determining device, method, and program
US20100121569A1 (en) * 2007-05-25 2010-05-13 Aisin Aw Co., Ltd Lane determining device, lane determining method and navigation apparatus using the same
US20090067675A1 (en) * 2007-09-07 2009-03-12 Yi Tan Radar guided vision system for vehicle validation and vehicle motion characterization
US20090088966A1 (en) * 2007-09-27 2009-04-02 Hitachi, Ltd. Driving support system
US20100172542A1 (en) * 2007-12-06 2010-07-08 Gideon Stein Bundling of driver assistance systems
US20090169055A1 (en) * 2007-12-27 2009-07-02 Aisin Aw Co., Ltd. Feature information collecting apparatus and feature information collecting program, and own vehicle position recognition apparatus and navigation apparatus
US20110196608A1 (en) * 2010-02-06 2011-08-11 Bayerische Motoren Werke Aktiengesellschaft Method for Position Determination for a Motor Vehicle
US20120050074A1 (en) * 2010-02-26 2012-03-01 Bechtel Jon H Automatic vehicle equipment monitoring, warning, and control system

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130202155A1 (en) * 2012-02-03 2013-08-08 Gopal Gudhur Karanam Low-cost lane marker detection
US9047518B2 (en) * 2012-02-15 2015-06-02 Delphi Technologies, Inc. Method for the detection and tracking of lane markings
US9858667B2 (en) * 2013-03-06 2018-01-02 Koninklijke Philips N.V. Scan region determining apparatus
US20150139520A1 (en) * 2013-03-06 2015-05-21 Koninklijke Philips N.V. Scan region determining apparatus
US9120486B1 (en) 2014-04-22 2015-09-01 Fca Us Llc Vehicle lane keeping techniques
RU2767508C2 (en) * 2014-05-21 2022-03-17 ЮНИВЕРСАЛ СИТИ СТЬЮДИОС ЭлЭлСи System and method for tracking vehicles in multi-level parking areas and at intersections
US20150354976A1 (en) * 2014-06-10 2015-12-10 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
WO2015189847A1 (en) * 2014-06-10 2015-12-17 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US10317231B2 (en) * 2014-06-10 2019-06-11 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US10753758B2 (en) * 2014-06-10 2020-08-25 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US20200003573A1 (en) * 2014-06-10 2020-01-02 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US9846823B2 (en) 2014-06-13 2017-12-19 Fujitsu Limited Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
US10926763B2 (en) 2014-08-18 2021-02-23 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US11834040B2 (en) 2014-08-18 2023-12-05 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US10150473B2 (en) 2014-08-18 2018-12-11 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US9470537B2 (en) * 2014-10-22 2016-10-18 Volkswagen Ag Accurate position determination near exit lanes
DE112015004885B4 (en) 2014-10-28 2023-10-12 Trw Automotive U.S. Llc IMPROVED ROAD DETECTION USING KINEMATIC DATA
US10025996B2 (en) * 2014-12-22 2018-07-17 Volkswagen Ag Early detection of exit only and shared lanes using perception technology
US10152638B2 (en) * 2014-12-22 2018-12-11 Volkswagen Ag Early detection of exit only and shared lanes using perception technology
US20160176341A1 (en) * 2014-12-22 2016-06-23 Volkswagen Ag Early detection of exit only and shared lanes using perception technology
US9840253B1 (en) * 2016-06-14 2017-12-12 Delphi Technologies, Inc. Lane keeping system for autonomous vehicle during camera drop-outs
US20170355366A1 (en) * 2016-06-14 2017-12-14 Delphi Technologies, Inc. Lane keeping system for autonomous vehicle during camera drop-outs
US10589743B2 (en) 2016-08-04 2020-03-17 Toyota Jidosha Kabushiki Kaisha Vehicle travel control apparatus
US11392135B2 (en) * 2016-10-31 2022-07-19 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
US10739782B2 (en) * 2016-10-31 2020-08-11 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
US11960293B2 (en) * 2016-10-31 2024-04-16 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
US20220283591A1 (en) * 2016-10-31 2022-09-08 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
WO2018081807A3 (en) * 2016-10-31 2018-06-28 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
CN106897680A (en) * 2017-02-14 2017-06-27 中国科学院自动化研究所 The detection method and device of a kind of lane line
WO2018167688A1 (en) * 2017-03-15 2018-09-20 3M Innovative Properties Company Pavement marking system for lane identification
US11124933B2 (en) 2017-03-15 2021-09-21 3M Innovative Properties Company Pavement marking system for lane identification
CN108427931A (en) * 2018-03-21 2018-08-21 合肥工业大学 The detection method of barrier before a kind of mine locomotive based on machine vision
JP2021508901A (en) * 2018-05-31 2021-03-11 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. Intelligent drive control methods and devices based on lane markings, as well as electronics
US11314973B2 (en) 2018-05-31 2022-04-26 Shanghai Sensetime Intelligent Technology Co., Ltd. Lane line-based intelligent driving control method and apparatus, and electronic device
JP7024115B2 (en) 2018-05-31 2022-02-22 上▲海▼商▲湯▼智能科技有限公司 Intelligent drive control methods and devices based on lane markings, as well as electronic devices
US20200192379A1 (en) * 2018-07-13 2020-06-18 Kache.AI System and method for automatically following a lane while in a vehicle's autonomous driving mode
US10803355B2 (en) * 2018-12-19 2020-10-13 Industrial Technology Research Institute Method for training image generator
US11256781B2 (en) * 2019-02-20 2022-02-22 Rohde & Schwarz Gmbh & Co. Kg Measurement system as well as method of providing statistical information

Also Published As

Publication number Publication date
US9047518B2 (en) 2015-06-02
EP2629243A1 (en) 2013-08-21

Similar Documents

Publication Publication Date Title
US9047518B2 (en) Method for the detection and tracking of lane markings
US8890951B2 (en) Clear path detection with patch smoothing approach
US8670592B2 (en) Clear path detection using segmentation-based method
US10699567B2 (en) Method of controlling a traffic surveillance system
JP6458734B2 (en) Passenger number measuring device, passenger number measuring method, and passenger number measuring program
EP3576008B1 (en) Image based lane marking classification
US8634593B2 (en) Pixel-based texture-less clear path detection
WO2016129403A1 (en) Object detection device
US11064177B2 (en) Image processing apparatus, imaging apparatus, mobile device control system, image processing method, and recording medium
JP6358552B2 (en) Image recognition apparatus and image recognition method
JP2008123462A (en) Object detector
JP2013232091A (en) Approaching object detection device, approaching object detection method and approaching object detection computer program
JP6139088B2 (en) Vehicle detection device
US10984263B2 (en) Detection and validation of objects from sequential images of a camera by using homographies
US10984264B2 (en) Detection and validation of objects from sequential images of a camera
JP2013057992A (en) Inter-vehicle distance calculation device and vehicle control system using the same
JP6818626B2 (en) Vehicle type discrimination device, vehicle type discrimination method, and vehicle type discrimination system
JP5997962B2 (en) In-vehicle lane marker recognition device
CN107255470A (en) Obstacle detector
US20130070098A1 (en) Apparatus for monitoring surroundings of a vehicle
JP2014067320A (en) Stereo camera device
JP2004348645A (en) Infrared image recognition apparatus and alarm equipment using the same
US20240001933A1 (en) Occupant temperature estimating device, occupant state detection device, occupant temperature estimating method, and occupant temperature estimating system
KR101352662B1 (en) Apparatus and method for detecting passing vehicle using lane recognition
KR102039814B1 (en) Method and apparatus for blind spot detection

Legal Events

Date Code Title Description
AS Assignment

Owner name: DELPHI TECHNOLOGIES, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NUNN, CHRISTIAN;MEUTER, MIRKO;MUELLER, DENNIS;AND OTHERS;SIGNING DATES FROM 20121010 TO 20121011;REEL/FRAME:029117/0572

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELPHI TECHNOLOGIES INC.;REEL/FRAME:047143/0874

Effective date: 20180101

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

AS Assignment

Owner name: APTIV TECHNOLOGIES (2) S.A R.L., LUXEMBOURG

Free format text: ENTITY CONVERSION;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:066746/0001

Effective date: 20230818

Owner name: APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L., LUXEMBOURG

Free format text: MERGER;ASSIGNOR:APTIV TECHNOLOGIES (2) S.A R.L.;REEL/FRAME:066566/0173

Effective date: 20231005

Owner name: APTIV TECHNOLOGIES AG, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L.;REEL/FRAME:066551/0219

Effective date: 20231006