WO2001080068A1 - Generating a model of the path of a roadway from an image recorded by a camera - Google Patents

Generating a model of the path of a roadway from an image recorded by a camera Download PDF

Info

Publication number
WO2001080068A1
WO2001080068A1 PCT/US2001/012527 US0112527W WO0180068A1 WO 2001080068 A1 WO2001080068 A1 WO 2001080068A1 US 0112527 W US0112527 W US 0112527W WO 0180068 A1 WO0180068 A1 WO 0180068A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
roadway
value
skeleton
processor
Prior art date
Application number
PCT/US2001/012527
Other languages
English (en)
French (fr)
Inventor
Gideon Stein
Amnon Shashua
Original Assignee
Mobileye, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mobileye, Inc. filed Critical Mobileye, Inc.
Priority to AU2001253619A priority Critical patent/AU2001253619A1/en
Publication of WO2001080068A1 publication Critical patent/WO2001080068A1/en

Links

Classifications

    • G06T3/153
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Definitions

  • the invention relates generally to the field of systems and methods for generating an estimate as to the structure of a roadway from a vehicle and more specifically to systems and methods for generating an estimate using an image recorded from the vehicle.
  • Accurate estimation of the structure of a roadway ahead of a vehicle is an important component in autonomous driving and computer vision-based driving assistance.
  • Using computer vision techniques to provide assistance while driving, instead of mechanical sensors, allows for the use of the information that is recorded for use in estimating vehicle movement to also be used in estimating ego-motion identifying lanes and the like, without the need for calibration between sensors as would be necessary with mechanical sensors. This reduces cost and maintenance.
  • roads have few feature points, if any.
  • the most obvious features in a road such as lane markings, are often difficult to detect and have a generally linear structure, whereas background image structures, such as those associated with other vehicles, buildings, trees, and the like, will typically have many feature points. This will make image- or optical-flow-based estimation difficult in practice.
  • typically images that are recorded for roadway structure estimation will contain a large amount of "outlier" information that is either not useful in estimating roadway structure, or that may result in poor estimation.
  • images of objects such as other vehicles will contribute false information for the road structure estimation.
  • conditions that degrade image quality such as raindrops and glare, will also make accurate road structure estimation difficult.
  • the invention provides new and improved systems and methods for generating an estimate of the structure of a roadway using an image recorded from the vehicle.
  • the invention provides a road skeleton estimation system for generating an estimate as to a skeleton of at least a portion of a roadway ahead of a vehicle.
  • the road skeleton estimation system includes an image receiver and a processor.
  • the image receiver is configured to receive image information relating to at least one image recorded ahead of the vehicle.
  • the processor is configured to process the image information received by the image receiver to generate an estimate of the skeleton of at least a portion of the roadway ahead of the vehicle.
  • FIG. 1 schematically depicts a vehicle moving on a roadway and including a roadway skeleton estimation constructed in accordance with the invention
  • FIG. 2 depicts a graph that schematically depicts a model of a roadway skeleton, useful in understanding one embodiment of the invention
  • FIG. 3 schematically depicts an image of a portion of a roadway, useful in understanding outputs performed by the roadway skeleton estimation system
  • FIGS. 4 and 5 depict flow charts depicting operations performed by the roadway skeleton estimation system in estimating the skeleton of the roadway, FIG. 4 depicting operations in connection with a model in which the roadway is modeled as a circular arc and FIG. 5 depicting operations in connection with a model in which the roadway is modeled as a parabolic arc.
  • FIG.T schematically depicts a vehicle 10 moving on a roadway 11 and including a roadway skeleton estimation system 12 constructed in accordance with the invention.
  • the vehicle 10 may be any kind of vehicle 10 that may move on the roadway 11, including, but not limited to automobiles, trucks, buses and the like.
  • the roadway skeleton estimation system 12 includes a camera 13 and a processor 14.
  • the camera 13 is mounted on the vehicle 10 and is preferably pointed in a forward direction, that is, in the direction in which the vehicle would normally move, to record successive images as the vehicle moves over the roadway. Preferably as the camera 13 records each image, it will provide the image to the processor 14.
  • the processor 14 will process information that it obtains from the successive images, possibly along with other information, such as information from the vehicle's speedometer (not separately shown) to estimate a roadway skeleton representing a portion of the roadway 11 ahead of the vehicle 10.
  • the processor 14 may also be mounted in or on the vehicle 11 and may form part thereof.
  • the roadway skeleton estimates generated by the processor 14 may be used for a number of things, including, but not limited to autonomous driving by the vehicle, providing assistance in collision avoidance, and the like. Operations performed by the processor 14 in estimating the roadway skeleton will be described in connection with the flow chart depicted in FIG. 3.
  • the roadway is modeled as a circular arc parallel to the XZ plane in three-dimensional space.
  • the X (horizontal) and Y (vertical) axes of three- dimensional space correspond to the "x" and "y" axes of the image plane of the images recorded by the camera 14, and the Z axis is orthogonal to the image plane.
  • the image plane will be the plane of the image after the image has been rectified to provide that the Z axis is parallel to the plane o the roadway 11 ;
  • the Stein I patent application describes a methodology for rectifying the images to provide that the image plane will have a suitable orientation.
  • the roadway is modeled as a circular arc, and, with reference to FIG. 2, that FIG. circular arc in the XZ plane representing the roadway, which will be referred to as the roadway's skeleton, is identified by reference numeral S.
  • Two lane markings 20L and 20R are also shown as dashed lines on opposite sides of the skeleton S.
  • the distances Rl and R2 between the lane markings 20L and 20R and the skeleton S may be the same, or they may differ.
  • the center of the circular arc need not be along the X axis, and, indeed will generally be at an angle ⁇ thereto.
  • the circular arc is parameterized by three components, namely
  • Zl and Z2 are three meters and thirty meters, respectively.
  • value XI represents the horizontal distance between the Z axis, specifically the point with coordinates (0,Z1), and the point on skeleton S with coordinates (X1,Z1).
  • the value X2 represents the horizontal distance between the points on line LI and circular arc comprising skeleton S at coordinate Z2.
  • R 2 a 2 + b 2 (1).
  • the radius R can be determined using equation (1).
  • the value for the radius R can be determined using equation (1).
  • equations (4) through (6) and the coordinates (x,y) of each point in the image ⁇ the coordinates (X,Y,Z) of points in three- dimensional space that are projected onto the image can be determined.
  • the image can be warped from the (x,y) space to a ⁇ R, ⁇ space using equations (2) and (3).
  • the image ⁇ in rectangular ((x,y)) coordinates can be warped to ⁇ R, ⁇ space.
  • the range is determined as follows. As noted above, warping the image of the roadway to R, ⁇ space effectively provides a warped image ⁇ ' in which the roadway is straight, not curved.
  • An illustrative image is depicted in FIG.3. As shown inFIG.3, roadway 30 is divided into two regions, including a near region 31 and a more distant region 32.
  • the vertical coordinates of the picture elements, or "pixels,” that are subtended by the roadway in FIG.3 extends from -120 to -10, the vertical coordinates of the near region 31 extend from -120 to -20.
  • the width of the image, with the horizontal coordinates extending from -160 to 160 pixels, is 320
  • Z max the maximum value of Z for the roadway
  • the operations proceed in two phases.
  • the processor 14 performs a rough alignment using a straight road model.
  • the areas of the image ⁇ ' are detected that appear to belong to road direction indicators, such as lane markings, tire marks, edges and so forth.
  • the rough alignment generated during the first series of steps is used, along with information in the image ⁇ ' including both regions 31 and 32 to determine a higher-order model.
  • only features found in the distant region 32 that are extensions of features found in the near region 31 are utilized, which will ensure that non- roadway features, such as automobiles, that may be found in the distant region 32 will be ignored.
  • the processor 14 initially receives an image ⁇ from the camera 14 (step 200).
  • the image ⁇ is a projection of points in rectangular three- dimensional coordinates (X,Y,Z).
  • the processor 14 After receiving the image ⁇ , the processor 14 performs the first series of steps in connection with the near region 31 (FIG. 3) of the roadway to generate a first-order model of the roadway.
  • the processor 14 will generate a warped image ⁇ ', and generates a value for a cost function for each warped image ⁇ 1 .
  • the value of the cost function will represent a measure of the number of vertical features that are present in the warped image ⁇ '. Since, for the warped image ⁇ ' for which the roadway appears to be straightest, the number of vertical features that are present in the warped image ⁇ ' will be largest, the processor 14 can select the value of the parameter X ! that was used in generating the warped image ⁇ ' associated with the largest cost function value as the appropriate value for parameter X,.
  • the processor 14 after the processor 14 receives the image ⁇ , it will select one, "i-th,” of the selected values for parameter X, (step 201) and using the selected value and the predetermined values for parameters X 2 and R, generate a warped image ⁇ 'j, in which the image ⁇ is warped to the R, ⁇ space (step 202). Thereafter, the processor 14 will generate the cost function, as follows. The processor 14 initially generates a derivative image d ⁇ ' along the horizontal ("x") coordinate of the image ⁇ ', the derivative representing the rate of change of image brightness, or intensity, as a function of the horizontal coordinate (step 203).
  • the processor 14 applies a non-linear function, such as a binary threshold or sigmoid function, to the derivative image d ⁇ ', thereby to generate a normalized derivative image n(d ⁇ ') (step 204).
  • the processor 14 for each column of pixels in the normalized derivative image n(d ⁇ ') generated in step 204, generates a sum of the pixel values of the pixels in the respective column (step 205), generates, for each column, the square of the sum generated for the column (step 206) and forms a sum of the squares generated in step 206 as the value of the cost function for the value of the parameter X ! and the warped image ⁇ ' generated therewith (step 207).
  • the processor 14 determines, for each column, whether the value of the square exceeds a selected threshold (step 208).
  • step 208 the processor 14 will determine whether it has performed steps 202 through 208 in connection with all of the selected values for parameter X, (step 209). If the processor 14 makes a negative determination in connection with step 209, it will return to step 202 to select another value for parameter X ! and performs steps 203 through 209 in connection therewith.
  • the processor 14 will perform steps 202 through 209 in connection with each of the selected values for parameter Xj to generate cost values therefor.
  • the processor 14 determines in step 209 that it has performed steps 202 through 209 in connection all of the selected values for parameter X l5 it will sequence to step 209 to identify, among the cost function values generated in step 207, the maximum cost function value (step 210).
  • the parameter 14 will determine the value of the parameter X ! and the warped image ⁇ ' associated with the maximum cost function value identified in step 209 (step 211). At this point the processor 14 will have completed the first phase, with the warped image ⁇ ' comprising the rough alignment.
  • the processor 14 After completing the first phase, the processor 14 begins the second phase. In the second phase, the processor 14 performs operations similar to those described above in the first phase, except that
  • the road skeleton S can also be modeled as a parabolic arc.
  • the major axis of the parabolic arc is along the X axis in three-dimensional space, which, as noted above, corresponds to the horizontal, or x, axis of the image ⁇ .
  • the road skeleton will conform to the equation
  • the image ⁇ can be warped to an image ⁇ ', in which the skeleton is straight, as follows. If (x,y) are the coordinates of a point in image ⁇ that is a projection of a point with coordinates (X,Y,Z) in three-dimensional space
  • a point in the image is a proj ection of a point the roadway
  • Y h is the height of the camera off the roadway and "d” is the pitch angle of the camera 14 relative to the horizontal axis Z.
  • a cost function is defined for the view (C,Z) whose value will be a maximum for the correct values of "a,” "b” and “d.”
  • searching for the maximum of the cost function's value is performed in two general phases, with a near region being used in the first phase, and a more extended region being used in the second phase.
  • the value of the cost function is determined as follows.
  • image ⁇ is warped to an image ⁇ '.
  • the image ⁇ is an image of the roadway (and possibly other objects and features) in which the skeleton of the roadway is, in an overhead (X,Z) view, a parabolic arc.
  • image ⁇ ' is an image of the roadway (and possibly other objects and features) in which the skeleton of the roadway may, in an overhead (C,Z) view, be a straight line, depending on the values for "a,” "b” and "d” that were used in the warping.
  • the warped image is projected onto the warped image's horizontal "x" axis by summing the pixel values in the respective columns.
  • the derivative of the sums across the warped image's horizontal "x” axis is then determined, and a sum of the absolute values of the derivative across the warped image's horizontal "x” axis is generated, with that sum being the value of the cost function.
  • the value of cost function value will be larger if the projection onto the warped image's horizontal projection is sharper, which will occur if there are more vertical features in the warped image ⁇ ', and it would be possible to select the warped image ⁇ * for which the cost function value is greatest as the warped image ⁇ ' in which the skeleton S of the roadway appears as a straight line. If there were no objects in the image ⁇ , and hence warped image ⁇ ', other than the roadway 11 , this would be correct. However, if there is clutter, such as other objects, in the image ⁇ , and hence warped image ⁇ ', with strong edges, the value of the cost function tends to get dominated by the clutter, which may result in an improper warped image ⁇ ' being selected.
  • the two-phase methodology reduces the likelihood of this occurring.
  • the image ⁇ is warped to provide warped images ⁇ ' - using a straight road model, that is, with value of coefficient "a” set to zero and the values of "b" and "d” may be zero or non-zero.
  • areas that appear to belong to road direction indicators, such as lane markings and tire tracks, as opposed to lines associated with other objects, such as cares, are then identified.
  • a preliminary value for coefficient "b" and a value for pitch "d" are determined.
  • image ⁇ is again warped, to generate warped images ⁇ " i5 using selected values "a;” for coefficient "a” and the preliminary value for coefficient "b” and the value for pitch "d” that were determined during the first phase.
  • the cost function values are generated, and an assessment as to the warped image ⁇ "j for which the skeleton S most approximates a straight line.
  • the value "a ' that was used in generating the warped image ⁇ for which the skeleton S most approximates a straight line is selected as the value for coefficient "a" for the model represented by equation (16).
  • the value pitch "d" identified in the first phase is selected as the pitch "d.”
  • the value of coefficient "b" value for the model represented by equation (16) corresponds to the value b which, as the coefficient of the linear term in the parabolic model for which the value "a ; " is identified as the coefficient "a” in the parabolic model, most approximates the straight line model, using the preliminary value for "b,” that was developed during the first phase.
  • the processor 14 after the processor 14 receives an image ⁇ (step 250), it will, in the first phase, select initial guess for the values for pitch "d" and coefficient "b” (step 251) and, using those values, warp the image ⁇ (step 252) thereby to generate a warped image ⁇ '.
  • the processor 14 then, for each column of pixels in the warped image ⁇ 1 , sums the pixel values for the pixels in the near region 31 in the column (step 253), generates values representing the derivative in the horizontal direction (step 254) and generates values that correspond to the absolute value of the derivative (step 255).
  • portions of the warped image ⁇ ' that contain, for example, edges or lane markings will produce peaks in the absolute values generated in step 255, since the edges and lane markings will correspond to fast transitions from dark-to-light or light- to-dark regions of the images.
  • the processor 14 then thresholds the absolute values generated in step 255 (step 256), that is, for any absolute value that is less than a predetermined threshold value, the processor 14 sets the absolute value to zero. For the absolute values that are above the threshold, the processor 14 then finds local maxima (step 257).
  • peaks "p ' typically correspond to lane markings in the warped image ⁇ ', although some peaks p ⁇ may be caused by a strong edge that is not part of the roadway, such as an automobile a short distance ahead of the vehicle 10.
  • the set of points (d j ,b k ) in a (d,b) plane that are associated with each peak p ⁇ form a line 1 ; .
  • the lines 1; that are associated with features along the roadway, such as lane markings and tire tracks, will intersect at or near a point (d ⁇ b-.) in the (d,b) plane, and the processor 14 will identify that point and determine the appropriate values for coefficient "b” and pitch "d” as b x and d x , respectively.
  • the processor 14 determines a value for coefficient "a" using regions of the image including both the near region 31 and the far region 32, generally emphasizing columns that are associated with or near the subset of peaks p f that it identified as being associated with features of the roadway.
  • X a t Z + b t Z + c with the value of "c” being set to zero (reference equation 16) over a range of "Z” as determined in the first phase (step 261).
  • the processor 14 selects a value of "a ; " (step 262) and warps the original image ⁇ to generate a warped image ⁇ "; using the values "a;” and “b x " as coefficients "a” and " b, and value d,. as the pitch "d” (step 263).
  • the processor 14 then projects a portion of the warped image ⁇ " i5 specifically the portion in the far region 32, onto the warped image's horizontal axis by summing the pixel values of the pixels in each column in that region (step 264) and then generates the derivative of the sums along the horizontal axis (step 265).
  • the processor 14 will generate the absolute value of the derivative (step 266) and generate a value corresponding to the sum of the absolute values (step 267).
  • steps 266 and 267 in connection only with columns that were determined in the first phase to be associated with features of the roadway 11 the processor 14 will minimize contributions due to features that are not associated with the roadway, such as automobiles, which may otherwise unduly influence the result.
  • the processor 14 After performing steps 262 through 267 for the value "a j " that was selected in step 262, the processor 14 will determine whether it has selected all of the values "a/' that were selected in step 260 (step 268), and, if not, return to step 262 to select another value "a j " and perform steps 263 through 267 in connection therewith.
  • the processor will perform steps 262 through 267 through a plurality of iterations until it has generated the sum of the absolute value of the derivative for all of the values "a j " that were selected in step 260.
  • the processor 14 determines in step 268 that it has selected all of the values "a ; " that were selected in 260, it will have generated a sum of the absolute value of the derivative for all of the values "a i5 " in which case it will sequence to step 269.
  • the processor identifies the value "a j " for which the sum is the largest (step 269).
  • the correct parameters for the skeleton S are, as the value of coefficient "a,” the value d ⁇ that was selected in step 269, as the value of coefficient "b,” the value of b that was generated for that value "a j " in step 261, and, as the pitch "d," the value generated in the first phase.
  • the invention provides a number of advantages.
  • the invention provides a system for estimating the skeleton S of a roadway 11 for some distance ahead of a vehicle.
  • Two specific methodologies are described, one methodology using a model in which the roadway is modeled as a circular arc, and the other methodology using a model in which the roadway is modeled as a parabolic arc, in both methodologies requiring only one image, although it will be appreciated that the system can make use of a combination of these methodologies, and/or other methodologies. Determining the skeleton of a roadway for some distance ahead of a vehicle can be useful in connection with autonomous or assisted driving of the vehicle.
  • a system in accordance with the invention can be constructed in whole or in part from special purpose hardware or a general purpose computer system, or any combination thereof, any portion of which may be controlled by a suitable program.
  • Any program may in whole or in part comprise part of or be stored on the system in a conventional manner, or it may in whole or in part be provided in to the system over a network or other mechanism for transferring information in a conventional manner.
  • the system may be operated and/or otherwise controlled by means of information provided by an operator using operator input elements (not shown) which may be connected directly to the system or which may transfer the information to the system over a network or other mechanism for transferring information in a conventional manner.
PCT/US2001/012527 2000-04-14 2001-04-14 Generating a model of the path of a roadway from an image recorded by a camera WO2001080068A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001253619A AU2001253619A1 (en) 2000-04-14 2001-04-14 Generating a model of the path of a roadway from an image recorded by a camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US19739300P 2000-04-14 2000-04-14
US60/197,393 2000-04-14

Publications (1)

Publication Number Publication Date
WO2001080068A1 true WO2001080068A1 (en) 2001-10-25

Family

ID=22729225

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/012527 WO2001080068A1 (en) 2000-04-14 2001-04-14 Generating a model of the path of a roadway from an image recorded by a camera

Country Status (3)

Country Link
US (1) US7151996B2 (US20030040864A1-20030227-M00001.png)
AU (1) AU2001253619A1 (US20030040864A1-20030227-M00001.png)
WO (1) WO2001080068A1 (US20030040864A1-20030227-M00001.png)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
WO2015189847A1 (en) * 2014-06-10 2015-12-17 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10150473B2 (en) 2014-08-18 2018-12-11 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation

Families Citing this family (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US7697027B2 (en) 2001-07-31 2010-04-13 Donnelly Corporation Vehicular video system
DE10138641A1 (de) * 2001-08-07 2003-02-20 Ibeo Automobile Sensor Gmbh Verfahren zur Bestimmung einer Modellfahrbahn
DE10253510A1 (de) * 2002-11-16 2004-05-27 Robert Bosch Gmbh Vorrichtung und Verfahren zur Verbesserung der Sicht in einem Kraftfahrzeug
US8032659B2 (en) * 2003-01-21 2011-10-04 Nextio Inc. Method and apparatus for a shared I/O network interface controller
JP4703136B2 (ja) * 2004-06-02 2011-06-15 トヨタ自動車株式会社 線図形化処理装置
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
US8164628B2 (en) 2006-01-04 2012-04-24 Mobileye Technologies Ltd. Estimating distance to an object using a sequence of images recorded by a monocular camera
US7786898B2 (en) * 2006-05-31 2010-08-31 Mobileye Technologies Ltd. Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
US20080043099A1 (en) * 2006-08-10 2008-02-21 Mobileye Technologies Ltd. Symmetric filter patterns for enhanced performance of single and concurrent driver assistance applications
EP2383713B1 (en) 2006-12-06 2013-05-29 Mobileye Technologies Limited Detecting and recognizing traffic signs
US8017898B2 (en) 2007-08-17 2011-09-13 Magna Electronics Inc. Vehicular imaging system in an automatic headlamp control system
US7594441B2 (en) * 2007-09-27 2009-09-29 Caterpillar Inc. Automated lost load response system
US9176006B2 (en) 2008-01-15 2015-11-03 Mobileye Vision Technologies Ltd. Detection and classification of light sources using a diffraction grating
US8351684B2 (en) * 2008-02-13 2013-01-08 Caterpillar Inc. Terrain map updating system
US8194927B2 (en) 2008-07-18 2012-06-05 GM Global Technology Operations LLC Road-lane marker detection using light-based sensing technology
US8099213B2 (en) 2008-07-18 2012-01-17 GM Global Technology Operations LLC Road-edge detection
US8204277B2 (en) * 2008-07-18 2012-06-19 GM Global Technology Operations LLC Apparatus and method for camera-bases lane marker detection
US8532927B2 (en) * 2008-11-07 2013-09-10 Intellectual Ventures Fund 83 Llc Generating photogenic routes from starting to destination locations
US20100157061A1 (en) * 2008-12-24 2010-06-24 Igor Katsman Device and method for handheld device based vehicle monitoring and driver assistance
US9280711B2 (en) 2010-09-21 2016-03-08 Mobileye Vision Technologies Ltd. Barrier and guardrail detection using a single camera
US10776635B2 (en) 2010-09-21 2020-09-15 Mobileye Vision Technologies Ltd. Monocular cued detection of three-dimensional structures from depth images
US9959595B2 (en) 2010-09-21 2018-05-01 Mobileye Vision Technologies Ltd. Dense structure from motion
WO2012075250A1 (en) 2010-12-01 2012-06-07 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
GB201101237D0 (en) * 2011-01-25 2011-03-09 Trw Ltd Method of processing images and apparatus
WO2012145819A1 (en) 2011-04-25 2012-11-01 Magna International Inc. Image processing method for detecting objects using relative motion
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
WO2012145818A1 (en) 2011-04-25 2012-11-01 Magna International Inc. Method and system for dynamically calibrating vehicular cameras
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
WO2013019707A1 (en) 2011-08-01 2013-02-07 Magna Electronics Inc. Vehicle camera alignment system
US20140218535A1 (en) 2011-09-21 2014-08-07 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
EP2574958B1 (en) 2011-09-28 2017-02-22 Honda Research Institute Europe GmbH Road-terrain detection method and system for driver assistance systems
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
WO2013086249A2 (en) 2011-12-09 2013-06-13 Magna Electronics, Inc. Vehicle vision system with customized display
WO2013126715A2 (en) 2012-02-22 2013-08-29 Magna Electronics, Inc. Vehicle camera system with image manipulation
US10457209B2 (en) 2012-02-22 2019-10-29 Magna Electronics Inc. Vehicle vision system with multi-paned view
US9319637B2 (en) 2012-03-27 2016-04-19 Magna Electronics Inc. Vehicle vision system with lens pollution detection
US9707892B2 (en) 2012-04-25 2017-07-18 Gentex Corporation Multi-focus optical system
US9550455B2 (en) 2012-04-25 2017-01-24 Gentex Corporation Multi-focus optical system
US9482296B2 (en) 2012-06-05 2016-11-01 Apple Inc. Rendering road signs during navigation
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
US10156455B2 (en) 2012-06-05 2018-12-18 Apple Inc. Context-aware voice guidance
US8983778B2 (en) * 2012-06-05 2015-03-17 Apple Inc. Generation of intersection information by a mapping service
US10176633B2 (en) 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US9418672B2 (en) 2012-06-05 2016-08-16 Apple Inc. Navigation application with adaptive instruction text
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US9707896B2 (en) 2012-10-15 2017-07-18 Magna Electronics Inc. Vehicle camera lens dirt protection via air flow
US9445057B2 (en) 2013-02-20 2016-09-13 Magna Electronics Inc. Vehicle vision system with dirt detection
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US9863928B1 (en) 2013-03-20 2018-01-09 United Parcel Service Of America, Inc. Road condition detection system
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9205776B2 (en) 2013-05-21 2015-12-08 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9747507B2 (en) * 2013-12-19 2017-08-29 Texas Instruments Incorporated Ground plane detection
EP2899669A1 (en) 2014-01-22 2015-07-29 Honda Research Institute Europe GmbH Lane relative position estimation method and system for driver assistance systems
US9487235B2 (en) 2014-04-10 2016-11-08 Magna Electronics Inc. Vehicle control system with adaptive wheel angle correction
JP6449627B2 (ja) * 2014-11-25 2019-01-09 株式会社Soken 走行区画線認識装置
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US9819841B1 (en) * 2015-04-17 2017-11-14 Altera Corporation Integrated circuits with optical flow computation circuitry
US10946799B2 (en) 2015-04-21 2021-03-16 Magna Electronics Inc. Vehicle vision system with overlay calibration
US9483700B1 (en) 2015-05-13 2016-11-01 Honda Motor Co., Ltd. System and method for lane vehicle localization with lane marking detection and likelihood scoring
US9566986B1 (en) 2015-09-25 2017-02-14 International Business Machines Corporation Controlling driving modes of self-driving vehicles
US11228700B2 (en) 2015-10-07 2022-01-18 Magna Electronics Inc. Vehicle vision system camera with adaptive field of view
US10187590B2 (en) 2015-10-27 2019-01-22 Magna Electronics Inc. Multi-camera vehicle vision system with image gap fill
US11277558B2 (en) 2016-02-01 2022-03-15 Magna Electronics Inc. Vehicle vision system with master-slave camera configuration
US11433809B2 (en) 2016-02-02 2022-09-06 Magna Electronics Inc. Vehicle vision system with smart camera video output
US9969389B2 (en) * 2016-05-03 2018-05-15 Ford Global Technologies, Llc Enhanced vehicle operation
US10300859B2 (en) 2016-06-10 2019-05-28 Magna Electronics Inc. Multi-sensor interior mirror device with image adjustment
US10750119B2 (en) 2016-10-17 2020-08-18 Magna Electronics Inc. Vehicle camera LVDS repeater
EP3532801B1 (en) 2016-10-31 2020-12-16 Mobileye Vision Technologies Ltd. Systems and methods for navigating lane merges and lane splits
US10452076B2 (en) 2017-01-04 2019-10-22 Magna Electronics Inc. Vehicle vision system with adjustable computation and data compression
KR102296520B1 (ko) * 2020-12-18 2021-09-01 주식회사 카비 단안 카메라를 이용한 경로추정에 의한 곡선차선 검출 방법
FR3120690B1 (fr) * 2021-03-15 2023-02-10 Psa Automobiles Sa Procédé et dispositif de détermination d’une fiabilité d’une cartographie base définition.

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473364A (en) * 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4910786A (en) * 1985-09-30 1990-03-20 Eichel Paul H Method of detecting intensity edge paths
US4819169A (en) * 1986-09-24 1989-04-04 Nissan Motor Company, Limited System and method for calculating movement direction and position of an unmanned vehicle
FR2665597B1 (fr) * 1990-07-31 1995-11-17 Thomson Trt Defense Procede et dispositif de localisation en temps reel de contours rectilignes dans une image numerisee, notamment pour la reconnaissance de formes dans un traitement d'analyse de scene.
US5189710A (en) * 1990-09-17 1993-02-23 Teknekron Communications Systems, Inc. Method and an apparatus for generating a video binary signal for a video image having a matrix of pixels
US5245422A (en) * 1991-06-28 1993-09-14 Zexel Corporation System and method for automatically steering a vehicle within a lane in a road
WO1993019441A1 (en) 1992-03-20 1993-09-30 Commonwealth Scientific And Industrial Research Organisation An object monitoring system
US5515448A (en) 1992-07-28 1996-05-07 Yazaki Corporation Distance measuring apparatus of a target tracking type
US5521633A (en) 1992-09-25 1996-05-28 Yazaki Corporation Motor vehicle obstacle monitoring system using optical flow processing
JP2863381B2 (ja) 1992-09-25 1999-03-03 矢崎総業株式会社 車両用監視方法
IL107603A (en) * 1992-12-21 1997-01-10 Johnson & Johnson Vision Prod Ophthalmic lens inspection method and apparatus
US5446549A (en) * 1993-01-14 1995-08-29 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for noncontact surface contour measurement
US5529138A (en) 1993-01-22 1996-06-25 Shaw; David C. H. Vehicle collision avoidance system
US5809322A (en) * 1993-12-12 1998-09-15 Associative Computing Ltd. Apparatus and method for signal processing
JP3287117B2 (ja) 1994-07-05 2002-05-27 株式会社日立製作所 撮像装置を用いた車両用の環境認識装置
DE59509929D1 (de) 1994-07-06 2002-01-24 Volkswagen Ag Verfahren zur Ermittlung der Sichtweite, insbesondere für die Bewegung eines Kraftfahrzeuges
US5642093A (en) 1995-01-27 1997-06-24 Fuji Jukogyo Kabushiki Kaisha Warning system for vehicle
KR960032262A (ko) 1995-02-09 1996-09-17 배순훈 차량의 주행 안전 시스템
WO1996034363A1 (fr) * 1995-04-26 1996-10-31 Hitachi, Ltd. Processeur d'images pour vehicule
KR100423379B1 (ko) * 1995-05-12 2004-07-23 소니 가부시끼 가이샤 키신호생성장치및화상합성장치와,키신호생성방법및화상합성방법
US7110880B2 (en) * 1997-10-22 2006-09-19 Intelligent Technologies International, Inc. Communication method and arrangement
US7085637B2 (en) * 1997-10-22 2006-08-01 Intelligent Technologies International, Inc. Method and system for controlling a vehicle
US6526352B1 (en) * 2001-07-19 2003-02-25 Intelligent Technologies International, Inc. Method and arrangement for mapping a road
FR2735598B1 (fr) * 1995-06-16 1997-07-11 Alsthom Cge Alcatel Methode d'extraction de contours par une approche mixte contour actif et amorce/guidage
JP3574235B2 (ja) 1995-08-31 2004-10-06 本田技研工業株式会社 車両の操舵力補正装置
US6047080A (en) * 1996-06-19 2000-04-04 Arch Development Corporation Method and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images
US6097839A (en) * 1997-03-10 2000-08-01 Intermec Ip Corporation Method and apparatus for automatic discriminating and locating patterns such as finder patterns, or portions thereof, in machine-readable symbols
US6313840B1 (en) * 1997-04-18 2001-11-06 Adobe Systems Incorporated Smooth shading of objects on display devices
US6130706A (en) * 1998-03-25 2000-10-10 Lucent Technologies Inc. Process for determining vehicle dynamics
US6424430B1 (en) * 1998-04-06 2002-07-23 Adobe Systems Incorporated Rendering of objects on graphical rendering devices as clipped images
JPH11353565A (ja) 1998-06-09 1999-12-24 Yazaki Corp 車両用衝突警報方法及び装置
DE19842176A1 (de) * 1998-09-15 2000-03-16 Bosch Gmbh Robert Verfahren und Vorrichtung zur Verkehrszeichenerkennung und Navigation
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
JP3853542B2 (ja) 1999-07-26 2006-12-06 パイオニア株式会社 画像処理装置、画像処理方法及びナビゲーション装置
CN1307067C (zh) * 2000-06-20 2007-03-28 株式会社日立制作所 车辆行驶控制装置
JP2002367059A (ja) * 2001-06-13 2002-12-20 Mbr:Kk 転送式セキュリティシステム
JP3922173B2 (ja) * 2002-12-18 2007-05-30 トヨタ自動車株式会社 運転補助システム及び装置

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473364A (en) * 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
ESTABLE S. ET AL.: "A real-time traffic sign recognition system", PROCEEDINGS OF THE INTELLIGENT VEHICLES '954 SYMPOSIUM, 1994, pages 213 - 218, XP002944994 *
FROHN H. ET AL.: "VISOCAR: An autonomous industrial transport vehicle guided by visual navigation", 1989 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, May 1989 (1989-05-01), pages 1155 - 1159, XP002944993 *
GEHRIG S.K. ET AL.: "A trajectory-based approach for the lateral control of car following systems", 1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, vol. 4, 1998, pages 3596 - 3601, XP002944992 *
GILLNER W.J.: "Motion based vehicle detection on motorways", PROCEEDINGS OF THE INTELLIGENT VEHICLES '95 SYMPOSIUM, 1995, pages 483 - 487, XP002944996 *
GREGOR R. ET AL.: "EMS-vision: A perceptual system for autonomous vehicles", PROCEEDINGS OF THE IEEE INTELLIGENT VEHICLES SYMPOSIUM, 2000, pages 52 - 57, XP002944989 *
JANSSEN R. ET AL.: "Hybrid approach for traffic sign recognition", INTELLIGENT VEHICLES '93 SYMPOSIUM, 1993, pages 390 - 395, XP002944995 *
MAGANTO A.L. ET AL.: "A monocular vision system for autonomous vehicle guidance", INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2000, pages 236 - 239, XP002944990 *
STEIN G.P. ET AL.: "A robust method for computing vehicle ego-motion", PROCEEDINGS OF THE IEEE INTELLIGENT VEHICLES SYMPOSIUM, 2000, pages 362 - 368, XP002944991 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US9834216B2 (en) 2002-05-03 2017-12-05 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US11203340B2 (en) 2002-05-03 2021-12-21 Magna Electronics Inc. Vehicular vision system using side-viewing camera
US10683008B2 (en) 2002-05-03 2020-06-16 Magna Electronics Inc. Vehicular driving assist system using forward-viewing camera
US10351135B2 (en) 2002-05-03 2019-07-16 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US10118618B2 (en) 2002-05-03 2018-11-06 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US9643605B2 (en) 2002-05-03 2017-05-09 Magna Electronics Inc. Vision system for vehicle
US9609289B2 (en) 2004-04-15 2017-03-28 Magna Electronics Inc. Vision system for vehicle
US11503253B2 (en) 2004-04-15 2022-11-15 Magna Electronics Inc. Vehicular control system with traffic lane detection
US9948904B2 (en) 2004-04-15 2018-04-17 Magna Electronics Inc. Vision system for vehicle
US10015452B1 (en) 2004-04-15 2018-07-03 Magna Electronics Inc. Vehicular control system
US11847836B2 (en) 2004-04-15 2023-12-19 Magna Electronics Inc. Vehicular control system with road curvature determination
US10110860B1 (en) 2004-04-15 2018-10-23 Magna Electronics Inc. Vehicular control system
US9428192B2 (en) 2004-04-15 2016-08-30 Magna Electronics Inc. Vision system for vehicle
US10735695B2 (en) 2004-04-15 2020-08-04 Magna Electronics Inc. Vehicular control system with traffic lane detection
US10187615B1 (en) 2004-04-15 2019-01-22 Magna Electronics Inc. Vehicular control system
US10306190B1 (en) 2004-04-15 2019-05-28 Magna Electronics Inc. Vehicular control system
US9736435B2 (en) 2004-04-15 2017-08-15 Magna Electronics Inc. Vision system for vehicle
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
US10462426B2 (en) 2004-04-15 2019-10-29 Magna Electronics Inc. Vehicular control system
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US11396257B2 (en) 2006-08-11 2022-07-26 Magna Electronics Inc. Vehicular forward viewing image capture system
US10787116B2 (en) 2006-08-11 2020-09-29 Magna Electronics Inc. Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera
US11148583B2 (en) 2006-08-11 2021-10-19 Magna Electronics Inc. Vehicular forward viewing image capture system
US11623559B2 (en) 2006-08-11 2023-04-11 Magna Electronics Inc. Vehicular forward viewing image capture system
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US11951900B2 (en) 2006-08-11 2024-04-09 Magna Electronics Inc. Vehicular forward viewing image capture system
WO2015189847A1 (en) * 2014-06-10 2015-12-17 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US10317231B2 (en) 2014-06-10 2019-06-11 Mobileye Vision Technologies Ltd. Top-down refinement in lane marking navigation
US10926763B2 (en) 2014-08-18 2021-02-23 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US10150473B2 (en) 2014-08-18 2018-12-11 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US11834040B2 (en) 2014-08-18 2023-12-05 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation

Also Published As

Publication number Publication date
AU2001253619A1 (en) 2001-10-30
US20030040864A1 (en) 2003-02-27
US7151996B2 (en) 2006-12-19

Similar Documents

Publication Publication Date Title
US7151996B2 (en) System and method for generating a model of the path of a roadway from an image recorded by a camera
Yu et al. Lane boundary detection using a multiresolution hough transform
US8180100B2 (en) Plane detector and detecting method
US8102427B2 (en) Camera egomotion estimation from an infra-red image sequence for night vision
US7612800B2 (en) Image processing apparatus and method
US8331653B2 (en) Object detector
JP3868876B2 (ja) 障害物検出装置及び方法
Yamaguchi et al. Vehicle ego-motion estimation and moving object detection using a monocular camera
US7889887B2 (en) Lane recognition apparatus
US6704621B1 (en) System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion
EP3735675A1 (en) Image annotation
EP3193306B1 (en) A method and a device for estimating an orientation of a camera relative to a road surface
US8259998B2 (en) Image processing device for vehicle
US8885049B2 (en) Method and device for determining calibration parameters of a camera
JP3367170B2 (ja) 障害物検出装置
Broggi et al. Self-calibration of a stereo vision system for automotive applications
Yamaguchi et al. Road region estimation using a sequence of monocular images
Schwarzinger et al. Vision-based car-following: detection, tracking, and identification
Baehring et al. Detection of close cut-in and overtaking vehicles for driver assistance based on planar parallax
CA2392578A1 (en) System and method for detecting obstacles to vehicle motion
Takahashi et al. A robust lane detection using real-time voting processor
US10982967B2 (en) Method and device for fast detection of repetitive structures in the image of a road scene
JPH1055446A (ja) 物体認識装置
JP7134780B2 (ja) ステレオカメラ装置
後方カメラ用画像処理技術 et al. Image processing technology for rear view camera (1): Development of lane detection system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP