EP0434455A2 - Method of determining the configuration of a path for motor vehicles - Google Patents
Method of determining the configuration of a path for motor vehicles Download PDFInfo
- Publication number
- EP0434455A2 EP0434455A2 EP19900314123 EP90314123A EP0434455A2 EP 0434455 A2 EP0434455 A2 EP 0434455A2 EP 19900314123 EP19900314123 EP 19900314123 EP 90314123 A EP90314123 A EP 90314123A EP 0434455 A2 EP0434455 A2 EP 0434455A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- straight line
- coordinate
- line
- point
- straight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/457—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by analysing connectivity, e.g. edge linking, connected component analysis or slices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Definitions
- the present invention relates to a method of determining or recognizing the configuration of a path for motor vehicles through image processing, and more particularly to a method of recognizing boundaries of a path irrespective of the configuration of the path.
- Recognition of the configuration of a path for motor vehicles requires that boundaries of the path at an end thereof be determined. To determine the boundaries of a path at an end thereof, it is necessary to process image data of the path at the end thereof, which are produced by a television camera or the like, and to extract line segments from the processed image data at the end of the path.
- One image processing method employs the Hough transformation as disclosed in Japanese Laid-Open Patent Publications Nos. 62(1987)-24310 and 62(1987)-7096. According to the disclosed process, feature points on an image produced by the image data are subjected to the Hough transformation, thereby producing a group of straight lines corresponding to the distribution of the feature points.
- the above conventional method based on the image processing can directly utilize the detected straight lines produced by way of the Hough transformation. If the path is curved or branched, however, the detected straight lines produced by way of the Hough transformation cannot be utilized as they are. In such a case, it has been customary to divide an image produced by a television camera into a plurality of small processing areas, and repeatedly detect line segments in the processing areas successively from a closest small area in the image, so that information accurately representing the boundaries of an actual path can be obtained.
- a method of determining the configuration of a path for motor vehicles comprising the steps of producing original image data of the path, determining feature points contained in the original image data, determining a group of straight lines approximating the array of the feature points, extracting straight lines, effective to determine boundaries of the path, from the group of straight lines, dividing the extracted straight lines into a plurality of line segments by points of intersection between the straight lines, and checking the line segments against the feature points of the original image data to determine whether the line segments correspond to the boundaries of the path.
- the step of determining feature points preferably comprises the step of producing edge data by differentiating the original image data.
- the step of determining a group of straight lines comprises the step of effecting the Hough transformation on the edge data.
- the step of extracting straight lines preferably comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting point of the one straight line and the ending point of the other straight line, to be an effective straight line if the X coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line.
- the step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting end of the one straight line and the ending point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is larger than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line.
- the step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the ending point of the one straight line and the starting point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is smaller than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line.
- the step of dividing the extracted straight lines preferably comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether the master and slave line segments are parallel to each other, determining a point of intersection between the master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of the master and slave line segments, and dividing the master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- the step of checking the line segments preferably comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than the predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
- a method of determining the configuration of a path for motor vehicles comprising the steps of producing original image data of the path, dividing the original image data into a plurality of areas, determining feature points contained in each of the areas, determining a group of straight lines approximating the array of the feature points in each of the areas, extracting straight lines, effective to determine boundaries of the path, from the group of straight lines in each of the areas, uniting the extracted straight lines in each of the areas into a single combination of image data representing a single straight line, dividing the single straight line into a plurality of line segments, and checking the line segments against the feature points of the original image data to determine whether the line segments correspond to the boundaries of the path.
- the step of determining feature points preferably comprises the step of producing edge data by differentiating the original image data.
- the step of determining a group of straight lines comprises the step of effecting the Hough transformation on the edge data.
- the step of extracting straight lines preferably comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting point of the one straight line and the ending point of the other straight line, to be an effective straight line if the X coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line.
- the step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting end of the one straight line and the ending point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line.
- the step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the ending point of the one straight line and the starting point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is smaller than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line.
- the step of dividing the single straight line preferably comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether the master and slave line segments are parallel to each other, determining a point of intersection between the master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of the master and slave line segments, and dividing the master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- the step of checking the line segments preferably comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than the predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
- FIGS. 1(a) through 1(g) are diagrams illustrative of the basic concept of a method according to the present invention.
- FIG. 2 is a graph showing the expression of a representative line segment used in the method of the present invention.
- FIG. 3 is a flowchart of a procedure of uniting representative line segments together
- FIG. 4 is a graph illustrative of a threshold ⁇ TH used in the flowchart shown in FIG. 3;
- FIG. 5 is a flowchart of a uniting process A
- FIGS. 6(a) through 6(c) are graphs showing the states of line segments in the uniting process A
- FIG. 7 is a flowchart of a uniting process B
- FIGS 8(a) through 8(c) are graphs showing the states of line segments in the uniting process B;
- FIGS. 9(a) and 9(b) are flowcharts of a uniting process C
- FIGS. 10(a) through 10(f) are graphs showing the states of line segments in the uniting process C;
- FIG. 11 is a flowchart of a process of dividing representative line segments
- FIGS. 12(a) and 12(b) are graphs showing the states of line segments in the dividing process
- FIGS. 13(a) through 13(d) are graphs illustrative of the separation of divided line segments
- FIGS. 14(a) through 14(b) are graphs illustrative of checking divided line segments against edge data
- FIG. 15 is a flowchart of a process of determining the degree of agreement between the divided line segments and the edge data and also determining the lengths of line segments;
- FIGS. 16(a) through 16(d) are graphs showing the classification of the states of the divided line segments at the time they are checked against the edge data
- FIGS. 17(a) and 17(b) are graphs showing line segments excluded from data selected to draw boundaries in the flowchart shown in FIG. 15;
- FIGS. 18(a) through 18(d) are graphs illustrative of the manner in which the divided line segments are joined.
- FIGS. 1(a) through 1(g) show the preferred basic concept of the present invention as it is applied to a method of determining or recognizing the configuration of a path for motor vehicles.
- a scene as shown in FIG. 1(a) is imaged by a television camera installed in a motor vehicle running along a path 1.
- the path 1 has boundaries 2a, 2b adjoining a sidewalk 4, boundaries 3a, 3b adjoining a sidewalk 5, a boundary 6 adjoining a sidewalk 7 located beyond the boundary 6.
- the scene also includes a horizon 8 with a system 9 of mountains located therebeyond.
- the imaged scene is referred to as an original image, and the original image is divided into two images by a central vertical line. Each of the two divided images is converted into edge data produced by differentiation in each area therein. Using the edge data, the original image is expressed by dot (not shown).
- the edge data are subjected to the Hough transformation, thereby producing a group of straight lines approximating the array or distribution of feature points.
- the straight lines in each area are then statistically classified into groups by clustering, and representative straight lines in the respective groups are available as straight lines effective to determine the boundaries of the path.
- the original image is divided into two images because the straight lines can be detected highly accurately by way of the Hough transformation.
- the representative straight lines are shown in FIGS. 1(b) and 1(c).
- the representative straight lines, denoted at L1, L2, L3, correspond respectively to the boundaries 2a, 2b, 6, respectively.
- the representative straight lines, denoted at L4, L5, L6, correspond respectively to the boundaries 3a, 3b, 6, respectively.
- the straight lines L1 through L6 are then united together, as described later on, into a single combination of image data as shown in FIG. 1(d). Specifically, the straight lines L2, L5 are united into a straight line L7, and the straight lines L3, L6 are united into a straight line L8.
- the straight line L1 is divided into line segments l1a, l1b, l1c, the straight line L4 line segments l4a, l4b, l4c, the straight line L7 line segments l7a, l7b, l7c, and the straight line L8 line segments l8a, l8b, l8c.
- the line segments are checked against edge data from the original image, as described later on, to measure the degree of agreement between the line segments and the edge data. The result of the measurement is shown in FIG. 1(f). In FIG.
- the line segments l1b, l1c, l4b, l4c, l7b shown in FIG. 1(e) are removed because the degree of agreement between these line segments and the edge data is low.
- the line segments l8a, l8b, l8c corresponding to the boundary 6 remain unremoved because the degree of agreement between these line segments and the edge data is high.
- FIGS. 1(b) and 1(c) are united into the single combination of image data shown in FIG. 1(d) as follows: First, it is determined whether the representative line segments in the lefthand area (FIG. 1(b) and the representative line segments in the righthand area (FIG. 1(c)), both obtained by clustering, can be united or not. If these representative line segments can be united, then they are united. As shown in FIG. 2, each of these representative line segments is expressed by the lenght ⁇ of a line extending from the origin of an X-Y coordinate system or Hough-transformed coordinate system perpendicularly to the representative line segment, and an angle ⁇ between the X-axis and the line whose length is ⁇ .
- the possibility of uniting the representative line segments together is determined by checking if the lengths ⁇ and the angles ⁇ in the respective areas are of close values or not. If they are of close values, then the line segments can be united together, and are united depending on the gradients of the line segments. Whether a representative line segment ( ⁇ L, ⁇ L) in the left area and a representative line segment ( ⁇ R, ⁇ R) in the right area can be united together is determined according to the flowchart shown in FIG. 3.
- the process shown in FIG. 3 is finished. If the angles ⁇ L, ⁇ R with respect to the representative line segments satisfy the ineqalities in the step 302, then it is determined whether the angle ⁇ L of the representative line segment in the lefthand area is close to the right angle or not in a step 304 by comparing the angle ⁇ L with a predetermined reference or threshold ⁇ TH and also comparing the angle ⁇ L with (pai - ⁇ TH) according to the following inequalities:
- a uniting process A is carried out in a step 306. If the angle ⁇ L is larger than 90°, then a uniting process B is carried out in a step 307. If the angle ⁇ L does not satisfy the inequalities in the step 304, then a uniting process C is carried out in a step 308.
- the uniting process A is carried out. It is assumed that a representative line segment L in the lefthand area and a representative line segment R in the righthand area are relatively positioned as shown in FIG. 6 (b), and that the representative line segment L has a starting point XSL in terms of an X coordinate and an ending point XEL in terms of an X coordinate, and the representative line segment R has a starting point XSR in terms of an X coordinate and an ending point XER in terms of an X coordinate.
- a step 601 determines whether the ending point XER of the line segment R is smaller than the starting point XSL of the line segment L, thereby checking if these line segments have any overlapping portion. If XER ⁇ XSL and hence the line segments R, L overlap each other, then a step 602 determines whether the following inequalities are satisfied:
- the X and Y coordinates of the starting point of the line segment L are replaced with the X and Y coordinates of the starting point of the line segment R in a step 603.
- the representative line segment R in the righthand area is removed in a step 604.
- the line segments L, R are united together into a line segment as shown in FIG. 6(c).
- the starting point of the united line segment corresponds to the starting point of the line segment R
- the ending point of the united line segment corresponds to the ending point of the line serpent L. If the inequalities in the steps 601, 602 are not satisfied, then the uniting process A is immediately finished.
- the uniting process B is carried out. It is assumed that a representative line segment L in the lefthand area and a representative line segment R in the righthand area are relatively positioned as shown in FIG. 8(b).
- the representative line segments L, R have starting and ending points expressed in terms of X coordinates in the same manner as shown in FIG. 6(b).
- a step 801 determines whether the ending point XEL of the line segment L is smaller than the starting point XSR of the line segment R, thereby checking of these line segments have any overlapping portion. If XEL ⁇ XSR and hence the line segments L, R overlap each other, then a step 802 determines whether the following inequalities are satisfied:
- the X and Y coordinates of the ending point of the line segment L are replaced with the X and Y coordinates of the ending point of the line segment R in a step 803.
- the representative line segment R in the righthand area is removed in a step 804.
- the line segments L, R are united together into a line segment as shown in FIG. 8(c).
- the starting point of the united line segment corresponds to the starting point of the line segment L
- the ending point of the united line segment corresponds to the ending point of the line segment R. If the inequalities in the steps 801, 802 are not satisfied, then the uniting process A is immediately finished.
- the uniting process B is carried out.
- the representative line segments L, R have starting and ending points expressed in terms of X coordinates in the same manner as shown in FIG. 6(b). It is also assumed that the representative line segment L has a starting point YSL in terms of a Y coordinate and an ending point YEL in terms of a Y coordinate, and the representative line segment R has a starting point YSR in terms of a Y coordinate and an ending point YER in terms of a Y coordinate.
- a step 1001 determines whether the starting point YSL of the line segment L is larger than the starting point YSR of the line segment R. If YSL > YSR (as shown in FIG. 10(c)), then the starting point of the line segment L is replaced with the starting point of the line segment R in a step 1002. Specifically, the starting point XSL in terms of an X coordinate of the line segment L is replaced with the starting point XSR in terms of an X coordinate of the line segment R, and the starting point YSL in terms of a Y coordinate of the line segment L is replaced with the starting point YSR in terms of a Y coordinate of the line segment R. Then, the line segment R is removed in a step 1003.
- the line segments L, R are now united into a line segment as shown in FIG. 10(d).
- the starting point of the united line segment corresponds to the starting point of the line segment R
- the ending point of the united line segment corresponds to the ending point of the line segment L. If YSL is not smaller than YSR in the step 1001, then the starting point of the line segment L is employed as the starting point of the united segment in a step 1004, and then the line segment R is removed in the step 1003, so that the line segments are united.
- a step 1005 determines whether the ending point YEL of the line segment L is smaller than the ending point YER of the line segment R. If YEL ⁇ YER (as shown in FIG. 10(e)), then the ending point of the line segment L is replaced with the ending point of the line segment R in a step 1006. Specifically, the ending point XEL in terms of an X coordinate of the line segment L is replaced with the ending point XER in terms of an X coordinate of the line segment R, and the ending point YEL in terms of a Y coordinate of the line segment L is replaced with the ending point YER in terms of a Y coordinate of the line segment R. Then, the line segment R is removed in a step 1007.
- the line segments L, R are now united into a line segment as shown in FIG 10(f).
- the starting point of the united line segment corresponds to the starting point of the line segment L
- the ending point of the united line segment corresponds to the ending point of the line segment R. If YEL is not greater than YER in the step 1005, then the ending point of the line segment L is employed as the ending point of the united segment in a step 1008, and then the line segment R is removed in the step 1007, so that the line segments are united.
- FIG. 11 The representative line segment L1 is defined as a master line segment, and the other representative line segment L2 as a slave line segment in a step 1201. Then, a step 1202 determines whether the master line segment L1 and the slave line segment L2 are parallel to each other or not.
- a step 1204 determines the X and Y coordinates (XC, YC) of the point C of intersection. Then, it is determined in a step 1205 whether the point C of intersection is positioned between the starting and ending points of the master line segment L1 and between the starting and ending points of the slave line segment L2.
- the line division process is brought to an end. If the point C of intersection is position on the master and slave line segments L1, L2, then, as shown in FIG. 12(b), the representative line segment L1 is divided into line segments l1, l2 by the point C of intersectlon, and the representative line segment L2 is divided into line segments l3, l4 in a step 1206. In this manner, points of intersection between various representative line segments are determined, and data on the determined points of intersection are stored. The above process is effected with respect to the representative line segments which are displayed as shown in FIG. 1(e), thus collecting the data on the points of intersection with respect to the respective representative line segments.
- each of the representative line segments is separated into a plurality of line segments. For example, it is assumed that a representative line segment L has points C1, C2 of intersection as shown in FIG. 13(a).
- the representative line segment L is severed into three line segments La, Lb, Lc by the points C1, C2 of intersection as shown in FIGS. 13(b), 13(c), and 13(d).
- a divided line segment d as shown in FIG. 14(a) has a starting point S having X and Y coordinates (SX, SY), an ending point E having X and Y coordinates (EX, EY), a gradient A, an intercept B, and is inclined at an angle ⁇ with respect to the X-axis.
- the divided line segment d is checked against the edge data of the original image by scanning an edge image over a constant width W of dots across the line segment d in the direction indicated by the arrows in FIG. 14(b). More specifically, the intensity of the edge point of each dot scanned is compared with a predetermined threshold, and the edge points whose intensity is higher than the predetermined threshold are counted. If the ratio of the count to the length l of the line segment d is greater than a predetermined ratio, then it is determined that the line segment d corresponds to a boundary of the actual path.
- the degree of agreement between the divided line segment d and the edge data is expressed as follows:
- the length l of the line segment d may be determined from the number of scanning lines across the width W.
- the length l may represented by
- , whichever is larger, or may be given by: l [(SX - EX)2 + (SY - EY)2] 1 ⁇ 2 ... (2).
- the degree of agreement is calculated with respect to different line segments to be measured whose states are classified as shown in FIGS. 16(a) through 16(d).
- the degree of agreement is calculated according to the flowchart shown in FIG. 15. It is assumed that the edge point of each line segment whose degree of agreement is to be measured has X and Y coordinates (RX, RY).
- a counter for counting edge points whose intensity is higher than a predetermined threshold is cleared in a step 1601.
- a step 1602 determines whether the angle ⁇ of inclination of the line segment d to be checked with respect to the X-axis is 90° or not. If the angle ⁇ is 90° and hence the line segment d is vertical, then the line segment d is in a state as shown in FIG. 16(a).
- the Y coordinate RY of the edge point to be measured varies from EY to SY in a step 1603, and the X coordinate RX varies by the width W across a constant value RXO in a step 1604.
- a step 1605 determines whether the scanned area lies in the range of the image.
- the process is finished. If the scanned area lies outside of the image, then the process is finished. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in. If the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in a step 1607. The step 1607 is carried out for every edge point existing in the scanned area, so that the number of all edge points whose intensity is higher than the threshold is measured along the line segment d.
- a step 1608 determines whether the angle ⁇ is smaller than 45° ( ⁇ ⁇ 45°) or larger than 135° ( ⁇ > 135°), thus determining whether the line segment d has a vertical tendency or not. If the line segment d has a vertical tendency, then the line segment d is in a state as shown in FIG. 16(b) and the Y coordinate RY of the edge point to be checked varies from EY to SY in a step 1609.
- the X coordinate RX of the edge point varies by the width W across the value RXO in a step 1610. Then, the step 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in the step 1607.
- the step 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in the step 1607.
- the line segment d is in a state as shown in FIG. 16(d).
- the X coordinate RX of the edge point to be checked varies from EX to SX in a step 1614.
- the step 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in the step 1607.
- the length l of the line segment d is determined according to the equation (2) or one of the alternative processes in a step 1616. Then, a step 1617 determines whether the length l is smaller than a predetermined minimum value or not. If the length l is larger than the minimum value, then the process is finished. If the length l is smaller than the minimum value, then a step 1619 determines whether neither of the opposite ends of the line segment d has a point of connection to another line segment. If either of the opposite ends of the line segment d has a point of connection, then the line segment d is determined to be effective to extract a boundary of the path in a step 1620.
- the line segment d is determined to be ineffective to extract a path boundary, and is excluded from the group of line segments to be selected, in a step 1621.
- Line segments to be excluded are shown by way of example in FIGS. 17(a) and 17(b). Specifically, a line segment d1 shown in FIG. 17(a) is excluded from the group of line segments to be selected because one of the opposite ends thereof has no point of connection. A line segment d1 shown in FIG. 17(b) is also excluded from the group of line segments to be selected because both of the opposite ends thereof have no point of connection.
- the ratio ratio of the number of the edge points whose intensity is higher than the predetermined threshold to the length l of the line segment d is calculated according to the equation (1), thereby determining the degree of agreement in a step 1622. Based on the determined degree of agreement, the effectiveness of the line segment d is ascertained, i.e., it is determined whether the line segment d corresponds to a boundary of the actual path.
- the selected line segments may be joined to other in different patterns as shown in FIG. 18(a) through 18(d).
- FIG. 18(a) shows a pattern in which selected line segments La, Lb originally come from one representative line segment.
- the line segments La, Lb shown in FIG. 18(a) are joined into one line segment.
- FIG. 18(b) shows a pattern in which selected line segments La, Lb originate from different representative line segments. In this case, the relationship in which the line segments La, Lb are joined is converted into data, which are stored.
- selected line segments La, Lb, which derive from one representative line segment are joined to each other at a junction to which another selected line segment Lc is joined.
- FIG. 18(d) illustrates a pattern in which selected line segments La, Lb are originally not joined but separated. The relationship in which these line segments La, Lb are joined is also converted into data, which are stored. In the pattern of FIG. 18(d), the number of points of connection between line segments is zero.
- the original image is divided into two image areas, and the data from the two image areas are processed.
- the original image may not be divided, but the array or distribution of feature points may be determined directly from the original image, and a group of straight lines approximating the determined array of feature points may be determined for the recognition of the path.
- the step of dividing the original image into a plurality of image areas and the step of combining the divided image areas into a single image are omitted.
- the modified arrangement also offers the same advantages as those of the illustrated embodiment.
- the method of the present embodiment does not require sequential steps which would otherwise be needed to determine the configuration of the path, but can easily and quickly determine the boundaries of the path.
- the method of determining the configuration of a path for motor vehicles according to the present embodiment is thus well applicable to the control of running or guidance of motor vehicles such as automobiles, which requires data processing in a short period of time.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
- The present invention relates to a method of determining or recognizing the configuration of a path for motor vehicles through image processing, and more particularly to a method of recognizing boundaries of a path irrespective of the configuration of the path.
- Recognition of the configuration of a path for motor vehicles requires that boundaries of the path at an end thereof be determined. To determine the boundaries of a path at an end thereof, it is necessary to process image data of the path at the end thereof, which are produced by a television camera or the like, and to extract line segments from the processed image data at the end of the path. One image processing method employs the Hough transformation as disclosed in Japanese Laid-Open Patent Publications Nos. 62(1987)-24310 and 62(1987)-7096. According to the disclosed process, feature points on an image produced by the image data are subjected to the Hough transformation, thereby producing a group of straight lines corresponding to the distribution of the feature points.
- If the path to be recognized is straight, then the above conventional method based on the image processing can directly utilize the detected straight lines produced by way of the Hough transformation. If the path is curved or branched, however, the detected straight lines produced by way of the Hough transformation cannot be utilized as they are. In such a case, it has been customary to divide an image produced by a television camera into a plurality of small processing areas, and repeatedly detect line segments in the processing areas successively from a closest small area in the image, so that information accurately representing the boundaries of an actual path can be obtained.
- With the conventional method, therefore, sequential steps are required to determine the configuratlon of a path, and the overall process is time-consuming. Accordingly, it has been difficult to apply the conventional method to the control of running of an automobile along paths since the automobile running control or guidance process requires quick data processing.
- It is a major object of the present invention to provide a method of determining the configuration of a path for motor vehicles, the method belng capable of easily and quickly determining boundaries of the path.
- According to the present invention, there is provided a method of determining the configuration of a path for motor vehicles, comprising the steps of producing original image data of the path, determining feature points contained in the original image data, determining a group of straight lines approximating the array of the feature points, extracting straight lines, effective to determine boundaries of the path, from the group of straight lines, dividing the extracted straight lines into a plurality of line segments by points of intersection between the straight lines, and checking the line segments against the feature points of the original image data to determine whether the line segments correspond to the boundaries of the path.
- The step of determining feature points preferably comprises the step of producing edge data by differentiating the original image data. The step of determining a group of straight lines comprises the step of effecting the Hough transformation on the edge data.
- The step of extracting straight lines preferably comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting point of the one straight line and the ending point of the other straight line, to be an effective straight line if the X coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line.
- The step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting end of the one straight line and the ending point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is larger than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line.
- The step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the ending point of the one straight line and the starting point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is smaller than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line.
- The step of dividing the extracted straight lines preferably comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether the master and slave line segments are parallel to each other, determining a point of intersection between the master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of the master and slave line segments, and dividing the master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- The step of checking the line segments preferably comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than the predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
- According to the present invention, there is also provided a method of determining the configuration of a path for motor vehicles, comprising the steps of producing original image data of the path, dividing the original image data into a plurality of areas, determining feature points contained in each of the areas, determining a group of straight lines approximating the array of the feature points in each of the areas, extracting straight lines, effective to determine boundaries of the path, from the group of straight lines in each of the areas, uniting the extracted straight lines in each of the areas into a single combination of image data representing a single straight line, dividing the single straight line into a plurality of line segments, and checking the line segments against the feature points of the original image data to determine whether the line segments correspond to the boundaries of the path.
- The step of determining feature points preferably comprises the step of producing edge data by differentiating the original image data. The step of determining a group of straight lines comprises the step of effecting the Hough transformation on the edge data.
- the step of extracting straight lines preferably comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting point of the one straight line and the ending point of the other straight line, to be an effective straight line if the X coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the X coordinate of the ending point of the one straight line is larger than the X coordinate of the ending point of the other straight line.
- The step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the starting end of the one straight line and the ending point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is larger than the X coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is larger than the Y coordinate of the ending point of the other straight line.
- The step of extracting straight lines preferably comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line, and determining a straight line which interconnects the ending point of the one straight line and the starting point of the other straight line, to be an effective straight line if the Y coordinate of the starting point of the one straight line is smaller than the Y coordinate of the starting point of the other straight line and the Y coordinate of the ending point of the one straight line is smaller than the Y coordinate of the ending point of the other straight line.
- The step of dividing the single straight line preferably comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether the master and slave line segments are parallel to each other, determining a point of intersection between the master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of the master and slave line segments, and dividing the master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- The step of checking the line segments preferably comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than the predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
- The above and other objects, features and advantages of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings in which a preferred embodiment of the present invention is shown by way of illustrative example.
- FIGS. 1(a) through 1(g) are diagrams illustrative of the basic concept of a method according to the present invention;
- FIG. 2 is a graph showing the expression of a representative line segment used in the method of the present invention;
- FIG. 3 is a flowchart of a procedure of uniting representative line segments together;
- FIG. 4 is a graph illustrative of a threshold ϑTH used in the flowchart shown in FIG. 3;
- FIG. 5 is a flowchart of a uniting process A;
- FIGS. 6(a) through 6(c) are graphs showing the states of line segments in the uniting process A;
- FIG. 7 is a flowchart of a uniting process B;
- FIGS 8(a) through 8(c) are graphs showing the states of line segments in the uniting process B;
- FIGS. 9(a) and 9(b) are flowcharts of a uniting process C;
- FIGS. 10(a) through 10(f) are graphs showing the states of line segments in the uniting process C;
- FIG. 11 is a flowchart of a process of dividing representative line segments;
- FIGS. 12(a) and 12(b) are graphs showing the states of line segments in the dividing process;
- FIGS. 13(a) through 13(d) are graphs illustrative of the separation of divided line segments;
- FIGS. 14(a) through 14(b) are graphs illustrative of checking divided line segments against edge data;
- FIG. 15 is a flowchart of a process of determining the degree of agreement between the divided line segments and the edge data and also determining the lengths of line segments;
- FIGS. 16(a) through 16(d) are graphs showing the classification of the states of the divided line segments at the time they are checked against the edge data;
- FIGS. 17(a) and 17(b) are graphs showing line segments excluded from data selected to draw boundaries in the flowchart shown in FIG. 15; and
- FIGS. 18(a) through 18(d) are graphs illustrative of the manner in which the divided line segments are joined.
- FIGS. 1(a) through 1(g) show the preferred basic concept of the present invention as it is applied to a method of determining or recognizing the configuration of a path for motor vehicles.
- It is assumed that a scene as shown in FIG. 1(a) is imaged by a television camera installed in a motor vehicle running along a
path 1. As shown in FIG. 1(a), thepath 1 hasboundaries 2a, 2b adjoining a sidewalk 4,boundaries sidewalk 7 located beyond the boundary 6. The scene also includes a horizon 8 with asystem 9 of mountains located therebeyond. The imaged scene is referred to as an original image, and the original image is divided into two images by a central vertical line. Each of the two divided images is converted into edge data produced by differentiation in each area therein. Using the edge data, the original image is expressed by dot (not shown). In each area, the edge data are subjected to the Hough transformation, thereby producing a group of straight lines approximating the array or distribution of feature points. The straight lines in each area are then statistically classified into groups by clustering, and representative straight lines in the respective groups are available as straight lines effective to determine the boundaries of the path. The original image is divided into two images because the straight lines can be detected highly accurately by way of the Hough transformation. - The representative straight lines are shown in FIGS. 1(b) and 1(c). In FIG. 1(b), the representative straight lines, denoted at L1, L2, L3, correspond respectively to the
boundaries 2a, 2b, 6, respectively. In FIG. 1(c), the representative straight lines, denoted at L4, L5, L6, correspond respectively to theboundaries - Then, points of intersection between the straight lines are determined, as described later on, and the straight lines are divided into line segments at the points of intersection. More specifically, as shown in FIG. 1(e), the straight line L1 is divided into line segments ℓ1a, ℓ1b, ℓ1c, the straight line L4 line segments ℓ4a, ℓ4b, ℓ4c, the straight line L7 line segments ℓ7a, ℓ7b, ℓ7c, and the straight line L8 line segments ℓ8a, ℓ8b, ℓ8c. The line segments are checked against edge data from the original image, as described later on, to measure the degree of agreement between the line segments and the edge data. The result of the measurement is shown in FIG. 1(f). In FIG. 1(f), the line segments ℓ1b, ℓ1c, ℓ4b, ℓ4c, ℓ7b shown in FIG. 1(e) are removed because the degree of agreement between these line segments and the edge data is low. However, the line segments ℓ1a, ℓ7a corresponding to the
boundaries 2a, 2b (FIG. 1(a)), the line segments ℓ4a, ℓ7c corresponding to theboundaries - The manner in which the remaining line segments are joined is then determined, as described later on, thus producing information on the end of the path, which is accurately representative of the actual path boundaries, as shown in FIG. 1(g).
- The main arithmetic operations employed in the above basic concept will now be described below.
- The two divided images shown in FIGS. 1(b) and 1(c) are united into the single combination of image data shown in FIG. 1(d) as follows: First, it is determined whether the representative line segments in the lefthand area (FIG. 1(b) and the representative line segments in the righthand area (FIG. 1(c)), both obtained by clustering, can be united or not. If these representative line segments can be united, then they are united. As shown in FIG. 2, each of these representative line segments is expressed by the lenght ρ of a line extending from the origin of an X-Y coordinate system or Hough-transformed coordinate system perpendicularly to the representative line segment, and an angle ϑ between the X-axis and the line whose length is ρ.
- The possibility of uniting the representative line segments together is determined by checking if the lengths ρ and the angles ϑ in the respective areas are of close values or not. If they are of close values, then the line segments can be united together, and are united depending on the gradients of the line segments. Whether a representative line segment (ρL, ϑL) in the left area and a representative line segment (ρR, ϑR) in the right area can be united together is determined according to the flowchart shown in FIG. 3.
- If the difference between the lengths ρL, ρR of the representative line segments in the lefthand and righthand areas falls within Δρ, then it is determined that the representative line segments can be united together as to the lengths ρL, ρR. Thus, it is determined whether the lengths ρL, ρR of the representative line segments satisfy the following relationships in a step 301:
- ρL - Δρ < ρR and ρR < ρL + Δρ.
- If the above inequalities are satisfied, then it is determined whether the angles ϑL, ϑR with respect to the representative line segments satisfy the following relationships in a step 302:
- ϑL - Δϑ < ϑR and ϑR < ϑL + Δϑ.
If the difference between the angles ϑL, ϑR with respect to the representative line segments in the lefthand and righthand areas falls within Δϑ, then it is determined that the representative line segments can be united together as to the angles ϑL, ϑR. - If the lengths ρL, ρR and the angles ϑL, ϑR of the representative line segments do not satisfy the inequalities in the
steps step 302, then it is determined whether the angle ϑL of the representative line segment in the lefthand area is close to the right angle or not in astep 304 by comparing the angle ϑL with a predetermined reference or threshold ϑTH and also comparing the angle ϑL with (pai - ϑTH) according to the following inequalities: - ϑTH < ϑL and ϑL < (pai - ϑTH)
where ϑTH and (pai - ϑTH) are indicated in the X-Y coordinate system shown in FIG. 4. - If the angle ϑL with respect to the representative line segment in the lefthand area satisfies the above inequalities, then it is determined whether the angle ϑL is smaller than 90° (ϑL < 90°) or not in a
step 305. If the angle ϑL is smaller than 90°, then a uniting process A is carried out in astep 306. If the angle ϑL is larger than 90°, then a uniting process B is carried out in astep 307. If the angle ϑL does not satisfy the inequalities in thestep 304, then a uniting process C is carried out in astep 308. - The uniting process A will now be described below with reference to FIGS. 5 and 6(a) through 6(c).
- If the angle ϑL with respect to the representative line segment in the lefthand area is in the range of ϑTH < ϑL < 90°, as shown in FIG. 6(a), then the uniting process A is carried out. It is assumed that a representative line segment L in the lefthand area and a representative line segment R in the righthand area are relatively positioned as shown in FIG. 6 (b), and that the representative line segment L has a starting point XSL in terms of an X coordinate and an ending point XEL in terms of an X coordinate, and the representative line segment R has a starting point XSR in terms of an X coordinate and an ending point XER in terms of an X coordinate.
- First, as shown in FIG. 5, a
step 601 determines whether the ending point XER of the line segment R is smaller than the starting point XSL of the line segment L, thereby checking if these line segments have any overlapping portion. If XER < XSL and hence the line segments R, L overlap each other, then astep 602 determines whether the following inequalities are satisfied: - XSR > XSL and XER < XEL.
- If the above inequalities are satisfied, then the X and Y coordinates of the starting point of the line segment L are replaced with the X and Y coordinates of the starting point of the line segment R in a
step 603. Then, the representative line segment R in the righthand area is removed in astep 604. As a result, the line segments L, R are united together into a line segment as shown in FIG. 6(c). The starting point of the united line segment corresponds to the starting point of the line segment R, and the ending point of the united line segment corresponds to the ending point of the line serpent L. If the inequalities in thesteps - The uniting process B will now be described below with reference to FIGS. 7 and 8(a) through 8(c).
- If the angle ϑL with respect to the representative line segment in the lefthand area is in the range of 90° ≦ ϑL < (180° - ϑTH), as shown in FIG. 8(a), then the uniting process B is carried out. It is assumed that a representative line segment L in the lefthand area and a representative line segment R in the righthand area are relatively positioned as shown in FIG. 8(b). The representative line segments L, R have starting and ending points expressed in terms of X coordinates in the same manner as shown in FIG. 6(b).
- First, as shown in FIG. 7, a
step 801 determines whether the ending point XEL of the line segment L is smaller than the starting point XSR of the line segment R, thereby checking of these line segments have any overlapping portion. If XEL < XSR and hence the line segments L, R overlap each other, then astep 802 determines whether the following inequalities are satisfied: - XER > XEL and XSR < XSL.
- If the above inequalities are satisfied, then the X and Y coordinates of the ending point of the line segment L are replaced with the X and Y coordinates of the ending point of the line segment R in a
step 803. Then, the representative line segment R in the righthand area is removed in astep 804. As a result, the line segments L, R are united together into a line segment as shown in FIG. 8(c). The starting point of the united line segment corresponds to the starting point of the line segment L, and the ending point of the united line segment corresponds to the ending point of the line segment R. If the inequalities in thesteps - The uniting process C will now be described below with reference to FIG. 9(a) and 10(b) and 10(a) through 10(f).
- If the angle ϑL with respect to the representative line segment in the lefthand area is in the range of 0 ≦ ϑL < ϑTH or (180° - ϑTH) ≦ ϑL < 180°, as shown in FIG. 10(a), then the uniting process B is carried out. The representative line segments L, R have starting and ending points expressed in terms of X coordinates in the same manner as shown in FIG. 6(b). It is also assumed that the representative line segment L has a starting point YSL in terms of a Y coordinate and an ending point YEL in terms of a Y coordinate, and the representative line segment R has a starting point YSR in terms of a Y coordinate and an ending point YER in terms of a Y coordinate.
- It is also assumed that the representative line segments are positioned as shown in FIG. 10(b), e.g., the representative line segment L in the lefthand area and the representative line segment R in the righthand area are relatlvely positioned as shown in FIG. 10(c). In this case, a uniting process C1 as shown in FIG. 9(a) is carried out.
- first, a
step 1001 determines whether the starting point YSL of the line segment L is larger than the starting point YSR of the line segment R. If YSL > YSR (as shown in FIG. 10(c)), then the starting point of the line segment L is replaced with the starting point of the line segment R in astep 1002. Specifically, the starting point XSL in terms of an X coordinate of the line segment L is replaced with the starting point XSR in terms of an X coordinate of the line segment R, and the starting point YSL in terms of a Y coordinate of the line segment L is replaced with the starting point YSR in terms of a Y coordinate of the line segment R. Then, the line segment R is removed in astep 1003. The line segments L, R are now united into a line segment as shown in FIG. 10(d). The starting point of the united line segment corresponds to the starting point of the line segment R, and the ending point of the united line segment corresponds to the ending point of the line segment L. If YSL is not smaller than YSR in thestep 1001, then the starting point of the line segment L is employed as the starting point of the united segment in astep 1004, and then the line segment R is removed in thestep 1003, so that the line segments are united. - It is assumed that the representative line segments are positioned as shown in FIG. 10(a), e.g., the representative line segment L in the lefthand area and the representative line segment R in the righthand area are relatively positioned as shown in FIG. 10(e). In this case, a uniting process C2 as shown in FIG. 9(b) is carried out.
- First, a
step 1005 determines whether the ending point YEL of the line segment L is smaller than the ending point YER of the line segment R. If YEL < YER (as shown in FIG. 10(e)), then the ending point of the line segment L is replaced with the ending point of the line segment R in astep 1006. Specifically, the ending point XEL in terms of an X coordinate of the line segment L is replaced with the ending point XER in terms of an X coordinate of the line segment R, and the ending point YEL in terms of a Y coordinate of the line segment L is replaced with the ending point YER in terms of a Y coordinate of the line segment R. Then, the line segment R is removed in astep 1007. The line segments L, R are now united into a line segment as shown in FIG 10(f). The starting point of the united line segment corresponds to the starting point of the line segment L, and the ending point of the united line segment corresponds to the ending point of the line segment R. If YEL is not greater than YER in thestep 1005, then the ending point of the line segment L is employed as the ending point of the united segment in astep 1008, and then the line segment R is removed in thestep 1007, so that the line segments are united. - The division of the straight lines into the line segments as shown in FIG. 1(e) will be described below with reference to FIGS. 11, 12(a) and 12(b). It is assumed that representative line segments L1, L2 as shown in FIG. 12(a) are obtained as a result of one of the uniting processes described above. In this case, a line division process as shown in FIG. 11 is carried out. The representative line segment L1 is defined as a master line segment, and the other representative line segment L2 as a slave line segment in a
step 1201. Then, astep 1202 determines whether the master line segment L1 and the slave line segment L2 are parallel to each other or not. If the master and slave line segments L1, L2 are not parallel to each other, then since there is no point of intersection between these line segments, the line division process is finished. Because the master and slave line segments L1, L2 are however not parallel to each other as shown in FIG. 12(a), they intersect with each other at a pointC. A step 1204 determines the X and Y coordinates (XC, YC) of the point C of intersection. Then, it is determined in astep 1205 whether the point C of intersection is positioned between the starting and ending points of the master line segment L1 and between the starting and ending points of the slave line segment L2. - If the point C of intersection is not positioned on the master and slave line segments L1, L2, then the line division process is brought to an end. If the point C of intersection is position on the master and slave line segments L1, L2, then, as shown in FIG. 12(b), the representative line segment L1 is divided into line segments ℓ1, ℓ2 by the point C of intersectlon, and the representative line segment L2 is divided into line segments ℓ3, ℓ4 in a
step 1206. In this manner, points of intersection between various representative line segments are determined, and data on the determined points of intersection are stored. The above process is effected with respect to the representative line segments which are displayed as shown in FIG. 1(e), thus collecting the data on the points of intersection with respect to the respective representative line segments. - Based on the points of intersection thus determined, each of the representative line segments is separated into a plurality of line segments. For example, it is assumed that a representative line segment L has points C1, C2 of intersection as shown in FIG. 13(a). The representative line segment L is severed into three line segments La, Lb, Lc by the points C1, C2 of intersection as shown in FIGS. 13(b), 13(c), and 13(d).
- The divided line segments are then checked against the edge data produced from the original image (FIG. 1(a)) as follows: It is assumed that a divided line segment d as shown in FIG. 14(a) has a starting point S having X and Y coordinates (SX, SY), an ending point E having X and Y coordinates (EX, EY), a gradient A, an intercept B, and is inclined at an angle ϑ with respect to the X-axis.
- The divided line segment d is checked against the edge data of the original image by scanning an edge image over a constant width W of dots across the line segment d in the direction indicated by the arrows in FIG. 14(b). More specifically, the intensity of the edge point of each dot scanned is compared with a predetermined threshold, and the edge points whose intensity is higher than the predetermined threshold are counted. If the ratio of the count to the length ℓ of the line segment d is greater than a predetermined ratio, then it is determined that the line segment d corresponds to a boundary of the actual path. The degree of agreement between the divided line segment d and the edge data is expressed as follows:
- Degree of agreement = [the number of edge points
- whose intensity is higher than the threshold]/
- [the length ℓ of the line segment d] ...(1).
-
- The degree of agreement is calculated with respect to different line segments to be measured whose states are classified as shown in FIGS. 16(a) through 16(d). The degree of agreement is calculated according to the flowchart shown in FIG. 15. It is assumed that the edge point of each line segment whose degree of agreement is to be measured has X and Y coordinates (RX, RY).
- A counter for counting edge points whose intensity is higher than a predetermined threshold is cleared in a
step 1601. Then, astep 1602 determines whether the angle ϑ of inclination of the line segment d to be checked with respect to the X-axis is 90° or not. If the angle ϑ is 90° and hence the line segment d is vertical, then the line segment d is in a state as shown in FIG. 16(a). In this case, the Y coordinate RY of the edge point to be measured varies from EY to SY in astep 1603, and the X coordinate RX varies by the width W across a constant value RXO in astep 1604. Then, astep 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies outside of the image, then the process is finished. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in. If the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in astep 1607. Thestep 1607 is carried out for every edge point existing in the scanned area, so that the number of all edge points whose intensity is higher than the threshold is measured along the line segment d. - If the angle ϑ of the line segment d is not 90° in the
step 1602, then astep 1608 determines whether the angle ϑ is smaller than 45° (ϑ < 45°) or larger than 135° (ϑ > 135°), thus determining whether the line segment d has a vertical tendency or not. If the line segment d has a vertical tendency, then the line segment d is in a state as shown in FIG. 16(b) and the Y coordinate RY of the edge point to be checked varies from EY to SY in astep 1609. A value D is produced by dividing the intercept B of the line segment d by the gradient A (D = B/A), and a value RXO is determined according to RXO = RY/A + D. The X coordinate RX of the edge point varies by the width W across the value RXO in astep 1610. Then, thestep 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in thestep 1607. - If the angle ϑ does not satisfy the conditions in the
step 1608, and hence the line segment d does not have a vertical tendency, then astep 1611 determines whether the X coordinate SX of the starting point of the line segment d is smaller than the X coordinate EX of the ending point E thereof or not (SX < Ex). If the X coordinate SX of the starting point S is smaller than the X coordinate EX of the ending point E, then the line segment d is in a state as shown in FIG. 16(c). In this case, the X coordinate RX of the edge point to be checked varies from SX to EX in astep 1612. The Y coordinate RY varies by the width W across a value RYO = RX·A + B in astep 1613. Then, thestep 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in thestep 1607. - If the X coordinate SX of the starting point X is larger than the X coordinate EX of the ending point E in the
step 1611, the line segment d is in a state as shown in FIG. 16(d). In this case, the X coordinate RX of the edge point to be checked varies from EX to SX in astep 1614. The Y coordinate RX varies by the width W across a value RYO = RX·A + B in astep 1615. Then, thestep 1605 determines whether the scanned area lies in the range of the image. If the scanned area lies within the image, the intensity of the edge point at the coordinates (RX, RY) is read in, and if the intensity is higher than a predetermined threshold, then the counter for counting edge points is counted up in thestep 1607. - After the edge points whose intensity is higher than the predetermined threshold have been counted by the counter, the length ℓ of the line segment d is determined according to the equation (2) or one of the alternative processes in a
step 1616. Then, astep 1617 determines whether the length ℓ is smaller than a predetermined minimum value or not. If the length ℓ is larger than the minimum value, then the process is finished. If the length ℓ is smaller than the minimum value, then astep 1619 determines whether neither of the opposite ends of the line segment d has a point of connection to another line segment. If either of the opposite ends of the line segment d has a point of connection, then the line segment d is determined to be effective to extract a boundary of the path in astep 1620. - If neither of the opposite ends of the line segment d has a point of connection to another line segment, then the line segment d is determined to be ineffective to extract a path boundary, and is excluded from the group of line segments to be selected, in a
step 1621. Line segments to be excluded are shown by way of example in FIGS. 17(a) and 17(b). Specifically, a line segment d1 shown in FIG. 17(a) is excluded from the group of line segments to be selected because one of the opposite ends thereof has no point of connection. A line segment d1 shown in FIG. 17(b) is also excluded from the group of line segments to be selected because both of the opposite ends thereof have no point of connection. - Then, the ratio ratio of the number of the edge points whose intensity is higher than the predetermined threshold to the length ℓ of the line segment d is calculated according to the equation (1), thereby determining the degree of agreement in a
step 1622. Based on the determined degree of agreement, the effectiveness of the line segment d is ascertained, i.e., it is determined whether the line segment d corresponds to a boundary of the actual path. - The manner in which the divided line segments are joined, as shown in FIG. 1(g). is determined as described below.
- The selected line segments may be joined to other in different patterns as shown in FIG. 18(a) through 18(d). FIG. 18(a) shows a pattern in which selected line segments La, Lb originally come from one representative line segment. The line segments La, Lb shown in FIG. 18(a) are joined into one line segment. FIG. 18(b) shows a pattern in which selected line segments La, Lb originate from different representative line segments. In this case, the relationship in which the line segments La, Lb are joined is converted into data, which are stored. According to the pattern shown in FIG. 18(c), selected line segments La, Lb, which derive from one representative line segment, are joined to each other at a junction to which another selected line segment Lc is joined. With this pattern, the relationship in which the line segments La, Lb, Lc are joined is converted into data, which are stored. FIG. 18(d) illustrates a pattern in which selected line segments La, Lb are originally not joined but separated. The relationship in which these line segments La, Lb are joined is also converted into data, which are stored. In the pattern of FIG. 18(d), the number of points of connection between line segments is zero.
- In the above embodiment, the original image is divided into two image areas, and the data from the two image areas are processed. However, the original image may not be divided, but the array or distribution of feature points may be determined directly from the original image, and a group of straight lines approximating the determined array of feature points may be determined for the recognition of the path. According to such a modification, the step of dividing the original image into a plurality of image areas and the step of combining the divided image areas into a single image are omitted. The modified arrangement also offers the same advantages as those of the illustrated embodiment.
- With the present embodiment, as described above, since the divided line segments are checked against the feature points of the original image data, the boundaries of the path can easily and quickly be recognized irrespective of the configuration of the path.
- Therefore, even if the path is curved or branched, the method of the present embodiment does not require sequential steps which would otherwise be needed to determine the configuration of the path, but can easily and quickly determine the boundaries of the path. The method of determining the configuration of a path for motor vehicles according to the present embodiment is thus well applicable to the control of running or guidance of motor vehicles such as automobiles, which requires data processing in a short period of time.
- Although a certain preferred embodiment has been shown and described, it should be understood that many changes and modifications may be made therein without departing from the scope of the appended claims.
Claims (16)
- A method of determining the configuration of a path for motor vehicles, comprising the steps of:producing original image data of the path;determining feature points contained in said original image data;determining a group of straight lines approximating the array of the feature points;extracting straight lines, effective to determine boundaries of the path, from said group of straight lines;dividing the extracted straight lines into a plurality of line segments by points of intersection between the straight lines; andchecking said line segments against the feature points of said original image data to determine whether the line segments correspond to the boundaries of the path.
- A method according to claim 1, wherein said step of determining feature points comprises the step of producing edge data by differentiating said original image data.
- A method according to claim 2, wherein said step of determining a group of straight lines comprises the step of effecting the Hough transformation on said edge data.
- A method according to claim 1, wherein said step of extracting straight lines comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of said one straight line is larger than the X coordinate of the ending point of said other straight line, and determining a straight line which interconnects the starting point of said one straight line and the ending point of said other straight line, to be an effective straight line if said X coordinate of the starting point of said one straight line is larger than said X coordinate of the starting point of said other straight line and said X coordinate of the ending point of said one straight line is larger than said X coordinate of the ending point of said other straight line.
- A method according to claim 1, wherein said step of extracting straight lines comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of said one straight line is larger than the Y coordinate of the ending point of said other straight line, and determining a straight line which interconnects the starting end of said one straight line and the ending point of said other straight line, to be an effective straight line if said Y coordinate of the starting point of said one straight line is larger than said Y coordinate of the starting point of said other straight line and said Y coordinate of the ending point of said one straight line is larger than said Y coordinate of the ending point of said other straight line.
- A method according to claim 1, wherein said step of extracting straight lines comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of said one straight line is smaller than the Y coordinate of the ending point of said other straight line, and determining a straight line which interconnects the ending point of said one straight line and the starting point of said other straight line, to be an effective straight line if said Y coordinate of the starting point of said one straight line is smaller than said Y coordinate of the starting point of said other straight line and said Y coordinate of the ending point of said one straight line is smaller than said Y coordinate of the ending point of said other straight line.
- A method according to claim 1, wherein said step of dividing the extracted straight lines comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether said master and slave line segments are parallel to each other, determining a point of intersection between said master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of said master and slave line segments, and dividing said master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- A method according to claim 1, wherein said step of checking said line segments comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than said predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
- A method of determining the configuration of a path for motor vehicles, comprising the steps of:producing original image data of the path;dividing said original image data into a plurality of areas;determining feature points contained in each of said areas;determining a group of straight lines approximating the array of the feature points in each of said areas;extracting straight lines, effective to determine boundaries of the path, from said group of straight lines in each of said areas;uniting the extracted straight lines in each of said areas into a single combination of image data representing a single straight line;dividing said single straight line into a plurality of line segments; andchecking said line segments against the feature points of said original image data to determine whether the line segments correspond to the boundaries of the path.
- A method according to claim 9, wherein said step of determining feature points comprises the step of producing edge data by differentiating said original image data.
- A method according to claim 10, wherein said step of determining a group of straight lines comprises the step of effecting the Hough transformation on said edge data.
- A method according to claim 9, wherein said step of extracting straight lines comprises the steps of determining whether the X coordinate of the starting point of one straight line is larger than the X coordinate of the starting point of another straight line, and whether the X coordinate of the ending point of said one straight line is larger than the X coordinate of the ending point of said other straight line, and determining a straight line which interconnects the starting point of said one straight line and the ending point of said other straight line, to be an effective straight line if said X coordinate of the starting point of said one straight line is larger than said X coordinate of the starting point of said other straight line and said X coordinate of the ending point of said one straight line is larger than said X coordinate of the ending point of said other straight line.
- A method according to claim 9, wherein said step of extracting straight lines comprises the steps of determining whether the Y coordinate of the starting point of one straight line is larger than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of said one straight line is larger than the Y coordinate of the ending point of said other straight line, and determining a straight line which interconnects the starting end of said one straight line and the ending point of said other straight line, to be an effective straight line if said Y coordinate of the starting point of said one straight line is larger than said Y coordinate of the starting point of said other straight line and said Y coordinate of the ending point of said one straight line is larger than said Y coordinate of the ending point of said other straight line.
- A method according to claim 9, wherein said step of extracting straight lines comprises the steps of determining whether the Y coordinate of the starting point of one straight line is smaller than the Y coordinate of the starting point of another straight line, and whether the Y coordinate of the ending point of said one straight line is smaller than the Y coordinate of the ending point of said other straight line, and determining a straight line which interconnects the ending point of said one straight line and the starting point of said other straight line, to be an effective straight line if said Y coordinate of the starting point of said one straight line is smaller than said Y coordinate of the starting point of said other straight line and said Y coordinate of the ending point of said one straight line is smaller than said Y coordinate of the ending point of said other straight line.
- A method according to claim 9, wherein said step of dividing said single straight line comprises the steps of defining one straight line as a master line segment and another straight line as a slave line segment, determining whether said master and slave line segments are parallel to each other, determining a point of intersection between said master and slave line segments if the master and slave line segments are not parallel to each other, determining whether the point of intersection is positioned between the starting and ending points of said master and slave line segments, and dividing said master and slave line segments by the point of intersection if the point of intersection is positioned between the starting and ending points of the master and slave line segments.
- A method according to claim 9, wherein said step of checking said line segments comprises the steps of scanning an edge image over a predetermined width across a line segment, comparing the intensity of the edge point of each dot which is scanned, with a predetermined threshold, counting edge points whose intensity is higher than said predetermined threshold, calculating the ratio of the number of the counted edge points to the length of the line segment, and determining whether the line segment corresponds to a boundary of the path based on the calculated ratio.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP1334127A JP2843079B2 (en) | 1989-12-22 | 1989-12-22 | Driving path determination method |
JP334127/89 | 1989-12-22 |
Publications (3)
Publication Number | Publication Date |
---|---|
EP0434455A2 true EP0434455A2 (en) | 1991-06-26 |
EP0434455A3 EP0434455A3 (en) | 1993-02-03 |
EP0434455B1 EP0434455B1 (en) | 1997-10-15 |
Family
ID=18273824
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP90314123A Expired - Lifetime EP0434455B1 (en) | 1989-12-22 | 1990-12-21 | Method of determining the configuration of a path for motor vehicles |
Country Status (4)
Country | Link |
---|---|
US (1) | US5341437A (en) |
EP (1) | EP0434455B1 (en) |
JP (1) | JP2843079B2 (en) |
DE (1) | DE69031589T2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3079099A1 (en) * | 2015-04-01 | 2016-10-12 | Ricoh Company, Ltd. | Method and device for detecting road dividing object |
EP3594852A1 (en) * | 2018-07-12 | 2020-01-15 | HERE Global B.V. | Method, apparatus, and system for constructing a polyline from line segments |
Families Citing this family (132)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06213660A (en) * | 1993-01-19 | 1994-08-05 | Aisin Seiki Co Ltd | Detecting method for approximate straight line of image |
JP3374570B2 (en) * | 1995-01-31 | 2003-02-04 | いすゞ自動車株式会社 | Lane departure warning device |
US5910854A (en) | 1993-02-26 | 1999-06-08 | Donnelly Corporation | Electrochromic polymeric solid films, manufacturing electrochromic devices using such solid films, and processes for making such solid films and devices |
US5877897A (en) | 1993-02-26 | 1999-03-02 | Donnelly Corporation | Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array |
US6822563B2 (en) * | 1997-09-22 | 2004-11-23 | Donnelly Corporation | Vehicle imaging system with accessory control |
US5670935A (en) | 1993-02-26 | 1997-09-23 | Donnelly Corporation | Rearview vision system for vehicle including panoramic view |
US6498620B2 (en) | 1993-02-26 | 2002-12-24 | Donnelly Corporation | Vision system for a vehicle including an image capture device and a display system having a long focal length |
US5631982A (en) * | 1993-06-10 | 1997-05-20 | International Business Machines Corporation | System using parallel coordinates for automated line detection in noisy images |
JP3169483B2 (en) * | 1993-06-25 | 2001-05-28 | 富士通株式会社 | Road environment recognition device |
JP3431962B2 (en) * | 1993-09-17 | 2003-07-28 | 本田技研工業株式会社 | Automatic traveling vehicle equipped with a lane marking recognition device |
US5668663A (en) | 1994-05-05 | 1997-09-16 | Donnelly Corporation | Electrochromic mirrors and devices |
JP3357749B2 (en) * | 1994-07-12 | 2002-12-16 | 本田技研工業株式会社 | Vehicle road image processing device |
US5509486A (en) * | 1994-08-12 | 1996-04-23 | Loral Corporation | Method of steering an agricultural vehicle |
US5621645A (en) * | 1995-01-24 | 1997-04-15 | Minnesota Mining And Manufacturing Company | Automated lane definition for machine vision traffic detector |
EP0740163B1 (en) * | 1995-04-25 | 2005-12-14 | Matsushita Electric Industrial Co., Ltd. | Local positioning apparatus for detecting a local position of an automobile on a road |
DE19680415C2 (en) * | 1995-04-26 | 2003-02-06 | Hitachi Ltd | Image processing device for vehicles |
US6891563B2 (en) | 1996-05-22 | 2005-05-10 | Donnelly Corporation | Vehicular vision system |
US5675489A (en) * | 1995-07-06 | 1997-10-07 | Carnegie Mellon University | System and method for estimating lateral position |
US7655894B2 (en) | 1996-03-25 | 2010-02-02 | Donnelly Corporation | Vehicular image sensing system |
JPH10302050A (en) * | 1997-04-30 | 1998-11-13 | Fujitsu Ltd | Data transformation processing circuit |
US8294975B2 (en) | 1997-08-25 | 2012-10-23 | Donnelly Corporation | Automotive rearview mirror assembly |
US6124886A (en) | 1997-08-25 | 2000-09-26 | Donnelly Corporation | Modular rearview mirror assembly |
US6172613B1 (en) | 1998-02-18 | 2001-01-09 | Donnelly Corporation | Rearview mirror assembly incorporating vehicle information display |
US6326613B1 (en) | 1998-01-07 | 2001-12-04 | Donnelly Corporation | Vehicle interior mirror assembly adapted for containing a rain sensor |
DE19738764A1 (en) * | 1997-09-04 | 1999-03-11 | Bayerische Motoren Werke Ag | Graphical display in motor vehicle for road map |
US8288711B2 (en) | 1998-01-07 | 2012-10-16 | Donnelly Corporation | Interior rearview mirror system with forwardly-viewing camera and a control |
US6445287B1 (en) | 2000-02-28 | 2002-09-03 | Donnelly Corporation | Tire inflation assistance monitoring system |
US6329925B1 (en) | 1999-11-24 | 2001-12-11 | Donnelly Corporation | Rearview mirror assembly with added feature modular display |
US6477464B2 (en) | 2000-03-09 | 2002-11-05 | Donnelly Corporation | Complete mirror-based global-positioning system (GPS) navigation solution |
US6693517B2 (en) | 2000-04-21 | 2004-02-17 | Donnelly Corporation | Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants |
JP2000161915A (en) * | 1998-11-26 | 2000-06-16 | Matsushita Electric Ind Co Ltd | On-vehicle single-camera stereoscopic vision system |
WO2001064481A2 (en) | 2000-03-02 | 2001-09-07 | Donnelly Corporation | Video mirror systems incorporating an accessory module |
WO2007053710A2 (en) | 2005-11-01 | 2007-05-10 | Donnelly Corporation | Interior rearview mirror with display |
US7370983B2 (en) | 2000-03-02 | 2008-05-13 | Donnelly Corporation | Interior mirror assembly with display |
US7167796B2 (en) | 2000-03-09 | 2007-01-23 | Donnelly Corporation | Vehicle navigation system for use with a telematics system |
US6396408B2 (en) | 2000-03-31 | 2002-05-28 | Donnelly Corporation | Digital electrochromic circuit with a vehicle network |
JP3904840B2 (en) * | 2000-08-15 | 2007-04-11 | 富士通株式会社 | Ruled line extraction device for extracting ruled lines from multi-valued images |
ES2287266T3 (en) | 2001-01-23 | 2007-12-16 | Donnelly Corporation | IMPROVED VEHICLE LIGHTING SYSTEM. |
US7255451B2 (en) | 2002-09-20 | 2007-08-14 | Donnelly Corporation | Electro-optic mirror cell |
US7581859B2 (en) | 2005-09-14 | 2009-09-01 | Donnelly Corp. | Display device for exterior rearview mirror |
US7697027B2 (en) | 2001-07-31 | 2010-04-13 | Donnelly Corporation | Vehicular video system |
US6882287B2 (en) | 2001-07-31 | 2005-04-19 | Donnelly Corporation | Automotive lane change aid |
FR2837813B1 (en) | 2002-03-29 | 2004-06-11 | Omnium Traitement Valorisa | CIRCULAR PLANT FOR THE BIOLOGICAL TREATMENT OF WASTEWATER |
ES2391556T3 (en) | 2002-05-03 | 2012-11-27 | Donnelly Corporation | Object detection system for vehicles |
US6918674B2 (en) | 2002-05-03 | 2005-07-19 | Donnelly Corporation | Vehicle rearview mirror system |
US7329013B2 (en) | 2002-06-06 | 2008-02-12 | Donnelly Corporation | Interior rearview mirror system with compass |
WO2003105099A1 (en) | 2002-06-06 | 2003-12-18 | Donnelly Corporation | Interior rearview mirror system with compass |
US20060061008A1 (en) | 2004-09-14 | 2006-03-23 | Lee Karner | Mounting assembly for vehicle interior mirror |
US10144353B2 (en) | 2002-08-21 | 2018-12-04 | Magna Electronics Inc. | Multi-camera vision system for a vehicle |
US7310177B2 (en) | 2002-09-20 | 2007-12-18 | Donnelly Corporation | Electro-optic reflective element assembly |
WO2004026633A2 (en) | 2002-09-20 | 2004-04-01 | Donnelly Corporation | Mirror reflective element assembly |
WO2004103772A2 (en) | 2003-05-19 | 2004-12-02 | Donnelly Corporation | Mirror assembly for vehicle |
US7103222B2 (en) * | 2002-11-01 | 2006-09-05 | Mitsubishi Electric Research Laboratories, Inc. | Pattern discovery in multi-dimensional time series using multi-resolution matching |
US7446924B2 (en) | 2003-10-02 | 2008-11-04 | Donnelly Corporation | Mirror reflective element assembly including electronic component |
US7308341B2 (en) | 2003-10-14 | 2007-12-11 | Donnelly Corporation | Vehicle communication system |
JP4162618B2 (en) | 2004-03-12 | 2008-10-08 | 株式会社豊田中央研究所 | Lane boundary judgment device |
US7526103B2 (en) | 2004-04-15 | 2009-04-28 | Donnelly Corporation | Imaging system for vehicle |
JP4703136B2 (en) * | 2004-06-02 | 2011-06-15 | トヨタ自動車株式会社 | Line drawing processing equipment |
US7881496B2 (en) | 2004-09-30 | 2011-02-01 | Donnelly Corporation | Vision system for vehicle |
US7720580B2 (en) | 2004-12-23 | 2010-05-18 | Donnelly Corporation | Object detection system for vehicle |
US7626749B2 (en) | 2005-05-16 | 2009-12-01 | Donnelly Corporation | Vehicle mirror assembly with indicia at reflective element |
JP4232794B2 (en) * | 2006-05-31 | 2009-03-04 | アイシン・エィ・ダブリュ株式会社 | Driving support method and driving support device |
US7972045B2 (en) | 2006-08-11 | 2011-07-05 | Donnelly Corporation | Automatic headlamp control system |
WO2008127752A2 (en) | 2007-01-25 | 2008-10-23 | Magna Electronics | Radar sensing system for vehicle |
US7914187B2 (en) | 2007-07-12 | 2011-03-29 | Magna Electronics Inc. | Automatic lighting system with adaptive alignment function |
US8017898B2 (en) | 2007-08-17 | 2011-09-13 | Magna Electronics Inc. | Vehicular imaging system in an automatic headlamp control system |
WO2009036176A1 (en) | 2007-09-11 | 2009-03-19 | Magna Electronics | Imaging system for vehicle |
WO2009046268A1 (en) | 2007-10-04 | 2009-04-09 | Magna Electronics | Combined rgb and ir imaging sensor |
US8154418B2 (en) | 2008-03-31 | 2012-04-10 | Magna Mirrors Of America, Inc. | Interior rearview mirror system |
US20100020170A1 (en) | 2008-07-24 | 2010-01-28 | Higgins-Luthman Michael J | Vehicle Imaging System |
US9487144B2 (en) | 2008-10-16 | 2016-11-08 | Magna Mirrors Of America, Inc. | Interior mirror assembly with display |
EP2401176B1 (en) | 2009-02-27 | 2019-05-08 | Magna Electronics | Alert system for vehicle |
US8376595B2 (en) | 2009-05-15 | 2013-02-19 | Magna Electronics, Inc. | Automatic headlamp control |
WO2011014482A1 (en) | 2009-07-27 | 2011-02-03 | Magna Electronics Inc. | Parking assist system |
WO2011014497A1 (en) | 2009-07-27 | 2011-02-03 | Magna Electronics Inc. | Vehicular camera with on-board microcontroller |
ES2538827T3 (en) | 2009-09-01 | 2015-06-24 | Magna Mirrors Of America, Inc. | Imaging and display system for a vehicle |
US8890955B2 (en) | 2010-02-10 | 2014-11-18 | Magna Mirrors Of America, Inc. | Adaptable wireless vehicle vision system based on wireless communication error |
US9117123B2 (en) | 2010-07-05 | 2015-08-25 | Magna Electronics Inc. | Vehicular rear view camera display system with lifecheck function |
US9180908B2 (en) | 2010-11-19 | 2015-11-10 | Magna Electronics Inc. | Lane keeping system and lane centering system |
US9900522B2 (en) | 2010-12-01 | 2018-02-20 | Magna Electronics Inc. | System and method of establishing a multi-camera image using pixel remapping |
US9264672B2 (en) | 2010-12-22 | 2016-02-16 | Magna Mirrors Of America, Inc. | Vision display system for vehicle |
WO2012103193A1 (en) | 2011-01-26 | 2012-08-02 | Magna Electronics Inc. | Rear vision system with trailer angle detection |
US9194943B2 (en) | 2011-04-12 | 2015-11-24 | Magna Electronics Inc. | Step filter for estimating distance in a time-of-flight ranging system |
WO2012145819A1 (en) | 2011-04-25 | 2012-11-01 | Magna International Inc. | Image processing method for detecting objects using relative motion |
WO2013016409A1 (en) | 2011-07-26 | 2013-01-31 | Magna Electronics Inc. | Vision system for vehicle |
WO2013043661A1 (en) | 2011-09-21 | 2013-03-28 | Magna Electronics, Inc. | Vehicle vision system using image data transmission and power supply via a coaxial cable |
US9681062B2 (en) | 2011-09-26 | 2017-06-13 | Magna Electronics Inc. | Vehicle camera image quality improvement in poor visibility conditions by contrast amplification |
US9146898B2 (en) | 2011-10-27 | 2015-09-29 | Magna Electronics Inc. | Driver assist system with algorithm switching |
US10099614B2 (en) | 2011-11-28 | 2018-10-16 | Magna Electronics Inc. | Vision system for vehicle |
JP5601332B2 (en) * | 2012-02-08 | 2014-10-08 | 村田機械株式会社 | Transport vehicle |
US8694224B2 (en) | 2012-03-01 | 2014-04-08 | Magna Electronics Inc. | Vehicle yaw rate correction |
US10609335B2 (en) | 2012-03-23 | 2020-03-31 | Magna Electronics Inc. | Vehicle vision system with accelerated object confirmation |
WO2013158592A2 (en) | 2012-04-16 | 2013-10-24 | Magna Electronics, Inc. | Vehicle vision system with reduced image color data processing by use of dithering |
US10089537B2 (en) | 2012-05-18 | 2018-10-02 | Magna Electronics Inc. | Vehicle vision system with front and rear camera integration |
US9340227B2 (en) | 2012-08-14 | 2016-05-17 | Magna Electronics Inc. | Vehicle lane keep assist system |
DE102013217430A1 (en) | 2012-09-04 | 2014-03-06 | Magna Electronics, Inc. | Driver assistance system for a motor vehicle |
US9446713B2 (en) | 2012-09-26 | 2016-09-20 | Magna Electronics Inc. | Trailer angle detection system |
US9558409B2 (en) | 2012-09-26 | 2017-01-31 | Magna Electronics Inc. | Vehicle vision system with trailer angle detection |
US9090234B2 (en) | 2012-11-19 | 2015-07-28 | Magna Electronics Inc. | Braking control system for vehicle |
US9743002B2 (en) | 2012-11-19 | 2017-08-22 | Magna Electronics Inc. | Vehicle vision system with enhanced display functions |
US10025994B2 (en) | 2012-12-04 | 2018-07-17 | Magna Electronics Inc. | Vehicle vision system utilizing corner detection |
US9481301B2 (en) | 2012-12-05 | 2016-11-01 | Magna Electronics Inc. | Vehicle vision system utilizing camera synchronization |
US20140218529A1 (en) | 2013-02-04 | 2014-08-07 | Magna Electronics Inc. | Vehicle data recording system |
US9092986B2 (en) | 2013-02-04 | 2015-07-28 | Magna Electronics Inc. | Vehicular vision system |
US10027930B2 (en) | 2013-03-29 | 2018-07-17 | Magna Electronics Inc. | Spectral filtering for vehicular driver assistance systems |
US9327693B2 (en) | 2013-04-10 | 2016-05-03 | Magna Electronics Inc. | Rear collision avoidance system for vehicle |
US10232797B2 (en) | 2013-04-29 | 2019-03-19 | Magna Electronics Inc. | Rear vision system for vehicle with dual purpose signal lines |
US9508014B2 (en) | 2013-05-06 | 2016-11-29 | Magna Electronics Inc. | Vehicular multi-camera vision system |
US10567705B2 (en) | 2013-06-10 | 2020-02-18 | Magna Electronics Inc. | Coaxial cable with bidirectional data transmission |
US9260095B2 (en) | 2013-06-19 | 2016-02-16 | Magna Electronics Inc. | Vehicle vision system with collision mitigation |
US20140375476A1 (en) | 2013-06-24 | 2014-12-25 | Magna Electronics Inc. | Vehicle alert system |
US9619716B2 (en) | 2013-08-12 | 2017-04-11 | Magna Electronics Inc. | Vehicle vision system with image classification |
US10326969B2 (en) | 2013-08-12 | 2019-06-18 | Magna Electronics Inc. | Vehicle vision system with reduction of temporal noise in images |
US9988047B2 (en) | 2013-12-12 | 2018-06-05 | Magna Electronics Inc. | Vehicle control system with traffic driving control |
US10160382B2 (en) | 2014-02-04 | 2018-12-25 | Magna Electronics Inc. | Trailer backup assist system |
US9623878B2 (en) | 2014-04-02 | 2017-04-18 | Magna Electronics Inc. | Personalized driver assistance system for vehicle |
US9487235B2 (en) | 2014-04-10 | 2016-11-08 | Magna Electronics Inc. | Vehicle control system with adaptive wheel angle correction |
JP6539958B2 (en) | 2014-08-28 | 2019-07-10 | 村田機械株式会社 | Carrier |
US10286855B2 (en) | 2015-03-23 | 2019-05-14 | Magna Electronics Inc. | Vehicle vision system with video compression |
US10819943B2 (en) | 2015-05-07 | 2020-10-27 | Magna Electronics Inc. | Vehicle vision system with incident recording function |
US10078789B2 (en) | 2015-07-17 | 2018-09-18 | Magna Electronics Inc. | Vehicle parking assist system with vision-based parking space detection |
US10086870B2 (en) | 2015-08-18 | 2018-10-02 | Magna Electronics Inc. | Trailer parking assist system for vehicle |
US10875403B2 (en) | 2015-10-27 | 2020-12-29 | Magna Electronics Inc. | Vehicle vision system with enhanced night vision |
US11285878B2 (en) | 2015-12-17 | 2022-03-29 | Magna Electronics Inc. | Vehicle vision system with camera line power filter |
US11277558B2 (en) | 2016-02-01 | 2022-03-15 | Magna Electronics Inc. | Vehicle vision system with master-slave camera configuration |
US11433809B2 (en) | 2016-02-02 | 2022-09-06 | Magna Electronics Inc. | Vehicle vision system with smart camera video output |
US10132971B2 (en) | 2016-03-04 | 2018-11-20 | Magna Electronics Inc. | Vehicle camera with multiple spectral filters |
US10055651B2 (en) | 2016-03-08 | 2018-08-21 | Magna Electronics Inc. | Vehicle vision system with enhanced lane tracking |
US10908614B2 (en) | 2017-12-19 | 2021-02-02 | Here Global B.V. | Method and apparatus for providing unknown moving object detection |
JP2020135776A (en) * | 2019-02-26 | 2020-08-31 | プラス株式会社 | Imaging device |
CN113785302A (en) * | 2019-04-26 | 2021-12-10 | 辉达公司 | Intersection attitude detection in autonomous machine applications |
US11968639B2 (en) | 2020-11-11 | 2024-04-23 | Magna Electronics Inc. | Vehicular control system with synchronized communication between control units |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3709207A1 (en) * | 1987-02-28 | 1988-09-08 | Standard Elektrik Lorenz Ag | CIRCUIT ARRANGEMENT FOR CONVERTING DIGITAL TONE SIGNAL VALUES TO ANALOG TONE |
US4970653A (en) * | 1989-04-06 | 1990-11-13 | General Motors Corporation | Vision method of detecting lane boundaries and obstacles |
-
1989
- 1989-12-22 JP JP1334127A patent/JP2843079B2/en not_active Expired - Lifetime
-
1990
- 1990-12-21 DE DE69031589T patent/DE69031589T2/en not_active Expired - Fee Related
- 1990-12-21 EP EP90314123A patent/EP0434455B1/en not_active Expired - Lifetime
-
1993
- 1993-02-09 US US08/016,656 patent/US5341437A/en not_active Expired - Lifetime
Non-Patent Citations (3)
Title |
---|
IEEE JOURNAL OF ROBOTICS AND AUTOMATION vol. RA-3, no. 2, April 1987, NEW YORK US pages 124 - 141 A. M. WAXMAN ET AL. 'A visual navigation system for automous land vehicles' * |
OPTICAL ENGINEERING vol. 28, no. 9, September 1989, BELLINGHAM US pages 949 - 954 Z. ZHU 'Algorithm for automatic road recognition on digitized images' * |
PROC. CVPR '88 5 June 1988, ANN ARBOR, MICHIGAN pages 905 - 910 K. C. COX 'Rapid search for spherical objects in aerial photographs' * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3079099A1 (en) * | 2015-04-01 | 2016-10-12 | Ricoh Company, Ltd. | Method and device for detecting road dividing object |
CN106157283A (en) * | 2015-04-01 | 2016-11-23 | 株式会社理光 | The detection method of lane segmentation thing and device |
EP3594852A1 (en) * | 2018-07-12 | 2020-01-15 | HERE Global B.V. | Method, apparatus, and system for constructing a polyline from line segments |
US11087469B2 (en) | 2018-07-12 | 2021-08-10 | Here Global B.V. | Method, apparatus, and system for constructing a polyline from line segments |
Also Published As
Publication number | Publication date |
---|---|
EP0434455B1 (en) | 1997-10-15 |
JPH03194670A (en) | 1991-08-26 |
JP2843079B2 (en) | 1999-01-06 |
EP0434455A3 (en) | 1993-02-03 |
DE69031589T2 (en) | 1998-02-12 |
US5341437A (en) | 1994-08-23 |
DE69031589D1 (en) | 1997-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0434455A2 (en) | Method of determining the configuration of a path for motor vehicles | |
EP0361914B1 (en) | A driving way judging device and method | |
US6430303B1 (en) | Image processing apparatus | |
EP0700017A2 (en) | Method and apparatus for directional counting of moving objects | |
US6141435A (en) | Image processing apparatus | |
US7151996B2 (en) | System and method for generating a model of the path of a roadway from an image recorded by a camera | |
US6888953B2 (en) | Vehicle surroundings monitoring apparatus | |
EP1640937B1 (en) | Collision time estimation apparatus and method for vehicles | |
EP0827127B1 (en) | Local positioning apparatus, and method therefor | |
CN100452093C (en) | Device for detecting road traveling lane | |
EP0128820B1 (en) | Pattern matching method and apparatus | |
Zhang et al. | Texture-based segmentation of road images | |
EP0567059B1 (en) | Object recognition system using image processing | |
US6829388B1 (en) | System of detecting road white line, method for detecting road white line and storage medium storing program for detecting road white line | |
US7889887B2 (en) | Lane recognition apparatus | |
US6556692B1 (en) | Image-processing method and apparatus for recognizing objects in traffic | |
JP2000357233A (en) | Body recognition device | |
US20030103650A1 (en) | Lane marker recognition method | |
JP2001004368A (en) | Object recognizing apparatus | |
CN107021103A (en) | Information computing device | |
US6205234B1 (en) | Image processing system | |
EP0696733A1 (en) | Surface inspection system | |
US5222207A (en) | Method and system for determining segment types in figure represented by straight short vectors | |
Beucher et al. | Road recognition in complex traffic situations | |
JP3032060B2 (en) | Roadway recognition device for mobile vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): DE FR GB |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: NAKAYAMA,SHIGETO, C/O K.K.HONDA GIJYUTSU KENKYUSHO |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): DE FR GB |
|
17P | Request for examination filed |
Effective date: 19930507 |
|
17Q | First examination report despatched |
Effective date: 19950724 |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): DE FR GB |
|
REF | Corresponds to: |
Ref document number: 69031589 Country of ref document: DE Date of ref document: 19971120 |
|
ET | Fr: translation filed | ||
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed | ||
REG | Reference to a national code |
Ref country code: GB Ref legal event code: IF02 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20061208 Year of fee payment: 17 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20061214 Year of fee payment: 17 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20061220 Year of fee payment: 17 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20071221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20080701 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: ST Effective date: 20081020 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20071221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20071231 |