US6697146B2 - Range finder for finding range by image realization - Google Patents

Range finder for finding range by image realization Download PDF

Info

Publication number
US6697146B2
US6697146B2 US10/177,633 US17763302A US6697146B2 US 6697146 B2 US6697146 B2 US 6697146B2 US 17763302 A US17763302 A US 17763302A US 6697146 B2 US6697146 B2 US 6697146B2
Authority
US
United States
Prior art keywords
range
image
pattern
section
finding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US10/177,633
Other versions
US20020196423A1 (en
Inventor
Nobukazu Shima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Ten Ltd
Original Assignee
Denso Ten Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Ten Ltd filed Critical Denso Ten Ltd
Assigned to FUJITSU TEN LIMITED reassignment FUJITSU TEN LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHIMA, NOBUKAZU
Publication of US20020196423A1 publication Critical patent/US20020196423A1/en
Application granted granted Critical
Publication of US6697146B2 publication Critical patent/US6697146B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T7/00Brake-action initiating means
    • B60T7/12Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
    • B60T7/22Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle, or by means of contactless obstacle detectors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals

Definitions

  • the present invention relates to a range finder used for supporting a driver when the driver drives an automobile. More particularly, the present invention relates to a range finder for finding a range to a target, which is running in front, by image realization, when a plurality of cameras or image pickup elements are used.
  • range finding to find a range to a vehicle running in front is one of the factors to be provided.
  • a range to a vehicle running in front has been detected until now by a range finding system, in which the range is found by image realization with compound-eye cameras.
  • a range to a vehicle running in front is found as follows.
  • a vehicle running in front is photographed by two cameras, which are mounted at a predetermined interval on a vehicle running behind, or by image pickup elements such as image sensors.
  • the parallax of the same object (the vehicle running in front) on the thus obtained two images is utilized, and a range to the vehicle running in front is found from this parallax by the principle of triangulation.
  • the above conventional range finding method As the entire image photographed by each camera must be corrected, it becomes necessary to provide a large-scale distortion correcting circuit and a memory to be incorporated into the distortion correcting circuit. Further, since the entire image must be corrected, a quantity of processing necessary for correction is increased, which causes a drop in the processing speed. Furthermore, even if the distortion is corrected, a fluctuation is caused in the pixel values of the images photographed by the right and left cameras. Therefore, it is difficult to accurately calculate the parallax. Furthermore, the above conventional range finding is disadvantageous in that a road surface (shadows, white lines and characters on the road) becomes an object of range finding in some cases.
  • the present invention has been accomplished to solve the above problems of the conventional range finding method for finding a range to a target by image realization. It is an object of the present invention to provide a range finder capable of finding a range to a target at high speed without having an error in range-finding caused by distortion of an image without providing a large-scale correcting circuit and memory.
  • the present invention provides a range finder for finding a range to a target by image realization comprising: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having a predetermined size and the first positional information from a first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern, from a plurality of horizontal or vertical lines located at positions corresponding to the first positional information in the second image of the target which has been made by the second imaging device; and a parallax calculating section for finding parallax from the first and the second positional information.
  • a correlation is found by a plurality of upward and downward lines, it is possible to accurately find parallax without conducting correction with respect to the distortion and axial misalignment.
  • the correlation processing section finds a correlation with the first pattern for every a plurality of horizontal or vertical lines, and the second pattern having the second positional information, which is most correlated with the first pattern, is detected according to the plurality of pieces of correlation which have been found.
  • a correlation is found by a plurality of upward and downward lines, and when it is judged how far a correlating position on each line departs from a vertical line, a pattern can be accurately realized even if other confusing patterns exist.
  • the first image is divided into a proximity region, which can be easily correlated, and a background region which is difficult to be correlated, and the correlation processing section finds a correlation with the first pattern for every a plurality of horizontal or vertical lines only when the first pattern exists in the background region.
  • the correlation processing section finds a correlation with the first pattern for every a plurality of horizontal or vertical lines only when the first pattern exists in the background region.
  • a range finder of the present invention further comprises an image correcting section for detecting a state of misalignment of the first or the second image according to the correlation of the first pattern with the second pattern for every a plurality of horizontal or vertical lines which has been found by the correlation processing section, and for correcting the first or the second image according to the state of the detected misalignment.
  • a range finder of the present invention further comprises an alarm generating section for generating an alarm when a value of correlation, which is obtained in the case where the correlation processing section detects the second pattern, is compared with a correlation reference value and when the value of correlation is not more than a correlation reference value.
  • a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a density difference detecting section for finding a density difference between the first image and the second image from the first image of the target, which has been made by the first imaging device, and the second image of the target which has been made by the second imaging device; an image density correcting section for correcting density of the first or the second image according to the density difference between the first and the second image; a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from the first image; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern, in the second image of the target which has been made by the second imaging device; and a parallax calculating section for finding parallax from the first and the second positional information.
  • a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a density difference detecting section for finding a density difference between the first image and the second image from the first image of the target, which has been made by the first imaging device, and the second image of the target which has been made by the second imaging device; a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from a first image; an image density correcting section for correcting density of the first pattern according to the density difference between the first and the second image; a correlation processing section for detecting a second pattern having the second positional information which is most correlated with the corrected first pattern in the second image; and a parallax calculating section for finding parallax from the first and the second positional information.
  • a range finder of the present invention further comprises a parameter setting section for setting a parameter necessary for processing conducted by the correlation processing section according to a density difference between the first and the second image.
  • a parameter setting section for setting a parameter necessary for processing conducted by the correlation processing section according to a density difference between the first and the second image.
  • a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having the first positional information containing a range finding target from the first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern in the second image of the target which has been made by the second imaging device; a parallax calculating section for finding parallax from the first and the second positional information; a range finding section for finding a range to the range finding target by the parallax; a realizing section for realizing a height of the range finding target according to the position in the first or the second image of the range finding target and according to the result of range finding conducted by the range finding section; and a nullifying section for nullifying the result of range finding conducted by the range finding section in the
  • the height of the range finding target is realized by the position in the image, the range of which was found, and by the result of range finding.
  • the target of range finding is realized as characters and shadows on a road, and the result of range finding is nullified.
  • a range finder of the present invention further comprises a road surface position correcting section for detecting a white line in the first or the second image and for finding a range to a forward end portion of the white line and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the white line.
  • a road surface position correcting section for detecting a white line in the first or the second image and for finding a range to a forward end portion of the white line and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the white line.
  • a range finder of the present invention further comprises a road surface correcting section for detecting a third pattern having the same characteristic as that of the white line from the first or the second image and for finding a range to the forward end portion of the third pattern and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the third pattern.
  • Another pattern such as a character on a road surface having the same characteristic as that of a white line on the road surface is extracted from an image, and its forward end portion is used as a reference and the position of the ground in the image is corrected. Due to the foregoing, it becomes possible to judge a height agreeing with the road environment.
  • the reference value correcting section corrects the road surface position by utilizing only a range finding value in a predetermined range of values in the plurality of range finding values of the forward end portion of the third pattern. Without using data outside the predetermined range of values, the reference value is more accurately corrected.
  • a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having a predetermined size and the first positional information containing the range finding target from the first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern in the second image of the target which has been made by the second imaging device; a parallax calculating section for finding parallax from the first and the second positional information; a range finding section for finding a range to the range finding target by the parallax; a judging section for judging whether or not the range finding target exists in a range finding objective region according to the result of range finding to find a range to the target and also according to the position of the range finding target in the first or the second image; and a nullifying section for nu
  • a range finder further comprises a height realizing section for realizing a height of the range finding target according to the position of the range finding target in the first or the second image, wherein the nullifying section does not nullify the result of range finding conducted by the range finding section in the case where the height is larger than a predetermined reference value of the height even if the range finding target exists outside the objective region of range finding.
  • the height is a predetermined value although it is outside the range finding objective region, it becomes possible to more accurately realize a range finding target when it is made to be a range finding object.
  • FIG. 1A is a view for explaining the principle of a conventional range finding system in which compound-eye cameras are used;
  • FIG. 1B is a view showing images photographed by a right and a left camera shown in FIG. 1A for explaining the principle of parallax range finding;
  • FIG. 1C is a view showing an image containing distortion made by a camera and also showing an image obtained when the image distortion has been corrected by a distortion correcting circuit;
  • FIG. 2A is a block diagram showing an outline of the constitution of a range finder of an embodiment of the present invention.
  • FIG. 2B is a block diagram showing an outline of the constitution of a range finder of another embodiment of the present invention.
  • FIG. 3A is a view showing a left input image photographed by a left camera and also showing a first pattern in the image;
  • FIG. 3B is a view showing a right input image photographed by a right camera, a graph of correlated values and a second pattern which is most correlated with the first pattern in FIG. 3A extracted by the correlation processing;
  • FIG. 5 is a view showing an actual input image and a relation in which a road surface position of the input image is shown, wherein the actual input image and the relation are shown standing side by side;
  • FIG. 6 is a view for explaining a correction of a relation showing a road surface position, wherein four correcting positions in an input image and a relation showing a road surface position of the input image after correction are shown standing side by side;
  • FIG. 7 is a flow chart for explaining a correction procedure of a relation showing a road surface position.
  • range finding is conducted by using compound eye cameras.
  • FIG. 1A explanations will be made into a case in which a distance from a vehicle, which is running behind, to a vehicle 100 , which is running in front, is found by the conventional range finding system using compound eye cameras.
  • the optical axis of the left camera 801 and that of the right camera 800 are horizontally arranged. In this case, the same object in the image photographed by the left camera 801 and that in the image photographed by the right camera 800 are shifted from each other in horizontally.
  • parallax This misalignment between the position of the image photographed by the left camera 801 and that of the image photographed by the right camera 800 is referred to as parallax.
  • the distance from the vehicle which is running behind to the vehicle which is running in front, can be found by the principle of triangulation in which this parallax is utilized.
  • the image 901 shown in FIG. 1B is an example of the image photographed by the left camera, and the image 902 shown in FIG. 1B is an example of the image photographed by the right camera.
  • Distance D (m) to a vehicle running in front can be found by the following expression.
  • (xb ⁇ xa) expresses parallax 903 .
  • the left end of each image photographed by each camera is used as a reference
  • the lateral coordinate to a reference position of the image 901 photographed by the left camera is xb
  • the lateral coordinate to a reference position of the image 902 photographed by the right camera is xa
  • the focal distance of each camera is f
  • the pixel pitch in each image is F
  • the length of a base line is B.
  • the above conventional method of range finding has the following disadvantages. According to the above conventional method of range finding, as the entire image photographed by each camera must be corrected, it is necessary to provide a large-scale distortion correcting circuit and a memory used for the distortion correcting circuit. Since the entire image must be corrected, a quantity of processing of image data is increased, which causes a drop of the processing speed. Further, even if the distortion is corrected, fluctuation is caused in the pixel values of the images photographed by the right and the left camera. Accordingly, it is difficult to accurately calculate parallax. Furthermore, when range finding is conducted, a road face, on which shadows, white lines and characters exist, becomes an object of range finding.
  • FIG. 2A is a block diagram showing an outline of the constitution of an embodiment of the range finder 10 of the present invention.
  • the range finder 10 which is mounted on a vehicle, includes a left camera 11 , right camera 12 and image processing section 13 .
  • Various signals are inputted from the range finder 10 into the driver support device 40 mounted on the vehicle.
  • the driver support device 40 realizes a target such as a vehicle, which is running in front, and an object existing ahead. According to the realization of the target and others, the driver support device 40 helps evade a danger and warns a driver.
  • the image processing section 13 includes: a left image memory 21 , right image memory 22 , density difference detecting section 23 , parameter setting section 24 , image correcting section 25 , pattern extracting section 26 , correlation processing section 27 , parallax calculating section 28 , range finding section 29 and diagnosis section 30 .
  • the left camera 11 and the right camera 12 are respectively arranged at positions in a vehicle (referred to as a self-vehicle hereinafter) on which the range finder 10 is mounted and at the same height from the ground and at a predetermined interval so that parallax can be generated in the visual field of each camera with respect to an object in the field of view.
  • the left camera 11 takes a photograph of space in the field of view of the left camera 11 .
  • Data of the left image photographed by the left camera 11 are stored in the left image memory 21 .
  • the right camera 12 takes a photograph of space in the field of view of the right camera 12 .
  • Data of the right image photographed by the right camera 12 are stored in the right image memory 22 .
  • the left image 300 shown in FIG. 3A is an example of the left image photographed by the left camera 11
  • the right image 310 shown in FIG. 3B is an example of the right image photographed by the right camera 12
  • reference numeral 100 is a vehicle running in front
  • reference numeral 101 is a white line on a road.
  • each image is composed of 640 pixels existing in the direction of the x-axis and 480 pixels existing in the directions of the y-axis in the drawing. Each pixel has density of 256 gradations.
  • the density difference calculating section 23 calculates average image density of the inputted right and left images. Further, the density difference calculating section 23 calculates a difference in density between the right and the left image. The thus calculated density difference data are sent to the image correcting section 25 and the diagnosis section 30 .
  • the image correcting section 25 receives density difference data from the density difference calculating section 23 and corrects density of the entire left image according to the density difference data. That is, if a density difference is caused between the images inputted from the right and left cameras, it becomes impossible to accurately conduct the pattern matching which will be described later. Accordingly, there is a possibility that parallax cannot be accurately calculated and, further, range finding cannot be accurately conducted. For the above reasons, a density balance between the right and left images is corrected. In this connection, density of the entire left image may not be corrected, and density may be corrected only in a specific region in the image which has been previously determined or detected.
  • the diagnosis section 30 judges that one of the right and left cameras or both of the right and left cameras are defective. Then, the diagnosis section 30 sends an alarm signal to the driver support device 40 .
  • the pattern extracting section 26 extracts the first pattern 301 having the coordinate data (x1, y1)) from the left image 300 . It is preferable that the first pattern 301 is composed of 3 ⁇ 3 pixels or 4 ⁇ 4 pixels. It is preferable that the first pattern 301 is extracted from an edge section of the range finding target (the vehicle 100 in the case of the left image 300 ) by means of edge detection.
  • the correlation processing section 27 detects the second pattern, which is most correlated with the first pattern 301 in the left image 300 extracted by the pattern extracting section 26 , from the right pattern 310 by means of pattern matching, so that the coordinate data of the second pattern can be detected.
  • Pattern matching of the first pattern 301 with the right image 310 is conducted on the five upward and downward lines (the lines 311 to 315 in FIG. 3B) round the y-coordinate (y1) of the first pattern 301 , that is, pattern matching of the first pattern 301 with the right image 310 is conducted among the five pixels in the direction of the y-axis. It is preferable that pattern matching is conducted among the five upward and downward lines. However, it is possible to adopt the other numbers of lines if necessary.
  • a range on the x-axis on which pattern matching is conducted can be previously set in the range from xm to xn round the x-coordinate (x1) of the first pattern 301 .
  • pattern matching is conducted on a plurality of upward and downward lines in the direction of the y-axis, however, pattern matching may be conducted on a plurality of upward and downward lines in the direction of the x-axis.
  • pattern matching is conducted.
  • a correction is not conducted but pattern matching is conducted on a plurality of upward and downward lines in the direction of the y-axis or the x-axis.
  • pattern matching is conducted on a plurality of upward and downward lines in the direction of the y-axis or the x-axis.
  • Pattern matching is conducted as follows. First, on each line on the y-axis, a window of the same size as that of the first pattern 301 is set for each pixel on the x-axis, and a value of correlation of each window with the first pattern 301 is calculated. Further, the value of correlation of each window is plotted on a graph, and a window, the value of correlation of which is highest, is realized as the second pattern which is most correlated with the first pattern 301 .
  • a value of correlation is found by calculation in which a well known cross correlation function is used. As an example of the well known cross correlation function, there is provided a method of SSDA (Sequential Similarity Detecting Algorithm) in which ⁇
  • SSDA Sequential Similarity Detecting Algorithm
  • FIG. 3B is a graph 320 showing values of correlation.
  • Graphs 321 to 325 shown in the graph 320 of values of correlation respectively correspond to the lines 311 to 315 of the right image 310 .
  • the right image 320 it can be judged that the highest correlation can be obtained at the coordinate x2 on the x-axis in the graph 323 .
  • the second pattern which is most correlated with the first pattern 301 in the right image 310 , is a pattern indicated by the reference numeral 331 in the right image 330 after the completion of processing of correlation, and the coordinate is (x2, y2)).
  • all graphs have a peak value at the coordinate x2.
  • the graph 325 has a peak value at the coordinate x3, the coordinate except for x2 can be excluded. That is, it is possible to prevent the occurrence of an error of range finding in a state of a plurality of graphs (a state of distribution of values of correlation).
  • the parameter setting section 24 can conduct processing of correlation in the correlation processing section 27 and setting of various parameters relating to the pattern extracting section 26 according to a density difference between the right and the left image calculated by the density difference calculating section 23 .
  • a threshold value for extracting an edge used in the pattern extracting section 26 and a threshold value for judging a correlation coincidence used in the correlation processing section 27 are set according to the density difference.
  • parallax calculating section 28 finds parallax of the range finding target (vehicle) 100 from the coordinate (x1, y1)) of the first pattern 301 in the left image 300 shown in FIG. 3 A and the coordinate (x2, y2)) of the second pattern 331 in the right image 330 shown in FIG. 3 B.
  • parallax can be expressed by the following expression.
  • the range finding section 29 finds a range between the range finding target 100 and the self-vehicle according to parallax calculated by the parallax calculating section 28 and sends the thus obtained range data to the driver support device 40 .
  • data of the distance to the range finding target can be sent to the driver support device 40 .
  • data of range finding are found in one portion in the image, however, data of range finding may be simultaneously found for a plurality of portions in the image.
  • data of range finding may be found in a plurality of portions of the range finding target and an average value may be found from a plurality of data, so that the average value can be used as data of range finding of the range finding target 100 .
  • the left image memory 21 and the right image memory 22 can be respectively realized by the frame memories.
  • the density difference calculating section 23 , parameter setting section 24 , image correcting section 25 , pattern extracting section 26 , correlation processing section 27 , parallax calculating section 28 , range finding section 29 and diagnosis section 30 may be respectively realized by a different processing circuit. Alternatively, they may be realized when programs for conducting calculation of the processing sections are successively executed in a computer having a CPU and various memories.
  • sampling is conducted once in several seconds on the y-coordinate of the second pattern 331 , which is most correlated in the right image 330 in FIG. 3B, so that the misalignment of y-coordinate can be detected.
  • Positions of the five upward and downward lines, on which correlation proceeding is conducted, may be corrected by utilizing the thus detected misalignment.
  • the misalignment of the axes of the right and left images can be corrected by this correction.
  • FIG. 2B is a block diagram showing an outline of another range finder 10 of the present invention. Like reference characters are used to indicate like parts in FIGS. 2A and 2B. Different points of the range finder shown in FIG. 2B from that shown in FIG. 2A are described as follows. In the range finder shown in FIG. 2B, after the first pattern 301 has been extracted from the input image, which was photographed by the left camera 11 , by the pattern extracting section 26 , only the thus extracted first pattern 301 is corrected in the pattern correcting section 31 by using the density difference data calculated by the density difference calculating section 23 . In the range finder shown in FIG. 2B, the entire image or the image in a specific region is not corrected like the range finder shown in FIG.
  • the correlation processing section 27 divides an inputted image into two regions of the background region 401 and the proximity region 402 by the boundary line 400 , and the range finding method is changed for each region. Only when the first pattern extracted by the pattern extracting section 26 exists in the background region 401 , correlation processing is conducted on the five upward and downward lines. When the first pattern extracted by the pattern extracting section 26 exists in the proximity region 402 , another simple correlation processing is conducted. The reason is that, in general, correlation processing can be easily conducted in the proximity region 402 , however, correlation processing cannot be easily conducted in the background region 401 . In this connection, a position of the boundary line 400 is appropriately determined according to the system. Concerning the simple correlation processing, it is possible to consider to conduct pattern matching on the same coordinate as the y-coordinate of the first pattern.
  • Reference numeral 500 shown in FIG. 5 is a graph showing a relation between the y-coordinate in the inputted image and distance D from the right 12 and the left camera 11 .
  • a curve shown by reference numeral 502 is a relation showing a road surface position of the inputted image
  • a curve shown by reference numeral 501 is a relation showing a predetermined height (reference value, for example 20 cm) from the road face.
  • the line 511 shows a road surface position corresponding to the relation 501 .
  • the region 503 corresponds to an object, the height of which is not less than the reference value of height from the road surface position.
  • the graph 500 shown in FIG. 5 it is possible to judge whether or not the height of the range finding target is not less than the reference value of height by the position (y-coordinate) in the inputted image of the range finding target and by the result of range finding (D).
  • the realization of height of the range finding target may be conducted by the height realizing means independently provided in the image processing section 13 or by the range finding section 29 .
  • the result of the range finding target is made valid.
  • the range finding target corresponds to the other regions, it is judged that it is not a range finding target such as a road face, white line, road face character or object, and the result of range finding is made invalid.
  • the above invalid processing of the result of range finding may be conducted by the height invalid means independently arranged in the image processing section 13 .
  • the above invalid processing of the result of range finding may be conducted by the range finding section 29 .
  • the graph 600 shown in FIG. 6 shows a position (y-coordinate) in the inputted image and shows distance D from the left 11 and the right camera 12 .
  • a curve shown by reference numerals 601 is a relation showing a previously set road surface position.
  • the road surface position variously changes according to the vibration of a vehicle and the environmental condition of the road. Therefore, it is preferable that the relation showing the road surface position is corrected by the inputted image.
  • a procedure of correcting the relation showing the road surface position will be explained as follows.
  • a position of the white line (the white line 101 of the image 610 in FIG. 6) on the road face is realized from the input image, and the coordinate is extracted.
  • step 702 range finding is conducted in one portion (for example, a portion shown by reference numeral 611 in the image 610 ) of the forward end portion of the realized white line.
  • step 703 it is judged whether or not a position of the range finding result is too close to a previously set range.
  • step 704 it is judged whether or not the position of the range finding result is too distant from the previously set range.
  • the program proceeds to step 705 , and it is judged that the range finding result is valid.
  • step 706 the result of range finding is invalid. Therefore, the result of range finding is not used for correction.
  • next step 707 it is judged whether or not the number of portions, which have been judged to be valid, is larger than P which has been set in step 701 .
  • the procedures in steps 702 to 706 are repeated again. For example, in the image 610 shown in FIG. 6, range finding is conducted in four portions of the points 611 to 614 .
  • step 707 the program proceeds to step 708 .
  • step 708 a correction value for correcting the relation 601 showing a road surface position is calculated.
  • step 709 the relation 601 showing the road surface position is corrected.
  • the graph 600 shown in FIG. 6 is found by the y-coordinate of each point and the relation 601 , and the distance (A) is 20 m at the point 611 , the distance (A) is 5 m at the point 612 , the distance (A) is 20 m at the point 613 and the distance (A) is 5 m at the point 614 .
  • the relation 605 showing a position higher than the reference value of height from the road surface position is corrected according to the correction of the relation 602 , and the range 604 in which correlation processing is conducted is changed in step 710 . In this way, a series of procedure can be completed.
  • the above procedure shown in FIG. 7 may be conducted by a road surface correcting means independently provided in the image processing section 13 .
  • the above procedure shown in FIG. 7 may be conducted by the range finding section 29 .
  • the road surface position is corrected by using a white line on a road surface.
  • it is possible to correct the relation showing the road face position by utilizing a pattern on the inputted image plane, the color of which is white and the length in the direction of the y-axis of which is somewhat large.
  • An example of the aforementioned pattern is characters drawn on the road face.
  • the correction range of the relation showing a road surface position is in a predetermined range (a range of the graph 603 in the graph 600 shown in FIG. 6, for example, in a range of ratios (B/A) 0.8 to 1.2).
  • the present invention it is unnecessary to provide a large-scale correction circuit and memory for conducting distortion correction of an inputted image. Accordingly, the size and cost of the range finder can be reduced.

Abstract

A range-finder, capable of finding a range by image realization, having two distantly arranged imaging devices. The first pattern having a predetermined size and the first positional information is extracted by the pattern extracting section from the first image of a target which has been made by the first imaging device. In the second image of the same target, which has been made by the second imaging device, the second pattern having the second positional information, which is most correlated with the first pattern in a plurality of upward and downward horizontal or vertical lines corresponding to the first positional information, is detected by the correlation processing section. Parallax is found from the first and the second positional information by the parallax calculating section. A distance to the target is found from this parallax by using the principle of triangulation.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority of Japanese Patent Application No. 2001-186779, filed on Jun. 20, 2001.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a range finder used for supporting a driver when the driver drives an automobile. More particularly, the present invention relates to a range finder for finding a range to a target, which is running in front, by image realization, when a plurality of cameras or image pickup elements are used.
2. Description of the Related Art
In order to enhance the convenience and safety of driving an automobile, a driver support system has been recently put into practical use. In this driver support system, range finding to find a range to a vehicle running in front is one of the factors to be provided. A range to a vehicle running in front has been detected until now by a range finding system, in which the range is found by image realization with compound-eye cameras.
In the above conventional range finding system in which compound-eye cameras are used, a range to a vehicle running in front is found as follows. A vehicle running in front is photographed by two cameras, which are mounted at a predetermined interval on a vehicle running behind, or by image pickup elements such as image sensors. The parallax of the same object (the vehicle running in front) on the thus obtained two images is utilized, and a range to the vehicle running in front is found from this parallax by the principle of triangulation.
On the other hand, in some cases, distortion is caused on an image obtained by each camera when the optical system is distorted. When the obtained image is distorted, it impossible to accurately calculate the parallax, and range finding is erroneously conducted. In order to solve the above problems, a distorted image photographed by each camera is corrected by a distortion correcting circuit so that the distorted image can be processed to an image having no distortion, and then a range to a target is found.
However, in the above conventional range finding method, as the entire image photographed by each camera must be corrected, it becomes necessary to provide a large-scale distortion correcting circuit and a memory to be incorporated into the distortion correcting circuit. Further, since the entire image must be corrected, a quantity of processing necessary for correction is increased, which causes a drop in the processing speed. Furthermore, even if the distortion is corrected, a fluctuation is caused in the pixel values of the images photographed by the right and left cameras. Therefore, it is difficult to accurately calculate the parallax. Furthermore, the above conventional range finding is disadvantageous in that a road surface (shadows, white lines and characters on the road) becomes an object of range finding in some cases.
SUMMARY OF THE INVENTION
The present invention has been accomplished to solve the above problems of the conventional range finding method for finding a range to a target by image realization. It is an object of the present invention to provide a range finder capable of finding a range to a target at high speed without having an error in range-finding caused by distortion of an image without providing a large-scale correcting circuit and memory.
It is another object of the present invention to provide a range finder capable of accurately finding a range to a target by correcting a fluctuation caused between images photographed by a right and a left camera.
It is still another object of the present invention to provide a range finder capable of accurately finding a range to a target without having an error in range-finding caused by a road surface.
In order to solve the above problem, the present invention provides a range finder for finding a range to a target by image realization comprising: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having a predetermined size and the first positional information from a first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern, from a plurality of horizontal or vertical lines located at positions corresponding to the first positional information in the second image of the target which has been made by the second imaging device; and a parallax calculating section for finding parallax from the first and the second positional information. When a correlation is found by a plurality of upward and downward lines, it is possible to accurately find parallax without conducting correction with respect to the distortion and axial misalignment.
In a range finder of the present invention, the correlation processing section finds a correlation with the first pattern for every a plurality of horizontal or vertical lines, and the second pattern having the second positional information, which is most correlated with the first pattern, is detected according to the plurality of pieces of correlation which have been found. A correlation is found by a plurality of upward and downward lines, and when it is judged how far a correlating position on each line departs from a vertical line, a pattern can be accurately realized even if other confusing patterns exist.
In a range finder of the present invention, it is preferable that the first image is divided into a proximity region, which can be easily correlated, and a background region which is difficult to be correlated, and the correlation processing section finds a correlation with the first pattern for every a plurality of horizontal or vertical lines only when the first pattern exists in the background region. In this constitution, only when the pattern is realized in a background region in which it is difficult to correlate, a correlation is found by a plurality of upward and downward lines.
It is preferable that a range finder of the present invention further comprises an image correcting section for detecting a state of misalignment of the first or the second image according to the correlation of the first pattern with the second pattern for every a plurality of horizontal or vertical lines which has been found by the correlation processing section, and for correcting the first or the second image according to the state of the detected misalignment.
Further, it is preferable that a range finder of the present invention further comprises an alarm generating section for generating an alarm when a value of correlation, which is obtained in the case where the correlation processing section detects the second pattern, is compared with a correlation reference value and when the value of correlation is not more than a correlation reference value. When the distortion of an image plane and the misalignment of an axis are large and it is impossible to accurately conduct range finding, an alarm is generated.
In order to solve the above problems, a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a density difference detecting section for finding a density difference between the first image and the second image from the first image of the target, which has been made by the first imaging device, and the second image of the target which has been made by the second imaging device; an image density correcting section for correcting density of the first or the second image according to the density difference between the first and the second image; a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from the first image; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern, in the second image of the target which has been made by the second imaging device; and a parallax calculating section for finding parallax from the first and the second positional information. When a difference of density between the right and the left image is found and one of the images is corrected by utilizing the difference in density, it becomes possible to accurately conduct range finding.
In order to solve the above problems, a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a density difference detecting section for finding a density difference between the first image and the second image from the first image of the target, which has been made by the first imaging device, and the second image of the target which has been made by the second imaging device; a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from a first image; an image density correcting section for correcting density of the first pattern according to the density difference between the first and the second image; a correlation processing section for detecting a second pattern having the second positional information which is most correlated with the corrected first pattern in the second image; and a parallax calculating section for finding parallax from the first and the second positional information. When a difference of density between the right and the left image is found and only the pattern is corrected by utilizing the difference of density, it becomes possible to accurately conduct range finding.
It is preferable that a range finder of the present invention further comprises a parameter setting section for setting a parameter necessary for processing conducted by the correlation processing section according to a density difference between the first and the second image. When the parameter necessary for correlation processing is changed by the difference of density found from the right and the left image, range finding can be more accurately conducted.
In order to solve the above problems, a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having the first positional information containing a range finding target from the first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern in the second image of the target which has been made by the second imaging device; a parallax calculating section for finding parallax from the first and the second positional information; a range finding section for finding a range to the range finding target by the parallax; a realizing section for realizing a height of the range finding target according to the position in the first or the second image of the range finding target and according to the result of range finding conducted by the range finding section; and a nullifying section for nullifying the result of range finding conducted by the range finding section in the case where the height of the range finding target is smaller than the reference height. The height of the range finding target is realized by the position in the image, the range of which was found, and by the result of range finding. When the height is not more than a predetermined height from the ground, the target of range finding is realized as characters and shadows on a road, and the result of range finding is nullified.
It is preferable that a range finder of the present invention further comprises a road surface position correcting section for detecting a white line in the first or the second image and for finding a range to a forward end portion of the white line and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the white line. When the position of the ground in the image is corrected by the result of realization of a white line on a road, it becomes possible to judge a height agreeing with the road environment.
A range finder of the present invention further comprises a road surface correcting section for detecting a third pattern having the same characteristic as that of the white line from the first or the second image and for finding a range to the forward end portion of the third pattern and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the third pattern. Another pattern such as a character on a road surface having the same characteristic as that of a white line on the road surface is extracted from an image, and its forward end portion is used as a reference and the position of the ground in the image is corrected. Due to the foregoing, it becomes possible to judge a height agreeing with the road environment.
In a range finder of the present invention, it is preferable that the reference value correcting section corrects a reference value according to a plurality of range finding values of the forward end portion of the third pattern. The reference value is corrected according to data sent from a plurality of positions.
Further, in a range finder of the present invention, it is preferable that the reference value correcting section corrects the road surface position by utilizing only a range finding value in a predetermined range of values in the plurality of range finding values of the forward end portion of the third pattern. Without using data outside the predetermined range of values, the reference value is more accurately corrected.
In order to solve the above problems, a range finder for finding a range to a target by image realization of the present invention comprises: a first and a second imaging device arranged at a predetermined interval; a pattern extracting section for extracting a first pattern having a predetermined size and the first positional information containing the range finding target from the first image of the target which has been made by the first imaging device; a correlation processing section for detecting a second pattern having the second positional information, which is most correlated with the first pattern in the second image of the target which has been made by the second imaging device; a parallax calculating section for finding parallax from the first and the second positional information; a range finding section for finding a range to the range finding target by the parallax; a judging section for judging whether or not the range finding target exists in a range finding objective region according to the result of range finding to find a range to the target and also according to the position of the range finding target in the first or the second image; and a nullifying section for nullifying the result of range finding conducted by the range finding section in the case where the range finding target exists outside the range finding objective region. When range finding is conducted only in a predetermined range, the processing time of range finding can be reduced.
A range finder further comprises a height realizing section for realizing a height of the range finding target according to the position of the range finding target in the first or the second image, wherein the nullifying section does not nullify the result of range finding conducted by the range finding section in the case where the height is larger than a predetermined reference value of the height even if the range finding target exists outside the objective region of range finding. In the case where the height is a predetermined value although it is outside the range finding objective region, it becomes possible to more accurately realize a range finding target when it is made to be a range finding object.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be more clearly understood from the description as set forth below with reference to the accompanying drawings, wherein
FIG. 1A is a view for explaining the principle of a conventional range finding system in which compound-eye cameras are used;
FIG. 1B is a view showing images photographed by a right and a left camera shown in FIG. 1A for explaining the principle of parallax range finding;
FIG. 1C is a view showing an image containing distortion made by a camera and also showing an image obtained when the image distortion has been corrected by a distortion correcting circuit;
FIG. 2A is a block diagram showing an outline of the constitution of a range finder of an embodiment of the present invention;
FIG. 2B is a block diagram showing an outline of the constitution of a range finder of another embodiment of the present invention;
FIG. 3A is a view showing a left input image photographed by a left camera and also showing a first pattern in the image;
FIG. 3B is a view showing a right input image photographed by a right camera, a graph of correlated values and a second pattern which is most correlated with the first pattern in FIG. 3A extracted by the correlation processing;
FIG. 4 is a view for explaining the classification of an input image into a background region and proximity region;
FIG. 5 is a view showing an actual input image and a relation in which a road surface position of the input image is shown, wherein the actual input image and the relation are shown standing side by side;
FIG. 6 is a view for explaining a correction of a relation showing a road surface position, wherein four correcting positions in an input image and a relation showing a road surface position of the input image after correction are shown standing side by side; and
FIG. 7 is a flow chart for explaining a correction procedure of a relation showing a road surface position.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Before describing the preferred embodiments, an explanation will be given to the conventional range finder for finding a range by image realization shown in FIGS. 1A to 1C.
In the conventional range finding system for finding a range by image realization, range finding is conducted by using compound eye cameras. Referring to FIG. 1A, explanations will be made into a case in which a distance from a vehicle, which is running behind, to a vehicle 100, which is running in front, is found by the conventional range finding system using compound eye cameras. As shown in FIG. 1A, the left camera 801 and the right camera 800 are horizontally arranged in the vehicle, which is running behind, at a predetermined interval (=base line length) B. The optical axis of the left camera 801 and that of the right camera 800 are horizontally arranged. In this case, the same object in the image photographed by the left camera 801 and that in the image photographed by the right camera 800 are shifted from each other in horizontally. This misalignment between the position of the image photographed by the left camera 801 and that of the image photographed by the right camera 800 is referred to as parallax. The distance from the vehicle which is running behind to the vehicle which is running in front, can be found by the principle of triangulation in which this parallax is utilized.
The image 901 shown in FIG. 1B is an example of the image photographed by the left camera, and the image 902 shown in FIG. 1B is an example of the image photographed by the right camera. Distance D (m) to a vehicle running in front can be found by the following expression.
D=f·B/{F(xb−xa)}
In this connection, (xb−xa) expresses parallax 903. In the above expression, the left end of each image photographed by each camera is used as a reference, the lateral coordinate to a reference position of the image 901 photographed by the left camera is xb, the lateral coordinate to a reference position of the image 902 photographed by the right camera is xa, the focal distance of each camera is f, the pixel pitch in each image is F, and the length of a base line is B.
In some cases, there is a possibility that distortion is caused in the image photographed by each camera due to distortion in the optical system. Distortion is high especially when a wide angle lens is used for the camera. When distortion is caused in an image, it becomes impossible to accurately calculate parallax, which causes an error in range finding. In order to solve the above problems, as shown in FIG. 1C, the entire image 1001 photographed by each camera containing distortion is processed into the image 1002 having no distortion by the distortion correcting circuit 1003, and then the aforementioned parallax (parallax 903 shown in FIG. 1B) is calculated.
However, the above conventional method of range finding has the following disadvantages. According to the above conventional method of range finding, as the entire image photographed by each camera must be corrected, it is necessary to provide a large-scale distortion correcting circuit and a memory used for the distortion correcting circuit. Since the entire image must be corrected, a quantity of processing of image data is increased, which causes a drop of the processing speed. Further, even if the distortion is corrected, fluctuation is caused in the pixel values of the images photographed by the right and the left camera. Accordingly, it is difficult to accurately calculate parallax. Furthermore, when range finding is conducted, a road face, on which shadows, white lines and characters exist, becomes an object of range finding.
FIG. 2A is a block diagram showing an outline of the constitution of an embodiment of the range finder 10 of the present invention. The range finder 10, which is mounted on a vehicle, includes a left camera 11, right camera 12 and image processing section 13. Various signals are inputted from the range finder 10 into the driver support device 40 mounted on the vehicle. According to the signals sent from the range finder 10, the driver support device 40 realizes a target such as a vehicle, which is running in front, and an object existing ahead. According to the realization of the target and others, the driver support device 40 helps evade a danger and warns a driver.
The image processing section 13 includes: a left image memory 21, right image memory 22, density difference detecting section 23, parameter setting section 24, image correcting section 25, pattern extracting section 26, correlation processing section 27, parallax calculating section 28, range finding section 29 and diagnosis section 30.
The left camera 11 and the right camera 12 are respectively arranged at positions in a vehicle (referred to as a self-vehicle hereinafter) on which the range finder 10 is mounted and at the same height from the ground and at a predetermined interval so that parallax can be generated in the visual field of each camera with respect to an object in the field of view. The left camera 11 takes a photograph of space in the field of view of the left camera 11. Data of the left image photographed by the left camera 11 are stored in the left image memory 21. In the same manner, the right camera 12 takes a photograph of space in the field of view of the right camera 12. Data of the right image photographed by the right camera 12 are stored in the right image memory 22.
The left image 300 shown in FIG. 3A is an example of the left image photographed by the left camera 11, and the right image 310 shown in FIG. 3B is an example of the right image photographed by the right camera 12. In the left image 300 and the right image 310, reference numeral 100 is a vehicle running in front, and reference numeral 101 is a white line on a road. In this embodiment, each image is composed of 640 pixels existing in the direction of the x-axis and 480 pixels existing in the directions of the y-axis in the drawing. Each pixel has density of 256 gradations.
The density difference calculating section 23 calculates average image density of the inputted right and left images. Further, the density difference calculating section 23 calculates a difference in density between the right and the left image. The thus calculated density difference data are sent to the image correcting section 25 and the diagnosis section 30.
The image correcting section 25 receives density difference data from the density difference calculating section 23 and corrects density of the entire left image according to the density difference data. That is, if a density difference is caused between the images inputted from the right and left cameras, it becomes impossible to accurately conduct the pattern matching which will be described later. Accordingly, there is a possibility that parallax cannot be accurately calculated and, further, range finding cannot be accurately conducted. For the above reasons, a density balance between the right and left images is corrected. In this connection, density of the entire left image may not be corrected, and density may be corrected only in a specific region in the image which has been previously determined or detected.
In the case where a density difference received by the diagnosis section 30 is higher than a predetermined value, the diagnosis section 30 judges that one of the right and left cameras or both of the right and left cameras are defective. Then, the diagnosis section 30 sends an alarm signal to the driver support device 40.
The pattern extracting section 26 extracts the first pattern 301 having the coordinate data (x1, y1)) from the left image 300. It is preferable that the first pattern 301 is composed of 3×3 pixels or 4×4 pixels. It is preferable that the first pattern 301 is extracted from an edge section of the range finding target (the vehicle 100 in the case of the left image 300) by means of edge detection.
The correlation processing section 27 detects the second pattern, which is most correlated with the first pattern 301 in the left image 300 extracted by the pattern extracting section 26, from the right pattern 310 by means of pattern matching, so that the coordinate data of the second pattern can be detected. Pattern matching of the first pattern 301 with the right image 310 is conducted on the five upward and downward lines (the lines 311 to 315 in FIG. 3B) round the y-coordinate (y1) of the first pattern 301, that is, pattern matching of the first pattern 301 with the right image 310 is conducted among the five pixels in the direction of the y-axis. It is preferable that pattern matching is conducted among the five upward and downward lines. However, it is possible to adopt the other numbers of lines if necessary. Further, a range on the x-axis on which pattern matching is conducted can be previously set in the range from xm to xn round the x-coordinate (x1) of the first pattern 301. In this connection, in this embodiment, pattern matching is conducted on a plurality of upward and downward lines in the direction of the y-axis, however, pattern matching may be conducted on a plurality of upward and downward lines in the direction of the x-axis.
According to the conventional method, after distortion of the right and left images or misalignment of the axis has been corrected, pattern matching is conducted. However, in this embodiment, such a correction is not conducted but pattern matching is conducted on a plurality of upward and downward lines in the direction of the y-axis or the x-axis. According to this embodiment, as described above, even if a correction of distortion is not conducted on the inputted right and left images, it becomes possible to accurately find parallax when a plurality of lines, which are located upward and downward, are correlated. Accordingly, it becomes unnecessary to provide a large-scale correcting circuit and memory for correcting distortion on the entire image plane. Further, the second pattern can be detected without requiring the processing time for correcting distortion on the entire image plane.
Pattern matching is conducted as follows. First, on each line on the y-axis, a window of the same size as that of the first pattern 301 is set for each pixel on the x-axis, and a value of correlation of each window with the first pattern 301 is calculated. Further, the value of correlation of each window is plotted on a graph, and a window, the value of correlation of which is highest, is realized as the second pattern which is most correlated with the first pattern 301. In this connection, a value of correlation is found by calculation in which a well known cross correlation function is used. As an example of the well known cross correlation function, there is provided a method of SSDA (Sequential Similarity Detecting Algorithm) in which ∫∫|f—t| dxdy is used as a scale of matching.
FIG. 3B is a graph 320 showing values of correlation. Graphs 321 to 325 shown in the graph 320 of values of correlation respectively correspond to the lines 311 to 315 of the right image 310. According to the right image 320, it can be judged that the highest correlation can be obtained at the coordinate x2 on the x-axis in the graph 323. Accordingly, it can be judged that the second pattern, which is most correlated with the first pattern 301 in the right image 310, is a pattern indicated by the reference numeral 331 in the right image 330 after the completion of processing of correlation, and the coordinate is (x2, y2)). In this connection, in the graph 320 of the values of correlation, all graphs have a peak value at the coordinate x2. Therefore, even if the graph 325 has a peak value at the coordinate x3, the coordinate except for x2 can be excluded. That is, it is possible to prevent the occurrence of an error of range finding in a state of a plurality of graphs (a state of distribution of values of correlation).
In this connection, the parameter setting section 24 can conduct processing of correlation in the correlation processing section 27 and setting of various parameters relating to the pattern extracting section 26 according to a density difference between the right and the left image calculated by the density difference calculating section 23. For example, it is preferable that a threshold value for extracting an edge used in the pattern extracting section 26 and a threshold value for judging a correlation coincidence used in the correlation processing section 27 are set according to the density difference. In this connection, since a value of correlation becomes low when the density difference is small, when the threshold value of the value of correlation is lowered according to the density difference, it becomes possible to accurately judge a coincidence of correlation.
The parallax calculating section 28 finds parallax of the range finding target (vehicle) 100 from the coordinate (x1, y1)) of the first pattern 301 in the left image 300 shown in FIG. 3A and the coordinate (x2, y2)) of the second pattern 331 in the right image 330 shown in FIG. 3B. In this case, parallax can be expressed by the following expression.
Parallax=((x2−x1)2+(y2−y1)2)1/2
The range finding section 29 finds a range between the range finding target 100 and the self-vehicle according to parallax calculated by the parallax calculating section 28 and sends the thus obtained range data to the driver support device 40. In this way, data of the distance to the range finding target can be sent to the driver support device 40. In the above example, data of range finding are found in one portion in the image, however, data of range finding may be simultaneously found for a plurality of portions in the image. Alternatively, data of range finding may be found in a plurality of portions of the range finding target and an average value may be found from a plurality of data, so that the average value can be used as data of range finding of the range finding target 100.
In FIG. 2A, the left image memory 21 and the right image memory 22 can be respectively realized by the frame memories. The density difference calculating section 23, parameter setting section 24, image correcting section 25, pattern extracting section 26, correlation processing section 27, parallax calculating section 28, range finding section 29 and diagnosis section 30 may be respectively realized by a different processing circuit. Alternatively, they may be realized when programs for conducting calculation of the processing sections are successively executed in a computer having a CPU and various memories.
In the range finder 10 shown in FIG. 2A, for example, sampling is conducted once in several seconds on the y-coordinate of the second pattern 331, which is most correlated in the right image 330 in FIG. 3B, so that the misalignment of y-coordinate can be detected. Positions of the five upward and downward lines, on which correlation proceeding is conducted, may be corrected by utilizing the thus detected misalignment. The misalignment of the axes of the right and left images can be corrected by this correction.
FIG. 2B is a block diagram showing an outline of another range finder 10 of the present invention. Like reference characters are used to indicate like parts in FIGS. 2A and 2B. Different points of the range finder shown in FIG. 2B from that shown in FIG. 2A are described as follows. In the range finder shown in FIG. 2B, after the first pattern 301 has been extracted from the input image, which was photographed by the left camera 11, by the pattern extracting section 26, only the thus extracted first pattern 301 is corrected in the pattern correcting section 31 by using the density difference data calculated by the density difference calculating section 23. In the range finder shown in FIG. 2B, the entire image or the image in a specific region is not corrected like the range finder shown in FIG. 2A, and the density of only the extracted pattern is corrected. Therefore, the processing time can be shortened. The other points of operation of the range finder shown in FIG. 2B are the same as those of the range finder 10 shown in FIG. 2A. Therefore, further detailed explanations will be omitted here.
Next, referring to FIG. 4, operation of the correlation processing section 27 will be explained below. As shown in FIG. 4, the correlation processing section 27 divides an inputted image into two regions of the background region 401 and the proximity region 402 by the boundary line 400, and the range finding method is changed for each region. Only when the first pattern extracted by the pattern extracting section 26 exists in the background region 401, correlation processing is conducted on the five upward and downward lines. When the first pattern extracted by the pattern extracting section 26 exists in the proximity region 402, another simple correlation processing is conducted. The reason is that, in general, correlation processing can be easily conducted in the proximity region 402, however, correlation processing cannot be easily conducted in the background region 401. In this connection, a position of the boundary line 400 is appropriately determined according to the system. Concerning the simple correlation processing, it is possible to consider to conduct pattern matching on the same coordinate as the y-coordinate of the first pattern.
Next, a procedure of realizing the height of the range finding target 100 will be explained below referring to FIG. 5. Reference numeral 500 shown in FIG. 5 is a graph showing a relation between the y-coordinate in the inputted image and distance D from the right 12 and the left camera 11. In this case, a curve shown by reference numeral 502 is a relation showing a road surface position of the inputted image, and a curve shown by reference numeral 501 is a relation showing a predetermined height (reference value, for example 20 cm) from the road face. In the input image 510 shown in FIG. 5, the line 511 shows a road surface position corresponding to the relation 501. Then, in the graph 500 shown in FIG. 5, the region 503 corresponds to an object, the height of which is not less than the reference value of height from the road surface position.
Accordingly, when the graph 500 shown in FIG. 5 is utilized, it is possible to judge whether or not the height of the range finding target is not less than the reference value of height by the position (y-coordinate) in the inputted image of the range finding target and by the result of range finding (D). The realization of height of the range finding target may be conducted by the height realizing means independently provided in the image processing section 13 or by the range finding section 29.
Only when the height of the range finding target is not less than the reference value of height (in the case corresponding to the region 503), the result is made valid. When the range finding target corresponds to the other regions, it is judged that it is not a range finding target such as a road face, white line, road face character or object, and the result of range finding is made invalid. The above invalid processing of the result of range finding may be conducted by the height invalid means independently arranged in the image processing section 13. Alternatively, the above invalid processing of the result of range finding may be conducted by the range finding section 29.
Next, referring to FIGS. 6 and 7, correction of a relation showing a road surface position will be explained below. In the same manner as that of the graph 500 shown in FIG. 5, the graph 600 shown in FIG. 6 shows a position (y-coordinate) in the inputted image and shows distance D from the left 11 and the right camera 12. A curve shown by reference numerals 601 is a relation showing a previously set road surface position. However, the road surface position variously changes according to the vibration of a vehicle and the environmental condition of the road. Therefore, it is preferable that the relation showing the road surface position is corrected by the inputted image.
Referring to FIG. 7, a procedure of correcting the relation showing the road surface position will be explained as follows. In the step 701, first, a position of the white line (the white line 101 of the image 610 in FIG. 6) on the road face is realized from the input image, and the coordinate is extracted. At this time, the range finding portion P is simultaneously designated. For example, it is preferable that P=4 portions.
Next, in step 702, range finding is conducted in one portion (for example, a portion shown by reference numeral 611 in the image 610) of the forward end portion of the realized white line. Next, in step 703, it is judged whether or not a position of the range finding result is too close to a previously set range. In the case where the position of the range finding result is not too close to the previously set range, in step 704, it is judged whether or not the position of the range finding result is too distant from the previously set range. In the case where the position of the range finding result in step 704 is not too distant from the previously set range, the program proceeds to step 705, and it is judged that the range finding result is valid. On the other hand, in the case where it is judged that the position of the range finding result is excessively close to the previously set range in step 703, and in the case where it is judged that the position of the range finding result is excessively distant from the previously set range in step 704, it is judged in step 706 that the result of range finding is invalid. Therefore, the result of range finding is not used for correction.
In the next step 707, it is judged whether or not the number of portions, which have been judged to be valid, is larger than P which has been set in step 701. When the number of portions, which have been judged to be valid, is not more than P which has been set in step 701, the procedures in steps 702 to 706 are repeated again. For example, in the image 610 shown in FIG. 6, range finding is conducted in four portions of the points 611 to 614.
On the other hand, in the case where it is judged in step 707 that the number of portions, which have been judged to be valid, is larger than P, the program proceeds to step 708. By utilizing the results of range finding conducted at the range finding portions, the number of which is larger than P, a correction value for correcting the relation 601 showing a road surface position is calculated. In the successive step 709, the relation 601 showing the road surface position is corrected. By utilizing the results of range finding conducted in a plurality of portions, it becomes possible to correct the relation 601 to the relation 602 more accurately showing the road surface position. Specifically, it is preferable that the correction is conducted as follows. An average of the ratios (B/A) of the distance (A) before correction at four points of 611 to 614, at which range finding has been conducted, to the range finding value (B) obtained by range finding is found, and the relation 601 showing the road surface position is corrected to the relation 602 showing a new road surface position by the average.
For example, concerning the distance (A) at four points before correction, the graph 600 shown in FIG. 6 is found by the y-coordinate of each point and the relation 601, and the distance (A) is 20 m at the point 611, the distance (A) is 5 m at the point 612, the distance (A) is 20 m at the point 613 and the distance (A) is 5 m at the point 614. Concerning the range finding values (B) at four points, when the range finding value (B) is 22 m at the point 611, the range finding value (B) is 6 m at the point 612, the range finding value (B) is 22 m at the point 613 and the range finding value (B) is 6 m at the point 614, an average of the ratios (B/A) is 1.125 m. Therefore, it is possible to find a new relation 602 in which the correction is conducted for 1.125 m.
Since the relation 602 showing a road surface position has been corrected, the relation 605 showing a position higher than the reference value of height from the road surface position is corrected according to the correction of the relation 602, and the range 604 in which correlation processing is conducted is changed in step 710. In this way, a series of procedure can be completed.
The above procedure shown in FIG. 7 may be conducted by a road surface correcting means independently provided in the image processing section 13. Alternatively, the above procedure shown in FIG. 7 may be conducted by the range finding section 29. In the above example, the road surface position is corrected by using a white line on a road surface. Alternatively, it is possible to correct the relation showing the road face position by utilizing a pattern on the inputted image plane, the color of which is white and the length in the direction of the y-axis of which is somewhat large. An example of the aforementioned pattern is characters drawn on the road face.
It is preferable that the correction range of the relation showing a road surface position is in a predetermined range (a range of the graph 603 in the graph 600 shown in FIG. 6, for example, in a range of ratios (B/A) 0.8 to 1.2).
When the range 604, in which correlation is conducted, changed in step 710 is utilized and correlation processing is conducted only on the range finding target corresponding to the range, a period of time in which correlation processing is conducted can be reduced. Therefore, the range finding time can be greatly reduced and the occurrence of an error in range finding can be prevented.
As described above, according to the present invention, it is unnecessary to provide a large-scale correction circuit and memory for conducting distortion correction of an inputted image. Accordingly, the size and cost of the range finder can be reduced.
According to the present invention, it is possible to accurately conduct range finding of a target without conducting distortion correction on an inputted image.
Further, when a road surface position is corrected according to the present invention, it becomes unnecessary to conduct range finding on an unnecessary target. Therefore, range finding can be accurately conducted at high speed.

Claims (13)

What is claimed is:
1. A range finder for finding a range to a target by image realization comprising:
a first and a second imaging device arranged at a predetermined interval;
a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from a first image of the target which has been made by the first imaging device, wherein the first image is divided into a proximity region, which can be easily correlated, and a background region which is difficult to be correlated;
a correlation processing section for detecting a second pattern having second positional information, which is most correlated with the first pattern, from a plurality of horizontal or vertical lines located at positions corresponding to the first positional information in the second imaging device wherein the second pattern having the second positional information, which is most correlated with the first pattern, is detected according to the plurality of pieces of correlation which have been found and wherein the correlation processing section finds a correlation with the first pattern for the plurality of horizontal or vertical lines only when the first pattern exists in the background region; and
a parallax calculating section for finding parallax from the first and the second positional information.
2. A range finder according to claim 1, further comprising:
an image correcting section for detecting a state of a misalignment of the first or the second image according to the correlation of the first pattern with the second pattern for a plurality of horizontal or vertical lines which has been found by the correlation processing section, and for correcting the first or the second image according to the state of detected misalignment.
3. A range finder according to claim 1, further comprising:
an alarm generating section for generating an alarm when a value of correlation, which is obtained in the case where the correlation processing section detects the second pattern, is compared with a correlation reference value and when the value of correlation is not more than a correlation reference value.
4. A range finder for finding a range to a target by image realization comprising:
a first and a second imaging device arranged at a predetermined interval;
a density difference detecting section for finding a density difference between a first image of the target made by the first imaging device and a second image of the target made by the second imaging device;
an image density correcting section for correcting density of the first or the second image according to the density difference between the first and the second images;
a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from the first image;
a correlation processing section for detecting a second pattern having second positional information, which is most correlated with the first pattern, in the second image of the target; and
a parallax calculating section for finding parallax from the first and the second positional information.
5. A range finder according to claim 4, further comprising: a parameter setting section for setting a parameter necessary for processing conducted by the correlation processing section according to a density difference between the first and the second image.
6. A range finder for finding a range to a target by image realization comprising:
a first and a second imaging device arranged at a predetermined interval;
a density difference detecting section for finding a density difference between a fist image of the target made by the first imaging device and a second image of the target made by the second imaging device;
a pattern extracting section for extracting a first pattern having a predetermined size and first positional information from the first image;
an image density correcting section for correcting density of the first pattern according to a density difference between the first and the second images;
a correlation processing section for detecting a second pattern having second positional information which is most correlated with the corrected first pattern in the second image; and
a parallax calculating section for finding parallax from the first and the second positional information.
7. A range finder for finding a range to a target by image realization comprising:
a first and a second imaging device arranged at a predetermined interval;
a pattern extracting section for extracting a first pattern having first positional information containing a range finding target from a first image of the target which has been made by the first imaging device;
a correlation processing section for detecting a second pattern having second positional information, which is most correlated with the first pattern in a second image of the target which has been made by the second imaging device;
a parallax calculating section for finding parallax from the first and the second positional information;
a range finding section for finding a range to the range finding target by the parallax;
a realizing section for realizing a height of the range finding target according to the position in the first or the second image of the range finding target and according to result of range finding conducted by the range finding section; and
a nullifying section for nullifying the result of range finding conducted by the range finding section in the case where the height of the range finding target is smaller than the reference height.
8. A range finder according to claim 7, further comprising: a road surface position correcting section for detecting a white line in the first or the second image and for finding a range to a forward end portion of the white line and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the white line.
9. A range finder according to claim 7, further comprising: a road surface correcting section for detecting a third pattern having same characteristic as that of a white line from the first or the second image and for finding a range to a forward end portion of the third pattern and for correcting a road face position, which becomes a reference to realize the height of the range finding target, from the value of range finding of the forward end portion of the third pattern.
10. A range finder according to claim 9, wherein the reference value correcting section corrects the reference value according to a plurality of range finding values of the forward end portion of the third pattern.
11. A range finder according to claim 10, wherein the reference value correcting section corrects the road surface position by utilizing only a range finding value in a predetermined range of values in the plurality of range finding values of the forward end portion of the third pattern.
12. A range finder for finding a range to a target by image realization comprising:
a first and a second imaging device arranged at a predetermined interval;
a pattern extracting section for extracting a first pattern having a predetermined size and first positional information containing the range finding target from a first image of the target which has been made by the first imaging device;
a correlation processing section for detecting a second pattern having second positional information, which is most correlated with the first pattern, in a second image of the target which has been made by the second imaging device;
a parallax calculating section for finding parallax from the first and the second positional information;
a range finding section for finding a range to the range finding target by the parallax;
a judging section for judging whether or not the range finding target exists in a range finding objective region according to the result of range finding to find a range to the target and also according to the position of the range finding target in the first or the second image; and
a nullifying section for nullifying the result of range finding conducted by the range finding section when the range finding target exists outside the range finding objective region.
13. A range finder according to claim 12, further comprising: a height realizing section for realizing a height of the range finding target, wherein the nullifying (section does not nullify the result of range finding conducted by the range finding section when the height is larger than a predetermined reference value of the height even if the range finding target exists outside the objective region of range finding.
US10/177,633 2001-06-20 2002-06-18 Range finder for finding range by image realization Expired - Lifetime US6697146B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2001-186779 2001-06-20
JP2001186779A JP2003004442A (en) 2001-06-20 2001-06-20 Distance-measuring apparatus

Publications (2)

Publication Number Publication Date
US20020196423A1 US20020196423A1 (en) 2002-12-26
US6697146B2 true US6697146B2 (en) 2004-02-24

Family

ID=19026177

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/177,633 Expired - Lifetime US6697146B2 (en) 2001-06-20 2002-06-18 Range finder for finding range by image realization

Country Status (2)

Country Link
US (1) US6697146B2 (en)
JP (1) JP2003004442A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060178830A1 (en) * 2005-02-10 2006-08-10 Rini Sherony Vehicle collision warning system
US20070291992A1 (en) * 2001-09-25 2007-12-20 Fujitsu Ten Limited Ranging device utilizing image processing
US11909942B2 (en) 2019-11-11 2024-02-20 Canon Kabushiki Kaisha Parallax detecting apparatus, image capturing apparatus, parallax detecting method, and storage medium

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7208720B2 (en) * 1999-07-06 2007-04-24 Larry C. Hardin Intrusion detection system
JP4055998B2 (en) * 2003-04-15 2008-03-05 本田技研工業株式会社 Distance detection device, distance detection method, and distance detection program
JP4095491B2 (en) * 2003-05-19 2008-06-04 本田技研工業株式会社 Distance measuring device, distance measuring method, and distance measuring program
JP4521235B2 (en) 2004-08-25 2010-08-11 日立ソフトウエアエンジニアリング株式会社 Apparatus and method for extracting change of photographed image
US8456515B2 (en) * 2006-07-25 2013-06-04 Qualcomm Incorporated Stereo image and video directional mapping of offset
CN100495274C (en) * 2007-07-19 2009-06-03 上海港机重工有限公司 Control method for automatic drive of large engineering vehicle and system thereof
US8588600B2 (en) * 2010-07-27 2013-11-19 Texas Instruments Incorporated Stereoscopic auto-focus based on coordinated lens positions
JP5481337B2 (en) * 2010-09-24 2014-04-23 株式会社東芝 Image processing device
KR101737085B1 (en) * 2010-11-05 2017-05-17 삼성전자주식회사 3D camera
JP5792662B2 (en) 2011-03-23 2015-10-14 シャープ株式会社 Parallax calculation device, distance calculation device, and parallax calculation method
US9677897B2 (en) * 2013-11-13 2017-06-13 Elwha Llc Dead reckoning system for vehicles
JP6855759B2 (en) * 2016-11-11 2021-04-07 トヨタ自動車株式会社 Self-driving vehicle control system
CN114608520B (en) * 2021-04-29 2023-06-02 北京石头创新科技有限公司 Ranging method, ranging device, robot and storage medium
CN114608608B (en) * 2022-01-21 2024-04-05 东莞奥优光电有限公司 Calibration method based on infrared thermal imaging belt range finder module

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255064A (en) * 1990-09-03 1993-10-19 Mitsubishi Denki Kabushiki Kaisha Distance-measuring equipment
US5303019A (en) * 1991-12-09 1994-04-12 Mitsubishi Denki Kabushiki Kaisha Inter-vehicle distance measuring device
US5940634A (en) * 1996-11-27 1999-08-17 Minolta Co., Ltd. Range finding device
US6047136A (en) * 1997-07-03 2000-04-04 Minolta Co., Ltd. Distance meter and an apparatus having a distance meter
US6370262B1 (en) * 1994-11-08 2002-04-09 Canon Kabushiki Kaisha Information processing apparatus and remote apparatus for object, using distance measuring apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2979619B2 (en) * 1990-09-28 1999-11-15 日本電気株式会社 Stereoscopic device
JPH0771916A (en) * 1993-09-06 1995-03-17 Fuji Film Micro Device Kk On-vehicle distance measuring device
JPH0894353A (en) * 1994-09-26 1996-04-12 Yazaki Corp Around vehicle monitoring device
JPH09311998A (en) * 1996-05-21 1997-12-02 Omron Corp Object detection device and parking lot management system using it
JP3211732B2 (en) * 1996-07-17 2001-09-25 富士電機株式会社 Distance measuring device
JPH10232111A (en) * 1997-02-19 1998-09-02 Matsushita Electric Ind Co Ltd Finder
JPH11325890A (en) * 1998-05-14 1999-11-26 Fuji Heavy Ind Ltd Picture correcting apparatus for stereoscopic camera
JP3923181B2 (en) * 1998-06-01 2007-05-30 本田技研工業株式会社 Vehicle distance measuring device
JP2000003448A (en) * 1998-06-16 2000-01-07 Sony Corp Luminance value correction circuit, method therefor and distance image generating device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255064A (en) * 1990-09-03 1993-10-19 Mitsubishi Denki Kabushiki Kaisha Distance-measuring equipment
US5303019A (en) * 1991-12-09 1994-04-12 Mitsubishi Denki Kabushiki Kaisha Inter-vehicle distance measuring device
US6370262B1 (en) * 1994-11-08 2002-04-09 Canon Kabushiki Kaisha Information processing apparatus and remote apparatus for object, using distance measuring apparatus
US5940634A (en) * 1996-11-27 1999-08-17 Minolta Co., Ltd. Range finding device
US6047136A (en) * 1997-07-03 2000-04-04 Minolta Co., Ltd. Distance meter and an apparatus having a distance meter

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Patent Abstract of Japan JP 11-55691 to Sony Corp, dated Feb. 26, 1999.
Patent Abstract of Japan JP 7-250319 to Yazaki Corp, dated Sep. 26, 1995.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070291992A1 (en) * 2001-09-25 2007-12-20 Fujitsu Ten Limited Ranging device utilizing image processing
US7729516B2 (en) * 2001-09-25 2010-06-01 Fujitsu Ten Limited Ranging device utilizing image processing
US20060178830A1 (en) * 2005-02-10 2006-08-10 Rini Sherony Vehicle collision warning system
US7130745B2 (en) 2005-02-10 2006-10-31 Toyota Technical Center Usa, Inc. Vehicle collision warning system
US11909942B2 (en) 2019-11-11 2024-02-20 Canon Kabushiki Kaisha Parallax detecting apparatus, image capturing apparatus, parallax detecting method, and storage medium

Also Published As

Publication number Publication date
JP2003004442A (en) 2003-01-08
US20020196423A1 (en) 2002-12-26

Similar Documents

Publication Publication Date Title
US6697146B2 (en) Range finder for finding range by image realization
US10659762B2 (en) Stereo camera
US6658150B2 (en) Image recognition system
US6531959B1 (en) Position detecting device
US6381360B1 (en) Apparatus and method for stereoscopic image processing
US20080013789A1 (en) Apparatus and System for Recognizing Environment Surrounding Vehicle
JP2006252473A (en) Obstacle detector, calibration device, calibration method and calibration program
JP2008039491A (en) Stereo image processing apparatus
JP2006053890A (en) Obstacle detection apparatus and method therefor
US20100013908A1 (en) Asynchronous photography automobile-detecting apparatus
US8044998B2 (en) Sensing apparatus and method for vehicles
JP3666348B2 (en) Distance recognition device
US11346663B2 (en) Stereo camera
JP2013257244A (en) Distance measurement device, distance measurement method, and distance measurement program
JPH1144533A (en) Preceding vehicle detector
JPH10267618A (en) Distance measuring instrument
JP2004126905A (en) Image processor
JP2004354256A (en) Calibration slippage detector, and stereo camera and stereo camera system equipped with the detector
US20180182078A1 (en) Image processing apparatus and image processing method
US7158665B2 (en) Image processing device for stereo image processing
JPH11167636A (en) Line detecting device for vehicle
EP3896387B1 (en) Image processing device
WO2023068034A1 (en) Image processing device
JP7083014B2 (en) Stereo camera
US20230169678A1 (en) Apparatus and method for processing an image of a vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU TEN LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIMA, NOBUKAZU;REEL/FRAME:013049/0615

Effective date: 20020606

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12