JP4803927B2 - Distance correction apparatus and distance correction method for monitoring system - Google Patents

Distance correction apparatus and distance correction method for monitoring system Download PDF

Info

Publication number
JP4803927B2
JP4803927B2 JP2001277998A JP2001277998A JP4803927B2 JP 4803927 B2 JP4803927 B2 JP 4803927B2 JP 2001277998 A JP2001277998 A JP 2001277998A JP 2001277998 A JP2001277998 A JP 2001277998A JP 4803927 B2 JP4803927 B2 JP 4803927B2
Authority
JP
Japan
Prior art keywords
vanishing point
distance
calculating
parallax
captured image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2001277998A
Other languages
Japanese (ja)
Other versions
JP2003083742A (en
Inventor
圭二 塙
Original Assignee
富士重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士重工業株式会社 filed Critical 富士重工業株式会社
Priority to JP2001277998A priority Critical patent/JP4803927B2/en
Publication of JP2003083742A publication Critical patent/JP2003083742A/en
Application granted granted Critical
Publication of JP4803927B2 publication Critical patent/JP4803927B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a distance correction apparatus and a distance correction method for a monitoring system that correct distance information including an error caused by a positional shift of a stereo camera.
[0002]
[Prior art]
In recent years, a stereo-type vehicle exterior monitoring device using a pair of in-vehicle cameras (stereo cameras) incorporating a solid-state image pickup device such as a CCD attracts attention. In the stereo method, which is one of three-dimensional measurement techniques, a region having a correlation with a certain pixel block in one image is specified in the other image. Then, based on the parallax of the pixel block, that is, the relative shift amount of the pixel block in both images (stereo images), the distance to the object is calculated using the principle of triangulation. Therefore, in order to increase the accuracy of stereo matching, in other words, in order to obtain highly reliable distance information, it is desirable that no positional deviation other than parallax exists in the stereo image. However, there is actually a positional shift (for example, a horizontal shift or a vertical shift or a rotational shift) due to the mechanical mounting accuracy of the stereo camera. Among these positional deviations, in particular, a horizontal translational deviation (hereinafter referred to as “horizontal deviation”) appears as a parallax error in a stereo image, and the distance calculated based on the deviation is different from the actually measured value.
[0003]
For example, Japanese Patent Laid-Open No. 2001-160137 discloses a technique for correcting a calculated distance including an error caused by a horizontal shift of a stereo camera using a vanishing point parallax DP in a monitoring system using a stereo camera. Yes. In this distance correction method, a vanishing point JV2D is calculated from the intersection of left and right lanes projected on one captured image (reference image) plane, and a road surface inclination angle a is detected based on the vanishing point JV2D. The inclination angle a is calculated using a photo analysis technique based on the vanishing point JV2D. Further, the inclination angle a ′ of the road surface in the three-dimensional space is detected using the distance image (two-dimensional array of calculated distance groups) calculated by the stereo processing. Then, the difference between these inclination angles a and a 'is obtained, and the vanishing point parallax DP is corrected so that they match.
[0004]
Japanese Patent Laid-Open No. 6-341837 discloses an inter-vehicle distance measuring device that reduces the influence of the horizontal shift described above. In this measuring device, for each of a pair of captured images obtained by photographing the front of the host vehicle, the vanishing point calculated from the intersection of the left and right lanes and the central axis (symmetric axis) of the image of the preceding vehicle are obtained. . Next, the parallax between the vanishing point and the center line on one captured image is calculated, and the parallax between the vanishing point and the center line on the other captured image is calculated. Then, the distance to the preceding vehicle is calculated by summing both parallaxes.
[0005]
[Problems to be solved by the invention]
An object of the present invention is to provide a novel distance correction method for correcting a parallax including an error caused by a positional shift of a stereo camera, particularly a horizontal shift.
[0006]
Another object of the present invention is to improve the accuracy of distance measurement by calculating the distance to the object using the corrected parallax.
[0007]
[Means for Solving the Problems]
  In order to solve such a problem, the first invention is a stereo imaging unit that obtains a pair of captured images, and a parallax calculation unit that calculates parallax by stereo matching based on the pair of captured images obtained by the stereo imaging unit. And a distance calculation unit that calculates the distance to the object based on the parallax and the vanishing point parallax calculated by the parallax calculation unit, and one of the captured image planes that is spatially parallel to each other extending in the distance direction. A plurality of approximate straight lines, a first vanishing point is calculated from the intersection of the approximate straight lines, and a plurality of approximate straight lines extending in the distance direction and parallel to each other are calculated on the other captured image plane Then, based on the vanishing point calculating means for calculating the second vanishing point from the intersection of the approximate lines, and the amount of deviation between the first vanishing point and the second vanishing point calculated by the vanishing point calculating means, Cannonball parallaxupdateA distance correction apparatus for a monitoring system having correction means for performing the above-described operation is provided.
[0008]
Here, in the first aspect of the invention, a plurality of parallel reference objects extending in the distance direction are detected from the scenery projected in the captured image, and the position of the reference object on the captured image plane is specified. Detection means for performing this may be further provided. In this case, the vanishing point calculating means preferably calculates an approximate straight line in the captured image plane for each of the detected reference objects when a plurality of reference objects are detected by the detecting means.
[0009]
The reference object may be a left and right lane on the road that is projected in the captured image, or may be a left and right boundary that indicates the boundary between the wall and the floor that is projected in the captured image. Moreover, you may use the rail on either side of the track projected on the captured image as a reference | standard object.
[0010]
  The second invention calculates the distance to the object based on the parallax and the vanishing point parallax based on the step of calculating the parallax by stereo matching based on a pair of captured images obtained by capturing the same scene at the same time. Calculating a plurality of spatially parallel approximate lines extending in the distance direction on one captured image plane, and calculating a first vanishing point from the intersection of these approximate lines; Calculating a plurality of spatially parallel approximate lines extending in the distance direction on the other captured image plane, calculating a second vanishing point from the intersection of these approximate lines, and a first vanishing point And vanishing point parallax based on the amount of deviation between the vanishing point and the second vanishing pointupdateA distance correction method for a monitoring system.
[0011]
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 is a block diagram showing a configuration of a stereo-type vehicle exterior monitoring apparatus according to the present embodiment. The stereo camera attached in the vicinity of the rearview mirror captures the scenery (same scenery) including the road ahead of the host vehicle, the preceding car, and the like at the same time in a normal traveling state. This stereo camera is composed of a pair of cameras 1 and 2 incorporating an image sensor such as a CCD, CMOS sensor, or infrared camera. Each camera 1 and 2 has a predetermined camera base length in the vehicle width direction. Installed. The main camera 1 that outputs a reference image signal is mounted on the right side in the traveling direction of the vehicle. On the other hand, the sub camera 2 that outputs the comparison image signal is mounted on the left side in the traveling direction. The camera pairs 1 and 2 are synchronized with each other, capture a front scene at the same timing, and output two analog image signals. These analog image signals are adjusted in the analog interface 3 so as to match the input range of the subsequent circuit. Further, the brightness balance of the image is adjusted by a gain control amplifier (GCA) 3 a in the analog interface 3.
[0012]
The analog image signal adjusted in the analog interface 3 is converted by the A / D converter 4 into digital image data of a predetermined luminance gradation (for example, 256 gradation gray scale). Each digitized data is subjected to affine transformation in the correction circuit 5. The positional shift of each camera 1 and 2 and the shift of the stereo image resulting therefrom are equivalently corrected by performing affine transformation on the image. Here, “affine transformation” is a general term for geometric coordinate transformation for rotating, moving, enlarging or reducing an image. The correction circuit 5 performs linear transformation represented by the following expression on the original image using the four affine parameters K, θ, SHFTI, and SHFTJ.
[Expression 1]
[0013]
In this equation, (i, j) is the coordinates of the original image, and (i ′, j ′) is the coordinates after conversion. The affine parameters SHFTI and SHFTJ represent movement in the i direction (horizontal direction of the image) and movement in the j direction (vertical direction of the image), respectively. The affine parameters θ and K indicate θ rotation and K-fold enlargement (reduction when | K | <1, respectively). By applying the affine transformation to the stereo image, “horizontal line matching in the stereo image”, which is important in ensuring the accuracy of stereo matching, is guaranteed. Note that the detailed hardware configuration of the correction circuit 5 is described in Japanese Patent Laid-Open No. 10-307352, so refer to it if necessary.
[0014]
Through such image processing, from the output signal of the main camera 1, for example, the luminance value of each pixel in an image region having 512 pixels in the horizontal direction and 200 pixels in the vertical direction is obtained as reference image data. Further, comparison image data having the same vertical length as the reference image and a larger horizontal length than the reference image is obtained from the output signal of the sub camera 2 (for example, the horizontal direction is 640 pixels and the vertical direction is 200). Pixel). In the ij coordinate system of an image that is a two-dimensional plane, the lower left corner of the image is the origin, the horizontal direction is the i coordinate axis, and the vertical direction is the j coordinate axis (unit is pixel). The reference image data and the comparison image data are stored in the image data memory 7.
[0015]
The stereo arithmetic circuit 6 calculates the parallax d based on the reference image data and the comparison image data. Since one parallax d is calculated for each pixel block of 4 × 4 pixels in the reference image, for example, a maximum of 128 × 50 parallaxes can be calculated for the entire reference image for one frame. When calculating the parallax di of one pixel block (hereinafter referred to as “target pixel block”) in the reference image, first, an area having a correlation with the luminance characteristic of the target pixel block is specified in the comparison image. As is well known, the distance to the object projected on the stereo image appears as a parallax in the stereo image, that is, a horizontal shift amount of the pixel block between the reference image and the comparison image. Therefore, when searching for a pixel block in the comparison image (hereinafter referred to as “comparison pixel block”), it is sufficient to search on the same horizontal line (epipolar line) as the j coordinate of the target pixel block. The stereo calculation circuit 6 evaluates the correlation with the target pixel block for each comparison pixel block while shifting the pixel on the epipolar line one by one (stereo matching).
[0016]
The correlation between two pixel blocks can be evaluated using, for example, a city block distance, which is a well-known correlation evaluation method. The stereo arithmetic circuit 6 obtains the city block distance for each region (same area as the target pixel block) existing on the epipolar line, and basically, the region where the value of the city block distance is the minimum is correlated with the target pixel block. Specify as a destination. The amount of horizontal displacement between the target pixel block and the pixel block associated with the correlation destination is parallax di. Note that the hardware configuration relating to the calculation of the city block distance and the detailed method for determining the correlation destination are disclosed in Japanese Patent Laid-Open No. 5-1114099, so please refer to them if necessary. The parallax d calculated by the stereo arithmetic circuit 6 is stored in the distance data memory 8.
[0017]
The microcomputer 9 (recognizing unit 10 that is a functional block when it is functionally read) reads the reference image data from the image data memory 7, and displays an object (for example, a preceding vehicle) projected in the reference image. Recognition is performed using a well-known image recognition technique. The recognizing unit 10 calculates the distance Z to the object based on the following equation using the parallax d read from the distance data memory 8 as a basic parameter.
[Expression 2]
[0018]
In this equation, KZH is a predetermined constant (camera baseline length / horizontal viewing angle), and DP is the vanishing point parallax. In the present embodiment, the vanishing point parallax DP is a parallax correction value (variable), and the value is calculated by the correction calculation unit 13 described later.
[0019]
Further, the recognition unit 10 performs “road shape recognition”. Here, “recognition of road shape” means that a three-dimensional road shape is expressed by a function related to left and right lanes (white line, overtaking prohibited line, etc.), and each parameter of this function is changed to an actual road shape (straight line, It is set to a value that matches the curve curvature or undulation. In the following description, a white line that is a typical lane will be described as an example, but the present invention can also be applied to various lanes including an overtaking lane. The white line model calculation method in the present embodiment will be described below with reference to FIG.
[0020]
First, a white line edge Pedge in the reference image, that is, a horizontal luminance edge (a portion where the amount of change in luminance between adjacent pixels is large) is identified due to a white line. The white line edge Pedge is separately searched for the left side and the right side of the traveling path, and a plurality of left white line edges Pedge1 and a plurality of right white line edges Pedge2 are specified. Specifically, a luminance edge having the following three conditions is recognized as a white line edge Pedge.
[0021]
(Three conditions of white line edge)
1.A luminance edge whose luminance change amount is equal to or greater than a predetermined value, and a pixel on the outer side (image end side) of the luminance edge is larger than the pixel on the inner side (image center side).
In other words, the white line edge Pedge caused by the left and right white lines of the traveling road is a luminance edge at the inner boundary of the white line (the boundary between the white line and the paved road) as shown in FIG.
[0022]
2.White line edge P satisfying condition 1 edge For the candidate, there is a further luminance edge outside the same horizontal line, and the inner pixel has a higher luminance than the pixel outside the luminance edge.
Since the white line has a predetermined width, there is also a boundary outside the white line edge Pedge. This condition is provided in view of the characteristics of such a white line.
[0023]
3.White line edge P satisfying condition 1 edge The parallax d is calculated for the pixel block including
If the parallax d is not calculated at the place where the white line edge Pedge exists, the white line edge Pedge is not effective information for recognizing the road shape.
[0024]
For each identified white line edge Pedge, the recognition unit 10 substitutes its coordinates (i, j) and its parallax d into the well-known coordinate conversion formulas shown in the following formulas 3 and 4, and thereby in the real space The coordinates (X, Y, Z) are calculated.
[Equation 3]
[Expression 4]
[0025]
Here, the constant CAH is the mounting height of the stereo cameras 1 and 2, the constant r is the mounting interval of the stereo cameras 1 and 2, and the constants PWV and PWH are the vertical viewing angle and horizontal viewing angle per pixel, respectively. The constants IV and JV are the i-coordinate value and j-coordinate value of the vanishing point V set in advance, respectively. In addition, the coordinate system of the real space set based on the position of the vehicle is based on the road surface directly below the center of the stereo cameras 1 and 2 as the origin, the vehicle width direction is the X axis, the vehicle height direction is the Y axis, and the vehicle length The direction (distance direction) is taken as the Z axis. When the coordinates (i, j) and the parallax d on the image plane are specified for an object (such as a preceding vehicle, a three-dimensional object, or a road) projected on the captured image, the conversion formulas shown in Formulas 2 to 4 are obtained. Based on the above, the coordinates (X, Y, Z) in the real space can be uniquely specified.
[0026]
A white line model is specified based on the coordinates (X, Y, Z) in the real space of each white line edge Pedge thus specified. The white line model is obtained by calculating an approximate straight line for each predetermined section for each of the left and right white line edges Pedge1 and Pedge2 within the recognition range (for example, from the camera position to 84 m ahead of the vehicle) and connecting them in a polygonal line. It is. In the white line model of FIG. 3 shown as an example, the recognition range is divided into seven sections and approximated by a straight line of the following equation using the least square method for each of the left and right white line edges Pedge1 and Pedge2 in each section.
[Equation 5]
(Left white line model L)
X = aL・ Z + bL
Y = cL・ Z + dL
(Right white line model R)
X = aR・ Z + bR
Y = cR・ Z + dR
[0027]
These white line models L and R are composed of a curve function (X = f (Z)) expressing the curve curvature of the road and a gradient function (Y = f (Z)) expressing the road slope and undulation. ing. Therefore, the three-dimensional change state of the road in the real space can be grasped by the left and right white line models L and R. The white line edges Pedge and the left and right white line models L and R calculated by the recognition unit 10 are transmitted to the correction calculation unit 13.
[0028]
When the recognition unit 10 determines that an alarm is necessary based on the recognition result regarding the preceding vehicle or the road shape, the recognition unit 10 activates the alarm device 11 such as a monitor or a speaker to alert the driver. Further, by controlling the control device 12 as necessary, vehicle control such as AT (automatic transmission) shift down, engine output suppression, or brake operation is executed.
[0029]
Next, details of the parallax correction according to the present embodiment will be described with reference to the flowcharts shown in FIGS. 4 and 5. The correction calculation unit 13 updates the value of the vanishing point parallax DP according to a series of procedures shown in this flowchart, and feeds back the value to the recognition unit 10. This flowchart is repeatedly executed every cycle of a predetermined period.
[0030]
First, in step 1, the correction calculation unit 13 reads the white line edge Pedge and the white line models L and R for each of the pair of captured images (reference image and comparison image) calculated by the recognition unit 10. Next, in step 2, it is determined whether left and right white lines exist in the reference image. This can be determined by examining whether the white line models L and R on the left and right are calculated in the recognition unit 10. Alternatively, the determination may be made by examining whether the left white line edge Pedge1 and the right white line edge Pedge2 are calculated.
[0031]
If a negative determination is made in Step 2, that is, if there are no white lines on both the left and right sides, the vanishing points cannot be calculated because the line segments parallel to each other cannot be extracted. Therefore, in order to achieve control stability, the process proceeds to return without changing the current value of the vanishing point parallax DP, and the execution of this flowchart in the current cycle is terminated. On the other hand, if an affirmative determination is made in step 2, the process proceeds to step 3.
[0032]
In step 3, the reliability of the left and right white lines is evaluated. Specifically, the following two points are evaluated. Only when it is determined in step 4 that the white line is reliable, the process proceeds to step 5. On the other hand, if it is determined that the white line is not reliable, the process proceeds to return without changing the value of the vanishing point parallax DP.
[0033]
(Reliability evaluation of white line)
1. When the deviation between the white line position in the previous cycle and the white line position in the current cycle is larger than a predetermined value, it is determined that the reliability as the white line is low. Specifically, when the position of the white line edge Pedge detected in the previous cycle and the position of the white line edge Pedge detected in the current cycle are greatly shifted, it is determined that the reliability as the white line is low.
[0034]
2. Verify how far the white line is visible. The white line has at least a certain length. Therefore, considering the transition of the white line between frames, if the white line edge Pedge does not extend beyond a certain length in the depth direction, it is determined that the reliability as the white line is low.
[0035]
In step 5, the linearity of the white line is evaluated based on the white line models R and L. In order to calculate an accurate vanishing point, it is necessary that the left and right white lines as the calculation base extend linearly, and an accurate vanishing point cannot be calculated from the curved white line. Therefore, the process proceeds to step 7 only when it is determined in step 6 that the white line is a straight line, and otherwise, the process proceeds to return without changing the value of the vanishing point parallax DP.
[0036]
The linearity of the white line can be evaluated based on, for example, a white line model (curve function X = f (Z)) calculated by the recognition unit 10. Referring to FIG. 3, first, the slope A1 of the curve function (the slopes a of the left and right white lines L and R in a predetermined distance range (for example, 0 to Z2) in the ZX plane).L, ARAverage). As the slope A1, an average value of the slope a1 in the first section and the slope a2 in the second section is used. Next, the slope A2 of the curve function in a predetermined distance range (for example, Z2 to Z4) is calculated. As the slope A2, an average value of the slope a3 in the third section and the slope a4 in the fourth section is used. Then, a difference (absolute value) between the inclination A1 and the inclination A2 is obtained, and if this difference is equal to or smaller than a predetermined threshold value, it is determined that the white line is a straight line.
[0037]
The procedure after step 7 shown in FIG. 5 relates to the update process of the vanishing point parallax DP. First, in step 7, an approximate straight line L1m of a plurality of left white line edges Pedge1 existing within a predetermined distance range (for example, 0 to Z2) in the reference image is calculated by the least square method (see FIG. 6). Similarly, an approximate straight line L2m of a plurality of right white line edges Pedge2 existing within the distance range is also calculated by the method of least squares. In step 8, the intersection point of the approximate lines L1m and L2m is specified as shown in FIG. 6 to calculate the i-coordinate value IVm of the vanishing point Vm (IVm, JVm) in the reference image.
[0038]
In subsequent step 9, an approximate straight line L1s of a plurality of left white line edges Pedge1 existing within a predetermined distance range (for example, 0 to Z2) in the comparison image is calculated by the least square method by the same method as that for the reference image. . At the same time, an approximate straight line L2s of a plurality of right white line edges Pedge2 existing within a predetermined distance range is also calculated by the method of least squares. In step 10, the i-coordinate value IVs of the vanishing point Vs (IVs, JVs) in the comparative image is calculated by specifying the intersection of the approximate lines L1s and L2s.
[0039]
In step 11, the parallax correction value, that is, the vanishing point parallax DP is updated based on these vanishing points IVm and IVs. Basically, the amount of deviation between the i-coordinate value IVm of the vanishing point Vm calculated for the reference image side and the i-coordinate value IVs of the vanishing point Vs calculated for the comparative image side becomes the vanishing point parallax DP. Then, the calculated vanishing point parallax DP is output to the recognition unit 10, and the processing of this flowchart in the current cycle ends. In consideration of control stability, the vanishing point parallax DP calculated in step 11 is stored for 1 to n processing cycles, and the average value is used for distance correction (vanishing point parallax). ) May be applied.
[0040]
According to the present embodiment, the feedback adjustment related to the vanishing point parallax DP is performed in parallel with the monitoring control, so that a highly accurate distance can always be calculated even when a horizontal shift of the stereo camera occurs. Therefore, even when the stereo camera mounting position changes from the initial setting state due to secular change, impact, or the like, highly reliable distance information can be obtained stably. Then, by performing monitoring control based on such a calculated distance, it is possible to improve the reliability of monitoring outside the vehicle. In particular, according to the present embodiment, since the vanishing point parallax DP is directly detected using a pair of captured images, even when the vanishing point parallax is greatly deviated, it can be detected stably.
[0041]
In the above description, the vanishing point parallax may be updated by proportional control, statistical control, or the like. For example, a histogram of 1000 samples of the vanishing point parallax DP may be taken and the mode value may be used.
[0042]
(Application to various monitoring systems using captured images)
In the above-described embodiment, the method of calculating the vanishing point parallax DP using the left and right lanes (white lines) on the road projected in the captured image has been described. This is a general tendency that, in the case of automobile front monitoring, there are usually lanes that extend in the distance direction (Z direction) on the left and right sides of the road, and these are often spatially parallel to each other. In view of the above. In the present specification, a linear object that extends parallel to each other in the distance direction and serves as a base for vanishing point calculation, such as a lane, is referred to as a “reference object”. The present invention can be widely applied to various monitoring systems using captured images in which a “reference object” is projected.
[0043]
As an example, when applied to an indoor robot that recognizes a surrounding situation based on a captured image, two boundary lines between a wall and a floor can be used as a “reference object”. FIG. 7 is an example of a captured image in the indoor robot. In addition, the straight line L1 (or L2) shown in the figure is used as a generic term for the straight line L1m (or L2m) on the reference image side and the straight line L1s (or L2s) on the comparison image side. Usually, the boundary line between the left wall and the floor and the boundary line between the right wall and the floor often extend in parallel to each other in the distance direction (depth direction). Therefore, vanishing point correction and distance correction can be performed using the left and right boundary lines. The outline of the vanishing point adjustment procedure using the boundary line will be described below.
[0044]
First, a plurality of straight lines L1m and L2m are detected based on the reference image. Similar to the above (white line edge condition), conditions relating to the luminance edge and parallax at the boundary between the wall and the floor are set in advance. Then, in the captured image, a portion that meets this condition is recognized as a boundary line, and its linearity is appropriately evaluated, and then the approximate straight lines L1m and L2m are calculated. As another method, by using a well-known Hough transform or the like, by extracting points (edge pixels at the boundary portion) that form a straight line in the captured image, straight lines L1m and L2m that become “reference objects” are obtained. It may be calculated.
[0045]
Next, it is determined based on the distance image that the straight lines L1m and L2m are substantially parallel in space. As described above, based on the distance image, the position in the real space of each region constituting the straight lines L1m and L2m can be specified. Therefore, when two straight lines L1m and L2m are detected, the spatial parallelism of these straight lines L1m and L2m is determined using a known method.
[0046]
When the straight lines L1m and L2m are spatially parallel, the vanishing point Vm is calculated from these intersection points in the reference image. In the same manner, the straight lines L1s and L2s in the comparison image are detected, and the vanishing point Vs is calculated from these intersection points in the comparison image. Then, the vanishing point parallax DP is calculated from the difference between the i coordinate values of these vanishing points Vm and Vs.
[0047]
As another example, when applied to a system for monitoring the front situation of a railway vehicle, the left and right rails can be used as “reference objects”. FIG. 8 is an example of a captured image in front of the railway vehicle. The left and right rails extend in parallel to each other in the distance direction. Therefore, since the two parallel straight lines L1 and L2 can be specified by using the left and right rails as the “reference object”, the vanishing point parallax DP can be adjusted by the above-described method.
[0048]
【The invention's effect】
Thus, in the present invention, the vanishing point parallax used when calculating three-dimensional information such as distance information is based on the vanishing point shift calculated as the intersection of parallel approximate straight lines on a pair of captured image planes. It is corrected. Therefore, even when a stereo camera is misaligned, the vanishing point parallax value that automatically cancels the error caused by it is automatically calculated, so stable three-dimensional information (distance information) is stable. Can be obtained.
[Brief description of the drawings]
FIG. 1 is a block diagram showing the configuration of a stereo-type vehicle exterior monitoring device
FIG. 2 is an explanatory diagram of a white line edge of a reference image
FIG. 3 is a diagram showing a white line model
FIG. 4 is a flowchart showing a part of the parallax correction procedure.
FIG. 5 is a flowchart showing the procedure following FIG.
FIG. 6 is an explanatory diagram for calculating a vanishing point on a captured image plane.
FIG. 7 is a diagram illustrating an example of a captured image of an indoor robot
FIG. 8 is a diagram showing an example of a captured image in front of a railway vehicle
[Explanation of symbols]
1 Main camera
2 Sub camera
3 Analog interface
3a Gain control amplifier
4 A / D converter
5 Correction circuit
6 Stereo operation circuit
7 Image data memory
8 Distance data memory
9 Microcomputer
10 Recognition part
11 Alarm device
12 Control device
13 Correction calculation section

Claims (6)

  1. In the distance correction device of the monitoring system,
    Stereo imaging means for obtaining a pair of captured images;
    Parallax calculating means for calculating parallax by stereo matching based on a pair of captured images obtained by the stereo imaging means;
    Distance calculating means for calculating the distance to the object based on the parallax calculated by the parallax calculating means and the vanishing point parallax;
    In one captured image plane, calculate a plurality of spatially parallel approximate lines extending in the distance direction, calculate a first vanishing point from the intersection of the approximate lines, and in the other captured image plane, Vanishing point calculating means for calculating a plurality of approximate straight lines extending in the distance direction and parallel to each other, and calculating a second vanishing point from the intersection of the approximate straight lines;
    A monitoring system comprising: a correction unit that updates the vanishing point parallax based on a shift amount between the first vanishing point and the second vanishing point calculated by the vanishing point calculating unit. Distance correction device.
  2. The camera further includes detection means for detecting a plurality of spatially parallel reference objects extending in the distance direction from a scene projected in the captured image and identifying the positions of the reference objects on the captured image plane. ,
    The vanishing point calculating means calculates an approximate straight line in the captured image plane for each of the reference objects when a plurality of reference objects are detected by the detecting means. Distance correction device for monitoring system.
  3.   The distance correction apparatus for a monitoring system according to claim 2, wherein the reference object is a left or right lane on a road projected in a captured image.
  4.   The distance correction apparatus for a monitoring system according to claim 2, wherein the reference object is a left / right boundary line indicating a boundary between a wall and a floor projected in a captured image.
  5.   The distance correction apparatus for a monitoring system according to claim 2, wherein the reference object is a left and right rail of a track projected in a captured image.
  6. In the distance correction method of the monitoring system,
    Calculating parallax by stereo matching based on a pair of captured images of the same scene captured at the same time;
    Calculating a distance to the object based on the parallax and the vanishing point parallax;
    Calculating a plurality of spatially parallel approximate lines extending in the distance direction on one captured image plane, and calculating a first vanishing point from the intersection of the approximate lines;
    Calculating a plurality of spatially parallel approximate lines extending in the distance direction on the other captured image plane, and calculating a second vanishing point from the intersection of the approximate lines;
    A distance correction method for a monitoring system, comprising: updating the vanishing point parallax based on a shift amount between the first vanishing point and the second vanishing point.
JP2001277998A 2001-09-13 2001-09-13 Distance correction apparatus and distance correction method for monitoring system Active JP4803927B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001277998A JP4803927B2 (en) 2001-09-13 2001-09-13 Distance correction apparatus and distance correction method for monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001277998A JP4803927B2 (en) 2001-09-13 2001-09-13 Distance correction apparatus and distance correction method for monitoring system

Publications (2)

Publication Number Publication Date
JP2003083742A JP2003083742A (en) 2003-03-19
JP4803927B2 true JP4803927B2 (en) 2011-10-26

Family

ID=19102438

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001277998A Active JP4803927B2 (en) 2001-09-13 2001-09-13 Distance correction apparatus and distance correction method for monitoring system

Country Status (1)

Country Link
JP (1) JP4803927B2 (en)

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6822563B2 (en) 1997-09-22 2004-11-23 Donnelly Corporation Vehicle imaging system with accessory control
US5877897A (en) 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
US7655894B2 (en) 1996-03-25 2010-02-02 Donnelly Corporation Vehicular image sensing system
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US6882287B2 (en) 2001-07-31 2005-04-19 Donnelly Corporation Automotive lane change aid
US7697027B2 (en) 2001-07-31 2010-04-13 Donnelly Corporation Vehicular video system
ES2391556T3 (en) 2002-05-03 2012-11-27 Donnelly Corporation Object detection system for vehicles
US7526103B2 (en) 2004-04-15 2009-04-28 Donnelly Corporation Imaging system for vehicle
JP2007263563A (en) 2004-06-03 2007-10-11 Matsushita Electric Ind Co Ltd Camera module
FR2874300B1 (en) * 2004-08-11 2006-11-24 Renault Sas Automatic calibration method of a stereovision system
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
JP4622637B2 (en) * 2005-04-04 2011-02-02 トヨタ自動車株式会社 In-vehicle camera attitude correction device and in-vehicle camera attitude correction method
US7972045B2 (en) 2006-08-11 2011-07-05 Donnelly Corporation Automatic headlamp control system
JP5234894B2 (en) * 2007-06-28 2013-07-10 富士重工業株式会社 Stereo image processing device
US8017898B2 (en) * 2007-08-17 2011-09-13 Magna Electronics Inc. Vehicular imaging system in an automatic headlamp control system
WO2009036176A1 (en) 2007-09-11 2009-03-19 Magna Electronics Imaging system for vehicle
US20090080876A1 (en) * 2007-09-25 2009-03-26 Mikhail Brusnitsyn Method For Distance Estimation Using AutoFocus Image Sensors And An Image Capture Device Employing The Same
WO2009046268A1 (en) 2007-10-04 2009-04-09 Magna Electronics Combined rgb and ir imaging sensor
WO2011014497A1 (en) 2009-07-27 2011-02-03 Magna Electronics Inc. Vehicular camera with on-board microcontroller
WO2011028686A1 (en) 2009-09-01 2011-03-10 Magna Mirrors Of America, Inc. Imaging and display system for vehicle
DE102009055776A1 (en) * 2009-11-25 2011-05-26 Conti Temic Microelectronic Gmbh Method for estimating the roll angle in a moving vehicle
WO2012075250A1 (en) 2010-12-01 2012-06-07 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US9264672B2 (en) 2010-12-22 2016-02-16 Magna Mirrors Of America, Inc. Vision display system for vehicle
US9085261B2 (en) 2011-01-26 2015-07-21 Magna Electronics Inc. Rear vision system with trailer angle detection
WO2013016409A1 (en) 2011-07-26 2013-01-31 Magna Electronics Inc. Vision system for vehicle
WO2013030932A1 (en) * 2011-08-29 2013-03-07 パイオニア株式会社 Navigation device, image display device, server, adjustment device, and forward image display method
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US10457209B2 (en) 2012-02-22 2019-10-29 Magna Electronics Inc. Vehicle vision system with multi-paned view
US9558409B2 (en) 2012-09-26 2017-01-31 Magna Electronics Inc. Vehicle vision system with trailer angle detection
US9446713B2 (en) 2012-09-26 2016-09-20 Magna Electronics Inc. Trailer angle detection system
US9743002B2 (en) 2012-11-19 2017-08-22 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
JP6251099B2 (en) * 2014-03-24 2017-12-20 国立大学法人東京工業大学 Distance calculation device
JP5906272B2 (en) * 2014-03-28 2016-04-20 富士重工業株式会社 Stereo image processing apparatus for vehicle
KR101574020B1 (en) * 2014-06-24 2015-12-03 국민대학교산학협력단 Method and Apparatus for stereo rectification of left and right view images captured from stereo cameras
US9852502B2 (en) 2014-10-27 2017-12-26 Denso Corporation Image processing apparatus
WO2016203989A1 (en) * 2015-06-17 2016-12-22 ソニー株式会社 Image processing device and image processing method
EP3358295B1 (en) * 2015-09-28 2020-10-07 Kyocera Corporation Image processing device, stereo camera device, vehicle, and image processing method
JP2018159683A (en) 2017-03-24 2018-10-11 日立オートモティブシステムズ株式会社 Stereo image processing device
WO2019163493A1 (en) * 2018-02-20 2019-08-29 日立オートモティブシステムズ株式会社 Image capturing device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06341837A (en) * 1993-06-01 1994-12-13 Matsushita Electric Ind Co Ltd Distance-between-cars measuring apparatus, camera-position correction device and collision warning device
JPH09126759A (en) * 1995-11-06 1997-05-16 Mitsubishi Motors Corp Range finding method by on-vehicle camera and range finder using the camera
JP3617709B2 (en) * 1995-11-10 2005-02-09 株式会社デンソー Distance measuring device
JP4573977B2 (en) * 1999-09-22 2010-11-04 富士重工業株式会社 Distance correction device for monitoring system and vanishing point correction device for monitoring system

Also Published As

Publication number Publication date
JP2003083742A (en) 2003-03-19

Similar Documents

Publication Publication Date Title
JP6285958B2 (en) Stereo support with rolling shutter
US9066085B2 (en) Stereoscopic camera object detection system and method of aligning the same
US9378553B2 (en) Stereo image processing device for vehicle
US9586455B2 (en) Road surface condition estimating apparatus
US7362881B2 (en) Obstacle detection system and method therefor
DE10164346B4 (en) Road surveillance method for a vehicle and system therefor
US8259174B2 (en) Camera auto-calibration by horizon estimation
JP6151150B2 (en) Object detection device and vehicle using the same
JP4861574B2 (en) Driving assistance device
JP3895238B2 (en) Obstacle detection apparatus and method
US8184857B2 (en) Moving object recognizing apparatus
JP3596314B2 (en) Object edge position measuring device and moving object traffic judging device
JP3759429B2 (en) Obstacle detection apparatus and method
JP4406381B2 (en) Obstacle detection apparatus and method
JP4433887B2 (en) Vehicle external recognition device
WO2016135736A2 (en) Road vertical contour detection using a stabilized coordinate frame
DE60316226T2 (en) Method and apparatus for determining whether an object detected by a plurality of sensors is identical and method and apparatus for position correction in a multi-sensor system
KR101241651B1 (en) Image recognizing apparatus and method, and position determining apparatus, vehicle controlling apparatus and navigation apparatus using the image recognizing apparatus or method
KR100784307B1 (en) Device for detecting road traveling lane
US6873912B2 (en) Vehicle tracking system
JP4676373B2 (en) Peripheral recognition device, peripheral recognition method, and program
JP4109077B2 (en) Stereo camera adjustment device and stereo camera adjustment method
US10713506B2 (en) Vehicle vision system with 3D registration for distance estimation
JP5440461B2 (en) Calibration apparatus, distance measurement system, calibration method, and calibration program
US8174563B2 (en) Object detecting system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080822

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20101019

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20101116

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110214

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110222

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110407

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110809

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110809

R150 Certificate of patent or registration of utility model

Ref document number: 4803927

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140819

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250