JP4032843B2 - Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system - Google Patents

Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system Download PDF

Info

Publication number
JP4032843B2
JP4032843B2 JP2002184018A JP2002184018A JP4032843B2 JP 4032843 B2 JP4032843 B2 JP 4032843B2 JP 2002184018 A JP2002184018 A JP 2002184018A JP 2002184018 A JP2002184018 A JP 2002184018A JP 4032843 B2 JP4032843 B2 JP 4032843B2
Authority
JP
Japan
Prior art keywords
distance
data
calculated
image data
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2002184018A
Other languages
Japanese (ja)
Other versions
JP2004028727A (en
Inventor
圭二 塙
Original Assignee
富士重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士重工業株式会社 filed Critical 富士重工業株式会社
Priority to JP2002184018A priority Critical patent/JP4032843B2/en
Publication of JP2004028727A publication Critical patent/JP2004028727A/en
Application granted granted Critical
Publication of JP4032843B2 publication Critical patent/JP4032843B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a monitoring system and a monitoring method for monitoring a situation in a monitoring area, and also relates to a distance correction device and a distance correction method in the monitoring system.
[0002]
[Prior art]
In recent years, monitoring devices that monitor the situation in a predetermined monitoring area have attracted attention and practical application. This monitoring device is mounted on a moving body such as a vehicle or an aircraft, or is fixed to a stationary body such as a support. A stereo-type vehicle exterior monitoring device, which is an example of the former, captures a scene in a predetermined monitoring area with a stereo camera mounted on a vehicle, and recognizes a traveling state based on information obtained thereby. In the stereo monitoring device, a positional deviation amount (that is, parallax) related to the same object projected on a pair of captured images (stereo images) is calculated by stereo matching. The real space position (three-dimensional position) of an object projected on the image can be calculated by a well-known coordinate conversion formula based on the parallax calculated for the object and the position on the image plane. It is.
[0003]
Furthermore, for the purpose of further improving the monitoring accuracy, a monitoring device using a stereo camera and a laser radar in combination is also proposed. For example, in Japanese Patent Laid-Open Nos. 7-125567 and 7-320199, the position of an object in a monitoring area is detected by a laser radar, and on the image plane in which a scene in the monitoring area is imaged. A technique for setting a processing area in an area corresponding to a detection position is disclosed. And the distance to a target object is detected by performing stereo matching for the processing area set on the image plane as a processing target.
[0004]
[Problems to be solved by the invention]
However, in the prior art disclosed in the above publication, it is necessary to set a processing area based on data from a laser radar and then perform stereo matching with this processing area as a processing target. For this reason, complicated processing is required, and the system configuration is complicated. Further, although the distance sensor is excellent in distance calculation accuracy, there is a problem that the detection accuracy is not so high for an object other than the preceding vehicle or the like. A typical example of such an object is a pedestrian. Pedestrians are objects that are relatively easy to detect by image processing, but are difficult to detect by laser radar.
[0005]
The present invention has been made in view of such circumstances, and an object thereof is to recognize an object by integrating distance image data obtained by a stereo camera and distance measurement data obtained by a laser radar. This is to further improve the accuracy.
[0006]
Another object of the present invention is to further improve the accuracy of the three-dimensional information about the object by correcting the set vanishing point when obtaining the three-dimensional information about the object using a preset vanishing point. It is to improve.
[0007]
[Means for Solving the Problems]
In order to solve such a problem, a first invention is a monitoring system for monitoring a situation in a monitoring area, and includes a stereo camera, a stereo image processing unit, a laser radar, a recognition unit, and a data correction unit. Provide a monitoring system. Here, the stereo camera captures a scene including the monitoring area and outputs a pair of image data. The stereo image processing unit calculates a parallax by stereo matching based on a pair of image data, and a parallax group related to the image data corresponding to one frame and a position on the image plane defined by the image data. The associated distance image data is output. Further, the laser radar measures the distance in the monitoring area and outputs a two-dimensional distribution of the distance in the monitoring area as distance measurement data. The recognizing unit divides the two-dimensional plane defined by the distance image data into a plurality of sections, and calculates a distance in the monitoring area based on a parallax group existing in the section. Then, distance calculation data in which each of the divided sections is associated with the calculated distance is calculated. Further, the data correction unit specifies the distance to the three-dimensional object in the monitoring area at the predetermined position on the distance measurement data as the first distance. Then, by specifying a section indicating the distance to the three-dimensional object in the distance calculation data, the distance calculated for the specified section is corrected to the first distance. At this time, the recognition unit recognizes a three-dimensional object in the monitoring area based on the distance calculation data corrected by the correction unit.
[0008]
According to a second aspect of the present invention, there is provided a monitoring system that monitors a situation in a monitoring area, and includes a stereo camera, a stereo image processing unit, a laser radar, a recognition unit, and a data correction unit. Here, the stereo camera captures a scene including the monitoring area and outputs a pair of image data. The stereo image processing unit calculates a parallax by stereo matching based on a pair of image data, and a parallax group related to image data corresponding to one frame and a coordinate position on an image plane defined by the image data are obtained. The associated distance image data is output. Further, the laser radar measures the distance in the monitoring area and outputs a two-dimensional distribution of the distance in the monitoring area as distance measurement data. The recognizing unit divides the two-dimensional plane defined by the distance image data into a plurality of sections, and calculates a distance in the monitoring area based on a parallax group existing in the section. Then, distance calculation data in which each of the divided sections is associated with the calculated distance is calculated. The data correction unit specifies the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data as the first distance. Further, by specifying a section indicating the distance to the three-dimensional object in the distance calculation data, the distance calculated for the specified section is specified as the second distance. Then, by comparing the first distance and the second distance, a third distance is determined as a correction value for correcting the distance calculated for the specified section. At this time, the recognition unit recognizes a three-dimensional object in the monitoring area based on the corrected distance calculation data.
[0009]
Here, in the second invention, when the data correction unit determines that there is a measurement value corresponding to the distance to the three-dimensional object at a predetermined position on the distance measurement data, the measurement value is set as the first distance. It is preferable to determine.
[0010]
In this case, when the difference between the first distance and the second distance is within a predetermined threshold, the data correction unit preferably determines the first distance as the third distance. Alternatively, the data correction unit preferably determines the second distance as the third distance when the difference between the first distance and the second distance is larger than a predetermined threshold.
[0011]
Alternatively, in this case, when the difference between the first distance and the second distance is within a predetermined threshold, the data correction unit calculates an average value of the first distance and the second distance. It is preferable to determine the distance as 3. In addition, the data correction unit preferably determines the second distance as the third distance when the difference between the first distance and the second distance is larger than a predetermined threshold value.
[0012]
According to a third aspect of the present invention, there is provided a distance correction apparatus for a monitoring system, comprising a stereo camera, a stereo image processing unit, a recognition unit, and a parallax correction unit. Here, the stereo camera captures a scene including the monitoring area and outputs a pair of image data. The stereo image processing unit calculates parallax by stereo matching based on a pair of image data, and associates a parallax group related to image data corresponding to one frame with a coordinate position on an image plane defined by the image data. The obtained distance image data is output. The laser radar measures the distance in the monitoring area and outputs a two-dimensional distribution of the distance in the monitoring area as distance measurement data. The recognizing unit divides the two-dimensional plane defined by the distance image data into a plurality of sections, and calculates a distance in the monitoring area based on a parallax group existing in the section. Then, distance calculation data in which each of the divided sections is associated with the calculated distance is calculated. A three-dimensional object is recognized based on the distance calculation data. The parallax correction unit identifies the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data as the first distance. And the distance calculated regarding this specified division is specified as the 2nd distance by specifying the section which shows the distance to a solid thing in distance calculation data. Then, the parameter is calculated based on the first distance and the second distance.
[0013]
The fourth invention provides a monitoring method for monitoring a situation in a monitoring area using distance image data and distance measurement data. Here, the distance image data is an image plane defined by a parallax group and image data related to image data corresponding to one frame calculated by stereo matching based on a pair of image data obtained by capturing a scene including a monitoring area. Are associated with each other. The distance measurement data is calculated as a two-dimensional distribution of distances by measuring the distance in the monitoring region by a laser radar. In this monitoring method, the first step divides the two-dimensional plane defined by the distance image data into a plurality of sections, calculates the distance in the monitoring area based on the parallax groups existing in the sections, and Distance calculation data in which each of the classified sections is associated with the calculated distance is calculated. In the second step, the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data is specified as the first distance. In the third step, a section indicating the distance to the three-dimensional object is specified in the distance calculation data. The fourth step corrects the distance calculated for the specified section in the distance calculation data to the first distance. In the fifth step, the three-dimensional object in the monitoring area is recognized based on the distance calculation data corrected in the fourth step.
[0014]
The fifth invention provides a monitoring method for monitoring a situation in a monitoring area using distance image data and distance measurement data. Here, the distance image data is an image plane defined by a parallax group and image data related to image data corresponding to one frame calculated by stereo matching based on a pair of image data obtained by capturing a scene including a monitoring area. Are associated with each other. The distance measurement data is calculated as a two-dimensional distribution of distances by measuring the distance in the monitoring region by a laser radar. In this monitoring method, the first step divides the two-dimensional plane defined by the distance image data into a plurality of sections, calculates the distance in the monitoring area based on the parallax groups existing in the sections, and Distance calculation data in which each of the classified sections is associated with the calculated distance is calculated. In the second step, the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data is specified as the first distance. In the third step, a section indicating the distance to the three-dimensional object is specified in the distance calculation data. In the fourth step, the distance calculated for the specified section is specified as the second distance in the distance calculation data. The fifth step determines the third distance as a correction value for correcting the distance calculated for the identified section by comparing the first distance and the second distance. In the sixth step, a three-dimensional object in the monitoring area is recognized based on the corrected distance calculation data.
[0015]
Here, in the fifth invention, when the second step determines that there is a measurement value corresponding to the distance to the three-dimensional object at a predetermined position on the distance measurement data, the measurement value is set to the first distance. Determine as.
[0016]
In this case, the fifth step determines the first distance as the third distance when the difference between the first distance and the second distance is within a predetermined threshold value. Alternatively, the fifth step preferably includes a step of determining the second distance as the third distance when the difference between the first distance and the second distance is larger than a predetermined threshold value. .
[0017]
In this case, in the fifth step, when the difference between the first distance and the second distance is within a predetermined threshold, the average value of the first distance and the second distance is calculated. The third distance is determined. Alternatively, the fifth step preferably includes a step of determining the second distance as the third distance when the difference between the first distance and the second distance is larger than a predetermined threshold value. .
[0018]
Furthermore, the sixth invention provides a distance correction method for a monitoring system. In this distance correction method, the first step captures a scene including the monitoring area and outputs a pair of image data. In the second step, parallax is calculated by stereo matching based on a pair of image data, and a parallax group related to image data corresponding to one frame is associated with a coordinate position on an image plane defined by the image data. The obtained distance image data is output. The third step measures the distance in the monitoring area and outputs a two-dimensional distribution of the distance in the monitoring area as distance measurement data. The fourth step divides the two-dimensional plane defined by the distance image data into a plurality of sections, calculates the distance in the monitoring area based on the parallax group existing in the section, and each of the divided sections And distance calculation data in which the calculated distance is associated. In the fifth step, the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data is specified as the first distance. In the sixth step, a section indicating the distance to the three-dimensional object in the monitoring area is specified in the distance calculation data. In the seventh step, the distance in the monitoring area associated with the specified section is specified as the second distance. In the eighth step, parameters are calculated based on the first distance and the second distance. In the ninth step, the three-dimensional object is recognized based on the calculated distance calculation data.
[0019]
DETAILED DESCRIPTION OF THE INVENTION
(First embodiment)
FIG. 1 is a block diagram of a stereo monitoring system according to the first embodiment. This stereo type monitoring system 1 is mounted on a vehicle such as an automobile and monitors the traveling state ahead of the vehicle. A stereo camera that captures a scene in the monitoring area in front of the vehicle is attached in the vicinity of the room mirror. This stereo camera is composed of a pair of cameras 2 and 3, and each of the cameras 2 and 3 includes an image sensor (for example, a CCD or CMOS sensor). The main camera 2 captures a reference image (right image) necessary for performing stereo image processing, and the sub camera 3 captures a comparison image (left image). In a state where each other is synchronized, each analog image output from the cameras 2 and 3 is converted into a digital image of a predetermined luminance gradation (for example, 256 gradation gray scale) by the A / D converters 4 and 5. Is converted to
[0020]
A pair of digitized image data is subjected to brightness correction, image geometric conversion, and the like in the image correction unit 6. Usually, there is an error in the mounting position of the pair of cameras 2 and 3 although there is a difference in degree. Therefore, a shift caused by the error occurs in the left and right images. In order to correct this deviation, geometrical transformation such as image rotation or translation is performed using affine transformation or the like.
[0021]
Through such image processing, reference image data is obtained from the main camera 2, and comparison image data is obtained from the sub camera 3. These image data (stereo image data) is a set of luminance values (0 to 255) of each pixel. Here, the image plane defined by the image data is expressed in the ij coordinate system, with the lower left corner of the image as the origin, the horizontal direction as the i coordinate axis, and the vertical direction as the j coordinate axis. Stereo image data corresponding to one frame (one image display unit) is output to the subsequent stereo image processing unit 7 and stored in the subsequent image data memory 8.
[0022]
The stereo image processing unit 7 calculates distance image data Dp related to a captured image corresponding to one frame based on the reference image data and the comparison image data. Here, the “distance image data” is a set of parallax d calculated for each small area in the image plane defined by the image data, and each parallax d is a position (i, j) on the image plane. It is associated. The unit for calculating the parallax d is a pixel block having a predetermined area (for example, 4 × 4 pixels) constituting a part of the reference image, and one parallax is calculated from one pixel block.
[0023]
FIG. 2 is an explanatory diagram of pixel blocks set in the reference image. For example, when the reference image is composed of 200 × 512 pixels, a parallax group corresponding to the number of pixel blocks PBij (50 × 128) can be calculated from a captured image corresponding to one frame. As is well known, the parallax d is the amount of horizontal displacement with respect to the pixel block PBij that is the calculation unit, and has a large correlation with the distance to the object projected in the pixel block PBij. That is, the closer the object projected in the pixel block PBij is to the cameras 2 and 3, the larger the parallax d of the pixel block PBij is, and the farther the object is, the smaller the parallax d is. d becomes 0).
[0024]
When calculating the parallax d regarding a certain pixel block PBij (correlation source), a region (correlation destination) having a correlation with the luminance characteristic of the pixel block PBij is specified in the comparison image. As described above, the distance from the cameras 2 and 3 to the object appears as a horizontal shift amount between the reference image and the comparison image. Therefore, when searching for the correlation destination in the comparison image, it is only necessary to search on the same horizontal line (epipolar line) as the j coordinate of the pixel block Pij as the correlation source. The stereo image processing unit 7 shifts the correlation between the correlation source and the correlation destination candidate while shifting one pixel at a time on the epipolar line within a predetermined search range set based on the i coordinate of the correlation source. Sequential evaluation (stereo matching). In principle, the amount of horizontal deviation of the correlation destination (one of the correlation destination candidates) determined to have the highest correlation is defined as the parallax d of the pixel block PBij.
[0025]
The correlation between two pixel blocks can be evaluated, for example, by calculating a city block distance CB. Formula 1 shows the basic form of the city block distance CB. In the equation, p1ij is the luminance value of the ijth pixel of one pixel block, and p2ij is the ijth luminance value of the other pixel block. The city block distance CB is the total sum of the differences (absolute values) of the luminance values p1ij and p2ij corresponding to each other in the entire pixel block, and the smaller the difference, the greater the correlation between both pixel blocks. .
[Expression 1]
CB = Σ | p1ij−p2ij |
[0026]
Basically, among the city block distances CB calculated for each pixel block existing on the epipolar line, the pixel block having the smallest value is determined as the correlation destination. Then, the amount of deviation between the correlation destination and the correlation source specified in this way becomes the parallax d. Note that the hardware configuration of the stereo image processing unit 7 for calculating the city block distance CB is disclosed in Japanese Patent Laid-Open No. 5-1114099, so refer to it if necessary. The distance image data Dp calculated through such processing, that is, a set of parallax d associated with the position (i, j) on the image is stored in the distance data memory 9.
[0027]
In the stereo monitoring system 1, a laser radar 10 that measures the distance in the monitoring area is attached to the front of the vehicle. As shown in FIG. 3, the laser radar 10 irradiates a vertically long laser beam (for example, ± 2 ° in the vertical direction and 0.03 ° in the horizontal direction), and the three-dimensional object in the monitoring region located at a position higher than the ground. Detect distance. As shown in FIG. 4, the laser radar 10 sequentially detects distances by performing horizontal scanning within a predetermined scanning range while emitting and receiving laser beams at regular intervals. As a series of detection operations within the scanning range, a set of a predetermined number of measurement values corresponding to each scanning position is calculated as distance measurement data Dm. In other words, the distance measurement data Dm is a two-dimensional distribution of distances within the monitoring area. This detection operation is performed in synchronization with the imaging operations of the cameras 2 and 3, and the distance measurement data Dm is converted into the microcomputer 11 (specifically, data correction) in correspondence with the output of image data corresponding to one frame. Part 13).
[0028]
The microcomputer 11 includes a CPU, a ROM, a RAM, an input / output interface, and the like. When the microcomputer 11 is viewed functionally, the microcomputer 11 includes a recognition unit 12, a data correction unit 13, and a control unit 14.
[0029]
The recognition unit 12 identifies the road shape in the monitoring area based on the image data or the distance image data Dp. Then, the two-dimensional plane defined by the distance image data Dp is divided into a plurality of sections, and the distance in the monitoring area is calculated based on the specified road shape and the parallax group existing in the section. Then, distance calculation data Dc in which each of the divided sections is associated with the calculated distance is calculated. Regarding the distance calculation data Dc, the distance associated with each section is a correctable value and is corrected by the data correction unit 13. The recognition unit 12 recognizes a three-dimensional object in the monitoring area based on the distance calculation data Dc ′ corrected by the data correction unit 13.
[0030]
The data correction unit 13 corrects the distance calculation data Dc, which is the output data of the recognition unit 12, based on the distance measurement data Dm, which is the output data of the laser radar 10, and outputs it to the recognition unit 12 as distance calculation data Dc ′. To do.
[0031]
Based on the recognition result in the recognition unit 12, the control unit 14 controls an alarm device or a control device (not shown) as necessary. For example, in a driving situation where the distance between the vehicle and the preceding vehicle is shortened and a warning to the driver is required, an alarm device such as a monitor or a speaker is operated to alert the driver. Further, in order to decelerate the vehicle in such a situation, the brake operation, the automatic transmission downshift, the engine output reduction, and the like are performed.
[0032]
FIG. 5 is a flowchart showing a monitoring procedure according to the first embodiment. This routine is called at predetermined intervals and executed by the microcomputer 11. First, in step 1, the recognition unit 12 reads a pair of image data (hereinafter simply referred to as “image data”) from the image data memory 8, and also stores the distance image data Dp corresponding to the image data in the distance data memory. 9 is read.
[0033]
In step 2, the recognition unit 12 specifies the road shape in the monitoring area based on the image data or the distance image data Dp. In this process, the road shape is specified by correcting / changing the parameters of the road model so as to correspond to the actual road shape. The road model is specified by a horizontal linear equation and a vertical linear equation in a real space coordinate system. This straight line formula can be calculated by dividing the lane on the road into multiple sections according to the set distance and approximating the left and right white lines etc. with a three-dimensional linear formula for each section and connecting them in a broken line shape It is. The details of the road model are disclosed in Japanese Patent Application Laid-Open No. 2001-160137 already filed by the applicant of the present invention.
[0034]
In step 3, the recognition unit 12 calculates distance calculation data Dc. As will be described below, the distance calculation data Dc is a set of representative distances calculated by dividing a two-dimensional plane defined by the distance image data Dp into a plurality of sections, and calculating one for each section. The representative distance regarding a certain section is uniquely calculated from the parallax group existing in the section.
[0035]
Specifically, as shown in FIG. 6, first, the ij plane defined by the read distance image data Dp is divided by a predetermined interval (for example, an interval of four pixels in the horizontal direction) to obtain a lattice. A plurality of sections (vertical strips) are defined. Then, one of the plurality of sections arranged in the horizontal direction (for example, the leftmost section) is selected.
Next, in the selected section, arbitrary data (for example, data whose (i, j) position is closest to the origin) is selected from all data existing in the section. Here, “data” refers to the parallax d associated with the position (i, j). Then, for the selected data, a three-dimensional position (x, y, z) is calculated based on a well-known coordinate conversion formula. The coordinate system of the real space set based on the position of the host vehicle is based on the road surface directly below the center of the main camera 2 as the origin, the vehicle width direction is the x axis, the vehicle height direction is the y axis, and the vehicle length direction (distance direction) ) Is the z-axis. Then, the height of the road surface at this distance z is calculated using the linear equation of the road model described above. The height of the road surface is compared with the height of the coordinate-converted data. If the data is above the road surface, the data is extracted as three-dimensional object data.
[0036]
At this time, data with a small difference in height from the road surface (for example, a height of about 0.1 m or less) is regarded as data relating to white lines, dirt, shadows, etc. on the road. In this case, the recognition unit 12 does not handle this data in the following processing. On the other hand, data whose height is higher than the height of the host vehicle is regarded as data relating to pedestrian bridges, signs, etc., and the recognition unit 12 does not handle this data in the following processing. Thereby, only the data estimated as the three-dimensional object on the road, that is, the data that is higher than the road surface and lower than the vehicle height of the host vehicle is selected as the three-dimensional object data.
[0037]
Thereafter, the same processing is repeatedly executed for all data in the section, and the three-dimensional object data in the section is sequentially extracted. For example, based on the i coordinate, the data is selected while sequentially shifting the j coordinate from 0 to 199, and then the data is moved to the next i coordinate (i + 1 coordinate). Similarly, the data is shifted while sequentially shifting the j coordinate. Is selected.
[0038]
When the above three-dimensional object estimation process is completed in one section, a histogram (distance frequency distribution) is created for the extracted three-dimensional object data. This distance frequency distribution is represented as the frequency of the three-dimensional object data with the distance z as the horizontal axis by counting the number of three-dimensional object data included in a section of a preset distance (for example, 15 m). When this distance frequency distribution is created, a section in which the frequency of the distribution is equal to or greater than a predetermined threshold and is the most frequent distance is detected. At this time, if there is a section corresponding to this condition, the recognition unit 12 determines that a three-dimensional object exists in the section. Then, an average value or a representative value of the distance corresponding to the number existing in the section, or an intermediate value of the section is calculated as the distance to the three-dimensional object.
[0039]
In general, the distance image data Dp calculated by stereo matching often includes error data due to the influence of mismatching or the like. As one mode of such error data, there is a case where data having a value at that position is calculated even though there is no three-dimensional object at a position in a certain real space. Therefore, in the created distance frequency distribution, if there is a section that is greater than or equal to a preset threshold and has a maximum value, it is determined that a three-dimensional object exists in that section, and the maximum value of frequency is less than or equal to the determination value It is preferable to determine that no object exists. This is because if there is a solid object of a certain size, the frequency of the section tends to increase, while if there is no object, it occurs even if there is incorrect data This is because the frequency of performing tends to be small. As a result, even when some noise is included in the data, it is possible to detect a three-dimensional object while minimizing this influence.
[0040]
Then, data of the next section (for example, the second section from the left end) is sequentially selected, extraction of data above the road surface, creation of a distance frequency distribution, and the presence of a three-dimensional object in the section, The distance to the three-dimensional object is calculated for each data. Then, this process is performed for all sections of the divided distance image data Dp. As a result, distance calculation data Dc is calculated in which a plurality of divided sections and distances in the monitoring area specified based on the disparity groups in each section are associated with each other.
[0041]
FIG. 7 is a diagram for explaining the distance to the object detected for each section related to the distance calculation data Dc. In the figure, the position in the real space corresponding to the distance for each section is indicated by a solid line and a point. Here, referring again to FIG. 4, the distance measurement data Dm based on the output from the laser radar 10 is similar to the output from the stereo camera, that is, the distance calculation data Dc based on the image. It can be seen that it is. Both data Dc and Dm are data obtained by radially measuring the distance from the main camera 2 or the laser radar 10 to the three-dimensional object, and the distance of each scan of the laser radar 10 is for each section of the distance calculation data Dc shown in FIG. Corresponds to the distance. Therefore, in the first embodiment, as described below, the data Dc and Dm are integrated to integrate the distance image data Dp and the distance measurement data Dm as the original form of the distance calculation data Dc. ing.
[0042]
In step 4 following step 3, the data correction unit 13 corrects the distance calculation data Dc calculated on an image basis. FIG. 8 is a detailed flowchart showing the correction procedure in step 4. As a premise, it is assumed that the data correction unit 13 has acquired the distance calculation data Dc output from the recognition unit 12. Further, the data correction unit 13 receives the distance image data Dp from which the distance calculation data Dc has been calculated (more specifically, from the laser radar 10 corresponding to the pair of image data that is the original image of the distance image data Dp). It is assumed that the output distance measurement data Dm is acquired. Here, the number of measurement values in the distance measurement data Dm based on the radar is defined as m. Further, in this distance measurement data Dm, in order to specify the position of the measurement value on the data, the scanning position is associated with the measurement value from the left to the right of the scanning range, and numbers 1 to m are assigned in ascending order. .
[0043]
First, in step 40, 1 is set as a variable n for specifying a position to be processed on the distance measurement data Dm based on the radar. At this time, the data correction unit 13 grasps the number of times m of scanning range measurements for the acquired radar-based distance measurement data Dm, and sets m as the maximum number Mmax of measurement values to be processed.
[0044]
In step 41, it is determined whether or not there is a measurement value to be specified at the nth position on the radar-based distance measurement data Dm. If the determination in step 41 is affirmative, that is, if it is determined that a measurement value (data) exists at the position n, the data correction unit 13 proceeds to step 42, and the data corresponding to the position n is set to the first value. The distance is specified as L1. In other words, the distance to the three-dimensional object in the monitoring area at the predetermined position on the distance measurement data Dm is specified as the first distance L1. On the other hand, when a negative determination is made in this determination, that is, when it is determined that there is no measurement value at the position n, the data correction unit 13 proceeds to step 47 described later.
[0045]
In step 43 following step 42, a section on the image-based distance calculation data Dc corresponding to the position n on the radar-based distance measurement data Dm is specified. In other words, in this step 43, in the image-based distance calculation data Dc, a three-dimensional object in the monitoring region (more precisely, a three-dimensional object corresponding to the distance L1 specified from the position n on the radar-based distance measurement data Dm). A category indicating the distance to the object) is specified. There is a correlation between the position n on the radar-based distance measurement data Dm and the section on the image-based distance calculation data Dc, and is uniquely identified from Equation 4 based on Equations 2 and 3 below. The
[Expression 2]
zo = L1 × cos θs + dz
xo = L1 × sin θs + dx
[0046]
Here, as shown in FIG. 9, Equation 2 indicates the coordinate position (x0, zo) in the real space of the three-dimensional object O corresponding to the position n on the radar-based distance measurement data Dm. Note that θs is an inclination from the laser radar 10 to the three-dimensional object O corresponding to the position n on the data with reference to the scanning direction of the laser radar 10 parallel to the z axis (the inclination to the left is “+ θ”, to the right) The inclination of “−θ”). At this time, the angle θc from the main camera 2 to the three-dimensional object O with respect to the z-axis direction is expressed by Equation 3.
[Equation 3]
θc = tan -1 (Xo / zo)
[0047]
[Expression 4]
N = Ncen + θc / dθc
[0048]
Here, N is the division number when the division located in the leftmost side in the image-based distance calculation data Dc is 1, and each division existing in the adjacent direction is numbered in ascending order. Show. Ncen indicates the number of the segment on the distance calculation data Dc corresponding to the z-axis extending direction, and dθc indicates the interval of the segment (for example, 0.2 ° in the case of a 4-pixel width). Then, based on the section N calculated by Equation 4, the distance l to the three-dimensional object associated with the section N is specified as the second distance L2.
[0049]
In step 44, the first distance L1 and the second distance L2 are compared. Then, according to the comparison result, the third distance L3 is determined as a correction value for correcting the distance l in the monitoring region associated with the section N on the distance calculation data Dc based on the image. In the first embodiment, the difference between the first distance L1 and the second distance L2 (specifically, the absolute value of both (| L1-L2 |)) is equal to or less than a predetermined threshold value Lth. Whether or not the value of the third distance L3 is determined.
[0050]
When an affirmative determination is made in step 44, the data correction unit 13 determines that the first distance L1 and the second distance L2 are approximate values. In step 45, L1 (first distance) based on the output from the laser radar 10 is set as the third distance L3. Accordingly, in the image-based distance calculation data Dc, the distance l in the monitoring area associated with the specified section N is corrected to the first distance L1.
[0051]
In general, the distance image data Dp based on an image or the distance calculation data Dc which is a modification of the distance image data is advantageous in terms of detection accuracy because various objects (particularly small objects) can be widely detected. On the other hand, the distance measurement data Dm based on the radar is advantageous in terms of the reliability of the measured distance (ranging accuracy), although it is inferior to the stereo camera in terms of detection accuracy. Therefore, in the first embodiment, the first distance L1 based on the radar is given an advantage, and the reliability of the monitoring accuracy is improved by trusting this value.
[0052]
On the other hand, if a negative determination is made in the previous step 44, the data correction unit 13 determines that the first distance L1 and the second distance L2 are different values. In step 46, L2 (second distance) based on the output from the stereo camera is set as the third distance L3. Thereby, in the image-based distance calculation data Dc, the distance l in the monitoring area associated with the specified section N is corrected to the second distance L2 (that is, the distance l is maintained as it is). .
[0053]
In general, regarding the distance L1 based on the radar and the distance L2 based on the image, as long as the distance of the same object existing in a certain section is shown, the values of the two are hardly greatly different. However, the distances L1 and L2 may be different when the distances of the separate objects are indicated, or when the difference between the distance measurement principles appears remarkably. Therefore, in such a case, priority is given to the image-based distance calculation data Dc that can detect various objects with high accuracy, and the image-based distance L2 is adopted. Thereby, the reliability of the monitoring accuracy is improved.
[0054]
In step 47, it is determined whether Mmax and n match. When an affirmative determination is made in step 47, the data correction unit 13 determines that the integration of the distance calculation data Dc based on the image and the distance measurement data Dm based on the radar has been performed at all the positions n. Exit this routine. On the other hand, if a negative determination is made in step 47, the process proceeds to step 48, where n + 1 is set as the variable n. The above-described process is repeatedly executed until n matches Mmax.
[0055]
In step 5 following step 4 in FIG. 5, the recognition unit 12 recognizes a three-dimensional object based on the distance calculation data Dc ′ (that is, the distance calculation data Dc corrected by the data correction unit 13). Specifically, the distances for each section are sequentially compared from the left to the right of the image, and the distances in the front-rear direction and the lateral direction that are close to each other are collected as a group. Then, the data arrangement direction is checked for each group, and the group is divided at a portion where the arrangement direction greatly changes, and each group is classified into a three-dimensional object or a wall portion from the data arrangement direction as a whole group. The Here, the three-dimensional object refers to a preceding vehicle, a person, an obstacle, or the like, and the wall portion refers to a side wall of a road side such as a guardrail. Then, for the group classified as a three-dimensional object, the average distance and the positions of the (left and right) end portions are calculated as parameters from the data in the group. On the other hand, for the group classified as the side wall, the alignment direction and the position of the (front and rear) end portions are calculated as parameters. Then, the control unit 14 gives a warning to the driver or performs vehicle control based on the recognized preceding vehicle, the distance to the person or the obstacle, and the road condition.
[0056]
The group parameter is calculated when the solid object or the wall is recognized. At this time, the corrected distance calculation data Dc ′ is corrected to the first distance L1 in the sections constituting the group. May include distance. In this case, it is preferable that the group distance is calculated using only the corrected distance L1. As described above, since the laser radar 10 has high distance measurement accuracy, the distance measurement accuracy can be improved by using only the value. In addition, when there are a plurality of corrected distances in the sections constituting the group, it is more preferable to use an average value thereof.
[0057]
As described above, according to the stereo monitoring system 1 of the first embodiment, the distance calculation data Dc based on the image and the distance measurement data Dm based on the radar can be integrated on the data. . As a result, it is possible to achieve both the advantages of stereo image recognition that can widely detect large and small objects and the advantages of the laser radar 10 that is excellent in the reliability of the measured distance. Improvements can be made. In addition, according to the stereo monitoring system 1 of the first embodiment, as in the prior art, a processing area is set based on data from the laser radar 10, and stereo matching is performed only on this processing area as a processing target. Since complicated operations such as performing such operations are not required, devices and parts for this purpose can be omitted.
[0058]
In the first embodiment, when the difference between the first distance L1 and the second distance L2 is equal to or less than the threshold value, the average value of the two distances L1 and L2 is determined as the third distance L3. Also good. In the first embodiment, the first distance L1 is compared with the second distance L2. However, the distance measurement accuracy in the radar-based distance measurement data Dm is reliable, and the value of this data is always adopted. It is also possible to do. In other words, the distance l in the monitoring area associated with the identified section N may always be corrected to the first distance L1.
[0059]
(Second Embodiment)
FIG. 10 is a flowchart illustrating a detailed procedure of the monitoring process according to the second embodiment. The monitoring procedure according to the second embodiment is different from the procedure according to the first embodiment in the processing method after step 6 shown in FIG. Omitted.
[0060]
Specifically, the difference between the second embodiment and the first embodiment is that parallax correction is performed. As long as the distance measurement data Dm based on the radar and the distance calculation data Dc based on the image indicate the distance of the same object existing in a certain section, it is not preferable that the distance values of the two are different. Therefore, in the second embodiment, as shown in FIG. 11, in the stereo monitoring system 1a, in the parallax correction unit 15, the image-based distance calculation data Dc matches the radar-based distance measurement data Dm. Correct the distance error.
[0061]
The parallax correction unit 15 calculates the vanishing point parallax dp based on the distance measurement data Dm output from the laser radar 10 and the distance calculation data Dc output from the recognition unit 12. Here, the vanishing point parallax dp is a parameter as a distance correction value.
Then, the calculated vanishing point parallax dp (more precisely, the vanishing point parallax dp ′ which is an average value of the plurality of vanishing point parallaxes dp) is reflected in the recognition unit 12. Thereby, the recognition unit 12 can newly calculate the distance calculation data Dc ′ based on the image based on the vanishing point parallax dp ′, and can recognize the three-dimensional object in the monitoring region.
[0062]
FIG. 12 is a flowchart showing a detailed procedure in step 6 shown in FIG. Here, since Step 60 to Step 63 are the same as Step 40 to Step 43 described above, detailed description thereof is omitted here.
[0063]
In step 64, the parallax correction unit 15 calculates the vanishing point parallax dp as the parameter described above based on the first distance L1 and the second distance L2. Specifically, based on Equation 5 below, the second distance L2 in the distance calculation data Dc based on the image is made to coincide with the first distance L1 in the distance measurement data Dm based on the radar. The vanishing point parallax dp is calculated.
[Equation 5]
dp = dpx−KZH / z
[0064]
Here, Formula 5 is specified from the well-known coordinate conversion formula shown by Formula 6.
[Formula 6]
z = KZH / (dpx−dp)
[0065]
KZH is a predetermined constant (camera base line length / horizontal viewing angle), and dpx is the parallax d corresponding to the second distance L2 based on the output from the stereo camera. Then, a measured value based on the output from the laser radar 10, that is, the first distance L1 is substituted for z.
[0066]
At the time when the distance calculation data Dc is calculated in step 3, it is preferable to use 0 or a predetermined value as an initial set value for this dp.
[0067]
In step 65, it is determined whether Mmax and n match.
If an affirmative determination is made in step 65, the parallax correction unit 15 determines that the vanishing point parallax dp has been calculated in all the sections of the distance calculation data Dc, and the process proceeds to step 67. On the other hand, if a negative determination is made in step 65, the process proceeds to step 66, where n is set to n + 1. Then, the process is repeatedly executed until n matches Mmax.
[0068]
Then, in step 67 following step 65, the average value dp ′ of all the calculated vanishing point parallaxes dp is calculated, and this routine is exited.
[0069]
In step 7 following step 6, the recognition unit 12 again divides the distance image data Dp into a plurality of sections. Then, based on the parallax group and vanishing point parallax dp ′ existing in the section, a three-dimensional position is calculated using a well-known coordinate transformation formula including Formula 6. Then, distance calculation data Dc ′ in which each of the divided sections is associated with the calculated distance is calculated by the same procedure as the method shown in Step 3 described above. Since the distance calculation data Dc ′ is integrated with the distance measurement data Dm based on the radar at the time of the coordinate conversion, the corrected distance calculation data Dc ′ described in the first embodiment is used. Is calculated as substantially the same data.
[0070]
In step 8, the recognition unit 12 recognizes a three-dimensional object based on the corrected distance calculation data Dc ′. Accordingly, the control unit 14 gives a warning to the driver or performs vehicle control based on the recognized distance from the preceding vehicle and the road condition.
[0071]
In step 6, once the vanishing point parallax dp ′ is calculated, this value can be continuously used in the subsequent processing. Therefore, the processing of steps 3 to 6 can be skipped for the frame to be processed next. However, this vanishing point parallax dp (or vanishing point parallax dp ′) is preferably specified at a predetermined frame interval, so that the optimum vanishing point parallax dp can be appropriately adopted.
[0072]
As described above, according to the stereo monitoring system 1a of the second embodiment, it is possible to improve the accuracy of the three-dimensional information related to the object by correcting the set vanishing point.
[0073]
In the present embodiment, the monitoring outside the vehicle has been described as an example, but the present invention is not limited to this. As an application example using such a technique, the present invention can be applied to various uses such as level crossing monitoring using both stereo image processing and laser radar, terrain recognition, or altitude measurement.
[0074]
【The invention's effect】
Thus, according to the present invention, the distance at the predetermined position of the distance measurement data as the output data from the laser radar is used to correct the distance of the corresponding section on the distance calculation data as the output data from the recognition unit. is doing. Thereby, since a solid object can be recognized using data in which distance measurement data and distance calculation data are integrated, it is possible to improve monitoring accuracy.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a stereo monitoring system according to a first embodiment.
FIG. 2 is an explanatory diagram of a pixel block set in a reference image
FIG. 3 is an explanatory diagram showing laser radar distance detection.
FIG. 4 is an explanatory diagram showing laser radar distance detection.
FIG. 5 is a flowchart showing a monitoring procedure according to the first embodiment.
FIG. 6 is an explanatory diagram of distance image data divided in a grid pattern.
FIG. 7 is an explanatory diagram of a distance to an object detected for each category related to distance calculation data.
FIG. 8 is a detailed flowchart showing a correction procedure in step 4 shown in FIG.
FIG. 9 is an explanatory diagram showing the correlation between distance measurement data and distance calculation data.
FIG. 10 is a flowchart showing a detailed procedure of monitoring processing according to the second embodiment.
FIG. 11 is a block diagram showing a stereo monitoring system according to a second embodiment.
12 is a flowchart showing a detailed procedure in step 6 shown in FIG.
[Explanation of symbols]
1,1a Stereo monitoring system
2 Main camera
3 Sub camera
4 A / D converter
5 A / D converter
6 Image correction unit
7 Stereo image processing unit
8 Image data memory
9 Distance data memory
10 Laser radar
11 Microcomputer
12 Recognition part
13 Data correction part
14 Control unit
15 Parallax correction unit

Claims (4)

  1. In the monitoring system that monitors the situation in the monitoring area,
    A stereo camera that captures a scene including the monitoring area and outputs a pair of image data;
    A distance image in which parallax is calculated by stereo matching based on the pair of image data, and a parallax group related to image data corresponding to one frame is associated with a coordinate position on an image plane defined by the image data A stereo image processing unit for outputting data;
    A laser radar that measures the distance in the monitoring region and outputs a two-dimensional distribution of the distance in the monitoring region as distance measurement data;
    Dividing a two-dimensional plane defined by the distance image data into a plurality of sections, calculating a distance in the monitoring area based on a parallax group existing in the section, and each of the divided sections and the A recognition unit that calculates distance calculation data associated with the calculated distance;
    When it is determined that there is a measurement value corresponding to the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data, the measurement value is specified as the first distance, and the distance calculation data By specifying the section indicating the distance to the three-dimensional object, the distance calculated for the specified section is specified as the second distance, and the first distance and the second distance are compared. A data correction unit for determining a third distance as a correction value for correcting the distance calculated for the identified category,
    The recognition unit recognizes a three-dimensional object in the monitoring area based on the corrected distance calculation data ,
    If the difference between the first distance and the second distance is within a predetermined threshold, the data correction unit determines the first distance as the third distance, and When the difference between the first distance and the second distance is larger than a predetermined threshold, the second distance is determined as the third distance .
  2. In the monitoring system that monitors the situation in the monitoring area,
    A stereo camera that captures a scene including the monitoring area and outputs a pair of image data;
    A distance image in which parallax is calculated by stereo matching based on the pair of image data, and a parallax group related to image data corresponding to one frame is associated with a coordinate position on an image plane defined by the image data A stereo image processing unit for outputting data;
    A laser radar that measures the distance in the monitoring region and outputs a two-dimensional distribution of the distance in the monitoring region as distance measurement data;
    Dividing a two-dimensional plane defined by the distance image data into a plurality of sections, calculating a distance in the monitoring area based on a parallax group existing in the section, and each of the divided sections and the A recognition unit that calculates distance calculation data associated with the calculated distance;
    When it is determined that there is a measurement value corresponding to the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data, the measurement value is specified as the first distance, and the distance calculation data By specifying the section indicating the distance to the three-dimensional object, the distance calculated for the specified section is specified as the second distance, and the first distance and the second distance are compared. A data correction unit for determining a third distance as a correction value for correcting the distance calculated for the identified category,
    The recognition unit recognizes a three-dimensional object in the monitoring area based on the corrected distance calculation data ,
    When the difference between the first distance and the second distance is within a predetermined threshold, the data correction unit calculates an average value of the first distance and the second distance. Determining a third distance, and determining a second distance as the third distance when a difference between the first distance and the second distance is greater than a predetermined threshold value. A characteristic surveillance system.
  3. Based on a pair of image data obtained by capturing a scene including a monitoring area, a parallax group related to image data corresponding to one frame calculated by stereo matching is associated with a position on an image plane defined by the image data. In the monitoring method of monitoring the situation in the monitoring area using the distance image data and the distance measurement data calculated as a two-dimensional distribution of the distance by measuring the distance in the monitoring area by a laser radar,
    Dividing a two-dimensional plane defined by the distance image data into a plurality of sections, calculating a distance in the monitoring area based on a parallax group existing in the section, and each of the divided sections and the A first step of calculating distance calculation data associated with the calculated distance;
    If it is determined that there is a measurement value corresponding to the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data, a second step of specifying the measurement value as a first distance;
    In the distance calculation data, a third step of specifying the section indicating the distance to the three-dimensional object;
    A fourth step of identifying, in the distance calculation data, a distance calculated for the specified category as a second distance;
    A fifth step of determining a third distance as a correction value for correcting the distance calculated for the identified section by comparing the first distance and the second distance;
    Based on the corrected the distance calculation data, it has a sixth step recognizes the three-dimensional object in the surveillance area,
    In the fifth step, when the difference between the first distance and the second distance is within a predetermined threshold, the first distance is determined as the third distance, When the difference between the first distance and the second distance is larger than a predetermined threshold, the second distance is determined as the third distance .
  4. Based on a pair of image data obtained by capturing a scene including a monitoring area, a parallax group related to image data corresponding to one frame calculated by stereo matching is associated with a position on an image plane defined by the image data. In the monitoring method of monitoring the situation in the monitoring area using the distance image data and the distance measurement data calculated as a two-dimensional distribution of the distance by measuring the distance in the monitoring area by a laser radar,
    Dividing a two-dimensional plane defined by the distance image data into a plurality of sections, calculating a distance in the monitoring area based on a parallax group existing in the section, and each of the divided sections and the A first step of calculating distance calculation data associated with the calculated distance;
    If it is determined that there is a measurement value corresponding to the distance to the three-dimensional object in the monitoring area at a predetermined position on the distance measurement data, a second step of specifying the measurement value as a first distance;
    In the distance calculation data, a third step of specifying the section indicating the distance to the three-dimensional object;
    A fourth step of identifying, in the distance calculation data, a distance calculated for the specified category as a second distance;
    A fifth step of determining a third distance as a correction value for correcting the distance calculated for the identified section by comparing the first distance and the second distance;
    Based on the corrected the distance calculation data, it has a sixth step recognizes the three-dimensional object in the surveillance area,
    In the fifth step, when a difference between the first distance and the second distance is within a predetermined threshold, an average value of the first distance and the second distance is calculated. Determining the third distance, and determining the second distance as the third distance when a difference between the first distance and the second distance is greater than a predetermined threshold value. A monitoring method characterized by.
JP2002184018A 2002-06-25 2002-06-25 Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system Active JP4032843B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002184018A JP4032843B2 (en) 2002-06-25 2002-06-25 Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002184018A JP4032843B2 (en) 2002-06-25 2002-06-25 Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system

Publications (2)

Publication Number Publication Date
JP2004028727A JP2004028727A (en) 2004-01-29
JP4032843B2 true JP4032843B2 (en) 2008-01-16

Family

ID=31180026

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002184018A Active JP4032843B2 (en) 2002-06-25 2002-06-25 Monitoring system and monitoring method, distance correction device and distance correction method in the monitoring system

Country Status (1)

Country Link
JP (1) JP4032843B2 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4926437B2 (en) * 2005-09-28 2012-05-09 富士重工業株式会社 Vehicle driving support device
JP4899402B2 (en) * 2005-10-05 2012-03-21 株式会社日立製作所 Imaging device
JP2007240276A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP5145585B2 (en) * 2007-06-08 2013-02-20 国立大学法人 熊本大学 Target detection device
JP2010091426A (en) * 2008-10-08 2010-04-22 Toyota Central R&D Labs Inc Distance measuring device and program
JP5208193B2 (en) * 2010-12-28 2013-06-12 ヤフー株式会社 Related word graph creation device, related word graph creation method, related word providing device, related word providing method, and program
EP2722646A4 (en) * 2011-06-14 2015-02-25 Nissan Motor Distance measurement device and environment map generation apparatus
CZ2012586A3 (en) * 2012-08-29 2014-03-12 Beistar3D Limited Method of describing points of object space objects and circuit arrangement for making the same
JP6540009B2 (en) 2013-12-27 2019-07-10 株式会社リコー Image processing apparatus, image processing method, program, image processing system
JP2015179077A (en) * 2014-02-25 2015-10-08 株式会社リコー Parallax calculation system, information processing device, information processing method, and program
EP3223034A1 (en) 2016-03-16 2017-09-27 Ricoh Company, Ltd. Object detection apparatus and moveable apparatus
JP2019219180A (en) * 2018-06-15 2019-12-26 日立オートモティブシステムズ株式会社 Object detection device for vehicles

Also Published As

Publication number Publication date
JP2004028727A (en) 2004-01-29

Similar Documents

Publication Publication Date Title
JP3587506B2 (en) Stereo camera adjustment device
US6956469B2 (en) Method and apparatus for pedestrian detection
JP3711405B2 (en) Method and system for extracting vehicle road information using a camera
JP4328692B2 (en) Object detection device
US7362881B2 (en) Obstacle detection system and method therefor
JP5188452B2 (en) Road shape recognition device
CA2174590C (en) Method of matching stereo images and method of measuring disparity between these images
JP4650079B2 (en) Object detection apparatus and method
JP4861574B2 (en) Driving assistance device
US6734787B2 (en) Apparatus and method of recognizing vehicle travelling behind
US6999896B2 (en) Identical object determination method and apparatus and displacement correction method and apparatus
EP1087336A2 (en) Apparatus and method for stereoscopic image processing
US8867790B2 (en) Object detection device, object detection method, and program
JP5689907B2 (en) Method for improving the detection of a moving object in a vehicle
JP3671825B2 (en) Inter-vehicle distance estimation device
JP2004032460A (en) Image processing apparatus and method therefor
US8548226B2 (en) Stereo image processing device and method
JP3352655B2 (en) Lane recognition device
JP3630100B2 (en) Lane detection device
US6873912B2 (en) Vehicle tracking system
EP2713309A2 (en) Method and device for detecting drivable region of road
EP1394761A2 (en) Obstacle detection device and method therefor
KR101411668B1 (en) A calibration apparatus, a distance measurement system, a calibration method, and a computer readable medium recording a calibration program
US6985619B1 (en) Distance correcting apparatus of surroundings monitoring system and vanishing point correcting apparatus thereof
US8244027B2 (en) Vehicle environment recognition system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050602

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070302

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070417

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070605

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20071010

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20071015

R150 Certificate of patent or registration of utility model

Ref document number: 4032843

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101102

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111102

Year of fee payment: 4

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121102

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121102

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131102

Year of fee payment: 6

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250