JP2013239078A - Image analysis device, image analysis method and image analysis program - Google Patents

Image analysis device, image analysis method and image analysis program Download PDF

Info

Publication number
JP2013239078A
JP2013239078A JP2012112553A JP2012112553A JP2013239078A JP 2013239078 A JP2013239078 A JP 2013239078A JP 2012112553 A JP2012112553 A JP 2012112553A JP 2012112553 A JP2012112553 A JP 2012112553A JP 2013239078 A JP2013239078 A JP 2013239078A
Authority
JP
Japan
Prior art keywords
target pixel
parallax
pixel
index value
image analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012112553A
Other languages
Japanese (ja)
Inventor
Kashu Takemae
嘉修 竹前
Fumiya Ichino
史也 一野
Akihiro Watanabe
章弘 渡邉
Original Assignee
Toyota Motor Corp
トヨタ自動車株式会社
Toyota Central R&D Labs Inc
株式会社豊田中央研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, トヨタ自動車株式会社, Toyota Central R&D Labs Inc, 株式会社豊田中央研究所 filed Critical Toyota Motor Corp
Priority to JP2012112553A priority Critical patent/JP2013239078A/en
Publication of JP2013239078A publication Critical patent/JP2013239078A/en
Application status is Pending legal-status Critical

Links

Abstract

To perform processing with higher accuracy.
A target pixel selected from a first image captured by a first imaging unit and a plurality of comparison target pixels in a second image captured by the second imaging unit. An index value indicating a degree of difference in pixel values including peripheral pixels is calculated, a comparison target pixel having the minimum index value is extracted, and a coordinate difference between the extracted comparison target pixel and the target pixel is calculated as a parallax. The parallax calculation means that performs the association with the target pixel and the target pixel and the parallax associated with the parallax calculation means that are associated with the target pixel and the comparison target with the minimum index value An image analyzing apparatus comprising: a selecting unit that selects a combination in which an inclination of the index value in the vicinity of a pixel is equal to or greater than a reference value.
[Selection] Figure 1

Description

  The present invention relates to an image analysis apparatus, an image analysis method, and an image analysis program that perform processing using stereo camera technology.

  2. Description of the Related Art Conventionally, a technique is known in which the same object is imaged at different angles and the distance to the object is calculated from the positional relationship between corresponding points in a plurality of images. Such a technique is called a stereo camera technique or the like. The stereo camera technology is mounted on a vehicle and is used, for example, to calculate a distance from an obstacle ahead of the vehicle.

  By the way, an index value such as SAD (Sum of Absolute Difference) is used to find corresponding points in a plurality of images, that is, points (pixels) estimated to have captured the same imaging target. The SAD is obtained as a sum of absolute values of differences between pixel values (for example, luminance values) of pixels in the reference area in one image and pixel values (same values) of pixels in the reference area in the other image. . Patent Document 1 describes a stereo image processing apparatus in which a point at which the SAD is a minimum value is a corresponding position on a sub-image having the highest degree of correlation with a point of interest in a main image.

JP 2005-250994 A

  However, in the stereo camera technology including the device described in Patent Document 1, as a result of searching for corresponding points using numerical values such as pixel values, points that do not actually capture the same imaging target are associated. May end up.

  Such inconvenience can be significant when trying to find a corresponding point on the road surface from a vehicle, for example. This is because, on the road surface, asphalt and concrete having a low contrast often continue except for drawn white lines and the like, and it is difficult for large characteristics to occur in pixel values. As a result, there is a possibility that the accuracy when calculating the distance to the imaging target or the like is lowered.

  An object of the present invention is to provide an image analysis device, an image analysis method, and an image analysis program that can perform processing with higher accuracy.

In order to achieve the above object, one embodiment of the present invention provides:
An image analysis apparatus that analyzes images captured by first and second imaging means that capture at least a part of a common imaging range from different imaging angles,
A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values, extracting a comparison target pixel having the minimum index value, calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax; Parallax calculating means for performing the association with the target pixel for a plurality of target pixels;
A selection unit that selects a combination in which an inclination of the index value in the vicinity of the comparison target pixel having the minimum index value is a reference value or more from a combination of the target pixel and the parallax associated by the parallax calculation unit;
Is an image analysis apparatus.

  According to this aspect of the present invention, selecting a combination in which the slope of the index value in the vicinity of the comparison target pixel in which the index value indicating the degree of difference between the pixel values including the surrounding pixels is minimized is equal to or greater than the reference value. Can select a significant parallax, that is, a parallax calculated from a combination of a pixel of interest and a pixel to be compared that is likely to contain the same imaging target, and processing using this can be performed with high accuracy. It can be carried out.

In one embodiment of the present invention,
A road surface structure estimation unit that estimates a road surface structure using a combination of a plurality of target pixels and parallax selected by the selection unit may be provided.

in this case,
Road surface determining means for extracting a target pixel and parallax estimated to have captured a road surface based on a variation degree of parallax in the depth direction in an image from a combination of a plurality of target pixels and parallax selected by the selection means; ,
The road surface structure estimation unit may be a unit that estimates a road surface structure using a combination of a pixel of interest and a parallax estimated to be a copy of the road surface extracted by the road surface determination unit.

Another aspect of the present invention is:
An image analysis apparatus that analyzes images captured by first and second imaging means that capture at least a part of a common imaging range from different imaging angles,
A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of coincidence of the included pixel values, extracting a comparison target pixel having the maximum index value, calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax; Parallax calculating means for performing the association with the target pixel for a plurality of target pixels;
A selection unit that selects a combination in which an inclination of the index value in the vicinity of the comparison target pixel having the minimum index value is a reference value or more from a combination of the target pixel and the parallax associated by the parallax calculation unit;
Is an image analysis apparatus.

  According to this aspect of the present invention, selecting a combination in which the slope of the index value in the vicinity of the comparison target pixel having the maximum index value indicating the degree of coincidence of the pixel values including the peripheral pixels is equal to or greater than the reference value. Can select a significant parallax, that is, a parallax calculated from a combination of a pixel of interest and a pixel to be compared that is likely to contain the same imaging target, and processing using this can be performed with high accuracy. It can be carried out.

Another aspect of the present invention is:
An image analysis apparatus that analyzes images captured by the first and second imaging units that capture at least a part of a common imaging range from different imaging angles,
A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values and extracting a comparison target pixel having the minimum index value;
A coordinate difference between the extracted comparison target pixel and the target pixel is calculated as a parallax and associated with the target pixel for a plurality of target pixels,
From the associated target pixel and parallax combination, select a combination in which the slope of the index value in the vicinity of the comparison target pixel having the smallest index value is greater than or equal to a reference value.
An image analysis method comprising:

Another aspect of the present invention is as follows:
An image analysis apparatus that analyzes images captured by the first and second imaging units that capture at least a part of a common imaging range from different imaging angles,
A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values and extracting a comparison target pixel having the minimum index value;
Calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax and associating it with the target pixel;
Selecting a combination in which the slope of the index value in the vicinity of the comparison target pixel having the smallest index value is greater than or equal to a reference value from the associated target pixel and parallax combinations;
Is an image analysis program.

  According to one aspect of the present invention, it is possible to provide an image analysis apparatus, an image analysis method, and an image analysis program that can perform processing with higher accuracy.

1 is a system configuration example of an image analysis apparatus 1 according to an embodiment of the present invention. It is the figure which illustrated the imaging range of the cameras 100 and 200. FIG. 2 is a diagram illustrating an image captured by a camera 100 and an image captured by a camera 200. FIG. It is a figure which shows the positions Pl and Pr in a left image and a right image of a certain imaging target P (X, Y, Z). It is a figure which shows a mode that a right edge image and a left edge image are produced | generated from a right image and a left image. 4 is an example of a flowchart showing a flow of processing executed by a parallax calculation unit 14; It is a figure which shows typically a mode that the parallax calculation part 14 scans a right edge image and a left edge image, selects an attention pixel and a comparison object pixel, and sets SAD window * and SAD window **. It is the figure which illustrated the waveform (henceforth a SAD waveform) of the SAD value used as the minimum and the minimum value. It is the figure which illustrated the waveform (henceforth a SAD waveform) of the SAD value used as the minimum and the minimum value. It is a figure which shows typically the principle which calculates | requires the parallax of a subpixel. It is a figure which shows typically a mode that the road surface determination part 20 scans a right edge image. It is explanatory drawing for demonstrating the coordinate axis which concerns on Formula (10).

  DESCRIPTION OF EMBODIMENTS Hereinafter, embodiments for carrying out the present invention will be described with reference to the accompanying drawings.

  Hereinafter, an image analysis apparatus, an image analysis method, and an image analysis program according to an embodiment of the present invention will be described with reference to the drawings.

[Constitution]
FIG. 1 is a system configuration example of an image analysis apparatus 1 according to an embodiment of the present invention. The image analysis apparatus 1 is used by being connected to cameras 100 and 200.

  The cameras 100 and 200 are, for example, a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera.

  The cameras 100 and 200 are attached to, for example, the left and right ends of the rear mirror portion of the vehicle, and take an image of the vehicle front and obliquely below. The cameras 100 and 200 are attached so that the optical axis angles (the depression angles) in the vertical direction coincide. The cameras 100 and 200 transmit images captured at a predetermined cycle to the image analysis apparatus 1. FIG. 2 is a diagram illustrating an imaging range of the cameras 100 and 200. As shown in the figure, the cameras 100 and 200 capture an imaging range at least partially in common from different imaging angles. FIG. 3 is a diagram illustrating an image captured by the camera 100 (hereinafter referred to as a left image) and an image captured by the camera 200 (hereinafter referred to as a right image).

  The image analysis apparatus 1 is, for example, a microcomputer in which a ROM (Read Only Memory), a RAM (Random Access Memory), and the like are connected to each other via a bus with a central processing unit (CPU) as a center. Accordingly, storage devices such as HDD (Hard Disc Drive), DVD-R (Digital Versatile Disk-Recordable) drive, CD-R (Compact Disc-Recordable) drive, EEPROM (Electronically Erasable and Programmable Read Only Memory), etc. O port, timer, counter, etc. are provided.

  The image analysis apparatus 1 includes an image input unit 10, a distortion correction / parallelization processing unit 12, a parallax calculation unit 14, and a selection unit as functional blocks that function when the CPU executes a program stored in a storage device. 16, a sub-pixel parallax calculation unit 18, a road surface determination unit 20, a parallax calculation unit 22 based on white line detection, and a road surface structure estimation unit 24.

  The program executed by the CPU is installed in the storage device by, for example, mounting a storage medium storing the program on the drive device. In addition, an interface device such as an in-vehicle Internet device may be downloaded from another computer via a network and installed in a storage device, or stored in a storage device, a ROM, or the like in advance when the image analysis device 1 is shipped. May be.

  The image input unit 10 stores image data input from the cameras 100 and 200 in a RAM or the like.

  The distortion correction / parallelization processing unit 12 performs distortion correction processing on image data sets captured by the cameras 100 and 200 at the same timing using internal parameters and external parameters of the cameras 100 and 200, and also in the vertical direction. That is, with respect to the direction of distance, the parallel imaging process is performed so that the same object to be imaged appears at the same position. Since the cameras 100 and 200 are mounted at the same height and with the same depression angle, in principle, the same imaging target should be captured at the same position in the vertical direction. Depending on the roll motion, the vibration of the room mirror, etc., the above parallel processing may be required.

  The distortion correction processing may be performed using, for example, a correction conversion table using the design values of the lenses of the cameras 100 and 200, or may be performed by parameter estimation using a radial distortion aberration model. Further, the parallelization processing related to the mounting error of the cameras 100 and 200 is, for example, an experiment in which a lattice pattern is set in the common imaging range of the cameras 100 and 200 and the relative relationship between the cameras 100 and 200 is calculated from the associated lattice points. Can be performed by performing in advance.

[Parallax calculation]
FIG. 4 is a diagram illustrating positions Pl and Pr of a certain imaging target P (X, Y, Z) in the left image and the right image. The positions Pl and Pr are corrected by the distortion correction / parallelization processing unit 12 so that coordinates (y coordinates) in the vertical direction of the image (hereinafter referred to as y direction) are the same, but the horizontal direction ( Hereinafter, coordinates (x coordinates) relating to the x direction are different depending on the distance between the cameras 100 and 200. The distance to the imaging object P is calculated by calculating the amount of deviation of the x-coordinates of Pl and Pr as parallax, and performing calculations that reflect the y-coordinates of Pl and Pr, the parallax, and the installation parameters of the cameras 100 and 200. Etc. can be obtained.

  By the way, when calculating the parallax, it is necessary to associate pixels estimated to have captured the same imaging object between the left image and the right image. First, the parallax calculation unit 14 applies a Sobel filter or the like to the left image and the right image corrected by the distortion correction / parallelization processing unit 12 to generate an edge image with emphasized edges. FIG. 5 is a diagram illustrating a state in which a right edge image and a left edge image are generated from the right image and the left image. Note that the parallax calculation unit 14 may generate an edge image using another method such as a Prewitt filter instead of the SobeL filter.

  Then, the parallax calculation unit 14 calculates the parallax at each pixel of the right edge image by performing the following processing, for example, using the right edge image as a standard image and the left edge image as a reference image. FIG. 6 is an example of a flowchart showing a flow of processing executed by the parallax calculation unit 14. This flowchart is repeatedly executed, for example, at a predetermined cycle.

  First, the parallax calculation unit 14 selects one pixel in the right edge image (S300). Hereinafter, the pixel selected in S300 is referred to as a target pixel. For example, the parallax calculation unit 14 selects a pixel by shifting one pixel to the right from the pixel at the upper left corner of the right edge image, and shifts the pixel by one pixel to the right from the leftmost pixel in the next lower row when reaching the right end. The target pixel is sequentially selected in the order of selecting the pixel.

  Next, the parallax calculation unit 14 sets an SAD window * centered on the pixel of interest (S302). The SAD window * is set in a rectangular range, for example, several pixels above and below the target pixel, and several pixels on the right and left sides (the vertical width and the horizontal width may be different).

  Next, the parallax calculation unit 14 selects one pixel having the same y coordinate as the target pixel in the left edge image (S304). Hereinafter, the pixel selected in S304 is referred to as a comparison target pixel. For example, the parallax calculation unit 14 sequentially selects the comparison target pixels in the order in which the y coordinate in the left edge image is selected from the left end pixel to the right by shifting the pixel one pixel to the right.

  Next, the parallax calculation unit 14 sets an SAD window ** centered on the comparison target pixel (S306). The SAD window ** is set with the same size and shape as the SAD window *.

  Then, the parallax calculation unit 14 calculates an SAD value between the SAD window * set in S304 and the SAD window ** set in S306 (S308). The SAD value is a sum of absolute values of differences between pixel values of pixels at the same position (for example, a luminance value, but may take color into consideration), and is defined by Expression (1). In the formula, N is the vertical size of the SAD window, M is the horizontal size of the SAD window, I is the pixel value of the coordinates represented by (i, j) in the SAD window *, and T is the SAD window *. The pixel value of the coordinates represented by (i, j) in *. The SAD value is an example of “an index value indicating a degree of difference between a pixel value including a peripheral pixel of a target pixel and a pixel value including a peripheral pixel of a comparison target pixel”.

Instead of the SAD value, other index values having similar properties such as an SSD (Sum of Swuared Difference) value may be used. The SSD value is the sum of the squares of differences in luminance difference between pixels at the same position, and is defined by Expression (2).

Next, the parallax calculation unit 14 determines whether or not all pixels having the same y coordinate as the target pixel are selected from the left edge image in S304 (S310). If all the pixels having the same y coordinate as the target pixel are not selected from the left edge image, the parallax calculation unit 14 returns to S304 and selects the right-side pixel as the comparison target pixel.

  On the other hand, when all the pixels having the same y coordinate as the target pixel are selected from the left edge image, the parallax calculation unit 14 becomes the minimum and minimum value among the SAD values calculated in the loop processing of S304 to S310. The SAD value is extracted, and the coordinate difference (distance in the x direction) between the comparison target pixel and the target pixel when the SAD value is calculated is calculated as the parallax for the target pixel (S312).

  A series of index values such as SAD values corresponding to the target pixel and data regarding the comparison target pixel when the SAD value is the minimum and minimum value are stored in a storage device such as a RAM for the selection unit 16 to use later for processing. Is done.

  Note that the parallax calculation unit 14 does not calculate the parallax for the pixel of interest when there is no minimum and minimum SAD value. This is a phenomenon that occurs when the SAD value is minimized at the edge of the left image or when the SAD value is constant for all comparison target pixels. In such a case, significant parallax is obtained. Because there is no.

  Next, the parallax calculation unit 14 determines whether or not all pixels are selected from the right edge image in S300 (S314). When all the pixels are not selected from the right edge image, the parallax calculation unit 14 returns to S300, and sets the right pixel (the leftmost pixel in the next lower row when reaching the right edge) as the target pixel. select. On the other hand, when all the pixels are selected from the right edge image, the parallax calculation unit 14 ends one routine of this flowchart.

  FIG. 7 is a diagram schematically illustrating a state in which the parallax calculation unit 14 scans the right edge image and the left edge image, selects a target pixel and a comparison target pixel, and sets the SAD window * and the SAD window **. is there.

[Selecting significant parallax (pixel)]
By the way, the parallax associated with the target pixel calculated by the parallax calculation unit 14 may be used as it is for the distance calculation or the like, but when higher accuracy is required, such as when estimating the structure of the road surface. For example, it is preferable to perform a process of narrowing down to a significant parallax, that is, a parallax calculated from a combination of a target pixel and a comparison target pixel that are likely to include the same imaging target.

8 and FIG. 9 show the waveforms of the SAD values in the comparison target pixel Q having the minimum and minimum SAD value, the comparison target pixel Q −1 on the left side thereof, and the comparison target pixel Q +1 on the right side thereof (hereinafter, referred to as “SAD value waveform”). It is a figure which illustrated (referred to as a SAD waveform). There are SAD waveforms that show a gentle waveform as shown in FIG. 8 and those that show a steep waveform as shown in FIG. Among these, significant parallax can be calculated from the combination of the target pixel and the comparison target pixel from which the SAD waveform indicating the steep waveform is obtained as shown in FIG.

  Therefore, the selection unit 16 uses, for example, Expression (3) for each pixel of interest for which the parallax has been calculated, an index indicating the slope of the SAD value in the vicinity of the comparison target pixel Q where the SAD value is the minimum and the minimum value. A value α is calculated, and the target pixel whose index value α is greater than or equal to the reference value and a process of narrowing down to the corresponding parallax are performed. The reference value may be a fixed value or a statistical value (for example, an average value of α).

α = | (SAD value in comparison target pixel Q- 1 ) − (SAD value in comparison target pixel Q) | + | (SAD value in comparison target pixel Q + 1 ) − (SAD value in comparison target pixel Q) | (3)

  Equation (3) is an example for deriving a value α reflecting the slope of the SAD value in the vicinity of the comparison target pixel having the minimum SAD value and the minimum value. Various changes are allowed, such as replacement with “several pixels”.

  By performing such processing, a combination of significant parallax and the target pixel can be selected. As a result, the image analysis apparatus 1 according to the present embodiment can perform processing with higher accuracy in road surface structure estimation processing and the like described later.

[Calculation of sub-pixel parallax]
The sub-pixel parallax calculation unit 18 converts the parallax selected by the selection unit 16 into sub-pixel parallax to improve the resolution. In the following description, assuming that the parallax is associated with the target pixel, “selecting a combination of the target pixel and the parallax” is simply referred to as “selecting parallax”.
FIG. 10 is a diagram schematically illustrating the principle of obtaining the sub-pixel parallax. As shown in the figure, the SAD value in the comparison target pixel Q having the minimum and minimum SAD value is c, the SAD value in the comparison target pixel Q −1 on the left side thereof is a, and the comparison target pixel Q +1 on the right side thereof. If the SAD value at is, b is the difference SP between the x coordinate of the virtual comparison target pixel Q * and the x coordinate of the comparison target pixel Q, where the SAD value is truly minimal and minimal. Represented by

SP = (ab) / 2 (ac) (a> b)
SP = (ab) / 2 (bc) (a≤b) (4)

  The sub-pixel parallax calculation unit 18 calculates the parallax of the sub-pixel with higher accuracy by adding the calculated difference SP to the parallax.

Note that the processing of the selection unit 16 may be performed after the parallax of the subpixel is calculated. In this case, if the virtual comparison target pixel Q * is between Q −1 and Q, the selection unit 16 calculates the index value α * by Expression (5), and Q * is between Q and Q + 1 . If so, the index value α * may be calculated by equation (6).

α * = | (SAD value at Q −1 ) − (SAD value at Q *) | + | (SAD value at Q) − (SAD value at Q *) | (5)
α * = | (SAD value at Q) − (SAD value at Q *) | + | (SAD value at Q + 1 ) − (SAD value at Q *) | (6)

[Road surface judgment]
The road surface determination unit 20 scans the right edge image in the depth direction of the image, for example, and when the parallax associated with a predetermined number of vertically consecutive pixels of interest falls within a threshold (for example, within 1 pixel), the continuous The target pixel to be determined is a pixel in which a three-dimensional object is copied. Then, the road surface determination unit 20 determines that the pixel of interest other than the pixel determined to have copied the three-dimensional object is a pixel that has copied the road surface. FIG. 11 is a diagram schematically illustrating how the road surface determination unit 20 scans the right edge image. The fact that the parallax associated with the target pixels arranged in the depth direction of the image falls within the threshold indicates that the distance from the cameras 100 and 200 is substantially constant. For this reason, this series of target pixels is considered to be a three-dimensional object (for example, a rear end surface of a preceding vehicle or a road-side wall), and other target pixels are considered to be a road surface. Note that the depth direction substantially coincides with the vertical direction at the center of the image, but at the left and right ends of the image, the direction converges to the center from the near side to the far side.

[Parallax calculation based on white line detection]
On the other hand, the parallax calculation unit 22 based on white line detection extracts, for example, a pixel whose luminance difference with an adjacent pixel is equal to or greater than a threshold from the left image and the right image corrected by the distortion correction / parallelization processing unit 12. The pixels corresponding to the outline of the white line are extracted by narrowing down the pixels arranged in a straight line or a curved line (in the case of a curve). Hereinafter, the extracted pixels are referred to as white line edges. Points arranged in a straight line or a curved line can be extracted by, for example, the Hough transform or the least square method.

  Then, the parallax calculation unit 22 based on white line detection, similarly to the parallax calculation unit 14, obtains the parallax between the white line edge (target pixel) in the left image and the white line edge (comparison target pixel) in the right image, thereby The parallax associated with the white line edge (target pixel) is calculated.

[Road surface structure estimation]
The road surface structure estimation unit 24 includes data obtained by adding the determination of the road surface determination unit 20 to the parallax of the subpixels calculated by the parallax calculation unit 14, the selection unit 16, and the subpixel parallax calculation unit 18, and the parallax calculation unit 22 based on white line detection. Reflecting the parallax calculated by the above, the road surface structure is estimated.

  The road surface structure estimation unit 24 selects parallax that satisfies the following conditions, and estimates the road surface structure using the selected parallax. When different parallaxes are calculated for the same target pixel, parallax satisfying the following condition 1 takes precedence over parallax satisfying conditions 2 and 3, and parallax satisfying condition 2 takes precedence over parallax satisfying condition 3 You can do it.

Condition 1: Parallax calculated by the parallax calculation unit 22 based on white line detection and selected by the selection unit 16 Condition 2: Parallax calculated by the parallax calculation unit 22 based on white line detection Condition 3: Selected by the selection unit 16 Parallax that is parallax and is associated with a pixel of interest that is determined to be a pixel obtained by copying the road surface by the road surface determination unit 20.

  When selecting the parallax as described above, the road surface structure estimation unit 24 first converts the parallax into three-dimensional information (distance D, lateral position X, height Y) according to equations (7) to (9). In the equation, f is the focal length of the cameras 100 and 200, Δd is parallax, and x and y are the x coordinate and y coordinate of the pixel of interest.

D = fB / Δd (7)
X = xD / f (8)
Y = yD / f (9)

  Further, the road surface structure estimation unit 24 applies the least square method or the like by using a plurality of road surface three-dimensional information obtained by the equations (7) to (9) because the road surface structure is represented by the equation (10). Then, the parameter of Expression (10) is calculated. In the formula, a is a longitudinal curvature, b is a pitch angle of the cameras 100 and 200, and c is a height of the cameras 100 and 200. FIG. 12 is an explanatory diagram for explaining coordinate axes according to Expression (10).

Y = a × Z 2 + b × Z + c (10)

[Summary]
According to the image analysis apparatus 1 of the present embodiment described above, the index value α indicating the slope of the SAD value in the vicinity of the comparison target pixel having the minimum and minimum SAD value is calculated, and the index value α is the reference value. Since the process of narrowing down the target pixel that is equal to or greater than the value and the corresponding parallax is performed, a combination of a significant parallax and the target pixel can be selected. Thereby, it is possible to perform processing with higher accuracy in road surface structure estimation processing or the like.

  The best mode for carrying out the present invention has been described above with reference to the embodiments. However, the present invention is not limited to these embodiments, and various modifications can be made without departing from the scope of the present invention. And substitutions can be added.

  For example, the parallax calculation unit 14 selects the comparison target pixel having the minimum SAD value and the minimum value, but may simply select the comparison target pixel having the minimum SAD value. .

  Further, the comparison target pixel in which the SAD value, which is an index value indicating the degree of difference between the pixel value including the peripheral pixel of the target pixel and the pixel value including the peripheral pixel of the comparison target pixel, is the minimum and minimum value. On the contrary, the index value indicating the “degree of coincidence” between the pixel value including the peripheral pixels of the target pixel and the pixel value including the peripheral pixels of the comparison target pixel is the maximum and minimum value. The comparison target pixel that is (or simply maximum) may be selected.

DESCRIPTION OF SYMBOLS 1 Image analysis apparatus 10 Image input part 12 Distortion correction | amendment / parallelization process part 14 Parallax calculation part 16 Selection part 18 Subpixel parallax calculation part 20 Road surface determination part 22 Parallax calculation part by white line detection 24 Road surface structure estimation part 100, 200 Camera

Claims (6)

  1. An image analysis apparatus that analyzes images captured by first and second imaging means that capture at least a part of a common imaging range from different imaging angles,
    A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values, extracting a comparison target pixel having the minimum index value, calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax; Parallax calculating means for performing the association with the target pixel for a plurality of target pixels;
    A selection unit that selects a combination in which an inclination of the index value in the vicinity of the comparison target pixel having the minimum index value is a reference value or more from a combination of the target pixel and the parallax associated by the parallax calculation unit;
    An image analysis apparatus comprising:
  2. The image analysis apparatus according to claim 1,
    An image analysis apparatus comprising road surface structure estimation means for estimating a road surface structure using a combination of a plurality of target pixels and parallax selected by the selection means.
  3. The image analysis apparatus according to claim 2,
    Road surface determining means for extracting a target pixel and parallax estimated to have captured a road surface based on a variation degree of parallax in the depth direction in an image from a combination of a plurality of target pixels and parallax selected by the selection means; ,
    The road surface structure estimation means is a means for estimating a road surface structure using a combination of a target pixel and a parallax estimated to be a copy of the road surface extracted by the road surface determination means.
    Image analysis device.
  4. An image analysis apparatus that analyzes images captured by first and second imaging means that capture at least a part of a common imaging range from different imaging angles,
    A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of coincidence of the included pixel values, extracting a comparison target pixel having the maximum index value, calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax; Parallax calculating means for performing the association with the target pixel for a plurality of target pixels;
    A selection unit that selects a combination in which an inclination of the index value in the vicinity of the comparison target pixel having the minimum index value is a reference value or more from a combination of the target pixel and the parallax associated by the parallax calculation unit;
    An image analysis apparatus comprising:
  5. An image analysis apparatus that analyzes images captured by the first and second imaging units that capture at least a part of a common imaging range from different imaging angles,
    A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values and extracting a comparison target pixel having the minimum index value;
    A coordinate difference between the extracted comparison target pixel and the target pixel is calculated as a parallax and associated with the target pixel for a plurality of target pixels,
    From the associated target pixel and parallax combination, select a combination in which the slope of the index value in the vicinity of the comparison target pixel having the smallest index value is greater than or equal to a reference value.
    An image analysis method comprising:
  6. An image analysis apparatus that analyzes images captured by the first and second imaging units that capture at least a part of a common imaging range from different imaging angles,
    A peripheral pixel is selected between a target pixel selected from the first image captured by the first imaging unit and a plurality of comparison target pixels in the second image captured by the second imaging unit. Calculating an index value indicating the degree of difference between the included pixel values and extracting a comparison target pixel having the minimum index value;
    Calculating a coordinate difference between the extracted comparison target pixel and the target pixel as a parallax and associating it with the target pixel;
    Selecting a combination in which the slope of the index value in the vicinity of the comparison target pixel having the smallest index value is greater than or equal to a reference value from the associated target pixel and parallax combinations;
    An image analysis program comprising:
JP2012112553A 2012-05-16 2012-05-16 Image analysis device, image analysis method and image analysis program Pending JP2013239078A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012112553A JP2013239078A (en) 2012-05-16 2012-05-16 Image analysis device, image analysis method and image analysis program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012112553A JP2013239078A (en) 2012-05-16 2012-05-16 Image analysis device, image analysis method and image analysis program

Publications (1)

Publication Number Publication Date
JP2013239078A true JP2013239078A (en) 2013-11-28

Family

ID=49764039

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012112553A Pending JP2013239078A (en) 2012-05-16 2012-05-16 Image analysis device, image analysis method and image analysis program

Country Status (1)

Country Link
JP (1) JP2013239078A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006047252A (en) * 2004-08-09 2006-02-16 Fuji Heavy Ind Ltd Image processing unit
JP2008033750A (en) * 2006-07-31 2008-02-14 Fuji Heavy Ind Ltd Object inclination detector
JP2009041972A (en) * 2007-08-07 2009-02-26 Toshiba Corp Image processing device and method therefor
WO2012017650A1 (en) * 2010-08-03 2012-02-09 パナソニック株式会社 Object detection device, object detection method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006047252A (en) * 2004-08-09 2006-02-16 Fuji Heavy Ind Ltd Image processing unit
JP2008033750A (en) * 2006-07-31 2008-02-14 Fuji Heavy Ind Ltd Object inclination detector
JP2009041972A (en) * 2007-08-07 2009-02-26 Toshiba Corp Image processing device and method therefor
WO2012017650A1 (en) * 2010-08-03 2012-02-09 パナソニック株式会社 Object detection device, object detection method, and program

Similar Documents

Publication Publication Date Title
JP3759429B2 (en) Obstacle detection apparatus and method
KR20100119559A (en) Method and system for converting 2d image data to stereoscopic image data
WO2011111247A1 (en) Stereo camera device
US8548226B2 (en) Stereo image processing device and method
EP2602761A1 (en) Object detection device, object detection method, and program
KR101862889B1 (en) Autofocus for stereoscopic camera
JP3868876B2 (en) Obstacle detection apparatus and method
US8396284B2 (en) Smart picking in 3D point clouds
US9959595B2 (en) Dense structure from motion
US8818077B2 (en) Stereo image matching apparatus and method
JP5462093B2 (en) Point cloud data processing device, point cloud data processing system, point cloud data processing method, and point cloud data processing program
JP2005037378A (en) Depth measurement method and depth measurement device
JP3562751B2 (en) Forward vehicle detection method and device
US9070042B2 (en) Image processing apparatus, image processing method, and program thereof
JP5127182B2 (en) Object detection device
JP2008082870A (en) Image processing program, and road surface state measuring system using this
JP3709879B2 (en) Stereo image processing device
JP2013225740A (en) Image formation device, image display device, and image formation method and image formation program
US7346190B2 (en) Traffic line recognition device
KR100793076B1 (en) Edge-adaptive stereo/multi-view image matching apparatus and its method
JP2008065634A (en) Object detection apparatus and object detection method
JP2013500536A5 (en)
US9237326B2 (en) Imaging system and method
JP6007602B2 (en) Image processing method, image processing apparatus, scanner, and computer program
CN101122457B (en) Image border scanning system and method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140508

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150130

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150203

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20151006