JP2004032244A - Stereo image processing apparatus and method therefor - Google Patents

Stereo image processing apparatus and method therefor Download PDF

Info

Publication number
JP2004032244A
JP2004032244A JP2002184013A JP2002184013A JP2004032244A JP 2004032244 A JP2004032244 A JP 2004032244A JP 2002184013 A JP2002184013 A JP 2002184013A JP 2002184013 A JP2002184013 A JP 2002184013A JP 2004032244 A JP2004032244 A JP 2004032244A
Authority
JP
Japan
Prior art keywords
timing
image
imaging
captured image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2002184013A
Other languages
Japanese (ja)
Other versions
JP3958638B2 (en
Inventor
Takayuki Sogawa
十川 能之
Original Assignee
Fuji Heavy Ind Ltd
富士重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Heavy Ind Ltd, 富士重工業株式会社 filed Critical Fuji Heavy Ind Ltd
Priority to JP2002184013A priority Critical patent/JP3958638B2/en
Publication of JP2004032244A publication Critical patent/JP2004032244A/en
Application granted granted Critical
Publication of JP3958638B2 publication Critical patent/JP3958638B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To accurately compute parallax even when the relative deviation between each of images is corrected, in computing the parallax using a pair of kinds of image data in which one frame is constituted by a plurality of pixels arranged in series. <P>SOLUTION: Cameras 1, 2 output picked-up images in each of which one frame is constituted of a plurality of pixels arranged in series, respectively. A first timing control unit 3 controls the imaging timing of the camera 2 so as to be synchronized with an imaging timing of the other camera 1. Image conversion units 11, 12 geometrically convert the relative deviation between each of the images outputted from the cameras 1, 2, with a conversion value. A first timing correction unit 5 decides a relative time difference between the imaging timing of the cameras 1, 2, corresponding to the conversion value. A correction amount for correcting the imaging timing of the camera 2 is outputted to the unit 3 on the basis of the time difference. <P>COPYRIGHT: (C)2004,JPO

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention provides a stereo image processing apparatus and method for calculating a parallax related to a reference pixel region by specifying a correlation destination of a reference pixel region in one captured image in another captured image based on a pair of captured images. About.
[0002]
[Prior art]
2. Description of the Related Art In recent years, a stereo image processing apparatus that calculates a distance (that is, parallax) between objects using a pair of captured images in order to recognize an object captured in a captured image has been known. The stereo image processing apparatus calculates a positional shift amount, that is, parallax, of the same object projected on the pair of captured images based on the pair of captured images acquired from the stereo camera. In the processing related to the parallax calculation, so-called stereo matching, which specifies a correlation destination of a reference pixel area in one captured image (reference image) in the other captured image (comparison image), is used.
[0003]
In this stereo matching, the parallax appears as a horizontal shift amount between the reference image and the comparative image. However, in order to obtain highly accurate parallax, a positional shift other than the parallax occurs between a pair of captured images. Desirably not present. However, in practice, this is caused by a displacement of the mounting position of the stereo camera (displacement in the horizontal, vertical, or rotational directions) or an optical displacement of the camera (distortion of the lens or tilt of the light receiving surface of the CCD). In some cases, a shift occurs in each image. Therefore, conventionally, in order to correct this shift, an affine transformation or the like is used to perform geometric transformation such as rotation or translation of the image.
[0004]
By the way, CCD sensors have often been used as image sensors built in stereo cameras, but recently, CMOS sensors capable of smaller size and lower power consumption have been often used. While the CCD sensor performs imaging of all pixels forming one frame at the same time, the CMOS sensor has an imaging feature that scans all pixels forming one frame as scanning lines (or This is performed in time series for each pixel or a predetermined pixel group).
[0005]
[Problems to be solved by the invention]
However, when a captured image corresponding to one frame whose imaging time is different for each scanning line (or a pixel or a predetermined pixel group) is used, when a geometric conversion is performed to correct a shift of each image, There is a possibility that the imaging time on the same horizontal line in each image is different. Therefore, there may be a problem that the imaging time of the reference image area and the imaging time of the specified pixel area of the correlation destination are different.
[0006]
Such a problem does not matter even if the imaging time of the pixel area of the correlation destination is different from the imaging time of the reference image area as long as the static image is a static object. However, when the object is moving like a vehicle or the like (or the stereo camera itself is mounted on a moving object), the scenery in front of the camera at each imaging time for each scanning line for one frame image is It changes from moment to moment. If the imaging time of the pixel area of the correlation destination is different from the imaging time of the reference image area, the displacement other than the parallax, that is, the displacement of the target object due to the dynamic element is included between the pair of images. I will be. As a result, in a captured image in which one frame is formed by a plurality of pixel groups arranged in time series, there is a possibility that the calculation accuracy of parallax is reduced.
[0007]
The present invention has been made in view of such circumstances, and an object of the present invention is to calculate parallax using a pair of captured image data in which one frame includes a plurality of pixel groups arranged in time series. In addition, even if the relative displacement of each captured image is corrected, the parallax is accurately calculated.
[0008]
[Means for Solving the Problems]
In order to solve such a problem, a first invention calculates a parallax for a reference pixel region by specifying a correlation destination of a reference pixel region in one captured image in another captured image based on a pair of captured images. The present invention provides a stereo image processing device having a pair of cameras, a first timing control unit, an image conversion unit, and a first timing correction unit. Here, the pair of cameras each output a captured image in which one frame includes a plurality of pixel groups arranged in time series. The first timing control unit controls the imaging timing of the other camera in synchronization with the imaging timing of one camera. The image conversion unit geometrically treats a pair of captured images output from the pair of cameras as a processing target and sets a relative shift of the other captured image with respect to one captured image using a predetermined conversion value. Convert. The first timing correction unit determines a relative time difference between the imaging timing of one camera and the imaging timing of the other camera according to the conversion value. Then, a correction amount for correcting the imaging timing of the other camera is output to the first timing control unit based on the determined relative time difference.
[0009]
Here, in the first aspect, the first timing control unit may change the imaging timing of the other camera in accordance with the time-series imaging intervals for the plurality of pixel groups according to the correction amount. preferable.
[0010]
Further, in the first invention, it is preferable that the imaging time of the reference pixel area coincides with the imaging time of the pixel area corresponding to the correlation destination specified in the other captured image.
[0011]
Further, in the first invention, it is preferable to further include a memory for storing a table in which the converted value and the relative time difference are associated with each other. In this case, the first timing correction unit determines the relative time difference corresponding to the converted value by reading from the memory.
[0012]
Further, in the first invention, it is preferable to further include a second timing control unit and a second timing correction unit. Here, the second timing control unit controls the imaging timing of one camera. In addition, the second timing correction unit determines a relative time difference between the imaging timing of one camera and the imaging timing of the other camera according to the conversion value. Then, a correction amount for correcting the imaging timing of one camera based on the determined relative time difference is output to the second timing control unit.
[0013]
Further, in the first aspect, a conversion is performed on a pair of captured images output from the image conversion unit as a processing target, and the conversion value is corrected in accordance with a relative displacement of one captured image with respect to the other captured image. It is preferable to further include a correction amount calculation unit that calculates the value correction amount. In this case, the correction amount calculation unit feeds back the calculated conversion value correction amount to the image conversion unit and feeds back the calculated conversion value correction amount to the first timing correction unit.
[0014]
Further, in the first aspect, it is preferable that the pixel group forms a horizontal pixel row having a width of one pixel or more on a captured image plane. Alternatively, it is preferable that the pixel group forms a vertical pixel column having a width of one pixel or more in the captured image plane.
[0015]
Further, in the first invention, the first timing control section can correct the relative shift of the other captured image with respect to the one captured image by controlling the imaging timing of the other camera. preferable.
[0016]
Further, according to the second invention, based on a pair of captured images in which one frame is composed of a plurality of pixel groups arranged in time series, a correlation destination of a reference pixel region in one captured image is determined in the other captured image. Provided is a stereo image processing method for calculating parallax for a reference pixel region by specifying. In this stereo image processing method, the first step controls the imaging timing of the other captured image in synchronization with the imaging timing of one captured image. In the second step, a pair of captured images is processed, and a relative displacement of the other captured image with respect to one captured image is geometrically converted by a predetermined conversion value. Further, the third step determines a relative time difference between the imaging timing of one captured image and the imaging timing of the other captured image according to the conversion value. Then, based on the determined relative time difference, the imaging timing of the other captured image is feedback-corrected.
[0017]
Here, in the second invention, it is preferable that the imaging timing of the other captured image subjected to the feedback correction is changed corresponding to a time-series imaging interval for a plurality of pixel groups.
[0018]
Further, in the second aspect, it is preferable that the imaging time of the reference pixel area coincides with the imaging time of the pixel area corresponding to the correlation destination specified in the other captured image.
[0019]
In the second invention, it is preferable that the method further includes the following steps. First, as a step, the imaging timing of one captured image is controlled. As a step, a relative time difference between the imaging timing of one captured image and the imaging timing of the other captured image is determined according to the conversion value. Then, based on the determined relative time difference, the imaging timing of one of the captured images is feedback-corrected.
[0020]
In the second invention, it is preferable that the method further includes the following steps. First, as a step, for a pair of image-converted captured images to be processed, relative displacement of the other captured image with respect to one captured image is specified, and in accordance with the specified relative displacement, A conversion value correction amount for correcting the conversion value is calculated. As a step, the conversion value is feedback-corrected based on the calculated conversion value correction amount. Further, as a step, based on the calculated conversion value correction amount, the feedback timing of the other captured image subjected to feedback correction is further feedback-corrected.
[0021]
Further, in the second invention, it is preferable that the method further includes a step of controlling a timing of capturing the other captured image to correct a relative shift of the other captured image with respect to the one captured image.
[0022]
BEST MODE FOR CARRYING OUT THE INVENTION
(1st Embodiment)
FIG. 1 is a block diagram of a stereo image processing apparatus according to the present embodiment. This stereo image processing device has, for example, a function as a vehicle outside monitoring device, performs stereo image processing using a pair of captured images, and monitors a situation ahead of the own vehicle based on the processed information. .
[0023]
A stereo camera that captures a scene in front of the host vehicle is mounted near a rear-view mirror and includes a pair of cameras 1 and 2. Each of the cameras 1 and 2 has a built-in image sensor. The image sensor includes a plurality of pixel groups arranged in a time series, and a captured image (hereinafter, referred to as an image display unit) in which one frame (display unit of an image) is formed. Simply called "image"). An example of such an image sensor is a CMOS sensor. This CMOS sensor uses a horizontal pixel row having a width of one pixel or more (hereinafter, simply referred to as a “scanning line”) as an imaging unit in an imaging operation for an image of one frame, and scans a scanning line from above the image based on a reference signal. While moving downward, images are taken in time series (further, images are output). That is, the scanning line as the imaging unit forms a horizontal pixel row of one pixel width or more on the image plane.
[0024]
The main camera 1 captures a reference image (right image) necessary for performing stereo image processing, and the sub camera 2 captures a comparison image (left image). The imaging timing (that is, the output timing of the reference signal) of the sub camera 2 of the pair of cameras 1 and 2 is controlled by the first timing control unit 3, and the main camera 1 is controlled by the second timing control unit 4 Controls the imaging timing. The cameras 2 and 1 whose imaging timings are controlled by the first and second timing controllers 3 and 4 acquire the captured images in a state where the imaging timings are synchronized with each other. In the present embodiment, the first timing control unit 3 (sub camera 2 side) adjusts the imaging timing of the sub camera 2 in accordance with the correction amount output from the subsequent timing correction unit (first timing correction unit) 5. Can be changed in accordance with the time-series imaging intervals for a plurality of pixel groups (scanning lines in the present embodiment). For example, if the time-series imaging interval for each scanning line is 1 second, the first timing control unit 3 sets the imaging timing of the sub camera 2 to at least 1 second with respect to the imaging timing of the main camera 1, ie, The imaging timing can be changed to 1 × n (seconds).
[0025]
The timing correction unit 5 calculates a relative time difference between the imaging timing of the main camera 1 and the imaging timing of the sub camera 2 according to the conversion values handled by the first and second image conversion units 11 and 12 at the subsequent stage. decide. In the present embodiment, the timing correction unit 5 includes a correction memory (ROM) 6 that stores a correction table in which the converted value is associated with the relative time difference (hereinafter, simply referred to as “relative time difference”). The relative time difference corresponding to the converted value is determined by reading from the correction memory 6. Then, based on the determined relative time difference, a correction amount for correcting the imaging timing of the sub camera 2 is output to the first timing control unit 3. The main feature of the present embodiment is to control such imaging timing, and details thereof will be described later.
[0026]
The analog images output from the cameras 1 and 2 are converted by the A / D converters 7 and 8 into digital images having a predetermined luminance gradation (for example, 256 gray scales). The digitized pair of image data (stereo image data) is stored in frame memories 9 and 10 each having a capacity capable of storing one frame of digital data.
[0027]
Although the mounting positions of the cameras 1 and 2 usually vary to some degree, there is an error in the mounting accuracy, and a shift due to such mounting error occurs in the left and right images. Therefore, the first and second image converters 11 and 12 perform the following conversion processing on a pair of images output from the cameras 1 and 2 as processing targets. First, the images stored in the frame memories 9 and 10 are read by the first and second image converters 11 and 12, respectively. Each image is geometrically transformed with a preset transformation value, so that a relative displacement of one image (for example, a reference image) with respect to another image (for example, a comparison image) is corrected. You. This conversion includes performing a linear shape correction such as an affine transformation or a nonlinear shape correction on the stereo image. These converted image data become equivalent to image data output in a state where the optical position of the camera should be as a stereo camera. As these correction methods, for example, the methods described in JP-A-10-307352 or JP-A-11-325889 filed by the present applicant can be used.
[0028]
In the present embodiment, the first and second image converters 11 and 12 have conversion memories (ROM and the like) 13 and 14 for storing a predetermined conversion table, respectively. The image conversion units 11 and 12 read out the conversion values from the conversion memories 13 and 14, respectively, to convert the image for each pixel. Then, the converted pair of image data is output to the stereo image processing unit 15 and the recognition unit 16, respectively.
[0029]
The stereo image processing unit 15 calculates distance data for an image corresponding to one frame based on the reference image and the comparison image. Here, “distance data” is a set of parallaxes d calculated for each small area in an image plane defined by image data, and each parallax d corresponds to a position (i, j) on the image plane. It is attached. One parallax d is calculated for each pixel block having a predetermined area (for example, 4 × 4 pixels) that forms a part of the reference image.
[0030]
FIG. 2 is an explanatory diagram of a pixel block set in the reference image. For example, when the reference image is composed of 200 × 512 pixels, a parallax group corresponding to the number of pixel blocks PBij (50 × 128) can be calculated from an image corresponding to one frame. As is well known, the parallax d is the amount of displacement in the horizontal direction with respect to the pixel block PBij, which is the unit of calculation, and has a large correlation with the distance to the object projected on the pixel block PBij. In other words, the closer the object shown in the pixel block PBij is to the cameras 1 and 2, the larger the parallax d of the pixel block PBij becomes, and the farther the object is, the smaller the parallax d becomes. d becomes 0).
[0031]
When calculating the parallax d for a certain pixel block PBij (correlation source), an area (correlation destination) having a correlation with the luminance characteristic of this pixel block PBij is specified in the comparison image. As described above, the distance from the cameras 1 and 2 to the object appears as a horizontal shift amount between the reference image and the comparison image. Therefore, when searching for the correlation destination in the comparison image, it is sufficient to search on the same horizontal line (epipolar line) as the j coordinate of the pixel block Pij that is the correlation source. The stereo image processing unit 15 sequentially shifts the correlation between the correlation source and the correlation destination candidate within the predetermined search range set on the basis of the i coordinate of the correlation source while shifting the pixel on the epipolar line by one pixel. Evaluate (stereo matching). Then, in principle, the horizontal shift amount of the correlation destination (one of the correlation destination candidates) determined to have the highest correlation is set as the parallax d of the pixel block PBij.
[0032]
The correlation between two pixel blocks can be evaluated, for example, by calculating a city block distance CB. Equation 1 shows the basic form of the city block distance CB. In the equation, p1ij is the luminance value of the ij-th pixel of one pixel block, and p2ij is the ij-th luminance value of the other pixel block. The city block distance CB is the sum of the difference (absolute value) between the luminance values p1ij and p2ij corresponding to the position in the entire pixel block, and the smaller the difference, the greater the correlation between the two pixel blocks. .
(Equation 1)
CB = Σ | p1ij−p2ij |
[0033]
Basically, among the city block distances CB calculated for each pixel block existing on the epipolar line, the pixel block having the minimum value is determined as the correlation destination. The amount of deviation between the correlation destination and the correlation source specified in this way is the parallax d. Note that the hardware configuration of the stereo image processing unit 15 for calculating the city block distance CB is disclosed in Japanese Patent Application Laid-Open No. H5-114099. The distance data calculated through such processing, that is, the set of parallaxes d associated with the position (i, j) on the image is output to the recognition unit 16.
[0034]
The recognizing unit 16 recognizes a running situation ahead of the vehicle based on the distance data output from the stereo image processing unit 15. However, the recognizing unit 16 can also use image data as appropriate in recognizing the situation in the monitoring area. Then, based on the recognition result, the recognizing unit 16 operates an alarm device such as a monitor or a speaker when it is determined that the driver needs to be alerted. In addition, the recognition unit 16 may control actuators as necessary to perform vehicle control such as downshifting and brake control.
[0035]
Hereinafter, the timing correction will be described. FIG. 3 is an explanatory diagram showing the correspondence between the imaging time and the image position. As prerequisites for performing the above-described stereo image processing, the j coordinate of a certain pixel block PBij (correlation source) selected on the reference image, the j coordinate of an area (correlation destination) specified on the comparison image, and Must match. Here, it is assumed that the imaging timing of the sub camera 2 is controlled by the first timing control unit 3 so as to coincide with the imaging timing of the main camera 1. In this case, as shown in FIG. 3A, a reference image corresponding to one frame output from the main camera 1 is composed of a plurality of scanning lines arranged in a time series with imaging times t1, t2, t3,. Constitutes one frame. In addition, a comparative image corresponding to one frame output from the sub camera 2 is also composed of a plurality of scanning lines arranged in time series with the imaging times t1, t2, t3,.
[0036]
As described above, a displacement due to the displacement of the cameras 1 and 2 is generated between the images. Here, it is assumed that the relative displacement between the images due to the displacement of the cameras 1 and 2 appears as a displacement of two scanning lines in the vertical direction. In this case, as shown in FIG. 3A, the target object (subject A) projected on the reference image is projected with a shift of two scanning lines (for example, the lower side) in the comparative image. Such relative displacement between the images is geometrically converted by the above-described first and second image conversion units 11 and 12 based on a predetermined conversion value. By this conversion, a positional match between the images is ensured. For example, a conversion is performed such that the position of the comparison image is shifted upward by two scanning lines relatively. As a result, as shown in FIG. 3B, the objects projected in both images exist on the corresponding j coordinate in the reference image and the comparison image.
[0037]
It should be noted here that when an image sensor having a different imaging time for each scanning line (or pixel or pixel group) such as a CMOS sensor is used, an image corresponding to one frame has This means that images at different imaging times t exist. Therefore, when the corresponding positions (that is, positions on the same j-coordinate) in both converted images are compared at the imaging time t, the imaging times t1, t2, t3,. . Correspond to the imaging times t3, t4, t5,... (See FIG. 3B).
[0038]
By the way, when the target object is a moving object such as a car, the target object moves by a shift of the imaging time t. Therefore, unless the corresponding position on the same j-coordinate between the reference image and the comparison image is an image captured at the same imaging time t, errors included in parallax will increase when stereo matching is performed. That is, as prerequisites for performing stereo image processing using such an image, not only the positional match between the reference image and the comparative image but also the matching of the imaging time at the corresponding position is required. Also need to be planned.
[0039]
In the example shown in the figure, the image is shifted by two scanning lines to achieve positional correspondence between the images. For example, when the cameras 1 and 2 are capturing an image of the scene ahead of the cameras 1 and 2 for each scanning line at an imaging time t of 1 second interval, the imaging time t two scanning lines below the imaging time t1 is: This is imaging time t3, and imaging is performed 2 seconds behind imaging time t1. Therefore, if the imaging timing of one camera 1 (or camera 2) is changed so that the imaging time t at the corresponding position after the conversion corresponds to the shift amount for two scanning lines, this kind of situation occurs. The problem should be able to be solved.
[0040]
Therefore, it is assumed that a predetermined correction amount is added to the original imaging timing of the sub-camera 2 and the imaging timing is advanced by two imaging times t (for example, 2 seconds) relative to the imaging timing of the main camera 1. At this time, one frame of comparative image data corresponding to one frame output from the sub-camera 2 is composed of a plurality of scanning lines arranged in a time series with imaging times t-1, t0, t1,. . In this case, when the corresponding positions of the converted reference image and the comparison image are compared at the imaging time t, the imaging times t1, t2, t3,... In the reference image and the imaging times t1, t2 in the comparison image. , T3,... Respectively (see FIG. 4). Therefore, by changing the imaging timing in accordance with the positional shift amount between the reference image and the comparison image, that is, the conversion value converted to match this shift, the position corresponding to the reference image and the comparison image is changed. Can be matched in time.
[0041]
Therefore, in the present embodiment, a timing correction unit 5 is provided for the first timing control unit 3 on the sub camera 2 side. The timing correction unit 5 determines a relative time difference according to a conversion value for converting each image, and outputs a correction amount corresponding to the relative time difference to the first timing control unit 3. Accordingly, the imaging timing of the sub camera 2 controlled by the first timing control unit 3 is changed corresponding to the correction amount. In other words, the imaging timing of the image captured by the sub camera 2 is feedback corrected. Thus, a desired relative time difference is secured between the imaging timing of the main camera 1 and the imaging timing of the sub camera 2. As a result, in the present embodiment, the imaging time for a certain reference pixel area (pixel block p1ij) on the reference image matches the imaging time for the pixel area (pixel block p2ij) corresponding to the correlation destination specified in the comparison image. Can be done.
[0042]
Here, the conversion value associated with the relative time difference refers to a conversion value for converting a relative shift between the reference image and the comparison image. Therefore, in the present embodiment, the conversion value associated with the time difference refers to a difference between the correction values handled by the first and second image conversion units 11 and 12, respectively. However, if the configuration is such that the image conversion is performed only on one image (for example, a comparison image), the conversion value for the one image will be associated with the relative time difference.
[0043]
For example, there is a linear relationship between such a conversion value and the relative time difference as shown in FIG. 5, and the relative time difference is uniquely specified according to the conversion amount. From the relationship shown in the drawing, it can be understood that the larger the shift amount of the comparative image with respect to the reference image, the earlier the imaging timing of the comparative image should be. However, the relationship shown in the figure is an example, and is determined to such an extent that positional conversion accompanying the conversion value and temporal matching at the corresponding position after the conversion can be achieved between the reference image and the comparison image. It should be done.
[0044]
The correction table that satisfies such a relationship and is stored in the correction memory 6 of the timing correction unit 5 is determined in advance for each device in the factory before shipment. Specifically, a stereo camera is attached to the vehicle body at the time of shipment adjustment, and then a test chart is taken to measure the amount of displacement due to an attachment error between the two cameras 1 and 2, whereby the individual stereo cameras are Is set for each point coordinate. The conversion tables indicating the conversion values of the shift amounts are stored in the memories 13 and 14 of the image conversion units 11 and 12, respectively. Then, for example, from the difference between the conversion amounts of the image center points of the pair of images, a relative time difference such that the center points of the pair of images after coordinate conversion are captured at the same time is obtained, so that the correction to show the correspondence relationship The table is determined and stored. However, whether or not the timing correction unit 5 has the correction memory 6 for storing the correction table is optional, and the relative time difference may be calculated each time according to the conversion value.
[0045]
As described above, according to the present embodiment, a desired relative time difference is secured between the imaging timing of the main camera 1 and the imaging timing of the sub camera 2 by performing the above-described timing correction. . Due to this time difference, even when image conversion is performed to correct the positional displacement of each image, time matching at the corresponding position in each image can be achieved. As a result, high-quality distance data (that is, parallax) can be calculated even when an image in which one frame is formed by a plurality of pixel groups arranged in time series is used. In addition, it is possible to improve recognition accuracy when recognizing a situation in front using the distance data.
[0046]
Note that the timing correction unit 5 may be provided in the second timing control unit 4 on the main camera 1 side. In this case, in the above-described example, the timing correction unit 5 outputs a correction amount that delays the imaging timing of the main camera 1 by two imaging times t (for example, 2 seconds) with respect to the imaging timing of the sub camera 2. Further, a configuration may be employed in which the timing correction units are provided in both the timing control units 3 and 4, respectively. In this case, in accordance with the time difference, for example, if the imaging timing of the main camera 1 is delayed by one second, control is performed such that the imaging timing of the sub camera 2 is advanced by one second (that is, the relative time difference between the two imaging timings is Offset by 2 seconds). Even with such a configuration, the same effects as in the first embodiment described above can be obtained.
[0047]
(Second embodiment)
FIG. 6 is a block diagram showing a stereo image processing apparatus according to the second embodiment. In the second embodiment, the same components as those described in the first embodiment shown in FIG. 1 are denoted by the same reference numerals, and description thereof is omitted.
[0048]
The difference between the second embodiment and the first embodiment is that the shift of each image due to the shift of the cameras 1 and 2 over time is further considered. Specifically, in addition to the configuration shown in the first embodiment, a second timing correction unit 17, a correction amount calculation unit 19, and a correction amount addition unit 20 provided on the second image conversion unit 12 side And a correction amount adding unit 21 provided on the timing correction unit 5 side. Here, the above-described timing correction unit 5 on the first timing control unit 3 side is hereinafter referred to as “first timing correction unit 5”.
[0049]
The second timing correction unit 17 calculates a relative time difference between the imaging timing of the main camera 1 and the imaging timing of the sub camera 2 according to the conversion values handled by the first and second image conversion units 11 and 12. decide. The second timing correction section 17 has a correction memory (ROM) 18 for storing a correction table in which the conversion value and the relative time difference are associated, and reads out the relative time difference corresponding to the conversion value from the correction memory 18 and determines the relative time difference. I do. Then, a correction amount for correcting the imaging timing of the main camera 1 is output to the second timing control unit 4 based on the determined time difference.
[0050]
When the second timing correction unit 17 is provided, the two timing correction units 5 and 17 cooperate to adjust the imaging timing of the main camera 1 and the sub camera 2 by the two correction amounts specified based on the conversion values. As a result, a desired relative time difference is ensured. Then, due to this time difference, even when image conversion is performed, it is possible to achieve temporal coincidence at corresponding positions in both images.
[0051]
The correction amount calculation unit 19 calculates a conversion value correction amount for geometrically converting a pair of images converted by the image conversion units 11 and 12 when both of the images are to be processed and a positional shift occurs. Calculate in real time. As a factor of such a positional shift, for example, a shift caused by a temporal shift of the cameras 1 and 2 (for example, a shift of a stereo camera occurring after shipment) can be considered. Specifically, the correction amount calculation unit 19 obtains the relative shift amount of the comparison image with respect to the reference image, and calculates the conversion value correction amount for one image (the comparison image in the present embodiment) accordingly. calculate. The details of the correction amount calculating section 19 are described in JP-A-2001-82955, and should be referred to if necessary. The calculated conversion value correction amount is output to the correction amount addition unit 20 on the second image conversion unit 12 side and to the correction amount addition unit 21 on the first timing correction unit 5 side. As a result, the correction amount calculation unit 19 feeds back the calculated conversion value correction amount to the second image conversion unit 12, and feeds back the conversion value correction amount to the first timing correction unit 5.
[0052]
The correction amount adding unit 20 of the second image conversion unit 12 converts the conversion value correction amount calculated by the correction amount calculation unit 19 into the conversion value read from the conversion memory 14 of the second image conversion unit 12. And outputs the result to the second image converter 12. As a result, a preset conversion value (stored in the conversion memory 14) is feedback-corrected based on the calculated conversion value correction amount. As a result, when image conversion is performed by the second image conversion unit 12, a conversion value correction amount for correcting a newly generated image shift amount is further added. The shift between images due to the shift is corrected.
[0053]
The correction amount adding unit 21 of the first timing correction unit 5 specifies the relative time difference based on the conversion value correction amount calculated by the correction amount calculation unit 19, for example, based on the relationship illustrated in FIG. Then, the specified relative time difference is added to the time difference read from the correction memory 6 and output to the first timing correction unit 5. Accordingly, a time difference corresponding to a conversion value correction amount for correcting a newly generated image shift amount is further added to the imaging timing of the sub camera 2. In other words, based on the calculated conversion value correction amount, the imaging timing of the comparative image to be feedback corrected with the correction amount corresponding to the time difference determined from the correction memory 6 is further feedback corrected. Thereby, even if the image conversion is newly performed (a change over time) by the second image conversion unit 12, it is possible to correspond to the imaging time at the corresponding position in each image.
[0054]
As described above, in the second embodiment, it goes without saying that the same operation and effect as those of the configuration shown in the first embodiment can be obtained. The difference between images due to the above is also taken into account. Thus, the image position shift and the imaging timing of the cameras 1 and 2 can be corrected in real time, so that the distance data can be accurately calculated.
[0055]
In the second embodiment, the first and second timing correction units 5, 17 are provided in the first and second timing control units 3, 4, respectively. However, the present invention is not limited to this. Absent. That is, as in the first embodiment, the second timing correction unit 17 may be omitted. However, it is preferable that the imaging timings of the cameras 1 and 2 can be changed, as in the present embodiment, because the variable width (that is, the time width) of the imaging timings of the cameras 1 and 2 can be set large.
[0056]
Further, in the first and second embodiments described above, a plurality of pixel groups arranged in time series has been described using a horizontal pixel row (scanning line) having a width of one pixel or more as an example. A vertical pixel column (vertical scanning line) having a pixel width or more may be used. Further, the CMOS sensor is an example, and the cameras 1 and 2 may include an image sensor in which each pixel or pixel group is used as an imaging unit.
[0057]
(Third embodiment)
The above-described CMOS sensor performs imaging and output for each scanning line based on the reference signal, and the target scanning line sequentially moves from top to bottom of the image. By utilizing this, a part of the image conversion processing relating to the reference image and the comparison image can be realized by controlling the imaging timing. Specifically, when the reference image and the comparison image are taken into the frame memories 9 and 10, the same address operation is performed for the images having the same reference / comparison image taking timing. Thus, by changing the imaging timing of one image (for example, a comparison image), it is the same as the case where the relative coordinates of the comparison image in the vertical direction with respect to the reference image are converted when the frame memories 9 and 10 are stored. State. At this time, the same addresses on the respective frame memories 9 and 10 of the reference image and the comparison image have the same imaging time, but are different in position (that is, the position of a position shifted by the time difference corresponding to the imaging timing). Image) is stored (the state shown in FIG. 4). Thus, by changing the imaging timing, the same effect as that of the image conversion can be obtained, and time correspondence can be achieved at the corresponding position of each image.
[0058]
In the present embodiment, an example of monitoring outside the vehicle has been described. However, examples of application using such a method include various uses such as level crossing monitoring, terrain recognition, and altitude measurement. In the above-described embodiment, the recognition process is performed based on the calculated distance data. However, it is needless to say that the stereoscopic image processing device simply functions as a stereo image processing device that calculates parallax based on a pair of image data. Nor.
[0059]
In addition, the present invention relates not only to the stereo image processing apparatus for performing the timing correction shown in FIGS. 1 and 6 but also to a stereo image processing method for performing the timing correction in the above-described procedure. It should be understood to include.
[0060]
【The invention's effect】
As described above, according to the present invention, by changing the imaging timing of the cameras so as to be a relative time difference corresponding to the converted value of the image, the desired relative time difference is obtained between the imaging timings of the pair of cameras. Secured. By ensuring this time difference, even when image conversion is performed, time matching at a corresponding position in the image can be achieved. Accordingly, even when an image in which one frame is composed of a plurality of pixel groups arranged in time series is used, it is possible to calculate parallax with high accuracy.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a stereo image processing apparatus according to a first embodiment.
FIG. 2 is an explanatory diagram of a pixel block set in a reference image.
FIG. 3 is an explanatory diagram showing a correspondence between an imaging time and an image position.
FIG. 4 is an explanatory diagram showing a correspondence relationship between an imaging time and an image position according to the embodiment;
FIG. 5 is an explanatory diagram showing a correspondence between a conversion value and a time difference;
FIG. 6 is a block diagram showing a stereo image processing apparatus according to a second embodiment;
[Explanation of symbols]
1 Main camera
2 Sub camera
3 First timing control unit
4 Second timing control unit
5 Timing correction unit (first timing correction unit)
6 Correction memory
7 A / D converter
8 A / D converter
9 Frame memory
10 frame memory
11 First image conversion unit
12 Second image conversion unit
13 Conversion memory
14 Conversion memory
15 Stereo image processing unit
16 Recognition unit
17 Second timing correction unit
18 Correction memory
19 Correction amount calculation unit
20 Conversion correction amount adder
21 Timing correction amount adder

Claims (15)

  1. In a stereo image processing apparatus that calculates a parallax for the reference pixel region by specifying a correlation destination of the reference pixel region in one captured image in the other captured image based on the pair of captured images,
    A pair of cameras each outputting a captured image in which one frame is composed of a plurality of pixel groups arranged in time series,
    A first timing control unit that controls the imaging timing of the other camera in synchronization with the imaging timing of one camera;
    Image conversion in which a pair of captured images output from the pair of cameras is processed, and a relative shift of the other captured image with respect to one captured image is geometrically converted by a preset conversion value. Department and
    According to the conversion value, a relative time difference between the imaging timing of the one camera and the imaging timing of the other camera is determined, and the imaging of the other camera is performed based on the determined relative time difference. A first timing correction unit that outputs a correction amount for correcting timing to the first timing control unit.
  2. The said 1st timing control part changes the imaging timing of the said other camera corresponding to the time-sequential imaging interval regarding the said several pixel group according to the said correction amount, The Claims characterized by the above-mentioned. 3. The stereo image processing device according to 1.
  3. 3. The stereo according to claim 1, wherein an imaging time of the reference pixel area matches an imaging time of a pixel area corresponding to the specified correlation destination in the other captured image. 4. Image processing device.
  4. Further comprising a memory for storing a table in which the conversion value and the relative time difference are associated,
    The stereo image processing according to any one of claims 1 to 3, wherein the first timing correction unit determines the relative time difference corresponding to the conversion value by reading from the memory. apparatus.
  5. A second timing control unit that controls an imaging timing of the one camera;
    According to the conversion value, a relative time difference between the imaging timing of the one camera and the imaging timing of the other camera is determined, and the imaging of the one camera is performed based on the determined relative time difference. 5. The stereo image processing device according to claim 1, further comprising a second timing correction unit that outputs a correction amount for correcting a timing to the second timing control unit.
  6. Calculating a conversion value correction amount for correcting the conversion value according to a relative displacement of the other captured image with respect to one captured image, with the pair of captured images output from the image conversion unit as a processing target. Further comprising a correction amount calculating unit for performing
    The correction amount calculation unit feeds back the calculated conversion value correction amount to the image conversion unit, and feeds back the calculated conversion value correction amount to the first timing correction unit. Item 6. A stereo image processing device according to any one of Items 1 to 5.
  7. The stereo image processing device according to claim 1, wherein the pixel group forms a horizontal pixel row having a width of one pixel or more in a captured image plane.
  8. The stereo image processing apparatus according to claim 1, wherein the pixel group forms a vertical pixel row having a width of one pixel or more in a captured image plane.
  9. The first timing control unit corrects a relative shift of the other captured image with respect to the one captured image by controlling an imaging timing of the other camera. Item 9. A stereo image processing device according to any one of Items 1 to 8.
  10. By specifying a correlation destination of a reference pixel region in one captured image in the other captured image based on a pair of captured images in which one frame is composed of a plurality of pixel groups arranged in time series, the reference pixel region In a stereo image processing method for calculating parallax related to
    A first step of controlling the imaging timing of the other captured image in synchronization with the imaging timing of the one captured image;
    A second step of geometrically converting the relative shift of the other captured image with respect to the one captured image as a processing target with the pair of captured images as a reference using a predetermined conversion value;
    According to the conversion value, determine the relative time difference between the imaging timing of the one captured image and the imaging timing of the other captured image, based on the determined relative time difference, the other And a third step of feedback-correcting the imaging timing of the captured image.
  11. 11. The stereo image processing method according to claim 10, wherein an imaging timing of the other captured image subjected to the feedback correction is changed corresponding to a time-series imaging interval for the plurality of pixel groups. .
  12. 12. The stereo according to claim 10, wherein an imaging time of the reference pixel area and an imaging time of a pixel area corresponding to the specified correlation destination in the other captured image match. Image processing method.
  13. Controlling imaging timing of the one captured image;
    According to the conversion value, a relative time difference between the imaging timing of the one captured image and the imaging timing of the other captured image is determined, and the one imaging is performed based on the determined relative time difference. 13. The stereo image processing method according to claim 10, further comprising a step of performing a feedback correction of an image capturing timing.
  14. With the pair of captured images subjected to the image conversion as processing targets, a relative shift of the other captured image with respect to one captured image is specified, and the converted value is determined in accordance with the specified relative shift. Calculating a conversion value correction amount for correcting
    Based on the calculated conversion value correction amount, performing a feedback correction of the conversion value,
    The method according to any one of claims 10 to 13, further comprising the step of: further performing a feedback correction of an imaging timing of the other captured image to be subjected to the feedback correction based on the calculated conversion value correction amount. Stereo image processing method.
  15. 15. The method according to claim 10, further comprising: controlling a timing of capturing the other captured image to correct a relative displacement of the other captured image with respect to the one captured image. 16. A stereo image processing method according to any one of the above.
JP2002184013A 2002-06-25 2002-06-25 Stereo image processing apparatus and stereo image processing method Active JP3958638B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002184013A JP3958638B2 (en) 2002-06-25 2002-06-25 Stereo image processing apparatus and stereo image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002184013A JP3958638B2 (en) 2002-06-25 2002-06-25 Stereo image processing apparatus and stereo image processing method

Publications (2)

Publication Number Publication Date
JP2004032244A true JP2004032244A (en) 2004-01-29
JP3958638B2 JP3958638B2 (en) 2007-08-15

Family

ID=31180021

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002184013A Active JP3958638B2 (en) 2002-06-25 2002-06-25 Stereo image processing apparatus and stereo image processing method

Country Status (1)

Country Link
JP (1) JP3958638B2 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006170993A (en) * 2004-12-10 2006-06-29 Microsoft Corp Matching of asynchronous image parts
JP2008211373A (en) * 2007-02-23 2008-09-11 Toyota Motor Corp Device and method for monitoring surroundings around vehicle
JP2010165339A (en) * 2008-10-06 2010-07-29 Nvidia Corp Media capture system, method, and computer program product for assessing processing capability utilizing cascaded memory
WO2011132364A1 (en) * 2010-04-19 2011-10-27 パナソニック株式会社 Three-dimensional imaging device and three-dimensional imaging method
US20120038748A1 (en) * 2009-05-19 2012-02-16 Autoliv Development Ab Vision System and Method for a Motor Vehicle
RU2456763C1 (en) * 2011-05-17 2012-07-20 Борис Иванович Волков Stereoscopic television system
JP2012198075A (en) * 2011-03-18 2012-10-18 Ricoh Co Ltd Stereoscopic camera device and image adjusting method
JP2013070177A (en) * 2011-09-21 2013-04-18 Toshiba Alpine Automotive Technology Corp On-vehicle camera
JP2015033047A (en) * 2013-08-05 2015-02-16 Kddi株式会社 Depth estimation device employing plural cameras
JP2015082192A (en) * 2013-10-22 2015-04-27 富士通株式会社 Image processing apparatus, image processing method, and image processing program
JP2016514246A (en) * 2013-01-15 2016-05-19 モービルアイ ビジョン テクノロジーズ リミテッド Stereo support with rolling shutter
CN106233722A (en) * 2014-03-20 2016-12-14 高途乐公司 The automatic alignment of the imageing sensor in multicamera system
US9792667B2 (en) 2014-03-20 2017-10-17 Gopro, Inc. Target-less auto-alignment of image sensors in a multi-camera system
WO2019058760A1 (en) * 2017-09-25 2019-03-28 日立オートモティブシステムズ株式会社 Stereo image processing device
US10389993B2 (en) 2014-03-20 2019-08-20 Gopro, Inc. Auto-alignment of image sensors in a multi-camera system

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006170993A (en) * 2004-12-10 2006-06-29 Microsoft Corp Matching of asynchronous image parts
KR101132099B1 (en) 2007-02-23 2012-04-04 도요타지도샤가부시키가이샤 Vehicle environment monitoring device and car environment monitoring method
JP2008211373A (en) * 2007-02-23 2008-09-11 Toyota Motor Corp Device and method for monitoring surroundings around vehicle
JP2010165339A (en) * 2008-10-06 2010-07-29 Nvidia Corp Media capture system, method, and computer program product for assessing processing capability utilizing cascaded memory
US20120038748A1 (en) * 2009-05-19 2012-02-16 Autoliv Development Ab Vision System and Method for a Motor Vehicle
CN102860016A (en) * 2010-04-19 2013-01-02 松下电器产业株式会社 Three-dimensional imaging device and three-dimensional imaging method
WO2011132364A1 (en) * 2010-04-19 2011-10-27 パナソニック株式会社 Three-dimensional imaging device and three-dimensional imaging method
US9304388B2 (en) 2010-04-19 2016-04-05 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional imaging device and three-dimensional imaging method
JP5683025B2 (en) * 2010-04-19 2015-03-11 パナソニックIpマネジメント株式会社 Stereoscopic image capturing apparatus and stereoscopic image capturing method
US8922626B2 (en) 2011-03-18 2014-12-30 Ricoh Company, Ltd. Stereo camera apparatus and method of obtaining image
JP2012198075A (en) * 2011-03-18 2012-10-18 Ricoh Co Ltd Stereoscopic camera device and image adjusting method
RU2456763C1 (en) * 2011-05-17 2012-07-20 Борис Иванович Волков Stereoscopic television system
JP2013070177A (en) * 2011-09-21 2013-04-18 Toshiba Alpine Automotive Technology Corp On-vehicle camera
US10200638B2 (en) 2013-01-15 2019-02-05 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
JP2016514246A (en) * 2013-01-15 2016-05-19 モービルアイ ビジョン テクノロジーズ リミテッド Stereo support with rolling shutter
JP2015033047A (en) * 2013-08-05 2015-02-16 Kddi株式会社 Depth estimation device employing plural cameras
JP2015082192A (en) * 2013-10-22 2015-04-27 富士通株式会社 Image processing apparatus, image processing method, and image processing program
CN106233722A (en) * 2014-03-20 2016-12-14 高途乐公司 The automatic alignment of the imageing sensor in multicamera system
US9792667B2 (en) 2014-03-20 2017-10-17 Gopro, Inc. Target-less auto-alignment of image sensors in a multi-camera system
CN106233722B (en) * 2014-03-20 2018-05-15 高途乐公司 The automatic alignment of imaging sensor in multicamera system
US10055816B2 (en) 2014-03-20 2018-08-21 Gopro, Inc. Target-less auto-alignment of image sensors in a multi-camera system
EP3120542A4 (en) * 2014-03-20 2017-03-01 GoPro, Inc. Auto-alignment of image sensors in a multi-camera system
US10389993B2 (en) 2014-03-20 2019-08-20 Gopro, Inc. Auto-alignment of image sensors in a multi-camera system
WO2019058760A1 (en) * 2017-09-25 2019-03-28 日立オートモティブシステムズ株式会社 Stereo image processing device

Also Published As

Publication number Publication date
JP3958638B2 (en) 2007-08-15

Similar Documents

Publication Publication Date Title
US10115024B2 (en) Road vertical contour detection using a stabilized coordinate frame
US20190356830A1 (en) Image distortion correction of a camera with a rolling shutter
US9854185B2 (en) Stereo assist with rolling shutters
US9329035B2 (en) Method to compensate for errors in time-of-flight range cameras caused by multiple reflections
JP5843751B2 (en) Information processing apparatus, information processing system, and information processing method
TWI432870B (en) Image processing system and automatic focusing method
JP2017531976A (en) System and method for dynamically calibrating an array camera
EP2382792B1 (en) Imaging apparatus, image correction method, and computer-readable recording medium
JP5683025B2 (en) Stereoscopic image capturing apparatus and stereoscopic image capturing method
JP5615441B2 (en) Image processing apparatus and image processing method
DE102006055641B4 (en) Arrangement and method for recording and reproducing images of a scene and / or an object
KR101221449B1 (en) Apparatus and method for calibrating image between cameras
US6823080B2 (en) Three-dimensional information processing apparatus and method
US9892493B2 (en) Method, apparatus and system for performing geometric calibration for surround view camera solution
CN100442141C (en) Image projection method and device
JP3280001B2 (en) Stereo image misalignment adjustment device
TWI383666B (en) An advanced dynamic stitching method for multi-lens camera system
JP5359783B2 (en) Image processing apparatus and method, and program
JP4699995B2 (en) Compound eye imaging apparatus and imaging method
EP2500748A2 (en) Stereo camera apparatus and method of obtaining image
EP0701721B1 (en) Stabilizing estimate of location of target region inferred from tracked multiple landmark regions of a video image
US7082209B2 (en) Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method
US8792039B2 (en) Obstacle detection display device
JP4488804B2 (en) Stereo image association method and three-dimensional data creation apparatus
CN101782675B (en) Lens control apparatus, optical apparatus and lens control method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050602

TRDD Decision of grant or rejection written
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070417

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070424

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070510

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110518

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110518

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120518

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130518

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140518

Year of fee payment: 7

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250