WO2025126535A1 - Stereo image processing device and stereo image calibration method - Google Patents

Stereo image processing device and stereo image calibration method Download PDF

Info

Publication number
WO2025126535A1
WO2025126535A1 PCT/JP2024/024431 JP2024024431W WO2025126535A1 WO 2025126535 A1 WO2025126535 A1 WO 2025126535A1 JP 2024024431 W JP2024024431 W JP 2024024431W WO 2025126535 A1 WO2025126535 A1 WO 2025126535A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
area
stereo
cameras
image
Prior art date
Application number
PCT/JP2024/024431
Other languages
French (fr)
Japanese (ja)
Inventor
和良 山崎
Original Assignee
Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Astemo株式会社 filed Critical Astemo株式会社
Publication of WO2025126535A1 publication Critical patent/WO2025126535A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention relates to a stereo image processing device mounted on, for example, a vehicle (vehicle-mounted) and a stereo image calibration method.
  • Stereo cameras are known as devices for recognizing objects three-dimensionally. Stereo cameras use the differences in how images are captured by multiple cameras placed in different positions to detect parallax between the multiple cameras based on trigonometry, and use this parallax to detect the depth and position of an object, making it possible to accurately detect the position of the measurement target.
  • Such stereo cameras are mounted on automobiles and other vehicles and are used in technology (in-vehicle sensing technology) to detect the position of obstacles, etc.
  • in-vehicle sensing technology is required to detect obstacles, etc. with a wide angle of view, as well as to detect obstacles at greater distances (wider angle of view and longer range).
  • In-vehicle stereo cameras are generally installed inside the vehicle to avoid the effects of dirt and other factors, but as the angle of view becomes wider, the effects of the vehicle's windshield cannot be ignored.
  • a calibration process for the stereo camera known as aiming, has been carried out during vehicle manufacturing and inspection to correct for misalignment of the stereo camera, but further measures are required to widen the angle of view.
  • Patent Document 1 is known as a method for calibrating a stereo camera that takes into account the influence of a windshield.
  • Patent Document 1 describes the problem as "to provide a stereo camera calibration device and calibration method that can properly calibrate a stereo camera using a target board even in a narrow space,” and describes the solution as "a stereo camera calibration device that measures the distance to an object based on parallax, comprising a stereo camera 3 having a pair of cameras 4, 6 spaced apart by a predetermined base line length B, a target board 8 that is arranged parallel to the arrangement direction of the pair of cameras 4, 6 and includes at least two targets 8a, 8b spaced apart by the same distance as the base line length B, and a calculation device that assumes that the target board 8 is located at an infinite distance relative to the stereo camera 3, treats the two targets 8a, 8b included in the images of the target board 8 photographed by each of the pair of cameras 4, 6 as the same target, and calibrates the parallax shift of the pair of cameras 4, 6.”
  • Patent Document 2 also describes a similar
  • Patent Document 2 states that the problem is to "variably set the search range when performing stereo matching according to the position on the image," and as a solution, it states that "the system has a comparison image line memory 7, an address generation circuit 10, and a stereo matching circuit 8.
  • the line memory 7 stores image data within a reference pixel area in one captured image and image data on a horizontal line corresponding to the vertical position of the reference pixel area in the other captured image.
  • the address generation circuit 10 sets the search range when performing stereo matching, and instructs the line memory 7 to read image data within the set search range and image data within the reference image area.
  • the stereo matching circuit 8 identifies the correlation destination of the reference pixel area by stereo matching based on the image data within the search range read from the line memory 7 and the image data of the reference pixel area.
  • the address generation circuit 10 calibrates the position of the search range for the reference pixel area based on the degree of deviation of the infinitely distant corresponding point based on the horizontal position of the reference pixel area.”
  • the present invention has been made in consideration of the above problems, and provides a stereo image processing device and stereo image calibration method that can calibrate a wide-angle stereo camera with high accuracy by placing a chart near the stereo camera, without increasing the size of the chart.
  • One aspect of the stereo image processing device includes a stereo matching unit that performs stereo matching of multiple images captured by multiple cameras that capture an image of a subject through a light-transmitting member to determine parallax, and a calibration processing unit that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member based on the results of stereo matching of multiple images obtained by the stereo matching unit capturing an image of a calibration chart with the multiple cameras, the calibration chart including a pattern, the pattern being a repeating pattern that is larger than the baseline length of the multiple cameras, and the calibration processing unit determines the calibration parameters based on the results of stereo matching assuming that different patterns are the same point.
  • the stereo image processing device and stereo image calibration method of the present invention can provide a stereo image processing device and stereo image calibration method that can calibrate a wide-angle camera with high accuracy without increasing the size of the chart.
  • FIG. 1 is a block diagram illustrating a configuration of a stereo camera image processing device according to a first embodiment.
  • FIG. 1 is a diagram for explaining a problem of a conventional method according to a first embodiment;
  • FIG. 11 is another diagram for explaining the problem of the conventional method according to the first embodiment.
  • 3 shows the shape of a chart area according to the first embodiment. 4 shows patterns in each area of the chart according to the first embodiment. 2 shows an optical path according to the first embodiment. 1 shows a parallax displacement in a conventional method according to the first embodiment. 2 shows the distance measurement in the conventional method according to the first embodiment. 4 shows parallax displacement according to the first embodiment. 3 shows the distance measurement according to the first embodiment. 4 shows the distance D according to the first embodiment.
  • FIG. 11 is a block diagram illustrating a configuration for creating calibration data in a stereo camera image processing device according to a second embodiment.
  • Example 1 A configuration of a stereo camera image processing device 10 (hereinafter, referred to as "image processing device 10" or “stereo image processing device 10") according to a first embodiment will be described with reference to Fig. 1.
  • the image processing device 10 is mounted on a vehicle such as an automobile, and is used to detect the distance from the vehicle to a three-dimensional object (other automobiles, buildings, pedestrians, etc.) around the vehicle.
  • a case where the image processing device 10 is mounted on a vehicle will be described as an example, but the present invention is not limited to this.
  • FIG. 1 is a block diagram showing an example of the configuration of an image processing device 10 according to a first embodiment.
  • This image processing device 10 is configured to detect surrounding three-dimensional objects based on images obtained by a right camera 50 and a left camera 60, and to issue an alarm if necessary.
  • the right camera 50 and the left camera 60 form a stereo camera. Note that the number of cameras that form the stereo camera is not limited to two, a left and a right camera.
  • the image processing device 10 is configured to include an image processing unit 100, a stereo disparity image generating unit 200, a stereo disparity image calibration unit 99, a road surface cross-sectional shape estimating unit 400, a stereoscopic three-dimensional object detection unit 500, and an alarm control unit 700.
  • a stereo parallax image is generated in the stereo parallax image generation unit 200 by utilizing the parallax between the right camera 50 and the left camera 60. Then, the stereo vision three-dimensional object detection unit 500 measures the distance from the vehicle to the three-dimensional object according to the parallax.
  • the right camera 50 and the left camera 60 each have a lens and an image sensor, not shown.
  • the right camera 50 and the left camera 60 each capture (take) an image of an object using an image sensor via a lens.
  • the right camera 50 and the left camera 60 are mounted inside the vehicle and capture images of subjects around the vehicle through a light-transmitting member such as the windshield.
  • the image processing device 10 captures an image P1 (first image) from the right camera 50 and an image P2 (second image) from the left camera 60.
  • the image processing unit 100 is configured to include, as an example, affine processing means 20a, 20b, luminance correction means 21a, 21b, pixel interpolation means 22a, 22b, and luminance information generation means 23a, 23b.
  • the image processing unit 100 applies a predetermined image processing to the images P1 and P2 obtained by the right camera 50 and the left camera 60, and supplies them to the stereo parallax image generation unit 200.
  • the affine processing means 20a applies affine processing to image P1 from the right camera 50.
  • the affine processing is, for example, a linear coordinate transformation process, but may also include non-linear calculations.
  • the affine processing means 20a obtains image P3 (third image).
  • the affine processing means 20b applies affine processing to image P2 from the left camera 60 to obtain image P4 (fourth image).
  • the affine processing means 20a and 20b may also perform distortion transformation processing other than affine processing.
  • fsin ⁇ which is the projection method of the fisheye lens, is projected into a coordinate system of (ftan ⁇ x, ftan ⁇ y).
  • f is the focal length of the fisheye lens
  • is the angle of view incident on the fisheye lens
  • ⁇ x and ⁇ y are the horizontal and vertical components of the angle of view incident on the fisheye lens.
  • the vertical pixel displacement caused by the influence of the windshield is calibrated by the affine processing means 20a and 20b.
  • the luminance correction means 21a corrects the luminance of each pixel of the image P3. For example, the luminance of each pixel of the image P3 is corrected based on the gain of the right camera 50, the difference in gain of each pixel in the image P3, etc. Similarly, the luminance correction means 21b corrects the luminance of each pixel of the image P4.
  • the luminance information generating means 23a generates luminance information for image P3. For example, it converts information representing a color image into luminance information for generating a parallax image. Similarly, the luminance information generating means 23b generates luminance information for image P4.
  • the stereo disparity image generating unit 200 uses the image of the aforementioned stereo viewing area (common viewing area) from the obtained images P3 and P4 to generate a stereo disparity image of the stereo viewing area.
  • the stereo parallax image generating unit 200 includes an exposure adjustment unit 210 and a sensitivity correction unit 220, and can execute feedback control to the right camera 50 and the left camera 60 regarding the exposure amount, sensitivity, etc. of the right camera 50 and the left camera 60.
  • the stereo parallax image generating unit 200 further includes a geometric correction unit 230 that performs geometric correction of the left and right images, and a matching unit 240 that performs matching processing of the left and right images.
  • the matching unit 240 performs stereo matching of the left and right images to obtain parallax (stereo parallax image).
  • the stereo parallax image calibration unit 99 calibrates the parallax shift caused by light-transmitting materials such as the windshield based on the data previously acquired from the stereo parallax calibration data recording unit 98.
  • the road surface cross-sectional shape estimation unit 400 estimates the cross-sectional shape of the road surface of the road along which the vehicle equipped with the image processing device 10 is scheduled to travel.
  • this embodiment is characterized by creating data in the stereo parallax calibration data recording unit 98.
  • the second is the position shift of the light beam.
  • the position shift of the light beam has a characteristic that the influence is large for a chart or a measurement object placed nearby, but can be ignored for a chart or a measurement object placed far away.
  • Prior Art 1 and Prior Art 2 do not take into account the effect of the misalignment of light rays due to refraction by the windshield. As described above, this misalignment of light rays is detected when measuring the distance to a nearby object, but can be ignored when detecting a distant object. Normally, parallax shift is calibrated to optimize distance measurement. On the other hand, if parallax shift is calibrated using a nearby chart, parallax shift occurs when a distant object is detected due to the misalignment of light rays due to refraction by the windshield. For this reason, Prior Art 1 and Prior Art 2 have the problem that parallax shift cannot be completely calibrated when a chart is placed nearby.
  • Figure 2 is a diagram explaining the problem.
  • This figure shows a cross section connecting the lens pupil positions of the right camera 50 and the left camera 60.
  • the amount of light ray displacement by the windshield 1 is shown larger than it actually is.
  • the light rays at a horizontal angle of view of 0 degrees in the right camera 50 and the left camera 60 are respectively light ray R51 and light ray R61.
  • the distance between the light ray R61 and light ray R51 that are incident on the windshield 1 (light ray R61 and light ray R51 on the opposite side of the stereo camera image processing device 10 with respect to the windshield 1) is set to distance D1.
  • the light rays on the horizontal wide angle side in the right camera 50 and the left camera 60 are set to light ray R52 and light ray R62.
  • the distance between the light ray R62 and light ray R52 that are incident on the windshield 1 is set to distance D2.
  • the line extending the light ray R51 between the windshield 1 and the right camera 50 is defined as axis K51
  • the line extending the light ray R61 between the windshield 1 and the left camera 60 is defined as axis K61
  • the line extending the light ray R52 between the windshield 1 and the right camera 50 is defined as axis K52
  • the line extending the light ray R62 between the windshield 1 and the left camera 60 is defined as axis K62.
  • the distance between the axis K51 and the axis K61 and the distance between the axis K52 and the axis K62 are equal to the baseline length B.
  • FIG. 3 is a diagram for explaining the influence of differences in baseline length B and interval D (interval D1, interval D2). Like FIG. 2, this diagram shows a cross section connecting the lens pupil positions of the right camera 50 and the left camera 60. Here, for ease of explanation, the amount of light displacement caused by the windshield 1 is shown larger than it actually is.
  • the charts described in prior art 1 and prior art 2 use a pattern with the same period as the baseline length. In this diagram, a chart G10 is arranged with a similar pattern period that is the same as the baseline length.
  • the axes of a predetermined angle of view from the pupil positions of the right camera 50 and the left camera 60 are axis K53 and axis K63.
  • the points of contact between the chart G10, axis K53, and axis K63 are positions Q1 and Q2, respectively. Since the inclinations of axis K53 and axis K63 are the same, the interval between position Q1 and position Q2 is the baseline length B. This configuration is the same as that of prior art 1 and prior art 2.
  • the light rays that detect the images at positions Q1 and Q2 by the right camera 50 and the left camera 60 are defined as light rays R53 and R63.
  • the lines extending light rays R53 and R63 between the pupil positions of the right camera 50 and the left camera 60 and the windshield 1 are defined as lines S53 and S63, respectively, and the intersection of lines S53 and S63 is defined as position TP1.
  • Figure 4 shows the shapes of the areas of chart G20 in this embodiment. Each area has a similar pattern (a square shape in the illustrated example). In this embodiment, each area (pattern) of chart G20 is characterized by having a different width in the horizontal direction (baseline direction). For example, this shows that the horizontal width da1 of area C33 (first area) of chart G20 is different from the horizontal width da2 of area C35 (second area). Width da2 is larger than width da1.
  • the widths da1 and da2 are both larger than the base line length B, and the width da1 is closer to the base line length B than the width da2 (i.e., the width da2 is larger than the width da1).
  • FIG. 5 shows the patterns in each area of chart G20 in FIG. 4.
  • the pattern in area C33 on chart G20 is shown.
  • the horizontal width of the areas is different.
  • the horizontal width db1 of (0,0) is different from the horizontal width db2 of (2,0).
  • Width db2 is larger than width db1.
  • the detected light amount of each area is set to be different, so that incorrect matching does not occur.
  • the amount of deviation between corresponding points in an image in which area C34 of chart G20 is detected by the right camera 50 and an image in which area C33 is detected by the left camera 60 is detected is detected.
  • the amount of deviation can be calculated using a detection method such as block matching or feature point extraction used in stereo cameras.
  • Figure 6 shows the light path when the positional deviation of the light rays due to the windshield is taken into consideration.
  • the light rays of the left camera 60 are the same as those of Figure 3.
  • the angle of the light ray R54 is changed so that the angle of the light ray R63 incident on the left camera 60 matches the angle of the light ray R54 incident on the right camera 50 (lines S54 and S63 are parallel, and the distance between them is the base line length B).
  • the horizontal periodic pattern on the chart G20 (in other words, the distance between positions Q3 and Q2 of the images on the chart G20 detected by the right camera 50 and left camera 60 with the light rays R54 and R63) is positioned a distance D apart.
  • the parallax shift caused by the inclination shift of the light beam due to the refraction of the windshield can be obtained by using the same process as in FIG. 1. That is, in the stereo parallax image calibration unit 99, the matching unit 240 obtains a calibration parameter for calibrating the parallax shift caused by (the refraction of) the windshield, which is at least a light-transmitting member, based on the result of stereo matching of the left and right images (images P1 and P2) obtained by capturing images of the chart G20 with the left and right cameras (right camera 50 and left camera 60).
  • the stereo parallax image calibration unit 99 obtains the above-mentioned calibration parameter based on the result of stereo matching assuming that different patterns are the same point.
  • the result of (the calibration data of) the stereo parallax image calibration unit 99 becomes this parallax shift.
  • the data is processed to correct this parallax shift, recorded in the stereo parallax calibration data recording unit 98, and calibration is performed by the stereo parallax image calibration unit 99.
  • high accuracy can be achieved for a distant measurement object, but for a nearby measurement object, a parallax shift due to the refraction of the windshield occurs as a cause of positional deviation.
  • the tolerance for parallax shift is greater in close proximity than in distant locations, so the impact is small.
  • Figure 7 shows the horizontal angle of view dependency of parallax shift.
  • the vertical axis shows parallax shift
  • the horizontal axis shows horizontal angle of view.
  • This figure shows the results for two conditions: with and without a windshield.
  • the difference between the parallax shift detected using the proximity chart and the parallax shift when the distance between the stereo camera and the measurement object is set to 1000 m was calculated.
  • this shows the state in which the effect of the inclination shift of the light beam due to refraction by the windshield has been removed. Therefore, the parallax shift on the vertical axis is the effect of the position shift of the light beam due to the refraction by the windshield described above.
  • the inclination shift of the light beam due to the windshield and the parallax shift which is the cause of the position shift, are detected simultaneously, so when the calibration described in Prior Art 1 and Prior Art 2 is performed, this parallax shift remains.
  • Figure 8 shows the dependency of the measured distance on the horizontal angle of view.
  • the vertical axis shows the measured distance of the stereo camera
  • the horizontal axis shows the horizontal angle of view.
  • the measured distance of the stereo camera is calculated using the parallax shift shown in Figure 7.
  • the presence of a windshield results in larger distance measurement errors.
  • distance measurement errors occur even in areas with small horizontal angles of view (for example, horizontal angle of view 0 degrees), the distance measurement errors are larger on the wide-angle side.
  • Figure 9 shows the horizontal angle of view dependency of parallax shift when this embodiment is applied.
  • the vertical axis shows parallax shift
  • the horizontal axis shows horizontal angle of view.
  • this is a state where the effects of distortion due to the windshield have been removed. This shows that there is almost no parallax shift when this embodiment is applied. Therefore, it can be seen that this embodiment can calculate the parallax shift caused by distortion due to the windshield with high accuracy.
  • Figure 10 shows the dependency of the distance measurement on the horizontal angle of view when this embodiment is applied.
  • the vertical axis shows the distance measurement of the stereo camera, and the horizontal axis shows the horizontal angle of view.
  • highly accurate distance measurement 50 m
  • Figure 11 shows the horizontal angle of view dependency of the optimal chart spacing D ( Figure 6) (spacing of periodic or repeating patterns).
  • the vertical axis shows spacing D
  • the horizontal axis shows horizontal angle of view.
  • the parallax shift and the stereo camera distance measurement distance are calculated at this spacing D.
  • spacing D is set longer than the baseline length (200 mm) in all horizontal angle ranges. This allows for highly accurate results as shown in Figures 9 and 10.
  • this embodiment compares different areas (performs stereo matching by assuming that different patterns are the same point) and detects the amount of deviation.
  • the difference between this embodiment and prior art 1 and prior art 2 is that, as shown in FIG. 6, the distance D of the horizontal periodic pattern on chart G20 is different from the baseline length B (more specifically, is greater than the baseline length B).
  • this embodiment can improve distance measurement accuracy by making the horizontal lengths of the two cameras different at least in the first and second areas.
  • the amount of change in the horizontal (baseline) width may be changed with respect to the vertical angle of view.
  • the horizontal (baseline) width may be changed between area C33 (first area) of chart G20 and areas C13, C23, C43, and C53 (third areas) that are perpendicular to the baseline direction of the sensor surface of the camera of area C33 (first area).
  • the optimal interval D in the horizontal (baseline) direction changes with respect to the vertical angle of view.
  • Each area may be connected in stages (step-like) as in FIG. 12, or each area may be connected smoothly as in FIG. 13. It goes without saying that the effect can be obtained by using two intervals D, for example, the horizontal narrow angle part and the horizontal wide angle part.
  • FIG. 5 a rectangular chart pattern as shown in FIG. 5 is used, but a circular chart pattern as shown in FIG. 14 may also be used.
  • the influence of the windshield is calibrated using the stereo parallax image calibration unit 99 and the stereo parallax calibration data recording unit 98, but this is not limited to this.
  • calibration can also be performed by sending the calibration data of the stereo parallax calibration data recording unit 98 to the affine processing means 20a or affine processing means 20b.
  • Example 2 A calibration method for the image processing device 10 according to the second embodiment will be described with reference to Fig. 15.
  • the calibration method in the first embodiment is a calculation method similar to the calibration methods in the conventional techniques 1 and 2, but is not limited to this.
  • a calibration method will be described in which the influence of the shift in position of light caused by refraction of the windshield when a chart is placed nearby is taken into consideration.
  • FIG. 15 shows an example of the configuration of the image processing device 10 including the flow of estimating the influence of the windshield.
  • this embodiment is characterized by using the charts shown in FIG. 4, FIG. 12, and FIG. 13.
  • this embodiment is characterized by obtaining the vertical deviation (Y deviation) image of two images in the Y deviation image generating unit 201.
  • the ⁇ Y deviation image generating unit 80 calculates the difference ( ⁇ Y deviation image) between the Y deviation image and the Y deviation image under reference conditions obtained by calculation or from an actual device.
  • the ⁇ parallax deviation calculation processing unit 85 generates a ⁇ parallax deviation image by multiplying the ⁇ Y deviation image by a predetermined coefficient.
  • the parallax deviation calculation processing unit 90 adds the ⁇ parallax deviation image and the parallax deviation image under reference conditions to obtain the parallax deviation caused by the refraction of the windshield.
  • the ⁇ parallax deviation and the ⁇ Y deviation indicate the parallax deviation and Y deviation amount under reference conditions such as the design value or the center value of the product.
  • the reason why the parallax shift can be calibrated in this embodiment will be explained. Since the windshield has a radius of curvature in the horizontal and vertical directions and a tilt angle of the windshield, deviations due to deviation factors such as variations in the radius of curvature of the windshield, the mounting position of the camera relative to the windshield, and deviations in the thickness of the windshield, not only a change in the horizontal direction ( ⁇ parallax shift) but also a change in the vertical direction ( ⁇ Y shift) occurs.
  • ⁇ parallax shift a change in the horizontal direction
  • ⁇ Y shift a change in the vertical direction
  • the parallax shift can be obtained from the ⁇ parallax shift and the parallax shift under the reference conditions.
  • the parallax shift data obtained in this manner is recorded in the stereo parallax calibration data recording unit 98 shown in FIG. 1, and calibration is performed in the stereo parallax image calibration unit 99.
  • the stereo image processing device 10 of this embodiment includes a stereo matching unit (matching unit 240) that performs stereo matching of a plurality of images captured by a plurality of cameras (right camera 50, left camera 60) that capture images of a subject through a light-transmitting member (windshield 1) to determine a parallax, and a calibration processing unit (stereo parallax image calibration unit 99) that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on a result of stereo matching performed by the stereo matching unit (matching unit 240) on a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras, the calibration chart including a pattern, and the pattern is a repeated pattern that is larger than the baseline lengths of the plurality of cameras ( FIG. 11 ), and the calibration processing unit (stereo parallax image calibration unit 99) determines the calibration parameters based on a result of stereo matching
  • the stereo image processing device 10 of this embodiment also includes a stereo matching unit (matching unit 240) that performs stereo matching of multiple images captured by multiple cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1) to determine parallax, and a calibration processing unit (stepper 240) that performs stereo matching of multiple images obtained by capturing an image of a calibration chart with the multiple cameras to determine calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of the stereo matching performed by the stereo matching unit (matching unit 240) on the multiple images obtained by capturing an image of a calibration chart with the multiple cameras.
  • a stereo matching unit that performs stereo matching of multiple images captured by multiple cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1) to determine parallax
  • stepper 240 that performs stereo matching of multiple images obtained by
  • the calibration chart includes a pattern, and the length of the baseline direction of the multiple cameras is different in at least a first region and a second region of the pattern (in other words, the pattern has a first region and a second region in which the lengths of the baseline direction of the multiple cameras are different) (da2>da1 in FIG. 4, FIG. 11, etc.), and the calibration processing unit (stereo parallax image calibration unit 99) finds the calibration parameters based on the results of stereo matching in the first region and the second region of the calibration chart, assuming that different patterns are the same point.
  • the second region when the first region is an area on a plane including the lens optical axes of the multiple cameras and including the position on the calibration chart where the perpendicular bisectors of the multiple cameras intersect (and the second region is an area where the baseline positions of the multiple cameras differ from the first region), the second region has a longer length in the baseline direction of the multiple cameras than the first region (in other words, the further the baseline positions of the multiple cameras are from the first region, the longer the length in the baseline direction of the multiple cameras than the first region) (da2>da1 in Figure 4, Figure 11, etc.).
  • the stereo image processing device 10 of this embodiment has a third area of the calibration chart in a direction perpendicular to the baseline direction of the sensor surface of the camera in the first area of the calibration chart, and the lengths of the baseline direction of the multiple cameras are different between the first area and the third area ( Figures 12 and 13).
  • the stereo image calibration method of this embodiment includes a stereo matching process (matching unit 240) that performs stereo matching of multiple images captured by multiple cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1) to determine parallax, and a calibration process (stereo parallax image calibration unit 99) that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of stereo matching of multiple images obtained by capturing images of a calibration chart with the multiple cameras in the stereo matching process (matching unit 240).
  • the calibration chart includes a pattern, and the pattern is a repeating pattern that is larger than the baseline length of the multiple cameras ( Figure 11).
  • the calibration parameters are determined based on the results of stereo matching performed assuming that different patterns are the same point.
  • the stereo image calibration method of this embodiment includes a stereo matching process (matching unit 240) for determining parallax by performing stereo matching on a plurality of images captured by a plurality of cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1), and a calibration process (stereo matching unit 240) for determining calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of the stereo matching of a plurality of images obtained by capturing an image of a calibration chart by the plurality of cameras in the stereo matching process (matching unit 240).
  • a stereo matching process for determining parallax by performing stereo matching on a plurality of images captured by a plurality of cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1)
  • a calibration process stereo matching unit 240
  • the calibration chart includes a pattern, and the length of the baseline direction of the multiple cameras is different in at least a first region and a second region of the pattern (in other words, the pattern has a first region and a second region in which the lengths of the baseline direction of the multiple cameras are different) (da2>da1 in FIG. 4, FIG. 11, etc.), and in the calibration process (stereo parallax image calibration unit 99), the calibration parameters are calculated based on the results of stereo matching in the first region and the second region of the calibration chart, where different patterns are considered to be the same point.
  • the stereo image processing device 10 and stereo image calibration method of this embodiment change the pattern size of the calibration chart (calibration chart) G20 from the center to the outside, thereby suppressing positional deviations caused by refraction of light-transmitting members such as a windshield.
  • the stereo image processing device 10 and stereo image calibration method according to this embodiment can provide a stereo image processing device 10 and stereo image calibration method that can calibrate a wide-angle camera with high accuracy without increasing the size of the chart.
  • the present invention is not limited to the above-mentioned embodiment, but includes various other variations.
  • the above-mentioned embodiment has been described in detail to clearly explain the present invention, and is not necessarily limited to those having all of the configurations described.
  • the above-mentioned configurations, functions, processing units, processing means, etc. may be realized in hardware, in part or in whole, for example by designing them as integrated circuits. Further, the above-mentioned configurations, functions, etc. may be realized in software by a processor interpreting and executing a program that realizes each function. Information on the programs, tables, files, etc. that realize each function can be stored in a memory, a storage device such as a hard disk or SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
  • a storage device such as a hard disk or SSD (Solid State Drive)
  • a recording medium such as an IC card, SD card, or DVD.
  • control lines and information lines shown are those considered necessary for the explanation, and do not necessarily show all control lines and information lines on the product. In reality, it can be assumed that almost all components are interconnected.
  • Stereo camera image processing device (stereo image processing device) 50: right camera 60: left camera 20a: affine processing means 20b: affine processing means 98: stereo parallax calibration data recording unit 99: stereo parallax image generating unit (calibration processing unit) 200: stereo disparity image generating unit 201: Y-shift image generating unit 400: road surface cross-sectional shape estimating unit 500: stereo vision three-dimensional object detecting unit

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The objective of the present invention is to provide a stereo image processing device and a stereo image calibration method capable of calibrating a wide-angle camera mounted in a vehicle with a high degree of accuracy without increasing the size of a chart. A calibration chart includes patterns, wherein: the lengths of the patterns in the baseline direction of a plurality of cameras differ at least between a first region and a second region; and a calibration processing unit obtains a calibration parameter on the basis of a result of performing stereo matching in each of the first region and the second region of the calibration chart, assuming that the different patterns are the same point.

Description

ステレオ画像処理装置およびステレオ画像校正方法STEREO IMAGE PROCESSING APPARATUS AND STEREO IMAGE CALIBRATION METHOD

 本発明は、例えば車両に搭載された(車載)ステレオ画像処理装置およびステレオ画像校正方法に関する。 The present invention relates to a stereo image processing device mounted on, for example, a vehicle (vehicle-mounted) and a stereo image calibration method.

 3次元的に物体を認識するための装置として、ステレオカメラが知られている。ステレオカメラは、異なる位置に配置した複数のカメラの画像の写り方の違いを利用して、三角法に基づき複数のカメラ間の視差を検出し、その視差を用いて物体の奥行きや位置を検出するものであり、測定対象の位置を正確に検出することができる。 Stereo cameras are known as devices for recognizing objects three-dimensionally. Stereo cameras use the differences in how images are captured by multiple cameras placed in different positions to detect parallax between the multiple cameras based on trigonometry, and use this parallax to detect the depth and position of an object, making it possible to accurately detect the position of the measurement target.

 このようなステレオカメラは、自動車などの車両に搭載され、障害物等の位置を検知する技術(車載センシング技術)に応用されている。車載センシング技術では、多くのユースケースに対応するため、広画角で障害物等を検知すると共に、より遠方の障害物まで検知すること(広画角化及び遠方化)が要求されている。 Such stereo cameras are mounted on automobiles and other vehicles and are used in technology (in-vehicle sensing technology) to detect the position of obstacles, etc. In order to accommodate many use cases, in-vehicle sensing technology is required to detect obstacles, etc. with a wide angle of view, as well as to detect obstacles at greater distances (wider angle of view and longer range).

 車載ステレオカメラは、汚れなどの影響を回避するため、一般的に自動車の室内に搭載されているが、広画角化に伴って車両に搭載されているフロントガラスによる影響が無視できなくなる。従来、ステレオカメラの搭載ずれを校正するため、車両製造時や検査時にエーミングと呼ばれるステレオカメラの校正作業が行われるが、広画角化のためには、更なる対応が求められる。  In-vehicle stereo cameras are generally installed inside the vehicle to avoid the effects of dirt and other factors, but as the angle of view becomes wider, the effects of the vehicle's windshield cannot be ignored. Conventionally, a calibration process for the stereo camera, known as aiming, has been carried out during vehicle manufacturing and inspection to correct for misalignment of the stereo camera, but further measures are required to widen the angle of view.

 フロントガラスの影響を考慮したステレオカメラの校正方法を開示するものとして、特許文献1が知られている。特許文献1は、課題として、「狭いスペースであっても、ターゲットボードを利用したステレオカメラの校正を適切に行うことができるステレオカメラの校正装置及び校正方法を提供する。」とし、解決手段として、「視差に基づき対象物までの距離を測定するステレオカメラの校正装置は、所定の基線長Bにて離間して配置された一対のカメラ4、6を備えるステレオカメラ3と、一対のカメラ4、6の配置方向と平行に配置され、基線長Bと同じ距離だけ離間した2つのターゲット8a、8bを少なくとも含むターゲットボード8と、ターゲットボード8がステレオカメラ3に対して無限遠の位置に存在するものと仮定して、一対のカメラ4、6の各々によって撮影されたターゲットボード8の撮影画像に含まれる2つのターゲット8a、8bが同一のターゲットであるものとして扱い、一対のカメラ4、6の視差ずれを校正する演算装置と、を有する。」と述べている。また、特許文献2で同様の校正技術について述べている。特許文献2は、課題として、「画像上の位置に応じて、ステレオマッチングを行う際の探索範囲を可変に設定すること」とし、解決手段として、「比較画像ラインメモリ7と、アドレス生成回路10と、ステレオマッチング回路8とを有する。ラインメモリ7は、一方の撮像画像における基準画素領域内の画像データと、他方の撮像画像における基準画素領域の垂直位置に対応した水平線上の画像データとを記憶する。アドレス生成回路10は、ステレオマッチングを行う際の探索範囲を設定するとともに、当該設定された探索範囲内の画像データと基準画像領域内の画像データとの読み出しを、ラインメモリ7に対して指示する。また、ステレオマッチング回路8は、ラインメモリ7から読み出された探索範囲内の画像データと基準画素領域の画像データとに基づいて、ステレオマッチングにより基準画素領域の相関先を特定する。ここで、上記のアドレス生成回路10は、基準画素領域の水平位置を基準とした無限遠対応点のずれの程度に基づいて、当該基準画素領域に関する探索範囲の位置を校正する。」と述べている。 Patent Document 1 is known as a method for calibrating a stereo camera that takes into account the influence of a windshield. Patent Document 1 describes the problem as "to provide a stereo camera calibration device and calibration method that can properly calibrate a stereo camera using a target board even in a narrow space," and describes the solution as "a stereo camera calibration device that measures the distance to an object based on parallax, comprising a stereo camera 3 having a pair of cameras 4, 6 spaced apart by a predetermined base line length B, a target board 8 that is arranged parallel to the arrangement direction of the pair of cameras 4, 6 and includes at least two targets 8a, 8b spaced apart by the same distance as the base line length B, and a calculation device that assumes that the target board 8 is located at an infinite distance relative to the stereo camera 3, treats the two targets 8a, 8b included in the images of the target board 8 photographed by each of the pair of cameras 4, 6 as the same target, and calibrates the parallax shift of the pair of cameras 4, 6." Patent Document 2 also describes a similar calibration technology. Patent Document 2 states that the problem is to "variably set the search range when performing stereo matching according to the position on the image," and as a solution, it states that "the system has a comparison image line memory 7, an address generation circuit 10, and a stereo matching circuit 8. The line memory 7 stores image data within a reference pixel area in one captured image and image data on a horizontal line corresponding to the vertical position of the reference pixel area in the other captured image. The address generation circuit 10 sets the search range when performing stereo matching, and instructs the line memory 7 to read image data within the set search range and image data within the reference image area. In addition, the stereo matching circuit 8 identifies the correlation destination of the reference pixel area by stereo matching based on the image data within the search range read from the line memory 7 and the image data of the reference pixel area. Here, the address generation circuit 10 calibrates the position of the search range for the reference pixel area based on the degree of deviation of the infinitely distant corresponding point based on the horizontal position of the reference pixel area."

特開2017-62150号公報JP 2017-62150 A 特開2001-92968号公報JP 2001-92968 A

 特許文献1、特許文献2は、ステレオカメラの近傍にチャートを配置し、ステレオカメラの2つのカメラの間隔(基線長)とチャートのパターンの周期を基線長と一致させ、仮想的にチャートが無限遠にあるとすることでフロントガラスの屈折による光線の傾きずれの影響を校正できる、としている。しかしながら、これらの技術は、フロントガラスの屈折による光線の位置ずれの影響が考慮されておらず、実際にステレオカメラの近傍にチャートを配置すると完全に校正することができない課題がある。 Patent Documents 1 and 2 claim that by placing a chart near a stereo camera, matching the distance between the two cameras (baseline length) and the period of the chart pattern with the baseline length, and virtually treating the chart as being at infinity, it is possible to calibrate the effects of shifts in the inclination of light rays due to refraction by the windshield. However, these technologies do not take into account the effects of shifts in the position of light rays due to refraction by the windshield, and there is an issue that complete calibration cannot be achieved if a chart is actually placed near a stereo camera.

 本発明は、上記のような課題に鑑みてなされたものであり、ステレオカメラの近傍にチャートを配置し、チャートのサイズを大型化することなく、高精度に広画角のステレオカメラの校正を行うことができるステレオ画像処理装置およびステレオ画像校正方法を提供するものである。 The present invention has been made in consideration of the above problems, and provides a stereo image processing device and stereo image calibration method that can calibrate a wide-angle stereo camera with high accuracy by placing a chart near the stereo camera, without increasing the size of the chart.

 本発明に係るステレオ画像処理装置の一態様は、光透過部材を介して被写体を撮像する複数のカメラにより撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング部と、前記ステレオマッチング部が前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材に起因する視差ずれを校正する校正パラメータを求める校正処理部と、を備え、前記校正チャートは、パターンを含み、前記パターンは、前記複数のカメラの基線長よりも大きな繰返しパターンであり、前記校正処理部は、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める。 One aspect of the stereo image processing device according to the present invention includes a stereo matching unit that performs stereo matching of multiple images captured by multiple cameras that capture an image of a subject through a light-transmitting member to determine parallax, and a calibration processing unit that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member based on the results of stereo matching of multiple images obtained by the stereo matching unit capturing an image of a calibration chart with the multiple cameras, the calibration chart including a pattern, the pattern being a repeating pattern that is larger than the baseline length of the multiple cameras, and the calibration processing unit determines the calibration parameters based on the results of stereo matching assuming that different patterns are the same point.

 本発明に係るステレオ画像処理装置およびステレオ画像校正方法によれば、チャートのサイズを大型化することなく、高精度に広画角カメラの校正を行うことができるステレオ画像処理装置およびステレオ画像校正方法を提供することができる。 The stereo image processing device and stereo image calibration method of the present invention can provide a stereo image processing device and stereo image calibration method that can calibrate a wide-angle camera with high accuracy without increasing the size of the chart.

 上記した以外の課題、構成および効果は以下の実施形態の説明により明らかにされる。  Problems, configurations and advantages other than those mentioned above will become clear from the description of the embodiments below.

実施例1に係るステレオカメラ画像処理装置の構成を説明するブロック図である。FIG. 1 is a block diagram illustrating a configuration of a stereo camera image processing device according to a first embodiment. 実施例1に係る従来方式の課題を説明する図である。FIG. 1 is a diagram for explaining a problem of a conventional method according to a first embodiment; 実施例1に係る従来方式の課題を説明する他の図である。FIG. 11 is another diagram for explaining the problem of the conventional method according to the first embodiment. 実施例1に係るチャートの領域の形状を示している。3 shows the shape of a chart area according to the first embodiment. 実施例1に係るチャートの各領域内のパターンを示している。4 shows patterns in each area of the chart according to the first embodiment. 実施例1に係る光路を示している。2 shows an optical path according to the first embodiment. 実施例1に係る従来方式の視差ずれを示している。1 shows a parallax displacement in a conventional method according to the first embodiment. 実施例1に係る従来方式の測距距離を示している。2 shows the distance measurement in the conventional method according to the first embodiment. 実施例1に係る視差ずれを示している。4 shows parallax displacement according to the first embodiment. 実施例1に係る測距距離を示している。3 shows the distance measurement according to the first embodiment. 実施例1に係る間隔Dを示している。4 shows the distance D according to the first embodiment. 実施例1に係るチャートの領域の他の形状を示している。4 shows other shapes of the chart area according to the first embodiment. 実施例1に係るチャートの領域の他の形状を示している。4 shows other shapes of the chart area according to the first embodiment. 実施例1に係るチャートの各領域の他のパターンを示している。6 shows other patterns for each area of the chart according to the first embodiment. 実施例2に係るステレオカメラ画像処理装置の、校正データを作成する構成を説明するブロック図である。FIG. 11 is a block diagram illustrating a configuration for creating calibration data in a stereo camera image processing device according to a second embodiment.

 以下、添付図面を参照して本実施形態について説明する。添付図面では、機能的に同じ要素は同じ番号で表示される場合もある。なお、添付図面は本開示の原理に則った実施形態を示しているが、これらは本開示の理解のためのものであり、決して本開示を限定的に解釈するために用いられるものではない。本明細書の記述は典型的な例示に過ぎず、本開示の特許請求の範囲又は適用例を如何なる意味においても限定するものではない。 Below, the present embodiment will be described with reference to the attached drawings. In the attached drawings, functionally identical elements may be indicated by the same numbers. Note that the attached drawings show embodiments in accordance with the principles of the present disclosure, but these are for the purpose of understanding the present disclosure and are in no way to be used to interpret the present disclosure in a restrictive manner. The descriptions in this specification are merely typical examples and do not limit the scope or application examples of the present disclosure in any way.

 本実施形態では、当業者が本開示を実施するのに十分詳細にその説明がなされているが、他の形態も可能で、本開示の技術的思想の範囲と精神を逸脱することなく構成・構造の変更や多様な要素の置き換えが可能であることを理解する必要がある。従って、以降の記述をこれに限定して解釈してはならない。 In this embodiment, the disclosure has been described in sufficient detail for those skilled in the art to implement the disclosure, but it should be understood that other forms are possible, and that changes to the configuration and structure and substitutions of various elements are possible without departing from the scope and spirit of the technical ideas of the disclosure. Therefore, the following description should not be interpreted as being limited to this.

[実施例1]
 図1を参照して、実施例1に係るステレオカメラ画像処理装置10(以下、「画像処理装置10」または「ステレオ画像処理装置10」という)の構成を説明する。この画像処理装置10は、例えば自動車などの車両に搭載され、車両から車両の周囲の立体物(他の自動車、建物、歩行者など)までの距離を検出するのに用いられる。以下では、車両に画像処理装置10が搭載される場合を例として説明するが、これに限定される趣旨ではない。
[Example 1]
A configuration of a stereo camera image processing device 10 (hereinafter, referred to as "image processing device 10" or "stereo image processing device 10") according to a first embodiment will be described with reference to Fig. 1. The image processing device 10 is mounted on a vehicle such as an automobile, and is used to detect the distance from the vehicle to a three-dimensional object (other automobiles, buildings, pedestrians, etc.) around the vehicle. In the following, a case where the image processing device 10 is mounted on a vehicle will be described as an example, but the present invention is not limited to this.

 図1は、実施例1に係る画像処理装置10の構成例を示すブロック図である。この画像処理装置10は、右カメラ50及び左カメラ60により得られた画像に基づき、周囲の立体物を検知し、必要に応じて警報を発することができるように構成される。右カメラ50、左カメラ60によりステレオカメラが構成される。なお、ステレオカメラを構成するカメラは、左右の2台に限定される趣旨ではない。 FIG. 1 is a block diagram showing an example of the configuration of an image processing device 10 according to a first embodiment. This image processing device 10 is configured to detect surrounding three-dimensional objects based on images obtained by a right camera 50 and a left camera 60, and to issue an alarm if necessary. The right camera 50 and the left camera 60 form a stereo camera. Note that the number of cameras that form the stereo camera is not limited to two, a left and a right camera.

 この画像処理装置10は、一例として、画像処理部100、ステレオ視差画像生成部200、ステレオ視差画像校正部99、路面断面形状推定部400、ステレオ視立体物検知部500、及び警報制御部700を備えて構成される。 As an example, the image processing device 10 is configured to include an image processing unit 100, a stereo disparity image generating unit 200, a stereo disparity image calibration unit 99, a road surface cross-sectional shape estimating unit 400, a stereoscopic three-dimensional object detection unit 500, and an alarm control unit 700.

 この画像処理装置10は、右カメラ50及び左カメラ60で共通に撮像が可能な領域(以下、「ステレオ視領域」という)では、右カメラ50及び左カメラ60の視差を利用してステレオ視差画像生成部200においてステレオ視差画像が生成される。そして、ステレオ視立体物検知部500において、視差に従って、車両から立体物までの距離を計測する。 In this image processing device 10, in an area where images can be captured by both the right camera 50 and the left camera 60 (hereinafter referred to as the "stereo vision area"), a stereo parallax image is generated in the stereo parallax image generation unit 200 by utilizing the parallax between the right camera 50 and the left camera 60. Then, the stereo vision three-dimensional object detection unit 500 measures the distance from the vehicle to the three-dimensional object according to the parallax.

 右カメラ50、左カメラ60は、図示は省略するが、レンズと、画像センサとを備えている。右カメラ50及び左カメラ60は、それぞれ対象物の画像をレンズを介して画像センサで取得(撮像)する。本実施例では、右カメラ50及び左カメラ60は、車両の室内に搭載され、光透過部材であるフロントガラスなどを介して車両の周囲の被写体を撮像した画像を取得する。画像処理装置10は、右カメラ50から画像P1(第1の画像)を取得するとともに、左カメラ60から画像P2(第2の画像)を取得する。 The right camera 50 and the left camera 60 each have a lens and an image sensor, not shown. The right camera 50 and the left camera 60 each capture (take) an image of an object using an image sensor via a lens. In this embodiment, the right camera 50 and the left camera 60 are mounted inside the vehicle and capture images of subjects around the vehicle through a light-transmitting member such as the windshield. The image processing device 10 captures an image P1 (first image) from the right camera 50 and an image P2 (second image) from the left camera 60.

 画像処理部100は、一例として、アフィン処理手段20a、20b、輝度補正手段21a、21b、画素補間手段22a、22b、及び輝度情報生成手段23a、23bを備えて構成される。画像処理部100は、右カメラ50及び左カメラ60により得られた画像P1及び画像P2に所定の画像処理を適用し、ステレオ視差画像生成部200に供給する。 The image processing unit 100 is configured to include, as an example, affine processing means 20a, 20b, luminance correction means 21a, 21b, pixel interpolation means 22a, 22b, and luminance information generation means 23a, 23b. The image processing unit 100 applies a predetermined image processing to the images P1 and P2 obtained by the right camera 50 and the left camera 60, and supplies them to the stereo parallax image generation unit 200.

 アフィン処理手段20aは、右カメラ50からの画像P1にアフィン処理を適用する。アフィン処理は、例えば線形の座標変換処理であるが、非線形の演算を含むものであってもよい。このアフィン処理を行った結果として、アフィン処理手段20aは画像P3(第3の画像)を取得する。同様に、アフィン処理手段20bは、左カメラ60からの画像P2にアフィン処理を適用して画像P4(第4の画像)を取得する。 The affine processing means 20a applies affine processing to image P1 from the right camera 50. The affine processing is, for example, a linear coordinate transformation process, but may also include non-linear calculations. As a result of performing this affine processing, the affine processing means 20a obtains image P3 (third image). Similarly, the affine processing means 20b applies affine processing to image P2 from the left camera 60 to obtain image P4 (fourth image).

 アフィン処理手段20a及び20bは、アフィン処理以外の歪変換処理を併せて実行するものであってもよい。本実施例では、魚眼レンズの射影方式のfsinθを、(ftanθx、ftanθy)の座標系へ射影変換する。ここで、fは魚眼レンズの焦点距離、θは魚眼レンズに入射する画角、θx、θyは魚眼レンズに入射する画角の水平、垂直成分を示している。さらに本実施例では、フロントガラスの影響による垂直方向の画素の変位をアフィン処理手段20a及び20bで校正する。 The affine processing means 20a and 20b may also perform distortion transformation processing other than affine processing. In this embodiment, fsinθ, which is the projection method of the fisheye lens, is projected into a coordinate system of (ftanθx, ftanθy). Here, f is the focal length of the fisheye lens, θ is the angle of view incident on the fisheye lens, and θx and θy are the horizontal and vertical components of the angle of view incident on the fisheye lens. Furthermore, in this embodiment, the vertical pixel displacement caused by the influence of the windshield is calibrated by the affine processing means 20a and 20b.

 輝度補正手段21aは、画像P3の各画素の輝度を補正する。例えば、右カメラ50のゲイン、画像P3内の各画素のゲインの違い等に基づいて、画像P3の各画素の輝度の補正が行われる。同様に、輝度補正手段21bは、画像P4の各画素の輝度を補正する。 The luminance correction means 21a corrects the luminance of each pixel of the image P3. For example, the luminance of each pixel of the image P3 is corrected based on the gain of the right camera 50, the difference in gain of each pixel in the image P3, etc. Similarly, the luminance correction means 21b corrects the luminance of each pixel of the image P4.

 画素補間手段22aは、画像P3に対してデモザイキング処理を行う。例えば、RAW画像からカラー画像への変換が行われる。同様に、画素補間手段22bは、画像P4に対してデモザイキング処理を行う。 The pixel interpolation means 22a performs demosaicing processing on image P3. For example, a raw image is converted into a color image. Similarly, the pixel interpolation means 22b performs demosaicing processing on image P4.

 輝度情報生成手段23aは、画像P3の輝度情報を生成する。例えば、カラー画像を表す情報を、視差画像を生成するための輝度情報に変換する。同様に、輝度情報生成手段23bは、画像P4の輝度情報を生成する。 The luminance information generating means 23a generates luminance information for image P3. For example, it converts information representing a color image into luminance information for generating a parallax image. Similarly, the luminance information generating means 23b generates luminance information for image P4.

 ステレオ視差画像生成部200は、得られた画像P3、P4のうち、前述のステレオ視領域(共通視野領域)の画像を利用して、ステレオ視領域のステレオ視差画像を生成する。 The stereo disparity image generating unit 200 uses the image of the aforementioned stereo viewing area (common viewing area) from the obtained images P3 and P4 to generate a stereo disparity image of the stereo viewing area.

 ステレオ視差画像生成部200は、露光調整部210、感度補正部220を備え、右カメラ50、左カメラ60の露光量、感度等についての右カメラ50、左カメラ60へのフィードバック制御を実行することができる。また、ステレオ視差画像生成部200は更に、左右の画像の幾何補正を行う幾何補正部230、及び、左右の画像のマッチング処理を行うマッチング部240を備える。マッチング部240は、左右の画像のステレオマッチングを行い視差(ステレオ視差画像)を求める。 The stereo parallax image generating unit 200 includes an exposure adjustment unit 210 and a sensitivity correction unit 220, and can execute feedback control to the right camera 50 and the left camera 60 regarding the exposure amount, sensitivity, etc. of the right camera 50 and the left camera 60. The stereo parallax image generating unit 200 further includes a geometric correction unit 230 that performs geometric correction of the left and right images, and a matching unit 240 that performs matching processing of the left and right images. The matching unit 240 performs stereo matching of the left and right images to obtain parallax (stereo parallax image).

 ステレオ視差画像校正部99では、予め取得してあるステレオ視差校正データ記録部98のデータを基に、光透過部材であるフロントガラスなどによる視差のずれを校正する。 The stereo parallax image calibration unit 99 calibrates the parallax shift caused by light-transmitting materials such as the windshield based on the data previously acquired from the stereo parallax calibration data recording unit 98.

 路面断面形状推定部400は、画像処理装置10が搭載された車両が進行する予定の道路の路面の断面形状の推定を行う。 The road surface cross-sectional shape estimation unit 400 estimates the cross-sectional shape of the road surface of the road along which the vehicle equipped with the image processing device 10 is scheduled to travel.

 ステレオ視立体物検知部500は、ステレオ視差画像生成部200で生成されたステレオ視差画像に従い、ステレオ視領域における立体物の検知を行う。また、検知された立体物に対し、ステレオマッチングを適用して視差を検出すると共に、立体物の種別(歩行者、自転車、車両、建物など)を識別する。立体物を検知するとともに、立体物の種別を識別することにより、更に予防安全に利用する種別を特定する。車両が検知された場合、その検知結果は、先行車の追従制御や、緊急時のブレーキ制御などに利用することができる。検知された立体物が歩行者や自転車の場合には、緊急ブレーキ制御や、警報の制御を実行することができる。静止物体と比較して、車両に向けて飛び出してくる物体には広い視野範囲内の対象物に対して警報や制御を実施することとなる。これら検知した物体に対して距離を計測するとともに時系列にトラッキングしている対象物の移動速度を推定することで、より適切な警報や制御を警報制御部700で実施することができる。 The stereoscopic three-dimensional object detection unit 500 detects three-dimensional objects in the stereoscopic field according to the stereoscopic disparity images generated by the stereoscopic disparity image generation unit 200. In addition, stereo matching is applied to the detected three-dimensional objects to detect disparity and identify the type of the three-dimensional object (pedestrian, bicycle, vehicle, building, etc.). By detecting and identifying the type of three-dimensional object, the type to be used for preventive safety is further specified. When a vehicle is detected, the detection result can be used for following control of the preceding vehicle and emergency braking control. When the detected three-dimensional object is a pedestrian or bicycle, emergency braking control and warning control can be executed. Compared to stationary objects, warnings and control are executed for objects that jump out toward the vehicle within a wide field of view. By measuring the distance to these detected objects and estimating the moving speed of the objects tracked over time, the warning control unit 700 can execute more appropriate warnings and control.

 図1の中で、本実施例では、ステレオ視差校正データ記録部98のデータを作成することを特徴としている。 In FIG. 1, this embodiment is characterized by creating data in the stereo parallax calibration data recording unit 98.

 次に、本実施例の効果について説明する。まず、従来技術の課題について詳細に説明する。上述したように、特許文献1(以下、「従来技術1」という)と特許文献2(以下、「従来技術2」という)では、ステレオカメラの近傍にチャートを配置し、ステレオカメラの2つのカメラの間隔(基線長)とチャートのパターンの周期を基線長と一致させ、仮想的にチャートが無限遠にあるとすることでフロントガラスの屈折による光線の傾きずれの影響を校正できる、としている。一方で、フロントガラスの屈折による影響は、主に2つある。一つ目は、光線の傾きずれによる画像歪である。この画像歪は、近傍に配置したチャート、遠方に配置したチャートのいずれでも同様に発生する。二つ目は、光線の位置ずれである。光線の位置ずれは、近傍に配置したチャートや測定対象物ではその影響が大きく、遠方に配置したチャートや測定対象物は無視できる特徴がある。 Next, the effect of this embodiment will be described. First, the problems of the conventional technology will be described in detail. As described above, in Patent Document 1 (hereinafter referred to as "Conventional Technology 1") and Patent Document 2 (hereinafter referred to as "Conventional Technology 2"), a chart is placed near a stereo camera, the interval between the two cameras of the stereo camera (baseline length) and the period of the chart pattern are made to match the baseline length, and the chart is virtually at infinity, so that the influence of the tilt shift of the light beam due to the refraction of the windshield can be calibrated. On the other hand, there are mainly two influences due to the refraction of the windshield. The first is image distortion due to the tilt shift of the light beam. This image distortion occurs in the same way for both a chart placed nearby and a chart placed far away. The second is the position shift of the light beam. The position shift of the light beam has a characteristic that the influence is large for a chart or a measurement object placed nearby, but can be ignored for a chart or a measurement object placed far away.

 この二つの要因のうち、従来技術1と従来技術2では、フロントガラスの屈折に伴う光線の位置ずれの影響が考慮されていない。上述したように、この光線の位置ずれは、近傍の測定対象物を測距したときに検出されるが、遠方の測定対象物を検出する場合には無視できる。通常、遠方の測距が最良となるように視差ずれが校正される。一方で、近傍のチャートを用いて視差ずれを校正すると、遠方の測定対象物を検出した際にフロントガラスの屈折に伴う光線の位置ずれ分だけ視差ずれが発生してしまう。このため、従来技術1と従来技術2では、近傍にチャートを配置すると、視差ずれを完全には校正することができない課題がある。 Of these two factors, Prior Art 1 and Prior Art 2 do not take into account the effect of the misalignment of light rays due to refraction by the windshield. As described above, this misalignment of light rays is detected when measuring the distance to a nearby object, but can be ignored when detecting a distant object. Normally, parallax shift is calibrated to optimize distance measurement. On the other hand, if parallax shift is calibrated using a nearby chart, parallax shift occurs when a distant object is detected due to the misalignment of light rays due to refraction by the windshield. For this reason, Prior Art 1 and Prior Art 2 have the problem that parallax shift cannot be completely calibrated when a chart is placed nearby.

 図2は、その課題について説明する図である。本図は、右カメラ50と左カメラ60のレンズ瞳位置を結ぶ断面を示している。ここでは、説明を簡単にするために、フロントガラス1による光線変位量を実際よりも大きく示している。本図において、右カメラ50、左カメラ60で水平画角0度における光線をそれぞれ光線R51、光線R61とする。そして、フロントガラス1に入射する光線R61と光線R51(フロントガラス1に対してステレオカメラ画像処理装置10と反対の光線R61と光線R51)の間隔を間隔D1とする。また、右カメラ50、左カメラ60で水平広角側における光線を光線R52、光線R62とする。同様にして、フロントガラス1に入射する光線R62と光線R52の間隔を間隔D2とする。また、フロントガラス1と右カメラ50の間の光線R51を延長した線を軸K51、フロントガラス1と左カメラ60の間の光線R61を延長した線を軸K61とする。同様にして、フロントガラス1と右カメラ50の間の光線R52を延長した線を軸K52、フロントガラス1と左カメラ60の間の光線R62を延長した線を軸K62とする。ここで、軸K51と軸K61の間隔、軸K52と軸K62の間隔は基線長Bと一致する。 Figure 2 is a diagram explaining the problem. This figure shows a cross section connecting the lens pupil positions of the right camera 50 and the left camera 60. Here, to simplify the explanation, the amount of light ray displacement by the windshield 1 is shown larger than it actually is. In this figure, the light rays at a horizontal angle of view of 0 degrees in the right camera 50 and the left camera 60 are respectively light ray R51 and light ray R61. The distance between the light ray R61 and light ray R51 that are incident on the windshield 1 (light ray R61 and light ray R51 on the opposite side of the stereo camera image processing device 10 with respect to the windshield 1) is set to distance D1. The light rays on the horizontal wide angle side in the right camera 50 and the left camera 60 are set to light ray R52 and light ray R62. Similarly, the distance between the light ray R62 and light ray R52 that are incident on the windshield 1 is set to distance D2. Also, the line extending the light ray R51 between the windshield 1 and the right camera 50 is defined as axis K51, and the line extending the light ray R61 between the windshield 1 and the left camera 60 is defined as axis K61. Similarly, the line extending the light ray R52 between the windshield 1 and the right camera 50 is defined as axis K52, and the line extending the light ray R62 between the windshield 1 and the left camera 60 is defined as axis K62. Here, the distance between the axis K51 and the axis K61 and the distance between the axis K52 and the axis K62 are equal to the baseline length B.

 まず、水平画角0度について考える。光線R51と軸K51は、ほとんど同じではあるものの、完全に一致しない。また、光線R61と軸K61も同様である。このため、基線長Bと間隔D1は一致しない。一方で、水平広角の間隔D2も基線長Bと異なる。ここで、特徴的なのが、フロントガラス1の影響で、間隔D1と間隔D2が異なる点である。従来技術1と従来技術2では、間隔D1、間隔D2、基線長Bが一致することを利用してフロントガラスによる歪の影響を見積るとしている。しかし、実際には、間隔D1、間隔D2、基線長Bは一致しない。 First, consider a horizontal angle of view of 0 degrees. Although ray R51 and axis K51 are almost the same, they do not coincide completely. The same is true for ray R61 and axis K61. For this reason, baseline length B and interval D1 do not coincide. Meanwhile, interval D2 at a horizontal wide angle is also different from baseline length B. What is characteristic here is that interval D1 and interval D2 differ due to the influence of windshield 1. Prior art 1 and prior art 2 estimate the effect of distortion caused by the windshield by utilizing the fact that interval D1, interval D2, and baseline length B coincide. However, in reality, interval D1, interval D2, and baseline length B do not coincide.

 図3は、基線長Bと間隔D(間隔D1、間隔D2)が異なることによる影響を説明する図である。本図も図2同様に、右カメラ50と左カメラ60のレンズ瞳位置を結ぶ断面を示している。ここでは、説明を簡単にするために、フロントガラス1による光線変位量を実際よりも大きく示している。従来技術1、従来技術2に記載のチャートでは、基線長と同じ周期のパターンを用いるとしている。本図でも、同様のパターンの周期が基線長と同じチャートG10を配置している。本図では、右カメラ50、左カメラ60の瞳位置からの所定画角の軸を軸K53、軸K63とする。そして、チャートG10と軸K53と軸K63の接点をそれぞれ位置Q1、位置Q2とする。軸K53と軸K63の傾きが等しいため、位置Q1と位置Q2の間隔は基線長Bとなっている。この構成は、従来技術1、従来技術2と同じ構成となっている。そして、位置Q1、位置Q2の像を右カメラ50、左カメラ60で検出する光線を光線R53、光線R63とする。そして、右カメラ50、左カメラ60の瞳位置とフロントガラス1の間の光線R53と光線R63を延長した線をそれぞれ線S53、線S63とし、線S53と線S63の交点を位置TP1とする。 FIG. 3 is a diagram for explaining the influence of differences in baseline length B and interval D (interval D1, interval D2). Like FIG. 2, this diagram shows a cross section connecting the lens pupil positions of the right camera 50 and the left camera 60. Here, for ease of explanation, the amount of light displacement caused by the windshield 1 is shown larger than it actually is. The charts described in prior art 1 and prior art 2 use a pattern with the same period as the baseline length. In this diagram, a chart G10 is arranged with a similar pattern period that is the same as the baseline length. In this diagram, the axes of a predetermined angle of view from the pupil positions of the right camera 50 and the left camera 60 are axis K53 and axis K63. The points of contact between the chart G10, axis K53, and axis K63 are positions Q1 and Q2, respectively. Since the inclinations of axis K53 and axis K63 are the same, the interval between position Q1 and position Q2 is the baseline length B. This configuration is the same as that of prior art 1 and prior art 2. The light rays that detect the images at positions Q1 and Q2 by the right camera 50 and the left camera 60 are defined as light rays R53 and R63. The lines extending light rays R53 and R63 between the pupil positions of the right camera 50 and the left camera 60 and the windshield 1 are defined as lines S53 and S63, respectively, and the intersection of lines S53 and S63 is defined as position TP1.

 従来技術1、従来技術2では、チャートG10を基線長と同じ周期のパターンにすることで、仮想的に無限遠を検出できる、としている。そのためには、軸K53と軸K63のように2つの軸が平行線でなければならない。しかしながら、フロントガラス1を透過して、右カメラ50に入射する光線R53の角度と左カメラ60に入射する光線R63の角度は異なる。このため、本チャートで調整されたステレオカメラは大きな誤差を発生する。本図で言えば、無限遠となるはずのものが、チャートが位置TP1にあるとして検出されることとなり、距離Lで校正されてしまうことが大きな問題となる。 In Prior Art 1 and Prior Art 2, it is said that by making the chart G10 into a pattern with the same period as the baseline length, it is possible to virtually detect infinity. To do this, two axes such as axis K53 and axis K63 must be parallel lines. However, the angle of light ray R53 that passes through the windshield 1 and enters the right camera 50 is different from the angle of light ray R63 that enters the left camera 60. For this reason, a stereo camera adjusted with this chart will generate large errors. In this diagram, for example, what should be infinity will be detected as if the chart was at position TP1, and the fact that it is calibrated at distance L poses a major problem.

 そこで、本実施例ではこれを考慮する。図4は、本実施例のチャートG20の領域の形状を示している。なお、各領域では同様の模様(図示例では四角形状)のパターンとなっている。本実施例では、チャートG20の各領域(パターン)は、水平方向(基線方向)の領域の幅が異なることを特徴としている。例えば、チャートG20の領域C33(第一の領域)の水平方向の幅da1と領域C35(第二の領域)の水平方向の幅da2の幅が異なることを示している。幅da1よりも幅da2の方が大きい。そして、本図では、ステレオカメラの2つのカメラのレンズ光軸を含む平面上かつ2つのカメラの垂直2等分線がチャートG20上の領域C33上にあるとしたとき(換言すれば、ステレオカメラの2つのカメラのレンズ光軸を含む平面上かつ2つのカメラの垂直2等分線が交差するチャートG20上の位置を含む領域を領域C33とし、あるいは、チャートG20とステレオカメラの2つのカメラのレンズ光軸を含む平面上かつ2つのカメラの垂直2等分線が交差する領域を領域33とし、水平方向(基線方向)の位置が領域C33と異なる領域を領域35としたとき)、幅da1、幅da2は共に基線長Bよりも大きく、かつ、幅da1は幅da2よりも基線長Bに近くなっている(つまり、幅da2は幅da1よりも大きい)。なお、フロントガラスの曲率半径が小さくなるほど、幅da1、幅da2は基線長Bに対して離れていく(図11参照)。それに対し、フロントガラスの曲率半径が大きくなるほど、幅da1、幅da2は基線長Bに近づく(図11参照)。車両に応じてフロントガラスの曲率半径が異なるため、それに応じたチャート間隔とする必要がある。また、ステレオカメラとチャート間の距離が小さいと、幅da1、幅da2は基線長Bに対して離れていく。それに対し、ステレオカメラとチャート間の距離が大きくなるにつれ、幅da1、幅da2は基線長Bに近づく。 This embodiment takes this into consideration. Figure 4 shows the shapes of the areas of chart G20 in this embodiment. Each area has a similar pattern (a square shape in the illustrated example). In this embodiment, each area (pattern) of chart G20 is characterized by having a different width in the horizontal direction (baseline direction). For example, this shows that the horizontal width da1 of area C33 (first area) of chart G20 is different from the horizontal width da2 of area C35 (second area). Width da2 is larger than width da1. In this figure, when the plane including the lens optical axes of the two cameras of the stereo camera and the perpendicular bisectors of the two cameras are on the area C33 on the chart G20 (in other words, when the area including the plane including the lens optical axes of the two cameras of the stereo camera and the position on the chart G20 where the perpendicular bisectors of the two cameras intersect is area C33, or when the plane including the lens optical axes of the two cameras of the stereo camera and the area where the perpendicular bisectors of the two cameras intersect is area 33, and the area whose horizontal position (baseline direction) is different from area C33 is area 35), the widths da1 and da2 are both larger than the base line length B, and the width da1 is closer to the base line length B than the width da2 (i.e., the width da2 is larger than the width da1). Note that the smaller the radius of curvature of the windshield, the more the widths da1 and da2 are separated from the base line length B (see FIG. 11). On the other hand, as the radius of curvature of the windshield increases, widths da1 and da2 approach baseline length B (see Figure 11). Because the radius of curvature of the windshield varies depending on the vehicle, the chart spacing must be adjusted accordingly. Also, if the distance between the stereo camera and the chart is small, widths da1 and da2 move away from baseline length B. On the other hand, as the distance between the stereo camera and the chart increases, widths da1 and da2 approach baseline length B.

 図5は、図4のチャートG20の各領域内のパターンを示している。例えば、チャートG20上の領域C33内のパターンを示している。ここでも水平方向の領域の幅が異なる。例えば(0,0)の水平方向の幅db1と(2,0)の水平方向の幅db2は異なる。幅db1よりも幅db2の方が大きい。そして、各領域の検出光量が異なるように設定されており、間違ったマッチングが発生しないようになっている。このような構成とすることで、例えば、右カメラ50でチャートG20の領域C34を検出した画像と左カメラ60で領域C33を検出した画像の対応点のずれ量を検出する。ずれ量の求め方としては、例えば、ステレオカメラで用いるブロックマッチングや特徴点抽出などの検出方法を用いれば良い。 FIG. 5 shows the patterns in each area of chart G20 in FIG. 4. For example, the pattern in area C33 on chart G20 is shown. Here too, the horizontal width of the areas is different. For example, the horizontal width db1 of (0,0) is different from the horizontal width db2 of (2,0). Width db2 is larger than width db1. The detected light amount of each area is set to be different, so that incorrect matching does not occur. With this configuration, for example, the amount of deviation between corresponding points in an image in which area C34 of chart G20 is detected by the right camera 50 and an image in which area C33 is detected by the left camera 60 is detected is detected. The amount of deviation can be calculated using a detection method such as block matching or feature point extraction used in stereo cameras.

 図6は、フロントガラスによる光線の位置ずれを考慮した場合の光路を示している。まず、左カメラ60の光線は図3と同様である。一方で、左カメラ60に入射する光線R63の角度と右カメラ50に入射する光線R54の角度が一致する(線S54、線S63は平行で、その間隔は基線長Bになる)ように光線R54の角度を変更している。このとき、チャートG20上の水平方向の周期パターン(言い換えれば、光線R54、光線R63にて右カメラ50、左カメラ60で検出するチャートG20上の像の位置Q3、位置Q2の間隔)は、距離Dだけ離れて配置される。 Figure 6 shows the light path when the positional deviation of the light rays due to the windshield is taken into consideration. First, the light rays of the left camera 60 are the same as those of Figure 3. Meanwhile, the angle of the light ray R54 is changed so that the angle of the light ray R63 incident on the left camera 60 matches the angle of the light ray R54 incident on the right camera 50 (lines S54 and S63 are parallel, and the distance between them is the base line length B). At this time, the horizontal periodic pattern on the chart G20 (in other words, the distance between positions Q3 and Q2 of the images on the chart G20 detected by the right camera 50 and left camera 60 with the light rays R54 and R63) is positioned a distance D apart.

 本実施例のフロントガラスの屈折による光線の傾きずれの影響の視差ずれは、図1と同等の処理を用いることで求めることができる。すなわち、ステレオ視差画像校正部99において、マッチング部240が左右のカメラ(右カメラ50、左カメラ60)でチャートG20を撮像して得られた左右の画像(画像P1、画像P2)のステレオマッチングを行った結果に基づいて、少なくとも光透過部材であるフロントガラス(の屈折)に起因する視差ずれを校正する校正パラメータを求める。このとき、ステレオ視差画像校正部99は、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、上述した校正パラメータを求める。このとき、ステレオ視差画像校正部99(の校正データ)の結果がこの視差ずれとなる。この視差ずれを補正するようにデータを処理し、ステレオ視差校正データ記録部98に記録して、ステレオ視差画像校正部99による校正を行えばよい。このような校正を行うと、遠方の測定対象物に対して高精度化が図れるものの、近傍の測定対象物では、フロントガラスの屈折による位置ずれ要因の視差ずれが発生することとなる。しかしながら、遠方よりも近傍の方が視差ずれの許容値が大きいため、その影響は小さい。 In this embodiment, the parallax shift caused by the inclination shift of the light beam due to the refraction of the windshield can be obtained by using the same process as in FIG. 1. That is, in the stereo parallax image calibration unit 99, the matching unit 240 obtains a calibration parameter for calibrating the parallax shift caused by (the refraction of) the windshield, which is at least a light-transmitting member, based on the result of stereo matching of the left and right images (images P1 and P2) obtained by capturing images of the chart G20 with the left and right cameras (right camera 50 and left camera 60). At this time, the stereo parallax image calibration unit 99 obtains the above-mentioned calibration parameter based on the result of stereo matching assuming that different patterns are the same point. At this time, the result of (the calibration data of) the stereo parallax image calibration unit 99 becomes this parallax shift. The data is processed to correct this parallax shift, recorded in the stereo parallax calibration data recording unit 98, and calibration is performed by the stereo parallax image calibration unit 99. When such a calibration is performed, high accuracy can be achieved for a distant measurement object, but for a nearby measurement object, a parallax shift due to the refraction of the windshield occurs as a cause of positional deviation. However, the tolerance for parallax shift is greater in close proximity than in distant locations, so the impact is small.

 図7は、従来技術1と従来技術2のシミュレーション結果である。ここでは、下記パラメータで計算を実施した。
 <フロントガラス>
 ・曲率半径
  -水平:5.0m
  -垂直:5.0m
 ・フロントガラス煽り角:45度
 ・フロントガラス厚さ:5.0mm
 ・フロントガラス屈折率:1.52
 ・ステレオカメラ-フロントガラス間距離:50mm
 <ステレオカメラ>
 ・基線長:200mm
 ・焦点距離:4.0mm
 ・センサ画素ピッチ:0.00375mm
 ・画角:-72度~+72度(水平)、0度(垂直)
 <チャート>
 ・カメラ-チャート間距離:0.5m
 <測定対象物>
 ・ステレオカメラ-測定対象物間距離:50m
 ・(フロントガラスの影響見積時)ステレオカメラ-測定対象物間距離:1000m
7 shows the simulation results for Conventional Technique 1 and Conventional Technique 2. Here, the calculation was performed with the following parameters.
<Windshield>
・Curvature radius -Horizontal: 5.0m
-Vertical: 5.0m
・Windshield tilt angle: 45 degrees ・Windshield thickness: 5.0 mm
Front glass refractive index: 1.52
・Distance between stereo camera and windshield: 50mm
<Stereo camera>
・Baseline length: 200mm
・Focal length: 4.0mm
・Sensor pixel pitch: 0.00375 mm
- Angle of view: -72 degrees to +72 degrees (horizontal), 0 degrees (vertical)
<Chart>
・Camera-chart distance: 0.5m
<Measurement object>
・Distance between stereo camera and measurement object: 50m
・(When estimating the effect of the windshield) Distance between stereo camera and measurement object: 1000m

 図7は、視差ずれの水平画角依存性を示している。縦軸は、視差ずれ、横軸は、水平画角を示している。本図では、フロントガラス有無の2条件の結果を示している。なお、フロントガラス有の場合、近傍チャートを用いて検出した視差ずれとステレオカメラ-測定対象物間距離を1000mとしたときの視差ずれの差分を算出した。すなわち、フロントガラスの屈折による光線の傾きずれの影響を取り除いた状態であることを示す。このため、縦軸の視差ずれは、上述したフロントガラスの屈折に伴う光線の位置ずれの影響である。実際の校正では、フロントガラスによる光線の傾きずれと位置ずれ要因の視差ずれが同時に検出されるため、従来技術1と従来技術2に記載されている校正を行うと、この視差ずれが残留することになる。 Figure 7 shows the horizontal angle of view dependency of parallax shift. The vertical axis shows parallax shift, and the horizontal axis shows horizontal angle of view. This figure shows the results for two conditions: with and without a windshield. When a windshield is present, the difference between the parallax shift detected using the proximity chart and the parallax shift when the distance between the stereo camera and the measurement object is set to 1000 m was calculated. In other words, this shows the state in which the effect of the inclination shift of the light beam due to refraction by the windshield has been removed. Therefore, the parallax shift on the vertical axis is the effect of the position shift of the light beam due to the refraction by the windshield described above. In actual calibration, the inclination shift of the light beam due to the windshield and the parallax shift, which is the cause of the position shift, are detected simultaneously, so when the calibration described in Prior Art 1 and Prior Art 2 is performed, this parallax shift remains.

 図8は、測距距離の水平画角依存性を示している。縦軸は、ステレオカメラの測距距離、横軸は、水平画角を示している。ここでは、図7に示した視差ずれを用いてステレオカメラの測距距離が算出されている。フロントガラスが無い場合、正確な測距(50m)が実現できている。一方で、フロントガラスが有ると測距誤差が大きくなる。水平画角が小さい領域(例えば、水平画角0度)でも測距誤差が発生しているが、広角側で測距誤差が大きくなっている。 Figure 8 shows the dependency of the measured distance on the horizontal angle of view. The vertical axis shows the measured distance of the stereo camera, and the horizontal axis shows the horizontal angle of view. Here, the measured distance of the stereo camera is calculated using the parallax shift shown in Figure 7. When there is no windshield, accurate distance measurement (50 m) is achieved. On the other hand, the presence of a windshield results in larger distance measurement errors. Although distance measurement errors occur even in areas with small horizontal angles of view (for example, horizontal angle of view 0 degrees), the distance measurement errors are larger on the wide-angle side.

 図9は、本実施例を適用したときの視差ずれの水平画角依存性を示している。縦軸は、視差ずれ、横軸は、水平画角を示している。図7と同様、フロントガラスによる歪の影響を取り除いた状態である。これより、本実施例を適用することで、視差ずれがほとんどないことがわかる。このため、本実施例では、高精度にフロントガラスによる歪要因の視差ずれを求めることができることがわかる。 Figure 9 shows the horizontal angle of view dependency of parallax shift when this embodiment is applied. The vertical axis shows parallax shift, and the horizontal axis shows horizontal angle of view. As with Figure 7, this is a state where the effects of distortion due to the windshield have been removed. This shows that there is almost no parallax shift when this embodiment is applied. Therefore, it can be seen that this embodiment can calculate the parallax shift caused by distortion due to the windshield with high accuracy.

 図10は、本実施例を適用したときの測距距離の水平画角依存性を示している。縦軸は、ステレオカメラの測距距離、横軸は、水平画角を示している。このように、本実施例を適用することで、高精度な測距(50m)が実現できることがわかる。 Figure 10 shows the dependency of the distance measurement on the horizontal angle of view when this embodiment is applied. The vertical axis shows the distance measurement of the stereo camera, and the horizontal axis shows the horizontal angle of view. As shown above, it can be seen that highly accurate distance measurement (50 m) can be achieved by applying this embodiment.

 図11は、チャートの最適な間隔D(図6)(周期パターンまたは繰返しパターンの間隔)の水平画角依存性を示している。縦軸は、間隔D、横軸は、水平画角を示している。図9、図10は、この間隔Dで視差ずれ、ステレオカメラの測距距離が計算されている。図11に示すように、全ての水平角度領域で基線長(200mm)よりも長く間隔Dを設定している。これにより、図9、図10に示すような高精度な結果が得られる。 Figure 11 shows the horizontal angle of view dependency of the optimal chart spacing D (Figure 6) (spacing of periodic or repeating patterns). The vertical axis shows spacing D, and the horizontal axis shows horizontal angle of view. In Figures 9 and 10, the parallax shift and the stereo camera distance measurement distance are calculated at this spacing D. As shown in Figure 11, spacing D is set longer than the baseline length (200 mm) in all horizontal angle ranges. This allows for highly accurate results as shown in Figures 9 and 10.

 以上のように、本実施例は、従来技術1と従来技術2と同様に、異なる領域を比較し(異なるパターンが同一点であるものとしてステレオマッチングを行い)、そのずれ量を検出する。本実施例と従来技術1と従来技術2の違いは、図6に示すように、チャートG20上の水平方向の周期パターンは、距離Dが基線長Bと異なる(詳細には、基線長Bよりも大きい)点である。そして、本実施例は、2つのカメラの水平方向の長さが、少なくとも第一の領域と第二の領域で異なるようにすることで、測距精度を向上することができる。 As described above, this embodiment, like prior art 1 and prior art 2, compares different areas (performs stereo matching by assuming that different patterns are the same point) and detects the amount of deviation. The difference between this embodiment and prior art 1 and prior art 2 is that, as shown in FIG. 6, the distance D of the horizontal periodic pattern on chart G20 is different from the baseline length B (more specifically, is greater than the baseline length B). And, this embodiment can improve distance measurement accuracy by making the horizontal lengths of the two cameras different at least in the first and second areas.

 ここで、本実施例では、図4のように、垂直方向の画角で違いは無かったが、例えば図12のように、垂直方向の画角に対して水平方向(基線方向)の幅の変化量を変えても良い。例えば、チャートG20の領域C33(第一の領域)と当該領域C33(第一の領域)のカメラのセンサ面の基線方向に対して垂直な方向にある領域C13、C23、C43、C53(第三の領域)で、水平方向(基線方向)の幅を変えても良い。これは、フロントガラスが傾いているため、垂直方向の画角に対して水平方向(基線方向)の最適な間隔Dが変わってくる。このようにすることで、画像全体の測距精度を向上できる。そして、図12のように各領域を段階的に(ステップ状に)つないでもよいし、図13のように各領域を滑らかにつないでもよい。そして、例えば、水平狭角部と水平広角部の2つの間隔Dを用いても効果が得られることは言うまでもない。 Here, in this embodiment, as in FIG. 4, there is no difference in the vertical angle of view, but as in FIG. 12, for example, the amount of change in the horizontal (baseline) width may be changed with respect to the vertical angle of view. For example, the horizontal (baseline) width may be changed between area C33 (first area) of chart G20 and areas C13, C23, C43, and C53 (third areas) that are perpendicular to the baseline direction of the sensor surface of the camera of area C33 (first area). This is because the windshield is tilted, so the optimal interval D in the horizontal (baseline) direction changes with respect to the vertical angle of view. In this way, the distance measurement accuracy of the entire image can be improved. Each area may be connected in stages (step-like) as in FIG. 12, or each area may be connected smoothly as in FIG. 13. It goes without saying that the effect can be obtained by using two intervals D, for example, the horizontal narrow angle part and the horizontal wide angle part.

 さらに、本実施例では、図5のような四角形状のチャートパターンを用いたが、図14のように円形のチャートパターンを用いても良い。 Furthermore, in this embodiment, a rectangular chart pattern as shown in FIG. 5 is used, but a circular chart pattern as shown in FIG. 14 may also be used.

 また、図1でステレオ視差画像校正部99とステレオ視差校正データ記録部98を用いてフロントガラスの影響を校正したが、それだけには限定されない。例えば、アフィン処理手段20aまたはアフィン処理手段20bにステレオ視差校正データ記録部98の校正データを送って、校正することもできる。 In addition, in FIG. 1, the influence of the windshield is calibrated using the stereo parallax image calibration unit 99 and the stereo parallax calibration data recording unit 98, but this is not limited to this. For example, calibration can also be performed by sending the calibration data of the stereo parallax calibration data recording unit 98 to the affine processing means 20a or affine processing means 20b.

[実施例2]
 図15を参照して、実施例2に係る画像処理装置10の校正方法を説明する。実施例1の校正方法は、従来技術1と従来技術2の校正方法と同様の算出方法としたが、これには限定されない。本実施例2では、チャートを近傍配置したときのフロントガラスの屈折に伴う光線の位置ずれの影響を考慮したときの校正方法について説明する。
[Example 2]
A calibration method for the image processing device 10 according to the second embodiment will be described with reference to Fig. 15. The calibration method in the first embodiment is a calculation method similar to the calibration methods in the conventional techniques 1 and 2, but is not limited to this. In the second embodiment, a calibration method will be described in which the influence of the shift in position of light caused by refraction of the windshield when a chart is placed nearby is taken into consideration.

 図15は、フロントガラスの影響の見積りフローを含む画像処理装置10の構成例を示している。本実施例では、実施例1と同様、図4、図12、図13に示すチャートを用いることが特徴である。本実施例では、図15に示すように、Yずれ画像生成部201で、2つの画像の垂直方向のずれ(Yずれ)画像を求めることを特徴としている。そして、ΔYずれ画像生成部80では、Yずれ画像と計算や実機から求めた基準条件のYずれ画像の差分(ΔYずれ画像)を演算する。そして、Δ視差ずれ算出処理部85では、ΔYずれ画像に所定係数をかけることで、Δ視差ずれ画像を生成する。最後に、視差ずれ算出処理部90で、Δ視差ずれ画像と基準条件の視差ずれ画像を足し合わせることで、フロントガラスの屈折による光線の傾きずれ要因の視差ずれを求める。ここで、Δ視差ずれとΔYずれとは、例えば設計値や製品の中心値などの基準条件の視差ずれとYずれ量を示している。ここでは、フロントガラスの曲率半径ばらつき、フロントガラスに対するカメラの搭載位置、フロントガラスの厚さずれなどに対応するため、Δ視差ずれ、ΔYずれを用いている。 FIG. 15 shows an example of the configuration of the image processing device 10 including the flow of estimating the influence of the windshield. As in the first embodiment, this embodiment is characterized by using the charts shown in FIG. 4, FIG. 12, and FIG. 13. As shown in FIG. 15, this embodiment is characterized by obtaining the vertical deviation (Y deviation) image of two images in the Y deviation image generating unit 201. Then, the ΔY deviation image generating unit 80 calculates the difference (ΔY deviation image) between the Y deviation image and the Y deviation image under reference conditions obtained by calculation or from an actual device. Then, the Δ parallax deviation calculation processing unit 85 generates a Δ parallax deviation image by multiplying the ΔY deviation image by a predetermined coefficient. Finally, the parallax deviation calculation processing unit 90 adds the Δ parallax deviation image and the parallax deviation image under reference conditions to obtain the parallax deviation caused by the refraction of the windshield. Here, the Δ parallax deviation and the ΔY deviation indicate the parallax deviation and Y deviation amount under reference conditions such as the design value or the center value of the product. Here, we use Δparallax shift and ΔY shift to accommodate variations in the radius of curvature of the windshield, the mounting position of the camera relative to the windshield, and thickness variations in the windshield.

 次に、本実施例で、視差ずれが校正できる理由について説明する。フロントガラスは、水平、垂直方向の曲率半径、フロントガラスの煽り角があるため、フロントガラスの曲率半径ばらつき、フロントガラスに対するカメラの搭載位置、フロントガラスの厚さずれなどのずれ要因に対して、水平方向の変化(Δ視差ずれ)だけでなく、垂直方向の変化(ΔYずれ)が発生する。ここで、Δ視差ずれとΔYずれには相関がある。このため、ΔYずれを検出することで、Δ視差ずれを求めることができる。そして、Δ視差ずれと基準条件の視差ずれから視差ずれを求めることができる。本実施例では、このように求められた視差ずれのデータを、図1に示すステレオ視差校正データ記録部98に記録し、ステレオ視差画像校正部99で校正を行う。 Next, the reason why the parallax shift can be calibrated in this embodiment will be explained. Since the windshield has a radius of curvature in the horizontal and vertical directions and a tilt angle of the windshield, deviations due to deviation factors such as variations in the radius of curvature of the windshield, the mounting position of the camera relative to the windshield, and deviations in the thickness of the windshield, not only a change in the horizontal direction (Δ parallax shift) but also a change in the vertical direction (ΔY shift) occurs. Here, there is a correlation between Δ parallax shift and ΔY shift. Therefore, by detecting the ΔY shift, it is possible to obtain the Δ parallax shift. Then, the parallax shift can be obtained from the Δ parallax shift and the parallax shift under the reference conditions. In this embodiment, the parallax shift data obtained in this manner is recorded in the stereo parallax calibration data recording unit 98 shown in FIG. 1, and calibration is performed in the stereo parallax image calibration unit 99.

 以上のように、チャートを近傍配置したときのフロントガラスの屈折に伴う光線の位置ずれを考慮することで、高精度にフロントガラスの屈折による光線の傾きずれの影響を見積ることができる。ここで、本実施例では、Y方向のずれを用いていたが、これだけには限定されず、チャートを用いてフロントガラスの影響を校正する方法であれば同様の効果が得られることは言うまでもない。なお、フロントガラスやカメラの搭載位置などのばらつき要因の影響が小さい場合には、実施例1の校正方法を用いることができる。一方で、それらのばらつき要因の影響が大きい場合には、本実施例のような校正方法を用いることが望ましい。 As described above, by taking into account the positional shift of light rays due to refraction by the windshield when a chart is placed nearby, it is possible to estimate with high accuracy the effect of the tilt shift of light rays due to refraction by the windshield. Here, in this embodiment, the shift in the Y direction is used, but this is not limited to this, and it goes without saying that a similar effect can be obtained if a chart is used to calibrate the effect of the windshield. Note that when the effect of variation factors such as the windshield and the mounting position of the camera is small, the calibration method of embodiment 1 can be used. On the other hand, when the effect of these variation factors is large, it is desirable to use a calibration method like that of this embodiment.

[まとめ]
 以上説明したように、本実施例のステレオ画像処理装置10は、光透過部材(フロントガラス1)を介して被写体を撮像する複数のカメラ(右カメラ50、左カメラ60)により撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング部(マッチング部240)と、前記ステレオマッチング部(マッチング部240)が前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材(フロントガラス1)に起因する視差ずれを校正する校正パラメータを求める校正処理部(ステレオ視差画像校正部99)と、を備え、前記校正チャートは、パターンを含み、前記パターンは、前記複数のカメラの基線長よりも大きな繰返しパターンであり(図11)、前記校正処理部(ステレオ視差画像校正部99)は、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める。
[summary]
As described above, the stereo image processing device 10 of this embodiment includes a stereo matching unit (matching unit 240) that performs stereo matching of a plurality of images captured by a plurality of cameras (right camera 50, left camera 60) that capture images of a subject through a light-transmitting member (windshield 1) to determine a parallax, and a calibration processing unit (stereo parallax image calibration unit 99) that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on a result of stereo matching performed by the stereo matching unit (matching unit 240) on a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras, the calibration chart including a pattern, and the pattern is a repeated pattern that is larger than the baseline lengths of the plurality of cameras ( FIG. 11 ), and the calibration processing unit (stereo parallax image calibration unit 99) determines the calibration parameters based on a result of stereo matching performed assuming that different patterns are the same point.

 また、本実施例のステレオ画像処理装置10は、光透過部材(フロントガラス1)を介して被写体を撮像する複数のカメラ(右カメラ50、左カメラ60)により撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング部(マッチング部240)と、前記ステレオマッチング部(マッチング部240)が前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材(フロントガラス1)に起因する視差ずれを校正する校正パラメータを求める校正処理部(ステレオ視差画像校正部99)と、を備え、前記校正チャートは、パターンを含み、前記パターンは、前記複数のカメラの基線方向の長さが、少なくとも第一の領域と第二の領域とで異なり(換言すれば、前記複数のカメラの基線方向の長さが異なる第一の領域と第二の領域とを有し)(図4のda2>da1、図11等)、前記校正処理部(ステレオ視差画像校正部99)は、前記校正チャートの前記第一の領域と前記第二の領域のそれぞれにおいて、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める。 The stereo image processing device 10 of this embodiment also includes a stereo matching unit (matching unit 240) that performs stereo matching of multiple images captured by multiple cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1) to determine parallax, and a calibration processing unit (stepper 240) that performs stereo matching of multiple images obtained by capturing an image of a calibration chart with the multiple cameras to determine calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of the stereo matching performed by the stereo matching unit (matching unit 240) on the multiple images obtained by capturing an image of a calibration chart with the multiple cameras. and a stereo parallax image calibration unit 99), the calibration chart includes a pattern, and the length of the baseline direction of the multiple cameras is different in at least a first region and a second region of the pattern (in other words, the pattern has a first region and a second region in which the lengths of the baseline direction of the multiple cameras are different) (da2>da1 in FIG. 4, FIG. 11, etc.), and the calibration processing unit (stereo parallax image calibration unit 99) finds the calibration parameters based on the results of stereo matching in the first region and the second region of the calibration chart, assuming that different patterns are the same point.

 また、本実施例のステレオ画像処理装置10は、前記複数のカメラのレンズ光軸を含む平面上かつ前記複数のカメラの垂直2等分線が交差する前記校正チャート上の位置を含む領域を前記第一の領域とし(、前記複数のカメラの基線方向の位置が前記第一の領域と異なる領域を前記第二の領域とし)たとき、前記第二の領域は、前記第一の領域に対して前記複数のカメラの基線方向の長さが長くなる(換言すれば、前記複数のカメラの基線方向の位置が前記第一の領域から離れるに従って前記第一の領域に対して前記複数のカメラの基線方向の長さが長くなる)(図4のda2>da1、図11等)。 In addition, in the stereo image processing device 10 of this embodiment, when the first region is an area on a plane including the lens optical axes of the multiple cameras and including the position on the calibration chart where the perpendicular bisectors of the multiple cameras intersect (and the second region is an area where the baseline positions of the multiple cameras differ from the first region), the second region has a longer length in the baseline direction of the multiple cameras than the first region (in other words, the further the baseline positions of the multiple cameras are from the first region, the longer the length in the baseline direction of the multiple cameras than the first region) (da2>da1 in Figure 4, Figure 11, etc.).

 また、本実施例のステレオ画像処理装置10は、前記校正チャートの前記第一の領域の前記カメラのセンサ面の基線方向に対して垂直な方向に前記校正チャートの第三の領域があり、前記第一の領域と前記第三の領域で前記複数のカメラの基線方向の長さが異なる(図12、図13)。 In addition, the stereo image processing device 10 of this embodiment has a third area of the calibration chart in a direction perpendicular to the baseline direction of the sensor surface of the camera in the first area of the calibration chart, and the lengths of the baseline direction of the multiple cameras are different between the first area and the third area (Figures 12 and 13).

 本実施例のステレオ画像校正方法は、光透過部材(フロントガラス1)を介して被写体を撮像する複数のカメラ(右カメラ50、左カメラ60)により撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング処理(マッチング部240)と、前記ステレオマッチング処理(マッチング部240)で前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材(フロントガラス1)に起因する視差ずれを校正する校正パラメータを求める校正処理(ステレオ視差画像校正部99)と、を備え、前記校正チャートは、パターンを含み、前記パターンは、前記複数のカメラの基線長よりも大きな繰返しパターンであり(図11)、前記校正処理(ステレオ視差画像校正部99)において、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める。 The stereo image calibration method of this embodiment includes a stereo matching process (matching unit 240) that performs stereo matching of multiple images captured by multiple cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1) to determine parallax, and a calibration process (stereo parallax image calibration unit 99) that determines calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of stereo matching of multiple images obtained by capturing images of a calibration chart with the multiple cameras in the stereo matching process (matching unit 240). The calibration chart includes a pattern, and the pattern is a repeating pattern that is larger than the baseline length of the multiple cameras (Figure 11). In the calibration process (stereo parallax image calibration unit 99), the calibration parameters are determined based on the results of stereo matching performed assuming that different patterns are the same point.

 また、本実施例のステレオ画像校正方法は、光透過部材(フロントガラス1)を介して被写体を撮像する複数のカメラ(右カメラ50、左カメラ60)により撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング処理(マッチング部240)と、前記ステレオマッチング処理(マッチング部240)で前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材(フロントガラス1)に起因する視差ずれを校正する校正パラメータを求める校正処理(ステレオ視差画像校正部99)と、を備え、前記校正チャートは、パターンを含み、前記パターンは、前記複数のカメラの基線方向の長さが、少なくとも第一の領域と第二の領域とで異なり(換言すれば、前記複数のカメラの基線方向の長さが異なる第一の領域と第二の領域とを有し)(図4のda2>da1、図11等)、前記校正処理(ステレオ視差画像校正部99)において、前記校正チャートの前記第一の領域と前記第二の領域のそれぞれにおいて、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める。 The stereo image calibration method of this embodiment includes a stereo matching process (matching unit 240) for determining parallax by performing stereo matching on a plurality of images captured by a plurality of cameras (right camera 50, left camera 60) that capture an image of a subject through a light-transmitting member (windshield 1), and a calibration process (stereo matching unit 240) for determining calibration parameters for calibrating at least the parallax shift caused by the light-transmitting member (windshield 1) based on the results of the stereo matching of a plurality of images obtained by capturing an image of a calibration chart by the plurality of cameras in the stereo matching process (matching unit 240). and a stereo parallax image calibration unit 99), the calibration chart includes a pattern, and the length of the baseline direction of the multiple cameras is different in at least a first region and a second region of the pattern (in other words, the pattern has a first region and a second region in which the lengths of the baseline direction of the multiple cameras are different) (da2>da1 in FIG. 4, FIG. 11, etc.), and in the calibration process (stereo parallax image calibration unit 99), the calibration parameters are calculated based on the results of stereo matching in the first region and the second region of the calibration chart, where different patterns are considered to be the same point.

 すなわち、本実施例のステレオ画像処理装置10およびステレオ画像校正方法は、校正用のチャート(校正チャート)G20が中心から外側に向かってパターンサイズを変化させることで、フロントガラスなどの光透過部材の屈折による位置ずれを抑制する。 In other words, the stereo image processing device 10 and stereo image calibration method of this embodiment change the pattern size of the calibration chart (calibration chart) G20 from the center to the outside, thereby suppressing positional deviations caused by refraction of light-transmitting members such as a windshield.

 本実施例に係るステレオ画像処理装置10およびステレオ画像校正方法によれば、チャートのサイズを大型化することなく、高精度に広画角カメラの校正を行うことができるステレオ画像処理装置10およびステレオ画像校正方法を提供することができる。 The stereo image processing device 10 and stereo image calibration method according to this embodiment can provide a stereo image processing device 10 and stereo image calibration method that can calibrate a wide-angle camera with high accuracy without increasing the size of the chart.

 以上、図面を用いて本発明に係る実施例を詳述してきたが、具体的な構成はこの実施例に限定されるものではなく、本発明の要旨を逸脱しない範囲における設計変更等があっても、それらは本発明に含まれるものである。  Although the embodiments of the present invention have been described in detail above using the drawings, the specific configuration is not limited to these embodiments, and any design changes that do not deviate from the gist of the present invention are also included in the present invention.

 なお、本発明は上記した実施例に限定されるものではなく、上記以外の様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 The present invention is not limited to the above-mentioned embodiment, but includes various other variations. For example, the above-mentioned embodiment has been described in detail to clearly explain the present invention, and is not necessarily limited to those having all of the configurations described.

 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記憶装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 Furthermore, the above-mentioned configurations, functions, processing units, processing means, etc. may be realized in hardware, in part or in whole, for example by designing them as integrated circuits. Further, the above-mentioned configurations, functions, etc. may be realized in software by a processor interpreting and executing a program that realizes each function. Information on the programs, tables, files, etc. that realize each function can be stored in a memory, a storage device such as a hard disk or SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.

 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 Furthermore, the control lines and information lines shown are those considered necessary for the explanation, and do not necessarily show all control lines and information lines on the product. In reality, it can be assumed that almost all components are interconnected.

1…フロントガラス
10…ステレオカメラ画像処理装置(ステレオ画像処理装置)
50…右カメラ
60…左カメラ
20a…アフィン処理手段
20b…アフィン処理手段
98…ステレオ視差校正データ記録部
99…ステレオ視差画像生成部(校正処理部)
200…ステレオ視差画像生成部
201…Yずれ画像生成部
400…路面断面形状推定部
500…ステレオ視立体物検知部
1... Windshield 10... Stereo camera image processing device (stereo image processing device)
50: right camera 60: left camera 20a: affine processing means 20b: affine processing means 98: stereo parallax calibration data recording unit 99: stereo parallax image generating unit (calibration processing unit)
200: stereo disparity image generating unit 201: Y-shift image generating unit 400: road surface cross-sectional shape estimating unit 500: stereo vision three-dimensional object detecting unit

Claims (8)

 光透過部材を介して被写体を撮像する複数のカメラにより撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング部と、
 前記ステレオマッチング部が前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材に起因する視差ずれを校正する校正パラメータを求める校正処理部と、を備え、
 前記校正チャートは、パターンを含み、
 前記パターンは、前記複数のカメラの基線長よりも大きな繰返しパターンであり、
 前記校正処理部は、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める、ことを特徴とするステレオ画像処理装置。
a stereo matching unit that performs stereo matching of a plurality of images captured by a plurality of cameras that capture an image of a subject through a light transmitting member, and obtains a parallax;
a calibration processing unit that determines a calibration parameter for calibrating at least a parallax shift caused by the light transmitting member based on a result of stereo matching performed by the stereo matching unit on a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras,
the calibration chart includes a pattern;
the pattern is a repeating pattern larger than a baseline length of the plurality of cameras;
The stereo image processing device according to claim 1, wherein the calibration processing unit determines the calibration parameters based on a result of stereo matching performed by assuming that different patterns are the same point.
 光透過部材を介して被写体を撮像する複数のカメラにより撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング部と、
 前記ステレオマッチング部が前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材に起因する視差ずれを校正する校正パラメータを求める校正処理部と、を備え、
 前記校正チャートは、パターンを含み、
 前記パターンは、前記複数のカメラの基線方向の長さが、少なくとも第一の領域と第二の領域とで異なり、
 前記校正処理部は、前記校正チャートの前記第一の領域と前記第二の領域のそれぞれにおいて、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める、ことを特徴とするステレオ画像処理装置。
a stereo matching unit that performs stereo matching of a plurality of images captured by a plurality of cameras that capture an image of a subject through a light transmitting member, and obtains a parallax;
a calibration processing unit that determines a calibration parameter for calibrating at least a parallax shift caused by the light transmitting member based on a result of stereo matching performed by the stereo matching unit on a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras,
the calibration chart includes a pattern;
The pattern includes a first region and a second region, and the lengths of the plurality of cameras in a baseline direction are different from each other at least in the first region and the second region;
the calibration processing unit determines the calibration parameters based on a result of stereo matching in which different patterns are assumed to be the same point in each of the first area and the second area of the calibration chart.
 請求項2に記載のステレオ画像処理装置であって、
 前記複数のカメラのレンズ光軸を含む平面上かつ前記複数のカメラの垂直2等分線が交差する前記校正チャート上の位置を含む領域を前記第一の領域としたとき、
 前記第二の領域は、前記第一の領域に対して前記複数のカメラの基線方向の長さが長くなる、ことを特徴とするステレオ画像処理装置。
3. The stereo image processing device according to claim 2,
When the first area is defined as an area including a position on the calibration chart where a plane including the lens optical axes of the plurality of cameras and perpendicular bisectors of the plurality of cameras intersect,
13. A stereo image processing device, wherein the second area has a longer length in a baseline direction of the plurality of cameras than the first area.
 請求項2に記載のステレオ画像処理装置であって、
 前記校正チャートの前記第一の領域の前記カメラのセンサ面の基線方向に対して垂直な方向に前記校正チャートの第三の領域があり、
 前記第一の領域と前記第三の領域で前記複数のカメラの基線方向の長さが異なる、ことを特徴とするステレオ画像処理装置。
3. The stereo image processing device according to claim 2,
a third area of the calibration chart is disposed in a direction perpendicular to a baseline direction of a sensor surface of the camera in the first area of the calibration chart;
13. A stereo image processing device, comprising: a plurality of cameras each having a different length in a baseline direction between the first area and the third area.
 光透過部材を介して被写体を撮像する複数のカメラにより撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング処理と、
 前記ステレオマッチング処理で前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材に起因する視差ずれを校正する校正パラメータを求める校正処理と、を備え、
 前記校正チャートは、パターンを含み、
 前記パターンは、前記複数のカメラの基線長よりも大きな繰返しパターンであり、
 前記校正処理において、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める、ことを特徴とするステレオ画像校正方法。
A stereo matching process for obtaining a parallax by performing stereo matching on a plurality of images captured by a plurality of cameras that capture an image of a subject through a light-transmitting member;
a calibration process for determining a calibration parameter for calibrating at least a parallax shift caused by the light transmitting member, based on a result of performing stereo matching of a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras in the stereo matching process;
the calibration chart includes a pattern;
the pattern is a repeating pattern larger than a baseline length of the plurality of cameras;
A stereo image calibration method, comprising the steps of: determining the calibration parameters based on a result of stereo matching performed by assuming that different patterns are the same point in the calibration process.
 光透過部材を介して被写体を撮像する複数のカメラにより撮像された複数の画像のステレオマッチングを行い視差を求めるステレオマッチング処理と、
 前記ステレオマッチング処理で前記複数のカメラで校正チャートを撮像して得られた複数の画像のステレオマッチングを行った結果に基づいて、少なくとも前記光透過部材に起因する視差ずれを校正する校正パラメータを求める校正処理と、を備え、
 前記校正チャートは、パターンを含み、
 前記パターンは、前記複数のカメラの基線方向の長さが、少なくとも第一の領域と第二の領域とで異なり、
 前記校正処理において、前記校正チャートの前記第一の領域と前記第二の領域のそれぞれにおいて、異なるパターンが同一点であるものとしてステレオマッチングを行った結果に基づいて、前記校正パラメータを求める、ことを特徴とするステレオ画像校正方法。
A stereo matching process for obtaining a parallax by performing stereo matching on a plurality of images captured by a plurality of cameras that capture an image of a subject through a light-transmitting member;
a calibration process for determining a calibration parameter for calibrating at least a parallax shift caused by the light transmitting member, based on a result of performing stereo matching of a plurality of images obtained by capturing images of a calibration chart with the plurality of cameras in the stereo matching process;
the calibration chart includes a pattern;
The pattern includes a first area and a second area, and the lengths of the plurality of cameras in a baseline direction are different from each other at least in the first area and the second area;
a stereo image calibration method comprising: determining, in the calibration process, the calibration parameters based on a result of stereo matching performed in each of the first area and the second area of the calibration chart, assuming that different patterns are the same point.
 請求項6に記載のステレオ画像校正方法であって、
 前記複数のカメラのレンズ光軸を含む平面上かつ前記複数のカメラの垂直2等分線が交差する前記校正チャート上の位置を含む領域を前記第一の領域としたとき、
 前記第二の領域は、前記第一の領域に対して前記複数のカメラの基線方向の長さが長くなる、ことを特徴とするステレオ画像校正方法。
7. A stereo image calibration method according to claim 6, comprising:
When the first area is defined as an area including a position on the calibration chart where a plane including the lens optical axes of the plurality of cameras and perpendicular bisectors of the plurality of cameras intersect,
A stereo image calibration method, characterized in that the second area has a longer length in the baseline direction of the multiple cameras than the first area.
 請求項6に記載のステレオ画像校正方法であって、
 前記校正チャートの前記第一の領域の前記カメラのセンサ面の基線方向に対して垂直な方向に前記校正チャートの第三の領域があり、
 前記第一の領域と前記第三の領域で前記複数のカメラの基線方向の長さが異なる、ことを特徴とするステレオ画像校正方法。
7. A stereo image calibration method according to claim 6, comprising:
a third area of the calibration chart is disposed in a direction perpendicular to a baseline direction of a sensor surface of the camera in the first area of the calibration chart;
A stereo image calibration method, characterized in that the lengths of the baseline directions of the multiple cameras are different between the first area and the third area.
PCT/JP2024/024431 2023-12-15 2024-07-05 Stereo image processing device and stereo image calibration method WO2025126535A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-211807 2023-12-15
JP2023211807A JP2025095656A (en) 2023-12-15 2023-12-15 STEREO IMAGE PROCESSING APPARATUS AND STEREO IMAGE CALIBRATION METHOD

Publications (1)

Publication Number Publication Date
WO2025126535A1 true WO2025126535A1 (en) 2025-06-19

Family

ID=96057080

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/024431 WO2025126535A1 (en) 2023-12-15 2024-07-05 Stereo image processing device and stereo image calibration method

Country Status (2)

Country Link
JP (1) JP2025095656A (en)
WO (1) WO2025126535A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001092968A (en) * 1999-09-22 2001-04-06 Fuji Heavy Ind Ltd Stereo matching device, stereo matching method, and method of identifying corresponding point at infinity
JP2004132870A (en) * 2002-10-11 2004-04-30 Keiji Saneyoshi Stereo camera adjustment device and stereo camera adjustment method
JP2017062150A (en) * 2015-09-24 2017-03-30 マツダ株式会社 Device for calibrating stereo camera and method for calibrating stereo camera
JP2019090755A (en) * 2017-11-16 2019-06-13 京セラ株式会社 Calibration method and calibration device
WO2021024612A1 (en) * 2019-08-07 2021-02-11 日立オートモティブシステムズ株式会社 Method for correcting stereocamera and device for correcting stereocamera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001092968A (en) * 1999-09-22 2001-04-06 Fuji Heavy Ind Ltd Stereo matching device, stereo matching method, and method of identifying corresponding point at infinity
JP2004132870A (en) * 2002-10-11 2004-04-30 Keiji Saneyoshi Stereo camera adjustment device and stereo camera adjustment method
JP2017062150A (en) * 2015-09-24 2017-03-30 マツダ株式会社 Device for calibrating stereo camera and method for calibrating stereo camera
JP2019090755A (en) * 2017-11-16 2019-06-13 京セラ株式会社 Calibration method and calibration device
WO2021024612A1 (en) * 2019-08-07 2021-02-11 日立オートモティブシステムズ株式会社 Method for correcting stereocamera and device for correcting stereocamera

Also Published As

Publication number Publication date
JP2025095656A (en) 2025-06-26

Similar Documents

Publication Publication Date Title
JP6427900B2 (en) Calibration method, calibration system, program, and moving object
JP5906272B2 (en) Stereo image processing apparatus for vehicle
US9348111B2 (en) Automatic detection of lens deviations
CN101539422B (en) Monocular vision real time distance measuring method
JP6602982B2 (en) In-vehicle camera, in-vehicle camera adjustment method, in-vehicle camera system
JP6317456B2 (en) Method and control device for detecting relative yaw angle changes in a vehicle stereo / video system
JP6209833B2 (en) Inspection tool, inspection method, stereo camera production method and system
JP6515650B2 (en) Calibration apparatus, distance measuring apparatus and calibration method
JP2001091245A (en) Stereo image processing device
JP2008039491A (en) Stereo image processing device
JP6209648B1 (en) Stereo camera installation parameter calibration method
JP6838225B2 (en) Stereo camera
US20180276844A1 (en) Position or orientation estimation apparatus, position or orientation estimation method, and driving assist device
KR101597163B1 (en) Method and camera apparatus for calibration of stereo camera
JP6186431B2 (en) Calibration apparatus, calibration system, and imaging apparatus
US20040071316A1 (en) Method for image recognition in motor vehicles
JP2023547515A (en) Method for correcting deviations of images and/or image points, camera-based systems and vehicles
JP6996582B2 (en) Calibration method, calibration equipment and program
WO2025126535A1 (en) Stereo image processing device and stereo image calibration method
JP6680335B2 (en) Stereo camera, vehicle, calculation method and program
JP7232005B2 (en) VEHICLE DRIVING ENVIRONMENT DETECTION DEVICE AND DRIVING CONTROL SYSTEM
US20250008070A1 (en) Imaging device and parallax deviation correction method
CN116718109A (en) Target capturing method based on binocular camera
JP7436400B2 (en) Stereo image processing device and image correction means
JP2016061567A (en) Measuring apparatus and method for adjusting the same