WO2019203001A1 - Dispositif de caméra stéréo - Google Patents

Dispositif de caméra stéréo Download PDF

Info

Publication number
WO2019203001A1
WO2019203001A1 PCT/JP2019/014916 JP2019014916W WO2019203001A1 WO 2019203001 A1 WO2019203001 A1 WO 2019203001A1 JP 2019014916 W JP2019014916 W JP 2019014916W WO 2019203001 A1 WO2019203001 A1 WO 2019203001A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
parallax offset
feature point
correction amount
offset correction
Prior art date
Application number
PCT/JP2019/014916
Other languages
English (en)
Japanese (ja)
Inventor
琢馬 大里
青木 利幸
裕史 大塚
永崎 健
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to CN201980025213.0A priority Critical patent/CN111989541B/zh
Publication of WO2019203001A1 publication Critical patent/WO2019203001A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication

Definitions

  • the present invention relates to a stereo camera device.
  • Stereo camera systems are known as means for estimating the three-dimensional position of an object.
  • a stereo camera system the same object is imaged from a plurality of different viewpoints by cameras arranged at a plurality of positions, and the object is based on a shift in view (so-called parallax) in a plurality of images respectively obtained by the plurality of cameras Calculate the distance to the object.
  • parallax offset an offset error that occurs in parallax called parallax offset is considered.
  • This parallax offset changes over time due to distortions in devices such as cameras due to vibration and temperature changes, and physical loads that occur during screw tightening, so in order to accurately calculate the distance to the object, the parallax offset The offset needs to be calibrated.
  • the distance change of the stationary object relative to the vehicle is calculated based on the parallax obtained from images obtained by capturing the still object with a plurality of cameras, and the distance change and the movement of the moving object
  • a calibration method is known in which the parallax offset is calculated so that the distance change and the movement amount are equal by comparing the amounts.
  • the parallax offset can be calculated only when an object that can be determined to be a stationary object is imaged, and a sufficient opportunity to calibrate the parallax offset cannot be ensured.
  • Patent Document 1 is a stereo camera device that detects a distance to an object, and the device is installed apart by a base line length. Two cameras, and calculation means for calculating a distance from an image acquired by the two cameras to an object on the image, wherein the calculation means is an image acquired by the two cameras.
  • An image processing unit that searches for corresponding points of the corresponding point and calculates two parallaxes from a difference in position coordinates of the corresponding points on the imaging surface, and the image from the two parallaxes defined by the image processing unit at least at two times Offset value calculation means for calculating a parallax offset value over the whole, and the parallax offset used as a calibration parameter by statistically analyzing the distribution of the parallax offset value And a statistical processing means for determining the optimum value, the stereo camera apparatus is disclosed.
  • the moving distance of the moving object is calculated by using the parallax of the feature points detected on the images captured by a plurality of cameras
  • the opportunity to calibrate the parallax offset is increased. If it is attempted to acquire parallax by tracking feature points as a whole, the processing load becomes very high, and it is considered that parallax cannot be acquired for a sufficient number of feature points considering the calculation cost.
  • feature points on an image cannot always be detected accurately, it may be possible to calculate an incorrect parallax offset by mistracking similar points on the image as feature points. It is done.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a stereo camera device that can reduce the calculation cost while ensuring the calculation accuracy of the parallax offset.
  • the present application includes a plurality of means for solving the above-described problems.
  • a parallax offset between an imaging device having a plurality of imaging units arranged in a vehicle and the plurality of imaging units is corrected.
  • a parallax image generating unit that generates a parallax image based on a plurality of images respectively captured by the plurality of imaging units, and a target for the plurality of imaging units based on the parallax image
  • the image processing device detects feature points of the plurality of images respectively captured by the plurality of imaging units.
  • a feature point detection unit a feature point tracking unit that tracks the feature point between the plurality of images captured at different times in each of the plurality of imaging units, and the feature A tracking range determining unit that limits tracking of the feature points of the plurality of images in the tracking unit to a tracking range of a part of the plurality of images; and a moving distance of the vehicle and the feature point tracking unit between the different times
  • a parallax offset correction amount calculation unit that calculates a parallax offset correction amount for correcting the parallax offset between the plurality of imaging units based on the parallax with respect to the feature point tracked in step (b).
  • FIG. 1 is a diagram showing a hardware configuration of the stereo camera device according to the present embodiment.
  • a stereo camera device 100 includes a stereo camera 110 (imaging device) having a pair of left and right cameras (imaging units) of a left camera 111 and a right camera 112, a read-only memory (ROM) that is a storage device, and random access.
  • An image processing apparatus 120 including a memory 130 such as a memory (RAM) and a central processing unit (CPU) 140 that is a processor, and a stereo camera apparatus 100 and an external apparatus 160 such as an external sensor or other apparatus.
  • the stereo camera 110, the image processing device 120, and the interface 150 are communication paths for exchanging information with each other. They are connected by a certain bus line 100a.
  • the stereo camera device 100 is an in-vehicle stereo camera device mounted on a vehicle 1 such as an automobile (see FIG. 3 and the like), and the external device 160 includes an accelerator opening (that is, a throttle opening). ), Brake operation amount (ie, brake pedal operation amount), steering angle, vehicle speed, acceleration, temperature, humidity, and other sensors, as well as ECU (Electronic Control) that controls various operations of the vehicle 1 Unit).
  • the various information acquired via the interface 150 may be various information such as engine rotation flowing on the CAN communication path (CAN bus), vehicle speed, wheel speed, steering angle, throttle opening, and the like.
  • the ECU may calculate the vehicle speed from information obtained by a sensor or the like that detects the rotational speed of the wheel, and obtain the result of flowing through the CAN communication path.
  • the image processing apparatus 120 is provided with an element temperature sensor that detects the temperature of the memory 130, the CPU 140, and the like as elements disposed on the substrate, for example.
  • the memory 130 is a recording medium in which a control program used for various processes in the image processing apparatus and various types of information are stored.
  • the CPU 140 performs predetermined arithmetic processing on a signal taken in via the interface 150 in accordance with a control program stored in the memory 130, detects a three-dimensional object or a road surface, and a position of the object 2 (see FIG. 3) (object). (Distance and direction up to 2) are calculated.
  • a calculation result by the CPU 140 is output to an external device 160 (ECU or the like) via the interface 150, and is used for determination and control of various operations of the vehicle 1 such as an accelerator, a brake, and a steering.
  • the semiconductor memory such as ROM and RAM has been exemplified as the memory 130 (storage device) of the image processing apparatus 120. However, any other storage device may be used, for example, a magnetic storage device such as a hard disk drive.
  • a plurality of (two in this embodiment) cameras (the left camera 111 and the right camera 112) of the stereo camera 110 are arranged at a predetermined optical axis interval (base line length) so that their optical axes are parallel to each other.
  • Each is constituted by an image sensor such as a CCD or CMOS, an optical lens, or the like.
  • the stereo camera 110 is an in-vehicle stereo camera, and is arranged so that the front of the vehicle 1 is imaged by the left camera 111 and the right camera 112.
  • FIG. 2 is a functional block diagram showing extracted processing functions of the image processing apparatus according to the present embodiment together with related components.
  • the image processing apparatus 120 calculates a correction amount (parallax offset correction amount) for correcting an error in parallax between the left camera 111 and the right camera 112 of the stereo camera 110 (hereinafter referred to as parallax offset).
  • Parallax for generating a parallax image an image with parallax as a pixel value
  • Parallax offset correction amount based on the parallax offset correction unit 121 to be output and a plurality of images captured by the left camera 111 and the right camera 112, respectively.
  • the image generation unit 122 and a distance calculation unit 123 calculates the relative position of the object 2 with respect to the stereo camera 110 (distance and direction to the object 2) based on the parallax image generated by the parallax image generation unit 122 Calculation unit).
  • the relative position (distance and direction to the object 2) of the object 2 calculated by the distance calculation unit 123 is sent to the external device 160 via the interface 150, and is also stored in the memory 130 of the image processing apparatus 120.
  • the parallax offset is an offset error that occurs in the parallax.
  • the value of the parallax offset is determined in advance depending on the installation positions of the left camera 111 and the right camera 112 constituting the stereo camera 110, but the device is distorted due to vibrations and temperature changes that occur after installation, the physical load generated during screw tightening, etc. Changes over time.
  • a parallax offset correction amount for appropriately correcting the parallax offset value is calculated according to the temporal change of the parallax offset.
  • the parallax image generation unit 122 generates a parallax image by correcting the parallax based on the images obtained by the left camera 111 and the right camera 112 using the parallax offset correction amount calculated by the parallax offset correction unit 121. , Improve the accuracy of parallax images. Therefore, since the accuracy of the parallax image is improved, the occurrence of an error when the distance calculation unit 123 calculates the distance using the parallax image can be suppressed, so that the three-dimensional position of the object is accurately estimated. be able to.
  • parallax offset correction amount calculation process the basic principle of the process of calculating the parallax offset correction amount in the present embodiment (hereinafter referred to as the parallax offset correction amount calculation process) will be described with reference to FIG.
  • FIG. 3 is a diagram for explaining the basic principle of the parallax offset correction amount calculation processing.
  • the distance between the position s1 and the target object (stationary object) at time t1 of the vehicle 1 on which the stereo camera device 100 of the present embodiment is mounted is L1, and the position s2 and the target object (stationary object) at time t2.
  • the distance between the position s1 of the vehicle 1 at time t1 and the position s2 of the vehicle 1 at time t2 is the movement amount dz. Since the relative position (attachment position) of the stereo camera device 100 to the vehicle 1 is known, the distance from the stereo camera 110 to the object 2 and the distance from the vehicle 1 to the object 2 are substantially the same. Think.
  • the distance Z [mm] from the stereo camera 110 (or the vehicle 1) to the object is as follows: It is represented by (Formula 1).
  • f [mm] is the focal length of the left camera 111 and the right camera 112
  • wi [mm / px] is the pixel pitch
  • B [mm] is the distance between the optical axes of the left camera 111 and the right camera 112.
  • Distance (ie, baseline length) d [mm] represents parallax
  • ⁇ [px] represents parallax offset.
  • the stereo camera device 100 mounted on a moving body such as the vehicle 1 calculates the distance from the moving vehicle 1 to the target object 2 based on the parallax of the stationary target object 2 (stationary object).
  • the change in the predetermined time that is, the movement distance of the vehicle 1 based on the parallax of the object
  • the movement amount of the vehicle 1 movement distance of the vehicle 1 obtained from information other than the parallax
  • the parallax offset of the stereo camera 110 can be estimated by selecting the parallax offset so that the change in the distance to the vehicle and the amount of movement of the vehicle 1 are equal.
  • the parallax offset correction amount so as to cancel the parallax offset
  • the accuracy of the parallax image generated by the parallax image generation unit 122 is improved, and the error in calculating the distance by the distance calculation unit 123 is reduced. Occurrence can be suppressed.
  • parallaxes at a plurality of (two in the present embodiment) different times (time t1, time t2) shown in FIG. 3 are parallax d1 [px] and parallax d2 [px], respectively, the vehicle calculated from the parallax
  • the change in distance from 1 to the object 2 and the amount of movement of the vehicle 1 satisfy the following (Equation 2).
  • FIG. 4 is a functional block diagram showing processing functions of the parallax offset correction unit.
  • the parallax offset correction unit 121 includes a plurality of images captured by the left camera 111 and the right camera 112 of the stereo camera 110 (here, 2 images captured by the left camera 111 and the right camera 112 at the same time).
  • Feature point tracking unit that tracks feature points between a plurality of images captured at different times in the left camera 111 and the right camera 112 respectively.
  • the area where the feature point detection unit 220 detects the feature point is limited to a part of the image (the feature point detection area is And a feature point detection region determination unit 210.
  • FIG. 5 is a functional block diagram showing the processing function of the feature point detection area determination unit.
  • the feature point detection area determination unit 210 excludes an area where parallax is not obtained for each of a plurality of images captured by the left camera 111 and the right camera 112 of the stereo camera 110.
  • the parallax unstable region exclusion unit 212 that excludes a region with low parallax accuracy, and the left camera 111 and the right camera 112
  • Each of the plurality of images includes a specific object region exclusion unit 213 that excludes a predetermined specific object region.
  • the feature point detection area determination unit 210 excludes areas where feature points should not be detected. In other words, the feature point detection area determination unit 210 restricts the areas where feature points are detected (that is, areas where feature points are detected (feature point detection areas)). To reduce processing time and improve accuracy.
  • the value of the parallax offset is estimated using a time-series parallax change in a three-dimensionally stationary region. Therefore, it is not necessary to perform feature point detection in areas where there is no texture and where parallax cannot be calculated, areas with specific patterns with low parallax accuracy, and areas where moving objects are imaged. By determining the processed region as a region for detecting feature points (feature point detection region), it is possible to expect a reduction in processing time and an improvement in accuracy.
  • FIG. 6 to 9 are diagrams for explaining an example of an area in the image to be excluded by the feature point detection area determination unit.
  • FIG. 6 is a diagram illustrating an example of an image handled by the feature point detection region determination unit.
  • 7 shows an example of a region to be excluded by the parallax non-calculated region exclusion unit in the image shown in FIG. 6, and
  • FIG. 8 shows a target to be excluded by the parallax unstable region exclusion unit in the image shown in FIG.
  • FIG. 9 shows an example of an area
  • FIG. 9 shows an example of an area to be excluded by the specific object area exclusion unit in the image shown in FIG.
  • the parallax non-calculated area exclusion unit 211 excludes areas where parallax is not calculated in the image 300 (parallax non-calculated areas 311 to 313).
  • the exclusion target areas exemplified in the areas 311 to 313 are areas in the image 300 where, for example, there is no texture and parallax cannot be calculated.
  • the areas to be excluded the area 311 in which the sky is captured in the image 300, and the areas 311 and 312 in which roads and the like are captured are conceivable.
  • a region to be excluded from which parallax cannot be calculated for example, in a parallax image generated at a past time point, a region where parallax is not obtained may be selected.
  • a numerical value such as a matching cost indicating how much the left and right images obtained when calculating the parallax match, a point (pixel) having a low degree of matching is a case where the parallax was obtained.
  • it may be configured to be excluded.
  • the parallax unstable region exclusion unit 212 excludes a region (parallax unstable region 321) in the image 300 that may have low parallax accuracy.
  • the exclusion target area exemplified in the area 321 is an area 321 having a fine repetitive pattern such as a fence in the image 300, and there is a possibility that an erroneous parallax may be calculated in the process of generating a parallax image.
  • the specific object area exclusion unit 213 excludes areas (specific object areas 331 to 333) where the specific object is imaged in the image 300.
  • the specific object here is an object that is highly likely not to be stationary. That is, as the exclusion target area, areas 331 and 332 in which a vehicle that is likely to move is imaged in the image 300 and an area 333 in which a pedestrian is imaged can be considered.
  • the extraction of the regions 331 to 333 that is, the extraction of a specific object such as a vehicle or a pedestrian, can be performed by a method such as pattern matching that performs determination based on a matching rate with a pattern that is held in advance.
  • an object that is determined to be moving as a result of detecting and tracking another specific object may be excluded.
  • the pixel is a continuous region, and the presence / absence / accuracy of parallax, the presence / absence of imaging of a specific object, and the like are determined by considering one pixel unit as a region, It may be determined whether it is any of the specific object regions 331 to 333.
  • FIG. 5 illustrates a case where the processing of the feature point detection region determination unit 210 is performed in the order of the parallax non-calculated region exclusion unit 211, the parallax unstable region exclusion unit 212, and the specific object region exclusion unit 213.
  • the present invention is not limited to this, and the feature point detection region determination unit 210 may be configured by changing the order of these processing functions.
  • the feature point detection unit 220 is a feature point detection region determined by the feature point detection region determination unit 210 (that is, for example, in the image 300, the parallax uncalculated regions 311 to 113, the parallax unstable region 321 and the specific object region).
  • One or more feature points are detected in a range other than the area excluded as 331 to 333.
  • a feature point detection method for example, FAST (Features from Accelerated Segment ⁇ ⁇ Test) can be used.
  • FAST is a method for quickly determining whether a certain point (pixel) has a luminance different from that of its surrounding pixels, and is often used as a method for finding a characteristic point at high speed.
  • FIG. 10 is a functional block diagram showing processing functions of the tracking range determination unit.
  • FIG. 11 and FIG. 12 are diagrams showing how feature points are tracked by the feature point tracking unit.
  • the feature point tracking unit 240 sets a point corresponding to one or more feature points (here, the feature point p1 is illustrated) captured in the image 410 at a certain time t1. Search is made from a plurality of feature points detected on the image 420 at another time t2 after t1, and the feature points p2 are associated with the feature points p1 to track the feature points. Note that, in the present embodiment, only the feature point p1 among the one or more feature points will be described as an example.
  • the parallax offset correction unit 121 causes the erroneous parallax offset to be incorrect. Therefore, the corresponding feature points p1 and p2 are accurately tracked between the images 410 and 420 having different imaging times t1 and t2. It will be necessary. Therefore, the tracking range determination unit 230 limits the tracking range of the feature points to be tracked by the feature point tracking unit 240 to a part of the image (determines the tracking range).
  • the tracking range determination unit 230 predicts that a certain feature point p1 on the image 410 captured at time t1 is captured on the image 420 captured at time t2 (imaging imaging position p11). ) And a feature point p2 corresponding to the feature point p1 on the image 420 taken at time t2 based on the expected imaging position p11 calculated by the expected imaging position calculation unit 231. And a predicted imaging region calculation unit 232 that calculates a range predicted to be captured (imaging predicted region 411) as a search range.
  • the predicted imaging position calculation unit 231 calculates a position (imaging imaging position p11) where a certain feature point p1 on the image 410 captured at time t1 is predicted to be captured on the image 420 captured at time t2. To do.
  • the feature point detection region determination unit 210 uses only three-dimensionally stationary objects as detection points as feature points, the image 410 (time t1) and the image 420 (time t2) It can be said that the movement of the vehicle 1 (own vehicle) is the cause of the change of the imaging position of the feature point.
  • a camera coordinate system of a certain feature point p1 (for example, a coordinate system based on the stereo camera device 100, the origin is set at the center of the stereo camera device 100, and z 3D position (X1, Y1, Z1) in the axis, the y-axis forward, and the x-axis rightward) and the image coordinate system in the image 410 (for example, the origin at the lower left corner of the images 410, 420)
  • (I2, j2) is expressed by the following (formula 3) and (formula 4).
  • f [mm] is a focal length
  • wi [mm / px] is a pixel pitch
  • v [mm / s] is a vehicle obtained from a vehicle speed sensor included in the external device 160.
  • the vehicle speed of 1 (own vehicle) is shown. In the present embodiment, for the sake of simplicity of explanation, a case where the vehicle 1 (own vehicle) is moving forward at a constant speed v is considered.
  • the predicted imaging region calculation unit 232 captures the feature point p2 corresponding to the feature point p1 on the image 420 captured at time t2, based on the predicted imaging position p11 calculated by the predicted imaging position calculation unit 231. And a predicted range (imaging predicted region 411) are calculated as a search range.
  • the search range calculated by the imaging expected region calculation unit 232 is output from the tracking range determination unit 230 to the feature point tracking unit 240 as a tracking range.
  • the feature point tracking unit 240 can perform tracking with high accuracy. Even if there are many detected feature points in the image 420 taken at time t2, setting the tracking range allows the tracking of unnecessary feature points to be played out of the range without incurring calculation costs. Can do.
  • This setting of the tracking range is particularly effective for a repetitive pattern such as a striped pattern (for example, a repetitive pattern that is not a removal target in the parallax unstable region removing unit 212), and similar feature points are displayed on the image. Even when a plurality of objects appear in the screen, it is possible to select and associate only points that are moving like a stationary object.
  • FIG. 13 and FIG. 14 are diagrams schematically illustrating an example of a method for calculating a predicted imaging region (search range) in the predicted imaging region calculation unit.
  • the predicted imaging region 411 includes, for example, a direction in which an imaging position shift due to a parallax error is considered centering on the predicted imaging position p11 calculated by the predicted imaging position calculation unit 231 in the image 410.
  • the range is set in consideration of the error.
  • the predicted imaging position in the direction along the line segment Since the parallax error occurs along a line segment that passes from the center of the image 410 through the predicted imaging position p11 (that is, occurs in the radial direction from the center of the image 410), the predicted imaging position in the direction along the line segment.
  • a range in which a parallax error can occur is set in the center direction of the image 410 and the direction opposite to the center of the image 410 with p11 as the center.
  • pitching occurs in the vertical direction of the image 410
  • a range of errors that can be generated by pitching in the vertical direction is set around the predicted imaging position p11.
  • a range that considers both the range of errors caused by parallax errors and pitching is set as the imaging expected region 411.
  • the predicted imaging region 411 is surrounded by two line segments that extend from the center of the image 410 along the predicted imaging position p11 and two line segments that extend in the vertical direction of the image 410. It becomes an area.
  • the expected imaging region 411 can be set appropriately.
  • the imaging predicted area 411 may be set in consideration of the imaging position (i2, j2) at time t2.
  • the feature point tracking unit 240 corresponds to each of one or more feature points detected by the feature point detection unit 220 between a plurality of images captured at different times in the tracking range set by the tracking range determination unit 230. Search for feature points. That is, the feature point tracking unit 240 calculates time-series disparity information for feature points associated with a plurality of images captured at different times.
  • the parallax offset correction amount calculation unit 250 calculates the parallax offset correction amount using the time-series parallax information of the feature points obtained by the feature point tracking unit 240 and the movement amount (movement distance) of the vehicle 1 (own vehicle). Processing is performed to calculate a parallax offset correction amount.
  • the case where the parallax offset correction amount is calculated using the parallax information obtained at two different times has been described as an example, but for example, by using a lot of data in time series Stability can be increased. Note that when using disparity information obtained for three or more different times, the equation for calculating the disparity offset is redundant and cannot be solved uniquely, so the disparity offset is calculated by optimization calculation. It is also conceivable to calculate the parallax offset using information on a plurality of feature points.
  • the moving distance of the moving object is calculated by using the parallax of the feature points detected on the images captured by a plurality of cameras to increase the calibration opportunity of the parallax offset. If it is attempted to acquire parallax by tracking points, the processing load becomes very high, and it is considered that parallax cannot be acquired for a sufficient number of feature points in consideration of the calculation cost. In addition, since feature points on an image cannot always be detected accurately, it may be possible to calculate an incorrect parallax offset by mistracking similar points on the image as feature points. It is done.
  • the stereo camera 110 in order to correct the parallax offset between the left camera 111 and the right camera 112, and the stereo camera 110 having a plurality of the left camera 111 and the right camera 112 arranged in the vehicle 1.
  • a parallax image generation unit 122 that generates a parallax image based on a plurality of images captured by the left camera 111 and the right camera 112, respectively, and the left camera 111 and the right camera based on the parallax image
  • the stereo camera device 100 including the image processing device 120 having the distance calculation unit 123 that calculates the relative position of the target object 112, the image processing device 120 has a plurality of images captured by the left camera 111 and the right camera 112, respectively.
  • a feature point detection unit 220 that detects feature points of an image, a left camera 111, and a right camera 11
  • a feature point tracking unit 240 that tracks feature points between a plurality of images captured at different times, and the feature point tracking unit 240 tracks feature points for a plurality of images by tracking a part of the plurality of images.
  • the tracking range determination unit 230 that restricts the range, the movement distance of the vehicle 1 between different times, and the parallax about the feature point tracked by the feature point tracking unit 240, between the left camera 111 and the right camera 112 Since it is configured to include the parallax offset correction amount calculation unit 250 that calculates the parallax offset correction amount for correcting the parallax offset, the calculation cost can be reduced while ensuring the calculation accuracy of the parallax offset.
  • the tracking range determination unit 230 is configured to limit the tracking of feature points for a plurality of images in the feature point tracking unit 240 to a tracking range (search range) of a part of the plurality of images. Therefore, by suppressing the possibility that the feature point of the tracking target is erroneously associated with other feature points, the calculation accuracy of the parallax offset can be ensured, and the tracking range of the feature points can be By limiting to a part, it is possible to reduce the amount of processing related to the feature point correspondence determination, and to reduce the calculation cost.
  • the tracking range determination unit 230 is configured to determine the tracking range based on the parallax offset that is considered to have a particularly large error and the pitching amount of the vehicle, the tracking range can be determined with higher accuracy. Thus, the calculation accuracy of the parallax offset can be ensured.
  • the area where the feature point detection unit 220 detects the feature point is limited to a part of the image. Since the feature point detection region is determined, the processing time can be shortened and the accuracy can be improved by eliminating the region where the feature point should not be detected.
  • the feature point detection region determination unit 210 excludes a region where parallax is not obtained from each of the plurality of images captured by the left camera 111 and the right camera 112 in the parallax uncalculated region exclusion unit 211. Since it is configured, it is possible to eliminate an area where there is no texture and where the parallax cannot be calculated, and it is possible to reduce processing time and improve accuracy.
  • the feature point detection region determination unit 210 is configured to exclude a region with low parallax accuracy for each of a plurality of images captured by the left camera 111 and the right camera 112 in the parallax unstable region exclusion unit 212. Therefore, it is possible to eliminate a specific pattern region with low parallax accuracy, and to shorten processing time and improve accuracy.
  • the specific object region exclusion unit 213 excludes a predetermined specific object region for each of the plurality of images captured by the left camera 111 and the right camera 112. Since it is configured, it is possible to eliminate the area where the moving object is imaged, and it is possible to shorten the processing time and improve the accuracy.
  • the parallax offset correction amount calculation for determining whether to calculate the parallax offset correction amount (parallax offset correction amount calculation processing) based on whether the parallax offset correction amount can be accurately calculated.
  • the case where the determination part 610 is provided is shown.
  • the movement amount and parallax of the vehicle, which are parameters used for calculating the parallax offset correction amount are obtained with high accuracy, and the assumed stationary state is obtained. It is important to satisfy the two conditions of using only the parallax of a certain point.
  • the parallax offset correction amount calculation determination unit 610 simply determines that the parallax offset correction amount calculation processing is not performed, thereby reducing the parallax offset correction amount. Accuracy can be ensured.
  • FIG. 15 is a functional block diagram showing processing functions of the parallax offset correction unit according to the present embodiment.
  • the parallax offset correction unit 121A includes a parallax offset correction amount calculation determination unit 610 that determines whether to calculate a parallax offset based on a predetermined condition, and a parallax offset correction amount calculation determination unit.
  • a parallax offset correction amount calculation determination unit 610 that determines whether to calculate a parallax offset based on a predetermined condition
  • a parallax offset correction amount calculation determination unit a parallax offset correction amount calculation determination unit.
  • a region in which the feature point detection unit 220 detects a feature point is a part of the image
  • a feature point detection region determination unit 210 (which determines a feature point detection region).
  • FIG. 16 is a functional block diagram illustrating processing functions of the parallax offset calculation determination unit.
  • a parallax offset correction amount calculation determination unit 610 and a speed determination unit 611 that determine whether to calculate a parallax offset correction amount based on the speed of the vehicle 1 (own vehicle) and the vehicle 1 (own vehicle).
  • An acceleration determination unit 612 that determines whether to calculate the parallax offset correction amount based on the acceleration of the vehicle, and whether to calculate the parallax offset correction amount based on whether the vehicle 1 (own vehicle) is traveling straight ahead. It includes a straight-ahead determination unit 613 for determining, and a scene determination unit 614 for determining whether to calculate the parallax offset correction amount based on whether or not the scene is a scene for which the parallax offset correction amount can be calculated.
  • the parallax offset correction amount calculation process is not executed as a determination result in the parallax offset correction amount calculation determination unit 610. Is output to the feature point detection region determination unit 210, and a series of parallax offset calculation processing that is connected to the processing of the feature point detection region determination unit 210 is not executed.
  • the speed determination unit 611 determines whether to calculate the parallax offset correction amount according to the speed of the vehicle 1 (own vehicle). In general, it is known that a speed sensor attached to a vehicle has a large error in a low speed range. Therefore, in the low speed range, the vehicle 1 calculated using a detection value of a vehicle speed sensor included in the external device 160. It is considered that the error of the movement amount of (own vehicle) becomes large. Therefore, in the speed determination unit 611, for example, when the detection value by the vehicle speed sensor included in the external device 160 is 20 km / h or less, an error occurring in the movement amount of the vehicle 1 (own vehicle) exceeds the allowable range. It is determined that the parallax offset correction amount calculation process is not executed.
  • the acceleration determination unit 612 determines whether or not to calculate the parallax offset correction amount according to the acceleration of the vehicle 1 (own vehicle). Since values obtained from various sensors included in the external device 160 are delayed in time from detection to reception by the image processing device 120, the time and speed of imaging with the stereo camera 110 are erroneously handled during acceleration / deceleration. There is a possibility of being attached.
  • the acceleration determination unit 612 when the detection value by the acceleration sensor included in the external device 160 is equal to or greater than a predetermined threshold, that is, when acceleration / deceleration exceeding the allowable range is performed, the stereo camera 110 It is determined that the possibility that the imaging time and the speed are erroneously associated with each other exceeds the allowable range, and it is determined that the parallax offset correction amount calculation process is not executed. In such an acceleration determination unit 612, it is also possible to avoid execution of the parallax offset correction amount calculation process when the vehicle 1 is in a slip state.
  • the acceleration is determined by observing the amount of change (difference value) in the unit time of the detection value of the speed sensor or the output of the gyro sensor included in the external device 160. May be.
  • the straight traveling determination unit 613 determines whether or not to calculate the parallax offset correction amount based on whether or not the vehicle 1 (own vehicle) is traveling straight. When the vehicle 1 travels on a curved road, an error in the estimated value of the movement amount of the vehicle 1 increases. Therefore, the straight traveling determination unit 613 determines that the parallax offset correction amount calculation process is not executed unless it can be determined that the vehicle 1 is traveling straight. Whether or not the vehicle 1 is traveling straight (whether it is traveling on a curve) is determined by vehicle sensors that detect GPS information from a navigation system or the like included in the external device 160, map data, steering angle, yaw rate, and the like. It can be determined based on the detection value from That is, for example, when the detected value of the steering angle sensor included in the external device 160 is outside a predetermined range, it is determined that the parallax offset correction amount calculation process is not executed.
  • the scene determination unit 614 determines whether to calculate the parallax offset correction amount based on whether the scene is a scene for which the parallax offset correction amount may be calculated. For example, when the vehicle 1 travels on a congested road, an object moving at a low speed is imaged in most of the field of view of the stereo camera 110, so that it becomes difficult to distinguish between a stationary object and a moving object. By detecting it, the possibility of calculating an incorrect parallax offset correction value increases.
  • the scene determination unit 614 obtains, for example, information on whether or not the vehicle is traveling on a congested road from a navigation system or the like included in the external device 160, and calculates a parallax offset correction amount if the vehicle is traveling on a congested road. It is determined that the scene is not acceptable, and it is determined that the parallax offset correction amount calculation process is not performed. For the same reason, for example, even when driving on a busy road such as a shopping street, it is determined that the scene is not a scene where the parallax offset correction amount may be calculated, and the parallax offset correction amount calculation processing Is determined not to be implemented.
  • FIG. 17 is a diagram showing a list of conditions when it is determined that the parallax offset correction amount calculation determination unit does not execute the parallax offset correction amount calculation processing.
  • the parallax offset correction amount calculation determination unit 610 executes parallax offset correction amount calculation processing based on input values from various sensors included in the external device 160 and information from the navigation system. Therefore, it is possible to complete the processing at a high speed, and to secure a calculation resource in a scene where the parallax offset correction amount calculation processing should not be executed.
  • each determination process in the speed determination unit 611, the acceleration determination unit 612, the straight travel determination unit 613, and the scene determination unit 614 can be performed independently of each other.
  • a part of the determination units 611 to 614 may be used as necessary.
  • the speed determination unit 611 can be omitted, and if the movement amount during curve traveling can be modeled with high accuracy, the straight travel determination is performed.
  • the part 613 can be omitted. That is, the parallax offset correction amount calculation determination unit 610 may be configured to perform only necessary determination according to the components at the time of implementation.
  • the parallax offset correction amount calculation process is not executed, which is suitable for calculating the parallax offset correction amount. It is no longer necessary to execute processing such as feature point detection in a state where there is no data, and calculation resources can be allocated to other processing. In addition, the possibility of calculating an incorrect parallax offset correction amount can be reduced.
  • the parallax offset instead of the parallax offset correction amount calculation determination unit 610 provided in the parallax offset correction unit 121A of the second embodiment for determining when the parallax offset correction amount calculation process is not executed, the parallax offset This is provided with a parallax offset correction amount calculation determination unit 610 ⁇ / b> A that determines whether to execute the correction amount calculation processing. That is, in the present embodiment, when the parallax offset correction amount calculation determination unit 610A determines to execute the parallax offset correction amount calculation process, the parallax offset correction unit 121A executes the parallax offset calculation process.
  • the parallax offset correction amount calculation process is executed and the parallax offset correction amount calculation process is not performed even though the parallax offset correction amount is to be updated.
  • the parallax offset correction amount can be updated at a proper timing.
  • FIG. 18 is a functional block diagram illustrating processing functions of the parallax offset correction amount calculation determination unit according to the present embodiment.
  • the parallax offset correction amount calculation determination unit 610A according to the present embodiment, an environment change determination unit 615 that determines whether to calculate the parallax offset correction amount based on a change in the surrounding environment of the stereo camera 110, And an elapsed time determination unit 616 that determines whether to calculate a parallax offset correction amount based on the elapsed time from the previous execution of the parallax offset correction value calculation process.
  • the determination that the parallax offset correction amount calculation process is performed as a determination result in the parallax offset correction amount calculation determination unit 610A
  • the result is output to the feature point detection region determination unit 210, and a series of processes of parallax offset calculation processing connected to the processing of the feature point detection region determination unit 210 is executed.
  • the environment change determination unit 615 calculates the parallax offset correction amount based on whether the surrounding environment of the stereo camera 110 has changed significantly compared to the environment at the time of the previous parallax offset correction amount calculation process. Determine if.
  • As the surrounding environment of the stereo camera 110 for example, there are humidity, temperature, and the like. When it is considered that there is a high possibility that the parallax offset is changed due to deformation of the housing due to changes in temperature or humidity. There is a need to urgently correct this. Therefore, in the environment change determination unit 615, when the change amount of the detection value from the temperature sensor or humidity sensor included in the external device 160 since the previous execution of the parallax offset correction amount calculation process is larger than a predetermined threshold value. Determines that the surrounding environment of the stereo camera 110 has changed significantly and determines to execute the parallax offset correction amount calculation processing.
  • the elapsed time determination unit 616 determines whether to calculate the parallax offset correction amount based on the elapsed time from the previous execution of the parallax offset correction value calculation process. It is conceivable that the parallax offset of the stereo camera 110 changes due to some factor such as aging. Therefore, the elapsed time determination unit 616 determines that there is a high possibility that the parallax offset has changed if the elapsed time since the previous execution of the parallax offset correction value calculation process is equal to or greater than a predetermined threshold.
  • the parallax offset star amount calculation process is determined to be executed. When dealing with aging degradation, for example, a period of several months may be set as a threshold value. Also, when dealing with an impact caused by a collision with a brake or a person's hand, a period of several days to several weeks is set as a threshold value to prevent a case where the vehicle travels with the parallax offset shifted. Can do.
  • Feature point detection unit 230... Tracking range determination unit, 231.
  • Parallax offset correction amount calculation unit 300, 410, 420 ... Image, 31 ... parallax uncalculated area, 321 ... parallax unstable area, 331 ... specific object area, 332 ... specific object area, 333 ... specific object area, 411 ... imaging expected area, 610 ... parallax offset correction amount calculation determination unit, 610A ... parallax Offset correction amount calculation determination unit, 611... Speed determination unit, 612... Acceleration determination unit, 613 .. straight travel determination unit, 614 .. scene determination unit, 615 .. environment change determination unit, 616.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif de caméra stéréo qui peut réduire les coûts de calcul tout en maintenant la précision lors du calcul du décalage de parallaxe. La présente invention comprend : une unité de détection de point caractéristique (220) qui détecte un point caractéristique dans une pluralité d'images capturées respectivement par une caméra gauche (111) et une caméra droite (112); une unité de suivi de point caractéristique (240) qui suit le point caractéristique parmi la pluralité d'images capturées à différents moments par la caméra gauche (111) et la caméra droite (112); une unité de détermination de plage de suivi (230) qui limite le suivi du point caractéristique dans la pluralité d'images par l'unité de suivi de point caractéristique (240) à une plage de suivi d'une partie de la pluralité d'images; et une unité de calcul de quantité de correction de décalage de parallaxe (250) qui calcule une quantité de correction de décalage de parallaxe pour corriger le décalage de parallaxe entre la caméra gauche (111) et la caméra droite (112) sur la base d'une distance parcourue par un véhicule (1) entre les différents instants et la parallaxe du point caractéristique suivi par l'unité de suivi de point caractéristique (240).
PCT/JP2019/014916 2018-04-18 2019-04-04 Dispositif de caméra stéréo WO2019203001A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980025213.0A CN111989541B (zh) 2018-04-18 2019-04-04 立体摄像机装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-080074 2018-04-18
JP2018080074A JP7118717B2 (ja) 2018-04-18 2018-04-18 画像処理装置およびステレオカメラ装置

Publications (1)

Publication Number Publication Date
WO2019203001A1 true WO2019203001A1 (fr) 2019-10-24

Family

ID=68239556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014916 WO2019203001A1 (fr) 2018-04-18 2019-04-04 Dispositif de caméra stéréo

Country Status (3)

Country Link
JP (1) JP7118717B2 (fr)
CN (1) CN111989541B (fr)
WO (1) WO2019203001A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7493433B2 (ja) 2020-10-28 2024-05-31 日立Astemo株式会社 移動量算出装置
CN115248017B (zh) * 2022-09-21 2022-11-29 江苏新恒基特种装备股份有限公司 一种支管管件几何尺寸快速检测方法及检测系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01188909A (ja) * 1988-01-22 1989-07-28 Toyota Autom Loom Works Ltd 画像式無人車の走行経路決定処理方法
JP2001169310A (ja) * 1999-12-06 2001-06-22 Honda Motor Co Ltd 距離検出装置
JP2003329439A (ja) * 2002-05-15 2003-11-19 Honda Motor Co Ltd 距離検出装置
JP2005202761A (ja) * 2004-01-16 2005-07-28 Toyota Motor Corp 車両周辺監視装置
JP2007336228A (ja) * 2006-06-14 2007-12-27 Canon Inc 撮像装置及びその制御方法及びプログラム及び記憶媒体
JP2009110173A (ja) * 2007-10-29 2009-05-21 Fuji Heavy Ind Ltd 物体検出装置
WO2015049717A1 (fr) * 2013-10-01 2015-04-09 株式会社日立製作所 Dispositif permettant d'estimer la position d'un corps mobile et procédé permettant d'estimer la position d'un corps mobile

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015015542A1 (fr) * 2013-07-29 2015-02-05 株式会社日立製作所 Système de caméra stéréo monté sur véhicule et méthode d'étalonnage de celui-ci

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01188909A (ja) * 1988-01-22 1989-07-28 Toyota Autom Loom Works Ltd 画像式無人車の走行経路決定処理方法
JP2001169310A (ja) * 1999-12-06 2001-06-22 Honda Motor Co Ltd 距離検出装置
JP2003329439A (ja) * 2002-05-15 2003-11-19 Honda Motor Co Ltd 距離検出装置
JP2005202761A (ja) * 2004-01-16 2005-07-28 Toyota Motor Corp 車両周辺監視装置
JP2007336228A (ja) * 2006-06-14 2007-12-27 Canon Inc 撮像装置及びその制御方法及びプログラム及び記憶媒体
JP2009110173A (ja) * 2007-10-29 2009-05-21 Fuji Heavy Ind Ltd 物体検出装置
WO2015049717A1 (fr) * 2013-10-01 2015-04-09 株式会社日立製作所 Dispositif permettant d'estimer la position d'un corps mobile et procédé permettant d'estimer la position d'un corps mobile

Also Published As

Publication number Publication date
JP2019190847A (ja) 2019-10-31
JP7118717B2 (ja) 2022-08-16
CN111989541B (zh) 2022-06-07
CN111989541A (zh) 2020-11-24

Similar Documents

Publication Publication Date Title
US9912933B2 (en) Road surface detection device and road surface detection system
CN110023951B (zh) 信息处理设备、成像设备、装置控制系统、信息处理方法和计算机可读记录介质
US10795370B2 (en) Travel assist apparatus
JP2007263669A (ja) 3次元座標取得装置
JP6602982B2 (ja) 車載カメラ、車載カメラの調整方法、車載カメラシステム
JP6936098B2 (ja) 対象物推定装置
JP6209648B1 (ja) ステレオカメラの設置パラメータ校正方法
JP6592991B2 (ja) 物体検出装置、物体検出方法及びプログラム
JP2018044880A (ja) 車両姿勢推定装置
JP2004108980A (ja) 画像処理方法
WO2019203001A1 (fr) Dispositif de caméra stéréo
JP6455164B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
JP6543935B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
JP6564127B2 (ja) 自動車用視覚システム及び視覚システムを制御する方法
JP2019212015A (ja) 車両データの時刻同期装置及び方法
JP5425500B2 (ja) キャリブレーション装置およびキャリブレーション方法
JP2018036225A (ja) 状態推定装置
EP3051494B1 (fr) Procédé pour déterminer une valeur de profondeur d'image en fonction d'une région d'image, système de caméra et véhicule à moteur
JP7064400B2 (ja) 物体検知装置
JP5193148B2 (ja) 車両用撮像装置
JP7269130B2 (ja) 画像処理装置
WO2020036039A1 (fr) Dispositif de caméra stéréo
JP2020042716A (ja) 異常検出装置および異常検出方法
JP2021018661A (ja) 車線データ生成装置、位置特定装置、車線データ生成方法および位置特定方法
JP7131327B2 (ja) 物標検知装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19788446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19788446

Country of ref document: EP

Kind code of ref document: A1