WO2021176819A1 - Dispositif de traitement - Google Patents

Dispositif de traitement Download PDF

Info

Publication number
WO2021176819A1
WO2021176819A1 PCT/JP2020/048694 JP2020048694W WO2021176819A1 WO 2021176819 A1 WO2021176819 A1 WO 2021176819A1 JP 2020048694 W JP2020048694 W JP 2020048694W WO 2021176819 A1 WO2021176819 A1 WO 2021176819A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
image
control target
value
pair
Prior art date
Application number
PCT/JP2020/048694
Other languages
English (en)
Japanese (ja)
Inventor
真穂 堀永
直也 多田
裕史 大塚
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to DE112020005059.9T priority Critical patent/DE112020005059T5/de
Priority to JP2022504989A priority patent/JP7250211B2/ja
Publication of WO2021176819A1 publication Critical patent/WO2021176819A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present invention relates to, for example, a processing device that processes an captured image captured by a stereo camera mounted on a vehicle.
  • Patent Document 1 a technique for estimating which part of a road marking is imaged from a plurality of images captured as the vehicle moves.
  • a relative vertical deviation occurs between the optical axis of the left camera and the optical axis of the right camera, and the stereo camera is repeatedly obliquely oblique toward the depth direction such as a flow guide zone (zebra zone).
  • zebra zone a flow guide zone
  • the area on which the road surface paint is written may be erroneously detected as a three-dimensional object. If the area where the road surface paint is written is falsely detected as a three-dimensional object, the safety system provided by the stereo camera such as AEB or ACC may be activated, and unnecessary warnings or brakes may be applied, giving the occupant a sense of discomfort. ..
  • the above-mentioned conventional method is for correcting the vertical deviation between the left and right cameras, and time is required for the correction process. If the vehicle travels on a road surface with a guide zone before the correction of the vertical deviation is completed, the guide zone may be erroneously detected as a three-dimensional object. Therefore, during the vertical deviation correction process, a mechanism such as stopping the system provided by the stereo camera is required.
  • the safety system may not operate in the actual necessary situation, which may lead to a decrease in safety performance.
  • An object of the present invention is to differentiate a three-dimensional object from a non-three-dimensional object regardless of whether or not a vertical optical axis shift occurs between the left and right cameras of the stereo camera, and to make the stereo camera recognize it as a three-dimensional object. On the other hand, it is to provide a processing device for determining whether or not to control.
  • the processing device of the present invention that solves the above problems is a processing device that processes a pair of captured images captured by a pair of cameras, and generates a first parallax image from the pair of captured images.
  • the acquisition unit, the second disparity image generation unit that generates the second disparity image by shifting the relative vertical positions of the pair of captured images, and the second disparity corresponding to the candidate region of the first disparity image.
  • the block diagram which shows the whole structure of the image pickup apparatus in 1st Embodiment.
  • the figure which shows typically the left-right captured image and the parallax image.
  • the figure which shows an example of the recognition area of a controlled object.
  • the figure which shows the difference between the distant parallax and the near parallax in the left and right captured images.
  • the figure explaining the parallax when the optical axis of the left and right cameras is relatively displaced in the vertical direction, and the parallax is imaged by the stereo camera.
  • FIG. 3 is a diagram schematically showing a left and right captured image and a parallax image.
  • FIG. 3A is a schematic view of a captured image of the front of the vehicle captured by the left camera
  • FIG. 3B shows a schematic view of the captured image of the front of the vehicle captured by the right camera at the same time as the left camera.
  • 3 (c) is a parallax image created by using the captured image of FIG. 3 (a) and the captured image of FIG. 3 (b).
  • An intermediate region 322 is shown.
  • the images captured by the left and right cameras capture the state in which the preceding vehicle 311 is traveling in front of the own vehicle.
  • the own vehicle is traveling in the traveling lane 312, and guide zones 313 are provided on both sides of the traveling lane 312.
  • the stereo camera detects a three-dimensional object from the parallax image as shown in FIG. 3C and recognizes it as a control target candidate.
  • the parallax In the absence of a three-dimensional object, the parallax is shown to gradually decrease as it shifts from the vicinity to the distance, and also in the parallax image as it gradually decreases as it shifts from the lower to the upper in the vertical direction.
  • the parallax when there is a three-dimensional object, the parallax is the same in the vicinity and the distance, and the parallax image is also shown to have the same value along the vertical direction. Therefore, in the parallax image, it is determined that there is a three-dimensional object in a region having the same parallax along the vertical direction.
  • the preceding vehicle 311 which is actually a three-dimensional object is detected as a three-dimensional object 331, but a part of the diversion zone 313 which is not actually a three-dimensional object is also a three-dimensional object 332. Has been detected as.
  • FIG. 4 is a diagram showing an example of a recognition area of the control target candidate.
  • the candidate area recognized as the control target candidate is shown as a rectangular frame 402.
  • the position of the rectangular frame 402 indicating the candidate region is defined by the xy coordinates of the parallax image 401.
  • FIG. 5 is a diagram showing the difference between the distant parallax and the near parallax in the left and right captured images.
  • Two preceding vehicles 511 and 521 are imaged on both the image 501 captured by the left camera and the image 502 captured by the right camera. Due to the characteristics of the stereo camera, as shown in FIG. 5, the parallax with respect to the distant preceding vehicle 521 is small, and the parallax with respect to the nearby preceding vehicle 511 is large.
  • FIG. 6 is a diagram for explaining the reason why the parallax error differs depending on the inclination angle of the line captured in the image.
  • FIG. 6A is an image of a line extending straight along the traveling direction of the own vehicle, and a vertical line 611 extending straight along the vertical direction of the image is imaged.
  • 6 (b) and 6 (c) are images of lines extending diagonally with respect to the traveling direction of the own vehicle, and the image of FIG. 6 (b) is inclined with respect to the lateral direction of the image.
  • the diagonal line 612 with a large size is imaged
  • the diagonal line 613 with a small inclination with respect to the lateral direction of the image is imaged in the image of FIG. 6 (c).
  • FIG. 6 (a) shows.
  • the vertical lines 611 of the image 601 shown have the same positions P1 and P2 on the left and right when the image is taken by the left camera and the positions P2 when the image is taken by the right camera, and the deviation error is 0.
  • the diagonal line 612 of the image 602 shown in FIG. 6B an error ⁇ 1 occurs in the left-right direction between the position P3 when the image is taken by the left camera and the position P4 when the image is taken by the right camera. ..
  • the diagonal line 613 of the image 603 shown in FIG. 6 (c) has a larger inclination than the diagonal line 612 of the image 602 shown in FIG.
  • the error ⁇ 2 is larger ( ⁇ 1 ⁇ 2). That is, when the optical axis of the left camera L and the optical axis of the right camera R are vertically displaced from each other in the vertical direction, the error increases as the inclination of the imaged line increases.
  • FIG. 7 is a diagram for explaining the parallax when the diversion zone is imaged by a stereo camera in which the optical axes of the left and right cameras are relatively displaced in the vertical direction.
  • the white line of the guide zone 702 is imaged so that the inclination is larger in the vicinity and gradually decreases as the distance is increased. Therefore, when the optical axes of the left and right cameras are relatively displaced in the vertical direction, the parallax error 721 to 723 in the left-right direction of the white line shifts from the vicinity to the distance as described with reference to FIG. Therefore, it gradually increases.
  • the parallax 711 to 713 of the white line gradually decreases as the distance from the vicinity to the distance increases, as described with reference to FIG. 5, regardless of the deviation of the optical axis.
  • the measured parallax in the vicinity is the sum of the relatively large correct parallax 711 and the relatively small parallax error 721, and the distant measurement.
  • the parallax is the sum of a relatively small correct parallax 713 and a relatively large parallax error 723, and the measured parallax between near and far is the middle size correct parallax 712 and the middle size parallax error 722. And are added.
  • the measured parallax 703 is equal to each other in the vicinity, the distance, and the middle thereof, and there is a region in the parallax image in which the parallax is the same in the vertical direction, and there is a risk of erroneously detecting that there is a three-dimensional object.
  • the imaging device of the present embodiment differentiates a three-dimensional object from a non-three-dimensional object regardless of whether or not a vertical optical axis shift occurs between the left and right cameras of the stereo camera.
  • a process of determining whether or not a stereo camera recognizes a three-dimensional object as a control target is performed.
  • the imaging device of the present embodiment calculates the parallax average value when the pair of captured images captured by the pair of imaging units are not shifted in the vertical direction and the parallax average value when the pair of captured images are relatively shifted in the vertical direction, and shifts the images. It has a configuration in which the control necessity judgment of the control target is performed from the distribution shape of the amount and the parallax average value.
  • FIG. 1 is a block diagram showing the overall configuration of the image pickup apparatus according to the first embodiment.
  • the image pickup device 100 is mounted on a vehicle such as an automobile, and images the front from the vehicle with a pair of left and right cameras, determines the presence or absence of a three-dimensional object based on parallax, or certifies the type of the three-dimensional object. It is a stereo camera that performs processing.
  • the image pickup device 100 includes a first image pickup unit 11 and a second image pickup section 12 which are a pair of cameras, and a processing device 1 which processes two captured images captured by the first image pickup section 11 and the second image pickup section 12. It has.
  • the processing device 1 is configured inside the imaging device 100, which is a stereo camera, will be described as an example, but the place where the processing device 1 is configured is not limited to the inside of the imaging device 100. Instead, it may be configured by an ECU or the like provided separately from the image pickup apparatus 100.
  • the first imaging unit 11 and the second imaging unit 12 are arranged at positions separated from each other in the vehicle interior of the vehicle in the vehicle width direction, and pass through the windshield to image an overlapping region in front of the vehicle.
  • the first image pickup unit 11 and the second image pickup unit 12 are composed of an assembly in which, for example, an optical system component such as a lens and an image pickup element such as a CCD or CMOS are combined.
  • the first imaging unit 11 and the second imaging unit 12 are adjusted so that their optical axes are parallel to each other and have the same height.
  • the captured images captured by the first imaging unit 11 and the second imaging unit 12 are input to the processing device 1.
  • the processing device 1 is composed of hardware having, for example, a CPU and memory, and software installed and executed on the hardware.
  • the processing device 1 has, as internal functions, a first image acquisition unit 13, a second image acquisition unit 14, a first parallax image generation unit 15, a control target candidate recognition unit 16, and a first parallax value acquisition unit 17.
  • a second parallax image generation unit 18, a second parallax value acquisition unit 19, and a control target determination unit 20 are provided.
  • the first image acquisition unit 13 and the second image acquisition unit 14 acquire captured images that are simultaneously and periodically captured by the first imaging unit 11 and the second imaging unit 12.
  • the first image acquisition unit 13 and the second image acquisition unit 14 cut out images of predetermined regions that overlap each other from a pair of captured images simultaneously captured by the first imaging unit 11 and the second imaging unit 12, respectively, and the first image. And get as a second image.
  • the first parallax image generation unit 15 generates a first parallax image using a pair of captured images acquired by the first image acquisition unit 13 and the second image acquisition unit 14.
  • a method for generating the first parallax image a conventionally known method can be used. For example, with the first image as a reference, in the second image, a pixel array at the same height position in the vertical direction as the first image is scanned in the horizontal direction to find a coincidence point with the first image, and the first image and the second image are used. So-called stereo matching is performed, in which the amount of lateral deviation from the image is calculated as the parallax.
  • the control target candidate recognition unit 16 detects a three-dimensional object from the first parallax image generated by the first parallax image generation unit 15, and recognizes the three-dimensional object as a control target candidate. For example, a three-dimensional object exists in a region where the amount of change in parallax in the vertical direction of the image is small compared to the amount of change in parallax in the vertical direction of an image in which a flat surface continues without a three-dimensional object. Judge that you are doing.
  • the control target candidate recognition unit 16 recognizes each of them as a control target candidate.
  • the control target candidate recognition unit 16 acquires the coordinate information of the candidate region in which the control target candidate is recognized in the first parallax image (for example, the regions 331 and 332 in FIG. 3 and the region 402 in FIG. 4).
  • the first parallax value acquisition unit 17 acquires the first parallax value, which is the parallax calculation value in the candidate region where the control target candidate exists in the first parallax image.
  • the first parallax value acquisition unit 17 executes a process of acquiring the first parallax value only when the control target candidate is recognized by the control target candidate recognition unit 16.
  • the first parallax value acquisition unit 17 does not perform the process of acquiring the first parallax value when no control target candidate is recognized by the control target candidate recognition unit 16.
  • the parallax average value in the candidate region can be used.
  • the parallax average value in the candidate region is calculated from the parallax value and the number of parallax in the candidate region.
  • the calculated parallax value is not limited to the average parallax value, but is the dispersion value of parallax, the maximum value of parallax, the minimum value of parallax, the difference between the maximum and minimum values of parallax, and the maximum and minimum values of parallax.
  • the ratio of values, the average value of the maximum values of parallax, or the average value of the minimum values of parallax can be used.
  • the second parallax image generation unit 18 generates a plurality of second parallax images using the pair of captured images used when the first parallax image was generated.
  • the second parallax image generation unit 18 uses a pair of shifted images cut out by shifting the relative vertical positions of the first image and the second image for each vertical coordinate or for each equally divided division. Generate a parallax image.
  • the second parallax image generation unit 18 cuts out a pair of shifted images by shifting at least one position of the first image and the second image only in the upward direction, only in the downward direction, or in both the upper and lower directions.
  • the second parallax image generation unit 18 changes the amount of shift in the vertical direction to generate a plurality of pairs of shift images, and uses these to generate a plurality of second parallax images. That is, the second parallax image generation unit 18 shifts at least one position of the first image and the second image a plurality of times to generate a plurality of a pair of shift images, and generates a second parallax image from each of the pair of shift images. Generate.
  • the second parallax value acquisition unit 19 acquires the second parallax value, which is the calculated parallax value in the corresponding area, from the corresponding area of the second parallax image.
  • the corresponding area of the second parallax image is set at the same position as the candidate area of the first parallax image. Since a plurality of second parallax images are generated, a corresponding region is set at the same position as the candidate region of the first parallax image for each second parallax image, and the second parallax value is obtained from each corresponding region. Further, when there are a plurality of candidate regions of the first parallax image, the corresponding regions of the second parallax image are set corresponding to each, and the second parallax value of each corresponding region is obtained.
  • the parallax average value in the corresponding region can be used in the same manner as the first parallax value.
  • the average parallax value in the corresponding area is calculated from the parallax value and the number of parallaxes in the corresponding area.
  • the average value of the ratio, the maximum value of parallax, or the average value of the minimum value of parallax can be used.
  • the second parallax value is calculated for each of the plurality of second parallax images.
  • the control target determination unit 20 uses the first parallax image generated by the first parallax image generation unit 15 and the plurality of second parallax images generated by the second parallax image generation unit 18, and the necessity of the control target, that is, , It is determined whether or not the control target candidate detected as a three-dimensional object is recognized as a control target.
  • the control target determination unit 20 determines the necessity of the control target based on the parallax calculation values calculated for each of the first parallax image and the plurality of second parallax images.
  • FIG. 2 is a flowchart illustrating a control target determination process of the image pickup apparatus according to the first embodiment.
  • step S201 image acquisition is performed.
  • a pair of captured images simultaneously captured by the first imaging unit 11 and the second imaging unit 12 are acquired by the first image acquisition unit 13 and the second image acquisition unit 14.
  • step S202 the first parallax image is generated.
  • the first parallax image is generated by the first parallax image generation unit 15 using a pair of captured images acquired by the first image acquisition unit 13 and the second image acquisition unit 14.
  • step S203 it is determined whether or not the control target candidate exists. Whether or not there is a control target candidate is determined depending on whether or not a three-dimensional object is detected from the first parallax image.
  • a three-dimensional object is detected from the first parallax image.
  • the process proceeds to step S204.
  • no three-dimensional object is detected, it is determined that there is no control target candidate in front of the own vehicle (NO in step S203), and the process returns to step S201. That is, in step S203, the three-dimensional object and the non-three-dimensional object are differentiated, and only the object recognized as the three-dimensional object is subjected to the process of determining whether or not it is to be controlled in step S204 or later.
  • step S204 a process of shifting the image in the vertical direction is performed.
  • the first image and the second image are shifted by relative vertical coordinates or by equally divided divisions to obtain a pair of shifted images.
  • the process of acquiring is performed.
  • step S205 the parallax is calculated by stereo matching using the pair of shifted images, and the second parallax image is generated.
  • step S206 it is determined whether or not there are a plurality of parallax images. Then, when the number of the second parallax images is less than the preset number of two or more, it is determined that there are no plurality of parallax images (NO in step S206), and the process returns to step S204. Then, in step S204 and step S205, the vertical shift amount of the pair of images is changed to generate another pair of shift images having different shift amounts, and further another second parallax image is generated.
  • step S206 a process of generating a second parallax image from the pair of images in which the shift amount is changed is performed.
  • the amount of shift can be, for example, one pixel or a plurality of pixels.
  • the shifting direction may be, for example, at least one of the upper and lower sides of the second image with respect to the first image, and in the present embodiment, the second image is shifted both upward and lower with respect to the first image. Each of them generates a pair of shifted images, and a second parallax image is generated from each of the plurality of pairs of shifted images.
  • step S207 a distribution of parallax average values is created.
  • the first parallax value of the candidate region in the first parallax image and the second parallax value of the corresponding region in the plurality of second parallax images are calculated, and an approximate straight line is obtained from the distribution of these parallax values.
  • FIG. 8 is a diagram showing the relationship between the amount of shift and the average parallax value.
  • FIG. 8 (a) shows a state in which the parallax average value is substantially constant and the slope of the approximate straight line of the distribution is zero regardless of the amount of shift
  • FIG. 8 (b) shows the state according to the amount of shift. It is a figure which shows the state which the parallax average value changes, and the slope of the approximate straight line of a distribution is equal to or more than a threshold.
  • step S208 the slope of the approximate straight line of the distribution obtained in step S207 is compared with the preset threshold value.
  • the control target candidate is a three-dimensional object that is correctly recognized
  • the parallax value in the recognition area of the control target candidate does not change due to the vertical deviation, so that the distribution is an approximate straight line with zero slope.
  • the slope of the approximate straight line of the distribution is larger than the threshold value, it is determined that the parallax value of the corresponding region has changed due to the vertical deviation between the first image and the plurality of second images.
  • control target candidate recognized as a three-dimensional object in the first parallax image by the control target candidate recognition unit 16 is a true three-dimensional object such as a preceding vehicle, a bicycle, or a pedestrian, it is affected by the presence or absence of vertical deviation.
  • the parallax value does not change between the candidate area to be controlled and the corresponding area.
  • the control target candidate recognized as a three-dimensional object in the first parallax image by the control target candidate recognition unit 16 is a non-three-dimensional object such as a headrace zone (zebra zone) on the road surface
  • the parallax value of the candidate region is used.
  • the parallax value of the corresponding region has a characteristic that it changes in comparison with the presence or absence of vertical deviation.
  • step S209 the control target candidate whose slope of the approximate straight line of the distribution is determined to be larger than the threshold value in the comparison result in step S208 is regarded as a control-unnecessary target, and a process of excluding it from the control target is executed. Then, the control target candidate that remains without being excluded from the control target candidates is determined to be a control target that requires control, and the control target information is output.
  • the parallax average value is a substantially constant value, that is, the parallax calculation value of the candidate region is irrespective of the relative vertical shift amount of the first image and the second image.
  • the control target candidate is recognized as a control target.
  • the parallax average values are different from each other, that is, the area corresponding to the parallax average value of the candidate region according to the relative vertical shift amount of the first image and the second image.
  • control target candidate is not a three-dimensional object but a erroneous detection of the headrace zone as a three-dimensional object, and processing is performed to exclude it from the control target.
  • step S208 An example of creating a distribution of the parallax average value in step S207 and determining whether to exclude the control target candidate from the control target based on the slope of the approximate straight line of the distribution has been described in step S208, but the present invention is limited to this. It's not something.
  • the parallax dispersion value, the maximum parallax value, the minimum parallax value, the difference between the maximum and minimum parallax values, and the maximum and minimum parallax values instead of the slope of the approximate straight line of the distribution, the parallax dispersion value, the maximum parallax value, the minimum parallax value, the difference between the maximum and minimum parallax values, and the maximum and minimum parallax values.
  • the ratio of values, the average value of the maximum values of parallax, or the average value of the minimum values of parallax can be used.
  • FIG. 9 is a diagram illustrating an example of determining the necessity of a controlled object based on statistical results.
  • the average value of the deviations 1 to 6 which are the six types of shift images in the case of the positive detection and the false detection, and the average value (ave) and the dispersion value (sigma) of these deviation average values.
  • Minimum value (min), maximum value (max), difference between maximum value and minimum value (max-min), average value of minimum value (min / ave), and average value of maximum value (max / ave) Is shown, and it is also possible to judge the necessity of the control target by comparing these values with the threshold value.
  • the imaging device 100 of the present embodiment acquires a first image and a second image from a pair of captured images, generates a first parallax image using the first image and the second image, and a three-dimensional object from the first parallax image. Is detected and recognized as a control target candidate, and the parallax value in the candidate area where the control target candidate exists is acquired as the first parallax value. Then, from the pair of captured images obtained from the first image and the second image, a pair of shifted images in which the positional relationship in the vertical direction relative to the first image and the second image is shifted is acquired, and the pair of these shifted images is obtained. A second disparity image is generated from the shifted image, a corresponding area corresponding to the candidate area of the first disparity image is set in the second disparity image, and the disparity value in the corresponding area is acquired as the second disparity value.
  • a pair of shifted images in which the shift amount is changed are acquired, a second parallax image is generated, a corresponding area is set, and a second parallax in the corresponding area is obtained.
  • the value is acquired, and a plurality of second parallax values having different shift amounts are acquired.
  • it is determined from the distribution status of the first parallax value and the plurality of second parallax values whether or not the control target candidate erroneously detects the headrace zone as a three-dimensional object, and when it is determined to be erroneous detection. Is processed to be excluded from the control target.
  • the detection is positive, and the plurality of second parallax values change according to the amount of shift with respect to the first parallax value. If this is the case, it is judged to be a false positive.
  • the image pickup apparatus 100 of the present embodiment it is possible to determine whether or not control of the control target is necessary even when the left and right cameras have optical axis deviations. Therefore, the headrace zone can be excluded from the control target as a three-dimensional object, and it is possible to prevent the safety system such as AEB and ACC from operating and applying unnecessary alarms and brakes. It is possible to prevent a person from feeling uncomfortable.
  • a three-dimensional object having a characteristic of a repeating pattern such as a fence or a fence is recognized as a control target. For example, even for a three-dimensional object having a characteristic of a repeating pattern such as a fence or a fence, the parallax value in the corresponding region of the second parallax image obtained in step S205 changes with the occurrence of vertical deviation of the pair of cameras. Have.
  • step S210 When it is determined in step S203 that the control target candidate exists, it is determined in step S210 whether or not the own vehicle speed is larger than the threshold value. It can be determined that the fence or fence is recognized as the control target when the fence or fence is located on the traveling path of the own vehicle, and it can be inferred that the own vehicle is traveling at a low speed. Therefore, only when the own vehicle speed is greater than the preset vehicle speed threshold value (YES in step S210), the process proceeds to step S204, and when the own vehicle speed on which the image pickup device 100 is mounted is equal to or less than the vehicle speed threshold value (in step S210). NO) does not generate a second parallax image.
  • the headrace zone when the headrace zone is erroneously detected as a three-dimensional object due to the vertical deviation of the optical axis generated between the left and right cameras, it is determined that the light guide zone is not a three-dimensional object, and the safety system is prevented from operating. It is possible.
  • the present invention is not limited to the above-described embodiments, and various designs are designed without departing from the spirit of the present invention described in the claims. You can make changes.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • Imaging device 11 1st imaging unit 12 2nd imaging unit 13 1st image acquisition unit 14 2nd image acquisition unit 15 1st parallax image generation unit 16 Control target candidate recognition unit 17 1st parallax value acquisition unit 18 2nd parallax image Generation unit 19 Second parallax value acquisition unit 20 Control target judgment unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Manufacturing & Machinery (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un dispositif de traitement pouvant déterminer la nécessité de commander une cible de commande même lorsque l'axe optique d'une caméra stéréo change. Ce dispositif de traitement, pour traiter une paire d'images capturées avec une paire de caméras, effectue un traitement pour générer une première image de parallaxe à partir de la paire d'images capturées, reconnaître un candidat cible de commande à partir de la première image de parallaxe, acquérir une première valeur de parallaxe à l'intérieur d'une région candidate où le candidat cible de commande est présent dans la première image de parallaxe, décaler la position verticale relative de la paire d'images capturées pour générer une seconde image de parallaxe, acquérir une seconde valeur de parallaxe à l'intérieur de la région correspondante de la seconde image de parallaxe qui correspond à la région candidate de la première image de parallaxe, et utiliser la première valeur de parallaxe et la seconde valeur de parallaxe pour déterminer s'il faut ou non identifier le candidat cible de commande en tant que cible de commande.
PCT/JP2020/048694 2020-03-06 2020-12-25 Dispositif de traitement WO2021176819A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112020005059.9T DE112020005059T5 (de) 2020-03-06 2020-12-25 Verarbeitungsvorrichtung
JP2022504989A JP7250211B2 (ja) 2020-03-06 2020-12-25 処理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020038552 2020-03-06
JP2020-038552 2020-03-06

Publications (1)

Publication Number Publication Date
WO2021176819A1 true WO2021176819A1 (fr) 2021-09-10

Family

ID=77613225

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/048694 WO2021176819A1 (fr) 2020-03-06 2020-12-25 Dispositif de traitement

Country Status (3)

Country Link
JP (1) JP7250211B2 (fr)
DE (1) DE112020005059T5 (fr)
WO (1) WO2021176819A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012017650A1 (fr) * 2010-08-03 2012-02-09 パナソニック株式会社 Dispositif de détection d'objet, procédé de détection d'objet et programme associé
WO2013062087A1 (fr) * 2011-10-28 2013-05-02 富士フイルム株式会社 Dispositif de capture d'image pour effectuer une mesure tridimensionnelle, dispositif de mesure tridimensionnelle et programme de mesure
JP2015190921A (ja) * 2014-03-28 2015-11-02 富士重工業株式会社 車両用ステレオ画像処理装置
WO2019003771A1 (fr) * 2017-06-26 2019-01-03 日立オートモティブシステムズ株式会社 Dispositif d'imagerie

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5097681B2 (ja) 2008-10-31 2012-12-12 日立オートモティブシステムズ株式会社 地物位置認識装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012017650A1 (fr) * 2010-08-03 2012-02-09 パナソニック株式会社 Dispositif de détection d'objet, procédé de détection d'objet et programme associé
WO2013062087A1 (fr) * 2011-10-28 2013-05-02 富士フイルム株式会社 Dispositif de capture d'image pour effectuer une mesure tridimensionnelle, dispositif de mesure tridimensionnelle et programme de mesure
JP2015190921A (ja) * 2014-03-28 2015-11-02 富士重工業株式会社 車両用ステレオ画像処理装置
WO2019003771A1 (fr) * 2017-06-26 2019-01-03 日立オートモティブシステムズ株式会社 Dispositif d'imagerie

Also Published As

Publication number Publication date
JP7250211B2 (ja) 2023-03-31
DE112020005059T5 (de) 2022-07-21
JPWO2021176819A1 (fr) 2021-09-10

Similar Documents

Publication Publication Date Title
JP6013884B2 (ja) 物体検出装置及び物体検出方法
EP2422320B1 (fr) Dispositif de détection d'objet
JP4956452B2 (ja) 車両用環境認識装置
WO2016002405A1 (fr) Dispositif de reconnaissance de place de stationnement
JP6274557B2 (ja) 移動面情報検出装置、及びこれを用いた移動体機器制御システム並びに移動面情報検出用プログラム
KR100941271B1 (ko) 자동차용 차선이탈 방지 방법
US7542835B2 (en) Vehicle image processing device
JP5371725B2 (ja) 物体検出装置
JP2009110172A (ja) 物体検出装置
US8730325B2 (en) Traveling lane detector
JP6722084B2 (ja) 物体検出装置
JP6592991B2 (ja) 物体検出装置、物体検出方法及びプログラム
US10984258B2 (en) Vehicle traveling environment detecting apparatus and vehicle traveling controlling system
CN109522779B (zh) 图像处理装置
JP2009169847A (ja) 車外監視装置
JP5073700B2 (ja) 物体検出装置
JP2008309519A (ja) 画像処理を用いた物体検出装置
WO2011016257A1 (fr) Dispositif de calcul de la distance pour un véhicule
JP2007299045A (ja) 車線認識装置
WO2021176819A1 (fr) Dispositif de traitement
WO2017154305A1 (fr) Dispositif de traitement d'image, système de commande d'appareil, dispositif d'imagerie, procédé de traitement d'image et programme
JP7232005B2 (ja) 車両の走行環境検出装置及び走行制御システム
WO2023053477A1 (fr) Dispositif de traitement d'images
WO2024150471A1 (fr) Dispositif de traitement d'image et procédé de traitement d'image
JP5822866B2 (ja) 画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20922789

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022504989

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 20922789

Country of ref document: EP

Kind code of ref document: A1