WO2020054429A1 - Dispositif de traitement d'images - Google Patents

Dispositif de traitement d'images Download PDF

Info

Publication number
WO2020054429A1
WO2020054429A1 PCT/JP2019/033762 JP2019033762W WO2020054429A1 WO 2020054429 A1 WO2020054429 A1 WO 2020054429A1 JP 2019033762 W JP2019033762 W JP 2019033762W WO 2020054429 A1 WO2020054429 A1 WO 2020054429A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
image processing
stereo camera
image
unit
Prior art date
Application number
PCT/JP2019/033762
Other languages
English (en)
Japanese (ja)
Inventor
裕介 内田
稲田 圭介
野中 進一
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to CN201980049733.5A priority Critical patent/CN112513572B/zh
Publication of WO2020054429A1 publication Critical patent/WO2020054429A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Definitions

  • the present invention relates to an image processing device.
  • a position measuring device that measures a distance to an object existing in a camera field of view based on a principle of triangulation based on a stereo image captured by a stereo camera having a pair of cameras.
  • the principle of triangulation is to calculate the distance from a camera to an object using the displacement (parallax) of the image of the same object captured by the left and right cameras.
  • Derivation of parallax is realized by specifying where an image of an object on one image exists on the other image.
  • Patent Literature 1 discloses an obstacle measurement method that measures a distance to an obstacle by parallax calculation using a stereo image. The method measures the parallax in a high-resolution image based on distance information obtained from a low-resolution image.
  • An obstacle measurement method is disclosed in which a sampling interval of a pixel to be subjected to a correlation operation is set to be large at a short distance and small at a long distance.
  • Patent Document 1 by setting an appropriate sampling interval, high-accuracy distance information can be obtained only in a necessary area, unnecessary calculation can be suppressed, and high-accuracy distance information can be obtained only in a necessary area. It is possible to reduce the operation time and the circuit scale of the device. However, since it is necessary to calculate parallax from each of the low-resolution image and the high-resolution image, the overall calculation amount cannot be reduced much, and there is room for further improvement in suppressing the calculation amount.
  • An image processing apparatus calculates a parallax based on a stereo camera image captured by a stereo camera, and a condition setting unit that sets a condition for determining a calculation target pixel in the stereo camera image.
  • a parallax calculation unit that determines the calculation target pixel in the stereo camera image based on a condition set by the condition setting unit, and calculates the parallax using information on the calculation target pixel. .
  • the amount of calculation required for calculating parallax can be suppressed.
  • FIG. 1 is a block diagram illustrating a basic configuration of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a configuration of an image processing apparatus according to a first embodiment of the present invention.
  • 4 is a flowchart illustrating an algorithm of parallax calculation in the image processing apparatus according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a configuration of an image processing apparatus according to a second embodiment of the present invention.
  • 9 is a flowchart illustrating an algorithm of parallax calculation in the image processing device according to the second embodiment of the present invention.
  • FIG. 6 is a block diagram illustrating a configuration of an image processing apparatus according to a third embodiment of the present invention.
  • FIG. 14 is a diagram showing an example of determining the number of scan lines in the image processing apparatus according to the third embodiment of the present invention.
  • FIG. 9 is a block diagram illustrating a configuration of an image processing apparatus according to a fourth embodiment of the present invention.
  • FIG. 14 is a block diagram illustrating a configuration of an image processing apparatus according to a fifth embodiment of the present invention. Explanatory drawing of pixel thinning used for smoothing calculation in an image processing device according to a fifth embodiment of the present invention Explanatory diagram of pixel arrangement used for smoothing calculation in an image processing device according to a fifth embodiment of the present invention.
  • FIG. 14 is a block diagram illustrating a configuration of an image processing apparatus according to a sixth embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a basic configuration of an image processing apparatus according to an embodiment of the present invention.
  • the image processing apparatus 10 illustrated in FIG. 1 is connected to the camera 20 and the processor 30, and includes a condition setting unit 101 and a parallax calculation unit 102.
  • the image processing device 10 is mounted on a vehicle, for example, and is used for detecting and recognizing an object such as another vehicle or an obstacle existing around the vehicle.
  • the camera 20 is a stereo camera installed at predetermined intervals on the left and right sides, captures a pair of left and right stereo camera images, and outputs the captured data to the image processing apparatus 10.
  • the image processing apparatus 10 calculates parallax in a stereo camera image using the parallax calculation unit 102 based on the shooting data input from the camera 20.
  • the parallax calculation unit 102 determines a pixel to be calculated in the stereo camera image based on the condition set by the condition setting unit 101, and uses information on the pixel to be calculated, for example, using a calculation method called SGM Is calculated by the following.
  • the calculation of the parallax may be performed using a calculation method other than the SGM.
  • the calculation result of the parallax by the parallax calculation unit 102 is output from the image processing device 10 to the processor 30.
  • the processor 30 performs processing such as object detection and object recognition using the information on the parallax input from the image processing apparatus 10.
  • FIG. 2 is a block diagram illustrating a configuration of the image processing apparatus according to the first embodiment of the present invention.
  • the image processing apparatus 10 includes an FPGA 11 and a memory 12.
  • the FPGA 11 is an arithmetic processing device configured by combining a large number of logic circuits, and has, as its functions, the condition setting unit 101 and the parallax calculating unit 102 described with reference to FIG. Note that the condition setting unit 101 and the parallax calculation unit 102 may be implemented in the image processing apparatus 10 by using something other than the FPGA 11, for example, another logic circuit such as an ASIC, software executed by a CPU, or the like.
  • the memory 12 is a storage device configured using a readable and writable recording medium such as a RAM, an HDD, and a flash memory, and has a stereo camera captured image storage unit 103 and a parallax image storage unit 104 as its functions.
  • a readable and writable recording medium such as a RAM, an HDD, and a flash memory
  • the image data representing the stereo camera image input from the camera 20 to the image processing apparatus 10 is stored in the stereo camera image storage unit 103 in the memory 12 on a frame basis.
  • the parallax calculating unit 102 in the FPGA 11 reads the pixel data of the stereo camera image of each frame from the stereo camera captured image storage unit 103, and performs parallax calculation using the read data, thereby obtaining a parallax image representing parallax in pixel units. Is generated for each frame.
  • the parallax image generated by the parallax calculation unit 102 is output to the processor 30 and stored in the parallax image storage unit 104 in the memory 102.
  • the parallax calculation unit 102 performs smoothing calculation and dissimilarity calculation in the calculation of parallax using a stereo camera image.
  • Smoothing calculation refers to smoothing (averaging) the pixel value of each pixel in a stereo camera image together with the pixel values of a plurality of surrounding pixels, thereby smoothing fluctuations in pixel values between pixels. This is the operation processing.
  • the calculation of dissimilarity is the likelihood that the object shown in one of the stereo camera images and the object shown in the other image are not the same thing, that is, the dissimilarity that indicates how far apart they are. This is the required arithmetic processing.
  • the condition setting unit 101 has the line number determination unit 111.
  • the number-of-lines determining unit 111 reads a parallax image generated by the parallax calculation unit 102 from the stereo camera image of the previous frame and stored in the parallax image storage unit 104, and is executed by the parallax calculation unit 102 based on the parallax image. Determine the number of scan lines used in the smoothing process. That is, the line number determination unit 111 of the present embodiment determines the number of scan lines for scanning the stereo camera image of the current frame based on the parallax calculated by the parallax calculation unit 102 in the past. Then, the parallax calculating unit 102 is notified of the determined number of scan lines.
  • the condition setting unit 101 in the present embodiment performs the condition setting for the parallax calculation unit 102 by notifying the number of scan lines determined by the line number determination unit 111 in this manner.
  • the parallax calculation unit 102 determines the number of pixels notified for each pixel with respect to the stereo camera image of the current frame read from the stereo camera captured image storage unit 103. Set the scan line. Then, smoothing calculation is performed using pixels of the stereo camera image on each set scan line as calculation target pixels. Thereafter, the dissimilarity is calculated for each pixel on the stereo camera image after the smoothing calculation, and the parallax of each pixel is calculated based on the calculation result.
  • the parallax calculation unit 102 outputs the parallax calculation result obtained in pixel units thus obtained to the processor 30 as a parallax image, and stores the parallax image in the parallax image storage unit 104.
  • FIG. 3 is a flowchart showing an algorithm for calculating parallax in the image processing apparatus according to the first embodiment of the present invention.
  • the parallax calculation shown in the flowchart of FIG. 3 is performed in the FPGA 11 when the stereo camera image newly captured by the camera 20 is stored in the stereo camera captured image storage unit 103 as the stereo camera image of the current frame.
  • step S10 the parallax calculation unit 102 reads the stereo camera image of the current frame from the stereo camera captured image storage unit 103, and sequentially selects the pixels from the pixels at the screen edge.
  • step S20 the line number determination unit 111 reads out the parallax of the pixel selected in step S10 from the parallax image storage unit 104 in the parallax image calculated by the parallax calculation unit 102 in the previous frame, and refers to the read parallax.
  • step S30 the number-of-lines determining unit 111 determines whether the parallax referred to in step S20 is larger than a predetermined threshold dth.
  • the process proceeds to step S40, and when the parallax is equal to or smaller than the threshold value dth, the process proceeds to step S50.
  • step S40 the line number determination unit 111 determines that the pixel selected in step S10 corresponds to a short-distance image, and sets a small number of scan lines for the pixel.
  • the number of scan lines is determined to be four.
  • step S50 the line number determination unit 111 determines that the pixel selected in step S10 corresponds to a long-distance image, and sets a large number of scan lines for the pixel.
  • the number of scan lines is determined to be eight.
  • the parallax calculation unit 102 calculates the parallax of the pixel selected in step S10 based on the determined number of scan lines. I do.
  • the smoothing calculation based on the determined number of scan lines is performed on the pixel, and the dissimilarity is optimized to calculate the parallax. Thereby, the parallax in the stereo camera image of the current frame is calculated for each pixel.
  • step S70 the parallax calculation unit 102 determines whether the calculation for all pixels has been completed. If there is a pixel for which the calculation has not been performed yet, the process returns to step S10 to select the next pixel, and the above processing is repeated. When the calculation for all pixels is completed, the obtained parallax image of the current frame is stored in the parallax image storage unit 104, and then the parallax calculation shown in the flowchart of FIG.
  • the number of scan lines to be set has been exemplified as four and eight, respectively, but these are only examples and do not limit the present invention.
  • a plurality of values may be set as the threshold value dth used in the determination in step S30, and the values may be compared with the parallax obtained in the previous frame, so that three or more types of scan lines may be set. In any case, it is preferable to determine that the smaller the value of the parallax obtained in the previous frame, the longer the distance to the image is, and to increase the number of scan lines.
  • FIG. 4 is a block diagram showing a configuration of an image processing device according to the second embodiment of the present invention.
  • the image processing apparatus 10 according to the present embodiment differs from the first embodiment described with reference to FIG. 2 in that the memory 12 does not include the parallax image Is different in that the area information output from is input to the line number determination unit 111.
  • This area information is information set for each object based on the result of the object detection performed by the processor 30, and information indicating which area in the stereo camera image the object corresponds to, Indicating whether the distance is classified into a short distance range or a long distance range.
  • the line number determination unit 111 inputs area information output from the processor 30 in accordance with the parallax calculated from the stereo camera image of the previous frame, and converts the information of the distance range for each area in the area information. Based on this, the number of scan lines used in the smoothing process performed by the parallax calculation unit 102 is determined. That is, the line number determination unit 111 of the present embodiment acquires information on the distance range determined based on the parallax calculated in the past by the parallax calculation unit 102, and based on the acquired information on the distance range, Determine the number. The other points are the same as the first embodiment.
  • FIG. 5 is a flowchart illustrating an algorithm for calculating parallax in the image processing apparatus according to the second embodiment of the present invention.
  • the parallax calculation shown in the flowchart of FIG. 5 is performed in the FPGA 11 when the stereo camera image newly photographed by the camera 20 is stored in the stereo camera photographed image storage unit 103 as the stereo camera image of the current frame.
  • step S ⁇ b> 10 ⁇ / b> A the line number determination unit 111 reads from the processor 30 the determination result of the distance range for each region determined by the processor 30 based on the parallax for the stereo camera image of the previous frame.
  • the distance range of each area corresponding to each object detected from the stereo camera image of the previous frame corresponds to either the short distance or the long distance. Is read.
  • step S20A the parallax calculation unit 102 reads the stereo camera image of the current frame from the stereo camera captured image storage unit 103, and sequentially selects the pixels at the end of the screen.
  • step S30A the line number determination unit 111 determines, based on the determination result of the distance range of the region corresponding to the pixel selected in step S20A, among the determination results of the distance range in the previous frame read in step S10A, It is determined whether or not the object detected in the previous frame is within the short distance area. If the determination result of the distance range is a short distance, it is determined that the object is in the short distance area in the previous frame, and the process proceeds to step S40. If the determination result of the distance range is a long distance, the object is It is determined that it was not in the short distance area in the previous frame, and the process proceeds to step S50.
  • step S40 the line number determination unit 111 determines that the pixel selected in step S20A corresponds to a short-distance image, and sets a smaller number of scan lines for the pixel.
  • the number of scan lines is determined to be four.
  • step S50 the line number determination unit 111 determines that the pixel selected in step S20A corresponds to a long-distance image, and sets a large number of scan lines for the pixel.
  • the number of scan lines is determined to be eight as in the first embodiment.
  • the parallax calculation unit 102 performs the same processing as that in the flowchart of FIG. 3 described in the first embodiment after step S60. . That is, in step S60, the calculation of the parallax based on the determined number of scan lines is performed on the pixel selected in step S20A, and in step S70, it is determined whether the calculation on all the pixels has been completed. If there is a pixel that has not been calculated, the process returns to step S20A to select the next pixel, and then repeats the above processing. When the calculation for all pixels is completed, the parallax calculation shown in the flowchart of FIG. 5 is completed.
  • the number of scan lines set in steps S40 and S50 is not limited to four or eight. Further, by setting the number of distance ranges of each area represented by the area information output from the processor 30 to three or more, three or more types of scan lines may be set. In any case, it is preferable to increase the number of scan lines set for pixels corresponding to the object as the object exists at a longer distance.
  • an example in which the area information obtained from the stereo camera image of the immediately preceding frame is used as the information used to determine the number of scan lines is used as the information used to determine the number of scan lines, but this does not limit the present invention.
  • the number of scan lines is determined based on parallax information in pixel units or pixel group units around pixels.
  • the number of scan lines is determined in the area unit obtained by object detection. Is used to determine the number of scan lines. Therefore, there is an effect that the number of scan lines can be determined based on more global information as compared with the first embodiment.
  • FIG. 6 is a block diagram illustrating a configuration of an image processing apparatus according to the third embodiment of the present invention.
  • the image processing apparatus 10 transmits the information of the object detection result output from the processor 30 to the line number determination unit 111 as compared with the second embodiment described in FIG.
  • the input points are different.
  • This information is information indicating the result of the object detection performed by the processor 30, and includes information indicating in which region in the stereo camera image the object has been detected.
  • the number-of-lines determining unit 111 inputs information of an object detection result output from the processor 30 according to the parallax calculated from the stereo camera image of the previous frame, and based on the information, a parallax calculating unit.
  • the number of scan lines used in the smoothing process performed by 102 is determined. For example, a small number of lines is adopted for a pixel determined to be an image of an already detected object, and a large number of lines is adopted for a pixel determined to be an image for which no object has been detected.
  • the line number determination unit 111 of the present embodiment acquires information on an object detected based on the parallax calculated in the past by the parallax calculation unit 102, and determines the number of scan lines based on the acquired information on the object. decide.
  • the other points are the same as the first and second embodiments.
  • FIG. 7 is a diagram showing an example of determining the number of scan lines in the image processing device according to the third embodiment of the present invention.
  • the number of lines 911 is set to be small even at a long distance.
  • the number of lines 912 is set to be large, so that more accurate parallax calculation is performed.
  • a smaller number of lines 913 is set for the road surface 903 on which the object has been detected.
  • FIG. 8 is a block diagram showing a configuration of an image processing apparatus according to the fourth embodiment of the present invention.
  • the image processing apparatus 10 according to the present embodiment is an example of a sensor capable of acquiring distance information as compared with the second and third embodiments described with reference to FIGS. The difference is that the distance information output from the LIDAR (Light Detection Detection And Ranging) 40 is input to the line number determination unit 111.
  • the LIDAR 40 in FIG. 8 is an example of another sensor capable of acquiring distance information, and the present invention is not limited to this.
  • distance information output from, for example, an infrared depth sensor, an ultrasonic sensor, a millimeter wave radar, or another camera image may be used.
  • the line number determination unit 111 inputs distance information output from the LIDAR 40 for a range of a stereo camera image captured by the camera 20, and performs the same processing as in the first embodiment based on the distance information.
  • the number of scan lines used in the smoothing process performed by the parallax calculation unit 102 is determined. That is, the distance of each pixel represented by the distance information is compared with a predetermined threshold, and the number of scan lines is set smaller for pixels whose distance is larger than the threshold, and the number of scan lines is set smaller for pixels whose distance is smaller than the threshold. Set a large number of lines.
  • the other points are the same as the first to third embodiments.
  • FIG. 9 is a block diagram showing a configuration of an image processing device according to the fifth embodiment of the present invention.
  • the condition setting unit 101 of the FPGA 11 is different from the first embodiment described with reference to FIG. The difference is that a point 112 is provided.
  • the sparse density determination unit 112 reads the parallax image generated by the parallax calculation unit 102 from the stereo camera image of the previous frame and stored in the parallax image storage unit 104, and is executed by the parallax calculation unit 102 based on the parallax image. Determine the sparse density in the smoothing process.
  • the sparse density determining unit 112 of the present embodiment determines the sparse density based on the parallax calculated by the parallax calculating unit 102 in the past. Then, the parallax calculating unit 102 is notified of the determined sparse density.
  • the condition setting unit 101 in the present embodiment sets the condition for the parallax calculation unit 102 by notifying the sparse density determined by the sparse density determination unit 112 in this manner.
  • the parallax calculation unit 102 calculates a ratio corresponding to the notified sparse density to the stereo camera image of the current frame read from the stereo camera captured image storage unit 103. To thin out the pixels. Then, smoothing calculation is performed with the pixels of the remaining stereo camera image not thinned out as calculation target pixels. After that, similar to the first embodiment, the dissimilarity calculation is performed for each pixel on the stereo camera image after the smoothing calculation, and the parallax for each pixel is calculated based on the calculation result.
  • the parallax calculation unit 102 outputs the parallax calculation result obtained in pixel units as a parallax image to the processor 30 and stores the parallax calculation result in the parallax image storage unit 104.
  • FIG. 10 is an explanatory diagram of pixel thinning used in the smoothing calculation in the image processing device according to the fifth embodiment of the present invention.
  • FIG. 10A is a diagram illustrating an example of pixels used in a smoothing calculation of a pixel corresponding to a long-distance image.
  • pixels on each scan line are not thinned out as shown in FIG. A dense smoothing operation is performed in a narrow range on an image.
  • FIG. 10B is a diagram illustrating an example of a pixel used in a smoothing calculation of a pixel corresponding to an image at a short distance.
  • the sparse density determining unit 112 of the present embodiment determines that the larger the value of the parallax obtained in the previous frame is, the shorter the distance to the image is, and increases the ratio of the pixels thinned out from the stereo camera image.
  • the sparse density is determined.
  • FIG. 11 is an explanatory diagram of an arrangement of pixels used for smoothing calculation in the image processing device according to the fifth embodiment of the present invention.
  • FIG. 11A shows an example in which pixels arranged side by side on eight scan lines are used for smoothing calculation.
  • FIG. 11B shows an example in which pixels arranged not according to a scan line but arranged according to a predetermined arrangement pattern are used for smoothing calculation. In each of the arrangement examples shown in FIGS.
  • the pixel arrangement shown in FIG. 11 is an example, and the pixel arrangement used for the smoothing calculation is not limited to this.
  • the sparse density determination unit 112 determines the number of scan lines in the line number determination unit 111 described in the first embodiment, based on the parallax calculated in the past by the parallax calculation unit 102.
  • the present invention is not limited to this.
  • information on the distance range determined based on the disparity calculated in the past by the disparity calculation unit 102 is acquired from the processor 30 and the acquired information on the distance range is acquired.
  • Sparse density may be determined based on Further, similarly to the determination of the number of scan lines in the third embodiment, information of an object detected based on the disparity calculated in the past by the disparity calculating unit 102 is acquired from the processor 30 and based on the acquired information of the object. Alternatively, the density may be determined. Alternatively, similarly to the determination of the number of scan lines in the fourth embodiment, distance information corresponding to the range of the stereo camera image measured by another sensor is obtained, and the sparse density is determined based on the obtained distance information. You may. Other than this, the sparse density can be determined by an arbitrary method.
  • FIG. 12 is a block diagram showing a configuration of an image processing apparatus according to the sixth embodiment of the present invention.
  • the condition setting unit 101 of the FPGA 11 is different from the first embodiment described with reference to FIG. The difference is that a point 112 is further provided.
  • This sparse density determination unit 112 is the same as that described in the fifth embodiment.
  • FIG. 12 illustrates an example in which the memory 12 includes the parallax image storage unit 104, the line number determination unit 111 and the sparse density determination unit 112 use the same method as in the second to fourth embodiments.
  • the parallax image storage unit 104 may be omitted without being provided in the memory 12. Further, as described in the fourth embodiment, when determining the number of scan lines or the sparse density using the distance information measured by another sensor, the distance information from the sensor is used as the line number determination unit. 111 or the sparse density determination unit 112.
  • the image processing apparatus 10 calculates parallax based on a stereo camera image captured by the camera 20.
  • the image processing apparatus 10 includes: a condition setting unit 101 that sets a condition for determining a calculation target pixel in a stereo camera image; and a calculation target pixel in the stereo camera image based on the condition set by the condition setting unit 101. And a parallax calculating unit 102 that calculates parallax using information of the calculation target pixel. With this configuration, the amount of calculation required for calculating the parallax can be suppressed.
  • the condition setting unit 101 has a line number determination unit 111 that determines the number of scan lines for scanning a stereo camera image.
  • the parallax calculation unit 102 sets the number of scan lines determined by the line number determination unit 111 for the stereo camera image, and determines pixels of the stereo camera image on the scan line as calculation target pixels. With this configuration, it is possible to appropriately determine the calculation target pixel according to the condition.
  • the line number determination unit 111 determines the number of scan lines based on the parallax calculated by the parallax calculation unit 102 in the past. At this time, the line number determination unit 111 increases the number of scan lines as the value of the parallax is smaller. With this configuration, it is possible to determine the optimal number of scan lines according to the distance for each pixel from past disparity information.
  • the number-of-lines determining unit 111 acquires information on the distance range determined based on the disparity calculated in the past by the disparity calculating unit 102, and based on the acquired information on the distance range. , Determine the number of scan lines. With this configuration, it is possible to determine the optimal number of scan lines according to the distance for each object from the past disparity information.
  • the line number determination unit 111 acquires information on an object detected based on the parallax calculated in the past by the parallax calculation unit 102, and performs scanning based on the acquired information on the object. Determine the number of lines. With this configuration, it is possible to determine the optimal number of scan lines for each object from past disparity information.
  • the number-of-lines determining unit 111 acquires distance information for a range of a stereo camera image, and determines the number of scan lines based on the acquired distance information. Thus, even when parallax cannot be calculated from a past stereo camera image, it is possible to determine the optimal number of scan lines according to the distance for each pixel.
  • the condition setting unit 101 includes a sparse density determination unit 112 that determines a sparse density representing a ratio of pixels to be thinned out from a stereo camera image.
  • the parallax calculation unit 102 determines the remaining pixels thinned out from the stereo camera image based on the sparse density determined by the sparse density determination unit 112 as calculation target pixels. With this configuration, it is possible to appropriately determine the calculation target pixel according to the condition.
  • the sparse density determination unit 112 can determine the sparse density based on the parallax calculated by the parallax calculation unit 102 in the past. At this time, the sparse density determining unit 112 determines the sparse density such that the larger the value of the parallax, the greater the proportion of pixels thinned out from the stereo camera image. With this configuration, it is possible to determine the optimal sparse density according to the distance for each pixel from the past disparity information.
  • the sparse-density determining unit 112 obtains information on the distance range determined based on the parallax calculated in the past by the parallax calculating unit 102, and obtains the information on the obtained distance range.
  • the sparse density can also be determined based on the information.
  • information on an object detected based on the parallax calculated in the past by the parallax calculation unit 102 may be acquired, and the sparse density may be determined based on the acquired information on the object.
  • SYMBOLS 10 ... Image processing apparatus, 11 ... FPGA, 12 ... Memory, 20 ... Camera, 30 ... Processor, 40 ... LIDAR, 101 ... Condition setting part, 102 ... Parallax calculation part (SGM calculation), 103 ... Stereo camera photography image storage part , 104: parallax image storage unit, 111: line number determination unit, 112: sparse density determination unit

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image utilisé pour calculer une parallaxe en fonction d'une image de caméra stéréo capturée au moyen d'une caméra stéréo, et comprenant : une unité de réglage de condition permettant de régler des conditions de détermination d'un pixel cible de calcul dans l'image de caméra stéréo; et une unité de calcul de parallaxe qui, en fonction des conditions définies par l'unité de réglage de condition, détermine le pixel cible de calcul à l'intérieur de l'image de caméra stéréo, et utilise des informations relatives au pixel cible de calcul pour calculer la parallaxe.
PCT/JP2019/033762 2018-09-10 2019-08-28 Dispositif de traitement d'images WO2020054429A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980049733.5A CN112513572B (zh) 2018-09-10 2019-08-28 图像处理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-168862 2018-09-10
JP2018168862A JP7066580B2 (ja) 2018-09-10 2018-09-10 画像処理装置

Publications (1)

Publication Number Publication Date
WO2020054429A1 true WO2020054429A1 (fr) 2020-03-19

Family

ID=69777549

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/033762 WO2020054429A1 (fr) 2018-09-10 2019-08-28 Dispositif de traitement d'images

Country Status (3)

Country Link
JP (1) JP7066580B2 (fr)
CN (1) CN112513572B (fr)
WO (1) WO2020054429A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001266128A (ja) * 2000-03-21 2001-09-28 Nippon Telegr & Teleph Corp <Ntt> 奥行き情報取得方法,装置および奥行き情報取得プログラムを記録した記録媒体
JP2004279031A (ja) * 2003-03-12 2004-10-07 Toyota Central Res & Dev Lab Inc 距離分布検知装置及び距離分布検知方法
JP2008241273A (ja) * 2007-03-26 2008-10-09 Ihi Corp レーザレーダ装置とその制御方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004309868A (ja) * 2003-04-08 2004-11-04 Sony Corp 撮像装置及び立体映像生成装置
WO2012111755A1 (fr) * 2011-02-18 2012-08-23 ソニー株式会社 Dispositif de traitement d'images et procédé de traitement d'images
WO2014097734A1 (fr) * 2012-12-19 2014-06-26 富士フイルム株式会社 Dispositif de traitement d'image, dispositif imageur, procédé de traitement d'image et programme de traitement d'image
CN104537668B (zh) * 2014-12-29 2017-08-15 浙江宇视科技有限公司 一种快速视差图像计算方法及装置
CN109643437B (zh) * 2016-08-23 2023-01-10 株式会社日立制作所 图像处理装置、立体照相机装置以及图像处理方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001266128A (ja) * 2000-03-21 2001-09-28 Nippon Telegr & Teleph Corp <Ntt> 奥行き情報取得方法,装置および奥行き情報取得プログラムを記録した記録媒体
JP2004279031A (ja) * 2003-03-12 2004-10-07 Toyota Central Res & Dev Lab Inc 距離分布検知装置及び距離分布検知方法
JP2008241273A (ja) * 2007-03-26 2008-10-09 Ihi Corp レーザレーダ装置とその制御方法

Also Published As

Publication number Publication date
JP7066580B2 (ja) 2022-05-13
CN112513572B (zh) 2022-10-18
JP2020041891A (ja) 2020-03-19
CN112513572A (zh) 2021-03-16

Similar Documents

Publication Publication Date Title
RU2529594C1 (ru) Устройство калибровки, система измерения расстояния, способ калибровки и программа калибровки
JP5404263B2 (ja) 視差算出方法、および視差算出装置
JP6589926B2 (ja) 物体検出装置
US9898669B2 (en) Traveling road surface detection device and traveling road surface detection method
JP5752618B2 (ja) ステレオ視差算出装置
KR102015706B1 (ko) 콘크리트 표면 균열 측정 장치 및 방법
CN105335955A (zh) 对象检测方法和对象检测装置
JP6566768B2 (ja) 情報処理装置、情報処理方法、プログラム
JP6515650B2 (ja) 校正装置、距離計測装置及び校正方法
JP2006090896A (ja) ステレオ画像処理装置
JP2019207456A (ja) 幾何変換行列推定装置、幾何変換行列推定方法、及びプログラム
JP2013174494A (ja) 画像処理装置、画像処理方法、及び車両
JP2008309637A (ja) 障害物計測方法、障害物計測装置及び障害物計測システム
TWI571099B (zh) 深度估測裝置及方法
JP2015230703A (ja) 物体検出装置及び物体検出方法
JP2009092551A (ja) 障害物計測方法、障害物計測装置及び障害物計測システム
JP7136721B2 (ja) 演算装置、視差算出方法
JP2018092547A (ja) 画像処理装置、画像処理方法およびプログラム
JP2021051347A (ja) 距離画像生成装置及び距離画像生成方法
WO2020054429A1 (fr) Dispositif de traitement d&#39;images
JPWO2017090097A1 (ja) 車両用外界認識装置
JPH1096607A (ja) 物体検出装置および平面推定方法
WO2020095549A1 (fr) Dispositif d&#39;imagerie
JP6936557B2 (ja) 探索処理装置、及びステレオカメラ装置
JP5579297B2 (ja) 視差算出方法、および視差算出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19859413

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19859413

Country of ref document: EP

Kind code of ref document: A1