CN112513572B - Image processing apparatus - Google Patents

Image processing apparatus Download PDF

Info

Publication number
CN112513572B
CN112513572B CN201980049733.5A CN201980049733A CN112513572B CN 112513572 B CN112513572 B CN 112513572B CN 201980049733 A CN201980049733 A CN 201980049733A CN 112513572 B CN112513572 B CN 112513572B
Authority
CN
China
Prior art keywords
parallax
pixel
calculation
processing apparatus
stereo camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980049733.5A
Other languages
Chinese (zh)
Other versions
CN112513572A (en
Inventor
内田裕介
稻田圭介
野中进一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Publication of CN112513572A publication Critical patent/CN112513572A/en
Application granted granted Critical
Publication of CN112513572B publication Critical patent/CN112513572B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Abstract

The present invention provides an image processing apparatus for calculating parallax based on a stereo camera image captured by a stereo camera, including: a condition setting unit that sets a condition for determining a pixel to be calculated in the stereo camera image; and a parallax calculation unit that determines the calculation target pixel in the stereo camera image according to the condition set by the condition setting unit, and calculates the parallax using information of the calculation target pixel.

Description

Image processing apparatus
Technical Field
The present invention relates to an image processing apparatus.
Background
A position measuring device is known which measures a distance to an object existing in a field of view of a camera based on a principle of triangulation based on a stereo image captured by a stereo camera having a pair of cameras. The principle of triangulation is to calculate the distance from a camera to the same object using the deviation of the positions (parallax) of the images of the object captured by the left and right cameras. The derivation of the disparity is achieved by determining where the image of the object on one image is present on the other image.
Various methods have been proposed for deriving the disparity. In the conventional method, it is known that a block matching is performed for a region composed of a plurality of pixels in one image, and a region having the lowest degree of dissimilarity is searched for in another image. Further, as a method for improving accuracy by reducing mismatching by including not only local information in an image but also information of the entire image in the Matching process, for example, a method called SGM (Semi-Global Matching) has been proposed. Such a method can be expected to have high accuracy, but has a problem of an increase in the amount of calculation compared to the conventional method.
For example, patent document 1 discloses a technique for suppressing the amount of calculation in parallax calculation using a stereo camera. Patent document 1 discloses an obstacle measurement method for calculating and measuring a distance to an obstacle using a parallax based on a stereoscopic image, the method including: based on the distance information obtained from the low-resolution image, the sampling interval of the pixels to be subjected to the correlation operation for calculating the parallax in the high-resolution image is set to be large at a short distance and small at a long distance.
Documents of the prior art
Patent document
Patent document 1: japanese patent application laid-open No. 2008-298533
Disclosure of Invention
Problems to be solved by the invention
In the method of patent document 1, by setting an appropriate sampling interval, highly accurate distance information can be acquired only for a necessary region, unnecessary computation can be suppressed, highly accurate distance information can be acquired only for a necessary region, and a reduction in computation time and a reduction in circuit scale of the device can be achieved. However, since it is necessary to calculate the parallax from each of the low-resolution image and the high-resolution image, the total amount of calculation cannot be reduced, and there is room for further improvement in terms of the amount of calculation suppression.
Means for solving the problems
An image processing apparatus according to the present invention is an image processing apparatus that calculates parallax based on a stereo camera image captured by a stereo camera, and includes: a condition setting unit that sets a condition for determining a pixel to be calculated in the stereo camera image; and a parallax calculation unit that determines the calculation target pixel in the stereo camera image according to the condition set by the condition setting unit, and calculates the parallax using information of the calculation target pixel.
Effects of the invention
According to the present invention, the amount of calculation required to calculate the parallax can be suppressed.
Drawings
Fig. 1 is a block diagram showing a basic configuration of an image processing apparatus according to an embodiment of the present invention.
Fig. 2 is a block diagram showing a configuration of an image processing apparatus according to a first embodiment of the present invention.
Fig. 3 is a flowchart showing an algorithm of parallax calculation in the image processing apparatus according to the first embodiment of the present invention.
Fig. 4 is a block diagram showing a configuration of an image processing apparatus according to a second embodiment of the present invention.
Fig. 5 is a flowchart showing an algorithm of parallax calculation in the image processing apparatus according to the second embodiment of the present invention.
Fig. 6 is a block diagram showing a configuration of an image processing apparatus according to a third embodiment of the present invention.
Fig. 7 is a diagram showing an example of determination of the number of scanning lines in the image processing apparatus according to the third embodiment of the present invention.
Fig. 8 is a block diagram showing a configuration of an image processing apparatus according to a fourth embodiment of the present invention.
Fig. 9 is a block diagram showing a configuration of an image processing apparatus according to a fifth embodiment of the present invention.
Fig. 10 is an explanatory diagram of thinning out of pixels for smoothing calculation in the image processing apparatus according to the fifth embodiment of the present invention.
Fig. 11 is an explanatory diagram of the arrangement of pixels used for smoothing calculation in the image processing apparatus according to the fifth embodiment of the present invention.
Fig. 12 is a block diagram showing a configuration of an image processing apparatus according to a sixth embodiment of the present invention.
Detailed Description
(basic structure)
First, a basic structure of an embodiment of the present invention will be described. Fig. 1 is a block diagram showing a basic configuration of an image processing apparatus according to an embodiment of the present invention. The image processing apparatus 10 shown in fig. 1 is connected to the camera 20 and the processor 30, and includes a condition setting unit 101 and a parallax calculation unit 102. The image processing apparatus 10 is mounted in a vehicle, for example, and is used to detect and recognize objects such as other vehicles and obstacles present around the vehicle.
The camera 20 is a stereo camera provided at a predetermined interval in the left and right sides, and captures a pair of left and right stereo camera images and outputs the captured data to the image processing apparatus 10. The image processing apparatus 10 calculates the parallax in the stereoscopic camera image by the parallax calculation unit 102 based on the shooting data input from the camera 20. At this time, the parallax calculation unit 102 determines a calculation target pixel in the stereo camera image according to the condition set by the condition setting unit 101, and calculates the parallax by using information of the calculation target pixel, for example, by an arithmetic method called SGM. The parallax may be calculated by an arithmetic method other than the SGM.
The calculation result of the parallax obtained by the parallax calculation unit 102 is output from the image processing apparatus 10 to the processor 30. The processor 30 executes processes such as object detection and object recognition using information of parallax input from the image processing apparatus 10.
Next, embodiments of the present invention using the above-described basic configuration will be described.
(first embodiment)
Fig. 2 is a block diagram showing a configuration of an image processing apparatus according to a first embodiment of the present invention. As shown in fig. 2, the image processing apparatus 10 of the present embodiment includes an FPGA11 and a memory 12. The FPGA11 is an arithmetic processing device configured by combining a large number of logic circuits, and has a condition setting unit 101 and a parallax calculation unit 102 described in fig. 1 as functions thereof. The condition setting unit 101 and the parallax calculation unit 102 may be implemented in the image processing apparatus 10 by using a logic circuit other than the FPGA11, for example, an ASIC, or software executed by a CPU. The memory 12 is a storage device configured using a readable and writable recording medium such as a RAM, HDD, or flash memory, and includes, as functions thereof, a stereo camera captured image storage unit 103 and a parallax image storage unit 104.
The captured image data representing the stereo camera image input from the camera 20 to the image processing apparatus 10 is stored in the stereo camera captured image storage unit 103 in the camera 12 on a frame-by-frame basis. The parallax calculation unit 102 in the FPGA11 reads pixel data of the stereo camera image of each frame from the stereo camera captured image storage unit 103, and performs parallax calculation using the pixel data, thereby generating a parallax image indicating parallax in units of pixels for each frame. The parallax image generated by the parallax calculation unit 102 is output to the processor 30, and is stored in the parallax image storage unit 104 in the memory 102.
The parallax calculation unit 102 performs smoothing calculation and disparity calculation in parallax calculation using a stereo camera image. The smoothing calculation is an arithmetic process for smoothing (averaging) the pixel value of each pixel in the stereo camera image together with the pixel values of a plurality of pixels located in the periphery thereof, thereby smoothing the variation in the pixel value among the pixels. The calculation of the degree of dissimilarity refers to an arithmetic process for obtaining the degree of accuracy of the difference between an object shown in one of the stereo camera images and an object shown in the other image, that is, the degree of dissimilarity indicating the difference.
In the present embodiment, the condition setting unit 101 includes a line number determination unit 111. The line number determination unit 111 reads the parallax image generated by the parallax calculation unit 102 from the stereoscopic camera image of the previous frame and stored in the parallax image storage unit 104, and determines the number of scanning lines used for the smoothing process performed by the parallax calculation unit 102 based on the parallax image. That is, the line count determination unit 111 of the present embodiment determines the number of scanning lines for scanning the stereoscopic camera image of the current frame based on the parallax calculated by the parallax calculation unit 102 in the past. Then, the parallax calculating unit 102 is notified of the determined number of scanning lines. The condition setting unit 101 in the present embodiment performs condition setting in the parallax calculation unit 102 by notifying the number of scanning lines determined by the line number determination unit 111 in this manner.
When the line count determination unit 111 notifies the number of scanning lines, the parallax calculation unit 102 sets the number of scanning lines notified for each pixel of the stereo camera image of the current frame read from the stereo camera captured image storage unit 103. Then, smoothing calculation is performed using pixels of the stereo camera image located on each set scanning line as calculation target pixels. Then, the degree of disparity is calculated for each pixel of the stereo camera image after the smoothing calculation, and the parallax of each pixel is calculated based on the calculation result. The parallax calculation unit 102 outputs the calculation result of the parallax for each image unit obtained in this way to the processor 30 as a parallax image, and stores the parallax image in the parallax image storage unit 104.
Fig. 3 is a flowchart showing an algorithm of parallax calculation in the image processing apparatus according to the first embodiment of the present invention. The parallax calculation shown in the flowchart of fig. 3 is performed in the FPGA11 when the stereo camera image newly captured by the camera 20 is stored in the stereo camera captured image storage unit 103 as the stereo camera image of the current frame.
In step S10, the parallax calculation unit 102 reads the stereo camera image of the current frame from the stereo camera captured image storage unit 103, and sequentially selects pixels from the pixels located at the image side.
In step S20, the line count determination unit 111 reads the parallax of the pixel selected in step S10 from the parallax image storage unit 104 in the parallax image calculated by the reference parallax calculation unit 102 using the previous frame.
In step S30, the line count determination unit 111 determines whether or not the parallax referred to in step S20 is larger than a predetermined threshold value dth. If the parallax is larger than the threshold value dth, the process proceeds to step S40, and if the parallax is not larger than the threshold value dth, the process proceeds to step S50.
In step S40, the line count determination unit 111 determines that the pixel selected in step S10 corresponds to a short-distance image, and sets the number of scanning lines corresponding to the pixel to be small. In the present embodiment, the number of scanning lines is determined to be 4, for example.
In step S50, the number-of-lines determining section 111 determines that the pixel selected in step S10 corresponds to a distant image, and sets the number of scanning lines corresponding to the pixel to be large. In the present embodiment, the number of scanning lines is determined to be 8, for example.
After the number of scanning lines is determined by the line number determination unit 111 in step S40 or S50, the parallax calculation unit 102 performs parallax calculation based on the determined number of scanning lines on the pixel selected in step S10 in step S60. Here, smoothing calculation based on the determined number of scanning lines is performed on the pixel, and the disparity is optimized to calculate the parallax. Thereby, the parallax in the stereo camera image of the current frame is calculated in pixel units.
In step S70, the parallax calculation unit 102 determines whether or not the calculation for all the pixels has been completed. If there is a pixel for which no operation has been performed, the process returns to step S10 to select the next pixel, and the above-described process is repeated. When the calculation for all the pixels is completed, the obtained parallax image of the current frame is stored in the parallax image storage unit 104, and then the parallax calculation shown in the flowchart of fig. 3 is completed.
In the above steps S40 and S50, the number of the set scanning lines is exemplified as 4 and 8, respectively, but this is an example and does not limit the present invention. Further, a plurality of values may be set for the threshold value dth used for the determination in step S30, and 3 or more kinds of scanning lines may be set by comparing the values with the parallax obtained in the previous frame. In either case, it is preferable that the smaller the parallax value obtained in the previous frame, the longer the distance to the image is, and the larger the number of scanning lines is.
(second embodiment)
Next, a second embodiment of the present invention will be described. In the present embodiment, an example will be described in which the distance range for each region obtained based on the result of the object detection performed by the processor 30 is used when the number of scanning lines is determined by the line number determination unit 111.
Fig. 4 is a block diagram showing a configuration of an image processing apparatus according to a second embodiment of the present invention. As shown in fig. 4, the image processing apparatus 10 according to the present embodiment differs from the first embodiment described with reference to fig. 2 in that the parallax image storage unit 104 is not provided in the memory 12, and in that the area information output from the memory 30 is input to the line count determination unit 111. The region information is information set for each object based on the result of object detection performed in the processor 30, and includes information indicating to which region within the stereo camera image the object corresponds, and information indicating to which distance range of the short distance and the long distance the distance to the object is classified.
In the present embodiment, the line number determination unit 111 inputs the region information output from the processor 30 in accordance with the parallax calculated from the stereo camera image of the previous frame, and determines the number of scanning lines used for the smoothing process performed by the parallax calculation unit 102 based on the information of the distance range for each region in the region information. That is, the line count determination unit 111 according to the present embodiment acquires information of a distance range determined based on the parallax calculated by the parallax calculation unit 102 in the past, and determines the number of scanning lines based on the acquired information of the distance range. The other points are the same as those of the first embodiment.
Fig. 5 is a flowchart showing an algorithm of parallax calculation in the image processing apparatus according to the second embodiment of the present invention. The parallax calculation shown in the flowchart of fig. 5 is performed in the FPGA11 when the stereo camera image newly captured by the camera 20 is stored in the stereo camera captured image storage unit 103 as the stereo camera image of the current frame.
In step S10A, the line count determination unit 111 reads, from the processor 30, the determination result of the distance range for each region determined by the processor 30 based on the parallax corresponding to the stereo camera image of the previous frame. Here, by acquiring the above-described region information output from the processor 30, information indicating which of the short distance and the long distance corresponds to the distance range of each region corresponding to each object detected from the stereo camera image of the previous frame is read.
In step S20A, the parallax calculation unit 102 reads the stereo camera image of the current frame from the stereo camera captured image storage unit 103, and sequentially selects pixels from the pixels located at the image side.
In step S30A, the line count determination unit 111 determines whether or not the object detected in the previous frame for the pixel is located within the short-distance region, based on the determination result of the distance range of the region corresponding to the pixel selected in step S20A, from among the determination results of the distance ranges in the previous frame read in step S10A. If the determination result of the distance range is a short distance, it is determined that the object is located in the short distance region in the previous frame and the process proceeds to step S40, and if the determination result of the distance range is a long distance, it is determined that the object is not located in the short distance region in the previous frame and the process proceeds to step S50.
In step S40, the line count determination unit 111 determines that the pixel selected in step S20A corresponds to a short-distance image, and sets the number of scanning lines corresponding to the pixel to be small. In the present embodiment, as in the first embodiment, the number of scanning lines is determined to be 4, for example.
In step S50, the line count determination unit 111 determines that the pixel selected in step S20A corresponds to a distant image, and sets the number of scanning lines corresponding to the pixel to be large. In the present embodiment, as in the first embodiment, the number of scanning lines is determined to be 8, for example.
After the number of scanning lines is determined by the line number determination unit 111 in step S40 or S50, the parallax calculation unit 102 performs the same processing as the flowchart of fig. 3 described in the first embodiment after step S60. That is, in step S60, parallax calculation based on the determined number of scanning lines is performed on the image selected in step S20A, and in step S70, it is determined whether or not the calculation has been completed for all pixels. If there is a pixel for which no calculation has been performed, the process returns to step S20A to select the next pixel, and the process is repeated. When the calculation for all the pixels is completed, the parallax calculation shown in the flowchart of fig. 5 is completed.
In the present embodiment, as in the first embodiment, the number of scanning lines set in steps S40 and S50 is not limited to 4 or 8. Further, the number of categories of the distance range of each region indicated by the region information output from the processor 30 may be 3 or more, and thus 3 or more kinds of scanning lines may be set. In either case, it is preferable that the number of scanning lines set for the pixels corresponding to the object is increased as the object exists at a longer distance.
Further, in the above description, an example of using the region information obtained from the stereo camera image of the previous frame as the information for determining the number of scanning lines has been described, but the present invention is not limited to this. For example, the region information of the current frame may be predicted based on the region information of the previous frame and the region information of the previous frame, and used. Further, the region information predicted in the current frame may be corrected based on the traveling information of the vehicle in which the image processing apparatus 10 is mounted, for example, the information of the turn.
In the first embodiment, the number of scanning lines is determined based on the parallax information in units of pixels or pixel groups around pixels, whereas in the present embodiment, the number of scanning lines is determined using information on the distance range in units of areas obtained by object detection. Therefore, compared to the first embodiment, there is an effect that the number of scanning lines can be determined based on information of a larger area.
(third embodiment)
Next, a third embodiment of the present invention will be described. In the present embodiment, an example in which the result of the object detection performed by the processor 30 is used when the number of scanning lines is determined by the line number determination unit 111 will be described.
Fig. 6 is a block diagram showing a configuration of an image processing apparatus according to a third embodiment of the present invention. As shown in fig. 6, the image processing apparatus 10 of the present embodiment differs from the second embodiment described with reference to fig. 4 in that information of the object detection result output from the processor 30 is input to the line count determining unit 111. This information is information indicating the result of object detection in the processor 30, and includes information indicating for which region within the stereo camera image an object has been detected.
In the present embodiment, the line number determination unit 111 inputs information of the object detection result output from the processor 30 according to the parallax calculated from the stereoscopic camera image of the previous frame, and determines the number of scanning lines used for the smoothing process performed by the parallax calculation unit 102 based on the information. For example, a smaller number of lines are used for pixels of the image of the object determined to have been detected, and a larger number of lines are used for pixels of the image determined not to have been detected. That is, the line count determination unit 111 according to the present embodiment acquires information on objects detected based on the parallax calculated by the parallax calculation unit 102 in the past, and determines the number of scanning lines based on the acquired information on the objects. Other points are the same as those of the first and second embodiments.
Fig. 7 is a diagram showing an example of determination of the number of scanning lines in the image processing apparatus according to the third embodiment of the present invention. As shown in fig. 7, in the parallax calculation for the pixels in the region of the image 901 of the vehicle in which the object has been detected in the previous frame, the number of lines 911 is set to be small even at a long distance. On the other hand, by setting the number of lines 912 to be large for the image 902 of the vehicle in which the object has not been detected in the previous frame, parallax calculation with higher accuracy is performed. In addition, a smaller number of lines 913 is set for road surface 903 where an object has been detected.
In the present embodiment, by setting the number of scanning lines using the result of object detection in the previous frame, it is possible to perform parallax calculation with higher accuracy for an image for which distance information has not been obtained in the past.
(fourth embodiment)
Next, a fourth embodiment of the present invention will be described. In the present embodiment, an example will be described in which distance information measured by another sensor is used when the number of scanning lines is determined in the line number determination unit 111.
Fig. 8 is a block diagram showing a configuration of an image processing apparatus according to a fourth embodiment of the present invention. As shown in fig. 8, the image processing apparatus 10 of the present embodiment differs from the second And third embodiments described with reference to fig. 4 And 6, respectively, in that distance information output from a LIDAR (laser radar) 40, which is an example of a sensor capable of acquiring distance information, is input to the line number determination unit 111. The LIDAR40 shown in fig. 8 is an example of another sensor capable of acquiring distance information, and the present invention is not limited to this. In addition, for example, distance information output from an infrared depth sensor, an ultrasonic sensor, a millimeter wave radar, other camera images, or the like may be used.
In the present embodiment, the line count determination unit 111 receives distance information output from the LIDAR40 for the range of the stereoscopic camera image captured by the camera 20, and determines the number of scanning lines used for the smoothing process performed by the parallax calculation unit 102 based on the distance information, as in the first embodiment. That is, the distance of each pixel indicated by the distance information is compared with a predetermined threshold value, and the number of scanning lines is set to be small for pixels whose distance is greater than the threshold value and large for pixels whose distance is less than or equal to the threshold value. The other points are the same as those of the first to third embodiments.
(fifth embodiment)
Next, a fifth embodiment of the present invention will be explained. In this embodiment, an example of an image processing apparatus including a density determination unit 112 that determines a density indicating a ratio of pixels to be thinned out in smoothing calculation, instead of the line number determination unit 111 described in the first to fourth embodiments, will be described.
Fig. 9 is a block diagram showing a configuration of an image processing apparatus according to a fifth embodiment of the present invention. As shown in fig. 9, the image processing apparatus 10 of the present embodiment differs from the first embodiment described with reference to fig. 2 in that the condition setting unit 101 of the FPGA11 includes a density determining unit 112 instead of the line number determining unit 111. The density determining unit 112 reads out the parallax image generated by the parallax calculating unit 102 from the stereo camera image of the previous frame and stored in the parallax image storage unit 104, and determines the density in the smoothing process performed by the parallax calculating unit 102 based on the parallax image. That is, the density determination unit 112 of the present embodiment determines the density based on the parallax calculated by the parallax calculation unit 102 in the past. Then, the disparity calculating unit 102 is notified of the determined density. The condition setting unit 101 in the present embodiment performs condition setting in the parallax calculation unit 102 by notifying the density determined by the density determination unit 112 in this manner.
When the density is notified from the density determination unit 112, the parallax calculation unit 102 performs image thinning out on the stereo camera image of the current frame read from the stereo camera captured image storage unit 103 at a rate corresponding to the notified density. Then, smoothing calculation is performed using the pixels of the remaining stereoscopic camera image that are not thinned out as calculation target pixels. Then, as in the first embodiment, the degree of disparity is calculated for each pixel of the stereo camera image after the smoothing calculation, and the parallax for each pixel is calculated based on the calculation result. The parallax calculation unit 102 outputs the calculation result of the parallax in pixel units obtained in this manner to the processor 30 as a parallax image, and stores the parallax image in the parallax image storage unit 104.
Fig. 10 is an explanatory diagram of thinning out pixels for smoothing calculation in the image processing apparatus according to the fifth embodiment of the present invention. Fig. 10 (a) is a diagram showing an example of a pixel used for smoothing calculation of a pixel corresponding to a distant image. As shown in fig. 10 (a), for a pixel determined to correspond to a distant image because the parallax appearing in the parallax image of the previous frame is small, the pixels on each scanning line are not thinned out, and an intensive smoothing operation is performed in a narrow range on the image. On the other hand, fig. 10 (b) is a diagram showing an example of a pixel used for smoothing calculation of a pixel corresponding to a close-distance image. As shown in fig. 10 (b), for a pixel determined to correspond to a short-distance image because of a large parallax appearing in a parallax image of a previous frame, pixels on each scan line are thinned out, and smoothing operation is performed using a small number of pixels over a wide range on the image. As described above, the density determination unit 112 of the present embodiment preferably determines the density so that the larger the parallax value obtained in the previous frame, the closer the distance to the image is determined and the larger the proportion of pixels to be thinned out from the stereo camera image.
Fig. 10 illustrates an example in which pixels used for smoothing operation are arranged on a scanning line. But not necessarily arranged on a line. Fig. 11 is an explanatory diagram of the arrangement of pixels used for smoothing calculation in the image processing apparatus according to the fifth embodiment of the present invention. Fig. 11 (a) shows an example in which each pixel arranged in an array on 8 scanning lines is used for smoothing calculation. On the other hand, fig. 11 (b) shows an example in which each pixel is arranged not on a scanning line but in a predetermined arrangement pattern in the smoothing calculation. In both the arrangement examples of fig. 11 (a) and 11 (b), the pixel thinning operation described above is performed based on the density determined by the density determination unit 112, and the smoothing operation is performed. Which pixel is thinned out at this time can be determined in advance. However, the arrangement of pixels shown in fig. 11 is an example, and the arrangement of pixels used for smoothing calculation is not limited to this.
In the present embodiment, an example has been described in which the density determination unit 112 determines the density based on the parallax calculated by the parallax calculation unit 102 in the past, in the same manner as the determination of the number of scanning lines by the line number determination unit 111 described in the first embodiment, but the present invention is not limited to this. For example, similarly to the determination of the number of scanning lines in the second embodiment, information of the distance range determined based on the parallax calculated by the parallax calculation unit 102 in the past may be acquired from the processor 30, and the density may be determined based on the acquired information of the distance range. In the same manner as the determination of the number of scanning lines in the third embodiment, information of an object detected based on the parallax calculated by the parallax calculation unit 102 in the past may be acquired from the processor 30, and the density may be determined based on the acquired information of the object. Alternatively, similarly to the determination of the number of scanning lines in the fourth embodiment, distance information corresponding to the range of the stereo camera image measured by another sensor may be acquired, and the density may be determined based on the acquired distance information. In addition, the density can be determined by any method.
(sixth embodiment)
Next, a sixth embodiment of the present invention will be described. In this embodiment, an example of an image processing apparatus in which the condition setting unit 101 includes the line number determination unit 111 described in the first to fourth embodiments and the density determination unit 112 described in the fifth embodiment will be described.
Fig. 12 is a block diagram showing a configuration of an image processing apparatus according to a sixth embodiment of the present invention. As shown in fig. 12, the image processing apparatus 10 of the present embodiment differs from the first embodiment described with reference to fig. 2 in that the condition setting unit 101 of the FPGA11 includes a density determining unit 112 in addition to the thread count determining unit 111. The density determining section 112 is the same as that described in the fifth embodiment. Although fig. 12 shows an example in which the memory 12 includes the parallax image storage unit 104, when the line number determining unit 111 and the density determining unit 112 determine the number of scanning lines and the density in the same manner as in the second to fourth embodiments, the parallax image storage unit 104 may not be provided in the memory 12 and may be omitted. As described in the fourth embodiment, when the number of scanning lines and the density are determined using the distance information measured by another sensor, the distance information from the sensor may be input to the number-of-lines determining unit 111 and the density determining unit 112.
According to the embodiments of the present invention described above, the following operational effects can be obtained.
(1) The image processing apparatus 10 calculates the parallax based on the stereoscopic camera image captured by the camera 20. The image processing apparatus 10 includes: a condition setting unit 101 for setting a condition for determining a pixel to be calculated in a stereo camera image; and a parallax calculation unit 102 that determines a calculation target pixel in the stereo camera image according to the condition set by the condition setting unit 101, and calculates a parallax using information of the calculation target pixel. Thus, the amount of calculation required to calculate the parallax can be suppressed.
(2) In each of the first to fourth and sixth embodiments, the condition setting unit 101 includes a line number determination unit 111 that determines the number of scanning lines for scanning the stereo camera image. The parallax calculation unit 102 sets the number of scan lines determined by the line number determination unit 111 for the stereo camera image, and determines pixels of the stereo camera image located on the scan lines as calculation target pixels. In this way, the calculation target pixel corresponding to the condition can be appropriately determined.
(3) In the first embodiment, the line number determination unit 111 determines the number of scanning lines based on the parallax calculated by the parallax calculation unit 102 in the past. In this case, the line number determination unit 111 increases the number of scanning lines as the parallax value decreases. In this way, the number of optimal scanning lines corresponding to the distance per pixel can be determined from the past parallax information.
(4) In the second embodiment, the line count determination unit 111 acquires information of a distance range determined based on the parallax calculated by the parallax calculation unit 102 in the past, and determines the number of scanning lines based on the acquired information of the distance range. In this way, the number of optimal scan lines corresponding to the distance of each object can be determined from the past parallax information.
(5) In the third embodiment, the line count determination unit 111 acquires information on objects detected based on the parallax calculated by the parallax calculation unit 102 in the past, and determines the number of scanning lines based on the acquired information on the objects. In this way, the optimum number of scanning lines can be determined for each object based on the past parallax information.
(6) In the fourth embodiment, the line count determination unit 111 acquires distance information corresponding to the range of the stereo camera image, and determines the number of scanning lines based on the acquired distance information. In this way, even when the parallax cannot be calculated from the previous stereo camera image, the number of optimal scan lines corresponding to the distance per pixel can be determined.
(7) In the fifth and sixth embodiments, the condition setting unit 101 includes a density determination unit 112 that determines a density indicating a ratio of pixels to be thinned out from the stereo camera image. The parallax calculation unit 102 determines, as a calculation target pixel, a pixel remaining after thinning out from the stereo camera image based on the density determined by the density determination unit 112. In this way, the calculation target pixel corresponding to the condition can be appropriately determined.
(8) In the fifth and sixth embodiments, the density setting unit 112 can determine the density based on the parallax calculated by the parallax calculation unit 102 in the past. In this case, the density determination unit 112 determines the density so that the larger the parallax value, the larger the proportion of pixels to be thinned out from the stereo camera image. In this way, the optimal density according to the distance of each pixel can be determined from the past parallax information.
(9) In the fifth and sixth embodiments, the density determination unit 112 may acquire information of a distance range determined based on the parallax calculated by the parallax calculation unit 102 in the past, and may determine the density based on the acquired information of the distance range. Further, information of an object detected based on the parallax calculated by the parallax calculation unit 102 in the past may be acquired, and the density may be determined based on the acquired information of the object. Further, distance information corresponding to the range of the stereo camera image may be acquired, and the density may be determined based on the acquired distance information. In this way, the optimum density can be determined according to the situation.
The embodiments of the present invention have been described above, but the present invention is not limited to the above embodiments, and various modifications can be made without departing from the scope of the claims. For example, although the image processing apparatus 10 mounted in a vehicle is described in each of the above embodiments, the present invention can be applied to an image processing apparatus used in another system.
The embodiment and the modification described above are merely examples, and the present invention is not limited to these contents as long as the features of the present invention are not impaired. While the various embodiments and modifications have been described above, the present invention is not limited to these. Other ways that can be considered within the scope of the technical idea of the present invention are also included within the scope of the present invention.
The disclosure of the following priority parent is incorporated herein by reference.
Japanese patent application 2018-168862 (filed in 2018 in 9 months and 10 days)
Description of the reference numerals
10 method 8230, an image processing device 11 method 8230, an FPGA 12 method 8230, a memory 20 method 8230, a camera 30 method 8230, a processor 40 method 8230, LIDAR 101 method 8230, a condition setting part 102 method 8230, a parallax calculation part (SGM operation) 103 method 8230, a stereo camera shooting image storage part 104 method 8230, a parallax image storage part 111 method 8230, a line number determination part 112 method 8230and a density determination part.

Claims (13)

1. An image processing apparatus that calculates parallax based on a stereoscopic camera image captured by a stereoscopic camera, comprising:
a condition setting unit that sets, for each pixel of the stereo camera image, a condition for determining a plurality of calculation target pixels corresponding to the pixel; and
a parallax calculation unit that determines the plurality of calculation target pixels for each pixel of the stereo camera image according to the condition set by the condition setting unit, and calculates the parallax for each pixel of the stereo camera image using information of the plurality of calculation target pixels,
the parallax calculation unit performs smoothing calculation for averaging a pixel value of each pixel of the stereo camera image and each pixel value of the plurality of calculation target pixels corresponding to the pixel, performs calculation of a degree of dissimilarity for each pixel of the stereo camera image on which the smoothing calculation is performed, and calculates the parallax based on a calculation result of the degree of dissimilarity.
2. The image processing apparatus according to claim 1, characterized in that:
the condition setting unit includes a line number determination unit that determines the number of scan lines for scanning the stereo camera image in the smoothing calculation,
the parallax calculation unit sets the number of scanning lines determined by the line number determination unit for each pixel in the stereo camera image, and determines each pixel of the stereo camera image located on the scanning lines as the calculation target pixel.
3. The image processing apparatus according to claim 2, characterized in that:
the line number determination unit determines the number of scanning lines based on the parallax calculated by the parallax calculation unit in the past.
4. The image processing apparatus according to claim 3, characterized in that:
the line number determination unit increases the number of scanning lines as the parallax value decreases.
5. The image processing apparatus according to claim 2, characterized in that:
the line number determination unit acquires information of a distance range determined based on the parallax calculated by the parallax calculation unit in the past, and determines the number of scanning lines based on the acquired information of the distance range.
6. The image processing apparatus according to claim 2, characterized in that:
the line number determination unit acquires information on an object detected based on the parallax calculated by the parallax calculation unit in the past, and determines the number of scanning lines based on the acquired information on the object.
7. The image processing apparatus according to claim 2, characterized in that:
the line number determination unit acquires distance information corresponding to a range of the stereo camera image, and determines the number of scanning lines based on the acquired distance information.
8. The image processing apparatus according to any one of claims 1 to 7, characterized in that:
the condition setting unit includes a density determination unit configured to determine a density indicating a ratio of pixels to be thinned out from the stereo camera image in the smoothing calculation,
the parallax calculation unit determines, for each pixel of the stereo camera image, a pixel to be thinned out of pixels located around the pixel, based on the density determined by the density determination unit, and determines, as the calculation target pixel, each of the pixels remaining after the thinning-out target pixel is thinned out.
9. The image processing apparatus according to claim 8, characterized in that:
the density determination unit determines the density based on the parallax calculated by the parallax calculation unit in the past.
10. The image processing apparatus according to claim 9, characterized in that:
the density determination unit determines the density so that the larger the parallax value is, the larger the proportion of pixels to be thinned out from the stereo camera image is.
11. The image processing apparatus according to claim 8, characterized in that:
the density determination unit acquires information of a distance range determined based on the parallax calculated by the parallax calculation unit in the past, and determines the density based on the acquired information of the distance range.
12. The image processing apparatus according to claim 8, characterized in that:
the density determining unit acquires information on an object detected based on the parallax calculated by the parallax calculating unit in the past, and determines the density based on the acquired information on the object.
13. The image processing apparatus according to claim 8, characterized in that:
the density determining unit acquires distance information corresponding to a range of the stereo camera image, and determines the density based on the acquired distance information.
CN201980049733.5A 2018-09-10 2019-08-28 Image processing apparatus Active CN112513572B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018168862A JP7066580B2 (en) 2018-09-10 2018-09-10 Image processing equipment
JP2018-168862 2018-09-10
PCT/JP2019/033762 WO2020054429A1 (en) 2018-09-10 2019-08-28 Image processing device

Publications (2)

Publication Number Publication Date
CN112513572A CN112513572A (en) 2021-03-16
CN112513572B true CN112513572B (en) 2022-10-18

Family

ID=69777549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980049733.5A Active CN112513572B (en) 2018-09-10 2019-08-28 Image processing apparatus

Country Status (3)

Country Link
JP (1) JP7066580B2 (en)
CN (1) CN112513572B (en)
WO (1) WO2020054429A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001266128A (en) 2000-03-21 2001-09-28 Nippon Telegr & Teleph Corp <Ntt> Method and device for obtaining depth information and recording medium recording depth information obtaining program
JP2004279031A (en) 2003-03-12 2004-10-07 Toyota Central Res & Dev Lab Inc Apparatus and method for detecting distance distribution
JP2004309868A (en) * 2003-04-08 2004-11-04 Sony Corp Imaging device and stereoscopic video generating device
JP2008241273A (en) 2007-03-26 2008-10-09 Ihi Corp Laser radar device and its control method
WO2012111755A1 (en) * 2011-02-18 2012-08-23 ソニー株式会社 Image processing device and image processing method
JP5960286B2 (en) * 2012-12-19 2016-08-02 富士フイルム株式会社 Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN104537668B (en) * 2014-12-29 2017-08-15 浙江宇视科技有限公司 A kind of quick parallax image computational methods and device
WO2018037479A1 (en) * 2016-08-23 2018-03-01 株式会社日立製作所 Image processing device, stereo camera device, and image processing method

Also Published As

Publication number Publication date
WO2020054429A1 (en) 2020-03-19
JP2020041891A (en) 2020-03-19
JP7066580B2 (en) 2022-05-13
CN112513572A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN107272021B (en) Object detection using radar and visually defined image detection areas
JP5404263B2 (en) Parallax calculation method and parallax calculation device
US9237326B2 (en) Imaging system and method
EP0762326B1 (en) Object detecting apparatus in which the position of a planar object is estimated by using Hough transform
US6658150B2 (en) Image recognition system
JP6208260B2 (en) Object recognition device
CN105335955A (en) Object detection method and object detection apparatus
US20140003704A1 (en) Imaging system and method
JP5752618B2 (en) Stereo parallax calculation device
JPH07129898A (en) Obstacle detecting device
EP3690800A2 (en) Information processing apparatus, information processing method, and program
JP2009092551A (en) Method, apparatus and system for measuring obstacle
TW201715882A (en) Device and method for depth estimation
JP7136721B2 (en) Arithmetic device, parallax calculation method
CN112513572B (en) Image processing apparatus
JP2021051347A (en) Distance image generation apparatus and distance image generation method
US10332259B2 (en) Image processing apparatus, image processing method, and program
JP7300331B2 (en) Information processing device for machine learning, information processing method for machine learning, and information processing program for machine learning
JP7134780B2 (en) stereo camera device
KR101569165B1 (en) Method and apparatus for vanishing point detection
JP7064400B2 (en) Object detection device
JP2018092547A (en) Image processing apparatus, image processing method, and program
WO2020095549A1 (en) Imaging device
JPH1096607A (en) Object detector and plane estimation method
JP5579297B2 (en) Parallax calculation method and parallax calculation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

GR01 Patent grant
GR01 Patent grant