WO2023139675A1 - Imaging device, parallax displacement correction method, and parallax displacement correction program - Google Patents
Imaging device, parallax displacement correction method, and parallax displacement correction program Download PDFInfo
- Publication number
- WO2023139675A1 WO2023139675A1 PCT/JP2022/001734 JP2022001734W WO2023139675A1 WO 2023139675 A1 WO2023139675 A1 WO 2023139675A1 JP 2022001734 W JP2022001734 W JP 2022001734W WO 2023139675 A1 WO2023139675 A1 WO 2023139675A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parallax
- captured image
- region
- value
- imaging
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 67
- 238000012937 correction Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 title claims description 36
- 238000006073 displacement reaction Methods 0.000 title abstract 8
- 238000010586 diagram Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 239000000284 extract Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Definitions
- the present disclosure relates to an imaging device, a parallax deviation correction method, and a parallax deviation correction program.
- a stereo camera is known as a device for recognizing objects in three dimensions.
- a stereo camera detects the parallax between multiple cameras based on trigonometry by using differences in how images are captured by multiple cameras placed at different positions, and uses the parallax to detect the distance and position from the cameras to an object.
- a stereo camera is mounted on a vehicle, for example, and used as a camera that constitutes a driving support system.
- Patent Literature 1 discloses a technique of calculating parallax between a position of a road surface near the vehicle (near road surface) and a position of a road surface far away from the vehicle (far road surface position), and correcting the parallax deviation.
- the device disclosed in Patent Document 1 preliminarily sets the road surface position used to calculate the parallax deviation, so there is a limit to the environment for correcting the parallax deviation. Also, since the texture of the road surface is monotonous, the parallax calculation can be unreliable. In particular, the reliability of parallax calculation may be lowered on a blurred road without white lines or various signs. Calculating the amount of parallax deviation using parallax with low reliability may lead to a decrease in the accuracy of correcting the parallax deviation and, in turn, a decrease in the accuracy of distance measurement.
- the present disclosure proposes an imaging device, a parallax deviation correction method, and a parallax deviation correction program that can detect parallax deviation with high accuracy.
- An imaging device includes a first imaging unit that acquires a first captured image, a second imaging unit that is arranged at a certain distance from the first imaging unit and acquires a second captured image, a parallax calculation unit that calculates the parallax of each pixel of the first captured image and the second captured image, an area selection unit that selects an area having a reliability equal to or higher than a threshold as the parallax value from the first captured image or the second captured image, and selection by the area selection unit.
- a parallax shift amount calculation unit that compares a parallax value in the obtained area with an ideal parallax value that should be acquired in the area, and obtains a parallax shift amount of each area;
- an imaging device a parallax deviation correction method, and a parallax deviation correction program that can detect parallax deviation with high accuracy.
- FIG. 1 is a schematic diagram illustrating an overall configuration of a stereo camera 100 as an imaging device according to a first embodiment
- FIG. 1B is a block diagram illustrating an example of a detailed configuration of the image processing device 2 of FIG. 1A
- FIG. 4 is an explanatory diagram illustrating a method of computing a parallax D in a parallax computation unit 13
- FIG. 4 is an explanatory diagram illustrating a method of computing a parallax D in a parallax computation unit 13
- FIG. FIG. 11 is an explanatory diagram for explaining a method of selecting an area by an area selection unit 14
- 4 is an explanatory diagram illustrating a method of computing a parallax D in a parallax computation unit 13
- FIG. 4 is an explanatory diagram illustrating a method of computing a parallax D in a parallax computation unit 13;
- FIG. 4 is an explanatory diagram illustrating a method of calculating a parallax D in a parallax calculator 13 and a method of calculating a parallax shift amount ⁇ D in a parallax shift amount calculator 15;
- FIG. 4 is an explanatory diagram illustrating a method of calculating a parallax D in a parallax calculator 13 and a method of calculating a parallax shift amount ⁇ D in a parallax shift amount calculator 15;
- FIG. 4 is an explanatory diagram illustrating a method of calculating a parallax D in a parallax calculator 13 and a method of calculating a parallax shift amount ⁇ D in a parallax shift amount calculator 15;
- FIG. 4 is an explanatory diagram illustrating a method of computing a parallax D
- FIG. 4 is a flowchart illustrating a specific example of a procedure for calculating a parallax shift correction amount ⁇ C in the stereo camera 100 of the first embodiment
- FIG. 12 is an explanatory diagram for explaining the operation of the area selection unit 14 in the stereo camera 100 of the second embodiment
- 10 is a flowchart illustrating a specific example of a procedure for calculating a parallax shift correction amount ⁇ C in the stereo camera 100 of the second embodiment
- FIG. 12 is an explanatory diagram for explaining the operation of the area selection unit 14 in the stereo camera 100 of the second embodiment
- It is an explanatory view explaining operation of a 3rd embodiment.
- It is an explanatory view explaining operation of a 3rd embodiment.
- This stereo camera 100 is an in-vehicle stereo camera device mounted in a vehicle such as an automobile, and includes a stereo camera main body 1 , an image processing device 2 and an interface 5 .
- the stereo camera body 1, the image processing device 2, and the interface 5 are connected via a communication line such as a bus BU.
- the stereo camera main body 1 is configured by arranging a plurality of, for example, two cameras, a first imaging unit 11 (left camera) and a second imaging unit 12 (right camera), spaced apart by a predetermined base line length B.
- the first image capturing unit 11 and the second image capturing unit 12 capture images in front of the vehicle from different positions by the base line length B to obtain images.
- the image processing device 2 can include, for example, a CPU 3 as an arithmetic device and a memory 4 as a storage device.
- Memory 4 may be RAM, ROM, hard disk drive (HDD), or a combination thereof.
- Stereo camera 100 can be connected to external device 200 via interface 5 .
- the external device 200 may include various sensors for detecting the opening of the vehicle's accelerator (that is, throttle opening), brake operation amount (brake pedal operation amount), steering angle, vehicle speed, acceleration, temperature, humidity, etc., as well as an ECU (Electronic Control Unit) that controls various operations of the vehicle.
- accelerator that is, throttle opening
- brake operation amount brake pedal operation amount
- steering angle vehicle speed
- acceleration acceleration
- temperature temperature
- humidity etc.
- ECU Electronic Control Unit
- the memory 4 is a recording medium in which programs used for various processes in the image processing device 2 (parallax deviation correction program, etc.) and various types of information are stored.
- the CPU 3 performs predetermined arithmetic processing on signals received via an interface (I/F) 5 according to a control program stored in the memory 4, detects three-dimensional objects and road surfaces, and calculates the position, distance, direction, etc. of objects.
- the calculation result by the CPU 3 is output to the external device 200 via the interface 5 and used for judgment and control of various operations of the vehicle such as acceleration, braking and steering.
- the image processing device 2 includes, for example, a parallax calculation unit 13, a region selection unit 14, a parallax shift amount calculation unit 15, and a parallax shift correction unit 16 realized by a program.
- the image processing device 2 calculates the distance Z from the stereo camera 100 to the object using the parallax D calculated by the parallax calculator 13 and corrected by the parallax shift corrector 16 , and supplies the calculation result to the external device 200 .
- the parallax calculation unit 13 calculates the parallax D based on the image captured by the first imaging unit 11 and the image captured by the second imaging unit 12 . Specifically, the parallax calculation unit 13 sets, for example, an image captured by the first imaging unit 11 as a reference image, and extracts feature points having a change in gradation in the reference image. Next, the image captured by the other second imaging unit 12 is used as a reference image, and the reference image is searched for the position of the reference point in which the same subject as the feature point extracted in the reference image appears. Template matching such as SAD (Sum of Absolute Difference) can be used for the search. A parallax D is calculated as the difference between the position of the extracted feature point in the reference image and the position in the reference image.
- SAD Sud of Absolute Difference
- the calculated parallax D is calculated for each region containing the extracted feature points, and is temporarily stored in the memory 4 in association with the position of the region. That is, the parallax calculation unit 13 is configured to extract a plurality of feature points from a pair of standard image and reference image, and to calculate a parallax D in each of a plurality of regions including the plurality of extracted feature points.
- the area selection unit 14 selects an area from which a highly reliable parallax D has been obtained, from among a plurality of areas from which data on the parallax D in the captured image has been obtained. Since a region where a highly reliable parallax D is obtained is selected and the parallax shift amount ⁇ D is calculated using the parallax D in the selected region, parallax shift correction can be performed with high accuracy. A method for determining the parallax D with high reliability will be described later. Note that the number of regions to be selected is irrelevant.
- the parallax shift amount calculation unit 15 uses the values of the parallax D in the multiple regions selected by the region selection unit 14 to calculate the parallax shift amount ⁇ D in each region.
- the parallax shift amount ⁇ D is calculated for each region as the difference (D ⁇ Di) between the parallax D calculated for a certain region in the parallax calculation unit 13 and the parallax ideal value Di to be obtained in the region.
- the “ideal parallax value Di” is determined by the height from the object of the first imaging unit 11 and the second imaging unit 12 and the angle of view of the camera that captures the area. A specific procedure for calculating the ideal parallax value Di will be described later.
- the parallax shift correction unit 16 determines a parallax shift correction value ⁇ C in the captured image from a plurality of parallax shift amounts ⁇ D calculated for each of the plurality of selected regions.
- FIG. 2A for example, an image of a road including a white line WL is captured by the first imaging unit 11 and the second imaging unit 12 to obtain a first captured image IM1 and a second captured image IM2.
- the first captured image IM1 captured by the first imaging unit 11 is used as a reference image
- the second captured image IM2 captured by the second imaging unit 12 is used as a reference image.
- the parallax calculator 13 extracts a feature point 21 in the first captured image IM1, and extracts a reference point 21' having the same feature as the feature point 21 in the second captured image IM2. Then, the parallax calculator 13 performs SAD template matching between the feature point 21 and the reference point 21', and the similarity SM is calculated, for example, as shown in the graph of FIG. 2B.
- the horizontal axis of the graph in FIG. 2B indicates the horizontal coordinate of the second captured image IM2, and the vertical axis indicates the SAD value (similarity SM).
- the similarity SM is calculated at the position of the reference point 21' where the same object as the object captured in the feature point 21 is reflected in the second captured image IM2 (reference image)
- the similarity SM will be a small value.
- the similarity SM is calculated at a position (a position other than the reference point) where an object that is not the same as the object captured in the feature point 21 is captured in the second captured image IM2 (reference image)
- the similarity SM becomes a high value.
- the parallax D of the feature point 21 is the coordinate value on the horizontal axis of the position (first peak P1) where the similarity SM becomes the minimum value.
- the amount of protrusion of the first peak P1 (the amount of decrease in the similarity SM at the first peak P1) is k
- the larger the value of the amount of protrusion k the larger the feature amount (for example, the difference in brightness) of the feature point 21 with respect to the adjacent area.
- the average value Av of the similarity SM is calculated in an area excluding the vicinity of the first peak P1 (the area indicated by the dashed line in the figure), and this is used as a reference value, and the difference between the average value Av (reference value) and the reliability SM of the minimum value of the first peak P1 is calculated as the protrusion amount k.
- the value of the protrusion amount k is calculated for a plurality of regions, and the region selection unit 14 selects a plurality of regions with high values of k. For example, as shown in FIG.
- a protrusion amount k is obtained for each of a plurality of areas (given with a unique area ID), the areas are sorted in descending order of the protrusion amount k, and a predetermined number of areas are selected by the area selection unit 14 in descending order of the protrusion amount k.
- a threshold THk can be set for the protrusion amount k, and the parallax shift correction operation can be stopped when the number of regions in which the protrusion amount k exceeds the threshold THk does not reach the minimum value Nreg.
- the protrusion amount k is calculated from the difference between the average value Av of the similarity SM (the value near the first peak P1 is excluded) and the lower limit value of the first peak P1, but as shown in FIGS. It may be calculated as the protrusion amount k.
- f (mm) be the focal length of the lenses L of the first imaging unit 11 and the second imaging unit 12
- ⁇ (mm/pixel) be the pixel pitch of the imaging elements (e.g., CMOS sensors) of the first imaging unit 11 and the second imaging unit 12
- J (pixel) be the position where the road surface is projected onto the imaging surface at a distance Z (mm) in the horizontal direction from the lens L
- h (mm) be the height of the lens L from the road surface.
- parallax D (pixel) at the distance Z (mm) satisfies the following equation (2), where B (mm) is the distance (base line length) between the first imaging unit 11 and the second imaging unit 12.
- the parallax D (pixels) of the road surface can be calculated from the baseline length B (mm) and height h of the stereo camera 100, and the coordinates J (pixels) of the road surface on the imaging plane. Since the base length B and the height h (mm) are values that can be specified when the first imaging unit 11 and the second imaging unit 12 are mounted on the vehicle, the parallax D (mm) can be uniquely specified by the coordinates J (pixel) of the road surface.
- the ideal value Di of the parallax D of the road surface can be calculated based on the value of the base line length B and the value of the height h when the stereo camera 100 is mounted on the vehicle.
- the ideal parallax value Di may be obtained for an object Ob (for example, a curbstone, a guardrail, etc.) whose height from the road surface is known.
- the road surface is a feature that always exists in a general driving environment, in principle it is preferable to obtain the ideal parallax value Di on the road surface.
- the ideal parallax value Di is obtained on the road surface, but in special cases, it is also possible to obtain the ideal parallax value Di on an object other than the road surface.
- the parallax shift amount calculator 15 calculates the parallax shift amount ⁇ D for each of the plurality of road surface areas selected by the area selector 14 .
- step S101 the road surface on which the calculation of the parallax deviation correction amount ⁇ C is based is specified. Then, in the manner described above, the parallax D on the road surface is calculated by the parallax calculator 13 for each region including the feature point 21 (step S102), and the value of the amount of protrusion k is calculated for each region (step S103).
- the area selection unit 14 selects an area where the protrusion amount k is greater than the threshold THk (step S104). If the number of regions Nreg selected in step S104 is equal to or greater than the threshold THreg, the process proceeds to step S106, but if it is smaller than the threshold THreg, calculation of the parallax deviation correction amount ⁇ C and parallax deviation correction processing are canceled (step S105). This is because the parallax shift amount cannot be calculated with high accuracy if a certain number of regions in which the value of the protrusion amount k is equal to or greater than the threshold value THk cannot be obtained.
- a region with a high protrusion amount k is selected, and the parallax shift amount ⁇ D is calculated based on the selected region. Therefore, it is possible to provide an imaging device that can detect parallax shift with high accuracy.
- FIGS. 6A to 6C a stereo camera 100 according to a second embodiment will be described with reference to FIGS. 6A to 6C. Since the overall configuration of the stereo camera 100 of the second embodiment is the same as that of the first embodiment (FIGS. 1A and 1B), redundant description will be omitted.
- the area selection unit 14 calculates the degree of protrusion k (see FIG. 2B) of the similarity SM, and selects a plurality of areas to be used for calculating the parallax shift amount ⁇ D according to the degree of protrusion k.
- the area selection unit 14 instead of obtaining a graph of the similarity SM, determines the magnitude of the luminance difference between the attention area 41 and the corresponding area, for example, the adjacent area 42 adjacent to the attention area 41, as shown in FIG.
- a road surface is identified as a basis for calculation of the parallax shift correction amount ⁇ C (step S201).
- the area selection unit 14 calculates the luminance difference between the specified attention area 41 on the road surface and the adjacent area 42 adjacent to the attention area 41 (step S202).
- the region selection unit 14 selects a region having a luminance difference greater than the threshold THin (step S203).
- step S205 it is determined whether or not the number Nreg of regions selected in step S203 is greater than or equal to the threshold THreg' (step S205). If the determination is Yes, the process proceeds to step S206, but if the determination is No (smaller than the threshold THreg), the computation of the parallax deviation correction amount ⁇ C and the parallax deviation correction process are stopped.
- the luminance difference between the region of interest 41 and a neighboring region 42' that is not adjacent to and is a little distant.
- the adjacent region 42 is a region that is located without a gap from the region of interest 41
- the neighboring region 42 ′ is a region separated from the region of interest 41 by, for example, one region. If the edge portion of the attention area 41 (for example, the edge portion between the road surface and the white line) is blurred due to the resolution of the camera, the luminance difference between the attention area 41 and the adjacent area 42 may not be correctly calculated.
- the area used for calculating the parallax shift amount is selected according to the magnitude of the luminance difference, and the parallax shift amount ⁇ D is calculated based on the selected area. Therefore, it is possible to provide an imaging device that can detect the parallax shift with high precision, as in the first embodiment.
- FIG. 7 Since the overall configuration of the stereo camera 100 of the third embodiment is the same as that of the first embodiment (FIGS. 1A and 1B), redundant description will be omitted.
- the third embodiment differs from the above-described embodiments in the method of calculating the amount of parallax deviation in the parallax deviation correction unit 16 .
- the parallax shift correction unit 16 calculates a representative value (for example, the median value Median(J)) of the parallax shift amounts ⁇ D of a plurality of areas for each vertical coordinate J (1, 2, . . . m, . . . ), as shown in FIG.
- the average value of the calculated median values Median(1) to (m) is calculated as the average value ⁇ Dav of the parallax deviation amounts ⁇ D.
- the parallax shift correction amount ⁇ C is calculated in the parallax shift correction unit 16 .
- the standard deviation SD of the median value Median(J) used to calculate the average value ⁇ Dav is larger than the threshold, it is preferable not to correct the parallax shift in the captured image. This is because when the standard deviation SD is larger than the threshold, the reliability in the region containing the coordinates is judged to be small.
- one median value Median(j) is calculated for one coordinate J, but one median value Median(k ⁇ k+x) may be calculated collectively for a plurality of coordinates k ⁇ k+x.
- the median value of the parallax deviation amount ⁇ D of all selected regions is calculated instead of each coordinate, and the median value is determined as the parallax deviation amount ⁇ D in the captured image, and can be used as the parallax deviation correction amount ⁇ C.
- the present invention is not limited to the above-described embodiments, and includes various modifications.
- the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
- part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of Optical Distance (AREA)
- Image Processing (AREA)
Abstract
Provided is an imaging device that can detect a road surface position with high reliability and that can thereby detect parallax displacement with high accuracy. This imaging device is provided with: a first imaging unit for acquiring a first captured image; and a second imaging unit that is disposed at a position away by a fixed distance from the first imaging unit and that acquires a second captured image. A parallax calculation unit calculates parallaxes in various regions in the first captured image and in various regions in the second captured image. A region selection unit selects, from the first captured image or from the second captured image, regions having a threshold value of reliability or higher as the value of each of the parallaxes. A parallax displacement amount calculation unit makes a comparison between the values of the parallaxes in the respective regions selected by the region selection unit and ideal values of parallaxes to be acquired in the regions, and obtains parallax displacement amounts of the regions. A parallax displacement correction unit corrects parallax displacements in the first imaging unit and the second imaging unit using correction amounts determined in accordance with the parallax displacement amounts of the regions.
Description
本開示は、撮像装置、視差ずれ補正方法、及び視差ずれ補正プログラムに関する。
The present disclosure relates to an imaging device, a parallax deviation correction method, and a parallax deviation correction program.
3次元的に物体を認識するための装置として、ステレオカメラが知られている。ステレオカメラは、異なる位置に配置した複数のカメラの画像の写り方の違いを利用して、三角法に基づき複数のカメラの間の視差を検出し、その視差を用いて、カメラから物体までの距離や位置を検出するものである。ステレオカメラは、例えば車両に搭載され、運転支援システムを構成するカメラとして使用されている。
A stereo camera is known as a device for recognizing objects in three dimensions. A stereo camera detects the parallax between multiple cameras based on trigonometry by using differences in how images are captured by multiple cameras placed at different positions, and uses the parallax to detect the distance and position from the cameras to an object. A stereo camera is mounted on a vehicle, for example, and used as a camera that constitutes a driving support system.
しかし、経年的な劣化によりステレオカメラの光軸がずれると、正確な視差を計算できなくなる。これにより、カメラから物体までの距離を正しく計測できず、運転支援システムが正しく動作しない可能性が大きくなる。このため、正確な視差を計算するために、光軸をキャリブレーションする(言い換えれば、視差ずれを補正する)様々な技術が提案されている(例えば、特許文献1参照)。特許文献1は、車両の近傍の路面(近傍路面)の位置と、車両の遠方の路面の位置(遠方路面位置)のそれぞれの視差を計算して視差ずれを補正する技術を開示する。
However, when the optical axis of the stereo camera deviates due to deterioration over time, it becomes impossible to calculate accurate parallax. As a result, the distance from the camera to the object cannot be measured correctly, increasing the possibility that the driving support system will not operate correctly. For this reason, various techniques for calibrating the optical axis (in other words, correcting the parallax deviation) have been proposed in order to accurately calculate the parallax (see, for example, Patent Document 1). Patent Literature 1 discloses a technique of calculating parallax between a position of a road surface near the vehicle (near road surface) and a position of a road surface far away from the vehicle (far road surface position), and correcting the parallax deviation.
しかし、特許文献1に開示の装置は、視差ずれを計算するのに用いる路面位置を予め設定しているため、視差ずれを補正する環境に制限がある。また、路面のテクスチャは単調であるため、視差の計算の信頼度が低くなることがあり得る。特に、白線や各種標識等が無い不鮮明な道路において、視差計算の信頼度が低くなることが生じ得る。信頼度の低い視差を用いて視差ずれ量を計算することは、視差ずれの補正精度の低下、ひいては距離の計測の精度の低下を招く虞がある。
However, the device disclosed in Patent Document 1 preliminarily sets the road surface position used to calculate the parallax deviation, so there is a limit to the environment for correcting the parallax deviation. Also, since the texture of the road surface is monotonous, the parallax calculation can be unreliable. In particular, the reliability of parallax calculation may be lowered on a blurred road without white lines or various signs. Calculating the amount of parallax deviation using parallax with low reliability may lead to a decrease in the accuracy of correcting the parallax deviation and, in turn, a decrease in the accuracy of distance measurement.
本開示は、視差ずれを高精度に検出することができる撮像装置、視差ずれ補正方法、及び視差ずれ補正プログラムを提案するものである。
The present disclosure proposes an imaging device, a parallax deviation correction method, and a parallax deviation correction program that can detect parallax deviation with high accuracy.
本開示の一の態様に係る撮像装置は、第1撮像画像を取得する第1撮像部と、前記第1撮像部から一定距離離れた位置に配置され、第2撮像画像を取得する第2撮像部と、前記第1撮像画像と前記第2撮像画像の各画素における視差を演算する視差演算部と、前記第1撮像画像又は前記第2撮像画像の中から、前記視差の値として閾値以上の信頼度を有する領域を選択する領域選択部と、前記領域選択部で選択された領域における視差の値と、当該領域において取得されるべき視差の理想値とを比較し、各領域の視差ずれ量を求める視差ずれ量演算部と、前記各領域の視差ずれ量を統計処理して決定される補正量によって、前記第1撮像部及び前記第2撮像部における視差ずれを補正する視差ずれ補正部とを備えることを特徴とする。
An imaging device according to one aspect of the present disclosure includes a first imaging unit that acquires a first captured image, a second imaging unit that is arranged at a certain distance from the first imaging unit and acquires a second captured image, a parallax calculation unit that calculates the parallax of each pixel of the first captured image and the second captured image, an area selection unit that selects an area having a reliability equal to or higher than a threshold as the parallax value from the first captured image or the second captured image, and selection by the area selection unit. A parallax shift amount calculation unit that compares a parallax value in the obtained area with an ideal parallax value that should be acquired in the area, and obtains a parallax shift amount of each area;
本発明によれば、視差ずれを高精度に検出することができる撮像装置、視差ずれ補正方法、及び視差ずれ補正プログラムを提供することができる。
According to the present invention, it is possible to provide an imaging device, a parallax deviation correction method, and a parallax deviation correction program that can detect parallax deviation with high accuracy.
以下、添付図面を参照して本実施形態について説明する。添付図面では、機能的に同じ要素は同じ番号で表示される場合もある。なお、添付図面は本開示の原理に則った実施形態と実装例を示しているが、これらは本開示の理解のためのものであり、決して本開示を限定的に解釈するために用いられるものではない。本明細書の記述は典型的な例示に過ぎず、本開示の特許請求の範囲又は適用例を如何なる意味においても限定するものではない。
The present embodiment will be described below with reference to the accompanying drawings. In the accompanying drawings, functionally identical elements may be labeled with the same numbers. Although the attached drawings show embodiments and implementation examples in accordance with the principles of the present disclosure, they are for the purpose of understanding the present disclosure and are not used to interpret the present disclosure in a limited way. The description herein is merely exemplary and is not intended to limit the scope or application of this disclosure in any way.
本実施形態では、当業者が本開示を実施するのに十分詳細にその説明がなされているが、他の実装・形態も可能で、本開示の技術的思想の範囲と精神を逸脱することなく構成・構造の変更や多様な要素の置き換えが可能であることを理解する必要がある。従って、以降の記述をこれに限定して解釈してはならない。
In the present embodiment, the description is given in sufficient detail for those skilled in the art to implement the present disclosure, but it is necessary to understand that other implementations and forms are possible, and that changes in configuration and structure and replacement of various elements are possible without departing from the scope and spirit of the technical idea of the present disclosure. Therefore, the following description should not be construed as being limited to this.
[第1の実施の形態]
図1Aの概略図を参照して、第1の実施の形態の撮像装置としてのステレオカメラ100の全体構成を説明する。このステレオカメラ100は、自動車などの車両に搭載される車載ステレオカメラ装置であり、ステレオカメラ本体1と、画像処理装置2と、インタフェース5とを備えている。ステレオカメラ本体1、画像処理装置2、及びインタフェース5は、バスBUなどの通信回線を介して接続されている。 [First embodiment]
The overall configuration of astereo camera 100 as an imaging device according to the first embodiment will be described with reference to the schematic diagram of FIG. 1A. This stereo camera 100 is an in-vehicle stereo camera device mounted in a vehicle such as an automobile, and includes a stereo camera main body 1 , an image processing device 2 and an interface 5 . The stereo camera body 1, the image processing device 2, and the interface 5 are connected via a communication line such as a bus BU.
図1Aの概略図を参照して、第1の実施の形態の撮像装置としてのステレオカメラ100の全体構成を説明する。このステレオカメラ100は、自動車などの車両に搭載される車載ステレオカメラ装置であり、ステレオカメラ本体1と、画像処理装置2と、インタフェース5とを備えている。ステレオカメラ本体1、画像処理装置2、及びインタフェース5は、バスBUなどの通信回線を介して接続されている。 [First embodiment]
The overall configuration of a
ステレオカメラ本体1は、複数例えば2台のカメラである第1撮像部11(左カメラ)及び第2撮像部12(右カメラ)を、所定の基線長Bだけ離間して配置して構成される。第1撮像部11及び第2撮像部12は、それぞれ車両前方の画像を、基線長Bの分だけ異なる位置から撮像して画像を取得する。画像処理装置2は、例えば、演算装置としてのCPU3、記憶装置としてのメモリ4を含み得る。メモリ4は、RAM、ROM、ハードディスクドライブ(HDD)、又はそれらの組合せであってよい。ステレオカメラ100は、インタフェース5を介して外部装置200と接続され得る。
The stereo camera main body 1 is configured by arranging a plurality of, for example, two cameras, a first imaging unit 11 (left camera) and a second imaging unit 12 (right camera), spaced apart by a predetermined base line length B. The first image capturing unit 11 and the second image capturing unit 12 capture images in front of the vehicle from different positions by the base line length B to obtain images. The image processing device 2 can include, for example, a CPU 3 as an arithmetic device and a memory 4 as a storage device. Memory 4 may be RAM, ROM, hard disk drive (HDD), or a combination thereof. Stereo camera 100 can be connected to external device 200 via interface 5 .
外部装置200は、車両のアクセルの開度(すなわち、スロットル開度)、ブレーキの操作量(ブレーキペダルの操作量)、ステアリングの操舵角、車速、加速度、温度、湿度などを検出する各種センサ類のほか、車両の各種動作を制御するECU(Electronic Control Unit)などを含み得る。
The external device 200 may include various sensors for detecting the opening of the vehicle's accelerator (that is, throttle opening), brake operation amount (brake pedal operation amount), steering angle, vehicle speed, acceleration, temperature, humidity, etc., as well as an ECU (Electronic Control Unit) that controls various operations of the vehicle.
メモリ4は、画像処理装置2における各種処理に用いるプログラム(視差ずれ補正用プログラム等)と、各種情報とが記憶された記録媒体である。CPU3は、メモリ4に記憶された制御プログラムに従ってインタフェース(I/F)5を介して取り入れた信号に対して所定の演算処理を行い、立体物や路面の検知、対象物の位置、距離、方向等の算出を行う。CPU3による演算結果は、インタフェース5を介して外部装置200に出力され、アクセルやブレーキ、ステアリングなど車両の各種動作の判断や制御に用いられる。
The memory 4 is a recording medium in which programs used for various processes in the image processing device 2 (parallax deviation correction program, etc.) and various types of information are stored. The CPU 3 performs predetermined arithmetic processing on signals received via an interface (I/F) 5 according to a control program stored in the memory 4, detects three-dimensional objects and road surfaces, and calculates the position, distance, direction, etc. of objects. The calculation result by the CPU 3 is output to the external device 200 via the interface 5 and used for judgment and control of various operations of the vehicle such as acceleration, braking and steering.
図1Bのブロック図を参照して、画像処理装置2の構成の詳細の一例を説明する。画像処理装置2は、一例として、プログラムにより実現される視差演算部13、領域選択部14、視差ずれ量演算部15、及び視差ずれ補正部16から構成される。画像処理装置2は、視差演算部13で演算され視差ずれ補正部16で補正された視差Dにより、ステレオカメラ100から物体までの距離Zを演算し、その演算結果を外部装置200に供給する。
An example of the detailed configuration of the image processing device 2 will be described with reference to the block diagram of FIG. 1B. The image processing device 2 includes, for example, a parallax calculation unit 13, a region selection unit 14, a parallax shift amount calculation unit 15, and a parallax shift correction unit 16 realized by a program. The image processing device 2 calculates the distance Z from the stereo camera 100 to the object using the parallax D calculated by the parallax calculator 13 and corrected by the parallax shift corrector 16 , and supplies the calculation result to the external device 200 .
視差演算部13は、第1撮像部11により撮像された画像と第2撮像部12により撮像された画像とに基づいて、視差Dを算出する。具体的に視差演算部13は、例えば第1撮像部11により撮像された画像を基準画像に設定し、その基準画像において濃淡変化がある特徴点を抽出する。そして、次に、もう一方の第2撮像部12により撮像された画像を参照画像とし、基準画像において抽出された特徴点と同一被写体が映り込んだ参照点の位置を参照画像中で探索する。探索には、例えばSAD(Sum of Absolute Difference)などのテンプレートマッチングを利用することができる。抽出した特徴点の基準画像中での位置と、参照画像中での位置との差分が視差Dとして算出される。算出された視差Dは、抽出した特徴点が含まれる領域毎に算出され、領域の位置と関連付けてメモリ4に一時的に記憶される。すなわち、視差演算部13は、一対の基準画像及び参照画像において、複数の特徴点を抽出し、複数の抽出された特徴点が含まれる複数の領域の各々において、視差Dを演算するよう構成されている。
The parallax calculation unit 13 calculates the parallax D based on the image captured by the first imaging unit 11 and the image captured by the second imaging unit 12 . Specifically, the parallax calculation unit 13 sets, for example, an image captured by the first imaging unit 11 as a reference image, and extracts feature points having a change in gradation in the reference image. Next, the image captured by the other second imaging unit 12 is used as a reference image, and the reference image is searched for the position of the reference point in which the same subject as the feature point extracted in the reference image appears. Template matching such as SAD (Sum of Absolute Difference) can be used for the search. A parallax D is calculated as the difference between the position of the extracted feature point in the reference image and the position in the reference image. The calculated parallax D is calculated for each region containing the extracted feature points, and is temporarily stored in the memory 4 in association with the position of the region. That is, the parallax calculation unit 13 is configured to extract a plurality of feature points from a pair of standard image and reference image, and to calculate a parallax D in each of a plurality of regions including the plurality of extracted feature points.
領域選択部14は、撮像された画像中の視差Dのデータが得られた複数の領域のうち、信頼度が高い視差Dが得られた領域を選択する。信頼度が高い視差Dが得られた領域を選択し、その選択された領域での視差Dを用いて視差ずれ量ΔDを演算するので、視差ずれ補正を高精度に実行することが可能になる。信頼度が高い視差Dを判定する手法については後述する。なお、選択される領域の数は不問である。
The area selection unit 14 selects an area from which a highly reliable parallax D has been obtained, from among a plurality of areas from which data on the parallax D in the captured image has been obtained. Since a region where a highly reliable parallax D is obtained is selected and the parallax shift amount ΔD is calculated using the parallax D in the selected region, parallax shift correction can be performed with high accuracy. A method for determining the parallax D with high reliability will be described later. Note that the number of regions to be selected is irrelevant.
視差ずれ量演算部15は、領域選択部14で選択された複数の領域における視差Dの値を用いて、各領域における視差ずれ量ΔDを計算する。視差ずれ量ΔDは、視差演算部13においてある領域において計算された視差Dと、当該領域において取得されるべき視差の理想値Diとの差分(D-Di)を視差ずれ量ΔDとして領域毎に算出する。ここで、「視差の理想値Di」とは、第1撮像部11及び第2撮像部12の物体からの高さと、当該領域が撮像されるカメラの画角で決定される。具体的な視差の理想値Diの演算の手順については後述する。
The parallax shift amount calculation unit 15 uses the values of the parallax D in the multiple regions selected by the region selection unit 14 to calculate the parallax shift amount ΔD in each region. The parallax shift amount ΔD is calculated for each region as the difference (D−Di) between the parallax D calculated for a certain region in the parallax calculation unit 13 and the parallax ideal value Di to be obtained in the region. Here, the “ideal parallax value Di” is determined by the height from the object of the first imaging unit 11 and the second imaging unit 12 and the angle of view of the camera that captures the area. A specific procedure for calculating the ideal parallax value Di will be described later.
視差ずれ補正部16は、選択された複数の領域の各々で計算された複数の視差ずれ量ΔDから、該撮像画像における視差ずれ補正の値ΔCを決定する。
The parallax shift correction unit 16 determines a parallax shift correction value ΔC in the captured image from a plurality of parallax shift amounts ΔD calculated for each of the plurality of selected regions.
次に、図2A~図2Cを参照して、視差演算部13での視差Dの演算、及び領域選択部14での信頼度の高い視差Dを有する領域の選択の方法について説明する。図2Aに示すように、例えば白線WLを含む道路の画像が、第1撮像部11及び第2撮像部12により撮像され、第1撮像画像IM1、第2撮像画像IM2が得られているとする。ここで、第1撮像部11で撮像される第1撮像画像IM1を基準画像とし、第2撮像部12で撮像される第2撮像画像IM2を参照画像とする。視差演算部13により、第1撮像画像IM1内では、ある特徴点21が抽出され、第2撮像画像IM2内では、この特徴点21と同一の特徴を有する参照点21’が抽出される。そして、視差演算部13は、特徴点21と参照点21’との間のSADテンプレートマッチングを実行し、その際に類似度SMが、例えば図2Bのグラフのように演算される。図2Bのグラフの横軸は第2撮像画像IM2の水平方向の座標を示し、縦軸はSAD値(類似度SM)を示す。
Next, with reference to FIGS. 2A to 2C, a method for calculating the parallax D by the parallax calculator 13 and selecting a region having a highly reliable parallax D by the region selector 14 will be described. As shown in FIG. 2A, for example, an image of a road including a white line WL is captured by the first imaging unit 11 and the second imaging unit 12 to obtain a first captured image IM1 and a second captured image IM2. Here, the first captured image IM1 captured by the first imaging unit 11 is used as a reference image, and the second captured image IM2 captured by the second imaging unit 12 is used as a reference image. The parallax calculator 13 extracts a feature point 21 in the first captured image IM1, and extracts a reference point 21' having the same feature as the feature point 21 in the second captured image IM2. Then, the parallax calculator 13 performs SAD template matching between the feature point 21 and the reference point 21', and the similarity SM is calculated, for example, as shown in the graph of FIG. 2B. The horizontal axis of the graph in FIG. 2B indicates the horizontal coordinate of the second captured image IM2, and the vertical axis indicates the SAD value (similarity SM).
第1撮像画像IM1(基準画像)内の特徴点21に関し、第2撮像画像IM2(参照画像)内において、特徴点21に写っている物体と同一の物体が映り込んだ参照点21’の位置で類似度SMを計算した場合、類似度SMは小さい値となる。一方、第1撮像画像IM1(基準画像)の特徴点21に関し、第2撮像画像IM2(参照画像)内において、特徴点21に写っている物体と同一でない物体が写っている位置(参照点以外の位置)で類似度SMを計算すると、類似度SMは高い値となる。ここで、類似度SMが最小値となった位置(第1ピークP1)の横軸の座標値が特徴点21の視差Dとなる。
Regarding the feature point 21 in the first captured image IM1 (reference image), if the similarity SM is calculated at the position of the reference point 21' where the same object as the object captured in the feature point 21 is reflected in the second captured image IM2 (reference image), the similarity SM will be a small value. On the other hand, with respect to the feature point 21 of the first captured image IM1 (reference image), if the similarity SM is calculated at a position (a position other than the reference point) where an object that is not the same as the object captured in the feature point 21 is captured in the second captured image IM2 (reference image), the similarity SM becomes a high value. Here, the parallax D of the feature point 21 is the coordinate value on the horizontal axis of the position (first peak P1) where the similarity SM becomes the minimum value.
第1ピークP1の突出量(第1ピークP1における類似度SMの減少量)をkとしたとき、突出量kの値が大きい程、特徴点21は隣接領域に対して特徴量(例えば、輝度差)が大きいと言える。隣接領域に対する輝度差等が大きい程、計算される突出量kが大きくなり、その領域で計算される視差Dの信頼度も高くなる。そこで、第1ピークP1付近(図中の破線の領域)を除いた領域で類似度SMの平均値Avを計算し、これを基準値とし、その平均値Av(基準値)と第1ピークP1の最小値との信頼度SMとの差を突出量kとして計算する。複数領域で突出量kの値を計算し、kの値が高い領域を、領域選択部14において複数個選択する。例えば、図2Cに示すように、複数の領域(固有の領域IDが付与されている)の各々について突出量kを求め、その突出量kが高い順に領域をソートし、突出量kが大きい順に所定数の領域が領域選択部14において選択される。なお、後述するように、突出量kに関し閾値THkを設定し、突出量kが閾値THkを超える領域の個数が最小値Nregに達しない場合には、視差ずれ補正の動作を中止することもできる。
When the amount of protrusion of the first peak P1 (the amount of decrease in the similarity SM at the first peak P1) is k, it can be said that the larger the value of the amount of protrusion k, the larger the feature amount (for example, the difference in brightness) of the feature point 21 with respect to the adjacent area. The larger the luminance difference or the like with respect to the adjacent area, the larger the calculated protrusion amount k, and the higher the reliability of the parallax D calculated in that area. Therefore, the average value Av of the similarity SM is calculated in an area excluding the vicinity of the first peak P1 (the area indicated by the dashed line in the figure), and this is used as a reference value, and the difference between the average value Av (reference value) and the reliability SM of the minimum value of the first peak P1 is calculated as the protrusion amount k. The value of the protrusion amount k is calculated for a plurality of regions, and the region selection unit 14 selects a plurality of regions with high values of k. For example, as shown in FIG. 2C, a protrusion amount k is obtained for each of a plurality of areas (given with a unique area ID), the areas are sorted in descending order of the protrusion amount k, and a predetermined number of areas are selected by the area selection unit 14 in descending order of the protrusion amount k. As will be described later, a threshold THk can be set for the protrusion amount k, and the parallax shift correction operation can be stopped when the number of regions in which the protrusion amount k exceeds the threshold THk does not reach the minimum value Nreg.
なお、図2A及び図2Bは、類似度SMの平均値Av(第1ピークP1付近での値は除外)と第1ピークP1の下限値の差分から突出量kを計算したが、図3A及び図3Bで示すように、第1ピークP1とは別の第2ピークP2(例えば、第1ピークP1に次いで突出量kが大きいピーク)を特定し、この第1ピークP1の下限値と第2ピークの下限値との間の差分を突出量kとして演算してもよい。
Note that in FIGS. 2A and 2B, the protrusion amount k is calculated from the difference between the average value Av of the similarity SM (the value near the first peak P1 is excluded) and the lower limit value of the first peak P1, but as shown in FIGS. It may be calculated as the protrusion amount k.
次に、図4Aを用いて、視差演算部13における視差Dの演算方法、及び視差ずれ量演算部15における、視差ずれ量ΔDの計算方法について述べる。第1撮像部11及び第2撮像部12のレンズLの焦点距離をf(mm)、第1撮像部11及び第2撮像部12の撮像素子(例えばCMOSセンサ)の画素ピッチをδ(mm/pixel)、レンズLから水平方向に距離Z(mm)離れた位置の路面が撮像面に投影される位置をJ(pixel)、レンズLの路面からの高さをh(mm)とすると、次式(1)のような関係が成り立つ。
Next, a method for calculating the parallax D in the parallax calculator 13 and a method for calculating the parallax shift amount ΔD in the parallax shift amount calculator 15 will be described with reference to FIG. 4A. Let f (mm) be the focal length of the lenses L of the first imaging unit 11 and the second imaging unit 12, δ (mm/pixel) be the pixel pitch of the imaging elements (e.g., CMOS sensors) of the first imaging unit 11 and the second imaging unit 12, J (pixel) be the position where the road surface is projected onto the imaging surface at a distance Z (mm) in the horizontal direction from the lens L, and h (mm) be the height of the lens L from the road surface.
[数1]
Z=(f×h)/(J×δ) …(1) [Number 1]
Z=(f×h)/(J×δ) (1)
Z=(f×h)/(J×δ) …(1) [Number 1]
Z=(f×h)/(J×δ) (1)
また、距離Z(mm)における視差D(pixel)は、第1撮像部11及び第2撮像部12の間の距離(基線長)をB(mm)とすると、次式(2)のような関係が成り立つ。
In addition, the parallax D (pixel) at the distance Z (mm) satisfies the following equation (2), where B (mm) is the distance (base line length) between the first imaging unit 11 and the second imaging unit 12.
[数2]
D=(B×f)/(Z×δ) …(2) [Number 2]
D=(B×f)/(Z×δ) (2)
D=(B×f)/(Z×δ) …(2) [Number 2]
D=(B×f)/(Z×δ) (2)
(2)式に(1)式を代入すると、次の(3)式となる。
Substituting formula (1) into formula (2) gives the following formula (3).
[数3]
D=B×J/h …(3) [Number 3]
D=B×J/h (3)
D=B×J/h …(3) [Number 3]
D=B×J/h (3)
このように、路面の視差D(pixel)は、ステレオカメラ100の基線長B(mm)及び高さh、当該路面の撮像面での座標J(pixel)から計算され得る。基線長B、及び高さh(mm)は、第1撮像部11及び第2撮像部12を車両に搭載した際に特定され得る値であるため、視差D(mm)は、路面の座標J(pixel)により一意に特定され得る。
In this way, the parallax D (pixels) of the road surface can be calculated from the baseline length B (mm) and height h of the stereo camera 100, and the coordinates J (pixels) of the road surface on the imaging plane. Since the base length B and the height h (mm) are values that can be specified when the first imaging unit 11 and the second imaging unit 12 are mounted on the vehicle, the parallax D (mm) can be uniquely specified by the coordinates J (pixel) of the road surface.
ステレオカメラ100の車両への搭載時における基線長Bの値、及び高さhの値に基づいて、路面の視差Dの理想値Diを演算することができる。ここで、路面において視差の理想値Diを求める代わりに、図4Bに示すように、例えば路面からの高さが既知である物体Ob(例えば縁石、ガードレール等)において、視差の理想値Diを求めても良い。ただし、路面は一般的な走行環境の中で常に存在する特徴物であるため、原則としては路面において視差の理想値Diを求めることが好適である。路面において視差の理想値Diを求めることを原則としつつ、特殊な場合において路面以外の物体において視差の理想値Diを求めることも可能である。
The ideal value Di of the parallax D of the road surface can be calculated based on the value of the base line length B and the value of the height h when the stereo camera 100 is mounted on the vehicle. Here, instead of obtaining the ideal parallax value Di on the road surface, as shown in FIG. 4B, for example, the ideal parallax value Di may be obtained for an object Ob (for example, a curbstone, a guardrail, etc.) whose height from the road surface is known. However, since the road surface is a feature that always exists in a general driving environment, in principle it is preferable to obtain the ideal parallax value Di on the road surface. In principle, the ideal parallax value Di is obtained on the road surface, but in special cases, it is also possible to obtain the ideal parallax value Di on an object other than the road surface.
視差ずれ量演算部15では、実際に視差演算部13において、そのときの基線長Bや高さhに従って計算された視差Dと、設計時の基線長Bや高さhに従って計算された視差の理想値Diとの間の差分を、視差ずれ量ΔD=D-Diとして計算する。視差ずれ量演算部15は、領域選択部14で選択した複数の路面領域の各々において、視差ずれ量ΔDを計算する。
In the parallax shift amount calculation unit 15, the parallax shift amount calculation unit 13 actually calculates the difference between the parallax D calculated according to the base line length B and height h at that time and the parallax ideal value Di calculated according to the base line length B and height h at the time of design as the parallax shift amount ΔD=D−Di. The parallax shift amount calculator 15 calculates the parallax shift amount ΔD for each of the plurality of road surface areas selected by the area selector 14 .
次に、図5のフローチャートを参照して、第1の実施の形態のステレオカメラ100において、視差ずれ補正量ΔCを演算する手順の具体例を説明する。
Next, a specific example of the procedure for calculating the parallax shift correction amount ΔC in the stereo camera 100 of the first embodiment will be described with reference to the flowchart of FIG.
まず、視差ずれ補正量ΔCの演算の基礎となる路面が特定される(ステップS101)。そして、上述の要領で、当該路面における視差Dを、特徴点21を含む領域毎に視差演算部13において計算し(ステップS102)、その領域毎に突出量kの値を演算する(ステップS103)。
First, the road surface on which the calculation of the parallax deviation correction amount ΔC is based is specified (step S101). Then, in the manner described above, the parallax D on the road surface is calculated by the parallax calculator 13 for each region including the feature point 21 (step S102), and the value of the amount of protrusion k is calculated for each region (step S103).
領域毎に特徴量kが演算されたら、突出量kが閾値THkよりも大きい領域を領域選択部14において選択する(ステップS104)。ステップS104で選択された領域数Nregが閾値THreg以上であればステップS106に進むが、閾値THregより小さければ、視差ずれ補正量ΔCの演算及び視差ずれ補正処理は中止される(ステップS105)。突出量kの値が閾値THk以上である領域が一定数得られない場合には、精度の高い視差ずれ量の演算ができないためである。
After the feature amount k is calculated for each area, the area selection unit 14 selects an area where the protrusion amount k is greater than the threshold THk (step S104). If the number of regions Nreg selected in step S104 is equal to or greater than the threshold THreg, the process proceeds to step S106, but if it is smaller than the threshold THreg, calculation of the parallax deviation correction amount ΔC and parallax deviation correction processing are canceled (step S105). This is because the parallax shift amount cannot be calculated with high accuracy if a certain number of regions in which the value of the protrusion amount k is equal to or greater than the threshold value THk cannot be obtained.
ステップS106では、視差ずれ量演算部15において、選択された複数の領域の各々での視差の理想値Diが上記の要領で演算され、この理想値Diと現実の視差Dとの差が、視差ずれ量ΔD=D-Diとして演算される(ステップS107)。そして、視差ずれ補正部16において、このように複数の領域で得られた視差ずれ量ΔDを統計処理して、画像全体での視差ずれ量ΔDfixを演算し、この視差ずれ量ΔDfixに従って視差ずれ補正量ΔCが演算される(ステップS108)。
In step S106, the parallax shift amount calculator 15 calculates the ideal parallax value Di in each of the plurality of selected regions in the manner described above, and the difference between this ideal value Di and the actual parallax D is calculated as the parallax shift amount ΔD=D−Di (step S107). Then, the parallax shift correction unit 16 statistically processes the parallax shift amounts ΔD obtained in the plurality of regions, calculates the parallax shift amount ΔDfix for the entire image, and calculates the parallax shift correction amount ΔC according to the parallax shift amount ΔDfix (step S108).
以上説明したように、第1の実施の形態のステレオカメラ100によれば、高い突出量kが得られた領域が選択され、選択された領域に基づいて視差ずれ量ΔDが演算されるため、視差ずれを高精度に検出することができる撮像装置を提供することができる。
As described above, according to the stereo camera 100 of the first embodiment, a region with a high protrusion amount k is selected, and the parallax shift amount ΔD is calculated based on the selected region. Therefore, it is possible to provide an imaging device that can detect parallax shift with high accuracy.
[第2の実施の形態]
次に、図6A~図6Cを参照して、第2の実施の形態に係るステレオカメラ100を説明する。この第2の実施の形態のステレオカメラ100の全体構成は、第1の実施の形態と同一であるので(図1A、図1B)、重複する説明は省略する。 [Second embodiment]
Next, astereo camera 100 according to a second embodiment will be described with reference to FIGS. 6A to 6C. Since the overall configuration of the stereo camera 100 of the second embodiment is the same as that of the first embodiment (FIGS. 1A and 1B), redundant description will be omitted.
次に、図6A~図6Cを参照して、第2の実施の形態に係るステレオカメラ100を説明する。この第2の実施の形態のステレオカメラ100の全体構成は、第1の実施の形態と同一であるので(図1A、図1B)、重複する説明は省略する。 [Second embodiment]
Next, a
前述の第1の実施の形態では、領域選択部14は、類似度SMの突出度k(図2B参照)を演算し、その突出度kの大きさに従って視差ずれ量ΔDの演算に用いる複数の領域を選択する。これに対し、第2の実施の形態では、領域選択部14は、類似度SMのグラフを得るのに変えて、図6Aに示すように、注目領域41と、これに対応する領域、例えば、注目領域41に隣接する隣接領域42の輝度差の大小を判定し、その輝度差の大小に基づいて、信頼度の高い視差を有する領域を選択する。
In the above-described first embodiment, the area selection unit 14 calculates the degree of protrusion k (see FIG. 2B) of the similarity SM, and selects a plurality of areas to be used for calculating the parallax shift amount ΔD according to the degree of protrusion k. On the other hand, in the second embodiment, instead of obtaining a graph of the similarity SM, the area selection unit 14 determines the magnitude of the luminance difference between the attention area 41 and the corresponding area, for example, the adjacent area 42 adjacent to the attention area 41, as shown in FIG.
図6Bを参照して、第2の実施の形態のステレオカメラ100において、視差ずれ補正量ΔCを演算する手順の具体例を説明する。まず、視差ずれ補正量ΔCの演算の基礎となる路面が特定される(ステップS201)。そして、領域選択部14において、特定された路面における注目領域41と、その注目領域41に隣接する隣接領域42との間の輝度差を計算する(ステップS202)。領域毎に輝度差が演算されたら、領域選択部14は、輝度差が閾値THinよりも大きい領域を選択する(ステップS203)。
A specific example of the procedure for calculating the parallax shift correction amount ΔC in the stereo camera 100 of the second embodiment will be described with reference to FIG. 6B. First, a road surface is identified as a basis for calculation of the parallax shift correction amount ΔC (step S201). Then, the area selection unit 14 calculates the luminance difference between the specified attention area 41 on the road surface and the adjacent area 42 adjacent to the attention area 41 (step S202). After calculating the luminance difference for each region, the region selection unit 14 selects a region having a luminance difference greater than the threshold THin (step S203).
続いて、ステップS203で選択された領域の数Nregが閾値THreg’以上であるか否かが判断される(ステップS205)。判断がYesであればステップS206に進むが、判断がNo(閾値THregより小さい)であれば、視差ずれ補正量ΔCの演算及び視差ずれ補正処理は中止される。
Subsequently, it is determined whether or not the number Nreg of regions selected in step S203 is greater than or equal to the threshold THreg' (step S205). If the determination is Yes, the process proceeds to step S206, but if the determination is No (smaller than the threshold THreg), the computation of the parallax deviation correction amount ΔC and the parallax deviation correction process are stopped.
ステップS206では、選択された複数の領域の各々での視差の理想値Diが上記の要領で演算され、この理想値Diと現実の視差Dとの差が、視差ずれ量演算部15において、視差ずれ量ΔD=D-Diとして演算される(ステップS207)。そして、このように複数の領域で得られた視差ずれ量ΔDを統計処理して、画像全体での視差ずれ量ΔDfixを演算し、この視差ずれ量ΔDfixに従って視差ずれ補正量ΔCが演算される(ステップS208)。
In step S206, the ideal value Di of parallax in each of the plurality of selected regions is calculated in the manner described above, and the difference between this ideal value Di and the actual parallax D is calculated by the parallax shift amount calculator 15 as the parallax shift amount ΔD=D−Di (step S207). Then, the parallax shift amount ΔD obtained in a plurality of regions is statistically processed to calculate the parallax shift amount ΔDfix for the entire image, and the parallax shift correction amount ΔC is calculated according to the parallax shift amount ΔDfix (step S208).
なお、図6Cに示すように、注目領域41と隣接せず少し離れた近傍領域42’との輝度差を計算することも可能である。隣接領域42は、注目領域41に間隔を置かずに位置する領域であるが、近傍領域42’は、例えば1領域分だけ注目領域41から離間した領域である。カメラの分解能の影響で注目領域41のエッジ部分(例えば、路面と白線のエッジ部分)がボケてしまうと、注目領域41と隣接領域42の輝度差を正しく計算できなくなる場合が生じ得るが、このように少し離れた近傍領域42’を輝度差の演算の対象とすることで、輝度差をより正確に演算することが可能になる。
It should be noted that, as shown in FIG. 6C, it is also possible to calculate the luminance difference between the region of interest 41 and a neighboring region 42' that is not adjacent to and is a little distant. The adjacent region 42 is a region that is located without a gap from the region of interest 41 , while the neighboring region 42 ′ is a region separated from the region of interest 41 by, for example, one region. If the edge portion of the attention area 41 (for example, the edge portion between the road surface and the white line) is blurred due to the resolution of the camera, the luminance difference between the attention area 41 and the adjacent area 42 may not be correctly calculated.
以上説明したように、第2の実施の形態のステレオカメラ100によれば、輝度差の大小に従って、視差ずれ量の演算に用いる領域が選択され、その選択された領域に基づいて視差ずれ量ΔDが演算されるため、第1の実施の形態と同様に、視差ずれを高精度に検出することができる撮像装置を提供することができる。
As described above, according to the stereo camera 100 of the second embodiment, the area used for calculating the parallax shift amount is selected according to the magnitude of the luminance difference, and the parallax shift amount ΔD is calculated based on the selected area. Therefore, it is possible to provide an imaging device that can detect the parallax shift with high precision, as in the first embodiment.
[第3の実施の形態]
次に、図7~図8を参照して、第3の実施の形態に係るステレオカメラ100を説明する。この第3の実施の形態のステレオカメラ100の全体構成は、第1の実施の形態と同一であるので(図1A、図1B)、重複する説明は省略する。この第3の実施の形態は、視差ずれ補正部16での視差ずれ量の演算の手法が、前述の実施の形態と異なっている。具体的には、前述の実施の形態と同様に領域選択部14による領域選択がされた後、図7に示すように、縦座標J(1,2、…m、…)毎に複数の領域の視差ずれ量ΔDの代表値(例えば中央値Median(J))が視差ずれ補正部16において演算される。 [Third embodiment]
Next, astereo camera 100 according to a third embodiment will be described with reference to FIGS. 7 and 8. FIG. Since the overall configuration of the stereo camera 100 of the third embodiment is the same as that of the first embodiment (FIGS. 1A and 1B), redundant description will be omitted. The third embodiment differs from the above-described embodiments in the method of calculating the amount of parallax deviation in the parallax deviation correction unit 16 . Specifically, after area selection is performed by the area selection unit 14 in the same manner as in the above-described embodiment, the parallax shift correction unit 16 calculates a representative value (for example, the median value Median(J)) of the parallax shift amounts ΔD of a plurality of areas for each vertical coordinate J (1, 2, . . . m, . . . ), as shown in FIG.
次に、図7~図8を参照して、第3の実施の形態に係るステレオカメラ100を説明する。この第3の実施の形態のステレオカメラ100の全体構成は、第1の実施の形態と同一であるので(図1A、図1B)、重複する説明は省略する。この第3の実施の形態は、視差ずれ補正部16での視差ずれ量の演算の手法が、前述の実施の形態と異なっている。具体的には、前述の実施の形態と同様に領域選択部14による領域選択がされた後、図7に示すように、縦座標J(1,2、…m、…)毎に複数の領域の視差ずれ量ΔDの代表値(例えば中央値Median(J))が視差ずれ補正部16において演算される。 [Third embodiment]
Next, a
次に、計算された中央値Median(1)~(m)の平均値を、視差ずれ量ΔDの平均値ΔDavとして計算する。この視差ずれ量ΔDavを入力として、視差ずれ補正部16において視差ずれ補正量ΔCを演算する。ただし、平均値ΔDavの計算に使用した中央値Median(J)の標準偏差SDが閾値より大きい場合、該撮像画像においては視差ずれの補正を実施しないようにするのが好適である。標準偏差SDが閾値より大きい場合、その座標が含まれる領域での信頼度は小さいと判断されるからである
Next, the average value of the calculated median values Median(1) to (m) is calculated as the average value ΔDav of the parallax deviation amounts ΔD. Using this parallax shift amount ΔDav as an input, the parallax shift correction amount ΔC is calculated in the parallax shift correction unit 16 . However, if the standard deviation SD of the median value Median(J) used to calculate the average value ΔDav is larger than the threshold, it is preferable not to correct the parallax shift in the captured image. This is because when the standard deviation SD is larger than the threshold, the reliability in the region containing the coordinates is judged to be small.
なお、図8の図示例では、1つの座標Jに対し、1つの中央値Median(j)を計算しているが、複数の座標k~k+xに対し1つの中央値Median(k~k+x)を纏めて計算しても良い。特に、車両から遠方の路面(画面上方に写っている路面)においては、縦方向の複数の座標に対し1つの中央値を計算することが好適である。その理由は、車両から遠方の路面は、近傍の路面(画面下方に写っている路面)に比べ、1画素あたりの情報量が圧縮されて第1撮像部11及び第2撮像部12において撮像されるからである。これは、視差ずれ補正の精度の低下に繋がる。上記のように、遠方の路面については、複数の座標について1の中央値を計算することで、精度の低下を防止することができる。
In the illustrated example of FIG. 8, one median value Median(j) is calculated for one coordinate J, but one median value Median(k~k+x) may be calculated collectively for a plurality of coordinates k~k+x. In particular, it is preferable to calculate one median value for a plurality of coordinates in the vertical direction on a road surface that is far from the vehicle (the road surface that appears in the upper part of the screen). The reason is that the road surface far from the vehicle is imaged by the first imaging unit 11 and the second imaging unit 12 with the amount of information per pixel compressed compared to the road surface in the vicinity (the road surface shown in the lower part of the screen). This leads to a decrease in accuracy of parallax shift correction. As described above, for a distant road surface, a decrease in accuracy can be prevented by calculating a median value of 1 for a plurality of coordinates.
また、図9に示すように、座標毎ではなく、選択された領域全ての視差ずれ量ΔDの中央値を計算し、その中央値を、その撮像画像における視差ずれ量ΔDと決定し、視差ずれ補正量ΔCとすることもできる。
Further, as shown in FIG. 9, the median value of the parallax deviation amount ΔD of all selected regions is calculated instead of each coordinate, and the median value is determined as the parallax deviation amount ΔD in the captured image, and can be used as the parallax deviation correction amount ΔC.
以上、本発明の種々の実施形態を説明したが、本発明は上記した実施形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。
Although various embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Moreover, it is possible to add, delete, or replace a part of the configuration of each embodiment with another configuration.
100…ステレオカメラ、 200…外部装置、 1…ステレオカメラ本体、 2…画像処理装置、 3…CPU、 4…メモリ、 5…インタフェース、 BU…バス、 11…第1撮像部、 12…第2撮像部、 13…視差演算部、 14…領域選択部、 15…視差ずれ量演算部、 16…視差ずれ補正部。
100... stereo camera, 200... external device, 1... stereo camera body, 2... image processing device, 3... CPU, 4... memory, 5... interface, BU... bus, 11... first imaging unit, 12... second imaging unit, 13... parallax calculation unit, 14... area selection unit, 15... parallax shift amount calculation unit, 16... parallax shift correction unit.
Claims (15)
- 第1撮像画像を取得する第1撮像部と、
前記第1撮像部から一定距離離れた位置に配置され、第2撮像画像を取得する第2撮像部と、
前記第1撮像画像と前記第2撮像画像の各領域における視差を演算する視差演算部と、
前記第1撮像画像又は前記第2撮像画像の中から、前記視差の値として閾値以上の信頼度を有する領域を選択する領域選択部と、
前記領域選択部で選択された領域における視差の値と、当該領域において取得されるべき視差の理想値とを比較し、各領域の視差ずれ量を求める視差ずれ量演算部と、
前記各領域の視差ずれ量に従って決定される補正量によって、前記第1撮像部及び前記第2撮像部における視差ずれを補正する視差ずれ補正部と
を備えることを特徴とする撮像装置。 a first imaging unit that acquires a first captured image;
a second imaging unit arranged at a position separated from the first imaging unit by a certain distance and acquiring a second captured image;
a parallax calculation unit that calculates a parallax in each region of the first captured image and the second captured image;
an area selection unit that selects an area having a reliability equal to or higher than a threshold value as the parallax value from the first captured image or the second captured image;
a parallax shift amount calculation unit that compares the parallax value in the region selected by the region selection unit and the ideal parallax value that should be acquired in the region, and obtains the parallax shift amount of each region;
An image pickup apparatus, comprising: a parallax shift correction section that corrects parallax shift in the first imaging section and the second imaging section by a correction amount determined according to the parallax shift amount of each region. - 前記視差演算部は、前記第1撮像画像を基準画像とし、前記第2撮像画像を参照画像としてテンプレートマッチングを実行して視差を演算し、
前記領域選択部は、前記テンプレートマッチングにおいて得られる類似度のグラフのピークの最小値と、基準値との間の差分である突出度に基づいて、前記領域を選択する、請求項1に記載の撮像装置。 The parallax calculation unit performs template matching using the first captured image as a reference image and the second captured image as a reference image to calculate parallax,
2. The imaging apparatus according to claim 1, wherein the region selection unit selects the region based on a degree of protrusion, which is a difference between a minimum value of a peak of a similarity graph obtained in the template matching and a reference value. - 前記基準値は、前記ピークの付近を除いた領域での前記類似度の平均値である、請求項2に記載の撮像装置。 The imaging apparatus according to claim 2, wherein the reference value is the average value of the similarities in an area excluding the vicinity of the peak.
- 前記基準値は、前記ピークとは別のピークの最小値である、請求項2に記載の撮像装置。 The imaging device according to claim 2, wherein the reference value is a minimum value of a peak different from the peak.
- 前記領域選択部により選択される領域の数が所定数よりも少ない場合に、前記視差ずれ補正部による視差ずれ補正が中止される、請求項1に記載の撮像装置。 The imaging apparatus according to claim 1, wherein the parallax shift correction by the parallax shift correction section is stopped when the number of regions selected by the region selection section is less than a predetermined number.
- 前記視差ずれ補正部は、各領域で演算された前記視差ずれ量の代表値を演算し、その代表値に従って視差ずれ補正量を演算する、請求項1に記載の撮像装置。 The imaging apparatus according to claim 1, wherein the parallax shift correction unit calculates a representative value of the parallax shift amount calculated in each region, and calculates the parallax shift correction amount according to the representative value.
- 前記領域選択部は、前記第1撮像画像又は前記第2撮像画像における注目領域と、前記注目領域に対応する領域の輝度差の大小に基づいて前記領域を選択する、請求項1に記載の撮像装置。 2. The imaging apparatus according to claim 1, wherein the area selection unit selects the area based on the magnitude of a luminance difference between an attention area in the first captured image or the second captured image and an area corresponding to the attention area.
- 第1撮像部で撮像した第1撮像画像と、前記第1撮像部から一定距離離れた位置に配置された第2撮像部で撮像した第2撮像画像とを取得するステップと、
前記第1撮像画像と前記第2撮像画像の各領域における視差を演算するステップと、
前記第1撮像画像又は前記第2撮像画像の中から、前記視差の値として閾値以上の信頼度を有する領域を選択するステップと、
選択された前記領域における視差の値と、当該領域において取得されるべき視差の理想値とを比較し、各領域の視差ずれ量を求めるステップと、
前記各領域の視差ずれ量に従って決定される補正量によって、前記第1撮像部及び前記第2撮像部における視差ずれを補正するステップと
を含む、視差ずれ補正方法。 a step of acquiring a first captured image captured by a first imaging unit and a second captured image captured by a second imaging unit arranged at a position separated from the first imaging unit by a predetermined distance;
calculating a parallax in each region of the first captured image and the second captured image;
a step of selecting an area having a reliability equal to or higher than a threshold value as the parallax value from the first captured image or the second captured image;
a step of comparing the parallax value in the selected region and the ideal parallax value to be obtained in the region to determine the parallax shift amount of each region;
and correcting the parallax shift in the first imaging unit and the second imaging unit using a correction amount determined according to the parallax shift amount of each region. - 前記視差を演算するステップは、前記第1撮像画像を基準画像とし、前記第2撮像画像を参照画像としてテンプレートマッチングを実行して視差を演算するものであり、
前記領域を選択するステップは、前記テンプレートマッチングにおいて得られる類似度のグラフのピークの最小値と、基準値との間の差分である突出度に基づいて、前記領域を選択する、請求項8に記載の視差ずれ補正方法。 In the step of calculating the parallax, the parallax is calculated by performing template matching using the first captured image as a reference image and the second captured image as a reference image,
9. The parallax shift correction method according to claim 8, wherein the step of selecting the region selects the region based on a degree of protrusion that is a difference between a minimum value of a peak of a similarity graph obtained in the template matching and a reference value. - 前記基準値は、前記ピークの付近を除いた領域での前記類似度の平均値である、請求項9に記載の視差ずれ補正方法。 The parallax deviation correction method according to claim 9, wherein the reference value is an average value of the similarities in an area excluding the vicinity of the peak.
- 前記基準値は、前記ピークとは別のピークの最小値である、請求項9に記載の視差ずれ補正方法。 The parallax deviation correction method according to claim 9, wherein the reference value is a minimum value of a peak different from the peak.
- 前記領域を選択するステップにより選択される領域の数が所定数よりも少ない場合に、前記視差ずれ補正が中止される、請求項8に記載の視差ずれ補正方法。 The parallax shift correction method according to claim 8, wherein the parallax shift correction is stopped when the number of regions selected by the step of selecting the regions is less than a predetermined number.
- 前記視差ずれの補正は、各領域で演算された前記視差ずれ量の代表値を演算し、その代表値に従って視差ずれ補正量を演算するものである、請求項8に記載の視差ずれ補正方法。 The parallax shift correction method according to claim 8, wherein the correction of the parallax shift calculates a representative value of the parallax shift amount calculated in each region, and calculates the parallax shift correction amount according to the representative value.
- 前記領域を選択するステップは、前記第1撮像画像又は前記第2撮像画像における注目領域と、前記注目領域に対応する領域の輝度差の大小に基づいて前記領域を選択する、請求項8に記載の視差ずれ補正方法。 9. The parallax shift correction method according to claim 8, wherein the step of selecting the area selects the area based on the magnitude of a luminance difference between a region of interest in the first captured image or the second captured image and a region corresponding to the region of interest.
- 第1撮像部で撮像した第1撮像画像と、前記第1撮像部から一定距離離れた位置に配置された第2撮像部で撮像した第2撮像画像とを取得するステップと、
前記第1撮像画像と前記第2撮像画像の各領域における視差を演算するステップと、
前記第1撮像画像又は前記第2撮像画像の中から、前記視差の値として閾値以上の信頼度を有する領域を選択するステップと、
選択された前記領域における視差の値と、当該領域において取得されるべき視差の理想値とを比較し、各領域の視差ずれ量を求めるステップと、
前記各領域の視差ずれ量に従って決定される補正量によって、前記第1撮像部及び前記第2撮像部における視差ずれを補正するステップと
をコンピュータに実行させるよう構成された、視差ずれ補正プログラム。 a step of acquiring a first captured image captured by a first imaging unit and a second captured image captured by a second imaging unit arranged at a position separated from the first imaging unit by a predetermined distance;
calculating a parallax in each region of the first captured image and the second captured image;
a step of selecting an area having a reliability equal to or higher than a threshold value as the parallax value from the first captured image or the second captured image;
a step of comparing the parallax value in the selected region and the ideal parallax value to be obtained in the region to determine the parallax shift amount of each region;
and correcting the parallax shift in the first imaging section and the second imaging section using a correction amount determined according to the parallax shift amount of each region.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023574929A JPWO2023139675A1 (en) | 2022-01-19 | 2022-01-19 | |
DE112022005074.8T DE112022005074T5 (en) | 2022-01-19 | 2022-01-19 | Imaging device, parallax shift correction method and parallax shift correction program |
PCT/JP2022/001734 WO2023139675A1 (en) | 2022-01-19 | 2022-01-19 | Imaging device, parallax displacement correction method, and parallax displacement correction program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2022/001734 WO2023139675A1 (en) | 2022-01-19 | 2022-01-19 | Imaging device, parallax displacement correction method, and parallax displacement correction program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023139675A1 true WO2023139675A1 (en) | 2023-07-27 |
Family
ID=87348230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001734 WO2023139675A1 (en) | 2022-01-19 | 2022-01-19 | Imaging device, parallax displacement correction method, and parallax displacement correction program |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2023139675A1 (en) |
DE (1) | DE112022005074T5 (en) |
WO (1) | WO2023139675A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012026895A (en) * | 2010-07-23 | 2012-02-09 | Canon Inc | Position attitude measurement device, position attitude measurement method, and program |
WO2012098862A1 (en) * | 2011-01-17 | 2012-07-26 | パナソニック株式会社 | Image file generation device, image file reproduction device, and image file generation method |
JP2019061303A (en) * | 2017-09-22 | 2019-04-18 | 株式会社デンソー | Periphery monitoring apparatus and periphery monitoring method for vehicle |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6521796B2 (en) | 2015-08-26 | 2019-05-29 | 株式会社Subaru | Stereo image processing device |
-
2022
- 2022-01-19 JP JP2023574929A patent/JPWO2023139675A1/ja active Pending
- 2022-01-19 DE DE112022005074.8T patent/DE112022005074T5/en active Pending
- 2022-01-19 WO PCT/JP2022/001734 patent/WO2023139675A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012026895A (en) * | 2010-07-23 | 2012-02-09 | Canon Inc | Position attitude measurement device, position attitude measurement method, and program |
WO2012098862A1 (en) * | 2011-01-17 | 2012-07-26 | パナソニック株式会社 | Image file generation device, image file reproduction device, and image file generation method |
JP2019061303A (en) * | 2017-09-22 | 2019-04-18 | 株式会社デンソー | Periphery monitoring apparatus and periphery monitoring method for vehicle |
Also Published As
Publication number | Publication date |
---|---|
DE112022005074T5 (en) | 2024-08-29 |
JPWO2023139675A1 (en) | 2023-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6707022B2 (en) | Stereo camera | |
WO2018142900A1 (en) | Information processing device, data management device, data management system, method, and program | |
JP6182866B2 (en) | Calibration device, distance measuring device, and vehicle | |
US6381360B1 (en) | Apparatus and method for stereoscopic image processing | |
JP4856611B2 (en) | Object detection device | |
JP5404263B2 (en) | Parallax calculation method and parallax calculation device | |
JP5180126B2 (en) | Road recognition device | |
JP6592991B2 (en) | Object detection apparatus, object detection method, and program | |
WO2014002692A1 (en) | Stereo camera | |
JPWO2018042954A1 (en) | In-vehicle camera, adjustment method of in-vehicle camera, in-vehicle camera system | |
WO2012029658A1 (en) | Imaging device, image-processing device, image-processing method, and image-processing program | |
WO2018038257A1 (en) | Object detecting method and device therefor | |
CN111989541B (en) | Stereo camera device | |
CN108389228B (en) | Ground detection method, device and equipment | |
JP7152506B2 (en) | Imaging device | |
JP7380443B2 (en) | Partial image generation device and computer program for partial image generation | |
WO2023139675A1 (en) | Imaging device, parallax displacement correction method, and parallax displacement correction program | |
JP6739367B2 (en) | Camera device | |
JP7269130B2 (en) | Image processing device | |
JP7382601B2 (en) | distance measuring device | |
JP6680335B2 (en) | Stereo camera, vehicle, calculation method and program | |
WO2020031980A1 (en) | Method for correcting lens marker image, correcting device, program, and recording medium | |
JP6365744B2 (en) | Program, recording medium, and calibration method | |
JPH11190611A (en) | Three-dimensional measuring method and three-dimensional measuring processor using this method | |
JP2018116620A (en) | Image processing device, image processing system, and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 112022005074 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 2023574929 Country of ref document: JP Kind code of ref document: A |