WO2019030995A1 - ステレオ画像処理装置 - Google Patents
ステレオ画像処理装置 Download PDFInfo
- Publication number
- WO2019030995A1 WO2019030995A1 PCT/JP2018/017424 JP2018017424W WO2019030995A1 WO 2019030995 A1 WO2019030995 A1 WO 2019030995A1 JP 2018017424 W JP2018017424 W JP 2018017424W WO 2019030995 A1 WO2019030995 A1 WO 2019030995A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- sensor
- camera
- center
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/10—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
- G01C3/14—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to a stereo image processing apparatus.
- stereo camera technology As 3D object recognition technology.
- the present technology is a technology that detects parallax based on trigonometry using differences in the way of capturing images of two cameras arranged at different positions, and detects the depth and position of an object using the parallax. .
- the position of the observation target can be accurately detected.
- stereo camera technology has a problem in which the effective field of view is narrow.
- Patent Document 1 Provides an imaging device capable of performing high-speed processing operation while reducing the content of arithmetic processing while securing a sufficient field of view.
- “left and right cameras 4 and 6 offset the imaging center as the center of the imaging device by the same amount in the opposite direction to the optical center as the optical axis of the optical lens.
- the corresponding point search the corresponding point R confirmed in the image of the right camera 6 is set as a reference point of search start for searching for the corresponding point L in the image of the left camera 4.
- the parallax can be shortened as compared with the case where the point corresponding to infinity is used as the reference point, the content of the arithmetic processing can be reduced, and high-speed processing can be performed. Describes the technology (see summary).
- the following patent document 2 "obtains a wide effective visual field by utilizing an imaging area of non-stereo vision while acquiring accurate distance information from an imaging area of stereo vision.
- the optical axes of the respective cameras 1a and 1b of the stereo camera 1 are disposed so as to be nonparallel on the same plane, and the effective visual field is determined by the region RM of stereovision and the region RS of nonstereovision.
- the pivot shaft 3 is provided at the center position of the stay 2 and the entire camera system is rotated by the drive motor 4 to enable wide-area scanning.
- the intruder when applied to an intruder monitoring apparatus, if an object intrudes into the field of view of one of the cameras, the intruder can be detected, and an apparatus using a stereo camera in which the optical axes are arranged in parallel
- the detection range can be expanded compared to
- reliable detection using the stereo method can be performed, and erroneous detection can be prevented to improve reliability. Describes the technology (see summary).
- Patent Document 3 provides “a stereo camera device capable of detecting an object to be detected within a wider space range and having high accuracy in distance calculation to an object and object detection accuracy, and a vehicle equipped with the stereo camera device.
- the stereo camera device (1) captures a range in which the first imaging unit (2) and the range imaged by the first imaging unit (2) are shifted in a direction different from the direction along the base length A subject using an overlapping area of the second imaging unit (3), the first image imaged by the first imaging unit (2), and the second image imaged by the second imaging unit (3) And a control unit (4) for calculating the distance to the end.
- Technology is disclosed (see abstract).
- the imaging device described in Patent Document 1 includes two sensors that receive light.
- the left sensor receives light from the left lens
- the right sensor receives light from the right lens.
- the center of the left sensor is shifted in a direction away from the right lens with respect to the optical axis of the left lens
- the center of the right sensor is away from the left lens with respect to the optical axis of the right lens It is arranged shifted in the direction.
- Patent document 1 aims at widening a stereo visual field by this composition.
- since the visual field range that the sensor can detect depends on the sensor size, it is considered that some measures such as increasing the device size are necessary to further increase the effective visual field.
- Patent Document 2 by tilting two cameras, a stereo visual field and a non-stereo visual field (monocular visual field) are generated to enlarge an effective visual field.
- a stereo visual field and a non-stereo visual field are generated to enlarge an effective visual field.
- the peripheral portion of the captured image is distorted, the distortions of the images captured by the two cameras do not match each other, so there is a problem regarding the accuracy of detecting the depth or position of the object. .
- the peripheral portion is distorted with respect to the central portion, so the detection accuracy is significantly reduced.
- the configuration of Patent Document 2 is considered to be useful only when using a lens with small distortion.
- Patent Document 3 the sensor is shifted in the vertical direction. This configuration is considered to have a small effect of enlarging the field of view using a distorted lens. Also, the effect of enlarging the visual field in the horizontal direction is poor.
- the present invention has been made in view of the above problems, and provides a stereo image processing apparatus capable of expanding an effective visual field by using a lens with distortion that realizes a wide visual field. It is a thing.
- the two sensors are arranged to be shifted from each other in the direction away from each other with respect to the optical axis of the lens, and the lens is larger than the f ⁇ lens in a region where the viewing angle is large. It has the characteristic that distortion is large.
- an effective visual field can be expanded by using a lens with distortion that realizes a wide visual field.
- FIG. 1 is a configuration diagram of a stereo camera 1 according to Embodiment 1.
- FIG. 2 is a schematic view of the positions of two sensors with respect to the optical axis of the stereo camera 1;
- FIG. 2 is a functional block diagram of a processing unit 2 provided in the stereo image processing device according to the first embodiment. It is a graph which shows the dependence to distance L of distance d. It is a graph which shows the relationship between a sensor shift amount and the increase amount of an effective visual field. It is a figure explaining the difference in viewing angle.
- FIG. 1 is a configuration diagram of a light shielding cover according to a first embodiment.
- FIG. 6 is a block diagram of a stereo camera 1 according to a second embodiment.
- FIG. 6 is a block diagram of a stereo camera 1 according to a third embodiment.
- FIG. 2 is a schematic view of the positions of two sensors with respect to the optical axis of the stereo camera 1;
- FIG. 1 is a block diagram of a stereo camera 1 according to Embodiment 1 of the present invention.
- the image processing unit will be described later.
- the stereo camera 1 includes a camera CR on the right side, a camera CL on the left side, and a processing unit (details will be described later) that processes the output.
- the right camera CR will be described.
- the light 100R reflected or transmitted through the target object 100 passes through the center of the lens LR of the camera CR on the right side, and enters the sensor CSR.
- the light 100R transmitted through the center of the lens LR is incident on the left side of the sensor CSR, not near the center MR on the sensor CSR.
- a plan view of the sensor CSR is shown at the lower right of FIG.
- the sensor CSR is disposed shifted to the side away from the camera CL with respect to the optical axis OR of the lens LR.
- the light 100R is incident on the left side of the sensor center MR.
- the left camera CL will be described.
- the light 100L reflected or transmitted through the target object 100 passes through the center of the lens LL of the camera CL on the left side, and enters the sensor CSL.
- the light 100L transmitted through the center of the lens LL is incident not on the vicinity of the center ML on the sensor CSL but on the right side of the sensor CSL.
- a plan view of the sensor CSL is shown at the lower left of FIG.
- the sensor CSL is disposed shifted to the side away from the camera CR with respect to the optical axis OL of the lens LL. Therefore, the light 100L is incident on the right side of the sensor center ML.
- the distance between the target object 100 and the lens is L
- the distance between the two cameras is D
- the distance between the lens and the sensor between the lens LR and the sensor CSR, and between the lens LL and the sensor CSL
- Lc the distance between the light 100R on the sensor (sensor CSR or sensor CSL), the incident position of the light 100L and the axis OR, and the axis OL be d.
- the focal length of the lens is f
- Lc fL / (L ⁇ f) from the lens formula.
- Patent Document 1 assuming that the maximum distance at which the stereo camera 1 can detect the distance of the target object 100 is Lmax, DfLmax / ⁇ (2L) (L ⁇ ) directs the sensor CSR and the sensor CSL away from each other. f) Shifted by ⁇ . In this document, this is intended to widen the common visual field (stereo visual field) that can be detected by the two sensors.
- the maximum distance Lmax here is the maximum distance from which an image with sufficient accuracy to detect the distance and position of the target object 100 can be acquired. For example, when the target object 100 exists extremely far, the image acquired by the sensor CSR and the sensor CSL is blurred, and therefore the distance and the position of the target object 100 can not be accurately detected (the detection accuracy is an allowable range Less than).
- the maximum distance Lmax is the maximum distance at which such a failure does not occur.
- the sensor CSR and the sensor CSL are arranged to be shifted in the direction away from each other by more than DfLmax / ⁇ (2L) (Lf) ⁇ .
- a common field of view S that can be detected by the two sensors (the filled area in FIG. 1), a field of view T1 that can be detected only by the sensor CSR (shaded area), and a field of view T2 that can be detected only by the sensor CSL (Shaded area) will occur.
- the fields of view T1 and T2 it is possible to widen the viewing angle that can be detected by the stereo camera 1 although it is a monocular field of view.
- the peripheral portion is distorted with respect to the viewing angle ⁇ more than isometric projection (f ⁇ (f: focal length, ⁇ : viewing angle)).
- the f ⁇ lens has a characteristic that the image height also increases as the viewing angle ⁇ increases.
- the lens LR and the lens LL in Embodiment 1 have almost the same characteristics as the f ⁇ lens (that is, the image height is proportional to ⁇ ) in the region where the viewing angle ⁇ is small, but the region where the viewing angle ⁇ is large The image height is smaller than that of the f ⁇ lens (that is, in the peripheral portion).
- the lenses LR and LL in the first embodiment have such a characteristic that the increment of the image height gradually decreases as ⁇ increases in the peripheral portion.
- An example of such a lens is an orthographic (f sin ⁇ ) lens.
- the resolution of the sensor depends on the field of view for the pixels of the sensor. In other words, reducing the resolution corresponds to increasing the field of view for the pixels of the sensor. Therefore, in the first embodiment, by using a lens with distortion, it is intended to widen the visual field with respect to the pixels in the peripheral portion. In this case, the resolution of the peripheral portion is lower than that of the central portion (region where ⁇ is small). However, for example, in the case of an on-vehicle camera or a drone camera, the resolution in the traveling direction needs to be high, but the resolution in the periphery may be lower than the center. In such an application, it is considered that the merit by widening the viewing angle of the peripheral portion using the configuration of the first embodiment is greater.
- the distance L is 50 m to 250 m (or more) in the case of an on-vehicle camera or a drone, 2 m to 5 m in the case of mounting on an inspection apparatus or the like, and 10 m to 20 m in the case of mounting on a robot or the like.
- the distance Lc between the lens and the sensor is adjusted so that the object 100 can be seen most clearly depending on the application.
- FIG. 2 is a schematic view of the positions of the two sensors with respect to the optical axis of the stereo camera 1.
- the vertical axis indicates the vertical direction
- the horizontal axis indicates the horizontal direction.
- the origin indicates the positions of the light 100L and the light 100R.
- the dotted line indicates the sensor CSR of the camera CR.
- the one-dot chain line indicates the sensor CSL of the camera CL.
- sensor ranges detected by the left and right cameras are different.
- a common range Sa solid area
- Ta1 shaded area
- Ta2 hatchched area
- FIG. 3 is a functional block diagram of the processing unit 2 provided in the stereo image processing apparatus according to the first embodiment.
- the A / D converters 200R and 200L respectively convert image signals detected by the sensors CSR and CSL into digital signals.
- the correction circuits 201R and 201L convert the distorted image into a predetermined projection system. For example, transform from orthographic projection to central projection (f tan ⁇ ).
- the stereo matching circuit 202 converts an image in the common field of view (field of view Sa in FIG. 2) into distance image data, and outputs the image to the image recognition circuit 203. Images in the monocular visual field (visual fields Ta1 and Ta2 in FIG. 2) are also output to the image recognition circuit 203 at the same time.
- the image recognition circuit 203 generates information on the presence / absence of a car / person / obstacle, distance information, a sign / signal, etc. by recognizing an object contained in the acquired image data, and sends the information to the control circuit 204. Output.
- the control circuit 204 controls the vehicle according to the information acquired from the image recognition circuit 203.
- a stereo camera mounted on an inspection apparatus or robot is required to detect the depth and position of an object with high accuracy in the entire field of view. Therefore, it may be required to secure the stereo vision as much as possible.
- a stereo camera mounted on a car or in a drone needs to detect the depth and position of an object with high accuracy within a predetermined angle range in the traveling direction, but it is necessary at other angles. I will not. Thus, even monocular detection can meet performance requirements. Therefore, in the first embodiment.
- the effective visual field is expanded by intentionally reducing the stereo visual field and increasing the monocular visual field.
- FIG. 4 is a graph showing the dependence of the distance d on the distance L.
- the vertical axis indicates the distance d
- the horizontal axis indicates the distance L.
- Each line in FIG. 4A indicates the base length D
- each line in FIG. 4B indicates the focal length f of the lens.
- the lens focal length f is 3.5 mm
- the base length D is 0.05 m to 1.00 m
- the lens focal length f is 1 mm to 10 mm
- the base length D is 0.35 m.
- the focal length f when the focal length f is 3.5 mm, the base length D is 0.35 mm, and the distance L is 100 m, the distance d is about 0.006 mm. This corresponds to two pixels, assuming that the pixel size of the sensor is 0.003 mm.
- This condition does not largely change even if the focal length f is changed as well as the base line length D.
- the focal length f is 10 mm
- the base length is 0.35 m
- the distance L is 50 m
- the distance d is less than 0.040 mm, which is a dozen pixels.
- Patent Document 1 by shifting the sensor from 2 pixels to more than 10 pixels in this manner, the stereo view is maximized.
- the effective visual field is expanded by shifting, for example, 0.1 mm to several mm (depending on the sensor size). That is, in the first embodiment, it can be said that the amount of shifting the sensor is much larger than that of the patent document 1.
- FIG. 5 is a graph showing the relationship between the sensor shift amount and the effective visual field increase amount.
- the optical conditions are as follows: (a) projection method: orthogonal projection, (b) lens focal length f: 3.5 mm, (c) sensor pixel size: 2.75 ⁇ m (horizontal) ⁇ 2.75 ⁇ m (vertical) , (D) Number of sensor pixels: 1956 pixels (horizontal) ⁇ 1266 pixels (vertical).
- the effective field of view is expanded by shifting the sensors of the two cameras perpendicularly to the lens and in the opposite direction. While this method can expand the effective field of view due to the shift of the sensor, it is difficult to obtain the effect of a distorted lens that realizes a wide field of view.
- the dotted lines in FIG. 5 also represent this.
- the effective visual field can be further increased by shifting the sensor in the horizontal direction as in the first embodiment. The reason is considered as follows.
- the field of view for pixels in the periphery is increased. Since a stereo camera is usually required to recognize an object in the horizontal direction, the sensor size in the horizontal direction is increased. Therefore, it is easy to obtain the effect of distortion in the left and right direction. On the other hand, since the sensor size is smaller in the vertical direction than in the horizontal direction, almost no distortion effect is obtained. As information in the horizontal direction is important in the on-vehicle sensor, it is expected that the sensor size in the horizontal direction will be larger than that for a normal imaging sensor. Therefore, it is considered that the increase in the effective visual field in the horizontal direction with respect to the vertical direction is further increased.
- FIG. 6 is a diagram for explaining the difference in viewing angle.
- 6 (a) shows the configuration of the stereo camera 1 according to the first embodiment
- FIG. 6 (b) shows the configuration of the stereo camera in Patent Document 1.
- Points P1 and P2 in FIG. 6 indicate the positions of target objects that can be viewed in stereo.
- the camera CR detects the left side of the field of view
- the camera CL detects the right side of the field of view. Therefore, it is possible to detect the front side more than Patent Document 1 in which two cameras detect the same field of view. Therefore, the present embodiment 1 is more advantageous than the patent document 1 also from the viewpoint of detecting the vicinity.
- the stereo camera 1 of the first embodiment focuses on the target object 100 having a distance L of 50 m to 250 m, it is conceivable that objects in the vicinity are somewhat blurred. However, in the case of detecting an object nearby, since the positional deviation amount of the target object 100 between two images is very large, there is no problem in position detection even if the image is somewhat blurred.
- the configuration of the first embodiment also contributes to downsizing of the stereo camera.
- a cover is disposed between the lens and the windshield so that light outside the field of view does not enter the camera.
- the dashed dotted line in FIG. 6 is a windshield
- a cover larger than the distance J1 and the distance J2 is required to secure a field of view. Since the distance J1 is smaller than the distance J2 as shown in FIG. 6, in the stereo camera 1 of Embodiment 1, the size of the cover can be made smaller than that of Patent Document 1, and as a result, the stereo camera 1 is miniaturized. can do.
- the light shielding cover needs to be attached at the same angle as or more than the viewing angle so as not to block the field of view.
- the light shielding cover can be attached at an inclination smaller than the effective viewing angle of the two cameras.
- FIG. 7 shows an example of the light shielding cover.
- A shows the light-shielding cover of the structure of this Embodiment 1
- (b) has shown the light-shielding cover of the conventional structure.
- Patent Document 1 shows the case where the target object is far (for example, the distance L is 100 m).
- (a) and (b) show a light shielding cover for realizing the same field of view.
- (c) and (d) unify the light shielding cover of (a) and (b).
- the light shielding cover 31R is disposed along the right detection limit 1R and left detection limit 3R of the right camera sensor, and the right detection limit 1L and left detection limit 3L of the left camera sensor A light shielding cover 31L is disposed along the line.
- the light shielding cover 42R is disposed along the right detection limit 2R and the left detection limit 4R of the sensor of the right camera, and the right detection limit 4L and the left detection limit 2L of the sensor of the left camera.
- a light shielding cover 42L is disposed along the line.
- the maximum view angle (left detection limit 3R) detected by the sensor of the right camera and the maximum view angle (right detection limit 1L) detected by the sensor of the left camera are outside the stereo camera.
- the light shielding cover has a small angle (the same angle as the right detection limit 1R and the left detection limit 3L).
- the maximum viewing angle (right detection limit 2R) detected by the sensor of the right camera and the maximum viewing angle (right detection limit 4L) detected by the sensor of the left camera are shaded outside the stereo camera It has the same angle as the cover.
- the light shielding cover becomes large, and as a result, the overall size of the stereo camera can not but be increased.
- the configuration of the first embodiment of (a) the light shielding cover does not spread so much outside with respect to the casing of the stereo camera 1, so the overall size of the stereo camera can be reduced.
- the configuration of the first embodiment is extremely advantageous because the stereo camera is required to be downsized from the viewpoint of being disposed in front of the driver's seat.
- the light shielding cover is integrated.
- the light shielding cover is determined based on the detection limit.
- the present invention is not limited to this. The same effect can be obtained even if a slight angular margin is secured for the inclination of the light shielding cover.
- the movable range of the wiper may be smaller than that of the patent document 1.
- the movable range of the wiper is required to be large, since the states of the camera CR and the camera CL with respect to rain are different from each other, highly accurate detection can not be performed. For example, in the case where one camera shoots with rain removed and the other camera shoots without rain removed, the detection accuracy decreases.
- the stereo camera 1 of Embodiment 1 can detect the target object 100 with a small size, high accuracy, and a wide field of view. Such an effect can not be obtained at all in the configuration in which the sensor is shifted in the vertical direction as in Patent Document 3.
- the detection accuracy may be reduced due to the difference in lens distortion between the camera CR and the camera CL as in Patent Document 2. It can be said that it is small.
- the stereo image processing device can enlarge the effective visual field by intentionally reducing the stereo visual field and increasing the monocular visual field. Furthermore, the viewing angle can be further expanded by using a lens whose peripheral portion is distorted.
- FIG. 8 is a configuration diagram of a stereo camera 1 according to Embodiment 2 of the present invention.
- the two sensors are shifted in directions approaching each other with respect to the optical axis.
- the other configuration is the same as that of the first embodiment.
- the right camera CR will be described.
- the light 100R reflected or transmitted through the target object 100 passes through the center of the lens LR and enters the sensor CSR. At this time, the light 100R transmitted through the center of the lens LR is incident on the right side of the sensor CSR, not near the center MR on the sensor CSR.
- the left camera CL will be described.
- the light 100L reflected or transmitted through the target object 100 passes through the center of the lens LL and enters the sensor CSL.
- the light 100L transmitted through the center of the lens LL is incident not on the vicinity of the center ML on the sensor CSL but on the left side of the sensor CSL.
- the center MR of the sensor CSR is shifted to the left with respect to the axis OR passing through the center of the lens LR, and the center ML of the sensor CSL is shifted to the right with respect to the axis OL passing through the center of the lens LL. It will be.
- the object can be recognized faster by the baseline length D.
- detection is performed by the camera CL in the second embodiment, and detection is performed by the camera CR in the first embodiment.
- the camera CL can detect the target object 100 on the left side earlier by the base length D.
- the target object 100 on the left side can be detected quickly by 2.5 m in the case of an ordinary vehicle. This is advantageous when recognizing a fast target object 100 such as a bicycle.
- the stereo image processing apparatus can enlarge the effective visual field by intentionally reducing the stereo visual field similarly to the first embodiment and increasing the monocular visual field. Furthermore, by shifting the sensors CSR and CSL in directions approaching each other with respect to the optical axis, it is possible to recognize the target object 100 in the monocular vision region more quickly.
- FIG. 9 is a block diagram of a stereo camera 1 according to Embodiment 3 of the present invention.
- the stereo camera 1 according to the third embodiment in addition to shifting the sensors CSR and CSL closer to each other as in the first embodiment, they are shifted to opposite sides also in the vertical direction.
- the sensor CSR shifts vertically downward
- the sensor CSL shifts vertically upward.
- the other configuration is the same as that of the first embodiment.
- the right camera CR will be described.
- the light 100R reflected or transmitted through the target object 100 passes through the center of the lens LR of the camera CR on the right side, and enters the sensor CSR.
- the light 100R transmitted through the center of the lens LR is incident on the upper left side of the sensor CSR rather than near the center MR on the sensor. That is, the center MR of the sensor CSR is shifted to the lower right side with respect to an axis OR passing through the center of the lens LR.
- the left camera CL will be described.
- the light 100L reflected or transmitted through the target object 100 passes through the center of the lens LL of the camera CL on the left side, and enters the sensor CSL.
- the light 100L transmitted through the center of the lens LL is not in the vicinity of the center ML on the sensor but on the lower right side of the sensor CSL. That is, the center ML of the sensor CSL is shifted to the upper left side with respect to an axis OL passing through the center of the lens LL.
- FIG. 10 is a schematic view of the positions of the two sensors with respect to the optical axis of the stereo camera 1.
- the vertical axis indicates the vertical direction
- the horizontal axis indicates the horizontal direction.
- the origin indicates the positions of the light 100L and the light 100R.
- the dotted line indicates the sensor CSR of the camera CR.
- the one-dot chain line indicates the sensor CSL of the camera CL.
- the range Tb1 and the range Tb2 can be detected. For example, when starting a vehicle from a stopped state, it is necessary to detect a sign, a signboard, a white line, etc. in the vicinity of the diagonal direction. According to the third embodiment, they can be detected by the range Tb1 and the range Tb2.
- the camera CR shifts the center MR of the sensor CSR to the lower right with respect to the axis OR passing through the center of the lens LR as in the third embodiment. It is possible to detect nearby signs and signs.
- the camera CL can detect the white line in the vicinity on the lower right side by shifting the sensor CSL to the upper left side with respect to the axis OR passing through the center of the lens LL.
- the camera CR on the right side can detect the white line in the vicinity on the lower left side by shifting the center MR of the sensor CSR to the upper right side with respect to the axis OR passing through the center of the lens LR.
- the camera CL can detect a sign, a signboard, or the like in the vicinity on the upper right side by shifting the center ML of the sensor CSL to the lower left side with respect to an axis OL passing the center of the lens LL.
- the stereo image processing apparatus can obtain a new visual field (a visual field corresponding to the range Ta1 and the range Tb2) by shifting the two sensors in the oblique direction.
- a new visual field a visual field corresponding to the range Ta1 and the range Tb2
- the two sensors are shifted in the direction away from each other in the vertical direction.
- the configuration in which the two sensors are shifted in the direction closer to each other in the vertical direction has the same effect as the third embodiment. You can get it.
- the sensors CSR and CSL are shifted in an oblique direction. Considering that the sensor size is larger in the oblique direction than in the horizontal and vertical directions, there is an advantage that the effective visual field increase amount described in FIG. 5 can be further increased.
- the present invention is not limited to the embodiments described above, but includes various modifications.
- the above-described embodiments are described in detail to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
- part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
- the two sensors of the stereo camera 1 do not necessarily have to be shifted by the same amount, and the same effect can be obtained as long as stereo vision and monocular vision can be realized simultaneously.
- Each functional unit included in the processing unit 2 can be configured using hardware such as a circuit device on which these functions are implemented, or by executing software on which these functions are implemented by a computing device (processor). It can also be configured.
- the control circuit 204 can also be provided outside the stereo image processing apparatus.
- the correction circuit 201R and the correction circuit 201L convert the projection method, the conversion is not necessarily required, and an image detected by a camera may be used as it is.
- the depth and position of the object may be detected from temporal change and size of the object using a part of the image data of the camera CR and the camera CL (field of view Ta1 and field of view Ta2 in FIG. 2).
- the difference between the lanes of Japan and the United States has been described as an example, but this configuration is an example, and any target object is selected using a new field of view (field corresponding to range Ta1 and range Tb2). It may be detected.
- Stereo camera 2 Processing unit 100: target object 200R, 200L: A / D converter 201R, 201L: correction circuit 202: stereo matching circuit 203: image recognition circuit 204: control circuit CR, CL: camera LR, LL: Lens CSR, CSL: Sensor
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP18843364.3A EP3667413B1 (en) | 2017-08-07 | 2018-05-01 | Stereo image processing device |
| US16/634,699 US10992920B2 (en) | 2017-08-07 | 2018-05-01 | Stereo image processing device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017152626A JP6793608B2 (ja) | 2017-08-07 | 2017-08-07 | ステレオ画像処理装置 |
| JP2017-152626 | 2017-08-07 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019030995A1 true WO2019030995A1 (ja) | 2019-02-14 |
Family
ID=65272132
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/017424 Ceased WO2019030995A1 (ja) | 2017-08-07 | 2018-05-01 | ステレオ画像処理装置 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10992920B2 (enExample) |
| EP (1) | EP3667413B1 (enExample) |
| JP (1) | JP6793608B2 (enExample) |
| WO (1) | WO2019030995A1 (enExample) |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021176821A1 (ja) * | 2020-03-05 | 2021-09-10 | 日立Astemo株式会社 | 撮像装置 |
| JP2021148668A (ja) * | 2020-03-19 | 2021-09-27 | 株式会社リコー | ステレオカメラ装置 |
| WO2022039404A1 (ko) * | 2020-08-20 | 2022-02-24 | (주)아고스비전 | 광시야각의 스테레오 카메라 장치 및 이를 이용한 깊이 영상 처리 방법 |
| JP7436400B2 (ja) | 2021-01-15 | 2024-02-21 | 日立Astemo株式会社 | ステレオ画像処理装置及び画像補正手段 |
| KR20220132270A (ko) | 2021-03-23 | 2022-09-30 | 삼성전자주식회사 | 카메라 모듈을 포함하는 전자 장치 및 그 전자 장치의 동작 방법 |
| WO2022230287A1 (ja) * | 2021-04-28 | 2022-11-03 | 日立Astemo株式会社 | 車載カメラ装置 |
| KR20230057649A (ko) * | 2021-10-22 | 2023-05-02 | 주식회사 한화 | 로봇 안전 감시 시스템 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003510666A (ja) * | 1999-09-30 | 2003-03-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | トラッキングカメラ |
| JP2005024463A (ja) | 2003-07-04 | 2005-01-27 | Fuji Heavy Ind Ltd | ステレオ広視野画像処理装置 |
| JP2006333120A (ja) * | 2005-05-26 | 2006-12-07 | Denso Corp | 撮像モジュール |
| JP2014140594A (ja) * | 2013-01-25 | 2014-08-07 | Fujifilm Corp | 立体内視鏡装置 |
| JP2014238558A (ja) | 2013-06-10 | 2014-12-18 | 株式会社リコー | 撮像装置及び視差検出方法 |
| WO2015182147A1 (ja) | 2014-05-28 | 2015-12-03 | 京セラ株式会社 | ステレオカメラ装置およびステレオカメラ装置を設置した車両 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3214364B2 (ja) * | 1996-08-14 | 2001-10-02 | 富士電機株式会社 | 車間距離測定装置 |
| EP1408703A3 (en) * | 2002-10-10 | 2004-10-13 | Fuji Photo Optical Co., Ltd. | Electronic stereoscopic imaging system |
| JP5432365B2 (ja) * | 2010-03-05 | 2014-03-05 | パナソニック株式会社 | 立体撮像装置および立体撮像方法 |
| JP6124655B2 (ja) * | 2013-04-05 | 2017-05-10 | オリンパス株式会社 | 撮像装置、撮像装置の制御方法及びプログラム |
| EP3214474B1 (en) * | 2014-10-29 | 2019-07-24 | Hitachi Automotive Systems, Ltd. | Optical system, image capturing device and distance measuring system |
| EP3252514A4 (en) * | 2015-01-26 | 2018-08-22 | Hitachi Automotive Systems, Ltd. | Imaging lens, imaging device using same, and distance measuring system |
-
2017
- 2017-08-07 JP JP2017152626A patent/JP6793608B2/ja active Active
-
2018
- 2018-05-01 WO PCT/JP2018/017424 patent/WO2019030995A1/ja not_active Ceased
- 2018-05-01 US US16/634,699 patent/US10992920B2/en active Active
- 2018-05-01 EP EP18843364.3A patent/EP3667413B1/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003510666A (ja) * | 1999-09-30 | 2003-03-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | トラッキングカメラ |
| JP2005024463A (ja) | 2003-07-04 | 2005-01-27 | Fuji Heavy Ind Ltd | ステレオ広視野画像処理装置 |
| JP2006333120A (ja) * | 2005-05-26 | 2006-12-07 | Denso Corp | 撮像モジュール |
| JP2014140594A (ja) * | 2013-01-25 | 2014-08-07 | Fujifilm Corp | 立体内視鏡装置 |
| JP2014238558A (ja) | 2013-06-10 | 2014-12-18 | 株式会社リコー | 撮像装置及び視差検出方法 |
| WO2015182147A1 (ja) | 2014-05-28 | 2015-12-03 | 京セラ株式会社 | ステレオカメラ装置およびステレオカメラ装置を設置した車両 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3667413A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20200213574A1 (en) | 2020-07-02 |
| EP3667413B1 (en) | 2023-12-27 |
| EP3667413A4 (en) | 2021-03-17 |
| US10992920B2 (en) | 2021-04-27 |
| EP3667413A1 (en) | 2020-06-17 |
| JP6793608B2 (ja) | 2020-12-02 |
| JP2019032409A (ja) | 2019-02-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019030995A1 (ja) | ステレオ画像処理装置 | |
| JP5273356B2 (ja) | 複眼画像入力装置及びそれを用いた距離測定装置 | |
| CN107122770B (zh) | 多目相机系统、智能驾驶系统、汽车、方法和存储介质 | |
| US20160379066A1 (en) | Method and Camera System for Distance Determination of Objects from a Vehicle | |
| US10869002B2 (en) | Vehicle camera device for capturing the surroundings of a motor vehicle and driver assistance device for detecting objects with such a vehicle camera device | |
| JP6510551B2 (ja) | 撮像光学系、撮像装置および距離測定システム | |
| US20180137629A1 (en) | Processing apparatus, imaging apparatus and automatic control system | |
| US20180338095A1 (en) | Imaging system and moving body control system | |
| US20180276844A1 (en) | Position or orientation estimation apparatus, position or orientation estimation method, and driving assist device | |
| US11012621B2 (en) | Imaging device having capability of increasing resolution of a predetermined imaging area using a free-form lens | |
| CN113646607A (zh) | 物体位置检测装置、行驶控制系统及行驶控制方法 | |
| JP2015230703A (ja) | 物体検出装置及び物体検出方法 | |
| JP5234490B2 (ja) | ステレオ画像形成装置 | |
| JP6983740B2 (ja) | ステレオカメラシステム、及び測距方法 | |
| WO2023166813A1 (ja) | ステレオ画像処理装置 | |
| JP7134925B2 (ja) | ステレオカメラ | |
| JPWO2017028848A5 (enExample) | ||
| CN114667729B (zh) | 多孔变焦数码摄像头及其使用方法 | |
| EP3974771A1 (en) | Stereo camera system and distance measurement method | |
| US20240295646A1 (en) | Distance measuring device, imaging device, and distance measuring method | |
| WO2023166546A1 (ja) | ステレオ画像処理装置 | |
| JP7652739B2 (ja) | 画像処理装置及び画像処理方法 | |
| KR20240130623A (ko) | 거리 계측 디바이스, 이동가능 장치, 및 제어 방법 | |
| JP2005271693A (ja) | 物体検出装置 | |
| JP2008040115A (ja) | ステレオカメラ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18843364 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018843364 Country of ref document: EP Effective date: 20200309 |