GB2541101A - Method and camera system for determining the distance of objects in relation to a vehicle - Google Patents

Method and camera system for determining the distance of objects in relation to a vehicle Download PDF

Info

Publication number
GB2541101A
GB2541101A GB1610870.6A GB201610870A GB2541101A GB 2541101 A GB2541101 A GB 2541101A GB 201610870 A GB201610870 A GB 201610870A GB 2541101 A GB2541101 A GB 2541101A
Authority
GB
United Kingdom
Prior art keywords
cameras
distance
determination
relation
consideration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1610870.6A
Other versions
GB201610870D0 (en
Inventor
Reiche Martin
Heroldt Julia
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE102016206493.2A external-priority patent/DE102016206493A1/en
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of GB201610870D0 publication Critical patent/GB201610870D0/en
Publication of GB2541101A publication Critical patent/GB2541101A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Measurement Of Optical Distance (AREA)
  • Studio Devices (AREA)

Abstract

The distance to an object in relation to a vehicle is determined using two cameras 101, 102 that acquire differing but overlapping fields of view (FOV), wherein the image angles 103, 104 of the cameras differ from one another (i.e. the cameras have different focal lengths). The images produced of the object in the overlapping region of the cameras fields of view are used to calculate the distance to the object. The imaging scales or distortions of cameras may differ. The cameras may each be mono cameras. The distance may be determined in consideration of at least one of the image angles or by comparing the images of the object. Alternatively, the distance may be determined in consideration of the orientation or positioning of the cameras which may further involve considering the angle between the optical axes of the cameras. Alternatively, the distance may be determined through back calculation of the image distortion of the cameras. The cameras may have parallel, convergent or divergent optical axes. The cameras may have differently configured optics e.g. one may have a wide angle lens and the other a telephoto lens.

Description

Description
Title
Method and camera system for determining the distance of objects in relation to a vehicle
The present invention relates to a method and a camera system for determining the distance of objects in relation to a vehicle by means of at least two cameras.
Prior art
Stereo camera systems that determine the distance in relation to an object in their overlap region on the basis of two identically configured optical paths, and integrate this into driver assistance systems, are already known from the publication Fiihrer, D. I. T., Heger, I. T., & Heckel, M. S. J. (2014). Stereo-Videokamera als Basis fur Assistenzfunktionen. ATZ-Automobiltechnische Zeitschrift, 116(2), 22-27. The stereo video camera generates so-called stereoscopic disparity information, i.e., it generates a precise 3-D map of the surroundings of the vehicle from the comparison between the left and right images. The resultant depth map comprises a highly accurate distance calculation for all points in the overlap region of the camera images.
Furthermore, the publication DE112012003685T5 discloses a system, comprising an image processing device, which is composed of a first and a second image acquisition unit, having wide-angle lenses that can acquire at least partially overlapping images. Furthermore, the system is composed of a distance measuring unit, which calculates the distance from the local vehicle to an object on the basis of a multiplicity of images acquired by the first and second image acquisition units. Inter alia, the calculation of the distance may be performed on the basis of the angle of incidence determined by a unit for determining angle of incidence.
If the cameras used ideally have parallel optical axes, and the focal lengths f of the cameras are equal and the base distance b of the cameras is known, the disparity D of a feature on the images acquired by the cameras, or image sensors, can be used to the determine the distance g in relation to the object associated with the feature:
g = f*b/D
This law proves to be so simple at this point because the optical paths of both cameras are approximately identical. JP Hll-39596 A discloses a camera system composed of two stereo video cameras. The stereo video cameras in that case have differing image angles. Determination of the distance of objects is possible on the basis of the acquired images of respectively one stereo video camera, the images of the respectively other stereo video camera not being included in the determination of distance. For objects present in an overlap region of the visual ranges of the two stereo video cameras, the distances determined by each of the individual stereo video are compared with each other.
Disclosure of the invention
The present invention relates to a camera system for determining the distance of objects in relation to a vehicle by means of at least two cameras that acquire differing, but at least partially overlapping, image regions. The core of the invention consists in that the configurations of the imagings of at least two cameras differ from each another, and that the images of an object in the overlap region of the fields of view of the cameras that are acquired by at least two cameras are used, in an evaluation unit, to determine the distance of the object in relation to the vehicle.
The invention makes it possible to determine the distance of objects by means of at least two cameras having differently configured optics. Unlike a stereo video camera, the cameras used need not be of the same design, operating according to the same imaging laws. It is consequently possible to construct a system that can cover differing image regions with differently configured cameras, for example with a wide angle and a telephoto lens and that, at the same time, can effect a determination of the distance of the objects present in the overlap region. The disclosed camera system enables this coverage of the image regions to be effected by means of only two monocameras. This results in a significant cost saving in comparison with already known camera systems, which use two stereo video cameras to cover the same image region.
The proposed invention could be applied, in particular, for driver assistance or safety functions. Systems such as, for example, emergency braking systems, lane-keeping assistance systems, lane-change assistance systems, traffic sign recognition, systems for distance control, convenience systems such as congestion assistance systems, roadworks assistance systems and comparable systems would be conceivable .
The advantage of the invention becomes particularly clear yet again at this point, since a plurality of assistance functions are conceivable with one camera system. In the differing systems, in some cases it is necessary to image a plurality of vehicle lanes in the surroundings very close to the vehicle, which can be realized, preferably, by means of cameras having a very wide field of view, in particular by means of wide-angle objective lenses. Other systems, such as the traffic sign recognition or an assistance function, which have to identify objects and/or vehicles in the far distance, preferably require for this purpose a camera system that uses, for example, a telephoto lens, by means of which a sharp image of objects in the far distance can be formed.
The proposed invention thus makes it possible, by means of a camera system, to fulfil the requirements of differing driver assistance systems and/or functions for autonomous driving, for which at least two conventional/known camera systems would be required. As a result, costs can be reduced or, alternatively, a greater number of assistance or safety functions can be realized with only one camera system.
Configuration of the imagings may be understood to mean, for example, the fields of view and/or the image angles and/or the light sensitivity and/or the separation capability and/or the pixel resolution and/or the colour filter pattern of the imager of the cameras used.
In the case of the disclosed camera system, the fields of view and/or the image angles of the cameras may additionally differ from each other in any way. The fields of view denote the image regions acquired by the cameras, frequently also referred to as "field of view" (FOV), the delimitation of which is defined by the image angles of the cameras .
The arrangements of the cameras may differ from each other in any way, for example in that their position in relation to each other varies and/or the orientations differ from each other or, more precisely, the directions and orientations of the optical axes in relation to each other. The cameras in this case may have optical axes that are parallel to each other and/or divergent and/or convergent.
Furthermore, the at least two cameras may be arranged in one housing, in which an evaluation unit may also optionally be installed, which evaluation unit, however, may also be fitted at any other location in the vehicle. Moreover, the cameras may be accommodated in at least two completely separate housings, at any locations in the vehicle. The evaluation unit in this case may likewise be located at any position in the vehicle or, alternatively, accommodated in one of the camera housings.
The camera objective lenses that are used, the optical imagings of which may be described, for example, by values such as fields of view, image angles, focal lengths and/or distances of the image sensors, may differ from each other in any way, such that, for example, at least one wide-angle lens is used and the optics of at least one further camera has, for example, a telephoto lens.
Additionally presented according to the invention is a method for determining the distance of objects in relation to a vehicle by means of at least two cameras that acquire differing, but at least partially overlapping, image regions, characterized in that the configurations of the imagings of at least two cameras differ from each other, and that the images of an object in the overlap region that are acquired by at least two cameras are used, in an evaluation unit, to determine the distance of the object in relation to the vehicle.
The cameras used for the application of the method according to the invention may be characterized in that the configurations of the imagings differ from each other in that the cameras have differing fields of view and/or image angles, and/or the imaging scales and/or distortions of the imagings differ from each other.
For the purpose of determining the distance of the object in relation to the vehicle, in a further step at least one of the fields of view and/or at least one of the image angles may be taken into consideration. Furthermore, for the purpose of calculating the distance, the acquired images of the cameras may be used, and/or consideration of the orientation of the cameras in relation to each other may be included, in particular the orientation of the optical axes. The calculation may furthermore include the positioning of the cameras in relation to each other, in particular the base distance of the cameras, consideration of the correction of the images, acquired by the cameras, by back-calculation of the distortion, and/or consideration of the determined corrected positions of an object in the image of at least two cameras by which the object was acquired.
In the determination of the distance of the object in relation to the vehicle, the determined angle difference of the object angles of at least two cameras may additionally be included. The object angle of a camera in this case describes the angle between the optical axis of the camera and the notional line from the camera to the detected object. The camera can therefore be used as a reference point, since the distance between the camera and the image sensor is very small in comparison with the distance between the camera and a detected object. Alternatively, the reference point may also be defined differently, for example the mid-pint of the foremost lens, with respect to the object, or the front or rear focal point, or the image sensor, or the camera housing may be used. All of the points mentioned in each case lie on the optical axis of the corresponding camera. Once the reference point has been clearly defined, conversion of the object angle to any other reference point can be performed at any time. Use of a more precisely specified reference point results in alternative descriptions of the object angle, such as: • The object angle in this case describes the angle between the optical axis and a notional line from the point of intersection of the foremost lens, with respect to the object, and the optical axis of the camera, to the obj ect. • The object angle in this case describes the angle between the optical axis and a notional line from the point of intersection of the foremost focal point, with respect to the object, and the optical axis of the camera, to the obj ect. • As described, it does not matter whether one uses the point of intersection of the optical axis with the vertex of the first lens or, for example, the entrance pupil, since the object distances are very large relative to this increment from entrance pupil to lens vertex.
Further details, features, feature combinations, advantages and effects on the basis of the invention are given by the dependent claims, the following description of the preferred exemplary embodiments of the invention and the drawings. The latter show, in a schematic representation:
Fig. 1 shows an exemplary camera system, consisting of two cameras having differing fields of view and image angles.
Fig. 2 shows a diagram representing the applied method.
Fig. 3 shows an exemplary characteristic of the image height over the object angle of two cameras.
Fig. 4 shows an exemplary optical imaging, for the definition of some terms.
Fig. 5 shows the distortions of the two cameras, plotted over the image height.
Fig. 6 shows the distortions of the two cameras, plotted over the object angle.
Figure 1 shows, exemplarily, the structure of a camera system consisting of two cameras 101, 102, which are disposed at a certain distance 112 from each other. In the example given, the optical axes 105, 106 of the cameras 101, 102 are parallel to each other, but alternatively they may also be disposed so as to be convergent or divergent.
The cameras 101, 102 are accommodated in a common housing 115, this being possible as an option, but not being a prerequisite. In addition, the camera system is connected to an evaluation unit 116, which may optionally be installed in the same housing 115 or be located outside, at any other position.
The two cameras 101, 102 have differing fields of view 113, 114, or differing image angles 103, 104. The fields of view 113, 114 overlap in a region 107, which is therefore covered by both cameras. An object 108 is present in the overlap region 107. The object 108 is picked up, at a certain object angle 110, 111, by both cameras 101, 102.
The distance 109 from the object 108 to the vehicle is determined by means of the method according to the invention.
In Figure 2, the disclosed method for determining distance is represented schematically by a sequence diagram. At the start of the method 201, the technical data of the cameras 101, 102 are known, these including, for example, the position of the cameras 101, 102, their distance 112, the orientation of the optical axes 105, 106, the image angles 103, 104 and the fields of view 113, 114.
These items of information have been and/or are noted 202 in the system before read-in of the image data 203 is effected. The sequence of processing is irrelevant at this point, however, such that step 202 may be placed at any point between 201 and 207.
After the data have been read-in 203, de-distortion of the images is effected in step 204, i.e. a back-calculation of the distortion of the cameras 101, 102, see explanation with reference to the figures that follow. Then, in step 205, the images can be standardized to a common system, such that, in step 206, the object angle 110, 111 of a common object 108 in the region 107 covered jointly by the two cameras 101, 102 can be determined. The steps 204, 205, 206 may be interchanged in any manner, without thereby altering the outcome of the method.
In the following step 207, the distance 109 is calculated, in consideration of the already known technical data of the cameras 101, 102 and of the determined object angles 110, 111. Instead of the object angles 110, 111, the positions of the object 108 on the corrected images acquired by the cameras 101, 102 may also be used for the calculation.
These positions can likewise be used to determine the disparity, as is possible by means of the object angles 110, 111. This means that the determination of the distance 109 may also be effected in consideration of the determined distance of the images of the object 109 that are acquired by the cameras 101, 102.
The method gives, as a result 208, the distance of the object 109 in relation to the vehicle. The exact reference point to which the distance 109 is measured may be defined on the basis of the requirements for the camera system. After the distance has been output and/or relayed 208, the determination of distance is ended, and may be performed over again with the same or any other object. These determinations of distance are not effected sequentially, the simultaneously measured total images of both cameras being instead searched, in their overlap region, for corresponding image contents. Following back-calculation of the images to a common imaging law and/or a common scale, a depth map can be determined, over the object space, from the disparity.
The determination of the distance 109 is not limited to one object 108; the distances of all objects in the overlap region can be determined simultaneously, and a precise 3-D map of the surroundings of the vehicle acquired by the cameras 101, 102 can thus be produced. For this purpose, the steps 201 to 209 may be repeated as often as required. The determination of the distance 109 in relation to any object 108 in the overlap region may be repeated over time, thereby making it possible to determine the change in the distance in relation to the object 108 over time. This, again, may be effected with any number of objects in the overlap region, as long as the objects are acquired by both cameras. This makes it possible, for example, to measure the speed of the vehicle.
In Figure 3, the image height h 401 of two cameras 101, 102 is plotted, exemplarily, over the object angle 110, 111.
In Figure 4, the image height 401 is illustrated exemplarily on an imaging of a lens optics system. The object angle 402 in relation to the object 403 and the focal widths of the lens 404 are again likewise entered in Figure 4.
In the following exemplary embodiment of the camera system, the optical axes of both cameras 101, 102 are ideally collinear, the base distance 112 of the two cameras 101, 102 being given. In a first approximation, the imagings of the cameras 101, 102 are rotationally symmetrical with respect to their optical axes 105, 106, being describable as image height hQ 401 over the object angle Ω 402. In the example, a maximum image height of 3 mm of the image sensor is assumed, this corresponding approximately to half the diagonal of the optically active rectangle of the image sensor .
In this example, the imaging laws of the two cameras 101, 102 are assumed to be approximately linear, with the slope of the focal length f 404. Camera 1 101 in this case has a maximum object angle 110 of 25°, camera 2 102 having a maximum object angle of 50°. The differing image heights, plotted over the angle 302, 304 of the two cameras 101, 102, are represented in Figure 3. In this case, the image height 302 corresponds to the image height of the camera 1 101 and, accordingly, the image height 304 corresponds to the camera 2 102. Additionally represented, in broken lines 301, 303, are the corresponding image heights of an ideal imaging according to a pinhole camera h_s = f*tanQ, wherein h_s represents the image height of the pinhole-camera imaging, f represents the focal length, and Ω represents the object angle 402. These ideal imaging curves 301, 303 constitute the reference for the so-called distortion.
In Figure 5, the distortions 501, 502 of both cameras 101, 102 are plotted over the image height 401, resulting in two completely different curve characteristics. Here, distortion 501 corresponds to the distortion of camera 1 101 and, accordingly, the distortion 502 corresponds to the distortion of camera 2 102. If both distortions 601, 602 are plotted over the object angle 402, equal distortions are obtained for the two optical paths of the cameras 101, 102. Here, distortion 601 corresponds to the distortion of camera 1 101 and, accordingly, the distortion 602 corresponds to the distortion of camera 2 102.
Bases on these relationships, a procedure can be described, by means of which the distance 109 of an object 108 in the overlap region 107 of the fields of view 113, 114 of the two cameras 101, 102 can be determined. • The acquired images are de-distorted according to the pixel raster of camera 1 101 and camera 2 102, i.e. the distortion is back-calculated. • Following the de-distortion, the corrected positions of the object 108 are determined in the imagings of the cameras 101, 102, i.e. in the acquired images of the cameras 101, 102. • This is followed by determination of the difference of the object angles 110, 111 of the cameras 101, 102. • The object distance is then determined from the angle difference and the base distance 112.

Claims (17)

Claims
1. Method for determining the distance of an object (108) in relation to a vehicle by means of at least two cameras (101, 102) that acquire differing (113, 114), but at least partially overlapping, fields of view (107) , characterized in that • the image angles (103, 104) of the two cameras (101, 102) differ from each other, and • a determination of the distance (109) of the object (108) in relation to the vehicle is effected in an evaluation unit (116) by means of images of an object (108) in an overlap region (107) of the fields of view (113, 114) of the cameras (101, 102) that are acquired by the two cameras (101, 102) .
2. Method according to Claim 1, characterized in that the imaging scales and/or the distortions of the cameras differ .
3. Method according to either one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of at least one of the image angles (103, 104).
4. Method according to any one of the preceding claims, characterized in that the cameras (101, 102) are monocameras .
5. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of a comparison of the images of the object (108) acquired by the cameras (101, 102) .
6. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of the orientation of the cameras (101, 102), in particular the orientation of the optical axes (105, 106) in relation to each other.
7. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of the positioning of the cameras (101, 102) in relation to each other, it being provided, in particular, that the determination of the distance (109) is effected in consideration of the base distance (112) of the cameras (101, 102).
8. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of a correction of the images acquired by the cameras (101, 102), by back-calculation of the distortion.
9. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of the determination of the corrected positions of an object (108), detected by at least two cameras (101, 102), in the images acquired by the cameras (101, 102).
10. Method according to any one of the preceding claims, characterized in that the determination of the distance (109) is effected in consideration of the determined angle difference of the object angles (110, 111), i.e. the angle between the optical axes (105, 106) of the cameras (101, 102) and the notional lines from the cameras (101, 102) to the object (108).
11. Camera system for determining the distance of an object (108) in relation to a vehicle by means of at least two cameras (101, 102) that acquire differing (113, 114), but at least partially overlapping, fields of view (107) , characterized in that • the image angles (103, 104) of the at least two cameras (101, 102) differ from each other, and • an evaluation unit (116) is provided for determining the distance (109) of the object (108) in relation to the vehicle, being designed to determine the distance (109) by means of the images of an object (108) in the overlap region (107) of the fields of view (113, 114) that are acquired by the at least two cameras (101, 102).
12. Camera system according to Claim 11, characterized in that the cameras (101, 102) are mono-cameras.
13. Camera system according to Claim 11 or 12, characterized in that the cameras differ by differing distortions .
14. Camera system according to any one of Claims 11 to 13, characterized in that the differing cameras (101, 102) are accommodated in a common housing (115).
15. Camera system according to any one of Claims 11 to 14, characterized in that the differing cameras (101, 102) have parallel or convergent or divergent optical axes (105, 106).
16. Method for determining the distance of an object, substantially as herein described with reference to and as shown in the accompanying drawings.
17. Camera system for determining the distance of an object, substantially as herein described with reference to and as shown in the accompanying drawings .
GB1610870.6A 2015-06-23 2016-06-22 Method and camera system for determining the distance of objects in relation to a vehicle Withdrawn GB2541101A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015211574 2015-06-23
DE102016206493.2A DE102016206493A1 (en) 2015-06-23 2016-04-18 Method and camera system for determining the distance of objects to a vehicle

Publications (2)

Publication Number Publication Date
GB201610870D0 GB201610870D0 (en) 2016-08-03
GB2541101A true GB2541101A (en) 2017-02-08

Family

ID=56895305

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1610870.6A Withdrawn GB2541101A (en) 2015-06-23 2016-06-22 Method and camera system for determining the distance of objects in relation to a vehicle

Country Status (1)

Country Link
GB (1) GB2541101A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004364212A (en) * 2003-06-09 2004-12-24 Fujitsu Ltd Object photographing apparatus, object photographing method and object photographing program
US6847392B1 (en) * 1996-10-31 2005-01-25 Nec Corporation Three-dimensional structure estimation apparatus
US20110211068A1 (en) * 2010-03-01 2011-09-01 Soichiro Yokota Image pickup apparatus and rangefinder
US20120293633A1 (en) * 2010-02-02 2012-11-22 Hiroshi Yamato Stereo camera
US20130188022A1 (en) * 2012-01-23 2013-07-25 Microsoft Corporation 3d zoom imager
WO2014054753A1 (en) * 2012-10-04 2014-04-10 アルプス電気株式会社 Image processing device and device for monitoring area in front of vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6847392B1 (en) * 1996-10-31 2005-01-25 Nec Corporation Three-dimensional structure estimation apparatus
JP2004364212A (en) * 2003-06-09 2004-12-24 Fujitsu Ltd Object photographing apparatus, object photographing method and object photographing program
US20120293633A1 (en) * 2010-02-02 2012-11-22 Hiroshi Yamato Stereo camera
US20110211068A1 (en) * 2010-03-01 2011-09-01 Soichiro Yokota Image pickup apparatus and rangefinder
US20130188022A1 (en) * 2012-01-23 2013-07-25 Microsoft Corporation 3d zoom imager
WO2014054753A1 (en) * 2012-10-04 2014-04-10 アルプス電気株式会社 Image processing device and device for monitoring area in front of vehicle

Also Published As

Publication number Publication date
GB201610870D0 (en) 2016-08-03

Similar Documents

Publication Publication Date Title
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
KR101245906B1 (en) Calibration indicator used for calibration of onboard camera, calibration method of onboard camera using calibration indicator, and program for calibration device of onboard camera using calibration indicator
US10183621B2 (en) Vehicular image processing apparatus and vehicular image processing system
JP5421072B2 (en) Approaching object detection system
CN204229115U (en) For obtaining the 3D camera of 3 d image data
CN108692719B (en) Object detection device
US10869002B2 (en) Vehicle camera device for capturing the surroundings of a motor vehicle and driver assistance device for detecting objects with such a vehicle camera device
US20130242101A1 (en) Method and Device for Representing Obstacles in a Parking Assistance System of Motor Vehicles
CN108924543B (en) Optical test system and test method for vehicle-mounted camera
US20140009589A1 (en) Vehicle having a device for detecting the surroundings of said vehicle
US20120236287A1 (en) External environment visualization apparatus and method
CN108805910A (en) More mesh Train-borne recorders, object detection method, intelligent driving system and automobile
WO2011125937A1 (en) Calibration data selection device, method of selection, selection program, and three dimensional position measuring device
JP5007863B2 (en) 3D object position measuring device
JP5624370B2 (en) Moving body detection apparatus and moving body detection method
JP2006322795A (en) Image processing device, image processing method and image processing program
EP3782363B1 (en) Method for dynamic stereoscopic calibration
JP2007295113A (en) Imaging apparatus
US11012684B2 (en) Vehicular camera testing using a slanted or staggered target
JP2009074888A (en) Inter-vehicle distance measuring device
WO2018062368A1 (en) Image pickup device and image pickup system
US20190391592A1 (en) Positioning system
WO2020129398A1 (en) Observation apparatus
JP2007278869A (en) Range finder, periphery monitor for vehicle, range finding method, and program for range finding
GB2541101A (en) Method and camera system for determining the distance of objects in relation to a vehicle

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)