WO2006052024A1 - ステレオカメラ - Google Patents
ステレオカメラ Download PDFInfo
- Publication number
- WO2006052024A1 WO2006052024A1 PCT/JP2005/021194 JP2005021194W WO2006052024A1 WO 2006052024 A1 WO2006052024 A1 WO 2006052024A1 JP 2005021194 W JP2005021194 W JP 2005021194W WO 2006052024 A1 WO2006052024 A1 WO 2006052024A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image sensor
- lens
- image
- stage
- imaging
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B11/00—Filters or other obturators specially adapted for photographic purposes
- G03B11/04—Hoods or caps for eliminating unwanted light from lenses, viewfinders or focusing aids
- G03B11/045—Lens hoods or shields
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0001—Arrangements for holding or mounting articles, not otherwise provided for characterised by position
- B60R2011/0003—Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
Definitions
- the present invention belongs to a stereo camera in which a plurality of image sensors are mounted.
- Stereo camera device that calculates the distance to the object using a pair of images captured by two imaging means and recognizes the object accordingly. It has begun to be applied to monitoring systems that detect intruders and abnormalities.
- the stereo image processing used in these systems is to obtain a distance by applying a triangulation technique to a pair of captured images that are taken at a distance from each other. It is common to have a pair of imaging means and a stereo image processing LSI that performs triangulation processing on a pair of captured images output by these imaging means. At this time, the stereo image processing LSI detects the pixel position of the feature point common to the mutual image from the pixel information included in the pair of images, and the pixel in which the feature point is shifted in the pair of images. Triangulation processing is realized by performing processing to obtain numbers.
- the above prior art is a technique for preserving the positional relationship even after the mounting on the premise that the pair of image sensors are mounted with an accurate relative positional relationship.
- no particular contrivance has been made on how to accurately mount the relative positions of the left and right image sensors.
- the positional relationship between the lens and the image sensor is first maintained, and the rotational direction of the image sensor is not displaced relative to the surface on which the lens and the image sensor are mounted. After adjusting and fixing as above, this camera unit was attached to a reference member. However, when this method is used, there is no shift in the rotational direction with respect to a certain surface in the camera unit.
- the present invention is characterized in that an imaging element installation surface for attaching left and right imaging elements is provided on a stage, and the left and right imaging elements are directly positioned on the stage to form a stereo force mesa. This reduces the accumulation of tolerances between components and improves the mounting accuracy of the positional relationship between image sensors.
- FIG. 1 is a schematic diagram of a stereo camera in one embodiment of the present invention.
- FIG. 2 is a system block diagram of a stereo camera in one embodiment of the present invention.
- FIG. 3 is a diagram showing attachment of an image sensor in one embodiment of the present invention.
- FIG. 4 is a diagram showing attachment of the image sensor in one embodiment of the present invention.
- FIG. 5 is a schematic diagram showing the stay projection structure in one embodiment of the present invention.
- FIG. 6 is a diagram showing an example of a specific structure for fixing the image sensor to the stage.
- FIG. 7 is a diagram showing an example of a specific structure for fixing the image sensor to the stage.
- FIG. 8 is a diagram showing an example of a specific structure for fixing the image sensor to the stage.
- FIG. 9 is a schematic view of a structure for attaching a lens holder to a stage in one embodiment of the present invention.
- FIG. 10 is a flowchart for calculating the distance to the object with the conventional configuration and a flowchart for calculating the distance to the object with the configuration of the present invention.
- FIG. 11 is a schematic diagram showing the structure of a stain projection when mounting an imaging device in each of the first and second embodiments of the present invention.
- FIG. 12 is a schematic diagram showing a method for positioning an image sensor in one embodiment of the present invention.
- FIG. 13 is a diagram showing the structure of a lens hood in one embodiment of the present invention.
- FIG. 14 is a diagram showing a lens hood structure in the prior art.
- FIG. 15 is a view showing the structure of a lens hood in another embodiment of the present invention.
- Fig. 16 is a schematic diagram showing the principle of a stereo camera.
- Fig. 17 is a schematic diagram showing the mounting position of a general stereo camera when it is attached to a vehicle.
- FIG. 18 is a schematic diagram for explaining the improvement of the processing speed of the image processing L S I when the image sensor according to the present invention is attached.
- FIG. 19 is an image diagram showing the left and right camera acquired images when the left and right imaging elements are in an ideal mounting state.
- FIG. 20 is an image diagram showing a left / right force mela-acquired image in a state in which the mounting positions of the left and right imaging elements are shifted in the left / right direction.
- FIG. 21 is an image diagram showing a left and right camera acquired image in a state in which the mounting positions of the left and right imaging elements are displaced in the vertical direction.
- FIG. 22 is an image diagram showing a left-right force image obtained in a state in which the mounting positions of the left and right image sensors are shifted in a rotational direction with respect to an axis perpendicular to the plane on which the image sensors are installed.
- Figure 16 (a) shows the principle of the stereo camera device.
- ⁇ represents the parallax
- ⁇ represents the measurement distance
- f represents the focal length
- b represents the baseline length. The relationship shown in the following equation holds between these.
- the height of the horizon is photographed at the same pixel position on the screen as shown in Fig. 19, and the object is photographed with a shift of parallax. Therefore, as shown in Fig. 19 (c), the distance from the slight deviation of the feature points is converted into the distance, and the size and distance to the three-dimensional object are calculated.
- the image sensor is mounted shifted in the left-right direction, the image is taken at a different position in the horizontal direction from the pixel position where the characteristic point should originally appear as shown in FIG. Then, as shown in Fig. 20 (c), the parallax becomes larger than the original, and it is erroneously recognized that there is a large three-dimensional object.
- the images are also shifted in the vertical direction as shown in Fig. 21 and Fig. 22. In this case, it is necessary to search only for one line, but it is necessary to search for multiple lines, which increases processing time and memory capacity.
- Equation 6 Briefly indicate how much distance measurement accuracy will deteriorate.
- CCD Charge Coupled Device
- the positional relationship between the image sensors 12a and 12b is important for performing stereo processing with high accuracy and high efficiency.
- the configuration of the stereo camera of the present invention that can improve the mounting accuracy of the positional relationship between the image pickup devices 12 a and 12 b will be described in a simple and inexpensive manner.
- FIG. 1 shows an example of the structure of a stereo camera according to the present invention.
- Fig. 1 (a) is an overhead view seen from the front
- Fig. 1 (b) is an overhead view seen from the rear.
- the stereo camera of this embodiment has two image sensors 1 2 a and 1 2 b (not shown) and a stage projection 18 8 a and 1 which are the parts for mounting them.
- 8 b is an integrated stage 1 1, two lenses 1 3 a, 1 3 b, and lens holders 19 a, 19 b for holding the positional relationship between the lens and the image sensor Image sensor substrates 16 6 a and 16 b that perform processing for capturing image information imaged on the image sensors 12 a and 12 b into the image processing LSI, and images that perform stereo processing using the captured images Consists of processing board la 9 on which processing LSI is mounted.
- Lenses 13a and 13b image visual information from the outside world onto image sensors 12a and 12b.
- the stage 11 is a member for fixing the stage reference plane 14 to the stage on the vehicle side, and holding the imaging device 12, the imaging device substrate 16, and the processing substrate 1 a 9.
- the image sensors 1 2 a and 1 2 b are mounted directly, and from the rear direction of the stay projections 1 8 a and 18 b Mount the image sensor board 16a and 16b.
- the stay projections 18 a and 18 b are provided with the image sensor terminal holes 17 so that the terminals of the image sensors 12 a and 12 b can be joined to the image sensor substrates 16 a and 16 b.
- the stay projections 1 8 a and 1 8 b are provided with imaging element installation surfaces 44 a and 44 b that serve as reference surfaces for mounting the imaging elements 1 2 a and 1 2 b.
- the image sensors 1 2 a and 1 2 b can be installed on a predetermined plane.
- the image sensors 12a and 12b are arranged on substantially the same plane or parallel planes.
- the stay projections 1 8 a and 1 8 b are also provided with a reference surface 4 1 as a reference for the mounting angle of the image sensors 1 2 a and 1 2 b, and images are taken according to the reference surface 4 1.
- the imaging angle shift between image sensors 1 2 a and 1 2 b can be made a value that can be substantially ignored.
- These configurations can reduce the number of joints that include tolerances by one compared to the conventional technology where two cameras are assembled separately and then mounted on the stage. Can be improved. That is, in the prior art, there are two tolerances of [imaging element-camera body] and [camera body-stage], whereas according to the configuration of this embodiment, [imaging element-stage] 5021194
- stage protrusions 18 a and 18 b 8 Tolerance only. Details of the structure of the stage protrusions 18 a and 18 b will be described later.
- the image sensor board 1 6 is connected to the image sensor board 16 so that the terminal 3 2 of the image sensor 1 2 can be inserted into the image sensor board 16. Holes 4 and 3 are open as shown in (d).
- the image sensor substrate 16 is a substrate bonded to the image sensor 12 and is a substrate on which a circuit for transmitting image information captured by the image sensor 12 to the processing substrate 1 a 9 is configured. Note that the left and right imaging element substrates 16 a and 16 b capture image information imaged on the imaging elements 12 a and 12 b into the image processing LSI la 7 provided on the processing substrate 1 a 9. Perform the process.
- the processing board 1 a 9 uses an image processing LSI to extract an object based on image information sent from 16 a and 16 b, and a circuit is provided to calculate the distance and size to the object. It is the board
- the processing board 1 a 9 is fixed to the stay 11 by a plurality of screws 1 b 1, and the imaging device boards 16 at both left and right ends are fixed to the stay 11 by a plurality of screws la 3.
- the left and right imaging element substrates 16a, 16b and the processing substrate 1a9 are connected by a harness 1b0.
- the processing board 1a9 is an image processing LSI 1a7 that performs stereo processing using the captured image and calculates the distance and size to the object.
- An arithmetic processing unit 1 a 6 that performs applications (for example, vehicle detection or pedestrian detection) and an interface microcomputer 1 a 8 for exchanging information from outside the stereo camera are provided.
- the interface microcomputer 1 a 8 also has a function of monitoring the arithmetic processing unit 1 a 6.
- Processing board 1 a 9 includes external processing unit 1 a 6, image processing LSI, interface microcomputer 1 a 8, and other stereo camera microcomputers, and power connector 1 a 4 that supplies power to the IC.
- a video connector 1 a 5 for outputting a stereo processed image or the like to the outside is connected.
- a cover 1 a 1 that protects the processing substrate 1 a 9 and the image pickup device substrates 16 a and 16 b is attached to the back side of the stay 11 by screws 1 a 2.
- Figure 2 shows the system block diagram.
- the images captured by the left and right image sensors 1 2 a, 1 2 b are routed through the harness 1 b 0 to the image sensor board 1 6 a,
- Image processing LS I la7 is a pre-processing for performing stereo matching with parallax.
- gain correction of image sensors 12 a and 12 b and peripheral light attenuation characteristics of lenses 13a and 13 b are performed on video data.
- Execute the shading process to correct the brightness difference in the image plane that is generated from the image and store it in SD RAM 1.
- stereo matching processing such as affine transformation is performed using the data in SDRAM1, and the data is stored in SDRAM2, which is a shared memory with the arithmetic processing unit 1a6.
- Arithmetic processing unit la 6 uses the stereo processing data stored in SDR AM 2 to perform various three-dimensional object detection processes such as vehicles and pedestrians, and the final calculation results are sent to interface microcomputer 1a8. Output to the outside.
- the video is output from the video output connector l a 5 via the video output IC.
- the power from the vehicle is sent to the power supply circuit via the power connector 1a4, converted to the required voltage, and then supplied to each microcomputer and each IC.
- Fig. 3 (a) is a side view of CCD
- Fig. 3 (b) is a front view
- Fig. 3 (c) is an overhead view.
- the CCD is bonded to the wire 3 7 and wire 3 7 for allocating the image received by the imaging surface 3 3 and the imaging surface 3 3 installed in the package 3 5 and the package recess 3 9 to each terminal. It has a metal pad 3 6, a terminal 3 2 connected to the metal pad 3 6, and a cover glass 3 4 to protect the imaging surface.
- the image is formed on the imaging surface 33 by the lens 13, and the formed image information is transmitted to the image sensor substrate 16 via the wire 37, pad 36, and terminal 31.
- the optical axis of the lens 13 and the imaging surface 33 must be perpendicular. If it is not vertical, part of the video will be blurred and stereo matching will not be possible. In addition, when performing the stereo matching process, it is not preferable that a deviation other than parallax occurs between the image acquired by the right image sensor and the image acquired by the left image sensor. For this reason, the height direction, rotation direction, and focus of the left and right imaging surfaces must be matched.
- the reference surface is substantially the same plane 90 (FIG. 4 (a)) or parallel to the step projections 18 a and 18 b of the step 11.
- the image sensor mounting surfaces 44a and 44b are provided. Normally, the imaging surface 33 and the package bottom surface 38 are manufactured so that the parallelism within a certain standard is maintained. For this reason, at step 11, as shown in Fig. 4 (a), 1 1
- the imaging surfaces 33 of the left and right imaging devices 12 a and 12 b installed on the installation surface are in good parallel.
- the degree of focus can be maintained, and it will be possible to obtain a focused image with no blur. Therefore, the detection accuracy of the stereo camera is improved.
- the image sensor 12 is provided with three reference surfaces 3 1 on the side surfaces of the package 3 5 and the stay projections 18 are also in contact with these reference surfaces 3 1.
- An image sensor positioning surface 4 1 is provided.
- the image sensor positioning surface 4 1 can be formed as an integral structure as shown in FIG. 5 (b), but the stay projection as shown in FIG. 5 (c).
- the image sensor positioning pin 4 2 may be set up on 1 8 and the head of the pin 4 2 may be used as the image sensor positioning surface 4 1.
- the shape and number of the image sensor positioning surface 41 depend on the reference surface 3 1 of the image sensor 12 and may be, for example, four or more.
- Figure 5 (a) will be described later.
- FIG. 6 is an external view of the fixed structure.
- an imaging element installation surface 44a on which an imaging element is installed is generated at the stage protrusion 18a provided at the end of the stay 11 and further imaging is performed.
- An image sensor positioning surface 41 is generated at a position opposite to the image sensor reference surface 3 1 when the element 1 2 a is installed.
- the image sensor 1 2 a is installed on the image sensor installation surface 44 a, and the image sensor presser plate 1 8 2 is placed thereon, and the fixing screws 1 8 3 are screwed. Tighten.
- the image sensor holding plate 1 8 2 has a screw hole 1 8 5 for the fixing screw 1 8 3 to pass through.
- the fixing screw 1 8 3 is screwed to the screw hole 1 8 1 through the screw hole 1 8 5. Is done.
- FIGS. 8A and 8B are a top view, a front view, and a side view of the state where 8 2 is attached to the stage protrusion 18.
- the image sensor holding plate 1 8 2 is provided with clips 1 8 4, 1 8 6 and 1 8 7.
- the latch 1 8 4 When the image sensor pressing plate 1 8 2 is fixed to the stay projection 18, the latch 1 8 4 is deformed by being pushed and bent by the image sensor 1 2 a as shown in FIG. Generates an urging force that presses against the image sensor installation surface 44.
- Neck 1 8 4 is the push-button of the image sensor as shown in Fig. 7 (a).
- Even 3 may be configured to form the same plane as the plate 1 8 2, or may be previously folded.
- the latches 1 8 6 and 1 8 7 are deformed by being pushed and bent by the image sensor 12 a as shown in FIG. Then, the urging force is generated by pressing the image sensor 1 2 a against the image sensor positioning surface 4 1.
- the claws 1 8 6 and 1 8 7 are inclined in advance toward the side in contact with the image sensor 1 2 a from the direction perpendicular to the image sensor pressing plate 1 8 2. Is provided. As a result, in a state where the image sensor holding plate 1 8 2 is fixed to the stay projection 1 8 a, the claws 1 8 6 and 1 8 7 are moved away from the image sensor holding plate 1 8 2 as shown in FIG.
- the depth of the imaging surface of image sensor 1 2 a is 1 86 in the horizontal direction (X-axis direction) and 1 8 7 in the vertical direction (Y-axis direction). Hold the direction (Z-axis direction) with nail 1 8 4.
- the imaging element 1 2 a is pressed and fixed to the imaging element installation surface 4 4 a and the imaging element positioning surface 4 1 a to generate an image free from single-pocketing, left / right optical axis misalignment, and rotational misalignment. .
- the image sensor positioning surface 41 a be 3 points or more for each image sensor.
- the horizontal position of the image sensor positioning surface 4 1 a is 2 points, and the vertical surface is 1 point. Therefore, there are 2 horizontal tips 1 86 and 2 in the vertical direction. 1 8 7 is one, but the image sensor positioning surface 4 1 a has two horizontal planes and one vertical plane, one horizontal clamp 1 8 6 and vertical No. 1 8 7 may be two.
- the image sensor positioning surface 4 1 is configured to be positioned on the upper side with the stereo camera attached to the vehicle, and is pressed from below with the catch of the image sensor pressing plate 1 8 2.
- the configuration is such that the imaging element positioning surface 4 1 can be configured to be positioned on the lower side with the stereo camera attached to the vehicle, the image sensor positioning surface 41 can be pressed from the upper side with a nail.
- gravity acts in the direction in which the image sensor 12 is pressed against the image sensor positioning surface 4 1 a, so that it is possible to improve resistance to vibration and the like.
- the image sensor holding plate 1 8 2 to press the image sensor 1 2 a against the image sensor installation surface 44 a and the image sensor positioning surface 4 1 a and screw it in place.
- the length of the imaging element terminal 3 2 is longer than the thickness of the imaging element installation surface 44 of the stay projection 18 a.
- the image pickup device terminal 3 2 reaches the hole 4 3 of the image pickup device substrate 16 a through the hole 43 provided in the stay protrusion 18 a.
- the image sensor terminal 3 2 and the image sensor substrate 1 6 a are soldered to complete the electrical connection.
- step projection 18a which is the same member as the step 1 1, to drive the image sensor.
- the image pickup device substrate 16 a on which the circuit is mounted is fixed to the stay projection 18 a from the rear.
- the lens holders 1 9 a, 1 9 so that the centers of the imaging surfaces 3 3 a, 3 3 b of the left and right image sensors 1 2 a, 1 2 b are aligned with the centers of the lenses 1 3 a, 1 3 b. Adjust the position of the lens together with b and fix it.
- the lens holder positioning pin 7 1 is placed in advance on the stay projections 18a, 18b or the lens holders 19a, 19b and aligned with the pins.
- the lens holders 19a and 19b may be attached and fixed.
- the lens holders 19a and 19b are fixed to the stay projections in advance, and the lenses 13a and 13b themselves are attached to the image sensor.
- the lenses 1 3 a and 1 3 b may be fixed to the lens holders 19 a and 19 b with an adhesive or the like.
- the concave portion 7 2 for installing the lens 1 3 of the lens holder 1 9 has a larger diameter than the lens 1 3, and is bonded after adjusting the positional relationship between the lens 1 3 and the image sensor 1 2. It becomes the composition fixed with the agent.
- the positional relationship between the two image sensors 1 2 a and 1 2 b is determined. And fix the optical axis in the appropriate direction.
- the positional relationship between the image sensors can be determined with high accuracy and simplicity by mounting the image sensors directly on the stage.
- the stereo image is processed after correcting the shift of the image captured from the left and right image sensors, and a distance image to the object is generated.
- a distance image is generated by stereo processing of the left and right images as they are without correcting the deviation as shown in Fig. 10 (b). It is also possible. For example, if there is a rotational shift between the mounted image sensors, a certain feature point on the image obtained by one image sensor as shown in Fig. 18 can be obtained by the other image sensor. On the captured image, it is necessary to search for a corresponding point in the search direction 104 when there is a rotational deviation of the image sensor. In this case, for example, image size 5 1 2 pixels X
- pixel shift of search direction 1 0 2 and 1 0 4 is soil 30 pixel
- the image pickup device itself is directly adjusted and attached to one member (stage) by accurately adjusting the positional relationship among the vertical, horizontal, and rotational directions. This reduces the tolerances included in the positional relationship between the image sensors compared to the conventional technology, and corrects the positional relationship between the image sensors after mounting the two conventional camera units in a stage. There is no need to perform work to perform. As a result, the processing time can be shortened, and the distance calculation accuracy by stereo processing can be improved.
- FIG. 11 (a) A second embodiment in which the image sensor 12 is mounted on the stage 11 with higher accuracy than the first embodiment will be described with reference to FIG.
- the imaging surface 33 of the image sensor actually has a slight shift in the rotation direction, and the package 35 of the image sensor and the imaging surface are in an exactly parallel positional relationship. Strictly speaking, deviations in the horizontal direction, vertical direction, and rotational direction occur.
- Such a deviation in mounting the imaging surface 33 is defined as a tolerance with respect to the reference surface 31 of the package.
- the tolerance in the rotation direction is set to ⁇ 1 degree. Therefore, the rotational angle deviation generated between the two image sensors is a maximum of 2 degrees.
- the vertical and horizontal tolerances are ⁇ 2 0 m, so the maximum vertical and horizontal displacement between two image sensors is 400 m.
- the packages of the image sensors 12 a and 12 b are mounted on the stage as shown in FIG. 7
- the structure shown in Fig. 11 (c) can be considered as a structure for correcting these deviations.
- a television camera 81 for photographing the imaging surface 3 3 itself of the image sensor 12 is prepared.
- the TV camera 8 1 takes a picture of the outer shape of the imaging surface 3 3 of the image sensor 1 2 and the step projection reference surface 15 at the same time. Make adjustments so that the angle matches.
- the step projection reference surface 15 is a mark that can be recognized by the TV camera 8 1 in the vicinity of the image sensor mounting surface 44, and uses steps or lines that can be easily extracted as edges in image data.
- the step with respect to the image pickup device installation surface 44 provided on the step protrusion 18 is defined as a step protrusion reference surface 15. Adjustment is carried out by installing a robot arm for moving the position of the image sensor 12 or a micro-meter for manual adjustment by a human.
- the surface used as a reference when adjusting the position of the imaging surface 33 may be the step reference surface 14.
- FIG. 11 (b) By observing the position of the imaging surface 3 3 and adjusting each other in this way, mounting is performed with reference to the reference surface 3 1 provided in the package of the imaging device 12 as shown in Fig. 11 (b).
- the positional relationship between the image sensors 12a and 12b can be mounted with higher accuracy than the configuration (Fig. 11 (b)).
- three or more positioning surfaces 4 1 may be provided, but as shown in FIG. 5 (a).
- the positioning surface 4 1 is not provided, and only the depth direction (Z direction) is positioned using the image sensor mounting surface 4 4, and the X and Y directions
- it may be configured to adjust using the above-mentioned mouth pot arm or micro-overnight.
- Adhesive is used to fix 18 and the step projections 18.
- the positioning surface 4 1 has two points, and rough position adjustment is performed by pressing against these two positioning surfaces 41, and fine adjustments can be made using the camera and the robot arm or micrometer. Good. In this way, when the rough positioning is performed on the positioning surface 41 and the robot arm is used for fine adjustment, the structure is more suitable for automatic manufacturing by a machine tool.
- the lenses 1 3 a and 1 2 b are respectively attached to the lens 1 3.
- the optical axis means a virtual straight line created by the positional relationship between the lens 13 fixed to the lens holder 19 and the image sensor 12 fixed to the stage 11.
- use a lens hood to capture the situation outside the vehicle with an in-vehicle camera, and to prevent saturation of the image sensor 12 due to direct sunlight or strong light from an unnecessary direction. Is effective.
- a configuration is shown in which the positional relationship between the lenses 1 3 a and 1 3 b and the image sensors 1 2 a and 1 2 b does not change even when the lens hood 1 61 receives an impact.
- one image sensor 1 2 b and lens 1 3 b the image sensor 12 a and the lens 13 a on the opposite side have the same configuration.
- Fig. 13 by forming the lens hood 15 lb into a bag shape so as to wrap the lens holder 1 9 b, the lens hood 1 5 1 b is not fixed to the lens holder 19 b. The structure is fixed directly to the stay 1 1.
- Fig. 13 (a) is an external view of the lens hood, and Fig.
- the bag portion of the lens hood 1 5 1 b has a clearance (gap) 9 2 so that it does not come into contact with the lens holder 1 9 b, and even if some impact is applied to the lens hood 1 5 lb
- the structure prevents the lens holder 19 b from being displaced due to sound.
- the position of the lens hood 1 5 1 b relative to the stay 1 1 changes slightly due to impact, the position of the lens hood 1 9 b does not change, and the lens 1 3 b and the image sensor 1 2 b that are indispensable for stereo processing
- the high-precision positional relationship with is maintained normally.
- the lens feed 15 1 a on the opposite side has the same configuration and has the same effect.
- disturbance light incident from the gap of the lens holder can be shut down more reliably, so that a clearer image can be captured.
- a lens hood 1 71 1 b having a structure in which the cover area is further expanded may be used.
- the lens hood is fixed to the section using screws 1 and 2.
- Fig. 15 (a) is an external view
- Fig. 15 (b) is a sectional view. In this way, the entire lens holder is enclosed from the upper surface of the stage 11 so that the screw hole for fixing the lens hood 1 7 1 b is not required on the stage projection 18 b, and the stay projection
- the degree of freedom in designing the size, shape, lens holder, and mounting structure of the lens holder is increased.
- the clearance between the lens holder 1 9 b and the lens hood 1 7 1 b (Gap) 9 3 can be secured wider than the embodiment shown in FIG. 13 so that the positional relationship can be prevented from changing due to an impact. Also, behind the mounting surface of the lens holder 1 9 b and stay projection 1 8 b
- the lens hood 1 7 1 a on the opposite side has the same configuration and has the same effect. Industrial applicability.
- one joint portion including a tolerance can be reduced, so that the productivity of the stereo camera is improved.
- correction processing for the position and angle deviation between the image sensors can be omitted to some extent, the processing speed and measurement accuracy of the stereo camera can be improved with a simple configuration.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of Optical Distance (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Studio Devices (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05806640A EP1816514B1 (en) | 2004-11-15 | 2005-11-14 | Stereo camera |
JP2006545151A JP4691508B2 (ja) | 2004-11-15 | 2005-11-14 | ステレオカメラ |
US11/666,340 US20080001727A1 (en) | 2004-11-15 | 2005-11-14 | Stereo Camera |
US14/161,360 US9456199B2 (en) | 2004-11-15 | 2014-01-22 | Stereo camera |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004329940 | 2004-11-15 | ||
JP2004-329940 | 2004-11-15 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/666,340 A-371-Of-International US20080001727A1 (en) | 2004-11-15 | 2005-11-14 | Stereo Camera |
US14/161,360 Continuation US9456199B2 (en) | 2004-11-15 | 2014-01-22 | Stereo camera |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006052024A1 true WO2006052024A1 (ja) | 2006-05-18 |
Family
ID=36336678
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/021194 WO2006052024A1 (ja) | 2004-11-15 | 2005-11-14 | ステレオカメラ |
Country Status (4)
Country | Link |
---|---|
US (2) | US20080001727A1 (ja) |
EP (2) | EP1816514B1 (ja) |
JP (2) | JP4691508B2 (ja) |
WO (1) | WO2006052024A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008012173A1 (de) * | 2006-07-26 | 2008-01-31 | Robert Bosch Gmbh | Optische messeinrichtung mit zwei kameraeinheiten |
WO2009057436A1 (ja) * | 2007-11-01 | 2009-05-07 | Konica Minolta Holdings, Inc. | 撮像装置 |
JP2009265412A (ja) * | 2008-04-25 | 2009-11-12 | Fuji Heavy Ind Ltd | ステレオカメラユニット |
JP2010195192A (ja) * | 2009-02-25 | 2010-09-09 | Alps Electric Co Ltd | 回転コネクタの取付構造 |
JP2012085290A (ja) * | 2010-10-08 | 2012-04-26 | Lg Innotek Co Ltd | 3次元撮像装置及びその製造方法 |
WO2012091151A1 (en) * | 2010-12-28 | 2012-07-05 | Ricoh Company, Ltd. | Ranging apparatus |
WO2014015867A1 (de) * | 2012-07-27 | 2014-01-30 | Conti Temic Microelectronic Gmbh | Verfahren zur ausrichtung zweier bildaufnahmeelemente eines stereokamerasystems |
JP2015198224A (ja) * | 2014-04-03 | 2015-11-09 | 日立オートモティブシステムズ株式会社 | 基板の保持構造 |
WO2016103928A1 (ja) * | 2014-12-24 | 2016-06-30 | 株式会社東海理化電機製作所 | 撮像装置 |
JP2016177257A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社リコー | 撮像ユニット、車両制御ユニットおよび撮像ユニットの伝熱方法 |
JP2017191265A (ja) * | 2016-04-15 | 2017-10-19 | 日立オートモティブシステムズ株式会社 | 多眼光学装置 |
JP2020094858A (ja) * | 2018-12-11 | 2020-06-18 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
Families Citing this family (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4527140B2 (ja) * | 2007-08-07 | 2010-08-18 | 日立オートモティブシステムズ株式会社 | 車載カメラ装置 |
JP2011221506A (ja) * | 2010-03-26 | 2011-11-04 | Panasonic Corp | 撮像装置 |
US9485495B2 (en) | 2010-08-09 | 2016-11-01 | Qualcomm Incorporated | Autofocus for stereo images |
US9438889B2 (en) | 2011-09-21 | 2016-09-06 | Qualcomm Incorporated | System and method for improving methods of manufacturing stereoscopic image sensors |
JP5793122B2 (ja) * | 2012-07-31 | 2015-10-14 | 日立オートモティブシステムズ株式会社 | 車載画像処理装置 |
JP5961506B2 (ja) * | 2012-09-27 | 2016-08-02 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
US9398264B2 (en) | 2012-10-19 | 2016-07-19 | Qualcomm Incorporated | Multi-camera system using folded optics |
EP2972478B1 (en) * | 2013-03-15 | 2020-12-16 | Uatc, Llc | Methods, systems, and apparatus for multi-sensory stereo vision for robotics |
DE102013102820A1 (de) | 2013-03-19 | 2014-09-25 | Conti Temic Microelectronic Gmbh | Stereokameramodul sowie Verfahren zur Herstellung |
JP6114617B2 (ja) * | 2013-04-09 | 2017-04-12 | 日立オートモティブシステムズ株式会社 | 車載ステレオカメラ |
JP6097644B2 (ja) * | 2013-06-19 | 2017-03-15 | 株式会社フジクラ | 撮像モジュール、測距モジュール、絶縁チューブ付き撮像モジュール、レンズ付き撮像モジュール、および内視鏡 |
CN106458103A (zh) | 2013-07-31 | 2017-02-22 | 感知驾驶员技术有限责任公司 | 车用便携式平视显示器 |
EP3035099B1 (en) * | 2013-08-14 | 2020-01-22 | Hitachi Automotive Systems, Ltd. | Imaging module, stereo camera for vehicle, and light shielding member for imaging module |
US10178373B2 (en) | 2013-08-16 | 2019-01-08 | Qualcomm Incorporated | Stereo yaw correction using autofocus feedback |
US9965856B2 (en) | 2013-10-22 | 2018-05-08 | Seegrid Corporation | Ranging cameras using a common substrate |
US10247944B2 (en) | 2013-12-20 | 2019-04-02 | Sensedriver Technologies, Llc | Method and apparatus for in-vehicular communications |
US9383550B2 (en) | 2014-04-04 | 2016-07-05 | Qualcomm Incorporated | Auto-focus in low-profile folded optics multi-camera system |
US9374516B2 (en) | 2014-04-04 | 2016-06-21 | Qualcomm Incorporated | Auto-focus in low-profile folded optics multi-camera system |
EP2942939A1 (en) * | 2014-05-07 | 2015-11-11 | Autoliv Development AB | Imaging system for a motor vehicle and method of mounting an imaging system |
DE102014208487A1 (de) * | 2014-05-07 | 2015-11-12 | Conti Temic Microelectronic Gmbh | Kamera eines Assistenzsystems eines Kraftfahrzeugs sowie Verfahren zur Herstellung eines derartigen Assistenzsystems |
US10013764B2 (en) | 2014-06-19 | 2018-07-03 | Qualcomm Incorporated | Local adaptive histogram equalization |
US9819863B2 (en) | 2014-06-20 | 2017-11-14 | Qualcomm Incorporated | Wide field of view array camera for hemispheric and spherical imaging |
US9549107B2 (en) | 2014-06-20 | 2017-01-17 | Qualcomm Incorporated | Autofocus for folded optic array cameras |
US9541740B2 (en) | 2014-06-20 | 2017-01-10 | Qualcomm Incorporated | Folded optic array camera using refractive prisms |
US9386222B2 (en) | 2014-06-20 | 2016-07-05 | Qualcomm Incorporated | Multi-camera system using folded optics free from parallax artifacts |
US9294672B2 (en) | 2014-06-20 | 2016-03-22 | Qualcomm Incorporated | Multi-camera system using folded optics free from parallax and tilt artifacts |
US9832381B2 (en) | 2014-10-31 | 2017-11-28 | Qualcomm Incorporated | Optical image stabilization for thin cameras |
WO2016123248A1 (en) | 2015-01-27 | 2016-08-04 | Sensedriver Technologies, Llc | An image projection medium and display projection system using same |
TWI558200B (zh) * | 2015-02-26 | 2016-11-11 | 晶睿通訊股份有限公司 | 攝像模組及攝影裝置 |
KR102301832B1 (ko) * | 2015-02-26 | 2021-09-13 | 엘지이노텍 주식회사 | 스테레오 카메라 |
US10412274B2 (en) | 2015-03-18 | 2019-09-10 | Ricoh Company, Ltd. | Imaging unit, vehicle control unit and heat transfer method for imaging unit |
JP6158258B2 (ja) * | 2015-08-07 | 2017-07-05 | 日立オートモティブシステムズ株式会社 | 車載画像処理装置 |
JP6831792B2 (ja) | 2015-11-16 | 2021-02-17 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、および、撮像システム |
KR102130112B1 (ko) * | 2015-12-09 | 2020-07-03 | 타이탄 메디칼 아이엔씨. | 입체 영상화 센서 장치 및 입체 영상화에 사용되는 영상 센서의 쌍들을 제작하는 방법 |
US10338225B2 (en) | 2015-12-15 | 2019-07-02 | Uber Technologies, Inc. | Dynamic LIDAR sensor controller |
US10548683B2 (en) | 2016-02-18 | 2020-02-04 | Kic Ventures, Llc | Surgical procedure handheld electronic display device and method of using same |
US10281923B2 (en) | 2016-03-03 | 2019-05-07 | Uber Technologies, Inc. | Planar-beam, light detection and ranging system |
US10077007B2 (en) | 2016-03-14 | 2018-09-18 | Uber Technologies, Inc. | Sidepod stereo camera system for an autonomous vehicle |
US9952317B2 (en) | 2016-05-27 | 2018-04-24 | Uber Technologies, Inc. | Vehicle sensor calibration system |
WO2018094045A2 (en) * | 2016-11-16 | 2018-05-24 | World View Holdings, Llc | Multi-camera scene representation including stereo video for vr display |
US10479376B2 (en) | 2017-03-23 | 2019-11-19 | Uatc, Llc | Dynamic sensor selection for self-driving vehicles |
JP6390763B1 (ja) * | 2017-06-28 | 2018-09-19 | Smk株式会社 | カメラモジュール及びカメラモジュールの製造方法 |
USD929487S1 (en) * | 2017-07-10 | 2021-08-31 | Tusimple, Inc. | Holder for multiple cameras |
US10775488B2 (en) | 2017-08-17 | 2020-09-15 | Uatc, Llc | Calibration for an autonomous vehicle LIDAR module |
US10746858B2 (en) | 2017-08-17 | 2020-08-18 | Uatc, Llc | Calibration for an autonomous vehicle LIDAR module |
JP6939353B2 (ja) | 2017-09-29 | 2021-09-22 | 株式会社デンソー | 車載カメラ装置 |
JP7020844B2 (ja) | 2017-09-29 | 2022-02-16 | 株式会社デンソー | 車載カメラ装置 |
US11474254B2 (en) | 2017-11-07 | 2022-10-18 | Piaggio Fast Forward Inc. | Multi-axes scanning system from single-axis scanner |
US10967862B2 (en) | 2017-11-07 | 2021-04-06 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
CN107830818A (zh) * | 2017-12-14 | 2018-03-23 | 苏州西博三维科技有限公司 | 三维形貌测量仪 |
CN108036741A (zh) * | 2017-12-14 | 2018-05-15 | 苏州西博三维科技有限公司 | 三维形貌测量仪 |
CN108050954A (zh) * | 2017-12-14 | 2018-05-18 | 苏州西博三维科技有限公司 | 三维形貌测量仪 |
US10914820B2 (en) | 2018-01-31 | 2021-02-09 | Uatc, Llc | Sensor assembly for vehicles |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4879596A (en) | 1988-01-08 | 1989-11-07 | Kabushiki Kaisha Toshiba | Stereoscopic camera apparatus which incorporates cameras and can commonly adjust directions of the cameras |
JPH08113082A (ja) * | 1994-10-18 | 1996-05-07 | Hino Motors Ltd | 車両用ビデオカメラの汚れ防止装置 |
US5778268A (en) | 1996-07-12 | 1998-07-07 | Inaba; Minoru | Stereo camera |
JPH11301365A (ja) | 1998-04-17 | 1999-11-02 | Kyocera Corp | 車載用ステレオカメラの支持装置 |
EP1087257A2 (en) | 1999-09-22 | 2001-03-28 | Fuji Jukogyo Kabushiki Kaisha | Mounting stereo camera system of a vehicle |
JP2001088611A (ja) * | 1999-09-22 | 2001-04-03 | Fuji Heavy Ind Ltd | 車載カメラ |
JP2001242521A (ja) * | 2000-02-28 | 2001-09-07 | Fuji Heavy Ind Ltd | カメラの組み立て構造、カメラの調整方法、および調整用治具 |
WO2003005455A1 (de) * | 2001-06-29 | 2003-01-16 | Siemens Aktiengesellschaft | Bilderzeugungsvorrichtung und verfahren zur herstellung einer bilderzeugungsvorrichtung |
JP2003084357A (ja) * | 2001-09-14 | 2003-03-19 | Kyocera Corp | 大型ステレオカメラの取付構造 |
JP2003312376A (ja) * | 2002-04-22 | 2003-11-06 | Fujitsu Ten Ltd | 車載用周辺監視センサの設置調整方法 |
JP2003335180A (ja) | 2002-05-17 | 2003-11-25 | Fuji Heavy Ind Ltd | センサの支持構造およびその組付方法 |
JP2005022496A (ja) * | 2003-07-01 | 2005-01-27 | Auto Network Gijutsu Kenkyusho:Kk | 監視部品取付部構造 |
JP2006033282A (ja) * | 2004-07-14 | 2006-02-02 | Olympus Corp | 画像生成装置およびその方法 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7663502B2 (en) * | 1992-05-05 | 2010-02-16 | Intelligent Technologies International, Inc. | Asset system control arrangement and method |
IT8253775V0 (it) | 1982-10-11 | 1982-10-11 | Sacef Dei F Lli Avignone S N C | Erpice combinato a denti vibranti e rulli sminuzzatori |
US4669843A (en) * | 1984-12-21 | 1987-06-02 | Agip, S.P.A. | Rotatable heliborne beam for supporting metric photo-cameras suitable to industrial stereophotogrammetric surveys |
US5233382A (en) * | 1991-04-03 | 1993-08-03 | Fuji Photo Film Company, Ltd. | Range finding device unaffected by environmental conditions |
JPH0772560A (ja) * | 1993-09-01 | 1995-03-17 | Off Morioka:Kk | ステレオ写真撮影装置 |
JPH0836229A (ja) * | 1994-07-21 | 1996-02-06 | Canon Inc | ステレオアダプター |
JP2746171B2 (ja) * | 1995-02-21 | 1998-04-28 | 日本電気株式会社 | 固体撮像装置及びその製造方法 |
JP3034209B2 (ja) | 1996-12-11 | 2000-04-17 | 稔 稲葉 | ステレオカメラのレンズ調節装置 |
JPH11192207A (ja) * | 1997-11-07 | 1999-07-21 | Matsushita Electric Ind Co Ltd | ビデオスコープおよび携帯収納ケース |
JPH11281351A (ja) * | 1998-01-28 | 1999-10-15 | Fuji Electric Co Ltd | 測距装置 |
JP4172555B2 (ja) * | 1998-03-12 | 2008-10-29 | 富士重工業株式会社 | ステレオカメラの調整装置 |
US6296360B1 (en) * | 1999-09-27 | 2001-10-02 | Minoru Inaba | Stereo slide mount, stereo slide viewer and collimation pattern mask |
JP3565749B2 (ja) * | 1999-09-22 | 2004-09-15 | 富士重工業株式会社 | 車載カメラの撮像方向の検査方法およびその検査装置 |
JP3255360B2 (ja) * | 1999-09-22 | 2002-02-12 | 富士重工業株式会社 | 距離データの検査方法およびその検査装置 |
JP3479006B2 (ja) * | 1999-09-22 | 2003-12-15 | 富士重工業株式会社 | 車載カメラの検査方法ならびに装置 |
EP1263626A2 (en) * | 2000-03-02 | 2002-12-11 | Donnelly Corporation | Video mirror systems incorporating an accessory module |
JP2001305681A (ja) * | 2000-04-19 | 2001-11-02 | Nippon Signal Co Ltd:The | ステレオカメラ装置 |
GB0019449D0 (en) * | 2000-08-09 | 2000-09-27 | Connell Michelle C O | Electric socket/plug cover |
JP3821652B2 (ja) * | 2001-02-26 | 2006-09-13 | 三菱電機株式会社 | 撮像装置 |
JP2002369060A (ja) * | 2001-06-04 | 2002-12-20 | Clarion Co Ltd | パノラマカメラの配置方法及びその構築方法 |
DE10162652A1 (de) * | 2001-12-20 | 2003-07-03 | Bosch Gmbh Robert | Stereo-Kamera-Anordnung in einem Kraftfahrzeug |
DE10209616A1 (de) * | 2002-03-05 | 2003-09-18 | Bosch Gmbh Robert | Verfahren und Vorrichtung zur Befestigung eines Sensors |
DE20209101U1 (de) * | 2002-06-12 | 2003-03-13 | Grosmann Hans Dieter | Die digitale Sehhilfe (Brille) |
KR100964057B1 (ko) * | 2002-08-30 | 2010-06-16 | 테루모 가부시키가이샤 | 튜브 접합 장치 및 튜브 접합 방법 |
JP4322092B2 (ja) * | 2002-11-13 | 2009-08-26 | 富士機械製造株式会社 | 電子部品実装装置における校正方法および装置 |
JP3709879B2 (ja) * | 2003-05-01 | 2005-10-26 | 日産自動車株式会社 | ステレオ画像処理装置 |
JP4073371B2 (ja) * | 2003-06-30 | 2008-04-09 | テルモ株式会社 | チューブクランプ装置及びチューブ接合装置 |
JP4238115B2 (ja) * | 2003-11-07 | 2009-03-11 | ポップリベット・ファスナー株式会社 | ワイヤハーネス等の長尺部材の留め具 |
JP2005350010A (ja) * | 2004-06-14 | 2005-12-22 | Fuji Heavy Ind Ltd | ステレオ式車外監視装置 |
JP4848795B2 (ja) * | 2006-02-27 | 2011-12-28 | パナソニック株式会社 | ステレオカメラ |
-
2005
- 2005-11-14 EP EP05806640A patent/EP1816514B1/en active Active
- 2005-11-14 JP JP2006545151A patent/JP4691508B2/ja active Active
- 2005-11-14 EP EP11159214A patent/EP2357527B1/en active Active
- 2005-11-14 US US11/666,340 patent/US20080001727A1/en not_active Abandoned
- 2005-11-14 WO PCT/JP2005/021194 patent/WO2006052024A1/ja active Application Filing
-
2010
- 2010-12-28 JP JP2010291539A patent/JP2011123078A/ja not_active Abandoned
-
2014
- 2014-01-22 US US14/161,360 patent/US9456199B2/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4879596A (en) | 1988-01-08 | 1989-11-07 | Kabushiki Kaisha Toshiba | Stereoscopic camera apparatus which incorporates cameras and can commonly adjust directions of the cameras |
JPH08113082A (ja) * | 1994-10-18 | 1996-05-07 | Hino Motors Ltd | 車両用ビデオカメラの汚れ防止装置 |
US5778268A (en) | 1996-07-12 | 1998-07-07 | Inaba; Minoru | Stereo camera |
JPH11301365A (ja) | 1998-04-17 | 1999-11-02 | Kyocera Corp | 車載用ステレオカメラの支持装置 |
EP1087257A2 (en) | 1999-09-22 | 2001-03-28 | Fuji Jukogyo Kabushiki Kaisha | Mounting stereo camera system of a vehicle |
JP2001088611A (ja) * | 1999-09-22 | 2001-04-03 | Fuji Heavy Ind Ltd | 車載カメラ |
JP2001242521A (ja) * | 2000-02-28 | 2001-09-07 | Fuji Heavy Ind Ltd | カメラの組み立て構造、カメラの調整方法、および調整用治具 |
WO2003005455A1 (de) * | 2001-06-29 | 2003-01-16 | Siemens Aktiengesellschaft | Bilderzeugungsvorrichtung und verfahren zur herstellung einer bilderzeugungsvorrichtung |
JP2003084357A (ja) * | 2001-09-14 | 2003-03-19 | Kyocera Corp | 大型ステレオカメラの取付構造 |
JP2003312376A (ja) * | 2002-04-22 | 2003-11-06 | Fujitsu Ten Ltd | 車載用周辺監視センサの設置調整方法 |
JP2003335180A (ja) | 2002-05-17 | 2003-11-25 | Fuji Heavy Ind Ltd | センサの支持構造およびその組付方法 |
JP2005022496A (ja) * | 2003-07-01 | 2005-01-27 | Auto Network Gijutsu Kenkyusho:Kk | 監視部品取付部構造 |
JP2006033282A (ja) * | 2004-07-14 | 2006-02-02 | Olympus Corp | 画像生成装置およびその方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1816514A4 |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101496414A (zh) * | 2006-07-26 | 2009-07-29 | 罗伯特·博世有限公司 | 具有两个摄像机单元的光学测量装置 |
WO2008012173A1 (de) * | 2006-07-26 | 2008-01-31 | Robert Bosch Gmbh | Optische messeinrichtung mit zwei kameraeinheiten |
JP5224142B2 (ja) * | 2007-11-01 | 2013-07-03 | コニカミノルタホールディングス株式会社 | 撮像装置 |
WO2009057436A1 (ja) * | 2007-11-01 | 2009-05-07 | Konica Minolta Holdings, Inc. | 撮像装置 |
JPWO2009057436A1 (ja) * | 2007-11-01 | 2011-03-10 | コニカミノルタホールディングス株式会社 | 撮像装置 |
JP2009265412A (ja) * | 2008-04-25 | 2009-11-12 | Fuji Heavy Ind Ltd | ステレオカメラユニット |
JP2010195192A (ja) * | 2009-02-25 | 2010-09-09 | Alps Electric Co Ltd | 回転コネクタの取付構造 |
JP2012085290A (ja) * | 2010-10-08 | 2012-04-26 | Lg Innotek Co Ltd | 3次元撮像装置及びその製造方法 |
CN107065423A (zh) * | 2010-10-08 | 2017-08-18 | Lg伊诺特有限公司 | 三维图像采集装置及其制造方法 |
CN107065423B (zh) * | 2010-10-08 | 2020-06-02 | Lg伊诺特有限公司 | 三维图像采集装置及其制造方法 |
US9883166B2 (en) | 2010-10-08 | 2018-01-30 | Lg Innotek Co., Ltd. | Three dimensional image pick-up device and manufacturing method thereof |
WO2012091151A1 (en) * | 2010-12-28 | 2012-07-05 | Ricoh Company, Ltd. | Ranging apparatus |
JP2012150093A (ja) * | 2010-12-28 | 2012-08-09 | Ricoh Co Ltd | 測距装置 |
US9429423B2 (en) | 2010-12-28 | 2016-08-30 | Ricoh Company, Ltd. | Ranging apparatus |
WO2014015867A1 (de) * | 2012-07-27 | 2014-01-30 | Conti Temic Microelectronic Gmbh | Verfahren zur ausrichtung zweier bildaufnahmeelemente eines stereokamerasystems |
JP2015528133A (ja) * | 2012-07-27 | 2015-09-24 | コンティ テミック マイクロエレクトロニック ゲゼルシャフト ミットベシュレンクテル ハフツングConti Temic microelectronic GmbH | ステレオ・カメラの二つの画像撮影エレメントを揃えるための方法 |
US10547827B2 (en) | 2012-07-27 | 2020-01-28 | Conti Temic Microelectronic Gmbh | Method for aligning two image recording elements of a stereo camera system |
JP2015198224A (ja) * | 2014-04-03 | 2015-11-09 | 日立オートモティブシステムズ株式会社 | 基板の保持構造 |
US10150427B2 (en) | 2014-12-24 | 2018-12-11 | Kabushiki Kaisha Tokai-Rika-Denki-Seisakusho | Imaging unit support structure |
JP2016120756A (ja) * | 2014-12-24 | 2016-07-07 | 株式会社東海理化電機製作所 | 撮像装置 |
WO2016103928A1 (ja) * | 2014-12-24 | 2016-06-30 | 株式会社東海理化電機製作所 | 撮像装置 |
JP2016177257A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社リコー | 撮像ユニット、車両制御ユニットおよび撮像ユニットの伝熱方法 |
JP2017191265A (ja) * | 2016-04-15 | 2017-10-19 | 日立オートモティブシステムズ株式会社 | 多眼光学装置 |
JP2020094858A (ja) * | 2018-12-11 | 2020-06-18 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
WO2020121746A1 (ja) * | 2018-12-11 | 2020-06-18 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
JP7154120B2 (ja) | 2018-12-11 | 2022-10-17 | 日立Astemo株式会社 | ステレオカメラ装置 |
Also Published As
Publication number | Publication date |
---|---|
EP1816514A1 (en) | 2007-08-08 |
EP1816514B1 (en) | 2012-06-20 |
JP2011123078A (ja) | 2011-06-23 |
US20080001727A1 (en) | 2008-01-03 |
US20140132739A1 (en) | 2014-05-15 |
JPWO2006052024A1 (ja) | 2008-05-29 |
EP2357527B1 (en) | 2012-10-17 |
EP2357527A1 (en) | 2011-08-17 |
JP4691508B2 (ja) | 2011-06-01 |
US9456199B2 (en) | 2016-09-27 |
EP1816514A4 (en) | 2010-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4691508B2 (ja) | ステレオカメラ | |
US10676041B2 (en) | Vehicular camera with pliable connection of PCBS | |
US9854225B2 (en) | Imaging unit including a chassis and heat transfer member | |
CN105163006B (zh) | 汽车成像系统及安装成像系统的方法 | |
US20160264065A1 (en) | Vehicle-mounted camera, method of manufacturing vehicle-mounted camera, and method of manufacturing vehicle body | |
JP2009265412A (ja) | ステレオカメラユニット | |
US20150358605A1 (en) | Stereo camera module and method for the production thereof | |
JP5961506B2 (ja) | ステレオカメラ装置 | |
JP6608278B2 (ja) | ステレオ・カメラの二つの画像撮影エレメントを揃えるための方法 | |
CN106476715A (zh) | 车载摄像头的安装方法 | |
US11841604B2 (en) | Vehicular camera with dual focusing feature | |
JP2008054030A (ja) | 車両周辺監視装置 | |
JP7436391B2 (ja) | 車載カメラ、及び車載カメラシステム | |
US20240121493A1 (en) | Vehicular camera assembly process | |
JP2007507138A (ja) | 画像センサを有する光学モジュールおよび画像センサの感応面上で支持されているレンズユニット | |
TWM568376U (zh) | 多攝像頭系統及多攝像頭模組 | |
US20160280150A1 (en) | camera system, particularly for a vehicle, and a method for the manufacture thereof | |
JP4755476B2 (ja) | 撮像装置 | |
JP6838264B2 (ja) | 車載カメラ装置 | |
JP2006276129A (ja) | 光学装置 | |
JP2010041373A (ja) | カメラモジュールおよびステレオカメラ | |
JP2022189609A (ja) | 撮像装置及び撮像装置の組み立て方法 | |
JP2023132605A (ja) | 測距装置、および移動体 | |
JP2009267755A (ja) | カメラ装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006545151 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11666340 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005806640 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005806640 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11666340 Country of ref document: US |