WO2004106858A1 - Stereo camera system and stereo optical module - Google Patents
Stereo camera system and stereo optical module Download PDFInfo
- Publication number
- WO2004106858A1 WO2004106858A1 PCT/JP2004/007759 JP2004007759W WO2004106858A1 WO 2004106858 A1 WO2004106858 A1 WO 2004106858A1 JP 2004007759 W JP2004007759 W JP 2004007759W WO 2004106858 A1 WO2004106858 A1 WO 2004106858A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stereo
- image data
- image
- data
- optical module
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/133—Equalising the characteristics of different image components, e.g. their average brightness or colour balance
Definitions
- the present invention relates to a stereo force camera system and a stereo optical module that can be mounted on a moving object such as a vehicle, a ship, an aircraft, and a mouth port, and a non-moving object such as a monitoring camera system.
- Japanese Patent Application Laid-Open No. H11-111469 discloses a stereo imaging unit (stereo optical module) for capturing a stereo image using a pair of cameras and an image processing unit for processing the stereo captured image. (Central control unit) and a stereo image processing system have been proposed.
- a stereo imaging unit a non-volatile memory for storing camera characteristic data is provided inside in order to correct a variation in the characteristics of a pair of cameras with each other.
- the correction data stored in the non-volatile memory is transmitted to the stereo image processing unit via the communication line. It is possible to output the data to the server.
- the stereo image data captured by the camera in the stereo imaging unit is output as an analog video signal to a stereo image processing unit via a communication line, and the stereo image processing unit outputs the stereo image data inside the stereo image processing unit.
- a correction operation is performed from the analog video signal and the correction data input from the imaging unit.
- Japanese Patent Application Laid-Open No. H11-1322843 discloses that one system of analog image signal cables is used to connect an image pickup device (stereo optical module) and an image processing unit (central control unit). An imaging device that connects and selectively sends each of stereo images using this image signal cable has been proposed.
- Japanese Patent Laid-Open Publication No. 2001-88609 discloses that a vehicle-mounted camera (stereo optical module) and an image recognition device (central control unit) are connected to a video signal line and a serial communication line. In addition, by mounting a temperature sensor for detecting the temperature on the in-vehicle camera, the output from this temperature sensor is transmitted to the image recognition device via the serial communication line. An in-vehicle camera that can be sent has been proposed.
- Japanese Patent Application Laid-Open No. 2000-2596966 proposes a system comprising a plurality of on-board cameras (stereo optical modules) and a peripheral recognition device (central control unit). .
- a stereo image obtained inside the stereo optical module is used, for example, to determine a distance to a subject.
- a correction calculation for correcting various imbalances generated between a pair of optical systems of a pair of cameras is required.
- the correction data is stored in a stereo optical module, and the correction data is stored in a central control unit using a communication line.
- the various imbalances generated between the above-described optical systems have a unique variation for each stereo optical module, such as a positional shift of the optical system inside the stereo optical module or a deviation due to aging. For this reason, in a method as disclosed in Japanese Patent Application Laid-Open No. H11-111, it is necessary to provide an arithmetic unit on the central control unit side for correcting variations inherent in the stereo optical module. The configuration on the unit side is complicated.
- stereo image data is transmitted between a stereo optical module and a central control unit using an analog communication line.
- serial communication lines for transmitting and receiving digital data and video signal lines for transmitting analog image data are separately provided. Two signal lines are used.
- An object of the present invention is to provide a stereo camera capable of performing a correction operation for correcting imbalance in capturing a stereo image or other image processing operations associated with the stereo optical module on the stereo optical module side.
- System and stele To provide optical modules.
- Another object of the present invention is to provide a stereo camera system and a stereo optical module that can simplify the structure of a central control unit that performs central control in a stereo camera system and can correct imbalance during stereo image capturing. It is to provide
- a stereo camera system captures a subject image from a plurality of viewpoints for the same subject and obtains a distance to the subject.
- a stereo camera system for acquiring a stereo image, and a central control for evaluating a target object based on the stereo image obtained by the stereo optical module.
- And a communication line for inputting the stereo image data obtained by the stereo optical module to the central control unit.
- the stereo optical module includes a plurality of viewpoints for the same subject.
- An imaging element for generating stereo image data have, and AZD conversion unit for generating a digital stereo image data a stereo image data that is generated by this IMAGING element and AZD converts the digital stereo image generated by the AZD converter unit
- An image output processing circuit that performs predetermined image processing on the data; and a module-side communication interface that outputs stereo image data that has been subjected to image processing by the image output processing circuit.
- a unit-side communication for inputting a stereo image data output via the module-side communication interface An interface, and a range image evaluation device for calculating a distance to a target object based on the stereo image data input via the unit-side communication interface.
- a stereo optical module includes: a stereo optical module for obtaining a stereo image; and a target based on a stereo image obtained by the stereo optical module. And a communication line for inputting stereo image data output from the stereo optical module to the central control unit.
- This is a stereo optics module for use in a stereo camera system that captures the image of a subject viewed from a point and determines the distance to the target subject, and generates an image of the same subject viewed from multiple viewpoints.
- An imaging element for generating data, digital stereo image de the stereo image de Isseki generated by the image pickup device and AZD converted - and AZD conversion unit that generates evening, generated in the AZD converter unit
- An image output processing circuit for performing predetermined image processing on the digital stereo image data thus obtained, and a module-side communication interface for outputting the stereo image data processed by the image output processing circuit to a communication line.
- a predetermined image processing operation is performed on stereo image data obtained in the stereo optical module. That is, by performing the image processing operation in the stereo optical module, the structure of the central control unit can be simplified.
- a stereo camera system captures a subject image from a plurality of viewpoints for the same subject, and determines the distance to the target subject.
- a stereo camera system for obtaining a stereo image, and a central control for evaluating a target object based on the stereo image obtained by the stereo optical module.
- a communication line for inputting the stereo image data acquired by the stereo optical module to the central control unit, wherein the stereo optical module is configured to view the same subject from multiple viewpoints.
- Stereo optical system for generating the viewed subject image, and the subject generated by this stereo optical system
- An image sensor that generates stereo image data based on an image; an A / D converter that performs AZD conversion of the stereo image data generated by the image sensor to generate digital stereo image data; and a key for the stereo camera system.
- Re-placement A predetermined information based on the digital stereo image data generated in the A / D conversion unit and the calibration data stored in the correction information memory.
- An image output processing circuit for performing processing, and a module-side communication interface for outputting stereo image data image-processed by the image output processing circuit to the communication line
- the central control unit includes: A unit-side communication interface for inputting stereo image data output to the communication line via the module-side communication interface to the central control unit; and a unit-side communication interface.
- a distance image evaluation device for determining a distance to a target subject based on the stereo image data input as input. To. .
- a stereo optical module for obtaining a stereo image, and an object based on a stereo image obtained by the stereo optical module.
- This is a stereo optics module for use in a stereo camera system that captures the image of the subject viewed from the viewpoint and calculates the distance to the target subject. And a stereo image based on the subject image generated in the stereo optical system.
- An image sensor for generating data, an A / D converter for A / D converting the stereo image data generated by the image sensor to generate a digital stereo image data, and a carrier for the stereo camera system.
- Correction information memory for storing the data and the digital stereo image data generated by the A / D converter and the calibration data stored in the correction information memory.
- An image output processing circuit for performing predetermined image processing, and a module-side communication interface for outputting the stereo image data processed by the image output processing circuit to the communication line.
- the stereo camera system can perform predetermined image processing in the stereo optical module using the obtained calibration data.
- a stereo camera system for photographing the same subject from multiple viewpoints and determining the distance to the subject.
- a stereo optical module for acquiring a stereo image, and a stereo optical module for acquiring a stereo image.
- a central control unit for evaluating a target object based on the stereo image obtained by the rheo optical module;
- a communication line for inputting the stereo image data acquired by the stereo optical module to the central control unit.
- a stereo optical system for generating a subject image viewed from a plurality of viewpoints for one subject, an image sensor for generating stereo image data based on the subject image generated by the stereo optical system, and the image sensor
- An A / D conversion unit for performing AD conversion on the stereo image data generated by the A / D converter to generate digital stereo image data; and a first carrier plane data, which is the calibration data of the stereo camera system.
- Correction information memory for storing the overnight and second calibration data; digital stereo image data generated by the AZD conversion unit; and first calibration stored in the correction information memory.
- An image output processing circuit for performing predetermined image processing based on the image data, and a stereo image processed by the image output processing circuit.
- a module-side communication interface for outputting data and a second carrier information stored in the correction information memory to the communication line
- the central control unit includes: A unit-side communication interface for inputting stereo image data output to the communication line through the module-side communication interface to the central control unit; and a unit-side communication interface.
- a distance image evaluation device for obtaining a distance to a target subject based on the stereo image data and the second calibrated data input via the CPU.
- a stereo optical module includes: a stereo optical module for obtaining a stereo image; and a stereo optical module for obtaining a stereo image.
- a central control unit for evaluating a target object based on the obtained stereo image, and a communication line for inputting stereo image data output from the stereo optical module to the central control unit.
- a stereo optics module for use in a stereo camera system that captures an image of a subject from multiple viewpoints and determines the distance to the subject.
- a stereo optical system for generating a viewed subject image, an image sensor for generating stereo image data based on the subject image generated by the stereo optical system, and an AZD for generating the stereo image data generated by the image sensor.
- An AZD conversion unit for converting the image into digital stereo image data, and the stereo camera system Correction information memory for storing the first and second calibration data, and a digital stereo image generated by the AZD conversion unit.
- An image output processing circuit for performing a predetermined image processing based on the data and the first calibration data stored in the correction information memory; and stereo image data processed by the image output processing circuit.
- a module-side communication interface for outputting the second carrier presentation data to the communication line.
- the correction information memory for storing the first calibration data and the second calibration data in the stereo optical module.
- the first carrier data is used when the stereo optical module performs a predetermined image processing operation.
- the distance calculation is performed on the central control unit side. That is, the central control unit can be simplified at least for a predetermined image processing operation using the first calibration data.
- a stereo camera system captures a subject image from a plurality of viewpoints for the same subject and determines the distance to the target subject.
- a stereo camera system for obtaining a stereo image, and a central control for evaluating a target object based on the stereo image obtained by the stereo optical module.
- a communication line for inputting data relating to the stereo image acquired by the stereo optical module to the central control unit.
- the stereo optical module is provided with a plurality of optical modules for the same subject.
- An imaging device that generates stereo image data based on the subject image, and an A / D conversion unit that A / D converts the stereo image data generated by this imaging device into digital stereo image data.
- An image output processing circuit that performs predetermined image processing on the digital stereo image data generated by the AZD conversion unit; and a scanner that performs image processing on the image output processing circuit.
- a distance image calculation unit that generates a distance image data based on the teleo image data; and a module side communication interface that outputs the distance image data generated by the distance image calculation unit to the communication line.
- the central control unit has a unit-side communication interface for inputting distance image data output via the module-side communication interface, and the unit-side communication interface input via the unit-side communication interface.
- a distance image evaluation device for evaluating the subject based on the distance image data.
- a stereo optical module includes a stereo optical module for obtaining a stereo image, and an object based on the stereo image obtained by the stereo optical module.
- a central control unit that evaluates the same subject, and a communication line for inputting data relating to a stereo image output from the stereo optical module to the central control unit.
- a stereo optical module for use in a stereo camera system that captures the image of a subject viewed from multiple viewpoints and determines the distance to the target subject.The subject is the same subject viewed from multiple viewpoints.
- An image sensor that generates image data
- an A / D converter that A / D converts the stereo image data generated by the image sensor to generate digital stereo image data
- a digital signal generated by the AZD converter For stereo image data, An image output processing circuit that performs image processing, a distance image calculation unit that generates distance image data based on stereo image data that has been subjected to image processing by the image output processing circuit, and a distance image calculation unit that generates the distance image data.
- a module-side communication interface for outputting the distance image data to the communication line.
- predetermined image processing is performed by the image output processing circuit on the stereo image data obtained in the stereo optical module, and the distance Distance image data is generated in the image calculation unit. That is, by providing the image output processing circuit for performing the image processing operation and the distance image calculation unit for generating the distance image data in the stereo optical module, the structure of the central control unit can be simplified. it can.
- a stereo camera system captures a subject image viewed from a plurality of viewpoints with respect to the same subject, and determines a distance to the target subject.
- a stereo camera system for generating a subject image viewed from multiple viewpoints for the same subject, and stereo image data based on the subject image generated by the stereo optical system.
- An AZD conversion unit that performs A / D conversion of stereo image data to generate digital stereo image data by using the imaging device, and an image conversion unit that generates the digital stereo image data generated by the AD conversion unit.
- An image output processing circuit for performing predetermined image processing on the basis of the stereo image data processed by the image output processing circuit.
- a distance image calculating unit for generating distance image data by using the distance image data generated by the distance image calculating unit and at least one of the stereo image data.
- a stereo optical module having an interface, a unit side communication interface for inputting at least one of the range image data and the stereo image data, and a unit side communication interface.
- a central control unit including a distance image evaluation device that evaluates the subject based on at least one of the distance image data and the stereo image data input via a face; A connection is made between the communication interface and the unit side interface, and the stereo optical module is connected. Possible to communicate line Toka et configured to communicate data to and from the central control Yuni' Bok.
- a stereo optical module includes: a stereo optical system for generating a subject image of a same subject from a plurality of viewpoints; An image sensor that generates stereo and image data based on the subject image generated by the stereo optical system, and an AZD converter that performs AZD conversion on the stereo image data generated by this image sensor to generate digital stereo image data An image output for performing predetermined image processing on the digital stereo image data generated by the A / D converter; a processing circuit; and a distance based on the stereo image data processed by the image output processing circuit.
- a distance image calculation unit for generating image data;
- a module-side communication interface that outputs at least one of the distance image data generated by the image calculation unit and at least one of the stereo image data generated by the imaging element to a communication line; I do.
- the stereo camera system is configured to shoot a subject image viewed from a plurality of viewpoints with respect to the same subject, and to capture a subject image from a plurality of viewpoints.
- a stereo camera system for obtaining a distance wherein a stereo optical module for acquiring a stereo image and a subject to be evaluated are evaluated based on the stereo image obtained by the stereo optical module.
- a communication line for inputting data relating to the stereo image acquired by the stereo optical module to the central control unit, wherein the stereo optical module is the same subject.
- a stereo optical system for generating an image of the subject viewed from multiple viewpoints.
- An imaging device for generating stereo image data based on the subject image, an A / D conversion unit for performing AZD conversion of the stereo image data generated by the imaging device to generate digital stereo image data, A distance image calculation unit that generates distance image data based on the digital stereo image data generated by the AZD conversion unit; A module-side communication interface for outputting the generated distance image data to the communication line, wherein the central control unit converts the distance image data output via the module-side communication interface.
- a unit-side communication interface to be input, and a distance image evaluation device for evaluating the subject based on the distance image data input via the unit-side communication interface.
- a stereo optical module includes: a stereo optical module for obtaining a stereo image; and a stereo optical module for obtaining a stereo image obtained by the stereo optical module.
- a central control unit for evaluating the subject, and a communication line for inputting data relating to a stereo image output from the stereo optical module to the central control unit.
- a stereo optical module for use in a stereo camera system that captures the image of an object viewed from multiple viewpoints and determines the distance to the target object.
- a stereo optical system for generating the image based on the subject image generated in the stereo optical system.
- An image sensor that generates image data; an AZD converter that A / D converts the stereo image data generated by the image sensor to generate digital stereo image data; and the digital stereo image generated by the AZD converter.
- a distance image calculation unit for generating distance image data based on the data, and the distance image data generated by the distance image calculation unit,
- a module-side communication interface for outputting to a line is provided.
- a stereo camera system captures a subject image viewed from a plurality of viewpoints with respect to the same subject, and captures the image of the subject from the viewpoint.
- a stereo camera system for obtaining a distance comprising: a stereo optical module for obtaining a stereo image; and a subject of a target object based on data relating to the stereo image obtained by the stereo optical module.
- a central control unit for performing an evaluation, and a communication line for communicating the stereo optical module and the central control unit by digital signals, wherein the stereo optical module is used for the same subject.
- a stereo optics system for generating a subject image viewed from multiple viewpoints, based on the subject image generated by this stereo optical system.
- An AZD conversion unit that performs AZD conversion on the stereo image data generated by the imaging device to generate digital stereo image data, and a control command from the central control unit.
- a module-side communication interface for outputting information related to a stereo image acquired by the stereo optical module to the communication line when the module is received, wherein the central control unit includes the module.
- a unit communication interface for inputting information related to the stereo image output via the side communication interface and for outputting a control command from the central control unit to the stereo optical module; and
- Unit side communication A range image evaluation device that evaluates the subject based on information related to the stereo image input via an interface.
- a stereo optical module includes: a stereo optical system for generating a subject image of the same subject from a plurality of viewpoints; An imaging device that generates stereo image data based on a subject image generated by a stereo optical system, and an AZD conversion unit that performs AZD conversion on the stereo image data generated by the imaging device to generate digital stereo image data. And outputting information related to the stereo image to a communication line in response to a control command input from the outside.
- the number of signal lines can be reduced by performing communication between the stereo optical module and the central control unit using only digital signals. Can be.
- a stereo camera system provides a stereo camera system according to the fifteenth aspect of the present invention, which captures an image of a subject from a plurality of viewpoints for the same subject to a subject to be targeted.
- a stereo camera system for determining a distance comprising at least one stereo optical module for acquiring a stereo image and at least one optical module for acquiring an image different from the stereo optical module.
- Module a central control unit for performing sequence control of the stereo optical module and the optical module, and a communication connection between the stereo optical module, the optical module, and the central control unit. Communication line.
- a plurality of modules can be connected to one communication line, and the number of signal lines can be reduced.
- FIG. 1 is a block diagram showing a basic configuration of a stereo camera system according to a first embodiment of the present invention.
- FIG. 2 is a block diagram showing the circuit configuration of the stereo optical module according to the first embodiment.
- FIG. 3A is a top view showing the configuration of the stereo optical system of the stereo optical module. Yes,
- FIG. 3B is an external view of the stereo optical module
- FIG. 4A is a perspective view showing the structure of the stereo optical module
- FIG. 4B is a stereo optical module.
- FIG. 3 is a top view for explaining a circuit arrangement in a module
- FIG. 5 is a diagram showing an example of a stereo image.
- FIG. 6 is a block diagram showing a circuit configuration of a central control unit in the stereo optical module according to the first embodiment of the present invention.
- FIG. 7 is a diagram showing an example in which the stereo camera system of the first embodiment is mounted on a vehicle.
- FIG. 8A is a diagram showing a first modification of the visual field mask
- FIG. 8B is a diagram showing a second modification of the visual field mask
- FIG. 9 is a diagram showing the central control unit.
- FIG. 9 is a block diagram showing a modification of the kit. '
- FIG. 10 is a block diagram showing a circuit configuration of the stereo optical module according to the second embodiment of the present invention.
- FIG. 11 is a block diagram showing a circuit configuration of a central control unit according to the second embodiment of the present invention.
- FIG. 12 is a block diagram showing a circuit configuration of the stereo optical module according to the third embodiment of the present invention.
- FIG. 13 is a diagram showing an example of communication contents using a video format such as the DV format ⁇ NTSC standard.
- FIG. 14 is a flowchart of the main operation sequence of the stereo camera system according to the fourth embodiment of the present invention
- FIG. 15 is a flowchart of the sensor check sequence in the fourth embodiment.
- FIG. 16 is a flowchart of the calibration data correction sequence
- Fig. 17 is a flowchart of the driver assistance sequence
- Fig. 18 is the inter-vehicle distance warning.
- FIG. 19 is a display example of a road surface recognition result
- FIG. 19 is a display example of a road surface recognition result.
- Figure 20 shows a display example of the avoidance warning.
- Figure 21 shows a display example of lane recognition results.
- FIG. 22 is a flowchart showing an example of a change in the flowchart of the main operation sequence.
- FIG. 23 is a flow chart showing a first modified example of the sensor check sequence.
- Figure 24 shows a second variant of the sensor check sequence. The flowchart shown in FIG.
- FIG. 25 is a block diagram showing a configuration of the stereo camera system according to the fifth embodiment of the present invention.
- Fig. 26 is a configuration diagram of a conventional raindrop sensor.
- FIG. 27 is a diagram illustrating an example of a stereo image at the time of detecting raindrops according to the sixth embodiment of the present invention.
- FIG. 28 is a diagram for explaining the overlap region.
- FIG. 29 is a diagram schematically showing a conventional corresponding point search
- FIG. 30 is a diagram schematically showing a corresponding point search at the time of detecting raindrops in the sixth embodiment.
- Figure 31 shows an example of a stereo image when raindrops adhere to the area outside the overlap area.
- Figure 32 shows the method for detecting raindrop images in an area outside the overlap area. It is a diagram for explaining,
- FIG. 33 is a diagram for explaining raindrop detection using a light beam projector.
- FIG. 34 is a diagram illustrating an example of the raindrop detection coating. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a block diagram showing a basic configuration of a stereo camera system according to a first embodiment of the present invention.
- this stereo camera system is assumed to be a system that can be mounted on moving objects such as vehicles, ships, aircraft, and robots. Since then figure An example of mounting on a vehicle will be described with reference to FIGS. That is, as shown in FIG. 1, the camera system basically includes a stereo optical module 1, a communication line 2, and a central control unit 3.
- the stereo optical module 1 captures a subject to acquire a stereo image and corrects the acquired stereo image.
- the communication line 2 is a communication line between the stereo optical module 1 and the central control unit 3.
- the central control unit 3 evaluates a subject to be photographed by the stereo optical module 1 based on a stereo image input from the stereo optical module 1 via the communication line 2. . Specifically, the central control unit 3 obtains a distance image to be described later based on the input stereo image, and evaluates the obtained distance image.
- FIG. 2 is a block diagram showing the inside of the stereo optical module in detail.
- the stereo optical module 1 includes a stereo optical system 4, an imaging device 5, an imaging device driving circuit 6, and a correction information memory 7, as shown in FIG.
- the stereo optical system 4 includes a visual field mask (right and left) 19a and 19b, and a front lens unit (right and left) 20a and 20b, and Secondary deflection mirrors (right and left) 21a, 21b, Secondary deflection mirrors (right and left) 22a, 22b, rear ball unit 23, It is composed of a low-pass filter 24 and a force, and forms an image from an object (not shown) on the image sensor 5.
- the external appearance of the stereo optical module 1 is covered by a casing member 26 provided with field mask openings 25a and 25b as shown in FIG. 3B.
- the casing member 26 is a member for protecting the internal mechanism of the stereo optical module from the outside, and has a function of blocking external light, a dustproof function, a function of supporting the internal mechanism, and the like.
- a cover member having.
- the light flux from a subject which is incident on the stereo optical system 4 from the field mask apertures 25a and 25b and is not shown, is a certain distance away.
- the part is shielded by the visual field masks 19a and 19b, and the rest enters the right and left front lens units 20a and 20b.
- the visual field masks 19a and 19b function as apertures for narrowing the visual field of the front cutouts 20a and 20b.
- the visual field mask 19a blocks the upper half area of the front lens unit 20a, and the visual field mask 19b The lower half of unit 20b is shielded.
- the effective optical axis (hereinafter, referred to as the optical axis) of the lenses of the front lens units 20a, 20Ob having the visual field masks 19a, 19b is determined by the light flux from the subject. It does not coincide with the center, that is, the central axis when there is no visual field mask 19a, 19b. Also, as shown in Fig. 4A, the optical axes of the right and left front lens units 20a and 2Ob Are not coplanar with each other and are twisted.
- the luminous flux from the subject that has entered through the front lens units 20a and 2Ob is reflected by the primary deflection mirrors 2la and 2lb.
- These first-order deflection mirrors 21a and 21b have such a size that they can reflect the light beam transmitted through the front lens units 20a and 20b.
- the first-order deflection mirrors 21a and 21b are almost the same size as or slightly larger than the unshielded area of the front cutout 20a.
- the primary deflection mirrors 2 la and 21 b are arranged at an angle of about 45 degrees in the horizontal direction and several degrees in the vertical direction on the image sensor side. With such an arrangement, the light beams reflected by the primary deflection mirrors 21a and 21b are incident on the secondary deflection mirrors 22a and 22b.
- the secondary deflecting mirrors 22a and 22b are tilted several degrees to the image sensor in the vertical direction so that they are substantially orthogonal to the primary deflecting mirrors 21a and 21b in the horizontal direction. Is arranged. With this arrangement, the light beams reflected by the secondary deflection mirrors 22a and 22b enter the rear lens unit 23. As shown in FIG. 4B, the secondary deflection mirrors 22a and 22b are arranged so as to intersect each other when viewed from the direction of the upper surface. That is, the light beam incident on the primary deflection mirror 21a is reflected by the secondary deflection mirror 22a and deflected so as to enter the rear lens 23 downward. Is performed. On the other hand, the light beam coming from the primary deflection mirror with a force of 2 lb is reflected by the secondary deflection mirror 22b and deflected so as to enter the rear ball unit 23 upward. .
- the light beam thus deflected passes through the rear unit 23, It enters the mouth-to-pass filter 24, and high-frequency noise components are removed at the mouth-to-pass filter. Thereafter, the light beam incident via the right front lens unit 20a forms an image in the lower half area of the image sensor 5, and the light beam incident via the left front lens unit 2 Ob is An image is formed on the upper half of the image element 5. That is, by limiting the visual field with the visual field masks 19a and 19b, the upper and lower images are formed on the image sensor without overlapping. As a result, stereo images having left and right parallax as shown in FIG. 5 are formed vertically on the image sensor. This makes it possible to acquire a stereo image using only one stereo optical module.
- the image pickup device 5 be disposed obliquely as shown in FIG. 4A.
- the stereo images formed on the image sensor 5 in this manner are sequentially output to the image output processing circuit 8 as analog signals by the operation of the image sensor drive circuit 6 shown in the block diagram of FIG.
- the inside of the image output processing circuit 8 includes an A / D converter 12, a C-Y converter 13, a compression processor 14, a rectification processing unit 15, It comprises a color conversion processing section 16, a shading correction processing section 17, and a frame buffer memory 18.
- the image output processing circuit 8 and the raindrop sensor 10 and the temperature sensor 11 are located behind the visual field masks 19a and 19b, and have a secondary deflection mirror. It is installed in the circuit layout space 27a, 27b, which is an empty space on the back side of 22a, 22b.
- the right optical system has a circuit layout space 27a above the primary deflection mirror 21a and the secondary deflection mirror 22a.
- the optical members are not arranged behind the visual field masks 19a and 19b because they are masked by b, and they are free spaces.
- a circuit that can be provided in these empty spaces is, for example, an electric signal output circuit that outputs an electric signal.
- correction data includes a digital memory (correction information memory) that stores the temperature, a temperature sensor, a raindrop sensor, an illuminance sensor, a GPS antenna, and an ETC card system.
- a power supply circuit (not shown) of the stereo optical module may be provided.
- a dual mechanism member may be provided.
- an adjustment mechanism for mechanically adjusting the stereo optical system may be provided as the mechanism member.
- an ND filter (not shown) for adjusting the amount of light incident on the stereo optical system is made to be detachable, and the above-mentioned empty space is used as a retreat space for the ND filter. You may.
- an A / D converter 12 converts a stereo image, which is an analog signal, into digital stereo image data, which is digital data.
- An optical color filter (not shown) is provided on the front surface of the image sensor 5 so that the image sensor 5 outputs a stereo image data corresponding to each of RGB colors.
- the C—Y converter 13 converts these RGB primary color signals into a luminance signal and two color difference signals. This is preparation processing for conversion to a motion JPEG (Motion-JPEG) signal by the compression processing unit 14. Thereafter, these signals are stored in the frame buffer memory 18.
- the stereo image data stored in the frame buffer memory 18 is input to the shading correction processing unit 17.
- the shading correction processing section 17 performs shading correction processing based on the shading correction data stored in the correction information memory 7.
- the key conversion processing section 16 performs key conversion processing to make the image input / output characteristics appropriate, and further stores it in the correction information memory 7 by the rectification processing section 15. Rectification processing is performed based on the obtained calibration data.
- the rectification process refers to the process of correcting image distortion caused by displacement of the epipolar lines of the left and right images and distortion of the lens (distortion) caused by mechanical displacement of the optical members and the like inside the stereo optical system 4.
- This is a process of correcting by image processing.
- the point corresponding to the feature point selected in the left image is detected in the right image, and the corresponding point is detected from a straight line (epipolar line) that should be originally located. And how much they are displaced, and perform image deformation according to the detected deviation.
- the left and right images subjected to rectification processing in this manner (hereinafter referred to as rectification images) are normalized to prescribed coordinates, and then arranged vertically as shown in FIG. Is written back to the frame buffer memory 18 in the same format.
- the calibration data stored in the correction information memory 7 includes a plurality of calibration data for taking into account thermal deformation of the stereo optical module. That is, the temperature value detected by the temperature sensor 11 is input to the image output processing circuit 8 via the M-side communication interface 9, and based on the input temperature, the rectification processing unit 1 5 implements a rectification process by selecting the calibration data stored in the correction information memory 7 and realizing a process that is easy to handle temperature changes.
- the digital stereo image data written back to the frame buffer memory 18 is input to the compression processing unit 14, After data compression using a data compression method such as motion JPEG, the data is output to the communication line 2 via the M-side communication interface 9.
- a raindrop sensor 10 and a temperature sensor 11 are also connected to the M-side communication interface 9, and the outputs of these sensors can be output to the communication line 2 and the image output processing circuit 8.
- the communication line 2 has a plurality of USB communication lines for transmitting data signals for controlling the rectification image data and controlling the stereo optical module. These communication lines are connected to the M-side communication interface 9 of the stereo optical module 1 and the unit (u) -side communication interface 28 of the central control unit 3, respectively.
- the rectification image was transmitted to the USB signal line in a compressed form in motion JPEG format, and the raindrop sensor information detected by the raindrop sensor 10 and the temperature sensor 11 detected the raindrop sensor information.
- Environmental information such as temperature sensor information and information such as photometric information of the subject are also transmitted via the USB communication line.
- the information is transmitted to the central control unit 3 via the communication line 2.
- the stereo image data is compressed by the compression processing unit 14 as described above, the amount of information during communication can be reduced, and other data of the image data can be transmitted and received. The amount of possible information can be increased.
- the communication line 2, that is, the USB communication line, as described above transmits rectification image data, raindrop sensor information, temperature sensor information, and the like, as described above.
- the stereo optical module 1 is provided with data generating devices such as a temperature sensor, a raindrop sensor, and a correction information memory that generate predetermined data in response to a control command from the central control unit 3.
- the communication line 2 is a bidirectional digital communication line, and various control commands are sent from the central control unit 3 to the stereo optical module 1 via the communication line 2. This control command will not be described in detail in the first embodiment; for example, a power control command for the stereo optical module 1 (a command for switching between an operation mode and a power saving mode) is used.
- a request signal to read out the temperature sensor information and raindrop sensor information to the central control unit, or a request signal to read out the information stored in the correction information memory 7 to the central control unit 3. is there.
- mutual data communication can be performed more efficiently using one communication line.
- the output of the image sensor 5 is converted into an analog image signal conforming to a format such as NTSC by the M-side communication interface 9 on the module side, and then output to the communication line 2.
- the compression processing unit 14 in the stereo optical module becomes unnecessary.
- the communication line 2 needs to have two systems, an analog communication line and a digital communication line, and an extra space for passing the communication line is required. It becomes important.
- there are disadvantages such as the extra space occupied by the communication connector is required, but on the other hand, there is the advantage that the speed of transmitting image data can be increased.
- control data other than image data is transmitted using a character transmission area where information is superimposed on the blanking signal. May be.
- the central control unit 3 includes a u-side communication interface 28, an image input device 29, a frame memory 30, an EPROM 31 and a distance image calculation unit 3. 2 and a peripheral environment judgment unit and sequence controller (hereinafter referred to as a surrounding environment judgment unit) as a range image evaluation device that performs object recognition and the like based on range images. ing. Furthermore, the central control unit 3 also has an image output device 34, a pointing input device 36, an audio output device 37, an audio input device 40, and an indicator light output device 41. I have. The above circuits and devices are connected via a path, and can transfer data to each other overnight.
- the image output device 34 and the pointing input device 36 are connected to the display input device 35.
- the audio output device 37 is connected to the speaker 38, and the audio input device 40 is connected to the microphone 39. Further, the indicator light output device 41 is connected to the indicator light 4.
- the rectification image input to the central control unit 3 via the communication line 2 is temporarily stored in the frame memory 30 via the u-side communication interface 28 and the image input device 29. Then, it is input to the distance image calculation unit 32.
- the rectification image input to the central control unit 3 is data compressed according to the motion JPEG standard, and is thus compressed by the image input device 29 described above. Decompress and restore data.
- the distance image calculation unit 32 has a stereo image extraction unit 43 for extracting left and right images from the stereo image, and a window matching unit 46. That is, the rectification image input to the distance image calculation unit 32 is cut out by the stereo image cutout unit 43 into left and right images 44, 45. Then, a window matching is performed on the cut-out left and right images 44 and 45 in a window-one matching unit 46 to detect a displacement of each corresponding point, that is, a parallax amount. As a result, a parallax image 47 is generated. After that, the parallax amount is converted into distance information in a distance image generation unit 48, and a distance image is generated based on the distance information and the calibration data 49 stored in the EPR ⁇ M31. You.
- the term “distance image” refers to an image having distance information for each pixel of a captured subject image.
- the calibration data 49 is data for distance information correction selected according to the raindrop sensor information and the temperature sensor information input via the communication line 2. After that, the generated distance image is sent to the surrounding environment judgment unit 33.
- the peripheral environment judgment unit 33 is a central processing unit (CPU) 50 and a main memory (RAM) connected to a communication bus, similar to a general personal computer or a control device using a micro computer. 51, digital signal processing section (DSP) 52, hard disk (HDD) 53, etc., and implements a sequence that evaluates range images and recognizes objects, and various other sequence controls. It is configured to achieve this.
- CPU central processing unit
- RAM main memory
- DSP digital signal processing section
- HDD hard disk
- the CPU 50 expands and executes the object recognition processing program stored in the HDD 53 on the RAM 51, and executes the stereo optical module.
- obstacles for example, pedestrians, vehicles, It performs detection of falling objects and recognition of characteristic environmental information (for example, white lines, guard rails, and signs on roads).
- the CPU 50 controls the image output device 34 to display the recognition result on the display / input device 35 based on the result of the recognition process, or the audio output device 37, for example.
- the warning light to notify the driver that the vehicle is approaching from the speed 38, and to control the indicator light output device 41 to light the indicator light 42 so that the driver Controls related to driving assistance, such as calling attention of the operator, are performed.
- manual operation members such as the display input device 35 5 audio input such as the microphone 39
- the communication line 2 includes, in addition to the stereo optical module 1 and the central control unit 3 as shown in FIG. , A vehicle speed sensor module 55 for detecting the speed of the vehicle, a car navigation module 56 for notifying the driver of the vehicle position, and an engine component for controlling the engine Troll module 57, side camera module 58 to monitor the side of the vehicle, indoor monitoring module 59 to monitor the inside of the vehicle, brake sensor module 61 to control the brake, vehicle Rear camera module 62 to monitor the rear of the vehicle, load detection module 63 to detect the load in the vehicle, ultrasonic radar equipment
- the rear ultrasonic radar module 64 or the like on which is mounted so that mutual coordination between the various in-vehicle modules can be achieved. Note that these modules can be used as they are.
- wireless communication technologies such as B1uetooth, IEEE802.il, and IEEE802.1X are used. It is also possible.
- a transmission network for audio data such as a wired network such as USB, IEEE1394, and Ethernet, and voice over IP (VoIP). You can use the protocol, real-time messaging function (RTC / SIP client), or assign an IP address to each device using IPV6 etc. You may build a LAN inside the car.
- a communication bus line compliant with the open standard such as the in-vehicle optical network standard “M0ST” or “CAN” that is resistant to in-vehicle noise and the like may be used.
- the first embodiment has the following unique effects.
- the stereo optical module is required for the stereo optical module such as rectification, T correction, shading correction, and separation into a luminance signal and a color difference signal. Image processing is performed.
- the circuit load on the central control unit can be reduced, the specifications on the central control unit can be simplified, and the versatility of the system can be improved.
- digital image data can be transmitted at a frame rate similar to that of analog communication. .
- the stereo optical module 1 is provided with the temperature sensor 11, temperature correction during rectification processing is appropriately performed. For this reason, the search area of the window matching performed by the central control unit 3 only needs to be performed on the epi-pola line, and higher-speed processing can be realized.
- each configuration of the first embodiment can of course be variously modified and changed.
- a plurality of stereo optical modules 1 as described above can be used.
- the rectification image from which stereo optics module is obtained can be determined. It may be possible to clearly identify them.
- the stereo optical module 1 includes processing circuits for r conversion processing, C—Y conversion processing, compression processing, and the like. Any image processing that is equivalent to that of a video camera or digital still camera can be performed.
- the correction information memory 7 of the stereo optical module 1 carries not only the correction information for the rectification processing and the shading correction processing, but also information about how each stereo optical module is mounted on the vehicle.
- the corrected correction information may be stored.
- the EPROM 31 in the central control unit can be omitted or the storage capacity can be reduced.
- information from the central control unit 3 obtained by the calibration may be transmitted to the stereo optical module 1 or the like by communication, and this information may be stored in the stereo optical module 1.
- the calibration data used in the distance image calculation unit 32 of the central control unit 3 is also stored in the correction information memory 7 of the stereo optical module 1, and this distance image is calculated when the distance image is calculated.
- Stereo optical module 1 From the central control unit 3.
- a part of the correction information (calibration data) among the correction information stored in the correction information memory 7 is used for the image processing calculation in the stereo optical module.
- Other correction information is transmitted to the central control unit 3 via the communication line 2 so that the central control unit can use the information for correction such as distance calculation.
- the central control unit 3 may be realized by hardware-based connection of various internal processing circuits using dedicated application-specific integrated circuits (ASICs). Of course, it may be constructed by a rewritable reconfigurable processor.
- ASICs application-specific integrated circuits
- each communication line 2 from the stereo optical module 1 and the central control unit 3 is affected by various spike noises applied to the harness. It may be insulated so that it does not exist.
- the format of the video to be sent to the communication line 2 besides the NTSC format in the case of the analog format, the PAL format or the SECAM format may be used.
- transmission may be performed according to a standard such as the JPEG system for compressing still images, the MPEG system for compressing moving images, or DV (digital video).
- GUI or voice interface constructed by a user interface design tool for in-vehicle devices such as the Automotive UI Toolkit of Microsoft (registered trademark) is used. May be used.
- the field masks 19a and 19b are drawn so as to cover half of the front lens units 20a and 20b, however, in order to regulate the field of view more effectively.
- a field mask 19 having a rectangular opening 25 may be arranged at a position offset from the optical axis of the front lens unit 20.
- the lens itself may be D-cut on the four sides to obtain the same effect as that of a visual field mask.
- the central control unit 3 The stereo optical system calibration device 65 for calculating the calibration time of the stereo optical system and the positional relationship between the vehicle and the stereo optical module 1 are measured. Carry out vehicle-based calibration to calculate 6 It may be configured to have a vehicle-based calibration device 67 or the like.
- correction information stored in the correction information memory 7 is assumed to be the calibration data for the shading correction processing and the rectification processing. Calibration day with positional relationship Of course, it does not matter even if it adds data. Further, the correction information memory 7 may be arranged on the central control unit 3 side. Also, a configuration may be adopted in which reading and writing can be performed as appropriate via the communication line 2.
- the stereo optical module 1 includes a rectification processing section 15, a conversion processing section 16, and a shading correction processing section 17.
- the processing circuits such as the above and the correction information memory 7 are not mounted, but these are mounted on the central control unit 3 as shown in Fig. 11, thereby reducing the circuit scale of the electronic circuit in the stereo optical module. You can do it. That is, the heat generated in the stereo optical module can be suppressed, and the fluctuation of the calibration data due to the thermal deformation of the optical system and the like in the stereo optical module can be suppressed. Also, since the cooling of the image sensor 5 can be efficiently performed by reducing other heat generating elements, it is possible to suppress the generation of thermal noise, and it is possible to output a better stereo image to the communication line 2. It is.
- each configuration of the second embodiment can be variously modified and changed in the same manner as the first embodiment.
- the central control unit in the third embodiment is obtained by removing the distance image calculation unit from the central control unit in the first embodiment shown in FIG.
- the distance image calculation unit 32 is also built in the stereo optical module 1.
- a two-dimensional image and a range image used for object recognition are output to the communication line 2.
- the third embodiment differs from the first embodiment in that image data is sent out from the M-side communication interface 9 in the stereo optical module in an analog system conforming to the NTSSC.
- the two-dimensional image and the distance image are rearranged side by side in, for example, one NTSC-based screen, and output as a video signal.
- information such as whether the image was output using the left or right image as the reference image, the temperature detected by the temperature sensor 11 or the information detected by the raindrop sensor 10 was used as a reference image. It is also possible to perform communication synchronized with the image by encoding and transmitting to the used signal area.
- the central control unit 3 Realizes various recognition processes by decoding the image by the u-side communication interface 28 and processing it as a two-dimensional image, a distance image and sensor information. More specifically, the central control unit 3 is illustrated by a peripheral environment determination unit 33, which is a distance image evaluation device, based on the distance image data input via the u-side communication interface 28. Evaluate subjects that do not.
- stereo optical modules it is also possible to use a plurality of these stereo optical modules.
- a code indicating the camera ID of each stereo optical module is provided in the upper and lower images on the screen.
- a range image may be embedded and transmitted to the central control unit 3 in synchronization with a single image.
- various internal processing circuits may be realized by hardware connections using dedicated ASICs, and furthermore, these may be dynamically rewritten. Of course, it is built by a rubble processor.
- a two-dimensional image and a range image used for object recognition are output using an analog signal line of the NTSC system.
- digital data is output. It may be output to a digital communication line overnight.
- data compression may be performed using a data compression method such as motion JPEG or MPEG, and then output to the communication line 2. Good.
- a stereo optical module 1, a central control unit 3, an operation status detection module, etc. are connected to the communication line 2 as shown in Fig.
- the driving condition detection module include a vehicle speed sensor module 55, an engine control module 57, and a brake sensor module 61 as shown in FIG.
- the environmental monitoring modules include a radar module 54, a car navigation module 56, a lateral force camera module 58, an indoor monitoring module 59, and a rear camera module 62 as shown in Fig. 7. , Load detection module 63, rear ultrasonic radar module 64, etc.
- wireless communication technologies such as B1uetooth, IEEE 802.11, and IEEE 802.1x are used. This is also possible.
- wired networks such as liSB, IEEE 1394, and Ethernet, voice data transmission protocols such as VoIP (voice over IP), and real-time messaging functions (RTC / SIP) client) or the like, or an in-vehicle LAN with an IP address assigned to each device using IPV6 or the like.
- a communication bus line compliant with the open standard such as the in-vehicle optical network standard “MOST” or “CAN” that is resistant to in-vehicle noise and the like may be used.
- MOST in-vehicle optical network standard
- CAN that is resistant to in-vehicle noise and the like
- the input and display are constructed by, for example, a user interface design tool for in-vehicle equipment, such as Microsoft Office (Automation) UI Toolkit. You may use the provided GUI or voice interface.
- a user interface design tool for in-vehicle equipment such as Microsoft Office (Automation) UI Toolkit. You may use the provided GUI or voice interface.
- stereo image data obtained in the stereo optical module is subjected to rectification, key correction, shading correction, separation into a luminance signal and a color difference signal, and the like.
- the calculation of the distance image is performed in the stereo optical module. Therefore, compared to the first embodiment, the circuit load on the central control unit can be further reduced, the specifications on the central control unit can be further simplified, and the versatility of the central control unit can be achieved. Can be increased. -.
- the fourth embodiment an operation sequence of a stereo camera system performed to perform better distance image calculation will be described.
- the trigger signal in the case of in-vehicle use is taken as an example.
- the fourth embodiment shows the operation sequence of the stereo camera system shown in the first embodiment. The same applies to a configuration like the second or third embodiment. Needless to say, it is applicable to
- the central control unit 3 in the sleep state is resumed (step S1) and activated (step S1).
- the central control unit 3 activates the image output processing circuit 8 and the imaging device 5 of the stereo optical module 1 at the same time as the activation (steps S3, S4).
- the central control unit 3 sends a signal to the stereo optical module 1 via the communication line 2 to activate the stereo optical module 1.
- “Startup command” is sent.
- the stereo optical module 1 switches the internal power supply circuit (not shown) from the power saving mode to the operation mode, and subsequently starts the image output processing circuit 8 and the image sensor 5. .
- imaging is started.
- the central control unit 3 starts the sensor check sequence shown in Fig. 15
- Step S5 The details of this sensor check sequence will be described later.
- Step S7 The rectification image on which the rectification processing has been performed is sent to the central control unit 3 via the communication line 2.
- the central control unit 3 for example, in the peripheral environment judging unit 33, the calibration data 101 stored in the frame memory 30 and used for the rectification process 101 is used. It is checked from the rectification image whether or not the calibration is appropriate (step S8).
- the rectification image at this time is checked.
- the one-shot image is deleted from the frame memory 30 so as not to be used for the subsequent processing.
- calibration data correction processing described later is performed (step S9), and the calibration data corrected by the correction processing is sent to the stereo optical module 1 via the communication line 2.
- the carrier position data 101 stored therein is updated (step S10).
- the rectification process of step S7 the rectification process is performed using the updated calibration data 101. As a result, the rectification process is always executed with the latest and correct calibration.
- the distance image calculation unit 32 in the central control unit 3 To the input rectification image After performing a window-matching process based on this (step S12), a distance image is generated, and the generated distance image is output to the surrounding environment determination unit 33 (step S13). Using this distance image, the surrounding environment determination unit 33 of the central control unit 3 performs driving support described later (step S14).
- step S15 branches to yes, the current calibration data is recorded (step S16), and the central control is performed.
- Unit 3 enters sleep state, and stereo optical module 1 also stops operating.
- an “operation stop command” for stopping the operation of the stereo optical module 1 is transmitted from the central control unit 3 to the stereo optical module 1 via the communication line 2.
- the stereo optical module 1 enters the power saving mode after performing a predetermined operation in response to the “operation stop command”.
- step S5 in FIG. 14 the sensor check sequence in step S5 in FIG. 14 will be described with reference to FIG.
- the description will be made on the assumption that the stereo optical module 1 is mounted on a room mirror in a vehicle and an image of the front of the vehicle body is taken through a front window glass.
- the central control unit 3 detects cloudiness and dew condensation on the front window glass by a known method (step S21), and then forms a cloud or dew on the vehicle front window glass. Determine if there is no condensation (step Step S22). If it is determined in this determination that there is a cloud, the central control unit 3 turns on a defroster (not shown) to start removing the cloud (step S 23), and also displays the cloud on the display input device 35. A warning is displayed (step S24). Then, when the cloud is removed by the effect of the defrosting, the central control unit 3 determines that there is no cloud in the determination in step S22, and turns off the cloud warning display (step S25). ).
- the central control unit 3 detects raindrops based on the output of the raindrop sensor 10 sent via the communication line 2 (step S26), and the raindrops are detected on the front window glass. It is determined whether it is not on (step S27). If it is determined that raindrops are present, the central control unit 3 operates a wiper (not shown) to start removing raindrops (step S28), and displays a rain warning on the display input device 35. Display is performed (step S29). Then, when the raindrop is eliminated due to the effect of the wiper, the central control unit 3 determines that there is no raindrop in the determination in step S27, and turns off the rain warning display (step S30). .
- the central control unit 3 detects the temperature around the stereo optical module 1 by the temperature sensor 11 (step S31), and detects the temperature within a range suitable for the operation of the stereo optical module 1. It is determined whether it is within (step S32). If it is determined in this determination that the detected temperature is not within the appropriate range, the central control unit 3 operates an air connector (not shown). In addition to performing temperature control (step S33), a temperature warning is displayed on the display / input device 35 (step S34). Then, when the temperature falls within the appropriate temperature range due to the effect of the air conditioner, the central control unit 3 determines that the temperature is within the appropriate temperature range in the determination in step S32. The temperature warning display is turned off (step S35).
- step S36 determines whether or not a stop signal has been input
- step S37 the detected environmental information 100 such as cloud information, raindrop sensor information, and temperature sensor information is transmitted via the communication line 2.
- step S37 After output to the rectification processing section 15 of the stereo optical module 1 (step S37), the flow returns to step S21 to continue the sensor check.
- the above environmental information data 100 is output only when cloudiness, dew condensation, raindrops, and temperature are within predetermined conditions. Therefore, the rectification processing using the environmental information data 100 is not started even if a stereo image is captured by the image sensor 5. Therefore, in such a case, a warning display is displayed to inform the driver that the vehicle will not operate because the specified conditions have not been obtained, rather than operating due to a failure. Input device It is preferable to carry out according to 35.
- FIG. 16 is a flowchart showing the sequence of the calibration and overnight correction in step S9 of FIG. That is, as shown in FIG. 16, the surrounding environment determination unit 33 first determines whether or not environmental conditions such as temperature information detected by the temperature sensor 11 obtained via the communication line 2 have changed. A determination is made (step S41). For example, in the case where the carrier displacement changes due to thermal expansion due to a temperature change or the like, a temperature prepared in advance in the EPOM 31 or the HDD 53 is used. The calibration data 101 corresponding to the temperature is read out from another calibration data table 102 and written to the correction information memory 7 of the stereo optical module 1 via the communication line 2. By doing so (step S47), the calibration break data 101 is updated.
- step S41 an image of a known shape (here, a stereo image from the stereo optical module 1 via the communication line 2) is acquired (step S41).
- step S 4 2) to determine whether the mounting conditions such as the mounting position of the stereo optical module 1 have changed.
- Step S43 If the mounting conditions have changed, the calibration data is corrected according to the change (Step S44), and the corrected updated calibration data 10 is corrected. 3 is added to the above calibration data table 102, and is written to the correction information memory 7 of the stereo optical module 1 via the communication line 2 so that the rectification is performed. Updates the calibration date used for the short processing.
- FIG. 17 is a flowchart showing an example of the driving support sequence in step S14 of FIG.
- the surrounding environment judging unit 33 enters the driving support sequence, and outputs a normal image (one of the images cut out by the stereo image cutting unit 43) 104 and a distance image 1
- a normal image one of the images cut out by the stereo image cutting unit 43
- a distance image 1 After performing segmentation processing based on the distance distribution by using 0 5 as input (step S52), preceding vehicle recognition (steps S54 to S57), road surface recognition (step S5 .8 to S59), obstacle recognition (steps S61 to S70), and lane recognition (steps S71 to S73) are performed.
- the preceding vehicle recognized by detecting the laser beam reflected from the vehicle by the radar module 54 using a laser radar or the like is extracted from the normal image 104, and the distance rain image 10
- the area of the preceding vehicle is recognized by matching with the segmentation in step 5 (step S54), and information including the vehicle width of the preceding vehicle is extracted to extract the inter-vehicle distance from the preceding vehicle (step S54) S55).
- the distance to the preceding vehicle is checked (step S56). If the distance to the preceding vehicle is not appropriate (for example, too close), the display / input device 3 is output via the image output device 34.
- Figure 5 shows an image display as shown in Fig. 18. Then, the following distance warning is executed (step S57).
- a plane recognizable from the distance image 105 is found at a position where the road surface is assumed to be (step S58), and the range of the plane is determined by the road surface. Is recognized as a plane with the same plane. Then, this road plane is set as a reference plane (step S59).
- this reference plane data 106 for each recognition process, it can be used for object recognition, preceding vehicle recognition, lane recognition, and so on. In this case, as shown in FIG. 19, the recognition plane may be displayed on the display input device 35 via the image output device 34.
- step S61 an object recognized by the segmentation process of the distance image 105 rising from the plane on which the road surface is detected is recognized (step S61), and the obstacle is determined as the obstacle. Label the distance and size.
- step S62 the distance of the obstacle is extracted (step S62), and it is determined whether or not the collision course (step S63). In the case of the collision course, the distance is determined via the image output device 34. Then, an avoidance warning is issued by the display input device 35 to urge the operator to avoid (Step S64).
- step S65 it is determined whether or not the distance extracted in step S62 is a distance that can be safely stopped by the operator, that is, the driver's voluntary braking operation
- a warning is issued by the display input device 35 via the image output device 34 so as to cause the operator to press the brake (step S66).
- the image output device 34 is used so that the operator steps on the brake.
- an instruction is given by the display input device 35 and the accelerator is automatically released (step S67).
- the avoidance operation can be performed here (step S68)
- forcible operation such as prompting a handle operation on the display input device 35 via the image output device 34 as an emerging drive is performed. Take safety measures (Step S69).
- step S70 prepare for passive safety such as an airbag.
- an accident report such as a current location or an accident occurrence report is automatically performed.
- Lane recognition is based on the recognition of a white line on the reference plane from the luminance value of the normal image 104, and the recognition of a guard rail projecting at a certain height from the reference plane from the distance image 105.
- the lane is recognized (step S71).
- the driving range is recognized, and it is checked that the vehicle has not deviated from the lane (step S72).
- a lane warning is displayed on the display input device 35 via the image output device 34 (step S73).
- a warning is displayed on the screen of the display input device 35 connected to the central control unit 3 shown in FIG. 6 as the above warning. Operate by means such as audible warning from the speaker 38, lighting of the indicator lamp 42, and vibration of the operating sheet (not shown).
- the warning may be given in a form that calls attention to the person. This is particularly effective when urgency is high as in steps S64, S66, and S67 described above, and it is not desirable for the operator to look away from the front.
- calibration of calibration data may be performed by selecting from a plurality of calibration data tables converted from a table in advance according to temperature, etc.
- the means for responding to the discovery of the problem is not limited to the defroster, the wiper, and the air conditioner described in the fourth embodiment, but may be replaced by another means, for example, instead of the defroster.
- An anti-fog coating and a hot-wire heater may be used for the heater, a super-water-repellent glass coat and a blower may be used instead of the wiper, and a Peltier element or an air-cooling fan may be used instead of the air conditioner.
- the environmental information data 100 is output, and as shown in FIG. 22, the environmental information data 100 is output. 0, the condition of the environmental information is inappropriate (step S87: no), and the rectification If the image processing cannot be performed, the sensor check sequence is started, the output processing of the distance image is stopped, and an unprocessable warning is displayed on the display input device 35 via the image output device 34.
- Step S888 this may be communicated to the operator such as the driver.
- the defrosting, wiper, and air conditioner are used as shown in Fig. 23. It goes without saying that it may be turned off (steps S105, S111, S117) when it becomes. Also, periodically output environmental information (temperature sensor information, etc.) data.
- each environmental condition is checked in parallel (step S 122 , S128, S134) and operations for eliminating measurement inability (steps S123, S129, S135) may be performed.
- the system may automatically perform the operation for eliminating the measurement inability, and the operator may be notified only by a warning with an indication that the processing cannot be performed. You may do so.
- cloud, raindrop, and temperature are presented as typical environmental information, but fog, image sensor trouble, day / night switching, illuminance information, dead leaves and bird droppings, and Stained or shielded window glass due to caught mascots, or Information such as the relative position of an oncoming vehicle may be used as environmental information.
- the fourth embodiment can also be used in the case of a stereo camera using a plurality of image sensors.
- This fifth embodiment will be described with reference to a system having a plurality of optical modules.
- the configuration of each optical module conforms to the above-described first to fourth embodiments.
- a plurality of stereo optical modules 1 a and 1 b and monocular optical modules 71 a and 71 b are connected to the communication line 2 as shown in FIG.
- the central control unit 3 is connected to the communication line 2, and an operation status detection module 72, an environment monitoring module 60, and the like are connected.
- Examples of the monocular optical module include a side camera module 58 and a rear camera module 62 shown in FIG.
- the driving condition detection module 72 includes an engine control module 57, a brake sensor module 61, a steering angle detection module (not shown), a transmission monitoring module, and the like, as shown in FIG.
- the environmental monitoring module 60 as shown in FIG. 7, a radar module 54, a vehicle speed sensor module 55, a power navigation module 56, an indoor monitoring module 59, a load detection module are shown. 63, rear ultrasonic radar module 64, and the like.
- the image information obtained by the optical modules is transmitted and received with information such as the camera ID of each optical module and the information on the calibration of each optical module. If this is determined by the central control unit 3, it is possible to determine which optical module information should be processed using which calibration data.
- the communication line 2 is also a single communication line, and the central control unit 3 and each unit including the stereo optical module 1a can efficiently communicate with each other using the communication line 2. It is now possible to communicate. That is, in this communication, the central control unit 3 supervises the sequence control of the whole system, and the central control unit 3 sends various requests such as data request commands to each unit from the central control unit 3. Control commands are sent back to the central control unit 3 from each unit, as well as the requested data. :
- the common use of the communication line 2 leads to a reduction in the in-vehicle harness. For this reason, effects such as reduction of weight and influence of noise interference can be obtained.
- the sixth embodiment is an embodiment of the raindrop sensor 10 mounted on the stereo optical module.
- a conventional raindrop sensor is directed to a glass block 74 that is in close contact with a window glass 73 and emits infrared or visible light emitting diodes (light-emitting LEDs) 75 Flood from I do.
- This light is condensed by a light projecting lens 76, and the reflected light of the condensed light beam on the outer glass surface is guided to a light receiving lens 78 by a reflection mirror 77 or the like. Then, the light guided by the light receiving lens 78 is collected, and the collected light is detected by the light receiving sensor 79 such as a photodiode.
- the light receiving sensor 79 such as a photodiode.
- the raindrop sensor had to be provided in close contact with the glass surface, which was a constraint on space and cost. In addition, it obstructs the driver's field of view, and if a stereo optical module is installed to detect obstacles in front of the vehicle in addition to the raindrop sensor, the field of view of the stereo optical module will be blocked. And the position of the raindrop sensor with respect to the stereo optical module 1 are problems. Therefore, the raindrop sensor according to the sixth embodiment detects raindrops using an image captured by the stereo optical module 1.
- a stereo optical system in which a horizontally long image of the left and right parallax is divided up and down as shown in FIG. 5 to form an image on an image sensor.
- a horizontally long stereo image as shown in FIG. 27 is obtained.
- objects at a long distance have small parallax
- objects at a short distance have large parallax. That is, in the optical system described in the above-described embodiment, the parallax range that can be measured can be widened. Therefore, in a certain region of the stereo image, a very short distance (for example, more than ten cm With a baseline length of, it can cover from a short distance of ten to several tens of centimeters) to infinity.
- FIG. 28 is a diagram showing the angle of view of the stereo images of the left and right two viewpoints of the stereo optical module 1 used in the sixth embodiment. That is, in the sixth embodiment, the stereo optical module 1 is arranged so that at least the windshield of the vehicle to be detected as a raindrop enters the overlap region 83 in which the angle of view in the left and right viewpoints overlaps. .
- the search range for raindrop detection in the example of Fig. 27 is the distance area where the window glass exists (the shaded area in the figure) of the overlap area 83 in Fig. 28.
- the search width of the corresponding point on the epipolar line is shortened in order to speed up the processing.
- the search width is widened, so that it takes time and power to search for the corresponding point.
- the amount of parallax can be increased.
- the mode is switched between raindrop detection and detection of obstacles and vehicles. That is, in the raindrop detection mode, as shown in FIG. 29, window-one matching is performed within a range corresponding to a short distance in the reference image 80. On the other hand, in the mode for detecting an obstacle or the like, the search range is limited, and a search for a parallax amount corresponding to, for example, 3 m to 100 m is performed.
- the search range on the search image 81 side is limited to a limited distance region including the surface of the window glass 73 as shown in FIG. 30 (for example, the window shape). It may be limited to the search range (the shaded area in the figure) corresponding to the narrow area along. That is, the value of the search range may be offset so that only the search range for detecting the southern drop at the left end of the right image and the right end of the left image shown in FIG. 27 is set as the search range. By limiting the search range in this way, faster detection of raindrops by window-matching can be realized.
- the search range when the search range is also inclined in accordance with the inclination of the window glass 73, the upper end of the screen is closer and the lower end of the screen is smaller than the upper end. Since it is a long distance, as shown by the ray in Fig. 30, window-to-matching is performed while reducing the offset amount of the search range from the search at the top edge of the screen to the search at the bottom edge of the screen. Is also good.
- the function as a raindrop sensor is realized by window matching for a short distance only, so that the central control unit 3 and the stereo optical module 1 are new. It is possible to detect raindrops, stains on glass, etc. without adding equipment.
- distance measurement of an ordinary obstacle is performed by matching in a range with a small parallax, and raindrop detection is performed in parallel with matching in a range of a large parallax, or by performing time-division matching. It is also possible to realize the function of detecting raindrops while measuring the distance of harmful objects.
- an example in which a captured image is passively processed has been described.
- An active method that detects raindrops based on scattered light may be used in combination. In this case, it is sufficient to switch between raindrop detection and distance measurement in a time-series manner and project the light into the right or left visual field of the stereo optical module.
- images outside the overlap region shown in Figs. 31 and 32 are taken as two-dimensional images, and the region outside the overlap region is captured as shown in Fig. 33.
- a light beam such as infrared light may be projected from the light beam projector 82 to detect raindrops.
- the viewpoint does not overlap. Therefore, the convergence angle may be opened outward and the viewpoint may be set.
- the region outside the overlap region is not used for stereo measurement, raindrops can be detected by projecting infrared light only in this viewing range.
- a coating 84 for detecting raindrops may be applied to the surface of the window glass corresponding to an area outside the overlap area.
- the surface of the window glass 73 may be provided.
- a diffusing surface may be provided so that it becomes cloudy glass so that the permeability of the part to which raindrops adhere is increased.
- water-repellent sheet and hydrophilic sheet can be inserted into the window so that the presence or absence of raindrops can be determined more clearly.
- the focus area can be set at a short distance, at least with the window glass surface.
- the detection of raindrops may be limited to the distance from the stereo camera system, by detecting from a two-dimensional image, capturing infrared light, extracting shapes, detecting colors, and the like. If these detections are performed in an area that is not used for distance measurement, raindrop detection can be realized with only a single image sensor without adversely affecting distance measurement.
- the infrared cut filter provided on the image sensor that captures an area outside the overlap region must be removed in advance or a band pass filter must be used. By substituting it for more efficient measurements, more effective measurements can be realized.
- the above-mentioned raindrop detection is performed on the window glass around the screen corresponding to that part.
- it may be configured to provide a means.
- a subject without providing a focusing mechanism in the stereo optical system, a subject can be photographed from the vicinity of the window glass to a predetermined distance in front (for example, 50 to 300 m).
- a so-called pan-focus stereo optical system having a depth of field may be configured. In this case, it is possible to roughly focus both on the raindrop or the attached matter on the window glass and the obstacle in front of it without providing a focusing mechanism. It is possible to do.
- the present invention has been described based on the embodiment, the present invention is not limited to the above-described embodiment, and it is needless to say that various modifications and applications are possible within the scope of the present invention. .
- a stereo camera system mounted on a vehicle is described.
- the present invention can be used, for example, when performing distance measurement on a moving object such as a robot.
- the above-described embodiment includes various stages of the invention, and various inventions can be extracted by an appropriate combination of a plurality of disclosed constituent features. For example, even if some components are deleted from all the components shown in the embodiment, the problem described in the section of the problem to be solved by the invention can be solved, and the problem is described in the section of the effect of the invention. If the effect can be obtained, a configuration from which this configuration requirement is deleted can be extracted as an invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Measurement Of Optical Distance (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Input (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04735383A EP1637837A1 (en) | 2003-05-29 | 2004-05-28 | Stereo camera system and stereo optical module |
JP2005506574A JPWO2004106858A1 (en) | 2003-05-29 | 2004-05-28 | Stereo camera system and stereo optical module |
US11/288,647 US7386226B2 (en) | 2003-05-29 | 2005-11-29 | Stereo camera system and stereo optical module |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003153452 | 2003-05-29 | ||
JP2003-153452 | 2003-05-29 | ||
JP2003153454 | 2003-05-29 | ||
JP2003-153454 | 2003-05-29 | ||
JP2003153453 | 2003-05-29 | ||
JP2003-153453 | 2003-05-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/288,647 Continuation US7386226B2 (en) | 2003-05-29 | 2005-11-29 | Stereo camera system and stereo optical module |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004106858A1 true WO2004106858A1 (en) | 2004-12-09 |
Family
ID=33493928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/007759 WO2004106858A1 (en) | 2003-05-29 | 2004-05-28 | Stereo camera system and stereo optical module |
Country Status (4)
Country | Link |
---|---|
US (1) | US7386226B2 (en) |
EP (1) | EP1637837A1 (en) |
JP (1) | JPWO2004106858A1 (en) |
WO (1) | WO2004106858A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007004689A (en) * | 2005-06-27 | 2007-01-11 | Fujitsu Ten Ltd | Image processor and image processing method |
JP2007028445A (en) * | 2005-07-20 | 2007-02-01 | Auto Network Gijutsu Kenkyusho:Kk | Image display system |
JP2007174113A (en) * | 2005-12-20 | 2007-07-05 | Sumitomo Electric Ind Ltd | Obstacle detection system and obstacle detection method |
JP2007325120A (en) * | 2006-06-02 | 2007-12-13 | Sumitomo Electric Ind Ltd | Far infrared imaging system and far infrared imaging method |
JP2008022454A (en) * | 2006-07-14 | 2008-01-31 | Sumitomo Electric Ind Ltd | Obstacle detection system and obstacle detection method |
WO2010010707A1 (en) * | 2008-07-23 | 2010-01-28 | パナソニック株式会社 | Image pickup device and semiconductor circuit element |
JP2010146284A (en) * | 2008-12-18 | 2010-07-01 | Mazda Motor Corp | Object detection device for vehicle, and driving support system for vehicle |
WO2011090053A1 (en) * | 2010-01-21 | 2011-07-28 | クラリオン株式会社 | Obstacle detection warning device |
JP2011247965A (en) * | 2010-05-24 | 2011-12-08 | Olympus Imaging Corp | Stereo photographing adaptive interchangeable lens, imaging apparatus body, and imaging apparatus |
JP2012008724A (en) * | 2010-06-23 | 2012-01-12 | Secom Co Ltd | Monitoring sensor |
JP2012038078A (en) * | 2010-08-06 | 2012-02-23 | Secom Co Ltd | Monitoring sensor |
CN102074007B (en) * | 2005-05-20 | 2012-07-18 | 丰田自动车株式会社 | Image processor for vehicles |
JP2012527803A (en) * | 2009-05-19 | 2012-11-08 | オートリブ ディベロップメント エービー | Vision systems and vision methods for automobiles |
US8395693B2 (en) | 2007-06-28 | 2013-03-12 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
US8466960B2 (en) | 2009-02-16 | 2013-06-18 | Ricoh Company, Ltd. | Liquid droplet recognition apparatus, raindrop recognition apparatus, and on-vehicle monitoring apparatus |
JP2013190416A (en) * | 2012-02-13 | 2013-09-26 | Ricoh Co Ltd | Deposit detection device and in-vehicle equipment controller including the same |
US9413983B2 (en) | 2006-11-22 | 2016-08-09 | Sony Corporation | Image display system, display device and display method |
JP2017044599A (en) * | 2015-08-27 | 2017-03-02 | ルネサスエレクトロニクス株式会社 | Control system |
JP2017090380A (en) * | 2015-11-16 | 2017-05-25 | 株式会社デンソーウェーブ | Laser radar device, window member for laser radar device, and control program for laser radar device |
WO2019216229A1 (en) * | 2018-05-09 | 2019-11-14 | ソニー株式会社 | Data processing device, data processing method, and program |
JP2021069069A (en) * | 2019-10-28 | 2021-04-30 | トヨタ自動車株式会社 | Vehicle control device |
CN116908217A (en) * | 2023-09-11 | 2023-10-20 | 中北大学 | Deep hole measurement and three-dimensional reconstruction system and application method thereof |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7957598B2 (en) * | 2006-01-09 | 2011-06-07 | Lexmark International, Inc. | Methods and systems for dynamic scan compression |
EP1991903B1 (en) * | 2006-02-22 | 2016-04-27 | Itres Research Limited | Optically multiplexed imaging systems and methods of operation |
US20080024871A1 (en) * | 2006-02-22 | 2008-01-31 | Itres Research Limited | Optically multiplexed imaging systems and methods of operation |
DE102006052779A1 (en) * | 2006-11-09 | 2008-05-15 | Bayerische Motoren Werke Ag | Method for generating an overall image of the surroundings of a motor vehicle |
EP1931147B1 (en) * | 2006-12-04 | 2019-07-10 | Harman Becker Automotive Systems GmbH | Apparatus and method for processing an image of a surrounding of a vehicle |
DE102007059735A1 (en) * | 2006-12-12 | 2008-07-24 | Cognex Corp., Natick | Obstacle and vehicle e.g. lorry, recognition system, has cameras aligned such that area on side and over end of own vehicle are recognized, and output device providing information about object or vehicle within range of own vehicle |
US8170326B2 (en) * | 2007-01-03 | 2012-05-01 | Vialogy Corp. | Multi parallax exploitation for omni-directional imaging electronic eye |
JP4915859B2 (en) * | 2007-03-26 | 2012-04-11 | 船井電機株式会社 | Object distance deriving device |
DE102007021499A1 (en) * | 2007-05-04 | 2008-11-06 | Deere & Company, Moline | operating device |
FR2940490B1 (en) * | 2008-12-19 | 2011-05-13 | Thales Sa | PROCESS FOR PASSIVE IMAGING AND RAIN DETECTION |
US20100289874A1 (en) * | 2009-05-15 | 2010-11-18 | Fuhua Cheng | Square tube mirror-based imaging system |
JP2011029905A (en) * | 2009-07-24 | 2011-02-10 | Fujifilm Corp | Imaging device, method and program |
KR101619076B1 (en) * | 2009-08-25 | 2016-05-10 | 삼성전자 주식회사 | Method of detecting and tracking moving object for mobile platform |
EP2293588A1 (en) * | 2009-08-31 | 2011-03-09 | Robert Bosch GmbH | Method for using a stereovision camera arrangement |
US8964004B2 (en) | 2010-06-18 | 2015-02-24 | Amchael Visual Technology Corporation | Three channel reflector imaging system |
EP2691014A1 (en) | 2011-03-30 | 2014-02-05 | The Gillette Company | Method of viewing a surface |
US8823932B2 (en) * | 2011-04-04 | 2014-09-02 | Corning Incorporated | Multi field of view hyperspectral imaging device and method for using same |
US20130027548A1 (en) * | 2011-07-28 | 2013-01-31 | Apple Inc. | Depth perception device and system |
KR20130024504A (en) * | 2011-08-31 | 2013-03-08 | 삼성전기주식회사 | Stereo camera system and method for controlling convergence |
US8648808B2 (en) | 2011-09-19 | 2014-02-11 | Amchael Visual Technology Corp. | Three-dimensional human-computer interaction system that supports mouse operations through the motion of a finger and an operation method thereof |
RU2477008C1 (en) * | 2011-09-30 | 2013-02-27 | Борис Иванович Волков | Video camera |
US9019352B2 (en) | 2011-11-21 | 2015-04-28 | Amchael Visual Technology Corp. | Two-parallel-channel reflector with focal length and disparity control |
DE102011056051A1 (en) | 2011-12-05 | 2013-06-06 | Conti Temic Microelectronic Gmbh | Method for evaluating image data of a vehicle camera taking into account information about rain |
JP5598484B2 (en) * | 2012-01-19 | 2014-10-01 | 株式会社デンソー | Audio output device |
US9019603B2 (en) | 2012-03-22 | 2015-04-28 | Amchael Visual Technology Corp. | Two-parallel-channel reflector with focal length and disparity control |
TWI511548B (en) * | 2012-04-25 | 2015-12-01 | Altek Corp | Image capture device and dustproof method thereof |
US9557634B2 (en) | 2012-07-05 | 2017-01-31 | Amchael Visual Technology Corporation | Two-channel reflector based single-lens 2D/3D camera with disparity and convergence angle control |
US9030660B2 (en) | 2012-09-19 | 2015-05-12 | Raytheon Company | Multi-band imaging spectrometer |
US9208369B2 (en) * | 2012-10-30 | 2015-12-08 | Lockheed Martin Corporation | System, method and computer software product for searching for a latent fingerprint while simultaneously constructing a three-dimensional topographic map of the searched space |
KR20140066908A (en) * | 2012-11-23 | 2014-06-03 | 현대자동차주식회사 | Vehicle control system |
US20150264336A1 (en) * | 2014-03-17 | 2015-09-17 | Jacob D. Catt | System And Method For Composite Three Dimensional Photography And Videography |
US10523891B2 (en) * | 2015-08-21 | 2019-12-31 | Sony Corporation | Projection system and apparatus unit to implement new use form of projector |
DE102015217265A1 (en) * | 2015-09-10 | 2017-03-16 | Conti Temic Microelectronic Gmbh | Stereo camera device for detecting the environment of a motor vehicle, motor vehicle with such a stereo camera device and a method for detecting raindrops or deposits |
JP7018285B2 (en) * | 2017-09-29 | 2022-02-10 | 株式会社デンソー | Vehicle peripheral monitoring device and peripheral monitoring method |
WO2020033967A1 (en) | 2018-08-10 | 2020-02-13 | Buffalo Automation Group Inc. | Training a deep learning system for maritime applications |
US10683067B2 (en) | 2018-08-10 | 2020-06-16 | Buffalo Automation Group Inc. | Sensor system for maritime vessels |
US10782691B2 (en) | 2018-08-10 | 2020-09-22 | Buffalo Automation Group Inc. | Deep learning and intelligent sensing system integration |
KR102613098B1 (en) * | 2019-01-23 | 2023-12-12 | 한화비전 주식회사 | Image Sensor module |
JP7207038B2 (en) | 2019-03-14 | 2023-01-18 | 株式会社リコー | IMAGING DEVICE, IMAGING OPTICAL SYSTEM AND MOVING OBJECT |
US11006093B1 (en) * | 2020-01-22 | 2021-05-11 | Photonic Medical Inc. | Open view, multi-modal, calibrated digital loupe with depth sensing |
US11066023B1 (en) | 2021-04-09 | 2021-07-20 | INVISIONit LLC | Camera system for particulate material trailer |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06273171A (en) * | 1993-03-18 | 1994-09-30 | Fuji Heavy Ind Ltd | Distance detection device for vehicle |
JPH1026528A (en) * | 1996-07-11 | 1998-01-27 | Topcon Corp | Shape-measuring apparatus and image-input device used therefor |
JPH11146425A (en) * | 1997-11-11 | 1999-05-28 | Canon Inc | Double eye camera, double eye camera system and double eye camera control method |
JPH11211469A (en) * | 1998-01-29 | 1999-08-06 | Fuji Heavy Ind Ltd | Stereo image processing system |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3468428B2 (en) * | 1993-03-24 | 2003-11-17 | 富士重工業株式会社 | Vehicle distance detection device |
JPH10336705A (en) * | 1997-06-02 | 1998-12-18 | Canon Inc | Compound eye camera |
JPH11328413A (en) | 1998-05-14 | 1999-11-30 | Fuji Heavy Ind Ltd | Stereo image processing system |
JP3462812B2 (en) | 1999-09-22 | 2003-11-05 | 富士重工業株式会社 | Power supply control method and device for vehicle-mounted camera |
JP2002071309A (en) * | 2000-08-24 | 2002-03-08 | Asahi Optical Co Ltd | Three-dimensional image-detecting device |
JP2002259966A (en) | 2001-03-05 | 2002-09-13 | Toyota Motor Corp | Periphery recognition device |
JP2002319091A (en) * | 2001-04-20 | 2002-10-31 | Fuji Heavy Ind Ltd | Device for recognizing following vehicle |
JP3868876B2 (en) * | 2002-09-25 | 2007-01-17 | 株式会社東芝 | Obstacle detection apparatus and method |
US7184073B2 (en) * | 2003-04-11 | 2007-02-27 | Satyam Computer Services Limited Of Mayfair Centre | System and method for warning drivers based on road curvature |
JPWO2004106857A1 (en) * | 2003-05-29 | 2006-07-20 | オリンパス株式会社 | Stereo optical module and stereo camera |
JP4402400B2 (en) * | 2003-08-28 | 2010-01-20 | オリンパス株式会社 | Object recognition device |
-
2004
- 2004-05-28 EP EP04735383A patent/EP1637837A1/en not_active Withdrawn
- 2004-05-28 WO PCT/JP2004/007759 patent/WO2004106858A1/en active Application Filing
- 2004-05-28 JP JP2005506574A patent/JPWO2004106858A1/en active Pending
-
2005
- 2005-11-29 US US11/288,647 patent/US7386226B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06273171A (en) * | 1993-03-18 | 1994-09-30 | Fuji Heavy Ind Ltd | Distance detection device for vehicle |
JPH1026528A (en) * | 1996-07-11 | 1998-01-27 | Topcon Corp | Shape-measuring apparatus and image-input device used therefor |
JPH11146425A (en) * | 1997-11-11 | 1999-05-28 | Canon Inc | Double eye camera, double eye camera system and double eye camera control method |
JPH11211469A (en) * | 1998-01-29 | 1999-08-06 | Fuji Heavy Ind Ltd | Stereo image processing system |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102074007B (en) * | 2005-05-20 | 2012-07-18 | 丰田自动车株式会社 | Image processor for vehicles |
JP2007004689A (en) * | 2005-06-27 | 2007-01-11 | Fujitsu Ten Ltd | Image processor and image processing method |
JP2007028445A (en) * | 2005-07-20 | 2007-02-01 | Auto Network Gijutsu Kenkyusho:Kk | Image display system |
JP2007174113A (en) * | 2005-12-20 | 2007-07-05 | Sumitomo Electric Ind Ltd | Obstacle detection system and obstacle detection method |
JP2007325120A (en) * | 2006-06-02 | 2007-12-13 | Sumitomo Electric Ind Ltd | Far infrared imaging system and far infrared imaging method |
JP2008022454A (en) * | 2006-07-14 | 2008-01-31 | Sumitomo Electric Ind Ltd | Obstacle detection system and obstacle detection method |
US9413983B2 (en) | 2006-11-22 | 2016-08-09 | Sony Corporation | Image display system, display device and display method |
US8395693B2 (en) | 2007-06-28 | 2013-03-12 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
JP4510930B2 (en) * | 2008-07-23 | 2010-07-28 | パナソニック株式会社 | Imaging device and semiconductor circuit element |
WO2010010707A1 (en) * | 2008-07-23 | 2010-01-28 | パナソニック株式会社 | Image pickup device and semiconductor circuit element |
JPWO2010010707A1 (en) * | 2008-07-23 | 2012-01-05 | パナソニック株式会社 | Imaging device and semiconductor circuit element |
US8390703B2 (en) | 2008-07-23 | 2013-03-05 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
JP2010146284A (en) * | 2008-12-18 | 2010-07-01 | Mazda Motor Corp | Object detection device for vehicle, and driving support system for vehicle |
US8466960B2 (en) | 2009-02-16 | 2013-06-18 | Ricoh Company, Ltd. | Liquid droplet recognition apparatus, raindrop recognition apparatus, and on-vehicle monitoring apparatus |
JP2012527803A (en) * | 2009-05-19 | 2012-11-08 | オートリブ ディベロップメント エービー | Vision systems and vision methods for automobiles |
WO2011090053A1 (en) * | 2010-01-21 | 2011-07-28 | クラリオン株式会社 | Obstacle detection warning device |
JP2011149810A (en) * | 2010-01-21 | 2011-08-04 | Clarion Co Ltd | Obstacle detection alarm device |
JP2011247965A (en) * | 2010-05-24 | 2011-12-08 | Olympus Imaging Corp | Stereo photographing adaptive interchangeable lens, imaging apparatus body, and imaging apparatus |
JP2012008724A (en) * | 2010-06-23 | 2012-01-12 | Secom Co Ltd | Monitoring sensor |
JP2012038078A (en) * | 2010-08-06 | 2012-02-23 | Secom Co Ltd | Monitoring sensor |
JP2013190416A (en) * | 2012-02-13 | 2013-09-26 | Ricoh Co Ltd | Deposit detection device and in-vehicle equipment controller including the same |
JP2017044599A (en) * | 2015-08-27 | 2017-03-02 | ルネサスエレクトロニクス株式会社 | Control system |
JP2017090380A (en) * | 2015-11-16 | 2017-05-25 | 株式会社デンソーウェーブ | Laser radar device, window member for laser radar device, and control program for laser radar device |
WO2019216229A1 (en) * | 2018-05-09 | 2019-11-14 | ソニー株式会社 | Data processing device, data processing method, and program |
US11438570B2 (en) | 2018-05-09 | 2022-09-06 | Sony Corporation | Data processing apparatus and data processing method for generation of calibration data for performing image processing |
JP2021069069A (en) * | 2019-10-28 | 2021-04-30 | トヨタ自動車株式会社 | Vehicle control device |
JP7201568B2 (en) | 2019-10-28 | 2023-01-10 | トヨタ自動車株式会社 | vehicle controller |
CN116908217A (en) * | 2023-09-11 | 2023-10-20 | 中北大学 | Deep hole measurement and three-dimensional reconstruction system and application method thereof |
CN116908217B (en) * | 2023-09-11 | 2023-11-17 | 中北大学 | Deep hole measurement and three-dimensional reconstruction system and application method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20060077543A1 (en) | 2006-04-13 |
JPWO2004106858A1 (en) | 2006-07-20 |
EP1637837A1 (en) | 2006-03-22 |
US7386226B2 (en) | 2008-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004106858A1 (en) | Stereo camera system and stereo optical module | |
US7437066B2 (en) | Stereo optical module and stereo camera | |
CN113271400B (en) | Imaging device and electronic equipment | |
US20200344421A1 (en) | Image pickup apparatus, image pickup control method, and program | |
EP2058762B1 (en) | Method and apparatus for generating bird's-eye image | |
WO2004106856A1 (en) | Device and method of supporting stereo camera, device and method of detecting calibration, and stereo camera system | |
KR102392221B1 (en) | An image processing apparatus, and an imaging apparatus, and an image processing system | |
JP6723079B2 (en) | Object distance detection device | |
WO2010070920A1 (en) | Device for generating image of surroundings of vehicle | |
JP2007172540A (en) | Moving object discrimination system, moving object discrimination method, and computer program | |
JP2004258266A (en) | Stereoscopic adapter and distance image input device using the same | |
US11718243B2 (en) | Vehicular camera system with forward viewing camera | |
CN111038383A (en) | Vehicle and control method thereof | |
WO2014054752A1 (en) | Image processing device and device for monitoring area in front of vehicle | |
CN100554878C (en) | Stereo optical module and stereo camera | |
JP2000293693A (en) | Obstacle detecting method and device | |
JP2004257837A (en) | Stereo adapter imaging system | |
WO2019111529A1 (en) | Image processing device and image processing method | |
JP7278846B2 (en) | OBJECT POSITION DETECTION DEVICE, TRIP CONTROL SYSTEM, AND TRIP CONTROL METHOD | |
JP2013250694A (en) | Image processing apparatus | |
CN117495972A (en) | Adaptive projection error-based external parameter calibration for 4D millimeter wave radar and camera | |
WO2019058755A1 (en) | Object distance detection device | |
JP7227112B2 (en) | OBJECT DETECTION DEVICE, TRIP CONTROL SYSTEM, AND TRIP CONTROL METHOD | |
CN1798957A (en) | Stereo camera system and stereo optical module | |
JP2006317193A (en) | Image processing apparatus, imaging processing method and program for processing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005506574 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048149063 Country of ref document: CN Ref document number: 11288647 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004735383 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004735383 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11288647 Country of ref document: US |