WO2016189878A1 - 演算装置、カメラ装置、車両及びキャリブレーション方法 - Google Patents
演算装置、カメラ装置、車両及びキャリブレーション方法 Download PDFInfo
- Publication number
- WO2016189878A1 WO2016189878A1 PCT/JP2016/002560 JP2016002560W WO2016189878A1 WO 2016189878 A1 WO2016189878 A1 WO 2016189878A1 JP 2016002560 W JP2016002560 W JP 2016002560W WO 2016189878 A1 WO2016189878 A1 WO 2016189878A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- image
- calibration
- lines
- parallel
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 38
- 239000000284 extract Substances 0.000 claims abstract description 14
- 230000003287 optical effect Effects 0.000 claims description 33
- 238000013507 mapping Methods 0.000 abstract description 24
- 230000015654 memory Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 18
- 238000003860 storage Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 16
- 238000000605 extraction Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 15
- 238000004891 communication Methods 0.000 description 14
- 238000003384 imaging method Methods 0.000 description 14
- 238000006243 chemical reaction Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 238000010276 construction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 1
- 239000011230 binding agent Substances 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000003137 locomotive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/10—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
- G01C3/14—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present disclosure relates to an arithmetic device, a camera device, a vehicle, and a calibration method.
- stereo camera devices that use multiple cameras to calculate the distance to the subject. For example, in the automobile field, a distance to a preceding vehicle and an obstacle is calculated using a stereo camera device. The calculated distance is used for driving assistance including warning to a driver for collision avoidance and control of an accelerator or a brake for auto cruise control.
- the stereo camera device calculates the distance from the difference between the images taken by the two cameras.
- the alignment error between the two cameras causes a large error in the calculated distance. Although it is necessary to frequently perform calibration, it is difficult to perform such high-precision calibration by mechanical adjustment each time.
- An electronic calibration method has been proposed in which an alignment error is automatically determined based on an image captured by a camera, and a distance is corrected according to the determined alignment error (see, for example, Patent Document 1).
- this electronic calibration method two or more straight lines corresponding to marking lines at the road boundary marked in parallel with each other on the traveling road are detected from the right image and the left image captured by the stereo camera.
- the coordinates of the vanishing point where the previous two straight lines intersect is then determined for each of the left and right images.
- the coordinates of the vanishing point are compared between the left image and the right image to determine the alignment error.
- the computing device extracts a plurality of lines corresponding to straight lines parallel to each other in the object to be imaged from an image obtained by imaging the object to be imaged, and each of the extracted lines is a three-dimensional coordinate.
- a controller that calculates first calibration data so that the mapping lines are parallel to each other when mapped to the space;
- the camera device extracts a plurality of lines corresponding to straight lines parallel to each other in the object to be imaged from an image obtained by capturing an image of the object to be imaged, and each of the extracted lines is a three-dimensional coordinate.
- An arithmetic unit having a controller that calculates first calibration data so that the mapping lines when mapped in space are parallel to each other, and one or more cameras that capture the image of the object to be projected are included.
- the vehicle of the present disclosure extracts a plurality of lines corresponding to straight lines parallel to each other in the object to be imaged from an image obtained by capturing an image of the object to be imaged, and each of the extracted lines is a three-dimensional coordinate space.
- An arithmetic unit having a controller that calculates first calibration data so that the mapping lines are parallel to each other and one or more cameras that capture the image of the object to be projected.
- FIG. 1 is a diagram schematically illustrating a vehicle equipped with a camera device that travels on a road.
- FIG. 2 is a block diagram illustrating a schematic configuration of a camera apparatus according to one of a plurality of embodiments.
- FIG. 3 is a flowchart showing a procedure for updating the calibration value in the calibration calculation unit.
- FIG. 4 is a diagram illustrating an example of extracting parallel straight lines in the real space from the camera image.
- FIG. 5 is a conceptual diagram showing a pair of straight lines corresponding to a pair of parallel straight lines included in the object to be extracted, extracted from the left image and the right image.
- FIG. 6 is a flowchart showing a procedure for mapping a straight line on the image space to the three-dimensional coordinate space.
- FIG. 1 is a diagram schematically illustrating a vehicle equipped with a camera device that travels on a road.
- FIG. 2 is a block diagram illustrating a schematic configuration of a camera apparatus according to one of a plurality of embodiment
- FIG. 7 is a diagram illustrating the relationship between the pixel value in the v direction and the image height y in the camera image.
- FIG. 8 is a diagram for explaining the correspondence between the image height y in the image space and the coordinates in the Z direction in the three-dimensional coordinate space.
- FIG. 9 is a diagram showing the relationship between the pixel value in the u direction and the image height x in the camera image.
- FIG. 10 is a diagram showing the relationship between the X coordinate position of the object to be imaged in the three-dimensional coordinate space and the incident angle ⁇ to the camera.
- FIG. 11 is a conceptual diagram showing straight lines extracted from the left image and the right image and mapped to the three-dimensional coordinate space.
- FIG. 12 is a conceptual diagram showing parallel straight lines mapped to the three-dimensional coordinate space after calibration of the depression angle of the optical axis of the camera.
- FIG. 13 is a conceptual diagram showing parallel straight lines mapped to the three-dimensional coordinate space after the calibration of the parallax deviation amount ⁇ u.
- FIG. 14 is a block diagram illustrating a schematic configuration of a camera apparatus according to one of a plurality of embodiments.
- FIG. 15 is a block diagram illustrating a schematic configuration of a camera apparatus according to one of a plurality of embodiments.
- FIG. 16 is a diagram showing a simplified appearance of a vehicle equipped with a camera device according to one of a plurality of embodiments.
- FIG. 17 is a block diagram illustrating a schematic configuration of a camera apparatus according to one of a plurality of embodiments.
- the vanishing point In the calibration method described in Patent Document 1, the vanishing point must be calculated from the parallel lines included in the captured image. With such a calibration method, an accurate vanishing point cannot be obtained when the road surface has an inclination. According to this indication, it can calibrate, without calculating the vanishing point of a straight line.
- the traveling direction (upward in the figure) of the vehicle 1 is the Z direction
- the vehicle width direction (left and right in the figure) of the vehicle 1 is the X direction
- Direction is defined as the Y direction.
- the traveling direction of the vehicle 1 (upward in the figure) is the positive direction of the Z direction
- the direction from left to right is the positive direction of the X direction
- the direction from the ground to the sky is the positive direction of the Y direction.
- the “vehicle” in the present disclosure includes, but is not limited to, an automobile, a railway vehicle, an industrial vehicle, and a living vehicle.
- the vehicle may include an airplane traveling on a runway.
- the automobile includes, but is not limited to, a passenger car, a truck, a bus, a two-wheeled vehicle, a trolley bus, and the like, and may include other vehicles that travel on the road.
- Rail vehicles include, but are not limited to, locomotives, freight cars, passenger cars, trams, guided railroads, ropeways, cable cars, linear motor cars, and monorails, and may include other vehicles that travel along the track.
- Industrial vehicles include industrial vehicles for agriculture and construction.
- Industrial vehicles include but are not limited to forklifts and golf carts.
- Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
- Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, cranes, dump trucks, and road rollers.
- Living vehicles include, but are not limited to, bicycles, wheelchairs, baby carriages, wheelbarrows, and electric two-wheelers.
- Vehicle power engines include, but are not limited to, internal combustion engines including diesel engines, gasoline engines, and hydrogen engines, and electrical engines including motors. Vehicles include those that travel by human power.
- the vehicle classification is not limited to the above. For example, an automobile may include an industrial vehicle capable of traveling on a road, and the same vehicle may be included in a plurality of classifications.
- the camera device 10 includes a first camera 11, a second camera 12, and an arithmetic device 14.
- the two cameras of the first camera 11 and the second camera 12 are arranged facing the outside of the vehicle 1 and operate as a stereo camera in cooperation with each other.
- the arithmetic device 14 is electrically connected to the first camera 11 and the second camera 12.
- the arithmetic device 14 includes an image processing device.
- the “stereo camera” is a plurality of cameras having parallax and cooperating with each other.
- the stereo camera includes at least two cameras.
- Stereo cameras include those capable of simultaneously imaging a target by cooperating a plurality of cameras.
- “Simultaneous” shooting is not limited to the exact same time. For example, (1) a plurality of cameras pick up images at the same time, (2) a plurality of cameras pick up images based on the same signal, and (3) a plurality of cameras pick up images at the same time in each internal clock. This is included in “simultaneously” shooting in the present disclosure.
- the imaging time reference includes an imaging start time, an imaging end time, a captured image data transmission time, and a time at which the counterpart device receives the image data.
- the stereo camera may be a device in which a plurality of cameras are included in one housing.
- the stereo camera may be a device including two or more cameras which are independent from each other and located apart from each other.
- the stereo camera is not limited to a plurality of cameras independent of each other.
- a camera having an optical mechanism that guides light incident on two distant locations to one light receiving element can be adopted as a stereo camera.
- two units, a first camera 11 and a second camera 12 that are independent from each other, are arranged.
- a plurality of images obtained by capturing the same subject from different viewpoints may be referred to as “stereo images”.
- the first camera 11 and the second camera 12 include a solid-state image sensor.
- the solid-state imaging device includes a CCD image sensor (Charge-Coupled Device Device Image Sensor) and a CMOS image sensor (Complementary MOS Image Image Sensor).
- the first camera 11 and the second camera 12 may include a lens mechanism.
- the optical axes of the first camera 11 and the second camera 12 face the direction in which the same subject can be imaged.
- the first camera 11 and the second camera 12 have different optical axes.
- the first camera 11 and the second camera 12 have their optical axes and positions determined so that at least the same subject to be captured is included in the captured image.
- the optical axes of the first camera 11 and the second camera 12 are directed to be parallel to each other. This parallelism is not limited to strict parallelism, but allows assembly deviations, mounting deviations, and deviations over time.
- the optical axes of the first camera 11 and the second camera 12 are not limited to being parallel but may be in different directions.
- the first camera 11 and the second camera 12 are fixed with respect to the vehicle body of the vehicle 1 so that changes in position and orientation with respect to the vehicle 1 are reduced. Even if the first camera 11 and the second camera 12 are fixed, their positions and orientations may change with respect to the vehicle 1.
- the optical axes of the first camera 11 and the second camera 12 face the front (Z direction) of the vehicle 1.
- the camera device 10 can image various objects to be imaged such as white lines 101 and 102 (division lines) on the road surface 100, preceding vehicles, and obstacles while traveling.
- the optical axes of the first camera 11 and the second camera 12 are inclined toward the road surface 100 from the Z direction.
- the optical axes of the first camera 11 and the second camera 12 may be directed in the Z direction, and may be inclined from the Z direction to the sky side.
- the directions of the optical axes of the first camera 11 and the second camera 12 are appropriately changed according to the application.
- the first camera 11 and the second camera 12 are located away from each other in the direction intersecting the respective optical axes.
- the first camera 11 and the second camera 12 are positioned along the vehicle width direction (X direction) of the vehicle 1.
- the first camera 11 is located on the left side of the second camera 12 when facing forward
- the second camera 12 is located on the right side of the first camera 11 when facing forward. Due to the difference in position between the first camera 11 and the second camera 12, in the two images captured by the cameras 11 and 12, the positions of the objects to be imaged corresponding to each other are different.
- the left image output from the first camera 11 and the right image output from the second camera 12 are stereo images captured from different viewpoints.
- the positions of the first camera 11 and the second camera 12 are not limited to this, and in another one of the embodiments, the first camera 11 and the second camera 12 are in the vertical direction (Y direction) or XY. You may locate along the diagonal direction in a plane. In that case, the images output from the first camera 11 and the second camera 12 are stereo images each having a parallax in the vertical direction or the diagonal direction.
- the first camera 11 and the second camera 12 are fixed to the front side of the vehicle 1 with the optical axis directed in front of the vehicle 1 (Z direction). In one of the embodiments, the first camera 11 and the second camera 12 can image the outside of the vehicle 1 via the windshield of the vehicle 1. In a plurality of embodiments, the first camera 11 and the second camera 12 may be fixed to any of the front bumper, fender grille, side fender, light module, and bonnet of the vehicle 1.
- the first camera 11 and the second camera 12 each output a captured image to the arithmetic device 14 as digital data.
- the computing device 14 can perform various processes on each of the left image output from the first camera 11 and the right image output from the second camera 12.
- the processing that can be performed by the computing device 14 includes processing for calibrating the deviation of the first camera 11 and the second camera 12 from the standard based on the image, and processing for detecting an object from the image.
- the computing device 14 calculates the distance to the detected object from both the left image and the right image.
- the computing device 14 may calculate the distance to the object by a known technique including the principle of triangulation.
- the computing device 14 performs a process of calibrating image differences due to manufacturing variations of the first camera 11 and the second camera 12 and differences from the planned standard including mounting displacement.
- the computing device 14 may perform at least one of a process of calibrating after developing image data into an image and a process of calibrating before developing image data into an image.
- the computing device 14 may perform a process of updating a calibration value used for image calibration in order to calibrate a change over time with respect to a planned standard.
- the computing device 14 may periodically perform a process of updating the calibration value.
- the “calibration value” is a parameter used for calibrating input image data or an image developed from the image data.
- the calibration value may be used when calibrating the deviation from the standard camera position and orientation.
- the calibration value and the calibration data are synonymous. Hereinafter, the procedure for updating the calibration value will be described in more detail.
- the camera device 10 includes a first camera 11, a second camera 12, and an arithmetic device 14.
- the computing device 14 includes an input unit 15, a control unit 16 as a controller, an output unit 17, and a storage unit 18.
- the computing device 14 may include a memory or the like for temporarily storing image data input to the input unit 15 separately from the storage unit 18.
- the input unit 15 is an input interface for inputting image data to the arithmetic device 14.
- the input unit 15 can employ a physical connector and a wireless communication device.
- the physical connector includes an electrical connector that supports transmission using an electrical signal, an optical connector that supports transmission using an optical signal, and an electromagnetic connector that supports transmission using electromagnetic waves.
- Electrical connectors include connectors conforming to IEC 60603, connectors conforming to USB standards, connectors corresponding to RCA terminals, connectors corresponding to S terminals defined in EIAJ CP-1211A, D terminals prescribed in EIAJIARC-5237 , A connector conforming to the HDMI (registered trademark) standard, and a connector corresponding to a coaxial cable including BNC.
- the optical connector includes various connectors conforming to IEC 61754.
- the wireless communication device includes a wireless communication device that complies with each standard including Bluetooth (registered trademark) and IEEE802.11.
- the wireless communication device includes at least one antenna.
- the input unit 15 receives image data of images captured by the first camera 11 and the second camera 12.
- the input unit 15 delivers the input image data to the control unit 16.
- the input to the input unit 15 includes a signal input via a wired cable and a signal input via a wireless connection.
- the input unit 15 may correspond to the imaging signal transmission method of the first camera 11 and the second camera 12.
- the control unit 16 includes one or a plurality of processors.
- the control unit 16 or the processor may include one or a plurality of memories that store programs for various processes and information being calculated.
- the memory includes volatile memory and nonvolatile memory.
- the memory includes a memory independent of the processor and a built-in memory of the processor.
- the processor includes a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process.
- the dedicated processor includes an application specific IC (ASIC; Application Specific Circuit).
- the processor includes a programmable logic device (PLD).
- PLD includes FPGA (Field-ProgrammablemGate Array).
- the control unit 16 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or a plurality of processors cooperate.
- the control unit 16 has a normal mode and a calculation mode.
- the control unit 16 may detect an object in the image based on the input image data in the normal mode.
- the control unit 16 calculates the distance to the detected object in the normal mode.
- the control unit 16 calculates a calibration value for calibrating the input image signal.
- the control unit 16 is not limited to a method that operates in any of a plurality of different modes.
- the controller 16 may execute either detection of an object or calculation of a distance to the detected object while calculating a calibration value for calibrating the input image signal.
- the output unit 17 is an output interface that outputs data from the arithmetic device 14.
- the input unit 15 can employ a physical connector and a wireless communication device.
- the output unit 17 is connected to the network of the vehicle 1 such as CAN (Control Area Network).
- the computing device 14 is connected to the control device of the vehicle 1, an alarm device, and the like via the CAN.
- the arithmetic unit 14 outputs the output unit 17 to a control device, an alarm device, and the like. Such information is appropriately used in each of the control device and the alarm device.
- the output unit 17 is separated from the input unit 15, but is not limited thereto.
- the input unit 15 and the output unit 17 may be one communication unit.
- This communication unit is a communication interface of the camera device 10.
- the communication unit can employ a physical connector and a wireless communication device.
- the storage unit 18 stores a calibration value.
- the calibration value includes a first calibration value (first calibration data) and a second calibration value (second calibration data).
- the storage unit 18 includes a rewritable memory.
- the storage unit 18 may be a non-volatile memory such as a flash memory, a magnetic memory (MRAM: Magnetoresistive Random Access Memory), or a ferroelectric memory (FeRAM: Ferroelectric Random Access Memory).
- the control unit 16 includes a calibration unit 19, a stereo calculation unit 20, and a calibration calculation unit 21.
- the operation of each part will be described below.
- Each of the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21 may be a hardware module or a software module.
- the control unit 16 can execute operations that each of the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21 can perform.
- the control unit 16 is not limited to a configuration including the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21, and one or more of the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21 may be omitted. .
- control unit 16 may execute all the operations of the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21. Operations performed by the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21 may be rephrased as operations performed by the control unit 16. The process performed by the control unit 16 using any one of the calibration unit 19, the stereo calculation unit 20, and the calibration calculation unit 21 may be executed by the control unit 16 itself.
- the calibration unit 19 calibrates the image with reference to the calibration value stored in the storage unit 18.
- the calibration unit 19 calibrates images received from each of the first camera 11 and the second camera 12.
- the calibration unit 19 electronically calibrates the deviation of each of the first camera 11 and the second camera 12 from the standard by image data conversion.
- the calibration unit 19 converts the left side image and the right side image into parallel equidistant images.
- the calibration unit 19 refers to the first calibration value when calibrating each of the image data received from the first camera 11 and the second camera 12.
- the calibration unit 19 refers to the second calibration value when the left image and the right image are calibrated as a stereo image.
- the stereo calculation unit 20 obtains the parallax between the left image and the right image calibrated by the calibration unit 19.
- the stereo calculation unit 20 divides one image of the left image and the right image into a plurality of regions.
- the stereo calculation unit 20 matches each of the plurality of divided areas with the other image.
- the stereo calculation unit 20 calculates the distance between the two regions matched in the left image and the right image based on the difference in coordinates in the left-right direction.
- the stereo calculation unit 20 generates a distance image indicating the calculated distance distribution.
- the stereo calculation unit 20 identifies an object present at the position by detecting a portion where a region having the same distance is fixed.
- the stereo calculation unit 20 identifies the distance to the identified object from the distance of the area where the object is identified.
- the objects identified by the stereo calculation unit 20 include obstacles.
- the obstacle includes at least one of a human, a vehicle, a road sign, a building, and a plant.
- the stereo calculation unit 20 associates the identified object with the distance image.
- the stereo calculation unit 20 outputs information including at least one of the distance image, the identified object, and the distance to the object via the output unit 17.
- the stereo operation unit 20 performs processing in real time.
- the stereo calculation unit 20 may recognize the road shape based on the parallax image.
- the calibration calculation unit 21 calculates a first calibration value for calibrating the image data received from the camera.
- the calibration calculator 21 calculates a first calibration value for calibrating the left image and the right image received from the first camera 11 and the second camera 12.
- the calibration calculator 21 calculates a second calibration value when the left image and the right image are calibrated as a stereo image based on the calibrated left image and right image.
- the calibration calculation unit 21 updates the calibration value stored in the storage unit 18.
- the calibration calculation unit 21 acquires the left image by the first camera 11 and the right image by the second camera 12 calibrated from the calibration unit 19 with the current calibration value, respectively (step S101).
- the calibration calculator 21 performs differential filtering on the acquired left image and right image (step S102).
- each image is emphasized in a portion where the luminance value changes greatly.
- the calibration calculation unit 21 performs binarization processing on the image that has been subjected to the differential filter processing (step S103). By performing the binarization process, the edge of each image is sharpened.
- the calibration calculation unit 21 extracts edges from an image with sharpened edges.
- the calibration calculation unit 21 extracts a plurality of points from lines in the image corresponding to a plurality of straight lines parallel to each other in the real space subject from each of the left image and the right image with sharpened edges.
- the calibration calculation unit 21 extracts points from lines corresponding to the same straight line in the real space from each of the left image and the right image.
- the number of points extracted by the calibration calculation unit 21 is two or more.
- the calibration calculation unit 21 extracts 100 points.
- the plurality of points extracted by the calibration calculation unit 21 may be referred to as “point group”.
- the calibration calculation unit 21 extracts a line that substantially includes the point group by extracting the point group.
- the object extracted by the calibration calculation unit 21 is not limited to a point, and a line may be extracted.
- the calibration calculation unit 21 In the case of a vehicle traveling on a road, the calibration calculation unit 21 generates parallel lines such as a lane line including a white line on the road surface, a boundary line between the road and the sidewalk, and a boundary line between the road and the median. Extract the corresponding point cloud.
- the calibration calculation unit 21 extracts point groups corresponding to the tracks parallel to each other.
- the “three-dimensional coordinate space” in this specification means a virtual space that is three-dimensionalized based on an image captured by a camera.
- straight lines parallel to each other can be extracted from straight lines located on the same plane.
- the same plane may be a road surface (XZ plane) that is a ground portion on which the vehicle 1 travels.
- XZ plane road surface
- On the actual road surface 100 there is a section where a white line 101 between the traveling lane and the roadside belt and a white line 102 between the traveling lanes are marked.
- the calibration calculation unit 21 extracts points from the straight line 103 and the straight line 104 that are the right edges of the white lines 101 and 102, respectively.
- FIG. 5 and the following FIGS. 11, 12, and 13, for the sake of explanation a set of points extracted from the left image and their mapping lines are shown by solid lines, and a set of points extracted from the right image and their mapping lines are shown.
- the white line 101 and the white line 102 linearly extend farther in front of the vehicle 1, but the white line only needs to be partially straight.
- the calibration calculation unit 21 may extract the straight line from the white line including the curved section caused by the road curve. Subsequently, the calibration calculation unit 21 performs other processing.
- a set of points extracted from each of the left image and the right image is indicated by a solid line and a broken line.
- These solid lines and broken lines correspond to, for example, a pair of straight lines parallel to each other in the real space to be imaged.
- a linear group of points extracted from the left image I L 103L, and 104L corresponds to the linear 103 and 104, and a linear group of points extracted from the left image I L 103L, and 104L.
- a linear group of points extracted from the right image I R 103R, and 104R corresponds to the origin (0, 0) of the image.
- the origin (0, 0) of the image is the upper left corner
- the right direction of each image is the u direction
- the lower direction is the v direction.
- the coordinates (u, v) are represented by pixel values from the origin (0, 0).
- an image space such a two-dimensional coordinate space is referred to as an image space.
- Calibration computing unit 21 the left side image I L and right image I R image point group becomes a straight line shape in the two-dimensional coordinate space 103L, 104L and point cloud 103R, the 104R, mapping the three-dimensional coordinate space (step S105) .
- FIG. 6 is a flowchart showing a procedure for mapping the point group on the image space to the three-dimensional coordinate space.
- “mapping” means that an element including a point and a line in a two-dimensional image space is associated with a coordinate in the real space by performing coordinate conversion to the three-dimensional coordinate space.
- Calibration computing unit 21 the extracted point cloud of the left image I L and right image I R, the same as the real space (X, Y, Z) coordinate system, coordinate transformation using the current calibration values.
- the Y coordinate of the road surface is set to zero.
- the white lines 101 and 102 on the road surface 100 have a Y coordinate of 0.
- Step S201 the extraction point of the point group 103L and 104L of the left side image I L, and, for each extraction point of the point group 103R and 104R of the right side image I R, sequentially performs the following steps (step S201).
- steps S202 and S203 only coordinate conversion for the left image will be described, but the same processing is performed for the right image.
- the calibration calculation unit 21 converts the v coordinate of the extraction point Pi into the Z coordinate of the three-dimensional coordinate space (step S202).
- the principle will be described below.
- (u 0 , v 0 ) is a point on the optical axis of the first camera 11 and indicates the center of the image space.
- the image height y (mm; millimeter) of the extraction point in the image space is obtained from the coordinate value v (pixel value) in the v direction in the image space of the extraction point, the following equation is obtained.
- p is the pixel pitch of the first camera 11 corresponding to the extraction point.
- the image height y in the image space is converted into an incident angle ⁇ in the YZ plane when light from the extraction point enters the first camera 11.
- the incident angle ⁇ is expressed by the following equation.
- the depression angle of the camera optical axis which is the installation angle of the camera theta C (an angle formed between the camera optical axis and the road surface (XZ plane), the angle of the downward because relative to the horizontal direction), This is an angle formed by a plane (that is, the road surface 100) including the straight lines 103 and 104 extracted in real space and the corresponding optical axis of the first camera 11.
- the first camera 11 and the second camera 12 are calibrated with high accuracy using a chart pattern or the like at the time of shipment.
- a calibration value at the time of factory shipment and a calibration value updated by processing in the previous calculation mode are stored in the storage unit 18.
- step S202 the calibration calculation unit 21 converts the v coordinate of the extraction point Pi into the Z coordinate of the three-dimensional coordinate space according to Equation (4).
- the calibration calculation unit 21 performs conversion from the u coordinate of the extraction point and the Z coordinate in the three-dimensional coordinate space to the X coordinate in the three-dimensional coordinate space (Ste S203).
- the principle will be described below.
- (u 0 , v 0 ) indicates the center of the image space and is a corresponding point on the optical axis of the first camera 11.
- the image height x (mm) of the extraction point in the image space in the u direction is obtained from the coordinate value u (pixel value) in the image space of the extraction point Pi, the following equation is obtained.
- the image height x in the image space is converted into an incident angle ⁇ in the XZ plane of the light from the extraction point Pi to the first camera 11.
- the incident angle ⁇ is expressed by the following equation.
- the coordinate Z and the incident angle ⁇ in the three-dimensional coordinate space and the coordinate X in the three-dimensional coordinate space have the following geometric relationship.
- a conversion formula from the u coordinate in the image space and the Z coordinate in the three-dimensional coordinate space to the X coordinate in the three-dimensional coordinate space is obtained as follows. It is done.
- the calibration calculation unit 21 performs conversion conversion from the u coordinate of the extraction point and the Z coordinate in the three-dimensional coordinate space to the X coordinate in the three-dimensional coordinate space according to Equation (8).
- the calibration calculation unit 21 maps the point (u, v) in the image space to the point (X, 0, Z) in the three-dimensional coordinate space by using the equations (4) and (8). can do.
- the calibration calculation unit 21 executes Step S202 and Step S203 for all extraction points of the left image and the right image (Steps S201 and S204).
- Step S202 and Step S203 for all extraction points of the left image and the right image.
- the calibration calculation unit 21 calibrates the displacement of the same object in the image space due to the difference in the left and right positions of the first camera 11 and the second camera 12 by mapping to the three-dimensional coordinate space.
- the same straight line included in the images captured by both the first camera 11 and the second camera 12 is mapped from the image space to the three-dimensional coordinate space, it is ideally mapped to the same position.
- Figure 11 is a left side image I L and right image I R from the extracted point cloud 103L, 103R, 104L, the 104R, approximate line 105L respectively are calculated from a plurality of mapping points mapped onto the three-dimensional coordinate space, 105R, It is a conceptual diagram which shows 106L and 106R.
- the approximate curves 105L, 105R, 106L, and 106R are mapping lines obtained by mapping the lines of the image space extracted as the point groups 103L, 103R, 104L, and 104R, respectively, into the three-dimensional coordinate space.
- the calibration calculation unit 21 determines the approximate straight lines 105L, 105R, 106L, and 106R in which a plurality of mapping points are collected using a least square method or the like. Further, in FIG. 11, for the sake of explanation, the deviation and inclination between straight lines are emphasized.
- a pair of the approximate straight line 105L and the approximate straight line 106L and a pair of the approximate straight line 105R and the approximate straight line 106R corresponding to the straight line 103 that is the edge of the parallel white line 101 and the straight line 104 that is the edge of the white line 102 on the road surface 100 are respectively shown.
- the pair of the approximate straight line 105L and the approximate straight line 105R corresponding to the same straight line and the pair of the approximate straight line 106L and the approximate straight line 106R do not overlap with each other because the calibration value stored in the storage unit 18 is used for calibration. This is due to the fact that the calibration cannot be performed with the calibration value stored in the storage unit 18 due to a new deviation from the state to be performed.
- the calibration calculation unit 21 parallels the pair of the approximate straight line 105L and the approximate straight line 106L mapped in the three-dimensional coordinate space, and the pair of the approximate straight line 105R and the approximate straight line 106R, and maps them in the three-dimensional coordinate space.
- a new calibration value is calculated by adjusting so that the pair of approximated straight line 105L and approximated straight line 105R and the pair of approximated straight line 106L and approximated straight line 106R overlap each other.
- the calibration calculator 21 updates the calculated new calibration value as a new calibration value.
- the calibration calculator 21 calculates the slopes of the approximate line 105L and the approximate line 106L (step S106). In addition, the inclination is similarly calculated for the approximate straight line 105R and the approximate straight line 106R (step S106).
- the calibration calculation unit 21 calculates the depression angle of the camera optical axis, which is a variation parameter of the first camera 11, from Equation (4) and Equation (8) so that the approximate straight line 105L and the approximate straight line 106L are parallel to each other.
- the calibration value ⁇ CL is adjusted and determined (step S107).
- the calibration calculating portion 21, approximate straight lines 105L, 106L is to adjust the depression angle to be parallel to the Z-axis, to determine a calibration value theta CL.
- theta CR depression angle of the optical axis of the camera for the second camera 12 step S107.
- the storage unit 18 uses the mounting angle of the first camera 11 and the second camera 12 set at the time of factory shipping and defined in advance as a reference value of the angle of depression of the optical axis of the camera. You can remember it.
- the calibration calculator 21 can determine the calibration values ⁇ CL and ⁇ CR by adjusting the depression angle of the optical axis of the camera based on this reference value.
- the depression angle calibration values ⁇ CL and ⁇ CR are first calibration values that are referred to when each of the image data received from the first camera 11 and the second camera 12 is calibrated.
- Step S108 Calibration values theta CL depression angle of the optical axis of first camera 11 and the second camera 12, when determining the theta CR, the calibration calculation unit 21, a group of points extracted from the left image I L and right image I R, a new Mapping to the three-dimensional coordinate space using the calibration values ⁇ CL and ⁇ CR (step S108).
- the mapping to the three-dimensional coordinate space is executed according to the flowchart shown in FIG. 6 as in step S105.
- approximate curves 107L, 107R, 108L, and 108R are mapping lines obtained by mapping the lines of the image space extracted as the point groups 103L, 103R, 104L, and 104R to the three-dimensional coordinate space, respectively.
- the approximate straight line 107L and the approximate straight line 108L respectively corresponding to the straight lines 103 and 102, which are the edges of the white line 101, which are straight lines parallel to each other, are parallel to each other.
- the approximate straight line 107R and the approximate straight line 108R are parallel to each other.
- the approximate straight line 107L and the approximate straight line 107R corresponding to the same straight line to be imaged are mapped to positions parallel to each other and shifted in the X direction.
- the approximate straight line 108L and the approximate straight line 108R corresponding to the same straight line that is, the straight line 104 that is the edge of the white line 102, are mapped to positions parallel to each other and shifted in the X direction. .
- step S108 the calibration calculation unit 21 determines the positions of the approximate lines 107L and 108L mapped from the left image to the three-dimensional coordinate space and the approximate lines 108R and 108R mapped from the right image to the three-dimensional coordinate space. Each is compared (step S109).
- the calibration calculation unit 21 calculates the parallax deviation amount ⁇ u (calibration data) so that the set of the mapped approximate line 107L and approximate 107R and the set of the approximate line 108L and approximate line 108R match each other.
- the parallax deviation amount ⁇ u is a second calibration value that is referred to when the image data received from the first camera 11 and the second camera 12 is calibrated as a stereo image.
- the parallax deviation amount ⁇ u is a difference in positional deviation with time between the first camera 11 and the second camera 12.
- approximate curves 109L, 109R, 110L, and 110R are mapping lines obtained by mapping the lines of the image space extracted as the point groups 103L, 103R, 104L, and 104R to the three-dimensional coordinate space, respectively.
- the approximate straight line 109L and the approximate straight line 109R corresponding to the straight line 103 of the edge of the same white line 101 to be imaged can be matched.
- the approximate straight line 110L and the approximate straight line 110R corresponding to the straight line 104 of the edge of the same white line 102 to be imaged can be made to coincide with each other.
- the calibration calculation unit 21 uses the updated calibration values ⁇ CL and ⁇ CR of the optical axis depression angles of the first camera 11 and the second camera 12 and the calibration value ⁇ u of the parallax deviation amount between the left image and the right image.
- the calibration value is updated by storing in the storage unit 18 (step S111), and the calibration value updating process is terminated.
- the calibration value ⁇ u for the parallax displacement amount the coordinates u 0 on the optical axis of the left image and the right image are relatively shifted corresponding to the parallax displacement amount ⁇ u to be u 0L and u 0R , which are the calibration values. May be saved as
- the calibration unit 19 is input from the first camera 11 using the updated calibration values ⁇ CL , ⁇ CR and ⁇ u until the next calibration value update process.
- the left image and the right image (stereo image) input from the camera 12 are calibrated, and the configured left image and right image are output to the stereo calculation unit 20.
- the control unit 16 may cause the calibration calculation unit 21 to automatically update the calibration value at regular time intervals or at least one time when the calculation device 14 starts and ends.
- the control unit 16 may update the calibration value based on a manual operation or a signal input from another device.
- calibration can be performed without obtaining the vanishing point of the parallel straight lines.
- the camera apparatus 10 converts lines in the image space obtained by imaging straight lines parallel to each other in the three-dimensional coordinate space so as to be parallel lines.
- ⁇ C of the optical axes of the first camera 11 and the second camera 12 calibration is performed with the road surface inclination taken into account, and thus when traveling on a slope or the like where the road surface is inclined.
- good calibration can be performed without degrading the calibration accuracy.
- the camera device 10 can be calibrated in the usage environment. In one of several embodiments, the camera device 10 can be calibrated digitally. In one of the embodiments, the camera device 10 has a low load on the user. In one of a plurality of embodiments, the camera device 10 can update the calibration value in the use environment, so that it is easy to maintain accuracy.
- the camera device 10 calculates the amount of parallax deviation ⁇ u (calibration data) and calibrates the difference in relative positional deviation between the imaging elements of the two cameras and the optical axis of the lens. Even if the positional deviation between the image sensor and the optical axis of the lens is a small deviation, a large error occurs in the distance measurement. Since the camera device 10 can update the calibration value by a simple calculation, it is easy to maintain the distance measurement accuracy.
- the 14 includes a first camera 31, a second camera 32, and an arithmetic device 37.
- the first camera 31 and the second camera 32 include imaging units 34a and 34b, primary calibration units 35a and 35b, and primary calibration value memories 36a and 36b, respectively.
- the first camera 31 and the second camera 32 are accurately calibrated by using a chart pattern or the like at the time of factory shipment, and the calibration values are stored in the primary calibration value memories 36a and 36b as primary calibration values.
- As the primary calibration value a calibration value similar to that of the embodiment shown in FIGS. 1 to 13 can be used.
- the primary calibration units 35 a and 35 b calibrate the images captured by the imaging units 34 a and 34 b based on the primary calibration values of the primary calibration value memories 36 a and 36 b and transmit the images to the input unit 15 of the arithmetic device 37.
- the primary calibration value is basically a fixed value and is not rewritten.
- the computing device 37 includes an input unit 15, a control unit 38, a secondary storage unit 41, and an output unit 17.
- the control unit 38 includes a secondary calibration unit 39 and a calibration calculation unit 40 in place of the calibration unit 19 and the calibration calculation unit 21 of the embodiment shown in FIG. Other configurations are the same as those of the embodiment shown in FIG.
- the secondary calibration unit 39 performs secondary calibration on the images received from the first camera 31 and the second camera 32.
- the secondary calibration corrects a deviation caused by a change with time after shipment from the factory, a vibration collision, or the like with respect to an image after the primary calibration.
- the calibration calculation unit 40 performs a calibration value update process similar to that of the embodiment shown in FIGS. 1 to 13 on the left and right images after the primary calibration. Accordingly, the calibration value for additional calibration stored in the secondary storage unit 41 can be updated as the secondary calibration value.
- the calibration can be performed satisfactorily as in the embodiments shown in FIGS. In the camera device 30 of this embodiment, it is not necessary to calculate the vanishing point.
- the 15 includes a first camera 51, a second camera 52, and an arithmetic device 57.
- the first camera 51 and the second camera 52 are similar to the first camera 31 and the second camera 32 of the embodiment shown in FIG. 14, and are imaging units 54a and 54b, calibration units 55a and 55b, and a calibration value memory, respectively. 56a, 56b.
- the calibration value memories 56a and 56b can be rewritten by signals from the arithmetic unit 57, unlike the primary calibration value memories 36a and 36b of the embodiment shown in FIG.
- the computing device 57 includes a communication unit 58, a control unit 59, and an output unit 17.
- the communication unit 58 receives image signals from the first camera 51 and the second camera 52 in the same manner as the input unit 15 of the embodiment shown in FIGS. 1 to 13 and calibrates the first camera 51 and the second camera 52.
- a rewrite signal can be output to the value memories 56a and 56b.
- the control unit 59 includes a calibration calculation unit 60 and a stereo calculation unit 20.
- the output unit 17 and the stereo calculation unit 20 operate in the same manner as the output unit 17 and the stereo calculation unit 20 of the embodiment shown in FIGS.
- the calibration calculation unit 60 can calculate the calibration value from the left image and the right image in the same manner as the calibration calculation unit 21 of the embodiment shown in FIG.
- the calibration value calculated by the calibration calculation unit 60 is transmitted to the calibration value memories 56a and 56b of the first camera 51 and the second camera 52 via the communication unit 58, and the calibration value is updated.
- the first camera 51 and the second camera 52 transmit calibrated images to the computing device 57.
- the stereo computing unit 20 executes processing such as detection of the measurement object in the image and distance calculation, and outputs the result.
- the camera device performs calibration by detecting parallel lines such as white lines on the road surface.
- This calibration method cannot be used when there is nothing parallel.
- the white lines on the road surface are not parallel to each other due to changes in slopes such as slopes and road curves. There are situations where this calibration method cannot be used.
- the camera device according to one of the embodiments may project two or more straight lines parallel to each other on the road surface during traveling. The camera device may be calibrated based on the projected straight line.
- the camera device 70 includes projection devices 71a and 71b.
- the projection devices 71a and 71b are devices that project straight lines parallel to each other on the road surface.
- the projection devices 71a and 71b are provided near the left and right headlights of the front bumper of the vehicle, and are widened in the vertical direction toward the front when the vehicle is running and narrow in the horizontal direction.
- a light beam is emitted. By irradiating the light beam toward the road surface, two straight lines parallel to each other are projected on the road surface.
- the projection apparatuses 71a and 71b are not limited to projecting two parallel lines, and the projection apparatuses 71a and 71b may project other specific patterns including parallel lines.
- the light emitted from the projection devices 71a and 71b may be any of visible light such as white light, infrared light, and ultraviolet light.
- the camera device 70 images two parallel lines projected on the road surface with the first camera 11 and the second camera 12.
- the projection devices 71a and 71b do not need to be divided into two units, and may be a single projection device, for example.
- the projection device may project two or more parallel lines on the road surface from the center of the front bumper of the vehicle.
- the projection device is not limited to the bumper of the vehicle, and may be fixed to any position of the vehicle interior, the fender grille, the side fender, the light module, and the bonnet.
- the camera device 70 is configured in the same manner as the stereo camera device 10 shown in FIG. 2 except for the projection devices 71a and 71b.
- the projection devices 71a and 71b cooperate with the control unit 16 of the arithmetic device 14 and project two straight lines parallel to each other on the road surface in the calculation mode.
- the control unit 16 instructs the projection devices 71a and 71b to project two parallel lines.
- the calibration calculation unit 21 performs calibration according to the same procedure as the flowchart of FIG.
- the calculation unit 21 calculates a calibration value and updates it.
- the camera device 70 can calibrate the deviation that occurs over time in the position and orientation of the stereo camera.
- the camera device 70 can calibrate the stereo camera at any time.
- the camera device 70 is less likely to be erroneously recognized and has high calibration accuracy even when there are construction marks or building shadows on the road surface.
- the present disclosure is not limited to the above-described embodiment, and various modifications or changes are possible.
- the number of cameras mounted on the camera device is not limited to two, and three or more cameras can be used.
- the processing contents and function sharing between the first and second cameras and the arithmetic device of each embodiment are examples, and processing and function sharing in other modes is also possible.
- the calibration value is always updated after the mapping step to the three-dimensional coordinate space (step S105), but the first camera and the second camera are mapped to the three-dimensional coordinate space. If the straight lines picked up in (1) are substantially parallel and substantially coincident with each other, it is determined that the update of the calibration value is unnecessary, and it is possible not to perform the processing after step S106.
- the distance measurement and object recognition methods by the stereo calculation unit are not limited to those described above, and various methods can be applied.
- the present disclosure is not limited to the use of vehicles such as automobiles, and can be applied to, for example, surveillance cameras, camera devices for monitoring production facilities, remote control robots, unmanned aircraft including drones, and the like.
- the present invention can be applied to an aircraft flying in a low sky.
- parallel straight lines can be extracted from various things such as two parallel sides of a box-shaped object such as a building, and a linearly extending passage and wall boundary.
- descriptions of “image”, “image data”, and “image information” can be understood by changing to other descriptions depending on the situation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Manufacturing & Machinery (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
以下、本開示の複数の実施形態について、図面を参照して説明する。
10,30,50,70 カメラ装置
11,31,51 第1カメラ
12,32,52 第2カメラ
14,37,57 演算装置
15 入力部
16,38,59制御部(コントローラ)
17 出力部
18 記憶部
19 較正部
20 ステレオ演算部
21,40,60:較正演算部
34a,34b,54a,54b 撮像部
35a,35b 一次較正部
36a,36b 一次較正値メモリ
39 二次較正部
41 二次記憶部
58 通信部
55a,55b 較正部
71a,71b 投影装置
100 路面
101,102 白線
103,104 直線 103L,103R,104L,104R 点群
105L,105R,106L,106R 近似曲線
107L,107R,108L,108R 近似曲線
109L,109R,110L,110R 近似曲線
Claims (13)
- 被写対象の像を撮像した画像から、該被写対象において互いに平行な直線に対応する複数の線を抽出し、抽出された前記複数の線の各々が3次元座標空間に写像されたときの写像線が互いに平行になるように第1の較正データを算出するコントローラを有する、演算装置。
- 請求項1に記載の演算装置であって、
前記コントローラは、
前記第1の較正データを算出する際に、前記複数の線の各々を3次元座標空間に写像し、
写像した複数の前記写像線が互いに平行になる前記第1の較正データを算出する、演算装置。 - 請求項1又は2に記載の演算装置であって、
前記コントローラは、前記画像から、前記被写対象において1つの平面上に位置する前記互いに平行な直線に対応する前記複数の線を抽出する、演算装置。 - 請求項3に記載の演算装置であって、
前記コントローラは、前記第1の較正データに基づいて、前記画像を撮像するカメラの光軸と前記平面とが成す角度を較正する、演算装置。 - 請求項1から4のいずれかに記載の演算装置であって、
前記コントローラは、
互いに視差を有する同じ被写対象の像を撮像した複数の画像のそれぞれに前記第1の較正データを算出し、
前記複数の画像において互いに対応する前記写像線を比較して第2の較正データを算出する、演算装置。 - 請求項5に記載の演算装置であって、
前記コントローラは、前記第2の較正データに基づいて、前記複数の画像を撮像する複数のカメラ間の視差ずれ量を較正する、演算装置。 - 被写対象の像を撮像した画像から、該被写対象において互いに平行な直線に対応する複数の線を抽出し、抽出された前記複数の線の各々が3次元座標空間に写像されたときの写像線が互いに平行になるように第1の較正データを算出するコントローラを有する、演算装置と、
前記被写対象の像を撮像する1又は複数のカメラと、
を含む、カメラ装置。 - 被写対象の像を撮像した画像から、該被写対象において互いに平行な直線に対応する複数の線を抽出し、抽出された前記複数の線の各々が3次元座標空間に写像されたときの写像線が互いに平行になるように第1の較正データを算出するコントローラを有する、演算装置と、
前記被写対象の像を撮像する1又は複数のカメラと、
を含む、車両。 - 請求項8に記載の車両であって、
前記1又は複数のカメラは、当該車両の外部を向いている、車両。 - 請求項8又は9に記載の車両であって、
前記コントローラは、前記複数のカメラの取付け俯角を基準値として前記第1の較正データに基づいて較正する、車両。 - 請求項8から10のいずれかに記載の車両であって、
前記コントローラは、前記画像の地上部分から前記複数直線を抽出する、車両。 - 被写対象の像を撮像した画像から、被写対象において互いに平行な直線に対応する複数の線を抽出するステップと、
前記抽出された複数の線の各々が3次元座標空間に写像されたときの写像線が互いに平行になるように第1の較正データを算出するステップと
を含むキャリブレーション方法。 - 請求項12に記載のキャリブレーション方法に従い互いに視差を有する複数の画像のそれぞれに前記第1の較正データを算出するステップと、
前記複数の画像において互いに対応する前記写像線を比較して第2の較正データを算出するステップと
を含むキャリブレーション方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16799589.3A EP3306267B1 (en) | 2015-05-27 | 2016-05-26 | Arithmetic logic device, camera device, vehicle and calibration method |
US15/576,781 US10373338B2 (en) | 2015-05-27 | 2016-05-26 | Calculation device, camera device, vehicle, and calibration method |
JP2017507018A JP6159905B2 (ja) | 2015-05-27 | 2016-05-26 | 演算装置、カメラ装置、車両及びキャリブレーション方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015107969 | 2015-05-27 | ||
JP2015-107969 | 2015-05-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016189878A1 true WO2016189878A1 (ja) | 2016-12-01 |
Family
ID=57392682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/002560 WO2016189878A1 (ja) | 2015-05-27 | 2016-05-26 | 演算装置、カメラ装置、車両及びキャリブレーション方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10373338B2 (ja) |
EP (1) | EP3306267B1 (ja) |
JP (1) | JP6159905B2 (ja) |
WO (1) | WO2016189878A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106851201A (zh) * | 2017-02-09 | 2017-06-13 | 苏州慧景光电科技有限公司 | 基于光纤传像技术的车载全景影像系统及其标定方法 |
CN107563326A (zh) * | 2017-08-31 | 2018-01-09 | 京东方科技集团股份有限公司 | 一种行车辅助方法、行车辅助装置和车辆 |
EP3383038A1 (en) * | 2017-03-29 | 2018-10-03 | Imagination Technologies Limited | Camera calibration |
CN108734741A (zh) * | 2017-04-18 | 2018-11-02 | 松下知识产权经营株式会社 | 摄像头校正方法、摄像头校正程序以及摄像头校正装置 |
JP2020012735A (ja) * | 2018-07-18 | 2020-01-23 | 日立オートモティブシステムズ株式会社 | 車載環境認識装置 |
WO2020105499A1 (ja) * | 2018-11-20 | 2020-05-28 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP2021522507A (ja) * | 2018-05-04 | 2021-08-30 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 光電センサの角度位置を測定するための方法、および試験台 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9916703B2 (en) * | 2015-11-04 | 2018-03-13 | Zoox, Inc. | Calibration for autonomous vehicle operation |
CN108348134B (zh) * | 2016-02-10 | 2020-05-19 | 奥林巴斯株式会社 | 内窥镜系统 |
CN108496138B (zh) * | 2017-05-25 | 2022-04-22 | 深圳市大疆创新科技有限公司 | 一种跟踪方法及装置 |
US11089288B2 (en) * | 2017-09-11 | 2021-08-10 | Tusimple, Inc. | Corner point extraction system and method for image guided stereo camera optical axes alignment |
US11158088B2 (en) | 2017-09-11 | 2021-10-26 | Tusimple, Inc. | Vanishing point computation and online alignment system and method for image guided stereo camera optical axes alignment |
US10529128B1 (en) * | 2018-04-27 | 2020-01-07 | Facebook Technologies, Llc | Apparatus, system, and method for mapping a 3D environment |
CN108647638B (zh) * | 2018-05-09 | 2021-10-12 | 东软睿驰汽车技术(上海)有限公司 | 一种车辆位置检测方法及装置 |
CN109084804B (zh) * | 2018-08-21 | 2020-11-10 | 北京云迹科技有限公司 | 机器人定位精准度判定处理方法及装置 |
CN109345589A (zh) * | 2018-09-11 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | 基于自动驾驶车辆的位置检测方法、装置、设备及介质 |
CN109389650B (zh) * | 2018-09-30 | 2021-01-12 | 京东方科技集团股份有限公司 | 一种车载相机的标定方法、装置、车辆和存储介质 |
CN109712196B (zh) * | 2018-12-17 | 2021-03-30 | 北京百度网讯科技有限公司 | 摄像头标定处理方法、装置、车辆控制设备及存储介质 |
DE102019207982A1 (de) * | 2019-05-31 | 2020-12-03 | Deere & Company | Sensoranordnung für ein landwirtschaftliches Fahrzeug |
CN110675016B (zh) * | 2019-08-08 | 2020-04-07 | 北京航空航天大学 | 一种基于端边云架构的矿车无人驾驶运输系统云智能调度系统及充电方法 |
CN111141311B (zh) * | 2019-12-31 | 2022-04-08 | 武汉中海庭数据技术有限公司 | 一种高精度地图定位模块的评估方法及系统 |
JP7457574B2 (ja) * | 2020-05-21 | 2024-03-28 | 株式会社Subaru | 画像処理装置 |
TWI782709B (zh) * | 2021-09-16 | 2022-11-01 | 財團法人金屬工業研究發展中心 | 手術機械臂控制系統以及手術機械臂控制方法 |
US11967112B2 (en) * | 2022-09-07 | 2024-04-23 | AitronX Inc. | Method and apparatus for detecting calibration requirement for image sensors in vehicles |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001272210A (ja) * | 2000-03-27 | 2001-10-05 | Toyoda Mach Works Ltd | 距離認識装置 |
JP3600378B2 (ja) * | 1996-07-24 | 2004-12-15 | 本田技研工業株式会社 | 車両の外界認識装置 |
JP2012198075A (ja) * | 2011-03-18 | 2012-10-18 | Ricoh Co Ltd | ステレオカメラ装置、画像補整方法 |
JP2015190921A (ja) * | 2014-03-28 | 2015-11-02 | 富士重工業株式会社 | 車両用ステレオ画像処理装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4573977B2 (ja) * | 1999-09-22 | 2010-11-04 | 富士重工業株式会社 | 監視システムの距離補正装置、および監視システムの消失点補正装置 |
JP3352655B2 (ja) * | 1999-09-22 | 2002-12-03 | 富士重工業株式会社 | 車線認識装置 |
AU2003248269A1 (en) * | 2002-07-12 | 2004-02-02 | Iwane Laboratories, Ltd. | Road and other flat object video plan-view developing image processing method, reverse developing image conversion processing method, plan-view developing image processing device, and reverse developing image conversion processing device |
WO2004106856A1 (ja) * | 2003-05-29 | 2004-12-09 | Olympus Corporation | ステレオカメラ支持装置、ステレオカメラ支持方法及びキャリブレーション検出装置及びキャリブレーション補正装置並びにステレオカメラシステム |
FR2874300B1 (fr) | 2004-08-11 | 2006-11-24 | Renault Sas | Procede de calibration automatique d'un systeme de stereovision |
JP4363295B2 (ja) * | 2004-10-01 | 2009-11-11 | オムロン株式会社 | ステレオ画像による平面推定方法 |
EP1901225A1 (en) * | 2005-05-10 | 2008-03-19 | Olympus Corporation | Image processing device, image processing method, and image processing program |
CN101894366B (zh) * | 2009-05-21 | 2014-01-29 | 北京中星微电子有限公司 | 一种获取标定参数的方法、装置及一种视频监控系统 |
JP5188452B2 (ja) * | 2009-05-22 | 2013-04-24 | 富士重工業株式会社 | 道路形状認識装置 |
US20110242355A1 (en) * | 2010-04-05 | 2011-10-06 | Qualcomm Incorporated | Combining data from multiple image sensors |
US9245345B2 (en) * | 2011-06-29 | 2016-01-26 | Nec Solution Innovators, Ltd. | Device for generating three dimensional feature data, method for generating three-dimensional feature data, and recording medium on which program for generating three-dimensional feature data is recorded |
KR20150087619A (ko) * | 2014-01-22 | 2015-07-30 | 한국전자통신연구원 | 증강 현실 기반의 차로 변경 안내 장치 및 방법 |
-
2016
- 2016-05-26 WO PCT/JP2016/002560 patent/WO2016189878A1/ja active Application Filing
- 2016-05-26 JP JP2017507018A patent/JP6159905B2/ja active Active
- 2016-05-26 EP EP16799589.3A patent/EP3306267B1/en active Active
- 2016-05-26 US US15/576,781 patent/US10373338B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3600378B2 (ja) * | 1996-07-24 | 2004-12-15 | 本田技研工業株式会社 | 車両の外界認識装置 |
JP2001272210A (ja) * | 2000-03-27 | 2001-10-05 | Toyoda Mach Works Ltd | 距離認識装置 |
JP2012198075A (ja) * | 2011-03-18 | 2012-10-18 | Ricoh Co Ltd | ステレオカメラ装置、画像補整方法 |
JP2015190921A (ja) * | 2014-03-28 | 2015-11-02 | 富士重工業株式会社 | 車両用ステレオ画像処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3306267A4 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106851201A (zh) * | 2017-02-09 | 2017-06-13 | 苏州慧景光电科技有限公司 | 基于光纤传像技术的车载全景影像系统及其标定方法 |
US10609281B2 (en) | 2017-03-29 | 2020-03-31 | Imagination Technologies Limited | Camera calibration |
EP3383038A1 (en) * | 2017-03-29 | 2018-10-03 | Imagination Technologies Limited | Camera calibration |
CN108696745A (zh) * | 2017-03-29 | 2018-10-23 | 畅想科技有限公司 | 相机校准 |
CN108696745B (zh) * | 2017-03-29 | 2021-08-17 | 畅想科技有限公司 | 相机校准 |
CN108734741A (zh) * | 2017-04-18 | 2018-11-02 | 松下知识产权经营株式会社 | 摄像头校正方法、摄像头校正程序以及摄像头校正装置 |
CN107563326A (zh) * | 2017-08-31 | 2018-01-09 | 京东方科技集团股份有限公司 | 一种行车辅助方法、行车辅助装置和车辆 |
JP7130059B2 (ja) | 2018-05-04 | 2022-09-02 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 光電センサの角度位置を測定するための方法、および試験台 |
JP2021522507A (ja) * | 2018-05-04 | 2021-08-30 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 光電センサの角度位置を測定するための方法、および試験台 |
WO2020017334A1 (ja) * | 2018-07-18 | 2020-01-23 | 日立オートモティブシステムズ株式会社 | 車載環境認識装置 |
JP2020012735A (ja) * | 2018-07-18 | 2020-01-23 | 日立オートモティブシステムズ株式会社 | 車載環境認識装置 |
JP7219561B2 (ja) | 2018-07-18 | 2023-02-08 | 日立Astemo株式会社 | 車載環境認識装置 |
WO2020105499A1 (ja) * | 2018-11-20 | 2020-05-28 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JPWO2020105499A1 (ja) * | 2018-11-20 | 2021-10-14 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP7308227B2 (ja) | 2018-11-20 | 2023-07-13 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
US11836945B2 (en) | 2018-11-20 | 2023-12-05 | Sony Semiconductor Solutions Corporation | Image processing apparatus, image processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP6159905B2 (ja) | 2017-07-05 |
EP3306267A1 (en) | 2018-04-11 |
US20180165833A1 (en) | 2018-06-14 |
EP3306267B1 (en) | 2020-04-01 |
JPWO2016189878A1 (ja) | 2017-08-24 |
EP3306267A4 (en) | 2019-01-23 |
US10373338B2 (en) | 2019-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6159905B2 (ja) | 演算装置、カメラ装置、車両及びキャリブレーション方法 | |
US10558867B2 (en) | Image processing apparatus, stereo camera apparatus, vehicle, and image processing method | |
US11143514B2 (en) | System and method for correcting high-definition map images | |
US10916035B1 (en) | Camera calibration using dense depth maps | |
US11555903B1 (en) | Sensor calibration using dense depth maps | |
JP6141562B1 (ja) | 視差算出装置、ステレオカメラ装置、車両及び視差算出方法 | |
JP6456499B2 (ja) | 立体物検出装置、ステレオカメラ装置、車両及び立体物検出方法 | |
JP6121641B1 (ja) | 画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 | |
WO2017068792A1 (ja) | 視差算出装置、ステレオカメラ装置、車両および視差算出方法 | |
JP2018088092A (ja) | 画像処理装置、車載システム、車載カメラ、移動体、および画像処理方法 | |
US20240046491A1 (en) | System and Method of Automatic Image View Alignment for Camera-Based Road Condition Detection on a Vehicle | |
US20230215026A1 (en) | On-vehicle spatial monitoring system | |
US20230136871A1 (en) | Camera calibration | |
CN112912944B (zh) | 图像处理装置、摄像头、移动体以及图像处理方法 | |
US20210358159A1 (en) | Image processing apparatus, imaging apparatus, mobile body, and image processing method | |
JP2013161188A (ja) | 物体認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16799589 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017507018 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15576781 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016799589 Country of ref document: EP |