WO2016208200A1 - 画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 - Google Patents
画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 Download PDFInfo
- Publication number
- WO2016208200A1 WO2016208200A1 PCT/JP2016/003058 JP2016003058W WO2016208200A1 WO 2016208200 A1 WO2016208200 A1 WO 2016208200A1 JP 2016003058 W JP2016003058 W JP 2016003058W WO 2016208200 A1 WO2016208200 A1 WO 2016208200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- captured image
- feature points
- parallax
- image processing
- control unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/10—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
- G01C3/14—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present disclosure relates to an image processing device, a stereo camera device, a vehicle, and an image processing method.
- a stereo camera in which an object such as an object or a person is simultaneously imaged by a plurality of cameras, and the distance to the object is measured based on the principle of triangulation using the captured image.
- safe driving is supported by notifying the driver of the presence of an object existing around the vehicle.
- the image processing apparatus is an image processing apparatus that calibrates a stereo camera including a plurality of imaging units, and includes an input unit and a control unit.
- the input unit acquires a first captured image and a second captured image respectively captured by the plurality of imaging units.
- the control unit calculates a parallax by performing one-dimensional matching based on a pixel value of the first captured image and a pixel value of the second captured image.
- the control unit extracts one or more first feature points from a region where pixels having parallax within a predetermined range are continuous in the first captured image, and extracts one or more first feature points.
- a second feature point corresponding to each of the first feature points is extracted by two-dimensionally matching the feature points with the pixels of the second captured image.
- the control unit calibrates the imaging unit based on the position of the first feature point and the position of the second feature point.
- the stereo camera device of the present disclosure includes a plurality of imaging units and an image processing device that calibrates the stereo camera including the plurality of imaging units.
- the image processing apparatus includes an input unit and a control unit.
- the input unit receives an input of a first captured image and a second captured image respectively captured by the plurality of imaging units.
- the control unit calculates a parallax by performing one-dimensional matching based on a pixel value of the first captured image and a pixel value of the second captured image.
- the control unit extracts one or more first feature points from a region where pixels having parallax within a predetermined range are continuous in the first captured image, and extracts one or more first feature points.
- a second feature point corresponding to each of the first feature points is extracted by two-dimensionally matching the feature points with the pixels of the second captured image.
- the control unit calibrates the imaging unit based on the position of the first feature point and the position of the second feature point.
- the vehicle of the present disclosure includes a stereo camera device.
- the stereo camera device includes a plurality of imaging units and an image processing device that calibrates the stereo camera including the plurality of imaging units.
- the image processing apparatus includes an input unit and a control unit.
- the input unit receives an input of a first captured image and a second captured image respectively captured by the plurality of imaging units.
- the input unit calculates a parallax by performing one-dimensional matching based on a pixel value of the first captured image and a pixel value of the second captured image.
- the control unit extracts one or more first feature points from a region where pixels having parallax within a predetermined range are continuous in the first captured image, and extracts one or more first feature points.
- a second feature point corresponding to each of the first feature points is extracted by two-dimensionally matching the feature points with the pixels of the second captured image.
- the control unit calibrates the imaging unit based on the position of the first feature point and the position of the second feature point.
- the image processing method of the present disclosure is an image processing method of an image processing apparatus that calibrates a stereo camera including a plurality of imaging units.
- the image processing method includes a control unit of the image processing apparatus configured to control the plurality of imaging units. Among these, the pixel value of the first captured image captured by the first imaging unit and the second captured image captured by a second imaging unit different from the first imaging unit among the plurality of imaging units. And calculating the parallax by performing one-dimensional matching based on the pixel values of the first and second pixels.
- the image processing method in the first captured image, one or more first feature points are extracted from a region where pixels having a parallax within a predetermined range are continuous, and the one or more first features are extracted.
- the second feature point corresponding to each of the first feature points is extracted by two-dimensionally matching the feature points with the pixels of the second captured image.
- the image processing method includes calibrating the imaging unit based on the position of the first feature point and the position of the second feature point.
- FIG. 1 It is a perspective view of the stereo camera apparatus containing the image processing apparatus of this embodiment. It is a side view of the vehicle which installed the stereo camera apparatus shown in FIG. It is a block diagram of the stereo camera apparatus shown in FIG. It is a figure which shows the distribution of the 1st and 2nd captured image each imaged by the 1st and 2nd imaging part shown in FIG. 1, and a one-dimensional pixel value. It is a conceptual diagram which shows the parallax of the pixel which comprises the to-be-photographed object area
- a stereo camera must accurately measure the distance from a plurality of cameras to the object captured by each of the plurality of cameras in order to accurately notify the user of the distance to the object existing around the stereo camera. .
- the distance to the object cannot be measured accurately.
- the reference area is suitable for matching, and the reference area is used.
- the first feature is calculated by calculating the parallax by one-dimensional matching and two-dimensionally matching the first feature point on the first captured image captured by the stereo camera.
- a second feature point on the second captured image corresponding to the point is extracted to calibrate the imaging unit.
- the stereo camera device 1 includes a plurality of imaging units 11L and 11R each having an optical axis that does not overlap with each other, and an image processing unit 10.
- the image processing unit 10 is also referred to as an image processing device. Accordingly, the plurality of images simultaneously captured by the plurality of imaging units 11L and 11R have parallax d from each other.
- the imaging time reference includes an imaging start time, an imaging end time, a captured image data transmission time, and a time at which the counterpart device receives the image data.
- the stereo camera device 1 may be a device in which a plurality of cameras are included in one housing.
- the stereo camera device 1 may be a device including two or more cameras which are independent from each other and located apart from each other.
- the stereo camera device 1 is not limited to a plurality of cameras independent of each other. In the present disclosure, for example, a camera having an optical mechanism that guides light incident on two distant locations to one light receiving element can be employed as the stereo camera device 1.
- the first imaging unit 11L and the second imaging unit 11L include a solid-state imaging device.
- the solid-state imaging device includes a CCD image sensor (Charge-Coupled Device Device Image Sensor) and a CMOS image sensor (Complementary MOS Image Image Sensor).
- the first imaging unit 11L and the second imaging unit 11L may include a lens mechanism.
- the first imaging unit 11L and the second imaging unit 11R capture a real space, and generate a first captured image 14L and a second captured image 14R, respectively.
- the stereo camera device 1 can accurately measure the distance from the stereo camera device 1 to the subject.
- the imaging unit installed on the left side when viewing the stereo camera device 1 from the opposite side of the subject is the first imaging unit 11L
- the imaging unit installed on the right side is the first imaging unit. This is called the second imaging unit 11R.
- the first imaging unit 11L and the second imaging unit 11R can image the outside of the vehicle 15 via the windshield of the vehicle 15.
- the first imaging unit 11L and the second imaging unit 11R may be fixed to any of the front bumper, fender grille, side fender, light module, and bonnet of the vehicle 15.
- parallel is not limited to strict parallel.
- Parallel includes, for example, a state in which the optical axes of the imaging units 11L and 11R can be regarded as substantially parallel, such that they are shifted from a completely parallel state.
- horizontal is not limited to strict horizontal.
- Horizontal includes a state in which the direction of the baseline length is deviated from a state completely parallel to the surface direction of the ground plane and can be regarded as being substantially horizontal.
- the “vehicle” in the present disclosure includes, but is not limited to, an automobile, a railway vehicle, an industrial vehicle, and a living vehicle.
- vehicle may include an airplane traveling on a runway.
- the automobile includes, but is not limited to, a passenger car, a truck, a bus, a two-wheeled vehicle, a trolley bus, and the like, and may include other vehicles that travel on the road.
- Rail vehicles include, but are not limited to, locomotives, freight cars, passenger cars, trams, guided railroads, ropeways, cable cars, linear motor cars, and monorails, and may include other vehicles that travel along the track.
- Industrial vehicles include industrial vehicles for agriculture and construction. Industrial vehicles include but are not limited to forklifts and golf carts.
- Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
- Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, cranes, dump trucks, and road rollers.
- Living vehicles include, but are not limited to, bicycles, wheelchairs, baby carriages, wheelbarrows, and electric two-wheelers.
- Vehicle power engines include, but are not limited to, internal combustion engines including diesel engines, gasoline engines, and hydrogen engines, and electrical engines including motors.
- Vehicle includes a vehicle that travels manually. The classification of “vehicle” is not limited to the above. For example, an automobile may include an industrial vehicle capable of traveling on a road, and the same vehicle may be included in a plurality of classifications.
- the image processing unit 10 includes an input unit 12 and a control unit 13.
- the input unit 12 is an input interface for inputting image data to the image processing unit 10.
- the input unit 12 can employ a physical connector and a wireless communication device.
- the physical connector includes an electrical connector that supports transmission using an electrical signal, an optical connector that supports transmission using an optical signal, and an electromagnetic connector that supports transmission using electromagnetic waves.
- Electrical connectors include connectors conforming to IEC 60603, connectors conforming to USB standards, connectors corresponding to RCA terminals, connectors corresponding to S terminals defined in EIAJ CP-1211A, D terminals prescribed in EIAJ RC-5237 , A connector conforming to the HDMI (registered trademark) standard, and a connector corresponding to a coaxial cable including BNC.
- the optical connector includes various connectors conforming to IEC 61754.
- the wireless communication device includes a wireless communication device that complies with each standard including Bluetooth (registered trademark) and IEEE802.11.
- the wireless communication device includes at least one antenna.
- the input unit 12 receives image data of images captured by the first imaging unit 11L and the second imaging unit 11R.
- the input unit 12 delivers the input image data to the control unit 13.
- the input to the input unit 12 includes a signal input via a wired cable and a signal input via a wireless connection.
- the input unit 12 may correspond to the imaging signal transmission method of the stereo camera device 1.
- the control unit 13 includes one or a plurality of processors.
- the control unit 13 or the processor may include one or a plurality of memories that store programs for various processes and information being calculated.
- the memory includes volatile memory and nonvolatile memory.
- the memory includes a memory independent of the processor and a built-in memory of the processor.
- the processor includes a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process.
- the dedicated processor includes an application specific IC (ASIC; Application Specific Circuit).
- the processor includes a programmable logic device (PLD).
- PLD includes FPGA (Field-ProgrammablemGate Array).
- the control unit 13 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or a plurality of processors cooperate.
- the control unit 13 measures the distance in real space from the stereo camera device 1 of the subject captured in the first captured image 14L and the second captured image 14R input to the input unit 12.
- the control unit 13 calculates the distance from the stereo camera device 1 to the subject.
- the spatial coordinate system has an arbitrary point as an origin, a base length direction as an X axis, and two directions perpendicular to the base length and perpendicular to each other as a Y axis and a Z axis, respectively.
- the optical axes of the first imaging unit 11L and the second imaging unit 11R are parallel to the Z axis, the row direction of the imaging surface is parallel to the X axis, and the column direction of the imaging surface is parallel to the Y axis.
- a rotation angle around the X axis is a pitch angle ⁇
- a rotation angle around the Z axis is a low angle ⁇ .
- both optical axes are parallel to the Z axis, and the column direction of the imaging surface is parallel to the Y axis perpendicular to the direction of the baseline length, so the first captured image 14 ⁇ / b> L and the second captured image
- the position of the spot image of the same subject in 14R differs only in the row direction. Therefore, in the stereo camera device 1, for example, in order to calculate the distance at a high speed of 30 fps, matching that matches the spot images of the same subject in the first captured image 14 ⁇ / b> L and the second captured image 14 ⁇ / b> R is parallel to the baseline length.
- One-dimensional matching is performed along the direction, that is, the X-axis direction.
- the accuracy of the above-described one-dimensional matching of the spot image is such that the displacement ⁇ Y along the Y-axis direction of the first imaging unit 11L with respect to the second imaging unit 11R in the external orientation element increases. ,descend.
- the accuracy of spot image association by one-dimensional matching decreases as the deviation ⁇ of the optical axis pitch angle ⁇ increases. Therefore, as will be described below, the stereo camera device 1 uses the second captured image 14L and the second captured image 14R based on the first captured image 14L and the second captured image 14R for the second position Y and the pitch angle ⁇ .
- the first imaging unit 11L is calibrated with the imaging unit 11R as a reference.
- the control unit 13 calculates the parallax d of the position of each pixel in the first captured image 14L and the second captured image 14R.
- the positions of the respective pixels of the first captured image 14L and the second captured image 14R are (u, v) images having a U axis parallel to the row direction of the imaging surface and a V axis parallel to the column direction of the imaging surface. Shown in coordinate system.
- the control unit 13 calculates the parallax d by one-dimensional matching along the row direction of the imaging surface. Specifically, the control unit 13 determines the distribution of one-dimensional pixel values in the U-axis direction at each v coordinate in the first captured image 14L and the U axis at the same v coordinate in the second captured image 14R. Compare the distribution of pixel values in one dimension in the direction. The control unit 13 calculates the difference between the positions of two pixels having corresponding pixel values in the two distributions as the parallax d.
- the control unit 13 determines a V-axis constant for calculating the parallax d.
- the control unit 13 calculates the parallax d when the V-axis is v 1 .
- the distribution of pixel values extracted from the first captured image 14L and the second captured image 14R is shown, for example, as shown in FIGS. 4 (3) and (4).
- the control unit 13 Based on the two extracted pixel value distributions, the control unit 13 associates the pixels of the first captured image 14L with the pixels of the second captured image 14R by one-dimensional matching. That is, the control unit 13 extracts the most likely pixel on the second captured image 14R representing the spot image represented by the pixel of the first captured image 14L, and associates it with the pixel of the first captured image 14L. .
- the control unit 13 specifies a region where pixels having a parallax d in which a difference from the parallax d of adjacent pixels in the parallax image is within a predetermined range is continuous as a parallax approximate region.
- a parallax image is an image that indicates the amount of pixel shift that represents the same spot image in two different captured images that are simultaneously captured.
- the parallax image is the spot image corresponding to each pixel of the first captured image 14L and the U-axis direction of the corresponding pixel of the same spot image on the second captured image 14R. It is an image showing the amount of deviation.
- FIG. 5 is a conceptual diagram showing the parallax d in the subject area 16 shown in FIG.
- each of a plurality of squares surrounding each number corresponds to each pixel in the subject area 16.
- the parallax d of each pixel is indicated at the position of the corresponding pixel.
- the thick line in FIG. 5 corresponds to each pixel constituting the optical image of the other vehicle in FIG.
- the parallax d relating to the pixels surrounded by the bold line is approximately in the range of 79 to 82.
- the parallax d varies depending on the distance from the stereo camera device 1.
- the subjects at substantially the same distance from the stereo camera device 1 have substantially the same parallax d.
- the subject having a certain size on the first captured image 14L and the second captured image 14R is at substantially the same distance from the stereo camera device 1, and the parallax d in the pixel indicating the optical image of the subject is within a certain range. include. In other words, there is a lump of objects in the region where the parallax d falls within the predetermined range.
- the control unit 13 specifies a region where pixels having a parallax d within a predetermined range are continuous as a parallax approximate region.
- the control unit 13 extracts one or more first feature points P 1 from the parallax approximate region.
- the first feature point P 1 is a point that is a feature on the first captured image 14L, that is, a point that a feature amount related to a pixel satisfies a predetermined requirement.
- the first feature point P 1 is an apex of an edge at which the derivative of the luminance value is a predetermined value or more.
- the control unit 13 determines whether or not the first feature point P 1 extracted from the first captured image 14L is suitable for use in the calibration process. For example, when the first feature point P 1 is at least a part of any one of a linear edge and a repetitive pattern, the second feature point P 2 corresponding to the first feature point P 1 is matched. Is prone to errors. Therefore, the control unit 13 determines whether or not the region including the first feature point P 1 and the vicinity thereof includes a linear edge or a repetitive pattern.
- the second feature point P 2 is a feature point on the second captured image 14R having a feature amount within a predetermined range from the feature amount of the first feature point P 1 .
- First feature point P 1 if it is determined that one of at least a portion of the linear edges and repeating patterns, in the subsequent processing, without using the feature point P 1 of the first, the first The first feature point P 1 different from the feature point P 1 is used.
- the control unit 13 uses the conventionally known two-dimensional pattern matching method to perform the first feature point P 1.
- the second feature point P 2 corresponding to the feature point P 1 is searched from the second captured image 14R and extracted.
- the control unit 13 performs two-dimensional pattern matching with sub-pixel accuracy by using a method such as interpolation.
- the starting point P S for searching for the second feature point P 2 in the second captured image 14R is the same position as the position (u L2 , v L2 ) of the first feature point P 1 by the parallax d in the U-axis direction. It is assumed that the position is shifted (u L2 + d, v L2 ).
- the control unit 13 searches for the second feature point P 2 within a predetermined size range centered on the position (u L2 + d, v L2 ).
- the predetermined size range for example, in the range from the starting point PS 1 to 2 pixels in the U-axis direction, and a range from the starting point P S 1 to 2 pixels in the V direction.
- Control unit 13 the second imaging unit 11R for calibrating the first imaging unit 11L based on the basis of the first position of the feature point P 1 of the and the second position of the feature point P 2 of the extracted.
- the external orientation element to be updated is determined as an element corresponding to the translation of the first imaging unit 11L in the direction perpendicular to the base line length and the optical axis.
- the element corresponding to the translation in the direction perpendicular to the base line length and the optical axis is the Y-axis direction in the spatial coordinate system shown in FIG.
- the X axis and Y axis in the spatial coordinate system are parallel to the U axis and V axis in the image coordinate system, respectively.
- the position of the first feature point P 1 extracted as described above is represented by image coordinates (u L2 , v L2 ), and the position of the second feature point P 2 is represented by image coordinates (u R2 , v R2 ).
- the control unit 13 recognizes that the first imaging unit 11L is displaced by the second imaging unit 11R as a reference by the displacement amount ⁇ Y L in the spatial coordinate system corresponding to the displacement amount ⁇ v in the image coordinate system.
- the position Y L in the Y-axis direction of the first imaging unit 11L is updated using ⁇ Y L. Even when the external rating element to be updated is the pitch angle ⁇ that is the rotation angle around the X axis, the control unit 13 performs the same processing.
- the control unit 13 starts processing when receiving at least one of a start command, a stop command, and an effective command for calibration control of the stereo camera device 1.
- the input unit 12 receives an input of the first captured image 14L and the second captured image 14R generated by the first imaging unit 11L and the second imaging unit 11R, respectively (step S1).
- control unit 13 performs one-dimensional matching on the first captured image 14L based on the pixel value of the first captured image 14L and the pixel value of the second captured image 14R input to the input unit 12.
- the parallax d of each pixel with the corresponding pixel of the second captured image 14R is calculated (step S2).
- the control unit 13 specifies a parallax approximate region in the first captured image 14L (step S3). Then, the control unit 13 searches for and extracts one or more first feature points P 1 from the parallax approximation region (step S4). The control unit 13 determines whether or not the extracted first feature point P 1 is suitable for use in the calibration process (step S5). Next, the control unit 13 sets the second feature point P 2 corresponding to the first feature point P 1 determined to be suitable for use in the calibration process by the second two-dimensional pattern matching method as the second feature point P 2 . The captured image 14R is searched and extracted (step S6). The first feature point P1 is determined to be suitable for use in the calibration process when it is not at least part of either the linear edge or the repetitive pattern.
- control unit 13 a first feature point based on P 1 and the second feature points P 2, to calibrate the first imaging unit 11L and the second imaging unit 11R as a reference (step S7).
- the control unit 13 calculates the parallax d by one-dimensional matching, and the first feature point P from a continuous region of pixels in which the fluctuation range of the parallax d is within a predetermined range. Extract 1 Therefore, compared with the case where the control unit 13 searches for the first feature point P 1 over the entire region of the first captured image 14L by two-dimensional matching, the first imaging unit 11R is used as a reference at a higher speed. It is possible to calibrate the imaging unit 11L. Since there is a high possibility that the pixels constituting the parallax approximate region do not contain noise, the control unit 13 can accurately extract the first feature point P 1 . Therefore, the control unit 13 can perform calibration accurately.
- the second feature is based on the first feature point P 1 different from the first feature point P 1 that is at least a part of either the linear edge or the repetitive pattern.
- a point P 2 is extracted. Accordingly, since the first feature point P 1 is not a part of the linear edge and is not a part of the repetitive pattern, the feature point similar to the first feature point P 1 is the first imaging. It is rarely included in the image 14L. It is also excluding the second feature point P 2 corresponding to the first feature point P 1 to the second captured image in 14R corresponding to the first captured image 14L, similar to the first feature point P 1 It is rare to include feature points. Accordingly, the second feature point P 2 corresponding to the first feature point P 1, without confusion with other feature points are similar to the first feature point P 1, it is possible to accurately determine.
- the control unit 13 based on the parallax d of the first position and the position of the feature point P 1, to determine the starting point P S 2D matching, second
- the second feature point P 2 can be searched in a range where there is a high possibility that the feature point P 2 exists. Therefore, the control unit 13 can extract the second feature point P 2 at high speed.
- the control unit 13 since the control unit 13 performs two-dimensional matching with sub-pixel accuracy, the first image pickup unit 11L is less than one pixel with respect to the second image pickup unit 11R. Calibration can be performed even if the amount deviates. That is, the control unit 13 can calibrate the first imaging unit 11L with the second imaging unit 11R as a reference with high accuracy.
- control unit 13 may extract a plurality of second feature points P 2 corresponding to each using the plurality of first feature points P 1 .
- calibration can be performed with high accuracy even when the low angle ⁇ L of the first imaging unit 11L is deviated from the low angle ⁇ R of the second imaging unit 11R.
- control unit 13 calibrates the first imaging unit 11L with the second imaging unit 11R as a reference. However, the controller 13 calibrates the second imaging unit 11R with the first imaging unit 11L as a reference. Good.
- control unit 13 extracts the second feature point P 2 based on the first feature point P 1 extracted from the first captured image 14L, but from the second captured image 14R.
- the first feature point P 1 may be extracted based on the extracted second feature point P 2 .
- the stereo camera device 1 includes the image processing unit 10, but another device includes the image processing unit 10, and the control unit 13 of the stereo camera device 1 is connected to the other device via a communication network or the like. Then, control may be performed for calibration based on the first captured image 14L and the second captured image 14R input to the input unit 12.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Manufacturing & Machinery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
10 画像処理部
11L 第1の撮像部
11R 第2の撮像部
12 入力部
13 制御部
14L 第1の撮像画像
14R 第2の撮像画像
15 車両
16 被写体領域
Claims (9)
- ステレオカメラの複数の撮像部によってそれぞれ撮像された第1の撮像画像および第2の撮像画像の入力を受ける入力部と、
前記第1の撮像画像の画素値と前記第2の撮像画像の画素値とに基づいて1次元マッチングを行うことにより視差を算出し、前記第1の撮像画像において、隣接する画素の視差との差が所定の範囲内である視差を有する画素が連続している領域から1つ以上の第1の特徴点を抽出し、1つ以上の前記第1の特徴点を前記第2の撮像画像の画素と2次元マッチングすることにより、前記第1の特徴点それぞれに対応する第2の特徴点を抽出し、前記第1の特徴点の位置と前記第2の特徴点の位置とに基づいて前記撮像部を較正する制御部と、
を備えることを特徴とする画像処理装置。 - 請求項1に記載の画像処理装置であって、
前記制御部は、1つ以上の前記第1の特徴点のうち、線状エッジおよび繰り返しパターンのいずれかの少なくとも一部である第1の特徴点とは異なる第1の特徴点に基づいて前記2次元マッチングを行うことを特徴とする画像処理装置。 - 請求項1又は2に記載の画像処理装置であって、
前記制御部は、前記第1の特徴点の位置と該位置の視差とに基づいて、前記2次元マッチングの起点を決定することを特徴とする画像処理装置。 - 請求項1乃至3のいずれか一項に記載の画像処理装置であって、
前記制御部は、サブピクセル精度で前記2次元マッチングを行うことを特徴とする画像処理装置。 - 請求項1乃至4のいずれか一項に記載の画像処理装置であって、
前記制御部は、前記第1の撮像画像から複数の前記第1の特徴点を抽出し、前記第2の撮像画像から前記第1の特徴点それぞれに対応する第2の特徴点を抽出し、複数の前記第1の特徴点の位置と、それぞれの前記第1の特徴点が対応する前記第2の特徴点の位置とに基づいて、前記撮像部を較正することを特徴とする画像処理装置。 - 請求項5に記載の画像処理装置であって、
前記制御部は、前記複数の前記第1の特徴点および前記第2の特徴点の位置に基づいて、ステレオカメラの姿勢を較正することを特徴とする画像処理装置。 - 複数の撮像部と、
前記複数の撮像部によってそれぞれ撮像された第1の撮像画像および第2の撮像画像をの入力を受ける入力部と、前記第1の撮像画像の画素値と前記第2の撮像画像の画素値とに基づいて1次元マッチングを行うことにより視差を算出し、前記第1の撮像画像において、視差が所定の範囲内である画素が連続している領域から1つ以上の第1の特徴点を抽出し、1つ以上の前記第1の特徴点を前記第2の撮像画像の画素と2次元マッチングすることにより、前記第1の特徴点それぞれに対応する第2の特徴点を抽出し、前記第1の特徴点の位置と前記第2の特徴点の位置とに基づいて前記撮像部を較正する制御部と、を有することを特徴とする画像処理装置と、
を備えることを特徴とするステレオカメラ装置。 - 複数の撮像部と、
前記複数の撮像部によってそれぞれ撮像された第1の撮像画像および第2の撮像画像の入力を受ける入力部と、前記第1の撮像画像の画素値と、前記第2の撮像画像の画素値とに基づいて1次元マッチングを行うことにより視差を算出し、前記第1の撮像画像において、視差が所定の範囲内である画素が連続している領域から1つ以上の第1の特徴点を抽出し、1つ以上の前記第1の特徴点を前記第2の撮像画像の画素と2次元マッチングすることにより、前記第1の特徴点それぞれに対応する第2の特徴点を抽出し、前記第1の特徴点の位置と前記第2の特徴点の位置とに基づいて、前記撮像部を較正する制御部と、を有するステレオカメラ装置と、
を備えることを特徴とする車両。 - 画像処理装置の制御部が、
複数の撮像部のうち第1の撮像部によって撮像された第1の撮像画像の画素値と、前記複数の撮像部のうち前記第1の撮像部とは異なる第2の撮像部によって撮像された第2の撮像画像の画素値とに基づいて1次元マッチングを行うことにより視差を算出するステップと、
前記第1の撮像画像において、視差が所定の範囲内である画素が連続している領域から1つ以上の第1の特徴点を抽出するステップと、
1つ以上の前記第1の特徴点を前記第2の撮像画像の画素と2次元マッチングすることにより、前記第1の特徴点それぞれに対応する第2の特徴点を抽出するステップと、
前記第1の特徴点の位置と前記第2の特徴点の位置とに基づいて、前記撮像部を較正するステップと、
を含むことを特徴とする画像処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017509077A JP6121641B1 (ja) | 2015-06-24 | 2016-06-24 | 画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 |
EP16813972.3A EP3315905B1 (en) | 2015-06-24 | 2016-06-24 | Image processing device, stereo camera device, vehicle, and image processing method |
US15/738,593 US10360462B2 (en) | 2015-06-24 | 2016-06-24 | Image processing device, stereo camera device, vehicle, and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-126817 | 2015-06-24 | ||
JP2015126817 | 2015-06-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016208200A1 true WO2016208200A1 (ja) | 2016-12-29 |
Family
ID=57584803
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/003058 WO2016208200A1 (ja) | 2015-06-24 | 2016-06-24 | 画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10360462B2 (ja) |
EP (1) | EP3315905B1 (ja) |
JP (1) | JP6121641B1 (ja) |
WO (1) | WO2016208200A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110998241A (zh) * | 2018-01-23 | 2020-04-10 | 深圳市大疆创新科技有限公司 | 用于校准可移动对象的光学系统的系统和方法 |
WO2020095747A1 (ja) | 2018-11-05 | 2020-05-14 | 京セラ株式会社 | コントローラ、位置判定装置、位置判定システム、表示システム、プログラム、および記録媒体。 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11182914B2 (en) | 2018-05-21 | 2021-11-23 | Facebook Technologies, Llc | Dynamic structured light for depth sensing systems based on contrast in a local area |
CN109559305B (zh) * | 2018-11-26 | 2023-06-30 | 易思维(杭州)科技有限公司 | 一种基于soc-fpga的线结构光图像快速处理系统 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5383013A (en) * | 1992-09-18 | 1995-01-17 | Nec Research Institute, Inc. | Stereoscopic computer vision system |
JPH07103734A (ja) * | 1993-10-01 | 1995-04-18 | Sharp Corp | ステレオ対応探索装置 |
JP2001266144A (ja) * | 2000-03-17 | 2001-09-28 | Glory Ltd | 画像照合装置、画像照合方法、およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004053407A (ja) * | 2002-07-19 | 2004-02-19 | Fuji Heavy Ind Ltd | ステレオ画像処理装置およびステレオ画像処理方法 |
JP2012198075A (ja) * | 2011-03-18 | 2012-10-18 | Ricoh Co Ltd | ステレオカメラ装置、画像補整方法 |
JP2013239170A (ja) * | 2009-03-31 | 2013-11-28 | Panasonic Corp | ステレオ画像処理装置およびステレオ画像処理方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3280001B2 (ja) | 1999-09-16 | 2002-04-30 | 富士重工業株式会社 | ステレオ画像の位置ずれ調整装置 |
JP5588812B2 (ja) * | 2010-09-30 | 2014-09-10 | 日立オートモティブシステムズ株式会社 | 画像処理装置及びそれを用いた撮像装置 |
JP6202367B2 (ja) * | 2013-05-14 | 2017-09-27 | 株式会社リコー | 画像処理装置、距離測定装置、移動体機器制御システム、移動体及び画像処理用プログラム |
-
2016
- 2016-06-24 WO PCT/JP2016/003058 patent/WO2016208200A1/ja active Application Filing
- 2016-06-24 EP EP16813972.3A patent/EP3315905B1/en active Active
- 2016-06-24 US US15/738,593 patent/US10360462B2/en active Active
- 2016-06-24 JP JP2017509077A patent/JP6121641B1/ja active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5383013A (en) * | 1992-09-18 | 1995-01-17 | Nec Research Institute, Inc. | Stereoscopic computer vision system |
JPH07103734A (ja) * | 1993-10-01 | 1995-04-18 | Sharp Corp | ステレオ対応探索装置 |
JP2001266144A (ja) * | 2000-03-17 | 2001-09-28 | Glory Ltd | 画像照合装置、画像照合方法、およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004053407A (ja) * | 2002-07-19 | 2004-02-19 | Fuji Heavy Ind Ltd | ステレオ画像処理装置およびステレオ画像処理方法 |
JP2013239170A (ja) * | 2009-03-31 | 2013-11-28 | Panasonic Corp | ステレオ画像処理装置およびステレオ画像処理方法 |
JP2012198075A (ja) * | 2011-03-18 | 2012-10-18 | Ricoh Co Ltd | ステレオカメラ装置、画像補整方法 |
Non-Patent Citations (2)
Title |
---|
See also references of EP3315905A4 * |
SHI WANLI ET AL.: "An Approach for Stereo Matching Using Pair-wise Sequence Alignment Algorithm Based on Dynamic Programming", 2010 INTERNATIONAL CONFERENCE ON CHALLENGES IN ENVIRONMENTAL SCIENCE AND COMPUTER ENGINEERING, 7 March 2010 (2010-03-07), pages 511 - 514, XP031695542 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110998241A (zh) * | 2018-01-23 | 2020-04-10 | 深圳市大疆创新科技有限公司 | 用于校准可移动对象的光学系统的系统和方法 |
WO2020095747A1 (ja) | 2018-11-05 | 2020-05-14 | 京セラ株式会社 | コントローラ、位置判定装置、位置判定システム、表示システム、プログラム、および記録媒体。 |
US11551375B2 (en) | 2018-11-05 | 2023-01-10 | Kyocera Corporation | Controller, position determination device, position determination system, and display system for determining a position of an object point in a real space based on cornea images of a first eye and a second eye of a user in captured image |
Also Published As
Publication number | Publication date |
---|---|
US20180165528A1 (en) | 2018-06-14 |
JP6121641B1 (ja) | 2017-04-26 |
EP3315905A4 (en) | 2019-02-27 |
JPWO2016208200A1 (ja) | 2017-06-29 |
EP3315905A1 (en) | 2018-05-02 |
EP3315905B1 (en) | 2020-04-22 |
US10360462B2 (en) | 2019-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3358295B1 (en) | Image processing device, stereo camera device, vehicle, and image processing method | |
JP6159905B2 (ja) | 演算装置、カメラ装置、車両及びキャリブレーション方法 | |
US10614585B2 (en) | Parallax calculation apparatus, stereo camera apparatus, vehicle, and parallax calculation method | |
US11703326B2 (en) | Stereo camera apparatus, vehicle, and parallax calculation method | |
JP6456499B2 (ja) | 立体物検出装置、ステレオカメラ装置、車両及び立体物検出方法 | |
JP6121641B1 (ja) | 画像処理装置、ステレオカメラ装置、車両、及び画像処理方法 | |
JP6855325B2 (ja) | 画像処理装置、ステレオカメラシステム、移動体、路面形状検出方法およびプログラム | |
US10769804B2 (en) | Parallax calculation apparatus, stereo camera apparatus, vehicle, and parallax calculation method | |
JP7569173B2 (ja) | 画像処理装置、ステレオカメラ装置、移動体及び画像処理方法 | |
JP7516204B2 (ja) | 画像処理装置、ステレオカメラ装置、移動体及び画像処理方法 | |
JP2018201167A (ja) | 画像処理装置、周辺監視システム、および移動体 | |
CN114402357A (zh) | 路面检测装置、物体检测装置、物体检测系统、移动体和物体检测方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16813972 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017509077 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15738593 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016813972 Country of ref document: EP |