WO2011115142A1 - 画像処理装置、方法、プログラム及び記録媒体 - Google Patents
画像処理装置、方法、プログラム及び記録媒体 Download PDFInfo
- Publication number
- WO2011115142A1 WO2011115142A1 PCT/JP2011/056118 JP2011056118W WO2011115142A1 WO 2011115142 A1 WO2011115142 A1 WO 2011115142A1 JP 2011056118 W JP2011056118 W JP 2011056118W WO 2011115142 A1 WO2011115142 A1 WO 2011115142A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- difference
- parallax
- along
- dimensional object
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- space grasping such as detection of a three-dimensional object from images acquires a pair of images having a difference corresponding to parallax by photographing the same space with two photographing devices, and extracts the parallax from the pair of images Then, this is performed by associating the extracted parallax with the spatial information.
- a block matching method is used to extract parallax from a pair of images having a difference corresponding to parallax.
- the block matching method searches for corresponding points in two images, and a difference in position of corresponding points in a pair of images having a difference corresponding to parallax corresponds to parallax.
- one of the pair of images is a reference image and the other is a comparison image, and an evaluation area of about 8 ⁇ 4 pixels, for example, is set around the pixel of interest in the reference image.
- a similar evaluation region is set in the comparison image, the difference between the pair of evaluation regions is calculated for each pixel, and an evaluation function ( ⁇
- ⁇ (Li ⁇ Ri) 2 ) that accumulates the values is repeated while moving the position of the evaluation area in the comparative image (raster scan), and the degree of coincidence Finds the position of the evaluation region on the comparative image that maximizes (the value of the evaluation function is minimum (minimum)).
- This position is the position of the corresponding point on the comparison image with respect to the target pixel in the reference image, and the difference in position between the target pixel on the reference image and the corresponding point on the comparison image corresponds to parallax.
- Patent Document 1 Japanese Patent Laid-Open No. 2001-92968
- Patent Document 1 describes image data in a reference pixel area in one captured image and a reference pixel area in the other captured image.
- the image data on the horizontal line corresponding to the vertical position is stored in the line memory, the image data in the reference pixel area and the image data in the set search range are read from the line memory, and stereo matching (block matching) is performed.
- a technique for specifying a correlation destination of a reference pixel region and correcting the position of a search range related to the reference pixel region based on the degree of shift of an infinite point corresponding to the horizontal position of the reference pixel region is disclosed.
- Patent Document 2 Japanese Patent Laid-Open No. 2007-235642 (hereinafter referred to as Patent Document 2) is obtained by projective transformation of a first image (an oblique overhead view) including a road surface imaged by a camera mounted on a vehicle.
- a top view and a top view obtained by projective transformation of a second image captured by the same camera at a timing different from the first image are created, and the two top views are characteristic on the road surface.
- Pattern matching based on the shape e.g. white line, road surface and solid object boundary, road surface texture, tire stop, etc.
- Patent Document 3 Japanese Patent Application Laid-Open No. 2000-293893 accumulates left and right images respectively taken by two TV cameras that photograph a road plane, and stores the accumulated left and right images. Extract the multiple lines shown above, find the corresponding points between the left image and the right image based on the extracted multiple lines, the left image of any point on the road plane based on the calculated corresponding points, An obstacle detection device that calculates a parameter of a relational expression established between projection positions on the right image and detects an area having a different height from the road plane as an obstacle area based on the relational expression determined by the calculated parameter. Is disclosed.
- the technique described in Patent Document 2 performs alignment of a pair of images by pattern matching.
- a characteristic shape on the road surface that can be used for pattern matching a certain characteristic shape (for example, a white line) is used. Since it is not guaranteed to be always present in the image, various characteristic shapes that can be used for pattern matching are sequentially searched for the image, and pattern matching is performed using the detected characteristic shapes.
- the processing time varies depending on the type of characteristic shape present in the image, and the processing time increases greatly depending on the type. For this reason, the technique described in Patent Document 2 is not suitable for applications that require high-speed operation, such as detecting a three-dimensional object in real time with respect to a captured moving image.
- the technique described in Patent Document 3 is based on the fact that an arbitrary point P (u, v) on the left image exists on the road plane based on the relational expression established between the projection positions on the left image and the right image.
- the corresponding point P ′ (u ′, v ′) on the right image when assumed is determined, and it is determined whether or not the point P corresponds to an obstacle based on the luminance difference between the point P and the corresponding point P ′. Therefore, the above relational expression requires high accuracy with an allowable error of less than one pixel. For this reason, the technique described in Patent Document 3 is easily affected by vehicle vibration, road inclination, and the like, and it is necessary to frequently recalculate the parameter h of the relational expression.
- Patent Document 3 As described above, in order to calculate the parameter h of the relational expression, it is necessary to calculate eight unknown parameters by solving eight simultaneous equations, and the calculation load is very high. Therefore, the technique described in Patent Document 3 has a problem that it is difficult to achieve both high-speed operation and determination accuracy.
- the present invention has been made in consideration of the above facts, and image processing that can realize in a short time the extraction of information corresponding to a three-dimensional object from images photographed by a plurality of photographing devices with a simple configuration and processing. It is an object to obtain an apparatus, an image processing method, and an image processing program.
- the image processing apparatus is captured by a first image captured by the first imaging apparatus corresponding to parallax, and a second imaging apparatus having a horizontal position different from that of the first imaging apparatus.
- a first image captured by the first imaging apparatus corresponding to parallax
- a second imaging apparatus having a horizontal position different from that of the first imaging apparatus.
- Storage means for storing each of the above, acquisition means for acquiring each of the first image taken by the first photographing apparatus and the second image photographed by the second photographing apparatus, and along the first direction in the image
- the position of the pixel row on the image along the first direction is determined according to the amount of deviation represented by the amount of deviation information corresponding to the position of the pixel row on the image along the second direction.
- Relative movement between one image and the second image Represents a difference between the processing unit that performs the parallax correction for each pixel column along the first direction in the image, and the first image and the second image that have undergone the parallax correction by the processing unit.
- the first image is taken by the first photographing device and the second image is taken by the second photographing device having a horizontal position different from that of the first photographing device. Since the apparatus and the second photographing apparatus have different horizontal positions, a difference corresponding to parallax occurs between the first image and the second image.
- the deviation amount information representing the deviation amount along the first direction on the image corresponding to the horizontal direction of the first image and the second image corresponding to the parallax is vertical. Each position on the image along the second direction on the image corresponding to the direction is stored in the storage means.
- the acquisition unit acquires the first image captured by the first imaging device and the second image captured by the second imaging device, respectively, and the processing unit determines the pixel row along the first direction in the image.
- the position on the image along the first direction is relative between the first image and the second image according to the amount of deviation represented by the amount of deviation information corresponding to the position on the image along the second direction of the pixel column.
- the parallax correction is performed for each pixel column along the first direction in the image.
- the difference information generated by the generation unit includes noise superimposed on the first image and the second image, and a change in the relative position between the first imaging device and the second imaging device.
- noise due to various factors such as the accuracy of the deviation amount information stored in the storage unit, there is a possibility that a difference caused by other than a three-dimensional object, that is, noise may be mixed.
- the generation unit when the generation unit generates a difference image representing the difference between the first image and the second image for each pixel as the difference information, there is a difference between the first image and the second image in the generated difference image.
- the generation unit generates a difference image representing the difference between the first image and the second image for each pixel as difference information.
- a difference pixel indicating that there is a difference between the first image and the second image, present in the difference image generated by the generation unit (this difference pixel may be a pixel having a difference greater than 0
- the image processing apparatus further includes a removing unit that removes noise from the difference image by performing contraction processing on a pixel whose difference to be expressed may be a pixel having a threshold value or more (the same applies hereinafter).
- a removing unit that removes noise from the difference image by performing contraction processing on a pixel whose difference to be expressed may be a pixel having a threshold value or more (the same applies hereinafter).
- the bias amount represented by the bias amount information stored in the storage means is a bias amount that represents the bias amount by the number of pixels.
- D is the deviation amount D
- h_cam is the height of the photographing optical axis from the ground at the installation position of the first photographing device and the second photographing device
- ⁇ is the inclination angle of the photographing optical axis with respect to the vertical direction
- the focal point of the optical system The distance is f, the number of pixels along the first direction of the image is w, the number of pixels along the second direction is h, the imaging size along the first direction of the image is w_img, and the imaging size along the second direction is h_img, where P is the position (number of pixels) on the image along the second direction, and b is the baseline length, which is the distance between the imaging optical axes of the first imaging device and the second imaging device.
- L_cam is a linear distance between an object located on the ground and imaged at a position (number of pixels) P on the image along the second direction and the first photographing device or the second photographing device, and L is the above-mentioned A distance between the object and the first imaging device or the second imaging device;
- L_cam ⁇ (h_cam 2 + L 2 ) ⁇ cos (tan ⁇ 1 (L / h_cam) ⁇ ) (2)
- L tan (tan ⁇ 1 ((P ⁇ (h / 2)) ⁇ (h_img / 2) / (h / 2) / f) + ⁇ ) ⁇ h_cam (3) It is derived in advance by performing the calculations of the above equations (1) to (3).
- the deviation amount information includes the deviation amount at each position on the image along the first direction in terms of the number of pixels.
- the processing means includes delay means for delaying the output of the first image or the second image in pixel units, and outputs the first image and the second image in parallel to the generation means in pixel units. Parallax correction in which the output of the first image or the output of the second image is relatively delayed by the delay means by the number of pixels represented by the deviation amount corresponding to the position on the image along the first direction of the target pixel row. Is performed while switching the number of delayed pixels by the delay means according to the change in the position on the image along the first direction of the pixel row to be output.
- the above-described parallax correction is performed by relatively moving the storage position in units of pixel columns along the first direction. Compared with the aspect, the parallax correction can be completed in a short time.
- the delay means comprises a plurality of delay units connected in series for delaying the output of the first image or the second image by one pixel.
- the processing means outputs the data that has passed through the number of delay units corresponding to the number of pixels represented by the deviation amount corresponding to the position on the image along the first direction of the pixel row to be output as the first image or the first image.
- the processing means corresponds to a position on the image along the first direction corresponding to a preset horizontal line.
- a pixel row located in a range corresponding to the upper side in the vertical direction from the position is excluded from the parallax correction target.
- the first photographing device and the second photographing are performed on the first image and the second image acquired by the acquiring unit.
- the image processing apparatus further includes correction means for correcting at least one of a difference in photographing range along the horizontal direction of the apparatus, a difference in photographing magnification, a difference in rotation angle around the photographing optical axis, and a difference in luminance. Parallax correction is performed on the first image and the second image that have been subjected to the correction by the correction means.
- the difference information at least one of a difference in the photographing range along the horizontal direction between the first photographing device and the second photographing device, a difference in photographing magnification, a difference in rotation angle around the photographing optical axis, and a difference in luminance. Information from which the difference due to one is removed is obtained.
- the difference in the shooting range along the horizontal direction between the first shooting apparatus and the second shooting apparatus is configured to be corrected at the same time when the processing means performs the parallax correction instead of being corrected by the correction means. It is also possible.
- the apparatus further includes a three-dimensional object detection means for detecting a three-dimensional object present in each of the three-dimensional objects.
- the difference information generated by the generation unit is information corresponding to the three-dimensional object (information indicating the difference caused by the three-dimensional object)
- by detecting the three-dimensional object using this difference information At least, the presence or absence of a three-dimensional object within the imaging range of the first imaging device and the second imaging device can be detected by a simple process.
- the ninth aspect of the present invention further includes output means for outputting the detection result of the three-dimensional object by the three-dimensional object detection means in the eighth aspect of the present invention.
- the output form of the detection result by the output means may be a form in which the detection result is output by displaying characters, images, figures, etc. representing the detection result of the three-dimensional object on the display means, or the detection result of the three-dimensional object is displayed.
- the form which outputs with a sound may be sufficient.
- the position of the image area corresponding to the three-dimensional object on the image is detected.
- the position of the three-dimensional object (more specifically, the position of the image region corresponding to the three-dimensional object on the image) can be detected by a simple process.
- the generation unit when the generation unit generates a difference image that represents the difference between the first image and the second image for each pixel as the difference information, the difference pixel that represents the difference caused by the three-dimensional object is the difference image on the difference image as described above.
- the difference pixel that represents the difference caused by the three-dimensional object is the difference image on the difference image as described above.
- This difference area is often a linear area along the edge of the three-dimensional object.
- the width of the difference area changes in accordance with the distance between the imaging device and the three-dimensional object, and the width of the linear difference area decreases as the distance between the imaging device and the three-dimensional object increases.
- the generation unit generates a difference image representing the difference between the first image and the second image as difference information for each pixel. Then, the three-dimensional object detection means extracts an image area corresponding to the three-dimensional object from the difference image generated by the generation means, and there is a difference between the first image and the second image existing in each extracted image area. Based on the width of a linear difference area composed of difference pixels representing the distance from the three-dimensional object corresponding to each image area. Thereby, detection of the distance with the solid object corresponding to each image area can be realized by simple processing.
- the generation means generates a difference image representing the difference between the first image and the second image for each pixel as difference information
- the object detection means performs contraction processing a plurality of times for a difference pixel that represents the presence of a difference between the first image and the second image and exists in the difference image, and a linear difference region including the difference pixels is the difference. It is determined whether or not the image has disappeared, and the output unit determines whether or not the number of times the contraction process is performed when the linear difference area disappears from the difference image is less than a threshold value or executes the contraction process a predetermined number of times.
- the three-dimensional object detection signal or the type of the three-dimensional object detection signal to be output is switched.
- each individual photographed as an image instead of detecting the distance to each three-dimensional object photographed as an image, each individual photographed as an image It is suitable for detecting the distance range (approximate distance) to the closest three-dimensional object among the three-dimensional objects, and the closest three-dimensional object is captured as an image.
- a distance range (approximate distance) to an object can be detected by simple processing, and a three-dimensional object detection signal can be notified by the presence or absence of output or the type of three-dimensional object detection signal to be output.
- the pixel number counting means counts the number of difference pixels existing in an area designated via the designation means in the difference image. Accordingly, it is possible to adjust the relative position between the first image capturing apparatus and the second image capturing apparatus and to adjust the parameters of the correction process by the correcting unit with an arbitrary region of the difference image as an evaluation target.
- the generation unit includes a difference between the first image and the second image that has undergone the parallax correction by the processing unit. Is generated for each pixel, and the difference image generated by the generation unit is corrected for the geometric parallax by the parallax correction by the processing unit, and is set in the first direction by a preset shift amount. It is comprised so that it may become a difference image showing the difference of the 1st image and 2nd image in the state shifted relatively.
- the difference image generated by the generation unit is corrected in the state where the geometric parallax is corrected by the parallax correction by the processing unit and is relatively shifted along the first direction by a preset shift amount.
- Configuration to be a difference image representing a difference between the first image and the second image can be realized by any one of the seventeenth to nineteenth aspects described below, for example.
- the seventeenth aspect of the present invention is the processing according to the sixteenth aspect of the present invention, wherein the deviation amount represented by the deviation amount information stored in the storage means is a deviation amount corresponding to the geometric parallax.
- the means is configured to make the first image and the second image relative to each other in the first direction by a preset shift amount before or after performing the parallax correction on the first image and the second image. Shift to.
- the shift amount can be changed and set depending on the application and the like. This has the advantage that it can be easily realized.
- the deviation amount represented by the deviation amount information stored in the storage means is preset to a deviation amount corresponding to a geometric parallax.
- the deviation amount corresponding to the shift amount is added to the deviation amount, or the deviation amount represented by the deviation amount information stored in the storage means is the deviation amount corresponding to the geometric parallax, and processing is performed.
- a deviation amount corresponding to the shift amount is added before being used for parallax correction by the means.
- a difference image representing the difference between is generated.
- the shift amount between the first image and the second image is included in the deviation amount information used for the parallax correction by the processing unit, the first image and the first image are corrected by the parallax correction by the processing unit. Shifting with two images is also performed at the same time, and there is an advantage that it is not necessary to change the configuration of the generation means.
- the deviation amount represented by the deviation amount information stored in the storage means is a deviation amount corresponding to a geometric parallax.
- the distance between the photographing optical axis of the first photographing apparatus and the photographing optical axis of the second photographing apparatus is set to become larger as the distance from the first photographing apparatus and the second photographing apparatus increases.
- the orientation is adjusted so that the one image and the second image are relatively shifted along the first direction by a preset shift amount.
- the first image and the second image in a state where the geometric parallax is corrected by the parallax correction by the processing unit and the shift is relatively shifted along the first direction by a preset shift amount.
- a difference image representing the difference between is generated.
- the first image and the second image are relatively shifted along the first direction by adjusting the orientations of the first and second imaging devices. Therefore, although it is difficult to change and set the shift amount according to the application or the like, there is an advantage that it is not necessary to change the configuration of the generating means as in the eighteenth aspect of the present invention.
- the first image and the second image present in the difference image generated by the generation unit For the difference pixel indicating that there is a difference, the contraction process along the second direction is executed a first number of times, and the contraction process along the first direction is performed a second time greater than the first number. Further provided is contraction processing means for executing the number of times.
- the difference image generated by the generating unit is corrected by the parallax correction by the processing unit, the geometrical parallax is corrected, and is relatively shifted along the first direction by a preset shift amount.
- the difference image is the difference between the first image and the second image including the difference component due to the three-dimensional object.
- the first image and the second image are images that are emphasized in the first direction as compared with the state where the first image and the second image are not shifted.
- the number of executions of the contraction process along the first direction is larger than the number of executions of the contraction process along the second direction.
- the shift amount is not more than 1 ⁇ 2 of the maximum deviation amount corresponding to geometric parallax. Is set.
- the storage means corresponds to the parallax, the first image captured by the first imaging device, and the second imaging that is different in horizontal position from the first imaging device.
- An image along the second direction on the image corresponding to the vertical direction is obtained by using the second image captured by the apparatus and the deviation amount information indicating the amount of deviation along the first direction on the image corresponding to the horizontal direction.
- the acquisition unit acquires the first image captured by the first imaging device and the second image captured by the second imaging device, respectively, and the processing unit The amount of deviation represented by the amount-of-bias information corresponding to the position of the pixel column on the image along the second direction of the pixel column along the first direction.
- the parallax correction to be moved in the opposite direction is performed for each pixel column along the first direction in the image, and the generation unit is configured to generate the first image and the second image that have undergone the parallax correction by the processing unit. Since the difference information representing the difference between the two is generated, the information corresponding to the three-dimensional object can be extracted in a short time from the images photographed by the plurality of photographing devices, as in the first aspect of the present invention. It can be realized by configuration and processing.
- a computer connected to the storage means for storing each of the acquisition means for acquiring the first image captured by the first imaging device and the second image captured by the second imaging device; According to the amount of deviation represented by the amount-of-bias information corresponding to the position on the image along the second direction of the pixel row, the position on the image along the first direction of the pixel row along one direction.
- a recording medium according to a twenty-fifth aspect of the present invention was shot by a first image taken by the first shooting device corresponding to parallax and a second shooting device having a horizontal position different from that of the first shooting device. For each position on the image along the second direction on the image corresponding to the vertical direction, the bias amount information indicating the amount of deviation along the first direction on the image corresponding to the horizontal direction of the second image.
- a computer connected to each storage means for storing, an acquisition means for respectively acquiring a first image photographed by the first photographing device and a second image photographed by the second photographing device, the first in the image A position on the image along the first direction of the pixel row along the direction is determined according to the amount of deviation represented by the bias amount information corresponding to the position on the image along the second direction of the pixel row.
- the first image and the second image And a processing unit that performs the parallax correction to be relatively moved in each of the pixel columns along the first direction in the image, and the first image that has undergone the parallax correction by the processing unit and the first
- An image processing program for functioning as a generation unit that generates difference information representing a difference between two images is recorded.
- FIG. 9C It is a schematic block diagram which shows an example of the difference image generation part demonstrated in 2nd Embodiment. It is an image figure which shows an example of the image image
- FIG. 1 shows imaging devices 10L and 10R according to the present embodiment and a three-dimensional object detection device 12 that is connected to the imaging devices 10L and 10R and detects a three-dimensional object in images taken by the imaging devices 10L and 10R. Has been.
- the imaging devices 10L and 10R are imaging devices that can capture a moving image and have the same optical characteristics, and are installed at a location where the 3D object to be detected by the 3D object detection device 12 can be imaged in a direction in which the 3D object can be captured. Is done.
- the specific installation location of the imaging devices 10L and 10R is determined according to the use of the three-dimensional object detection device 12.
- the three-dimensional object detection device 12 detects a three-dimensional object (obstacle) existing in the space in front of the vehicle.
- the photographing devices 10L and 10R are installed in a position where an image of the space ahead of the vehicle can be imaged in the direction of imaging the space ahead of the vehicle.
- the height from the ground (the height h_cam of the photographing optical axis from the ground at the installation position) is equal (a straight line connecting the photographing devices 10L and 10R).
- the relative position and orientation are adjusted so that each imaging optical axis is parallel with a predetermined interval (baseline length b: about 7 cm as in the case of both human eyes).
- Calibration work is performed to roughly correct optical axis twist and magnification differences.
- the imaging devices 10L and 10R that have undergone the above calibration work operate in synchronization, and image data (moving image data) obtained by imaging by the imaging devices 10L and 10R is sequentially output to the three-dimensional object detection device 12.
- the three-dimensional object detection device 12 includes a front image processing unit 14, a parallax correction unit 16, a difference image generation unit 18, a rear image processing unit 20, a difference image analysis unit 22, and an analysis result output unit 24. Connected in order.
- the parallax correction unit 16 is an example of a processing unit according to the present invention
- the difference image generation unit 18 is an example of a generation unit according to the present invention
- the post-image processing unit 20 is an example of a removal unit according to the second aspect of the present invention.
- the previous image processing unit 14 is an example of a correction unit according to the seventh aspect of the present invention
- the difference image analysis unit 22 is an example of a three-dimensional object detection unit according to the eighth aspect of the present invention
- the analysis result output unit 24 is It is an example of the output means which concerns on the 9th aspect etc. of this invention
- line which connects imaging device 10L, 10R and the solid-object detection apparatus 12 is an example of the acquisition means which concerns on this invention.
- the three-dimensional object detection device 12 includes a device control unit 26 that incorporates a nonvolatile storage unit 28.
- the apparatus control unit 26 is connected to the front image processing unit 14, the parallax correction unit 16, the difference image generation unit 18, the rear image processing unit 20, the difference image analysis unit 22, and the analysis result output unit 24, and these operations are performed. To control.
- the parallax correction unit 16 receives image data input from the imaging devices 10 ⁇ / b> L and 10 ⁇ / b> R to the three-dimensional object detection device 12 and subjected to correction processing (described later) by the previous image processing unit 14.
- the data is input in the raster scan order in units of 14 pixels.
- the parallax correction unit 16 includes latch groups 34 and 36 in which a plurality of latches 32 that can hold data for one pixel are connected in series, and the latch group 34 has image data input from the imaging device 10L. Is input to the latch group 36, and the image data input from the photographing apparatus 10R is input to the latch group 36.
- pixel clocks are respectively input to the individual latches 32, and at the timing synchronized with the pixel clocks, the held data for one pixel is output to the subsequent latch 32 and the previous latch 32 is provided. Repeatedly hold new data for one pixel input from.
- the latch groups 34 and 36 are an example of delay means according to the fourth aspect of the present invention, and each latch 32 is an example of a delay unit according to the fifth aspect of the present invention.
- each latch 32 of the latch group 34 is connected to the selector 38, and each latch 32 of the latch group 36 is connected to the selector 40.
- the selectors 38 and 40 have a data output end connected to the difference image generation unit 18 and a control signal input end connected to the parallax correction control unit 42.
- the parallax correction control unit 42 includes a parallax information storage unit 44 and a front end. It is connected to the image processing unit 14.
- the parallax information storage unit 44 is connected to the device control unit 26, and parallax amount information is written by the device control unit 26.
- the parallax amount information is an example of the bias amount information according to the present invention
- the parallax information storage unit 44 that stores the parallax amount information written by the device control unit 26 is a memory according to the present invention. It is an example of a part.
- the parallax correction control unit 42 outputs any of the data respectively input from the individual latches 32 of the latch group 34 to the selector 38 based on the parallax amount information stored in the parallax information storage unit 44.
- the selection signal output to the selectors 38 and 40 is switched at a timing synchronized with the line synchronization signal input from the previous image processing unit 14.
- the previous image processing unit 14 determines the difference in the photographing range along the horizontal direction (the left-right direction on the image) of the photographing devices 10L and 10R with respect to the image data input to the three-dimensional object detection device 12 from the photographing devices 10L and 10R.
- Offset correction processing for correcting magnification correction processing for correcting a difference in imaging magnification of the imaging devices 10L and 10R, twist correction processing for correcting a difference in twist (rotation angle) around the imaging optical axis of the imaging devices 10L and 10R, and imaging device
- one of the images input from the imaging devices 10 ⁇ / b> L and 10 ⁇ / b> R is used as the residual that has not been corrected by the adjustment by the calibration work. Correction is performed by shifting in the (X direction).
- the offset correction process can be included in the parallax correction process performed by the parallax correction unit 16 (is performed collectively) without being performed by the previous image processing unit 14. Further, the device control unit 26 may select a correction process to be executed by the previous image processing unit 14 among the above-described correction processes in accordance with the required accuracy for the three-dimensional object detection by the three-dimensional object detection device 12. Good.
- FIG. 3A shows an example of an image obtained by photographing with a monocular photographing device.
- FIGS. 3B and 3C show the same photographing range by the photographing devices 10L and 10R according to the first embodiment.
- An example of an image (however, an image after offset correction performed by the previous image processing unit 14) obtained by the above processing is shown.
- the height of the photographing optical axis from the ground at the position is h_cam
- the inclination angle of the photographing optical axis with respect to the vertical direction is ⁇
- the focal length of the optical system of the photographing devices 10L and 10R is f
- the number of pixels along the X direction of the image is determined by the installation of the photographing apparatuses 10L and 10R.
- L_cam in the above equation (1) is a linear distance between the imaging devices 10L and 10R and an object that is located on the ground and forms an image at a position P on the image along the Y direction.
- Distance f 35 mm
- number of pixels along the X direction of the image w 640
- number of pixels along the Y direction h 480
- w_img 36 mm, Y direction FIG.
- the parallax amount information is information indicating the parallax amount D at each position along the Y direction of the image (however, the range of the Y coordinate value or less of the vanishing point), and the device control unit 26 stores Based on each parameter (h_cam, ⁇ , f, w, h, w_img, h_img, b) stored in advance in the unit 28, each position along the Y direction of the image (however, the range below the Y coordinate value of the vanishing point)
- the above-described parallax amount information is generated in advance by performing the calculations of equations (1) to (3) and associating the parallax amount D obtained by the calculation with the Y coordinate value.
- the information is written in advance in the parallax information storage unit 44 of the correction unit 16.
- the parallax correction control unit 42 of the parallax correction unit 16 receives the image data and the image data when the correction processing by the previous image processing unit 14 is input from the previous image processing unit 14 in a raster scan order in units of one pixel. Based on the line synchronization signal that is synchronously input from the previous image processing unit 14, the image data of one line (one pixel column along the X direction of the image) that is input from the previous image processing unit 14 is displayed on the image. And the parallax amount D associated with the recognized Y coordinate value is read out from the parallax amount information stored in the parallax information storage unit 44.
- the parallax correction control unit 42 is closer to the infinity point than the image captured by the imaging device 10L (hereinafter referred to as the left image) and the image captured by the imaging device 10R (hereinafter referred to as the right image).
- the selectors 38 and 40 are arranged so that the output of the image data in which the image of the same object located is biased to the upstream side in the raster scan direction due to the parallax is relatively delayed by the read parallax amount D.
- the parallax correction control unit 42 receives the line coordinate signal from the previous image processing unit 14 for the recognition of the Y coordinate value, the reading of the corresponding parallax amount D, and the output of the selection signal to the selectors 38 and 40. Do it every time.
- 4C for example, the image shown in FIG. 3C is converted into the image shown in FIG. 3E through the parallax correction shown in FIG. 3D.
- the parallax correction described above is based on the parallax amount information stored in the parallax information storage unit 44, and the position on the image of the pixel column along the X direction is set to X for one of the left image and the right image. Since moving in the direction can be realized by an extremely simple process of switching the amount of movement (number of delayed pixels) according to the Y coordinate value of each pixel column, a high-performance arithmetic unit or the like is required. In addition, it is possible to perform processing on a moving image in real time, including each processing such as generation of a difference image and analysis on the difference image, which will be described later.
- offset correction may be performed simultaneously with the above-described parallax correction.
- This is a correction obtained by adding the offset correction amount to the parallax amount D read from the parallax information storage unit 44 as the number of delayed pixels.
- the parallax correction control unit 42 is configured to apply the amount, or information in which the offset correction amount is uniformly added to the individual parallax amounts D set in the parallax amount information is written in the parallax information storage unit 44. This can be realized by configuring the device control unit 26.
- the parallax amount information is updated according to the installation status of the photographing apparatuses 10L and 10R.
- the installation status will change due to vibration or the like, and accordingly, a change in parameters such as the inclination angle ⁇ is expected.
- an element for measuring the inclination such as a gyro sensor, is provided, and parallax amount information when the inclination measured by the element is each value is calculated and stored in advance.
- the parallax amount D changes according to the optical conditions in photographing by the photographing devices 10L and 10R, for example, when the optical conditions such as the inclination angle ⁇ of the photographing optical axis and the height h_cam of the photographing optical axis are changed.
- the apparatus control unit 26 detects the changed optical condition, performs the calculation of the parallax amount D and generates the parallax amount information again based on the detected changed optical condition, and the generated parallax amount information is stored in the parallax information storage unit.
- the information on the amount of parallax stored in 44 is overwritten and written. Thereby, the parallax correction according to the changed optical condition is performed by the parallax correction unit 16.
- the optical conditions after the change are, for example, that an operator who has changed the optical conditions performs operations such as inputting and setting the optical conditions after the change to the device control unit 26, so that the device control unit 26
- the present invention is not limited to this.
- an angle sensor for detecting the tilt angle ⁇ is provided for the tilt angle ⁇ of the photographing optical axis, and detection is performed based on the output of the angle sensor. Is also possible.
- calculation of the parallax amount D and generation of parallax amount information are performed in advance and stored in the storage unit 28, and the device control unit 26 changes the optical conditions after the change when the optical conditions are changed. May be read from the storage unit 28 and written to the parallax information storage unit 44.
- the image data of the left image and the right image that have undergone parallax correction is input in parallel to the difference image generation unit 18 in units of one pixel, and the difference image generation unit 18 receives data for one pixel of the left image and the right image.
- ) of the difference between the data L for one pixel of the input left image and the data R for one pixel of the input right image is calculated.
- a process of setting the absolute value of the difference as the luminance value of the pixel is performed.
- FIG. 3E showing the result of performing the parallax correction on the image illustrated in FIG.
- the image portion corresponding to the three-dimensional object that is standing upright is actually inclined on the image after the parallax correction.
- pixels corresponding to a two-dimensional element are erased as pixels having a luminance of 0, and mainly pixels corresponding to a three-dimensional object in an image (particularly pixels corresponding to an edge of a three-dimensional object)
- a difference image in which a set of difference pixels (difference area) having a large difference value (luminance value) is formed is obtained.
- the difference image generation unit 18 generates the left image shown in FIG. 5A and the right image shown in FIG. 5B through the correction processing by the previous image processing unit 14 and the parallax correction processing by the parallax correction unit 16.
- An example of a difference image is shown. It can be confirmed that a corresponding difference area is also generated on the difference image shown in FIG. 5C due to a three-dimensional object such as a person or a vehicle existing as a subject in the left and right images shown in FIGS. 5A and 5B.
- the parallax increases as the distance between the imaging device and the three-dimensional object decreases.
- the distance between the imaging device and the three-dimensional object corresponds to the corresponding linear difference region.
- the difference area corresponding to the person who is closer to the imaging device is wider than the width of the linear difference area corresponding to the vehicle. ing).
- the difference area corresponding to the background building or the white line of the parking lot is an error (noise) caused by the positional deviation or characteristic difference of the imaging devices 10L and 10R.
- the post-image processing unit 20 performs noise removal processing on the difference image input from the difference image generation unit 18.
- filtering processing such as a known smoothing filter may be applied as noise removal processing
- the post-image processing unit 20 performs contraction processing as noise removal processing.
- the contraction process is an image process in which the minimum value of all pixels in a peripheral region (for example, a region of 3 pixels ⁇ 3 pixels) centered on the target pixel is set as the value of the target pixel.
- the isolated difference pixel in the difference image is removed, and the outermost pixel in the difference area in the difference image is removed with a width of one pixel.
- noise superimposed on the difference image input from the difference image generation unit 18 for example, the difference area corresponding to the white line of the background building or the parking lot in the example of FIG. 6) is removed. Is done.
- the contraction process in the post-image processing unit 20 may be combined with binarization, and the difference image input from the difference image generation unit 18 may be binarized first and then the contraction process may be performed. If the differential image analysis unit 22 in the subsequent stage performs processing on the binarized differential image, binarization may be performed after the contraction processing is performed on the differential image. Although details will be described later, when the difference image analysis unit 22 performs the process of detecting the distance range from the three-dimensional object, the contraction process is repeatedly performed on the difference image. The execution of noise removal processing (shrinkage processing) may be omitted.
- the processing content in the analysis of the difference image by the difference image analysis unit 22 differs depending on the purpose of the three-dimensional object detection by the three-dimensional object detection device 12, the type of information to be provided to the user, and the like.
- the difference image analysis unit 22 performs position detection processing for detecting the position and range of the image region corresponding to the three-dimensional object based on the difference image as the difference image analysis processing. This position detection process is also executed as a pre-process even when a distance range detection process, which will be described later, is performed by the difference image analysis unit 22.
- an image region corresponding to the three-dimensional object is extracted by a known data analysis method such as a projection method. For example, extracting an image region corresponding to a three-dimensional object by the projection method counts the number of difference pixels in each pixel row for each pixel row along the X direction with respect to the binarized difference image. Then, a histogram of difference pixels in the Y direction is obtained, and a difference pixel histogram in the X direction is obtained by counting the number of difference pixels in each pixel row for each pixel row along the Y direction.
- a known data analysis method such as a projection method. For example, extracting an image region corresponding to a three-dimensional object by the projection method counts the number of difference pixels in each pixel row for each pixel row along the X direction with respect to the binarized difference image. Then, a histogram of difference pixels in the Y direction is obtained, and a difference pixel histogram in the X direction is obtained by counting the number of difference pixels in each pixel row for each
- the X coordinate value range in which the difference pixel number peak occurs in the difference pixel histogram in the X direction is extracted as the X coordinate value range of the image area corresponding to the three-dimensional object, and the difference pixel in the Y direction is extracted.
- This can be realized by performing a process of extracting the range of Y coordinate values in which the peak of the difference pixel number is generated in the histogram as the range of Y coordinate values of the image area corresponding to the three-dimensional object.
- the said process is an example of the process by the three-dimensional object detection means which concerns on the 10th aspect of this invention.
- the difference image analysis unit 22 When notifying the user of the position of the three-dimensional object, the difference image analysis unit 22 outputs the X coordinate value and the Y coordinate value range of the image region corresponding to the three-dimensional object to the analysis result output unit 24 as the analysis result.
- the analysis result output unit 24 outputs the analysis result based on the range of the X coordinate value and the Y coordinate value of the image region corresponding to the three-dimensional object input from the difference image analysis unit 22 or the image capturing device 10L or the image capturing.
- FIG. 7A an image added with a frame-shaped figure that clearly shows the range of the image area corresponding to the three-dimensional object is generated from the image captured by the device 10R, and the generated image is displayed on a display or the like. It can comprise so that the process to make may be performed. Thereby, it is possible to notify the user of the position of the three-dimensional object existing within the imaging range of the imaging devices 10L and 10R.
- an approximate distance distance range: for example, the distance is large / medium / small
- the contraction process is repeatedly performed to detect the distance range with the existing three-dimensional object.
- the width of the linear difference area corresponding to the three-dimensional object changes according to the distance between the imaging device and the three-dimensional object, and the distance between the imaging device and the three-dimensional object becomes smaller.
- the width of the linear difference region corresponding to the three-dimensional object is increased. This is used in the repeated contraction process, and it is determined whether the difference area in each image area extracted by the position detection process has disappeared while repeating the above-described contraction process on the difference image a plurality of times (this Alternatively, it may be determined whether or not the number of difference pixels in each image area is equal to or less than a predetermined value).
- an image region corresponding to a person and an image region corresponding to a vehicle are extracted as image regions corresponding to a three-dimensional object, while trees and white lines in the image are also captured by the photographing devices 10L, It is extracted as a difference area due to the influence of a 10R position shift, characteristic difference, and the like.
- the contraction process is repeatedly performed on this difference image, when the contraction process is repeated n1 times as shown in FIG. 7B, the linear difference area corresponding to the person and the linear difference area corresponding to the vehicle have a width.
- the difference area corresponding to the trees and white lines in the image disappears. Further, as shown in FIG.
- a difference area that disappears when the contraction process is repeated n1 times is regarded as noise, and a three-dimensional object corresponding to the difference area that disappears when the contraction process is repeated n2 times (vehicle in the example of FIG. 7).
- the three-dimensional object (a person in the example of FIG. 7) corresponding to the difference area that disappears when the contraction process is repeated n3 times is “distance range from the imaging device: If the difference area remains even after the contraction process is repeated n3 times, the three-dimensional object corresponding to the difference area can be determined as “distance range from the photographing apparatus: small”.
- the said process is an example of the process by the three-dimensional object detection means which concerns on the 11th, 12th aspect of this invention.
- the difference image analysis unit 22 adds, as an analysis result, the three-dimensional object in addition to the range of the X coordinate value and the Y coordinate value of the image region corresponding to the three-dimensional object. Is also output to the analysis result output unit 24, and the analysis result output unit 24 performs, for example, an image captured by the imaging device 10L or the imaging device 10R as a process of outputting the analysis result.
- a frame-like figure that clearly indicates the range of the image area corresponding to the three-dimensional object is added, and an image in which the display color of the frame is switched according to the determination result of the distance range is generated. It is possible to configure to perform processing to be displayed on the screen. Thereby, it is possible to notify the user of the position and distance range of the three-dimensional object existing within the imaging range of the imaging devices 10L and 10R. Instead of switching the display color of the frame, the distance range may be displayed with characters or the like.
- the search may be performed.
- an image region corresponding to a person is extracted from a difference image (see FIG. 7C) with n ⁇ b> 2 repeated shrinkage processing, and this image region is a difference image with n ⁇ b> 1 repeated shrinkage processing (see FIG. 7). 7B)).
- the image area corresponding to the three-dimensional object is searched for the difference image in a state where the difference area that can be regarded as noise is removed, so that the accuracy in searching for the image area corresponding to the three-dimensional object is improved. ⁇ A reduction in processing time can be realized.
- the difference image analysis unit 22 performs, for example, the number of difference pixels in the difference image input from the post-image processing unit 20 or the difference pixels as the difference image analysis process.
- the difference pixel represented by the difference pixel is processed to count the number of difference pixels greater than or equal to a predetermined value, and when the counting result is equal to or greater than a threshold value, a process of determining “there is a three-dimensional object” is performed.
- the difference represented by the difference pixels can be accumulated over all the difference pixels, and a process of determining that “a solid object is present” can be performed when the accumulation result is equal to or greater than a threshold value.
- the analysis result output unit 24 outputs, for example, a warning sound or a voice message such as “an obstacle has been detected” when the analysis result by the difference image analysis unit 22 is “solid object exists”. It can be configured to notify the analysis result by voice, for example, by outputting.
- FIG. 8A shows a difference image generated from the combination of the left image L and the right image R ′
- FIG. 8C shows a difference image generated from the combination of the left image L and the right image R ′′
- FIG. 8B shows the difference image.
- FIG. 8A shows a binary difference image obtained by binarizing the difference image with a certain threshold value
- FIG. 8D shows a binarized difference image obtained by binarizing the difference image shown in FIG. 8C with a certain threshold value.
- the right image R ′ and the right image R ′′ are images captured by the imaging device 10R in a state where the positional relationship with the imaging device 10L is different.
- Whether or not the positional relationship between the imaging devices 10L and 10R is appropriate is determined by counting the number of difference pixels in the difference image.
- the difference pixel corresponding to the three-dimensional object in the difference image is the position of the imaging devices 10L and 10R. Since the inclination of the change in the number of pixels with respect to the change in the relationship is small, for example, as indicated by a white frame in FIGS. 8B and 8D, a range excluding the image area corresponding to the three-dimensional object in the difference image is designated as the evaluation area by the operator It is desirable to count the number of difference pixels existing in the evaluation area designated by the operator.
- the first person is present at a position relatively distant from the imaging devices 10L and 10R in the left region in the image.
- the contour line corresponding to the first person does not appear clearly. For this reason, the first person may not be detected as a three-dimensional object.
- the second person is also present in the right region of the image. In particular, the position of the second person is relatively far from the imaging devices 10L and 10R.
- 9E and 9F corresponding to the images of FIG. 9B and FIG. 9C existing in FIG. 9B the outline corresponding to the second person is interrupted. There is a possibility that the distance to the second person is erroneously detected or that the second person is erroneously detected as a plurality of three-dimensional objects.
- the imaging devices 10L and 10R are integrated as a single imaging module, an increase in the base line length b leads to an increase in the size of the imaging module, leading to an increase in weight and an increase in the cost of the entire system in order to ensure the rigidity of the imaging module.
- problems such as difficulty in securing the installation location of the imaging module and a reduction in the degree of freedom in design also occur.
- the parallax correction unit 16 performs the parallax correction after the left image and the right image are relatively shifted along the left-right direction (X direction) of the image by a preset shift amount. It is configured to do. That is, as shown in FIG. 10, the parallax correction unit 16 according to the second embodiment is arranged in a preceding stage (between the latch group 36 and the previous image processing unit 14) of the latch group 36 that transfers the image data of the right image.
- a delay unit 50 is provided.
- a pixel clock is input to the delay unit 50, and shift amount information representing the relative shift amount S between the left image and the right image in terms of the number of pixels is input from the device control unit 26.
- the delay unit 50 delays the image data of the right image sequentially input for each pixel from the previous image processing unit 14 by the number of pixels represented by the shift amount information input from the device control unit 26, and then latches the group 36. To sequentially shift the left image and the right image in the left-right direction (X direction) of the image.
- FIGS. 11A to 11C show the same images as FIGS. 3A to 3C.
- the above image is displayed by the delay unit 50 of the parallax correction unit 16 on the right image shown in FIG. 11C.
- the latch group 36 includes, as an example, a right side shifted in the left-right direction (X direction) of the image with respect to the left side image shown in FIG. An image is input. Accordingly, the right image shown in FIG. 11D is converted into an image shown in FIG. 11F through the parallax correction shown in FIG. 11E.
- the line corresponding to the outline of the three-dimensional object on the difference image is shown as a line having a certain thickness, but the line corresponding to the outline of the three-dimensional object on the difference image is Although it also depends on the distance L, it actually has a certain thickness. Therefore, when a difference image is generated by performing image shift in addition to parallax correction, as shown in FIG. Among the lines corresponding to the outline of the image, the thickness of the line extending in the vertical direction (Y direction) of the image is increased.
- the difference component due to the three-dimensional object is buried in noise on the difference image, a part of the line corresponding to the outline of the three-dimensional object on the difference image is interrupted, or the difference component due to the three-dimensional object is It is suppressed that it does not appear clearly on the difference image.
- FIG. 12B shows the case where the image shift is not performed, among the lines corresponding to the outline of the three-dimensional object, particularly the lines extending in the vertical direction (Y direction) of the image.
- the difference component due to the three-dimensional object on the difference image is not easily lost even if the contraction process is repeatedly performed. Therefore, it is possible to suppress the occurrence of detection failure or erroneous detection of a three-dimensional object without increasing the baseline length b.
- the above aspect is an example of the sixteenth aspect of the present invention, more specifically, the seventeenth aspect of the present invention.
- the difference image generated in the second embodiment is an image in which the difference between the left and right images in the left and right direction (X direction) is emphasized, and therefore the contraction process is performed in the left and right direction (X Direction) and shrinkage processing in the vertical direction (Y direction) of the image, and the number of executions of the shrinking processing in the vertical direction (Y direction) of the image is determined for the horizontal direction (X direction) of the image. It is preferable to make the number smaller than the number of executions of the contraction process. Thereby, it is suppressed that the line extended in the left-right direction (X direction) of the image among the lines on the difference image corresponding to the outline of the three-dimensional object disappears by repeatedly executing the contraction process.
- the above item is an example of the twentieth aspect of the present invention.
- the shift amount S in the second embodiment is the maximum value of the correction amount (geometric parallax (parallax amount D)) in the parallax correction based on an empirical rule obtained through experiments by the inventors of the present application. It is desirable that the size is 1/2 or less.
- the maximum value of the correction amount (parallax amount D) in the parallax correction is 50 pixels, and thus the shift amount S is 25 pixels or less.
- the shift amount S is defined by the shift amount information input from the device control unit 26 to the delay unit 50 of the parallax correction unit 16. The shift amount S can be easily changed and set within the range of “1 ⁇ 2 of the maximum value of the correction amount (parallax amount D) in the parallax correction” according to the distance range).
- the delay unit 50 is provided in the preceding stage of the latch group 36 to shift the image (delay of output of image data) before performing the parallax correction.
- the present invention is not limited to this.
- the delay unit 50 is provided in the subsequent stage of the selector 40 (between the selector 40 and the difference image generation unit 18), so that the image shift after the parallax correction is performed. May be performed.
- the delay unit 50 shifts the image (delay of output of image data) with respect to the right image.
- the present invention is not limited to this. Alternatively, the image may be shifted.
- Each of the above aspects is also an example of the sixteenth aspect of the present invention, more specifically, the seventeenth aspect of the present invention.
- the delay unit 50 delays the output of the image data to realize the image shift.
- the parallax correction amount (the number of delay pixels) in the parallax correction performed by the parallax correction unit 16. ) Is a value obtained by adding a correction amount corresponding to the shift amount S to a correction amount corresponding to the geometric parallax (parallax amount D), and geometric parallax (parallax amount D) by the parallax correction performed by the parallax correction unit 16.
- Correction and image shift may be performed simultaneously.
- the correction amount characteristic d shown in FIG. 14 represents a correction amount (the number of delayed pixels) corresponding to the geometric parallax (parallax amount D).
- the correction amount represented by the correction amount characteristic d is changed to the shift amount S.
- the parallax correction amount (the number of delayed pixels) in the parallax correction performed by the parallax correction unit 16 is a value obtained by adding a correction amount corresponding to the shift amount S to a correction amount corresponding to the geometric parallax (parallax amount D).
- the apparatus control unit 26 calculates the parallax amount D
- the shift amount S is added to the calculated parallax amount D
- the added value is associated with the Y coordinate value to generate parallax amount information. This can be realized by performing a process of writing the generated parallax amount information in the parallax information storage unit 44 of the parallax correction unit 16.
- the parallax correction control unit 42 of the parallax correction unit 16 adds the shift amount S to the parallax amount D read from the parallax information storage unit 44. Also, it can be realized by performing a process of outputting to the selectors 38 and 40 as a selection signal.
- the above aspect is an example of the sixteenth aspect of the present invention, more specifically, the eighteenth aspect of the present invention.
- the relative shifting of the left image and the right image in the left-right direction (X direction) of the image means that the photographing optical axes of the photographing device 10L and the photographing device 10R are separated from the photographing device 10L and the photographing device 10R. It can also be realized by arranging in a direction in which the interval of the distance increases.
- FIG. 15 shows a mode in which the photographing optical axis of the photographing apparatus 10R is arranged outward from parallel with respect to the photographing optical axis of the photographing apparatus 10L. Note that the base line length b in FIG.
- the photographing optical axis of the photographing apparatus 10R may be arranged outward from parallel, or the photographing light of the photographing apparatuses 10L and 10R. You may arrange
- the parallax amount d 3 is a component that does not depend on the distance L, and is adjusted in advance according to the shift amount S.
- the parallax correction unit 16 performs the parallax correction for correcting the geometric parallax (parallax amount D), so that the difference image generation unit 18 performs the left image.
- a difference image in which the difference between the left and right images (X direction) between the right image and the right image is emphasized is generated.
- the imaging light source is not particularly installed and the imaging apparatuses 10L and 10R are assumed to perform imaging using natural light as illumination light.
- the present invention is not limited to this.
- near-infrared light is used.
- a light source that emits light may be installed as a photographing light source, and the photographing devices 10L and 10R may be configured to perform photographing with light emitted from the photographing light source and reflected from the subject.
- the illumination light may be light in a wavelength range other than near infrared light.
- the difference information (difference image) the luminance difference (or density or brightness in the HSV color space (where H is the hue and S is the saturation) as the first image and the second image.
- the luminance difference or density or brightness in the HSV color space (where H is the hue and S is the saturation
- the 1st image and the 2nd image are color images
- difference information (difference image) representing a difference in hue or saturation (or saturation) between the first image and the second image may be generated.
- contraction process independently was demonstrated above, it is different, such as combining arbitrary filter processes, for example, an expansion process, for example, performing an expansion process once every time a shrinkage process is performed several times, etc.
- the contraction process and the expansion process may be executed alternately by the number of executions. Thereby, fragmentation of the difference image can be prevented.
- other image processing such as a thinning process may be applied.
- the imaging devices 10L and 10R and the three-dimensional object detection device 12 are mounted on a vehicle, and an example of detecting a three-dimensional object (obstacle) existing in a space in front of the vehicle has been described as an example.
- the use of the processing apparatus is not limited to this, and it goes without saying that it can be applied to various uses such as detection of obstacles in a self-propelled robot.
- all the processes by the three-dimensional object detection device 12 according to the present embodiment can be replaced with processes by a computer, and the above processes may be performed by a computer.
- the program for causing the computer to perform the above process is an example of an image processing program according to the twenty-fourth aspect of the present invention.
- the computer executes the program, the computer executes the first program of the present invention.
- the image processing apparatus according to the aspect functions.
- the above program can also be provided in a form recorded on a recording medium such as a CD-ROM or DVD-ROM.
- the recording medium in this aspect is an example of the recording medium according to the twenty-fifth aspect of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
L=b×f/D …(1)
以上の処理を、基準画像の全画素について行うことで、画像として撮影された個々の物体との距離を検出することができる。なお、図16において、符号「106」は撮像面、符号「108」は比較画像上の対応点(対応画素)である。
D=b×f/L_cam×(w/2)/(w_ing/2) …(1)
但し、L_camは、地面上に位置しかつ第2方向に沿って画像上の位置(画素数)Pに結像する物体と、第1撮影装置又は第2撮影装置との直線距離、Lは前記物体と第1撮影装置又は第2撮影装置との距離であり、
L_cam=√(h_cam2+L2)×cos(tan-1(L/h_cam)-θ) …(2)
L=tan(tan-1((P-(h/2))×(h_img/2)/(h/2)/f)+θ)×h_cam …(3)
上記(1)~(3)式の演算を行うことで予め導出される。
図1には本実施形態に係る撮影装置10L,10Rと、撮影装置10L,10Rに接続され撮影装置10L,10Rによって撮影された画像中の立体物を検出する立体物検出装置12と、が示されている。
D=b×f/L_cam×(w/2)/(w_ing/2) …(1)
で求まり、上記(1)式におけるL_camは、地面上に位置しかつY方向に沿って画像上の位置Pに結像する物体と、撮影装置10L,10Rとの直線距離であり、
L_cam=√(h_cam2+L2)×cos(tan-1(L/h_cam)-θ) …(2)
で求まる。また、上記(2)式におけるLは前記物体と撮影装置10L,10Rとの距離であり、撮影光軸と地面との交点(図4A,図4Bに星印で示す位置)は、画像上でのY座標値が"h/2"となることから(図4B参照)、
L=tan(tan-1((P-(h/2))×(h_img/2)/(h/2)/f)+θ)×h_cam …(3)
で求まる。
次に本発明の第2実施形態を説明する。なお、第1実施形態と同一の部分には同一の符号を付し、説明を省略する。
Claims (25)
- 視差に相当する、第1撮影装置によって撮影された第1画像と、前記第1撮影装置と水平方向位置の異なる第2撮影装置によって撮影された第2画像と、の前記水平方向に対応する画像上の第1方向に沿った偏倚量を表す偏倚量情報を、鉛直方向に対応する画像上の第2方向に沿った画像上の各位置について各々記憶する記憶手段と、
前記第1撮影装置によって撮影された第1画像及び前記第2撮影装置によって撮影された第2画像を各々取得する取得手段と、
画像中の前記第1方向に沿った画素列の前記第1方向に沿った画像上の位置を、前記画素列の前記第2方向に沿った画像上の位置に対応する前記偏倚量情報が表す前記偏倚量に応じて、前記第1画像と前記第2画像とで相対的に移動させる視差補正を、画像中の前記第1方向に沿った各画素列に対して各々行う処理手段と、
前記処理手段による前記視差補正を経た前記第1画像と前記第2画像との差分を表す差分情報を生成する生成手段と、
を含む画像処理装置。 - 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記生成手段によって生成された前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素を対象として収縮処理を行うことで、前記差分画像のノイズを除去する除去手段を更に備えた請求項1記載の画像処理装置。 - 前記記憶手段に記憶されている偏倚量情報が表す偏倚量は当該偏倚量を画素数で表す偏倚量Dであり、偏倚量Dは、前記第1撮影装置及び前記第2撮影装置の設置位置における地面からの撮影光軸の高さをh_cam、鉛直方向に対する撮影光軸の傾斜角度をθ、光学系の焦点距離をf、画像の前記第1方向に沿った画素数をw、前記第2方向に沿った画素数をh、画像の前記第1方向に沿った撮像サイズをw_img、前記第2方向に沿った撮像サイズをh_img、前記第2方向に沿った画像上の位置(画素数)をP、前記第1撮影装置と前記第2撮影装置の撮影光軸の距離である基線長をbとしたときに、
D=b×f/L_cam×(w/2)/(w_ing/2) …(1)
但し、L_camは、地面上に位置しかつ第2方向に沿って画像上の位置(画素数)Pに結像する物体と、前記第1撮影装置又は前記第2撮影装置との直線距離、Lは前記物体と前記第1撮影装置又は前記第2撮影装置との距離であり、
L_cam=√(h_cam2+L2)×cos(tan-1(L/h_cam)-θ) …(2)
L=tan(tan-1((P-(h/2))×(h_img/2)/(h/2)/f)+θ)×h_cam …(3)
上記(1)~(3)式の演算を行うことで予め導出される請求項1又は請求項2記載の画像処理装置。 - 前記偏倚量情報は、前記第1方向に沿った画像上の各位置における前記偏倚量を画素数で表す情報であり、
前記処理手段は、前記第1画像又は前記第2画像の出力を画素単位で遅延させる遅延手段を備え、前記第1画像及び前記第2画像を画素単位で並列に前記生成手段へ出力するにあたり、出力対象の前記画素列の前記第1方向に沿った画像上の位置に対応する前記偏倚量が表す画素数分だけ、前記第1画像の出力又は前記第2画像の出力を前記遅延手段によって相対的に遅延させる前記視差補正を、出力対象の前記画素列の前記第1方向に沿った画像上の位置の変化に応じて前記遅延手段による遅延画素数を切替えながら行う請求項1~請求項3の何れか1項記載の画像処理装置。 - 前記遅延手段は、前記第1画像又は前記第2画像の出力を1画素分遅延させる遅延部が直列に複数接続されて構成されており、
前記処理手段は、出力対象の前記画素列の前記第1方向に沿った画像上の位置に対応する前記偏倚量が表す画素数に対応する数の前記遅延部を通過したデータを、前記第1画像又は前記第2画像として出力するデータとして選択することで、前記第1画像又は前記第2画像として出力するデータを前記画素数分だけ遅延させる請求項4記載の画像処理装置。 - 前記処理手段は、前記第1方向に沿った画像上の位置が、予め設定された水平線に相当する位置よりも鉛直方向上側に対応する範囲に位置している前記画素列を前記視差補正の対象から除外する請求項1~請求項5の何れか1項記載の画像処理装置。
- 前記取得手段によって取得された前記第1画像及び前記第2画像に対し、前記第1撮影装置と前記第2撮影装置の前記水平方向に沿った撮影範囲の相違、撮影倍率の相違、撮影光軸回りの回転角の相違、及び、輝度の相違、の少なくとも1つを補正する補正手段を更に備え、
前記処理手段は、前記補正手段による前記補正を経た前記第1画像及び前記第2画像に対して前記視差補正を行う請求項1~請求項6の何れか1項記載の画像処理装置。 - 前記生成手段によって生成された差分情報に基づいて、前記第1撮影装置及び前記第2撮影装置の撮影範囲内に各々存在している立体物を検出する立体物検出手段を更に備えた請求項1~請求項7の何れか1項記載の画像処理装置。
- 前記立体物検出手段による前記立体物の検出結果を出力する出力手段を更に備えた請求項8記載の画像処理装置。
- 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記立体物検出手段は、前記生成手段によって生成された前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素の前記差分画像上での分布に基づいて、前記差分画像から前記立体物に相当する画像領域を抽出することで、前記立体物に相当する画像領域の画像上での位置を検出する請求項8又は請求項9記載の画像処理装置。 - 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記立体物検出手段は、前記生成手段によって生成された前記差分画像から前記立体物に相当する画像領域を抽出し、抽出した個々の前記画像領域内に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素から成る線状の差分領域の幅に基づいて、個々の画像領域に対応する立体物との距離を検出する請求項8又は請求項9記載の画像処理装置。 - 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記立体物検出手段は、前記生成手段によって生成された前記差分画像から前記立体物に相当する画像領域を抽出し、前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素を対象として収縮処理を行うことを、抽出した個々の前記画像領域内に存在する前記差分画素から成る線状の差分領域が消滅したか否かを個々の前記画像領域毎に判定しながら繰り返し、前記線状の差分領域が個々の前記画像領域から消滅した時点での前記収縮処理の実行回数に基づいて、個々の画像領域に対応する立体物との距離を検出する請求項8、請求項9及び請求項11の何れか1項記載の画像処理装置。 - 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記立体物検出手段は、前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素を対象として収縮処理を複数回行うと共に、前記差分画素から成る線状の差分領域が前記差分画像から消滅したか否かを判定し、
前記出力手段は、前記線状の差分領域が前記差分画像から消滅した時点での前記収縮処理の実行回数が閾値未満か否か、又は、前記収縮処理を所定回実行した時点で前記線状の差分領域が前記差分画像から消滅したか否かに応じて、立体物検出信号を出力するか否か、又は、出力する立体物検出信号の種類を切り替える請求項9記載の画像処理装置。 - 前記生成手段は、前記差分情報として、前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記生成手段によって生成された前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素の数を計数し、計数結果を出力する画素数計数手段を更に備えた請求項1~請求項13の何れか1項記載の画像処理装置。 - 前記画素数計数手段は、前記差分画像のうち指定手段を介して指定された領域内に存在する前記差分画素の数を計数する請求項14記載の画像処理装置。
- 前記生成手段は、前記処理手段による前記視差補正を経た前記第1画像と前記第2画像との差分を画素毎に表す差分画像を生成し、
前記生成手段によって生成される前記差分画像が、前記処理手段による前記視差補正によって幾何学的な視差が補正され、かつ予め設定されたシフト量だけ前記第1方向に沿って相対的にシフトされた状態での前記第1画像と前記第2画像との差分を表す差分画像となるように構成された請求項1~請求項15の何れか1項記載の画像処理装置。 - 前記記憶手段に記憶されている前記偏倚量情報が表す偏倚量が前記幾何学的な視差に相当する偏倚量とされ、
前記処理手段は、前記第1画像及び前記第2画像に対して前記視差補正を行う前又は前記視差補正を行った後に、前記第1画像と前記第2画像とを予め設定されたシフト量だけ前記第1方向に沿って相対的にシフトさせる請求項16記載の画像処理装置。 - 前記記憶手段に記憶されている前記偏倚量情報が表す偏倚量が、前記幾何学的な視差に相当する偏倚量に予め設定された前記シフト量に相当する偏倚量を加えた偏倚量とされているか、又は、前記記憶手段に記憶されている前記偏倚量情報が表す偏倚量が、前記幾何学的な視差に相当する偏倚量とされ、前記処理手段による前記視差補正に用いられる前に前記シフト量に相当する偏倚量が加えられる請求項16記載の画像処理装置。
- 前記記憶手段に記憶されている前記偏倚量情報が表す偏倚量が前記幾何学的な視差に相当する偏倚量とされ、
前記第1撮影装置及び前記第2撮影装置は、前記第1撮影装置の撮影光軸と前記第2撮影装置の撮影光軸との間隔が、前記第1撮影装置及び前記第2撮影装置から離れるに従って大きくなる向きとされ、前記第1画像と前記第2画像とが、予め設定されたシフト量だけ前記第1方向に沿って相対的にシフトされた状態となるように前記向きが調整されている請求項16記載の画像処理装置。 - 前記生成手段によって生成された前記差分画像中に存在する、前記第1画像と前記第2画像との差分有りを表す差分画素を対象として、前記第2方向に沿った収縮処理を第1回数だけ実行すると共に、前記第1方向に沿った収縮処理を、前記第1回数よりも多い第2回数実行する収縮処理手段を更に備えた請求項16~請求項19の何れか1項記載の画像処理装置。
- 前記シフト量は、前記幾何学的な視差に相当する偏倚量の最大値の1/2以下の大きさに設定されている請求項16~請求項19の何れか1項記載の画像処理装置。
- 前記第1撮影装置及び前記第2撮影装置は、前記第1撮影装置の撮影光軸と前記第2撮影装置の撮影光軸との間隔が、前記第1撮影装置及び前記第2撮影装置から離れるに従って大きくなる向きとされている請求項1記載の画像処理装置。
- 記憶手段は、視差に相当する、第1撮影装置によって撮影された第1画像と、前記第1撮影装置と水平方向位置の異なる第2撮影装置によって撮影された第2画像と、の前記水平方向に対応する画像上の第1方向に沿った偏倚量を表す偏倚量情報を、鉛直方向に対応する画像上の第2方向に沿った画像上の各位置について各々記憶し、
取得手段は、前記第1撮影装置によって撮影された第1画像及び前記第2撮影装置によって撮影された第2画像を各々取得し、
処理手段は、画像中の前記第1方向に沿った画素列の前記第1方向に沿った画像上の位置を、前記画素列の前記第2方向に沿った画像上の位置に対応する前記偏倚量情報が表す前記偏倚量に応じて、前記第1画像と前記第2画像とで相対的に移動させる視差補正を、画像中の前記第1方向に沿った各画素列に対して各々行い、
生成手段は、前記処理手段による前記視差補正を経た前記第1画像と前記第2画像との差分を表す差分情報を生成する
画像処理方法。 - 視差に相当する、第1撮影装置によって撮影された第1画像と、前記第1撮影装置と水平方向位置の異なる第2撮影装置によって撮影された第2画像と、の前記水平方向に対応する画像上の第1方向に沿った偏倚量を表す偏倚量情報を、鉛直方向に対応する画像上の第2方向に沿った画像上の各位置について各々記憶する記憶手段と接続されたコンピュータを、
前記第1撮影装置によって撮影された第1画像及び前記第2撮影装置によって撮影された第2画像を各々取得する取得手段、
画像中の前記第1方向に沿った画素列の前記第1方向に沿った画像上の位置を、前記画素列の前記第2方向に沿った画像上の位置に対応する前記偏倚量情報が表す前記偏倚量に応じて、前記第1画像と前記第2画像とで相対的に移動させる視差補正を、画像中の前記第1方向に沿った各画素列に対して各々行う処理手段、
及び、前記処理手段による前記視差補正を経た前記第1画像と前記第2画像との差分を表す差分情報を生成する生成手段
として機能させるための画像処理プログラム。 - 視差に相当する、第1撮影装置によって撮影された第1画像と、前記第1撮影装置と水平方向位置の異なる第2撮影装置によって撮影された第2画像と、の前記水平方向に対応する画像上の第1方向に沿った偏倚量を表す偏倚量情報を、鉛直方向に対応する画像上の第2方向に沿った画像上の各位置について各々記憶する記憶手段と接続されたコンピュータを、
前記第1撮影装置によって撮影された第1画像及び前記第2撮影装置によって撮影された第2画像を各々取得する取得手段、
画像中の前記第1方向に沿った画素列の前記第1方向に沿った画像上の位置を、前記画素列の前記第2方向に沿った画像上の位置に対応する前記偏倚量情報が表す前記偏倚量に応じて、前記第1画像と前記第2画像とで相対的に移動させる視差補正を、画像中の前記第1方向に沿った各画素列に対して各々行う処理手段、
及び、前記処理手段による前記視差補正を経た前記第1画像と前記第2画像との差分を表す差分情報を生成する生成手段
として機能させるための画像処理プログラムが記録された記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180014515.1A CN102792333B (zh) | 2010-03-19 | 2011-03-15 | 图像处理装置、方法、程序以及记录介质 |
US13/132,976 US8917929B2 (en) | 2010-03-19 | 2011-03-15 | Image processing apparatus, method, program, and recording medium |
JP2012505712A JP5792157B2 (ja) | 2010-03-19 | 2011-03-15 | 画像処理装置、方法、プログラム及び記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-065166 | 2010-03-19 | ||
JP2010065166 | 2010-03-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011115142A1 true WO2011115142A1 (ja) | 2011-09-22 |
Family
ID=44649226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/056118 WO2011115142A1 (ja) | 2010-03-19 | 2011-03-15 | 画像処理装置、方法、プログラム及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8917929B2 (ja) |
JP (1) | JP5792157B2 (ja) |
CN (1) | CN102792333B (ja) |
WO (1) | WO2011115142A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014089548A (ja) * | 2012-10-30 | 2014-05-15 | Sharp Corp | 路面段差検出方法、路面段差検出装置、路面段差検出装置を備えた車両 |
CN103843333A (zh) * | 2011-09-29 | 2014-06-04 | 富士胶片株式会社 | 控制立体图像的显示的方法、控制立体图像显示的装置以及成像装置 |
JP7015078B1 (ja) | 2020-12-01 | 2022-02-02 | 勇祐 鈴木 | 検査方法および検査システム |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011115142A1 (ja) * | 2010-03-19 | 2011-09-22 | Okiセミコンダクタ株式会社 | 画像処理装置、方法、プログラム及び記録媒体 |
DE102011011929A1 (de) * | 2011-02-18 | 2012-08-23 | Hella Kgaa Hueck & Co. | Verfahren zum Detektieren von Zielobjekten in einem Überwachungsbereich |
JP5367749B2 (ja) * | 2011-03-25 | 2013-12-11 | 株式会社東芝 | サーバ装置、通信方法およびプログラム |
KR101850386B1 (ko) * | 2011-04-19 | 2018-04-19 | 엘지전자 주식회사 | 로봇 청소기 및 이의 제어 방법 |
DE102012002321B4 (de) * | 2012-02-06 | 2022-04-28 | Airbus Defence and Space GmbH | Verfahren zur Erkennung eines vorgegebenen Musters in einem Bilddatensatz |
JP5773944B2 (ja) * | 2012-05-22 | 2015-09-02 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置および情報処理方法 |
US9544574B2 (en) * | 2013-12-06 | 2017-01-10 | Google Inc. | Selecting camera pairs for stereoscopic imaging |
US9565416B1 (en) | 2013-09-30 | 2017-02-07 | Google Inc. | Depth-assisted focus in multi-camera systems |
JP6151150B2 (ja) * | 2013-10-07 | 2017-06-21 | 日立オートモティブシステムズ株式会社 | 物体検出装置及びそれを用いた車両 |
CN104506842A (zh) | 2015-01-15 | 2015-04-08 | 京东方科技集团股份有限公司 | 三维摄像头模组、终端设备以及测距方法 |
JP6452585B2 (ja) | 2015-10-01 | 2019-01-16 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および位置情報取得方法 |
CN108541305A (zh) | 2015-12-21 | 2018-09-14 | 株式会社小糸制作所 | 车辆用图像获取装置和包括该装置的车辆 |
EP3396410A4 (en) | 2015-12-21 | 2019-08-21 | Koito Manufacturing Co., Ltd. | IMAGE ACQUISITION DEVICE FOR VEHICLES, CONTROL DEVICE, VEHICLE HAVING IMAGE ACQUISITION DEVICE FOR VEHICLES AND CONTROL DEVICE, AND IMAGE ACQUISITION METHOD FOR VEHICLES |
US11194023B2 (en) | 2015-12-21 | 2021-12-07 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
EP3396414A4 (en) | 2015-12-21 | 2019-08-21 | Koito Manufacturing Co., Ltd. | IMAGING APPARATUS FOR USE BY A VEHICLE, CONTROL DEVICE, VEHICLE WITH THE CONTROL DEVICE OR IMAGING DEVICE FOR USE BY A VEHICLE AND IMAGE RECORDING FOR USE BY A VEHICLE |
KR102565485B1 (ko) * | 2016-01-11 | 2023-08-14 | 한국전자통신연구원 | 도시 거리 검색 서비스 제공 서버 및 방법 |
JP6780983B2 (ja) * | 2016-08-18 | 2020-11-04 | マクセル株式会社 | 画像処理システム |
US11361380B2 (en) * | 2016-09-21 | 2022-06-14 | Allstate Insurance Company | Enhanced image capture and analysis of damaged tangible objects |
JP6794243B2 (ja) * | 2016-12-19 | 2020-12-02 | 日立オートモティブシステムズ株式会社 | 物体検出装置 |
US10984555B2 (en) * | 2017-03-30 | 2021-04-20 | Mitsubishi Electric Corporation | Object detection device and vehicle |
JP6953247B2 (ja) * | 2017-09-08 | 2021-10-27 | ラピスセミコンダクタ株式会社 | ゴーグル型表示装置、視線検出方法及び視線検出システム |
US11158088B2 (en) * | 2017-09-11 | 2021-10-26 | Tusimple, Inc. | Vanishing point computation and online alignment system and method for image guided stereo camera optical axes alignment |
US11089288B2 (en) | 2017-09-11 | 2021-08-10 | Tusimple, Inc. | Corner point extraction system and method for image guided stereo camera optical axes alignment |
JP2019145571A (ja) * | 2018-02-16 | 2019-08-29 | 株式会社東芝 | 半導体装置 |
WO2020263498A1 (en) * | 2019-06-25 | 2020-12-30 | Snap Inc. | Vanishing point stereoscopic image correction |
CN111914764B (zh) * | 2020-08-05 | 2023-09-15 | 杭州睿琪软件有限公司 | 图像处理方法、图像处理装置、电子设备和存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07306037A (ja) * | 1994-05-11 | 1995-11-21 | Nippon Soken Inc | 立体物領域検出装置及び立体物領域迄の距離測定装置及びそれらの検出、測定方法 |
JP2009088840A (ja) * | 2007-09-28 | 2009-04-23 | Saxa Inc | ステレオ画像処理装置及び同画像処理用プログラム |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5963664A (en) * | 1995-06-22 | 1999-10-05 | Sarnoff Corporation | Method and system for image combination using a parallax-based technique |
US5825915A (en) * | 1995-09-12 | 1998-10-20 | Matsushita Electric Industrial Co., Ltd. | Object detecting apparatus in which the position of a planar object is estimated by using hough transform |
US6108440A (en) * | 1996-06-28 | 2000-08-22 | Sony Corporation | Image data converting method |
WO1998004087A1 (fr) * | 1996-07-18 | 1998-01-29 | Sanyo Electric Co., Ltd. | Dispositif et procede pour convertir des signaux video bidimensionnels en signaux video tridimensionnels |
JPH10178564A (ja) * | 1996-10-17 | 1998-06-30 | Sharp Corp | パノラマ画像作成装置及び記録媒体 |
US6512857B1 (en) * | 1997-05-09 | 2003-01-28 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration |
US6597818B2 (en) * | 1997-05-09 | 2003-07-22 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration of imagery |
US6269175B1 (en) * | 1998-08-28 | 2001-07-31 | Sarnoff Corporation | Method and apparatus for enhancing regions of aligned images using flow estimation |
JP3798161B2 (ja) * | 1998-10-29 | 2006-07-19 | 株式会社ニデック | 眼底計測装置及び眼底計測プログラムを記録した記録媒体 |
JP2000293693A (ja) | 1999-03-31 | 2000-10-20 | Toshiba Corp | 障害物検出方法および装置 |
JP3867883B2 (ja) * | 1999-06-01 | 2007-01-17 | 株式会社リコー | 画像合成処理方法、画像合成処理装置及び記録媒体 |
JP3263931B2 (ja) | 1999-09-22 | 2002-03-11 | 富士重工業株式会社 | ステレオマッチング装置 |
US6829383B1 (en) * | 2000-04-28 | 2004-12-07 | Canon Kabushiki Kaisha | Stochastic adjustment of differently-illuminated images |
US6714672B1 (en) * | 1999-10-27 | 2004-03-30 | Canon Kabushiki Kaisha | Automated stereo fundus evaluation |
US6862364B1 (en) * | 1999-10-27 | 2005-03-01 | Canon Kabushiki Kaisha | Stereo image processing for radiography |
ATE278298T1 (de) * | 1999-11-26 | 2004-10-15 | Sanyo Electric Co | Verfahren zur 2d/3d videoumwandlung |
JP4610799B2 (ja) * | 2001-06-25 | 2011-01-12 | オリンパス株式会社 | 立体観察システム、及び内視鏡装置 |
CN1675937B (zh) * | 2002-08-20 | 2011-08-24 | 江良一成 | 生成立体图像的方法和装置 |
JP4095491B2 (ja) * | 2003-05-19 | 2008-06-04 | 本田技研工業株式会社 | 距離測定装置、距離測定方法、及び距離測定プログラム |
JP2004363758A (ja) * | 2003-06-03 | 2004-12-24 | Konica Minolta Photo Imaging Inc | 画像処理方法、撮像装置、画像処理装置及び画像記録装置 |
US7593597B2 (en) * | 2003-08-06 | 2009-09-22 | Eastman Kodak Company | Alignment of lens array images using autocorrelation |
JP4069855B2 (ja) * | 2003-11-27 | 2008-04-02 | ソニー株式会社 | 画像処理装置及び方法 |
US7376250B2 (en) * | 2004-01-05 | 2008-05-20 | Honda Motor Co., Ltd. | Apparatus, method and program for moving object detection |
WO2005081178A1 (en) * | 2004-02-17 | 2005-09-01 | Yeda Research & Development Co., Ltd. | Method and apparatus for matching portions of input images |
US7617022B1 (en) * | 2004-07-01 | 2009-11-10 | Rockwell Collins, Inc. | Dual wavelength enhanced vision system optimized for visual landing light alignment |
US7742634B2 (en) * | 2005-03-15 | 2010-06-22 | Omron Corporation | Image processing method, three-dimensional position measuring method and image processing apparatus |
JP2007235642A (ja) | 2006-03-02 | 2007-09-13 | Hitachi Ltd | 障害物検知システム |
JP5234894B2 (ja) * | 2007-06-28 | 2013-07-10 | 富士重工業株式会社 | ステレオ画像処理装置 |
JP4856611B2 (ja) * | 2007-10-29 | 2012-01-18 | 富士重工業株式会社 | 物体検出装置 |
JP4956452B2 (ja) * | 2008-01-25 | 2012-06-20 | 富士重工業株式会社 | 車両用環境認識装置 |
JP4886716B2 (ja) * | 2008-02-26 | 2012-02-29 | 富士フイルム株式会社 | 画像処理装置および方法並びにプログラム |
JP2010011441A (ja) * | 2008-05-26 | 2010-01-14 | Sanyo Electric Co Ltd | 撮像装置及び画像再生装置 |
US8600193B2 (en) * | 2008-07-16 | 2013-12-03 | Varian Medical Systems, Inc. | Image stitching and related method therefor |
WO2010035443A1 (ja) * | 2008-09-26 | 2010-04-01 | パナソニック株式会社 | 映像信号処理装置及び映像信号処理方法 |
JP5402715B2 (ja) * | 2009-06-29 | 2014-01-29 | ソニー株式会社 | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 |
WO2011115142A1 (ja) * | 2010-03-19 | 2011-09-22 | Okiセミコンダクタ株式会社 | 画像処理装置、方法、プログラム及び記録媒体 |
US8878950B2 (en) * | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
JP2012186781A (ja) * | 2011-02-18 | 2012-09-27 | Sony Corp | 画像処理装置および画像処理方法 |
JP2012198075A (ja) * | 2011-03-18 | 2012-10-18 | Ricoh Co Ltd | ステレオカメラ装置、画像補整方法 |
JP2013114517A (ja) * | 2011-11-29 | 2013-06-10 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
-
2011
- 2011-03-15 WO PCT/JP2011/056118 patent/WO2011115142A1/ja active Application Filing
- 2011-03-15 CN CN201180014515.1A patent/CN102792333B/zh not_active Expired - Fee Related
- 2011-03-15 JP JP2012505712A patent/JP5792157B2/ja not_active Expired - Fee Related
- 2011-03-15 US US13/132,976 patent/US8917929B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07306037A (ja) * | 1994-05-11 | 1995-11-21 | Nippon Soken Inc | 立体物領域検出装置及び立体物領域迄の距離測定装置及びそれらの検出、測定方法 |
JP2009088840A (ja) * | 2007-09-28 | 2009-04-23 | Saxa Inc | ステレオ画像処理装置及び同画像処理用プログラム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103843333A (zh) * | 2011-09-29 | 2014-06-04 | 富士胶片株式会社 | 控制立体图像的显示的方法、控制立体图像显示的装置以及成像装置 |
JP2014089548A (ja) * | 2012-10-30 | 2014-05-15 | Sharp Corp | 路面段差検出方法、路面段差検出装置、路面段差検出装置を備えた車両 |
JP7015078B1 (ja) | 2020-12-01 | 2022-02-02 | 勇祐 鈴木 | 検査方法および検査システム |
JP2022087382A (ja) * | 2020-12-01 | 2022-06-13 | 勇祐 鈴木 | 検査方法および検査システム |
Also Published As
Publication number | Publication date |
---|---|
US8917929B2 (en) | 2014-12-23 |
US20110311130A1 (en) | 2011-12-22 |
CN102792333A (zh) | 2012-11-21 |
JPWO2011115142A1 (ja) | 2013-06-27 |
CN102792333B (zh) | 2016-11-23 |
JP5792157B2 (ja) | 2015-10-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5792157B2 (ja) | 画像処理装置、方法、プログラム及び記録媒体 | |
KR101121034B1 (ko) | 복수의 이미지들로부터 카메라 파라미터를 얻기 위한 시스템과 방법 및 이들의 컴퓨터 프로그램 제품 | |
CN111566437B (zh) | 三维测量系统以及三维测量方法 | |
US11039121B2 (en) | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method | |
KR101666959B1 (ko) | 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법 | |
WO2014061372A1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
CN111462503B (zh) | 车辆测速方法、装置及计算机可读存储介质 | |
WO2014002849A1 (ja) | 3次元測定方法、装置及びシステム、並びに画像処理装置 | |
US20200068186A1 (en) | Stereo camera | |
JP2006252473A (ja) | 障害物検出装置、キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム | |
JP2004117078A (ja) | 障害物検出装置及び方法 | |
JP5156601B2 (ja) | 形状測定装置およびプログラム | |
JP2008082870A (ja) | 画像処理プログラム及びこれを用いた路面状態計測システム | |
JP4055998B2 (ja) | 距離検出装置、距離検出方法、及び距離検出プログラム | |
US9530240B2 (en) | Method and system for rendering virtual views | |
JP4701848B2 (ja) | 画像マッチング装置、画像マッチング方法および画像マッチング用プログラム | |
CN112184793B (zh) | 深度数据的处理方法、装置及可读存储介质 | |
KR101453143B1 (ko) | 스테레오 매칭 처리 시스템, 스테레오 매칭 처리 방법, 및 기록 매체 | |
JP2012185712A (ja) | 画像照合装置または画像照合方法 | |
Kruger et al. | In-factory calibration of multiocular camera systems | |
JP6429483B2 (ja) | 情報処理装置、撮像装置、情報処理システム、情報処理方法およびプログラム | |
JP2008224323A (ja) | ステレオ写真計測装置、ステレオ写真計測方法及びステレオ写真計測用プログラム | |
CN106303501B (zh) | 基于图像稀疏特征匹配的立体图像重构方法及装置 | |
JP5891751B2 (ja) | 画像間差分装置および画像間差分方法 | |
JPH07306037A (ja) | 立体物領域検出装置及び立体物領域迄の距離測定装置及びそれらの検出、測定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180014515.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13132976 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11756326 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012505712 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11756326 Country of ref document: EP Kind code of ref document: A1 |