WO2012066774A1 - 撮像装置及び距離計測方法 - Google Patents
撮像装置及び距離計測方法 Download PDFInfo
- Publication number
- WO2012066774A1 WO2012066774A1 PCT/JP2011/006373 JP2011006373W WO2012066774A1 WO 2012066774 A1 WO2012066774 A1 WO 2012066774A1 JP 2011006373 W JP2011006373 W JP 2011006373W WO 2012066774 A1 WO2012066774 A1 WO 2012066774A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- images
- unit
- distance
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/32—Measuring distances in line of sight; Optical rangefinders by focusing the object, e.g. on a ground glass screen
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/571—Depth or shape recovery from multiple images from focus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/236—Image signal generators using stereoscopic image cameras using a single 2D image sensor using varifocal lenses or mirrors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/0816—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/16—Beam splitting or combining systems used as aids for focusing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10148—Varying focus
Definitions
- the present invention relates to an imaging apparatus and a distance measurement method, and more particularly to an imaging apparatus and a distance measurement method for measuring the distance of a subject using a plurality of captured images.
- DFD Depth from Defocus
- DFD is a method for measuring distance using image blur.
- the photographed image shows whether the blur that exists in the photographed image is caused by a change in the focus of the lens, or whether the original image that represents the state where there is no blur due to the lens originally has a blurred texture There is a problem that it is very difficult to distinguish from itself.
- the spectral component of the original image is compared by comparing the ratio of the spatial frequency spectrum between a plurality of captured images with the ratio of the blur spatial frequency spectrum corresponding to the depth of the scene. Measurements that do not depend on
- the present invention has been made in view of the above-described problems, and an object of the present invention is to provide an imaging apparatus and a distance measuring method capable of realizing stable distance measurement from a small number of photographed images.
- an imaging apparatus provides an imaging unit that generates an image by imaging a subject and the imaging unit while changing a focus position of the imaging unit.
- a focus range control unit that captures n images (n is an integer of 2 or more) each having a different focus range, and a reference image that generates a reference image serving as a blur base using the n images.
- the imaging apparatus can obtain an image with a wider focus range than usual without reducing the aperture by capturing an image while changing the focus position. . Accordingly, the imaging device according to one embodiment of the present invention can obtain a reference image from a small number of images. In addition, since the focus ranges of the images are independent of each other, an image having a substantially uniform blur with respect to the subject distance can be generated using a plurality of images. Therefore, the imaging device according to one embodiment of the present invention can obtain a highly accurate reference image by a simple method. As described above, the imaging device according to one embodiment of the present invention can realize stable distance measurement from a small number of captured images.
- the imaging unit includes an imaging device and a lens that collects light on the imaging device, and the focusing range control unit changes the distance between the imaging device and the lens at a constant speed.
- the in-focus position may be changed.
- the exposure time of the n images may be the same.
- the noise included in the n images with different focus ranges can be made comparable, so that the accuracy of distance calculation can be improved.
- the focusing range control unit may change the distance between the imaging element and the lens at a constant speed during a period from when the imaging unit starts capturing the n images to when the imaging unit ends. Also good.
- the imaging unit includes a lens, n imaging elements arranged so that optical path lengths from the lenses are different, and a beam splitter that divides light from the lens into each of the n imaging elements.
- the focusing range control unit may simultaneously capture the n images on the n imaging elements while simultaneously changing the focusing positions of the n imaging elements in the same period. Good.
- n images having different in-focus ranges can be taken simultaneously, so that the time required for the entire process can be shortened.
- the depth of field of each image is continuously expanded, a wide range of distance measurement can be performed with a small number of images.
- the imaging unit includes a lens, n imaging elements, and a selection unit that selectively causes light from the lens to enter any one of the n imaging elements, and the focusing range.
- the control unit sequentially selects the n imaging elements, and selectively causes light to enter the selected imaging elements by the selection unit, whereby the n images are input to each of the n imaging elements. Each of these may be imaged.
- n images having different in-focus ranges can be taken simultaneously, so that the time required for the entire process can be shortened.
- each image has a discontinuous focus range, it is possible to have a blurred shape that facilitates distance measurement by the DFD algorithm.
- the reference image generation unit may generate an average image of the n images and generate the reference image using the average image.
- the reference image generation unit may generate the reference image by performing a deconvolution operation on the average image using one type of point spread function.
- the present invention can be realized not only as such an imaging apparatus, but also as a distance measurement method using characteristic means included in the imaging apparatus as a step, or as a control method of the imaging apparatus, or such characteristics. It can also be realized as a program that causes a computer to execute typical steps. Needless to say, such a program can be distributed via a non-transitory computer-readable recording medium such as a CD-ROM and a transmission medium such as the Internet.
- the present invention can be realized as a semiconductor integrated circuit (LSI) that realizes part or all of the functions of such an imaging apparatus.
- LSI semiconductor integrated circuit
- the present invention can provide an imaging apparatus and a distance measuring method capable of realizing stable distance measurement from a small number of captured images.
- FIG. 1 is a block diagram showing a configuration of an imaging apparatus according to an embodiment of the present invention.
- FIG. 2 is a diagram schematically showing the state of light collection when the in-focus position according to the embodiment of the present invention is changed.
- FIG. 3 is a flowchart showing a flow of operations of the imaging apparatus according to the embodiment of the present invention.
- FIG. 4 is a diagram showing the change of the in-focus position according to the embodiment of the present invention.
- FIG. 5 is a diagram showing the blur amounts of the first image and the second image according to the embodiment of the present invention.
- FIG. 6A is a diagram showing a texture of a scene according to the embodiment of the present invention.
- FIG. 6B is a diagram showing the subject distance of the scene according to the embodiment of the present invention.
- FIG. 7A is a diagram illustrating a result of distance measurement performed from two images having a focusing range to be joined according to the embodiment of the present invention.
- FIG. 7B is a diagram illustrating a result of distance measurement performed from two images having an in-focus range spaced according to an embodiment of the present invention.
- FIG. 8A is a table showing the relationship between the interval of the focusing range and the RMS of the distance measurement result according to the embodiment of the present invention.
- FIG. 8B is a graph showing the relationship between the interval of the focus range and the RMS of the distance measurement result according to the embodiment of the present invention.
- FIG. 9A is a diagram illustrating a configuration of an imaging unit according to Modification 1 of the embodiment of the present invention.
- FIG. 9B is a diagram showing a change in the in-focus position according to the first modification of the embodiment of the present invention.
- FIG. 10 is a diagram illustrating a configuration of an imaging unit according to Modification 2 of the embodiment of the present invention.
- FIG. 11A is a diagram illustrating an operation of the imaging unit according to the second modification of the embodiment of the present invention.
- FIG. 11B is a diagram illustrating an operation of the imaging unit according to Modification 2 of the embodiment of the present invention.
- FIG. 11C is a diagram illustrating an operation of the imaging unit according to Modification 2 of the embodiment of the present invention.
- FIG. 12 is a diagram showing a change in the focus position according to the second modification of the embodiment of the present invention.
- the imaging apparatus captures a plurality of images having independent focusing ranges. Further, an interval is provided between a plurality of focus ranges of a plurality of images. Thereby, the imaging apparatus can generate an image having a substantially uniform blur with respect to the subject distance using the plurality of images. As described above, the imaging apparatus according to the embodiment of the present invention can realize stable distance measurement from a small number of captured images.
- FIG. 1 is a diagram illustrating a configuration of an imaging apparatus 10 according to an embodiment of the present invention.
- the imaging device 10 illustrated in FIG. 1 captures an image and uses the image to measure the distance between the imaging device 10 and a subject included in the image.
- the imaging device 10 includes an imaging unit 11, a focusing range control unit 12, a reference image generation unit 13, and a distance measurement unit 14.
- the imaging unit 11 includes a lens unit in which a lens 21 that collects light rays is incorporated, and an imaging element 22 such as a CCD or a CMOS.
- the imaging unit 11 has a function of generating an image by capturing an image of a subject.
- the focus range control unit 12 has a function of controlling the lens unit included in the imaging unit 11 and controlling the focus position and the depth of field. Specifically, the focus range control unit 12 changes the focus position by changing the distance between the lens 21 and the image sensor 22. More specifically, the focusing range control unit 12 changes the distance between the lens 21 and the image sensor 22 by moving one or both of the lens 21 and the image sensor 22.
- the focusing range control unit 12 operates an autofocus mechanism incorporated in the lens unit with a specific pattern or switches a specific optical element.
- the lens unit may include a plurality of lenses. In this case, the focusing range control unit 12 may move one or more of the plurality of lenses.
- the distance between the lens and the image sensor is, for example, the distance between the lens to be moved or the lens principal point position of a plurality of lenses and the image sensor.
- the focus range control unit 12 changes the focus position of the image pickup unit 11 and causes the image pickup unit 11 to have a plurality of images (first image 31a and first image having different focus ranges and depths of field). Two images 31b) are photographed.
- the focusing ranges of the plurality of images are independent from each other, and a non-focusing interval is provided between the focusing ranges.
- the reference image generation unit 13 uses the first image 31a and the second image 31b having different focus positions and depths of field generated by the operation of the focus range control unit 12 as a reference for blurring.
- An image 32 is generated.
- the reference image 32 is an image in which a state where there is no blur due to the lens 21 is estimated.
- the distance measurement unit 14 performs distance measurement based on the DFD technique using the first image 31a and the second image 31b and the reference image 32. That is, the distance measurement unit 14 measures the distance to the subject from the difference in the degree of blur between the first image 31a and the second image 31b and the reference image 32.
- the hyperfocal distance is a distance that is determined to be in focus from the distance to infinity when the in-focus position (focus) is adjusted to that distance.
- the focal length of the lens is f
- the F number of the lens is F
- the size of the allowable circle of confusion representing the smallest detectable blur is c
- the depth of field when the in-focus position is adjusted to the distance s is expressed by the following (Expression 2) and (Expression), where DN is the depth of field ahead of s and Df is the depth of field behind. 3).
- EDOF extended depth of field
- the simplest EDOF method is a method of taking a plurality of images while gradually shifting the in-focus position, and extracting and synthesizing in-focus portions from these images. This technique is also used in Patent Document 2.
- Non-Patent Document 1 discloses a technique for realizing the same effect as combining a large number of images by changing the focus position during exposure.
- FIG. 2 is a diagram schematically showing the state of light collection when the in-focus position is changed.
- the position (image plane position) of the image sensor 22 when focusing on the subject position a1 is b1
- the position of the image sensor 22 when focusing on the subject position a2 is b2.
- the position of the image sensor 22 in the case of focusing on an arbitrary subject position between a1 and a2 is always between b1 and b2.
- the image sensor 22 moves from b1 to b2 during exposure, the blur is gradually increased from the state of focusing on a1, and the blur is integrated and overlapped in the image.
- the focus is gradually increased from a state where the blur is large with respect to a2, and the blur is superimposed on the image.
- f is the focal length of the optical system
- a is the diameter of the aperture of the optical system
- u is the subject distance
- ⁇ v is from v
- R is the distance from the blur center
- g is a constant.
- the image plane position is the position of the image sensor with respect to the lens. In other words, the image plane position corresponds to the distance between the lens and the image sensor.
- Equation 6 Equation 6
- Non-Patent Document 1 shows that the PSF that can be obtained from (Equation 7) has an almost universal blur shape regardless of the distance between the distances V (0) to V (T). That is, by changing the start point v + V (0) and the end point v + V (T) of the image plane position, the range of the subject distance where the blur is constant can be changed.
- h represents the PSF at the position (x, y)
- d (x, y) represents the subject distance at the position (x, y).
- * in the formula represents a convolution operation.
- the PSF differs depending on the subject distance. Therefore, when a plurality of subjects are present at different distances, an image in which a different PSF is convoluted for each image position is taken as a captured image. can get.
- PSFs corresponding to the subject distances d1, d2,..., Dn are assumed to be h (x, y, d1), h (x, y, d2), ..., h (x, y, dn).
- the captured image I (x, y) is an image obtained by convolving R (x, y) with H (x, y, d1). equal.
- d (x, y) is obtained by the following (formula 9).
- the image is divided into blocks, the sum of errors in the block is obtained, and the distance that minimizes the error is set as the distance of the entire block.
- the imaging unit 11 captures the first image 31a and the second image 31b having different focus ranges. Specifically, during the exposure of each of the first image 31a and the second image 31b, the focusing range control unit 12 moves the image plane position at a constant speed as shown in FIG. At this time, the focusing range control unit 12 first causes the imaging unit 11 to capture the first image 31a while moving the image plane position from v1 to v3 at a constant speed (S101). Next, the focus range control unit 12 moves the focus position by a predetermined distance (S102). Specifically, the focusing range control unit 12 moves the imaging position from v3 to v4 at a constant speed during the non-exposure period.
- the focusing range control unit 12 causes the imaging unit 11 to capture the second image 31b while moving the image plane position from v4 to v2 at a constant speed (S103).
- the first image 31a photographed in this way has a uniform blur within the focusing range of the first image 31a from the image plane positions v1 to v3, and the image plane positions v1 to v3. In other positions, there is a blur corresponding to the distance from the focusing range of the first image 31a.
- the second image 31b has a uniform blur within the focusing range of the second image 31b from the image plane position v4 to v2, and the second image 31b is aligned at positions other than the image plane position v4 to v2.
- the blur that is uniform with respect to the subject distance is used for generating the reference image 32, and the blur corresponding to the subject distance is used for distance measurement by DFD.
- v3 and v4 can be made into arbitrary positions between v1 and v2, in order to make the exposure time of the 1st image 31a and the 2nd image 31b the same, the distance of v1 and v3 and v4 and v2. are preferably set to be equal.
- the noise included in the first image and the second image can be made comparable, so that the accuracy of distance calculation can be improved.
- the focus range control unit 12 moves the focus position at a constant speed during the period including the exposure period of the first image 31a, the non-exposure period, and the exposure period of the second image 31b.
- the focusing range control unit 12 changes the image plane position at a constant speed during a period from when the imaging unit 11 starts imaging the first image 31a and the second image 31b to when it ends. Thereby, control which changes an in-focus position can be performed easily.
- the moving speed of the in-focus position during the exposure period of the first image 31a may be different from the moving speed of the in-focus position during the exposure period of the second image 31b.
- the reference image generation unit 13 generates a reference image 32 using the first image 31a and the second image 31b (S104).
- the reference image 32 is obtained by deconvolution of the PSF with respect to the captured image.
- the subject distance d (x, y) needs to be known in order to correctly obtain the reference image 32, but the first image 31a and the second image 31b are expanded because the depth of field is expanded.
- the PSF is constant over a range of subject distances.
- the reference image generation unit 13 generates an average image by taking the average of the first image 31a and the second image 31b.
- the average image has substantially the same depth of field over the range from v1 to v2, except for the section from v3 to v4. That is, the average image is almost equivalent to an image having uniform blur over the entire range from v1 to v2. Therefore, the reference image generation unit 13 generates a reference image 32 focused on the entire range from v1 to v2 by performing a deconvolution operation on the average image using one type of PSF corresponding to uniform blur. To do.
- a known method such as a Wiener filter can be used for the deconvolution algorithm.
- the distance measurement unit 14 uses the captured first image 31a and second image 31b and the reference image 32 generated in step S104, and follows the distance map d (x, y) according to (Equation 9). Is calculated (S105). Specifically, the distance measurement unit 14 calculates a distance map using the first image 31a and a distance map using the second image 31b according to (Equation 9). Next, the distance measuring unit 14 calculates one distance map by integrating these (S106). For this integration, a method of taking a simple average of distance maps calculated from each image can be used.
- the distance measurement may be performed by performing a Fourier transform on (Equation 8) and (Equation 9) and comparing them in the frequency domain.
- the convolution operation in (Equation 8) and (Equation 9) is converted into multiplication, so distance measurement can be performed at higher speed.
- the imaging apparatus 10 can generate the reference image 32 using only the first image 31a and the second image 31b.
- the imaging device 10 can perform distance measurement with high accuracy with a small number of images. Further, since the change of the focus position during exposure can be realized by diverting a standard autofocus mechanism, a special mechanism is not required.
- the reference image is also calculated from the average image of all the captured images.
- an image generated by performing a deconvolution operation using the PSF on the average image is used as the reference image 32.
- the image has a substantially uniform blur in the entire range from v1 to v2, refer to it. It can be used as the image 32.
- the average image or the added image may be used as the reference image 32.
- the lens performance assumed in the simulation is a focal length of 9 mm and an F number of 1.4.
- a 20-digit number represents whether or not the in-focus position is in alignment with each of the 20 steps of the subject distance. For example, [1, 0, 1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0] is 1 from the far side. This indicates that the third, fourth, fifth, and tenth stages are in focus.
- FIG. 7A shows a result of distance measurement by DFD from two images in which the focusing range is continuous (no interval is provided). Specifically, these two images have an in-focus range of [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0. , 0, 0, 0] (1st to 10th steps) and the focus range is [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1 , 1, 1, 1, 1, 1] (11th to 20th stages). At this time, the subject distance is obtained in the form shown by the following (Expression 10) and (Expression 11).
- F1 and F2 represent frequency components of the first and second images
- H1 and H2 represent PSF frequency components corresponding to the focusing ranges of the first and second images
- H1 * and H2 * Represents a complex conjugate of H1 and H2
- ⁇ represents a minute value for preventing division by zero
- f ⁇ 1 represents an inverse Fourier transform
- FIG. 7B shows the result of distance measurement performed in the same manner from two images with an interval in the focusing range.
- these two images have an in-focus range of [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0. , 0, 0, 0] (1st to 10th steps) and the focus range is [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1 1, 1, 1, 1, 1, 1] (12th to 20th stages). That is, the 11th stage is the out-of-focus range.
- FIG. 7A the measurement result is far from that in FIG.
- FIG. 7B the measurement result is improved compared to FIG. 7A.
- FIGS. 8A and 8B are diagrams showing the difference between the measurement result and the correct answer data with respect to the interval of the focus range.
- the difference between the measurement result and the correct answer data is RMS (Root Mean Square). The closer the RMS value is to 0, the higher the measurement accuracy.
- FIG. 8A is a table showing the RMS of the distance measurement result with respect to the interval of the focus range.
- FIG. 8B is a diagram showing RMS as a bar graph.
- the combination with the smallest RMS was a case where the third stage of the 10th to 12th stages was not focused. Further, the measurement accuracy is higher when the interval is provided than when the interval is not provided in the focusing range. However, when the interval is too wide, the measurement accuracy decreases. That is, the interval is preferably 4 steps or less. More preferably, the interval is 2 to 4 steps.
- one stage corresponds to about 50 ⁇ m of the image plane position.
- the image plane position corresponding to the interval is preferably 200 ⁇ m or less (corresponding to 4 steps), more preferably 100 ⁇ m or more (corresponding to 2 steps) and 200 ⁇ m or less (corresponding to 4 steps).
- this interval may not be optimal in all cases.
- FIG. 9A is a diagram illustrating a configuration of the imaging unit 11 according to the first modification.
- FIG. 9B is a diagram illustrating a change in the focus position in the first modification.
- the imaging unit 11 includes a lens 21, two imaging elements 22a and 22b, a beam splitter 23 for splitting a light beam, and a mirror 24 for bending an optical path.
- the two image sensors 22a and 22b are arranged so that the optical path lengths from the lens 21 are different.
- the optical path length of the image sensor 22b is longer than the image sensor 22a by ⁇ v.
- the beam splitter 23 divides the light from the lens 21 into two imaging elements 22a and 22b.
- the focusing range control unit 12 changes the focusing position of the two imaging elements 22a and 22b simultaneously in the same period, while the first image 31a is displayed on the imaging element 22a and the second image 31b is displayed on the imaging element 22b.
- the image sensor 22a and the image sensor are captured as shown in FIG. 9B.
- An image in which the in-focus position is always shifted by ⁇ v is obtained by the element 22b.
- ⁇ v equal to the distance from v1 to v4
- two images that do not focus on the distance from v3 to v4 can be taken simultaneously.
- a prism may be used instead of a mirror as means for bending the optical path.
- the structure which does not have a mirror and a prism like the structure shown in FIG. 10 mentioned later may be sufficient.
- FIG. 10 is a diagram illustrating a configuration of the imaging unit 11 according to the second modification.
- 11A to 11B are diagrams illustrating the operation of the imaging unit 11.
- FIG. 12 is a diagram illustrating a change in the in-focus position in the second modification.
- the imaging unit 11 shown in FIG. 10 includes a lens 21, two imaging elements 22a and 22b, a diaphragm 25, and a movable mirror 26.
- the movable mirror 26 is disposed in the middle of the optical path between the lens 21 and the two imaging elements 22a and 22b, and is an optical axis changing unit that changes the optical axis direction of the light from the lens 21.
- the movable mirror 26 functions as a selection unit that selectively causes the light from the lens 21 to enter one of the two imaging elements 22a and 22b.
- optical path lengths from the lens 21 to the two image sensors 22a and 22b are equal.
- the movable mirror 26 is, for example, a galvano mirror or a MEMS mirror.
- the movable mirror 26 has a function of guiding a light beam to one of the two image pickup devices 22a and 22b as shown in FIGS. 11A and 11C.
- the diaphragm 25 blocks the light beam during the operation in which the movable mirror 26 switches the image sensor to which the light beam reaches.
- the focusing range control unit 12 sequentially selects the two image pickup devices 22a and 22b, and selectively causes light to enter the selected image pickup device with the movable mirror 26, thereby allowing each of the two image pickup devices to Each of the two images is captured.
- the focus position is changed at a constant speed during exposure, and at the same time, the light beam is distributed to the imaging elements 22a and 22b by the movable mirror 26.
- the light beam reaches the image sensor 22a in the section where the in-focus position is v1 to v3, and reaches the image sensor 22b in the section v4 to v2. Therefore, as shown in FIG. 12, the first image 31a and the second image 31b are images in a discontinuous focus range.
- the data of the first image 31a is captured before the second image 31b is captured.
- a reading period is required.
- the non-exposure period can be shortened.
- optical path in Modification 2 is not limited to the configuration shown in FIG. 10, and any optical path can be used as long as the optical path lengths of the plurality of imaging elements are constant.
- a part of the functions of the imaging apparatus may be a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or hard disk unit.
- the imaging apparatus achieves its functions by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- a part or all of the constituent elements constituting the above-described imaging apparatus may be configured by a single system LSI (Large Scale Integration).
- the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- a part or all of the constituent elements constituting the imaging apparatus may be configured by an IC card or a single module that can be attached to and detached from the imaging apparatus.
- the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). ), Recorded in a semiconductor memory or the like.
- the digital signal may be recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention may be a computer system including a microprocessor and a memory, the memory storing the computer program, and the microprocessor operating according to the computer program.
- program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like and executed by another independent computer system. You may do that.
- division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be.
- functions of a plurality of functional blocks having similar functions may be processed in parallel or time-division by a single hardware or software.
- the present invention can be applied to an imaging apparatus having a lens system, particularly a monocular imaging apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Studio Devices (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
以下、合焦範囲の間に間隔を設ける利点について述べる。位置(x、y)における正しい距離の値をD、正しくない距離の値をD’とする。(式9)より参照画像Rにh(x、y、d)とh(x、y、D’)とをそれぞれ畳み込んだ結果が大きく異なるほど、正しい距離であるか否かの判定が容易になる。以下ではシミュレーションで検証する。
図9Aは、変形例1に係る撮像部11の構成を示す図である。また、図9Bは、変形例1における、合焦位置の変化を示す図である。
図10は、変形例2に係る撮像部11の構成を示す図である。図11A~図11Bは、撮像部11の動作を示す図である。図12は、変形例2における、合焦位置の変化を示す図である。
なお、本発明を上記実施の形態に基づいて説明してきたが、本発明は、上記の実施の形態に限定されないのはもちろんである。以下のような場合も本発明に含まれる。
11 撮像部
12 合焦範囲制御部
13 参照画像生成部
14 距離計測部
21 レンズ
22、22a、22b 撮像素子
23 ビームスプリッタ
24 ミラー
25 絞り
26 可動ミラー
31a 第1画像
31b 第2画像
32 参照画像
Claims (10)
- 被写体を撮像することにより画像を生成する撮像部と、
前記撮像部の合焦位置を変化させながら、前記撮像部に、それぞれ異なる合焦範囲を有するn枚(nは2以上の整数)の画像を撮影させる合焦範囲制御部と、
前記n枚の画像を用いて、ぼけの基準となる参照画像を生成する参照画像生成部と、
前記n枚の画像と、前記参照画像とのぼけの度合いの差から前記被写体までの距離を計測する距離計測部とを備え、
前記n枚の画像のそれぞれが有する合焦範囲は互いに独立しており、各合焦範囲間に非合焦な間隔が設けられている
撮像装置。 - 前記撮像部は、撮像素子と、前記撮像素子に光を集光するレンズとを備え、
前記合焦範囲制御部は、前記撮像素子と前記レンズとの距離を等速で変化させることで前記合焦位置を変化させる
請求項1に記載の撮像装置。 - 前記n枚の画像の露光時間は同じである
請求項2に記載の撮像装置。 - 前記合焦範囲制御部は、前記撮像部が前記n枚の画像の撮像を開始してから、終了するまでの期間において、前記撮像素子と前記レンズとの距離を等速で変化させる
請求項2又は3に記載の撮像装置。 - 前記撮像部は、
レンズと、
前記レンズからの光路長がそれぞれ異なるように配置されたn個の撮像素子と、
前記レンズからの光を前記n個の撮像素子のそれぞれに分割するビームスプリッタとを備え、
前記合焦範囲制御部は、同一の期間において前記n個の撮像素子の合焦位置を同時に変化させながら、前記n個の撮像素子に前記n枚の画像を同時に撮像させる
請求項1に記載の撮像装置。 - 前記撮像部は、
レンズと、
n個の撮像素子と、
前記レンズからの光を前記n個の撮像素子のいずれか1つへ選択的に入射させる選択部とを備え、
前記合焦範囲制御部は、前記n個の撮像素子を順次選択し、選択した撮像素子に、前記選択部により選択的に光を入射させることで、前記n個の撮像素子の各々に、前記n枚の画像の各々を撮像させる
請求項1に記載の撮像装置。 - 前記参照画像生成部は、前記n枚の画像の平均画像を生成し、当該平均画像を用いて前記参照画像を生成する
請求項1~6のいずれか一項に記載の撮像装置。 - 前記参照画像生成部は、前記平均画像に1種類の点広がり関数による逆畳み込み演算を行うことにより、前記参照画像を生成する
請求項7に記載の撮像装置。 - 被写体を撮像することにより画像を生成する撮像部を備える撮像装置における距離計測方法であって、
前記撮像部の合焦位置を変化させながら、前記撮像部に、それぞれ異なる合焦範囲を有するn枚(nは2以上の整数)の画像を撮影させる撮影ステップと、
前記n枚の画像を用いて、ぼけの基準となる参照画像を生成する参照画像生成ステップ部と、
前記n枚の画像と、前記参照画像とのぼけの度合いの差から前記被写体までの距離を計測する距離計測ステップとを含み、
前記n枚の画像のそれぞれが有する合焦範囲は互いに独立しており、各合焦範囲間に非合焦な間隔が設けられている
距離計測方法。 - 請求項9に記載の距離計測方法をコンピュータに実行させるための
プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012512728A JP5832424B2 (ja) | 2010-11-17 | 2011-11-16 | 撮像装置及び距離計測方法 |
EP11842267.4A EP2642245B1 (en) | 2010-11-17 | 2011-11-16 | Image pickup device and distance measuring method |
CN201180006220.XA CN102713512B (zh) | 2010-11-17 | 2011-11-16 | 摄像装置以及距离测量方法 |
US13/522,057 US8698943B2 (en) | 2010-11-17 | 2011-11-16 | Imaging apparatus and distance measurement method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010257227 | 2010-11-17 | ||
JP2010-257227 | 2010-11-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012066774A1 true WO2012066774A1 (ja) | 2012-05-24 |
Family
ID=46083722
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/006373 WO2012066774A1 (ja) | 2010-11-17 | 2011-11-16 | 撮像装置及び距離計測方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8698943B2 (ja) |
EP (1) | EP2642245B1 (ja) |
JP (1) | JP5832424B2 (ja) |
CN (1) | CN102713512B (ja) |
WO (1) | WO2012066774A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013171954A1 (ja) * | 2012-05-17 | 2013-11-21 | パナソニック株式会社 | 撮像装置、半導体集積回路および撮像方法 |
WO2014021238A1 (en) | 2012-07-31 | 2014-02-06 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
JP2014044117A (ja) * | 2012-08-27 | 2014-03-13 | Canon Inc | 距離情報取得装置、撮像装置、距離情報取得方法、及び、プログラム |
JP2015034732A (ja) * | 2013-08-08 | 2015-02-19 | キヤノン株式会社 | 距離算出装置、撮像装置および距離算出方法 |
KR20160111570A (ko) * | 2015-03-16 | 2016-09-27 | (주)이더블유비엠 | 다른 선명도를 갖는 두 개의 이미지를 캡쳐하는 단일센서를 이용한 거리 정보 (depth)추출장치에서 다단계 검색에 의한 최대유사도 연산량 감축방법 |
JP6413170B1 (ja) * | 2017-06-22 | 2018-10-31 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5966535B2 (ja) * | 2012-04-05 | 2016-08-10 | ソニー株式会社 | 情報処理装置、プログラム及び情報処理方法 |
JP2014154907A (ja) * | 2013-02-05 | 2014-08-25 | Canon Inc | 立体撮像装置 |
US9077891B1 (en) * | 2013-03-06 | 2015-07-07 | Amazon Technologies, Inc. | Depth determination using camera focus |
JP6236908B2 (ja) * | 2013-06-21 | 2017-11-29 | 株式会社リコー | 撮像装置、撮像システムおよび撮像方法 |
JP2015036632A (ja) * | 2013-08-12 | 2015-02-23 | キヤノン株式会社 | 距離計測装置、撮像装置、距離計測方法 |
JP6173156B2 (ja) * | 2013-10-02 | 2017-08-02 | キヤノン株式会社 | 画像処理装置、撮像装置及び画像処理方法 |
WO2015053113A1 (ja) * | 2013-10-08 | 2015-04-16 | オリンパス株式会社 | 撮像装置及び電子機器 |
TWI521255B (zh) * | 2013-11-29 | 2016-02-11 | 光寶科技股份有限公司 | 自動對焦方法、其自動對焦裝置和其影像擷取裝置 |
JP5895270B2 (ja) * | 2014-03-28 | 2016-03-30 | パナソニックIpマネジメント株式会社 | 撮像装置 |
CN104759037B (zh) * | 2015-03-25 | 2017-09-22 | 深圳市医诺智能科技发展有限公司 | 放疗剂量对比显示方法及系统 |
DE102015112380A1 (de) * | 2015-07-29 | 2017-02-02 | Connaught Electronics Ltd. | Verfahren zum Bestimmen einer Entfernung von einer Kamera zu einem Objekt, Fahrerassistenzsystem und Kraftfahrzeug |
EP3438699A1 (de) * | 2017-07-31 | 2019-02-06 | Hexagon Technology Center GmbH | Distanzmesser mit spad-anordnung zur berücksichtigung von mehrfachzielen |
JP7123884B2 (ja) | 2019-09-12 | 2022-08-23 | 株式会社東芝 | 撮像装置、方法及びプログラム |
DE102020201097B4 (de) | 2020-01-30 | 2023-02-16 | Carl Zeiss Industrielle Messtechnik Gmbh | Anordnung und Verfahren zur optischen Objektkoordinatenermittlung |
CN112748113B (zh) * | 2020-12-21 | 2022-08-05 | 杭州电子科技大学 | 一种集成激光测量与超声探伤的测头装置及其测量方法 |
DE102022129697A1 (de) | 2022-11-10 | 2024-05-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Bestimmen einer Distanz eines Objektes in einer Fahrzeugumgebung zu einem Fahrzeug auf Basis von Bildern mit unterschiedlichen Schärfen |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0560528A (ja) * | 1991-09-03 | 1993-03-09 | Hitachi Ltd | 立体情報入力装置 |
JPH10508107A (ja) * | 1995-06-07 | 1998-08-04 | ザ トラスティース オブ コロンビア ユニヴァーシティー イン ザ シティー オブ ニューヨーク | 能動型照明及びデフォーカスに起因する画像中の相対的なぼけを用いる物体の3次元形状を決定する装置及び方法 |
JPH11337313A (ja) | 1998-05-25 | 1999-12-10 | Univ Kyoto | 距離計測装置及び方法並びに画像復元装置及び方法 |
JP2001074422A (ja) * | 1999-08-31 | 2001-03-23 | Hitachi Ltd | 立体形状検出装置及びハンダ付検査装置並びにそれらの方法 |
US20070019883A1 (en) | 2005-07-19 | 2007-01-25 | Wong Earl Q | Method for creating a depth map for auto focus using an all-in-focus picture and two-dimensional scale space matching |
JP2007533977A (ja) * | 2004-03-11 | 2007-11-22 | アイコス・ビジョン・システムズ・ナムローゼ・フェンノートシャップ | 波面操作および改良3d測定方法および装置 |
JP2010016743A (ja) * | 2008-07-07 | 2010-01-21 | Olympus Corp | 測距装置、測距方法、測距プログラム又は撮像装置 |
JP2010194296A (ja) * | 2009-01-28 | 2010-09-09 | Panasonic Corp | 口腔内測定装置及び口腔内測定システム |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FI97085C (fi) * | 1995-03-29 | 1996-10-10 | Valtion Teknillinen | Menetelmä ja kuvauslaitteisto etäisyyden määrittämiseen ja sen käyttö |
US6320979B1 (en) * | 1998-10-06 | 2001-11-20 | Canon Kabushiki Kaisha | Depth of field enhancement |
DE102005034597A1 (de) * | 2005-07-25 | 2007-02-08 | Robert Bosch Gmbh | Verfahren und Anordnung zur Erzeugung einer Tiefenkarte |
US7929801B2 (en) * | 2005-08-15 | 2011-04-19 | Sony Corporation | Depth information for auto focus using two pictures and two-dimensional Gaussian scale space theory |
JP4943695B2 (ja) * | 2005-11-21 | 2012-05-30 | 富士フイルム株式会社 | 多焦点カメラの撮影光学系 |
US20070189750A1 (en) * | 2006-02-16 | 2007-08-16 | Sony Corporation | Method of and apparatus for simultaneously capturing and generating multiple blurred images |
US7711201B2 (en) * | 2006-06-22 | 2010-05-04 | Sony Corporation | Method of and apparatus for generating a depth map utilized in autofocusing |
US7720371B2 (en) * | 2007-01-18 | 2010-05-18 | Nikon Corporation | Depth layer extraction and image synthesis from focus varied multiple images |
JP2009110137A (ja) * | 2007-10-29 | 2009-05-21 | Ricoh Co Ltd | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2009188697A (ja) * | 2008-02-06 | 2009-08-20 | Fujifilm Corp | 多焦点カメラ装置、それに用いられる画像処理方法およびプログラム |
US8280194B2 (en) * | 2008-04-29 | 2012-10-02 | Sony Corporation | Reduced hardware implementation for a two-picture depth map algorithm |
US8194995B2 (en) * | 2008-09-30 | 2012-06-05 | Sony Corporation | Fast camera auto-focus |
US8405742B2 (en) * | 2008-12-30 | 2013-03-26 | Massachusetts Institute Of Technology | Processing images having different focus |
US8199248B2 (en) * | 2009-01-30 | 2012-06-12 | Sony Corporation | Two-dimensional polynomial model for depth estimation based on two-picture matching |
US8542313B2 (en) * | 2010-01-27 | 2013-09-24 | Csr Technology Inc. | Depth from defocus calibration |
US8436912B2 (en) * | 2010-04-30 | 2013-05-07 | Intellectual Ventures Fund 83 Llc | Range measurement using multiple coded apertures |
JP2013030895A (ja) * | 2011-07-27 | 2013-02-07 | Sony Corp | 信号処理装置、撮像装置、信号処理方法およびプログラム |
US8705801B2 (en) * | 2010-06-17 | 2014-04-22 | Panasonic Corporation | Distance estimation device, distance estimation method, integrated circuit, and computer program |
CN102472620B (zh) * | 2010-06-17 | 2016-03-02 | 松下电器产业株式会社 | 图像处理装置及图像处理方法 |
KR20120023431A (ko) * | 2010-09-03 | 2012-03-13 | 삼성전자주식회사 | 깊이 조정이 가능한 2차원/3차원 영상 변환 방법 및 그 장치 |
US9098147B2 (en) * | 2011-12-29 | 2015-08-04 | Industrial Technology Research Institute | Ranging apparatus, ranging method, and interactive display system |
-
2011
- 2011-11-16 WO PCT/JP2011/006373 patent/WO2012066774A1/ja active Application Filing
- 2011-11-16 EP EP11842267.4A patent/EP2642245B1/en active Active
- 2011-11-16 US US13/522,057 patent/US8698943B2/en active Active
- 2011-11-16 CN CN201180006220.XA patent/CN102713512B/zh active Active
- 2011-11-16 JP JP2012512728A patent/JP5832424B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0560528A (ja) * | 1991-09-03 | 1993-03-09 | Hitachi Ltd | 立体情報入力装置 |
JPH10508107A (ja) * | 1995-06-07 | 1998-08-04 | ザ トラスティース オブ コロンビア ユニヴァーシティー イン ザ シティー オブ ニューヨーク | 能動型照明及びデフォーカスに起因する画像中の相対的なぼけを用いる物体の3次元形状を決定する装置及び方法 |
JPH11337313A (ja) | 1998-05-25 | 1999-12-10 | Univ Kyoto | 距離計測装置及び方法並びに画像復元装置及び方法 |
JP2001074422A (ja) * | 1999-08-31 | 2001-03-23 | Hitachi Ltd | 立体形状検出装置及びハンダ付検査装置並びにそれらの方法 |
JP2007533977A (ja) * | 2004-03-11 | 2007-11-22 | アイコス・ビジョン・システムズ・ナムローゼ・フェンノートシャップ | 波面操作および改良3d測定方法および装置 |
US20070019883A1 (en) | 2005-07-19 | 2007-01-25 | Wong Earl Q | Method for creating a depth map for auto focus using an all-in-focus picture and two-dimensional scale space matching |
JP2010016743A (ja) * | 2008-07-07 | 2010-01-21 | Olympus Corp | 測距装置、測距方法、測距プログラム又は撮像装置 |
JP2010194296A (ja) * | 2009-01-28 | 2010-09-09 | Panasonic Corp | 口腔内測定装置及び口腔内測定システム |
Non-Patent Citations (2)
Title |
---|
H. NAGAHARA; S. KUTHIRUMMAL; C. ZHOU; S. K. NAYER: "Flexible Depth of Field Photography", EUROPEAN CONFERENCE ON COMPUTER VISION, October 2008 (2008-10-01) |
See also references of EP2642245A4 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013171954A1 (ja) * | 2012-05-17 | 2013-11-21 | パナソニック株式会社 | 撮像装置、半導体集積回路および撮像方法 |
US8890996B2 (en) | 2012-05-17 | 2014-11-18 | Panasonic Corporation | Imaging device, semiconductor integrated circuit and imaging method |
WO2014021238A1 (en) | 2012-07-31 | 2014-02-06 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
JP2014044408A (ja) * | 2012-07-31 | 2014-03-13 | Canon Inc | 撮像装置、距離情報取得方法およびプログラム |
EP2880399A4 (en) * | 2012-07-31 | 2016-05-11 | Canon Kk | PICTURE RECORDING DEVICE, DEEP INFORMATION DETECTION METHOD AND PROGRAM |
US9762788B2 (en) | 2012-07-31 | 2017-09-12 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
JP2014044117A (ja) * | 2012-08-27 | 2014-03-13 | Canon Inc | 距離情報取得装置、撮像装置、距離情報取得方法、及び、プログラム |
JP2015034732A (ja) * | 2013-08-08 | 2015-02-19 | キヤノン株式会社 | 距離算出装置、撮像装置および距離算出方法 |
KR20160111570A (ko) * | 2015-03-16 | 2016-09-27 | (주)이더블유비엠 | 다른 선명도를 갖는 두 개의 이미지를 캡쳐하는 단일센서를 이용한 거리 정보 (depth)추출장치에서 다단계 검색에 의한 최대유사도 연산량 감축방법 |
KR101711927B1 (ko) | 2015-03-16 | 2017-03-06 | (주)이더블유비엠 | 다른 선명도를 갖는 두 개의 이미지를 캡쳐하는 단일센서를 이용한 거리 정보 (depth)추출장치에서 다단계 검색에 의한 최대유사도 연산량 감축방법 |
JP6413170B1 (ja) * | 2017-06-22 | 2018-10-31 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20120300114A1 (en) | 2012-11-29 |
US8698943B2 (en) | 2014-04-15 |
CN102713512B (zh) | 2015-06-03 |
EP2642245A4 (en) | 2014-05-28 |
EP2642245A1 (en) | 2013-09-25 |
EP2642245B1 (en) | 2020-11-11 |
JP5832424B2 (ja) | 2015-12-16 |
JPWO2012066774A1 (ja) | 2014-05-12 |
CN102713512A (zh) | 2012-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5832424B2 (ja) | 撮像装置及び距離計測方法 | |
JP5869883B2 (ja) | 画像処理装置 | |
JP5824364B2 (ja) | 距離推定装置、距離推定方法、集積回路、コンピュータプログラム | |
JP5173665B2 (ja) | 画像撮影装置およびその距離演算方法と合焦画像取得方法 | |
US10491799B2 (en) | Focus detection apparatus, focus control apparatus, image capturing apparatus, focus detection method, and storage medium | |
JP5868183B2 (ja) | 撮像装置及び撮像方法 | |
CN109255810B (zh) | 图像处理装置及图像处理方法 | |
JP2008026790A (ja) | 撮像装置及びフォーカス制御方法 | |
JP2012005056A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2013044844A (ja) | 画像処理装置および画像処理方法 | |
US10999491B2 (en) | Control apparatus, image capturing apparatus, control method, and storage medium | |
JP2012256118A (ja) | 画像復元装置およびその方法 | |
JP4752733B2 (ja) | 撮像装置および撮像方法、並びに撮像装置の設計方法 | |
JP5338112B2 (ja) | 相関演算装置、焦点検出装置および撮像装置 | |
JP5409588B2 (ja) | 焦点調節方法、焦点調節プログラムおよび撮像装置 | |
JP2016066995A (ja) | 像ズレ量算出装置、撮像装置、および像ズレ量算出方法 | |
JP4509576B2 (ja) | 焦点検出装置 | |
JP2008211678A (ja) | 撮像装置およびその方法 | |
JP6590639B2 (ja) | 距離検出装置、撮像装置、および距離検出方法 | |
JP5581177B2 (ja) | 撮像位置調整装置および撮像装置 | |
JP6598550B2 (ja) | 画像処理装置、撮像装置、画像処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180006220.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012512728 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11842267 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13522057 Country of ref document: US Ref document number: 2011842267 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |