WO2012070208A1 - 撮像装置、撮像方法、プログラムおよび集積回路 - Google Patents
撮像装置、撮像方法、プログラムおよび集積回路 Download PDFInfo
- Publication number
- WO2012070208A1 WO2012070208A1 PCT/JP2011/006420 JP2011006420W WO2012070208A1 WO 2012070208 A1 WO2012070208 A1 WO 2012070208A1 JP 2011006420 W JP2011006420 W JP 2011006420W WO 2012070208 A1 WO2012070208 A1 WO 2012070208A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- subject
- optical
- distance
- imaging
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/365—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
Definitions
- the present invention relates to an imaging apparatus that measures the depth of a scene using a plurality of images taken from a single viewpoint.
- subject distance the distance from an imaging device to each subject
- active methods that calculate the subject distance based on the time until the reflected wave returns, the angle of the reflected wave, etc.
- passive method for calculating the subject distance based on the above.
- passive methods that do not require a device for irradiating infrared rays or the like are widely used in imaging devices such as cameras.
- DFD Depth from Defocus
- a captured image including blur is a point spread function (PSF: Point Spread Function) that is a function of subject distance with respect to an all-focus image representing a state in which there is no blur due to the lens. ). Since the point spread function is a function of the subject distance, the DFD can obtain the subject distance by detecting the blur from the blurred image. However, at this time, the omnifocal image and the subject distance are unknown. Since one equation relating to the blurred image, the omnifocal image, and the subject distance is established for one blurred image, a new blurred image with a different focus position is captured to obtain a new equation.
- PSF Point Spread Function
- DFD is a method for obtaining a subject distance by using a point spread function for blur included in a blurred image.
- the point spread function before and after the image point corresponding to the subject distance is similar, the point spread function after passing through the image point or the point before passing through the image point.
- the image distribution function becomes ambiguous due to the influence of noise included in the image, and the discrimination becomes difficult.
- Non-Patent Document 1 by using an aperture that has a shape that is not point-symmetric as a whole, the determination of the shape of the point spread function before and after the image point corresponding to the subject distance of the subject is eliminated. can do.
- An object of the present invention is to provide an imaging apparatus that eliminates the problem and estimates the subject distance from a small number of captured images.
- an imaging apparatus includes an imaging device that captures an image, an optical system that forms a subject image on the imaging device, and an optical device having a birefringence effect.
- the distance from the imaging element to the subject is measured by the element, the captured image, and a point spread function changed by the optical element before and after the image point corresponding to the subject distance of the subject.
- a distance measuring unit includes an imaging device that captures an image, an optical system that forms a subject image on the imaging device, and an optical device having a birefringence effect.
- the shape of the point spread function before and after the image point corresponding to the subject distance of the subject can be made different by the action of the optical element having the effect of birefringence.
- the birefringent material does not need to shield light compared to a method using an aperture that is not point-symmetric, it is possible to suppress a decrease in light amount.
- optical elements having birefringence effects mainly affect only astigmatism (especially if the birefringent material is a parallel plate and the optical system is a telecentric optical system). For this reason, even if the shape of the point spread function before and after the image point corresponding to the subject distance of the subject is made different, the influence on other aberrations is small. Therefore, there is no need to redesign the optical system. That is, it can be realized simply by inserting into the current device and adding a device for processing the point spread function.
- the direction of the optical axis of the optical element is not parallel to the optical axis of the optical system.
- the optical element is disposed between the imaging element and the optical system on the optical axis of the optical system, and a plane of the optical element that intersects the optical axis of the optical system is light of the optical system. It is preferable to be perpendicular to the axis.
- the distance measuring unit includes an image captured by the image sensor in a state where the birefringence effect by the optical element is not present, and an image captured in a state where the optical element is on the optical axis of the optical system. It is preferable that the distance from the image sensor to the subject is measured.
- the optical element can electrically or magnetically turn on and off the effect of birefringence
- the distance measurement unit can detect an image captured by the imaging element without the effect of the birefringence by the optical element.
- the distance to the subject is measured using an image picked up in a state where the optical element is on the optical axis of the optical system.
- it further includes a reference image generation unit that generates a reference image from an image captured by the imaging element in a state where there is no birefringence effect by the optical element, and the distance measurement unit passes through the optical element. It is preferable that the point spread function is estimated using the captured image and the reference image, and the distance to the subject is measured.
- the reference image generation unit generates an omnifocal image as the reference image from an image picked up by the image pickup device without the effect of the birefringence by the optical element.
- the optical system has image side telecentric optical characteristics.
- a light beam separation unit that separates a light beam into a plurality of optical paths is further provided, and there are a plurality of the imaging elements, and each of the imaging objects is imaged corresponding to a plurality of optical paths separated by the light beam separation unit.
- the optical element may be disposed on at least one of a plurality of optical paths separated by the light beam separation unit.
- the present invention can be realized not only as such an image pickup apparatus but also as an image pickup method in which operations of characteristic components included in the image pickup apparatus are used as steps. It can also be realized as a program for causing a computer to execute an imaging method. Such a program can also be distributed via a recording medium such as a CD-ROM or a transmission medium such as the Internet. The present invention can also be realized as an integrated circuit that performs processing of each processing unit.
- the object distance can be obtained stably and with high accuracy by calculating the shape of the point spread function included in the image from at least two images.
- FIG. 1 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing the state of light rays that pass through the birefringent material.
- FIG. 3 is a diagram showing the arrangement of components of the imaging apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram showing how the shape of the point spread function changes before and after the image point corresponding to the subject distance of the subject due to the birefringent material.
- 5A is a diagram showing the shape of the point spread function of an extraordinary ray at position (a) in FIG. 4 when a birefringent material is used, and
- FIG. 5B is a birefringent material.
- FIG. 4 is a diagram showing the point spread function of the extraordinary ray at the position (b) in FIG. 4 when using the above, (a-2) is the position at the position (a) in FIG. 4 when no birefringent material is used.
- FIG. 5B is a diagram showing a point spread function
- FIG. 5B is a diagram showing a point spread function at a position (b) in FIG. 4 when a birefringent material is not used.
- FIG. 6 is a diagram showing point spread functions corresponding to different subject positions when a birefringent material is used.
- FIG. 7 is a diagram showing point spread functions corresponding to different subject positions when no birefringent material is used.
- FIG. 8 is a diagram showing the shape of the tertiary phase plate.
- FIG. 9 is a diagram showing a flow of operations of the imaging apparatus according to Embodiment 1 of the present invention.
- FIG. 10 is a block diagram showing a configuration of the imaging apparatus according to Embodiment 2 of the present invention.
- FIG. 11 is a diagram illustrating an arrangement of components of the imaging apparatus according to Embodiment 2 of the present invention.
- FIG. 1 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 1 of the present invention.
- the imaging apparatus 10 includes an optical system 11, a birefringent material 12, an actuator 13, a focusing range control unit 14, an imaging element 15, an image acquisition unit 16, a reference image generation unit 17, and a distance measurement unit 18.
- the optical system 11 forms a subject image on the image sensor 15.
- a birefringent material 12 that is an optical element having a birefringence effect is installed on the optical path between the imaging element 15 and the optical system 11.
- the shape of the point spread function of the extraordinary ray in the light ray transmitted through the birefringent material 12 is changed, and the shape of the point spread function before and after the image point corresponding to the subject distance of the subject is changed.
- the actuator 13 inserts and retracts the birefringent material 12 with respect to the optical path.
- the imaging device 10 Since the actuator 13 inserts and retreats the birefringent material 12 with respect to the optical path, the imaging device 10 has an image of a subject image transmitted through the birefringent material 12 and an image of a subject image not transmitted through the birefringent material 12. Can be obtained.
- the focus range control unit 14 moves at least one of the optical system 11 and the image sensor 15 and controls the focus position and the depth of field. Specifically, the control is performed by operating the optical system 11 in a specific pattern or switching a specific optical element.
- the imaging element 15 is composed of a CCD, a CMOS, or the like, and converts the light received on the imaging surface into an electrical signal for each pixel and outputs it.
- the image acquisition unit 16 acquires a plurality of images from the image sensor 15 and holds each image.
- the reference image generation unit 17 is a reference image obtained by estimating the state in which there is no blur due to the optical system from a plurality of images having different focus positions and depths of field, which are obtained by the effect of the focus range control unit 14. Focus image).
- the distance measurement unit 18 performs distance measurement based on a DFD technique using a blurred image focused on an arbitrary distance and a reference image obtained from the reference image generation unit 17.
- the birefringent material 12 is a material having optical anisotropy, and has a property of separating light into ordinary light and extraordinary light according to the polarization direction of the light that has entered the material.
- the ordinary ray and the extraordinary ray are determined by the direction of the optical axis unique to the birefringent material 12.
- An ordinary ray is a ray having an electric field that oscillates perpendicularly to a plane formed by an optical axis and an incident ray
- an extraordinary ray is a ray having an electric field that oscillates in the plane.
- the direction of the optical axis and the number of axes vary depending on the type of substance, and when one optical axis is provided, the case where it has one optical axis is expressed as biaxial.
- calcite which is a uniaxial crystal is used as the birefringent material 12.
- the difference between the ordinary ray and the extraordinary ray is that when passing through the birefringent material 12, the ordinary ray has a constant light velocity regardless of the light propagation direction, whereas the extraordinary ray has a light velocity that propagates. It depends on the direction. Furthermore, the refractive index no for ordinary light and the refractive index ne for extraordinary light are different. Due to the difference between the refractive index no for ordinary rays and the refractive index ne for extraordinary rays, and the property that the speed of extraordinary ray light varies depending on the propagation direction, when the rays enter the birefringent material 12 as shown in FIG. A difference occurs in the traveling direction between the ordinary ray and the extraordinary ray.
- extraordinary rays are used in particular to make the shape of the point spread function before and after the image point corresponding to the subject distance of the subject different.
- the positional relationship among the optical system 11, the birefringent material 12, and the image sensor 15 is such that the birefringent material 12 is disposed between the optical system 11 (lens) and the image sensor 15 as shown in FIG. 3. That is, the three members are arranged on the optical axis in the order of the optical system 11, the birefringent material 12, and the image sensor 15.
- the birefringent material 12 is a parallel plate shaped and arranged so that all the planes of the birefringent material intersecting the optical axis are perpendicular to the optical axis. Note that “vertical” in this case may not be strictly vertical.
- the birefringent material 12 is uniaxial, and the direction of the optical axis is the y direction in FIG.
- the birefringent material 12 is preferably a parallel plate. If it is a parallel plate, it can be given the property of mainly affecting only astigmatism.
- the parallel plate described here is a substance in which a first surface on which light is incident and a second surface on which light is emitted are parallel to each other. That is, the angles and shapes of the surfaces other than the first surface and the second surface are not limited.
- FIG. 4 is a diagram showing the behavior of ordinary rays and extraordinary rays in the yz plane and the xz plane in the configuration of FIG.
- the shape of the extraordinary ray point spread function is larger in the y direction than in the x direction at the position (a) in front of the image point corresponding to the subject distance of the subject in FIG. As shown in FIG. 5A-1, the shape is long in the y direction.
- the shape of the point spread function of the extraordinary ray is larger in the x direction than in the y direction at the position (b) behind the image point corresponding to the subject distance of the subject, As shown in FIG. 5B-1, the shape is long in the x direction.
- 5 (a-2) and 5 (b-2) are ordinary ray point spread functions at the positions (a) and (b) in FIG. 4, respectively.
- these are point spread functions of light rays in the absence of the birefringent material 12. That is, when the birefringent material 12 is not present, it can be confirmed that the shape is similar (for example, circular in this case) before and after the image point corresponding to the subject distance of the subject.
- FIG. 6 is a diagram showing point spread functions corresponding to different subject positions when the birefringent material 12 is used.
- FIG. 7 is a diagram showing point spread functions corresponding to different subject positions when the birefringent material 12 is not used.
- the “subject distance” here is defined as the distance from the imaging device to the subject in the above description, but may be the distance from the optical system 11 to the subject, or the distance from the imaging element 15 to the subject. It may be.
- the point spread function with respect to the subject position (a) and the subject position (b) is taken into consideration when the birefringent material 12 is present as shown in FIG. 6, the point spread function before and after the image point corresponding to the subject distance of the subject. Since the shapes are different, the shape of the point spread function corresponding to the position (a) and the shape of the point spread function corresponding to the position (b) formed on the image sensor 15 are different from each other. That is, the subject distance can be uniquely estimated from the point spread function obtained in the image sensor 15. On the other hand, in FIG. 7, since there is no birefringent material 12, the point spread function corresponding to the positions (a) and (b) has a similar shape, and thus the obtained point spread function is obtained.
- a reference image that is not blurred by the optical system 11 is used.
- An image that is not blurred by the optical system 11 can also be referred to as an image having a deep depth of field.
- Increasing the depth of field can be easily achieved by reducing the aperture of the optical system.
- the amount of light received by the image sensor 15 is reduced.
- EDoF extended depth of field
- the simplest EDoF method is a method of capturing a plurality of images while gradually shifting the in-focus position, and extracting and synthesizing in-focus portions from these images.
- Non-Patent Document 2 discloses a method of generating an image without blur by changing the focus position during exposure.
- the point spread function becomes almost constant regardless of the subject distance, and a uniform blurred image can be obtained. If deconvolution is performed on the obtained blurred image using an invariant point spread function that is not affected by the subject distance, an image that is not blurred in the entire image can be obtained.
- an EDoF technique using a special optical element has also been proposed.
- this is a method using an optical element called a cubic phase mask (Cubic Phase Mask).
- a cubic phase mask Cubic Phase Mask
- the shape is shown in FIG.
- an optical element having such a shape is incorporated in the vicinity of the aperture of the optical system, an image having a substantially constant blur can be obtained regardless of the subject distance.
- Non-Patent Document 1 when deconvolution is performed using an invariant point spread function that is not affected by the subject distance, an image with no blur can be obtained over the entire image.
- a method using a multifocal lens can be used.
- FIG. 9 is a flowchart illustrating an example of a process flow for calculating the subject distance. This process is a process of calculating the closest object distance from an image obtained by capturing the target object among n predetermined object distances d1, d2,..., Dn.
- a subject image I and a reference image I ′ obtained by transmitting through a birefringent substance are imaged and acquired (steps S101 and S102). Note that the order of steps S101 and S102 may be reversed.
- the reference image acquired here is an image obtained by capturing a subject image that does not pass through the birefringent material 12.
- Equation 1 the relationship represented by the following Equation 1 is established between the image I and the reference image I ′.
- h represents a point spread function at a position (x, y) in the image
- d (x, y) represents a subject distance at the position (x, y).
- * in the formula represents a convolution operation. Since the point spread function varies depending on the subject distance, when there are subjects at a plurality of different subject distances, an image obtained by convolving a point spread function having different subject distances at each image position into an image without blur is image I. As obtained.
- an initial value 1 is substituted into the counter i (step 103), and an error function C (x, y, di) with respect to the i-th subject distance is calculated for each pixel of the image (step S104).
- the error function is expressed by Equation 2 below.
- Equation 2 shows the difference between an image obtained by convolving a point image distribution function h (x, y, di) corresponding to the i-th subject distance di and an actual captured image I into a reference image I ′ without blur. Equivalent to taking. When the imaged subject is actually present at the i-th subject distance, the error function C (x, y, di) that is the difference is minimized.
- Equation 2 the error function C (x, y, di) is obtained by convolving the point spread function h (x, y, di) corresponding to the i-th subject distance di between the pixels into a non-blurred image.
- the absolute value of the difference between the captured image and the actual captured image I the error function can be determined based on an arbitrary format expressing the L2 norm equidistant distance.
- step S105 After calculating the error function, it is determined whether the value of the counter i has reached n (step S105). If not, the value of the counter i is increased by 1 (step S106), and the value of the counter i is n Repeat until you reach.
- the subject distance is calculated (step S107).
- the subject distance d (x, y) at the position (x, y) is expressed by Equation 3 below.
- the image is divided into a plurality of blocks to obtain the sum of error functions within the block, and the subject distance at which the error function is minimized is captured in the entire block.
- the subject distance can be estimated uniquely.
- the direction of the optical axis of the birefringent material 12 is upward in FIG. 3, but the direction of the optical axis is not limited to the upward direction, and may be any direction.
- the shape of the point spread function before and after the image point corresponding to the subject distance can be made different.
- the shape of the obtained point spread function also changes. Regardless of the direction of the optical axis, the position of the image point corresponding to the subject distance of the ordinary ray subject and the image point corresponding to the subject distance of the extraordinary ray subject are different, but the optical axis and the optical axis are different.
- the birefringent material 12 is described as using calcite, which is a uniaxial crystal, but other materials having birefringence effects may be used.
- calcite which is a uniaxial crystal
- other materials having birefringence effects may be used.
- the optical axis not only the direction but also the number of axes can be an element for controlling the shape of the point distribution function, and not only a uniaxial birefringent material but also a biaxial material can be effective. .
- a plurality of birefringent materials of uniaxial, biaxial, or both can be arranged to widen the range of change. Further, depending on the thickness and type of the birefringent material, the shape of the obtained point spread function can be changed before and after the image point corresponding to the subject distance of the subject.
- the acquisition of the subject image that has passed through the birefringent material 12 and the subject image that has not passed through the birefringent material 12 is realized by the movement of the birefringent material by the actuator, but there are other methods. In general, it can be realized by a method of moving in and out of the optical path by the movement of the birefringent material itself by physical driving, or a method of using an optical element capable of controlling the birefringence effect.
- Examples of the former include a method of creating a case where a birefringent material is present or not present on a light path by linearly moving by an actuator or rotating a birefringent material plate while being perpendicular to the optical axis.
- Examples of the latter include an element that can be controlled by electricity such as an electro-optic effect, an element that can be controlled by magnetism, and the like.
- the presence or absence of the birefringence effect can be controlled by switching the presence or absence of application of voltage or magnetic field.
- a birefringent material for example, a liquid crystal that can electrically and magnetically control the effect of the birefringent material may be employed.
- the position of the birefringent material is not limited to that shown in FIG. 3, but the effect of making the shape of the point spread function before and after the image point corresponding to the subject distance of the subject different at any position can be obtained. However, it is preferable to arrange it immediately before the image sensor as shown in FIG.
- the optical system 11 is preferably an optical system in which the shape of the point spread function is the same at all image heights, and is particularly preferably an image side telecentric optical system.
- the image-side telecentric optical system is an optical system in which the principal ray and the optical axis are parallel at all angles of view on the image side.
- the shape of the point spread function is the same at all image heights even when the birefringent material 12 is arranged on the optical path. That is, if the optical system has the property that the shape of the point distribution function is the same at all image heights, the property can be preserved even if the birefringent material 12 is arranged on the optical path. Therefore, it is not necessary to redesign the optical system including the birefringent material in that case. If the shape of the point spread function is the same at all image heights, only one point spread function may be used for ranging calculation, and the calculation cost can be reduced.
- FIG. 10 is a block diagram illustrating a configuration of the imaging device 19 according to Embodiment 2 of the present invention. 10, the same reference numerals are used for the same components as those of the imaging device 10 of FIG. 1, and a part of the description is omitted.
- the imaging device 19 includes an optical system 11, a light beam separation unit 20, a birefringent material 12, a focusing range control unit 14, an imaging element A21, an imaging element B22, an image acquisition unit A23, an image acquisition unit B24, a reference image generation unit 17, And a distance measuring unit 18.
- the optical system 11 forms a subject image on the image sensor A21 and the image sensor B22.
- the light beam separation unit 20 spatially separates light beams with an arbitrary light amount ratio.
- the image pickup element A21 and the image pickup element B22 are composed of a CCD, a CMOS, or the like, and convert the light received by the image pickup surface into an electric signal for each pixel and output it.
- one of the light beams separated by the light beam separation unit 20 has its shape of the point spread function changed by the birefringent material 12, and is received by the image sensor A21.
- the image sensor B22 receives the other light beam that has not been transmitted through the birefringent material 12 and is not affected by the birefringent material 12 and separated by the light beam separation unit 20.
- the image acquisition unit A23 and the image acquisition unit B24 acquire images from the image sensor A21 and the image sensor B22, respectively, and store the acquired images.
- a blurred image having a configuration as shown in FIG. 11 and having different shape of the point spread function before and after the image point corresponding to the subject distance of the subject by the birefringent material 12 is obtained from the image sensor A21.
- the light rays that do not pass through the birefringent material 12 are imaged by the imaging element B22 while the in-focus position and the depth of field are controlled by the in-focus range control unit 14 as in the first embodiment.
- the reference image generation unit 17 generates a reference image based on the image acquired by the imaging element B22.
- the blurred image obtained by the image sensor A21 and the reference image generated from the image captured by the image sensor B22 are used for calculation of the subject distance by performing the same processing as in FIG.
- the blurred image obtained from the image sensor A21 and the reference image obtained from the reference image generation unit 17 correspond to the image I and the reference image I ′ in FIG. 9, respectively. Furthermore, the subject distance can be calculated by the same calculation as Expressions 1 to 3.
- Examples of the light beam separation unit 20 used for light beam separation include a non-polarization beam splitter and a polarization beam splitter.
- the obtained image I is an image including both an extraordinary ray and an ordinary ray as in the first embodiment.
- a polarizing beam splitter it is possible to control the direction of polarized light to be separated from the optical axis of the birefringent material so that the light rays included in the image I are only extraordinary rays. It should be noted that, since only the extraordinary ray is used as the light ray included in the image I, an image that does not include noise due to the ordinary ray can be captured, and thus an image with higher accuracy for deriving the subject distance can be obtained.
- a birefringent substance can be disposed between the polarizing beam splitter and the optical system. In that case, it is necessary to select a polarization direction so that only an ordinary ray reaches the image sensor B22.
- an image containing only extraordinary rays reduces the amount of light, but can also be obtained by transmitting only extraordinary rays using an optical element that passes only specific polarized light such as a polarizer.
- the image I and the reference image I ′ can be acquired at the same time, a difference other than blur does not occur in both images, and the subject distance can be obtained more accurately.
- the relative position of the subject with respect to the imaging device changes due to the movement of the subject and the imaging device itself, and the two images other than the blur are generated. A difference occurs, and the accuracy of distance measurement tends to decrease.
- the signal noise ratio S / N ratio
- the image I and the reference image I ′ are acquired by being divided by time, whereas in the second embodiment, the image I and the reference image I ′ are acquired by being spatially divided. It can be said.
- the amount of light for each of the image I and the reference image I ′ is reduced by dividing the light beam. However, when the light amounts of both images are combined, the amount of light is not reduced and is not wasted. If the time required to acquire both images is the same, the total light amount is the same in the first embodiment and the second embodiment.
- an omnifocal image is used as a reference image in order to obtain the subject distance of the subject.
- the present invention is not limited to this, and an image with uniform blur is used as the reference image. May be used to derive the subject distance of the subject.
- control part of the actuator 13 as a birefringence effect provision part in the block diagrams (FIG. 1, FIG. 10, etc.) of the first embodiment and the second embodiment, and the image acquisition part as an imaging part. 16 and the distance measuring unit 18 are typically realized as an LSI which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them. For example, the functional blocks other than the memory may be integrated into one chip.
- LSI is used, but depending on the degree of integration, it may be called IC, system LSI, super LSI, or ultra LSI.
- the method of circuit integration is not limited to LSI, and implementation with a dedicated circuit or a general-purpose processor is also possible.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- only the means for storing the data to be processed may be configured separately instead of being integrated into one chip.
- the imaging apparatus can perform distance measurement based on an image taken from a single viewpoint, it can be applied to all imaging equipment.
- Imaging device 11
- Optical system 12
- Birefringent substance 13
- Actuator 14
- Focusing range control part 15
- Imaging element 16
- Image acquisition part 17
- Reference image generation part 18
- Distance measurement part 19
- Imaging apparatus 20
- Light beam separation part 21
- Imaging element A 22
- Image sensor B 23
- Image acquisition unit A 24
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Automatic Focus Adjustment (AREA)
- Measurement Of Optical Distance (AREA)
- Studio Devices (AREA)
Abstract
Description
図1は、本発明の実施の形態1における撮像装置の構成を示すブロック図である。
本発明の実施の形態2にかかる撮像装置19は、常光線と異常光線とを分離して、それぞれの光線のみの画像を取得する構成を有する。図10は、本発明の実施の形態2における撮像装置19の構成を示すブロック図である。図10において、図1の撮像装置10と同じ構成要素については同じ符号を用い、一部説明を省略する。撮像装置19は、光学系11、光線分離部20、複屈折物質12、合焦範囲制御部14、撮像素子A21、撮像素子B22、画像取得部A23、画像取得部B24、参照画像生成部17、および距離計測部18を備える。
11 光学系
12 複屈折物質
13 アクチュエータ
14 合焦範囲制御部
15 撮像素子
16 画像取得部
17 参照画像生成部
18 距離計測部
19 撮像装置
20 光線分離部
21 撮像素子A
22 撮像素子B
23 画像取得部A
24 画像取得部B
Claims (13)
- 画像を撮像する撮像素子と、
前記撮像素子に被写体像を結像させるための光学系と、
複屈折の効果を有する光学素子と、
撮像された前記画像と、被写体の被写体距離に対応する像点の前後において前記光学素子により変化を与えられた点像分布関数とによって、前記撮像素子から前記被写体までの距離を計測する距離計測部と、を備える
撮像装置。 - 前記光学素子の光学軸の方向が前記光学系の光軸と非平行である
請求項1に記載の撮像装置。 - 前記光学素子は、前記光学系の光軸上において、前記撮像素子と前記光学系との間に配置され、前記光学系の光軸と交わる前記光学素子の平面は前記光学系の光軸に対して垂直となる
請求項1または請求項2に記載の撮像装置。 - さらに、
前記光学素子を、前記光学系の光軸に対して挿入または退避させることにより、前記光学系の光軸上において前記複屈折の効果をオンオフする光学素子移動部を、備え、
前記距離計測部は、前記光学素子による前記複屈折の効果がない状態で前記撮像素子によって撮像された画像と、前記光学素子が前記光学系の光軸上にある状態で撮像された画像とを用いて、前記撮像素子から前記被写体までの距離を計測する
請求項1から3のいずれか1項に記載の撮像装置。 - 前記光学素子は、電気的または磁気的に複屈折の効果をオンオフでき、
前記距離計測部は、前記光学素子による前記複屈折の効果がない状態で前記撮像素子によって撮像された画像と、前記光学素子が前記光学系の光軸上にある状態で撮像された画像とを用いて前記被写体までの距離を計測する
請求項1から3のいずれか1項に記載の撮像装置。 - さらに、
前記光学素子による前記複屈折の効果がない状態で前記撮像素子によって撮像される画像から、参照画像を生成する参照画像生成部を備え、
前記距離計測部は、前記光学素子を通して撮像した画像と、前記参照画像を用いて、前記点像分布関数を推定し、前記被写体までの距離を計測する
請求項1から5のいずれか1項に記載の撮像装置。 - 前記参照画像生成部は、前記光学素子による前記複屈折の効果がない状態で前記撮像素子によって撮像される画像から、全焦点画像を前記参照画像として生成する
請求項6に記載の撮像装置。 - 前記光学系は、像側テレセントリック性の光学特性を有する
請求項1から7のいずれか1項に記載の撮像装置。 - さらに、
光線を複数の光路に分離する光線分離部を、備え、
前記撮像素子は、複数あり、それぞれが前記光線分離部によって分離される複数の光路に対応して前記撮像対象を撮像し、
前記光学素子は、前記光線分離部によって分離される複数の光路の内の少なくとも一つの光路上に配置される
請求項1から8のいずれか1項に記載の撮像装置。 - 前記光学素子は、複数ある
請求項1から9のいずれか1項に記載の撮像装置。 - 画像を撮像する撮像素子と、前記撮像素子に被写体像を結像させるための光学系と、を有する撮像装置の撮像方法であって、
前記光学系によって定まる点像分布関数の形状を、被写体の被写体距離に対応する像点前後の位置において異なる形状とさせる複屈折の効果を前記光学系の光軸上に与える複屈折効果付与ステップと、
前記光学系の光軸上に前記複屈折の効果が与えられた状態で前記撮像素子により画像を撮像する撮像ステップと、
前記撮像素子によって撮像された画像と前記点像分布関数とによって、前記被写体までの距離を計測する距離計測ステップと、を含む
撮像方法。 - 請求項11の撮像方法をコンピュータに実行させるためのプログラム。
- 画像を撮像する撮像素子と、前記撮像素子に被写体像を結像させるための光学系と、を有する撮像装置の集積回路であって、
前記光学系によって定まる点像分布関数の形状を、被写体の被写体距離に対応する像点前後の位置において異なる形状とさせる複屈折の効果を前記光学系の光軸上に与える複屈折効果付与部と、
前記光学系の光軸上に前記複屈折の効果が与えられた状態で前記撮像素子により画像を撮像させる撮像部と、
前記撮像素子によって撮像された画像と前記点像分布関数とによって、前記被写体までの距離を計測する距離計測部と、を備える
集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/574,079 US20120314061A1 (en) | 2010-11-24 | 2011-11-18 | Imaging apparatus, imaging method, program, and integrated circuit |
JP2012510474A JP5873430B2 (ja) | 2010-11-24 | 2011-11-18 | 撮像装置、撮像方法、プログラムおよび集積回路 |
CN201180006620.0A CN102713513B (zh) | 2010-11-24 | 2011-11-18 | 摄像装置、摄像方法、程序以及集成电路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-260859 | 2010-11-24 | ||
JP2010260859 | 2010-11-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012070208A1 true WO2012070208A1 (ja) | 2012-05-31 |
Family
ID=46145579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/006420 WO2012070208A1 (ja) | 2010-11-24 | 2011-11-18 | 撮像装置、撮像方法、プログラムおよび集積回路 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20120314061A1 (ja) |
JP (1) | JP5873430B2 (ja) |
CN (1) | CN102713513B (ja) |
WO (1) | WO2012070208A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013205516A (ja) * | 2012-03-27 | 2013-10-07 | Nippon Hoso Kyokai <Nhk> | 多重フォーカスカメラ |
WO2014021238A1 (en) * | 2012-07-31 | 2014-02-06 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
JP2016529473A (ja) * | 2013-06-13 | 2016-09-23 | ビーエーエスエフ ソシエタス・ヨーロピアBasf Se | 少なくとも1つの物体を光学的に検出する検出器 |
WO2020196027A1 (ja) * | 2019-03-28 | 2020-10-01 | ソニー株式会社 | 光学系、内視鏡、および医療用画像処理システム |
US10955936B2 (en) | 2015-07-17 | 2021-03-23 | Trinamix Gmbh | Detector for optically detecting at least one object |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6112862B2 (ja) * | 2012-12-28 | 2017-04-12 | キヤノン株式会社 | 撮像装置 |
CN104102068B (zh) * | 2013-04-11 | 2017-06-30 | 聚晶半导体股份有限公司 | 自动对焦方法及自动对焦装置 |
JP2015046777A (ja) * | 2013-08-28 | 2015-03-12 | キヤノン株式会社 | 撮像装置および撮像装置の制御方法 |
CN108107571B (zh) * | 2013-10-30 | 2021-06-01 | 株式会社摩如富 | 图像处理装置及方法及非暂时性计算机可读记录介质 |
US9404742B2 (en) * | 2013-12-10 | 2016-08-02 | GM Global Technology Operations LLC | Distance determination system for a vehicle using holographic techniques |
JP6699898B2 (ja) * | 2016-11-11 | 2020-05-27 | 株式会社東芝 | 処理装置、撮像装置、及び自動制御システム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0814887A (ja) * | 1994-06-27 | 1996-01-19 | Matsushita Electric Works Ltd | 光学式変位計 |
JP2963990B1 (ja) * | 1998-05-25 | 1999-10-18 | 京都大学長 | 距離計測装置及び方法並びに画像復元装置及び方法 |
JP2001074422A (ja) * | 1999-08-31 | 2001-03-23 | Hitachi Ltd | 立体形状検出装置及びハンダ付検査装置並びにそれらの方法 |
JP2005077391A (ja) * | 2003-09-04 | 2005-03-24 | Aoi Electronics Co Ltd | 位置姿勢計測装置および位置と姿勢の計測方法 |
JP2007533977A (ja) * | 2004-03-11 | 2007-11-22 | アイコス・ビジョン・システムズ・ナムローゼ・フェンノートシャップ | 波面操作および改良3d測定方法および装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2016456A4 (en) * | 2006-04-20 | 2010-08-25 | Xceed Imaging Ltd | THROUGH OPTICAL SYSTEM AND METHOD FOR PROVIDING AN ENLARGED SHARPNESS DURING THE FIGURE |
CN101952762B (zh) * | 2008-01-02 | 2012-11-28 | 加利福尼亚大学董事会 | 高数值孔径远程显微镜设备 |
US8305485B2 (en) * | 2010-04-30 | 2012-11-06 | Eastman Kodak Company | Digital camera with coded aperture rangefinder |
-
2011
- 2011-11-18 US US13/574,079 patent/US20120314061A1/en not_active Abandoned
- 2011-11-18 CN CN201180006620.0A patent/CN102713513B/zh not_active Expired - Fee Related
- 2011-11-18 WO PCT/JP2011/006420 patent/WO2012070208A1/ja active Application Filing
- 2011-11-18 JP JP2012510474A patent/JP5873430B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0814887A (ja) * | 1994-06-27 | 1996-01-19 | Matsushita Electric Works Ltd | 光学式変位計 |
JP2963990B1 (ja) * | 1998-05-25 | 1999-10-18 | 京都大学長 | 距離計測装置及び方法並びに画像復元装置及び方法 |
JP2001074422A (ja) * | 1999-08-31 | 2001-03-23 | Hitachi Ltd | 立体形状検出装置及びハンダ付検査装置並びにそれらの方法 |
JP2005077391A (ja) * | 2003-09-04 | 2005-03-24 | Aoi Electronics Co Ltd | 位置姿勢計測装置および位置と姿勢の計測方法 |
JP2007533977A (ja) * | 2004-03-11 | 2007-11-22 | アイコス・ビジョン・システムズ・ナムローゼ・フェンノートシャップ | 波面操作および改良3d測定方法および装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013205516A (ja) * | 2012-03-27 | 2013-10-07 | Nippon Hoso Kyokai <Nhk> | 多重フォーカスカメラ |
WO2014021238A1 (en) * | 2012-07-31 | 2014-02-06 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
US9762788B2 (en) | 2012-07-31 | 2017-09-12 | Canon Kabushiki Kaisha | Image pickup apparatus, depth information acquisition method and program |
JP2016529473A (ja) * | 2013-06-13 | 2016-09-23 | ビーエーエスエフ ソシエタス・ヨーロピアBasf Se | 少なくとも1つの物体を光学的に検出する検出器 |
US10823818B2 (en) | 2013-06-13 | 2020-11-03 | Basf Se | Detector for optically detecting at least one object |
US10845459B2 (en) | 2013-06-13 | 2020-11-24 | Basf Se | Detector for optically detecting at least one object |
US10955936B2 (en) | 2015-07-17 | 2021-03-23 | Trinamix Gmbh | Detector for optically detecting at least one object |
WO2020196027A1 (ja) * | 2019-03-28 | 2020-10-01 | ソニー株式会社 | 光学系、内視鏡、および医療用画像処理システム |
Also Published As
Publication number | Publication date |
---|---|
US20120314061A1 (en) | 2012-12-13 |
JP5873430B2 (ja) | 2016-03-01 |
JPWO2012070208A1 (ja) | 2014-05-19 |
CN102713513B (zh) | 2015-08-12 |
CN102713513A (zh) | 2012-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5873430B2 (ja) | 撮像装置、撮像方法、プログラムおよび集積回路 | |
US20230362344A1 (en) | System and Methods for Calibration of an Array Camera | |
US8488872B2 (en) | Stereo image processing apparatus, stereo image processing method and program | |
WO2011158498A1 (ja) | 撮像装置及び撮像方法 | |
US11499824B2 (en) | Distance measuring camera | |
JP2016128816A (ja) | プレノプティック・カメラを使った表面属性の推定 | |
KR20120066043A (ko) | 거리 측정용 카메라 장치 | |
US11032533B2 (en) | Image processing apparatus, image capturing apparatus, image processing method, and storage medium | |
JP2007158825A (ja) | 画像入力装置 | |
WO2009016256A1 (en) | Ultra-compact aperture controlled depth from defocus range sensor | |
WO2019125427A1 (en) | System and method for hybrid depth estimation | |
US11410321B2 (en) | Distance measuring camera | |
JP2022128517A (ja) | 測距カメラ | |
KR20190050859A (ko) | 웨이퍼의 3차원 맵핑 | |
US20200410707A1 (en) | Distance measuring camera | |
WO2013069279A1 (ja) | 撮像装置 | |
EP3350770A1 (en) | An apparatus and a method for generating data representing a pixel beam | |
JP2007205767A (ja) | 三次元座標計測装置および方法 | |
JP7328589B2 (ja) | 測距カメラ | |
US11842507B2 (en) | Distance measuring camera | |
EP2500690B1 (en) | Range finder and imaging device | |
JP2024021700A (ja) | 画像処理装置、撮像装置、画像処理方法、及びコンピュータプログラム | |
Taketomi et al. | Depth estimation based on defocus blur using a single image taken by a tilted lens optics camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180006620.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012510474 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11843382 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13574079 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11843382 Country of ref document: EP Kind code of ref document: A1 |