US9270948B2 - Image pick-up device, method, and system utilizing a lens having plural regions each with different focal characteristics - Google Patents
Image pick-up device, method, and system utilizing a lens having plural regions each with different focal characteristics Download PDFInfo
- Publication number
- US9270948B2 US9270948B2 US14/001,978 US201214001978A US9270948B2 US 9270948 B2 US9270948 B2 US 9270948B2 US 201214001978 A US201214001978 A US 201214001978A US 9270948 B2 US9270948 B2 US 9270948B2
- Authority
- US
- United States
- Prior art keywords
- image pickup
- regions
- pixels
- incident
- light beams
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title description 27
- 230000003287 optical effect Effects 0.000 claims abstract description 308
- 238000012545 processing Methods 0.000 claims description 40
- 238000010586 diagram Methods 0.000 description 24
- 238000005259 measurement Methods 0.000 description 12
- 238000003860 storage Methods 0.000 description 11
- 238000001228 spectrum Methods 0.000 description 10
- 238000004088 simulation Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H04N5/2254—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
Definitions
- the present invention relates to an image pickup apparatus such as a camera, and an image pickup method using the image pickup apparatus.
- distance measurement apparatuses for measuring the distance to an object for measuring the distance to an object (an object to which the distance is measured) based on the parallax between a plurality of image pickup optical systems have been used for the following distance measurement for automobiles, auto focus systems for cameras, and three-dimensional shape measurement systems.
- a pair of image pickup optical systems arranged in the left-right direction or in the vertical direction form images on the respective image pickup areas, and the distance to the object is detected through triangulation using the parallax between those images.
- the DFD (Depth From Defocus) method is known as a scheme for measuring the distance from a single image pickup optical system to an object. While the DFD method is an approach in which the distance is calculated by analyzing the amount of blur of the obtained image, it is not possible with a single image to determine whether it is a pattern of the object itself or a blur caused by the object distance, and therefore methods for estimating the distance from a plurality of images have been used (Patent Document 1, Non-Patent Document 1).
- Patent Document 1 Japanese Patent No. 3110095
- Patent Document 2 Japanese Laid-Open Patent Publication No. 2010-39162
- Non-Patent Document 1 Xue Tu, Youn-sik Kang and Murali Subbarao, “Two- and Three-Dimensional Methods for Inspection and Metrology V.”, Edited by Huang, Peisen S. Proceedings of the SPIE, Volume 6762, pp. 676203 (2007).
- Configurations using a plurality of image pickup optical systems increase the size and cost of the image pickup apparatus. Moreover, it is necessary to provide a plurality of image pickup optical systems of uniform characteristics and to ensure that optical axes of a plurality of image pickup optical systems are parallel to one another with a high precision, thus making the manufacture more difficult; and since a calibration process for determining camera parameters is needed, thereby requiring a large number of steps.
- Patent Document 1 discloses an image pickup apparatus in which the optical path is divided by a prism, and an image is captured by two image pickup surfaces of varied back focuses, thereby making it possible to measure the distance to an object in a single iteration of image capture.
- a method requires two image pickup surfaces, thereby increasing the sizes of the image pickup apparatus and significantly increasing the cost.
- the present invention has been made in order to solve such problems as described above, and a primary object thereof is to provide an image pickup apparatus and an image pickup method capable of obtaining brightness information with which it is possible to calculate the object distance using a single image pickup optical system.
- An image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including six regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the six regions incident respectively on different pixels on the image pickup device.
- An image pickup system of the present invention includes: an image pickup apparatus of the present invention; and a signal processing device for calculating a distance to an object using brightness information of a plurality of pixels obtained respectively from six different pixels on which light beams having passed through six regions of the image pickup apparatus are incident.
- An image pickup method of the present invention uses image pickup apparatus including: a lens optical system having a plurality of regions including six regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident; and an arrayed optical device arranged between the lens optical system and the image pickup device, the method including: making light beams having passed through the six regions incident respectively on different pixels on the image pickup device by means of the arrayed optical device; and calculating a distance to an object using brightness information of a plurality of pixels obtained respectively from six different pixels on which light beams having passed through six regions are incident.
- Another image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including three regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident and of which center points of the pixels are located at apices of a regular hexagon; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the three regions incident on different pixels on the image pickup device.
- Still another image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including four regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device including a plurality of pixels on which light beams having passed through the lens optical system are incident and of which positions of center points in a row direction are shifted from one row to another by half a pixel arrangement pitch; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the four regions incident on different pixels on the image pickup device.
- the present invention it is possible to obtain brightness information with which the object distance can be calculated through image capture using a single image pickup system.
- a movie is captured using an image pickup apparatus of the present invention, it is possible to measure the accurate distance to an object even if the position of the object varies over the passage of time.
- FIG. 1 A schematic diagram showing Embodiment 1 of an image pickup apparatus A according to the present invention.
- FIG. 2 A front view of an optical device L 1 according to Embodiment 1 of the present invention, as viewed from the object side.
- FIG. 3 A perspective view of an arrayed optical device K according to Embodiment 1 of the present invention.
- FIG. 4 ] ( a ) is a diagram showing, on an enlarged scale, the arrayed optical device K and an image pickup device N shown in FIG. 1
- ( b ) is a diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N.
- FIG. 5 A cross-sectional view showing the image pickup apparatus A according to the present invention.
- FIG. 6 A graph showing the relationship between the object distance and the degree of sharpness (the sharpness of the image) according to Embodiment 1 of the present invention.
- FIG. 7 ] ( a ) to ( c ) are diagrams each showing the brightness distribution of an image block having a size of 16 ⁇ 16, and ( d ) to ( f ) are diagrams showing the frequency spectra obtained by performing a two-dimensional Fourier transform on the image blocks shown in ( a ) to ( c ), respectively.
- FIG. 8 A front view of the optical device L 1 according to Embodiment 1 of the present invention, as viewed from the object side.
- FIG. 9 A diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N according to Embodiment 1 of the present invention.
- FIG. 10 A front view of the optical device L 1 according to Embodiment 1 of the present invention, as viewed, from the object side.
- FIG. 11 A diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N according to Embodiment 1 of the present invention.
- FIG. 12 A perspective view of the arrayed optical device K according to Embodiment 1 of the present invention.
- FIG. 13 A diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N according to Embodiment 1 of the present invention.
- FIG. 14 A schematic diagram showing Embodiment 2 of the image pickup apparatus A according to the present invention.
- FIG. 15 A front view of the optical device L 1 according to Embodiment 2 of the present invention, as viewed from the object side.
- FIG. 16 ] ( a ) is a diagram showing, on an enlarged scale, the arrayed optical device K and the image pickup device N shown in FIG. 14
- ( b ) is a diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N.
- FIG. 17 A graph showing the relationship between the object distance and the degree of sharpness (the sharpness of the image) according to Embodiment 2 of the present invention.
- FIG. 18 A front view of the optical device L 1 according to Embodiment 2 of the present invention, as viewed from the object side.
- FIG. 19 A diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N according to Embodiment 2 of the present invention.
- FIG. 20 is a perspective view showing a microlens array having a rotationally asymmetric shape with respect to the optical axis.
- ( a 2 ) is a diagram showing the contour lines of the microlens array shown in ( a 1 ).
- ( a 3 ) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in ( a 1 ) and ( a 2 ) is applied to the arrayed optical device of the present invention.
- ( b 1 ) is a perspective view showing a microlens array having a rotationally symmetric shape with respect to the optical axis.
- ( b 2 ) is a diagram showing the contour lines of the microlens array shown in ( b 1 ).
- ( b 3 ) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in ( b 1 ) and ( b 2 ) is applied to the arrayed optical device according to an embodiment of the present invention.
- FIG. 1 is a schematic diagram showing an image pickup apparatus A of Embodiment 1.
- the image pickup apparatus A of the present embodiment includes a lens optical system L whose optical axis is V, an arrayed optical device K arranged in the vicinity of the focal point of the lens optical system L, an image pickup device N, a first signal processing section C 1 , a second signal processing section C 2 , and a storage section Me.
- the lens optical system L has six optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 ( FIG. 1 shows a cross section passing through D 2 and D 5 ) having such optical characteristics that focal characteristics are made different from one another, and is composed of an optical device L 1 on which light beams B 1 , B 2 , B 3 , B 4 , B 5 and B 6 ( FIG. 1 shows a cross section passing through B 2 and B 5 ) from an object (not shown) are incident, a stop S on which light having passed through the optical device L 1 is incident, and a lens L 2 on which light having passed through the stop S is incident.
- the optical device L 1 is preferably arranged in the vicinity of the stop S.
- light beams having passed through the six optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 pass through the lens L 2 and then are incident on the arrayed optical device K.
- the arrayed optical device K causes the light beams having passed through the six optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 to be incident on six pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 of the image pickup device N, respectively.
- a plurality of pixels belong to each of the six pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 . For example, in FIG.
- pixels p 1 , p 2 , p 3 , p 4 , p 5 , p 6 are pixels belonging to the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 , respectively.
- the first signal processing section C 1 outputs images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 obtained from the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 , respectively. Since the optical characteristics of the six optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 are different from one another, the degrees of sharpness (values calculated by using the brightness) of the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 are different from one another depending on the object distance.
- the storage section Me stores the correlation between the degree of sharpness and the object distance for each of the light beams having passed through the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 .
- the second signal processing section C 2 it is possible to obtain the distance to the object based on the degrees of sharpness for the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 and the correlations.
- FIG. 2 is a front view of the optical device L 1 as viewed from the object side.
- the optical device L 1 is divided into six portions, the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 , in a plane perpendicular to the optical axis V, with the optical axis V being the boundary center.
- the broken line s denotes the position of the stop S.
- the light beam B 2 in FIG. 1 is a light beam passing through the optical region D 2 on the optical device L 1
- the light beam B 5 is a light beam passing through the optical region D 5 on the optical device L 1 .
- the light beams B 1 , B 2 , B 3 , B 4 , B 5 and B 6 pass through the optical device L 1 , the stop S, the lens L 2 and the arrayed optical device K in this order to arrive at an image pickup surface Ni on the image pickup device N (shown in FIG. 4 , etc.).
- FIG. 3 is a perspective view of the arrayed optical device K.
- optical elements M 1 are arranged in a hexagonal close-packed pattern in a plane perpendicular to the optical axis V.
- the cross section (the longitudinal cross section) of each optical element M 1 has a curved shape protruding toward the image pickup device N.
- the arrayed optical device K has a structure of a microlens array.
- the arrayed optical device K is arranged in the vicinity of the focal point of the lens optical system L, and is arranged at a position away from the image pickup surface Ni by a predetermined distance.
- the position at which the arrayed optical device K is arranged may be determined based on, for example, the focal point of the lens L 2 .
- the “focal characteristics being different” as used in the present embodiment refers to difference in at least one of characteristics that contribute to light condensing in the optical system, and specifically to difference in the focal length, the distance to an object in focus, the distance range where the degree of sharpness is greater than or equal to a certain value, etc.
- the optical characteristics By varying the optical characteristics by adjusting the radius of curvature of the surface, the aspherical coefficient or the refractive index between the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 , it is possible to vary focal characteristics for light beams having passed through the different regions.
- FIG. 4( a ) is a diagram showing, on an enlarged scale, the arrayed optical device K and the image pickup device N shown in FIG. 1
- FIG. 4( b ) is a diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N.
- the arrayed optical device K is arranged so that the surface thereof on which the optical elements M 1 are formed is facing the image pickup surface Ni.
- Pixels P having a geometric shape are arranged on the image pickup surface Ni so that the center point of each pixel P is at an apex of a regular hexagon. Specifically, honeycomb-array pixels described in Patent Document 2 may be used.
- a plurality of pixels P provided on the image pickup surface can each be classified as a pixel belonging to one of the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 .
- the arrayed optical device K is arranged so that one optical element M 1 thereof corresponds to six pixels p 1 , p 2 , p 3 , p 4 , p 5 and p 6 included in the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 , respectively.
- the center points of the six pixels p 1 , p 2 , p 3 , p 4 , p 5 and p 6 included in the first to sixth pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 , respectively, are located at the apices of a regular hexagon.
- Microlenses Ms (the optical elements M 1 ) are provided on the image pickup surface Ni so as to respectively cover the six pixels p 1 , p 2 , p 3 , p 4 , p 5 and p 6 included in the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 , respectively.
- optical elements M 1 are preferably arranged in a hexagonal close-packed pattern so that pixels arranged to be at the apices of a regular hexagon can be covered efficiently.
- the arrayed optical device is designed so that the majority of the light beams B 1 , B 2 , B 3 , B 4 , B 5 and B 6 having passed through the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 on the optical device L 1 arrives at the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 on the image pickup surface Ni, respectively.
- this configuration can be realized by appropriately setting parameters, such as the refractive index of the arrayed optical device K, the distance from the image pickup surface Ni, and the radius of curvature of the surface of the optical element M 1 .
- the first signal processing section C 1 shown in FIG. 1 outputs the first image I 1 formed only by the pixel group P 1 .
- the images I 2 . . . I 6 formed only by the pixel groups P 2 . . . P 6 are output.
- the second signal processing section C 2 performs a distance measurement calculation using the brightness information represented by differences in brightness value between adjacent pixels (the degree of sharpness) in the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 .
- the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 are images obtained by the light beams B 1 , B 2 , B 3 , B 4 , B 5 and B 6 having passed through the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 having such optical characteristics that focal characteristics are made different from one another.
- the second signal processing section C 2 calculates the distance to the object by using the degree of sharpness (brightness information) of a plurality of images obtained for a plurality of pixel groups among the first to sixth pixel groups P 1 to P 6 .
- the lens optical system L using the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 , it is possible to precisely obtain the distance to an object at a short distance, as compared with a method where the number of divisions of optical regions is smaller. That is, it is possible to precisely obtain the distance to the object through (e.g., a single iteration of) image capture using a single image pickup optical system (the lens optical system L).
- the stop S is a region through which light beams of all field angles pass. Therefore, by inserting a plane having optical characteristics for controlling focal characteristics in the vicinity of the stop S, it is possible to similarly control focal characteristics of light beams of all field angles. That is, in the present embodiment, it is preferred that the optical device L 1 is provided in the vicinity of the stop S. As the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 having such optical characteristics that focal characteristics are made different from one another are arranged in the vicinity of the stop S, the light beams can be given focal characteristics according to the number of divisions of regions.
- the optical device L 1 is provided at a position such that light having passed through the optical device L 1 is incident on the stop S directly (with no other optical members interposed therebetween).
- the optical device L 1 may be provided closer to the image pickup device N than the stop S. In such a case, it is preferred that the optical device L 1 is provided between the stop S and the lens L 2 and light having passed through the stop S is incident on the optical device L 1 directly (with no other optical members interposed therebetween).
- the angle of incidence of the light beam at the focal point of the optical system is uniquely determined based on the position of the light beam passing through the stop S and the field angle.
- the arrayed optical device K has the function of varying the outgoing direction based on the angle of incidence of the light beam. Therefore, it is possible to distribute light beams among pixels on the image pickup surface Ni so as to correspond to the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 divided in the vicinity of the stop S.
- FIG. 5 is a cross-sectional view showing the image pickup apparatus A of Embodiment 1.
- like components to those of FIG. 1 are denoted by like reference numerals to those of FIG. 1 .
- the region H of FIG. 5 in practice includes the arrayed optical device K.
- the region H has a configuration shown in FIG. 4( a ). Design data of such an optical system as shown in FIG.
- profiles G 1 , G 2 . . . G 6 denote the degrees of sharpness of predetermined regions of pixels produced only by the respective pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 .
- the degree of sharpness can be obtained based on the difference in brightness value between adjacent pixels in an image block of a predetermined size.
- the brightness distribution of an image block of a predetermined size can be obtained based on a Fourier-transformed frequency spectrum.
- E denotes the degree of sharpness in a block of a predetermined size
- ⁇ x i,j is the difference value between the brightness value of a pixel at a certain coordinate point in an image block of a predetermined size and the brightness value of a pixel at the same position in an adjacent block
- ⁇ y i,j is the difference value between the brightness value of a pixel at a coordinate point in an image block of a predetermined size and the brightness value of a pixel at the same position in an adjacent block
- k is a coefficient. It is preferred that ⁇ y i,j is multiplied by a predetermined coefficient.
- FIGS. 7( a ) to 7 ( c ) each show the brightness distribution of an image block having a size of 16 ⁇ 16. The degree of sharpness decreases in the order of FIGS. 7( a ), 7 ( b ) and 7 ( c ).
- FIGS. 7( d ) to 7 ( f ) show frequency spectrums obtained by a two-dimensional Fourier transform on the image blocks of FIGS. 7( a ) to 7 ( c ).
- FIGS. 7( d ) to 7 ( f ) for ease of understanding, the intensity of each frequency spectrum is shown after being logarithmically transformed, where it is brighter for a frequency spectrum of a higher intensity.
- each frequency spectrum the position of the highest brightness at the center is the DC component, and the frequency increases toward the peripheral portion.
- FIGS. 7( d ) to 7 ( f ) it can be seen that more higher frequency spectrum values are missing for a lower degree of sharpness of the image. Therefore, in order to obtain the degree of sharpness from these frequency spectrums, it can be obtained by extracting the whole or a part of the frequency spectrum, for example.
- the range of Z in FIG. 6 represents an area over which at least one of the degrees of sharpness G 1 , G 2 , G 3 , G 4 , G 5 and G 6 is changing.
- the object distance can be obtained by using such a relationship.
- the object distance has a correlation with the ratio between the degrees of sharpness G 1 and G 2 in the range of Z, the ratio between the degrees of sharpness G 2 and G 3 in the range of Z 2 , the ratio between the degrees of sharpness G 3 and G 4 in the range of Z 3 , the ratio between the degrees of sharpness G 4 and G 5 in the range of Z 4 , and the ratio between the degrees of sharpness G 5 and G 6 in the range of Z 5 .
- the value of the ratio between the degrees of sharpness of any two of six images formed by light beams incident on the six optical regions D 1 to D 6 has a correlation with the object distance.
- the correlations between these degrees of sharpness and the object distances are stored in advance in the storage section Me.
- the image pickup apparatus When the image pickup apparatus is used, of the data obtained as a result of a single iteration of image capture, the ratio between the degrees of sharpness of the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 produced for the respective pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 is obtained for each arithmetic block. Then, the object distance can be obtained by using correlations stored in the storage section Me (the correlation between any two images and the ratio between the degrees of sharpness thereof).
- the ratio between degrees of sharpness of the correlation is compared with the value of the ratio between the degrees of sharpness of the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 . Then, the object distance corresponding to the value at which they match is used as the distance to the object at the time of the image-capturing operation.
- the ratios between the degrees of sharpness need to be all different from one another over a predetermined object distance range.
- the configuration is such that the degree of sharpness is high for one of the optical systems in the range of Z, and the ratios between the degrees of sharpness all different from one another, thus making it possible to uniquely obtain the object distance. Since the ratio cannot be obtained if the value of the degree of sharpness is too low, it is preferred that the value of the degree of sharpness is greater than or equal to a certain value.
- the relationship between the object distance and the degree of sharpness is dictated by the radius of curvature of the surface of the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 , the spherical aberration characteristics, and the refractive index. That is, the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 need to have such optical characteristics that the ratios between the degrees of sharpness of the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 are all different from one another over a predetermined distance range.
- the object distance may be obtained by using a value other than the degree of sharpness, e.g., the contrast, as long as it is a value calculated using the brightness (brightness information).
- the contrast can be obtained, for example, from the ratio between the maximum brightness value and the minimum brightness value within a predetermined arithmetic block. While the degree of sharpness is a difference between brightness values, the contrast is a ratio between brightness values. The contrast may be obtained from the ratio between a point of the maximum brightness value and another point of the minimum brightness value, or the contrast may be obtained from the ratio between the average value among some higher brightness values and the average value among some lower brightness values, for example.
- correlations between object distances and contrasts ratio are stored in advance in the storage section Me.
- the contrast ratio between the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 for each block it is possible to obtained the object distance using the correlation.
- the present embodiment may employ either one of the method of obtaining the degree of sharpness from the difference between brightness values of adjacent pixels, and the method of obtaining the degree of sharpness through Fourier transform.
- the brightness value is a relative value
- the brightness value obtained by the former method and the brightness value obtained by the latter method are different values. Therefore, the method of obtaining the degree of sharpness for obtaining correlations (correlations stored in advance between object distances and degrees of sharpness) and the method of obtaining the degree of sharpness at the time of image capture need to be matched with each other.
- the optical system of the image pickup apparatus may use an image-side telecentric optical system.
- the main beam incident angle of the arrayed optical device K is a value close to 0 degree, and it is therefore possible to reduce the crosstalk between light beams arriving at respective pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 over the entire image pickup area.
- an image-side non-telecentric optical system may be used as the lens optical system L.
- magnifications of the obtained images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 are different from one another for each of the regions.
- the areas of the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 are made equal to one another (generally equal area).
- the exposure time can be made equal for the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 .
- the areas of the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 are different from one another, it is preferred that the exposure time is varied among the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 and P 6 or a brightness adjustment is performed after image capture.
- correlations between object distances and ratios between degrees of sharpness (or the contrasts) of images obtained from the six optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 of the optical device L 1 are stored in advance, and the distance to an object can be obtained by the ratio between degrees of sharpness (or the contrasts) of the images I 1 , I 2 , I 3 , I 4 , I 5 and I 6 and the correlations. That is, by performing a single iteration of image capture, for example, using an image pickup apparatus of the present embodiment, it is possible to obtain brightness information with which the object distance can be measured. Then, the object distance can be calculated using the brightness information.
- the lens optical system L since it is possible to obtain the distance to an object through (e.g., a single iteration of) image capture using a single image pickup optical system (the lens optical system L), it is not necessary to make uniform the characteristics or the positions of a plurality of image pickup optical systems as with an image pickup apparatus using a plurality of image pickup optical systems. Moreover, where a movie is captured using an image pickup apparatus of the present embodiment, it is possible to measure the accurate distance to an object even if the position of the object varies over the passage of time.
- the number of kinds of optical characteristics of the optical regions D 1 , D 2 , D 3 , D 4 , D 5 and D 6 may be three instead of six. That is, as shown in FIG. 8 , two of the divided six regions that are located in point symmetry with each other with respect to the optical axis may be provided with the same optical characteristics, thereby resulting in a configuration where there are three optical regions (D 1 , D 2 , D 3 ) such that focal characteristics are made different from one another. Then, as shown in FIG. 9 , an arrangement is used such that the center points of pixels are at the apices of a regular hexagon on the image pickup device N.
- Light beams having passed through the three optical regions D 1 , D 2 and D 3 are incident on the pixel groups P 1 , P 2 and P 3 , respectively.
- Two pixels p 1 included in the pixel group P 1 are located in point symmetry with each other with respect to the central axis of the optical element M 1 .
- each of two pixels p 2 and two pixels p 3 included in the pixel groups P 2 and P 3 are located in point symmetry with each other with respect to the central axis of the optical element M 1 .
- the region may be divided in six by dividing it in two in the lateral direction on a plane including the optical axis therein and in three in the longitudinal direction, thereby forming regions (D 1 , D 2 , D 3 , D 4 , D 5 and D 6 ) having optical characteristics different from one another. Then, one may consider combining together a microlens array having a grid array of microlenses, and rectangular pixels as shown in FIG. 11 . Similar advantageous effects are obtained also by arranging a microlens array including microlenses (the optical elements M 1 ) each having a rectangular outer shape as shown in FIG. 12 so that six square pixels correspond to one microlens (the optical element M 1 ) as shown in FIG. 13 .
- Embodiment 2 is different from Embodiment 1 in that the region of the optical device L 1 is divided in seven. In the present embodiment, similar contents to Embodiment 1 will not herein be described in detail.
- FIG. 14 is a schematic diagram showing Embodiment 2 of the image pickup apparatus A according to the present invention.
- the image pickup apparatus A of the present embodiment includes a lens optical system L whose optical axis is V, an arrayed optical device K arranged in the vicinity of the focal point of the lens optical system L, an image pickup device N, a first signal processing section C 1 , a second signal processing section C 2 , and a storage section Me.
- the lens optical system L has seven optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 ( FIG.
- FIG. 14 shows a cross section passing through D 1 , D 2 and D 5 ) having such optical characteristics that focal characteristics are made different from one another, and is composed of an optical device L 1 on which light beams B 1 , B 2 , B 3 , B 4 , B 5 , B 6 and B 7 ( FIG. 14 shows a cross section passing through B 1 , B 2 and B 5 ) from an object (not shown) are incident, a stop S on which light having passed through the optical device L 1 is incident, and a lens L 2 on which light having passed through the stop S is incident.
- the stop S is installed in the vicinity of the lens optical system L, and has a single opening.
- light beams having passed through the seven optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 pass through the lens L 2 and then are incident on the arrayed optical device K.
- the arrayed optical device K causes the light beams having passed through the seven optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 to be incident on the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 (shown in FIG. 16 , etc.) of the image pickup device N, respectively.
- the first signal processing section C 1 outputs images I 1 , I 2 , I 3 , I 4 , I 5 , I 6 and I 7 obtained from the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 , respectively. Since the optical characteristics of the seven optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 are different from one another, the degrees of sharpness (values calculated by using the brightness) of the images I 1 , I 2 , I 3 , I 4 , I 5 , I 6 and I 7 are different from one another depending on the object distance.
- the storage section Me stores the correlation between the degree of sharpness and the object distance for each of the light beams having passed through the optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 .
- the second signal processing section C 2 it is possible to obtain the distance to the object based on the degrees of sharpness for the images I 1 , I 2 , I 3 , I 4 , I 5 , I 6 and I 7 and the correlations.
- FIG. 15 is a front view of the optical device L 1 as viewed from the object side.
- the optical region includes one central region D 1 located at the optical axis of the lens optical system, and six surrounding regions D 2 , D 3 , D 4 , D 5 , D 6 and D 7 located around the central region D 1 .
- the optical region D 1 has a different shape from the optical regions D 2 , D 3 , D 4 , D 5 , D 6 and D 7 in Embodiment 2, the optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 have an equal area.
- the exposure time can be made equal between the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 on which light beams from the optical regions are incident. Note that where the optical regions have different areas, it is preferred that the exposure time is made different between pixels depending on their areas, or the brightness is adjusted in the image generation process.
- the broken line s denotes the position of the stop S.
- the configuration of the arrayed optical device K is similar to that of Embodiment 1, and the perspective view of the arrayed optical device K of the present embodiment is similar to that of FIG. 3 .
- FIG. 16( a ) is a diagram showing, on an enlarged scale, the arrayed optical device K and the image pickup device N shown in FIG. 14
- FIG. 16( b ) is a diagram showing the positional relationship between the arrayed optical device K and pixels on the image pickup device N.
- the arrayed optical device K is arranged so that the surface thereof on which the optical elements M 4 are formed is facing the image pickup surface Ni.
- a plurality of pixels P are arranged in n rows (n is an integer greater than or equal to 2), for example. As shown in FIG. 16( b ), they are arranged while shifting the positions of the center points of the pixels in the row direction (lateral direction) from one row to another by half the arrangement pitch.
- a plurality of pixels P can each be classified as one of pixels p 1 , p 2 , p 3 , p 4 , p 5 , p 6 and p 7 belonging to one of the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 .
- the six pixels p 2 , p 3 , p 4 , p 5 , p 6 and p 7 included in the pixel groups P 2 , P 3 , P 4 , P 5 , P 6 and P 7 , respectively, are arranged at the apices of a hexagon, with the pixel p 1 included in the pixel group P 1 being arranged at the center of the hexagon.
- the arrayed optical device K is arranged in the vicinity of the focal point of the lens optical system L, and is arranged at a position away from the image pickup surface Ni by a predetermined distance.
- the microlenses Ms are provided so as to cover the surfaces of seven pixels p 1 , p 2 , p 3 , p 4 , p 5 , p 6 and p 7 included in the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 , respectively.
- the arrayed optical device K is arranged so that the surface thereof on which the optical elements M 4 are formed is facing the image pickup surface Ni.
- the arrayed optical device K is configured so that one optical element M 4 corresponds to seven pixels p 1 , p 2 , p 3 , p 4 , p 5 , p 6 and p 7 included in the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 , respectively.
- the arrayed optical device is designed so that the majority of the light beams B 1 , B 2 , B 3 , B 4 , B 5 , B 6 and B 7 having passed through the optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 on the optical device L 1 arrives at the pixel groups P 1 , P 2 , P 3 , P 4 , P 5 , P 6 and P 7 on the image pickup surface Ni, respectively.
- this configuration can be realized by appropriately setting parameters, such as the refractive index of the arrayed optical device K, the distance from the image pickup surface Ni, and the radius of curvature of the surface of the optical element M 4 .
- the first signal processing section C 1 shown in FIG. 14 outputs the first image I 1 formed only by the pixel group P 1 .
- the images I 2 , I 3 , I 4 , I 5 , I 6 and I 7 formed only by the pixel groups P 2 , P 3 , P 4 , P 5 , P 6 and P 7 , respectively, are output.
- the second signal processing section C 2 performs a distance measurement calculation using the brightness information represented by differences in brightness value between adjacent pixels (the degree of sharpness) in the images I 1 , I 2 , I 3 , I 4 , I 5 , I 6 and I 7 .
- Embodiment 2 the relationship between the object distance and the degree of sharpness is as shown in FIG. 17 , and the object distance can be obtained in the range of Z.
- the present embodiment is configured so that seven different images can be obtained simultaneously by seven regions having such optical characteristics that focal characteristics are made different from one another, and it is therefore possible to obtain the distance to an object through (e.g., a single iteration of) image capture using a single image pickup optical system.
- an object e.g., a single iteration of
- the number of kinds of optical characteristics of the optical regions D 1 , D 2 , D 3 , D 4 , D 5 , D 6 and D 7 may be four instead of seven. That is, as shown in FIG. 18 , each two of the seven regions including one central region located at the optical axis of the lens optical system and six surrounding regions located around the central region that are located in point symmetry with each other with respect to the optical axis are given the same optical characteristics, resulting in four optical regions (D 1 , D 2 , D 3 and D 4 ) having such optical characteristics that focal characteristics are made different from one another.
- Pixels included in the pixel group P 1 on which light beams having passed through the optical regions D 1 are incident are located at the central axis of the optical elements M 4 .
- Light beams having passed through the optical regions D 2 , D 3 and D 4 each including two regions located in point symmetry with each other with respect to the optical axis are incident on the pixel groups P 2 , P 3 and P 4 , respectively.
- Two pixels p 2 included in the pixel group P 2 are located in point symmetry with each other with respect to the central axis of the optical element M 4 .
- each of two pixels p 3 and two pixels p 4 included in the pixel groups P 3 and P 4 are located in point symmetry with each other with respect to the central axis of the optical element M 4 .
- no parallax occurs between images obtained in the pixel groups P 1 , P 2 , P 3 and P 4 , on which light beams having passed through the optical regions D 1 , D 2 , D 3 and D 4 , respectively, are incident. This allows for precise distance measurement.
- Embodiments 1 and 2 are examples where curved surface configurations, etc., for making focal characteristics different from one another are arranged on the object-side surface of the optical device L 1 , such curved surface configurations, etc., may be arranged on the image-side surface of the optical device L 1 .
- the lens L 2 has a single-lens configuration, it may be a lens configured with a plurality of groups of lenses or a plurality of lenses.
- a plurality of optical regions may be formed on the optical surface of the lens L 2 arranged in the vicinity of the stop.
- optical device L 1 is arranged on the object side with respect to the position of the stop, it may be arranged on the image side with respect to the position of the stop.
- Embodiments 1 and 2 are directed to an image pickup apparatus including the first signal processing section C 1 , the second signal processing section C 2 , and the storage section Me (shown in FIG. 1 , etc.).
- the image pickup apparatus of the present invention does not have to include the signal processing section and the storage section.
- processes performed by the first signal processing section C 1 and the second signal processing section C 2 may be performed by using a PC, or the like, external to the image pickup apparatus. That is, the present invention may be implemented by a system including an image pickup apparatus, which includes the lens optical system L, the arrayed optical device K and the image pickup device N, and an external signal processing device.
- the image pickup apparatus of this embodiment it is possible to obtain brightness information with which the object distance can be measured by performing (e.g., a single iteration of) image capture using a single image pickup optical system.
- the object distance can be obtained through a process performed by an external signal processing section using the correlations between the brightness information and the degree of sharpness (or the contrast) stored in the external storage section.
- the object distance may be obtained by substituting the obtained degree of sharpness or contrast into an expression representing the relationship between the degree of sharpness or the contrast and the object distance.
- the optical elements (microlenses) of the microlens array of Embodiments 1 and 2 are in a rotationally symmetric shape with respect to the optical axis within a range of a predetermined radius of each optical element.
- description will be made in comparison with microlenses having a rotationally asymmetric shape with respect to the optical axis.
- FIG. 20( a 1 ) is a perspective view showing a microlens array having a rotationally asymmetric shape with respect to the optical axis.
- Such a microlens array is formed through patterning using a resist, which is obtained by forming a quadrangular prism-shaped resist on the array and performing a heat treatment, thereby rounding the corner portions of the resist.
- FIG. 20( a 2 ) shows the contour lines of the microlens shown in FIG. 20( a 1 ).
- the radius of curvature in the longitudinal and lateral directions differs from that in the diagonal direction (the diagonal direction across the bottom surface of the microlens).
- FIG. 20( a 3 ) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in FIGS. 20( a 1 ) and 20 ( a 2 ) is applied to the arrayed optical device of the present invention.
- FIG. 20( a 3 ) shows only the light beams passing through one optical region, of all the light beams passing through the arrayed optical device K.
- FIG. 20( b 1 ) is a perspective view showing a microlens array having a rotationally symmetric shape with respect to the optical axis.
- a microlens having such a rotationally symmetric shape can be formed on a glass plate, or the like, through a thermal imprinting or UV imprint process.
- FIG. 20( b 2 ) shows the contour lines of the microlens having a rotationally symmetric shape.
- the radius of curvature in the longitudinal and lateral directions is equal to that in the diagonal direction.
- FIG. 20( b 3 ) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in FIGS. 20( b 1 ) and 20 ( b 2 ) is applied to the arrayed optical device of the present invention. While FIG. 20( b 3 ) shows only the light beams passing through one optical region, of all the light beams passing through the arrayed optical device K, it can be seen that there is no such crosstalk as that shown in FIG. 20( a 3 ). Thus, by providing a microlens having a rotationally symmetric shape, it is possible to reduce the crosstalk, and thus to suppress the deterioration of precision in the distance measurement calculation.
- An image pickup apparatus is useful as an image pickup apparatus such as a digital still camera or a digital video camera. It is also applicable to a distance measurement apparatus for monitoring the surroundings of an automobile and a person in an automobile, or a distance measurement apparatus for a three-dimensional information input for a game device, a PC, a portable terminal, and the like.
- a Image pickup apparatus A Image pickup apparatus
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims (21)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011099165 | 2011-04-27 | ||
JP2011-099165 | 2011-04-27 | ||
PCT/JP2012/000728 WO2012147245A1 (en) | 2011-04-27 | 2012-02-03 | Image pick-up device, image pick-up system equipped with image pick-up device, and image pick-up method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130329042A1 US20130329042A1 (en) | 2013-12-12 |
US9270948B2 true US9270948B2 (en) | 2016-02-23 |
Family
ID=47071781
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/001,978 Expired - Fee Related US9270948B2 (en) | 2011-04-27 | 2012-02-03 | Image pick-up device, method, and system utilizing a lens having plural regions each with different focal characteristics |
Country Status (3)
Country | Link |
---|---|
US (1) | US9270948B2 (en) |
JP (1) | JP5548310B2 (en) |
WO (1) | WO2012147245A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11272147B2 (en) * | 2017-03-17 | 2022-03-08 | Panasonic Intellectual Property Management Co., Ltd. | Projector and projector system |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013114888A1 (en) * | 2012-02-02 | 2013-08-08 | パナソニック株式会社 | Imaging device |
CN104204941B (en) * | 2012-03-21 | 2016-01-06 | 富士胶片株式会社 | Camera head |
CN108337419A (en) * | 2012-07-12 | 2018-07-27 | 株式会社尼康 | Image processing apparatus |
JP5953270B2 (en) * | 2013-07-04 | 2016-07-20 | オリンパス株式会社 | Imaging device |
JP6077967B2 (en) * | 2013-08-27 | 2017-02-08 | 富士フイルム株式会社 | Imaging device |
JP6536877B2 (en) | 2014-07-31 | 2019-07-03 | パナソニックIpマネジメント株式会社 | Imaging device and imaging system |
US9661193B2 (en) * | 2014-08-01 | 2017-05-23 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and analyzing apparatus |
WO2021030034A1 (en) | 2019-08-15 | 2021-02-18 | Apple Inc. | Depth mapping using spatial multiplexing of illumination phase |
WO2021034409A1 (en) * | 2019-08-20 | 2021-02-25 | Apple Inc. | Depth sensor with interlaced sampling structure |
US11763472B1 (en) | 2020-04-02 | 2023-09-19 | Apple Inc. | Depth mapping with MPI mitigation using reference illumination pattern |
US11558569B2 (en) | 2020-06-11 | 2023-01-17 | Apple Inc. | Global-shutter image sensor with time-of-flight sensing capability |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4560863A (en) | 1982-08-30 | 1985-12-24 | Canon Kabushiki Kaisha | Focus detecting device |
JPS6176310U (en) | 1984-10-26 | 1986-05-22 | ||
JPH05302831A (en) | 1992-04-27 | 1993-11-16 | Olympus Optical Co Ltd | Distance measuring apparatus |
JPH0735545A (en) | 1993-07-22 | 1995-02-07 | Nissan Motor Co Ltd | Optical range finder |
JPH0760211B2 (en) | 1986-04-21 | 1995-06-28 | ソニー株式会社 | Autofocus control device |
US5576975A (en) | 1991-09-20 | 1996-11-19 | Fujitsu Limited | Distance measuring method and a distance measuring apparatus |
JP2000152281A (en) | 1998-11-09 | 2000-05-30 | Sony Corp | Image pickup device |
US20010015763A1 (en) | 2000-02-15 | 2001-08-23 | Michio Miwa | Object monitoring apparatus |
US20040125230A1 (en) | 2002-12-13 | 2004-07-01 | Yasuo Suda | Image sensing apparatus |
JP2006184065A (en) | 2004-12-27 | 2006-07-13 | Matsushita Electric Ind Co Ltd | Object detector |
JP2006184844A (en) | 2004-12-03 | 2006-07-13 | Tochigi Nikon Corp | Image forming optical system and imaging apparatus using the same |
US20070017993A1 (en) * | 2005-07-20 | 2007-01-25 | Ulrich Sander | Optical Device With Increased Depth Of Field |
US20070279618A1 (en) * | 2004-10-15 | 2007-12-06 | Matsushita Electric Industrial Co., Ltd. | Imaging Apparatus And Image Improving Method |
JP2008051894A (en) | 2006-08-22 | 2008-03-06 | Matsushita Electric Ind Co Ltd | Imaging apparatus |
US20080277566A1 (en) | 2005-05-30 | 2008-11-13 | Ken Utagawa | Image Forming State Detection Device |
JP2009198376A (en) | 2008-02-22 | 2009-09-03 | Aisin Seiki Co Ltd | Surface shape measuring device |
JP2010039162A (en) | 2008-08-05 | 2010-02-18 | Nikon Corp | Focus-detecting device and image pickup device |
US20100171854A1 (en) * | 2009-01-08 | 2010-07-08 | Sony Corporation | Solid-state imaging device |
US20110085050A1 (en) * | 2007-08-04 | 2011-04-14 | Omnivision Cdm Optics, Inc. | Multi-Region Imaging Systems |
US20120033105A1 (en) | 2010-08-04 | 2012-02-09 | Olympus Corporation | Image processing apparatus, image processing method, imaging apparatus, and information storage medium |
-
2012
- 2012-02-03 US US14/001,978 patent/US9270948B2/en not_active Expired - Fee Related
- 2012-02-03 WO PCT/JP2012/000728 patent/WO2012147245A1/en active Application Filing
- 2012-02-03 JP JP2013511875A patent/JP5548310B2/en not_active Expired - Fee Related
Patent Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4560863A (en) | 1982-08-30 | 1985-12-24 | Canon Kabushiki Kaisha | Focus detecting device |
JPS6176310U (en) | 1984-10-26 | 1986-05-22 | ||
JPH0760211B2 (en) | 1986-04-21 | 1995-06-28 | ソニー株式会社 | Autofocus control device |
JP3110095B2 (en) | 1991-09-20 | 2000-11-20 | 富士通株式会社 | Distance measuring method and distance measuring device |
US5576975A (en) | 1991-09-20 | 1996-11-19 | Fujitsu Limited | Distance measuring method and a distance measuring apparatus |
JPH05302831A (en) | 1992-04-27 | 1993-11-16 | Olympus Optical Co Ltd | Distance measuring apparatus |
JPH0735545A (en) | 1993-07-22 | 1995-02-07 | Nissan Motor Co Ltd | Optical range finder |
JP2000152281A (en) | 1998-11-09 | 2000-05-30 | Sony Corp | Image pickup device |
US20010015763A1 (en) | 2000-02-15 | 2001-08-23 | Michio Miwa | Object monitoring apparatus |
JP2001227914A (en) | 2000-02-15 | 2001-08-24 | Matsushita Electric Ind Co Ltd | Object monitoring device |
US20040125230A1 (en) | 2002-12-13 | 2004-07-01 | Yasuo Suda | Image sensing apparatus |
JP2004191893A (en) | 2002-12-13 | 2004-07-08 | Canon Inc | Imaging apparatus |
US20070279618A1 (en) * | 2004-10-15 | 2007-12-06 | Matsushita Electric Industrial Co., Ltd. | Imaging Apparatus And Image Improving Method |
JP2006184844A (en) | 2004-12-03 | 2006-07-13 | Tochigi Nikon Corp | Image forming optical system and imaging apparatus using the same |
JP2006184065A (en) | 2004-12-27 | 2006-07-13 | Matsushita Electric Ind Co Ltd | Object detector |
US20080277566A1 (en) | 2005-05-30 | 2008-11-13 | Ken Utagawa | Image Forming State Detection Device |
US20070017993A1 (en) * | 2005-07-20 | 2007-01-25 | Ulrich Sander | Optical Device With Increased Depth Of Field |
JP2008051894A (en) | 2006-08-22 | 2008-03-06 | Matsushita Electric Ind Co Ltd | Imaging apparatus |
US20110085050A1 (en) * | 2007-08-04 | 2011-04-14 | Omnivision Cdm Optics, Inc. | Multi-Region Imaging Systems |
JP2009198376A (en) | 2008-02-22 | 2009-09-03 | Aisin Seiki Co Ltd | Surface shape measuring device |
JP2010039162A (en) | 2008-08-05 | 2010-02-18 | Nikon Corp | Focus-detecting device and image pickup device |
US20100171854A1 (en) * | 2009-01-08 | 2010-07-08 | Sony Corporation | Solid-state imaging device |
US20120033105A1 (en) | 2010-08-04 | 2012-02-09 | Olympus Corporation | Image processing apparatus, image processing method, imaging apparatus, and information storage medium |
JP2012039255A (en) | 2010-08-04 | 2012-02-23 | Olympus Corp | Image processing apparatus, image processing method, imaging apparatus and program |
Non-Patent Citations (3)
Title |
---|
International Search Report for corresponding International Application No. PCT/JP2012/000728 mailed Apr. 17, 2012. |
Preliminary Report on Patentability for corresponding International Application No. PCT/JP2012/000728 dated Jul. 4, 2013 and partial English translation. |
Tu et al., "Two- and Three-Dimensional Methods for Inspection and Metrology V", Edited by Huang, Peisen S. Proceedings of the SPIE, vol. 6762, pp. 676203 (2007), entitled "Depth and Focused Image Recovery from Defocused Images for Cameras Operating in Macro Mode". |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11272147B2 (en) * | 2017-03-17 | 2022-03-08 | Panasonic Intellectual Property Management Co., Ltd. | Projector and projector system |
Also Published As
Publication number | Publication date |
---|---|
US20130329042A1 (en) | 2013-12-12 |
JP5548310B2 (en) | 2014-07-16 |
JPWO2012147245A1 (en) | 2014-07-28 |
WO2012147245A1 (en) | 2012-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9270948B2 (en) | Image pick-up device, method, and system utilizing a lens having plural regions each with different focal characteristics | |
US8711215B2 (en) | Imaging device and imaging method | |
US9383199B2 (en) | Imaging apparatus | |
US9142582B2 (en) | Imaging device and imaging system | |
US9182602B2 (en) | Image pickup device and rangefinder device | |
US9531963B2 (en) | Image capturing device and image capturing system | |
US8836825B2 (en) | Imaging apparatus | |
US8260074B2 (en) | Apparatus and method for measuring depth and method for computing image defocus and blur status | |
US20130242161A1 (en) | Solid-state imaging device and portable information terminal | |
US9134126B2 (en) | Image processing device, and image processing method | |
CN107424195B (en) | Light field distance estimation method | |
US20050206874A1 (en) | Apparatus and method for determining the range of remote point light sources | |
US9438778B2 (en) | Image pickup device and light field image pickup lens | |
EP2982946A1 (en) | Light spot centroid position acquisition method for wavefront sensor | |
US8767068B2 (en) | Distance measuring apparatus based on parallax with conditions on curvature and focus length | |
CN108731808B (en) | Method and device for calibrating sub-aperture center position of IMS (IP multimedia subsystem) snapshot imaging spectrometer | |
JP2016066995A (en) | Image deviation amount calculation device, imaging device and image deviation amount calculation method | |
CN107870522B (en) | Imaging optical path device and detection control method of imaging optical path device | |
CN109257595A (en) | The system of Photo-Response Non-Uniformity in a kind of testing image sensor pixel | |
KR101088777B1 (en) | apparatus for measuring the three dimensional shape | |
TW201606420A (en) | Image pickup device and light field image pickup lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PANASONIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURATA, AKIKO;IMAMURA, NORIHIRO;SIGNING DATES FROM 20130716 TO 20130717;REEL/FRAME:031359/0085 |
|
AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143 Effective date: 20141110 Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143 Effective date: 20141110 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:056788/0362 Effective date: 20141110 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240223 |