WO2013105383A1 - Image generation method, image generation apparatus, program, and storage medium - Google Patents

Image generation method, image generation apparatus, program, and storage medium Download PDF

Info

Publication number
WO2013105383A1
WO2013105383A1 PCT/JP2012/082116 JP2012082116W WO2013105383A1 WO 2013105383 A1 WO2013105383 A1 WO 2013105383A1 JP 2012082116 W JP2012082116 W JP 2012082116W WO 2013105383 A1 WO2013105383 A1 WO 2013105383A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sub
pixel
optical system
imaging optical
Prior art date
Application number
PCT/JP2012/082116
Other languages
English (en)
French (fr)
Inventor
Koichi Fukuda
Masafumi Kimura
Koshi Hatakeyama
Norihito Hiasa
Shohei Tsutsumi
Tomohiro Nishiyama
Kazuhiro Yahata
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Priority to US14/370,417 priority Critical patent/US9344624B2/en
Priority to CN201280066884.XA priority patent/CN104041006B/zh
Publication of WO2013105383A1 publication Critical patent/WO2013105383A1/en
Priority to US15/130,367 priority patent/US9609208B2/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • G02B7/346Systems for automatic generation of focusing signals using different areas in a pupil plane using horizontal and vertical areas in the pupil plane, i.e. wide area autofocusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • the present invention relates to an image generation method, and an image generation apparatus.
  • the exit pupil of the imaging lens is divided into multiple pupil areas, and multiple
  • parallax images corresponding to the divided pupil areas can be captured at the same time.
  • US Patent No. 4410804 discloses an image capture apparatus that uses a two-dimensional image sensor in which one microlens and multiple divided photo-electric converters are formed with respect to one pixel.
  • the divided photo-electric converters are configured so as to receive light from different pupil sub-areas of the exit pupil of the imaging lens via one microlens, and thus pupil division is performed.
  • the multiple parallax images that are captured are equivalent to light field (LF) data, which is information on a spatial distribution of light intensity and an angle distribution.
  • LF light field
  • An image generation method for generating an output image from an input image acquired by an image sensor that has an array of a plurality of pixels, each of which has arranged therein a plurality of sub-pixels that each receive a light beam that passes through a different pupil sub-area of an imaging optical system, the method including: a step of generating a plurality of parallax images that respectively correspond to the different pupil sub-areas based on the input image; a step of generating a plurality of pixel shifted images by performing different non-integral shifting for each of the plurality of parallax images according to a virtual image forming plane of the imaging optical system that is different from an image sensing plane at which the image sensor is arranged; and a step of generating an output image that has a higher resolution than each
  • a program causes a computer to execute the steps of the above-described image
  • a computer-readable storage medium stores a program for causing a computer to execute the steps of the above-described image generation method.
  • an image generation apparatus comprises an image sensor configured to acquire an input image, wherein the image sensor has an array of a plurality of pixels, each of which has arranged therein a plurality of sub-pixels that each receive a light beam that passes through a different pupil sub-area of an imaging optical system; a first generation means configured to generate a plurality of parallax images that respectively correspond to the different pupil sub-areas based on the input image; a second generation means configured to generate a plurality of pixel shifted images by performing different shifting for each of the plurality of parallax images according to a virtual image forming plane of the imaging optical system that is different from an image sensing plane at which the image sensor is arranged; and a composition means configured to generate an output image that has a higher resolution than the resolution of the parallax images from the plurality of pixel shifted images through composition processing.
  • FIG. 1 is a schematic configuration diagram of an image capture apparatus according to embodiments of the present invention.
  • FIG. 2 is a schematic diagram of a pixel array according to embodiments of the present invention.
  • FIGS. 3A and 3B are a schematic plan view and a schematic cross-sectional view of a pixel
  • FIG. 4 is a schematic diagram for
  • FIG. 5 is a schematic diagram for
  • FIG. 6 is a schematic diagram of a
  • FIG. 7 is a diagram for describing a
  • FIG. 8 is a flowchart of image generation according to a first embodiment of the present
  • FIG. 9 is a diagram for describing pixel shift in parallax images according to embodiments of the present invention.
  • FIG. 10 shows an example of a relational expression between an output image and a pixel shifted image according to embodiments of the present invention.
  • FIG. 11 is a flowchart of image generation according to a second embodiment of the present
  • FIG. 1 is a configuration diagram of a camera as an image capture apparatus that has an image sensor according to a first embodiment of the present
  • reference numeral 101 denotes a first lens group arranged at the leading end of an imaging optical system, and this lens group is held so as to be capable of moving forward and backward in the optical axis direction.
  • Reference numeral 102 denotes an aperture/shutter that not only adjusts the amount of light in image capturing by performing opening diameter adjustment, but also functions as a shutter for
  • Reference numeral 103 denotes a second lens group.
  • the second lens group 103 can move forward and backward in the optical axis direction together with the aperture/shutter 102, and realizes a magnification effect (zoom function) by moving in conjunction with forward/backward movement of the first lens group 101.
  • Reference numeral 105 denotes a third lens group that adjusts the focal point by moving forward and backward in the optical axis direction.
  • Reference numeral 106 denotes an optical low-pass filter, which is an optical element for reducing false coloring and moire that appear in captured images.
  • Reference numeral 107 denotes an image sensor configured by a two-dimensional CMOS photosensor and peripheral circuitry, and this image sensor is arranged at the image forming plane of the imaging optical system.
  • Reference numeral 111 denotes a zoom actuator that performs a magnification operation by driving elements from the first lens group 101 to the third lens group 105 in the optical axis direction by rotating a cam barrel (not shown) .
  • Reference numeral 112 denotes an aperture/shutter actuator that adjusts the amount of captured light by controlling the opening diameter of the aperture/shutter 102, as well as controls the light exposure time in still image
  • Reference numeral 114 denotes a focus actuator that adjusts the focal point by driving the third lens group 105 forward and backward in the optical axis direction.
  • Reference numeral 115 denotes an electronic flash for subject illumination in image capturing, and is preferably a flash illumination apparatus that uses a xenon tube, but may be an illumination apparatus that includes a continuous-emission LED.
  • Reference numeral 116 denotes an AF auxiliary light apparatus that projects a mask image having a predetermined pattern of openings into the subject field via a projection lens so as to improve focus detection capability with respect to darks subjects and low-contrast subjects.
  • Reference numeral 121 denotes a CPU in the camera that performs various types of control with respect to the camera body, has an arithmetic portion, a ROM, a RAM, an A/D converter, a D/A converter, a communication interface circuit, and the like, and drives various circuits in the camera based on a predetermined program stored in the ROM.
  • This CPU also executes a series of .operations such as AF, image capturing, image generation, and recording.
  • the CPU 121 is an image generation means, a parallax image generation means, a pixel shifted image generation means, and a super-resolution processing means of the present invention.
  • Reference numeral 122 denotes an electronic flash control circuit that performs control for lighting the electronic flash 115 in synchronization with an image capturing operation.
  • Reference numeral 123 denotes an auxiliary light driver circuit that performs control for lighting the AF auxiliary light apparatus 116 in synchronization with a focus detection operation.
  • Reference numeral 124 denotes an image sensor driver circuit that controls image capturing operations of the image sensor 107, as well as subjects an acquired image signal to A/D conversion and
  • Reference numeral 125 denotes an image processing circuit that performs processing such as ⁇ conversion, color interpolation, and JPEG compression on an image that was acquired by the image sensor 107.
  • Reference numeral 126 denotes a focus driver circuit that adjusts the focal point by
  • Reference numeral 128 denotes an aperture/shutter driver circuit that controls the opening diameter of the aperture/shutter 102 by
  • Reference numeral 129 denotes a zoom driver circuit that drives the zoom actuator 111 in accordance with a zoom operation that was performed by a photographer.
  • Reference numeral 131 denotes a display apparatus such as an LCD that displays information regarding the camera shooting mode, a preview image before image capturing, an image for checking after image capturing, an image indicating the focus state in focus detection, and the like.
  • Reference numeral 132 denotes an operation switch group that is configured by a power switch, a release (shooting trigger) switch, a zoom operation switch, a shooting mode selection switch, and the like.
  • Reference numeral 133 denotes a
  • FIG. 2 is a schematic diagram of a pixel array and a sub-pixel array in the image sensor of the present embodiment.
  • FIG. 2 shows a 4x4 pixel array range in the pixel array and a 16*16 sub-pixel array range in the sub-pixel array in the two-dimensional CMOS sensor
  • the present embodiment describes an image sensor that has a pixel period ⁇ of 20 m and an effective pixel count NLF of approximately 2.2
  • pixels 200G that have G (green) spectral sensitivity are arranged as pixels at diagonal positions, and a pixel 200R that has R (red) spectral sensitivity and a pixel 200B that has B (blue) spectral sensitivity are arranged as the other two pixels.
  • ⁇ ⁇ ⁇ (a 4x array of) sub-pixels 201 to 216 are arranged in a two-dimensional array in each pixel.
  • FIG. 3A is a plan view of one of the pixels
  • FIG. 3B is a cross-sectional view of the same pixel taken along a-a in FIG. 3A and viewed from the -y side.
  • the pixel 200G of the present embodiment is provided with a microlens 305 for focusing incident light on the light-receiving side of the pixel, and is provided with photo-electric converters 301 to 316 that are divided into ⁇ areas
  • converters 301 to 316 respectively correspond to the sub-pixels 201 to 216 in FIG. 2.
  • the photo-electric converters 301 to 316 may be pin-structure photodiodes in which an intrinsic layer is sandwiched between a p layer and an n layer, or, as necessary, may be pn-junction photodiodes in which the intrinsic layer is omitted.
  • a color filter 306 is formed between the microlens 305 and the photo-electric converters 301 to 316. Also, for each sub-pixel, the spectral transmittance of the color filter may be changed, or the color filter may be omitted, as necessary .
  • FIGS. 3A and 3B is focused by the microlens 305, filtered by the color filter 306, and then received by the photo-electric converters 301 to 316.
  • each photo-electric converter pairs of an electron and a hole are generated according to the amount of received light and separated by a depletion layer, and then negatively charged electrons are accumulated in the n layer (not shown) , whereas the holes are discharged outside the image sensor via the p layer, which is connected to a constant voltage source (not shown) .
  • FIG. 4 is a schematic diagram showing the correspondence relationship between pupil division and the photo-electric converters 301 to 316 (sub-pixels 201 to 216) .
  • FIG. 4 is a cross- sectional view of the pixel 200G shown in FIG. 3A taken along a-a and viewed from the +y side, and shows the exit pupil face of the imaging optical system.
  • the x axis and the y axis in the cross-sectional view are the opposite of those in FIGS. 3A and 3B in order to correspond to the coordinate axes of the exit pupil face.
  • the image sensor is arranged in the
  • pupil sub-areas 501 to 516 are in an approximately conjugate
  • the exit pupil 400 of the imaging optical system is divided in Np
  • a pupil area 500 is the pupil area from which the entire pixel 200G can receive light when all of the photo- electric converters 301 to 316 (sub-pixels 201 to 216) that are divided into ⁇ ⁇ ⁇ areas (4x areas) are combined.
  • FIG. 5 is a schematic diagram showing the correspondence
  • the photo-electric converters 301 to 316 In each pixel of the image sensor, the photo-electric converters 301 to 316
  • an input image is acquired by the image sensor that has an array of multiple pixels, each of which has arranged therein multiple sub-pixels that each receive a light beam that passes through a different pupil sub-area of the
  • a parallax image that corresponds to a specified pupil sub-area among the pupil sub-areas 501 to 516 of the imaging optical system can be obtained by, for each pixel, selecting a signal from a specified sub-pixel among the sub-pixels 201 to 216 (photoelectric converters 301 to 316) .
  • a specified sub-pixel among the sub-pixels 201 to 216 photoelectric converters 301 to 316.
  • parallax image that corresponds to the pupil sub-area 509 of the imaging optical system can be obtained by selecting the signal from the sub-pixel 209 (photo- electric converter 309) for each pixel. The same follows for the other sub-pixels as well. Based on the input image acquired by the image sensor of the present embodiment, multiple (pupil division count Np) parallax images that respectively correspond to the different pupil sub-areas and have a resolution equal to the effective pixel count can be generated.
  • a captured image with a resolution equal to the effective pixel count can be generated by adding together all of the signals from the sub-pixels 201 to 216 for each pixel.
  • FIG. 6 is a schematic diagram of the
  • is the angular resolution
  • be the estimated elevation of the exit pupil of the imaging optical system
  • the sub-pixels 212 to 209 receive light beams that have angles of incidence ⁇ 0 to ⁇ 3 respectively. Light beams whose angles of incidence have a range of the angular resolution ⁇ enter the sub-pixels .
  • FIG. 7 is a schematic diagram for
  • the i- th pixel Xi light beams that entered at the angles da
  • LF data which is information on a spatial distribution of light intensity and an angle distribution, can be acquired, and the LF data is constituted by multiple parallax images respectively corresponding to the different pupil sub-areas as described above.
  • a refocused image can be generated at a virtual image forming plane by translating all of the sub-pixel signals Li, a along the respective angles Qa from the image sensing plane to the virtual image forming plane, distributing the signals to virtual pixels in the virtual image forming plane, and then performing weighted addition.
  • the coefficient used in the weighted addition is determined such that all of the values are positive and have a sum of 1.
  • the exit pupil of the imaging optical system having the aperture value F decreases in area upon being divided into ⁇ ⁇ ⁇ pupil areas, and the effective aperture value of the pupil sub-areas increases to NGF.
  • the third member in Expression (1) shows that refocusing can be performed in the range in which the effective aperture value NGF of the pupil sub-areas increases and the focal depth increases.
  • step S100 an input image is acquired by the image sensor that has an array of multiple pixels, each of which has arranged therein multiple sub-pixels (the sub-pixels 201 to 216) that each receive a light beam that passes through a different pupil sub-area of the imaging optical system. It is also possible to use an input image that was captured by the image sensor having the above configuration in advance and stored in a recording medium.
  • step S200 a parallax image that
  • the imaging optical system corresponds to a specified pupil sub-area among the pupil sub-areas 501 to 516 of the imaging optical system is generated by, for each pixel, selecting a signal from a specified sub-pixel among the sub-pixels 201 to 216 from the input image. Based on the input image, multiple parallax images that respectively correspond to the different pupil sub-areas and have a resolution equal to the effective pixel count are generated .
  • step S300 multiple pixel shifted images are generated by, for each of the parallax images generated in step S200, performing different non- integral shifting according to a virtual image forming plane of the imaging optical system that is different from the image sensing plane at which the image sensor is arranged.
  • FIG. 9 is a diagram for describing pixel shift in parallax images. Although the following describes only the x direction for the sake of
  • An image is generated by, for each pixel, selecting the sub-pixel signal Li, a that entered at the a-th angle 9a, and that image is the parallax image that corresponds to the a- th pupil sub-area.
  • multiple pixel shifted images are generated by performing translation along the angle 6a for each of the parallax images to a virtual image forming plane that is different from the image sensing plane.
  • a distance d between the image sensing plane and the virtual image forming plane is set such that the amount of shift in the horizontal direction is a non-integer.
  • step S400 super-resolution processing is performed such that an output image whose resolution is higher than the resolution of each of the parallax images is generated from the multiple pixel shifted images that were generated in step S300.
  • FIG. 9 shows the relationship in the
  • Expression (2) is a relational expression between the super-resolution pixel signal 1 ⁇ and the sub-pixel signal 1. ⁇ arrayed one-dimensionally .
  • step S400 an output image (super- resolution pixel signal ⁇ , ⁇ ) is generated through super-resolution processing for obtaining the inverse matrix ⁇ _1 ⁇ , ⁇ , ⁇ ', ⁇ ' of the determinant ⁇ , ⁇ , ⁇ ', ⁇ ' and performing compositing using the relational expression of Expression (4).
  • the inverse matrix ⁇ _1 ⁇ , v, ⁇ ' , v' may ⁇ be obtained in advance as necessary.
  • a configuration is possible in which, as necessary, the super-resolution pixel signal ⁇ , ⁇ , the inverse matrix ⁇ _1 ⁇ , , ⁇ ' , ' , and the sub-pixel signal ⁇ , ⁇ , ⁇ in Expressions (3) and (4) are respectively subjected to Fourier transformation, super-resolution processing is performed in the frequency space, and then inverse Fourier transformation is performed.
  • dark correction As necessary, dark correction, shading correction, demosaicing processing, and the like may be performed on one or a combination of the input image, the parallax images, the pixel shifted images, and the output image.
  • the present embodiment is one example of an image capture apparatus that has an image generation means for performing the above-described image
  • the present embodiment is one example of a display apparatus that has an image generation means for performing the above-described image generation method.
  • a captured image that has a high spatial resolution can be generated from multiple parallax images.
  • the processing up to the generation of multiple parallax images that respectively correspond to the different pupil sub-areas and have a resolution equal to the effective pixel count based on the input image in step S200 is similar to that in the first embodiment .
  • step S310 translation along the angle 9a is performed for each "x direction only" parallax image, and multiple x-direction pixel shifted images are generated by performing x-direction non- integral shifting (shifting by the non-integral factor 1/ ⁇ of the pixel period ⁇ ) .
  • step S410 multiple x-direction super resolution images are generated by solving the
  • Expression (5) can be explicitly described as the recurrence formulas in Expressions (6a) to (6d).
  • the recurrence formulas in Expressions (6a) to (6d) can be sequentially solved for the super-resolution pixel signal ⁇ , ⁇ , and there is no need to obtain the inverse matrix ⁇ _1 ⁇ , ⁇ ' of the determinant ⁇ , ⁇ ', thus making it possible to simplify the arithmetic processing. In this way, x-direction super-resolution processing is performed through steps S310 and S410.
  • step S320 translation along the angle Gb is performed for each "y direction only" x- direction pixel shifted image, and multiple y-direction pixel shifted images are generated by performing y- direction non-integral shifting (shifting by the non- integral factor 1/ ⁇ of the pixel period ⁇ ) .
  • step S420 the recurrence formulas expressing the relationship between the y-direction pixel shifted images and the super-resolution pixel signal ⁇ , ⁇ are sequentially solved for the super- resolution pixel signal ⁇ , ⁇ , and thus an output image (super-resolution pixel signal ⁇ , ⁇ ) is generated.
  • a captured image that has a high spatial resolution can be generated from multiple parallax images.
  • aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described
  • the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiments.
  • the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Automatic Focus Adjustment (AREA)
PCT/JP2012/082116 2012-01-13 2012-12-05 Image generation method, image generation apparatus, program, and storage medium WO2013105383A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/370,417 US9344624B2 (en) 2012-01-13 2012-12-05 Parallax image generation method, image generation apparatus, program, and storage medium
CN201280066884.XA CN104041006B (zh) 2012-01-13 2012-12-05 图像生成方法以及图像生成设备
US15/130,367 US9609208B2 (en) 2012-01-13 2016-04-15 Image generation method, image generation apparatus, program, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012005661A JP5818697B2 (ja) 2012-01-13 2012-01-13 画像生成方法、撮像装置および表示装置、プログラム、記憶媒体
JP2012-005661 2012-01-13

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/370,417 A-371-Of-International US9344624B2 (en) 2012-01-13 2012-12-05 Parallax image generation method, image generation apparatus, program, and storage medium
US15/130,367 Continuation US9609208B2 (en) 2012-01-13 2016-04-15 Image generation method, image generation apparatus, program, and storage medium

Publications (1)

Publication Number Publication Date
WO2013105383A1 true WO2013105383A1 (en) 2013-07-18

Family

ID=48781343

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/082116 WO2013105383A1 (en) 2012-01-13 2012-12-05 Image generation method, image generation apparatus, program, and storage medium

Country Status (4)

Country Link
US (2) US9344624B2 (ja)
JP (1) JP5818697B2 (ja)
CN (1) CN104041006B (ja)
WO (1) WO2013105383A1 (ja)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016139895A1 (en) 2015-03-03 2016-09-09 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, program, and storage medium

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5584270B2 (ja) 2012-11-08 2014-09-03 オリンパス株式会社 撮像装置
US9667846B2 (en) * 2012-11-27 2017-05-30 Nokia Technologies Oy Plenoptic camera apparatus, a method and a computer program
JP6188531B2 (ja) * 2013-10-22 2017-08-30 キヤノン株式会社 撮像装置、その制御方法およびプログラム
KR102195407B1 (ko) 2015-03-16 2020-12-29 삼성전자주식회사 이미지 신호 프로세서와 이를 포함하는 장치들
JP6509031B2 (ja) * 2015-05-14 2019-05-08 キヤノン株式会社 画像処理装置、画像処理方法及び撮像装置
JP6579859B2 (ja) * 2015-08-11 2019-09-25 キヤノン株式会社 撮像装置、画像処理装置、画像処理方法およびプログラム
JP6541503B2 (ja) * 2015-08-11 2019-07-10 キヤノン株式会社 撮像装置、画像処理装置、画像処理方法およびプログラム
JP6723709B2 (ja) * 2015-09-11 2020-07-15 キヤノン株式会社 撮像装置、画像処理装置及びそれらの制御方法
JP6628617B2 (ja) * 2016-01-25 2020-01-15 キヤノン株式会社 画像処理装置及び画像処理方法
US20170302844A1 (en) * 2016-04-13 2017-10-19 Canon Kabushiki Kaisha Image capturing apparatus, control method therefor, and storage medium
EP3261328B1 (en) 2016-06-03 2021-10-13 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable storage medium
CN106488148B (zh) * 2016-11-01 2019-09-17 首都师范大学 一种超分辨率图像传感器及其构造方法
CN106713707B (zh) * 2016-11-18 2019-08-09 成都微晶景泰科技有限公司 透镜阵列成像方法及装置
KR102637105B1 (ko) * 2018-07-13 2024-02-15 삼성전자주식회사 영상 데이터를 처리하는 방법 및 장치
CN115272449A (zh) * 2021-04-29 2022-11-01 华为技术有限公司 图像处理方法及相关设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007004471A (ja) * 2005-06-23 2007-01-11 Nikon Corp 画像合成方法及び撮像装置
JP2008294741A (ja) * 2007-05-24 2008-12-04 Olympus Corp 撮像システム
JP2013042443A (ja) * 2011-08-19 2013-02-28 Canon Inc 画像処理方法、撮像装置、画像処理装置、および、画像処理プログラム

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4410804A (en) 1981-07-13 1983-10-18 Honeywell Inc. Two dimensional image panel with range measurement capability
JP3774597B2 (ja) 1999-09-13 2006-05-17 キヤノン株式会社 撮像装置
WO2007060847A1 (ja) * 2005-11-22 2007-05-31 Matsushita Electric Industrial Co., Ltd. 撮像装置
US7962033B2 (en) * 2008-01-23 2011-06-14 Adobe Systems Incorporated Methods and apparatus for full-resolution light-field capture and rendering
KR101608970B1 (ko) * 2009-11-27 2016-04-05 삼성전자주식회사 광 필드 데이터를 이용한 영상 처리 장치 및 방법
US8749620B1 (en) * 2010-02-20 2014-06-10 Lytro, Inc. 3D light field cameras, images and files, and methods of using, operating, processing and viewing same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007004471A (ja) * 2005-06-23 2007-01-11 Nikon Corp 画像合成方法及び撮像装置
JP2008294741A (ja) * 2007-05-24 2008-12-04 Olympus Corp 撮像システム
JP2013042443A (ja) * 2011-08-19 2013-02-28 Canon Inc 画像処理方法、撮像装置、画像処理装置、および、画像処理プログラム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016139895A1 (en) 2015-03-03 2016-09-09 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, program, and storage medium
CN107431755A (zh) * 2015-03-03 2017-12-01 佳能株式会社 图像处理设备、摄像设备、图像处理方法、程序和存储介质
EP3266195A4 (en) * 2015-03-03 2018-10-03 C/o Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, program, and storage medium
US10171732B2 (en) 2015-03-03 2019-01-01 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium for generating an image based on plurality of parallax images
US10674074B2 (en) 2015-03-03 2020-06-02 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium for generating an image based on a plurality of parallax images
CN107431755B (zh) * 2015-03-03 2020-10-30 佳能株式会社 图像处理设备、摄像设备、图像处理方法和存储介质

Also Published As

Publication number Publication date
CN104041006A (zh) 2014-09-10
JP5818697B2 (ja) 2015-11-18
JP2013145979A (ja) 2013-07-25
US20160234435A1 (en) 2016-08-11
US9344624B2 (en) 2016-05-17
US20140368690A1 (en) 2014-12-18
CN104041006B (zh) 2017-10-10
US9609208B2 (en) 2017-03-28

Similar Documents

Publication Publication Date Title
US9609208B2 (en) Image generation method, image generation apparatus, program, and storage medium
US9490281B2 (en) Image sensor and image capturing apparatus
JP6071374B2 (ja) 画像処理装置、画像処理方法およびプログラムならびに画像処理装置を備えた撮像装置
US9742984B2 (en) Image capturing apparatus and method of controlling the same
KR101362241B1 (ko) 촬상장치
WO2013046973A1 (ja) 固体撮像素子、撮像装置、及び合焦制御方法
JP6249825B2 (ja) 撮像装置、その制御方法、および制御プログラム
JP6174940B2 (ja) 撮像素子及び撮像装置
JP2012059845A (ja) 撮像素子及び撮像装置
JP2009217074A (ja) 焦点検出装置
CN107431755B (zh) 图像处理设备、摄像设备、图像处理方法和存储介质
US20140071322A1 (en) Image pickup apparatus with image pickup device and control method for image pickup apparatus
US20170257583A1 (en) Image processing device and control method thereof
JP2016111678A (ja) 撮像素子、撮像装置、焦点検出装置ならびに画像処理装置およびその制御方法
US10992859B2 (en) Image capture apparatus and control method thereof capable of reducing an image data amount
JP5961208B2 (ja) 撮像素子及び撮像装置
CN113596431B (zh) 图像处理设备、摄像设备、图像处理方法和存储介质
JP2020127179A (ja) 画像処理装置
JP2019092215A (ja) 画像処理装置、撮像装置、画像処理方法、プログラム、および、記憶媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12864986

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14370417

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 12864986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE