WO2015075303A1 - Plenoptic image processing for an extended depth of field - Google Patents
Plenoptic image processing for an extended depth of field Download PDFInfo
- Publication number
- WO2015075303A1 WO2015075303A1 PCT/FI2014/050836 FI2014050836W WO2015075303A1 WO 2015075303 A1 WO2015075303 A1 WO 2015075303A1 FI 2014050836 W FI2014050836 W FI 2014050836W WO 2015075303 A1 WO2015075303 A1 WO 2015075303A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plenoptic
- image
- plenoptic image
- recorded
- depth
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0075—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/957—Light-field or plenoptic cameras or camera modules
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/21—Indexing scheme for image data processing or generation, in general involving computational photography
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
Definitions
- Embodiments of the present invention relate to image processing. In particular, they relate to processing plenoptic images.
- a plenoptic (or light field) camera simultaneously captures an image of a scene, through each one of multiple optics.
- the multiple optics may be provided, for example, as an array of micro-lenses, apertures or masks.
- a method comprising: performing digital signal processing on a recorded plenoptic image of a scene using a depth-invariant point spread function , which distorts the recorded plenoptic image, to produce and record a modified plenoptic image of the scene, wherein the modified plenoptic image of the scene when convolved with the depth-invariant point spread function reproduces the recorded plenoptic image.
- an apparatus comprising: processing circuitry; and at least one memory storing computer program code configured, working with the processing circuitry, to cause the method as claimed in one or more of claims 1 to 10 to be performed.
- an apparatus comprising: camera optics; an array of plenoptic camera optics; one or more image sensors comprising a plurality of sensels for capturing the recorded plenoptic image; a filter that has the depth-invariant point spread function; and circuitry configured to perform digital signal processing on a recorded plenoptic image of a scene using the depth-invariant point spread function to produce and record a modified plenoptic image of the scene.
- Fig. 1 illustrates an example of a plenoptic camera comprising a filter with a depth-invariant point spread function
- Fig. 2 illustrates an example of the plenoptic camera of Fig 1 , configured as a focused plenoptic camera;
- Fig. 3 illustrates a processing circuitry
- Fig. 4 illustrates an apparatus comprising a plenoptic camera, for example as illustrated in Figs 1 or 2, and processing circuitry, for example, as illustrated in Fig 3; and
- Fig. 5 illustrates an example of a method.
- Embodiments of the invention relate to processing a plenoptic image to obtain an output image with a large depth of field.
- a large depth of field may be obtained by filtering a light field in a plenoptic camera using a filter 100 with a depth invariant point spread function before recording the light field as a recorded plenoptic image, then performing digital signal processing on the recorded plenoptic image using the depth-invariant point spread function that distorts the recorded plenoptic image to produce and record a modified plenoptic image.
- the modified plenoptic image of the scene when convolved with the depth-invariant point spread function produces the recorded plenoptic image.
- the figures illustrate an apparatus 10 comprising: processing circuitry 12; and at least one memory 14 storing computer program code 18 configured, working with the processing circuitry 12, to cause at least the following to be performed: performing digital signal processing on the recorded plenoptic image using the depth-invariant point spread function that distorts the recorded plenoptic image to produce and record a modified plenoptic image, wherein the modified plenoptic image of the scene when convolved with the depth- invariant point spread function produces the recorded plenoptic image.
- Fig 1 illustrates an example of a plenoptic camera apparatus 2 that images a scene comprising at least an object O.
- the plenoptic camera apparatus 2 may be an imaging device. It may, for example be a camera or a multi-functional device with a plenoptic camera as one of its functions.
- the apparatus 2 may be a portable device, that is, a device that is configured to be carried by a user.
- the apparatus 2 may be a hand-portable device, that is, a device that is sized to be carried in a palm of a user and capable of fitting in an inside jacket pocket. If the plenoptic camera apparatus 2 is a hand-portable multi-functional device, such as a mobile cellular telephone, then it may be desirable for an external aperture in a housing 6 for the plenoptic camera to be small.
- This example of a plenoptic camera apparatus 2 comprises, within a housing 40, camera optics 3, an array 24 of plenoptic camera optics 4 and an image sensor 26 comprising a plurality of sensels 5.
- the plenoptic camera apparatus 2 comprises a filter 100 that has a depth-invariant point spread function.
- the camera optics 3, the array 24 of plenoptic camera optics 4, the filter 100, the image sensor 26 are arranged, in series, along an optical axis of the plenoptic camera apparatus 2.
- the camera optics 3 comprises an aperture and/or one or more lenses.
- an optical plane 7 of the camera optics 3 is normal to the optical axis of the plenoptic camera apparatus 2.
- the array 24 of plenoptic camera optics 4 occupies an optical plane 8 normal to the optical axis of the plenoptic camera apparatus 2 and parallel to the optical plane 7 of the camera optics 3.
- Each plenoptic camera optic 4 comprises an aperture, a mask or a lens.
- the array 24 may be an array of micro-lenses, apertures or masks.
- each of the plenoptic camera optics is the same, for example, each micro-lens in the array 24 of plenoptic micro-lenses may have the same focal length.
- the image sensor 26 comprises an array of sensels 5 in an imaging plane 9 normal to the optical axis of the plenoptic camera apparatus 2 and parallel to the optical plane 7 of the camera optics 3 and the optical plane 8 of the array 24 of plenoptic camera optics 4.
- a single image sensor 26 is illustrated a plurality of separate image sensors may be used instead of a single image sensor 26.
- An image sensel 5 is a sensor element. It is the sensing equivalent to a pixel (picture element). The data recorded by a sensel 5 when reproduced as an image corresponds to a pixel.
- a light field between the camera optics 3 and the image sensor 26 is composed of a plurality of different light rays originating from different parts of the scene. Each such scene part is at a distance (depth) from the plenoptic camera apparatus 2.
- Each sensel records the light rays incident on it. The collection of recordings made by the plurality of sensel 5 represents the radiance of the scene.
- the filter 100 has a depth-invariant point spread function (PSF).
- PSD point spread function
- the filter 100 is in the optical path to the recorded plenoptic image.
- the filter 100 creates a distortion in an incident light ray that is independent of the depth of the object O from which the light ray originates.
- the distortion spreads or blurs the light ray in a manner determined by the filter's point spread function.
- the light ray exiting the filter 100 is equivalent to the light ray entering the filter 100 convolved with the point spread function (PSF).
- the PSF may be considered as the impulse response of the filter 100.
- the PSF is invariant to depth of an object, it is also invariant to the focus settings of the plenoptic camera 2. If the focus of the camera optics 3 changes then the PSF may change to a different depth-invariant PSF.
- the filter 100 occupies a filter plane 101 normal to the optical axis of the plenoptic camera apparatus 2 and parallel to the optical plane 8 of the array 24 of plenoptic camera optics 4.
- the filter 100 may be implemented in a number of different ways.
- the filter 100 may be an optical phase modulator (for example a phase plate or mask) that introduces a phase change to an incident light ray that depends upon where on the filter 100 the light ray is incident.
- the phase variation across the phase modulator may be determined by a mathematical nonlinear function that has been determined to provide a depth-invariant PSF.
- a phase modulator may introduce a phase change that is dependent upon the third power of the distance in the filter plane 101 from the optical axis.
- phase variation across the phase modulator may be random or pseudorandom.
- phase modulator may modulates the light rays falling on each microlens (i.e., each plenoptics camera optics 4) such that the image formed under each microlens (referred to as micro-image/elemental-image/sub-image) is depth invariant or the PSF of each microlens image is depth invariant.
- the microlenses 4 could have spherical aberrations introduced such that they have a PSF which is depth invariant.
- the aim would be to have the same PSF for each microlens, this is not compulsory as one can store the PSF for each microlens and use it to deconvolve the captured image to get all-in-focus micro-images of the plenoptic image.
- the filter 100 introduces a depth-invariant blur which will later be removed by processing the plenoptic image recorded by the sensels 5 of the image sensor 26.
- the incident light X that is incident on an optical system is modified at each stage n of the optical system to produce output light Y.
- the impact of each stage n on the light incident at that stage is modeled as an optical filter that can transform an incident light ray to a different exiting light ray using a transform function T n .
- the convolution of the light ray X n incident to that stage with the transform function T n for that stage produces the light ray Y n exiting that stage which is also the light ray X n +i incident to the next stage.
- the order of the stages may be changed. Therefore the position of the filter 100 may be changed with appropriate changes in the filter if required. Further more if the effect of the transform function Tf for the filter 100 is known, its effect may be removed by digital signal processing as discussed later, to convert a recorded plenoptic image to a modified plenoptic image.
- the purpose of the filter 100 is to introduce depth-invariance distortion to the optical system in a controlled manner so that the effect of the depth-invariant distortion may be removed after capturing the plenoptic image using post-capture processing.
- the filter 100 may therefore be positioned between the array 24 of plenoptic camera optics 4 and the image sensor 26 or may be positioned between the array 24 of plenoptic camera optics 4 and the camera optics 3 or it may be integrated with the array 24 of plenoptic camera optics 4 or the camera optics 3.
- the filter 100 illustrated in Fig 1 is a single filter, in other examples the filter 100 may comprise a plurality of sub-filters each of which is positioned at a different position along the optical axis.
- Fig. 2 illustrates an example of the plenoptic camera 2 of Fig 1 , configured as a focused plenoptic camera (also known as Plenoptic Camera 2.0).
- the focused plenoptic camera comprises a main lens 22 as the camera optics 3 and comprises micro-lenses in the array 24 of plenoptic camera optics 4.
- the main lens 22 images a real-life scene.
- the array 24 of micro- lenses is focused on the (real) image plane 23 of the main lens 22.
- Each micro-lens conveys a portion of the (real) image produced by the main lens 22 onto an image sensor 26, effectively acting as a relaying system.
- Each micro-lens satisfies the lens equation:
- the focused plenoptic camera 2 illustrated in Fig. 1 is a Keplerian focused plenoptic camera in which the real image plane 23 of the main lens 22 is positioned between the main lens 22 and the micro-lens array 24.
- the array 24 of micro-lenses is focused on the real image plane 23 of the main lens 22.
- Each micro-lens conveys a portion of the real image produced by the main lens 22 onto an image sensor 26.
- the focused plenoptic camera 2 may alternatively be configured as a Galilean focused plenoptic camera.
- the micro-lens array 24 is placed between the main lens 22 and the image plane 23 of the main- lens 22.
- the array 24 of micro-lenses is focused on the virtual image plane 23 of the main lens 22.
- Each micro-lens conveys a portion of the virtual image produced by the main lens 22 onto an image sensor 26.
- Each micro-lens 4 forms a sub-image on the image sensor 26.
- the sub-images collectively form a plenoptic image (otherwise known as a "light-field image").
- each micro-lens has a different position to the others, a disparity exists when comparing the location of a particular scene point in a sub-image formed by one micro-lens with the location of the same scene point in another sub-image formed by another micro-lens. That is, there will be an offset in the location of a particular scene point in one sub-image relative to the location of the same scene point in another sub-image.
- each micro- lens conveys only part of the image formed by the main lens 22 onto the image sensor 26, individual points in the scene will be imaged by some micro-lenses and not others. This means that each point in the scene will be present in only a subset of the sub-images.
- Fig. 3 illustrates a first apparatus 10 comprising processing circuitry 12 and a memory 14.
- the apparatus 10 may, for example, be a chip or a chipset.
- the processing circuitry 12 is configured to read from and write to the memory 14.
- the processing circuitry 12 may comprise an output interface via which data and/or commands are output by the processing circuitry 12 and an input interface via which data and/or commands are input to the processing circuitry 12.
- the processor 12 may be or comprise one or more processors.
- the processing circuitry 12 may include an analog to digital converter.
- the memory 14 stores a computer program 17 comprising computer program instructions/code 18 that control the operation of the apparatus 10 when loaded into the processing circuitry 12.
- the computer program code 18 provides the logic and routines that enable the apparatus 10 to perform the methods illustrated in Fig 5.
- the processing circuitry 12, by reading the memory 14, is able to load and execute the computer program 17.
- memory 14 is illustrated as a single component it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage.
- processing circuitry 12 is illustrated as a single component it may be implemented as one or more separate components.
- the computer program 17 may arrive at the apparatus 10 via any suitable delivery mechanism 30.
- the delivery mechanism 30 may be, for example, a non-transitory computer-readable storage medium such as a compact disc read-only memory (CD-ROM) or digital versatile disc (DVD).
- the delivery mechanism 30 may be a signal configured to reliably transfer the computer program 17.
- the apparatus 10 may cause the propagation or transmission of the computer program 17 as a computer data signal.
- Fig. 4 illustrates a second apparatus 20.
- the second apparatus 20 is a plenoptic camera 2 as described with reference to Fig 1 or 2.
- the second apparatus 20 includes a housing 6, the first apparatus 10 illustrated in Fig. 3 and the optical system of a plenoptic camera as illustrated in Fig. 1 or 2.
- the housing 6 houses the processing circuitry 12, the memory 14, the camera optics 3, the array 24 of plenoptic camera optics 4, the filter 100 and the image sensor 26.
- the apparatus 20 may also comprise a display.
- the memory 14 is illustrated in Fig. 4 as storing a recorded plenoptic image 44, a modified plenoptic image 33 and information 35 defining a depth-invariant point spread function (PSF).
- PSF point spread function
- the image sensor 26 may be any type of image sensor, including a charge-coupled device (CCD) sensor and a complementary metal-oxide-semiconductor (CMOS) sensor.
- CCD charge-coupled device
- CMOS complementary metal-oxide-semiconductor
- the array 24 of micro-lenses may include any number of micro-lenses.
- the elements 12, 14, 3, 24, 100, 26 are operationally coupled and any number or combination of intervening elements can exist between them (including no intervening elements).
- An aperture 27 is present in the housing 6 that enables light to enter the housing 6.
- the arrow labeled with the reference numeral 40 in Fig. 4 illustrates light entering the housing 6.
- the arrow labeled with the reference numeral 41 illustrates light being conveyed from the camera optics 3 to the array 24 of plenoptic camera optics 4.
- the arrow labeled with the reference numeral 42 illustrates light being conveyed from the array 24 of plenoptic camera optics 4 to the depth-invariant PSF filter 100.
- the arrow labeled with the reference numeral 43 illustrates light being conveyed from the depth-invariant PSF filter 100 to the image sensor 26, which records electronic image data as a recorded plenoptic image 44.
- the electronic image data is converted from an analogue domain to a digital domain before it is recorded.
- the processing circuitry 12 is configured to store the digital recorded plenoptic image 44 in the memory 14.
- the processing circuitry 12 performs digital signal processing on the recorded plenoptic image 44 using a depth-invariant point spread function PSF to produce a modified plenoptic image 33.
- Information 35 defining the PSF may be read from memory 14.
- the produced modified plenoptic image 33 may be stored by the processing circuitry 12 in the memory 14.
- the purpose of the digital signal processing is to reverse an effect of the filter 100 having the depth-invariant point spread function on the recorded plenoptic image 44.
- the produced modified plenoptic image 33 when convolved with the depth-invariant point spread function PSF reproduces the recorded plenoptic image 44.
- the recorded plenoptic image 44 and the modified plenoptic image 33 are images of the same scene. There may, for example, be exact correspondence between the images.
- the recorded plenoptic image 44 comprises pixels and the modified plenoptic image 33 comprises pixels and there may be a one to one mapping between the pixels of the recorded plenoptic image 44 and the pixels of the modified plenoptic image 33.
- the processing circuitry 12 may perform the same digital signal processing on each of the pixels of the recorded plenoptic image 44 to produce the respective pixels of the modified plenoptic image 33.
- the digital signal processing may occur without using any depth estimates of pixels in the recorded plenoptic image 44.
- Fig 5 illustrates an example of a digital signal processing method 50 that may be performed by the circuitry 12 to reverse an effect of the filter 100 having the depth-invariant point spread function on the recorded plenoptic image 44.
- the processing circuitry 12 performs a deconvolution operation on the recorded plenoptic image 44 using the depth-invariant point spread function PSF to reverse an effect of the depth-invariant point spread function PSF in the optical path of the recorded plenoptic image 44.
- the same deconvolution operation is performed on each of the pixels of the recorded plenoptic image 44 to produce the modified plenoptic image 33. There is consequentially a one to one mapping between the pixels of the recorded plenoptic image and the pixels of the modified plenoptic image.
- the deconvolution operation may be performed in a variety of different ways.
- the processing circuitry 12 performs a deconvolution operation on the recorded plenoptic image 44 using the depth-invariant point spread function PSF by performing an inverse Fourier transform on a result of dividing a Fourier transform of the recorded plenoptic image by a Fourier transform of the depth-invariant point spread function. That is where
- I M is the modified plenoptic image 33
- I R is the recorded plenoptic image 44
- F is the Fourier transform operator
- F "1 is the inverse Fourier transform operator
- P is the depth-invariant point spread function.
- the processing circuitry 12 performs a deconvolution operation on the recorded plenoptic image 44 using the depth-invariant point spread function PSF by: (i) convolving a putative plenoptic image of the scene with the depth-invariant point spread function to produce a reference plenoptic image;
- Ip is the putative plenoptic image
- l ref is the reference plenoptic image 44
- I M is the modified plenoptic image 33
- I R is the recorded plenoptic image 44
- I P is the putative plenoptic image
- l ref is the reference plenoptic image 44
- ⁇ is a comparison operator
- T is a threshold value.
- the modified plenoptic image 33 undergoes computational plenoptic processing to produce one or more output images 36 of the scene.
- the resolution (number of pixels) of the output image 36 is less than the number of pixels of the modified plenoptic image 33/number of sensels 5.
- the modified plenoptic image 33 records the light field associated with the optical system of the plenoptic camera without the filter 100.
- the incident light X that is incident on a optical system is modified at each stage n of the optical system to produce the modified plenoptic image.
- the impact of each stage n on the light incident at that stage is modeled as an optical filter that can transform an incident light rays to a different exiting light rays using a transform function T n .
- the convolution of the light rays X n incident to that stage with the transform function T n for that stage produces the light rays Y n exiting that stage which is also the light rays X n +i incident to the next stage.
- I M is the modified plenoptic image 33 and S is the original light from the imaged scene.
- microlens array 24 one can also have constant focal-length microlenses 4 without any filter 100 on top of it, interleaved between the micro-lenses 4 with filter 100. This may help with near distance focus.
- processing the modified plenoptic image 33 is possible to computationally change the optics or apply different optics to produce one or more output images 36 of the scene
- the modified plenoptic image 33 using a first transform function to obtain a first image of the scene captured from a first perspective and using a second transform function to obtain a second image of the scene captured from a second, different perspective.
- This change in perspective may be used to create a depth map for features in the imaged scene.
- the first image of the scene comprises a first sub-set of the pixels of the modified plenoptic image 33 and the a second image of the scene comprises a second sub-set of the pixels of the modified plenoptic image 33.
- the second sub-set of pixels have a constant offset from the first sub-set of pixels.
- the modified plenoptic image 33 using a first transform function to obtain a first image of the scene as if captured using a lens of a first focal length and using a second transform function to obtain a second image of the scene as if captured using a lens of a second, different focal length
- This change in lens focal length may be used to refocus.
- references to 'computer-readable storage medium', 'processing circuitry', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry.
- References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed- function device, gate array or programmable logic device etc.
- the term 'circuitry' refers to all of the following:
- circuits such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
- circuitry would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware.
- circuitry would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, or other network device.
- the blocks illustrated in figure 5 may represent blocks in a method and/or sections of code in the computer program 17. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
- the apparatus 10 illustrated in Fig. 3 may form part of a computer rather than a plenoptic camera such as that illustrated in Fig. 4.
- the apparatus that performs the image processing to produce an output image 36 having a greater depth of field need not be or form part of the apparatus that was used to capture the original plenoptic image 44.
- an apparatus 10 comprising: processing circuitry 12; and at least one memory 14 storing computer program code 18 configured, working with the processing circuitry 12, to cause: performing digital signal processing on a recorded plenoptic image 44 of a scene using a depth-invariant point spread function , which distorts the recorded plenoptic image 44, to produce and record a modified plenoptic image 33 of the scene, wherein the modified plenoptic image 33 of the scene when convolved with the depth-invariant point spread function reproduces the recorded plenoptic image 44.
- an apparatus 10 comprising: means 12 for performing digital signal processing on a recorded plenoptic image 44 of a scene using a depth-invariant point spread function , which distorts the recorded plenoptic image 44, to produce and record a modified plenoptic image 33 of the scene, wherein the modified plenoptic image 33 of the scene when convolved with the depth- invariant point spread function reproduces the recorded plenoptic image 44
- a depth-invariant point spread function which distorts the recorded plenoptic image 44, to produce and record a modified plenoptic image 33 of the scene, wherein the modified plenoptic image 33 of the scene when convolved with the depth- invariant point spread function reproduces the recorded plenoptic image 44
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Image Processing (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN5382/CHE/2013 | 2013-11-21 | ||
IN5382CH2013 IN2013CH05382A (enrdf_load_stackoverflow) | 2013-11-21 | 2014-11-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015075303A1 true WO2015075303A1 (en) | 2015-05-28 |
Family
ID=53179018
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2014/050836 WO2015075303A1 (en) | 2013-11-21 | 2014-11-06 | Plenoptic image processing for an extended depth of field |
Country Status (2)
Country | Link |
---|---|
IN (1) | IN2013CH05382A (enrdf_load_stackoverflow) |
WO (1) | WO2015075303A1 (enrdf_load_stackoverflow) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8345144B1 (en) * | 2009-07-15 | 2013-01-01 | Adobe Systems Incorporated | Methods and apparatus for rich image capture with focused plenoptic cameras |
-
2014
- 2014-11-06 IN IN5382CH2013 patent/IN2013CH05382A/en unknown
- 2014-11-06 WO PCT/FI2014/050836 patent/WO2015075303A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8345144B1 (en) * | 2009-07-15 | 2013-01-01 | Adobe Systems Incorporated | Methods and apparatus for rich image capture with focused plenoptic cameras |
Non-Patent Citations (2)
Title |
---|
COSSAIRT, O ET AL.: "Diffusion Coded Photography for Extended Depth of Field.", ACM TRANS. GRAPH., vol. 29, 31 July 2010 (2010-07-31), XP058041048, Retrieved from the Internet <URL:http://doi.acm.org/10.1145/1833349.1778768> DOI: doi:10.1145/1778765.1778768 * |
WETZSTEIN, G ET AL.: "State of the Art in Computational Plenoptic Imaging.", EUROGRAPHICS STATE OF THE ART REPORT., 2011, Retrieved from the Internet <URL:https://www.cs.ubc.ca/labs/imager/tr/2011/ComputationalPlenopticlmaging/ComputationalPlenopticlmaging-STAR.pdf> * |
Also Published As
Publication number | Publication date |
---|---|
IN2013CH05382A (enrdf_load_stackoverflow) | 2015-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Georgiev et al. | Lytro camera technology: theory, algorithms, performance analysis | |
JP6802372B2 (ja) | 端末のための撮影方法及び端末 | |
JP4981124B2 (ja) | 改良型プレノプティック・カメラ | |
TWI554106B (zh) | 產生影像散景效果的方法及影像擷取裝置 | |
TWI538512B (zh) | 調整對焦位置的方法及電子裝置 | |
CN108335323B (zh) | 一种图像背景的虚化方法及移动终端 | |
WO2017016258A1 (zh) | 一种终端及终端拍摄的方法 | |
EP2786556B1 (en) | Controlling image capture and/or controlling image processing | |
CN102422632A (zh) | 摄像装置以及图像复原方法 | |
CN103916574B (zh) | 摄像装置 | |
JP2013145982A (ja) | 撮像装置、画像処理装置及び方法 | |
CN103262523B (zh) | 摄像装置、摄像系统、摄像方法以及图像处理方法 | |
Alzayer et al. | Dc2: Dual-camera defocus control by learning to refocus | |
JP2012195797A (ja) | パンフォーカス画像生成装置 | |
Chen et al. | Light field based digital refocusing using a DSLR camera with a pinhole array mask | |
CN105450943B (zh) | 产生图像散景效果的方法及图像获取装置 | |
US9866809B2 (en) | Image processing system with aliasing detection mechanism and method of operation thereof | |
Georgiev et al. | Using focused plenoptic cameras for rich image capture | |
US9667846B2 (en) | Plenoptic camera apparatus, a method and a computer program | |
WO2015075303A1 (en) | Plenoptic image processing for an extended depth of field | |
Oberdörster et al. | Digital focusing and refocusing with thin multi-aperture cameras | |
US20190149750A1 (en) | High frame rate motion field estimation for light field sensor, method, corresponding computer program product, computer readable carrier medium and device | |
Georgiev | Plenoptic camera resolution | |
WO2014122506A1 (en) | Image processing of sub -images of a plenoptic image | |
Sahin et al. | Light L16 computational camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14863790 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14863790 Country of ref document: EP Kind code of ref document: A1 |