WO2014083439A1 - Appareil photographique plénoptique, procédé, et programme d'ordinateur - Google Patents

Appareil photographique plénoptique, procédé, et programme d'ordinateur Download PDF

Info

Publication number
WO2014083439A1
WO2014083439A1 PCT/IB2013/053319 IB2013053319W WO2014083439A1 WO 2014083439 A1 WO2014083439 A1 WO 2014083439A1 IB 2013053319 W IB2013053319 W IB 2013053319W WO 2014083439 A1 WO2014083439 A1 WO 2014083439A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera optics
array
image sensor
relative
physical movement
Prior art date
Application number
PCT/IB2013/053319
Other languages
English (en)
Inventor
Mithun Uliyar
Basavaraja Sv
Gururaj PUTRAYA
Rajeswari Kannan
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to US14/443,735 priority Critical patent/US9667846B2/en
Publication of WO2014083439A1 publication Critical patent/WO2014083439A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00

Definitions

  • Embodiments of the present invention relate to a plenoptic camera apparatus, a method and a computer program.
  • a plenoptic (or light field) camera simultaneously captures an image of a scene, through each one of multiple optics.
  • the multiple optics may be provided, for example, as an array of micro-lenses, apertures or masks.
  • an apparatus comprising: camera optics; an array of plenoptic camera optics; an image sensor comprising a plurality of sensels; and a driver configured to cause relative physical movement of at least the camera optics and the array of plenoptic camera optics.
  • a method comprising: causing physical movement of camera optics relative to an image sensor; and causing contemporaneous physical movement of an array of plenoptic camera optics relative to the image sensor.
  • a computer program product comprising computer program instructions that, when loaded into a processor, enable: causing physical movement of camera optics relative to an image sensor; and causing contemporaneous physical movement of an array of plenoptic camera optics relative to the image sensor.
  • Fig 1 illustrates an example of a plenoptic camera apparatus
  • Fig 2 schematically illustrates an example in which there is relative physical movement between camera optics, the array of plenoptic camera optics and the image sensor;
  • Fig 3 illustrates an example of the plenoptic camera apparatus
  • Fig 4 illustrates a method which may be carried out of the plenoptic camera apparatus
  • Fig 5 illustrates a method that includes image processing
  • Figs 6A, 6B and 6C illustrate aspects of use of an example of a plenoptic camera apparatus.
  • the Figures illustrate an apparatus 2 comprising: camera optics 4; an array 6 of plenoptic camera optics 8; an image sensor 10 comprising a plurality of sensels 12; and a driver 20 configured to cause relative physical movement of at least the camera optics and the array of plenoptic camera optics.
  • Fig 1 illustrates an example of a plenoptic camera apparatus 2.
  • the plenoptic camera apparatus 2 may be an imaging device. It may, for example be a camera or a multi-functional device with a plenoptic camera as one of its functions.
  • the apparatus 2 may be a portable device, that is, a device that is configured to be carried by a user.
  • the apparatus 2 may be a hand-portable device, that is, a device that is sized to be carried in a palm of a user and capable of fitting in an inside jacket pocket. If the plenoptic camera apparatus 2 is a hand-portable multi-functional device, such as a mobile cellular telephone, then it may be desirable for an external aperture in a housing 40 for the plenoptic camera to be small.
  • This example of a plenoptic camera apparatus 2 comprises, within a housing 40, camera optics 4, an array 6 of plenoptic camera optics 8 and an image sensor 10 comprising a plurality of sensels 12.
  • the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10 are arranged, in series, along an optical axis of the plenoptic camera apparatus 2.
  • the camera optics 4 comprises an aperture and/or one or more lenses.
  • An optical plane 30 of the camera optics 4 is normal to the optical axis of the plenoptic camera apparatus 2.
  • the array 6 of plenoptic camera optics 8 occupies an optical plane 32 normal to the optical axis of the plenoptic camera apparatus 2 and parallel to the optical plane 30 of the camera optics 4.
  • Each plenoptic camera optic 8 comprises an aperture, a mask or a lens.
  • the array 6 may be an array of micro-lenses, apertures or masks.
  • the image sensor 10 comprises an array of sensels 12 in an imaging plane 34 normal to the optical axis of the plenoptic camera apparatus 2 and parallel to the optical plane 30 of the camera optics 4 and the optical plane 32 of the array 6 of plenoptic camera optics 8.
  • An image sensel 12 is a sensor element. It is the sensing equivalent to a pixel (picture element).
  • the data recorded by a sensel 12 when reproduced as an image corresponds to a pixel.
  • This example of the plenoptic camera apparatus 2 also comprises a driver 20 configured to cause relative physical movement between the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • the driver 20 is configured to operate an actuator 22A associated with the camera optics 4.
  • the actuator 22A is configured to physically move the camera optics 4 relative to the housing 40 in the optical plane 30 parallel to the imaging plane 34 of the image sensor 10.
  • the driver 20 is configured to operate an actuator 22B associated with the array 6 of plenoptic camera optics 8.
  • the actuator 22B is configured to physically move the array 6 relative to the housing 40 in the optical plane 32 parallel to the imaging plane 34 of the image sensor 10.
  • the driver 20 is configured to operate an actuator 22C associated with the image sensor 10.
  • the actuator 22C is configured to physically move the image sensor 10 relative to the housing 40 parallel to the imaging plane 34 of the image sensor 10.
  • Fig 2 schematically illustrates an example in which there is relative physical movement between the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • Fig 2 comprises a right-angled triangle.
  • the base of the triangle represents a focused optical path from an object O through the camera optics 4 and the array 6 of plenoptic camera optics 8 to the image sensor 10, at time t1 before relative physical movement between the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • the hypotenuse of the triangle represents a focused optical path from the object O through the camera optics 4 and the array 6 of plenoptic camera optics 8 to the image sensor 10, at time t2 after relative physical movement between the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • the camera optics 4 physically moves upwards a displacement c relative to the housing 40 in the optical plane
  • the array 6 of plenoptic camera optics 8 moves upwards a displacement p relative to the housing 40 in the optical plane 32
  • the image sensor 10 physically moves upwards a displacement s relative to the housing 40 in the imaging plane 34.
  • This relative physical movement maintains focus of a reference point O in a scene captured by the image sensor 10 and maintains a position of the reference point at a particular sensel of the image sensor 10.
  • the displacements c, p, s of the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10 within the respective parallel planes have a linear relationship.
  • the physical movement of the image sensor 10 relative to the housing 40 of the apparatus is greater than the physical movement of the array 6 of plenoptic camera optics 8 relative to the housing 40 of the apparatus.
  • the physical movement of the array 6 of plenoptic camera optics 8 relative to the housing 40 of the apparatus is greater than the physical movement of the camera optics 4 relative to the housing 40.
  • the camera optics 4 has a physical aperture diameter D and a focal length f. It's real F-number FN is therefore f/D.
  • the camera optics 4 is moved, in its optical plane 30, by the driver 20 a distance c.
  • the effective aperture diameter D' of the moving camera optics 4 is increased. In this example, it is D+c.
  • the effective F-number FN' of the moving camera optics 4 has therefore decreased. In this example, it is f/D'.
  • the optical design of the plenoptic camera apparatus 2 may be based up the effective F-number FN' of the camera optics 4, rather than the real F-number FN.
  • the effective F-number FN' is less than the real F-number FN.
  • a plenoptic camera optic 8 has a physical aperture diameter X and a focal length f. It's real F-number FNp is therefore f/X.
  • the array 6 of plenoptic camera optics 8 is moved, in its optical plane 32, by the driver 20 a distance p.
  • the effective aperture diameter X' of the moving plenoptic camera optic 8 is increased. In this example, it is X+p.
  • the effective F-number FNp' of the moving plenoptic camera optic 8 has therefore decreased. In this example, it is f/(X').
  • the optical design of the plenoptic camera apparatus 2 may be based up the effective F-number FN' of the plenoptic camera optics 8, rather than the real F-number FN.
  • the effective F-number FN' is less than the real F-number FN.
  • a plenoptic camera apparatus 2 it may be desirable to optically match the camera optics 4 and the plenoptic camera optics 8. It may, for example, be desirable for the F-number of the plenoptic camera optics 8 to match the F-number of the camera optics 4. In this scenario, the effective F- number FNp' of the plenoptic camera optics 8 matches the effective F-number FN' of the camera optics 4. The camera optics 4 therefore has a real F- number FN greater than a real F-number of the plenoptic camera optics 8.
  • the physical aperture diameter D of the camera optics 4 may be small.
  • the real F-number of the camera optics 4 may be greater than 2.5 with a focal length of between 6mm and 10 mm.
  • the physical aperture diameter D of the camera optics 4 may, in this example, range between 2 mm and 4mm
  • Fig 3 illustrates an example of the plenoptic camera apparatus 2.
  • the figure illustrates an example of a driver 20 which controls actuators 22.
  • the actuators 22 are used to move, relative to each other, the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • the actuators physically move the camera optics 4, the array 6 of plenoptic camera optics 8 and the image sensor 10.
  • the actuators physically move only the camera optics 4, and the array 6 of plenoptic camera optics 8.
  • the image sensor 10 is not physically moved but is virtually moved by processing an output 1 1 from the image sensor 10. There may therefore be no actuator 22C, as illustrated in Fig 1 , as the image sensor 10 does not need to physically move.
  • the value s illustrated in Fig 2 does not represent a physical distance moved by the image sensor 10 but instead represents a shift in an origin of a captured image within the sensels 12 of the image sensor 10.
  • the driver 20 is provided by processing circuitry 27. It provides control signals 21 to the actuators 22.
  • the processing circuitry 27 that is used to provide the driver 20 is also used to provide an image processor 23 that process an output 1 1 from the image sensor 10.
  • the processing circuitry 27 may be configured such that the relative physical movement caused by the driver 20 maintains focus of a reference point in a scene captured by the image sensor 10.
  • the processing circuitry 27 may be configured such that the relative physical movement caused by the driver 20 maintains focus of a reference point in a scene captured by the image sensor and also maintains a position of the reference point at a particular sensel 12 of the image sensor.
  • the processing circuitry 27 may be configured such that the relative physical movement caused by the driver 20 maintains focus of a reference point in a scene captured by the image sensor and tracks a position of the reference point across the sensels 12 of the image sensor 10. This tracking is used by the image processor 23 to compensate the output 1 1 of the image sensor 10 to effect virtual movement of the image sensor 10.
  • the processing circuitry 27 in this example comprises a processor 24 and a memory 26.
  • the memory 26 stores computer program instructions that enable the driver 20 and image processor 23.
  • Fig 4 illustrates a method 50 which may be carried out at the plenoptic camera apparatus 2, for example, by the processing circuitry 27 of Fig 3.
  • the driver 20 of the apparatus 2 causes physical movement of the camera optics 4 relative to an image sensor 10.
  • the driver 20 of the apparatus 2 causes contemporaneous physical movement of an array 6 of plenoptic camera optics 4 relative to the image sensor 10.
  • the order of blocks 52 and 54 does not imply an order to the method but merely separates distinct activities within the method 50 that occur at the same time during an image exposure.
  • the apparatus 2 causes movement of the image sensor.
  • the movement of the image sensor 10 is a physical movement caused by the driver 20.
  • the physical movement is contemporaneous with blocks 52 and 54.
  • the order of blocks 52, 54 and 56 does not imply an order to the method but merely separates distinct activities within the method 50 that occur at the same time during an image exposure.
  • the movement of the image sensor 10 is a virtual movement applied in post-processing of the output 1 1 of the sensels 12 of the image sensor 10 by the image processor 23.
  • Figs 6A, 6B and 6C one of many examples of how the plenoptic camera apparatus 2 may be used is illustrated.
  • Fig 6A is similar to Fig 1 . It illustrates the camera optics 4 of Fig 1 , the array 6 of plenoptic camera optics 8 of Fig 1 and the optical sensor 10 of Fig 1 . However, for the purposes of clarity it does not illustrate the housing 40, the actuators 22 nor the driver 20.
  • the apparatus 2 has N different perspectives P1 , P2, P3...PN of a single object A each sampled by different operational apertures A1 , A2, A3....AN of the camera optics 4.
  • a set 70 of sensels 12i , 12 2 , 12 3 ...12 N is used by the image sensor 10.
  • a group 72 m of one or more of those sensels is associated with each perspective Pm of the respective perspectives P1 , P2, P3...PN.
  • Each group 72 m of sensels comprises a sensel 12 m associated with each perspective Pm of the respective perspectives P1 , P2, P3.
  • Fig 5 illustrates a method 60.
  • the method 60 is a general method but will be described in the following paragraphs with respect to the particular example illustrated in Figs 6A, 6B and 6C of the apparatus illustrated in Fig 1 .
  • the camera optics 4 are, at block 61 , positioned such that a first sub-set of the N perspectives (e.g. ⁇ P1 , P2 ⁇ ) are, at block 62, sampled at a corresponding first sub-set of sensels (e.g. ⁇ 12i , 12 2 ⁇ )
  • the driver 20 is configured to cause a first displacement of at least the camera optics 4 and the array 6 of plenoptic camera optics 8 during the first exposure period as previously described.
  • c, p, s may be zero.
  • the camera optics 4 are displaced (e.g. by c) so that they occupy a first position that covers a first sub-set of the N different operational apertures (e.g. ⁇ A1 , A2 ⁇ ) associated with the first sub-set of the N perspectives (e.g. ⁇ P1 , P2 ⁇ ) but does not cover the other operational apertures (e.g. A3).
  • the first sub-set of the N perspectives e.g. ⁇ P1 , P2 ⁇
  • the first sub-set of the N perspectives are sampled at a corresponding first sub-set of sensels (e.g. ⁇ 12i , 12 2 ⁇ but not 12 3 )
  • the camera optics 4 are, at block 63, positioned such that a second sub-set of the N perspectives (e.g. ⁇ P2, P3 ⁇ ) are, at block 64, sampled at a corresponding second sub-set of sensels (e.g. ⁇ 12 2 , 12 3 ⁇ ).
  • the driver 20 is configured to cause a second displacement of at least the camera optics 4 and the array 6 of plenoptic camera optics 8 during the second exposure period as previously described.
  • the camera optics 4 are displaced (e.g. by c) so that they occupy a second position that covers a second sub-set of the N different operational apertures (e.g. ⁇ A2, A3 ⁇ ) associated with the second sub-set of the N perspectives (e.g.
  • the second sub-set of the N perspectives e.g. ⁇ P2, P3 ⁇
  • the second sub-set of sensels e.g. ⁇ 12 2 , 12 3 ⁇ but not 12i
  • the first sub-set of sensels 12 and the second sub-set of sensels 12 may be relatively displaced by an integral number of sensels 12.
  • the second exposure period immediately follows the first exposure period and both occur during a single continuous image exposure. In other examples, there may be a delay between the first exposure period and the second exposure period.
  • processing circuitry 27 that receives the output 1 1 from the image sensor 10, e.g. processor 24 in Fig 3, is configured to process the first image and the second image taking into consideration parallax arising from the different first and second perspectives P1 , P2.
  • the image processor 23 may use computer vision techniques to identify interest points within the images, it may determine shifts of interest points within the images captured from different perspectives, it may then use known angular off-sets between the different perspectives and the determined shifts in interest points from the different perspectives to estimate a distance, from the apparatus 2, to the objects corresponding to the interest points using trigonometric equations or look-up tables.
  • the first and second sub-sets of the N perspectives overlap, the first and second sub-sets of the N different operational apertures overlap, and the first and second sub-sets of sensels overlap.
  • driver 20 and image processor 20 may be as a single entity or as separate entities. They may be implemented in hardware alone ( a circuit, a processor etc), have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware). They may be implemented using instructions that enable hardware
  • Fig 3 they are implemented using a processor 24 which is configured to read from and write to a memory 26.
  • the memory 26 comprises a computer program 28.
  • the processor 24 may be configured by the computer program 28 to perform the function of the driver 20. It may, for example, in the first embodiment cause execution of blocks 52, 54 and 46 of method 50. It may, for example, in the second embodiment cause execution of blocks 52 and 54. It may, for example, in the first and second embodiments cause execution of the blocks 61 , 63 in the method 60.
  • the processor 24 may be configured by the computer program 28 to perform the function of the image processor 23. It may, for example, in the second embodiment execute block 56 and cause virtual movement of the image sensor by processing the output 1 1 of the image senso O.
  • the processor 24 may be configured by the computer program 28 to perform the function of the image processor 23. It may, for example, in the first and second embodiments, perform the function of block 65 in the method 60.
  • the processor 24 is configured to read from and write to the memory 26.
  • the processor 24 may also comprise an output interface via which data and/or commands are output by the processor 24 and an input interface via which data and/or commands are input to the processor 24.
  • the memory 26 stores the computer program 28 comprising computer program instructions that control the operation of the apparatus 2 when loaded into the processor 24.
  • the computer program instructions provide the logic and routines that enables the apparatus to perform the data processing and control methods described.
  • the processor 24 by reading the memory 26 is able to load and execute the computer program 28.
  • the computer program may arrive at the apparatus 2 via any suitable delivery mechanism .
  • the delivery mechanism may be, for example, a non-transitory computer-readable storage medium, a computer program product, a memory device, etc
  • module' refers to a unit or apparatus that excludes certain parts/components that would be added by an end manufacturer or a user.
  • the combination of camera optics 30, array 6 of plenoptic camera optics 8 and image sensor, with their respective actuators 22 (if any) may be provided as a module.
  • the combination of camera optics 30, array 6 of plenoptic camera optics 8 and image sensor, with their respective actuators 22 (if any) and driver 20 may be provided as a module.
  • the combination of camera optics 30, array 6 of plenoptic camera optics 8 and image sensor, with their respective actuators 22 (if any), processor 24 and memory 26 (with or without the computer program 28) may be provided as a module.
  • Some or all of the blocks illustrated in the Fig 4 may represent steps in a method and/or sections of code in the computer program 28.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
  • Some or all of the blocks illustrated in the Fig 5 may represent steps in a method and/or sections of code in the computer program 28.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.

Abstract

La présente invention se rapporte à un appareil qui comprend : des optiques d'appareil photo ; un ensemble 6 d'optiques d'appareil photographique plénoptique 8 ; un capteur d'image 10 comprenant une pluralité de lentilles 12 ; et un circuit de commande, qui est configuré de façon à commander un déplacement physique relatif des optiques d'appareil photo et de l'ensemble d'optiques d'appareil photographique plénoptique.
PCT/IB2013/053319 2012-11-27 2013-04-26 Appareil photographique plénoptique, procédé, et programme d'ordinateur WO2014083439A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/443,735 US9667846B2 (en) 2012-11-27 2013-04-26 Plenoptic camera apparatus, a method and a computer program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN4928/CHE/2012 2012-11-27
INCH49282012 2012-11-27

Publications (1)

Publication Number Publication Date
WO2014083439A1 true WO2014083439A1 (fr) 2014-06-05

Family

ID=48614085

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/053319 WO2014083439A1 (fr) 2012-11-27 2013-04-26 Appareil photographique plénoptique, procédé, et programme d'ordinateur

Country Status (1)

Country Link
WO (1) WO2014083439A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090185801A1 (en) * 2008-01-23 2009-07-23 Georgiev Todor G Methods and Apparatus for Full-Resolution Light-Field Capture and Rendering

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090185801A1 (en) * 2008-01-23 2009-07-23 Georgiev Todor G Methods and Apparatus for Full-Resolution Light-Field Capture and Rendering

Similar Documents

Publication Publication Date Title
EP3443736B1 (fr) Procédé et appareil pour stabiliser l'image d'une vidéo
JP2012123296A (ja) 電子機器
CN107209397B (zh) 具有图像稳定的成像透镜
US20140211047A1 (en) Digital photographing apparatus and control method thereof
JP2008245157A (ja) 撮像装置およびその方法
EP2672696B1 (fr) Dispositif de restauration d'image, dispositif d'imagerie et procédé de restauration d'image
CN112740650B (zh) 摄像装置设备
US9279954B2 (en) Lens barrel capable of increasing shooting magnification while being miniaturized, and image pickup apparatus
CN105491267A (zh) 影像撷取装置及数位变焦方法
US9667846B2 (en) Plenoptic camera apparatus, a method and a computer program
US10250805B2 (en) Imaging device for performing DFD processing at appropriate timing
JP4364847B2 (ja) 撮像装置および画像変換方法
US20120200941A1 (en) Zoom type lens barrel and image pickup apparatus
KR102429361B1 (ko) 카메라 및 그 제어 방법
JP2006094468A (ja) 撮像装置および撮像方法
CN103809264A (zh) 一种光学调焦系统结构
JP2006094469A (ja) 撮像装置および撮像方法
WO2014083439A1 (fr) Appareil photographique plénoptique, procédé, et programme d'ordinateur
JP2013162369A (ja) 撮像装置
JP2009033607A (ja) 撮像装置および画像処理方法
JP6645711B2 (ja) 画像処理装置、画像処理方法、プログラム
JP5553862B2 (ja) 撮像装置および撮像装置の制御方法
JP2013236207A (ja) 撮像装置、撮像方法、およびプログラム
JP2016010080A (ja) 画像処理装置、画像処理方法及びプログラム
CN113067981B (zh) 相机的焦距调整方法和相机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13728530

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14443735

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13728530

Country of ref document: EP

Kind code of ref document: A1