WO2014191613A1 - Light field imaging - Google Patents

Light field imaging Download PDF

Info

Publication number
WO2014191613A1
WO2014191613A1 PCT/FI2014/050353 FI2014050353W WO2014191613A1 WO 2014191613 A1 WO2014191613 A1 WO 2014191613A1 FI 2014050353 W FI2014050353 W FI 2014050353W WO 2014191613 A1 WO2014191613 A1 WO 2014191613A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
image
array
lenslets
light field
Prior art date
Application number
PCT/FI2014/050353
Other languages
French (fr)
Inventor
Basavaraja S V
Mithun Uliyar
Gururaj Gopal Putraya
Martin Schrader
Rajeswari Kannan
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Publication of WO2014191613A1 publication Critical patent/WO2014191613A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images

Abstract

An apparatus, electronic device, system and methods for light field imaging. The apparatus comprises a first lens; a relay lens package comprising at least a second lens configured to form an intermediate image plane; an array of convex lenslets positioned optically between the first lens and the relay lens package; wherein each lenslet of the array of lenslets is configured to form on the intermediate image plane a real image of any object in the field of view of the first lens.

Description

LIGHT FIELD IMAGING
TECHNICAL FIELD The present application generally relates to light field imaging. BACKGROUND
In a digital camera, light rays fall on an image sensor through the optics of the camera. The image sensor detects and records the intensity of light rays incident on each pixel of the image sensor. From the intensity data, an image or a photograph is created. As the image sensor records the intensity of all the light rays incident on each pixel, the direction of the light rays is not taken into account. The camera produces an image, the focus of which is dependent on the optical settings at the time of exposure.
In light field photography, also called plenoptic imaging, the intensity and direction of each incident light ray is recorded, and with the information on the four dimensional light field it is feasible to create each possible image in the field of view of the camera, e.g. the focus of an image can be adjusted after the exposure. Light field photography has been realized with arrangements of several cameras, and with dedicated light field camera arrangements comprising a mask or an array of lenslets between the lens of the camera and the imaging sensor. The recent success and growth of digital and mobile imaging calls for simple solutions for light field imaging which do not sacrifice the quality of the image, do not require tedious hardware modifications nor disturb the comfortable user experience of digital and mobile imaging. Furthermore, as mobile users increasingly edit and share their images after capture, images allowing an enhanced range of post processing possibilities are desired.
SUMMARY
Various aspects of examples of the invention are set out in the claims. According to a first example aspect of the invention, there is provided an apparatus, comprising:
a first lens;
a relay lens package comprising at least a second lens configured to form an intermediate image plane; and
an array of convex lenslets positioned optically between the first lens and the relay lens package; wherein
each lenslet of the array of lenslets is configured to form on the intermediate image plane a real image of any object in the field of view of the first lens.
The relay lens package may further comprise a third lens positioned optically after the second lens. The apparatus may further be configured to be attached in front of a lens of a camera unit comprising a camera lens and an image sensor.
The third lens may be configured together with the camera lens to project the real image formed by the array of lenslets onto the image sensor.
Each lens may comprise a lens package comprising a combination of a plurality of lens elements.
Each lens and the array of lenslets may be in such a position that a light ray emanating at a point in the field of view of the first lens is portrayed by at least 2x2 lenslets of the array of lenslets.
At least one of the lenses and/or the array of lenslets may comprise switchable optics elements.
According to a second example aspect of the invention, there is provided an electronic device, comprising:
a housing;
a display; a memory;
a processor; and
a camera unit comprising an image sensor and a camera lens; wherein
the processor is configured to cause the image sensor to record a primary image comprising light field information from a real intermediate image received on the sensor via the camera lens from an intermediate image plane, the intermediate image being formed by an array of convex lenslets; and
the processor is configured to cause retrieving light field information from a primary image obtained from the image sensor.
The processor may be further configured to cause forming a secondary image from the primary image.
According to a third example aspect of the invention, there is provided a system, comprising the apparatus of the first example aspect and the electronic device of the second example aspect. According to a fourth example aspect of the invention, there is provided a method, comprising
relaying with a first lens light rays from an object into an array of convex lenslets;
forming with the array of convex lenslets a real image on an intermediate image plane, the intermediate image plane being formed by a second lens of a relay lens package; wherein
the array of convex lenslets is positioned between the first lens and the relay lens package. The method may further comprise projecting the intermediate image onto an image sensor via a third lens of the relay lens package.
The method may further comprise adjusting at least one of the lenses and/or the array of lenslets with switchable optics elements. According to a fifth example aspect of the invention, there is provided a method, comprising
setting a camera unit of an electronic device to light field capture mode, so that the camera unit is focused on an intermediate image plane;
projecting with a camera lens an intermediate image formed by an array of convex lenslets on the intermediate image plane onto an image sensor;
recording a primary image comprising light field information projected on the sensor; and
retrieving light field information from the primary image obtained from the image sensor.
The method may further comprise forming a secondary image from the primary image.
According to a sixth example aspect of the invention, there is provided a method, comprising
receiving user input on using light field imaging;
in response to the user input, attaching an apparatus comprising a first lens, a relay lens package comprising at least a second lens and an array of convex lenslets between the first lens and the relay lens package in front of a lens of a camera unit of an electronic device;
setting the camera unit to light field capture mode, so that the camera unit is focused on an intermediate image plane;
relaying with a first lens light rays from an object into an array of convex lenslets;
forming with the array of convex lenslets a real image on an intermediate image plane, the intermediate image plane being formed by the second lens;
projecting with a camera lens the image formed on the intermediate image plane onto an image sensor;
recording a primary image comprising light field information projected on the sensor; and retrieving light field information from the primary image obtained from the image sensor.
The method may further comprise forming a secondary image from the primary image.
According to a seventh example aspect of the invention, there is provided a computer program, comprising:
code for performing a method of an example aspect of the invention, when the computer program is run on a processor.
According to an eighth example aspect of the invention, there is provided a memory medium comprising the computer program of the third example aspect of the invention.
Different non-binding example aspects and example embodiments of the present invention have been illustrated in the foregoing. The foregoing example embodiments are used merely to explain selected aspects or steps that may be utilized in implementations of the present invention. Some example embodiments may be presented only with reference to certain example aspects of the invention. It should be appreciated that corresponding example embodiments may apply to other example aspects as well.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of example embodiments of the present invention, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
Fig. 1 shows a schematic system for use as a reference with which some example embodiments can be explained;
Fig. 2 shows a block diagram of an apparatus of an example embodiment;
Fig. 3 shows a block diagram of a camera unit of an example embodiment;
Fig. 4 shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment; Fig 5. shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment;
Fig 6. shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment;
Fig 7. shows an example primary image captured with apparatuses of an example embodiment; and
Figs. 8 and 9 show example secondary images created with apparatuses of an example embodiment; and
Fig. 10 shows a flow diagram of a method of an example embodiment.
DETAILED DESCRIPTON OF THE DRAWINGS
Fig. 1 shows a schematic system 100 for use as a reference with which some example embodiments can be explained. The system 100 comprises an electronic device 1 10 such as a camera phone, camera, smartphone, gaming device, personal digital assistant or a tablet computer having a camera unit 120 that is capable of capturing images with a field of view 130. The camera unit is configured to be used without or with a further device according to an example embodiment 125 attached thereto as an add-on or an accessory. The device 125 is configured for providing light field recording. The device 1 10 further comprises a display 140. Fig. 1 also shows image objects 150, 160 and 170 at different distances from the camera unit that are being imaged by the camera unit 120.
Fig. 2 shows a block diagram of an apparatus 200 of an example embodiment. The apparatus 200 is suited for operating as the device 1 10. In an example embodiment, the apparatus 200 comprises a communication interface 220, a host processor 210 coupled to the communication interface module 220, and a memory 240 coupled to the host processor 210. The memory 240 comprises a work memory and a non-volatile memory such as a read-only memory, flash memory, optical or magnetic memory. In the memory 240, typically at least initially in the non-volatile memory, there is stored software 250 operable to be loaded into and executed by the host processor 210. The software 250 may comprise one or more software modules and can be in the form of a computer program product that is software stored in a memory medium. The apparatus 200 further comprises a camera unit 260 and a viewfinder 270 each coupled to the host processor 210. The camera unit 260 and the processor 210 are connected via a camera interface 280. The camera unit is configured to operate with or without the device 125, i.e. either as a conventional digital camera or as a light field photography camera. In an example embodiment, the camera unit further comprises an arrangement for attaching the device 125 to the camera unit.
Term host processor refers to a processor in the apparatus 200 in distinction of one or more processors in the camera unit 260, referred to as camera processor(s) 330 in Fig. 3. Depending on implementation, different example embodiments share processing of image and/or light field information and control of the camera unit 260 differently between the camera unit and one or more processors outside the camera unit. Also, the processing is performed on the fly in an example embodiment and with buffering in another example embodiment. It is also possible that a given amount of images or image information can be processed on the fly and after than buffered operation mode is used as in one example embodiment.
It shall be understood that any coupling in this document refers to functional or operational coupling; there may be intervening components or circuitries in between coupled elements unless expressly otherwise described.
The communication interface module 220 is configured to provide local communications over one or more local links. The links may be wired and/or wireless links. The communication interface 220 may further or alternatively implement telecommunication links suited for establishing links with other users or for data transfer, e.g. using the Internet. Such telecommunication links may be links using any of: wireless local area network links, Bluetooth, ultra-wideband, cellular or satellite communication links. The communication interface 220 may be integrated into the apparatus 200 or into an adapter, such as a card that may be inserted into a suitable slot or port of the apparatus 200. While Fig. 2 shows one communication interface 220, the apparatus may comprise a plurality of communication interfaces 220. The host processor 210 is, for instance, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a graphics processing unit, an application specific integrated circuit (ASIC), a field programmable gate array, a microcontroller or a combination of such elements. Figure 2 shows one host processor 210, but the apparatus 200 may comprise a plurality of host processors.
As mentioned in the foregoing, the memory 240 may comprise non-transitory nonvolatile and a non-volatile memory, such as a read-only memory (ROM), a programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), a random-access memory (RAM), a flash memory, a data disk, an optical storage, a magnetic storage, or a smart card. In some example embodiments, only volatile or non-volatile memory is present in the apparatus 200. Moreover, in some example embodiments, the apparatus comprises a plurality of memories. In some example embodiments, various elements are integrated. For instance, the memory 240 can be constructed as a part of the apparatus 200 or inserted into a slot or a port. Further still, the memory 240 may serve the sole purpose of storing data, or it may be constructed as a part of an apparatus serving other purposes, such as processing data. Similar options are thinkable also for various other elements.
A skilled person appreciates that in addition to the elements shown in Fig. 2, the apparatus 200 may comprise other elements, such as microphones, displays, as well as additional circuitry such as further input/output (I/O) circuitries, memory chips, application-specific integrated circuits (ASIC), processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, or ciphering/deciphering circuitry. Additionally, the apparatus 200 may comprise a housing and a disposable or rechargeable battery (not shown) for powering the apparatus if external power supply is not available. It is also useful to realize that the term apparatus is used in this document with varying scope. In some of the broader claims and examples, the apparatus may refer to only a subset of the features presented in Fig. 2 or even be implemented without any one of the features of Fig. 2. In an example embodiment term apparatus refers to the processor 210, an input of the processor 210 configured to receive information from the camera unit and an output of the processor 210 configured to provide information to the viewfinder. In one example embodiment, the apparatus refers to a device that receives image information from the image sensor via a first input and produces sub-images to a second input of an image processor, which image processor is any circuitry that makes use of the produced sub-images. For instance, the image processor may comprise the processor 210 and the device in question may comprise the camera processor 330 and the camera interface 280 shown in Fig. 3. Fig. 3 shows a block diagram of a camera unit 260 of an example embodiment. The camera unit 260 comprises optics such as an objective 310, an image sensor 320, a camera processor 330, a memory 340 comprising data 344 and software 342 with which the camera processor 330 can manage operations of the camera unit 260. The camera processor 330 operates as an image and light field information processing circuitry of an example embodiment. An input/output or camera interface 280 is also provided to enable exchange of information between the camera unit 260 and the host processor 210. Furthermore, in an example embodiment, the camera unit has a light sensitive film medium instead of an image sensor 320. In an example embodiment, the software 342 stored in the memory comprises applications or programs or instructions for operating the camera unit in a conventional camera mode and in light field photography mode using the device 125, attached to or integrated with the objective 310. In an example embodiment, he data 344 stored in the memory 340 comprises parameters for use in conventional and light field photography.
The image sensor 320 is, for instance, a charge-coupled device (CCD) or complementary metal oxide semiconductor (CMOS) unit. In case of a CMOS unit, the image sensor 320 can also contain built-in analog-to-digital implemented on common silicon chip with the image sensor 320. In an alternative example embodiment, a separate analog-to-digital (A D) conversion is provided between the image sensor 320 and the camera processor 330. In addition to the conventional image processing and the calculations or operations needed in light field recording, the camera processor 330 takes care in example embodiments of one or more of the following functions: pixel color interpolation; white balance correction; edge enhancement; anti-aliasing of images; vignetting correction; combining of subsequent images for high dynamic range imaging; bayer reconstruction filtering; chromatic aberration correction; dust effect compensation; image stabilization.
In an example embodiment, the apparatus 200 further comprises a user interface (U/l) 230. The user interface comprises one or more elements with which the user operates the apparatus 200 and the camera unit 260. Said elements comprise for example a shutter button, menu buttons and a touch screen. The shutter button and the menu buttons may be hardware buttons or for example buttons displayed on a touch screen.
In a further example embodiment, the apparatus 200 or the camera unit 260 comprises an image stabilizer (not shown). In an example embodiment the image stabilizer is an optical image stabilizer configured to move a lens or several lenses. Alternatively, the image stabilizer is configured to move the image sensor 320 or a mirror. In a further example embodiment the image stabilizer is implemented with a software image stabilization method. It is also possible to use more than one different image stabilizing techniques and in one example embodiment, two or more of the mentioned image stabilization techniques are combined. A skilled person appreciates that in a further example embodiment, the apparatus 200 and/or the camera unit 260 comprises further elements not shown in the image.
Fig. 3 further shows a light field photography module 400, suited for operating as the device 125, which is hereinafter described in further detail with reference to Figs 4 and 5, configured to be placed in front of the camera module 300, or camera optics, when light field information is to be recorded, for example the light field photography module is comprised in a casing that can be attached in front of the objective of a camera unit. The light field photography module 400 provides a conventional digital camera unit with light field recording capability without any changes to the camera unit itself. In an example embodiment, the light field photography module is configured to be attached to any conventional electronic device with a camera unit and configured to adapt to the camera unit to which it is attached either automatically or by manual adjustment. In a further example embodiment, the apparatus 200 and/or the camera unit is configured to automatically recognize the light field photography module 400, e.g. through mechanical, electric, optical, or wireless means such as Bluetooth or near field communication (NFC) or optically detecting lens markers. In a further example embodiment, the light field photography module 400 in front of the camera module 300 is detected with image processing methods using a view finder system of the camera.
Fig. 4 shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment. Fig. 4 shows schematically the camera optics 300 comprising a main camera lens 410 and the image sensor 320, and an apparatus according to an embodiment of the invention, i.e. a light field photography module 400, configured to be placed in front of the camera module when light field information is to be recorded. The light field photography module comprises a main lens or first lens, 430 and an array of lenslets, or microlenses, 440. The light field photography module further comprises a relay lens package 445 comprising, in an example embodiment, a field lens, or second lens, 450. The array of lenslets 440 comprises a plurality of convex lenslets configured to form an intermediate image and is positioned optically between the main lens 430 and the field lens 450. The convex lenslets form a real intermediate image, i.e. an image that is formed as the light rays converge, on the intermediate image plane on the field lens 450. When using the light field photography module, the camera optics 300 are set to light field capture mode, e.g. macro mode, in such a way that the main camera lens 410 is focusing at an intermediate image plane formed on the field lens 450. Fig. 4 further shows schematically a route of light rays originating at the point source 420. The main lens 430 of the light field photography module 400 with the lenslet array 440 is configured to form a light field image on the intermediate image plane on the field lens 450. From the field lens 450, the intermediate image is relayed to the sensor 320 by the field lens 450 and the main camera lens 410. In an example embodiment, the light field photography module 400 is configured to make the light rays entering the aperture of the camera module 300 parallel. A skilled person appreciates that if a user of an apparatus with a camera wishes to capture a conventional digital image, the light field photography module 400 is not attached or is not used, and the camera unit functions as a conventional digital camera. Accordingly, also the image quality of conventional imaging remains undisturbed.
Fig. 5 shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment. Fig. 5 shows schematically the camera optics 300 comprising a main camera lens 410 and the image sensor 320, and an apparatus according to an embodiment of the invention, i.e. a light field photography module 400, configured to be placed in front of the camera module when light field information is to be recorded. The light field photography module comprises a main lens, or first lens, 430 and an array of lenslets 440 The light field photography module further comprises a relay lens package 445 comprising, in an example embodiment, a field lens, or second lens, 450 and a front lens, or third lens, 460. The front lens 460 is configured to expand the field of view of the system formed by the light field photography module 400 and the camera optics 300. The array of lenslets 440 comprises a plurality of convex lenslets configured to form an intermediate image and is positioned optically between the main lens 430 and the field lens 450. The convex lenslets form a real intermediate image on the intermediate image plane on the field lens 450. When using the light field photography module, the camera optics 300 are set to light field capture mode, e.g. macro mode, in such a way that the main camera lens 410 is focusing at an intermediate image plane formed on the field lens 450. Fig. 5 further shows schematically a route of light rays originating at the point source 420. The main lens 430 of the light field photography module 400 with the lenslet array 440 is configured to form a light field image on the intermediate image plane on the field lens 450. From the field lens 450, the intermediate image is relayed to the sensor 320 by the field lens 450, the front lens 340 and the main camera lens 410. In an example embodiment, the light field photography module 400 is configured to make the light rays entering the aperture of the camera module 300 parallel. The distance D between the main lens 430 and the array of lenslets 440, the distance B between the array of lenslets 440 and the field lens 450 of the relay lens package 445 and the distance f between the field lens 450 of the relay lens package 445 and the main camera lens 410 or combination of the front lens 460 of the relay lens package and the main camera lens 410 are in an example embodiment chosen in accordance with the specifications desired and the specifications of the camera module 300. In order to record the light field information each object, i.e. each ray of light emanating from an object, is projected by at least 2x2 lenslets. Based on this assumption, the distances between the lenses and the focal lengths as well as apertures of the lenses are calculated, bearing in mind that the optics 300 of the camera unit 260 are as a rule not changed, as the light field photography module 400 is used as an add-on to the conventional camera. In an example embodiment, for a camera unit having a focal length of 8 mm and an aperture of 3.3 mm, the focal length of the first lens 430 is 20 mm and the aperture of the first lens 430 is 15mm. Furthermore, the focal length of the lenslets of the array of lenslets 440 is 1 mm and the pitch thereof is 0.5 mm. Furthermore, the distance D between the first lens 430 and the array of lenslets is 19 mm, and distance B between second lens 450 of the relay lens package 445 and the array of lenslets is 0.55 mm. It is appreciated that Figs 4 and 5 illustrate the optical components as ideal components, but the depicted components, in an example embodiment, consist of several components such as doublet or triplet lenses, i.e. the components comprise several lens elements. Furthermore, the light field photography module 400 may comprise further elements (not shown), such as further optical components or electronic components, e.g. a processor or processors. In a further example embodiment, the elements of the light field photography module comprise switchable optics, i.e. optical elements that are controlled by electric signals, so as to accommodate different imaging situations and save space and cost. Fig. 6 shows a schematic representation of an apparatus according to an example embodiment with a camera unit of an example embodiment. For illustrating light rays emanating from a point source 420 being projected onto multiple locations of the image sensor 320, the camera main lens 410, a combination 540 of the lenslet array 440 and the relay lens package 445 (not shown), and the main lens 430 are depicted. As the camera optics, i.e. the camera main lens 410, or the combination of the camera main lens 410 and the front lens 460 of the relay lens package (not shown) of the light field photography module, is used as a macro lens and focused on the image plane formed by the field lens 450, the image sensor 320 sees the image formed by the lenslets on the field lens 450 (not shown). An example of the image seen by the image sensor 320 is shown in Fig. 7. The image shows each object, i.e. each point source of light portrayed through at least three lenslets of the array of lenslets 440 and relayed on the sensor. Accordingly, this primary, or first, image seen by the sensor contains light field information and allows each possible image to be reconstructed through image processing.
Figs. 8 and 9 show example secondary images created from a primary image captured with the apparatuses according to an example embodiment. Fig. 8 shows an example secondary image in which the background of the image is in focus, Fig. 9 shows an example secondary image in which the foreground is in focus.
Figure 10 illustrates a flow chart of an example embodiment of a method according to an example embodiment. At step 1010 a decision is made, typically by a user of the apparatus, whether a conventional digital image is to be captured or whether a light field image is to be captured, i.e. a primary image comprising light field information is to be captured. If a conventional image is to be captured, the shutter is released at 1045 and a final image is captured at 1040.
When the user wishes to use light field imaging, a light field module according to an example embodiment is attached in front of the camera unit at 1015. The attachment of the light field module 400 is effected by conventional means, e.g. by magnetic or slip-on attachment, or by a lens mount of the camera unit. At step 1020 the camera unit, or the settings thereof, is set to light field capture mode, e.g. macro mode, i.e. focused on the intermediate image plane formed at the field lens as hereinbefore described. The shutter is released at 1045 and a primary image, i.e. an image from which the light field information is retrieved, is captured. At step 1035 the primary image is processed to obtain 1040 a secondary, or final, image with desired parameters. The image processing is, in an example embodiment, carried out for example using the camera processor 330, or alternatively or in addition using an external processing, such as cloud based processing. In an example embodiment, the image processing is carried out automatically, i.e. without user interaction, in accordance with predefined parameters, so that for example the secondary image is always a full focus image. In a further example embodiment, the image processing is based on user input, i.e. the user chooses the desired parameters for the image through a user interface.
Some use cases relating to given example embodiments of light field imaging are presented in the following. In a first use case, a user of a device 1 10 with a camera unit wishes to capture an image with objects 150,160,170 at different distances. Using the light field photography add-on device 125, i.e. the light field photography module 400, the device is configured to record light field information from the field of view 130 of the device 1 10. As the light field information is retrievable from a primary image captured using the device 125, for example the focus of the captured image can be adjusted after capture and in an example embodiment each object 150,160,170 in the field of view of the device 1 10 is in focus in a final image.
In a second use case, a user wishes to capture a video with the camera unit of the device 1 10. The video is captured using the light field add-on and can be refocused later with video processing for enhanced clarity and artistic effect, as the light field information is available in each single frame.
In a third use case, a user wishes to provide a parallax, or depth of field, effects on a captured image. As the light field information is retrievable from the primary image, such effects are created with image processing from a single capture.
In a fourth use case, the light field information retrievable from the primary image is used to create a three-dimensional image without having to resort to use of several cameras.
Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is to enable both light field and conventional imaging without compromising the conventional image quality. Another technical effect of one or more of the example embodiments disclosed herein is to easily enable light field imaging with any camera unit. Another technical effect of one or more of the example embodiments disclosed herein is to enhance the user experience by providing light field information enabling vast possibilities of post processing. Still a further technical effect is to provide a simple and cost effective light field camera by using the light field imaging accessory.
If desired, the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or may be combined.
Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
It is also noted herein that while example embodiments of the invention have been described hereinbefore, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.

Claims

An apparatus, comprising:
a first lens;
a relay lens package comprising at least a second lens configured to form an intermediate image plane; and
an array of convex lenslets positioned optically between the first lens and the relay lens package; wherein
each lenslet of the array of lenslets is configured to form on the intermediate image plane a real image of any object in the field of view of the first lens.
The apparatus of claim 1 , wherein the relay lens package further comprises a third lens positioned optically after the second lens.
The apparatus of claim 1 or 2, wherein the apparatus is further configured to be attached in front of a lens of a camera unit comprising a camera lens and an image sensor.
The apparatus of claim 2 or 3, wherein the third lens is configured together with the camera lens to project the real image formed by the array of lenslets onto the image sensor.
The apparatus of any preceding claim, wherein each lens comprises a lens package comprising a combination of a plurality of lens elements.
The apparatus of any preceding claim, wherein each lens and the array of lenslets is in such a position that a light ray emanating at a point in the field of view of the first lens is portrayed by at least 2x2 lenslets of the array of lenslets.
The apparatus of any preceding claim, wherein at least one of the lenses and/or the array of lenslets comprises switchable optics elements.
An electronic device, comprising: a housing;
a display;
a memory;
a processor; and
a camera unit comprising an image sensor and a camera lens; wherein the processor is configured to cause the image sensor to record a primary image comprising light field information from a real intermediate image received on the sensor via the camera lens from an intermediate image plane, the intermediate image being formed by an array of convex lenslets; and
the processor is configured to cause retrieving light field information from a primary image obtained from the image sensor.
9. The electronic device of claim 8, wherein the processor is further configured to cause forming a secondary image from the primary image.
10. A system comprising
the apparatus of any of claims 1 -7; and
the electronic device of any of claims 8-9.
1 1 . A method comprising
relaying with a first lens light rays from an object into an array of convex lenslets;
forming with the array of convex lenslets a real image on an intermediate image plane, the intermediate image plane being formed by a second lens of a relay lens package; wherein
the array of convex lenslets is positioned between the first lens and the relay lens package.
12. The method of claim 1 1 , further comprising projecting the intermediate image onto an image sensor via a third lens of the relay lens package.
13. The method of claim 1 1 or 12 further comprising adjusting at least one of the lenses and/or the array of lenslets with switchable optics elements.
14. A method comprising
setting a camera unit of an electronic device to light field capture mode, so that the camera unit is focused on an intermediate image plane;
projecting with a camera lens an intermediate image formed by an array of convex lenslets on the intermediate image plane onto an image sensor;
recording a primary image comprising light field information projected on the sensor; and
retrieving light field information from the primary image obtained from the image sensor.
15. The method of claim 14, further comprising forming a secondary image from the primary image.
16. A method comprising:
receiving user input on using light field imaging;
in response to the user input, attaching an apparatus comprising a first lens, a relay lens package comprising at least a second lens and an array of convex lenslets between the first lens and the relay lens package in front of a lens of a camera unit of an electronic device;
setting the camera unit to light field capture mode, so that the camera unit is focused on an intermediate image plane;
relaying with a first lens light rays from an object into an array of convex lenslets;
forming with the array of convex lenslets a real image on an intermediate image plane, the intermediate image plane being formed by the second lens;
projecting with a camera lens the image formed on the intermediate image plane onto an image sensor;
recording a primary image comprising light field information projected on the sensor; and
retrieving light field information from the primary image obtained from the image sensor.
17. The method of claim 16, further comprising forming a secondary image from the primary image.
18. A computer program, comprising:
code for performing a method of any of claims 1 1 to 17,
when the computer program is run on a processor.
19. A memory medium comprising the computer program of claim 18.
PCT/FI2014/050353 2013-05-27 2014-05-13 Light field imaging WO2014191613A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2304CH2013 2013-05-27
IN2304/CHE/2013 2013-05-27

Publications (1)

Publication Number Publication Date
WO2014191613A1 true WO2014191613A1 (en) 2014-12-04

Family

ID=51988068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2014/050353 WO2014191613A1 (en) 2013-05-27 2014-05-13 Light field imaging

Country Status (1)

Country Link
WO (1) WO2014191613A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016163279A (en) * 2015-03-04 2016-09-05 キヤノン株式会社 Image processing device, image processing method and imaging apparatus
CN108801752A (en) * 2018-08-02 2018-11-13 佛山科学技术学院 A kind of sample loading attachment and sample driving device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090262182A1 (en) * 2007-10-15 2009-10-22 The University Of Connecticut Three-dimensional imaging apparatus
US20100128145A1 (en) * 2008-11-25 2010-05-27 Colvin Pitts System of and Method for Video Refocusing
WO2012138324A1 (en) * 2011-04-04 2012-10-11 Pixar Super light-field lens with focus control and non- spherical lenslet arrays
US8290358B1 (en) * 2007-06-25 2012-10-16 Adobe Systems Incorporated Methods and apparatus for light-field imaging
US20130128087A1 (en) * 2010-08-27 2013-05-23 Todor G. Georgiev Methods and Apparatus for Super-Resolution in Integral Photography

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8290358B1 (en) * 2007-06-25 2012-10-16 Adobe Systems Incorporated Methods and apparatus for light-field imaging
US20090262182A1 (en) * 2007-10-15 2009-10-22 The University Of Connecticut Three-dimensional imaging apparatus
US20100128145A1 (en) * 2008-11-25 2010-05-27 Colvin Pitts System of and Method for Video Refocusing
US20130128087A1 (en) * 2010-08-27 2013-05-23 Todor G. Georgiev Methods and Apparatus for Super-Resolution in Integral Photography
WO2012138324A1 (en) * 2011-04-04 2012-10-11 Pixar Super light-field lens with focus control and non- spherical lenslet arrays

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016163279A (en) * 2015-03-04 2016-09-05 キヤノン株式会社 Image processing device, image processing method and imaging apparatus
CN108801752A (en) * 2018-08-02 2018-11-13 佛山科学技术学院 A kind of sample loading attachment and sample driving device
CN108801752B (en) * 2018-08-02 2023-11-28 佛山科学技术学院 Sample loading device and sample driving device

Similar Documents

Publication Publication Date Title
US9185303B2 (en) Array camera imaging system and method
JP6033454B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
US9253390B2 (en) Image processing device, image capturing device, image processing method, and computer readable medium for setting a combination parameter for combining a plurality of image data
JP5937767B2 (en) Imaging apparatus and imaging method
JP5946970B2 (en) Imaging apparatus and imaging method
US11570376B2 (en) All-in-focus implementation
JP5681329B2 (en) Imaging apparatus and image display method
JP6158340B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
EP2677733A2 (en) Array camera imaging system and method
WO2014106917A1 (en) Image processing device, imaging device, image processing method, and image processing program
US20140118606A1 (en) Smart cameras
WO2013145821A1 (en) Imaging element and imaging device
JP2019138969A (en) Display control device and control method thereof
WO2014191613A1 (en) Light field imaging
KR20140031690A (en) Photographing apparatus and method for photographing image thereof
US20150350530A1 (en) Light Field Imaging
JP5972485B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
Sahin et al. Light L16 computational camera
JP2017041778A (en) Imaging apparatus, moving picture formation method and program
JP2017009815A (en) Focus detection device, focus detection method, and camera system
JP2015139018A (en) Electronic apparatus and control program
JP5934844B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP2012080296A (en) Imaging device, and photographing range display control method
WO2015044514A1 (en) Method and apparatus for plenoptic imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14804914

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14804914

Country of ref document: EP

Kind code of ref document: A1