WO2015164025A1 - Generation and use of a 3d radon image - Google Patents

Generation and use of a 3d radon image Download PDF

Info

Publication number
WO2015164025A1
WO2015164025A1 PCT/US2015/022804 US2015022804W WO2015164025A1 WO 2015164025 A1 WO2015164025 A1 WO 2015164025A1 US 2015022804 W US2015022804 W US 2015022804W WO 2015164025 A1 WO2015164025 A1 WO 2015164025A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
radon
plenoptic
plane
microlens
Prior art date
Application number
PCT/US2015/022804
Other languages
French (fr)
Inventor
Todor Georgiev GEORGIEV
Salil TAMBE
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to CN201580020920.2A priority Critical patent/CN106233329A/en
Priority to EP15721369.5A priority patent/EP3134868B1/en
Priority to BR112016024707A priority patent/BR112016024707A2/en
Priority to KR1020167031948A priority patent/KR20170005009A/en
Priority to JP2016563926A priority patent/JP2017520944A/en
Publication of WO2015164025A1 publication Critical patent/WO2015164025A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/21Indexing scheme for image data processing or generation, in general involving computational photography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the systems and methods disclosed herein are directed to image data, and, more particularly, to capture of and rendering from plenoptic data.
  • Plenoptic photography comes closer to rendering the full variety of view angles and multiple focal points offered by the direct observation of objects by capturing the light field, also referred to as radiance, of a target image scene.
  • a microlens array can be mounted in front of an image sensor, effectively capturing many images of a target scene, with each image capturing a portion of the target scene from a slightly different viewpoint. As such, multiple light rays are captured from varying viewpoints for each pixel of a synthesized final image of the scene.
  • This provides raw sensor data containing four-dimensional radiance data about each pixel point in a potential final image of the target scene: two dimensions of the spatial position of a light ray and two dimensions of the orientation of a light ray at the pixel.
  • this data can be synthesized into a final 2D or 3D image from any of the vantage points or focus points represented by the radiance data, extending the capabilities of digital photography and affording greater flexibility for photographers to alter vantage point or focus after image capture.
  • Plenoptic camera technology offers several imaging capabilities that may be desirable in mobile devices, including but not limited to full 3D imaging, refocusabilty, and High Dynamic Range (HDR) imaging.
  • processing the plenoptic sensor data is computationally expensive, typically requiring parallel processing on central processing units (CPUs) or intensive processing on a graphics processing unit (GPU).
  • CPUs central processing units
  • GPU graphics processing unit
  • plenoptic photography requires prohibitively large amounts of data storage and processing for implementation in mobile photography, given the relatively limited GPU memory of conventional mobile devices.
  • aspects of the invention relate to techniques for efficiently recording appropriately transformed captured plenoptic image data (for example in a plenoptic camera) and a technique for rendering images from such transformed captured plenoptic data, referred to herein as "Radon photography.”
  • the structure of captured plenoptic data can be changed by using a Radon transform so that the dimensionality is reduced from four dimensions to three dimensions, thereby generating image data (referred to herein as a "Radon image”) having a smaller size than traditional plenoptic images.
  • a final image can be rendered from the Radon image using inverse Radon transform or other computed tomography techniques to recover the radiance or luminous density of the plenoptic data.
  • this can reduce the amount of data by about an order of magnitude, making it possible to perform rendering using much less GPU memory than existing plenoptic rendering techniques, for example on a mobile device.
  • one aspect relates to a system in an electronic device for rendering a final image, the system comprising a plenoptic camera including a microlens array and an image sensor, the plenoptic camera configured to capture plenoptic image data of an image scene; and one or more processors in data communication with the plenoptic camera and configured to at least receive the captured plenoptic image data, the captured plenoptic image data including a plurality of microimages each formed by one of a plurality of microlenses in the microlens array focusing light from the image scene onto the image sensor, determine a plurality of lines of integration located within a plane bisecting the microlens array, each of the plurality of lines of integration corresponding to a plane extending through three dimensional space of the image scene, and generate a Radon image based at least partly on a plurality of pixels of the plenoptic image data corresponding to the plurality of lines of integration.
  • Another aspect relates to a system in an electronic device for rendering a final image, the system comprising a first module configured to at least access data representing a Radon image, the Radon image representing, for each sampled plane of a plurality of sampled planes of an image scene, a sum of light energy in the sampled plane; a second module configured to at least determine a luminous density of the image scene based at least partly on integrating the sum of light energy from each sampled plane of the plurality of sampled planes of the Radon image; and a third module configured to at least project the luminous density onto an image plane to produce the final image.
  • Another aspect relates to a method for compressing plenoptic image data, the method comprising receiving at least a portion of data representing a plenoptic image of a three-dimensional image space from a plenoptic camera having a microlens array and an image sensor; identifying a plane of integration intersecting a microlens plane, the microlens plane bisecting the microlens array; determining an image line intersecting the plane of integration and an image plane located a first distance from the microlens plane; determining a microlens line intersecting the plane of integration and the microlens plane; mapping the image line to a microimage based at least partly on intersection of microlens line and the microlens, the microimage formed on the image sensor by a microlens of the microlens array; and summing pixel values for each of a plurality of pixels in the microimage, the plurality of pixels located along the image line mapped to the microimage.
  • Another aspect relates to a non-transitory, computer-readable medium storing instructions that, when executed, cause one or more computing devices to perform operations comprising receiving Radon image data of an image scene, the Radon image data representing a luminous density of the image scene as summed values of light energy in each of a plurality of planes of the image scene; computing an intermediate function of the Radon image using a back-projection of the Radon image; recovering the luminous density of the image scene based at least partly on the intermediate function of the Radon image; and projecting the luminous density onto an image plane to produce a dynamically refocusable rendered image of the image scene.
  • FIG. 1A illustrates an embodiment of conventional plenoptic camera.
  • FIG. IB illustrates an example focused plenoptic camera (Keplerian telescopic case), according to some embodiments.
  • FIG. 1C illustrates an example focused plenoptic camera (Galilean telescopic case), according to some embodiments.
  • FIG. ID illustrates an example thin plenoptic camera based on microspheres, according to some embodiments.
  • FIG. 2 illustrates a high-level schematic block diagram of an embodiment of an image capture device having Radon photography capabilities.
  • FIG. 3 illustrates an embodiment of a Radon photography process.
  • FIG. 4A illustrates an embodiment of a technique of using plenoptic camera planes for Radon image generation.
  • FIG. 4B illustrates an embodiment of a set of pixels or microimages that can be used in Radon image generation.
  • FIG. 4C illustrates an embodiment of a density of lines of integration that can be used in a method of 3D Radon image generation.
  • FIG. 5 illustrates an embodiment of a process for generating a Radon image from input light field data.
  • FIG. 6 illustrates an embodiment of a process for generating plenoptic image data from a Radon image.
  • Embodiments of the disclosure relate to systems and techniques for capturing plenoptic image data, processing the plenoptic image data to increase efficiency, and rendering from the processed plenoptic image data.
  • the Radon photography techniques described herein can change the structure of captured plenoptic data to define the radiance of a target image scene in terms of energy contained within two-dimensional planes rather than within one-dimensional rays, which can effectively reduce the amount of image data by about an order of magnitude. For example, this can be accomplished in some implementations by applying a Radon transform to the plenoptic image data, generating a Radon image.
  • a dynamically refocusable image can be rendered from the Radon image by applying the inverse Radon transform to recover the radiance of the image scene from the Radon image.
  • a Radon image can be approximately 1 megabyte (MB) of data while a typical plenoptic image can be approximately 50 MB of data. This can enable plenoptic imaging on mobile devices with limited GPU capacity, for example smart phones and tablet computers.
  • the Radon photography technique described herein can record captured plenoptic data using a transform related to the Radon transform to reduce the quantity of captured plenoptic data.
  • the information captured by plenoptic cameras may be referred to as the light field, the plenoptic function, or radiance.
  • a light field (which may also be referred to as radiance, luminous density, or the plenoptic function) is a four-dimensional record of all one-dimensional light rays in the three-dimensional space of the target image scene. Radiance describes both spatial and angular information, and is defined as density of energy per unit of area per unit of stereo angle (in radians).
  • plenoptic camera captures radiance in plenoptic images (also referred to as flat images, or flats).
  • plenoptic images may be digitally refocused, noise may be reduced, viewpoints may be changed, and other plenoptic effects may be achieved.
  • plenoptic cameras may also be referred to as light field cameras, and plenoptic images may also be referred to as light field images.
  • the four-dimensional radiance data of the image scene can be represented as an alternative plenoptic function depending on only three variables, which is the Radon image of the density of virtual light sources.
  • Radon photography techniques can capture or represent the energy density of a three- dimensional body in the image scene by cutting it with multiple thin virtual planes and integrating over each plane. The energy in each plane can be measured to construct the Radon image. Accordingly, the resulting Radon image can represent the summed values of energy over each plane between the image plane and the sensor plane rather than traditional plenoptic image data which represents the values of energy of each light ray between the image plan and the sensor plane.
  • the Radon image being a three- dimensional representation of the four-dimensional radiance of the image scene, is therefore much more economical in terms of memory and thus faster to render as compared to previous plenoptic imaging methods.
  • the original three-dimensional luminous density of the scene can be recovered, for example, by performing an inverse Radon transform, and images from different views and/or having different depths of focus can be rendered from the luminous density.
  • the Radon photography technique can effectively integrate rays that fall within a plane and can evaluate the Radon transform over all planes passing through a point to perform back projection. Back projection can be the first step done in order to derive the inverse Radon transform at that point.
  • Specific lines of integration can be chosen to sample each microlens in the array with a sufficient step size between adjacent lines of integration to avoid sampling the same set of pixels, which would lead to redundant information.
  • the Radon photography technique can determine the luminous density in the scene from the Radon image. Projecting the luminous density onto an image plane can produce a final image, wherein the final image can be adjusted to have different views or different depths of focus of the captured image scene.
  • FIGS. 1A-1D illustrate various embodiments of plenoptic cameras that can be used to capture Radon image data according to the Radon photography technique.
  • the plenoptic imaging systems of FIGS. 1A-1D can be implemented in a variety of imaging applications, for example still photography, videography, stereoscopic photography, stereoscopic videography, multispectral photography, and multispectral videography.
  • Devices such as handheld cameras, tablet computers, mobile phones, gaming consoles, laptops, personal computers, augmented reality devices such as heads- up display systems, security cameras, and the like can incorporate the plenoptic imaging systems of FIGS. 1A-1D.
  • FIG. 1A illustrates a conventional plenoptic camera 100a.
  • a conventional plenoptic camera can include a main lens 105 and a microlens array 110 placed at distance f in front of a photo sensor 115.
  • a charge- coupled device CCD
  • a CMOS imaging sensor can be used as the photo sensor 115.
  • the microlenses 1 10 can have aperture d and focal length and are assumed to be equally spaced at interval d.
  • the main lens 105 can be focused at the plane formed by the center of the microlens array 1 10 ("microlens plane"), and the microlenses 1 10 can be focused at optical infinity (equivalently, at the main lens 105).
  • each "micro camera” can be focused at the main camera lens aperture, and not on the object being photographed.
  • Each microlens image can thus be completely defocused relative to that object, and represents only the angular distribution of the radiance. As such, these microimages can look blurry and do not represent a human-recognizable image. Since each microlens image can sample a given location depending on its position and spans the same angular range as the other microlens images, rendering an output image from a conventional plenoptic camera radiance image can be accomplished by integrating all of the pixels under each microlens. Integrating a fixed portion of the pixels under each microlens can generate an image of one certain view. In some embodiments, each microlens contributes to a single pixel in the final generated image.
  • FIGS. IB and 1C illustrate example focused plenoptic cameras 100b and 100c, respectively, according to some embodiments.
  • the components shown in FIGS. IB and 1C are not necessarily to scale relative to each other, nor are the distances between the components necessarily to scale, nor are the sizes of the components necessarily to scale.
  • the focused plenoptic cameras 100b, 100c may include at least a main lens 105, a microlens array 110, and a photo sensor 1 15.
  • the microlens array 1 10 of a focused plenoptic camera 100b, 100c can be focused on an image plane 120 of the main camera lens instead of at infinity.
  • each microlens can reimage the main lens image onto the photo sensor 1 15.
  • the microlenses 110 can form an array of true images of the main lens image as a relay system, thereby each forming a microimage on the photo sensor 115.
  • FIG. IB illustrates an example of a Keplerian telescopic system 100b wherein the image plane 120 being imaged is in front of the microlenses 110.
  • the microlenses 1 10 can be described by the lens equation:
  • the spatial resolution of the radiance captured by the plenoptic camera is a function of the resolution of the microlens images and the amount of overlap in rendering, and not of the number of microlenses. This decoupling of resolution and number of microlenses distinguishes the focused plenoptic camera 100b, 100c from the conventional plenoptic camera 100a.
  • Another difference between the conventional plenoptic camera 100a and the focused plenoptic camera 100b, 100c is in the nature of the information that is captured by each microlens.
  • each microlens images one position in the scene, capturing all of the angular information there.
  • the focused plenoptic camera 100b, 100c different microlenses capture the same position; angular information is spread across microlenses.
  • the rendering algorithm can integrate across microlens images, rather than within a single microlens image. That is, assuming that the task is "imaging the image” that is in focus, the rendering algorithm integrates the points in the microlenses that correspond to the same position in the image by overlapping them at a fixed pitch.
  • FIG. ID illustrates an example thin plenoptic camera system lOOd based on microspheres, according to some embodiments.
  • This example thin plenoptic camera lOOd may be similar to the focused plenoptic camera 100c illustrated in FIG. 1C; however, the microlens array is replaced by microspheres 130.
  • the microspheres 130 may be attached or fixed to the surface of the photo sensor 1 15 by any of several techniques; for example a thin layer (e.g., a few nanometers thick) of a transparent adhesive material may deposited on the pixel surface of the photo sensor 115, a layer of molten glass or a similar substance may be deposited on the pixel surface of the photo sensor 1 15, or the microspheres 130 may be embedded in a substance that is very flat on top, while covering the microspheres 130.
  • a thin plenoptic camera as illustrated in FIG. ID may be approximately 5 mm (millimeters) thin, or even thinner, and thus suitable for use in thin mobile devices.
  • FIGS. 1A-1D are intended to illustrate example plenoptic camera systems HOa-l lOd that can carry out the Radon photography techniques described herein.
  • any plenoptic or light field camera capable of capturing the radiance of an image scene can implement Radon photography techniques in order to reduce the amount of captured image data.
  • a camera can directly capture a 3D Radon image instead of capturing 4D radiance data and generating the 3D Radon image from the 4D radiance data.
  • FIG. 2 illustrates a high-level schematic block diagram of an embodiment of an image capture device 200 having Radon photography capabilities, the device 200 having a set of components including an image processor 220 linked to a camera assembly 201.
  • the image processor 220 is also in communication with a working memory 265, memory 230, and device processor 255, which in turn is in communication with storage 270 and an optional electronic display 260.
  • Device 200 may be a cell phone, digital camera, tablet computer, personal digital assistant, or the like. There are many portable computing devices in which rendering from a reduced quantity of plenoptic image data such as is described herein would provide advantages. Device 200 may also be a stationary computing device or any device in which the Radon photography technique would be advantageous. A plurality of applications may be available to the user on device 200. These applications may include traditional photographic and video applications, high dynamic range imaging, panoramic photo and video, multispectral photo and video, stereoscopic imaging such as 3D images or 3D video, and plenoptic photo and video.
  • the image capture device 200 includes plenoptic camera assembly 201 for capturing external images.
  • the camera 201 can be any of the plenoptic camera configurations 100a- lOOd described above with respect to FIGS. 1A-1D, in some embodiments.
  • a main lens 205 microlens A 210a through microlens N 210n, and a single image sensor 215
  • embodiments of camera 201 other than the example illustration can have any combination of some or all of the main lens 205, microlens array 210a-210n, and sensor 215 capable of capturing the radiance data of a target image scene.
  • N microlenses can be used, where N ⁇ 2.
  • the plenoptic camera assembly 201 can have additional components, for example additional lens assemblies and corresponding additional image sensors for capture of stereoscopic or multispectral plenoptic image data.
  • device 200 can include additional camera assemblies, for example a traditional (non- plenoptic) camera assembly in addition to the plenoptic camera assembly 201.
  • the plenoptic camera assembly 201 can be coupled to the image processor 220 to transmit captured image to the image processor 220.
  • the image processor 220 may be configured to perform various processing operations on received image data comprising N microimages corresponding to the N microlenses in order to execute the Radon photography technique.
  • Processor 220 may be a general purpose processing unit or a processor specially designed for imaging applications. Examples of image processing operations include cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color interpolation, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, etc.
  • Processor 220 may, in some embodiments, comprise a plurality of processors.
  • Processor 220 may be one or more dedicated image signal processors (ISPs) or a software implementation of a processor.
  • ISPs dedicated image signal processors
  • the image processor 220 is connected to a memory 230 and a working memory 265.
  • the memory 230 stores capture control module 235, Radon photography module 240, and operating system 250.
  • the Radon photography module240 includes sub-modules: Radon image generator 242, luminous density calculator 244, and flat rendering module 246.
  • the modules of the memory 230 include instructions that configure the image processor 220 of device processor 255 to perform various image processing and device management tasks.
  • Working memory 265 may be used by image processor 220 to store a working set of processor instructions contained in the modules of memory 230.
  • working memory 255 may also be used by image processor 220 to store dynamic data created during the operation of device 200.
  • the image processor 220 is configured by several modules stored in the memories.
  • the capture control module 235 may include instructions that configure the image processor 220 to adjust the focus position of plenoptic camera assembly 201.
  • Capture control module 235 may further include instructions that control the overall image capture functions of the device 200.
  • capture control module 235 may include instructions that call subroutines to configure the image processor 220 to capture raw plenoptic image data of a target image scene using the plenoptic camera assembly 201.
  • capture control module 235 may then call the Radon photography module 240 to reduce the size of the captured plenoptic image data and output the reduced size image data to the imaging processor 220.
  • capture control module 235 may then call the Radon photography module 240 to perform a rendering operation on the raw plenoptic data in order to output a flat, refocusable image to imaging processor 220.
  • Capture control module 235 may also call the Radon photography module 240 to perform a rendering operation on raw plenoptic image data in order to output a preview image of a scene to be captured, and to update the preview image at certain time intervals, when the scene in the raw image data changes, or when a user changes the focus of the preview image.
  • Radon photography module 240 can call sub-modules Radon image generator 242, luminous density calculator 244, and flat rendering module 246 to perform different portions of the plenoptic data processing and image rendering operations.
  • the Radon image generator 242 can include instructions that configure the image processor 220 to generate a Radon image, the Radon image being a relatively smaller quantity of data compared to the raw plenoptic image data.
  • Radon image generator 242 can include instructions that configure the image processor 220 to apply a Radon transform to plenoptic image data.
  • Some embodiments of the Radon image generator 242 can operate on stored plenoptic image data, while other embodiments of the Radon image generator 242 can change the structure of plenoptic image data as it is captured prior to storage.
  • the pixels or microimages of a photo sensor can be scanned in rows or lines corresponding to intersection lines with a plane of integration, and the values of the light rays incident on those pixels or microimages can be integrated, as will be discussed in more detail below.
  • the Radon photography module 240 can transmit the Radon image to the image processor 220 for storage in the storage module 270.
  • the Radon photography module 240 can transmit the Radon image to the luminous density calculator 244.
  • the luminous density calculator 244 can include instructions that configure the processor 220 to perform processing operations on the Radon image to generate plenoptic image data from the Radon image.
  • the luminous density calculator 244 can include instructions that configure the processor 220 to apply an inverse Radon transform to the Radon image to recover the original luminous density captured by the plenoptic camera assembly 201, or to recover an approximation of the original luminous density.
  • the flat rendering module 246 can include instructions that configure the image processor 220 to perform a rendering operation on the output of the luminous density calculator 244.
  • the flat rendering module 246 can include instructions that configure the image processor 220 to output an image (sometimes referred to as a "flat" in plenoptic photography) by projecting the luminous density onto an image plane, which can produce a flat at different viewpoints or different depths of focus.
  • the processor 220 can store the image or output the image for display to a user, wherein the user can dynamically refocus the image through a range of focus depths captured by the plenoptic image data, and can dynamically adjust a viewpoint of the image through a range of viewpoints captured by the plenoptic image data.
  • the flat rendering module 246 can include instructions that cause the image processor 220 to respond to the user commands and render an updated image.
  • the flat rendering module 246 can include instructions that configure the image processor 220 to output a three-dimensional or stereoscopic image and to update the image based on user input.
  • Operating system module 250 configures the image processor 220 to manage the working memory 265 and the processing resources of device 200.
  • operating system module 250 may include device drivers to manage hardware resources such as the camera assembly 201. Therefore, in some embodiments, instructions contained in the image processing modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in operating system component 250. Instructions within operating system 250 may then interact directly with these hardware components.
  • Operating system module 250 may further configure the image processor 220 to share information with device processor 255.
  • Device processor 255 may be configured to control the display 260 to display the captured image, or a preview of the captured image, to a user. The display 260 may be external to the imaging device 200 or may be part of the imaging device 200.
  • the display 260 may also be configured to provide a view finder displaying a preview image for a use prior to capturing an image, or may be configured to display a captured image stored in memory or recently captured by the user.
  • the display 260 may comprise an LCD or LED screen, and may implement touch sensitive technologies.
  • Device processor 255 may write data to storage module 270, for example data representing captured images. While storage module 270 is represented graphically as a traditional disk device, those with skill in the art would understand that the storage module 270 may be configured as any storage media device.
  • the storage module 270 may include a disk drive, such as a floppy disk drive, hard disk drive, optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, RAM, ROM, and/or EEPROM.
  • the storage module 270 can also include multiple memory units, and any one of the memory units may be configured to be within the image capture device 200, or may be external to the image capture device 200.
  • the storage module 270 may include a ROM memory containing system program instructions stored within the image capture device 200.
  • the storage module 270 may also include memory cards or high speed memories configured to store captured images which may be removable from the camera.
  • the storage module 270 can also be external to device 200, and in one example device 200 may wirelessly transmit data to the storage module 270, for example over a network connection.
  • Figure 2 depicts a device having separate components to include a processor, imaging sensor, and memory
  • processor imaging sensor
  • memory memory components
  • the memory components may be combined with processor components, for example to save cost and/or to improve performance.
  • Figure 2 illustrates two memory components, including memory component 230 comprising several modules and a separate memory 265 comprising a working memory
  • memory component 230 comprising several modules
  • a separate memory 265 comprising a working memory
  • a design may utilize ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 230.
  • the processor instructions may be loaded into RAM to facilitate execution by the image processor 220.
  • working memory 265 may comprise RAM memory, with instructions loaded into working memory 265 before execution by the processor 220.
  • FIG. 3 illustrates an embodiment of a Radon photography process 300.
  • the process 300 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 300, which can be executed by any device or system having Radon photography capabilities.
  • the Radon photography module 240 can receive plenoptic image data, for example from the plenoptic camera assembly 201, which can be any of the plenoptic cameras lOOa- lOOd represented by FIGS. 1A- 1D or any other camera capable of capturing plenoptic or light field image data.
  • the Radon photography module 240 may receive a complete set of plenoptic image data after image capture in some embodiments. In other embodiments, the Radon photography module 240 may receive plenoptic image data during image capture. For example, the Radon photography module 240 may receive the plenoptic image data in portions corresponding to predetermined planes of integration in one example.
  • the Radon image generator 242 of the Radon photography module 240 can generate a Radon image from the input plenoptic image data.
  • the Radon image generator 242 can apply the Radon transform to calculate the integral of the value of light rays in one or more planes of integration.
  • the Radon image fe, v (r can be computed by integrating the function value over all possible planes in one embodiment.
  • a subset of all possible planes can be identified for computing the Radon image ⁇ , ⁇ ⁇ ).
  • the plane(s) of integration can be planes normal to and intersecting the sensor plane of the plenoptic camera, and the position and location of the plane(s) of integration can determined based on the configuration of the sensor such that each distinct pixel of the sensor (or each distinct microimage formed on the sensor by the microlens array) is sampled by at least one plane of integration.
  • a parameterization of a plane, given by the parameters ⁇ , ⁇ , r can be beneficial in some embodiments for computing the inverse Radon transform used to render images from the Radon image.
  • An example of the Radon photography technique can effectively integrate all rays that fall within the plane then apply the inverse Radon transform (i.e., the inverse of Equations (3) or (4), as will be explained in more detail with respect to block 315.
  • the inverse Radon transform i.e., the inverse of Equations (3) or (4), as will be explained in more detail with respect to block 315.
  • the luminous density calculator 244 of the Radon photography module 240 can sum over all of the planes of the Radon image passing through a point. This can calculate a back-projection of the Radon image.
  • the luminous density calculator 244 can apply an inverse Radon transform to the Radon image as described below in Equation (5).
  • Luminous density calculator 244 can take the input Radon image fe , q > (f) an d reproduce the luminous density f(x, y, z).
  • Equation (5) has the same form as the Poisson equation. Accordingly, the back- projection of the Radon image can be expressed as:
  • f 1 represents the backprojected Radon image.
  • f 1 can look like a burred version of the actual function
  • the luminous density calculator 244 can apply a three- dimensional Laplacian of Gaussian to the back-projection f 1 in order to recover the original luminous density of the image scene, or an approximation thereof. In some embodiments this can produce a noisy version of the original function f(x, y, z), which after denoising can provide the density of the virtual light sources. Accordingly, in some embodiments the luminous density calculator 244 can perform denoising on the back- projection f 1 after applying the three-dimensional Laplacian of Gaussian.
  • the Radon photography module 240 can output the luminous density of the plenoptic data, for example in one embodiment for rendering an image with dynamic focus and vantage point. Projecting the luminous density onto an image plane can produce images of the target image scene at different depths of focus or different vantage points.
  • the flat rendering module 246 can perform rendering of images from the luminous density.
  • the Radon photography module 240 can output the luminous density for storage.
  • the flat rendering module 246 can project the luminous density onto an image plane to produce a dynamically refocusable rendered image of the image scene.
  • the rendered image is referred to as "dynamically refocusable" due to the fact that, based on the luminous density, images from different viewpoints and/or having different depths of focus can be rendered of the same image scene.
  • the flat rendering module 246 can adjust the focus and/or vantage point of the rendered image based on user input. For example, flat rendering module 246 can receive an indication of user input from device processor 255 indicating that the user has selected an option to adjust one or both of the vantage point from which the rendered image appears to have been taken or the depth of focus of the rendered image.
  • FIG. 4A illustrates an embodiment of plenoptic camera planes 400 that can be used in Radon image generation.
  • the planes 400 can include an image plane 405, a microlens plane 430, a plane of integration 415, and a sensor plane 435.
  • the image plane 405 can represent a plane formed by the image plane, for example a plane passing through the point of focus of a main lens of a plenoptic camera.
  • the microlens plane 430 can represent a plane formed through a microlens array of a plenoptic camera, for example a plane bisecting each of the lenses of the microarray.
  • the plane of integration 415 can represent a plane intersecting both the image plane 405 and the microlens plane 430.
  • the plane of integration 415 is depicted in a certain location relative to the other planes, in some embodiments the location of the plane of integration 415 can vary based on the parameters of the photo sensor, and in addition multiple planes of integration can be used to generate a Radon image, as will be discussed in more detail below.
  • the sensor plane 435 can represent a plane formed by the surface of a photo sensor of a plenoptic camera, and can include a plurality of regions 440 corresponding to microimages formed by the microlenses within the microlens array.
  • 2-plane parameterization can be used for representing light fields where the first two parameters (a first coordinate) denote the location of the start point of the ray 420 under consideration on the image plane 405 and the last two parameters (a second coordinate) denote the location of the end point of the ray 420 on the microlens plane 430, where the image plane 405 and the microlens plane 430 can be separated by a distance A.
  • the Radon photography technique can determine the pixels onto which image line 410 is mapped in the corresponding microimages 440 of the microlenses lying along the microlens line 425.
  • each microlens can correspond to one microimage
  • each microimage can correspond to one pixel of the image sensor.
  • each microimage can correspond to a plurality of pixels of the image sensor.
  • the image line 410 can represent a vector passing through the intersection of the image plane 405 and the plane of integration 415.
  • the microlens line 425 can represent a vector passing through the intersection of the plane of integration 415 and the microlens plane 430.
  • the Radon photography technique can compute the focal length of a microlens and compute the magnification of the microlens using Equation (1) above. This can be computed, in some embodiments, for each microlens in the array that intersects with the microlens line, allowing the Radon photography technique to locate the pixels corresponding to the rays 420 lying in the plane of integration.
  • FIG. 4A An example of a simple model for the plenoptic camera is illustrated in FIG. 4A where the light rays, after entering the camera through the main lens, form a real image at plane PI 405. This image is re-imaged by each of the microlenses located at plane P2 430 onto the sensor plane 435.
  • the Radon photography technique can determine the sum over all rays 420 that lie in an arbitrary plane P 415.
  • the Radon photography technique can determine the image of line l t 410 as produced by an arbitrary microlens, placed at (x 0 , y 0 ).
  • the equation of line l t can be given by:
  • the image of a point (x, y) by the microlens placed at (x 0 , y 0 ) can be given by:
  • Equation (1 1) the origin is fixed at the intersection of the optical axis of the microlens plane 430 and the sensor plane 430, M is the microlens magnification, m equals the slope of the microlens line 425, and
  • the Radon image fe ,v (r) can be obtained by summation over all pixels that lie on the line represented by Equation (1 1), for each microlens intersected by plane P 415. These pixels are represented in FIG. 4A by line of intersection 445.
  • FIG. 4B illustrates an embodiment of a set of pixels or microimages 440 that can be used in Radon image generation.
  • FIG. 4B illustrates one example of how a set of microimages or pixels 440 can be selected from the plurality of microimages or pixels 440 of the plenoptic camera photo sensor once an intersection line 445 is determined according to Equations (8) through (11) above.
  • each microimage/pixel 440 that the line 445 passes through can be selected for computing the Radon transform of the corresponding plane of integration 415, as represented by the shaded microimages 450.
  • the Radon transform can be calculated using pixel values of the selected microimages 450.
  • FIG. 4B illustrates one example of integrating over the pixels 450 in a given microimage, which can be done by adding up pixel intensity values.
  • multiple planes of integration are used, as discussed above with respect to block 315 of FIG. 3.
  • FIG. 4C illustrates an embodiment of lines of integration 460 that can be used in Radon image generation.
  • the lines of integration 460 can correspond to a plurality of intersection lines- that is, the mapping of image lines/microlens lines onto the sensor plane 435- and therefore represent edges of a plurality of planes of integration used to generate a Radon image /.
  • a plurality of lines of integration 460 sample distinct microimages 440 corresponding to the microlenses of the plenoptic camera when the steps by which ⁇ 455 is varied are high enough.
  • One way to ensure that each microimage 440 is sampled by at least one line of integration 460 is to generate lines from a starting point on the sensor plane 435 (here illustrated as a corner of the sensor plane 435) to each the microimages 440 along at least one border of the sensor plane 435 (here illustrated as ending approximately in the center of the outer edge of each microimage 440, the outer edge being a border of the sensor plane 435).
  • the angles formed by each pair of adjacent lines can be measured, and the smallest angle can be selected as the step size ⁇ 455. This represents one example of determining a step size for ⁇ .
  • the choices for the start and end values and the step size for the parameters ⁇ , ⁇ , and r in the equations above can determine the sampling of the light field space.
  • the parameter ⁇ can represent the step size between lines of integration over a plenoptic sensor.
  • One embodiment for determining the step size for ⁇ is described above with respect to FIG. 4C.
  • can be varied in steps of size one degree from zero 360. This sampling can be sufficient in one example for images captured using a plenoptic sensor having a 75x75 area. Denser sampling (e.g., smaller step size for ⁇ ) may not provide additional information due to adjacent lines sampling the same set of pixels, leading to redundant information, larger data size for the Radon image, greater processor usage, and longer run times.
  • M ciosest and Mf arthest are the minimum and maximum values of magnification for the scene under consideration with respect to the microlens array and B represents the distance between the microlens plane 430 and the sensor plane 435.
  • One example of computing the approximate magnification of a given point with respect to a microlens array is given by the ratio between the number of times the point is repeated in a row and the number of horizontal pixels in a microimage.
  • the parameter r can represent a vector passing through the origin which is also perpendicular to the plane of integration.
  • the maximum value that r can have in one embodiment (r max ) is the length of the diagonal of an image formed on the sensor, determined from the number of pixels in the vertical and horizontal directions.
  • r can be varied in steps of one between 1 and r max , as a smaller step size may not significantly improve reconstruction of the plenoptic function from the Radon image due the sensor tolerance of one pixel. Smaller step size may, however, contribute to a significant increase in run time due to denser sampling.
  • the parameter ⁇ can represent an angle between the vector r and the z- axis in an (x,y,z) coordinate system mapped to the three-dimensional image scene.
  • the interval over which ⁇ is varied substantially contributes to refocusing due to its association with depth values along the z-axis in the data representing three-dimensional target image scene.
  • can be varied between 80 and 100 with a step size of one.
  • FIG. 5 illustrates an embodiment of a process 500 for generating a Radon image from input light field data.
  • the process 500 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 500, which can be executed by any device or system having Radon photography capabilities.
  • the Radon photography module 240 can receive plenoptic image data, for example from the plenoptic camera assembly 201, which can be any of the plenoptic cameras lOOa-lOOd represented by FIGS. 1A-1D or any other camera capable of capturing plenoptic or light field image data.
  • the Radon photography module 240 may receive a complete set of plenoptic image data after image capture in some embodiments. In other embodiments, the Radon photography module 240 may receive plenoptic image data during image capture. For example, the Radon photography module 240 may receive the plenoptic image data in portions corresponding to predetermined planes of integration in one example.
  • the Radon image generator 242 can identify a plane of integration.
  • the Radon image generator 242 can receive data indicating sensor parameters such as pixel size, and can determine one or more planes of integration that collectively operate to sample each distinct pixel of, or microimage formed on, the sensor as is described above with respect to FIG. 4C.
  • the Radon image generator 242 can receive data indicating the number and location of the planes of integration, for example from image processor 220 retrieving such information from data store 270.
  • the plane or planes of integration needed for generating a Radon image can be predetermined for a sensor.
  • block 510 may be optional.
  • Predetermined planes of integration can be used to scan only pixels, or a plurality of pixels corresponding to microimages, for pixels or microimages predetermined to intersect with the plane of integration, and this partial plenoptic image data can be sent to the Radon photography module 240 for application of the Radon transform.
  • the Radon image generator 242 can determine a microlens line representing the intersection of the determined plane of integration and the microlens array of the plenoptic camera. For example, a microlens line 425 is illustrated above with respect to FIG. 4A.
  • the Radon image generator 242 can map the microlens line to specific pixels and/or microimages on the sensor. This can be accomplished, in one example, by Equation (11) described above.
  • planes of integration and their corresponding microlens lines and pixels/micro images mapped to the microlens lines can be computed in advance of image capture, and this data can be retrieved by the Radon image generator 242 during application of the Radon transform to input plenoptic image data.
  • the pre-computed data indicating pixels mapped to predetermined microlens lines can be used to segment data as it is captured by the sensor, sending only pixel values relevant to the current plane of integration to the Radon image generator, and blocks 515 and 520 can be optional.
  • the Radon image generator 242 can calculate the sum of pixel values that are mapped to the microlens line. This can be done individually by microimage for each pixel within the microimage that is mapped to the microlens line in one embodiment. In one example, the Radon image generator 242 can calculate the sum of the pixel values using either of Equations (3) and (4) above.
  • Some embodiments of the process 500 can be implemented after image capture to change the structure of existing or stored plenoptic image data. Other embodiments of the process 500 can be implemented during image capture to change the structure of plenoptic image data as it is captured.
  • intersection lines can be predetermined prior to image capture such that the intersection lines sample each distinct pixel or microimage, as illustrated by FIG. 4C. Pixels corresponding to the intersection lines can be determined according to Equation (1 1) above.
  • the pixels of the photo sensor can be scanned in rows or lines according to the corresponding intersection lines.
  • Some embodiments may scan the pixels sequentially for each intersection line, and accordingly the integral value of each plane of integration corresponding to the intersection lines can be sequentially calculated, for example based at least partly on the pixel intensity values.
  • the values for the planes of integration can be computed in parallel.
  • the values of the light rays (represented by pixel values) incident on those pixels or microimages can be summed or integrated according to the Radon transform of the plane expressed in Equation (4).
  • the resulting values, representing the sum of light rays over the planes of integration can be stored as a Radon image.
  • the Radon image data can be represented as an array of values with one value for each sampled plane, wherein the values in the array are the sum of light rays over that plane.
  • FIG. 6 illustrates an embodiment of a process 600 for generating plenoptic image data from a Radon image.
  • the process 600 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 600, which can be executed by any device or system having Radon photography capabilities.
  • the luminous density calculator 244 can receive Radon image data, for example from the Radon image calculator 242 or from the image processor 220 retrieving a Radon image stored in the storage 270.
  • the Radon image can be an expression of a plenoptic function in terms of summed energy over planes intersecting the image sensor of the capturing plenoptic camera rather than the traditional expression of plenoptic image data in terms of summed energy of rays incident on the image sensor.
  • the luminous density calculator 244 can calculate an intermediate function through back-projection of the Radon image. In one embodiment, this can be accomplished through Equation (7) defined above.
  • Back projection can take the Radon image function, defined on each plane in the three-dimensional image space, and projects the planes over the image space to reproduce the original luminous density.
  • the back-projection can be a blurry version of the original plenoptic function in some embodiments.
  • the luminous density calculator 244 can apply the Laplacian of Gaussian operator to the blurry back-projection in order to sharpen the back- projection and bring it closer to the original plenoptic function.
  • application of the Laplacian of Gaussian operator can add undesirable noise to the back-projection in some embodiments.
  • the luminous density calculator 244 can denoise the back-projection after application of the Laplacian of Gaussian operator to recover the original plenoptic function, or to substantially recover the original plenoptic function.
  • Implementations disclosed herein provide systems, methods and apparatus for capturing and rendering from plenoptic image data.
  • One skilled in the art will recognize that these embodiments may be implemented in hardware, software, firmware, or any combination thereof.
  • the circuits, processes, and systems discussed above may be utilized in a wireless communication device.
  • the wireless communication device may be a kind of electronic device used to wirelessly communicate with other electronic devices. Examples of wireless communication devices include cellular telephones, smart phones, Personal Digital Assistants (PDAs), e-readers, gaming systems, music players, netbooks, wireless modems, laptop computers, tablet devices, etc.
  • PDAs Personal Digital Assistants
  • the wireless communication device may include one or more image sensors, two or more image signal processors, a memory including instructions or modules for carrying out the CNR process discussed above.
  • the device may also have data, a processor loading instructions and/or data from memory, one or more communication interfaces, one or more input devices, one or more output devices such as a display device and a power source/interface.
  • the wireless communication device may additionally include a transmitter and a receiver.
  • the transmitter and receiver may be jointly referred to as a transceiver.
  • the transceiver may be coupled to one or more antennas for transmitting and/or receiving wireless signals.
  • the wireless communication device may wirelessly connect to another electronic device (e.g., base station).
  • a wireless communication device may alternatively be referred to as a mobile device, a mobile station, a subscriber station, a user equipment (UE), a remote station, an access terminal, a mobile terminal, a terminal, a user terminal, a subscriber unit, etc.
  • Examples of wireless communication devices include laptop or desktop computers, cellular phones, smart phones, wireless modems, e-readers, tablet devices, gaming systems, etc.
  • Wireless communication devices may operate in accordance with one or more industry standards such as the 3rd Generation Partnership Project (3 GPP).
  • 3 GPP 3rd Generation Partnership Project
  • the general term "wireless communication device” may include wireless communication devices described with varying nomenclatures according to industry standards (e.g., access terminal, user equipment (UE), remote terminal, etc.).
  • the functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium.
  • computer- readable medium refers to any available medium that can be accessed by a computer or processor.
  • a medium may comprise RAM, ROM, EEPROM, flash memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
  • a computer-readable medium may be tangible and non-transitory.
  • computer-program product refers to a computing device or processor in combination with code or instructions (e.g., a "program”) that may be executed, processed or computed by the computing device or processor.
  • code may refer to software, instructions, code or data that is/are executable by a computing device or processor.
  • Software or instructions may also be transmitted over a transmission medium.
  • a transmission medium For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of transmission medium.
  • DSL digital subscriber line
  • the methods disclosed herein comprise one or more steps or actions for achieving the described method.
  • the method steps and/or actions may be interchanged with one another without departing from the scope of the claims.
  • the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
  • Couple may indicate either an indirect connection or a direct connection.
  • first component may be either indirectly connected to the second component or directly connected to the second component.
  • plurality denotes two or more. For example, a plurality of components indicates two or more components.
  • determining encompasses a wide variety of actions and, therefore, “determining” can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” can include resolving, selecting, choosing, establishing and the like.
  • the examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of the operations may be re-arranged.
  • a process is terminated when its operations are completed.
  • a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a software function, its termination corresponds to a return of the function to the calling function or the main function.

Abstract

Certain aspects relate to systems and techniques for efficiently recording captured plenoptic image data and for rendering images from the captured plenoptic data. The plenoptic image data can be captured by a plenoptic or other light field camera. In some implementations, four dimensional radiance data can be transformed into three dimensional data by performing a Radon transform to define the image by planes instead of rays. A resulting Radon image can represent the summed values of energy over each plane. The original three-dimensional luminous density of the scene can be recovered, for example, by performing an inverse Radon transform. Images from different views and/or having different focus can be rendered from the luminous density.

Description

GENERATION AND USE OF A 3D RADON IMAGE
TECHNICAL FIELD
[0001] The systems and methods disclosed herein are directed to image data, and, more particularly, to capture of and rendering from plenoptic data.
BACKGROUND
[0002] The field of photography continues to develop, offering more options to a photographer. With analog photography, developed film revealed the final result of the exposure, color balance, focus, and other image capture factors as they had occurred at the time of image capture. Until recently, digital photography has been about digitizing what used to be an analog process. Digital photography affords more options to photographers than analog photography after an image is captured, as processing applications can be used to enhance and clean up the image respecting certain qualities like exposure, color balance, saturation, and others. However, if the focus of the captured digital image is off, or if the photographer wishes to shift to a different viewpoint, existing processing applications cannot correct for these factors.
[0003] Plenoptic photography comes closer to rendering the full variety of view angles and multiple focal points offered by the direct observation of objects by capturing the light field, also referred to as radiance, of a target image scene. To capture a plenoptic image, a microlens array can be mounted in front of an image sensor, effectively capturing many images of a target scene, with each image capturing a portion of the target scene from a slightly different viewpoint. As such, multiple light rays are captured from varying viewpoints for each pixel of a synthesized final image of the scene. This provides raw sensor data containing four-dimensional radiance data about each pixel point in a potential final image of the target scene: two dimensions of the spatial position of a light ray and two dimensions of the orientation of a light ray at the pixel. With software, this data can be synthesized into a final 2D or 3D image from any of the vantage points or focus points represented by the radiance data, extending the capabilities of digital photography and affording greater flexibility for photographers to alter vantage point or focus after image capture.
[0004] Plenoptic camera technology offers several imaging capabilities that may be desirable in mobile devices, including but not limited to full 3D imaging, refocusabilty, and High Dynamic Range (HDR) imaging. However, processing the plenoptic sensor data is computationally expensive, typically requiring parallel processing on central processing units (CPUs) or intensive processing on a graphics processing unit (GPU). Accordingly, plenoptic photography requires prohibitively large amounts of data storage and processing for implementation in mobile photography, given the relatively limited GPU memory of conventional mobile devices.
SUMMARY
[0005] Aspects of the invention relate to techniques for efficiently recording appropriately transformed captured plenoptic image data (for example in a plenoptic camera) and a technique for rendering images from such transformed captured plenoptic data, referred to herein as "Radon photography." For example, the structure of captured plenoptic data can be changed by using a Radon transform so that the dimensionality is reduced from four dimensions to three dimensions, thereby generating image data (referred to herein as a "Radon image") having a smaller size than traditional plenoptic images. A final image can be rendered from the Radon image using inverse Radon transform or other computed tomography techniques to recover the radiance or luminous density of the plenoptic data. Compared with existing methods, this can reduce the amount of data by about an order of magnitude, making it possible to perform rendering using much less GPU memory than existing plenoptic rendering techniques, for example on a mobile device.
[0006] Accordingly, one aspect relates to a system in an electronic device for rendering a final image, the system comprising a plenoptic camera including a microlens array and an image sensor, the plenoptic camera configured to capture plenoptic image data of an image scene; and one or more processors in data communication with the plenoptic camera and configured to at least receive the captured plenoptic image data, the captured plenoptic image data including a plurality of microimages each formed by one of a plurality of microlenses in the microlens array focusing light from the image scene onto the image sensor, determine a plurality of lines of integration located within a plane bisecting the microlens array, each of the plurality of lines of integration corresponding to a plane extending through three dimensional space of the image scene, and generate a Radon image based at least partly on a plurality of pixels of the plenoptic image data corresponding to the plurality of lines of integration. [0007] Another aspect relates to a system in an electronic device for rendering a final image, the system comprising a first module configured to at least access data representing a Radon image, the Radon image representing, for each sampled plane of a plurality of sampled planes of an image scene, a sum of light energy in the sampled plane; a second module configured to at least determine a luminous density of the image scene based at least partly on integrating the sum of light energy from each sampled plane of the plurality of sampled planes of the Radon image; and a third module configured to at least project the luminous density onto an image plane to produce the final image.
[0008] Another aspect relates to a method for compressing plenoptic image data, the method comprising receiving at least a portion of data representing a plenoptic image of a three-dimensional image space from a plenoptic camera having a microlens array and an image sensor; identifying a plane of integration intersecting a microlens plane, the microlens plane bisecting the microlens array; determining an image line intersecting the plane of integration and an image plane located a first distance from the microlens plane; determining a microlens line intersecting the plane of integration and the microlens plane; mapping the image line to a microimage based at least partly on intersection of microlens line and the microlens, the microimage formed on the image sensor by a microlens of the microlens array; and summing pixel values for each of a plurality of pixels in the microimage, the plurality of pixels located along the image line mapped to the microimage.
[0009] Another aspect relates to a non-transitory, computer-readable medium storing instructions that, when executed, cause one or more computing devices to perform operations comprising receiving Radon image data of an image scene, the Radon image data representing a luminous density of the image scene as summed values of light energy in each of a plurality of planes of the image scene; computing an intermediate function of the Radon image using a back-projection of the Radon image; recovering the luminous density of the image scene based at least partly on the intermediate function of the Radon image; and projecting the luminous density onto an image plane to produce a dynamically refocusable rendered image of the image scene. BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The disclosed aspects will hereinafter be described in conjunction with the appended drawings and appendices, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
[0011] FIG. 1A illustrates an embodiment of conventional plenoptic camera.
[0012] FIG. IB illustrates an example focused plenoptic camera (Keplerian telescopic case), according to some embodiments.
[0013] FIG. 1C illustrates an example focused plenoptic camera (Galilean telescopic case), according to some embodiments.
[0014] FIG. ID illustrates an example thin plenoptic camera based on microspheres, according to some embodiments.
[0015] FIG. 2 illustrates a high-level schematic block diagram of an embodiment of an image capture device having Radon photography capabilities.
[0016] FIG. 3 illustrates an embodiment of a Radon photography process.
[0017] FIG. 4A illustrates an embodiment of a technique of using plenoptic camera planes for Radon image generation.
[0018] FIG. 4B illustrates an embodiment of a set of pixels or microimages that can be used in Radon image generation.
[0019] FIG. 4C illustrates an embodiment of a density of lines of integration that can be used in a method of 3D Radon image generation.
[0020] FIG. 5 illustrates an embodiment of a process for generating a Radon image from input light field data.
[0021] FIG. 6 illustrates an embodiment of a process for generating plenoptic image data from a Radon image.
DETAILED DESCRIPTION
Introduction
[0022] Embodiments of the disclosure relate to systems and techniques for capturing plenoptic image data, processing the plenoptic image data to increase efficiency, and rendering from the processed plenoptic image data. The Radon photography techniques described herein can change the structure of captured plenoptic data to define the radiance of a target image scene in terms of energy contained within two-dimensional planes rather than within one-dimensional rays, which can effectively reduce the amount of image data by about an order of magnitude. For example, this can be accomplished in some implementations by applying a Radon transform to the plenoptic image data, generating a Radon image. A dynamically refocusable image can be rendered from the Radon image by applying the inverse Radon transform to recover the radiance of the image scene from the Radon image. In one example, a Radon image can be approximately 1 megabyte (MB) of data while a typical plenoptic image can be approximately 50 MB of data. This can enable plenoptic imaging on mobile devices with limited GPU capacity, for example smart phones and tablet computers.
[0023] The Radon photography technique described herein can record captured plenoptic data using a transform related to the Radon transform to reduce the quantity of captured plenoptic data. The information captured by plenoptic cameras may be referred to as the light field, the plenoptic function, or radiance. In computational photography, a light field (which may also be referred to as radiance, luminous density, or the plenoptic function) is a four-dimensional record of all one-dimensional light rays in the three-dimensional space of the target image scene. Radiance describes both spatial and angular information, and is defined as density of energy per unit of area per unit of stereo angle (in radians). A plenoptic camera captures radiance in plenoptic images (also referred to as flat images, or flats). When processed, plenoptic images may be digitally refocused, noise may be reduced, viewpoints may be changed, and other plenoptic effects may be achieved. Note that, in the literature, plenoptic cameras may also be referred to as light field cameras, and plenoptic images may also be referred to as light field images.
[0024] Using the Radon transform, the four-dimensional radiance data of the image scene can be represented as an alternative plenoptic function depending on only three variables, which is the Radon image of the density of virtual light sources. Radon photography techniques can capture or represent the energy density of a three- dimensional body in the image scene by cutting it with multiple thin virtual planes and integrating over each plane. The energy in each plane can be measured to construct the Radon image. Accordingly, the resulting Radon image can represent the summed values of energy over each plane between the image plane and the sensor plane rather than traditional plenoptic image data which represents the values of energy of each light ray between the image plan and the sensor plane. The Radon image, being a three- dimensional representation of the four-dimensional radiance of the image scene, is therefore much more economical in terms of memory and thus faster to render as compared to previous plenoptic imaging methods. [0025] The original three-dimensional luminous density of the scene can be recovered, for example, by performing an inverse Radon transform, and images from different views and/or having different depths of focus can be rendered from the luminous density. In one implementation, the Radon photography technique can effectively integrate rays that fall within a plane and can evaluate the Radon transform over all planes passing through a point to perform back projection. Back projection can be the first step done in order to derive the inverse Radon transform at that point. Specific lines of integration can be chosen to sample each microlens in the array with a sufficient step size between adjacent lines of integration to avoid sampling the same set of pixels, which would lead to redundant information. Through back-projection, application of the Laplacian of Gaussian operator, and denoising, the Radon photography technique can determine the luminous density in the scene from the Radon image. Projecting the luminous density onto an image plane can produce a final image, wherein the final image can be adjusted to have different views or different depths of focus of the captured image scene.
Overview of Example Plenoptic Cameras
[0026] FIGS. 1A-1D illustrate various embodiments of plenoptic cameras that can be used to capture Radon image data according to the Radon photography technique. The plenoptic imaging systems of FIGS. 1A-1D can be implemented in a variety of imaging applications, for example still photography, videography, stereoscopic photography, stereoscopic videography, multispectral photography, and multispectral videography. Devices such as handheld cameras, tablet computers, mobile phones, gaming consoles, laptops, personal computers, augmented reality devices such as heads- up display systems, security cameras, and the like can incorporate the plenoptic imaging systems of FIGS. 1A-1D.
[0027] FIG. 1A illustrates a conventional plenoptic camera 100a. The components shown in this Figure are not necessarily to scale relative to each other. A conventional plenoptic camera can include a main lens 105 and a microlens array 110 placed at distance f in front of a photo sensor 115. In some implementations, a charge- coupled device (CCD) can be used as the photo sensor 115. In other implementations, a CMOS imaging sensor can be used as the photo sensor 115. The microlenses 1 10 can have aperture d and focal length and are assumed to be equally spaced at interval d. The main lens 105 can be focused at the plane formed by the center of the microlens array 1 10 ("microlens plane"), and the microlenses 1 10 can be focused at optical infinity (equivalently, at the main lens 105).
[0028] Considering that the focal length of the main camera lens 105 is much greater than the focal length of the microlenses 1 10, each "micro camera" can be focused at the main camera lens aperture, and not on the object being photographed. Each microlens image can thus be completely defocused relative to that object, and represents only the angular distribution of the radiance. As such, these microimages can look blurry and do not represent a human-recognizable image. Since each microlens image can sample a given location depending on its position and spans the same angular range as the other microlens images, rendering an output image from a conventional plenoptic camera radiance image can be accomplished by integrating all of the pixels under each microlens. Integrating a fixed portion of the pixels under each microlens can generate an image of one certain view. In some embodiments, each microlens contributes to a single pixel in the final generated image.
[0029] FIGS. IB and 1C illustrate example focused plenoptic cameras 100b and 100c, respectively, according to some embodiments. The components shown in FIGS. IB and 1C are not necessarily to scale relative to each other, nor are the distances between the components necessarily to scale, nor are the sizes of the components necessarily to scale. The focused plenoptic cameras 100b, 100c may include at least a main lens 105, a microlens array 110, and a photo sensor 1 15. In contrast to the conventional plenoptic camera system 100a of FIG. 1A, the microlens array 1 10 of a focused plenoptic camera 100b, 100c can be focused on an image plane 120 of the main camera lens instead of at infinity. With a focused plenoptic camera 100b, 100c, each microlens can reimage the main lens image onto the photo sensor 1 15. The microlenses 110 can form an array of true images of the main lens image as a relay system, thereby each forming a microimage on the photo sensor 115.
[0030] An array of micro cameras (formed by the projection of the microlenses 1 10 onto the photo sensor 115) observe the "object" in front of them. This "object" is the aerial 3D image of the scene, formed behind the main camera lens 105, represented as a shaded ovoid in FIGS. IB and 1C. Accordingly, the ovoid shaded area 125 in FIGS. IB and 1C represent the three-dimensional (3D) image formed inside the camera by the main camera lens. As illustrated, this 3D image 125 may extend behind the microlenses 1 10. [0031] FIG. IB illustrates an example of a Keplerian telescopic system 100b wherein the image plane 120 being imaged is in front of the microlenses 110. If the main lens 105 forms an image behind the microlenses 110, it is still possible to focus the microlenses 1 10 onto that virtual image so that they form a real image on the photo sensor 1 15, such as in the example Galilean telescopic camera 100c of FIG. 1C. In both the Keplerian telescopic camera 100b and the Galilean telescopic camera 100c, the microlens imaging can be described by the lens equation:
1 1 1 with, respectively, positive a (Keplerian telescopic system 100b) or negative a (Galilean telescopic system 100c). When remapped onto the photo sensor, the image of the main lens can be reduced in size. This reduction may be denoted as:
a
m = - (2)
[0032] As a result of this scaling, the spatial resolution of the radiance captured by the plenoptic camera is a function of the resolution of the microlens images and the amount of overlap in rendering, and not of the number of microlenses. This decoupling of resolution and number of microlenses distinguishes the focused plenoptic camera 100b, 100c from the conventional plenoptic camera 100a.
[0033] Another difference between the conventional plenoptic camera 100a and the focused plenoptic camera 100b, 100c is in the nature of the information that is captured by each microlens. In the conventional plenoptic camera 100a, each microlens images one position in the scene, capturing all of the angular information there. In the focused plenoptic camera 100b, 100c, different microlenses capture the same position; angular information is spread across microlenses. Accordingly, to render flats captured with the focused plenoptic camera 100b, 100c, the rendering algorithm can integrate across microlens images, rather than within a single microlens image. That is, assuming that the task is "imaging the image" that is in focus, the rendering algorithm integrates the points in the microlenses that correspond to the same position in the image by overlapping them at a fixed pitch.
[0034] FIG. ID illustrates an example thin plenoptic camera system lOOd based on microspheres, according to some embodiments. This example thin plenoptic camera lOOd may be similar to the focused plenoptic camera 100c illustrated in FIG. 1C; however, the microlens array is replaced by microspheres 130. The microspheres 130 may be attached or fixed to the surface of the photo sensor 1 15 by any of several techniques; for example a thin layer (e.g., a few nanometers thick) of a transparent adhesive material may deposited on the pixel surface of the photo sensor 115, a layer of molten glass or a similar substance may be deposited on the pixel surface of the photo sensor 1 15, or the microspheres 130 may be embedded in a substance that is very flat on top, while covering the microspheres 130. With appropriately selected and arranged components, a thin plenoptic camera as illustrated in FIG. ID may be approximately 5 mm (millimeters) thin, or even thinner, and thus suitable for use in thin mobile devices.
[0035] FIGS. 1A-1D are intended to illustrate example plenoptic camera systems HOa-l lOd that can carry out the Radon photography techniques described herein. However, it will be appreciated that any plenoptic or light field camera capable of capturing the radiance of an image scene can implement Radon photography techniques in order to reduce the amount of captured image data. In some embodiments, a camera can directly capture a 3D Radon image instead of capturing 4D radiance data and generating the 3D Radon image from the 4D radiance data.
Overview of Example System
[0036] FIG. 2 illustrates a high-level schematic block diagram of an embodiment of an image capture device 200 having Radon photography capabilities, the device 200 having a set of components including an image processor 220 linked to a camera assembly 201. The image processor 220 is also in communication with a working memory 265, memory 230, and device processor 255, which in turn is in communication with storage 270 and an optional electronic display 260.
[0037] Device 200 may be a cell phone, digital camera, tablet computer, personal digital assistant, or the like. There are many portable computing devices in which rendering from a reduced quantity of plenoptic image data such as is described herein would provide advantages. Device 200 may also be a stationary computing device or any device in which the Radon photography technique would be advantageous. A plurality of applications may be available to the user on device 200. These applications may include traditional photographic and video applications, high dynamic range imaging, panoramic photo and video, multispectral photo and video, stereoscopic imaging such as 3D images or 3D video, and plenoptic photo and video.
[0038] The image capture device 200 includes plenoptic camera assembly 201 for capturing external images. The camera 201 can be any of the plenoptic camera configurations 100a- lOOd described above with respect to FIGS. 1A-1D, in some embodiments. Although depicted with a main lens 205, microlens A 210a through microlens N 210n, and a single image sensor 215, embodiments of camera 201 other than the example illustration can have any combination of some or all of the main lens 205, microlens array 210a-210n, and sensor 215 capable of capturing the radiance data of a target image scene. In general, N microlenses can be used, where N ≥ 2. Some embodiments of the plenoptic camera assembly 201 can have additional components, for example additional lens assemblies and corresponding additional image sensors for capture of stereoscopic or multispectral plenoptic image data. In some embodiments, device 200 can include additional camera assemblies, for example a traditional (non- plenoptic) camera assembly in addition to the plenoptic camera assembly 201. The plenoptic camera assembly 201 can be coupled to the image processor 220 to transmit captured image to the image processor 220.
[0039] The image processor 220 may be configured to perform various processing operations on received image data comprising N microimages corresponding to the N microlenses in order to execute the Radon photography technique. Processor 220 may be a general purpose processing unit or a processor specially designed for imaging applications. Examples of image processing operations include cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color interpolation, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, etc. Processor 220 may, in some embodiments, comprise a plurality of processors. Processor 220 may be one or more dedicated image signal processors (ISPs) or a software implementation of a processor.
[0040] As shown, the image processor 220 is connected to a memory 230 and a working memory 265. In the illustrated embodiment, the memory 230 stores capture control module 235, Radon photography module 240, and operating system 250. The Radon photography module240 includes sub-modules: Radon image generator 242, luminous density calculator 244, and flat rendering module 246. The modules of the memory 230 include instructions that configure the image processor 220 of device processor 255 to perform various image processing and device management tasks. Working memory 265 may be used by image processor 220 to store a working set of processor instructions contained in the modules of memory 230. Alternatively, working memory 255 may also be used by image processor 220 to store dynamic data created during the operation of device 200. [0041] As mentioned above, the image processor 220 is configured by several modules stored in the memories. The capture control module 235 may include instructions that configure the image processor 220 to adjust the focus position of plenoptic camera assembly 201. Capture control module 235 may further include instructions that control the overall image capture functions of the device 200. For example, capture control module 235 may include instructions that call subroutines to configure the image processor 220 to capture raw plenoptic image data of a target image scene using the plenoptic camera assembly 201. In one embodiment, capture control module 235 may then call the Radon photography module 240 to reduce the size of the captured plenoptic image data and output the reduced size image data to the imaging processor 220. In another embodiment capture control module 235 may then call the Radon photography module 240 to perform a rendering operation on the raw plenoptic data in order to output a flat, refocusable image to imaging processor 220. Capture control module 235 may also call the Radon photography module 240 to perform a rendering operation on raw plenoptic image data in order to output a preview image of a scene to be captured, and to update the preview image at certain time intervals, when the scene in the raw image data changes, or when a user changes the focus of the preview image.
[0042] Radon photography module 240 can call sub-modules Radon image generator 242, luminous density calculator 244, and flat rendering module 246 to perform different portions of the plenoptic data processing and image rendering operations. The Radon image generator 242 can include instructions that configure the image processor 220 to generate a Radon image, the Radon image being a relatively smaller quantity of data compared to the raw plenoptic image data. For example, Radon image generator 242 can include instructions that configure the image processor 220 to apply a Radon transform to plenoptic image data. Some embodiments of the Radon image generator 242 can operate on stored plenoptic image data, while other embodiments of the Radon image generator 242 can change the structure of plenoptic image data as it is captured prior to storage. For example, the pixels or microimages of a photo sensor can be scanned in rows or lines corresponding to intersection lines with a plane of integration, and the values of the light rays incident on those pixels or microimages can be integrated, as will be discussed in more detail below. In some embodiments, the Radon photography module 240 can transmit the Radon image to the image processor 220 for storage in the storage module 270. In some embodiments, the Radon photography module 240 can transmit the Radon image to the luminous density calculator 244.
[0043] The luminous density calculator 244 can include instructions that configure the processor 220 to perform processing operations on the Radon image to generate plenoptic image data from the Radon image. For example, the luminous density calculator 244 can include instructions that configure the processor 220 to apply an inverse Radon transform to the Radon image to recover the original luminous density captured by the plenoptic camera assembly 201, or to recover an approximation of the original luminous density.
[0044] The flat rendering module 246 can include instructions that configure the image processor 220 to perform a rendering operation on the output of the luminous density calculator 244. For example, the flat rendering module 246 can include instructions that configure the image processor 220 to output an image (sometimes referred to as a "flat" in plenoptic photography) by projecting the luminous density onto an image plane, which can produce a flat at different viewpoints or different depths of focus. The processor 220 can store the image or output the image for display to a user, wherein the user can dynamically refocus the image through a range of focus depths captured by the plenoptic image data, and can dynamically adjust a viewpoint of the image through a range of viewpoints captured by the plenoptic image data. As the user inputs commands to adjust the focus and/or viewpoint of the image, the flat rendering module 246 can include instructions that cause the image processor 220 to respond to the user commands and render an updated image. In some embodiments, the flat rendering module 246 can include instructions that configure the image processor 220 to output a three-dimensional or stereoscopic image and to update the image based on user input.
[0045] Operating system module 250 configures the image processor 220 to manage the working memory 265 and the processing resources of device 200. For example, operating system module 250 may include device drivers to manage hardware resources such as the camera assembly 201. Therefore, in some embodiments, instructions contained in the image processing modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in operating system component 250. Instructions within operating system 250 may then interact directly with these hardware components. Operating system module 250 may further configure the image processor 220 to share information with device processor 255. [0046] Device processor 255 may be configured to control the display 260 to display the captured image, or a preview of the captured image, to a user. The display 260 may be external to the imaging device 200 or may be part of the imaging device 200. The display 260 may also be configured to provide a view finder displaying a preview image for a use prior to capturing an image, or may be configured to display a captured image stored in memory or recently captured by the user. The display 260 may comprise an LCD or LED screen, and may implement touch sensitive technologies.
[0047] Device processor 255 may write data to storage module 270, for example data representing captured images. While storage module 270 is represented graphically as a traditional disk device, those with skill in the art would understand that the storage module 270 may be configured as any storage media device. For example, the storage module 270 may include a disk drive, such as a floppy disk drive, hard disk drive, optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, RAM, ROM, and/or EEPROM. The storage module 270 can also include multiple memory units, and any one of the memory units may be configured to be within the image capture device 200, or may be external to the image capture device 200. For example, the storage module 270 may include a ROM memory containing system program instructions stored within the image capture device 200. The storage module 270 may also include memory cards or high speed memories configured to store captured images which may be removable from the camera. The storage module 270 can also be external to device 200, and in one example device 200 may wirelessly transmit data to the storage module 270, for example over a network connection.
[0048] Although Figure 2 depicts a device having separate components to include a processor, imaging sensor, and memory, one skilled in the art would recognize that these separate components may be combined in a variety of ways to achieve particular design objectives. For example, in an alternative embodiment, the memory components may be combined with processor components, for example to save cost and/or to improve performance.
[0049] Additionally, although Figure 2 illustrates two memory components, including memory component 230 comprising several modules and a separate memory 265 comprising a working memory, one with skill in the art would recognize several embodiments utilizing different memory architectures. For example, a design may utilize ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 230. The processor instructions may be loaded into RAM to facilitate execution by the image processor 220. For example, working memory 265 may comprise RAM memory, with instructions loaded into working memory 265 before execution by the processor 220.
Overview of Example Radon Photography Process
[0050] FIG. 3 illustrates an embodiment of a Radon photography process 300. The process 300 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 300, which can be executed by any device or system having Radon photography capabilities.
[0051] At block 305, the Radon photography module 240 can receive plenoptic image data, for example from the plenoptic camera assembly 201, which can be any of the plenoptic cameras lOOa- lOOd represented by FIGS. 1A- 1D or any other camera capable of capturing plenoptic or light field image data. The Radon photography module 240 may receive a complete set of plenoptic image data after image capture in some embodiments. In other embodiments, the Radon photography module 240 may receive plenoptic image data during image capture. For example, the Radon photography module 240 may receive the plenoptic image data in portions corresponding to predetermined planes of integration in one example.
[0052] At block 3 10, the Radon image generator 242 of the Radon photography module 240 can generate a Radon image from the input plenoptic image data. For example, the Radon image generator 242 can apply the Radon transform to calculate the integral of the value of light rays in one or more planes of integration.
[0053] Given a luminous density f (x, y, z), the Radon image fe,v (r can be computed by integrating the function value over all possible planes in one embodiment. In another embodiment, a subset of all possible planes can be identified for computing the Radon image β,φ ί^). The plane(s) of integration can be planes normal to and intersecting the sensor plane of the plenoptic camera, and the position and location of the plane(s) of integration can determined based on the configuration of the sensor such that each distinct pixel of the sensor (or each distinct microimage formed on the sensor by the microlens array) is sampled by at least one plane of integration.
[0054] The Radon transform in three dimensions can be expressed as:
Figure imgf000017_0001
= I I I f(x, y, z)S(az + by - r)dxdydz (3)
[0055] A parameterization of a plane, given by the parameters φ, Θ, r can be beneficial in some embodiments for computing the inverse Radon transform used to render images from the Radon image. Accordingly, the Radon transform for a plane defined in polar coordinates can be expressed as Equation (4), below: fa,b,c r) = I I I f(x> y> z)S x cos φ sin Θ + y sin φ sin Θ + z cos Θ— r)dxdydz (4)
[0056] An example of the Radon photography technique can effectively integrate all rays that fall within the plane then apply the inverse Radon transform (i.e., the inverse of Equations (3) or (4), as will be explained in more detail with respect to block 315.
[0057] At block 315, the luminous density calculator 244 of the Radon photography module 240 can sum over all of the planes of the Radon image passing through a point. This can calculate a back-projection of the Radon image. For example, the luminous density calculator 244 can apply an inverse Radon transform to the Radon image as described below in Equation (5). Luminous density calculator 244 can take the input Radon image fe,q> (f) and reproduce the luminous density f(x, y, z). By applying the inverse Fourier transform and Fourier Slice Theorem, the expression for determining the inverse Radon transform can be: fix. y. z) =— Ax,y,z( \ fgi<p(r) sin e άθάφ) (5)
°π J<p=o Je=o
where r is given by:
r = x cos ip sin # + y sin <p sin Θ + z cos Θ (6) Notably, Equation (5) has the same form as the Poisson equation. Accordingly, the back- projection of the Radon image can be expressed as:
f x, y, z) =
ί—ί2 ψ n = 0 ί—ίn fe <p {r) sm d (7)
θ=0
where r is given by Equation (6) and f1 represents the backprojected Radon image. In some embodiments, f1 can look like a burred version of the actual function
[0058] At block 320, the luminous density calculator 244 can apply a three- dimensional Laplacian of Gaussian to the back-projection f1 in order to recover the original luminous density of the image scene, or an approximation thereof. In some embodiments this can produce a noisy version of the original function f(x, y, z), which after denoising can provide the density of the virtual light sources. Accordingly, in some embodiments the luminous density calculator 244 can perform denoising on the back- projection f1 after applying the three-dimensional Laplacian of Gaussian.
[0059] At block 325, the Radon photography module 240 can output the luminous density of the plenoptic data, for example in one embodiment for rendering an image with dynamic focus and vantage point. Projecting the luminous density onto an image plane can produce images of the target image scene at different depths of focus or different vantage points. In one embodiment, the flat rendering module 246 can perform rendering of images from the luminous density. In another embodiment, the Radon photography module 240 can output the luminous density for storage.
[0060] At block 330, the flat rendering module 246 can project the luminous density onto an image plane to produce a dynamically refocusable rendered image of the image scene. The rendered image is referred to as "dynamically refocusable" due to the fact that, based on the luminous density, images from different viewpoints and/or having different depths of focus can be rendered of the same image scene.
[0061] At optional block 335, the flat rendering module 246 can adjust the focus and/or vantage point of the rendered image based on user input. For example, flat rendering module 246 can receive an indication of user input from device processor 255 indicating that the user has selected an option to adjust one or both of the vantage point from which the rendered image appears to have been taken or the depth of focus of the rendered image.
Overview of Example Radon Image Generation Process
[0062] FIG. 4A illustrates an embodiment of plenoptic camera planes 400 that can be used in Radon image generation. The planes 400 can include an image plane 405, a microlens plane 430, a plane of integration 415, and a sensor plane 435. The image plane 405 can represent a plane formed by the image plane, for example a plane passing through the point of focus of a main lens of a plenoptic camera. The microlens plane 430 can represent a plane formed through a microlens array of a plenoptic camera, for example a plane bisecting each of the lenses of the microarray. The plane of integration 415 can represent a plane intersecting both the image plane 405 and the microlens plane 430. Although the plane of integration 415 is depicted in a certain location relative to the other planes, in some embodiments the location of the plane of integration 415 can vary based on the parameters of the photo sensor, and in addition multiple planes of integration can be used to generate a Radon image, as will be discussed in more detail below. The sensor plane 435 can represent a plane formed by the surface of a photo sensor of a plenoptic camera, and can include a plurality of regions 440 corresponding to microimages formed by the microlenses within the microlens array.
[0063] Correct parameterization of the objects involved can aid in efficient computing of the Radon image. In one embodiment, 2-plane parameterization can be used for representing light fields where the first two parameters (a first coordinate) denote the location of the start point of the ray 420 under consideration on the image plane 405 and the last two parameters (a second coordinate) denote the location of the end point of the ray 420 on the microlens plane 430, where the image plane 405 and the microlens plane 430 can be separated by a distance A.
[0064] To compute the integral over light rays 420 lying in the plane of integration 415, the Radon photography technique can determine the pixels onto which image line 410 is mapped in the corresponding microimages 440 of the microlenses lying along the microlens line 425. In one embodiment, each microlens can correspond to one microimage, and each microimage can correspond to one pixel of the image sensor. In another embodiment, each microimage can correspond to a plurality of pixels of the image sensor. The image line 410 can represent a vector passing through the intersection of the image plane 405 and the plane of integration 415. The microlens line 425 can represent a vector passing through the intersection of the plane of integration 415 and the microlens plane 430.
[0065] In order to locate the pixels/microimages lying along the microlens line 425, the Radon photography technique can compute the focal length of a microlens and compute the magnification of the microlens using Equation (1) above. This can be computed, in some embodiments, for each microlens in the array that intersects with the microlens line, allowing the Radon photography technique to locate the pixels corresponding to the rays 420 lying in the plane of integration.
[0066] An example of a simple model for the plenoptic camera is illustrated in FIG. 4A where the light rays, after entering the camera through the main lens, form a real image at plane PI 405. This image is re-imaged by each of the microlenses located at plane P2 430 onto the sensor plane 435. The Radon photography technique can determine the sum over all rays 420 that lie in an arbitrary plane P 415.
[0067] To compute the integral of rays lying in plane P 415, one needs to determine the corresponding pixels on the sensor plane 435 where the rays are mapped by each of the microlenses. For this purpose, the Radon photography technique can determine the image of line lt 410 as produced by an arbitrary microlens, placed at (x0, y0). The equation of line lt can be given by:
ax + by = r (8)
[0068] Similarly, the equation of line l2 can be given by:
ax + by = r = cA (9)
[0069] The image of a point (x, y) by the microlens placed at (x0, y0) can be given by:
xi = -M(x - x0), yx = -M(y - y0) (10) where M is the magnification of the microlens. Substituting the values of x and y from the above equations into the equation of line l 410 written in the form of y = mx + d, we get:
y1 = mx1 - M(mx0 + d - y0) (11)
[0070] Note that for Equation (1 1), the origin is fixed at the intersection of the optical axis of the microlens plane 430 and the sensor plane 430, M is the microlens magnification, m equals the slope of the microlens line 425, and
—a
m =— = cot e (12) b
r r
ά = τ b = ~ si—n φ si ~n~ Θ τ ( v13) '
The Radon image fe,v (r) can be obtained by summation over all pixels that lie on the line represented by Equation (1 1), for each microlens intersected by plane P 415. These pixels are represented in FIG. 4A by line of intersection 445.
[0071] FIG. 4B illustrates an embodiment of a set of pixels or microimages 440 that can be used in Radon image generation. FIG. 4B illustrates one example of how a set of microimages or pixels 440 can be selected from the plurality of microimages or pixels 440 of the plenoptic camera photo sensor once an intersection line 445 is determined according to Equations (8) through (11) above. In the illustrated example, each microimage/pixel 440 that the line 445 passes through(depicted using the orange color) can be selected for computing the Radon transform of the corresponding plane of integration 415, as represented by the shaded microimages 450. The Radon transform can be calculated using pixel values of the selected microimages 450. FIG. 4B illustrates one example of integrating over the pixels 450 in a given microimage, which can be done by adding up pixel intensity values. In order to compute the Radon image /, multiple planes of integration are used, as discussed above with respect to block 315 of FIG. 3. [0072] FIG. 4C illustrates an embodiment of lines of integration 460 that can be used in Radon image generation. The lines of integration 460 can correspond to a plurality of intersection lines- that is, the mapping of image lines/microlens lines onto the sensor plane 435- and therefore represent edges of a plurality of planes of integration used to generate a Radon image /.
[0073] As shown in FIG. 4C, a plurality of lines of integration 460 sample distinct microimages 440 corresponding to the microlenses of the plenoptic camera when the steps by which φ 455 is varied are high enough. One way to ensure that each microimage 440 is sampled by at least one line of integration 460 is to generate lines from a starting point on the sensor plane 435 (here illustrated as a corner of the sensor plane 435) to each the microimages 440 along at least one border of the sensor plane 435 (here illustrated as ending approximately in the center of the outer edge of each microimage 440, the outer edge being a border of the sensor plane 435). The angles formed by each pair of adjacent lines can be measured, and the smallest angle can be selected as the step size φ 455. This represents one example of determining a step size for φ.
[0074] The choices for the start and end values and the step size for the parameters φ, Θ, and r in the equations above can determine the sampling of the light field space. The parameter φ can represent the step size between lines of integration over a plenoptic sensor. One embodiment for determining the step size for φ is described above with respect to FIG. 4C. In another embodiment, φ can be varied in steps of size one degree from zero 360. This sampling can be sufficient in one example for images captured using a plenoptic sensor having a 75x75 area. Denser sampling (e.g., smaller step size for φ) may not provide additional information due to adjacent lines sampling the same set of pixels, leading to redundant information, larger data size for the Radon image, greater processor usage, and longer run times.
[0075] The choice of location for the origin, which is the center of the sphere used for parameterizing planes, can also have an impact on the computing required for the Radon transform. Locating the origin at the center of the point cloud of the scene as imaged by the main lens may provide for marginally better quality sampling, however computing the center accurately can be costly and is not required. Accordingly, some embodiments can compute the center, while other embodiments can approximate the center by using the following formula: 1 1
B
M, +
closest 'farthest 0 (14) center =
2
where Mciosest and Mfarthest are the minimum and maximum values of magnification for the scene under consideration with respect to the microlens array and B represents the distance between the microlens plane 430 and the sensor plane 435. One example of computing the approximate magnification of a given point with respect to a microlens array is given by the ratio between the number of times the point is repeated in a row and the number of horizontal pixels in a microimage.
[0076] The parameter r can represent a vector passing through the origin which is also perpendicular to the plane of integration. The maximum value that r can have in one embodiment (rmax) is the length of the diagonal of an image formed on the sensor, determined from the number of pixels in the vertical and horizontal directions. In one embodiment, r can be varied in steps of one between 1 and rmax, as a smaller step size may not significantly improve reconstruction of the plenoptic function from the Radon image due the sensor tolerance of one pixel. Smaller step size may, however, contribute to a significant increase in run time due to denser sampling.
[0077] The parameter Θ can represent an angle between the vector r and the z- axis in an (x,y,z) coordinate system mapped to the three-dimensional image scene. In the above equations, the interval over which Θ is varied substantially contributes to refocusing due to its association with depth values along the z-axis in the data representing three-dimensional target image scene. In one embodiment, Θ can be varied between 80 and 100 with a step size of one.
[0078] FIG. 5 illustrates an embodiment of a process 500 for generating a Radon image from input light field data. The process 500 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 500, which can be executed by any device or system having Radon photography capabilities.
[0079] At block 505, the Radon photography module 240 can receive plenoptic image data, for example from the plenoptic camera assembly 201, which can be any of the plenoptic cameras lOOa-lOOd represented by FIGS. 1A-1D or any other camera capable of capturing plenoptic or light field image data. The Radon photography module 240 may receive a complete set of plenoptic image data after image capture in some embodiments. In other embodiments, the Radon photography module 240 may receive plenoptic image data during image capture. For example, the Radon photography module 240 may receive the plenoptic image data in portions corresponding to predetermined planes of integration in one example.
[0080] At block 510, the Radon image generator 242 can identify a plane of integration. In some embodiments, the Radon image generator 242 can receive data indicating sensor parameters such as pixel size, and can determine one or more planes of integration that collectively operate to sample each distinct pixel of, or microimage formed on, the sensor as is described above with respect to FIG. 4C. In other embodiments, the Radon image generator 242 can receive data indicating the number and location of the planes of integration, for example from image processor 220 retrieving such information from data store 270. For example, the plane or planes of integration needed for generating a Radon image can be predetermined for a sensor. In some embodiments, block 510 may be optional. Predetermined planes of integration can be used to scan only pixels, or a plurality of pixels corresponding to microimages, for pixels or microimages predetermined to intersect with the plane of integration, and this partial plenoptic image data can be sent to the Radon photography module 240 for application of the Radon transform.
[0081] At block 515, the Radon image generator 242 can determine a microlens line representing the intersection of the determined plane of integration and the microlens array of the plenoptic camera. For example, a microlens line 425 is illustrated above with respect to FIG. 4A. At block 520, the Radon image generator 242 can map the microlens line to specific pixels and/or microimages on the sensor. This can be accomplished, in one example, by Equation (11) described above. In some embodiments, planes of integration and their corresponding microlens lines and pixels/micro images mapped to the microlens lines can be computed in advance of image capture, and this data can be retrieved by the Radon image generator 242 during application of the Radon transform to input plenoptic image data. In further embodiments, the pre-computed data indicating pixels mapped to predetermined microlens lines can be used to segment data as it is captured by the sensor, sending only pixel values relevant to the current plane of integration to the Radon image generator, and blocks 515 and 520 can be optional.
[0082] At block 525, the Radon image generator 242 can calculate the sum of pixel values that are mapped to the microlens line. This can be done individually by microimage for each pixel within the microimage that is mapped to the microlens line in one embodiment. In one example, the Radon image generator 242 can calculate the sum of the pixel values using either of Equations (3) and (4) above.
[0083] The process 500 as detailed above can be repeated, in some embodiments, for each plane of integration determined to be needed for computing the Radon image.
[0084] Some embodiments of the process 500 can be implemented after image capture to change the structure of existing or stored plenoptic image data. Other embodiments of the process 500 can be implemented during image capture to change the structure of plenoptic image data as it is captured. For example, intersection lines can be predetermined prior to image capture such that the intersection lines sample each distinct pixel or microimage, as illustrated by FIG. 4C. Pixels corresponding to the intersection lines can be determined according to Equation (1 1) above. During image capture, the pixels of the photo sensor can be scanned in rows or lines according to the corresponding intersection lines. Some embodiments may scan the pixels sequentially for each intersection line, and accordingly the integral value of each plane of integration corresponding to the intersection lines can be sequentially calculated, for example based at least partly on the pixel intensity values. In other embodiments the values for the planes of integration can be computed in parallel. The values of the light rays (represented by pixel values) incident on those pixels or microimages can be summed or integrated according to the Radon transform of the plane expressed in Equation (4). The resulting values, representing the sum of light rays over the planes of integration, can be stored as a Radon image. In one embodiment, the Radon image data can be represented as an array of values with one value for each sampled plane, wherein the values in the array are the sum of light rays over that plane.
Overview of Example Process for Integration of a Radon Image
[0085] FIG. 6 illustrates an embodiment of a process 600 for generating plenoptic image data from a Radon image. The process 600 is described as being executed by the Radon photography module 240 and its subcomponents, described above with respect to FIG. 2. However, this is for illustrative purposes and is not meant to limit the process 600, which can be executed by any device or system having Radon photography capabilities.
[0086] At block 605, the luminous density calculator 244 can receive Radon image data, for example from the Radon image calculator 242 or from the image processor 220 retrieving a Radon image stored in the storage 270. The Radon image can be an expression of a plenoptic function in terms of summed energy over planes intersecting the image sensor of the capturing plenoptic camera rather than the traditional expression of plenoptic image data in terms of summed energy of rays incident on the image sensor.
[0087] At block 610, the luminous density calculator 244 can calculate an intermediate function through back-projection of the Radon image. In one embodiment, this can be accomplished through Equation (7) defined above. Back projection can take the Radon image function, defined on each plane in the three-dimensional image space, and projects the planes over the image space to reproduce the original luminous density. The back-projection can be a blurry version of the original plenoptic function in some embodiments.
[0088] At block 615, the luminous density calculator 244 can apply the Laplacian of Gaussian operator to the blurry back-projection in order to sharpen the back- projection and bring it closer to the original plenoptic function. However, application of the Laplacian of Gaussian operator can add undesirable noise to the back-projection in some embodiments.
[0089] Accordingly, at block 620, the luminous density calculator 244 can denoise the back-projection after application of the Laplacian of Gaussian operator to recover the original plenoptic function, or to substantially recover the original plenoptic function.
Implementing Systems and Terminology
[0090] Implementations disclosed herein provide systems, methods and apparatus for capturing and rendering from plenoptic image data. One skilled in the art will recognize that these embodiments may be implemented in hardware, software, firmware, or any combination thereof.
[0091] In some embodiments, the circuits, processes, and systems discussed above may be utilized in a wireless communication device. The wireless communication device may be a kind of electronic device used to wirelessly communicate with other electronic devices. Examples of wireless communication devices include cellular telephones, smart phones, Personal Digital Assistants (PDAs), e-readers, gaming systems, music players, netbooks, wireless modems, laptop computers, tablet devices, etc. [0092] The wireless communication device may include one or more image sensors, two or more image signal processors, a memory including instructions or modules for carrying out the CNR process discussed above. The device may also have data, a processor loading instructions and/or data from memory, one or more communication interfaces, one or more input devices, one or more output devices such as a display device and a power source/interface. The wireless communication device may additionally include a transmitter and a receiver. The transmitter and receiver may be jointly referred to as a transceiver. The transceiver may be coupled to one or more antennas for transmitting and/or receiving wireless signals.
[0093] The wireless communication device may wirelessly connect to another electronic device (e.g., base station). A wireless communication device may alternatively be referred to as a mobile device, a mobile station, a subscriber station, a user equipment (UE), a remote station, an access terminal, a mobile terminal, a terminal, a user terminal, a subscriber unit, etc. Examples of wireless communication devices include laptop or desktop computers, cellular phones, smart phones, wireless modems, e-readers, tablet devices, gaming systems, etc. Wireless communication devices may operate in accordance with one or more industry standards such as the 3rd Generation Partnership Project (3 GPP). Thus, the general term "wireless communication device" may include wireless communication devices described with varying nomenclatures according to industry standards (e.g., access terminal, user equipment (UE), remote terminal, etc.).
[0094] The functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium. The term "computer- readable medium" refers to any available medium that can be accessed by a computer or processor. By way of example, and not limitation, such a medium may comprise RAM, ROM, EEPROM, flash memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. It should be noted that a computer-readable medium may be tangible and non-transitory. The term "computer-program product" refers to a computing device or processor in combination with code or instructions (e.g., a "program") that may be executed, processed or computed by the computing device or processor. As used herein, the term "code" may refer to software, instructions, code or data that is/are executable by a computing device or processor.
[0095] Software or instructions may also be transmitted over a transmission medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of transmission medium.
[0096] The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is required for proper operation of the method that is being described, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
[0097] It should be noted that the terms "couple," "coupling," "coupled" or other variations of the word couple as used herein may indicate either an indirect connection or a direct connection. For example, if a first component is "coupled" to a second component, the first component may be either indirectly connected to the second component or directly connected to the second component. As used herein, the term "plurality" denotes two or more. For example, a plurality of components indicates two or more components.
[0098] The term "determining" encompasses a wide variety of actions and, therefore, "determining" can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, "determining" can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, "determining" can include resolving, selecting, choosing, establishing and the like.
[0099] The phrase "based on" does not mean "based only on," unless expressly specified otherwise. In other words, the phrase "based on" describes both "based only on" and "based at least on."
[0100] In the foregoing description, specific details are given to provide a thorough understanding of the examples. However, it will be understood by one of ordinary skill in the art that the examples may be practiced without these specific details. For example, electrical components/devices may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, such components, other structures and techniques may be shown in detail to further explain the examples.
[0101] Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts may have applicability throughout the entire specification.
[0102] It is also noted that the examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a software function, its termination corresponds to a return of the function to the calling function or the main function.
[0103] The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims

WHAT IS CLAIMED IS:
1. A system in an electronic device for rendering a final image, the system comprising:
a plenoptic camera including a microlens array and an image sensor, the plenoptic camera configured to capture plenoptic image data of an image scene; and
one or more processors in data communication with the plenoptic camera and configured to at least:
receive the captured plenoptic image data, the captured plenoptic image data including a plurality of microimages each formed by one of a plurality of microlenses in the microlens array focusing light from the image scene onto the image sensor,
determine a plurality of lines of integration located within a plane bisecting the microlens array, each of the plurality of lines of integration corresponding to a plane extending through three dimensional space of the image scene, and
generate a Radon image based at least partly on a plurality of pixels of the plenoptic image data corresponding to the plurality of lines of integration.
2. The system of claim 1, wherein the one or more processors are further configured to, for each of the plurality of lines of integration, determine a corresponding image line by mapping one line of the plurality of lines of integration to at least one corresponding microimage of the plurality of microimages.
3. The system of claim 2, wherein the one or more processors are further configured to identify the plurality of pixels based at least partly on intersection of the corresponding image line with the plurality of pixels.
4. The system of claim 3, wherein the one or more processors are further configured to generate a plurality of summed pixel values by, for each of the plurality of lines of integration, determining a sum of pixel values associated with each of the plurality of pixels intersecting the corresponding image line.
5. The system of claim 4, wherein the one or more processors are further configured to generate the Radon image as an array of the plurality of summed pixel values.
6. The system of claim 4, wherein the pixel values are intensity values.
7. A system in an electronic device for rendering a final image, the system comprising:
a first module configured to at least access data representing a Radon image, the Radon image representing, for each sampled plane of a plurality of sampled planes of an image scene, a sum of light energy in the sampled plane; a second module configured to at least determine a luminous density of the image scene based at least partly on integrating the sum of light energy from each sampled plane of the plurality of sampled planes of the Radon image; and
a third module configured to at least project the luminous density onto an image plane to produce the final image.
8. The system of claim 7, wherein the final image can be adjusted to have different viewpoints or different depths of focus of the captured image scene.
9. The system of claim 7, wherein the Radon image is generated from captured plenoptic image data of the image scene.
10. The system of claim 9, wherein the captured plenoptic image data expresses a light field of the image scene in terms of intensity of a plurality of rays of light incident on an image sensor from objects in the image scene.
1 1. The system of claim 10, wherein the Radon image expresses the light field in terms of energy contained in planes located in the image scene, the planes intersecting the image sensor.
12. The system of claim 9, wherein the first module is further configured to receive the captured plenoptic image data and generate the Radon image from the captured plenoptic image data.
13. The system of claim 9, further comprising a plenoptic camera configured for capture of plenoptic image data of the image scene.
14. The system of claim 13, wherein the plenoptic camera comprises a main lens, a microlens array including a plurality of microlenses, and an image sensor.
15. The system of claim 14, wherein the plurality of microlenses each comprise a microsphere.
16. The system of claim 7, wherein the first module is configured to access the data representing the Radon image by retrieving the Radon image from a data store.
17. The system of claim 7, wherein the second module is configured to determine the luminous density of the image scene based at least partly on calculating a back- projection of the Radon image.
18. The system of claim 17, wherein the second module is further configured to at least apply a Laplacian of Gaussian operator to the back-projection.
19. The system of claim 18, wherein the second module is further configured to at least denoise the back-projection after applying the Laplacian of Gaussian operator.
20. A method for compressing plenoptic image data, the method comprising: receiving at least a portion of data representing a plenoptic image of a three-dimensional image space from a plenoptic camera having a microlens array and an image sensor;
identifying a plane of integration intersecting a microlens plane, the microlens plane bisecting the microlens array;
determining an image line intersecting the plane of integration and an image plane located a first distance from the microlens plane;
determining a microlens line intersecting the plane of integration and the microlens plane;
mapping the image line to a microimage based at least partly on intersection of microlens line and the microlens, the microimage formed on the image sensor by a microlens of the microlens array; and summing pixel values for each of a plurality of pixels in the microimage, the plurality of pixels located along the image line mapped to the microimage.
21. The method of claim 20, further comprising generating a Radon image based at least partly on the summed pixel values.
22. The method of claim 20, wherein each of a plurality of microlenses of the microlens array forms a distinct microimage on the image sensor.
23. The method of claim 20, further comprising parameterizing the three- dimensional image space using a polar coordinate system.
24. The method of claim 20, further comprising identifying a plurality of planes of integration intersecting the microlens plane.
25. The method of claim 24, wherein the plurality of planes of integration are positioned such that as a group the plurality of planes of integration sample each of a plurality of sensor pixels of the image sensor.
26. The method of claim 25, wherein a step size between adjacent lines formed by intersection of two adjacent planes of the plurality of planes of integration is selected to minimize redundant sampling of microimages on the image sensor.
27. A non-transitory, computer-readable medium storing instructions that, when executed, cause one or more computing devices to perform operations comprising:
receiving Radon image data of an image scene, the Radon image data representing a luminous density of the image scene as summed values of light energy in each of a plurality of planes of the image scene;
computing an intermediate function of the Radon image using a back- projection of the Radon image;
recovering the luminous density of the image scene based at least partly on the intermediate function of the Radon image; and
projecting the luminous density onto an image plane to produce a dynamically refocusable rendered image of the image scene.
28. The non-transitory, computer-readable medium of claim 27, the operations further comprising applying a Laplacian operator to the back-projection of the Radon image to compute the intermediate function.
29. The non-transitory, computer-readable medium of claim 28, the operations further comprising denoising the back-projection of the Radon image to compute the intermediate function.
30. The non-transitory, computer-readable medium of claim 27, the operations further comprising adjusting a focus depth of the dynamically refocusable rendered image in response to user input.
PCT/US2015/022804 2014-04-24 2015-03-26 Generation and use of a 3d radon image WO2015164025A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201580020920.2A CN106233329A (en) 2014-04-24 2015-03-26 3D draws generation and the use of east image
EP15721369.5A EP3134868B1 (en) 2014-04-24 2015-03-26 Generation and use of a 3d radon image
BR112016024707A BR112016024707A2 (en) 2014-04-24 2015-03-26 generation and use of a radon 3d image
KR1020167031948A KR20170005009A (en) 2014-04-24 2015-03-26 Generation and use of a 3d radon image
JP2016563926A JP2017520944A (en) 2014-04-24 2015-03-26 Generation and use of 3D radon images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/261,162 2014-04-24
US14/261,162 US9843787B2 (en) 2014-04-24 2014-04-24 Generation and use of a 3D radon image

Publications (1)

Publication Number Publication Date
WO2015164025A1 true WO2015164025A1 (en) 2015-10-29

Family

ID=53059397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/022804 WO2015164025A1 (en) 2014-04-24 2015-03-26 Generation and use of a 3d radon image

Country Status (7)

Country Link
US (1) US9843787B2 (en)
EP (1) EP3134868B1 (en)
JP (1) JP2017520944A (en)
KR (1) KR20170005009A (en)
CN (1) CN106233329A (en)
BR (1) BR112016024707A2 (en)
WO (1) WO2015164025A1 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2998690A1 (en) * 2015-09-17 2017-03-23 Thomson Licensing Method for encoding a light field content
US10032077B1 (en) * 2015-10-29 2018-07-24 National Technology & Engineering Solutions Of Sandia, Llc Vehicle track identification in synthetic aperture radar images
US10521952B2 (en) 2016-04-12 2019-12-31 Quidient, Llc Quotidian scene reconstruction engine
US11430144B2 (en) * 2017-09-26 2022-08-30 Universita' Degli Studi Di Bari Aldo Moro Device and process for the contemporary capture of standard images and plenoptic images via correlation plenoptic imaging
US10930709B2 (en) 2017-10-03 2021-02-23 Lockheed Martin Corporation Stacked transparent pixel structures for image sensors
US10510812B2 (en) 2017-11-09 2019-12-17 Lockheed Martin Corporation Display-integrated infrared emitter and sensor structures
US11616941B2 (en) 2018-02-07 2023-03-28 Lockheed Martin Corporation Direct camera-to-display system
US10838250B2 (en) * 2018-02-07 2020-11-17 Lockheed Martin Corporation Display assemblies with electronically emulated transparency
US10594951B2 (en) 2018-02-07 2020-03-17 Lockheed Martin Corporation Distributed multi-aperture camera array
US10979699B2 (en) 2018-02-07 2021-04-13 Lockheed Martin Corporation Plenoptic cellular imaging system
US10652529B2 (en) 2018-02-07 2020-05-12 Lockheed Martin Corporation In-layer Signal processing
US10690910B2 (en) 2018-02-07 2020-06-23 Lockheed Martin Corporation Plenoptic cellular vision correction
US10951883B2 (en) 2018-02-07 2021-03-16 Lockheed Martin Corporation Distributed multi-screen array for high density display
US10129984B1 (en) 2018-02-07 2018-11-13 Lockheed Martin Corporation Three-dimensional electronics distribution by geodesic faceting
CN112470191A (en) 2018-05-02 2021-03-09 奎蒂安特有限公司 Codec for processing scenes with almost unlimited details
KR102365735B1 (en) * 2018-11-06 2022-02-22 한국전자통신연구원 Plenoptic image processing apparatus, system having the same, and object segmentation method thereof
US10866413B2 (en) 2018-12-03 2020-12-15 Lockheed Martin Corporation Eccentric incident luminance pupil tracking
WO2020194025A1 (en) * 2019-03-22 2020-10-01 Universita' Degli Studi Di Bari Aldo Moro Process and apparatus for the capture of plenoptic images between arbitrary planes
CN112868224B (en) * 2019-04-01 2023-08-29 谷歌有限责任公司 Method, apparatus and storage medium for capturing and editing dynamic depth image
US10698201B1 (en) 2019-04-02 2020-06-30 Lockheed Martin Corporation Plenoptic cellular axis redirection
US20220086321A1 (en) * 2020-09-15 2022-03-17 Micron Technology, Inc. Reduced diffraction micro lens imaging
WO2023149963A1 (en) 2022-02-01 2023-08-10 Landscan Llc Systems and methods for multispectral landscape mapping
US20240129604A1 (en) * 2022-10-14 2024-04-18 Motional Ad Llc Plenoptic sensor devices, systems, and methods

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4887025B2 (en) * 2005-10-27 2012-02-29 パナソニック株式会社 Mold manufacturing method and optical element manufacturing method
EP2089831B1 (en) * 2006-12-01 2011-10-26 Koninklijke Philips Electronics N.V. Method to automatically decode microarray images
US7962033B2 (en) * 2008-01-23 2011-06-14 Adobe Systems Incorporated Methods and apparatus for full-resolution light-field capture and rendering
EP2244484B1 (en) * 2009-04-22 2012-03-28 Raytrix GmbH Digital imaging method for synthesizing an image using data recorded with a plenoptic camera
US8228417B1 (en) 2009-07-15 2012-07-24 Adobe Systems Incorporated Focused plenoptic camera employing different apertures or filtering at different microlenses
US8400555B1 (en) 2009-12-01 2013-03-19 Adobe Systems Incorporated Focused plenoptic camera employing microlenses with different focal lengths
EP2403234A1 (en) * 2010-06-29 2012-01-04 Koninklijke Philips Electronics N.V. Method and system for constructing a compound image from data obtained by an array of image capturing devices
US8803918B2 (en) 2010-08-27 2014-08-12 Adobe Systems Incorporated Methods and apparatus for calibrating focused plenoptic camera data
US8619179B2 (en) 2011-03-28 2013-12-31 Canon Kabushiki Kaisha Multi-modal image capture apparatus with a tunable spectral response
JP6064040B2 (en) * 2012-05-09 2017-01-18 ライトロ, インコーポレイテッドLytro, Inc. Optimizing optics to improve light field capture and manipulation
US20150146032A1 (en) * 2013-11-22 2015-05-28 Vidinoti Sa Light field processing method

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
ANDREW LUMSDAINE ET AL: "The focused plenoptic camera", 2009 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL PHOTOGRAPHY (ICCP 2009), IEEE, US, 16 April 2009 (2009-04-16), pages 1 - 8, XP031740267, ISBN: 978-1-4244-4534-9 *
EDMUND Y. LAM: "Computational Photography: Advances and Challenges", SPIE, PO BOX 10 BELLINGHAM WA 98227-0010, USA, vol. 8122, 81220O, 21 August 2011 (2011-08-21), pages 1 - 7, XP040563501 *
F MATUS ET AL: "Image representation via a finite Radon transform", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1 January 1993 (1993-01-01), pages 996 - 1006, XP055205390, Retrieved from the Internet <URL:http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=254058> [retrieved on 20150730], DOI: 10.1109/34.254058 *
JOSÉ G. MARICHAL-HERNÁNDEZ: "Fast approximate 4-D/3-D discrete radon transform for lightfield refocusing", JOURNAL OF ELECTRONIC IMAGING, vol. 21, no. 2, 3 July 2012 (2012-07-03), pages 023026, XP055205378, ISSN: 1017-9909, DOI: 10.1117/1.JEI.21.2.023026 *
KEIJI YAMASHITA, TOMOHIRO YENDO, MASAYUKI TANIMOTO, TOSHIAKI FUJII: "Compressive acquisition of ray-space using radon transform", STEREOSCOPIC DEVELOPMENTS II, vol. 7237, 723715, 18 January 2009 (2009-01-18), pages 1 - 10, XP040493754 *
LEVOY M ET AL: "LIGHT FIELD RENDERING", COMPUTER GRAPHICS PROCEEDINGS 1996 (SIGGRAPH). NEW ORLEANS, AUG. 4 - 9, 1996; [COMPUTER GRAPHICS PROCEEDINGS (SIGGRAPH)], NEW YORK, NY : ACM, US, 4 August 1996 (1996-08-04), pages 31 - 42, XP000682719 *
TODOR GEORGIEV ET AL: "The radon image as plenoptic function", 2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 27 October 2014 (2014-10-27), pages 1922 - 1926, XP055205391, ISBN: 978-1-47-995751-4, DOI: 10.1109/ICIP.2014.7025385 *

Also Published As

Publication number Publication date
US9843787B2 (en) 2017-12-12
CN106233329A (en) 2016-12-14
BR112016024707A2 (en) 2017-08-15
EP3134868B1 (en) 2020-08-19
JP2017520944A (en) 2017-07-27
US20150312549A1 (en) 2015-10-29
KR20170005009A (en) 2017-01-11
EP3134868A1 (en) 2017-03-01

Similar Documents

Publication Publication Date Title
EP3134868B1 (en) Generation and use of a 3d radon image
US9444991B2 (en) Robust layered light-field rendering
US9900510B1 (en) Motion blur for light-field images
US10015469B2 (en) Image blur based on 3D depth information
CN109474780B (en) Method and device for image processing
US8665341B2 (en) Methods and apparatus for rendering output images with simulated artistic effects from focused plenoptic camera data
US8860833B2 (en) Blended rendering of focused plenoptic camera data
KR101893047B1 (en) Image processing method and image processing device
CN108432230B (en) Imaging device and method for displaying an image of a scene
US9900584B2 (en) Depth map generation based on cluster hierarchy and multiple multiresolution camera clusters
WO2015196802A1 (en) Photographing method and apparatus, and electronic device
US9729857B2 (en) High resolution depth map computation using multiresolution camera clusters for 3D image generation
US10341546B2 (en) Image processing apparatus and image processing method
JP2013061850A (en) Image processing apparatus and image processing method for noise reduction
US10354399B2 (en) Multi-view back-projection to a light-field
US10529057B2 (en) Image processing apparatus and image processing method
JP5900017B2 (en) Depth estimation apparatus, reconstructed image generation apparatus, depth estimation method, reconstructed image generation method, and program
EP3496042A1 (en) System and method for generating training images
JP6867645B2 (en) Image processing equipment, methods, and programs
JP6674644B2 (en) Image processing apparatus and image processing method
Popovic et al. Design and implementation of real-time multi-sensor vision systems
JP7014175B2 (en) Image processing device, image processing method, and program
Čadík et al. Automated outdoor depth-map generation and alignment
US10453183B2 (en) Image processing apparatus and image processing method
Yu et al. Racking focus and tracking focus on live video streams: a stereo solution

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15721369

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
REEP Request for entry into the european phase

Ref document number: 2015721369

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015721369

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016563926

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016024707

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 20167031948

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 112016024707

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20161021