EP2845167A1 - Modules d'appareils de prise de vues constitués de groupes de filtres pi - Google Patents

Modules d'appareils de prise de vues constitués de groupes de filtres pi

Info

Publication number
EP2845167A1
EP2845167A1 EP13785220.8A EP13785220A EP2845167A1 EP 2845167 A1 EP2845167 A1 EP 2845167A1 EP 13785220 A EP13785220 A EP 13785220A EP 2845167 A1 EP2845167 A1 EP 2845167A1
Authority
EP
European Patent Office
Prior art keywords
camera
array
cameras
color
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13785220.8A
Other languages
German (de)
English (en)
Other versions
EP2845167A4 (fr
Inventor
Semyon Nisenzon
Kartik Venkataraman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pelican Imaging Corp
Original Assignee
Pelican Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pelican Imaging Corp filed Critical Pelican Imaging Corp
Publication of EP2845167A1 publication Critical patent/EP2845167A1/fr
Publication of EP2845167A4 publication Critical patent/EP2845167A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/048Picture signal generators using solid-state devices having several pick-up sensors

Definitions

  • the present invention relates generally to digital cameras and more specifically to filter patterns utilized in camera modules of array cameras.
  • Conventional digital cameras typically include a single focal plane with a lens stack.
  • the focal plane includes an array of light sensitive pixels and is part of a sensor.
  • the lens stack creates an optical channel that forms an image of a scene upon the array of light sensitive pixels in the focal plane.
  • Each light sensitive pixel can generate image data based upon the light incident upon the pixel.
  • an array of color filters is typically applied to the pixels in the focal plane of the camera's sensor.
  • Typical color filters can include red, green and blue color filters.
  • a demosaicing algorithm can be used to interpolate a set of complete red, green and blue values for each pixel of image data captured by the focal plane given a specific color filter pattern.
  • One example of a camera color filter pattern is the Bayer filter pattern.
  • the Bayer filter pattern describes a specific pattern of red, green and blue color filters that results in 50% of the pixels in a focal plane capturing green light, 25% capturing red light and 25% capturing blue light.
  • binocular disparity or parallax
  • parallax provide information that can be used to calculate depth in the visual scene, providing a major means of depth perception.
  • the impression of depth associated with stereoscopic depth perception can also be obtained under other conditions, such as when an observer views a scene with only one eye while moving.
  • the observed parallax can be utilized to obtain depth information for objects in the scene. Similar principles in machine vision can be used to gather depth information.
  • two cameras separated by a distance can take pictures of the same scene and the captured images can be compared by shifting the pixels of two or more images to find parts of the images that match.
  • the amount an object shifts between different camera views is called the disparity, which is inversely proportional to the distance to the object.
  • a disparity search that detects the shift of an object in multiple images can be used to calculate the distance to the object based upon the baseline distance between the cameras and the focal length of the cameras involved.
  • the approach of using two or more cameras to generate stereoscopic three- dimensional images is commonly referred to as multi-view stereo.
  • a pixel that captures image data concerning a portion of a scene, which is not visible in images captured of the scene from other viewpoints, can be referred to as an occluded pixel.
  • FIGS. 1A and IB illustrate the principles of parallax and occlusion.
  • FIG. 1A depicts the image 100 captured by a first camera having a first field of view
  • FIG. IB depicts the image 102 captured by a second adjacent camera having a second field of view.
  • a foreground object 104 appears slightly to the right of the background object 106.
  • the foreground object 104 appears shifted to the left hand side of the background object 106.
  • the disparity introduced by the different fields of view of the two cameras is equal to the difference between the location of the foreground object 104 in the image captured by the first camera (indicated in the image captured by the second camera by ghost lines 108) and its location in the image captured by the second camera.
  • the distance from the two cameras to the foreground object can be obtained by determining the disparity of the foreground object in the two captured images, and this is described in U.S. Patent Application Serial No. 61/780,906, entitled “Systems and Methods for Parallax Detection and Correction in Images Captured Using Array Cameras.”
  • the disclosure of U.S. Patent Application Serial No. 61/780,906 is incorporated by reference herein in its entirety.
  • the pixels contained within the ghost lines 108 in the image 102 can be considered to be occluded pixels (i.e. the pixels capture image data from a portion of the scene that is visible in the image 102 captured by the second camera and is not visible in the image 100 captured by the first camera).
  • the pixels of the foreground object 104 can be referred to as occluding pixels as they capture portions of the scene that occlude the pixels contained within the ghost lines 108 in the image 102.
  • an array camera module includes: an M x N imager array including a plurality of focal planes, each focal plane including an array of light sensitive pixels; an M x N optic array of lens stacks, where each lens stack corresponds to a focal plane, and where each lens stack forms an image of a scene on its corresponding focal plane; where each pairing of a lens stack and its corresponding focal plane thereby defines a camera; where at least one row in the M x N array of cameras includes at least one red color camera, at least one green color camera, and at least one blue color camera; and where at least one column in the M x N array of cameras includes at least one red color camera, at least one green color camera, and at least one blue color camera.
  • M and N are each greater than two and at least one of M and N is even; color filters are implemented within the cameras in the array camera module such that the array camera module is patterned with at least one ⁇ filter group including: a 3 x 3 array of cameras including: a reference camera at the center of the 3 x 3 array of cameras; two red color cameras located on opposite sides of the 3 x 3 array of cameras; two blue color cameras located on opposite sides of the 3 x 3 array of cameras; and four green color cameras surrounding the reference camera.
  • each of the four green color cameras surrounding the reference camera is disposed at a corner location of the 3 x 3 array of cameras.
  • the reference camera is a green color camera.
  • the reference camera is one of: a camera that incorporates a Bayer filter, a camera that is configured to capture infrared light, and a camera that is configured to capture ultraviolet light.
  • each of the two red color cameras is located at a corner location of the 3 x 3 array of cameras, and each of the two blue color cameras is located at a corner location of the 3 x 3 array of cameras.
  • At least one color filter is implemented on the imager array.
  • At least one color filter is implemented on a lens stack.
  • a 3 x 3 array camera module includes: a 3 x 3 imager array including a 3 x 3 arrangement of focal planes, each focal plane including an array of light sensitive pixels; a 3 x 3 optic array of lens stacks, where each lens stack corresponds to a focal plane, and where each lens stack forms an image of a scene on its corresponding focal plane; where each pairing of a lens stack and its corresponding focal plane thereby defines a camera; where the 3 x 3 array of cameras includes: a reference camera at the center of the 3 x 3 array of cameras; two red color cameras located on opposite sides of the 3 x 3 array of cameras; two blue color cameras located on opposite sides of the 3 x 3 array of cameras; and four green color cameras, each located at a corner location of the 3 x 3 array of cameras; where each of the color cameras is achieved using a color filter.
  • At least one color filter is implemented on the imager array to achieve a color camera.
  • At least one color filter is implemented within a lens stack to achieve a color camera.
  • the reference camera is a green color camera.
  • the reference camera is one of: a camera that incorporates a Bayer filter, a camera that is configured to capture infrared light, and a camera that is configured to capture ultraviolet light.
  • a method of patterning an array camera module with at least one ⁇ filter group includes: evaluating whether an imager array of M x N focal planes, where each focal plane comprises an array of light sensitive pixels, includes any defective focal planes; assembling an M x N array camera module using: the imager array of M x N focal planes; an M x N optic array of lens stacks, where each lens stack corresponds with a focal plane, where the M x N array camera module is assembled so that: each lens stack and its corresponding focal plane define a camera; color filters are implemented within the array camera module such that the array camera module is patterned with at least one ⁇ filter group including: a 3 x 3 array of cameras including: a reference camera at the center of the 3 x 3 array of cameras; two red color cameras located on opposite sides of the 3 x 3 array of cameras; two blue color cameras located on opposite sides of the 3 x 3 array of cameras; and four green color cameras surrounding the reference camera; and where the array camera module is patterned with the at least one ⁇ filter group including:
  • At least one color filter is implemented on the imager array.
  • At least one color filter is implemented within a lens stack.
  • the reference camera is a green color camera.
  • an array camera module includes: an imager array comprising M x N focal planes, where each focal plane comprises a plurality of rows of pixels that also form a plurality of columns of pixels and each active focal plane is contained within a region of the imager array that does not contain pixels from another focal plane; an optic array of M x N lens stacks, where an image is formed on each focal plane by a separate lens stack in the optic array of lens stacks; wherein the imager array and the optic array of lens stacks form an M x N array of cameras that are configured to independently capture an image of a scene; where at least one row in the M x N array of cameras comprises at least one red color camera, at least one green color camera, and at least one blue color camera; and where at least one column in the M x N array
  • the red color camera is a camera that captures image data including electromagnetic waves having a wavelength within the range of 620 nm and 750 nm;
  • the green color camera is a camera that captures image data including electromagnetic waves having a wavelength within the range of 495 nm and 570 nm;
  • the blue color camera is a camera that captures image data including electromagnetic waves having a wavelength within the range of 450 nm and 495 nm.
  • the optics of each camera within the array camera module are configured so that each camera has a field of view of a scene that is shifted with respect to the fields of view of the other cameras so that each shift of the field of view of each camera with respect to the fields of view of the other cameras is configured to include a unique sub-pixel shifted view of the scene.
  • M and N are each greater than two and at least one of M and N is even; color filters are implemented within the cameras in the array camera module such that the array camera module is patterned with at least one ⁇ filter group including: a 3 x 3 array of cameras including: a reference camera at the center of the 3 x 3 array of cameras; two red color cameras located on opposite sides of the 3 x 3 array of cameras; two blue color cameras located on opposite sides of the 3 x 3 array of cameras; and four green color cameras surrounding the reference camera.
  • each of the four green color cameras surrounding the reference camera is disposed at a corner location of the 3 x 3 array of cameras.
  • M is four; N is four; the first row of cameras of the 4 x 4 array camera module includes, in the following order, a green color camera, a blue color camera, a green color camera, and a red color camera; the second row of cameras of the 4 x 4 array camera module includes, in the following order, a red color camera, a green color camera, a red color camera, and a green color camera; the third row of cameras of the 4 x 4 array camera module includes, in the following order, a green color camera, a blue color camera, a green color camera, and a blue color camera; and the fourth row of cameras of the 4 x 4 array camera module includes, in the following order, a blue color camera, a green color camera, a red color camera, and a green color camera.
  • the reference camera within the at least one ⁇ filter group is a green color camera.
  • the reference camera within the at least one ⁇ filter group is a camera that incorporates a Bayer filter.
  • the reference camera is one of: a camera that incorporates a Bayer filter, a camera that is configured to capture infrared light, and a camera that is configured to capture ultraviolet light.
  • each of the two red color cameras is located at a corner location of the 3 x 3 array of cameras, and wherein each of the two blue color cameras is located at a corner location of the 3 x 3 array of cameras.
  • At least one color filter is implemented on the imager array.
  • a 3 x 3 array camera module includes: a 3 x 3 imager array including a 3 x 3 arrangement of focal planes, where each focal plane comprises a plurality of rows of pixels that also form a plurality of columns of pixels and each active focal plane is contained within a region of the imager array that does not contain pixels from another focal plane; a 3 x 3 optic array of lens stacks, where an image is formed on each focal plane by a separate lens stack in the optic array of lens stacks; where the imager array and the optic array of lens stacks form a 3 x 3 array of cameras that are configured to independently capture an image of a scene; where the 3 x 3 array of cameras includes: a reference camera at the center of the 3 x 3 array of cameras; two red color cameras located on opposite sides of the 3 x 3 array of cameras; two blue color cameras located on opposite sides of the 3 x 3 array of cameras; and four green
  • At least one color filter is implemented on the imager array to achieve a color camera.
  • At least one color filter is implemented within a lens stack to achieve a color camera.
  • the reference camera is a green color camera.
  • the reference camera is one of: a camera that incorporates a Bayer filter, a camera that is configured to capture infrared light, and a camera that is configured to capture ultraviolet light.
  • an array camera module includes: an imager array comprising M x N focal planes, where each focal plane comprises a plurality of rows of pixels that also form a plurality of columns of pixels and each active focal plane is contained within a region of the imager array that does not contain pixels from another focal plane; an optic array of M x N lens stacks, where an image is formed on each focal plane by a separate lens stack in the optic array of lens stacks; wherein the imager array and the optic array of lens stacks form an M x N array of cameras that are configured to independently capture an image of a scene; and wherein at least either one row or one column in the M x N array of cameras comprises at least one red color camera, at least one green color camera, and at least one blue color camera.
  • an array camera includes: an array camera module, including: an imager array comprising M x N focal planes, where each focal plane comprises a plurality of rows of pixels that also form a plurality of columns of pixels and each active focal plane is contained within a region of the imager array that does not contain pixels from another focal plane; an optic array of M x N lens stacks, where an image is formed on each focal plane by a separate lens stack in the optic array of lens stacks; where the imager array and the optic array of lens stacks form an M x N array of cameras that are configured to independently capture an image of a scene; where at least one row in the M x N array of cameras comprises at least one red color camera, at least one green color camera, and at least one blue color camera; and where at least one column in the M x N array of cameras comprises at least one red color camera, at least one green color camera, and at least one blue color camera; and a processor that includes an image processing pipeline, the image processing pipeline including: a parallax detection module
  • FIGS. 1A and IB illustrate the principles of parallax and occlusion as they pertain to image capture, and which can be addressed in accordance with embodiments of the invention.
  • FIG. 2 illustrates an array camera with a camera module and processor in accordance with an embodiment of the invention.
  • FIG. 3 illustrates a camera module with an optic array and imager array in accordance with an embodiment of the invention.
  • FIG. 4 illustrates an image processing pipeline in accordance with an embodiment of the invention.
  • FIG. 5 A conceptually illustrates a 3 x 3 camera module patterned with a ⁇ filter group where red cameras are arranged horizontally and blue cameras are arranged vertically in accordance with an embodiment of the invention.
  • FIG. 5B conceptually illustrates a 3 x 3 camera module patterned with a ⁇ filter group where red cameras are arranged vertically and blue cameras are arranged horizontally in accordance with an embodiment of the invention.
  • FIG. 5C conceptually illustrates a 3 x 3 camera module patterned with a ⁇ filter group where red cameras and blue cameras are arranged at the corner locations of the 3 x 3 camera module in accordance with an embodiment of the invention
  • FIGS. 5D and 5E conceptually illustrate a number of 3 x 3 camera modules patterned with a ⁇ filter group.
  • FIG. 6 conceptually illustrates a 4 x 4 camera module patterned with two ⁇ filter groups in accordance with an embodiment of the invention.
  • FIG. 7 conceptually illustrates a 4 x 4 camera module patterned with two ⁇ filter groups with two cameras that could each act as a reference camera in accordance with an embodiment of the invention.
  • FIG. 8A illustrates a process for testing an imager array for defective focal planes to create a camera module that reduces the effect of any defective focal plane in accordance with an embodiment of the invention.
  • FIG. 8B conceptually illustrates a 4 x 4 camera module patterned with two ⁇ filter groups where a faulty focal plane causes a loss of red coverage around possible reference cameras.
  • FIG. 8C conceptually illustrates the 4 x 4 camera module patterned with a different arrangement of ⁇ filter groups relative to FIG. 6B where the faulty focal plane does not result in a loss of red coverage around possible reference cameras in accordance with an embodiment of the invention.
  • FIG. 9 A conceptually illustrates use of a subset of cameras to produce a left virtual viewpoint for an array camera operating in 3D mode on a 4 x 4 camera module patterned with ⁇ filter groups in accordance with an embodiment of the invention.
  • FIG. 9B conceptually illustrates use of a subset of cameras to produce a right virtual viewpoint for an array camera operating in 3D mode on a 4 x 4 camera module patterned with ⁇ filter groups in accordance with an embodiment of the invention.
  • FIGS. 9C and 9D conceptually illustrate array camera modules that employ ⁇ filter groups to capture stereoscopic images with viewpoints that correspond to the viewpoints of reference cameras within the camera array.
  • FIG. 10 conceptually illustrates a 4 x 4 camera module patterned with ⁇ filter groups where nine cameras are utilized to capture image data used to synthesize frames of video in accordance with an embodiment of the invention.
  • FIG. 11 is a flow chart illustrating a process for generating color filter patterns including ⁇ filter groups in accordance with embodiments of the invention.
  • FIGS. 12 A - 12D illustrate a process for generating a color filter pattern including ⁇ filter groups for a 5 x 5 array of cameras in accordance with embodiments of the invention.
  • FIGS. 13A - 13D illustrate a process for generating a color filter pattern including ⁇ filter groups for a 4 x 5 array of cameras in accordance with embodiments of the invention.
  • FIG. 14 illustrates a 7 x 7 array of cameras patterned using ⁇ filter groups in accordance with embodiments of the invention.
  • camera modules of an array camera are patterned with one or more ⁇ filter groups.
  • the term patterned here refers to the use of specific color filters in individual cameras within the camera module so that the cameras form a pattern of color channels within the array camera.
  • the term color channel or color camera can be used to refer to a camera that captures image data within a specific portion of the spectrum and is not necessarily limited to image data with respect to a specific color.
  • a 'red color camera' is a camera that captures image data that corresponds with electromagnetic waves (i.e., within the electromagnetic spectrum) that humans conventionally perceive as red, and similarly for 'blue color cameras', 'green color cameras', etc.
  • a red color camera may capture image data corresponding with electromagnetic waves having wavelengths of between approximately 620 nm and 750 nm
  • a green color camera may capture image data corresponding with electromagnetic waves having wavelengths of between approximately 495 nm and approximately 570 nm
  • a blue color camera may capture image data corresponding with electromagnetic waves having wavelengths of between approximately 450 nm and 495 nm.
  • the portions of the visible light spectrum that are captured by blue color cameras, green color cameras and red color cameras can depend upon the requirements of a specific application.
  • Bayer camera can be used to refer to a camera that captures image data using the Bayer filter pattern on the image plane.
  • a color channel can include a camera that captures infrared light, ultraviolet light, extended color and any other portion of the visible spectrum appropriate to a specific application.
  • ⁇ filter group refers to a 3 x 3 group of cameras including a central camera and color cameras distributed around the central camera to reduce occlusion zones in each color channel.
  • the central camera of a ⁇ filter group can be used as a reference camera when synthesizing an image using image data captured by an imager array.
  • a camera is a reference camera when its viewpoint is used as the viewpoint of the synthesized image.
  • the central camera of a ⁇ filter group is surrounded by color cameras in a way that minimizes occlusion zones for each color camera when the central camera is used as a reference camera.
  • Occlusion zones are areas surrounding foreground objects not visible to cameras that are spatially offset from the reference camera due to the effects of parallax.
  • a similar decrease in the likelihood that a portion of the scene visible from the reference viewpoint will be occluded in every other image captured within a specific color channel can be achieved using two cameras in the same color channel that are located on opposite sides of a reference camera or three cameras in each color channel that are distributed in three sectors around the reference camera. In other embodiments, cameras can be distributed in more than four sectors around the reference camera.
  • the central camera of a ⁇ filter group is a green camera while in other embodiments the central camera captures image data from any appropriate portion of the spectrum.
  • the central camera is a Bayer camera (i.e. a camera that utilizes a Bayer filter pattern to capture a color image).
  • a ⁇ filter group is a 3 x 3 array of cameras with a green color camera at each corner and a green color camera at the center which can serve as the reference camera with a symmetrical distribution of red and blue cameras around the central green camera.
  • the symmetrical distribution can include arrangements where either red color cameras are directly above and below the center green reference camera with blue color cameras directly to the left and right, or blue color cameras directly above and below the green center reference camera with red color cameras directly to the left and right.
  • Camera modules of dimensions greater than a 3 x 3 array of cameras can be patterned with ⁇ filter groups in accordance with many embodiments of the invention.
  • patterning a camera module with ⁇ filter groups enables an efficient distribution of cameras around a reference camera that reduces occlusion zones.
  • patterns of ⁇ filter groups can overlap with each other such that two overlapping ⁇ filter groups on a camera module share common cameras.
  • cameras that are not part of a ⁇ filter group can be assigned a color to reduce occlusion zones in the resulting camera array by distributing cameras in each color channel within each of a predetermined number of sectors surrounding a reference camera and/or multiple cameras that can act as reference cameras within the camera array.
  • camera modules can be patterned with ⁇ filter groups such that either at least one row in the camera module or at least one column in the camera module includes at least one red color camera, at least one green color camera, and at least one blue color camera.
  • at least one row and at least one column of the array camera module include at least one red color camera, at least one green color camera, and at least one blue color camera.
  • At least one row and at least one column of the array camera module include at least one cyan color camera, at least one magenta color camera, and at least one yellow color camera (e.g. color cameras that correspond with the CMYK color model).
  • at least one row and at least one column of the array camera module include at least one red color camera, at least one yellow color camera, and at least one blue color camera (e.g. color cameras that correspond with the RYB color model).
  • camera modules of an M x N dimension may also be patterned with ⁇ filter groups in accordance with many embodiments of the invention.
  • These camera modules are distinct from an M x N camera module where both M and N are odd numbers insofar as where at least one of M and N is even, none of the constituent cameras align with the center of the camera array.
  • M and N are both odd, there is a camera that corresponds with the center of the camera array.
  • there is a central camera that corresponds with the center of the camera array. Cameras that align with the center of the camera array are typically selected as the reference camera of the camera module.
  • any suitable camera may be utilized as the reference camera of the camera module.
  • color cameras surrounding the reference camera need not be uniformly distributed but need only be distributed in a way to minimize or reduce occlusion zones of each color from the perspective of the reference camera. Utilization of a reference camera in a i filter group to synthesize an image from captured image data can be significantly less computationally intensive than synthesizing an image using the same image data from a virtual viewpoint.
  • High quality images or video can be captured by an array camera including a camera module patterned with ⁇ filter groups utilizing a subset of cameras within the camera module (i.e. not requiring that all cameras on a camera module be utilized). Similar techniques can also be used.
  • -U- be used for efficient generation of stereoscopic 3D images utilizing image data captured by subsets of the cameras within the camera module.
  • Patterning camera modules with ⁇ filter groups also enables robust fault tolerance in camera modules with multiple ⁇ filter groups as multiple possible reference cameras can be utilized if a reference camera begins to perform sub optimally. Patterning camera modules with ⁇ filter groups also allows for yield improvement in manufacturing camera modules as the impact of a defective focal plane on a focal plane array can be minimized by simply changing the pattern of the color lens stacks in an optic array. Various ⁇ filter groups and the patterning of camera modules with ⁇ filter groups in accordance with embodiments of the invention are discussed further below.
  • an array camera includes a camera module and a processor.
  • An array camera with a camera module patterned with ⁇ filter groups in accordance with an embodiment of the invention is illustrated in FIG. 2.
  • the array camera 200 includes a camera module 202 as an array of individual cameras 204 where each camera 204 includes a focal plane with a corresponding lens stack.
  • An array of individual cameras refers to a plurality of cameras in a particular arrangement, such as (but not limited to) the square arrangement utilized in the illustrated embodiment.
  • the camera module 202 is connected 206 to a processor 208.
  • a camera 204 labeled as "R” refers to a red camera with a red filtered color channel
  • G refers to a green camera with a green filtered color channel
  • B refers to a blue camera with a blue filtered color channel.
  • Array camera modules in accordance with embodiments of the invention can be constructed from an imager array or sensor including an array of focal planes and an optic array including a lens stack for each focal plane in the imager array. Sensors including multiple focal planes are discussed in U.S. Patent Application Serial No. 13/106,797 entitled “Architectures for System on Chip Array Cameras", to Pain et al., the disclosure of which is incorporated herein by reference in its entirety.
  • Light filters can be used within each optical channel formed by the lens stacks in the optic array to enable different cameras within an array camera module to capture image data with respect to different portions of the electromagnetic spectrum.
  • the camera module 300 includes an imager array 330 including an array of focal planes 340 along with a corresponding optic array 310 including an array of lens stacks 320.
  • each lens stack 320 creates an optical channel that forms an image of a scene on an array of light sensitive pixels within a corresponding focal plane 340.
  • Each pairing of a lens stack 320 and focal plane 340 forms a single camera 204 within the camera module, and thereby an image is formed on each focal plane by a separate lens stack in the optic array of lens stacks.
  • Each pixel within a focal plane 340 of a camera 204 generates image data that can be sent from the camera 204 to the processor 208.
  • the lens stack within each optical channel is configured so that pixels of each focal plane 340 sample the same object space or region within the scene.
  • the lens stacks are configured so that the pixels that sample the same object space do so with sub-pixel offsets to provide sampling diversity that can be utilized to recover increased resolution through the use of super-resolution processes.
  • the optics of each camera module can be configured so that each camera within the camera module has a field of view of a scene that is shifted with respect to the fields of view of the other cameras within the camera module so that each shift of the field of view of each camera with respect to the fields of view of the other cameras is configured to include a unique sub-pixel shifted view of the scene.
  • the focal planes are configured in a 5 x 5 array.
  • Each focal plane 340 on the sensor is capable of capturing an image of the scene.
  • each focal plane includes a plurality of rows of pixels that also forms a plurality of columns of pixels, and each focal plane is contained within a region of the imager that does not contain pixels from another focal plane.
  • image data capture and readout of each focal plane can be independently controlled.
  • the imager array and the optic array of lens stacks form an array of cameras that can be configured to independently capture an image of a scene.
  • image capture settings including (but not limited to) the exposure times and analog gains of pixels within a focal can be determined independently to enable image capture settings to be tailored based upon factors including (but not limited to) a specific color channel and/or a specific portion of the scene dynamic range.
  • the sensor elements utilized in the focal planes can be individual light sensing elements such as, but not limited to, traditional CIS (CMOS Image Sensor) pixels, CCD (charge-coupled device) pixels, high dynamic range sensor elements, multispectral sensor elements and/or any other structure configured to generate an electrical signal indicative of light incident on the structure.
  • the sensor elements of each focal plane have similar physical properties and receive light via the same optical channel and color filter (where present).
  • the sensor elements have different characteristics and, in many instances, the characteristics of the sensor elements are related to the color filter applied to each sensor element.
  • color filters in individual cameras can be used to pattern the camera module with ⁇ filter groups. These cameras can be used to capture data with respect to different colors, or a specific portion of the spectrum.
  • color filters in many embodiments of the invention are included in the lens stack.
  • a green color camera can include a lens stack with a green light filter that allows green light to pass through the optical channel.
  • the pixels in each focal plane are the same and the light information captured by the pixels is differentiated by the color filters in the corresponding lens stack for each filter plane.
  • camera modules including ⁇ filter groups can be implemented in a variety of ways including (but not limited to) by applying color filters to the pixels of the focal planes of the camera module similar to the manner in which color filters are applied to the pixels of a conventional color camera.
  • at least one of the cameras in the camera module can include uniform color filters applied to the pixels in its focal plane.
  • a Bayer filter pattern is applied to the pixels of one of the cameras in a camera module.
  • camera modules are constructed in which color filters are utilized in both the lens stacks and on the pixels of the imager array.
  • an array camera generates image data from the multiple focal planes and uses a processor to synthesize one or more images of a scene.
  • the image data captured by a single focal plane in the sensor array can constitute a low resolution image, or an "LR image” (the term low resolution here is used only to contrast with higher resolution images or super-resolved images, alternatively a "HR image” or "SR image”), which the processor can use in combination with other low resolution image data captured by the camera module to construct a higher resolution image through Super Resolution processing.
  • LR image the term low resolution here is used only to contrast with higher resolution images or super-resolved images, alternatively a "HR image” or "SR image”
  • the processor can use in combination with other low resolution image data captured by the camera module to construct a higher resolution image through Super Resolution processing.
  • Super Resolution processes that can be used to synthesize high resolution images using low resolution images captured by an array camera are discussed in U.S. Patent Application No. 12/967,807 entitled “Systems and Methods for Synthe
  • any of a variety of regular or irregular layouts of imagers including imagers that sense visible light, portions of the visible light spectrum, near-IR light, other portions of the spectrum and/or combinations of different portions of the spectrum can be utilized to capture LR images that provide one or more channels of information for use in SR processes in accordance with embodiments of the invention.
  • the processing of captured LR images is discussed further below.
  • the processing of LR images to obtain an SR image in accordance with embodiments of the invention typically occurs in an array camera's image processing pipeline.
  • the image processing pipeline performs processes that register the LR images prior to performing SR processes on the LR images.
  • the image processing pipeline also performs processes that eliminate problem pixels and compensate for parallax.
  • FIG. 4 An image processing pipeline incorporating a SR module for fusing information from LR images to obtain a synthesized HR image in accordance with an embodiment of the invention is illustrated in FIG. 4.
  • pixel information is read out from focal planes 340 and is provided to a photometric conversion module 402 for photometric normalization.
  • the photometric conversion module can perform any of a variety of photometric image processing processes including but not limited to one or more of photometric normalization, Black Level calculation and adjustments, vignetting correction, and lateral color correction.
  • the photometric conversion module also performs temperature normalization.
  • the inputs of the photometric normalization module are photometric calibration data 401 and the captured LR images.
  • the photometric calibration data is typically captured during an offline calibration process.
  • the output of the photometric conversion module 402 is a set of photometrically normalized LR images. These photometrically normalized images are provided to a parallax detection module 404 and to a super-resolution module 406.
  • the image processing pipeline Prior to performing SR processing, the image processing pipeline detects parallax that becomes more apparent as objects in the scene captured by the imager array approach the imager array.
  • parallax (or disparity) detection is performed using the parallax detection module 404.
  • the parallax detection module 404 generates an occlusion map for the occlusion zones around foreground objects.
  • the occlusion maps are binary maps created for pairs of LR imagers.
  • occlusion maps are generated to illustrate whether a point in the scene is visible in the field of view of a reference LR imager and whether points in the scene visible within the field of view of the reference imager are visible in the field of view of other imagers.
  • the use of ⁇ filter groups can increase the likelihood that a pixel visible in a reference LR image is visible (i.e. not occluded) in at least one other LR image.
  • the parallax detection module 404 performs scene independent geometric corrections to the photometrically normalized LR images using geometric calibration data 408 obtained via an address conversion module 410.
  • parallax detection module 404 can then compare the geometrically and photometrically corrected LR images to detect the presence of scene dependent geometric displacements between LR images.
  • Information concerning these scene dependent geometric displacements can be referred to as parallax information and can be provided to the super-resolution module 406 in the form of scene dependent parallax corrections and occlusion maps.
  • parallax information can also include generated depth maps which can also be provided to the super-resolution module 406.
  • Geometric calibration (or scene-independent geometric correction) data 408 can be generated using an off line calibration process or a subsequent recalibration process.
  • the scene- independent correction information along with the scene-dependent geometric correction information (parallax) and occlusion maps, form the geometric correction information for the LR images.
  • the parallax information and the photometrically normalized LR images are provided to the super-resolution module 406 for use in the synthesis of one or more HR images 420.
  • the super-resolution module 406 performs scene independent and scene dependent geometric corrections (i.e. geometric corrections) using the parallax information and geometric calibration data 408 obtained via the address conversion module 410.
  • the photometrically normalized and geometrically registered LR images are then utilized in the synthesis of an HR image.
  • the synthesized HR image may then be fed to a downstream color processing module 412, which can be implemented using any standard color processing module configured to perform color correction and/or chroma level adjustment.
  • the color processing module performs operations including but not limited to one or more of white balance, color correction, gamma correction, and RGB to YUV correction.
  • image processing pipelines in accordance with embodiments of the invention include a dynamic refocus module.
  • the dynamic refocus module enables the user to specify a focal plane within a scene for use when synthesizing an HR image.
  • the dynamic refocus module builds an estimated HR depth map for the scene.
  • the dynamic refocus module can use the HR depth map to blur the synthesized image to make portions of the scene that do not lie on the focal plane to appear out of focus.
  • the SR processing is limited to pixels lying on the focal plane and within a specified Z-range around the focal plane.
  • the synthesized high resolution image 420 is encoded using any of a variety of standards based or proprietary encoding processes including but not limited to encoding the image in accordance with the JPEG standard developed by the Joint Photographic Experts Group.
  • the encoded image can then be stored in accordance with a file format appropriate to the encoding technique used including but not limited to the JPEG Interchange Format (JIF), the JPEG File Interchange Format (JFIF), or the Exchangeable image file format (Exif).
  • JIF JPEG Interchange Format
  • JFIF JPEG File Interchange Format
  • Exchangeable image file format Exif
  • parallax information can be used to generate depth maps as well as occlusion maps, and this is discussed below.
  • Array cameras in accordance with many embodiments of the invention use disparity observed in images captured by the array cameras to generate a depth map.
  • a depth map is typically regarded as being a layer of meta-data concerning an image (often a reference image captured by a reference camera) that describes the distance from the camera to specific pixels or groups of pixels within the image (depending upon the resolution of the depth map relative to the resolution of the original input images).
  • Array cameras in accordance with a number of embodiments of the invention use depth maps for a variety of purposes including (but not limited to) generating scene dependent geometric shifts during the synthesis of a high resolution image and/or performing dynamic refocusing of a synthesized image.
  • the process of determining the depth of a portion of scene based upon pixel disparity is theoretically straightforward.
  • the viewpoint of a specific camera in the array camera is chosen as a reference viewpoint
  • the distance to a portion of the scene visible from the reference viewpoint can be determined using the disparity between the corresponding pixels in some or all of the other images captured by the camera array (often referred to as alternate view images).
  • alternate view images the disparity between the corresponding pixels in some or all of the other images captured by the camera array
  • a pixel corresponding to a pixel in the reference image captured from the reference viewpoint will be located in each alternate view image along an epipolar line (i.e. a line parallel to the baseline vector between the two cameras).
  • the distance along the epipolar line of the disparity corresponds to the distance between the camera and the portion of the scene captured by the pixels. Therefore, by comparing the pixels in the captured reference image and alternate view image(s) that are expected to correspond at a specific depth, a search can be conducted for the depth that yields the pixels having the highest degree of similarity. The depth at which the corresponding pixels in the reference image and the alternate view image(s) have the highest degree of similarity can be selected as the most likely distance between the camera and the portion of the scene captured by the pixel. [0097] Many challenges exist, however, in determining an accurate depth map using the method outlined above. In several embodiments, the cameras in an array camera are similar but not the same.
  • image characteristics including (but not limited to) optical characteristics, different sensor characteristics (such as variations in sensor response due to offsets, different transmission or gain responses, non-linear characteristics of pixel response), noise in the captured images, and/or warps or distortions related to manufacturing tolerances related to the assembly process can vary between the images reducing the similarity of corresponding pixels in different images.
  • super-resolution processes rely on sampling diversity in the images captured by an imager array in order to synthesize higher resolution images.
  • increasing sampling diversity can also involve decreasing similarity between corresponding pixels in captured images in a light field. Given that the process for determining depth outlined above relies upon the similarity of pixels, the presence of photometric differences and sampling diversity between the captured images can reduce the accuracy with which a depth map can be determined.
  • occlusions occur when a pixel that is visible from the reference viewpoint is not visible in one or more of the captured images.
  • the effect of an occlusion is that at the correct depth, the pixel location that would otherwise be occupied by a corresponding pixel is occupied by a pixel sampling another portion of the scene (typically an object closer to the camera).
  • the occluding pixel is often very different to the occluded pixel. Therefore, a comparison of the similarity of the pixels at the correct depth is less likely to result in a significantly higher degree of similarity than at other depths.
  • the occluding pixel acts as a strong outlier masking the similarity of those pixels which in fact correspond at the correct depth. Accordingly, the presence of occlusions can introduce a strong source of error into a depth map. Furthermore, use of ⁇ filter groups to increase the likelihood that a pixel visible in an image captured by a reference camera is visible in alternate view images captured by other cameras within the array can reduce error in a depth map generated in the manner described above.
  • Processes for generating depth maps in accordance with many embodiments of the invention attempt to reduce sources of error that can be introduced into a depth map by sources including (but not limited to) those outlined above.
  • U.S. Patent Application Serial No. 61/780,906 entitled “Systems and Methods for Parallax Detection and Correction in Images Captured Using Array Cameras” discloses such processes.
  • U.S. Patent Application Serial No. 61/780,906 is incorporated by reference herein in its entirety.
  • use of ⁇ filter groups can significantly decrease the likelihood that a pixel visible from the viewpoint of a reference camera is occluded within all cameras within a color channel.
  • Many different array cameras are capable of utilizing ⁇ filter groups in accordance with embodiments of the invention. Camera modules utilizing ⁇ filter groups in accordance with embodiments of the invention are described in further detail below.
  • Camera modules can be patterned with ⁇ filter groups in accordance with embodiments of the invention.
  • ⁇ filter groups utilized as part of a camera module can each include a central camera that can function as a reference camera surrounded by color cameras in a way that reduces occlusion zones for each color.
  • the camera module is arranged in a rectangular format utilizing the RGB color model where a reference camera is a green camera surrounded by red, green and blue cameras.
  • a number of green cameras that is twice the number of red cameras and twice the number of blue cameras surround the reference camera.
  • red color cameras and blue color cameras are located in opposite positions on the 3 x 3 array of cameras.
  • any set of colors from any color model can be utilized to detect a useful range of colors in addition to the RGB color model, such as the cyan, magenta, yellow and key (CMYK) color model or red, yellow and blue (RYB) color model.
  • two ⁇ filter groups can be utilized in the patterning of a camera module when the RGB color model is used.
  • One ⁇ filter group is illustrated in FIG. 5A and the other ⁇ filter group is illustrated FIG. 5B. Either of these ⁇ filter groups can be used to pattern any camera module with dimensions greater than a 3 x 3 array of cameras.
  • patterning of the camera module with a ⁇ filter group includes only a single ⁇ filter group.
  • a ⁇ filter group on a 3 x 3 camera module in accordance with an embodiment of the invention is illustrated in FIG. 5A.
  • the ⁇ filter group 500 includes a green camera at each corner, a green reference camera in the center notated within a box 502, blue cameras above and below the reference camera, and red cameras to the left and right sides of the reference camera.
  • the number of green cameras surrounding the central reference camera is twice the number of red cameras and twice the number of blue cameras.
  • red cameras are located in opposite locations relative to the center of the 3 x 3 array of cameras to reduce occlusions.
  • FIG. 5B An alternative to the ⁇ filter group described in FIG. 5A is illustrated in FIG. 5B in accordance with an embodiment of the invention.
  • This ⁇ filter group also includes green cameras at the corners with a green reference camera 552 at the center, as denoted with a box.
  • the red cameras shown in FIG. 5B are above and below, and the blue cameras are to the left and right side of the reference camera.
  • the ⁇ filter group in FIG. 5B includes a central reference camera surrounded by a number of green cameras that is twice the number of red cameras and twice the number of blue cameras.
  • the reference camera need not be a green camera.
  • the configurations in FIGS. 5 A and 5B can be modified to include a central camera that employs a Bayer color filter.
  • the central camera is an infrared camera, an extended color camera and/or any other type of camera appropriate to a specific application, for example an infrared camera, or a UV camera.
  • any of a variety of color cameras can be distributed around the reference camera in opposite locations in the 3 x 3 array relative to the reference camera in a manner that reduces occlusion zones with respect to each color channel.
  • FIG. 5C depicts an embodiment where green color cameras are located above, below, to the left, and to the right of the central camera, while red and blue color cameras are disposed at the corner location of the ⁇ filter group.
  • the first and third rows and columns each have a red, green, and blue color filter, and this arrangement can reduce instances of occlusions.
  • the configuration shown in FIG. 5C can include slightly larger occlusion zones in the red and blue color channels compared with the embodiments illustrated in FIGS. 5 A and 5B, because the red and blue color cameras are slightly further away from central reference camera.
  • FIGS. 5C depicts an embodiment where green color cameras are located above, below, to the left, and to the right of the central camera, while red and blue color cameras are disposed at the corner location of the ⁇ filter group.
  • the first and third rows and columns each have a red, green, and blue color filter, and this arrangement can reduce instances of occlusions.
  • the configuration shown in FIG. 5C can include slightly larger occlusion zones in
  • 5D and 5E depict embodiments where color cameras surround a central green camera such that the cameras in each color channel are located in opposite positions in a 3 x 3 array relative to the central reference camera.
  • the blue or red color channel in which the cameras are in the corners of the 3 x 3 array are likely to have slightly larger occlusion zones than the blue or red color channel in which the cameras are located closer to the central reference camera (i.e. the cameras are not located in the corners).
  • the central reference camera can be any suitable camera, e.g. not just a green camera, in accordance with embodiments of the invention.
  • many embodiments are similar to those seen in FIGS. 5D and 5E, except they are utilize an arrangement that is the mirror image of those seen in FIGS. 5D and 5E.
  • numerous embodiments are similar to those seen in FIGS. 5D and 5E, except they utilize an arrangement that is rotated with respect to those seen in FIGS. 5D and 5E.
  • Any camera module with dimensions at and above 3 x 3 cameras can be patterned with one or more ⁇ filter groups, where cameras not within a ⁇ filter group are assigned a color that reduces or minimizes the likelihood of occlusion zones within the camera module given color filter assignments of the ⁇ filter groups.
  • a 4 x 4 camera module patterned with two ⁇ filter groups in accordance with an embodiment of the invention is illustrated in FIG. 6.
  • the camera module 600 includes a first ⁇ filter group 602 of nine cameras centered on a reference green camera 604.
  • a second ⁇ filter group 610 is diagonally located one camera shift to the lower right of the first ⁇ filter group.
  • the second ⁇ filter group shares the four center cameras 612 of the camera module 600 with the first ⁇ filter group.
  • the cameras serve different roles (i.e. different cameras act as reference cameras in the two ⁇ filter groups).
  • the two cameras at the corners 606 and 608 of the camera module are not included in the two ⁇ filter groups, 602 and 610.
  • the color filters utilized within these cameras are determined based upon minimization of occlusion zones given the color filter assignments of the cameras that are part of the two ⁇ filter groups, 602 and 610. Due to the patterning of the ⁇ filter groups, there is an even distribution of blue color cameras around the reference camera, but there is no red color camera above the reference camera.
  • selecting the upper right corner camera 606 to be red provides red image data from a viewpoint above the reference camera and the likelihood of occlusion zones above and to the right of the foreground images in a scene for the reference camera 604 and the center camera of the second ⁇ filter group is minimized.
  • selecting the lower left corner camera 608 to be blue provides blue image data from a viewpoint to the left of the reference camera and the likelihood of occlusion zones below and to the left of the foreground images in a scene for the reference camera 604 and the center camera of the second ⁇ filter group is minimized.
  • a camera module with dimensions greater than 3 x 3 can be patterned with ⁇ filter groups with colors assigned to cameras not included in any ⁇ filter group to reduce and/or minimize occlusion zones as discussed above.
  • the camera array includes at least one row and at least one column that contain a blue color camera, a green color camera, and a red color camera.
  • FIG. 7 A 4 x 4 camera module with two ⁇ filter groups in accordance with an embodiment of the invention is illustrated in FIG. 7.
  • the camera module 700 includes two ⁇ filter groups 702, 706 where the central camera of each ⁇ filter group 704, 708 can act as a reference camera. Irrespective of the reference camera that is selected, the distribution of cameras around the reference camera is equivalent due to the use of ⁇ filter groups.
  • a camera module 700 detects a defect with reference camera 704, the camera module 700 can switch to using the camera at the center of another ⁇ filter group as a reference camera 708 to avoid the defects of the first reference camera 704. Furthermore, patterning with ⁇ filter groups does not require that the reference camera or a virtual viewpoint be at the center of a camera module but rather that the reference camera is surrounded by color cameras in a way that reduces occlusion zones for each color. Although a specific camera module is discussed above, camera modules of any number of different dimensions can be utilized to create multiple reference camera options in accordance with embodiments of the invention.
  • Manufacturing processes inherently involve variations that can result in defects.
  • the manufacturing defects may be severe enough to render an entire focal plane within an imager array inoperable. If the failure of the focal plane results in the discarding of the imager array, then the cost to manufacture array cameras is increased.
  • Patterning camera modules with ⁇ filter groups can provide high manufacturing yield because the allocation of color filters in the optical channels of the optic array can be used to reduce the impact that a faulty focal plane has with respect to the creation of occlusion zones in the images synthesized using the image data captured by the array camera.
  • the light sensed by the pixels in a focal plane of an imager array is determined by a color filter included in the optical channel that focuses light onto the focal plane.
  • a color filter included in the optical channel that focuses light onto the focal plane.
  • the color filter pattern of the optical channels in the optic array can be determined so that the defective focal plane does not result in an increase in the size of occlusion zones.
  • FIG. 6A A process for detecting faulty focal planes before combining an optic array and imager array to create a camera module in accordance with embodiments of the invention is illustrated in FIG. 6A.
  • the color filter patterns are patterned on the optics array and not on the pixels of the imager array.
  • a process can systematically choose a specific optics array to force the faulty focal plane to pair with a color of a certain filter to ensure that the size of the occlusion zones in a given color channel are reduced and/or minimized.
  • the process 800 includes testing (802) an imager array for faulty focal planes.
  • a decision (804) is made as to whether a faulty focal plane is detected on the imager array. If a faulty focal plane is detected, then an optic array is selected based upon the location of the faulty focal plane (806). In many embodiments, an optic array is selected that reduces the effect of the faulty focal plane by assigning color filters to the operational focal planes in a way that minimizes the impact of the faulty focal plane on the creation of occlusion zones within images synthesized using image data captured by the imager array. Further discussion of selecting different optic arrays that reduce occlusion zones when there is a faulty focal plane is provided below with reference to FIGS. 6B and 6C.
  • the selected optic array is combined (808) with the imager array to create a camera module. If a faulty focal plane is not detected, then any of a variety of optic arrays including filter patterns based on ⁇ filter groups can be combined (808) with the tested imager array to create a camera module.
  • a typical process can include a default optic array including a first filter pattern based on ⁇ filter groups and a second filter pattern based on ⁇ filter groups can be utilized when specific defects are detected that would result in the faulty focal plane reducing the number of color cameras (or even specific color cameras such as color cameras around the outside of the camera module) in the camera module when the first filter pattern is used.
  • FIGS. 8B and 8C The manner in which modifying color filter assignments can reduce the impact of a faulty focal plane is illustrated in FIGS. 8B and 8C.
  • a camera module with a faulty red camera is illustrated in FIG. 8B.
  • the camera module 820 includes a first ⁇ filter group 828 with a possible reference camera 822 at the center, a second ⁇ filter group 832 with a possible reference camera 830 at the center and a faulty red camera 824 below both ⁇ filter groups 828 and 832.
  • an optic array including the filter pattern illustrated in FIG. 8B results in a defective red camera that prevents the capture of red color information below any reference camera, increasing the likelihood of occlusion zones below foreground objects.
  • an optic array patterned using ⁇ filter groups in different locations can result in all of the blue and red color filters being assigned to cameras that are active. In this way, the faulty focal plane only impacts the number of green cameras and does so in a way that reduces the likelihood of occlusion zones in an image synthesized using the image data captured by the resulting camera module.
  • yield can be improved under certain circumstances by combining the imager array that includes the faulty focal plane with an optic array that assigns the color filters of the active cameras based on ⁇ filter groups in a way that results in color information being captured around the reference camera in a way that minimizes the likelihood of occlusion zones given the location of the faulty focal plane.
  • FIG. 8C A camera module with the faulty focal plane of FIG. 8B but with an optic array patterned with ⁇ filter groups in such a way that the faulty focal plane does not reduce the capture of red or blue image data around the reference camera module is illustrated in FIG. 8C.
  • the optic array of FIG. 8C Relative to the pattern of the optic array of FIG. 8B, the optic array of FIG. 8C is flipped along the center vertical bisecting axis 826 of the optic array and includes two ⁇ filter groups 828' and 832'.
  • the lens stack associated with the faulty focal plane is green 854, as opposed to red 824 in FIG. 8B. As there are multiple green cameras below all possible reference cameras 852, 856 in FIG.
  • the loss of a green camera 854 is less impactful as opposed to the impact from the loss of the red camera 824 in FIG. 8B. Therefore, the impact of faulty focal planes on an imager array can be reduced by combining the faulty imager array with an optic array specifically selected to assign color filters to the focal planes in the imager array in a manner that reduces the likelihood that the faulty focal plane will create an occlusion zone in any of the color channels captured by the resulting camera module.
  • the example above discusses reducing red occlusion zones
  • the impact of a defective focal plane in any of the locations in an imager array can be similarly minimized by appropriate selection of a filter pattern based on ⁇ filter groups.
  • any of a variety of alternative color filter patterns including ⁇ filter groups can be utilized to increase manufacturing yield in accordance with embodiments of the invention.
  • Super Resolution processes can be used to synthesize high resolution images using low resolution images captured by an array camera including pairs of stereoscopic 3D images as disclosed in U.S. Patent Application No. 12/967,807 entitled “Systems and Methods for Synthesizing High Resolution Images Using Super-Resolution Processes", filed December 14, 2010, the disclosure of which is incorporated by reference above.
  • Stereoscopic 3D image pairs are two images of a scene from spatially offset viewpoints that can be combined to create a 3D representation of the scene.
  • the use of a filter pattern including ⁇ filter groups can enable the synthesis of stereoscopic 3D images in a computationally efficient manner.
  • Image data captured by less than all of the cameras in the array camera can be used to synthesize each of the images that form the stereoscopic 3D image pair.
  • Patterning with ⁇ filter groups enables an efficient distribution of cameras around a reference camera that reduces occlusion zones and reduces the amount of image data captured by the camera module that is utilized to synthesize each of the images in a stereoscopic 3D image pair.
  • different subsets of the cameras are used to capture each of the images that form the stereoscopic 3D image pair and each of the subsets includes a ⁇ filter group.
  • the images that form the stereoscopic 3D image pair are captured from a virtual viewpoint that is slightly offset from the camera at the center of the ⁇ filter group.
  • the central camera of a ⁇ filter group is surrounded by color cameras in a way that minimizes occlusion zones for each color camera when the central camera is used as a reference camera.
  • the virtual viewpoint is proximate the center of a ⁇ filter group, the benefits of the distribution of color cameras around the virtual viewpoint are similar.
  • FIG. 9A A left virtual viewpoint for a stereoscopic 3D image pair captured using a camera module patterned using ⁇ filter groups is illustrated in FIG. 9A.
  • the left virtual viewpoint 904 is taken from image data from the 12 circled cameras Gi - G 3 , G 5 - G 7 , Bi - B 2 , B 4 , and R 2 - R 3 that form a 3x4 array.
  • the virtual viewpoint is offset relative to the green camera G 3 , which is the center of a ⁇ filter group 906.
  • FIG. 9B A right virtual viewpoint used to capture the second image in the stereoscopic pair using the camera module shown in FIG. 7 is illustrated in FIG. 9B.
  • the right virtual viewpoint 954 is taken from image data from the 12 circled cameras Bi - B 3 , G 2 - G 4 , G 6 - Gg, Ri, R 3 - R 4 that form a 3x4 array.
  • the virtual viewpoint is offset relative to the green camera G 6 , which is the center of a ⁇ filter group 956. Therefore, a single array camera can capture 3D images of a scene using image data from a subset of the cameras to synthesize each of the images that form the stereoscopic pair.
  • the computational complexity of generating the stereoscopic 3D image pair is reduced.
  • the location of the viewpoints of each of the images proximate a camera that is the center of a ⁇ filter group reduces the likelihood of occlusion zones in the synthesized images.
  • the viewpoints need not be virtual viewpoints.
  • array camera modules can be constructed using ⁇ filter groups so that the viewpoints from which stereoscopic images are captured are reference viewpoints obtained from reference cameras within the camera array.
  • a 3 x 5 camera module is provided that includes two overlapping ⁇ filter groups.
  • a 3 x 5 camera module that includes two overlapping ⁇ filter groups centered on each of two reference green color cameras is illustrated in FIG. 9C.
  • the camera module 960 includes two overlapping ⁇ filter groups 962 and 964, each centered on one of two reference green color cameras 966 and 968 respectively. The two reference cameras 966 and 968 are used to provide the two reference viewpoints.
  • an array camera module is configured to capture stereoscopic images using non-overlapping ⁇ filter groups.
  • a 3 x 6 array camera module that includes two non-overlapping ⁇ filter groups, which can be used to capture stereoscopic images is illustrated in FIG. 9D.
  • the array camera module 970 is similar to that seen in FIG. 9C, except that the two ⁇ filter groups 972 and 974 do not overlap.
  • the two ⁇ filter groups 972 and 974 are each centered on one of two green color cameras 976 and 978 respectively.
  • the two reference cameras 976 and 978 are used to provide the two reference viewpoints.
  • 9D demonstrates that ⁇ filter groups having different arrangements of cameras within each ⁇ filter group can be utilized to pattern an array camera module in accordance with embodiments of the invention.
  • the two ⁇ filter groups 972 and 974 use different 3 x 3 camera arrangement.
  • ⁇ filter groups incorporating different 3 x 3 arrangements of cameras can be utilized to construct any of a variety of camera arrays of different dimensions.
  • stereoscopic image pairs can be generated using subsets of cameras in any of a variety of camera modules in accordance with embodiments of the invention.
  • Array cameras with camera modules patterned with ⁇ filter groups can utilize less than all of the available cameras in operation in accordance with many embodiments of the invention. In several embodiments, using fewer cameras can minimize the computational complexity of generating an image using an array camera and can reduce the power consumption of the array camera. Reducing the number of cameras used to capture image data can be useful for applications such as video, where frames of video can be synthesized using less than all of the image data that can be captured by a camera module. In a number of embodiments, a single ⁇ filter group can be utilized to capture an image. In many embodiments, image data captured by a single ⁇ filter group is utilized to capture a preview image prior to capturing image data with a larger number of cameras.
  • the cameras in a single ⁇ filter group capture video image data.
  • image data can be captured using additional cameras to increase resolution and/or provide additional color information and reduce occlusions.
  • FIG. 10 A ⁇ filter group within a camera module that is utilized to capture image data that can be utilized to synthesize an image is illustrated in FIG. 10.
  • the reference camera is boxed and utilized cameras are encompassed in a dotted line.
  • the camera module 1000 includes a ⁇ filter group of cameras generating image data Gi - G 2 , G 5 - G 6 , Bi - B 2 and R 2 - R 3 with reference camera G 3 .
  • Image data can be acquired using additional cameras for increased resolution and to provide additional color information in occlusion zones. Accordingly, any number and arrangement of cameras can be utilized to capture image data using a camera module in accordance with many different embodiments of the invention.
  • Color filter patterns for any array of cameras having dimensions greater than 3 x 3 can be constructed in accordance with embodiments of the invention.
  • processes for constructing color filter patterns typically involve assigning color filters to the cameras in a camera module to maximize the number of overlapping ⁇ filter groups.
  • color filters can be assigned to the cameras based upon minimizing occlusions around the camera that is to be used as the reference camera for the purposes of synthesizing high-resolution images.
  • FIG. 11 A process for assigning color filters to cameras in a camera module in accordance with an embodiment of the invention is illustrated in FIG. 11.
  • the process 1100 includes selecting (1102) a corner of the array, assigning (1104) a ⁇ filter group to the selected corner.
  • the ⁇ filter group occupies a 3 x 3 grid.
  • Color filters can be assigned (1106) to the remaining cameras in such a way to maximize the number of overlapping ⁇ filter groups within the array.
  • the cameras are assigned (1108) color filters that reduce the likelihood of occlusion zones in images synthesized from the viewpoint of a camera selected as the reference camera for the array. At which point, all of the cameras in the array are assigned color filters.
  • FIGS. 12A - 12D The process of generating a simple filter pattern for a 5 x 5 array using ⁇ filter groups is illustrated in FIGS. 12A - 12D. The process starts with the selection of the top left corner of the array. A ⁇ filter group is assigned to the 3 x 3 group of cameras in the top left corner (cameras G1-G5, Bi-B 2 , and Ri-R 2 ).
  • a second overlapping ⁇ filter group is created by adding three green cameras and a blue camera and a red camera (G 6 -G8 and B3 and R3).
  • a third overlapping ⁇ filter group is created by adding another three green cameras and a blue camera and a red camera (G9-G11 and B 4 and R 4 ).
  • a fifth and sixth ⁇ filter groups are created by adding a single green camera, blue camera and red camera (Gi 2 , B 5 , R 5 and G13, B 6 , R 6 ). In the event that central camera (G 6 ) fails, a camera at the center of another ⁇ filter group can be utilized as the reference camera (e.g. G3).
  • FIGS. 13A - 13D A similar process for generating a simple filter pattern for a 4 x 5 array using ⁇ filter groups is illustrated in FIGS. 13A - 13D.
  • the process is very similar with the exception that two cameras are not included in ⁇ filter groups. Due to the fact that there are no blue cameras below the camera G 6 (which is the center of a ⁇ filter group), the cameras that do not form part of a ⁇ filter group are assigned as blue cameras (B5 and B 6 ).
  • similar processes can be applied to any array larger than a 3 x 3 array to generate a color filter pattern incorporating ⁇ filter groups in accordance with embodiments of the invention.
  • the process outlined above can be utilized to construct larger arrays including the 7 x 7 array of cameras illustrated in FIG. 14.
  • the same process can also be utilized to construct even larger arrays of any dimensions including square arrays where the number of cameras in each of the dimensions of the array is odd. Accordingly, the processes discussed herein can be utilized to construct a camera module and/or an array camera including a camera array having any dimensions appropriate to the requirements of a specific application in accordance with embodiments of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

Selon des modes de réalisation, l'invention concerne des systèmes et des procédés qui constituent des modules d'appareils de prise de vues à matrices à l'aide de groupes de filtres π. Dans un mode de réalisation, un module d'appareils de prise de vues à matrices comprend : une matrice d'imageurs M x N comportant une pluralité de plans focaux, chaque plan focal incluant une matrice de pixels ; et une matrice optique M x N d'empilements de lentilles, chaque empilement de lentilles correspondant à un plan focal et formant une image d'une scène sur son plan focal correspondant. Chaque appariement d'un empilement de lentilles et d'un plan focal crée ainsi un appareil de prise de vues. Au moins une rangée de la matrice M x N d'appareils de prise de vues comprend un ou plusieurs appareils de prise de vues du rouge, un ou plusieurs appareils de prise de vues du vert et un ou plusieurs appareils de prise de vues du bleu. Au moins une colonne de la matrice M x N d'appareils de prise de vues comporte un ou plusieurs appareils de prise de vues du rouge, un ou plusieurs appareils de prise de vues du vert et un ou plusieurs appareils de prise de vues du bleu.
EP13785220.8A 2012-05-01 2013-05-01 Modules d'appareils de prise de vues constitués de groupes de filtres pi Withdrawn EP2845167A4 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201261641165P 2012-05-01 2012-05-01
US201261691666P 2012-08-21 2012-08-21
US201361780906P 2013-03-13 2013-03-13
PCT/US2013/039155 WO2013166215A1 (fr) 2012-05-01 2013-05-01 Modules d'appareils de prise de vues constitués de groupes de filtres pi

Publications (2)

Publication Number Publication Date
EP2845167A1 true EP2845167A1 (fr) 2015-03-11
EP2845167A4 EP2845167A4 (fr) 2016-01-13

Family

ID=49514873

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13785220.8A Withdrawn EP2845167A4 (fr) 2012-05-01 2013-05-01 Modules d'appareils de prise de vues constitués de groupes de filtres pi

Country Status (4)

Country Link
EP (1) EP2845167A4 (fr)
JP (1) JP2015521411A (fr)
CN (1) CN104335246B (fr)
WO (1) WO2013166215A1 (fr)

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
EP2502115A4 (fr) 2009-11-20 2013-11-06 Pelican Imaging Corp Capture et traitement d'images au moyen d'un réseau de caméras monolithique équipé d'imageurs hétérogènes
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
EP2761534B1 (fr) 2011-09-28 2020-11-18 FotoNation Limited Systèmes de codage de fichiers d'image de champ lumineux
EP2817955B1 (fr) 2012-02-21 2018-04-11 FotoNation Cayman Limited Systèmes et procédés pour la manipulation de données d'image de champ lumineux capturé
JP2015534734A (ja) 2012-06-28 2015-12-03 ペリカン イメージング コーポレイション 欠陥のあるカメラアレイ、光学アレイ、およびセンサを検出するためのシステムおよび方法
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
EP3869797B1 (fr) 2012-08-21 2023-07-19 Adeia Imaging LLC Procédé pour détection de profondeur dans des images capturées à l'aide de caméras en réseau
US20140055632A1 (en) 2012-08-23 2014-02-27 Pelican Imaging Corporation Feature based high resolution motion estimation from low resolution images captured using an array source
EP2901671A4 (fr) 2012-09-28 2016-08-24 Pelican Imaging Corp Création d'images à partir de champs de lumière en utilisant des points de vue virtuels
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
WO2014164550A2 (fr) 2013-03-13 2014-10-09 Pelican Imaging Corporation Systèmes et procédés de calibrage d'une caméra réseau
WO2014159779A1 (fr) 2013-03-14 2014-10-02 Pelican Imaging Corporation Systèmes et procédés de réduction du flou cinétique dans des images ou une vidéo par luminosité ultra faible avec des caméras en réseau
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US9438888B2 (en) 2013-03-15 2016-09-06 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
WO2015048694A2 (fr) 2013-09-27 2015-04-02 Pelican Imaging Corporation Systèmes et procédés destinés à la correction de la distorsion de la perspective utilisant la profondeur
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
WO2015081279A1 (fr) 2013-11-26 2015-06-04 Pelican Imaging Corporation Configurations de caméras en réseau comprenant de multiples caméras en réseau constitutives
CN104735360B (zh) * 2013-12-18 2017-12-22 华为技术有限公司 光场图像处理方法和装置
US9807372B2 (en) 2014-02-12 2017-10-31 Htc Corporation Focused image generation single depth information from multiple images from multiple sensors
WO2015134996A1 (fr) 2014-03-07 2015-09-11 Pelican Imaging Corporation Système et procédés pour une régularisation de profondeur et un matage interactif semi-automatique à l'aide d'images rvb-d
US20170272733A1 (en) * 2014-06-03 2017-09-21 Hitachi Medical Corporation Image processing apparatus and stereoscopic display method
US20150381972A1 (en) * 2014-06-30 2015-12-31 Microsoft Corporation Depth estimation using multi-view stereo and a calibrated projector
WO2016003253A1 (fr) 2014-07-04 2016-01-07 Samsung Electronics Co., Ltd. Procédé et appareil pour une capture d'image et une extraction de profondeur simultanées
RU2595759C2 (ru) * 2014-07-04 2016-08-27 Самсунг Электроникс Ко., Лтд. Способ и устройство для захвата изображения и одновременного извлечения глубины
EP3201877B1 (fr) 2014-09-29 2018-12-19 Fotonation Cayman Limited Systèmes et procédés d'étalonnage dynamique de caméras en réseau
US9762893B2 (en) 2015-12-07 2017-09-12 Google Inc. Systems and methods for multiscopic noise reduction and high-dynamic range
CN108702498A (zh) * 2016-03-10 2018-10-23 索尼公司 信息处理器和信息处理方法
MX2022003020A (es) 2019-09-17 2022-06-14 Boston Polarimetrics Inc Sistemas y metodos para modelado de superficie usando se?ales de polarizacion.
KR20230004423A (ko) 2019-10-07 2023-01-06 보스턴 폴라리메트릭스, 인크. 편광을 사용한 표면 법선 감지 시스템 및 방법
WO2021108002A1 (fr) 2019-11-30 2021-06-03 Boston Polarimetrics, Inc. Systèmes et procédés de segmentation d'objets transparents au moyen de files d'attentes de polarisation
US11195303B2 (en) 2020-01-29 2021-12-07 Boston Polarimetrics, Inc. Systems and methods for characterizing object pose detection and measurement systems
KR20220133973A (ko) 2020-01-30 2022-10-05 인트린식 이노베이션 엘엘씨 편광된 이미지들을 포함하는 상이한 이미징 양식들에 대해 통계적 모델들을 훈련하기 위해 데이터를 합성하기 위한 시스템들 및 방법들
WO2021243088A1 (fr) 2020-05-27 2021-12-02 Boston Polarimetrics, Inc. Systèmes optiques de polarisation à ouvertures multiples utilisant des diviseurs de faisceau
US12069227B2 (en) 2021-03-10 2024-08-20 Intrinsic Innovation Llc Multi-modal and multi-spectral stereo camera arrays
US12020455B2 (en) 2021-03-10 2024-06-25 Intrinsic Innovation Llc Systems and methods for high dynamic range image reconstruction
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US12067746B2 (en) 2021-05-07 2024-08-20 Intrinsic Innovation Llc Systems and methods for using computer vision to pick up small objects
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3728160B2 (ja) * 1999-12-06 2005-12-21 キヤノン株式会社 奥行き画像計測装置及び方法、並びに複合現実感提示システム
EP1812968B1 (fr) * 2004-08-25 2019-01-16 Callahan Cellular L.L.C. Appareil pour plusieurs dispositifs photographiques et procédé de fonctionnement associé
KR101733443B1 (ko) * 2008-05-20 2017-05-10 펠리칸 이매징 코포레이션 이종 이미저를 구비한 모놀리식 카메라 어레이를 이용한 이미지의 캡처링 및 처리
US8866920B2 (en) * 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP2011044801A (ja) * 2009-08-19 2011-03-03 Toshiba Corp 画像処理装置
EP2502115A4 (fr) * 2009-11-20 2013-11-06 Pelican Imaging Corp Capture et traitement d'images au moyen d'un réseau de caméras monolithique équipé d'imageurs hétérogènes
US20120012748A1 (en) * 2010-05-12 2012-01-19 Pelican Imaging Corporation Architectures for imager arrays and array cameras

Also Published As

Publication number Publication date
EP2845167A4 (fr) 2016-01-13
CN104335246B (zh) 2018-09-04
CN104335246A (zh) 2015-02-04
JP2015521411A (ja) 2015-07-27
WO2013166215A1 (fr) 2013-11-07

Similar Documents

Publication Publication Date Title
EP2845167A1 (fr) Modules d'appareils de prise de vues constitués de groupes de filtres pi
US9706132B2 (en) Camera modules patterned with pi filter groups
CN204697179U (zh) 具有像素阵列的图像传感器
JP5589146B2 (ja) 撮像素子及び撮像装置
CN105872525B (zh) 图像处理装置和图像处理方法
CN105306786B (zh) 用于具有相位检测像素的图像传感器的图像处理方法
JP5472584B2 (ja) 撮像装置
EP2133726B1 (fr) Système de capture multi-images avec résolution de profondeur d'image améliorée
CN103688536B (zh) 图像处理装置、图像处理方法
JP6131546B2 (ja) 画像処理装置、撮像装置および画像処理プログラム
KR20140094395A (ko) 복수 개의 마이크로렌즈를 사용하여 촬영하는 촬영 장치 및 그 촬영 방법
JP5597777B2 (ja) カラー撮像素子及び撮像装置
US9118879B2 (en) Camera array system
JP2008011532A (ja) イメージ復元方法及び装置
CN103597811A (zh) 拍摄立体移动图像和平面移动图像的图像拍摄元件以及装配有其的图像拍摄装置
JP5634614B2 (ja) 撮像素子及び撮像装置
WO2012153504A1 (fr) Dispositif d'imagerie et programme de commande de dispositif d'imagerie
TW202437770A (zh) 影像感測器
CN118588719A (zh) 图像传感器

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141028

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20151210

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 9/09 20060101ALI20151204BHEP

Ipc: G06T 7/00 20060101ALI20151204BHEP

Ipc: H04N 5/225 20060101ALI20151204BHEP

Ipc: H04N 9/04 20060101AFI20151204BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160719