WO2005057278A1 - Procede et dispositif permettant de saisir des images multiples - Google Patents

Procede et dispositif permettant de saisir des images multiples Download PDF

Info

Publication number
WO2005057278A1
WO2005057278A1 PCT/FI2003/000944 FI0300944W WO2005057278A1 WO 2005057278 A1 WO2005057278 A1 WO 2005057278A1 FI 0300944 W FI0300944 W FI 0300944W WO 2005057278 A1 WO2005057278 A1 WO 2005057278A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
capturing apparatus
image capturing
images
colour
Prior art date
Application number
PCT/FI2003/000944
Other languages
English (en)
Inventor
Timo Kolehmainen
Markku Rytivaara
Timo Tokkonen
Jakke Mäkelä
Kai Ojala
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to AU2003285380A priority Critical patent/AU2003285380A1/en
Priority to PCT/FI2003/000944 priority patent/WO2005057278A1/fr
Publication of WO2005057278A1 publication Critical patent/WO2005057278A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/41Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors

Definitions

  • the invention relates to an imaging device and a method of creating an image file. Especially the invention relates to digital imaging devices com- prising more than one image capturing apparatus.
  • An object of the invention is to provide an improved solution for creating images. Another object of the invention is to enhance the dynamic range of images.
  • an imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image.
  • the apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
  • a method of creating an image file in an imaging device comprising producing images with at least two image capturing apparatus, and utilising at least a portion of the images produced with different image capturing apparatus with each other to produce an image with enhanced image quality.
  • At least one image capturing apparatus has different light capturing properties compared to the other apparatus.
  • the image produced by the apparatus is used for enhancing the dynamic range of the image produced with the other of the image capturing apparatus.
  • at least one image capturing apparatus has a small aperture.
  • the image produced by the apparatus has fewer aberrations, as a smaller aperture produces a sharper image.
  • the information in the image may be utilised and combined with the images produced by other apparatus.
  • at least one image capturing apparatus has a higher aperture than other apparatus. Thus, the apparatus gathers more light and it is able to get more details from dark areas of the photographed area.
  • the imaging device comprises a lenslet array with at least four lenses and a sensor array.
  • the four image capturing apparatus each use one lens from the lenslet array, and a portion of the sensor array.
  • Three image capturing apparatus each comprise unique colour filter from a group of RGB or CMY filters or other system of colour filters and thus the three apparatus are required for producing a colour image.
  • the fourth image capturing apparatus may be manufactured with different light capturing properties compared to other apparatus and used for enhancing the image quality produced with the three apparatus.
  • Figure 1 illustrates an example of an imaging device of an embodiment
  • Figure 2A and 2B illustrate an example of an image sensing arrangement
  • Figure 2C illustrates an example of colour image combining
  • Figures 3A and 3B illustrate embodiments of the invention
  • Figure 4 illustrates a method of an embodiment with a flowchart
  • Figure 5 illustrates an embodiment where a polarization filter is used.
  • FIG. 1 illustrates a generalised digital image device which may be utilized in some embodiments of the invention. It should be noted that embodiments of the invention may also be utilised in other kinds of digital cameras than the apparatus of Figure 1 , which is just an example of a possible structure.
  • the apparatus of Figure 1 comprises an image sensing arrangement 100.
  • the image sensing arrangement comprises a lens assembly and an image sensor.
  • the structure of the arrangement 100 will be discussed in more detail later.
  • the image sensing arrangement captures an image and converts the captured image into an electrical form.
  • the electric signal produced by the apparatus 100 is led to an A/D converter 102 which converts the analogue signal into a digital form. From the converter the digitised signal is taken to a signal processor 104.
  • the image data is processed in the signal processor to create an image file.
  • the output signal of the image sensing arrangement 100 contains raw image data which needs post processing, such as white balanc- ing and colour processing.
  • the signal processor is also responsible for giving exposure control commands 106 to image sensing arrangement 100.
  • the apparatus may further comprise an image memory 108 where the signal processor may store finished images, a work memory 110 for data and program storage, a display 112 and a user interface 114, which typically comprises a keyboard or corresponding means for the user to give input to the apparatus.
  • Figure 2A illustrates an example of image sensing arrangement 100.
  • the image sensing arrangement comprises in this example a lens assembly 200 which comprises a lenslet array with four lenses.
  • the arrangement further comprises an image sensor 202, an aperture plate 204, a colour filter arrangement 206 and an infra-red filter 208.
  • Figure 2B illustrates the structure of the image sensing arrangement from another point of view.
  • the lens assembly 200 comprises four separate lenses 210 - 216 in a lenslet array.
  • the aper- ture plate 204 comprises a fixed aperture 218 - 224 for each lens.
  • the aperture plate controls the amount of light that is passed to the lens. It should be noted that the structure of the aperture plate is not relevant to the embodiments, i.e. the aperture value of each lens needs not be the same.
  • the number of lenses is not limited to four, either.
  • the colour filter arrangement 206 of the image sensing arrangement comprises in this example three colour filters, i.e. red 226, green 228 and blue 230 in front of lenses 201 - 214, respectively.
  • the sensor array 202 is in this example divided into four sections 234 to 239.
  • the image sensing arrangement comprises in this example four image capturing apparatus 240 - 246.
  • the image capturing apparatus 240 comprises the colour filter 226, the aperture 218, the lens 210 and the section 234 of the sensor array.
  • the image capturing apparatus 242 comprises the colour filter 228, the aperture 220, the lens 212 and the section 236 of the sensor array and the image capturing apparatus 244 comprises the colour filter 230, the aperture 222, the lens 214 and the section 238 of the sensor array.
  • the fourth image capturing apparatus 246 comprises the aperture 224, the lens 216 and a section 239 of the sensor array.
  • the fourth apparatus 246 does not in this example comprise a colour filter.
  • the image sensing arrangement of Figures 2A and 2B is thus able to form four separate images on the image sensor 202.
  • the image sensor 202 is typically, but not necessarily, a single solid-state sensor, such as a CCD (Charged Coupled Device) or CMOS (Complementary Metal-oxide Semiconductor) sensor known to one skilled in the art.
  • the image sensor 202 may be divided between lenses, as described above.
  • the image sensor 202 may also comprise four different sensors, one for each lens.
  • the image sensor 202 converts light into an electric current.
  • the sensor 202 comprises a given number of pixels.
  • the number of pixels in the sensor determines the resolution of the sensor. Each pixel produces an electric signal in response to light.
  • the number of pixels in the sensor of an imaging apparatus is a design parameter. Typically in low cost imaging apparatus the number of pixels may be 640x480 along the long and short sides of the sensor. A sensor of this resolution is often called a VGA sensor. In general, the higher the number of pixels in a sensor, the more detailed image can be produced by the sensor.
  • the image sensor 202 is thus sensitive to light and produces an electric signal when exposed to light.
  • the senor is not able to differ- entiate different colours from each other.
  • the sensor as such produces only black and white images.
  • a number of solutions are proposed to enable a digital imaging apparatus to produce colour images. It is well known for one skilled in the art that a full colour image can be produced using only three basic colours in the image capturing phase.
  • One generally used combination of three suitable colours is red, green and blue RGB.
  • Another widely used combination is cyan, magenta and yellow (CMY).
  • CCMY cyan, magenta and yellow
  • all colours can be synthesised using three colours, also other solutions are available, such as RGBE, where emerald is used as the fourth colour.
  • One solution used in single lens digital image capturing apparatus is to provide a colour filter array in front of the image sensor, the filter consisting of a three-colour pattern of RGB or CMY colours.
  • a colour filter array in front of the image sensor, the filter consisting of a three-colour pattern of RGB or CMY colours.
  • Such a solution is often called a Bayer matrix.
  • RGB Bayer matrix filter each pixel is typically covered by a filter of a single colour in such a way that in horizontal direction every other pixel is covered with a green filter and every other pixel is covered by a red filter on every other line and by a blue filter on every other line.
  • a single colour filter passes through to the sensor pixel under the filter light which wavelength corresponds to the wavelength of the single colour.
  • the signal processor interpolates the image signal received from the sensor in such a way that all pixels receive a colour value for all three colours.
  • the image sensing arrangement comprises a colour filter arrangement 206 in front of the lens assembly 200.
  • the filter arrangement may be located also in a different part of the arrangement, for example between the lenses and the sensor.
  • the colour filter arrangement 206 comprises three filters, one of each of the three RGB colours, each filter being in front of a lens. Alternatively also CMY colours or other colour spaces, may be used as well.
  • the lens 210 is associated with a red filter, the lens 212 with a green filter and the lens 214 with a blue filter. Thus one lens 216 has no colour filter.
  • the lens assembly may in an embodiment comprise an infra-red filter 208 associated with the lenses.
  • the infra-red filter does not necessarily cover all lenses at it may also be situated elsewhere, for example between the lenses and the sensor.
  • Each lens of the lens assembly 200 thus produces a separate image to the sensor 202.
  • the sensor is divided between the lenses in such a way that the images produced by the lenses do not overlap.
  • the area of the sensor divided to the lenses may be equal, or the areas may be of different sizes, de- pending on the embodiment.
  • the sensor 202 is a VGA imaging sensor and that the sections 234 - 239 allocated for each lens are of Quarter VGA (QVGA) resolution (320x240).
  • QVGA Quarter VGA
  • the electric signal produced by the sensor 202 is digitised and taken to the signal processor 104.
  • the signal processor proc- esses the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one filtered with a single colour.
  • the signal processor further processes the subimages and combines a VGA resolution image from the subimages.
  • Figure 2C illustrates one possible embodiment to combine the final image from the subimages. This ex- ample assumes that each lens of the lenslet comprises a colour filter, in such a way that there are two green filters, one blue and one red.
  • Figure 2C shows the top left corner of the combined image 250, and four subimages, a green one 252, a red one 254, a blue one 256 and a green one 258.
  • Each of the subimages thus comprises a 320x240 pixel array.
  • the top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different.
  • the subimages are first registered. Registering means that any two image points are identified as corresponding to the same physical point.
  • the top left pixel R1 C1 of the combined image is taken from the greenl image 252,
  • the pixel R1 C2 is taken from the red image 254, the pixel R2C1 is taken from the blue image 256 and the pixel R2C2 is taken from the green2 image 258.
  • the final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor array and a corresponding Bayer colour matrix.
  • the signal processor 104 may take into account the parallax error arising from the distances of the lenses 210 - 214 from each other.
  • the electric signal produced by the sensor 202 is digitised and taken to the signal processor 104.
  • the signal processor processes the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one being filtered with a single colour.
  • the signal processor further processes the subimages and combines a VGA resolution image from the subimages.
  • Each of the subimages thus comprise a 320x240 pixel array.
  • the top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different. Due to the parallax error the same pixels of the subimages do not necessarily correspond to each other. The parallax error is compensated by an algorithm.
  • the final image formation may be described as comprising many steps: first the three subimages are registered (also called matching). Registering means that any two image points are identified as corresponding to the same physical point).Then, the subimages are interpolated and the interpolated subimages are fused to an RGB-color image.
  • Interpolation and fusion may also be in another order.
  • the final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor ar- ray and a corresponding Bayer colour matrix.
  • the subimages produced by the three image capturing apparatus 240 - 244 are used to produce a colour image.
  • the fourth image capturing apparatus 246 may have different properties compared with the other apparatus.
  • the aperture plate 204 may comprise an aperture 224 of a different size for the fourth image capturing apparatus 246 compared to the three other image capturing apparatus.
  • the signal processor 104 is configured to combine at least a portion of the subimage produced with the fourth image capturing apparatus with the subimages produced with the three image capturing apparatus 240 - 244 to produce a colour image with an enhanced image quality.
  • the signal processor 104 is configured to analyse the images produced with the image capturing apparatus and to determine which portions of the images to combine.
  • the fourth image capturing apparatus has a small aperture 224 compared to the apertures 218 - 222 of the rest of the image capturing apparatus. This is illustrated in Figure 3A. When the aperture is small there are less aberrations in the resulting image, because a small aperture draws a sharp image.
  • a subimage taken with a small aperture adds information on the final image on bright areas which would otherwise be overexposed.
  • Apertures are usually denoted with so called F-numbers. They de- note the size of the aperture hole, through which the light passes to the lens. F-numbers are a fraction of the focal length of a lens. Thus, the smaller the F- number the more light is passed to the lens. For example, if the focal length of a lens is 50 mm, an F-number of 2.8 means that the aperture is 1/2.8 of 50 mm, i.e. 18 mm. A small aperture in this embodiment corresponds to F-number 4 or greater.
  • the fourth image capturing apparatus has a larger aperture 224 than the apertures 218 - 222 of the rest of the apparatus. This is illustrated in Figure 3B.
  • the large aperture enables the apparatus to have better light sensitivity compared to other apparatus.
  • the difference between the apertures is preferably fairly great.
  • the final image has a lower noise level because it is averaged using many images.
  • the dynamic area is bigger.
  • the final image will have more details in otherwise dark areas of the image. In this way, the final image contains more details in areas where the light intensity is low. These areas would be dark without the dynamic range enhancement.
  • the colour filter arrangement 206 may be a black and white image. In such a case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216. In an embodiment the colour filter arrangement 206 may comprise a separate Bayer matrix 232 or a corresponding colour matrix filter structure. Thus the fourth lens can be used to enhance a colour image.
  • the subimage or portions of the subimage produced with the fourth image capturing apparatus and the subimages produced with the three image capturing apparatus 240 - 244 may be combined by the signal processor 104 using several different methods. In an embodiment the combining is made us- ing an averaging method for each pixel to be combined: D ⁇ / - p ⁇ Y R + py 4 rVfinal R , 2
  • PV R , PV G , and PV B are the pixel values of red, green and blue filtered apparatus (in the example of Figure 2B, the pixel values from the subimages produced by the apparatus
  • PV 4 is the pixel value of the fourth apparatus 246.
  • PV R , PV G , and PV B are the pixel values of red, green and blue filtered apparatus. Since the fourth apparatus produces black and white images, also the colour saturation must be increased for the combined pixels.
  • the algorithm is for the situation where the aperture of the fourth apparatus 246 is larger than in other apparatus. In the weighted mean method information of the final image is taken mainly using the three RGB apparatus. Information produced by the fourth apparatus with the larger aperture can be utilised for example in the darkest areas of the image. The above algorithm automatically takes the above condition into account. In the embodiment where the aperture of the fourth apparatus is smaller and the image thus sharper than in the other apparatus the images may be combined with an averaging or advanced method, where the images are compared and the sharpest areas of both images are combined into the final image.
  • the amount of information in each image can be measured by taking standard deviation from the small areas of the images.
  • the amount of in- formation corresponds to sharpness.
  • the flowchart of Figure 4 illustrates the method.
  • phase 400 standard deviation from a small area of the image produced with the three RGB apparatus is calculated.
  • phase 402 standard deviation from a corresponding area of the image produced with the fourth apparatus is calculated.
  • phase 404 these deviations are compared with each other.
  • the area which has bigger deviation is assumed to be sharper and it is emphasised when producing the final image.
  • the attention is moved to the next area.
  • the fourth apparatus is configured to use different exposure time compared to other apparatus. This enables the apparatus to have different light sensitivity compared to other apparatus.
  • the fourth apparatus produces infra-red images. This is achieved by removing the infra-red filter 208 at least partially in front of the lens 216. Thus near-IR light reaches the sensor. In this case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216.
  • the infra-red filter may be a partially leaky Infra-red filter, in which case it passes both visible light and infra-red light to the sensor via the lens 216.
  • the fourth apparatus may act as an apparatus to be used for imaging in darkness. Imaging is possible when the scene is lit by an IR-light source.
  • the fourth apparatus may also be used as a black/white (B/W) reference im- age, which is taken without the infra-red filter..
  • B/W image can also be used for document imaging.
  • the lack of a colour filter array enhances the spatial resolution of the image compared to a colour image.
  • the reference B/W image may also be useful when the three colour filtered images are registered. The registration process is enhanced when a common reference image is available.
  • Figure 5 illustrates an embodiment of the invention.
  • Figure 5 shows the lens assembly 200, the image sensor 202, the aperture plate 204 and the colour filter arrangement 206 in a more compact form.
  • the fourth apparatus comprises a polarization filter 500.
  • a polarization filter blocks light waves which are polarized in perpendicular to the polarization direction of the filter.
  • a vertically polarized filter does not allow any horizontally polarized waves to pass through.
  • the most common use of polarized filters is to block reflected light.
  • sunshine horizontal surfaces, such as roads and water reflect horizontally polarized light.
  • the fourth apparatus comprises a vertically polarized filter which allows non-polarized light to pass through but blocks reflected light.
  • the fourth apparatus comprises a polarization filter which can be rotated by the user. The polarization filter may also be used with the other embodiments described above.
  • the lens with the polarization filter is similar in optical and light gathering properties compared to the other subsystem in order to simplify calculations.
  • the default image produced by the non-polarized apparatus is defined to be the "normal image" NI. This is the image that is transmitted to the viewfinder for the user to view and stored in memory as the main image.
  • the polarized image PI is stored separately.
  • the user is able to decide whether or not to use the information contained in PI to manipulate NI to form a "corrected image" CI. For example, when viewing images, he can be presented with a simple menu, which allows him to choose the "glare correction" , if desired.
  • the correction is made automatically and the corrected image is shown on the viewfinder and stored.
  • the user does not need to be aware that any correction has even been made. This is simple for the user, but taking the image requires more processing and is more difficult to realize in real time.
  • the image taken by the other apparatus and the polarized image taken by the fourth apparatus are reformatted into a same colour space in which there is only the intensity component (i.e. the are reformatted into greyscale images, for example).
  • These reformatted images may be called NY (for the normal image) and PY (for the polarized image).
  • the pixel values X, j in the matrix X are equal to k, but where the polarizing filter has blocked a significant amount of light from a given location, the pixel values X, j are much smaller.
  • the matrix X is thus essentially a "map" of the areas with reflected light: where there is significant reflection, the map is dark (close to zero), while it has a constant non-zero value in other areas.
  • the "glare matrix" GM is defined to be a greyscale image with the same dimen- sions as PY and NY.
  • GM is not uniquely defined, but is related to X in that it is a measure of the "excess light" which is to be removed from the image.
  • the image sensor is a temperature sensitive unit and generates a small electric current, which depends on the temperature of the sen- sor. This current is called a dark current, because it occurs also when the sensor is not exposed to light.
  • one apparatus is shielded from light and thus produces an image based on the dark current only. Information from this image may be used to suppress at least part of the dark current present in the other apparatus used for producing the actual image. For example, the dark current image may be subtracted from the images of other apparatus.
  • at least one image capturing apparatus is used for measuring white balance or measuring exposure parameters. Usually digital cameras measure white balance and exposure parameters using one or more captured images and calculating parameters for white balance and exposure adjustments by averaging pixel values over the image or over the images.
  • the calculation requires computing resources and increases current consumption in a digital camera. In such a case the same lens that creates the image is also used for these measuring purposes.
  • the imaging apparatus has a dedicated image capturing apparatus with a lens arrangement and image sensor area for these measuring purposes.
  • the required software and required algorithms may be designed better as the image capturing and the measuring functions are separated to different apparatus. Thus measuring can be made faster and more accurately than in conventional solutions.
  • the associated image capturing apparatus detects spectral information by capturing light intensity in many spectrum bands by means of diode detectors with corresponding colour filters (for example, red, green, blue and near-IR bands are used). These parameters are used by the processor of the imaging device for estimating parameters needed for white balance and exposure adjustment.
  • the benefit is a processing time much reduced compared to the case of calculating these parameters by averaging over a full image.
  • the white balance and exposure parameters may also be calculated by taking a normal colour image with the image capturing apparatus and averaging pixels over the image in a fashion suitable for white balance and exposure adjustment.
  • the image may be saved and used for later image post-processing on computer, for example.
  • each image capturing apparatus has a different aperture size.
  • Each image capturing apparatus produces a colour image.
  • Each image capturing apparatus comprises a colour filter. Large aperture variations enable high dynamic range imaging. Images of two or more image capturing apparatus may be used to compose a dynamically enhanced colour image. The images may be registered and averaged pixelwise to achieve a high dynamic range colour image.
  • Weighted averaging may also be used as an advanced method to combine images.
  • the weight coefficient can be taken from the best exposure image or derived from all sub-images.
  • the weight value indicates what subimages to use as the source of information, when calculating pixel value in final image. When the weight value is high the information is taken from small aperture cameras and vice versa. Typically the camera sensor sensitivity is dependent on wavelength.
  • each image capturing apparatus comprises a different aperture size and each image capturing apparatus is dedicated to its own spectral band (for instance: R, G, B, Clear).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé permettant de créer un fichier image dans un dispositif d'imagerie et un dispositif d'imagerie comprenant au moins deux appareils de saisie d'images, chaque appareil étant conçu pour produire une image. L'appareil est conçu pour utiliser au moins une partie des images produites à l'aide de différents appareils de saisie d'images afin de produire une image présentant une qualité d'image améliorée.
PCT/FI2003/000944 2003-12-11 2003-12-11 Procede et dispositif permettant de saisir des images multiples WO2005057278A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2003285380A AU2003285380A1 (en) 2003-12-11 2003-12-11 Method and device for capturing multiple images
PCT/FI2003/000944 WO2005057278A1 (fr) 2003-12-11 2003-12-11 Procede et dispositif permettant de saisir des images multiples

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2003/000944 WO2005057278A1 (fr) 2003-12-11 2003-12-11 Procede et dispositif permettant de saisir des images multiples

Publications (1)

Publication Number Publication Date
WO2005057278A1 true WO2005057278A1 (fr) 2005-06-23

Family

ID=34673805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2003/000944 WO2005057278A1 (fr) 2003-12-11 2003-12-11 Procede et dispositif permettant de saisir des images multiples

Country Status (2)

Country Link
AU (1) AU2003285380A1 (fr)
WO (1) WO2005057278A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1874034A2 (fr) * 2006-06-26 2008-01-02 Samsung Electro-Mechanics Co., Ltd. Appareil et procédé de récupération d'image à pixels nombreux
WO2008112053A3 (fr) * 2007-03-09 2008-11-13 Eastman Kodak Co Procédé de fonctionnement d'appareil photo à deux objectifs pour augmenter des images
CN102348093A (zh) * 2011-08-23 2012-02-08 太原理工大学 Android手机视频聊天智能底座
JP2018119856A (ja) * 2017-01-25 2018-08-02 京セラ株式会社 撮像部材および撮像装置
TWI781085B (zh) * 2015-11-24 2022-10-21 日商索尼半導體解決方案公司 複眼透鏡模組及複眼相機模組

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) * 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0858208A1 (fr) * 1997-02-07 1998-08-12 Eastman Kodak Company Procédé de fabrication d'images numériques à charactéristique de performance améliorée
EP0930770A2 (fr) * 1998-01-14 1999-07-21 Mitsubishi Denki Kabushiki Kaisha Télephone cellulaire portable avec fonction de caméra
US20030117501A1 (en) * 2001-12-21 2003-06-26 Nec Corporation Camera device for portable equipment
US6611289B1 (en) * 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) * 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0858208A1 (fr) * 1997-02-07 1998-08-12 Eastman Kodak Company Procédé de fabrication d'images numériques à charactéristique de performance améliorée
EP0930770A2 (fr) * 1998-01-14 1999-07-21 Mitsubishi Denki Kabushiki Kaisha Télephone cellulaire portable avec fonction de caméra
US6611289B1 (en) * 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses
US20030117501A1 (en) * 2001-12-21 2003-06-26 Nec Corporation Camera device for portable equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1874034A2 (fr) * 2006-06-26 2008-01-02 Samsung Electro-Mechanics Co., Ltd. Appareil et procédé de récupération d'image à pixels nombreux
EP1874034A3 (fr) * 2006-06-26 2011-12-21 Samsung Electro-Mechanics Co., Ltd. Appareil et procédé de récupération d'image à pixels nombreux
WO2008112053A3 (fr) * 2007-03-09 2008-11-13 Eastman Kodak Co Procédé de fonctionnement d'appareil photo à deux objectifs pour augmenter des images
US7859588B2 (en) 2007-03-09 2010-12-28 Eastman Kodak Company Method and apparatus for operating a dual lens camera to augment an image
CN102348093A (zh) * 2011-08-23 2012-02-08 太原理工大学 Android手机视频聊天智能底座
TWI781085B (zh) * 2015-11-24 2022-10-21 日商索尼半導體解決方案公司 複眼透鏡模組及複眼相機模組
JP2018119856A (ja) * 2017-01-25 2018-08-02 京セラ株式会社 撮像部材および撮像装置

Also Published As

Publication number Publication date
AU2003285380A1 (en) 2005-06-29

Similar Documents

Publication Publication Date Title
US20070177004A1 (en) Image creating method and imaging device
USRE47458E1 (en) Pattern conversion for interpolation
US9077886B2 (en) Image pickup apparatus and image processing apparatus
US6801719B1 (en) Camera using beam splitter with micro-lens image amplification
US7688368B2 (en) Image sensor with improved light sensitivity
TWI488144B (zh) 使用由相同影像擷取裝置所擷取之場景之低解析度影像及至少一個高解析度影像來提供經改良高解析度影像的方法
TWI428006B (zh) 處理像素陣列及處理影像的方法
TWI462055B (zh) 具有合成全色影像之彩色濾光器陣列影像
TWI495336B (zh) 利用彩色濾光片陣列影像產生全彩影像
US8179445B2 (en) Providing improved high resolution image
JP5825817B2 (ja) 固体撮像素子及び撮像装置
US8049801B2 (en) Image sensor and imaging apparatus
EP2664153B1 (fr) Système de réalisation d'image utilisant une unité d'objectif présentant des aberrations chromatiques longitudinales et procédé de fonctionnement
US6813046B1 (en) Method and apparatus for exposure control for a sparsely sampled extended dynamic range image sensing device
US20050128509A1 (en) Image creating method and imaging device
US10630920B2 (en) Image processing apparatus
US20100253833A1 (en) Exposing pixel groups in producing digital images
US20090051984A1 (en) Image sensor having checkerboard pattern
CN101449575A (zh) 具有改进的光灵敏度的图像传感器
WO2005057278A1 (fr) Procede et dispositif permettant de saisir des images multiples
JP4649734B2 (ja) 映像信号処理装置および映像信号処理プログラムを記録した記録媒体
Allen et al. Digital cameras and scanners
JP2002185976A (ja) 映像信号処理装置および映像信号処理プログラムを記録した記録媒体

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP