EP1856907A2 - Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs - Google Patents

Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs

Info

Publication number
EP1856907A2
EP1856907A2 EP06726221A EP06726221A EP1856907A2 EP 1856907 A2 EP1856907 A2 EP 1856907A2 EP 06726221 A EP06726221 A EP 06726221A EP 06726221 A EP06726221 A EP 06726221A EP 1856907 A2 EP1856907 A2 EP 1856907A2
Authority
EP
European Patent Office
Prior art keywords
image
sharpness
color
digital image
capture apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06726221A
Other languages
German (de)
English (en)
French (fr)
Inventor
Laurent Chanas
Imène TARCHOUNA
Frédéric Guichard
Bruno Liege
Jérôme MENIERE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dxo Labs SA
Original Assignee
Dxo Labs SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from FR0550601A external-priority patent/FR2880958B1/fr
Application filed by Dxo Labs SA filed Critical Dxo Labs SA
Publication of EP1856907A2 publication Critical patent/EP1856907A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/615Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/615Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
    • H04N25/6153Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF] for colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • the invention relates to a method for controlling an action, especially a sharpness modification, from a digital color image. It relates more particularly, but not exclusively, to improving the sharpness of at least one color of a digital image.
  • the invention also relates to a system implementing such a method and an image generated by such a method.
  • the invention also relates to a method for producing an image capture and / or restitution apparatus which comprises an optical system for capturing and / or restoring images, a sensor and / or image generator, and / or a servo system, the image being processed, for improvement, by digital image processing means.
  • the invention also relates to an apparatus obtained by such an embodiment method.
  • an optical focusing device can be used which displaces optical elements making it possible to vary the range of distances for which the image is sharp.
  • Such a device manual or motorized, often includes a servo system for choosing the displacement according to the distances of the objects in the scene.
  • the applications of such a process are cameras, cameras. It has the disadvantage of having a limited depth of field, especially at large aperture, and a cost and a size difficult to adapt to devices of small size, such as telephones.
  • a solution of the wavefront coding type can be used, which adds a specific optical element in the optical system to allow computational reconstruction of the sharpness with a large depth of field.
  • the applications of this solution have the disadvantage of being a specific industrial process, to have a cost and a bulk of the optical element, to require a hardware modification.
  • the solutions are: - the deflashing algorithms on the luminance or a color, by increasing the sharpness by "sharpen” or by another method of calculation.
  • the applications of such a method (all cameras) have the disadvantages of a limited increase in sharpness and therefore a very small increase in depth of field.
  • the known techniques for designing or producing such image capture and / or reproduction devices consist in firstly selecting the properties of the hardware elements of the apparatus. including the optical system, the sensor and the servo system. Then, if necessary, there is provided digital image processing means for correcting the defects of at least one of the hardware elements of the apparatus.
  • an optical device system design an optical device system
  • a specification is first established, that is to say that the space requirement, the focal length ranges, the opening ranges, the Covered field, the performances expressed, either in size of image spot, or in value of FTM (modulation transfer function), and the cost.
  • FTM modulation transfer function
  • This development of the optical system is done interactively.
  • an optical system is designed to have the best quality in the center of the image and usually the quality at the edges of the image is of a lower level.
  • the optical system is designed to achieve a certain level of distortion, vignetting, blur and depth. field, so that the optical system can be compared to other optical systems.
  • the characteristics of the sensor are also specified, namely: pixel quality, pixel area, number of pixels, microlens matrix, anti-alias filters, pixel geometry , and the arrangement of the pixels.
  • the usual technique is to select the sensor of an image capture apparatus independently of the other elements of the apparatus and, in particular, the image processing system.
  • the capture apparatus and / or image generators also usually include one or more servo systems such as an exposure system and / or a point system (auto focus or "autofocus") and / or a flash control system.
  • servo systems such as an exposure system and / or a point system (auto focus or "autofocus") and / or a flash control system.
  • the measurement modes are determined, in particular the areas of the image on which the exposure will be measured are determined. as well as the weight assigned to each zone.
  • the number and position of the areas of the image that will be used to focus will be determined. For example, a motor displacement instruction is also specified.
  • the invention results from the combination of the following findings, which are specific to it: i) The capturing apparatus and / or processing images generate on these images a variable sharpness which is a function of the color considered as described below with the aid of Figures la and Ib.
  • Figure la is shown the convergent lens 1 of an optical device (not shown) provided with a sensor 2 located on a focal point 3.2 associated with a wavelength X2.
  • the color defined by this length ⁇ 2 is clear on an image formed by this lens when the image represents an object at a very great distance.
  • the focusing point 3.2 of the lens is specific to the color defined by this wavelength ⁇ 2, such that a focusing point 3.1 specific to another color defined by a wavelength ⁇ 1 lies upstream of the sensor.
  • the image formed by this second color ( ⁇ 1) at the sensor is less clear than the image formed by the first color ( ⁇ 2), which reduces the sharpness of the overall image formed by the sensor.
  • the focusing point of the lens for a wavelength is variable depending on the distance to which the object 4 is represented in the image.
  • FIG. 1b shows the new locations 4.1 and 4.2 of the focusing points associated, respectively, with the wavelengths ⁇ 1 and ⁇ 2 when the object represented has moved from a very great distance (FIG. 1a) to a closer distance. ( Figure Ib).
  • the senor is located on the focal point of the color ( ⁇ l) which previously did not form a clear image.
  • the focusing point of the lens for a wavelength and an object distance is variable depending on the position in the image of the object represented.
  • FIG. 2 which is an example of the spectral distribution of an image along axis 6.1
  • the images are generally composed of several colors whose intensities (axis of ordinates 6.2) can be close.
  • blue 5.1 components (wavelength around 450 nm), green 5.2 (wavelength around 550 nm) and red (near 600 nm wavelength) are represented, but it is clear that the invention applies to an image regardless of its distribution of colors and wavelengths considered (eg infrared or ultraviolet).
  • the invention generally relates to a method for improving the sharpness of at least one color of a digital image comprising the steps of - choosing from the colors of the image at least one color called net color,
  • the invention also relates to a method of producing a capture apparatus which comprises a capture optical system, and a sensor, and / or a servo system, the image being processed, for its improvement, by digital image processing means; a method in which the parameters of the optical system and / or the sensor and / or the servo system are determined or selected from the capabilities of the digital means image processing, so as to minimize the costs of implementation and / or optimize the performance of the capture apparatus.
  • the method further comprises the step of decomposing the digital image into regions; said choice of the net color being made for each region.
  • said choice of the net color consists of choosing the sharpest color according to a predetermined rule. In one embodiment, said choice of "clean color" is predetermined.
  • said digital image is derived from a capture apparatus and said choice of the net color is a function of the distance between the capture apparatus and at least one object of the captured scene to obtain said digital image.
  • said image capture apparatus having a macro mode, said choice of the net color is a function of the activation of the macro mode.
  • said digital image being derived from a capture apparatus, said method further comprises the step of determining the distance between the capturing apparatus and at least one captured scene object from the sharpness of at least two colors in an image region of said object. In one embodiment, the method further includes the step of reducing the sharpness of at least one color in at least one image region.
  • the method further comprises the step of determining an enslaving instruction of said capturing apparatus from the sharpness of at least two colors; so that the focus is done in fewer steps and is accelerated.
  • said method further comprises the step of selecting an optic a set of predetermined optics; said optics having features such that images of an object at at least two predetermined distances have distinct distinct colors; so that the depth of field is improved and / or the cost of optics is decreased.
  • said method further comprises the step of designing an optic taking into account the method according to the invention; said optics having characteristics such that the images of an object at at least two predetermined distances have distinct distinct colors, so that the depth of field and / or the aperture and / or any other optical characteristic is improved and / or the cost of optics is decreased,
  • the invention also relates to a method for producing a device (20) for capturing and / or restoring images which comprises an optical system (22, 22 ') for capturing and / or restoring images, a sensor (24) and / or generator (24 ') of images, and / or a servo system (26), the image being processed, for its improvement, by digital means (28, 28') of image processing,
  • the method being such that the parameters of the optical system and / or the sensor and / or the image generator and / or the servo system are determined or selected, based on the capabilities of the digital image processing means , and in particular the improvement of the sharpness of a color according to the sharpness of another color according to a process according to one of the preceding claims,
  • the invention also relates to an apparatus for capturing and / or restoring images using a color improvement method according to one of the preceding embodiments and / or obtained by a production method according to the preceding embodiment.
  • the invention also relates to a digital image obtained by a method according to one of the preceding embodiments or from an apparatus according to the preceding embodiment.
  • the invention also relates to a digital image processing device implementing a method according to one of the preceding embodiments.
  • - Digital image means an image in digital form.
  • the image may be from an image capture apparatus.
  • the digital image may be represented by a set of numerical values, hereinafter called gray level, each numerical value being associated with a color sensitivity and a relative geometric position on a surface or a volume.
  • gray level a set of numerical values
  • color refers to the set of numerical values associated with the same color sensitivity.
  • the digital image is preferably the raw image of the sensor ("raw” format) before the demosaicing operation ("demosaicing" in English).
  • the digital image can also have undergone a treatment, for example a demosaicing, a white balance.
  • the digital image has not undergone subsampling.
  • the image capture apparatus includes a sensor with sensitive elements.
  • sensitive element is meant a sensor element for converting a flow of energy into an electrical signal.
  • the energy flow can take the form of a luminous flux, X-rays, a magnetic field, an electromagnetic field or sound waves.
  • the sensitive elements may be, depending on the case, juxtaposed on a surface and / or superimposed in a volume.
  • the sensitive elements may be arranged in a rectangular matrix, a hexagonal matrix or other geometry.
  • the invention applies to sensors comprising sensitive elements of at least two different types, each type having a color sensitivity, each color sensitivity corresponding to the portion of the energy flow converted into an electrical signal by the sensor. sensing element of the sensor.
  • the sensors In the case of a visible image sensor, the sensors generally have a sensitivity in 3 colors and the digital image has 3 colors: red 5.1, green 5.2 and blue 5.3 represented in FIG. 2 which shows on the vertical axis 6.2 the amount of energy converted and on the horizontal axis 6.1 the wavelength. Some sensors have a sensitivity in 4 colors red, green, emerald, blue.
  • color is also meant a combination, including linear signals delivered by the sensor.
  • the sharpness of a color may correspond to the measurement of a value called BXU which is a measure of the area of blur spot, as described in the article published in the Proceedings of IEEE, International Conference of Image Processing, Singapore 2004 ", and entitled” Uniqueness of Blur Measure “by Jérians BUZZI and Frédéric GUICHARD.
  • BXU is the variance of the impulse response (that is, its average surface). Processing capabilities can be limited to a maximum value of BXU.
  • the sharpness of a color is obtained by calculating a gradient.
  • the sharpness of a color can be obtained by a gradient calculation of 9 gray levels taken in neighboring geometric positions in the color considered.
  • the invention refers to the sharpness of at least two colors. According to one embodiment, the sharpness of at least two colors is only considered relatively relative to each other. For this embodiment, a gradient makes it possible to simply calculate a relative sharpness between two colors independently of the content of the image.
  • the invention mentions to choose from the colors at least one color called "clean color”. In one embodiment, this choice can be made by determining which of at least two colors is the sharpest. For this embodiment, a gradient makes it possible to simply determine the sharpest color among at least two colors.
  • An image capture device is, for example, a disposable camera, a digital camera, a DSLR (digital or not), a scanner, a fax, an endoscope, a camera, a camcorder, a video camera surveillance, a toy, a camera or a camera integrated or connected to a telephone, personal assistant or computer, thermal imaging camera, ultrasound machine, MRI (magnetic resonance) imaging equipment, X-ray machine.
  • the invention relates to such apparatuses when processing images having at least two colors.
  • Optical image capture system means optical means for rendering images on a sensor.
  • Image sensor means mechanical, chemical, or electronic means for capturing and / or recording an image.
  • servo system means means of the mechanical, chemical, electronic or computer, allowing elements or parameters of the device to meet a set. These include the autofocus system, automatic white balance control, automatic exposure control, optical element control, for example to maintain consistent quality images, an image stabilization system, an optical and / or digital zoom factor control system, or a saturation control system, or a contrast control system.
  • the digital image processing means can take various forms depending on the application.
  • the digital image processing means may be integrated wholly or partly into the apparatus, as in the following examples:
  • An image capture apparatus which produces modified images, for example a digital camera which incorporates image processing means.
  • An image rendering apparatus which displays or prints modified images, for example a video projector or a printer including image processing means.
  • a mixed device that corrects the defects of its elements for example a scanner / printer / fax including image processing means.
  • a professional image capture apparatus that produces modified images, for example an endoscope including image processing means.
  • the digital image processing means comprise means for improving the image quality by acting on at least one of the parameters of the group comprising: the geometric distortions of the optical system, the chromatic aberrations of the optical system, the compensation of the image; parallax, depth of field, vignetting of the optical system and / or of the sensor and / or the image generator, lack of sharpness of the optical system and / or the sensor and / or the image generator, noise, Moire phenomena, and / or contrast,
  • the determined or selected parameters of the optical system are chosen from the group comprising: the number of optical elements of the system, the nature of the materials composing the optical elements of the optical system, the cost of the optical system materials, the treatment optical surfaces, assembly tolerances, parallax value as a function of focal length, aperture characteristics, aperture mechanisms, range of possible focal lengths, focusing characteristics, focusing mechanisms to the point, anti-alias filters, clutter, depth of field, focal length and focus characteristics, geometric distortions, chromatic aberrations, decentering, vignetting, sharpness characteristics, - and / or the determined or selected parameters of the sensor and / or image generator are selected from the group consisting of: pixel quality, area pixels, the number of pixels, the microlens matrix, the anti-alias filters, the geometry of the pixels, the arrangement of the pixels, and / or the determined or selected parameters of the servo system are selected from the group consisting of: focus measurement, exposure metering, white balance measurement, focus set point, set point the set time, the
  • the focusing can be carried out in various ways, in particular by controlling the position of moving elements of the optical system or by controlling the geometry of elements. deformable optics.
  • the performance of a capture device is in particular, its cost, its size, the minimum amount of light that it can receive or transmit, the quality of the image, in particular its sharpness, the technical characteristics of the optics, the sensor and servo as well as its depth of field.
  • the depth of field can be defined as the range of distances in which the object generates a sharp image, that is to say whose sharpness is greater than a given threshold for a color , usually green, or again, or as the distance between the nearest object plane and the farthest object plane for which the blur spot does not exceed predetermined dimensions.
  • the invention also relates to an apparatus obtained by the production method as defined above.
  • the invention relates to a method for controlling an action from a measurement made on at least one digital image, having at least two colors, from an image capture apparatus, in which: - the relative sharpness is measured between at least two colors on at least one region R of the image, and
  • At least one action is controlled as a function of the measured relative sharpness.
  • region it means part or all of the image.
  • a region has one or more pixels, contiguous or not.
  • the action is particularly adapted to the distance between the imaged object and the capture apparatus or is adapted to the relative depth between two imaged objects.
  • Relative sharpness can be measured in different ways, for example (without the list being limiting):
  • the relative sharpness and / or relative sharpness in a region can be expressed as a single numerical value, for example reflecting the average relative sharpness in the region, or by several numerical values that account for the relative sharpness in different parts of the region.
  • At least one action is controlled according to the measured relative sharpness.
  • This action is notably (without the list being limiting):
  • an object detection in particular a face and / or the main subject (s) and / or
  • an object recognition and / or authentication for example a face, and / or
  • the action implements: the digital image, and / or
  • the treatment may consist (without the list being limiting) in one of the following actions:
  • the use of the relative sharpness measured to control the action thus makes it possible, in particular, to adapt the action to the distance between at least part of an image object and the measuring device, and / or to the geometry at least a portion of an object and / or the position and / or size of at least a portion of the object, and / or the direction of at least a portion of the object.
  • the known methods do not make it possible to control this type of action from a measurement of relative sharpness of at least one region of the image, but require the use of a particular device in addition to the image sensor to estimate a distance.
  • the known methods allow a measurement of distance in only one point or a limited number of points while the invention makes it possible to measure the distance in a large number of points simultaneously.
  • the controlled action is included in the group comprising:
  • the controlled action comprises processing on at least one zone Z 'of the digital image and / or another digital image.
  • the zone Z ' is or is not part of the digital image on which the relative sharpness measurement has been made.
  • the sharpness measurement in a digital camera is performed on the image displayed before shooting, and the image taken later is processed at full resolution (while the measurement performed on the displayed image before shooting, is usually at lower resolution) from the last measurement or a combination of the last measurements.
  • the zone Z ' constitutes all or part of the region (on which the relative sharpness measurement has been made) of the digital image, and / or the entire digital image, and / or a zone distinct from the region of the digital image, and / or an area of another digital image, and / or another entire digital image.
  • the zone Z ' is a pixel and one defines a region of N pixels on which one measures the relative sharpness and, depending on this relative sharpness, we apply a filter that conveys the sharpness of the sharpest color to the other color so that the sharpness of the pixel is increased.
  • the depth of field is increased.
  • the zone Z 'on which the treatment is carried out can constitute an entire digital image, especially when the sharpness of the entire image is increased.
  • the other digital image being, for example , an image following a video image; the other image is also, for example, the digital image taken at full resolution for a camera, while the image on which the measurement is made is at low resolution.
  • the zone Z 'for which a process is controlled comprises at least one pixel of an image and the region comprises a predetermined neighborhood of the corresponding pixel in the digital image.
  • the processed image can be the digital image.
  • the processed image may also be another image, for example an image from the same sensor and captured after the digital image.
  • the correspondence between the pixels of the two images can be done by associating the pixels of the two images located in the same place.
  • This case has the advantage of avoiding storing the digital image between the measurement and the processing without annoying artifact if the images are captured with a small time interval, for example 1 / 15s.
  • this treatment is applied to all the pixels of an image.
  • the processed image can be the digital image.
  • the processed image can be another image, for example an image from the same sensor and captured after the digital image.
  • processing on at least the zone Z ' comprises the modification of at least one characteristic of the image forming part of the group comprising: the sharpness, the contrast, the brightness, the details, the color, the type of compression , the compression ratio, the content of the image, the resolution.
  • Example of contrast modification The contrast of nearby objects is increased and the contrast of background objects is reduced, for example in the case of a video conference. Conversely, you can reduce the contrast of nearby objects and increase the contrast of background objects to reduce the effect of fog.
  • Example of brightness change The contrast of nearby objects is increased and the contrast of background objects is reduced, for example in the case of a video conference. Conversely, you can reduce the contrast of nearby objects and increase the contrast of background objects to reduce the effect of fog.
  • Example of brightness change The contrast of nearby objects is increased and the contrast of background objects is reduced, for example in the case of a video conference. Conversely, you can reduce the contrast of nearby objects and increase the contrast of background objects to reduce the effect of fog.
  • Example of brightness change The contrast of nearby objects is increased and the contrast of background objects is reduced, for example in the case of a video conference. Conversely, you can reduce the contrast of nearby objects and increase the contrast of background objects to reduce the effect of fog.
  • Example of brightness change The contrast of nearby objects is increased and the contrast of background objects is reduced, for example
  • the treatment may consist of lighting nearby objects and sinking the background, for example for a videoconference.
  • the treatment of the brightness will consist of lighting the background and darken the nearest objects to compensate for the effect of the flash.
  • an MPEG4 codec is provided with a close object / remote object segmentation, in order to allow to strongly compress the remote object to keep a maximum quality of the main subject that is close.
  • Example of changing the compression ratio As above in the case of a video conference, the compression ratio may be higher for the background than for the main subject.
  • the treatment consists of replacing a background with a landscape or a setting.
  • the processing comprises a sharpness modification for each pixel of the zone Z 'by means of a filter mixing the values attached to the pixel over a predetermined neighborhood at each pixel, the parameters of the filter being a function of the measured relative sharpness.
  • zone Z ' is determined from the measured relative sharpness.
  • zone Z corresponds to parts of images where the relative sharpness is within a given range corresponding to parts of the image containing objects within a given range of distances, which allows, for example to treat differently a foreground and the background.
  • the zone Z ' constitutes a background of an image, in particular intended to be transmitted remotely, in particular by a video or videoconferencing system.
  • the processed image can be the digital image.
  • the processed image may also be another image, for example an image from the same sensor and captured after the digital image.
  • the processing comprises providing an information function of the distance between the imaged object and the capture apparatus for all or part of the pixels of the zone Z 'and a storage and / or transmission and / or use of this information function distance, the storage being performed in particular in a computer file, in particular in an image file.
  • the zone Z ' may constitute a point and / or a region and / or several regions and / or a complete image and / or a main subject and / or a background.
  • the distance-dependent information may be a distance, for example with an indication of accuracy, or a range of distance values such as, for example, a distance less than one centimeter, a distance of between 1 and 10 centimeters, and a distance of 10 centimeters. centimeters and 1 meter, and beyond one meter.
  • Distance-based information can also be represented by a criterion of "too close", “near”, “near”, “far”, or “macro”. Information based on distance can also be translated into nature information of objects or subjects such as “portrait” or "landscape”.
  • Distance-based information may also include distance values of various image elements such as minimum distance, maximum distance, mean, and standard deviation.
  • the controlled action comprises a servo control of the capture apparatus included in the group consisting of: a focus control, an exposure control, a flash servo, a registration servocontrol of the image, a servo white balance, servo image stabilization, a servo of another apparatus or device related to the capture apparatus such as guiding a robot.
  • the main subject or the areas of interest can be detected by the distance measurements, from the sharpness, the main subject or the area of interest then being the closest area.
  • a focusing servo made from measurements made directly on a single digital image is particularly advantageous compared to known focus tuning, or "autofocus", for which it is necessary to perform measurements on successive images. .
  • a known adjustment servocontrol consists of pressing a triggering member halfway up and then moving the frame before pressing fully, whereas with the invention the focusing can be done. perform fully automatically; the invention thus allows a saving of time and a better image.
  • Example of Exposure Control As with focus control, the exposure setting is made on the main subject that is detected automatically; thus the exposure can be correct regardless of the position of the main subject in the frame of the image. In other words, as for the focus, the user does not need to aim the subject and then press halfway and then move the frame.
  • the illumination control can be carried out according to this main subject whereas with the state of the art the flash power is adjusted according to the focus without determining the main subject, that is to say, in particular, the closest subject.
  • less enlightened subjects can be numerically treated by lightening.
  • Example of control of another device When a mobile robot has to move, the regions closest to the mobile robot are determined and a trajectory free of any obstacle is determined from the objects closest to the mobile robot.
  • the controlled action includes a signal supply such as an indication signal of the object of main interest of the digital image, and / or a focus area, and / or an alarm signal indicating a change in the digitally monitored and imaged scene, and / or a distance from at least a portion of the imaged scene to the capture apparatus.
  • a signal supply such as an indication signal of the object of main interest of the digital image, and / or a focus area, and / or an alarm signal indicating a change in the digitally monitored and imaged scene, and / or a distance from at least a portion of the imaged scene to the capture apparatus.
  • a digital camera can be arranged a frame, including a predetermined shape, around the main subject to tell the photographer what is the main subject detected by the camera when shooting.
  • This indication signal of the main subject is used especially before the shooting itself to tell the photographer what the subject or the clearest object.
  • This signal can also be an indication that the nearest object or subject is too close to the camera to be able to be in focus.
  • the signal is constituted, for example, by the plaintext message "Foreground too close", or by an exaggeration of the blur of the first plane, or by a visible change in the color of the first plane.
  • the signal indicating that the foreground scene or object is too far away may take into account the final use of the image to be taken, including the resolution chosen for that purpose.
  • a subject that is blurred on a television or computer receiver screen may be sharp on a small screen of the type of that of a camera.
  • a blurry subject for a print on paper of 24cm x 30cm is not inevitably for an impression of 10cm X 15cm.
  • the blur indication signal can also accommodate the subject.
  • the detection of a bar code is more tolerant to blur than a natural image.
  • the camera In a video surveillance system of an object, the camera is set to monitor two regions. The first of these regions is the one where the object is located and the second region is the entire field of the camera. If an object in the shooting field approaches the object to be monitored, an alarm is triggered.
  • the controlled action is made to depend on at least one characteristic of the capture apparatus during the shooting, in particular the focal length, the aperture, the focusing distance, the exposure parameters, white balance settings, resolution, compression, or a user-made setting.
  • the controlled action is a function of the relative sharpness measured and the relative sharpness between at least two colors depends on the setting of the camera including the focal length, the aperture and the distance to the camera. point.
  • the digital image constitutes a raw image derived from the sensor of the capture apparatus.
  • This arrangement facilitates the measurement of relative sharpness because if we use a raw image or "raw”, the measurement is not affected by the treatments such as demosaicing, sharpening filtering, color space change or the tone curve.
  • the raw image from the sensor may, however, have undergone processing such as denoising, digital gain, black level compensation.
  • processing such as denoising, digital gain, black level compensation.
  • the relative sharpness measurement and / or the controlled action can be performed in the capture apparatus.
  • the relative sharpness measurement can be performed outside the capture device, for example on a computer after transfer of the digital image and / or control an action that is performed outside the capture apparatus.
  • the command comprises a command for detecting and / or recognizing a portion of the image, such as a detection and / or face recognition.
  • a face has a specific size.
  • the method according to the invention makes it possible to determine the distance between objects or subjects and to the capture apparatus; Moreover, from this distance information, the focal length and the size of the object in the image, it is possible to deduce the presence of the face (which has a size within a given range).
  • the criterion of size of the object can be supplemented by other criteria such as, for example, colors.
  • Object detection such as face detection, can be used in particular for, during a teleconference, automatically perform a strong compression of the background. This method can also be used for the detection of the defect, in order to correct it, red-eye, or for face recognition (biometric applications).
  • the controlled action includes measuring the position and / or movement of the capture apparatus.
  • one or more objects intended to remain stationary in a scene of a captured image are stored in memory and motion or position detection is performed by determining the variation of the relative sharpness over time. This arrangement can, for example, be used to make a visual computer interface of the "mouse" type in three dimensions.
  • the controlled action includes determining the position in the image of the main subject (s).
  • the criterion for determining the main subject in a digital image will be the smallest distance to the capture device. However, this criterion can be combined with other factors. For example, objects that are close to the capture apparatus may be eliminated by automatic processing. As previously described, it is also possible to take into account a criterion of size of the object, this size being a function of the focal length and the distance between the capture apparatus and the object.
  • the controlled action further comprises automatic framing, including centering, or cropping the digital image and / or another image on the main subject of the digital image.
  • the cropped image can be the digital image.
  • the cropped image may also be another image, for example an image from the same sensor and captured after the digital image.
  • the action ordered comprises the application of a treatment which is a function, on the one hand, of the relative sharpness and, secondly, a criterion selected by the user.
  • the selected criterion is the following: focus on the parts of the image that are closest to the capture device.
  • the control can be to increase the sharpness of these parts of the image and reduce the sharpness of the rest of the image to create a lower depth of field than actually obtained.
  • the controlled action includes modifying the contrast and / or brightness and / or color and / or sharpness of an image according to the relative sharpness variation in the image.
  • a scene is lit by one or more natural or artificial sources as well as possibly by one (or more) flash (es) controlled by the camera.
  • an image capture apparatus performs an exposure control (exposure time, sensor gain and, if appropriate, aperture), a white balance control (gain of each color in the image). the entire image) and possibly the flash (duration and power of the flash) according to measurements in a digital image of the scene (eg saturated area analysis, histogram analysis, average color analysis) and / or measurements made with a complementary device: infrared range finder, flash pre-flash ..., focus servo to find the focus producing the sharpest image by comparing the sharpness of several images taken with different focus.
  • These controls change the contrast and / or brightness and / or color of the image but do not use a measure of the relative sharpness between at least two colors on at least one region R of the image.
  • the controlled action includes providing, to an exposure control system and / or white balance and / or focusing, the position of at least one area of interest to be taken. in account, this area of interest being determined by comparing at least two relative sharpness measurements.
  • the exposure control can be performed on the part closest to the capture apparatus, possibly by combining with another criterion such as the elimination of object (s) close (s), in limit of picture (edge of field).
  • the enslavement of the white balance can be performed for example on a subject of significant size in the center of the image, possibly at the expense of a background illuminated differently.
  • the method includes determining a close portion in the image and a remote portion and the white balance control performs separate measurements on these two regions to determine the presence or absence of multiple lights, and make separate offsets. for each of these regions. If the position of the area of interest is provided to the focus servo, the focus action will be faster and the main subject (area of interest) can be tracked, even if it is in focus. movement.
  • the controlled action includes providing a signal to the user indicating that the image is too close to be sharp.
  • the controlled action includes altering resolution of an image based on the measured relative sharpness.
  • the image can be the digital image.
  • the image may also be another image, for example an image from the same sensor and captured after the digital image.
  • the resolution is reduced when the image is taken at a distance from the capture apparatus that is too small to obtain a sharp image at full resolution, the final resolution being chosen to obtain a sharp image.
  • the controlled action includes providing information, or signal, used for automatic indexing of the digital image.
  • the indexing may consist of the provision of a signal indicating that it is a portrait or group of people.
  • the distinction between these two situations is made according to whether the pictorial scene includes one or more objects or related subjects. If the distance of the objects or subjects is greater than a predetermined limit, then the image can be considered to represent a landscape.
  • the controlled action includes providing to a capture device sound (s), distance information and / or direction relative to the capture apparatus, a subject or object in the digital image.
  • a camcorder or a cameraphone one can determine the main subject (s), determine the distances and / or directions of these main subjects and focus the sound capture on the main subject or main topics and thus eliminate the background noise.
  • the directivity control of the sound capture can be performed using two microphones and a phase shift between the signals of these microphones.
  • a particular application of this latter provision is, in a video conference, the use of a wide-angle image capture apparatus and an automatic tracking of the subject that is expressed orally.
  • the controlled action includes the setting of a high compression for the background and a compression for the main subject (s), this (these) main subject (s) ) being determined as constituting an area of the image satisfying criteria based on the measured relative sharpness.
  • the capture apparatus comprises a sensor having pixels provided with color filters of at least two kinds, these filters being chosen so that their spectral responses have little overlap.
  • the capture apparatus includes a sensor having pixels primarily for producing the image and other pixels primarily for measuring relative sharpness.
  • pixels primarily for measuring relative sharpness have a spectral response in a spectral band that has little overlap with the spectral band, with pixels primarily serving to produce the image.
  • the pixels primarily for producing the image have a spectral response primarily in the range visible to the human eye and the other pixels have a spectral response primarily outside the range visible to the human eye.
  • the invention also relates to a sensor thus defined, independently of a capture apparatus and method, according to the invention, defined above.
  • the invention also relates to a capture apparatus comprising such a sensor, this capture apparatus may also be used independently of the method defined above.
  • the invention also relates, according to a provision that can be used in combination with the (or independently) provisions defined above, a digital image capture apparatus which comprises a sensor having, on the one hand, pixels whose spectral response is mainly in the domain visible to the human eye and, on the other hand, additional pixels having a spectral response, mainly outside the spectrum visible to the human eye, this sensor was such that the d image from the additional pixels has sharpness, in at least a range of distances between the capture apparatus and the image scene, greater than the sharpness of the portion of the image from the spectral response pixels mainly in the visible range .
  • the additional pixels may be sensitive to infrared and / or ultraviolet radiation.
  • Pixels sensitive to ultraviolet radiation can be used to improve sharpness for short distances, while pixels sensitive to infrared radiation can be used to improve sharpness at great distances.
  • infrared and / or ultraviolet one can hear any part of the spectrum beyond or below the visible spectrum, including the near infra-red such as 700 to 800 or 700 to 900nm, or near ultraviolet near 400nm.
  • the capture apparatus is provided with a fixed optics, that is to say devoid of mechanical elements for focusing.
  • the capture apparatus is provided with a zoom lens without moving or deformable focusing element, the relative sharpness between at least two colors on at least one region R of the image being variable according to the focal length and / or the position of the object imaged with respect to the apparatus.
  • variable-focus optics comprises, for example, a single mobile or deformable optical group.
  • a zoom is made with at least two mobile groups, for example one or two for the focal length and the other for focusing.
  • the focus and focus are independent, ie when the focal length changes, it is not necessary to change the focus. This eliminates the time needed for focusing.
  • varifocal lenses less expensive, in which the focus must be changed when the focal length varies.
  • zooms in which two complexly linked mobile optical groups are used to vary the focal length, the focus being achieved by a third group.
  • the digital image is derived from at least two sensors.
  • each sensor is dedicated to a specific color.
  • the controlled action includes adding an object to an image and / or replacing a portion of an image based on the relative sharpness measured on the digital image.
  • the method adds a character next to the main subject. It is also possible, by way of example, to add an object in a given position in the image; the object will be the correct size in the image if the distance of the scene imaged at this position is taken into account.
  • the method includes capturing a sequence of images, the digital image being part of the sequence and the controlled action being performed on at least one other image of the sequence.
  • the estimate of the relative sharpness can be performed on preview images before shooting, at lower resolution, while the correction can be performed on an image permanently stored, for example by means of a choice of filters resulting from a measurement made on the preview images.
  • the controlled action includes modifying a setting of the capture apparatus, including focal length, aperture, focus distance.
  • the camera may include an automatic adjustment program such that the aperture is If the subject is in front of a background, the background can be blurred.
  • the tuning program can also automatically adjust the aperture to the distance of the subjects in a group so that the depth of field is sufficient for all subjects in the group to be sharp. Note that in the latter case, we obtain a function performed automatically while it is performed manually in the state of the art.
  • the controlled action comprises producing a modified raw image.
  • the digital image is preferably the raw image of the sensor (format "raw” in English) before demosaicing operation ("demosaicing" in English).
  • the digital image may also have been processed, for example, white balance.
  • the digital image has not undergone subsampling.
  • an optical system, sensor and image processing means that produces a raw image having a better quality or particular characteristics, for example an extension of the depth of field, while maintaining image-like characteristics.
  • raw directly from the sensor and in particular an accounting with the functional blocks or known components performing the function of conversion raw image to visible image ("image pipe” or "image signal processor” in English).
  • the raw image has undergone demosaicing.
  • the optics of the capture apparatus exhibit strong longitudinal chromatic aberrations, for example such that, for a focus, an aperture and a focal length determined, there is at least one color for which the distance object of best sharpness is less than f 2 OJ 3 k being a coefficient less than 0.7, preferably less than 0.5, f being the focal length, 0 the aperture and P the most small (among all the colors of the image) diameters of the blur spot of an object point lying at infinity.
  • the measurement of relative sharpness between two colors is obtained by comparing the results of a first measurement M applied to the first color and the result of the second measurement applied to the second color, each measurement M providing a function value on the one hand, the sharpness of the color and, on the other hand, the content of the digital image, so that the comparison can be freed from the content of the digital image.
  • the comparison of the sharpnesses is carried out by using a measurement M on pixels of the digital image.
  • the measurement M in a given pixel P for a given color channel C corresponds to the gradient of the variation of C in a neighborhood P. It is obtained by the following calculation:
  • V (P) a neighborhood of the pixel P.
  • GM is the average of the amplitude of the gradients on the neighborhood V (P)
  • SM the mean of the amplitude of the differences between GM and the gradients on the neighborhood V (P).
  • a gradient is calculated by the magnitude of the difference in values of two pixels of the same color.
  • Gradients in the neighborhood V (P) correspond to gradients involving a predetermined number of pairs of pixels in the neighborhood V (P).
  • the measure M at the pixel P having a color C can be defined by the ratio between SM and GM. This gives a value M (P, C).
  • This measurement does not allow, in itself, to characterize precisely and completely the sharpness of the color C. In fact, it depends on the content of the image (type of scene imaged: textures, gradients, etc ..) in the neighborhood V (P) of the pixel P.
  • a frank transition in the scene imaged for the same sharpness of color will generate a measurement M higher than a smooth transition in the pictorial scene.
  • a transition will be present in the same way in each color, thus affecting the measurement M in the same way between the colors. In other words, when a clear transition appears on a color C, the same type of transition appears on the other colors.
  • the comparison of the measurements M makes it possible to establish the relative sharpness between a color C1 and a color C2.
  • the relative sharpness, between two colors C1 and C2, measured in a pixel P can be defined for example as a comparison between the two measurements M (P, Cl) and M (P, C2).
  • M (P, C1)> M (P, C2) implies that Cl is sharper than C2.
  • the relative sharpness in a region R of the image can be defined by using the measurement M on all the pixels P of the region R.
  • the relative sharpness in a region R of the image may be the set or a subset of the relative sharpnesses measured for the pixels P of the region R. It may also be defined as a single value such that the sum S of the measurements on all the pixels P of the region R for each of the colors.
  • the controlled action when the controlled action is to determine the position of the main subject in the image, the controlled action further comprises automatic framing, including centering the image on the main subject.
  • the method may be implemented in an image capture or image processing apparatus or device.
  • These devices or devices are part of the group comprising: an electronic component, integrating or not a sensor, an electronic subassembly integrating an optical, a sensor and possibly an image processing module ("camera module”) or any other form as defined above.
  • FIGS. 1a and 1b are representative diagrams of the longitudinal chromatic aberration of a convergent lens
  • FIG. 2 already described, is the color spectral diagram of an image
  • FIGS. 3a and 3b are diagrams representing the improvement of the sharpness of a color by means of the same clear color according to FIG. invention
  • FIG. 4 is a diagram showing the improvement of the sharpness of a color by means of different distinct colors associated with distinct regions of an image according to the invention
  • FIGS. 5, 6 and 7 are diagrams representing improvement of the sharpness of a color by means of different distinct colors associated with the whole of an image according to the invention
  • FIG. 8 is a diagram showing the servo-control of an apparatus according to a sharpness difference between the net color and the color to be improved according to the invention
  • FIG. 9 is a diagram representing the choice of a color net from a distance measured between an object and a device capturing the image of this object
  • FIG. 10 is a diagram showing the reduction of the sharpness of at least one color in at least one region of the image
  • FIG. 11 is a diagram of an apparatus obtained by the method according to the invention.
  • FIG. 12 is a diagram showing steps of the method according to the invention.
  • FIG. 13 shows a mode of adjustment in accordance with FIG.
  • 1, 14a and 14b form a set of diagrams showing settings used in the context of
  • FIGS. 15, 15a, and 15b illustrate a property of an image capture apparatus according to the invention and of a conventional apparatus
  • FIGS. 16a to 16d are diagrams showing the properties of an optical system of an apparatus according to the invention and of a conventional apparatus
  • FIGS. 17a and 17b are diagrams showing an example of optical system selection for an apparatus according to the invention.
  • FIG. 18 is a diagram illustrating characteristics of a camera according to
  • FIGS. 18.1 and 18.2 represent means for implementing the method according to the invention
  • Figures 19.1, 19.2 and 19.3 represent steps of the method according to the invention according to several embodiments
  • Figures 20.1 and 20.2 show other embodiments of the invention.
  • the method described below improves the sharpness of at least one color of a digital image by choosing from the colors of the image at least one color called "clean color” and by reflecting the sharpness of the sharp color on at least one other improved color, as shown below with the help of Figures 3a and 3b.
  • FIG. 3a More precisely, in FIG. 3a is shown the sharpness (axis 7.2 of the ordinates) of two colors 13.1 and 13.2 as a function of the distance of the objects that they represent on the image considered vis-à-vis the apparatus having captured the image
  • the sharpness of these two colors varies differently depending on this distance but overall, in this example, the first color 13.2 has a better sharpness than that of a second color 13.1 of the same image.
  • CA, CO and CN are respectively values representative of the improved color, the original color (or to be improved) and the net color.
  • the clean color is the first color.
  • the original color and the improved color correspond to the second color before and after treatment.
  • the effect of the sharpness on the second color is carried out using a filter F, according to a formula of the type:
  • CA CN + F (CO - CN)
  • the filter F will have the particularity of removing the details of the image to which it is applied.
  • a linear low-pass filter or averager
  • one of the numerous known nonlinear filters having the particularity of removing details such as, for example, a median filter.
  • the human retina has a particularly high sensitivity, with respect to the details of an image, for the green color so that the adjustment of the optical systems generally aims to obtain a high sharpness.
  • this color for a certain range of focus (see, for example, pages 30 to 33 of the book "Color Participation Models" by Mark D. Fairchild edited by Addison Wesley).
  • an optical device delivering images whose sharpness is not satisfactory for the human eye may have a satisfactory sharpness for one of its colors, such as blue or red, for which the eye shows less sensitivity when considering details.
  • one of its colors such as blue or red
  • the sharpness of the distant object is generally favored with a green color while the sharpness of the near object is improved by considering the blue color.
  • the sharp color used to improve the sharpness of a color according to regions of the image such a method being described below with the help of Figure 4 which shows an image 10 comprising two regions 11.1 and 11.2.
  • the improvement of a color in the region 11.2 is done by considering the color 8.3 as the net color while the improvement of a color in the region
  • the regions of an image may be predetermined or not.
  • a region may be a spatial area delimited by one or more pixels.
  • the color 8.2 is selected as the net color in the region 11.1 while the color 8.3 is the net color in the area 11.2.
  • FIG. 5 represents the sharpness (axis of ordinates 7.2) of two colors 8.2 and 8.3 as a function of the distance (7.1) between at least one object of the scene captured to obtain said image and the apparatus of capture.
  • a method according to the invention can consider the color 8.3 as a clean color, used to correct the sharpness of a color, over the range of distances 9.1, while the color 8.2 is considered the net color to improve a color from an object of the captured scene to obtain the image at a distance from the capture apparatus in the range 9.2.
  • the sharpness of the colors on the image can be improved towards a profile as shown in the diagram 6, namely the juxtaposition of the sharpest colors on the image.
  • the region of image may vary depending on the geometric position of the region of image and / or other image capture parameters such as focal length, aperture, focus, etc. To determine the sharpest color in the sense of the invention, it is not necessary to know the parameters indicated above.
  • the choice of the net color can also be determined by the software activation of at least one image capture mode such as a macro mode as described later. In such a context, we can consider the image as a single region.
  • a threshold 8.1 which indicates the required level of sharpness, beyond which the image is considered fuzzy.
  • such a threshold 8.1 defines the depth of field, that is to say the range 9.2 of distances between at least one object of the scene captured to obtain said image, and the device capture, such that the image of the object is clear.
  • a consequence of the invention is therefore to allow an extension of the depth of field of an optical system as detailed below with the help of Figure 9.
  • the depth of field of a capture device initially limited by the sharpness of the color 8.2 and the sharpness threshold 8.1, is increased by using a second color 8.3 having a satisfactory sharpness (below the threshold 8.1) over a new range of distances between at least one object of the scene captured to obtain said image and the capture apparatus.
  • such an application is implemented in fixed focus cameras, such as cameraphones.
  • the optical design of these devices provides a range of sharpness for great distances up to a few tens of centimeters at best on the basis of a green color, similar to the color 8.2 of Figure 5.
  • the blue color does not focus in the same way, it can present a sharpness at distances smaller than the green color, similar to the color 8.3.
  • the invention makes it possible to increase the sharpness of an image at a short distance from a cameraphone by attributing to the green color, and to the other colors, the sharpness of the blue color, increasing corollary the depth of field of the device.
  • the method determines a servo control setpoint of the capture apparatus considered in FIG. from the sharpness of at least two colors of the captured image so that the focus is in fewer steps and therefore faster.
  • a distance 17.1 between at least one object of the imaged scene and the optical system 1 capturing the image can be determined using the different levels of sharpness (axis 7.2 of the ordinates) of the colors 8.2 and 8.3 used in the region 11.3 relating to the image of the object.
  • a macro function is provided to allow imaging of objects near the capture apparatus within a predetermined range of distances, referred to as the macro range of distances 9.1, to the apparatus.
  • a capture apparatus makes it possible to move all or part of the optics to perform the macro function.
  • the method or system that is the subject of the invention makes it possible to dispense with such a displacement.
  • the sharpest color is predetermined for the macro range 9.1, for example by measuring the sharpness 8.2 and 8.3 of the colors of the digital images obtained by the capture apparatus for each color by producing digital images. from objects located at different distances from the capture apparatus.
  • the clearest color ( Figure 5) is the one corresponding to 8.3. This predetermination can be carried out definitively, for example at the time of the design of the apparatus
  • the Macro function when using the device, when the Macro function is activated, it then reflects the sharpness of the net color so predetermined on the other colors, as described above.
  • the sharpness of the digital image can be calculated by a conventional method or by using the method according to the invention applied to the range of distances 9.2.
  • the macro mode can thus be activated in a software manner at the device or any other device processing the image.
  • This software activation can be done so conventionally before the capture of the image but also after this capture and on a local device or remote capture device.
  • the activation of the macro mode can be done automatically, for example by determining the sharpest image between the image generated in normal mode and the image generated in macro mode.
  • the macro function performed according to the invention also benefits an apparatus comprising variable parameters at the time of capturing the digital image and having an influence on the sharpness of the colors, in particular a zoom capture apparatus, and / or a Optical with variable focus and / or variable aperture.
  • the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
  • the addition of macro function allows the shooting of bar code, business card, or manuscript containing text and / or schema by an image capture device, including a phone or a camera.
  • the depth of field is the range of distances between the objects in the scene and the image capture apparatus, allowing for a sharp digital image.
  • a capture device has a depth of field limited and all the weaker as the opening of the optics is large.
  • the digital image is decomposed into regions 11.1 and 11.2, for example in square regions corresponding to 9 sensitive elements neighboring the sensor, or, more generally, in regions corresponding to X by Y sensitive elements or in regions of predetermined shape or calculated according to the digital image.
  • the sharpest color is chosen, for example as the color corresponding to the lowest value among the values obtained by calculating a gradient for each color from the gray levels corresponding to the color and the region. considered.
  • the color corresponding to curve 8.3 is sharper for region 11.2 while the color corresponding to curve 8.2 is sharper for region 11.1.
  • the digital image of near-distance objects 5 to the capture apparatus within the 9.1 range is clear for the color corresponding to curve 8.3 (FIG. for example, blue), whereas it is less so for the color corresponding to curve 8.2 (for example, green).
  • the digital image of distant objects - distances to the capture apparatus in the 9.2 range - is clear for the color corresponding to curve 8.2, whereas it is less for the color corresponding to curve 8.2. color corresponding to curve 8.3.
  • the eye is much more sensitive to the sharpness in the green than the blue, it will perceive a sharpness corresponding to the curve 8.5 of Figure 7.
  • Figure 6 represents, by the curve 8.4, the sharpness obtained in each color after use of the method according to the invention: the blue made it possible to obtain a sharpness better than the threshold 8.1 for the close objects, located in the range of distances 9.1, then that the green made it possible to obtain a sharpness better than the threshold 8.1 for the distant objects, located in the range of distances 9.2.
  • the depth of field is increased without increasing the cost, complexity or bulk of the optics and / or without the need to change the exposure, so without reducing the opening, or increase the noise level or increase the motion blur.
  • Depth-of-field augmentation allows both barcode, business card, or manuscript-containing text and / or schema capture as well as portraits or landscapes by an image capture apparatus, including a phone or a camera. This is possible without using expensive autofocus or macro devices. In addition this device, compared to a manual mechanical macro device is made fully automatically without any intervention of the user.
  • the increase in depth of field produced according to the invention also benefits an apparatus having variable parameters at the time of capturing the digital image and having an influence on the sharpness of the colors, in particular a zoom capture apparatus, and / or an optics with variable focus and / or variable aperture.
  • the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
  • the method and device according to the invention then makes it possible to choose or design, as described later using FIGS. 11 to 17b, during the design of the capture apparatus, an optic with a more limited number of setting positions. to the point, which has the advantage of reducing the constraints of design of the optics and thus of reducing the costs. This also has the advantage of enabling faster and less expensive development by decreasing the necessary accuracy of the servo mechanism. For example, to obtain optics of great depth of field can be chosen or design an optical having the characteristic of having the union of the clear distance ranges for each of the largest possible colors.
  • optics of large aperture it is possible to choose or design an optic having the characteristic of having a single clear color in each of the ranges of distances and such that the union of the clear distance ranges for each of the colors corresponds to the desired depth of field.
  • an optic having the characteristic of having a single clear color in each of the ranges of distances and such that the union of the clear distance ranges for each of the colors corresponds to the desired depth of field.
  • one can also optimize both the aperture of the camera and the depth of field of the image.
  • a method and a device are also provided for reducing longitudinal chromatic aberrations of a digital image.
  • a capture device uses a hardware device to measure the distance of objects from a scene based on a laser, an infrared device, a flash pre-flash ...
  • the digital image is decomposed into regions 11.3, for example in square regions corresponding to 9 sensitive elements neighboring the sensor, or, more generally, in regions corresponding to X by Y sensitive elements or in regions of predetermined shape or calculated according to the digital image.
  • the sharpness of at least two colors is then measured for each region 11.3, the measured values, or the measured relative values 16.1 and 16.2, are then reported on the corresponding curves 8.2 and 8.3 of sharpness of the capture apparatus.
  • a distance 17.2 corresponding to an estimate of the distance 17.1 between the part of the object 4 represented on the region 11.3 and the capture apparatus is then obtained.
  • the distance measurement performed according to the invention benefits especially fixed optics, including telephones.
  • the distance measurement produced according to the invention also benefits an apparatus comprising variable parameters at the the moment of capturing the digital image and having an influence on the sharpness of the colors including a zoom capture apparatus, and / or an optics with variable focus and / or variable aperture.
  • the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
  • the method then makes it possible to obtain a distance estimate of the objects present in each region of the digital image. This allows :
  • the distance in real time is displayed on the image
  • the distance information makes it possible to guide a robot
  • FIGS. 4, 5, 6 and 7 will now describe an embodiment of the method and system according to the invention, more particularly adapted to the control of the depth of field, without the need for a particular mechanical device for a device.
  • known image capture apparatus The method then makes it possible to obtain a sharp image for objects located at a distance from the capture apparatus corresponding to a range of sharpness and a blurred image for the other objects.
  • a capture device has a limited depth of field and as much as the opening of the optics is large and so the depth of field and the exposure are linked so that a choice must be made in low light between depth of field, noise and motion blur.
  • the digital image is decomposed into regions 11.1 and 11.2, for example in square regions corresponding to 9 sensitive elements adjacent to the sensor, or, more generally, in regions corresponding to X by Y sensitive elements or in regions of predetermined shape or calculated according to the digital image.
  • regions 11.1 and 11.2 for example in square regions corresponding to 9 sensitive elements adjacent to the sensor, or, more generally, in regions corresponding to X by Y sensitive elements or in regions of predetermined shape or calculated according to the digital image.
  • the clearest color by example as the color corresponding to the lowest value among the values obtained by calculating a gradient for each color from the gray levels corresponding to the color and the region considered.
  • the color corresponding to curve 8.2 is sharper for region 11.2 while the color corresponding to curve 8.3 is clearer for region 11.1.
  • the depth of field is controlled without the need to change the exposure so without changing the aperture, or increase the noise level or increase the motion blur.
  • the depth of field control performed according to the invention benefits in particular to fixed optics, especially to telephones. Depth of field control allows both barcode, business card, or manuscript containing text and / or schema as well as portrait or landscape capture by an image capture apparatus, including a phone or camera. This is possible without the use of expensive optics of large aperture. In addition this device can be realized fully automatically without any intervention of the user.
  • the depth of field control performed according to the invention also benefits an apparatus comprising a mobile optics, in particular a zoom.
  • a skilled amateur can control directly or indirectly regardless of depth of field and exposure.
  • Fig. 11 is a diagram illustrating the architecture of an image capture or rendering apparatus.
  • Such an apparatus for example of image capture, comprises, on the one hand, an optical system 122, in particular with one or more optical elements such as lenses, intended to form an image on a sensor 124.
  • this sensor may be of another type, for example a photographic film in the case of a so-called "silver" device.
  • Such an apparatus also comprises a servo-control system 126 acting on the optical system 122 and / or on the sensor 124 in order to focus so that the image plane is on the sensor 124, and / or for the amount of light received on the sensor is optimal by setting the exposure time and / or opening, and / or that the colors obtained are correct, by performing a control of the white balance.
  • the apparatus comprises digital image processing means 128.
  • these digital image processing means are separated from the apparatus 120. It is also possible to provide a portion of the image processing means in the apparatus 120 and a portion outside the apparatus 120. The digital image processing is performed after the image recording by the sensor 124.
  • An image rendering apparatus has a structure analogous to an image capture apparatus. Instead of a sensor 124, there is provided an image generator 124 'receiving images of digital image processing means 128' and supplying the images to an optical system 122 ', such as a projection optical system .
  • the invention consists, according to one of its aspects, which can be used independently of the previously described aspects, from the capabilities of the digital image processing means 128, 128 'for determining or selecting the parameters of the optical system 122, 122' and / or the sensor or image generator 124, 124 'and / or the servo system 126.
  • FIG. 12 shows the level of performance that can be achieved with each of the components of the apparatus when associated with digital image processing means. These levels are represented by the broken line 130 for the optical system, the broken line 132 for the sensor, the broken line 134 for the servo, and the broken line 136 for the device.
  • the level of performance of the optical system can be established at level 130 '
  • the levels of the performance of the sensor and of the servo system can be established at the levels 132' and 134 ', respectively.
  • the level of the performance of the apparatus would be at the lowest level, for example the level 136 'corresponding to the lowest level 130' for the optical system.
  • the digital image processing means are preferably those described in the following documents: Patent Application EP 02751241.7 entitled:
  • Patent Application EP 02747504.5 for: "Method and system for reducing the frequency of updates of image processing means".
  • Patent Application EP 02748934.3 for: "Method and system for correcting the chromatic aberrations of a color image produced by means of an optical system.
  • Patent Application EP 02743348.1 for: "Method and system for producing formatted information related to geometric distortions”.
  • Patent Application EP 02748933.5 for: "Method and system for providing, in a standard format, formatted information to image processing means".
  • Patent Application EP 02747506.0 for: "Method and system for producing formatted information related to the defects of at least one apparatus of a chain, in particular with respect to blurring".
  • Patent application EP 02745485.9 for: "Method and system for modifying a digital image by taking into account its noise”.
  • Patent Application PCT / FR 2004/050455 for: "Method and system for modifying a digital image in a differentiated and quasi-regular pixelwise manner".
  • an optical system can distort the images in such a way that a rectangle can be deformed into a cushion, with a convex shape of each of the sides or in a barrel with a concave shape of each of the sides.
  • - The chromatic aberrations of the optical system if an object point is represented by three colored spots having precise positions with respect to each other, the chromatic aberration results in a variation of position of these spots with respect to each other, the aberrations being, in general, all the more important as one moves away from the center of the image.
  • - Parallax when an adjustment is made by deformation or displacement of an optical element of the optical system, the image obtained on the image plane can move.
  • the setting is, for example, a focal length adjustment, or a focus adjustment.
  • FIG. 13 This defect is illustrated in FIG. 13, in which there is shown an optical system 140 with three lenses in which the center of the image has position 142 when the lens 144 has the position shown in solid lines.
  • the center of the image takes the position 142'.
  • Depth of field when the optical system is focused on a specific object plane, the images of this plane remain clear as well as the images of the objects close to this plane.
  • depth of field is the distance between the nearest object plane and the farthest object plane for which the images remain sharp.
  • Vignetting in general, the brightness of the image is maximum in the center and decreases as one moves away from the center. Vignetting is measured as the difference, in percent, between the brightness at a point and the maximum brightness.
  • the sound of the image is generally defined by its standard deviation, its shape, and the size of the noise spot and its coloration.
  • the moire phenomenon is a distortion of the image that occurs when there are high spatial frequencies.
  • Moiré is corrected by setting the anti-alias filters.
  • the contrast is the ratio between the highest and the lowest brightness values of the image for which details of the image are still visible.
  • the contrast (FIG. 14a) of an image can be improved, that is to say, to extend (FIG. 14b) the range of luminosities on which the details can be distinguished. This extension is performed using a particular algorithm for contrast correction and noise.
  • the image surface of an object plane does not constitute a perfect plane but has a curvature, called field curvature.
  • This curvature varies according to various parameters including focus and focus.
  • the position of the image plane 150 depends on the area on which the focus is made.
  • the plane 150 corresponds to a focus in the center 152 of the image.
  • the image plane 156 is closer to the optical system 122 than the image plane 150.
  • it is the image plane at a position 158, intermediate between the positions 154 (corresponding to a focus on an area near the edge of the image), and 150
  • the combination of the digital image processing means 128 with the focus servocontrol 126 makes it possible to limit the displacement of the plane 158 for focusing, which reduces the energy consumption of the servo system and makes it possible to reduce the volume of its components.
  • the diagram of FIG. 15a shows the blur properties with a conventional focus servo system in which the maximum sharpness is obtained in the center of the image.
  • the field of the image is plotted on the abscissa and on the ordinate the blur value expressed in BXU.
  • the blur is, in the center, 1.3 and at the edge of the image, 6.6.
  • FIG. 15b is a diagram similar to that of FIG. 15a showing the properties of a servo-control of an apparatus produced according to the invention, on the assumption that the digital image processing means make it possible to correct the blurring until to a value of BXU equal to 14.
  • this value is the limit for the blur to be correctable by the digital processing means.
  • the digital image processing means comprise means of improvement of the sharpness such that they make it possible to dispense with focus control.
  • FIGS. 16a, 16b, 16c and 16d show the characteristics of an apparatus obtained according to the conventional technique and those of an apparatus obtained with the method according to the invention.
  • the conventional device is a digital photography device integrated into a mobile phone having a VGA sensor, ie a 640 x 480 resolution without a focus system.
  • the conventional apparatus has an opening of 2.8 whereas the apparatus obtained with the process according to the invention has an opening of 1.4.
  • FIG. 16a which corresponds to the conventional apparatus, is a diagram on which the percentage of field of the image is represented on the abscissa, the origin corresponding to the center of the image. The ordinate represents the V vignetting.
  • Figure 16b is a similar diagram for an apparatus obtained according to the invention.
  • the vignetting reaches the value 0.7 at the edge of the image while in the diagram of FIG. 16b it can be seen that the optical system of the apparatus according to the invention presents a significantly larger vignetting, of the order of 0.3.
  • the correction limit of the algorithm used is 0.25. In other words, thanks to the correction algorithm can be used a substantially larger optics vignetting.
  • FIG. 16c is a diagram representing, on the ordinate, the blur, expressed in BXU, as a function of the field of the image (in abscissas) for a conventional apparatus.
  • the blur feature is 1.5 in the center and 4 at the edge of the image.
  • the diagram of FIG. 16d also represents the blur for the optics of the apparatus obtained with the method according to the invention.
  • the field of the image is also represented and on the ordinate the blur expressed in BXU.
  • the blur in the center of the image is of the order of 2.2. It is therefore superior to the blur of the diagram of Figure 16c.
  • a fuzziness of about 3 was chosen, taking into account the limit of the correction algorithm.
  • we chose a degraded optics with regard to sharpness in the center while we obtain the same results as with the conventional device, with, in addition, an upper aperture.
  • the optics of the apparatus according to the invention represent a quality analogous to that of conventional optics, this result being obtainable due to the degradation of the vignetting with respect to the classic optics.
  • FIGS. 17a and 17b there are shown characteristics of different optical systems between which the choice must be made in order to make a capture apparatus using the method according to the invention.
  • the optical system provides a small image spot 1100.
  • This system has a modulation transfer function (MTF) represented by a diagram where the spatial frequencies are on the abscissa.
  • the value of the cutoff frequency is fc.
  • the FTM function comprises a bearing 1110 in the vicinity of the zero frequencies and a part decreasing rapidly towards the value fc.
  • the optic represented by the diagram of FIG. 17b has an image spot 1114 of dimensions substantially greater than the image spot 1100 and its FTM has the same cutoff frequency fc as in the case of FIG. 17a.
  • the variation of this MTF as a function of the spatial frequency is different: this frequency decreases relatively regularly from the origin towards the cutoff frequency.
  • the optics shown in FIG. 17b will provide more detail than the optics shown in FIG. 17a, and this despite the fact that the image spot is larger than in the case of Figure 17a. We will therefore choose the optics corresponding to Figure 17b.
  • CMOS or CCD sensors are often sensors formed from a so-called Bayer pixel mosaic.
  • the Bayer mosaic consists of a succession of 2x2 pixels, formed of 2 green pixels (that is to say a light sensitive photosite in a spectral range around 550nm), a red pixel (spectral range around 600nm) and a blue pixel
  • the spectral bands of green, red and blue differ and have a greater or lesser coverage.
  • a strong overlap between these three bands has the effect of reducing the sensor sensitivity to colors
  • a strong overlap between the three bands also reduces the differences in sharpness between the colors, thus reducing in particular the range of distances for which at least one of the three colors is clear.
  • This adaptation can be jointly conducted with the design of the optics and, depending on the constraints on digital image processing. Description of a sensor optimizing the process according to
  • the senor and / or the optical system are more particularly adapted to applications for providing precise indications of distances of the imaged objects.
  • a mosaic of Bayer pixels is used.
  • the accuracy of distance measurements according to the method depends in particular on the variation of the relative sharpness depending on the distance. This variation depends on the amount of chromatic aberration that can be achieved with the capture system (sensor and optics). However, the spectral frequency range of the visible light, and therefore the useful light for a photograph, is relatively small: of the order of 400nm to 700nm. Also, the variation of relative sharpness as a function of the distance is then limited with a conventional Bayer sensor.
  • a simple way is to use in addition to the three classic colors: red, green and blue, a different spectral band, for example 800nm-900nm or any other band beyond and / or below the visible spectrum.
  • the pixels sensitive to this fourth spectral band will not necessarily be useful for reconstructing the visible image but will be used primarily to estimate the distance of the objects by comparing the relative sharpness on this fourth spectral band with the one, or several, of the three classic colors.
  • a photographic apparatus is thus obtained which makes it possible to provide more precise indications of the distance of the imaged objects from all the NxM pixels of the image.
  • FIG. 20.2 one starts from a conventional Bayer in which three R, G, B pixels and a U pixel corresponding to a portion of a UV or infrared spectral band are provided.
  • infrared and / or ultraviolet we can hear any part of the spectrum beyond or below the visible spectrum, including the near infra-red such as 700 to 800 or 700 to 900nm, or near ultraviolet near 400nm.
  • This U-pixel is used to enhance the sharpness of visible colors as shown in the diagram of Figure 20.1.
  • the distances "d" of the objects imaged to the capture apparatus are plotted on the abscissa and the diameter "D" of the blur spot is ordinate.
  • the curves, 20.3, 20.4, 20.5, and 20.6 represent the variation of the diameter "D” as a function of the distance "d” for, respectively, the red “R”, the green “G”, the blue “B” and the ultraviolet “U”.
  • the line 20.7 represents the sharpness threshold defining the depth of field.
  • the distance “dl” represents the depth of field limit for a capture apparatus comprising “RGB” pixels and not the U pixels while using the sharpness enhancement method according to the invention.
  • the distance “d2” represents the depth of field limit obtained with a capture apparatus comprising the sensor shown in Figure 20.2 and using the sharpness enhancement method according to the invention.
  • the "U” pixels only serve to reflect the sharpness of the "U” color to the "RGB” colors for objects located between the "dl” and “d2" distances. Thus the final image will only include the three colors “RGB” (or any other known visible color distribution).
  • pixels that are sensitive to the near infrared are added to improve the sharpness at greater distances.
  • the optics can be designed so that over a wide range of distances: the smallest of the spot diagram diameters (between the three colors) is below a first predetermined threshold and that the largest of the spot pattern diameters between the three colors is below a second predetermined threshold.
  • the two thresholds are determined as a function, for example, of the capabilities and constraints of digital image processing, on the one hand (as for example the size of filter "F" described below), and characteristics of the image. sensor, on the other hand.
  • Figure 18 shows an example of the BxU measurements (ordinate axis) for the three RGB color planes as a function of the distance (abscissa) for an optics designed in this sense.
  • the values shown are those in the center of the image field. In each point of the image field, it will be possible to measure different but similar curves.
  • Sl and S2 are the two thresholds described above. The range of distances satisfying the two criteria described above is then, for this optic, approximately 12cm-infinite (d1-> infinite in FIG. 18), which means that it will be possible to reconstruct a sharp image for scenes imaged in this range of distances.
  • Optics having longitudinal chromatic aberrations can also be used such that, for focusing, aperture and focal length, there is at least one color for which the object distance of better sharpness is lower.
  • the image is fuzzy, that is, the spot diagram occupies a spot of more than X pixels in diameter , X being a predetermined parameter defining the depth of field limit.
  • a coin- digital sampling of the image will reduce the size of the blur spot by a factor depending on the type of subsampling used, but, typically, the order of magnitude of the subsampling factor considered.
  • a sharp, but lower resolution image can then be generated from the digital image by selecting the subsampling factor such that the blur spot is smaller, once the image has been downsampled, at a lower resolution. given threshold.
  • the subsample described above is first made before the sharpness increase according to the invention is carried out.
  • the processing comprises a modification of sharpness for each pixel of the zone Z 'by means of a filter mixing the pixel values on a predetermined neighborhood of each pixel, the parameters of the filter being function the relative sharpness measured.
  • an image capture device provided with an optics may have different sharpness according to the color plans and the distance of the imaged objects.
  • the dependence of the sharpness (or blur) on the distance of the imaged objects makes it impossible to increase the sharpness by means of a predetermined treatment such as a predetermined sharpness filtering.
  • An alternative embodiment of the invention consists in choosing or adapting the sharpness filters to the measured relative sharpnesses.
  • the filter M can modify the value of the pixel P as a function of the values of the pixels on a neighborhood of the pixel P on the set of three colors.
  • the filter can be chosen M as an operator performing the following operations:
  • GA GN + c_GG * M_GG (GN) + c_GR * M_GR (RN) + c_GB * M_GB (BN)
  • RA RN + c_RG * M_RG (GN) + c_RR * M_RR (RN) + c_RB * M_RB (BN)
  • M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ represent filters, which can be chosen as linear zero-sum filters, such as high-pass frequency filters.
  • the c_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ represent weights weighting the impact of each filter M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ .
  • This filtering example can also reflect the sharpness of the sharpest color on others.
  • the high-pass filters M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ will give values close to 0, when they are applied to the green and red colors. are blurred in the example.
  • GA will therefore be equal to GN plus c_GB * M_GB (BN), that is to say GN plus the high frequencies of blue.
  • the green color thus inherits the sharpness of the clean color (blue). It is the same for the red color.
  • the filters M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ and the coefficients c_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ can be adapted to different possible values of the colors.
  • An embodiment of such an adaptation, in the context of RGB images from a given capture device, is as follows:
  • the association table between the relative purities considered and the set of filters may include other inputs, such as, for example, the position of the zone Z 'in the field of the image or of the shooting parameters such as the value of the focal length, aperture, focus distance, etc., of the optical system at the time of shooting. Indeed, it is usual that the sharpness characteristics of a digital image also depend on these factors.
  • the sharpness correction of a digital image it will first cut the image field into several zones Z 'and the method will be applied to each of the zones.
  • the clipping will preferably be performed according to the characteristics of the sharpness of the colors so that in each zone the sharpness of the colors is relatively homogeneous.
  • an automatic adaptation of the sharpness filter applied to the digital image, to the distance between the image scene and the capture device is thus obtained. It should be noted that, thanks to the use of the relative sharpness this automatic adaptation to the distance, can be done without the explicit knowledge of this distance.
  • this embodiment of the method also allows the automatic adaptation of treatments aimed for example at the correction of optical defects and / or sensors whose effects on the image depend on the distance between pictorial scene and capture apparatus.
  • the blur or loss of sharpness
  • optical defects and / or sensor such as geometric distortion or vignetting, are other examples.
  • Principle of the invention is an example, but other optical defects and / or sensor (s) such as geometric distortion or vignetting, are other examples.
  • FIG. 19.1 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, an action 191 controlled according to the measured relative sharpness.
  • the controlled action also depends on a mode 193 corresponding for example to a choice of the user of the device, and / or a characteristic of the capture device when shooting.
  • Fig. 19.2 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, a commanded action 191 as a function of the measured relative sharpness comprising a processing of the image 10 and producing a processed image 192.
  • the controlled action also depends on a mode 193 corresponding, for example, to a choice of the user of the apparatus, and / or a characteristic of the capture apparatus during shooting.
  • Fig. 19.3 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, a commanded action 191 according to the measured relative sharpness having a processing of another image 194 and producing a processed image 198.
  • the controlled action also depends on a mode 193 corresponding for example to a choice of the user of the device , and / or a characteristic of the capture apparatus during shooting.
  • the action command consists of modifying the contrast and / or the brightness and / or the color of the image as a function of the relative sharpness between at least two colors on at least two colors. a region R of the image.
  • the use of the relative sharpness between at least two colors on at least one region R of the image allows by example of simulating the addition of localized lighting for example a flash positioned anywhere in the scene, and / or, conversely, to reduce the effect of a flash or lighting of various colors in the scene.
  • the digital image is divided into regions according to the relative sharpness between at least two colors, so that each image region a part of the scene within a given range of distances and is oriented in one direction. given.
  • An indication of the direction can be obtained from the local variation of the relative sharpness in the image.
  • An indication of the distance can be obtained from the relative sharpness as previously described.
  • the three-dimensional geometry of the scene is reconstructed by measuring the distance at a large number of points of the image. We then use a known technique in image synthesis to add lighting to the scene (ray tracing or other).
  • lighting is added to the subject (s) main (s) adapted to each subject to cause a "fill-in" effect simulating one or more flash (s) positioned (s) opposite or on the side of each subject.
  • This operation can be performed automatically and independently for each subject. With the known technique the addition of lighting for each subject is possible only with studio lighting.
  • the flash power can be determined according to the nearest subject to properly illuminate it, and complement the lighting of other subjects by adding simulated lighting.
  • the color of the illumination can also be determined for each region by a known method of estimating the white balance and then making the color of the scene lighting uniform.
  • the white balance is estimated globally for lack of information on the 3 - dimensional geometry of the scene.
  • Figure 18.1 there is shown a sensor 2, producing a raw image 180 pretreated 181, for example a white balance, and / or black level compensation, and / or a noise reduction to produce a pretreated image 182.
  • a relative sharpness measurement 190 controlling an action 191 corresponding to a processing implementing the pretreated image 182 and the relative sharpness measurement 190, to produce a processed image 192.
  • a downstream processing of the processed image 192 has been represented, corresponding, for example, to a demosaicing or other processing necessary to convert a raw image into a visible image.
  • FIG. 18.2 shows a sensor 2 producing a raw image 180.
  • a relative sharpness measurement 190 controlling an action 191 corresponding to a processing implementing the raw image 180 and the relative sharpness measurement 190 has also been shown. to produce a processed image 192.
  • a downstream processing of the processed image 192 corresponding, for example, to a demosaicing or other processing necessary to convert a raw image into a visible image.
  • the action implements a processing on a visible image.
  • the invention applies to an apparatus having variable parameters at the time of capturing the digital image and having an influence on the sharpness of the colors including a zoom capture apparatus, and / or an optics with variable focus and / or a variable aperture.
  • the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
  • the invention makes it possible to restore the focus digitally without a moving group and instantaneously, which thus makes it possible to reduce the complexity of a zoom by removing at least one moving part.
  • the relative sharpness between two colors may be variable, whereas this is not acceptable in known optics.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Paper (AREA)
EP06726221A 2005-03-07 2006-03-06 Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs Withdrawn EP1856907A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0550601A FR2880958B1 (fr) 2005-01-19 2005-03-07 Procede d'amelioration de la nettete d'au moins une couleur d'une image numerique
PCT/FR2006/050197 WO2006095110A2 (fr) 2005-03-07 2006-03-06 Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs

Publications (1)

Publication Number Publication Date
EP1856907A2 true EP1856907A2 (fr) 2007-11-21

Family

ID=36632499

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06726221A Withdrawn EP1856907A2 (fr) 2005-03-07 2006-03-06 Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs

Country Status (7)

Country Link
US (3) US7920172B2 (ja)
EP (1) EP1856907A2 (ja)
JP (3) JP5535476B2 (ja)
KR (1) KR101265358B1 (ja)
CN (2) CN102984448B (ja)
CA (4) CA2834963C (ja)
WO (1) WO2006095110A2 (ja)

Families Citing this family (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4969122B2 (ja) * 2006-03-24 2012-07-04 パナソニック株式会社 撮像装置
KR100776805B1 (ko) * 2006-09-29 2007-11-19 한국전자통신연구원 스테레오 비전 처리를 통해 지능형 서비스 로봇 시스템에서효율적인 영상 정보의 전송을 위한 장치 및 그 방법
KR100834577B1 (ko) * 2006-12-07 2008-06-02 한국전자통신연구원 스테레오 비전 처리를 통해 목표물 검색 및 추종 방법, 및이를 적용한 가정용 지능형 서비스 로봇 장치
US20090102924A1 (en) * 2007-05-21 2009-04-23 Masten Jr James W Rapidly Deployable, Remotely Observable Video Monitoring System
KR100976284B1 (ko) * 2007-06-07 2010-08-16 가부시끼가이샤 도시바 촬상 장치
DE102007031230B3 (de) * 2007-07-04 2008-10-30 Bundesdruckerei Gmbh Dokumentenerfassungssystem und Dokumentenerfassungsverfahren
JP5032911B2 (ja) * 2007-07-31 2012-09-26 キヤノン株式会社 画像処理装置及び画像処理方法
JP5298507B2 (ja) * 2007-11-12 2013-09-25 セイコーエプソン株式会社 画像表示装置及び画像表示方法
US8643748B2 (en) 2007-11-20 2014-02-04 Motorola Mobility Llc Compact stationary lens optical zoom image capture system
US8379115B2 (en) 2007-11-20 2013-02-19 Motorola Mobility Llc Image capture device with electronic focus
KR101412752B1 (ko) * 2007-11-26 2014-07-01 삼성전기주식회사 디지털 자동 초점 영상 생성 장치 및 방법
JP5171361B2 (ja) * 2008-04-07 2013-03-27 株式会社日立製作所 撮像装置
JP5132401B2 (ja) * 2008-04-16 2013-01-30 キヤノン株式会社 画像処理装置及び画像処理方法
US8160355B1 (en) * 2008-05-18 2012-04-17 Pixim Israel Ltd. Method, device and computer program product for performing white balancing of a digital image
EP2312858B1 (en) * 2008-06-18 2012-09-26 Panasonic Corporation Image processing apparatus, imaging apparatus, image processing method, and program
GB2463480A (en) * 2008-09-12 2010-03-17 Sharp Kk Camera Having Large Depth of Field
JP5075795B2 (ja) 2008-11-14 2012-11-21 株式会社東芝 固体撮像装置
JP5158713B2 (ja) * 2008-11-26 2013-03-06 京セラ株式会社 撮像装置および車載カメラシステム
JP5300133B2 (ja) * 2008-12-18 2013-09-25 株式会社ザクティ 画像表示装置及び撮像装置
JP5213688B2 (ja) * 2008-12-19 2013-06-19 三洋電機株式会社 撮像装置
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US8379321B2 (en) * 2009-03-05 2013-02-19 Raytheon Canada Limited Method and apparatus for accurate imaging with an extended depth of field
CN105681633B (zh) 2009-03-19 2019-01-18 数字光学公司 双传感器照相机及其方法
JP2010257037A (ja) 2009-04-22 2010-11-11 Sony Corp 情報処理装置および方法、並びにプログラム
US8553106B2 (en) 2009-05-04 2013-10-08 Digitaloptics Corporation Dual lens digital zoom
JP2010288150A (ja) * 2009-06-12 2010-12-24 Toshiba Corp 固体撮像装置
CN101938535B (zh) * 2009-06-29 2014-01-15 鸿富锦精密工业(深圳)有限公司 电子设备
FR2949003B1 (fr) * 2009-08-10 2017-09-08 Dxo Labs Systeme et procede de capture d'images avec deux modes de fonctionnement
TWI451357B (zh) * 2009-09-09 2014-09-01 Himax Tech Ltd 字型反鋸齒方法
CN102257822B (zh) 2009-10-27 2014-01-08 松下电器产业株式会社 摄像装置和使用它的测距装置
WO2011058236A1 (fr) 2009-11-16 2011-05-19 Dxo Labs Systeme optique et procede de conception associe
US20110149021A1 (en) * 2009-12-17 2011-06-23 Samir Hulyalkar Method and system for sharpness processing for 3d video
US20110188116A1 (en) * 2010-02-02 2011-08-04 Nikolay Ledentsov Ledentsov Device for generation of three-demensional images
JP5528173B2 (ja) 2010-03-31 2014-06-25 キヤノン株式会社 画像処理装置、撮像装置および画像処理プログラム
TWI495335B (zh) * 2010-04-21 2015-08-01 Hon Hai Prec Ind Co Ltd 取像模組及其運作方法
JP2011229603A (ja) * 2010-04-26 2011-11-17 Fujifilm Corp 内視鏡装置
JP2011229625A (ja) * 2010-04-26 2011-11-17 Fujifilm Corp 内視鏡装置
JP5811635B2 (ja) * 2011-03-07 2015-11-11 株式会社ニコン 画像処理装置、撮像装置および画像処理プログラム
JP5630105B2 (ja) * 2010-07-05 2014-11-26 株式会社ニコン 画像処理装置、撮像装置および画像処理プログラム
US8736722B2 (en) * 2010-07-15 2014-05-27 Apple Inc. Enhanced image capture sharpening
JP5576739B2 (ja) 2010-08-04 2014-08-20 オリンパス株式会社 画像処理装置、画像処理方法、撮像装置及びプログラム
JP5582935B2 (ja) 2010-09-22 2014-09-03 富士フイルム株式会社 撮像モジュール
US9225766B2 (en) * 2010-10-29 2015-12-29 Sears Brands, L.L.C. Systems and methods for providing smart appliances
US9697588B2 (en) 2010-11-15 2017-07-04 Intuitive Surgical Operations, Inc. System and method for multi-resolution sharpness transport across color channels
JP5976676B2 (ja) * 2011-01-14 2016-08-24 ソニー株式会社 レンズ部の縦の色収差を利用したイメージングシステム及びその操作方法
CN102158648B (zh) * 2011-01-27 2014-09-10 明基电通有限公司 影像截取装置及影像处理方法
US8988590B2 (en) 2011-03-28 2015-03-24 Intermec Ip Corp. Two-dimensional imager with solid-state auto-focus
JP2012237693A (ja) * 2011-05-13 2012-12-06 Sony Corp 画像処理装置、画像処理方法及び画像処理プログラム
JP5806504B2 (ja) * 2011-05-17 2015-11-10 オリンパス株式会社 撮像装置およびこれを備える顕微鏡システム
US8711275B2 (en) * 2011-05-31 2014-04-29 Apple Inc. Estimating optical characteristics of a camera component using sharpness sweep data
US8749892B2 (en) 2011-06-17 2014-06-10 DigitalOptics Corporation Europe Limited Auto-focus actuator for field curvature correction of zoom lenses
EP2725802A4 (en) 2011-06-23 2014-07-02 Panasonic Corp IMAGING DEVICE
US8953058B2 (en) * 2011-06-29 2015-02-10 Fotonation Limited Axial chromatic aberration correction
JP5847471B2 (ja) * 2011-07-20 2016-01-20 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
CN103313642B (zh) 2011-09-29 2016-01-20 奥林巴斯株式会社 内窥镜装置
TWI528833B (zh) * 2011-11-09 2016-04-01 鴻海精密工業股份有限公司 立體攝像裝置
FR2982678B1 (fr) * 2011-11-14 2014-01-03 Dxo Labs Procede et systeme de capture de sequence d'images avec compensation des variations de grandissement
JP5898481B2 (ja) * 2011-12-13 2016-04-06 キヤノン株式会社 撮像装置及び焦点検出方法
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US9007368B2 (en) 2012-05-07 2015-04-14 Intermec Ip Corp. Dimensioning system calibration systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
EP2677363A1 (en) * 2012-06-20 2013-12-25 bioMérieux An optical device including a camera, a diaphragm and illumination means
EP2872966A1 (en) * 2012-07-12 2015-05-20 Dual Aperture International Co. Ltd. Gesture-based user interface
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
TWI451344B (zh) * 2012-08-27 2014-09-01 Pixart Imaging Inc 手勢辨識系統及手勢辨識方法
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
WO2014076836A1 (ja) * 2012-11-19 2014-05-22 富士機械製造株式会社 部品実装機および実装検査機
JP5738904B2 (ja) * 2013-01-28 2015-06-24 オリンパス株式会社 画像処理装置、撮像装置、画像処理方法及びプログラム
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
JP6086829B2 (ja) * 2013-06-26 2017-03-01 オリンパス株式会社 画像処理装置及び画像処理方法
US9239950B2 (en) 2013-07-01 2016-01-19 Hand Held Products, Inc. Dimensioning system
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
WO2015059346A1 (en) * 2013-10-25 2015-04-30 Nokia Technologies Oy An apparatus and a method for producing a depth-map
JP6256132B2 (ja) * 2014-03-14 2018-01-10 株式会社リコー 撮像システム
EP3119264B1 (en) * 2014-03-18 2020-04-22 Steris Instrument Management Services, Inc. Optically adaptive endoscope
KR101591172B1 (ko) * 2014-04-23 2016-02-03 주식회사 듀얼어퍼처인터네셔널 이미지 센서와 피사체 사이의 거리를 결정하는 방법 및 장치
EP2942940A1 (en) * 2014-05-06 2015-11-11 Nokia Technologies OY Method and apparatus for defining the visible content of an image
US9232132B1 (en) * 2014-06-10 2016-01-05 Gregory S. Tseytin Light field image processing
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US20160225150A1 (en) * 2015-02-02 2016-08-04 Capso Vision, Inc. Method and Apparatus for Object Distance and Size Estimation based on Calibration Data of Lens Focus
US10475361B2 (en) 2015-02-02 2019-11-12 Apple Inc. Adjustable display illumination
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3118576B1 (en) 2015-07-15 2018-09-12 Hand Held Products, Inc. Mobile dimensioning device with dynamic accuracy compatible with nist standard
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US11354783B2 (en) 2015-10-16 2022-06-07 Capsovision Inc. Method and apparatus of sharpening of gastrointestinal images based on depth information
US10624533B2 (en) 2015-10-16 2020-04-21 Capsovision Inc Endoscope with images optimized based on depth map derived from structured light images
US10943333B2 (en) * 2015-10-16 2021-03-09 Capsovision Inc. Method and apparatus of sharpening of gastrointestinal images based on depth information
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US9715721B2 (en) 2015-12-18 2017-07-25 Sony Corporation Focus detection
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
TWI588508B (zh) * 2016-05-10 2017-06-21 國立中興大學 立體深度量測裝置
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10277829B1 (en) * 2016-08-12 2019-04-30 Apple Inc. Video capture in low-light conditions
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
JP6801434B2 (ja) * 2016-12-20 2020-12-16 富士通株式会社 生体画像処理装置、生体画像処理方法および生体画像処理プログラム
MX2019008835A (es) 2017-01-25 2019-10-24 Kornit Digital Ltd Impresion por chorro de tinta en telas sinteticas te?idas.
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
WO2019048492A1 (en) * 2017-09-08 2019-03-14 Sony Corporation IMAGING DEVICE, METHOD AND PROGRAM FOR PRODUCING IMAGES OF A SCENE
CN107613284B (zh) * 2017-10-31 2019-10-08 努比亚技术有限公司 一种图像处理方法、终端和计算机可读存储介质
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
CN108650462B (zh) * 2018-05-14 2020-06-09 Oppo广东移动通信有限公司 拍摄预览显示方法、装置、终端及存储介质
US10679024B2 (en) 2018-07-24 2020-06-09 Cognex Corporation System and method for auto-focusing a vision system camera on barcodes
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11336840B2 (en) 2020-09-02 2022-05-17 Cisco Technology, Inc. Matching foreground and virtual background during a video communication session
CN114339187B (zh) * 2020-09-30 2024-06-14 北京小米移动软件有限公司 图像处理方法、图像处理装置及存储介质
US11893668B2 (en) 2021-03-31 2024-02-06 Leica Camera Ag Imaging system and method for generating a final digital image via applying a profile to image information
CN114724000B (zh) * 2022-06-09 2022-08-30 深圳精智达技术股份有限公司 一种屏拍图摩尔纹的处理方法、处理装置及处理设备

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USH101H (en) * 1984-10-01 1986-08-05 The United States Of America As Represented By The Secretary Of The Army Ultraviolet and infrared focal place array
JPS63247680A (ja) 1987-04-02 1988-10-14 Mitsubishi Electric Corp 画像追尾装置
JPH01212981A (ja) * 1988-02-20 1989-08-25 Sanyo Electric Co Ltd オートフォーカス装置
JPH01276884A (ja) * 1988-04-27 1989-11-07 Nec Corp ビデオカメラの焦合装置
US5161107A (en) 1990-10-25 1992-11-03 Mestech Creation Corporation Traffic surveillance system
JPH06138362A (ja) * 1991-02-06 1994-05-20 Sony Corp オートフォーカス装置
GB9125954D0 (en) * 1991-12-06 1992-02-05 Vlsi Vision Ltd Electronic camera
US6292212B1 (en) * 1994-12-23 2001-09-18 Eastman Kodak Company Electronic color infrared camera
JP3960647B2 (ja) * 1997-01-09 2007-08-15 オリンパス株式会社 自動合焦装置
EP0878970A3 (en) 1997-05-16 1999-08-18 Matsushita Electric Industrial Co., Ltd. Imager registration error and chromatic aberration measurement system for a video camera
US5973846A (en) * 1998-11-30 1999-10-26 Hewlett-Packard Company Offset spectra lens system for a two spectra automatic focusing system
JP2000299874A (ja) 1999-04-12 2000-10-24 Sony Corp 信号処理装置及び方法並びに撮像装置及び方法
JP2000338385A (ja) * 1999-05-28 2000-12-08 Ricoh Co Ltd 自動合焦装置およびその合焦方法
US6859229B1 (en) 1999-06-30 2005-02-22 Canon Kabushiki Kaisha Image pickup apparatus
JP2001103358A (ja) * 1999-09-30 2001-04-13 Mitsubishi Electric Corp 色収差補正装置
JP4696407B2 (ja) * 2001-06-20 2011-06-08 株式会社ニコン 商品推奨システムおよび商品推奨方法
JP2003018407A (ja) * 2001-07-02 2003-01-17 Konica Corp 画像処理方法及び画像処理装置
US20030063185A1 (en) * 2001-09-28 2003-04-03 Bell Cynthia S. Three-dimensional imaging with complementary color filter arrays
JP4126938B2 (ja) 2002-03-22 2008-07-30 セイコーエプソン株式会社 画像処理装置および画像出力装置
JP2004120487A (ja) * 2002-09-27 2004-04-15 Fuji Photo Film Co Ltd 撮像装置
JP2004228662A (ja) * 2003-01-20 2004-08-12 Minolta Co Ltd 撮像装置
JP4010254B2 (ja) * 2003-02-06 2007-11-21 ソニー株式会社 画像記録再生装置、画像撮影装置及び色収差補正方法
US20040165090A1 (en) * 2003-02-13 2004-08-26 Alex Ning Auto-focus (AF) lens and process
US20040169748A1 (en) * 2003-02-28 2004-09-02 Tinku Acharya Sub-sampled infrared sensor for use in a digital image capture device
US20040174446A1 (en) * 2003-02-28 2004-09-09 Tinku Acharya Four-color mosaic pattern for depth and image capture
JP4378994B2 (ja) 2003-04-30 2009-12-09 ソニー株式会社 画像処理装置、画像処理方法ならびに撮像装置
FR2860089B1 (fr) 2003-09-23 2005-11-11 Do Labs Procede et systeme pour modifier une image numerique de maniere differenciee et quasi reguliere par pixel
JP4665422B2 (ja) * 2004-04-02 2011-04-06 ソニー株式会社 撮像装置
JP4815807B2 (ja) 2004-05-31 2011-11-16 株式会社ニコン Rawデータから倍率色収差を検出する画像処理装置、画像処理プログラム、および電子カメラ
US20060093234A1 (en) * 2004-11-04 2006-05-04 Silverstein D A Reduction of blur in multi-channel images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US20110109749A1 (en) 2011-05-12
WO2006095110A2 (fr) 2006-09-14
CA2835047A1 (fr) 2006-09-14
CN101204083A (zh) 2008-06-18
KR101265358B1 (ko) 2013-05-21
CA2834963C (fr) 2017-04-18
CN102984448A (zh) 2013-03-20
JP2008532449A (ja) 2008-08-14
JP6076300B2 (ja) 2017-02-08
CA2835047C (fr) 2017-04-18
JP2015019378A (ja) 2015-01-29
CA2600185C (fr) 2016-04-26
CA2834963A1 (fr) 2006-09-14
CA2600185A1 (fr) 2006-09-14
CA2834883C (fr) 2018-01-23
WO2006095110A3 (fr) 2006-11-02
US8212889B2 (en) 2012-07-03
JP5633891B2 (ja) 2014-12-03
US20110019065A1 (en) 2011-01-27
JP2013214986A (ja) 2013-10-17
US20080158377A1 (en) 2008-07-03
JP5535476B2 (ja) 2014-07-02
CN102984448B (zh) 2016-05-25
CA2834883A1 (fr) 2006-09-14
KR20070121717A (ko) 2007-12-27
US7920172B2 (en) 2011-04-05

Similar Documents

Publication Publication Date Title
CA2834963C (fr) Procede pour commander une action, notamment une modification de nettete, a partir d'une image numerique en couleurs
JP6935587B2 (ja) 画像処理のための方法および装置
EP1523730B1 (fr) Procede et systeme pour calculer une image transformee a partir d'une image numerique
EP2174289B1 (fr) Procede de traitement d'objet numerique et systeme associe.
EP3657784B1 (fr) Procédé d'estimation d'un défaut d'un système de capture d'images et systèmes associés
JP5237978B2 (ja) 撮像装置および撮像方法、ならびに前記撮像装置のための画像処理方法
JP6838994B2 (ja) 撮像装置、撮像装置の制御方法およびプログラム
WO2003007242A2 (fr) Procede et systeme pour produire des informations formatees liees aux defauts
FR2842976A1 (fr) Dispositif et procede pour fournir un zoom numerique de resolution amelioree dans un dispositif imageur electronique portatif
WO2005125242A2 (fr) Procede d'amelioration de services relatifs a des donnees multimedia en telephonie mobile
KR20190068618A (ko) 단말기를 위한 촬영 방법 및 단말기
FR2996034A1 (fr) Procede pour creer des images a gamme dynamique etendue en imagerie fixe et video, et dispositif d'imagerie implementant le procede.
FR2851393A1 (fr) Dispositif de traitement d'image, procede de traitement d'image et programme-produit de traitement d'image
FR2880958A1 (fr) Procede d'amelioration de la nettete d'au moins une couleur d'une image numerique
WO2023218072A1 (fr) Procédé de correction globale d' une image, et système associe
FR2887346A1 (fr) Procede et dispositif d'amelioration d'une image numerique

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070906

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20080228

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190108