WO2007101887A2 - Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees - Google Patents

Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees Download PDF

Info

Publication number
WO2007101887A2
WO2007101887A2 PCT/EP2007/055338 EP2007055338W WO2007101887A2 WO 2007101887 A2 WO2007101887 A2 WO 2007101887A2 EP 2007055338 W EP2007055338 W EP 2007055338W WO 2007101887 A2 WO2007101887 A2 WO 2007101887A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
image taking
imaging parameter
scene
obtaining
Prior art date
Application number
PCT/EP2007/055338
Other languages
English (en)
Other versions
WO2007101887A3 (fr
Inventor
Alain Wacker
Original Assignee
Sinar Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sinar Ag filed Critical Sinar Ag
Priority to PCT/EP2007/055338 priority Critical patent/WO2007101887A2/fr
Publication of WO2007101887A2 publication Critical patent/WO2007101887A2/fr
Publication of WO2007101887A3 publication Critical patent/WO2007101887A3/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane

Definitions

  • the invention relates to the field of photography, and in particular to an image taking apparatus and a control unit for an image taking apparatus and to a method of manufacturing a photograph and to a use of said method.
  • the usual way to obtain a picture of a scene by means of a camera is to select appropriate imaging parameters like the focus setting, the zoom setting, the aperture opening (aperture number) and the exposure time and then expose an image taking element accordingly.
  • Said image taking element can be a photographic film or, nowadays, rather a photoelectric chip (typically a CCD or a CMOS chip) , which has a large number (today typically at least several millions) of pixels (picture elements). Since it is sometimes difficult to precisely determine the appropriate imaging parameters in advance, some cameras provide for a so-called “bracketing” or “exposure bracketing”, which allows to obtain a number of (typically three) images, each taken at a different aperture opening and/or exposure time.
  • the release button Upon pressing the release button once, e.g., firstly one picture at a presumed optimal exposure time is taken, then one picture at twice said presumed optimal exposure time is taken, and finally, one picture at half said presumed optimal exposure time is taken. Afterwards, the best picture can be selected from these three.
  • an optical detection system which allows to minimize undesired effects of aberration in image taking by using a spatial light modulator embodied as a deformable mirror comprised of a multitude of deformable micro-mirrors.
  • an image pickup apparatus specifically designed for the purpose of picking up an entirety of a flat surface of a white board in-focus is disclosed, i.e. an image pickup apparatus for a particular kind of oblique photography.
  • Such an image pickup apparatus is installed with an inclination angle ⁇ between a sensing surface of the image pickup apparatus and the surface of the white board. Said angle ⁇ is adjustable, and dependent on said angle ⁇ , a mirror is rotated, a taking lens is moved and a CCD line image sensor is moved, so as to obtain line-by-line a full image of said white board surface, which is fully in-focus, wherein light from said white board surface is reflected by said mirror, then runs throught said taking lens and is then detected by said line sensor.
  • an object of the invention is to create an image taking apparatus, a control module for an image taking apparatus and a method of manufacturing a picture and a use of said method, which are new and provide for useful degrees of freedom in creating a picture of a scene. Another object of the invention is to provide for enhanced imaging capabilities. Another object of the invention is to provide a photographer with enhanced creative possibilities for creating photographs.
  • Another object of the invention is to provide for enhanced ways of creating high-quality images in general-purpose photography.
  • an image taking apparatus a control module for an image taking apparatus, a control unit for an image taking apparatus, a method of manufacturing a picture of a scene and a use of said method according to the patent claims.
  • the use according to the invention is a use of the method according to the invention for intentionally causing that said picture comprises portions, which are deliberately out-of-focus .
  • the control module for an image taking apparatus comprises an image taking element and an image-forming optical system comprising at least one lens and is adapted to enabling
  • the control unit for an image taking apparatus is comprised in and/or connectable to said image taking apparatus and comprises a control module according to the invention.
  • the image taking apparatus comprises a control module according to the invention.
  • the invention it is possible to obtain pictures, in particular still images, throughout which at least one imaging parameter is varied. I.e., different parts of the picture are obtained from exposures with (at least partially) different imaging parameters such as focus settings, soft focus settings and zoom settings. Pictures being partially in-focus and partially deliberately out-of- focus can be obtained in a well-defined way, which provides for great creative possibilities in creating photographs.
  • the method can allow to obtain pictures of pre-selectable distributions of sharpness and blur across the picture. It is possible to achieve the effect of a view camera with tilted lenses without tilting lenses and still further creative effects.
  • An advantage of the invention can be, that it can be realized with photographic cameras, which are, from an optical point of view, identical with known cameras. But, the way the exposure and said at least one imaging parameter are controlled is different from what is known from conventional photographic cameras.
  • step al) usually the image taking element will be used for successively obtaining said image components.
  • Said relative movement put forward in step bl) causes, at least in part, said pre-defined variation of said at least one imaging parameter put forward in step b) .
  • said at least one imaging parameter is, at least in part, different for different image components.
  • said pre-defined variation of said at least one imaging parameter is carried out such as to achieve that, at least for a portion of said image components of said set, different image components of said portion are obtained using different settings of said at least one imaging parameter.
  • said variation of said at least one imaging parameter is a pre-defined variation. Therefore, for each image component, the setting of said at least one imaging parameter to be used during obtaining the respective image component is prescribed (typically by the photographer) before the respective image component is obtained, in particular, even before a first image component of said set is obtained.
  • said at least one imaging parameter is set to a pre-defined initial setting while the first image component of said set of image components is obtained, and said at least one imaging parameter is set to a pre-defined final setting while the last image component of said set of image components is obtained.
  • said pre-defined variation is clearly distinguished from such variations of said at least one imaging parameter, which are carried out irrespective of which image component is currently obtained, such as it is done, e.g., for camera-shake compensation.
  • a camera-shake compensation is accomplished by detecting a change of the position of the optical detection system and compensating for a shift in focal point caused by the detected position change.
  • said method comprises the step of e) defining said pre-defined variation of said at least one imaging parameter.
  • Step e) is typically carried out before carrying out steps a) and b) .
  • step e) is typically carried out by the photographer.
  • Said fractions of said scene to be captured in said image components mentioned in step a2) are usually pre-defined, too:
  • the fraction of said scene to be captured in the respective image component is prescribed before the respective image component is obtained, in particular, before a first image component of said set is obtained.
  • the method comprises the step of adjusting at least one control parameter which influences shape and/or arrangement of all those portions of said image taking element, each of which is exposed for obtaining one respective image component.
  • said fractions of said scene captured in each of said image components do substantially not or not at all overlap.
  • said method comprises the step of switching from a normal mode of said image taking apparatus, in which no pre-defined variation of said at least one imaging parameter and/or no movement of said at least one lens can be carried out during exposure, to another mode of said image taking apparatus, in which steps a) and b) can be carried out.
  • Said normal mode can be the mode, in which standard photo cameras work, i.e. in which imaging parameters (and lenses) remain fixed during exposing an image taking element.
  • Said other mode is the mode according to the invention.
  • said picture is composed of a multitude of picture constituents derived from said set of image components and said method comprises the steps of deriving said picture constituents from said set of image components and deriving said picture from said picture constituents.
  • said picture is composed of picture constituents which are substantially identical with said image components. This is usually preferred in case of photochemical converters as image taking elements.
  • both, a focus setting and a magnification and/or zoom setting is varied, wherein the variation of said magnification and/or zoom setting is chosen such that it compensates for changes in magnification caused by said focus setting variation.
  • Said deriving of said picture constituents and/or of said picture may in full or in part take place in the image taking apparatus and in full or in part in a computer connectable to the image taking apparatus or in a computer separate therefrom.
  • Said image components can also be referred to as "raw image components” in the sense that they are usually subject to some further processing in order to obtain said picture constituents.
  • data representing said image components can be what generally is referred to as “raw data” in digital photography, yet it is also possible to have processed, e.g., compressed, data representing said image components.
  • processed, e.g., compressed, data representing said image components can be used, e.g., compressed, data representing said image components.
  • said method comprises the step of d) deliberately creating out-of-focus portions of said scene in said picture by carrying out step b) .
  • said picture comprises in-focus portions and deliberately out-of-focus portions because of carrying out step b) .
  • step b) there are varying degrees of sharpness (and blur) across said picture because of carrying out step b) .
  • step a) comprises the step of a3) exposing said image taking element with light from said scene in full by successively exposing different portions of said image taking element with light from different parts of said scene.
  • said image taking element which in this case typically is a two-dimensional converter such as a sheet of photographic film or a two-dimensional CCD or CMOS chip, is subject to exposure on a portion-by-portion basis, until light from said scene in full is captured, which typically is the case when the whole image taking element has been exposed.
  • a two-dimensional converter such as a sheet of photographic film or a two-dimensional CCD or CMOS chip
  • said image taking element is subject to exposure on a portion-by-portion basis, until light from said scene in full is captured, which typically is the case when the whole image taking element has been exposed.
  • different portions of said image taking element are exposed at different times.
  • said method comprises the step of a31) using a sensing area definition element arranged within said image taking apparatus between said scene and said image taking element for defining said different portions of said image taking element to be exposed.
  • said sensing area definition element is arranged close to said image taking element, and it usually comprises an opaque portion for blocking light from travelling to said image taking element. It usually comprises, in addition, a transparent portion, which allows light to travel to said image taking element, so as to define said different portions of said image taking element to be exposed. Typically, on its way to said image taking element, the light travels through said transparent portion substantially without changing its direction and/or substantially unperturbed.
  • said sensing area definition element can comprise a transparent portion, which is confined, fully or in part, by said opaque portion.
  • the sensing area definition element can, e.g., comprise a liquid crystal element such as a liquid crystal material between glass substrates having electrodes to selectively create transparent and light-blocking areas of the liquid crystal element. If many electrodes are provided, a great flexibility for defining said image components can be provided this way. Nevertheless, said transparent portion is preferably an opening, since this provides a great optical quality, in particular compared to providing glass substrates in the light path used for exposing said image taking element. Accordingly, in one embodiment, said sensing area definition element is capable of forming an opening, and said step a31) comprises the step of moving said opening (with respect to said image taking element) of said sensing area definition element for defining said different portions of said image taking element to be exposed.
  • a slit-shaped opening can be moved (continuously or quasi- continuously or step-wise) in the light path, e.g., shutter curtains as usually used for controlling the exposure in photographic cameras can be used as sensing area definition elements. It is also possible to use a pin diaphragm
  • step a3) can be accomplished by successively bringing different groups of photosensitive members of said multitude of photosensitive members into a suitable photosensitive state and back into a photo-insensitive - In ⁇
  • the image taking element may be constantly illuminated, and the portion of the image taking element to be exposed for obtaining a single image component can easily be chosen by a control signal switching the photo- sensitivity of said groups of photosensitive members. And even the times at which and the time during which an exposure for a single image component takes place may readily be defined by said control signal.
  • a control signal switching the photo- sensitivity of said groups of photosensitive members. And even the times at which and the time during which an exposure for a single image component takes place may readily be defined by said control signal.
  • an electronic shutter may be realized in the image taking element itself. This can render a sensing area definition element superfluous.
  • step a) may comprise the step of a4) successively obtaining the image components of said set by multiply exposing said image taking element, each time with light from a different part of said scene.
  • step a4) comprises the step of a41) moving said image taking element for accomplishing said exposing of said image taking element with light from said different parts of said scene.
  • Said moving said image taking element can be a continuous or quasi-continuous or step-wise moving.
  • a drive functionally connected to said image taking element can be used for accomplishing said movement.
  • said method comprises the step of cl) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to an angle between an axis defined by said image taking element and an axis defined by an object to be imaged.
  • said method comprises the step of c2) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not describe and/or is unrelated to an alignment of said image taking apparatus relative to an object to be imaged.
  • said method comprises the step of c3) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of an axis defined by said image taking element relative to an axis defined by an object to be imaged.
  • said method comprises the step of c4) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of said image taking apparatus relative to said scene.
  • said method comprises the step of c5) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to a distance between said image taking element and an object to be imaged.
  • Steps cl) , c2), c3), c4) and c5), respectively, are typically carried out by the photographer, and typically before carrying out steps a) and b) .
  • control parameter mentioned in one of steps cl), c2), c3), c4) and c5) is identical with the control parameter mentioned in another one or more of these steps .
  • a control parameter, on which said predefined variation of said at least one imaging parameter depends, is adjusted for creatively designing said picture of said scene.
  • said at least one imaging parameter which is varied in a pre-defined way in step b) , depends on said relative movement mentioned in step bl) .
  • said pre-defined variation of said at least one imaging parameter leaves the signal strength generated by said image taking element substantially unchanged.
  • said at least one imaging parameter is a parameter which leaves the amount of light to which said image taking element is exposed during obtaining said image components substantially unchanged. In one embodiment, said at least one imaging parameter is a parameter which influences light paths relevant for said obtaining said image components.
  • said at least one lens is comprised in a focussing section of said image taking apparatus, and said at least one imaging parameter is a parameter of said focussing section.
  • said at least one imaging parameter comprises a focus setting.
  • Variation of the focus setting allows to place the location of maximum sharpness in different distances for different places in the picture (different parts of the scene) .
  • the plane of maximum sharpness may be angled arbitrarily, and in principle, even an arbitrarily shaped surface of maximum sharpness may be realized.
  • said method comprises the step of f) creating a bent focal surface by carrying out step b) .
  • the focal surface is the surface constituted by points which are imaged in-focus.
  • a bent focal surface is a not-flat focal surface.
  • the light used for obtaining the respective image component travels within said image taking apparatus along a light path to said image taking element without being reflected by a mirror.
  • light paths are used, which are, within said image taking apparatus, free from mirrors.
  • mirror-free embodiments have the advantage that they allow for highest-quality imaging, since mirrors are optical elements which may have imperfections, may require adjusting and maintenance and may generate failures.
  • step bl) comprises the step of bll) moving said at least one lens during step a) .
  • This moving of said at least one lens typically takes place along the optical axis of the image taking apparatus.
  • said image taking apparatus comprises a drive for moving said at least one lens, and step bll) is accomplished using said drive.
  • This drive may be identical with a drive used for autofocus operations in said image taking apparatus.
  • the method comprises the step of g) moving said image taking element during step a) .
  • This moving can be a moving contributing to or even embodying said relative movement mentioned in step bl) , in which case it typically takes place along the optical axis of the image taking apparatus or at least comprises a movement component along the optical axis of the image taking apparatus.
  • the moving said image taking element mentioned in step g) comprises a tilting movement, wherein said tilting may be a tilting about an arbitrary axis, e.g., a hoizontally aligned axis or a vertically aligned axis.
  • the image taking element is aligned such that the line (in case of a line scanner) or the surface (in case of a two-dimensional photosensive element such as photographic film or an imaging photoelectric detector) defined by it lies in a plane perpendicular to the optical axis of the image taking apparatus, typically in the image plane.
  • said tilting movement typically comprises a movement component along the optical axis of the image taking apparatus, but is not a movement solely along the optical axis of the image taking apparatus. If a movement according to step g) is carried out, in particular a tilting movement, while also a sensing area definition element is used for defining said different portions of said image taking element to be exposed (cf.
  • step a31 it can be advantageous to move said sensing area definition element simultaneously with and in the same manner as said image taking element, e.g., for keeping a distance between said sensing area definition element and said image taking element constant and/or for ensuring that - for all image components - the length of the light path relevant for obtaining the respective image component from said sensing area definition element to said image taking element is of the same magnitude.
  • said image taking apparatus is a general-purpose photographic camera.
  • a general-purpose photographic camera is intended for use for various types of photography, not limited to only one type of photography such as taking pictures of white boards only.
  • said image taking apparatus is a camera for hand-held use and/or for use mounted on a camera stand or tripod.
  • said set of image components is obtained automatically.
  • said settings of said at least one imaging parameter are varied automatically.
  • said pre-defined variation of said at least one imaging parameter mentioned in step b) is carried out by varying settings of said at least one imaging parameter in a continuous or quasi-continuous or step-wise manner.
  • Continuous variations of said at least one imaging parameter, of said relative movement mentioned in step bl), of said moving of said image taking element and/or of said lens and/or of said transparent portion of said sensing area definition element
  • analogue control e.g., by using an analogue control and an analogue motor for continuously varying, e.g., a focus setting, a lens position, an image taking element position, an aperture position.
  • Quasi-continuous variations are possible with a drive and a digital control (having a reasonable resolution) .
  • Step-wise variations are possible by various means .
  • said image taking element is a photoelectric converter, in particular a CMOS chip or a CCD chip or a line scanner.
  • said image taking element is a an imaging photochemical converter, in particular photographic film.
  • said method comprises the step of storing data representative of said set of image components in a storage unit, in particular in at least one of the group comprising
  • a data carrier in particular magnetic or optical or electrical data carrier, in particular a removable data carrier.
  • the "data" of the image components are "stored” in the image taking element, typically a photographic film.
  • said method comprises the step of
  • line-shaped image components with a small width, wherein the line may be curved or straight, continuous or discontinuous.
  • the width of the line would, for high resolution, be only one pixel, or maybe up of two or three pixels; in case of a color-sensitive chip, one pixel would preferably be considered to comprise a couple of photosensitive members, accounting for the different colors, e.g., one for red, two for green, one for blue.
  • the number of image components in said set will in many cases be several hundreds to several thousands.
  • image components with a larger width can be used for faster image-taking, usually at the expense of resolution; if, however, the variation of said imaging parametercomprises only in a relatively small number of steps, there may be no loss of resolution.
  • the image taking apparatus comprises at least one of
  • an image-forming optical system in particular a detachable image-forming optical system
  • an exposure time definition unit an image taking module comprising an image taking element, in particular a detachable image taking module .
  • It may be a camera system, in particular a photographic camera, a general-purpose photographic camera, a still image camera, more particularly a single-lens reflex camera .
  • said image taking apparatus is free from mirrors, which are arranged, for at least one of said image components, along a light path along which light used for obtaining said at least one image component travels within said image taking apparatus to said image taking element during obtaining said at least one image component.
  • said image taking apparatus comprises at least one of the group comprising
  • a first storage unit for storing data representative of said set of image components
  • a second storage unit which may be identical with or different from said first storage unit, for storing data representative of said pre-defined variation of said at least one imaging parameter
  • a third storage unit which may be identical with or different from said first and/or second storage units, for storing data relating each of said image components of said set to said settings of said at least one imaging parameter used during obtaining the respective image component.
  • Said data can be digital data.
  • said image taking apparatus comprises, as image taking element, a line scanner and a drive functionally connected to said line scanner, wherein said control module is adapted to controlling said drive such that said line scanner is moved during obtaining said set of image components, so as to define for each of said image components the corresponding fractions of said scene.
  • said image taking apparatus comprises an exposure time definition unit.
  • an exposure time definition unit it is defined when and for how long an exposure of the image taking element takes place, or, more precisely, when and for how long photons are collected by the image taking element.
  • exposure time definition units can be, e.g., shutters (shutter curtains) and/or apertures. It is possible that an image taking element itself implements an exposure time definition unit, e.g., if the photo-sensitivity of the image taking element (or portions thereof) can be switched on and off. It can be useful to provide an exposure time definition unit in addition to a sensing area definition element.
  • the methods according to the invention may also be considered methods of operating an image taking apparatus, in particular, methods of operating an image taking apparatus for obtaining one final image, which one final image is composed of a multitude of picture constituents derived from a set of image components.
  • Said picture (or final image) of said scene can be considered a mosaic-like composition of the picture constituents.
  • said set of image components may be interpreted to forming a mosaic-like pattern from all its image components, wherein each image component corresponds to one mosaic-piece-like fraction of said scene.
  • the image components may be considered partial images or mosaic image fractions .
  • the mosaic-like arrangement of picture components represented by the final image is usually identical with the mosaic-like arrangement that can meaningfully be formed from the set of image components or at least obtainable therefrom by relatively simple geometric transformations, which usually do not alter the neighboring-relationships of the mosaic parts.
  • the scene and its illumination shall remain unchanged during a time span ⁇ t, during which all image components of the set are derived, and also the image taking apparatus should not be moved within that time.
  • the advantages of the methods correspond to the advantages of corresponding apparatuses.
  • Fig. 1 an illustration of a simple embodiment with focus variation
  • FIG. 2 an illustration of a simple embodiment with focus variation
  • Fig. 3 an image taking apparatus with focus variation
  • FIG. 4 an illustration of a simple embodiment with "electronic shutter"
  • Fig. 5 an illustration of a camera system with computer and storage unit
  • Fig. 6 an illustration of a simple embodiment with sensing area definition element
  • Fig. 7 an illustration of a simple embodiment with a line scanner
  • Fig. 8 an illustration of image components and corresponding imaging parameter settings over time
  • Fig. 9 an illustration of image components, picture constituents and final image
  • Fig. 10 an illustration of a mosaic pattern of picture constituents
  • Fig. 11 an illustration of a mosaic pattern of picture constituents
  • Fig. 12 an illustration of exposures and imaging parameter settings over time
  • Fig. 13 an illustration of exposures and imaging parameter settings over time
  • Fig. 14 an illustration of exposures and imaging parameter settings over time
  • Fig. 15 an illustration of exposures and imaging parameter settings over time
  • Fig. 16 an image taking apparatus with focus and aperture opening variation
  • Fig. 17 an illustration of a simple embodiment with focus variation and image taking element tilting, in a side view
  • Fig. 18 an illustration of the simple embodiment with focus variation and image taking element tilting of Fig. 17, in a top view;
  • Fig. 19 an illustration of a simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting, in a side view;
  • Fig. 20 an illustration of the simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting of
  • Fig. 1 is an illustration of a simple embodiment, in which an imaging parameter, namely the focus setting is varied.
  • a scene 99 (or an object 99) is imaged onto an image taking element 60 by means of an image-forming optical system 20, which is drawn as a single lens, which at the same time is a focussing section 29.
  • a control module 4 can control a drive 28 (dash-dotted lines indicate functional connections) such that the focus setting is changed, illustrated by the dotted lenses connected by the dotted arrows.
  • the focussing section 29 is moved step-wise from an initial position at time tO to a final position tf, via several intermediate steps, one of which is drawn at tn, the location of the maximum sharpness of the image will also move, as illustrated by the small image sections labelled with the corresponding point in time (t ⁇ ,tn,tf).
  • Said control module 4 controls said image taking module 60 such, that at different times ti different portions of the image taking module 60 capture a partial image representing only a fraction of said scene 99.
  • a partial image is referred to as a image component Ri (indices of image components corresponding to indices of points in time) .
  • Fig.2 is an illustration of the same simple embodiment as in Fig. 1. This time, the object 99 is tilted.
  • the focus settings properly and controlling the capturing of the image components Ri accordingly, it is possible to obtain a final image (picture) of the (tilted) surface of the object 99, which shows the surface in maximum sharpness throughout the whole picture.
  • the maximum sharpness of the image of the surface is achieved in the image plane 86 of the image taking element 60.
  • control module 4 will usually be programmed by the photographer before time t ⁇ , i.e., before starting capturing said image components RO...Rf, so that the described focus variation is a pre-defined variation of at least one imaging parameter.
  • Fig. 3 shows schematically an image taking apparatus 1 with focus variation, similar to the embodiment of Figs. 1 and 2.
  • the image taking apparatus e.g., a camera, comprises an image taking module 6, e.g., a digital back, which comprises the image taking element 60, e.g., a CCD chip or a CMOS chip.
  • Fig. 3 further illustrates that it is possible to define that area of image taking element 60, which is to be exposed for obtaining a certain image component, by means of a sensing area definition element 58, which is separate from image taking element 60.
  • a sensing area definition element 58 is usually located close to image taking element 60.
  • Sensing area definition element 58 comprises a transparent portion 580 which lets light pass and an opaque portion which blocks light from travelling further.
  • the sensing area definition element 58 of the embodiment of Fig. 3 can, e.g., be a liquid crystal element with several or better with a multitude of electrode pairs, which pairs are arranged, e.g., in a regular array.
  • the shape of an image component Rn to be captured (and, therewith, of a fraction of scene 99 to be captured in said image component) can then easily be defined by selecting one or more of said electrode pairs and not applying a voltage to the selected electrode pair(s) while applying a voltage to the other electrode pairs.
  • the application of a voltage will provoke an ordered arrangement of liquid crystals between the corresponding electrode pairs, which causes intransparency.
  • the data generated by image taking element 60 are stored in a storage unit 10, which may be separate from the camera or comprised therein.
  • the control unit 4 ensures that exposure takes place at appropriate times ti and at appropriate portions of image taking element 60.
  • FIG. 4, 6 and 7 illustrate some of such ways, wherein a moving lens is drawn at an intermediate time tn and, dashed, at an initial time t0 and at a final time tf, for illustrating the imaging parameter variation (e.g., focus setting, zoom setting, soft focus setting) .
  • the imaging parameter variation e.g., focus setting, zoom setting, soft focus setting
  • FIG. 4 illustrates a simple embodiment with "electronic shutter”.
  • a CMOS chip 60c image taking element
  • control unit 4 such, that as a function of time, different groups of pixels (photosensitive members 65) are in action, i.e., are in a photosensitive state, while the rest of the photosensitive members 65 are photo-insensitive, i.e., they do not register photons impinging on them.
  • each row of pixels is switched active, from top to bottom of the CMOS chip 60c, which takes altogeher a time ⁇ t .
  • the imaging parameter is varied.
  • the resulting image components RO, ...Rn, ...Rf are stored in a storage unit 10, e.g., after all rows were active (all data of the image components are collected) , or image component by image component, i.e., data read-out into storage unit 10 each time after one row has been switched back to the inactive state.
  • a storage unit 10 e.g., after all rows were active (all data of the image components are collected) , or image component by image component, i.e., data read-out into storage unit 10 each time after one row has been switched back to the inactive state.
  • Fig. 6 illustrates a simple embodiment with a sensing area definition element 58, e.g., embodied as a mechanical shutter.
  • a sensing area definition element 58 e.g., embodied as a mechanical shutter.
  • a photochemical converter 60b like a photographic film 60b, may be used; a two-dimensional photoelectric converter could be used, too.
  • the control unit 4 controls (via drive 59) sensing area definition element 58, e.g., shutter curtain 58, which defines, as a function of time, the locations on the photographic film 60b, which shall be exposed to light.
  • the movement of the shutter opening may be step-wise or continuous.
  • the set of image components may be considered to contain an infinite number of image components, or a finite number of image components with the delimitation of one image component to a neighboring image component being choosable rather freely. It is possible to use the finally fully exposed photographic film 60b as the final image, or to subject it to further processing, before obtaining the final image. It is, e.g., possible to scan the finally fully exposed photographic film 60b and continue with the so-obtained data. Comparing the embodiment of Fig. 4 with the embodiment of Fig. 6, it is to be noted that the amount of controlling that has to be carried out for obtaining the image components can be considerably smaller in an embodiment of Fig. 6.
  • Fig. 7 is an illustration of a simple embodiment with a line scanner 60a.
  • the line scanner 60a (as image taking element) is moved, so as to be exposed to different parts (or fractions) of the scene to be imaged.
  • a drive for driving the line scanner 60a has not been drawn in Fig. 7.
  • a line scanner 60a is a (usually) linear arrangement of a multitude of photosensitive members (several hundreds or several thousands or more) . It may have more than one row in order to be able to obtain color information.
  • the movement may be continuous or (rather) step-wise, wherein data are read out after a certain exposure time. It is possible to provide a shutter as an exposure time definition element in the light path, which prevents light from impinging on the line scanner 60a when no new data shall be obtained, e.g., when data shall be read out.
  • a sensing area definition element such as a mechanical shutter may be used in conjunction with a CCD chip 6Od or a CMOS chip 60c.
  • an exposure time definition unit for generally allowing or prohibiting that an image taking element 60 is exposed to light.
  • a shutter can be used, which is opened before the first image component RO is taken and is closed after the last image component Rf has been taken.
  • At least some embodiments of the invention can be used for obtaining pictures of scenes with moving objects, which are substantially free from distortions caused by said moving of said objects.
  • the example shows stripe-shaped image components Ri, which are arranged in parallel. This can readily be realized in any of the embodiments of Figs. 4, 5 and 7.
  • the imaging parameter in the example of Fig. 8 is varied continuously, in a nonlinear fashion.
  • Fig. 9 illustrates the relation of image components Ri, picture constituents Ci and final image P. If no processing of the image components Ri is necessary, the image components Ri may be identical with the picture constituents Ci.
  • Fig. 9 shows a case, in which the
  • (lateral) magnification varies with the variation of the imaging parameter.
  • the dotted lines within the image components Ri indicate, which part of each image components will be used in the corresponding picture constituent Ci; the outer part of the image components Ri can be discarded, only image component Rf is used in full.
  • the picture constituents Ci are obtained.
  • the full set of picture constituents Ci corresponds to the picture P, which can be represented as data or as a print on (photo) paper.
  • Figs. 10 and 11 are illustrations of example mosaic patterns of picture constituents Ci. This shall illustrate that, in fact, arbitrary mosaic patterns are possible. Only a small number of picture constituents Ci has been drawn in the Figures, whereas there will typically be of the order of 500 to of the order of 5000 or even more picture constituents Ci. For naturally-looking pictures, it is advisable to have a large number of picture constituents Ci, i.e., to have a maximum resolution, and to have smooth or no changes from one image component to a neighboring image component.
  • Fig. 10 illustrates that stripes do not need to be parallel to the image frame (they do not even have to be parallel to each other) , and that they do not need to have the same width.
  • Fig. 11 shall illustrate that any other mosaic (or puzzle-like) pattern is possible, e.g., a hillock-like one.
  • neighboring image components Ri (which will lead to neighboring picture constituents Ci) will be obtained in succession.
  • Figs. 12 to 15 illustrate some examples of how exposures and imaging parameter settings pi may change over time.
  • Fig. 12 illustrates discrete exposures with the same exposure time ⁇ i for each image component.
  • the imaging parameter is varied quasi-continuously or step-wise. It is possible to make imaging parameter steps only between two exposures, but it is also possible to make the imaging parameter setting steps independent from the timing of the discrete exposures, like shown in Fig. 12.
  • Fig. 13 illustrates, like Figs. 12, discrete exposures, but two imaging parameters, namely the focus setting (ai) and a zoom setting (si) of the image taking element are varied. This may be done for changing the focus and at the same time compensating for the change in magnification due to said change in focus, which could render a processing of so-obtained image components for deriving picture constituents therefrom superfluous.
  • Each imaging parameter can, e.g., be varied continuously or quasi-continuously .
  • Fig. 14 illustrates discrete exposures with varying exposure times ⁇ i for different image components.
  • the imaging parameter settings (pCL.pf) partially vary continuously, but also show a step and a constant region.
  • Fig. 15 illustrates a continuous exposure, as it may be realized, e.g., in an embodiment with a continuously moving opening of a sensing area definition element such as an opening of a shutter (cf . Fig. 6) .
  • An imaging parameter (pCL.pf) e.g., a soft focus setting, is varied.
  • Fig. 16 shows schematically an image taking apparatus 1.
  • This exemplary apparatus 1 is a modular single-lens reflex camera 1. It has the following parts, which are all (optionally) detachable: a lens module 2, a focussed-state detection module 3, a control unit 40, an optional adapter plate 6', an image taking module 6 and a focussed-state detection module 7.
  • the lens module 2 corresponds to an image-forming optical system 20 comprising a number of lenses 21 and an aperture 22 and possibly a shutter (not shown) .
  • a part of the lenses 21 forms a part of a focussing section 29, which also comprises a drive 28 (for focussing) .
  • the drive 28 does not have to be arranged at or within the lens barrel 2.
  • a drive 23 may be provided, which allows to open and close the aperture 22 and adjust its opening.
  • the lens barrel 2 is attached to a focussed-state detection module 3, which in the camera of Fig. 1 is at the same time a mirror module containing a mirror arrangement comprising a main mirror 35 and an auxiliary mirror 36.
  • the focussed-state detection module 7 is embodied as a view finder module 7 attached to the mirror module 3.
  • the focussed-state detection module 7 may, in general, present images for example optically or electro- optically.
  • the thick wavy line represents the image of the object 99 in the image plane 87 of focussing screen 70.
  • the camera 1 comprises an autofocus sensor 30.
  • Light from the object 99 reaches the autofocus sensor 30 on a light path 9a ' through the main mirror 35 and via reflection at the auxiliary mirror 36.
  • an image is formed in an image plane 83 of the autofocus sensor 30.
  • the optical path length from object 99 to the image plane 83 of the autofocus sensor 30 is the same as the optical path length from object 99 to the image plane 87 of the focussing screen 70.
  • the mirror arrangement (main mirror 35 and auxiliary mirror 36) is moved as indicated by the small arrow.
  • This lets the light pass along a light path 9b through the control unit 40, which contains a sensing area definition element 58, e.g., a shutter 58, and a control module 4 (or control circuit) embodied in a microprocessor ⁇ P.
  • the shutter 58 and the control module 4 do not necessarily have to be arranged within the control unit 40.
  • the control module 4 may control the drive 28, the aperture 22 (via the drive 23), a mechanism for moving the mirror arrangement (not shown) and the shutter 58 and other functions of the image taking apparatus. It may receive input from the autofocus sensor
  • the functional connections of the control module 4 to the various units and elements are not shown in Fig. 16.
  • the light will pass the adapter plate 6' and impinge on an image taking element 60 of the image taking module 6, which is embodied as a digital back 6 with a CCD or CMOS chip 60 and comprises a storage unit 10 (memory) .
  • the image plane of the image taking element 60 is labelled 86.
  • At least the following imaging parameters can be varied while obtaining the image components: parameters of the shutter 58 (movement, slit widths) , focus settings (via drive 28) and opening of aperture 22 (via drive 23) .
  • parameters of the shutter 58 movement, slit widths
  • focus settings via drive 28
  • opening of aperture 22 via drive 23
  • other imaging parameters and parts or elements of the camera 1 could be controlled by the control module 4 in order to vary these during obtaining the image components.
  • the photographer could select, manually or automatically (autofocus or the like) a setting of the at least one imaging parameter to be varied, e.g., a focus setting and/or a zoom setting. Then, the user marks, on a screen, a point or an area in the scene, at which this setting should be used, e.g., in the middle of the top of the screen. Then the same is done for a second setting and a second point or area in the scene, e.g. in the middle of the bottom of the screen. It is possible to input further settings and points / areas.
  • a setting of the at least one imaging parameter e.g., a focus setting and/or a zoom setting.
  • a suitable fitting mode could be selected, which defines the algorithm (fitting procedure for interpolation / extrapolation) to be used for obtaining the mosaic patterns and the imaging parameter variation from the input.
  • one mode could be: finest-resolution and horizontally-oriented stripe- shaped image components from top to bottom, and polynomial interpolation between the imaging parameter settings.
  • the "exposure program” is defined, and upon a start signal, e.g., pressing the release button of the camera, the image components are obtained according to the "exposure program” and stored in a storage unit. This way of defining the "exposure program” (and providing the input for that) could even be realized with the camera alone.
  • the data from the image taking element 60 could be output on a screen of the camera, and points could be set, e.g., by means of a tracking ball or cursor keys.
  • a software for defining the instructions can be implemented in the camera itself and/or can be run on a (separate) computer connected (at least temporarily) to the camera.
  • Fig. 5 is an illustration of a camera system with computer 100 and storage unit 10.
  • the storage unit 10 may be part of the computer 100 and contain image components and/or picture constituents and/or final images (pictures).
  • a software for defining exposure programs and/or software for obtaining final images (from image components and/or picture constituents) and even software embodying the functions of the control module 4 (for remote-controlling the camera) may be installed and run.
  • Fig. 17 schematically illustrates, in a side view, a simple embodiment with a variation of an imaging parameter and with image taking element tilting.
  • Said imaging parameter can be a focus setting, a zoom setting, a soft focus setting or another imaging parameter. It is symbolized by a moving lens 21 and will usually also comprise a movement of at least one lens 21.
  • a sensing area definition element 58 having a transparent portion 580, and an image taking element 60 are shown at two points in time tl,t2.
  • the arrangement at time tl is drawn in solid lines, the arrangement at time t2 is drawn in thick dotted lines.
  • sensing area definition element 58 can, e.g., be embodied as a shutter curtain.
  • image taking element 60 is tilted (inclined) with respect to a certain unchanged angle during obtaining the set of image components; e.g., image taking element 60 is tilted into such a tilted position shortly - A A -
  • image taking element 60 is not kept in a fixed position (tilted or not tilted) during obtaining the set of image components, but the tilting is varied (in angle and/or direction) during obtaining said set of image components.
  • the effect of varying the tilting angle during obtaining said set of image components is readily understood: Assuming that a flat surface aligned perpendicularly to optical axis A is to be imaged, the slice-shaped image component Rl obtained at time tl has, as indicated in the right part of Fig. 17 by dots having the same size, the same degree of sharpness or blur across itself. But the slice-shaped image component R2 obtained at time t2 (with image taking element 60 tilted) has, as indicated in the right part of Fig.
  • Fig. 18 shows the embodiment of Fig. 17 is a top view.
  • the axis of rotation of image taking element 60 can, as shown in Figs. 17 and 18, run centrally through image taking element 60, but it could also be a different axis, e.g., one running along an edge of image taking element 60.
  • the movements of the image taking element can be accomplished using a drive, e.g., a rotatory or a linear drive, e.g., piezo driven.
  • a drive e.g., a rotatory or a linear drive, e.g., piezo driven.
  • Figs. 19 and 20 show, in side view and in top view, respectively, an embodiment which is similar to the one shown in Figs. 17 and 18, but along with the image taking element, also the sensing area definition element 58 is tilted.
  • This can be advantageous in particular if - as often will be the case - the sensing area definition element 58 is arranged close to image taking element 60.
  • the sensing area definition element 58 can be separate from image taking element 60 or can be fixed to image taking element 60. In the latter case, the size of sensing area definition element 58 with respect to the size of image taking element 60 has to be chosen sufficiently large, so as to allow for proper exposure of image taking element 60 when tilted, also in the peripheral parts of image taking element 60.
  • the tilting of image taking element 60 can be accomplished and used in the same fashion as discussed in conjunction with Figs. 17 an 18.
  • a tilting of image taking element 60 can also be accomplished when the image taking element is a line sensor (cf., e.g., Fig. 7) .
  • a sensing area definition element can be dispensed with or can be used, e.g., with a transparent portion being moved along the extension of the line sensor.
  • lens module 2 lens module, objective module, lens barrel 10 storage unit
  • control module control circuit
  • sensing area definition element 58 sensing area definition element, aperture element, aperture, slit-shaped aperture, means for defining a portion of the image taking element to be exposed, shutter 580 transparent portion of sensing area definition element, opening of sensing area definition element
  • image plane of focussed-state detection arrangement image plane of autofocus sensor 86 image plane of image taking element
  • Ci picture constituent P picture final image pi imaging parameter setting

Abstract

La présente invention concerne un procédé de formation d'une image (P) d'une scène (99) utilisant un dispositif de prise de vues (1) comprenant un élément de prise de vues (60) et un système optique de formation d'image (20) comprenant au moins une lentille (21). Le procédé consiste à : a) obtenir un ensemble de composants d'image (Ri), cet ensemble de composants d'image (Ri) contenant des informations sur la scène (99) dans sa totalité; b) effectuer une modification prédéfinie d'au moins un paramètre d'imagerie pendant l'étape a) qui consiste à a1) obtenir successivement les composants d'image (Ri) de l'ensemble et a2) capturer une fraction sensiblement différente de la scène (99) dans chacun des composants d'image (Ri), l'étape b) consistant à b1) effectuer un déplacement relatif de la lentille (21) par rapport à l'élément de prise de vues (60) pendant l'étape a), le paramètre d'imagerie étant, pour chaque composant d'image (Ri), sensiblement constant lors de l'obtention du composant d'image respectif (Ri). Le paramètre d'imagerie peut être un réglage de focalisation, le procédé permettant alors d'obtenir des images à distributions de netteté et de flou prédéterminées.
PCT/EP2007/055338 2007-05-31 2007-05-31 Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees WO2007101887A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/055338 WO2007101887A2 (fr) 2007-05-31 2007-05-31 Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/055338 WO2007101887A2 (fr) 2007-05-31 2007-05-31 Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees

Publications (2)

Publication Number Publication Date
WO2007101887A2 true WO2007101887A2 (fr) 2007-09-13
WO2007101887A3 WO2007101887A3 (fr) 2008-04-03

Family

ID=38475219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/055338 WO2007101887A2 (fr) 2007-05-31 2007-05-31 Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees

Country Status (1)

Country Link
WO (1) WO2007101887A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2059024A3 (fr) * 2007-10-29 2009-05-20 Ricoh Company, Ltd. Dispositif de traitement d'images, procédé de traitement d'images et produit de programme informatique
CN114793269A (zh) * 2022-03-25 2022-07-26 岚图汽车科技有限公司 摄像头的控制方法及相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907353A (en) * 1995-03-28 1999-05-25 Canon Kabushiki Kaisha Determining a dividing number of areas into which an object image is to be divided based on information associated with the object
US20020054217A1 (en) * 2000-11-07 2002-05-09 Minolta Co., Ltd. Method for connecting split images and image shooting apparatus
US6535250B1 (en) * 1997-06-12 2003-03-18 Minolta Co., Ltd. Image pickup apparatus
JP2005039680A (ja) * 2003-07-18 2005-02-10 Casio Comput Co Ltd カメラ装置、及び撮影方法、プログラム
EP1553521A1 (fr) * 2002-10-15 2005-07-13 Seiko Epson Corporation Traitement de synthese de panorama d'une pluralite de donnees d'images
US20050178950A1 (en) * 2004-02-18 2005-08-18 Fujinon Corporation Electronic imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907353A (en) * 1995-03-28 1999-05-25 Canon Kabushiki Kaisha Determining a dividing number of areas into which an object image is to be divided based on information associated with the object
US6535250B1 (en) * 1997-06-12 2003-03-18 Minolta Co., Ltd. Image pickup apparatus
US20020054217A1 (en) * 2000-11-07 2002-05-09 Minolta Co., Ltd. Method for connecting split images and image shooting apparatus
EP1553521A1 (fr) * 2002-10-15 2005-07-13 Seiko Epson Corporation Traitement de synthese de panorama d'une pluralite de donnees d'images
JP2005039680A (ja) * 2003-07-18 2005-02-10 Casio Comput Co Ltd カメラ装置、及び撮影方法、プログラム
US20050178950A1 (en) * 2004-02-18 2005-08-18 Fujinon Corporation Electronic imaging system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2059024A3 (fr) * 2007-10-29 2009-05-20 Ricoh Company, Ltd. Dispositif de traitement d'images, procédé de traitement d'images et produit de programme informatique
CN101426093B (zh) * 2007-10-29 2011-11-16 株式会社理光 图像处理设备和图像处理方法
CN114793269A (zh) * 2022-03-25 2022-07-26 岚图汽车科技有限公司 摄像头的控制方法及相关设备

Also Published As

Publication number Publication date
WO2007101887A3 (fr) 2008-04-03

Similar Documents

Publication Publication Date Title
US8432481B2 (en) Image sensing apparatus that controls start timing of charge accumulation and control method thereof
US9591246B2 (en) Image pickup apparatus with blur correcting function
JP5901246B2 (ja) 撮像装置
JP5645846B2 (ja) 焦点調節装置及び焦点調節方法
JP5254904B2 (ja) 撮像装置及び方法
US8063944B2 (en) Imaging apparatus
KR101280248B1 (ko) 카메라 및 카메라 시스템
JP5168798B2 (ja) 焦点調節装置および撮像装置
US8854533B2 (en) Image capture apparatus and control method therefor
JP4948266B2 (ja) 撮像装置及びその制御方法
EP2035891A1 (fr) Procédé et système pour la stabilisation d'image
JP2011081271A (ja) 撮像装置
JP2001042207A (ja) 電子カメラ
JP6997295B2 (ja) 撮像装置、撮像方法、及びプログラム
WO2007101887A2 (fr) Procedé de formation d'une image et dispositif de prise de vues à fonctions d'imagerie ameliorees
JP2014130231A (ja) 撮像装置、その制御方法、および制御プログラム
CN101505370B (zh) 摄像设备
JP4309716B2 (ja) カメラ
JPH11223761A (ja) 焦点検出装置付きカメラ
JP4135202B2 (ja) 焦点検出装置及びカメラ
JP6477275B2 (ja) 撮像装置
JP2016057402A (ja) 撮像装置及びその制御方法
JP2018042145A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP2011176457A (ja) 電子カメラ
JP4106725B2 (ja) 焦点検出装置及び焦点検出装置付きカメラ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07729740

Country of ref document: EP

Kind code of ref document: A2