WO2015158953A1 - Transformations pour stabilisation et refocalisation d'image - Google Patents

Transformations pour stabilisation et refocalisation d'image Download PDF

Info

Publication number
WO2015158953A1
WO2015158953A1 PCT/FI2015/050197 FI2015050197W WO2015158953A1 WO 2015158953 A1 WO2015158953 A1 WO 2015158953A1 FI 2015050197 W FI2015050197 W FI 2015050197W WO 2015158953 A1 WO2015158953 A1 WO 2015158953A1
Authority
WO
WIPO (PCT)
Prior art keywords
image sensor
optical devices
image
relative positioning
camera
Prior art date
Application number
PCT/FI2015/050197
Other languages
English (en)
Inventor
Johan Windmark
Anders MÅRTENSSON
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Publication of WO2015158953A1 publication Critical patent/WO2015158953A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/64Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
    • G02B27/646Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for small deviations, e.g. due to vibration or shake
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation

Definitions

  • Embodiments of the present invention relate to imaging. In particular, they relate to adjusting image frames by performing a transform. BACKGROUND
  • one or more optical devices are configured to convey light to an image sensor. At least one optical device may be movable relative to the image sensor to allow for refocusing. The image sensor and/or at least one of the optical devices may be movable relative to one another in order to provide optical image stabilization.
  • a method comprising: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.
  • an apparatus comprising: means for causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; means for causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; means for determining the adjusted relative positioning of the image sensor and the one or more optical devices; and means for adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.
  • an apparatus comprising: at least one processor; and at least one memory storing computer program code configured, working with the at least one processor, to cause: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.
  • a non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause: causing an adjustment to the relative positioning of an image sensor of a camera and one or more optical devices of the camera which convey light to the image sensor; causing at least part of an image frame to be captured, using the image sensor, while the image sensor and the one or more optical devices are in the adjusted relative positioning; determining the adjusted relative positioning of the image sensor and the one or more optical devices; and adjusting the image frame by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor and the one or more optical devices.
  • a method comprising: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.
  • an apparatus comprising: means for responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.
  • an apparatus comprising: at least one processor; and at least one memory storing computer program code configured, working with the at least one processor, to cause: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.
  • a non-transitory computer readable medium storing computer program instructions that, when performed by at least one processor, cause: responding to rotational movement of a camera by performing a transform on an image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.
  • Fig. 1 illustrates an apparatus
  • Fig. 2 illustrates a further apparatus
  • Fig. 3 illustrates an optical arrangement comprising an image sensor and an optical device
  • Fig. 4 illustrates rotational movement of a camera
  • Fig. 5 illustrates a scene being imaged using optical image stabilization
  • Fig. 6A illustrates a schematic showing an image sensor of the further apparatus imaging an object in a scene
  • Fig. 6B illustrates the image sensor of the further apparatus shown in figure 6A undergoing translation movement for optical image stabilization, which is performed to compensate for rotational movement of the further apparatus;
  • Fig. 6C illustrates a virtual plane indicating how the image sensor shown in figure 6A would have had to have moved in order to fully compensate for the rotational movement of the further apparatus;
  • Fig. 6D is a schematic illustrating the image sensor illustrated in figure 6B, a first plane at the scene representing the scene imaged by the image sensor, the virtual plane illustrated in figure 6C and a second plane at the scene representing how the scene would have been imaged if the image sensor were positioned at the virtual plane;
  • Figures 7A and 7B illustrate an image plane of the further apparatus after optical image stabilization has been performed to compensate for rotational movement of the further apparatus, and a projection plane which represents where the image plane would have to be positioned in order to fully compensate for the rotational movement of the further apparatus;
  • Figure 8 illustrates a first flow chart of a method
  • Figure 9 illustrates a second chart of a method.
  • Embodiments of the invention relate to using adjusting an image frame by performing a mathematical transform on the image frame.
  • a transform is performed in order to at least partially compensate for rotational movement of a camera which may, for example, occur due to user handshake.
  • a transform is performed in response to a change in the field of view of the camera which occurs during refocusing of the camera.
  • Figure 1 illustrates an apparatus 10 that may be a chip or a chipset. The apparatus 10 may form part of an electronic device such as that illustrated in figure 2.
  • the apparatus 10 comprises at least one processor 12 and at least one memory 14.
  • the at least one processor 12 may, for example, include a central processing unit (CPU), a graphics processing unit (GPU) and/or an optical image stabilization controller.
  • CPU central processing unit
  • GPU graphics processing unit
  • optical image stabilization controller an optical image stabilization controller
  • the processor 12 is configured to read from and write to the memory 14.
  • the processor 12 may comprise an output interface via which data and/or commands are output by the processor 12 and an input interface via which data and/or commands are input to the processor 12.
  • the memory 14 is illustrated as storing a computer program 17 which comprises computer program instructions/code 18 that control the operation of the apparatus 10 when loaded into the processor 12.
  • the processor 12 by reading the memory, is able to load and execute the computer program code 18.
  • the computer program code 18 provides the logic and routines that enables the apparatus 10 to perform the methods illustrated in figure 8 and 9 and described below.
  • the processor 12 and the computer program code 18 provide means for performing the methods illustrated in figure 8 and 9 and described below.
  • memory 14 is illustrated as a single component in figure 1 , it may be implemented as one or more separate components, some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/dynamic/cached storage.
  • the computer program code 18 may arrive at the apparatus 10 via any suitable delivery mechanism 28.
  • the delivery mechanism 28 may be, for example, a non-transitory computer-readable storage medium such as an optical disc or a memory card.
  • the delivery mechanism 28 may be a signal configured to reliably transfer the computer program code 18.
  • the apparatus 10 may cause the propagation or transmission of the computer program code 18 as a computer data signal.
  • Figure 2 illustrates an apparatus 20 in the form of an electronic device.
  • the apparatus 20 has camera functionality and may be referred to hereinafter as a camera.
  • the electronic device 20 may have other functionality in addition to camera functionality.
  • the apparatus 20 may, for example, function as a mobile telephone, a tablet computer, a games console or a personal music player. In some cases, the camera function may not be the primary function of the apparatus 20.
  • the example of the apparatus 20 illustrated in figure 2 includes one or more optical devices 24, an image sensor 26, position detection circuitry 22, position adjustment circuitry 23 and the apparatus 10 illustrated in figure 1 co-located in a housing 25.
  • the apparatus 20 optionally includes one or more motion detectors 27 and might, for example, comprise other elements such as a display and/or a radio frequency transceiver.
  • the elements 12, 14, 22, 23, 24, 26 and 27 are operationally coupled and any number or combination of intervening elements can exist between them (including no intervening elements).
  • the one or more optical devices 24 are configured to convey light to the image sensor 26.
  • the one or more optical devices 24 may, for example, comprise one or more lenses and/or one or more mirrors. In some embodiments, the one or more optical devices 24 consist merely of a single lens.
  • Figure 2 schematically illustrates light 6 entering the housing 25 of the apparatus 20 via an aperture 21 and being conveyed to the image sensor 26.
  • the image sensor 26 is configured to convert incident light into digital image data.
  • the processor 12 is configured to retrieve image data captured by the image sensor 26 and process it. In this regard, the image sensor 26 is configured to provide an input to the processor 12.
  • the image sensor 26 may be any type of image sensor. It may, for example, be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) sensor.
  • the position adjustment circuitry 23 is configured to adjust the relative positioning of the image sensor 26 and at least one of the optical devices 24. In this regard, the position adjustment circuitry 23 may adjust the position of the image sensor 26 and/or at least one of the optical devices 24.
  • the position adjustment circuitry 23 may, for example, comprise at least one electric motor that is configured to move the image sensor 26 and/or at least one of the optical devices 24.
  • the image sensor 26 and/or at least one of the optical devices 24 may be moved for the purpose of performing optical image stabilization. Alternatively or additionally, the image sensor 26 and/or at least one of the optical devices 24 may be moved to perform focusing and refocusing.
  • the position detection circuitry 22 is configured to detect the position of the image sensor 26 and/or at least one of the optical devices 24.
  • the position detector circuitry 22 might, for example, comprise a Hall effect sensor.
  • the position detection circuitry 22 might be configured to determine the position of the image sensor 26 and/or at least one of the optical devices 24 periodically.
  • the position detection circuitry 22 is also configured to provide a signal indicating the position of the image sensor 26 and/or at least one of the optical devices 24 to the processor 12.
  • the one or more motion sensors 27 may, for example, comprise one or more accelerometers and/or one or gyroscopes.
  • the one or more motion sensors 27 may provide inputs to the processor 12 that are indicative of the motion of the apparatus 20 (for example, during still image or video image capture).
  • Figure 3 illustrates an example of an optical arrangement in which the one or more optical devices 24 consist of a single lens.
  • the optical axis of the optical arrangement is illustrated by a dotted line 28 in figure 3.
  • Cartesian coordinate axes 70 have also been illustrated in figure 3.
  • the image sensor 26 illustrated in figure 3 is substantially planar in nature and extends in the x and y dimensions.
  • the position adjustment circuitry 23 may be configured to move the image sensor 26 and/or the optical device 24 to perform optical image stabilization.
  • the position adjustment circuitry 23 may be configured to move the image sensor 26 laterally relative to the optical device 24 and relative to the direction of propagation of light from the optical device 24 (that is, in the +x, -x, +y and -y directions).
  • the position adjustment circuitry 23 might be configured to move the optical device 24 laterally relative to the image sensor 26 and relative to the direction of propagation of light from the optical device 24 (that is, in the +x, -x, +y and -y directions).
  • the position adjustment circuitry 23 might be configured such that it cannot or does not rotate the image sensor 26 or the optical device(s) 24.
  • the position adjustment circuitry 23 might be configured in this way in order to save space inside the housing 25.
  • the position adjustment circuitry 23 may be configured to move the image sensor 26 and/or the optical device 24 to perform focusing or refocusing.
  • the position adjustment circuitry 23 might be configured to move the image sensor 26 to and from the optical device 24 (that is, in the +z and the -z directions).
  • the position adjustment circuitry 23 might be configured to move the optical device 24 to and from the image sensor 26 (that is, in the -z and the +z directions).
  • Figure 4 illustrates the further apparatus/camera 20 being directed towards a scene 50.
  • the user is capturing a video of the scene 50 using the camera 20.
  • the triangle labeled with the reference numeral 33 represents a cone of light which enters the aperture 21 of the camera 20 when the camera 20 is in its initial position. Due to user handshake, the camera 20 rotates in a direction indicated by the arrow labeled with the reference numeral 34 while video is being captured.
  • the cone of light entering the camera 20 after the rotation has occurred is labeled with the reference numeral 35 in figure 4.
  • Figure 5 illustrates how optical image stabilization compensates for the rotational motion illustrated in figure 4.
  • a plane 36 is illustrated which is representative of a region of the scene 50 that is imaged by the camera 20 prior to the rotation of the camera.
  • the processor 12 responds by causing the position adjustment circuitry 23 to make an adjustment to the relative positioning of the image sensor 26 and at least one optical device 24 to try to stabilize the video being captured.
  • the relative positioning of the image sensor 26 and at least one optical device 24 is adjusted by causing the image sensor 26 and/or at least one optical device 24 to undergo a translational movement (and not a rotational movement) as explained above in relation to figure 3. This means that adjustment of the relative positioning of the image sensor 26 and at least one optical device 24 cannot fully compensate for the rotation of the camera 20 using optical image stabilization.
  • the plane labeled with the reference numeral 37 indicates the imaged region of the scene 50 following rotation of the camera 20 and following adjustment of the relative positioning of the image sensor 26 and the optical device(s) 24.
  • the misalignment of the planes 36, 37 indicates that the image sensor 26 is effectively viewing the scene 50 from a slightly different perspective in each instance, showing that the optical image stabilization has not fully compensated for the rotation of the camera 20. This can manifest itself as “jitter” or "wobble” during video capture. This "jitter” is particularly prevalent at the periphery of the captured video.
  • this is remedied, at least partially, by adjusting captured image frames by performing a transform on the image frames.
  • the transform which is performed warps the image, such that the image seems as if it was captured from the same perspective as that "seen" by the image sensor 26 prior to the rotation of the camera 20.
  • FIG. 6A illustrates an image sensor 26 positioned in a housing 25 of a camera 20 that is imaging an object 51 in a scene 50.
  • the image sensor 26 is co-incident with the image plane of the one or more optical devices 24 in the camera 20.
  • Figure 6B illustrates a rotation of the camera 20 being simulated by moving the image sensor 26 in the direction indicated by the arrow 31 .
  • Figure 6C illustrates a virtual plane 39 which represents where the image plane/image sensor 26 would have to be positioned in order to fully compensate for the rotation that is simulated in figure 6B.
  • the virtual plane 39 is a rotation of the image plane/image sensor 26 in the direction illustrated by the arrow labeled with the reference numeral 32.
  • Figure 6D is a schematic illustrating the misalignment between the image plane/image sensor 26 shown in figure 6B and the virtual plane 39 shown in figure 6C, in two dimensions.
  • Figure 6D also illustrates a plane 63 which represents a region of the scene 50 that would be captured by an image sensor positioned at the virtual plane 39 and a plane 64 representing a region of the scene 50 that would be captured when the image sensor 26 is in the adjusted position illustrated in figure 6B.
  • the camera projection x p of a point x on an image plane/image sensor 26/P can be written as:
  • K is a projection matrix
  • R is a rotation matrix
  • 7 is a translation.
  • the translation of the image sensor 26 is introduced into the projection matrix Ki as ⁇ and ⁇ , along with the focal length f of the optical device(s) 24: 0
  • x p i is a point on the image plane/image sensor 26/P which corresponds with a point Xpo on the projection/virtual plane 39/Po, as illustrated in figure 7B.
  • equations (8) and (9) we have a set of equations that we can use to map each pixel on the image plane/image sensor 26/Pi to the virtual/projection plane 39/Po.
  • the only unknowns are the angles ⁇ and ⁇ . If we apply the constraint that the center of the captured image is projected to the center of the projection, we have:
  • Equation (8) can then be used to project the pixels on the image plane/image sensor 26/Po (in a captured image) onto a virtual/projection plane 39/Po in order to digitally adjust the image to compensate for the rotation of the camera 20.
  • embodiments of the invention provide a method of digitally compensating for the rotation of a camera 20 using a relatively straightforward computational technique.
  • the only variable inputs to the above process for projecting a captured image onto the virtual/projection plane 39/Pi are ⁇ and ⁇ , which relate to the translational movement the image sensor 26 and/or the optical device 24 undergoes during optical image stabilization.
  • the processor 12 of the camera 20 can therefore perform the transform using a simple input from the position detection circuitry 22.
  • FIG 8. A first example of a method according to embodiments of the invention will now be described in relation to figure 8. In this first method, a transform is performed in accordance with that described above in relation to figures 4 to 7B.
  • the user provides user input to cause the camera 20 to begin capturing video.
  • the expression "capturing video” is intended to encompass both transient capture of video when the camera 20 is in a "viewfinder mode” and more permanent capture of video.
  • the processor 12 responds to the user input by commencing a video capture process in which it begins to read image data from the image sensor 26.
  • the processor 12 begins the video capture process, the image sensor 26 and the one or more optical devices 24 are in their initial positions.
  • the position detection circuitry 22 may be configured to detect the relative positioning of the image sensor 26 and the one or more optical devices 24 at least once per captured image frame and in some cases it may detect the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame.
  • the processor 12 responds to the rotation of the camera 20 by initiating optical image stabilization.
  • the processor 12 causes an adjustment to the relative positioning of the image sensor 26 of the camera 20 and the one or more optical devices 24 of the camera 20 while an image frame is being captured.
  • the processor 12 may, for example, cause a translational movement of the image sensor 26 and/or the one or more optical devices 24, as described above. This is done by controlling the position adjustment circuitry 23 to adjust the position of the image sensor 26 and/or the one or more optical devices 24.
  • the processor 12 causes the image sensor 26 to capture at least part of an image frame, while the image sensor 26 and the one or more optical devices 24 are in the adjusted relative positioning.
  • the processor 12 determines the adjusted relative positioning of the image sensor 26 and the one or more optical devices 24. For example, the processor 12 may determine the distance ⁇ , ⁇ that the image sensor 26 and/or the one or more optical devices 24 have moved from their respective initial positions.
  • At least part of the image frame may be formed as the relative positioning of the image sensor 26 and the one or more optical devices 24 is being adjusted. If so, and the position detection circuitry 22 is detecting the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame, the "adjusted relative positioning" can be considered to be the average adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 while the image frame was being captured.
  • the processor 802 adjusts the image frame captured in block 802 by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 determined in block 803.
  • the processor 12 may, for example, warp the image frame captured in block 802 by projecting the image frame onto a virtual/projection plane as described above.
  • the transform that is performed on the image frame at least partially compensates for the change in the perspective of the camera 20 that results from the rotation of the camera 20.
  • FIG. 8 A second example of a method according to embodiments of the invention will now be described in conjunction with figure 8.
  • the second example is different from the first example in that there is not necessarily any rotation of the camera 20 while capturing video and the transform that is performed does not compensate for any such rotation.
  • a transform is performed on image frames to digitally stabilize the image frames during refocusing of the camera 20.
  • the user provides user input to cause the camera 20 to begin capturing video.
  • the processor 12 may cause an adjustment to the relative positioning of the image sensor 26 and the one or more optical devices 24 in order to focus on the scene being captured.
  • the processor 12 may alter the distance between the image sensor 26 and at least one optical device 24 by moving the image sensor 26 and/or an optical device 24 in the z-dimension illustrated in figure 3.
  • Changes to the scene being imaged then cause the processor 12 to attempt to refocus on the scene by causing a further adjustment to the relative positioning of the image sensor 26 and the optical device 24 in block 801 of figure 8.
  • the processor 12 causes the image sensor 26 to capture at least part of an image frame, while the image sensor 26 and the one or more optical devices 24 are in the adjusted relative positioning.
  • the processor 12 determines the adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 is being adjusted. For example, the processor 12 may determine the distance ⁇ that the image sensor 26 and/or the one or more optical devices 24 have moved from their respective initial positions.
  • At least part of the image frame may be formed as the relative positioning of the image sensor 26 and the one or more optical devices 24. If so, and the position detection circuitry 22 is detecting the relative positioning of the image sensor 26 and the one or more optical devices 24 multiple times per captured image frame, the "adjusted relative positioning" can be considered to be the average adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 while the image frame was being captured.
  • the processor 802 adjusts the image frame captured in block 802 by performing a transform on the image frame using the determined adjusted relative positioning of the image sensor 26 and the one or more optical devices 24 determined in block 803.
  • the transform that is performed on the image is a resizing/scaling of the image rather than a projection.
  • this stabilizes the video being captured by mitigating the zooming in and out that occurs when the camera 20 is refocusing.
  • FIG. 9 illustrates a third example of a method according to embodiments of the invention.
  • the third example is similar to the first example in that a transform is performed on an image frame which at least partially compensates for rotational movement of the camera 20.
  • the third example may incorporate any of the features of the first example described above in relation to figure 8. However, the third example differs from the first example in that there need not be any adjustment to the relative positioning of the image sensor 26 and the one or more optical devices 24.
  • the camera 20 need not have an optical image stabilization system.
  • the processor 12 may, for instance, receive inputs from the one or more motion detectors 27, rather than from the position detection circuitry 22, for use in the projection of the captured image onto the projection plane.
  • rotational movement of the camera 20 is detected, for example, while video of a scene is being captured by the camera 20.
  • the rotational movement of the camera 20 causes a change in the perspective from which the scene is being imaged.
  • Optical image stabilization may or may not be performed, as described in the first example.
  • the processor 12 responds to rotational movement of the camera 20 by performing a transform on a captured image frame, in order to at least partially compensate for a change in perspective of the camera caused by the rotational movement.
  • the transform that is performed may be the same as that described above in relation to figures 4 to 7B. If no optical image stabilization is performed, the angles ⁇ and ⁇ referred to in equation (9) above may be determined from inputs provided by the one or more motion detectors 27. In this regard, the processor 12 may determine how the position of the camera 20 is changing over a period of time by receiving multiple readings from the motion detector(s) 27 per image frame. If so, averages of those readings may be used as inputs to the transform.
  • references to 'non-transitory computer-readable storage medium', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry.
  • References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.
  • the term 'circuitry' refers to all of the following:
  • circuitry to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term in this application, including in any claims.
  • the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware.
  • the term “circuitry” would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, or other network device.
  • the blocks illustrated in figures 8 and 9 may represent steps in a method and/or sections of code in the computer program 17.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

Cette invention concerne des appareils, des procédés et des programmes informatiques. Un procédé selon l'invention comprend les étapes consistant à : réaliser un réglage du positionnement relatif d'un capteur d'image d'une caméra et d'un ou plusieurs dispositifs optiques de la caméra qui transportent la lumière vers le capteur d'image ; réaliser la capture d'au moins une partie d'une trame d'image à l'aide du capteur d'image, tandis que le capteur d'image et ledit/lesdits dispositif(s) optique(s) est/sont dans le positionnement relatif ajusté ; déterminer le positionnement relatif ajusté du capteur d'image et dudit/desdits dispositif(s) optique(s) ; et ajuster la trame d'image exécution d'une transformée sur la trame d'image à l'aide du positionnement relatif ajusté déterminé du capteur d'image et dudit/desdits dispositif(s) optique(s). Selon un mode de réalisation, l'invention concerne la correction des distorsions d'image persistant après la stabilisation de l'image optique. Selon un autre mode de réalisation, l'invention concerne la mise à l'échelle de l'image au cours de la refocalisation.
PCT/FI2015/050197 2014-04-14 2015-03-24 Transformations pour stabilisation et refocalisation d'image WO2015158953A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1406616.1 2014-04-14
GB1406616.1A GB2525175A (en) 2014-04-14 2014-04-14 Imaging

Publications (1)

Publication Number Publication Date
WO2015158953A1 true WO2015158953A1 (fr) 2015-10-22

Family

ID=50844921

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2015/050197 WO2015158953A1 (fr) 2014-04-14 2015-03-24 Transformations pour stabilisation et refocalisation d'image

Country Status (2)

Country Link
GB (1) GB2525175A (fr)
WO (1) WO2015158953A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11328155B2 (en) 2015-11-13 2022-05-10 FLIR Belgium BVBA Augmented reality labels systems and methods
GB2561746B (en) * 2015-11-13 2022-02-09 Flir Systems Video sensor fusion and model based virtual and augmented reality systems and methods

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6396961B1 (en) * 1997-11-12 2002-05-28 Sarnoff Corporation Method and apparatus for fixating a camera on a target point using image alignment
US20070132856A1 (en) * 2005-12-14 2007-06-14 Mitsuhiro Saito Image processing apparatus, image-pickup apparatus, and image processing method
US20130044229A1 (en) * 2011-08-18 2013-02-21 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
US20130321650A1 (en) * 2012-06-04 2013-12-05 Sony Corporation Imaging device, control method of the same and program
US20140111661A1 (en) * 2012-10-22 2014-04-24 Canon Kabushiki Kaisha Image capture apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3671698D1 (de) * 1985-03-25 1990-07-12 Jean Yves Dr Chauve Anordnung zur stimulation von akupunkturpunkten.
US8760513B2 (en) * 2011-09-30 2014-06-24 Siemens Industry, Inc. Methods and system for stabilizing live video in the presence of long-term image drift
US8810666B2 (en) * 2012-01-16 2014-08-19 Google Inc. Methods and systems for processing a video for stabilization using dynamic crop

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6396961B1 (en) * 1997-11-12 2002-05-28 Sarnoff Corporation Method and apparatus for fixating a camera on a target point using image alignment
US20070132856A1 (en) * 2005-12-14 2007-06-14 Mitsuhiro Saito Image processing apparatus, image-pickup apparatus, and image processing method
US20130044229A1 (en) * 2011-08-18 2013-02-21 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
US20130321650A1 (en) * 2012-06-04 2013-12-05 Sony Corporation Imaging device, control method of the same and program
US20140111661A1 (en) * 2012-10-22 2014-04-24 Canon Kabushiki Kaisha Image capture apparatus

Also Published As

Publication number Publication date
GB2525175A (en) 2015-10-21
GB201406616D0 (en) 2014-05-28

Similar Documents

Publication Publication Date Title
CN107770433B (zh) 影像获取装置及其影像平顺缩放方法
CN109194876B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN111246089B (zh) 抖动补偿方法和装置、电子设备、计算机可读存储介质
US10306165B2 (en) Image generating method and dual-lens device
CN108881703B (zh) 防抖控制方法和装置
JP6135848B2 (ja) 撮像装置、画像処理装置および画像処理方法
JP2019106656A (ja) 半導体装置および電子機器
KR101991754B1 (ko) 이미지 처리 방법 및 장치, 그리고 전자 기기
CN107615744B (zh) 一种图像拍摄参数的确定方法及摄像装置
US10142541B2 (en) Image processing apparatus, imaging apparatus, and control method of image processing apparatus
JP2012195668A (ja) 画像処理装置、画像処理方法およびプログラム
JP6098873B2 (ja) 撮像装置および画像処理装置
JP6128458B2 (ja) 撮像装置および画像処理方法
CN113875219A (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
US9204047B2 (en) Imaging
WO2015158953A1 (fr) Transformations pour stabilisation et refocalisation d'image
CN113519152B (zh) 紧密同步的光学图像稳定(ois)中的翻滚补偿和模糊减少
US10965877B2 (en) Image generating method and electronic apparatus
JP2021033015A5 (fr)
JP2014160998A (ja) 画像処理システム、画像処理方法、画像処理プログラム、及び記録媒体
JPWO2011129036A1 (ja) 撮像装置および集積回路
Sindelar et al. Space-variant image deblurring on smartphones using inertial sensors
JP2015095670A (ja) 撮像装置、その制御方法、および制御プログラム
JP6237421B2 (ja) 撮像装置および撮像処理プログラム
CN109963085B (zh) 一种调整快门速度的方法、装置及机器人

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15779535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15779535

Country of ref document: EP

Kind code of ref document: A1