WO2023113991A1 - Systèmes et procédés de capture d'images stabilisées - Google Patents

Systèmes et procédés de capture d'images stabilisées Download PDF

Info

Publication number
WO2023113991A1
WO2023113991A1 PCT/US2022/051110 US2022051110W WO2023113991A1 WO 2023113991 A1 WO2023113991 A1 WO 2023113991A1 US 2022051110 W US2022051110 W US 2022051110W WO 2023113991 A1 WO2023113991 A1 WO 2023113991A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
image
sensor
lens
mobile device
Prior art date
Application number
PCT/US2022/051110
Other languages
English (en)
Inventor
Sebastien Riccardi
Pierre Grenet
Jerome LACHAUX
Original Assignee
Invensense, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/992,801 external-priority patent/US20230199326A1/en
Application filed by Invensense, Inc. filed Critical Invensense, Inc.
Publication of WO2023113991A1 publication Critical patent/WO2023113991A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors

Definitions

  • This disclosure generally relates to techniques for capturing images with a moveable device and more specifically to accommodating greater motion during stabilization.
  • a portable device it is increasingly for a portable device to have digital imaging functions. Examples include dedicated imaging devices such as cameras or video recorders as well as portable devices having a greater range of usage including smartphones, tablets, laptops and wearable devices. Further, other moveable devices may also incorporate imaging functionality such as piloted vehicles, vessels and cycles or autonomous devices such as drones and other robotic appliances. However, implementations in a mobile device may be particularly susceptible to degradation in quality caused by motion while the images are being captured. In particular, a camera incorporated into a portable device is often hand held during use and, despite efforts to be still during image recording, shaking may occur.
  • mobile devices may undergo intentional motion during usage as in the case of vehicles as well as unintentional motion such as vibration resulting from engines, motors and other powered systems or perturbations resulting from the medium in which the device is travelling, including roadway surfaces, air currents or water conditions.
  • EIS Electronic Image Stabilization
  • DIS Digital Image Stabilization
  • One technique that may be applied to a sequence of images is to compare the position of one or more objects in a plurality of images to each other and selectively displace a portion of the image to correct for any movements or vibrations of the device.
  • Optical Image Stabilization is a technique where the relative position of a lens or other optical element in a device is adjusted with respect to the image sensor to compensate for the sensed motion of the device.
  • the adjustment may involve moving either or both the lens and the image sensor and the adjustment may include either or both rotational and translational movement.
  • EIS techniques conventionally require some degree of cropping to achieve the corrective displacement of an image and cannot provide compensation during capture, preventing any reduction in blur that may result from motion during the exposure time.
  • OIS often provides improved performance.
  • the intended implementation of OIS involves the capability to produce a relative change in position between the lens and sensor of sufficient magnitude to compensate for the unintended motion experienced by the device.
  • Particularly under certain situations that involve increased motion, such as running or walking with a portable device conventional approaches to may not provide satisfactory compensation. Attempts to mitigate this effect by increasing the stroke length of the actuators driving components have the undesirable result of increasing the size of the camera module.
  • determining translational motion may be accomplished with accelerometer information but as a result of the double integration necessary to convert from acceleration to distance, suffers from accumulating drift and other errors and that render the technique impracticable after a relatively short period of time.
  • this disclosure includes a method for capturing stabilized images.
  • the method may involve obtaining motion sensor data for the mobile device. Motion of the mobile device may be determined from the obtained sensor data. A relative position of an image sensor and a lens may be adjusted based at least in part on the determined motion. A first image may be captured with the image sensor. A synchronization signal may be sent after the capture of the first image. The relative position of the lens and image sensor may be reset in response to the synchronization signal. Motion of the mobile device may again determined and the relative position of the lens and the image sensor may be adjusted from the reset relative position. A second image may then be captured.
  • This disclosure also includes a moveable device for capturing a plurality of stabilized images.
  • the device may have an image sensor, a lens, a motion sensor and at least one processor configured to receive data from the motion sensor.
  • the at least one processor may be configured to determine a motion of the mobile device from the motion sensor data and adjust a relative position of the lens and the image sensor based on the determined motion.
  • the at least one processor may also be configured to capture a first image with the image sensor. The relative position of the lens and the image sensor may be reset in response to a synchronization signal sent after capturing the first image.
  • the at least one processor may determine further motion of the mobile device from the motion sensor data and adjust the relative position of the lens and the image sensor from the reset relative position based on the determined further motion so that a second image with the image sensor.
  • FIG. 1 is a schematic diagram of a device configured to capture a plurality of stabilized captured images according to an embodiment.
  • FIG. 2 is a schematic diagram showing communication between components of a mobile device during stabilization according to an embodiment.
  • FIG. 3 is a schematic diagram showing conventional optical image stabilization during unintended motion.
  • FIG. 4 is a schematic diagram showing optical image stabilization to compensate for unintended motion by periodically resetting relative lens and image sensor position according to an embodiment.
  • FIG. 5 is a schematic diagram showing conventional optical image stabilization to during relatively large motion.
  • FIG. 6 is a schematic diagram showing optical image stabilization during relatively large motion by periodically resetting relative lens and image sensor position according to an embodiment.
  • FIG. 7 schematically depicts an exemplary routine for capturing a plurality of stabilized images by periodically resetting relative lens and image sensor position according to an embodiment.
  • Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor- readable medium, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software.
  • various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
  • the exemplary wireless communications devices may include components other than those shown, including well-known components such as a processor, memory and the like.
  • the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, performs one or more of the methods described above.
  • the non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
  • the non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like.
  • RAM synchronous dynamic random access memory
  • ROM read only memory
  • NVRAM non-volatile random access memory
  • EEPROM electrically erasable programmable read-only memory
  • FLASH memory other known storage media, and the like.
  • the techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
  • a carrier wave may be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
  • LAN local area
  • processors such as one or more motion processing units (MPUs), digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • MPUs motion processing units
  • DSPs digital signal processors
  • ASIPs application specific instruction set processors
  • FPGAs field programmable gate arrays
  • FPGAs field programmable gate arrays
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of an MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an MPU core, or any other such configuration.
  • a mobile electronic device with one or more digital cameras.
  • sensor data generated by the mobile device may be employed by a mobile device to process images captured by such digital cameras.
  • a mobile device may employ motion sensors as part of the user interface, such as for determining orientation of the device to adjust the display of information accordingly as well as for receiving user input for controlling an application, for navigational purposes, or for a wide variety of other applications. Data from such a sensor or plurality of sensors may be used to determine motion of the mobile device.
  • device 100 may be implemented as a device or apparatus, such as a handheld device that can be moved in space by a user and its motion and/or orientation in space therefore sensed, such as a multipurpose device like a smartphone or a dedicated handheld camera/ video recorder. As noted above, however, device 100 may instead be integrated into a vehicle or vessel used for transportation device or other mobile device whether autonomous or piloted.
  • Device 100 includes a camera unit 102 configured for capturing images.
  • the camera unit 102 includes at least an optical element, such as, for example, a lens 104, which projects the image onto an image sensor 106.
  • the camera unit 102 may optionally be apt to perform optical image stabilization (OIS).
  • OIS systems include processing to determine compensatory motion of the lens and/or image sensor in response to sensed motion of the device or part of the device, such as e .g. the camera (body), actuators to provide the compensatory motion in the image sensor or lens, and position sensors to determine whether the actuators have produced the desired movement.
  • the camera unit 102 may include dedicated motion sensors 107 to determine the motion, or may obtain the motion from another module in the device, such as e.g. sensor processing unit (SPU) 122 discussed below.
  • SPU sensor processing unit
  • the camera unit includes an actuator 108 for imparting relative movement between lens 104 and image sensor 106 along at least two orthogonal axes. Additionally, a position sensor 110 may be included for determining the position of lens 104 in relation to image sensor 106. Motion sensing may be performed by a general purpose sensor assembly as described below according to techniques disclosed in commonly-owned U.S. Patent No. 9,628,713, which is hereby incorporated by reference in its entirety.
  • actuator 108 may be implemented using voice coil motors (VCM) and position sensor 110 may be implemented with Hall sensors, although other suitable alternatives may be employed, such as shape memory alloy (SMA) or ball bearing actuators.
  • VCM voice coil motors
  • SMA shape memory alloy
  • Device 100 may also include a host processor 112, memory 114, interface device 116 and display 118.
  • Host processor 112 can be one or more microprocessors, central processing units (CPUs), or other processors which run software programs, which may be stored in memory 114, associated with the functions of device 100.
  • Interface devices 116 can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, buttons, touch screen joystick, slider, knob, printer, scanner, computer network I/O device, other connected peripherals and the like.
  • Display 118 may be configured to output images viewable by the user and may function as a viewfinder for camera unit 102.
  • image processor 120 for receiving output from image sensor 106 as well as controlling the OIS system, although in other embodiments, any distribution of these functionalities may be provided between host processor 112 and other processing resources of device 100.
  • camera unit 102 may include a processor to analyze the motion sensor input and control the actuators.
  • Image processor 120 or other processing resources may also apply stabilization and/or compression algorithms to the captured images as described below.
  • multiple layers of software can be provided in memory 114, which may be any combination of computer readable medium such as electronic memory or other storage medium such as hard disk, optical disk, etc., for use with the host processor 112.
  • an operating system layer can be provided for device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of device 100.
  • different software application programs such as menu navigation software, games, camera function control, image processing or adjusting, navigation software, communications software, such as telephony or wireless local area network (WLAN) software, or any of a wide variety of other software and functional interfaces can be provided.
  • multiple different applications can be provided on a single device 100, and in some of those embodiments, multiple applications can run simultaneously.
  • Device 100 also includes a general purpose sensor assembly in the form of integrated sensor processing unit SPU 122 featuring sensor processor 124, memory 126 and motion sensor 128.
  • Memory 126 may store algorithms, routines or other instructions for processing data output by motion sensor 128 and/or other sensors as described below using logic or controllers of sensor processor 124, as well as storing raw data and/or motion data output by motion sensor 128 or other sensors.
  • Motion sensor 128 may be one or more sensors for measuring motion of device 100 in space.
  • SPU 122 measures one or more axes of rotation and/or one or more axes of acceleration of the device.
  • at least some of the motion sensors are inertial sensors, such as rotational motion sensors or linear motion sensors.
  • the rotational motion sensors may be gyroscopes to measure angular velocity along one or more orthogonal axes and the linear motion sensors may be accelerometers to measure linear acceleration along one or more orthogonal axes.
  • the gyroscopes and accelerometers may each have 3 orthogonal axes, such as to measure the motion of the device with 6 degrees of freedom.
  • the signals from the sensors may be combined in a sensor fusion operation performed by sensor processor 124 or other processing resources of device 100 provides a six axis determination of motion.
  • the sensor information may be converted, for example, into an orientation, a change of orientation, a speed of motion, or a change in the speed of motion.
  • motion sensor 128 may be implemented using MEMS to be integrated with SPU 122 in a single package.
  • Exemplary details regarding suitable configurations of host processor 112 and SPU 122 may be found in commonly owned U.S. Patents Nos. 8,250,921 and 8,952,832, which are hereby incorporated by reference in their entirety.
  • SPU 122 may be configured as a sensor hub by aggregating sensor data from additional processing layers as described in commonly owned U.S. Patent Publication No. 2014/14480364, which is also hereby incorporated by reference in its entirety.
  • SPU 122 may be configured to provide motion data for purposes independent of camera unit 102, such as to host processor 112 for user interface functions, as well as enabling OIS functionality. Any, or all parts of the MPU may be combined with image processor 120 into a single chip or single package, and may be integrated into the camera unit 102. Any processing or processor needed for the actuator 108 control or position sensor 110 control, may also be included in the same chip or package.
  • MPUTM motion processing unit
  • Device 100 may also include other sensors as desired.
  • analog sensor 130 may provide output to analog to digital converter (ADC) 132, for example within SPU 122.
  • ADC analog to digital converter
  • data output by digital sensor 134 may be communicated over bus 136 to sensor processor 124 or other processing resources in device 100.
  • Analog sensor 130 and digital sensor 134 may provide additional sensor data about the environment surrounding device 100.
  • non-inertial sensors such as one or more pressure sensors, magnetometers, temperature sensors, infrared sensors, ultrasonic sensors, radio frequency sensors, position sensors such as GPS, or other types of sensors can be provided.
  • data from a magnetometer measuring along three orthogonal axes may be fused with gyroscope and accelerometer data to provide a nine axis determination of motion.
  • a pressure sensor may be used as an indication of altitude for device 100, such that a sensor fusion operation may provide a ten axis determination of motion.
  • device 100 may have access to other types of motion sensing in some embodiments, such as odometery and/or distance measurement that may be produced by a vehicle or other mobile devices.
  • camera unit 102, SPU 122, host processor 112, memory 114 and other components of device 100 may be coupled through bus 136, which may be any suitable bus or interface, such as a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent.
  • PCIe peripheral component interconnect express
  • USB universal serial bus
  • UART universal asynchronous receiver/transmitter
  • AMBA advanced microcontroller bus architecture
  • I2C Inter-Integrated Circuit
  • SDIO serial digital input output
  • SPI serial peripheral interface
  • additional buses may be used to couple the various components of device 100, such as by using a dedicated bus between host processor 112 and memory 114.
  • a motion algorithm layer can provide motion algorithms that provide lower- level processing for raw sensor data provided from the motion sensors and other sensors.
  • a sensor device driver layer may provide a software interface to the hardware sensors of device 100.
  • a suitable application program interface API may be provided to facilitate communication between host processor 112 and SPU 122, for example, to transmit desired sensor processing tasks.
  • Other embodiments may feature any desired division of processing between SPU 122 and host processor 112 as appropriate for the applications and/or hardware being employed.
  • lower level software layers may be provided in SPU 122 and an API layer implemented by host processor 112 may allow communication of the states of application programs as well as sensor commands.
  • API layer implemented by host processor 112 may allow communication of the states of application programs as well as sensor commands.
  • a chip is defined to include at least one substrate typically formed from a semiconductor material.
  • a single chip may be formed from multiple substrates, where the substrates are mechanically bonded to preserve the functionality.
  • a multiple chip includes at least two substrates, wherein the two substrates are electrically connected, but do not require mechanical bonding.
  • a package provides electrical connection between the bond pads on the chip to a metal lead that can be soldered to a PCB.
  • a package typically comprises a substrate and a cover.
  • Integrated Circuit (IC) substrate may refer to a silicon substrate with electrical circuits, typically CMOS circuits.
  • MEMS cap provides mechanical support for the MEMS structure. The MEMS structural layer is attached to the MEMS cap. The MEMS cap is also referred to as handle substrate or handle wafer.
  • an MPU may incorporate the sensor.
  • the sensor or sensors may be formed on a first substrate.
  • Other embodiments may include solid-state sensors or any other type of sensors.
  • the electronic circuits in the MPU receive measurement outputs from the one or more sensors.
  • the electronic circuits process the sensor data.
  • the electronic circuits may be implemented on a second silicon substrate.
  • the first substrate may be vertically stacked, attached and electrically connected to the second substrate in a single semiconductor chip, while in other embodiments the first substrate may be disposed laterally and electrically connected to the second substrate in a single semiconductor package.
  • the first substrate may be attached to the second substrate through wafer bonding, as described in commonly owned U.S. Patent No. 7,104,129, which is incorporated herein by reference in its entirety, to simultaneously provide electrical connections and hermetically seal the MEMS devices.
  • This fabrication technique advantageously enables technology that allows for the design and manufacture of high performance, multi-axis, inertial sensors in a very small and economical package. Integration at the wafer-level minimizes parasitic capacitances, allowing for improved signal-to-noise relative to a discrete solution. Such integration at the wafer-level also enables the incorporation of a rich feature set which minimizes the need for external amplification.
  • raw data refers to measurement outputs from the sensors which are not yet processed.
  • motion data may refer to processed raw data, which may involve applying a sensor fusion algorithm or applying any other algorithm.
  • data from one or more sensors may be combined to provide an orientation or orientation change of the device.
  • an MPU may include processors, memory, control logic and sensors among structures.
  • the term “captured image” refers to the pixels recorded by the image sensor of a digital camera, such as image sensor 106, without further stabilization adjustments. Therefore, to the extent OIS techniques are applied, the captured image may have been stabilized by any compensating changes in the relative positioning of lens 104 and image sensor 106, but no other processing of the recorded pixels has been performed to further stabilize the image.
  • Synchronization signals may be available during the row and frame readout of the sensors to help determine the relative positioning of lens 104 and image sensor 106 at the different stages of image recording.
  • FIG. 2 schematically depicts the exchange of signals between image processor 120 and camera unit 102.
  • image processor 120 may receive motion sensor data representing movement of device 100, such as from SPU 122 as shown.
  • any suitable source of sensor data may be employed, including a dedicated OIS motion sensor assembly.
  • Image processor 120 may employ the motion sensor data to determine an appropriate change in relative position between lens 104 and image sensor 106 to compensate for the detected movement.
  • image processor 120 sends actuator signals 202 configured to cause actuator 108 to produce the desired relative change in position.
  • position sensor 110 produces an output position signal 204 reflecting the relationship between lens 104 and image sensor 106 as feedback to verify that operation of actuator 108 has resulted in the desired change in relative position.
  • the HSync and FSync signals 206 correspond to row and frame information from image sensor 106, respectively.
  • the image readout signals 208 transfer the data from the image sensor 106 to the image processor 120. Any one or combination of signals 202, 204, 206 and 208 may be used to estimate compensatory stabilization adjustments to the relative position between lens 104 and image sensor 106.
  • OIS techniques may be employed during the capture of images to compensate for unintended movements of device 100 by generating a compensating relative movement between image sensor 106 and lens 104 in response to detected movement.
  • OIS techniques are limited by the displacement limitations of actuators 108.
  • FIG. 3 schematically depicts a compensatory lens shift as trace 300 that may be applied to accommodate a motion sinus of 5 Hz. Typical hand vibrations are in the range of 1 - 10 Hz, so this provides a good representation of the hand-held context. Even if actuator 108 can provide the required amount of displacement, at each peak and trough this limit may be met or exceeded and prevent further compensatory motion so that stabilization cannot be performed.
  • FIG. 4 schematically depicts the resetting of lens position in response to the FSYNC signal at 30 Hz, although it should be appreciated that any suitable synchronization signal may be used, including those used for other image processing tasks as well as signals generated specifically for this purpose.
  • lens shift is represented as trace 400 and FSYNC as trace 402.
  • a motion sinus of 5 Hz exhibits a period of 200 ms which requires the lens and image sensor position to transition from center to each maximum and minimum in one fourth that time, i.e. 50 ms.
  • lens shift 400 is reset to a center position every 33 ms, which is sufficient to avoid reaching the motion limit of actuator 108 and compensatory motion is always possible.
  • lens and image sensor position may be reset every 17 ms.
  • actuator 108 may be configured so that the reset motion is faster than the standard compensation motion.
  • the output from image sensor 102 may be coupled to SPU 122 through bus 136.
  • the FSYNC may also be directly coupled to an OIS controller if the OIS functionality is implemented within camera 102.
  • SPU 122 may already receive FSYNC as an input for the purpose of applying EIS techniques.
  • FIG. 5 schematically depicts a conventional OIS technique during a large motion.
  • the lens shift on the x-axis is represented by trace 500 and the lens shift on the y-axis is represented by trace 502.
  • the camera was rotated by 90° around vertical between the 18 and 20.5 s time stamps, with the x-axis lens shift 500 exhibiting a 5 Hz motion sinus to replicate hand vibration.
  • OIS is interrupted during the rotation when the y-axis lens shift exceeds the motion limits.
  • FIG. 1 schematically depicts a conventional OIS technique during a large motion.
  • the lens shift on the x-axis is represented by trace 500
  • the lens shift on the y-axis is represented by trace 502.
  • the camera was rotated by 90° around vertical between the 18 and 20.5 s time stamps, with the x-axis lens shift 500 exhibiting a 5 Hz motion sinus to replicate hand vibration.
  • OIS is interrupted during the rotation when the y-axis lens shift exceeds the motion limits.
  • FIG. 6 indicates the result of resetting the lens and image sensor position at the same 30 Hz rate shown in the previous example, with the lens shift on the x-axis is represented by trace 600 and the lens shift on the y-axis is represented by trace 602. As shown, resetting the lens shift results in the capability of continually providing OIS throughout the rotation, accommodating both the vibratory motion indicated on the x-axis lens shift 600 and the rotation motion indicated on the y- axis lens shift 602.
  • the periodic resetting of lens and image sensor position has the additional benefit of allowing recalibration of the sensors at each reset given that the reset position is known.
  • drift and other sensor errors are significantly mitigated.
  • accelerometer measurements are particularly susceptible to drift due to the double integration performed to estimate translation.
  • Resetting the relative position of the lens and image sensor periodically reduces the chance that this drift will cause the motion limits of the actuators to be exceeded,
  • the periodic resetting of lens shift significantly reduces the amount of time during which the accelerometer readings can drift and results in more accurate estimations of translation.
  • the reset position may be chosen based at least in part on a filtering operation configured to compensate for the drift.
  • the techniques of this disclosure may also be applied to generating a composite still image from a plurality of captured images.
  • one type of composite image may be a stitched together panorama that represents a greater field of view than any one of the captured images.
  • the relative position of the lens and image sensor can be reset to a desired position, such as so the expected motion center during capture of the next image matches the center position.
  • the advantages discussed above regarding the compensation of smaller motions such as hand vibration as well as the compensation of larger motions, such as may result during running or other types of high activity can be realized.
  • a potential complication associated with the techniques of this disclosure relates to the electronic view finder function in which an image preview is displayed to the user continuously.
  • the resetting of the lens shift may cause the center of each image preview to shift undesirably.
  • this artifact can be mitigated using electronic image stabilization (EIS) techniques.
  • EIS electronic image stabilization
  • at least two image preview may be compared to determine whether one or more pixels have been translated by the resetting of the lens and image sensor position and/or the pixel translation calculated based on the known motion of the reset.
  • a subsequent image preview may be adjusted to minimize the amount of pixel shift to provide the user with a more stable image preview function without affecting the OIS compensation of images actually captured.
  • an exemplary routine for capturing a plurality of images with a mobile device is schematically depicted in the flowchart of FIG. 7.
  • motion sensor data is obtained for the mobile device.
  • a motion of the mobile device is determined from the obtained sensor data.
  • a relative position of an image sensor and a lens is adjusted in 704 based at least in part on the determined motion and a first image captured with the image sensor in 706.
  • a synchronization signal is sent after the capture of the first image and the relative position of the lens and image sensor is reset in response to the synchronization signal in 710.
  • Further motion of the mobile device is then determined in 712, based at least in part on obtained sensor data.
  • the relative position of the lens and the image sensor may be adjusted from the reset relative position in 714 and a second image captured in 716.
  • the first and second images may be part of a sequence of images captured at a sampling rate.
  • the synchronization signal may be sent periodically between each image capture of the sequence of images.
  • Electronic image stabilization may be applied to align sequential captured images.
  • the electronic image stabilization may be based at least in part on the sensor data.
  • adjusting the relative position of the lens and the image sensor may involve a recentering.
  • adjusting the relative position of the lens and the image sensor may involve selecting a new relative position of the lens and the image sensor based at least in part on previous motion of the mobile device.
  • the first and second images may be incorporated into a panoramic image.
  • adjusting the relative position of the lens and the image sensor may include a translational motion correction.
  • determining further motion of the mobile device from the obtained motion sensor data may include obtaining motion sensor data at an epoch corresponding to the synchronization signal.
  • the obtained sensor data may be data from an inertial sensor assembly associated with the mobile device.
  • the obtained sensor data may be non-inertial motion sensor data.
  • this disclosure may also include a mobile device with an image sensor, a lens, a motion sensor and at least one processor configured to receive data from the motion sensor.
  • the at least one processor may be configured to determine a motion of the mobile device from the motion sensor data and adjust a relative position of the lens and the image sensor based on the determined motion.
  • the at least one processor may also be configured to capture a first image with the image sensor.
  • the relative position of the lens and the image sensor may be reset in response to a synchronization signal sent after capturing the first image.
  • the at least one processor may determine further motion of the mobile device from the motion sensor data and adjust the relative position of the lens and the image sensor from the reset relative position based on the determined further motion so that a second image with the image sensor.
  • the first and second images may be part of a sequence of images captured at a sampling rate
  • the synchronization signal may be sent periodically between each image capture of the sequence of images.
  • At least one processor may be configured to apply electronic image stabilization to align sequential captured images.
  • the electronic image stabilization may be based at least in part on the sensor data.
  • at least one processor is configured to reset the relative position of the lens and the image sensor by recentering.
  • At least one processor may be configured to reset the relative position of the lens and the image sensor by selecting a new relative position of the lens and the image sensor based at least in part on previous motion of the mobile device.
  • the motion sensor may be an inertial sensor assembly.
  • the inertial sensor assembly may have an accelerometer and the at least one processor may be configured to adjust the relative position of the lens and the image sensor using a translational motion correction.
  • the motion sensor may be a non-inertial motion sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Sont divulgués des systèmes et des procédés de capture d'images stabilisées. Le mouvement du dispositif mobile est déterminé de sorte que la position relative de la lentille et du capteur d'image puisse être ajustée afin de compenser un mouvement involontaire. La position relative de la lentille et du capteur d'image peut être réinitialisée périodiquement en réponse à un signal de synchronisation entre la capture d'images de capture.
PCT/US2022/051110 2021-12-16 2022-11-28 Systèmes et procédés de capture d'images stabilisées WO2023113991A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163290520P 2021-12-16 2021-12-16
US63/290,520 2021-12-16
US17/992,801 2022-11-22
US17/992,801 US20230199326A1 (en) 2021-12-16 2022-11-22 Systems and methods for capturing stabilized images

Publications (1)

Publication Number Publication Date
WO2023113991A1 true WO2023113991A1 (fr) 2023-06-22

Family

ID=84980878

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/051110 WO2023113991A1 (fr) 2021-12-16 2022-11-28 Systèmes et procédés de capture d'images stabilisées

Country Status (1)

Country Link
WO (1) WO2023113991A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7104129B2 (en) 2004-02-02 2006-09-12 Invensense Inc. Vertically integrated MEMS structure with electronics in a hermetically sealed cavity
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US20150319365A1 (en) * 2014-03-17 2015-11-05 Invensense Inc. Systems and methods for optical image stabilization
US20170289454A1 (en) * 2016-04-04 2017-10-05 Microsoft Technology Licensing, Llc Method and apparatus for video content stabilization
US20200137308A1 (en) * 2018-10-30 2020-04-30 Qualcomm Incorporated Optical image stabilization techniques

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7104129B2 (en) 2004-02-02 2006-09-12 Invensense Inc. Vertically integrated MEMS structure with electronics in a hermetically sealed cavity
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US20150319365A1 (en) * 2014-03-17 2015-11-05 Invensense Inc. Systems and methods for optical image stabilization
US9628713B2 (en) 2014-03-17 2017-04-18 Invensense, Inc. Systems and methods for optical image stabilization using a digital interface
US20170289454A1 (en) * 2016-04-04 2017-10-05 Microsoft Technology Licensing, Llc Method and apparatus for video content stabilization
US20200137308A1 (en) * 2018-10-30 2020-04-30 Qualcomm Incorporated Optical image stabilization techniques

Similar Documents

Publication Publication Date Title
US9628713B2 (en) Systems and methods for optical image stabilization using a digital interface
US11412142B2 (en) Translation correction for optical image stabilization
US20170085740A1 (en) Systems and methods for storing images and sensor data
US10958838B2 (en) Method and device for electronic image stabilization of a captured image
US10506163B2 (en) Systems and methods for synchronizing sensor data
US20190226848A1 (en) Integrated motion processing unit (mpu) with mems inertial sensing and embedded digital electronics
US9013585B2 (en) Image capture device
US10458812B2 (en) Sensor output configuration
US20170041545A1 (en) Systems and methods for stabilizing images
JP6098874B2 (ja) 撮像装置および画像処理装置
US20170241799A1 (en) Systems and methods to compensate for gyroscope offset
CN109951631B (zh) 用于图像捕获的图像稳定的系统和方法
KR101856947B1 (ko) 촬영장치, 움직임 추정장치, 영상 보정 방법, 움직임 추정방법 및 컴퓨터 판독가능 기록매체
US11042984B2 (en) Systems and methods for providing image depth information
US20190132516A1 (en) Systems and methods for digital video stabalization
US20150286279A1 (en) Systems and methods for guiding a user during calibration of a sensor
JP5977611B2 (ja) ブレ量検出装置、撮像装置及びブレ量検出方法
US20230199326A1 (en) Systems and methods for capturing stabilized images
CN112154480B (zh) 可移动平台的定位方法、装置、可移动平台及存储介质
WO2023113991A1 (fr) Systèmes et procédés de capture d'images stabilisées
EP3859498B1 (fr) Dispositif électronique de pointage avec récupération de démarrage rapide et procédé correspondant
US9921335B1 (en) Systems and methods for determining linear acceleration
WO2023007789A1 (fr) Unité de mesure inertielle, procédé de fonctionnement d'unité de mesure inertielle, dispositif d'imagerie, dispositif d'affichage et programme
US20230417553A1 (en) Orientation calculation apparatus, orientation calculation method, imaging apparatus including orientation calculation apparatus, and method for controlling same
CN104704804A (zh) 摄像装置、检测装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22843915

Country of ref document: EP

Kind code of ref document: A1