WO2019178872A1 - 视频图像防抖方法和终端 - Google Patents

视频图像防抖方法和终端 Download PDF

Info

Publication number
WO2019178872A1
WO2019178872A1 PCT/CN2018/080357 CN2018080357W WO2019178872A1 WO 2019178872 A1 WO2019178872 A1 WO 2019178872A1 CN 2018080357 W CN2018080357 W CN 2018080357W WO 2019178872 A1 WO2019178872 A1 WO 2019178872A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
axis
distance
image
jitter
Prior art date
Application number
PCT/CN2018/080357
Other languages
English (en)
French (fr)
Inventor
李远友
罗巍
刘桓宇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP18910358.3A priority Critical patent/EP3745705A4/en
Priority to US16/976,820 priority patent/US11539887B2/en
Priority to RU2020133144A priority patent/RU2758460C1/ru
Priority to PCT/CN2018/080357 priority patent/WO2019178872A1/zh
Priority to CN201880037958.4A priority patent/CN110731077B/zh
Publication of WO2019178872A1 publication Critical patent/WO2019178872A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Definitions

  • the present application relates to the field of image processing, and in particular, to a video image stabilization method and a terminal.
  • OIS optical image stabilization
  • EIS electronic image stabilization
  • EIS technology includes two methods: one is anti-shake processing based on image content, according to the content of the image frames before and after, the motion of the image is recognized, the image registration is aligned, and the appropriate cropping, stretching, deformation, etc. are processed.
  • the disadvantages of the method are large computational complexity, slow speed, and high power consumption.
  • the other is based on motion sensor data for anti-shake processing. The motion between the frame and the frame is calculated by the motion sensor data during each frame of image exposure. The image is aligned and then properly cropped, stretched, deformed, etc. This method is fast and has low power consumption.
  • the current second EIS technology can only compensate for the jitter in the five-axis direction, including: pitch, yaw, and roll, and the X-axis and the Y-axis. Translational jitter. Compensation for the translational jitter of the Z-axis has not been achieved.
  • the Z axis refers to the optical axis of the camera
  • the X axis refers to the axis perpendicular to the Z axis in the horizontal plane
  • the Y axis refers to the axis perpendicular to the Z axis in the vertical plane
  • the rolling finger rotates around the Z axis
  • the pitch finger surrounds
  • the X axis rotates and the wobble refers to rotation about the Y axis.
  • the embodiment of the present application provides a video image anti-shake method and a terminal device, which are used to implement compensation for translational jitter in the Z direction.
  • a video image stabilization method includes: the terminal turns on the camera, and the video image is captured by the camera; the terminal detects the jitter on the X, Y, and Z axes when shooting, and the Z axis is the camera.
  • the optical axis, the X axis refers to the axis perpendicular to the Z axis in the horizontal plane, and the Y axis refers to the axis perpendicular to the Z axis in the vertical plane; the terminal defends the video image according to the jitter on the X, Y, and Z axes Shake processing.
  • the video image anti-shake method provided by the present application not only realizes compensation for the translational shake of the terminal on the Z-axis, but also achieves more accurate X/Y translational shake anti-shake effect by object distance detection, and additionally detects the position of the lens. Calculate the amount of image zoom caused by focusing, and then reverse zoom to make the picture stable when focusing.
  • the method further includes: detecting, by the terminal, the object distance, where the object distance refers to a distance of the focusing object or the character.
  • the terminal detects the jitter on the X, Y, and Z axes, including: if the object distance is greater than or equal to the object distance threshold, the terminal detects the rotation jitter on the X, Y, and Z axes; if the object distance is less than the object distance threshold, the terminal detects Rotational jitter on the X, Y, and Z axes, and the terminal detects translational jitter on the X, Y, and Z axes.
  • the terminal can only detect the rotation jitter of the terminal on the X, Y, and Z axes (three-axis video anti-shake), thereby reducing the amount of data, improving the data processing speed, and reducing the work. If the object distance is less than the object distance threshold, the terminal can detect the rotation jitter and translation jitter of the terminal on the X, Y, and Z axes (six-axis video anti-shake), which can achieve better anti-shake effect.
  • the terminal detects the object distance, including: the terminal detects the object distance by using a depth sensor, and the depth sensor includes at least one of the following sensors: a laser sensor, a time of flight (TOF) sensor, and a structure.
  • a depth sensor includes at least one of the following sensors: a laser sensor, a time of flight (TOF) sensor, and a structure.
  • Light sensor This embodiment provides a sensor that detects the object distance.
  • the terminal detects the rotational jitter on the X, Y, and Z axes, including: the terminal detects the rotational jitter on the X, Y, and Z axes by the angle sensor, and the angle sensor includes the gyroscope.
  • This embodiment provides a sensor that detects rotational jitter on the X, Y, and Z axes.
  • the terminal detects the translational jitter on the X, Y, and Z axes, including: the terminal detects the translational jitter on the X, Y, and Z axes through the displacement sensor, and the displacement sensor includes an accelerometer.
  • This embodiment provides a sensor that detects translational jitter on the X, Y, and Z axes.
  • the method further includes: detecting, by the terminal, an image distance.
  • the terminal performs anti-shake processing on the video image according to the jitter on the X, Y, and Z axes, including: the terminal performs anti-shake processing on the video image according to the object distance, the image distance, and the jitter on the X, Y, and Z axes.
  • This embodiment can further compensate for the six-axis anti-shake by object distance and image distance.
  • the terminal detects the image distance, including: the terminal detects the image distance by using the position sensor, and the position sensor includes at least one of the following sensors: a Hall sensor, an anisotropic magnetoresistance AMR sensor, a giant magnetoresistance GMR sensor, tunnel magnetoresistive TMR sensor.
  • the position sensor includes at least one of the following sensors: a Hall sensor, an anisotropic magnetoresistance AMR sensor, a giant magnetoresistance GMR sensor, tunnel magnetoresistive TMR sensor.
  • This embodiment provides a sensor that detects the image distance.
  • the terminal performs anti-shake processing on the video image according to the object distance, the image distance, and the jitter on the X, Y, and Z axes, including: image jitter caused by the rotation of the terminal around the Z axis, the terminal pair
  • the video image is subjected to rotation compensation that is opposite to the direction of rotation and has the same rotation angle.
  • the trapezoidal distortion is restored to a rectangle based on the trapezoidal correction algorithm in the axial direction, where d is the image displacement distance, v is the image distance, and ⁇ is the rotation angle around the X axis or the Y axis.
  • the terminal scales the video image based on the scaling ratio (u+ ⁇ )/u, where u is the object distance and ⁇ is the distance the terminal shifts away from the object on the Z-axis. .
  • This embodiment provides a specific way to further compensate for the six-axis anti-shake by object distance and image distance.
  • the method further includes: for image jitter caused by terminal focus adjustment, the terminal scales the video image based on a scaling ratio [(u- ⁇ )v]/[(v+ ⁇ )u], Where u is the object distance, v is the image distance, and ⁇ is the distance the lens moves toward the object.
  • This embodiment can compensate for image jitter caused by focusing.
  • a terminal comprising: a shooting unit, configured to open a camera and capture a video image through the camera; a detecting unit configured to detect jitter on the X, Y, and Z axes when shooting, and the Z axis is a camera
  • the optical axis, the X axis refers to the axis perpendicular to the Z axis in the horizontal plane, the Y axis refers to the axis perpendicular to the Z axis in the vertical plane;
  • the anti-shake unit is used for the jitter according to the X, Y, and Z axes Anti-shake processing of video images.
  • the principles and benefits of the terminal can be solved.
  • the implementation of the terminal can be referred to the first The aspects and implementations of the various possible methods of the first aspect are not repeated here.
  • an embodiment of the present application provides a terminal, including: a processor, a memory, and a communication interface; the memory is configured to store a computer execution instruction, and the processor is coupled to the memory, when the terminal is running, the processor The computer executing the memory stores instructions to cause the terminal to perform the first aspect and the various possible methods of the first aspect.
  • an embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium stores an instruction, when the instruction is run on any one of the foregoing terminals, causing the terminal to execute the first aspect and the first Various possible methods of aspects.
  • an embodiment of the present application provides a computer program product comprising instructions, which when executed on any of the above terminals, causes the terminal to perform the first aspect and each possible method of the first aspect.
  • the names of the components in the terminal are not limited to the device itself, and in actual implementation, the components may appear under other names. As long as the functions of the various components are similar to the embodiments of the present application, they are within the scope of the claims and their equivalents.
  • FIG. 1 is a schematic diagram of various jitters of a video image according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a front side of a terminal according to an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram 1 of a terminal according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of five-axis anti-shake of a video image according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of how each sensor according to an embodiment of the present disclosure is processed
  • FIG. 6 is a schematic flowchart of a video image anti-shake method according to an embodiment of the present disclosure
  • FIG. 7 is a schematic diagram of an angle sensor according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a displacement sensor according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of determining three-axis anti-shake or six-axis anti-shake according to an object distance according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a terminal rotating (swinging) jitter or a Y-axis rotation (pitching) jitter around an X-axis according to an embodiment of the present application;
  • FIG. 11 is a schematic diagram of translational jitter of a terminal on an X-axis or a Y-axis according to an embodiment of the present disclosure
  • FIG. 12 is a schematic diagram of translational jitter of a terminal on a Z axis according to an embodiment of the present disclosure
  • FIG. 13 is a schematic diagram of image jitter caused by focusing of a terminal according to an embodiment of the present disclosure
  • FIG. 14 is a schematic diagram of a TOF provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a mounting position of a Hall magnet and a Hall sensor according to an embodiment of the present application.
  • 16 is a schematic diagram of a basic principle of six-axis anti-shake for a video image according to an embodiment of the present application.
  • FIG. 17 is a schematic structural diagram 2 of a terminal according to an embodiment of the present disclosure.
  • FIG. 18 is a schematic structural diagram 3 of a terminal according to an embodiment of the present application.
  • the terminal in the embodiment of the present application may be various electronic devices having a photographic function, for example, may be a wearable electronic device (such as a smart watch, etc.), a tablet computer, a desktop computer, a virtual reality device, an augmented reality device, a camera, a camera. It can also be the mobile phone 200 shown in FIG. 2 or FIG. 3, and the specific form of the terminal is not limited in the embodiment of the present application.
  • the terminal in the embodiment of the present application may be the mobile phone 200.
  • 2 is a front view of the mobile phone 200.
  • the mobile phone 200 may be, for example, a full screen as shown in (a) of FIG. 2, or a full screen (not called a notch) as shown in FIG. 2(b). Full screen).
  • FIG. 3 is a schematic diagram of the hardware structure of the mobile phone 200. It should be understood that the illustrated mobile phone 200 is merely an example of a terminal, which may have more or fewer components than those shown in the figures, may combine two or more components, or may have different component.
  • the mobile phone 200 may include: a radio frequency (RF) circuit 210, a memory 220, an input unit 230, a display unit 240, a sensor 250, an audio circuit 260, and a wireless fidelity (Wi-Fi).
  • RF radio frequency
  • Module 270, processor 280, Bluetooth module 281, and power supply 290 and the like may be included in the mobile phone 200.
  • the RF circuit 210 can be used for receiving and transmitting signals during transmission and reception of information or during a call.
  • the downlink data of the base station can be received and then sent to the processor 280 for processing; and the uplink data can be sent to the base station.
  • RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the memory 220 can be used to store software programs and data.
  • the processor 280 performs various functions of the mobile phone 200 and data processing by running software programs or data stored in the memory 220.
  • the memory 220 may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the memory 220 stores an operating system that enables the mobile phone 200 to operate, such as developed by Apple. Operating system, developed by Google Inc. Open source operating system, developed by Microsoft Corporation Operating system, etc.
  • the memory 220 in the present application may store an operating system and various application software, and may also store code for executing the method described in the embodiment of the present application.
  • An input unit 230 such as a touch screen, can be used to receive input numeric or character information, producing signal inputs related to user settings and function control of the handset 200.
  • the input unit 230 may include a touch panel 231 disposed on the front surface of the mobile phone 200 as shown in FIG. 2, and may collect a touch operation on or near the user.
  • the input unit 230 in the present application can collect the touch operation of the user.
  • the display unit 240 (ie, the display screen) can be used to display information input by the user or information provided to the user and a graphical user interface (GUI) of various menus of the mobile phone 200.
  • the display unit 240 may include a display panel 241 disposed on the front side of the mobile phone 200.
  • the display panel 241 can be configured in the form of a liquid crystal display, a light emitting diode, or the like.
  • Display unit 240 can be used to display the various graphical user interfaces described in this application.
  • the touch panel 231 can be overlaid on the display panel 241.
  • the touch panel 231 can be integrated with the display panel 241 to implement the input and output functions of the mobile phone 200.
  • the display unit 240 in the present application can display a captured video or the like.
  • the handset 200 can also include at least one sensor 250.
  • the angle sensor is used to detect the rotational motion of the device in the X, Y, and Z directions, and may be a gyroscope or other motion sensor.
  • the displacement sensor is used to detect the translational jitter of the device in the X, Y, and Z directions, and may be an accelerometer or other motion sensor.
  • the depth sensor is used to detect the object distance of the shooting scene, and may be a laser sensor, a time of flight (TOF), a depth detecting device such as a structured light, or the like.
  • Position sensor for detecting the position (or image distance) of the lens, which may be a Hall sensor, an anisotropic magneto resistance (AMR), a giant magneto resistance (GMR), a tunnel magnet A device that can detect the position of the lens, such as a tunneling magneto resistance (TMR).
  • the image sensor is used for sensitizing and generating an image, and may be an optical image sensor such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the mobile phone 200 can also be equipped with other sensors such as a barometer, a hygrometer, a thermometer, an infrared sensor, and the like.
  • the audio circuit 260, the speaker 261, and the microphone 262 can provide an audio interface between the user and the handset 200.
  • the audio circuit 260 can transmit the converted electrical data of the received audio data to the speaker 261, and convert it into a sound signal output by the speaker 261.
  • the microphone 262 converts the collected sound signal into an electrical signal, and the audio circuit 260 After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 210 for transmission to, for example, another mobile phone, or the audio data is output to the memory 220 for further processing.
  • the microphone 262 of the present application can capture audio synchronized with the video.
  • Wi-Fi is a short-range wireless transmission technology
  • the mobile phone 200 can help users to send and receive emails, browse web pages, and access streaming media through the Wi-Fi module 270, which provides users with wireless broadband Internet access.
  • the processor 280 is a control center of the mobile phone 200, and connects various parts of the entire mobile phone by using various interfaces and lines, and executes the mobile phone 200 by running or executing a software program stored in the memory 220 and calling data stored in the memory 220.
  • the processor 280 can include one or more processing units; the processor 280 can also integrate an application processor and a baseband processor, wherein the application processor primarily processes an operating system, a user interface, an application, etc., a baseband The processor primarily handles wireless communications. It can be understood that the above baseband processor may not be integrated into the processor 280.
  • the processor 280 in this application may include an image signal processor (ISP).
  • ISP image signal processor
  • the Bluetooth module 281 is configured to perform information interaction with other Bluetooth devices having a Bluetooth module through a Bluetooth protocol.
  • the mobile phone 200 can establish a Bluetooth connection through a Bluetooth module 281 and a wearable electronic device (such as a smart watch) that also has a Bluetooth module, thereby performing data interaction.
  • the handset 200 also includes a power source 290 (such as a battery) that supplies power to the various components.
  • the power supply can be logically coupled to the processor 280 through a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
  • the terminal may have image jitter between the frame and the frame due to the user's hand-held or the user is in motion.
  • the anti-shake function of the terminal may compensate the video image moving in a certain direction in the opposite direction, thereby Eliminate image jitter between frames.
  • the jitter of the video in the five-axis direction can be compensated at most, and the rotation jitter of the pitch, yaw, and roll is detected by the angular velocity sensor. And the detection of the X-axis translational jitter and the Y-axis translational jitter by the acceleration sensor, and finally the compensation (anti-shake) result is obtained.
  • the Z-axis translational jitter that is, the jitter in the front-back direction of the optical axis, there is currently no good way to handle it.
  • the compensation of the X-axis translational jitter and the Y-axis translational jitter is not considered, and the compensation effect is not good.
  • the adjustment described in the present application The focus adjusts the image distance, that is, adjusts the distance of the lens relative to the optical image sensor.
  • the terminal of the present application performs rotational jitter detection by an angle sensor, performs translational shake detection by a displacement sensor, performs object distance detection by a depth sensor, and performs lens position detection by a position sensor, and the processor combines the above detection results.
  • the video compensation calculation is performed to obtain the rotation compensation amount of the X/Y/Z axis and/or the translation compensation amount of the X/Y/Z axis.
  • the terminal acquires the source video stream during the detection by the optical image sensor, and buffers the N frame image of the source video stream, and the image signal processor (ISP) or the graphic processing unit (GPU) combines the above compensation.
  • ISP image signal processor
  • GPU graphic processing unit
  • the quantity and source video stream are processed by video image to obtain a stable video stream after anti-shake processing, and then can be directly displayed on the display screen, or used for video coding.
  • the video image stabilization function of the present application can be turned on or off by a switch control.
  • FIG. 6 is a schematic flowchart of a video image stabilization method according to an embodiment of the present disclosure, where the method specifically includes:
  • the terminal opens the camera and captures a video image through the camera.
  • the terminal can open an application that can capture video through the camera, such as a camera, WeChat, and the like.
  • the app can control the camera and capture video through optical image sensors such as CMOS or CCD in the camera.
  • the camera can be a front camera or a rear camera.
  • the video images acquired by the optical image sensor in the front and rear frames are quite different, and image jitter appears on the display screen.
  • the image data acquired by the optical image sensor may be matrix pixel data, and the resolution of the finally processed video stream may be smaller than the resolution of the image data acquired by the image sensor, and the design may provide redundant data for image processing.
  • the terminal detects jitter on the X, Y, and Z axes when shooting.
  • Step S102 and step S101 may be performed simultaneously so that the terminal can correspond to the captured video image when the jitter occurs.
  • the terminal can detect the three-axis jitter of the terminal, that is, the rotational jitter on the X, Y, and Z axes, including pitch, sway, and roll.
  • the terminal can detect the six-axis jitter of the terminal, that is, the rotational jitter and translational jitter on the X, Y, and Z axes, including pitch, sway, roll, X-axis translational jitter, Y-axis translational jitter, and Z-axis.
  • Pan jitter Pan jitter.
  • the Z axis refers to the optical axis of the camera
  • the X axis refers to the axis perpendicular to the Z axis in the horizontal plane
  • the Y axis refers to the axis perpendicular to the Z axis in the vertical plane
  • the rolling finger rotates around the Z axis
  • the pitch finger surrounds X-axis rotation, swaying around the Y-axis
  • the rotation jitter of the terminal on the X, Y, and Z axes can be detected by an angle sensor.
  • an angle sensor which may be a gyroscope or other motion sensor.
  • the angle sensor can be mounted on the terminal body or in the camera module. If the angle sensor is a gyroscope, the gyroscope output signal is the moving angular velocity of the terminal, and the gyroscope output signal is integrated once to obtain the angle rotated by the terminal rotational motion, including the pitch angle ⁇ R , the rocking angle ⁇ Y , and the roll angle ⁇ P .
  • MEMS gyroscopes In modern electronic products, micro electro mechanical system (MEMS) gyroscopes are generally used to measure angular velocity. MEMS gyroscopes use Coriolis forces—the tangential forces experienced by rotating objects as they move in radial motion to estimate angular velocity. MEMS gyroscopes typically have a movable capacitive plate in two directions that measure the change in capacitance due to Coriolis motion. Since the Coriolis force is proportional to the angular velocity, the angular velocity can be calculated from the change in capacitance.
  • the displacement jitter of the terminal on the X, Y, and Z axes can be detected by the displacement sensor.
  • a schematic diagram of a displacement sensor which may be an accelerometer or other motion sensor.
  • the displacement sensor can be mounted on the body of the device or mounted in the camera module. If the angle sensor is an acceleration sensor, the acceleration sensor output signal is the acceleration of the terminal motion, and the acceleration sensor output signal is integrated once, the line speed of the terminal motion can be obtained, and the line speed is further integrated once, and the distance of the terminal motion can be obtained. Including the translation distance of the terminal on the X, Y, and Z axes.
  • MEMS accelerometers in modern electronic products include piezoelectric MEMS accelerometers, capacitive MEMS accelerometers, and the like.
  • Piezoelectric MEMS accelerometers use the piezoelectric effect to estimate the acceleration.
  • the above gyroscopes and accelerometers can be designed in the same electronic component or separately as two independent electronic components.
  • the image is anti-shake, only the basic anti-shake effect can be achieved, and there are many other factors that may affect the image quality. Therefore, the application has been improved in the following aspects:
  • the translational jitter of the terminal on the X, Y, and Z axes has little effect on the image.
  • the terminal is at X.
  • the translational jitter on the Y and Z axes has a large influence on the image.
  • the object distance described in the present application refers to the distance of a focused object or a person. Therefore, referring to FIG. 9, the terminal can detect the object distance.
  • the terminal can only detect the rotation jitter of the terminal on the X, Y, and Z axes (three-axis video anti-shake), so that Reduce the amount of data, improve the data processing speed, reduce power consumption; if the object distance is less than the object distance threshold, the terminal can detect the terminal's rotation jitter and translational jitter on the X, Y, Z axis (six-axis video anti-shake), which can be achieved Better anti-shake effect.
  • d the image displacement distance
  • v the image distance
  • the rotation angle around the X axis or the Y axis.
  • the image jitter caused by the translational jitter of the terminal on the X-axis or the Y-axis is not only related to the terminal translation distance ⁇ , but also related to the image distance v and the object distance u, so the corresponding compensation amount is also the terminal translation distance ⁇ , The image is related to the distance v and the object distance u.
  • the imaging will become smaller when the object distance becomes larger, the imaging will become larger when the object distance becomes smaller, and the image jitter caused by the translational jitter of the terminal on the Z axis is related to the translation distance ⁇ except for the terminal, and is also related to the object distance u. Therefore, the corresponding compensation amount is also related to the terminal translation distance ⁇ and the object distance u.
  • the terminal since the image distance is the distance of the lens from the optical image sensor, the terminal not only detects the rotational jitter and translational shake on the X, Y, and Z axes, but also detects the change in the object distance and the lens position.
  • the terminal can detect the object distance by using a depth sensor.
  • the depth sensor may include at least one of the following sensors: a laser sensor, a time of flight (TOF) sensor, a structured light sensor, and the like.
  • the depth sensor can be mounted on the body of the device or installed in the camera module.
  • the TOF obtains the object distance by continuously emitting a light pulse to the object to be photographed, and then receiving a reflected light pulse returned from the object with the sensor, and detecting the flight (round trip) time of the emitted and received light pulses.
  • the object distance can be calculated.
  • the terminal can detect the position (image distance) of the lens through the position sensor to detect the focusing motion of the lens, that is, the change of the image distance.
  • the position sensor may include at least one of the following sensors: a Hall sensor, an anisotropic magneto resistance (AMR) sensor, a giant magneto resistance (GMR) sensor, and a tunneling magneto resistance (TMR).
  • a device such as a sensor that can detect the position of the lens.
  • the Hall sensor is a position sensor fabricated by the Hall effect.
  • the Hall semiconductor material moves in an applied magnetic field, the motion trajectory shifts due to the Lorentz force, and charge accumulation occurs on both sides of the Hall semiconductor material to form an electric field perpendicular to the current direction.
  • the Lorentz force received by the carrier is balanced with the electric field repulsion, thereby establishing a stable potential difference, that is, the Hall voltage, on both sides of the Hall semiconductor material.
  • the magnitude of the magnetic field strength can be estimated by measuring the Hall voltage, and the position of the Hall magnet can be estimated by the magnitude of the magnetic field strength.
  • the Hall magnet is usually mounted on the lens barrel to move as the lens moves.
  • the Hall sensor is usually mounted in a fixed position, such as on a substrate.
  • the position change of the lens drives the Hall magnet to move, thereby changing the magnetic field strength induced by the Hall sensor, causing a change in the Hall voltage.
  • the amount of displacement of the lens movement can be calculated by measuring the amount of change in the Hall voltage.
  • the terminal performs anti-shake processing (compensation) on the video image according to the jitter of the terminal on the X, Y, and Z axes.
  • the sensor data based on the displacement sensor or the angle sensor detects the image and can only achieve the basic anti-shake effect.
  • the present application further improves the method.
  • the terminal can be based on the object distance.
  • the image distance and the jitter of the terminal on the X, Y, and Z axes are used to anti-shake the video image.
  • the terminal can perform rotation compensation on the video image opposite to the direction of rotation and the same rotation angle. For example, taking the roll shown in the figure as an example, assuming that the terminal rotates the angle ⁇ clockwise around the Z axis, and the captured video image rotates the angle ⁇ counterclockwise, the terminal rotates the video image clockwise by the angle ⁇ to achieve compensation. .
  • the terminal can compensate the video image for the same translation distance d based on the opposite direction of the translation of the video image according to Equation 1, and the video image of the jitter
  • the trapezoidal distortion is restored to a rectangle based on a trapezoidal correction algorithm in the direction of the rotation axis.
  • the specific trapezoidal correction algorithm may include image space transformation and interpolation operation, etc., and details are not described herein.
  • the terminal can refer to FIG. 10 in the vertical direction.
  • the content and formula 1 compensate the distance d to the video image in the horizontal direction.
  • the terminal can restore the trapezoidal distortion to a rectangle by the trapezoidal correction algorithm.
  • the terminal can compensate the video image for the same translation distance d based on the opposite direction of the translation of the video image according to Equation 2, and crop the excess portion of the video image.
  • the terminal can refer to the content and corresponding formula shown in FIG. 2
  • the video image is compensated for the distance d along the Y-axis, and the excess portion of the upper portion of the video image is cropped.
  • the terminal can scale the video image based on the scaling ratio (u+ ⁇ )/u (the inverse of Equation 3), where u is the object distance and ⁇ is the terminal in the Z axis.
  • the upward translation distance away from the object, ⁇ can be positive or negative. If the object becomes larger when the terminal is close to the object being photographed, the terminal can reduce the image according to the formula; if the object is far away from the object, the image is reduced, and the terminal can enlarge the image according to the formula.
  • the terminal can scale the video image based on the scaling [(u- ⁇ )v]/[(v+ ⁇ )u] (the reciprocal of Equation 4), where u is the object distance, v is the image distance, and ⁇ is the distance the lens moves toward the object.
  • the video image after the anti-shake processing of the present application can be directly displayed on the display screen, or used for video coding, etc., which is not limited.
  • the video image anti-shake method provided by the present application not only realizes compensation for the translational shake of the terminal on the Z-axis, but also achieves more accurate X/Y translational shake anti-shake effect by object distance detection, and additionally detects the position of the lens. Calculate the amount of image zoom caused by focusing, and then reverse zoom to make the picture stable when focusing.
  • the above terminal and the like include hardware structures and/or software modules corresponding to each function.
  • the embodiments of the present application can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the embodiments of the present application.
  • the embodiment of the present application may perform the division of the function modules on the terminal or the like according to the foregoing method example.
  • each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 17 is a schematic diagram showing a possible structure of a terminal involved in the foregoing embodiment.
  • the terminal 200 includes: a shooting unit 2011, a detecting unit 2012, and an anti-shake unit. 2013.
  • the photographing unit 2011 is for supporting the terminal 200 to execute the process S101 in FIG. 6; the detecting unit 2012 is for supporting the terminal 200 to execute the process S102 in FIG. 6; the anti-shake unit 2013 is for supporting the terminal 200 to execute the process S103 in FIG. 6. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
  • the above-described photographing unit 2011 can be integrated into a photographing module
  • the detecting unit 2012 is integrated as a detecting module
  • the anti-shake unit 2013 is integrated as a processing module.
  • the terminal may further include a storage module, a communication module, an input and output module, and the like.
  • the processing module 2021 is configured to control and manage the action of the terminal.
  • the communication module 2022 is configured to support communication between the terminal and other network entities such as a cloud server, other terminals, and the like.
  • the input/output module 2023 is for receiving information input by the user or outputting information provided to the user and various menus of the terminal.
  • the storage module 2024 is configured to save program codes and data of the terminal.
  • the shooting module 2025 is for taking a video image.
  • the detecting module 2026 is configured to detect the jitter of the terminal.
  • the processing module 2021 may be a processor or a controller, for example, may be a central processing unit (CPU), a GPU, a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit. (application-specific integrated circuit, ASIC), field programmable gate array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the communication module 2022 may be a transceiver, a transceiver circuit, an input/output device, a communication interface, or the like.
  • the communication module 2022 may specifically be a Bluetooth device, a Wi-Fi device, a peripheral interface, or the like.
  • the memory module 2024 may be a memory, which may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • RAM high speed random access memory
  • nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the input/output module 2023 can be an input/output device such as a touch screen, a keyboard, a microphone, and a display.
  • the display may specifically be configured in the form of a liquid crystal display, an organic light emitting diode or the like.
  • a touch panel can be integrated on the display for collecting touch events on or near the display, and transmitting the collected touch information to other devices (such as a processor, etc.).
  • the shooting module 2025 can be an optical image sensor.
  • the detection module 2026 can include an angle sensor, a displacement sensor, a depth sensor, a position sensor, and the like.
  • the storage module is a memory
  • the input/output module is a display
  • the processing module is a processor
  • the communication module is a communication interface
  • the memory is used to store a computer execution instruction
  • the processor is coupled to the memory
  • the terminal is running, the processor executes the memory.
  • the stored computer executes instructions to cause the terminal to perform the method as described in any of Figures 6 .
  • Embodiments of the present invention also provide a computer storage medium storing one or more programs, the one or more programs including instructions that, when executed by the terminal, cause the terminal to perform the method as described in any of Figures 6 .
  • Embodiments of the present invention also provide a computer program product comprising instructions for causing a terminal to perform the method of any of FIG. 6 when the computer program product is run on a terminal.
  • the terminal, the computer storage medium or the computer program product provided by the embodiments of the present invention are all used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can be referred to the corresponding method provided above. The beneficial effects will not be described here.
  • the size of the sequence numbers of the foregoing processes does not mean the order of execution sequence, and the order of execution of each process should be determined by its function and internal logic, and should not be applied to the embodiment of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical, mechanical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • a software program it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transmission to another website site, computer, server or data center via wired (eg coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg infrared, wireless, microwave, etc.).
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device that includes one or more servers, data centers, etc. that can be integrated with the media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)) or the like.
  • a magnetic medium eg, a floppy disk, a hard disk, a magnetic tape
  • an optical medium eg, a DVD
  • a semiconductor medium such as a solid state disk (SSD)

Abstract

本申请公开了一种视频图像防抖方法和终端,涉及图像处理领域,用于实现对Z方向的平移抖动的补偿。一种视频图像防抖方法包括:终端打开摄像头,并通过所述摄像头拍摄视频图像;所述终端检测拍摄时在X、Y、Z轴上的抖动,所述Z轴为所述摄像头的光轴,所述X轴指在水平面上与所述Z轴相互垂直的轴,所述Y轴指在垂直面上与所述Z轴相互垂直的轴;所述终端根据所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理。本申请实施例应用于视频图像防抖。

Description

视频图像防抖方法和终端 技术领域
本申请涉及图像处理领域,尤其涉及一种视频图像防抖方法和终端。
背景技术
用户在运动过程中拍摄,或者拍摄过程中手持抖动,均可能导致拍摄的图像或视频存在抖动模糊。为了解决图像抖动问题,出现了光学稳像(optical image stabilization,OIS)技术和电子稳像(electronics image stabilization,EIS)技术。OIS技术是在拍摄期间,通过运动传感器检测拍摄设备的抖动数据,OIS控制器根据抖动数据控制推动OIS的马达来移动镜头或者图像传感器。但由于拍摄设备可能持续抖动或运动,即使采用OIS技术,在连续几帧图像之间仍然存在明显的抖动和错位。
EIS技术包括两种方式:一种是基于图像内容的防抖处理,根据前后图像帧的内容,识别图像的运动情况,将图像配准对齐后进行适当裁剪,拉伸,变形等处理,这种方法的缺点是运算量大,速度慢,功耗高;另一种是基于运动传感器数据来进行防抖处理,通过每帧图像曝光期间的运动传感器数据,计算出帧与帧之间的运动情况,将图像配准对齐后进行适当裁剪,拉伸,变形等处理,这种方法速度快,功耗低。
参照图1中所示,目前第二种EIS技术只能补偿五轴方向的抖动,包括:俯仰(pitch)、摇摆(yaw)、滚转(roll)这三种旋转运动以及X轴、Y轴的平移抖动。尚未实现对Z轴的平移抖动的补偿。其中,Z轴指摄像头的光轴,X轴指在水平面上与Z轴相互垂直的轴,Y轴指在垂直面上与Z轴相互垂直的轴,滚转指围绕Z轴旋转,俯仰指围绕X轴旋转,摇摆指围绕Y轴旋转。
发明内容
本申请实施例提供一种视频图像防抖方法和终端设备,用于实现对Z方向的平移抖动的补偿。
为达到上述目的,本申请的实施例采用如下技术方案:
第一方面,提供了一种视频图像防抖方法,该方法包括:终端打开摄像头,并通过所述摄像头拍摄视频图像;终端检测拍摄时在X、Y、Z轴上的抖动,Z轴为摄像头的光轴,X轴指在水平面上与Z轴相互垂直的轴,Y轴指在垂直面上与Z轴相互垂直的轴;终端根据在X、Y、Z轴上的抖动对视频图像进行防抖处理。本申请提供的视频图像防抖方法,不仅实现在对终端在Z轴上的平移抖动的补偿,还通过物距检测,实现更精准的X/Y平移抖动防抖效果,另外通过检测镜头的位置,计算由于调焦引起的画面缩放量,进而通过反向缩放,使得调焦时画面稳定。
在一种可能的实施方式中,该方法还包括:终端检测物距,物距指对焦物体或人物的距离。终端检测在X、Y、Z轴上的抖动,包括:如果物距大于等于物距门限,则终端检测在X、Y、Z轴上的旋转抖动;如果物距小于物距门限,则终端检测在X、Y、Z轴上的旋转抖动,并且终端检测在X、Y、Z轴上的平移抖动。该实施方式使得如果物距大于等于物距门限,终端可以只检测终端在X、Y、Z轴上的旋转抖动(三轴视频 防抖),这样可以减少数据量,提高数据处理速度,降低功耗;如果物距小于物距门限,终端可以检测终端在X、Y、Z轴上的旋转抖动和平移抖动(六轴视频防抖),这样可以实现较好的防抖效果
在一种可能的实施方式中,终端检测物距,包括:终端通过深度传感器来检测物距,深度传感器包括以下传感器的至少一种:激光传感器,飞行时间(time of flight,TOF)传感器、结构光传感器。该实施方式提供了检测物距的传感器。
在一种可能的实施方式中,终端检测在X、Y、Z轴上的旋转抖动,包括:终端通过角度传感器来检测在X、Y、Z轴上的旋转抖动,角度传感器包括陀螺仪。该实施方式提供了检测X、Y、Z轴上的旋转抖动的传感器。
在一种可能的实施方式中,终端检测在X、Y、Z轴上的平移抖动,包括:终端通过位移传感器来检测在X、Y、Z轴上的平移抖动,位移传感器包括加速度计。该实施方式提供了检测在X、Y、Z轴上的平移抖动的传感器。
在一种可能的实施方式中,该方法还包括:终端检测像距。终端根据在X、Y、Z轴上的抖动对视频图像进行防抖处理,包括:终端根据物距、像距以及在X、Y、Z轴上的抖动对视频图像进行防抖处理。该实施方式可以通过物距、像距对六轴防抖进行进一步补偿。
在一种可能的实施方式中,终端检测像距,包括:终端通过位置传感器来检测像距,位置传感器包括以下传感器的至少一种:霍尔传感器、各向异性磁电阻AMR传感器、巨磁电阻GMR传感器、隧道磁电阻TMR传感器。该实施方式提供了检测像距的传感器。
在一种可能的实施方式中,终端根据物距、像距以及在X、Y、Z轴上的抖动对视频图像进行防抖处理,包括:对于终端围绕Z轴旋转引起的图像抖动,终端对视频图像进行与其旋转方向相反并且旋转角度相同的旋转补偿。对于终端围绕X轴旋转抖动或Y轴旋转抖动引起的图像抖动,终端基于公式d=v*tan(θ)向视频图像平移的相反方向对视频图像补偿相同平移距离d,并对视频图像在旋转轴方向上基于梯形校正算法将梯形失真恢复为矩形,其中,d为图像位移距离,v为像距,θ为围绕X轴或Y轴的旋转角度。对于终端在X轴或Y轴上平移抖动引起的图像抖动,终端基于公式d=(v+u)*Δ/v向视频图像平移的相反方向对视频图像补偿相同平移距离d,并对视频图像多余部分进行裁剪,其中,d为图像位移距离,v为像距,u为物距,Δ为终端的平移距离。对于终端在Z轴上平移抖动引起的图像抖动,终端基于缩放比例(u+Δ)/u对视频图像进行缩放,其中,u为物距,Δ为终端在Z轴上向远离物体方向平移距离。该实施方式提供了通过物距、像距对六轴防抖进行进一步补偿的具体方式。
在一种可能的实施方式中,该方法还包括:对于终端调焦引起的图像抖动,终端基于缩放比例[(u-Δ)v]/[(v+Δ)u]对视频图像进行缩放,其中,u为物距,v为像距,Δ为镜头向物体移动的距离。该实施方式可以对调焦引起的图像抖动进行补位。
第二方面,提供了一种终端,包括:拍摄单元,用于打开摄像头,并通过摄像头拍摄视频图像;检测单元,用于检测拍摄时在X、Y、Z轴上的抖动,Z轴为摄像头的光轴,X轴指在水平面上与Z轴相互垂直的轴,Y轴指在垂直面上与Z轴相互垂直的轴;防抖单元,用于根据在X、Y、Z轴上的抖动对视频图像进行防抖处理。基于同一 发明构思,由于该终端解决问题的原理以及有益效果可以参见上述第一方面和第一方面的各可能的方法实施方式以及所带来的有益效果,因此该终端的实施可以参见上述第一方面和第一方面的各可能的方法的实施方式,重复之处不再赘述。
第三方面,本申请实施例提供了一种终端,包括:处理器、存储器和通信接口;该存储器用于存储计算机执行指令,该处理器与该存储器耦接,当终端运行时,该处理器执行该存储器存储的该计算机执行指令,以使终端执行第一方面和第一方面的各可能的方法。
第四方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当该指令在上述任一项终端上运行时,使得终端执行第一方面和第一方面的各可能的方法。
第五方面,本申请实施例提供一种包含指令的计算机程序产品,当其在上述任一项终端上运行时,使得终端执行第一方面和第一方面的各可能的方法。
本申请的实施例中,上述终端内各部件的名字对设备本身不构成限定,在实际实现中,这些部件可以以其他名称出现。只要各个部件的功能和本申请的实施例类似,即属于本申请权利要求及其等同技术的范围之内。
另外,第三方面至第五方面中任一种设计方式所带来的技术效果可参见上述第一方面中不同设计方法所带来的技术效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种视频图像各种抖动的示意图;
图2为本申请实施例提供的一种终端的正面的示意图;
图3为本申请实施例提供的一种终端的结构示意图一;
图4为本申请实施例提供的一种视频图像五轴防抖的示意图;
图5为本申请实施例提供的各传感器如何处理的示意图;
图6为本申请实施例提供的一种视频图像防抖方法的流程示意图;
图7为本申请实施例提供的一种角度传感器的示意图;
图8为本申请实施例提供的一种位移传感器的示意图;
图9为本申请实施例提供的根据物距确定三轴防抖或六轴防抖的示意图;
图10为本申请实施例提供的终端围绕X轴旋转(摇摆)抖动或Y轴旋转(俯仰)抖动的示意图;
图11为本申请实施例提供的终端在X轴或Y轴上平移抖动的示意图;
图12为本申请实施例提供的终端在Z轴上平移抖动的示意图;
图13为本申请实施例提供的终端调焦引起的图像抖动的示意图;
图14为本申请实施例提供的TOF的示意图;
图15为本申请实施例提供的霍尔磁铁和霍尔传感器的安装位置的示意图;
图16为本申请实施例提供的对视频图像六轴防抖的基本原理的示意图;
图17为本申请实施例提供的一种终端的结构示意图二;
图18为本申请实施例提供的一种终端的结构示意图三。
具体实施方式
本申请实施例中的终端可以为具有摄影功能的各种电子设备,例如,可以为可穿 戴电子设备(例如智能手表等)、平板电脑、台式电脑、虚拟现实装置、增强现实装置、照相机、摄像机,也可以是图2或图3所示的手机200,本申请实施例对终端的具体形式不做限制。
以下实施例以手机作为例子来说明终端如何实现实施例中的具体技术方案。如图2或图3所示,本申请实施例中的终端可以为手机200。图2为手机200的正面示意图,手机200可以是例如图2中(a)所示的全面屏,或者,如图2中(b)所示的挖孔(notch)全面屏(或者称为异形全面屏)。图3为手机200的硬件结构示意图。应该理解的是,图示手机200仅仅是终端的一个范例,其可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件。
如图3所示,手机200可以包括:射频(radio frequency,RF)电路210、存储器220、输入单元230、显示单元240、传感器250、音频电路260、无线保真(wireless fidelity,Wi-Fi)模块270、处理器280、蓝牙模块281、以及电源290等部件。
RF电路210可用于在收发信息或通话过程中信号的接收和发送,可以接收基站的下行数据后交给处理器280处理;可以将上行数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等器件。
存储器220可用于存储软件程序及数据。处理器280通过运行存储在存储器220的软件程序或数据,从而执行手机200的各种功能以及数据处理。存储器220可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。存储器220存储有使得手机200能运行的操作系统,例如苹果公司所开发的
Figure PCTCN2018080357-appb-000001
操作系统,谷歌公司所开发的
Figure PCTCN2018080357-appb-000002
开源操作系统,微软公司所开发的
Figure PCTCN2018080357-appb-000003
操作系统等。本申请中存储器220可以存储操作系统及各种应用软件,还可以存储执行本申请实施例所述方法的代码。
输入单元230(例如触摸屏)可用于接收输入的数字或字符信息,产生与手机200的用户设置以及功能控制有关的信号输入。具体地,输入单元230可以包括如图2所示设置在手机200正面的触控面板231,可收集用户在其上或附近的触摸操作。本申请中输入单元230可以收集用户的触控操作。
显示单元240(即显示屏)可用于显示由用户输入的信息或提供给用户的信息以及手机200的各种菜单的图形用户界面(graphical user interface,GUI)。显示单元240可包括设置在手机200正面的显示面板241。其中,显示面板241可以采用液晶显示器、发光二极管等形式来配置。显示单元240可以用于显示本申请中所述的各种图形用户界面。触控面板231可以覆盖在显示面板241之上,也可以将触控面板231与显示面板241集成而实现手机200的输入和输出功能,集成后可以简称触摸显示屏。本申请中显示单元240可以显示拍摄的视频等。
手机200还可以包括至少一种传感器250。例如,角度传感器,用于检测本设备在X、Y、Z方向的旋转运动,可以是陀螺仪或其它运动传感器等。位移传感器,用于检测本设备在X、Y、Z方向的平移抖动,可以是加速度计或其它运动传感器等。深度传感器,用于检测拍摄场景的物距,可以是激光传感器、飞行时间(time of flight,TOF)、结构光等深度检测器件等。位置传感器,用于检测镜头的位置(或称像距),可以是霍尔传感器、各向异性磁电阻传感器(anisotropic magneto resistance,AMR)、巨磁 电阻传感器(giant magneto resistance,GMR)、隧道磁电阻传感器(tunneling magneto resistance,TMR)等可以检测镜头位置的器件。图像传感器,用于感光并生成图像,可以是互补金属氧化物半导体(complementary metal oxide semiconductor,CMOS),电荷耦合器件(charge coupled device,CCD)等光学图像传感器。手机200还可配置有气压计、湿度计、温度计、红外线传感器等其他传感器。
音频电路260、扬声器261、麦克风262可提供用户与手机200之间的音频接口。音频电路260可将接收到的音频数据转换后的电信号,传输到扬声器261,由扬声器261转换为声音信号输出;另一方面,麦克风262将收集的声音信号转换为电信号,由音频电路260接收后转换为音频数据,再将音频数据输出至RF电路210以发送给比如另一手机,或者将音频数据输出至存储器220以便进一步处理。本申请的麦克风262可以采集与视频同步的音频。
Wi-Fi属于短距离无线传输技术,手机200可以通过Wi-Fi模块270帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。
处理器280是手机200的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器220内的软件程序,以及调用存储在存储器220内的数据,执行手机200的各种功能和处理数据。在一些实施例中,处理器280可包括一个或多个处理单元;处理器280还可以集成应用处理器和基带处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,基带处理器主要处理无线通信。可以理解的是,上述基带处理器也可以不集成到处理器280中。本申请中处理器280可以包括图像信号处理器(image signal processor,ISP)。
蓝牙模块281,用于通过蓝牙协议来与其他具有蓝牙模块的蓝牙设备进行信息交互。例如,手机200可以通过蓝牙模块281与同样具备蓝牙模块的可穿戴电子设备(例如智能手表)建立蓝牙连接,从而进行数据交互。
手机200还包括给各个部件供电的电源290(比如电池)。电源可以通过电源管理系统与处理器280逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗等功能。
终端在拍摄视频过程中,可能由于用户手持或用户处于运动中而出现帧与帧之间图像抖动,通过终端的防抖功能可以对按照某一方向运动的视频图像进行相反方向的补偿,从而尽量消除帧与帧之间图像抖动。
参照图4中所示,现有技术中最多只能补偿视频在五轴方向上的抖动,通过角速度传感器对俯仰(pitch)、摇摆(yaw)、滚转(roll)这种旋转抖动的检测,以及通过加速度传感器对X轴的平移抖动、Y轴的平移抖动的检测,最后综合得到补偿(防抖)结果。对于Z轴的平移抖动,即沿光轴前后方向抖动,目前并未有好的处理方式。并且目前对X轴的平移抖动和Y轴的平移抖动的补偿,并没有考虑物距信息,补偿效果不好。另外,在终端进行调焦时,由于镜头位置的变化,也会使得视频中图像忽大忽小,该因素也会对视频图像的稳定产生不利影响,需要说明的是,本申请所述的调焦指调节像距,即调节镜头相对于光学图像传感器的距离。
参照图5中所示,本申请的终端通过角度传感器进行旋转抖动检测,通过位移传 感器进行平移抖动检测,通过深度传感器进行物距检测,通过位置传感器进行镜头位置检测,由处理器综合上述检测结果进行视频补偿计算,得到X/Y/Z轴的旋转补偿量和/或X/Y/Z轴的平移补偿量。终端通过光学图像传感器获取检测期间的源视频流,对源视频流的N帧图像进行缓存后,图像信号处理器(image signal processor,ISP)或图形处理单元(graphic processing unit,GPU)结合上述补偿量和源视频流进行视频图像处理,得到防抖处理后的稳定视频流,然后可以直接在显示屏上预览显示,或者用于视频编码等。本申请的视频图像防抖功能可以通过开关控制开启或关闭。
对此,以下将结合附图对本申请实施例提供的一种视频图像防抖方法进行具体介绍。如图6所示,为本申请实施例提供的一种视频图像防抖方法的流程示意图,该方法具体包括:
S101、终端打开摄像头,并通过摄像头拍摄视频图像。
终端可以打开能够通过摄像头拍摄视频的应用程序,例如相机、微信等。通过该应用程序可以控制摄像头,并通过摄像头中的CMOS或CCD等光学图像传感器拍摄视频。该摄像头可以是前置摄像头或后置摄像头。
在终端抖动过程中,光学图像传感器在前后帧所采集的视频图像差别较大,体现在显示屏上即出现图像抖动。通过光学图像传感器所采集的图像数据可以是矩阵式像素点数据,最终处理后的视频流的分辨率可以小于图像传感器所采集的图像数据的分辨率,该设计可以为图像处理提供冗余数据。
S102、终端检测拍摄时在X、Y、Z轴上的抖动。
步骤S102与步骤S101可以是同时执行的,以便终端发生抖动时与所拍摄的视频图像能够对应。
终端可以检测终端的三轴抖动,即在X、Y、Z轴上的旋转抖动,包括俯仰、摇摆、滚转。或者,终端可以检测终端的六轴抖动,即在X、Y、Z轴上的旋转抖动和平移抖动,包括俯仰、摇摆、滚转、X轴的平移抖动、Y轴的平移抖动、Z轴的平移抖动。
其中,Z轴指摄像头的光轴,X轴指在水平面上与Z轴相互垂直的轴,Y轴指在垂直面上与Z轴相互垂直的轴,滚转指围绕Z轴旋转,俯仰指围绕X轴旋转,摇摆指围绕Y轴旋转
具体的,可以通过角度传感器来检测终端在X、Y、Z轴上的旋转抖动。参照图7中所示,为一种角度传感器的示意图,该角度传感器可以是陀螺仪或其它运动传感器。角度传感器可以安装在终端机身上,或者安装在摄像头模组中。如果角度传感器是陀螺仪,则陀螺仪输出信号为终端的运动角速度,将陀螺仪输出信号进行一次积分可以得到终端旋转运动所转过的角度,包括俯仰角Ω R、摇摆角Ω Y、滚转角Ω P
现代电子产品中一般使用微机电系统(micro electro mechanical system,MEMS)陀螺仪测量角速度。MEMS陀螺仪利用科里奥利力——旋转物体在有径向运动时所受到的切向力来推算角速度。MEMS陀螺仪通常有两个方向的可移动电容板,电容板测量由于科里奥利运动带来的电容变化。因为科里奥利力正比于角速度,所以由电容的变化可以计算出角速度。
具体的,可以通过位移传感器来检测终端在X、Y、Z轴上的平移抖动。参照图8中所示,为一种位移传感器的示意图,该位移传感器可以是加速度计或其它运动传感 器。位移传感器可以安装在设备机身上,或者安装在摄像头模组中。如果角度传感器是加速度传感器,则加速度传感器输出信号为终端运动的加速度,将加速度传感器输出信号进行一次积分,可以得到终端运动的线速度,将线速度再进行一次积分,可以得到终端运动的距离,包括终端在X、Y、Z轴上的平移距离。
现代电子产品中的MEMS加速度计包括压电式MEMS加速度计、容感式MEMS加速度计等。压电式MEMS加速度计利用压电效应来推算加速度,在其内部有一个刚体支撑的质量块,有运动的情况下质量块会产生压力,刚体产生应变,把加速度转变成电信号输出。容感式MEMS加速度计内部也存在一个质量块,但是是标准的平板电容器。加速度的变化带动质量块的移动从而改变平板电容器两极的间距和正对面积,通过测量电容变化量来计算加速度。
在实际产品中,上述陀螺仪和加速度计可以设计在同一个电子元器件中,也可以分开设计为两个独立的电子元器件。
基于以上传感器数据对图像进行防抖处理,只能实现基本的防抖效果,还有许多其他因素可能影响成像质量,因此本申请还在以下几个方面做出了改进:
在日常使用过程中可以发现,当物距(被拍摄物体距离)较远时,终端在X、Y、Z轴上的平移抖动对图像影响较小,当物距较近时,终端在X、Y、Z轴上的平移抖动对图像影响较大。本申请所述的物距指对焦物体或人物的距离。因此,参照图9中所示,终端可以检测物距,如果物距大于等于物距门限,终端可以只检测终端在X、Y、Z轴上的旋转抖动(三轴视频防抖),这样可以减少数据量,提高数据处理速度,降低功耗;如果物距小于物距门限,终端可以检测终端在X、Y、Z轴上的旋转抖动和平移抖动(六轴视频防抖),这样可以实现较好的防抖效果。
另外,参照图10中所示,根据透镜成像原理,终端围绕X轴旋转(摇摆)抖动或Y轴旋转(俯仰)抖动引起的图像抖动满足公式1:d=v*tan(θ),其中,d为图像位移距离,v为像距,θ为围绕X轴或Y轴的旋转角度。从中可以看出,终端围绕X轴或Y轴旋转产生的图像抖动与物距u无关,仅与像距v和终端旋转角度θ有关,因此相应的补偿量也仅与像距v和旋转角度θ有关。
参照图11中所示,根据透镜成像原理,终端在X轴或Y轴上平移抖动引起的图像抖动满足公式2:d=(v+u)*Δ/v,其中,d为图像位移距离,v为像距,u为物距,Δ为终端在X轴或Y轴上的平移距离。从中可以看出,终端在X轴或Y轴上平移抖动引起的图像抖动不仅与终端平移距离Δ有关,还与像距v和物距u有关,因此相应的补偿量也与终端平移距离Δ、像距v和物距u有关。
参照图12中所示,对于终端在Z轴上平移抖动引起的图像抖动来说,假设物高A,终端未沿Z轴平移前像高A',物距为u,像距为v。根据透镜成像原理,A/u=A'/v,所以A'=vA/u,终端在Z轴上向远离物体方向平移距离Δ,使得物距u变化Δ而像距v不变,则平移后的像高A”=vA/(u+Δ),所以终端沿Z轴平移后图像缩放比例满足公式3:r=A”/A'=u/(u+Δ),Δ可以为正数或负数。从中可以看出,物距变大时成像将变小,物距变小时成像将变大,并且终端在Z轴上平移抖动引起的图像抖动与除了终端平移距离Δ有关,还与物距u有关,因此相应的补偿量也与终端平移距离Δ和物距u有关。
参照图13中所示,对于终端调焦引起的图像抖动,假设终端的镜头向物体移动的距离(即像距变化距离)为Δ,那么像距变为v+Δ,物距变为u-Δ,则新的像高A”=(v+Δ)A/(u-Δ),所以终端调焦后图像缩放比例满足公式4:r=A”/A'=[(v+Δ)u]/[(u-Δ)v],Δ可以为正数或负数。从中可以看出,终端调焦引起的图像抖动与镜头平移距离Δ有关,还与物距u和像距v有关,因此相应的补偿量也与镜头平移距离Δ、物距u和像距v有关。
综上所述,由于像距为镜头相对于光学图像传感器的距离,因此终端不仅要检测X、Y、Z轴上的旋转抖动和平移抖动,还要检测物距和镜头位置的变化。
具体的,终端可以通过深度传感器来检测物距。深度传感器可以包括以下传感器的至少一种:激光传感器,飞行时间(time of flight,TOF)传感器、结构光传感器等深度检测器件。深度传感器可以安装在设备机身上,或者安装在摄像头模组中。其中,TOF通过向被拍摄物体连续发射光脉冲,然后用传感器接收从物体返回的反射光脉冲,通过探测发射和接收光脉冲的飞行(往返)时间来得到物距。参照图14中所示,通过检测发射光和反射光的相位差ΔΦ,光速c和发射光脉冲f的频率,可以计算出物距
Figure PCTCN2018080357-appb-000004
具体的,终端可以通过位置传感器来检测镜头的位置(像距),以检测镜头的调焦运动也就是像距的变化情况。位置传感器可以包括以下传感器的至少一种:霍尔传感器、各向异性磁电阻(anisotropic magneto resistance,AMR)传感器、巨磁电阻(giant magneto resistance,GMR)传感器、隧道磁电阻(tunneling magneto resistance,TMR)传感器等可以检测镜头位置的器件。
其中,霍尔传感器是利用霍尔效应制作的一种位置传感器。霍尔半导体材料中的电子在外加磁场中运动时,因为受到洛仑兹力的作用而使运动轨迹发生偏移,并在霍尔半导体材料两侧产生电荷积累,形成垂直于电流方向的电场,最终使载流子受到的洛仑兹力与电场斥力相平衡,从而在霍尔半导体材料两侧建立起稳定的电势差即霍尔电压。可通过测量霍尔电压推算出磁场强度的大小,通过磁场强度的大小可以推算霍尔磁铁的位置。实际产品中,参照图15中所示,霍尔磁铁通常被安装于镜筒上随着镜头的移动而移动。而霍尔传感器通常被安装在一个位置固定不变的地方,例如被安装在基板上。当镜头调焦从而使得像距变化时,镜头的位置变化带动霍尔磁铁移动,进而改变霍尔传感器感应的磁场强度,引起霍尔电压的变化。通过测量霍尔电压变化量可以计算镜头移动的位移量。
S103、终端根据终端在X、Y、Z轴上的抖动对视频图像进行防抖处理(补偿)。
如前文所述,基于位移传感器或角度传感器检测的传感器数据对图像进行防抖处理,只能实现基本的防抖效果,本申请更进一步地对其进行改进,具体的,终端可以根据物距、像距以及终端在X、Y、Z轴上的抖动对视频图像进行防抖处理。
参照图16中所示,示出本申请对视频图像六轴防抖的基本原理。
对于终端围绕Z轴旋转(滚转)引起的图像抖动,终端可以对视频图像进行与其旋转方向相反并且旋转角度相同的旋转补偿。例如,以图中所示的滚转为例,假设终端围绕Z轴顺时针旋转角α,所拍摄的视频图像逆时针旋转角α,则终端要将该视频图像顺时针旋转角α以实现补偿。
对于终端围绕X轴旋转(摇摆)抖动或Y轴旋转(俯仰)抖动引起的图像抖动,终端可以基于公式1向视频图像平移的相反方向对视频图像补偿相同平移距离d,并对抖动的视频图像在旋转轴方向上基于梯形校正算法将梯形失真恢复为矩形。具体梯形校正算法可以包括图像空间变换和插值运算等,具体不再赘述。例如,以图中所示的俯仰为例,假设终端围绕X轴向下旋转角θ,使得视频图像向上平移d并在水平方向上呈现梯形失真,则在垂直方向上,终端可以参照图10所示内容及公式1将视频图像向相下补偿距离d,在水平方向上,终端可以通过梯形校正算法将梯形失真恢复为矩形。
对于终端在X轴或Y轴上平移抖动引起的图像抖动,终端可以基于公式2向视频图像平移的相反方向对视频图像补偿相同平移距离d,并对视频图像多余部分进行裁剪。例如,以图中所示的Y轴的平移抖动为例,假设终端沿Y轴向下平移距离d,使得视频图像沿Y轴向上平移距离d,终端可以参照图11所示内容及对应公式2将视频图像沿Y轴相下补偿距离d,并裁剪掉视频图像上部多余部分。
对于终端在Z轴上平移抖动引起的图像抖动,终端可以基于缩放比例(u+Δ)/u(公式3的倒数)对视频图像进行缩放,其中,u为物距,Δ为终端在Z轴上向远离物体方向平移距离,Δ可以为正数或负数。如果终端靠近被拍摄物体时,物体成像变大,终端可以将图像按照该公式缩小;如果终端远离物体时,物体成像缩小,终端可以将图像按照该公式放大。
对于终端调焦引起的图像抖动,终端可以基于缩放比例[(u-Δ)v]/[(v+Δ)u](公式4的倒数)对视频图像进行缩放,其中,u为物距,v为像距,Δ为镜头向物体移动的距离。
需要说明的是,以上仅针对每种抖动情况的补偿方式单独进行说明,在实际使用过程中,可能存在同时存在多种抖动的情况,可以仅对其中最明显的抖动方式进行补偿,或者对每种抖动均进行补偿,本申请不作限定。
本申请经过防抖处理后的视频图像可以直接在显示屏上预览显示,或者用于视频编码等,具体不作限定。
本申请提供的视频图像防抖方法,不仅实现在对终端在Z轴上的平移抖动的补偿,还通过物距检测,实现更精准的X/Y平移抖动防抖效果,另外通过检测镜头的位置,计算由于调焦引起的画面缩放量,进而通过反向缩放,使得调焦时画面稳定。
可以理解的是,上述终端等为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述终端等进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的 形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图17示出了上述实施例中所涉及的终端的一种可能的结构示意图,该终端200包括:拍摄单元2011、检测单元2012、防抖单元2013。
拍摄单元2011用于支持终端200执行图6中的过程S101;检测单元2012用于支持终端200执行图6中的过程S102;防抖单元2013用于支持终端200执行图6中的过程S103。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在采用集成的单元的情况下,可将上述拍摄单元2011集成为拍摄模块,将检测单元2012集成为检测模块,将防抖单元2013集成为处理模块。当然,终端中还可以包括存储模块、通信模块以及输入输出模块等。
此时,如图18所示,示出了上述实施例中所涉及的终端的一种可能的结构示意图。其中,处理模块2021用于对终端的动作进行控制管理。通信模块2022用于支持终端与其他网络实体例如云端服务器、其他终端等的通信。输入/输出模块2023用于接收由用户输入的信息或输出提供给用户的信息以及终端的各种菜单。存储模块2024用于保存终端的程序代码和数据。拍摄模块2025用于拍摄视频图像。检测模块2026用于检测终端的抖动。
示例性的,处理模块2021可以是处理器或控制器,例如可以是中央处理器(central processing unit,CPU),GPU,通用处理器,数字信号处理器(digital signal processor,DSP),专用集成电路(application-specific integrated circuit,ASIC),现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。
通信模块2022可以是收发器、收发电路、输入输出设备或通信接口等。例如,通信模块2022具体可以是蓝牙装置、Wi-Fi装置、外设接口等等。
存储模块2024可以是存储器,该存储器可以包括高速随机存取存储器(RAM),还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。
输入/输出模块2023可以为触摸屏、键盘、麦克风以及显示器等输入输出设备。其中,显示器具体可以采用液晶显示器、有机发光二极管等形式来配置显示器。另外,显示器上还可以集成触控板,用于采集在其上或附近的触摸事件,并将采集到的触摸信息发送给其他器件(例如处理器等)。
拍摄模块2025可以为光学图像传感器。
检测模块2026可以包括角度传感器、位移传感器、深度传感器、位置传感器等。
当存储模块为存储器,输入/输出模块为显示器,处理模块为处理器,通信模块为通信接口时,存储器用于存储计算机执行指令,处理器与存储器耦接,当终端运行时,处理器执行存储器存储的计算机执行指令,以使终端执行如图6中任一附图所述的方 法。
本发明实施例还提供一种存储一个或多个程序的计算机存储介质,一个或多个程序包括指令,该指令当被终端执行时,使终端执行如图6中任一附图所述的方法。
本发明实施例还提供了一种包含指令的计算机程序产品,当该计算机程序产品在终端上运行时,使得终端执行图6中任一附图所述的方法。
其中,本发明实施例提供的终端、计算机存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、设备和方法,可以通过其它的方式实现。例如,以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式来实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中 心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可以用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带),光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (20)

  1. 一种视频图像防抖方法,其特征在于,包括:
    终端打开摄像头,并通过所述摄像头拍摄视频图像;
    所述终端检测拍摄时在X、Y、Z轴上的抖动,所述Z轴为所述摄像头的光轴,所述X轴指在水平面上与所述Z轴相互垂直的轴,所述Y轴指在垂直面上与所述Z轴相互垂直的轴;
    所述终端根据所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理。
  2. 根据权利要求1所述的方法,其特征在于,
    所述方法还包括:所述终端检测物距,所述物距指对焦物体或人物的距离;
    所述终端检测在X、Y、Z轴上的抖动,包括:
    如果所述物距大于等于物距门限,则所述终端检测在X、Y、Z轴上的旋转抖动;如果所述物距小于物距门限,则所述终端检测在X、Y、Z轴上的旋转抖动,并且所述终端检测在X、Y、Z轴上的平移抖动。
  3. 根据权利要求2所述的方法,其特征在于,所述终端检测物距,包括:
    所述终端通过深度传感器来检测物距,所述深度传感器包括以下传感器的至少一种:激光传感器,飞行时间TOF传感器、结构光传感器。
  4. 根据权利要求2所述的方法,其特征在于,所述终端检测在X、Y、Z轴上的旋转抖动,包括:
    所述终端通过角度传感器来检测在X、Y、Z轴上的旋转抖动,所述角度传感器包括陀螺仪。
  5. 根据权利要求2所述的方法,其特征在于,所述终端检测在X、Y、Z轴上的平移抖动,包括:
    所述终端通过位移传感器来检测在X、Y、Z轴上的平移抖动,所述位移传感器包括加速度计。
  6. 根据权利要求2-5任一项所述的方法,其特征在于,
    所述方法还包括:所述终端检测像距;
    所述终端根据所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理,包括:
    所述终端根据物距、像距以及所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理。
  7. 根据权利要求6所述的方法,其特征在于,所述终端检测像距,包括:
    所述终端通过位置传感器来检测像距,所述位置传感器包括以下传感器的至少一种:霍尔传感器、各向异性磁电阻AMR传感器、巨磁电阻GMR传感器、隧道磁电阻TMR传感器。
  8. 根据权利要求6或7所述的方法,其特征在于,所述终端根据物距、像距以及所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理,包括:
    对于所述终端围绕Z轴旋转引起的图像抖动,所述终端对视频图像进行与其旋转方向相反并且旋转角度相同的旋转补偿;
    对于所述终端围绕X轴旋转抖动或Y轴旋转抖动引起的图像抖动,所述终端基于公式d=v*tan(θ)向所述视频图像平移的相反方向对所述视频图像补偿相同平移距离 d,并对所述视频图像在旋转轴方向上基于梯形校正算法将梯形失真恢复为矩形,其中,d为图像位移距离,v为像距,θ为围绕X轴或Y轴的旋转角度;
    对于所述终端在X轴或Y轴上平移抖动引起的图像抖动,所述终端基于公式d=(v+u)*Δ/v向所述视频图像平移的相反方向对所述视频图像补偿相同平移距离d,并对所述视频图像多余部分进行裁剪,其中,d为图像位移距离,v为像距,u为物距,Δ为所述终端的平移距离;
    对于所述终端在Z轴上平移抖动引起的图像抖动,所述终端基于缩放比例(u+Δ)/u对所述视频图像进行缩放,其中,u为物距,Δ为所述终端在Z轴上向远离物体方向平移距离。
  9. 根据权利要求6-8任一项所述的方法,其特征在于,所述方法还包括:
    对于所述终端调焦引起的图像抖动,所述终端基于缩放比例[(u-Δ)v]/[(v+Δ)u]对所述视频图像进行缩放,其中,u为物距,v为像距,Δ为镜头向物体移动的距离。
  10. 一种终端,其特征在于,包括:
    拍摄单元,用于打开摄像头,并通过所述摄像头拍摄视频图像;
    检测单元,用于检测拍摄时在X、Y、Z轴上的抖动,所述Z轴为所述摄像头的光轴,所述X轴指在水平面上与所述Z轴相互垂直的轴,所述Y轴指在垂直面上与所述Z轴相互垂直的轴;
    防抖单元,用于根据所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理。
  11. 根据权利要求10所述的终端,其特征在于,
    所述检测单元,还用于检测物距,所述物距指对焦物体或人物的距离;
    所述检测单元,具体用于:
    如果所述物距大于等于物距门限,则检测在X、Y、Z轴上的旋转抖动;如果所述物距小于物距门限,则检测在X、Y、Z轴上的旋转抖动,并且检测在X、Y、Z轴上的平移抖动。
  12. 根据权利要求11所述的终端,其特征在于,所述检测单元,具体用于:
    通过深度传感器来检测物距,所述深度传感器包括以下传感器的至少一种:激光传感器,飞行时间TOF传感器、结构光传感器。
  13. 根据权利要求11所述的终端,其特征在于,所述检测单元,具体用于:
    通过角度传感器来检测在X、Y、Z轴上的旋转抖动,所述角度传感器包括陀螺仪。
  14. 根据权利要求11所述的终端,其特征在于,所述检测单元,具体用于:
    通过位移传感器来检测在X、Y、Z轴上的平移抖动,所述位移传感器包括加速度计。
  15. 根据权利要求11-14任一项所述的终端,其特征在于,
    所述检测单元,还用于检测像距;
    所述防抖单元,具体用于根据物距、像距以及所述在X、Y、Z轴上的抖动对所述视频图像进行防抖处理。
  16. 根据权利要求15所述的终端,其特征在于,所述检测单元,具体用于:
    通过位置传感器来检测像距,所述位置传感器包括以下传感器的至少一种:霍尔传感器、各向异性磁电阻AMR传感器、巨磁电阻GMR传感器、隧道磁电阻TMR传 感器。
  17. 根据权利要求15或16所述的终端,其特征在于,所述防抖单元,具体用于:
    对于所述终端围绕Z轴旋转引起的图像抖动,对视频图像进行与其旋转方向相反并且旋转角度相同的旋转补偿;
    对于所述终端围绕X轴旋转抖动或Y轴旋转抖动引起的图像抖动,基于公式d=v*tan(θ)向所述视频图像平移的相反方向对所述视频图像补偿相同平移距离d,并对所述视频图像在旋转轴方向上基于梯形校正算法将梯形失真恢复为矩形,其中,d为图像位移距离,v为像距,θ为围绕X轴或Y轴的旋转角度;
    对于所述终端在X轴或Y轴上平移抖动引起的图像抖动,基于公式d=(v+u)*Δ/v向所述视频图像平移的相反方向对所述视频图像补偿相同平移距离d,并对所述视频图像多余部分进行裁剪,其中,d为图像位移距离,v为像距,u为物距,Δ为所述终端的平移距离;
    对于所述终端在Z轴上平移抖动引起的图像抖动,基于缩放比例(u+Δ)/u对所述视频图像进行缩放,其中,u为物距,Δ为所述终端在Z轴上向远离物体方向平移距离。
  18. 根据权利要求15-17任一项所述的终端,其特征在于,所述防抖单元,还用于:
    对于所述终端调焦引起的图像抖动,基于缩放比例[(u-Δ)v]/[(v+Δ)u]对所述视频图像进行缩放,其中,u为物距,v为像距,Δ为镜头向物体移动的距离。
  19. 一种终端,其特征在于,包括:处理器、显示器、存储器和通信接口;
    所述存储器用于存储计算机执行指令,所述处理器与所述存储器耦接,当所述终端运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述终端执行如权利要求1-9任一项所述方法。
  20. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在终端上运行时,使得所述终端执行如权利要求1-9任一项所述方法。
PCT/CN2018/080357 2018-03-23 2018-03-23 视频图像防抖方法和终端 WO2019178872A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP18910358.3A EP3745705A4 (en) 2018-03-23 2018-03-23 VIDEO IMAGE ANTI-SHAKER PROCESS AND TERMINAL
US16/976,820 US11539887B2 (en) 2018-03-23 2018-03-23 Video image anti-shake method and terminal
RU2020133144A RU2758460C1 (ru) 2018-03-23 2018-03-23 Оконечное устройство и способ стабилизации видеоизображения
PCT/CN2018/080357 WO2019178872A1 (zh) 2018-03-23 2018-03-23 视频图像防抖方法和终端
CN201880037958.4A CN110731077B (zh) 2018-03-23 2018-03-23 视频图像防抖方法和终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/080357 WO2019178872A1 (zh) 2018-03-23 2018-03-23 视频图像防抖方法和终端

Publications (1)

Publication Number Publication Date
WO2019178872A1 true WO2019178872A1 (zh) 2019-09-26

Family

ID=67988176

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080357 WO2019178872A1 (zh) 2018-03-23 2018-03-23 视频图像防抖方法和终端

Country Status (5)

Country Link
US (1) US11539887B2 (zh)
EP (1) EP3745705A4 (zh)
CN (1) CN110731077B (zh)
RU (1) RU2758460C1 (zh)
WO (1) WO2019178872A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111212224A (zh) * 2020-01-10 2020-05-29 上海摩象网络科技有限公司 应用于图像拍摄设备的防抖处理方法及其装置、电子设备
WO2021178245A1 (en) * 2020-03-03 2021-09-10 Qualcomm Incorporated Power-efficient dynamic electronic image stabilization
CN113452919A (zh) * 2021-07-21 2021-09-28 杭州海康威视数字技术股份有限公司 用于利用光学防抖和电子防抖实现协同防抖的摄像机

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572993B (zh) * 2020-04-27 2022-10-11 华为技术有限公司 一种视频处理方法及移动终端
CN111768659B (zh) * 2020-05-15 2022-08-30 四川科华天府科技有限公司 一种基于ar互动教学设备的变焦系统
US11711613B2 (en) * 2021-04-27 2023-07-25 Qualcomm Incorporated Image alignment for computational photography
US11494920B1 (en) * 2021-04-29 2022-11-08 Jumio Corporation Multi-sensor motion analysis to check camera pipeline integrity
CN115103108A (zh) * 2022-06-06 2022-09-23 Oppo广东移动通信有限公司 防抖处理方法、装置、电子设备和计算机可读存储介质
CN117596483B (zh) * 2024-01-18 2024-04-02 深圳市瀚晖威视科技有限公司 视频防抖拼接方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012112819A (ja) * 2010-11-25 2012-06-14 Nec Tokin Corp 振動ジャイロ
CN103685950A (zh) * 2013-12-06 2014-03-26 华为技术有限公司 一种视频图像防抖方法及装置
CN104902142A (zh) * 2015-05-29 2015-09-09 华中科技大学 一种移动终端视频的电子稳像方法
CN105430245A (zh) * 2014-09-17 2016-03-23 奥林巴斯株式会社 摄像装置和像抖校正方法
CN106791423A (zh) * 2016-12-30 2017-05-31 维沃移动通信有限公司 一种摄像头装置、拍摄方法和装置

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000224470A (ja) * 1999-02-02 2000-08-11 Minolta Co Ltd カメラシステム
JP4169178B2 (ja) * 2000-10-05 2008-10-22 株式会社リコー 手振れ補正機能付き撮像装置
JP2003043540A (ja) * 2001-07-30 2003-02-13 Canon Inc 振れ補正機能付きカメラ
JP4031646B2 (ja) * 2002-01-15 2008-01-09 株式会社リコー 撮像装置
JP3929778B2 (ja) * 2002-01-16 2007-06-13 株式会社リコー 撮像装置
US6751410B1 (en) 2003-07-10 2004-06-15 Hewlett-Packard Development Company, L.P. Inertial camera stabilization apparatus and method
KR20060007225A (ko) * 2004-07-19 2006-01-24 삼성전자주식회사 촬상소자 구동제어와 메모리 읽기제어를 이용한 손떨림보정방법 및 이를 적용한 촬영장치
JP5128616B2 (ja) * 2008-02-22 2013-01-23 パナソニック株式会社 撮像装置
RU2384967C1 (ru) 2008-08-12 2010-03-20 Федеральное государственное унитарное предприятие "Научно-исследовательский институт телевидения" Способ стабилизации изображения (варианты)
JP4857363B2 (ja) 2009-06-11 2012-01-18 キヤノン株式会社 像振れ補正装置および撮像装置
US8558903B2 (en) * 2010-03-25 2013-10-15 Apple Inc. Accelerometer / gyro-facilitated video stabilization
JP2012242563A (ja) * 2011-05-18 2012-12-10 Pentax Ricoh Imaging Co Ltd 手ブレ補正装置およびデジタルカメラ
RU2517347C1 (ru) 2012-12-07 2014-05-27 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Санкт-Петербургский государственный электротехнический университет "ЛЭТИ" им. В.И. Ульянова (Ленина)" Устройство стабилизации изображения
JP2014126861A (ja) * 2012-12-27 2014-07-07 Canon Inc 撮像装置及びその制御方法、プログラム、記憶媒体
JP6071545B2 (ja) * 2012-12-27 2017-02-01 キヤノン株式会社 撮像装置、画像処理装置及びその制御方法、プログラム、記憶媒体
JP6170395B2 (ja) * 2013-09-26 2017-07-26 キヤノン株式会社 撮像装置およびその制御方法
CN104796596B (zh) * 2014-01-20 2018-07-06 联想(北京)有限公司 一种信息处理方法及电子设备
WO2016080538A1 (ja) * 2014-11-21 2016-05-26 富士フイルム株式会社 撮像装置及び撮像方法
US9525821B2 (en) * 2015-03-09 2016-12-20 Microsoft Technology Licensing, Llc Video stabilization
CN104967785B (zh) * 2015-07-07 2018-04-27 小米科技有限责任公司 控制光学防抖的方法及装置
CN105100614B (zh) 2015-07-24 2018-07-31 小米科技有限责任公司 光学防抖的实现方法及装置、电子设备
KR102352681B1 (ko) * 2015-07-27 2022-01-18 삼성전자주식회사 동영상 안정화 방법 및 이를 위한 전자 장치
US9912868B2 (en) * 2015-09-15 2018-03-06 Canon Kabushiki Kaisha Image-blur correction apparatus, tilt correction apparatus, method of controlling image-blur correction apparatus, and method of controlling tilt correction apparatus
US9743001B1 (en) * 2016-02-19 2017-08-22 Fotonation Limited Method of stabilizing a sequence of images
US10341567B2 (en) * 2016-03-16 2019-07-02 Ricoh Imaging Company, Ltd. Photographing apparatus
US10142546B2 (en) * 2016-03-16 2018-11-27 Ricoh Imaging Company, Ltd. Shake-correction device and shake-correction method for photographing apparatus
US10303041B2 (en) 2016-08-10 2019-05-28 Apple Inc. Closed loop position control for camera actuator
CN106488081B (zh) 2016-10-17 2019-06-28 深圳市前海视微科学有限责任公司 视频稳像系统及方法
JP2018101101A (ja) * 2016-12-21 2018-06-28 ソニー株式会社 カメラ制御装置及び撮像装置
US10341562B1 (en) * 2017-09-07 2019-07-02 Gopro, Inc. Systems and methods for translational motion correction
KR102573302B1 (ko) * 2018-10-10 2023-08-31 삼성전자 주식회사 영상의 안정화를 위한 카메라 모듈, 그것을 포함하는 전자 장치 및 전자 장치의 영상 안정화 방법
US10609288B1 (en) * 2019-03-04 2020-03-31 Qualcomm Incorporated Roll compensation and blur reduction in tightly synchronized optical image stabilization (OIS)
US11381745B2 (en) * 2019-03-07 2022-07-05 Invensense, Inc. Drift correction with phase and amplitude compensation for optical image stabilization
US10681277B1 (en) * 2019-03-07 2020-06-09 Qualcomm Incorporated Translation compensation in optical image stabilization (OIS)

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012112819A (ja) * 2010-11-25 2012-06-14 Nec Tokin Corp 振動ジャイロ
CN103685950A (zh) * 2013-12-06 2014-03-26 华为技术有限公司 一种视频图像防抖方法及装置
CN105430245A (zh) * 2014-09-17 2016-03-23 奥林巴斯株式会社 摄像装置和像抖校正方法
CN104902142A (zh) * 2015-05-29 2015-09-09 华中科技大学 一种移动终端视频的电子稳像方法
CN106791423A (zh) * 2016-12-30 2017-05-31 维沃移动通信有限公司 一种摄像头装置、拍摄方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3745705A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111212224A (zh) * 2020-01-10 2020-05-29 上海摩象网络科技有限公司 应用于图像拍摄设备的防抖处理方法及其装置、电子设备
WO2021178245A1 (en) * 2020-03-03 2021-09-10 Qualcomm Incorporated Power-efficient dynamic electronic image stabilization
CN113452919A (zh) * 2021-07-21 2021-09-28 杭州海康威视数字技术股份有限公司 用于利用光学防抖和电子防抖实现协同防抖的摄像机
CN113452919B (zh) * 2021-07-21 2022-04-19 杭州海康威视数字技术股份有限公司 用于利用光学防抖和电子防抖实现协同防抖的摄像机

Also Published As

Publication number Publication date
RU2758460C1 (ru) 2021-10-28
EP3745705A4 (en) 2020-12-30
EP3745705A1 (en) 2020-12-02
US11539887B2 (en) 2022-12-27
US20200404178A1 (en) 2020-12-24
CN110731077A (zh) 2020-01-24
CN110731077B (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2019178872A1 (zh) 视频图像防抖方法和终端
WO2018072353A1 (zh) 获取图像的方法和终端设备
JP6128389B2 (ja) 撮像装置
US9019387B2 (en) Imaging device and method of obtaining image
WO2019237984A1 (zh) 图像校正方法、电子设备及计算机可读存储介质
KR101856947B1 (ko) 촬영장치, 움직임 추정장치, 영상 보정 방법, 움직임 추정방법 및 컴퓨터 판독가능 기록매체
CN110784651B (zh) 一种防抖方法及电子设备
US10104292B2 (en) Multishot tilt optical image stabilization for shallow depth of field
JP6098874B2 (ja) 撮像装置および画像処理装置
WO2018196695A1 (zh) 一种拍摄方法及移动终端
CN105284101A (zh) 低光高动态范围图像的无运动模糊捕捉
US11412142B2 (en) Translation correction for optical image stabilization
WO2016187411A1 (en) Systems and methods for storing images and sensor data
JP5977611B2 (ja) ブレ量検出装置、撮像装置及びブレ量検出方法
CN113556464A (zh) 拍摄方法、装置及电子设备
EP3267675B1 (en) Terminal device and photographing method
CN213279878U (zh) 摄像装置及移动终端
CN114616820A (zh) 摄像支援装置、摄像装置、摄像系统、摄像支援系统、摄像支援方法及程序
WO2019179413A1 (zh) 景深图像生成方法及移动终端
CN110999274A (zh) 对多个传感器设备中的图像捕获进行同步
CN114765663A (zh) 防抖处理方法、装置、移动设备及存储介质
WO2019176177A1 (ja) 情報処理装置、情報処理方法、及びプログラム
CN113286076A (zh) 拍摄方法及相关设备
CN115379115B (zh) 视频拍摄的方法、装置及电子设备
US11490009B2 (en) Optical image stabilization circuit and optical image stabilization device for object position correction using face tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910358

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018910358

Country of ref document: EP

Effective date: 20200827

NENP Non-entry into the national phase

Ref country code: DE