CN114449173A - Optical anti-shake control method, device, storage medium and electronic equipment - Google Patents
Optical anti-shake control method, device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN114449173A CN114449173A CN202210181901.XA CN202210181901A CN114449173A CN 114449173 A CN114449173 A CN 114449173A CN 202210181901 A CN202210181901 A CN 202210181901A CN 114449173 A CN114449173 A CN 114449173A
- Authority
- CN
- China
- Prior art keywords
- object distance
- camera
- shake
- determining
- optical anti
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 113
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000004044 response Effects 0.000 claims abstract description 19
- 238000013519 translation Methods 0.000 claims description 47
- 230000001133 acceleration Effects 0.000 claims description 29
- 238000012360 testing method Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims description 2
- 238000001914 filtration Methods 0.000 description 17
- 230000033001 locomotion Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 239000003381 stabilizer Substances 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The disclosure provides an optical anti-shake control method, an optical anti-shake control device, a computer-readable storage medium and an electronic device, and relates to the technical field of images. The optical anti-shake control method comprises the following steps: acquiring an object distance between a shot object and a camera; determining a target class from the data classes of the inertial sensors according to the object distance; determining pose information of the camera based on the inertial sensing data of the target class; and obtaining optical anti-shake control parameters by analyzing the pose information of the camera. The optical anti-shake device can realize optical anti-shake under different object distances, broaden the application range of the optical anti-shake, and improve the response speed and efficiency of the optical anti-shake.
Description
Technical Field
The present disclosure relates to the field of image technologies, and in particular, to an optical anti-shake control method, an optical anti-shake control apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of imaging technology, people increasingly use electronic devices with cameras to capture images or videos to record various information. In the shooting process, the hand shake of a photographer, vibration in the environment and the like can cause the shake of a shot picture, and the image is blurred.
In the related art, an Optical Image Stabilizer (OIS) is used to compensate for a lens shift during a shake process, so as to achieve an anti-shake effect. However, the current optical anti-shake system can be applied to a limited number of scenes.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those skilled in the art.
Disclosure of Invention
The present disclosure provides an optical anti-shake control method, an optical anti-shake control apparatus, a computer-readable storage medium, and an electronic device, so as to widen an application range of optical anti-shake at least to a certain extent.
According to a first aspect of the present disclosure, there is provided an optical anti-shake control method, including: acquiring an object distance between a shot object and a camera; determining a target class from the data classes of the inertial sensors according to the object distance; determining pose information of the camera based on the inertial sensing data of the target class; and obtaining optical anti-shake control parameters by analyzing the pose information of the camera.
According to a second aspect of the present disclosure, there is provided an optical anti-shake control apparatus comprising: the object distance acquisition module is configured to acquire an object distance between a shot object and the camera; a target class determination module configured to determine a target class among data classes of inertial sensors according to the object distance; a pose information determination module configured to determine pose information for the camera based on inertial sensing data for the object class; and the control parameter determination module is configured to obtain the optical anti-shake control parameters by analyzing the pose information of the camera.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the optical anti-shake control method of the first aspect described above and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; a memory for storing executable instructions of the processor; a camera including an optical anti-shake system. Wherein the processor is configured to execute the optical anti-shake control method of the first aspect and possible implementations thereof via execution of the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
and determining a target type in the inertial sensing data according to the object distance between the shot object and the camera, determining pose information of the camera by adopting the inertial sensing data of the target type, and further obtaining optical anti-shake control parameters by analyzing the pose information, wherein the optical anti-shake control parameters can be used for realizing accurate optical anti-shake control. On the one hand, the scheme can realize optical anti-shake under different object distances, so that the limitation to the object distance in the related technology is broken through, and the application range of the optical anti-shake is widened. On the other hand, according to different object distances, the inertial sensing data of the target types adapted to the object distances are adopted to perform the relevant calculation of the pose information, so that the number of the inertial sensing data needing to be processed can be reduced to a certain extent, the response speed and efficiency of optical anti-shake are improved, and the potential problem of camera head heat is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is apparent that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings can be derived from those drawings without inventive effort for those skilled in the art.
Fig. 1 shows a schematic configuration diagram of an electronic apparatus in the present exemplary embodiment;
fig. 2 shows a schematic view of the principle of optical anti-shake in the present exemplary embodiment;
fig. 3 shows a flowchart of an optical anti-shake control method in the present exemplary embodiment;
fig. 4 is a diagram illustrating a screen shift caused by rotation and translation of a lens in the present exemplary embodiment;
FIG. 5 illustrates a flow chart for determining a first object distance threshold in the exemplary embodiment;
FIG. 6 is a schematic diagram illustrating the determination of the first, second, and third range thresholds in the exemplary embodiment;
fig. 7 shows a flowchart for determining position adjustment information in the present exemplary embodiment;
fig. 8 shows a schematic diagram in which the lens is moved in the X direction in the present exemplary embodiment;
fig. 9 shows a schematic diagram of an optical anti-shake flow in the present exemplary embodiment;
fig. 10 shows a schematic structural diagram of an optical anti-shake control apparatus in the present exemplary embodiment.
Detailed Description
Exemplary embodiments of the present disclosure will now be described with reference to the accompanying drawings. The exemplary embodiments, however, may be embodied in many different forms and should not be construed as limited to the examples set forth herein. These embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of exemplary embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the related art, optical anti-shake has a certain limitation on the object distance, and a good anti-shake effect can be achieved only within a specific object distance range, so that applicable scenes are limited.
In view of the above, exemplary embodiments of the present disclosure provide an optical anti-shake control method, and an electronic apparatus for performing the same. The electronic device will be explained first.
The electronic device may include a processor, a memory, and a camera. The camera includes an optical anti-shake system. The memory stores executable instructions of the processor, such as may be program code. The processor executes the executable instructions to perform the optical anti-shake control method in the present exemplary embodiment. The electronic equipment can be a mobile phone, a tablet computer, a digital camera, an unmanned aerial vehicle, intelligent wearable equipment and the like.
The structure of the electronic device will be exemplarily described below by taking the mobile terminal 100 in fig. 1 as an example. It will be appreciated by those skilled in the art that the configuration of figure 1 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes.
Referring to fig. 1, the mobile terminal 100 may specifically include: processor 110, memory 120, communication module 130, bus 140, display module 150, power module 160, camera 170, and sensor module 180.
The memory 120 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the mobile terminal 100 and data processing by executing instructions stored in the memory 120. For example, the memory 120 may store a degree code of the optical anti-shake control method in the present exemplary embodiment, and the processor 110 implements the optical anti-shake control method by executing the code. The memory 120 may also store application data, such as files for storing images, videos, and the like.
The communication function of the mobile terminal 100 may be implemented by the communication module 130 with an antenna, a modem processor, a baseband processor, and the like. Antennas are used to transmit and receive electromagnetic wave signals, such as radio frequency signals. The communication module 130 may provide a mobile communication solution such as 3G, 4G, 5G, etc. applied to the mobile terminal 100, or a wireless communication solution such as wireless local area network, bluetooth, near field communication, etc.
The display module 150 is used to provide display functions of the mobile terminal 100, such as displaying a graphical user interface. The power module 160 is used to implement power management functions, such as charging a battery, powering a device, monitoring a battery status, and the like.
The camera 170 is used for capturing images or videos, and may include a lens 171, an image sensor 172, an optical anti-shake system 173, and other components, such as a cover plate and a filter, which are not shown in the figures. The lens 171 may be a lens for controlling the optical path. The image sensor 172 is used for receiving the optical signal and converting it into an electrical signal for further conversion into a digital signal or the like. The optical anti-shake system 173 is used to provide an optical anti-shake function of the camera 170 to ensure that the camera 170 can capture a clear image or video in case of shake.
In the present exemplary embodiment, the optical anti-shake control method may be executed by the processor 110 to control the optical anti-shake system 173. For example, an ISP may be provided in cooperation with the camera 170, and the ISP may perform an optical anti-shake control method and transmit a generated control signal to the optical anti-shake system 173 to implement corresponding control. In addition, the ISP can also be used to implement auto focus, auto exposure, auto white balance, flicker detection, black level compensation, etc.
The sensor module 180 may include one or more sensors for implementing the respective sensing functions. In this exemplary embodiment, the sensor module 180 may include an inertial sensor 181, such as an accelerometer, a gyroscope, and a magnetometer, which are respectively configured to sense and collect acceleration data and angular velocity data (or angular acceleration data, which may be obtained by integrating angular velocity data, and may be regarded as being equal to angular velocity data, and therefore will not be described in detail below), and magnetometer data. The inertial sensor 181 is used to sense the pose of the mobile terminal 100, and then obtain the pose of the camera 170.
In one embodiment, the inertial sensor 181 may be configured with the camera 170, for example, the inertial sensor 181 is disposed in the camera 170, which can accurately sense the pose of the camera 170.
In addition, the sensor module 180 may also include other sensors, such as a depth sensor, for sensing depth information. In one embodiment, a depth sensor may be provided in conjunction with camera 170 to form a depth camera. Also, the mobile terminal 100 may further include other components not shown in the drawings, such as an audio module, a touch input module, and the like.
The number of the above components is not limited in the present disclosure, for example, the number of the cameras 170 is not limited, and two or three cameras may be provided according to actual requirements.
Based on the mobile terminal 100, the optical anti-shake process is as follows: during shooting, the inertial sensor 181 may sense shake of the camera 170, transmit data related to the shake to the ISP, the ISP may perform the optical anti-shake control method in the present exemplary embodiment to obtain optical anti-shake control parameters, and then the ISP may transmit the optical anti-shake control parameters to a driving circuit (Driver IC), and the driving circuit controls the optical anti-shake system to perform corresponding adjustment, for example, the position of the lens 171 or the image sensor 172 in the camera 170 may be adjusted by controlling a motor in the optical anti-shake system to implement shake compensation, so as to improve image blur generated due to the shake.
Fig. 2 shows the principle of optical anti-shake. Under the condition of no shake, the optical axis direction of the lens 220 can be aligned with the object 230 to be shot, and stable imaging can be realized on the imaging plane 210, so that the image is clearer. When the shake occurs and the optical anti-shake is not turned on, the optical axis of the lens 220 is shifted compared with the object 230 to be photographed, so that the whole picture is shifted, and the image is unstable due to the movement of the lens 220, and the image is blurred. When the camera shakes and starts optical anti-shake, the adjustment of the lens 220 can compensate the shake, so that the optical axis can be aligned to the photographed object 230, the image is stable, and the image is clear.
The optical anti-shake control method in the present exemplary embodiment is explained below with reference to fig. 3. Fig. 3 shows an exemplary flow of the optical anti-shake control method, which may include:
step S310, acquiring the object distance between the shot object and the camera;
step S320, determining a target type in the data types of the inertial sensor according to the object distance;
step S330, determining pose information of the camera based on the inertial sensing data of the target category;
and step S340, obtaining optical anti-shake control parameters by analyzing the pose information of the camera.
In the optical anti-shake control method, the target type in the inertial sensing data is determined according to the object distance between the shooting object and the camera, the pose information of the camera is determined by adopting the inertial sensing data of the target type, and then the optical anti-shake control parameters are obtained by analyzing the pose information, and the optical anti-shake control parameters can be used for realizing accurate optical anti-shake control. On the one hand, the scheme can realize optical anti-shake under different object distances, so that the limitation to the object distance in the related technology is broken through, and the application range of the optical anti-shake is widened. On the other hand, according to different object distances, the inertial sensing data of the target types adapted to the object distances are adopted to perform the relevant calculation of the pose information, so that the number of the inertial sensing data needing to be processed can be reduced to a certain extent, the response speed and efficiency of optical anti-shake are improved, and the potential problem of camera head heat is solved.
Each step in fig. 3 is explained in detail below.
In step S310, an object distance between the subject and the camera is acquired.
Object distance generally refers to the distance of the object being photographed from the optical center of the lens. The object distance in the present exemplary embodiment is used to measure the distance between the object to be photographed and the camera, and may be a distance between the object to be photographed and any component (not limited to a lens) of the camera, such as a distance between the object to be photographed and an image sensor of the camera. The shot object can be a foreground object in the picture, or an object in the center of the picture, or an object in focus, etc. In step S310, the object distance between the object and the camera when the previous frame or multiple frames of images are captured may be acquired for optical anti-shake control when the current frame of images is captured, or the object distance between the object and the camera may be acquired based on the preview image.
The present disclosure is not limited to a particular manner of determining object distance, and several examples are provided below.
In one embodiment, the acquiring an object distance between the object to be photographed and the camera may include:
acquiring focusing parameters when a camera aligns to a shot object;
and determining the object distance according to the focusing parameters.
The focusing parameters may include a focal length, an auto-focusing control parameter, and the like. The focal distance may be a focal distance determined by auto-focusing or manual focusing. The Auto Focus control parameter may be a parameter that adjusts the lens position at the time of Auto Focus, and may be an AF (Auto Focus) code, for example. For example, the AF code may have a value in the range of 0 to 1023 for indicating the position of the lens in the optical axis direction, such as the larger the AF code value, the farther the distance from the image sensor. The AF code has a strong correlation with the focal distance and can therefore also be used to calculate the object distance.
The focusing parameter may be a focusing parameter for capturing the current frame image. When taking successive images, the focus parameters in the previous or multiple frame image taking may also be used, since the focus parameters between successive frames do not usually change much.
In one embodiment, the object distance may be calculated by the relationship of focal length, object distance, image distance, etc. in the lens imaging, with the focal length obtained.
In one embodiment, the above determining the object distance according to the focusing parameter may include the following steps:
and determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned with the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
Generally, when the object distances are different, and the focal lengths of the cameras during automatic focusing are different, the automatic focusing control parameters for adjusting the lens positions are also different, that is, the automatic focusing control parameters have a corresponding relationship with the object distances, and calibration can be performed in advance, for example, the automatic focusing control parameters are determined under a plurality of known object distances, so as to obtain a calibration relationship between the automatic focusing control parameters and the object distances. Therefore, when the current frame image is shot, after the current automatic focusing control parameter is obtained, the corresponding object distance can be found out in the calibration relation.
In one embodiment, if the camera is a depth camera, depth information of the object to be photographed may be acquired, and the object distance may be determined therefrom. For example, the depth value of the captured object may be obtained and used as the object distance, or when the depth value of the captured object is not unique, if the depth values of different parts of the captured object are different, the depth values may be fused (e.g., averaged or weighted) to obtain the object distance.
In one embodiment, the object distance may be determined from two images of the object being captured. The two images may be two images at different viewing angles. For example, the electronic device includes two cameras (binocular cameras), and two images of the object to be photographed can be acquired through the two cameras; or, the camera is in a motion state, and can acquire two images at different times, for example, the first two images of the current frame can be acquired, or preview images can be acquired at two times respectively. When the two images are acquired, the three-dimensional information of the shot object can be reconstructed through the triangulation principle, so that the object distance can be determined.
With continued reference to fig. 3, in step S320, a target class is determined among the data classes of the inertial sensors based on the object distance.
The data category of the inertial sensor may be divided by the category of the sensor, or may be divided in other ways, such as in the form of the data itself. Exemplary data categories for inertial sensors may include: acceleration data, angular velocity data. Magnetometer data, gravity data, etc. may also be included.
The camera shake usually includes two motion situations of rotation and translation, and the following describes the effect of rotation and translation on the screen shift with reference to fig. 4 by taking the camera shake as an example. It should be understood that the shaking of the lens may be equivalent to the shaking of the image sensor. Fig. 4 shows a picture shift at different object distances caused by lens rotation, and a picture shift at different object distances caused by lens translation. As can be seen from fig. 4, when the lens 220 rotates, the angle of the optical axis changes, and when the object distance is small, the distance between the optical axis (or the focal point) and the object 230 to be photographed is small, that is, the picture shift amount is small; as the object distance increases, the distance between the optical axis (or focal point) and the object 230 is larger, that is, the screen shift amount is larger. When the lens 220 is translated, the angle of the optical axis is not changed, but the entire lens is translated with respect to the object 230, and the distance between the optical axis (or the focal point) and the object 230 is not changed, that is, the screen shift amount is not changed, according to the change of the object distance. Therefore, when the object distance is smaller, the image offset is insensitive to the rotation of the lens and sensitive to the translation of the lens, so that the translation of the lens needs to be compensated; when the object distance is large, the frame offset is insensitive to the translation of the lens and sensitive to the rotation of the lens, so that the rotation of the lens needs to be compensated.
It should be noted that fig. 4 is only illustrative, and the straight-line distance between the subject 230 and the actual optical axis is taken as the screen shift amount. The picture shift amount may also be determined from the curve distance, or from the displacement of the feature point in the image, etc. The principle is similar regardless of the manner in which the screen shift amount is calculated, and the relationship shown in fig. 4 is also true.
Therefore, in the present exemplary embodiment, it is possible to determine whether to adopt the translational type of inertial sensing data or the rotational type of inertial sensing data as the target type for optical shake compensation according to the difference in object distance. The inertial sensing data of the translation category may be acceleration data, and the inertial sensing data of the rotation category may be angular velocity data.
In one embodiment, the above determining the target class from the data classes of the inertial sensor according to the object distance may include the following steps:
determining the acceleration data as a target class in response to the object distance being less than a first object distance threshold;
in response to the object distance being greater than the first object distance threshold, the angular velocity data is determined to be of the target class.
The first object distance threshold may be an object distance at which the rotation and the translation of the lens cause the same frame offset, and may be obtained empirically or through a preliminary test. If the object distance between the shot object and the camera is smaller than the first object distance threshold value, namely the object distance is smaller, the translation has larger influence on the picture offset, and therefore the acceleration data is determined as the target class. If the object distance between the photographed object and the camera is larger than the first object distance threshold, that is, the object distance is large, the angular velocity data is determined as the target class because the influence of the rotation on the screen shift is large.
In one embodiment, the first object distance threshold may be determined by preliminary testing. Specifically, referring to fig. 5, the optical anti-shake control method may further include the following steps S510 to S530:
step S510, obtaining a translational test quantity and a corresponding rotational test quantity based on a relationship between the translational compensation and the rotational compensation.
Wherein, the relationship between the translation compensation and the rotation compensation may be: in the general case, the correspondence between the compensation for translation and the compensation for rotation. For example, the optical anti-shake system generally compensates shake by translating rather than rotating the lens, and even if the lens rotates, the influence of rotation can be reduced or even eliminated by translation compensation. Empirically, when the lens rotates by 1 degree, the compensation can be performed by translation by 0.2mm, and therefore the relationship between the translation compensation and the rotation compensation can include a corresponding relationship between 0.2mm and 1 degree. The relationship may be linear or non-linear, and is not limited by this disclosure.
Based on the relation between the translation compensation and the rotation compensation, at least one set of translation test quantities and corresponding rotation test quantities may be obtained, as may be expressed in the form of (translation quantity a, rotation angle B). For example, the above-mentioned 0.2mm can be used as a translation test quantity, and 1 degree can be used as a rotation test quantity, and both form a set of test data.
In step S520, a first relationship between the object distance and the screen offset is obtained when the translation test quantity is used to apply translation to the camera, and a second relationship between the object distance and the screen offset is obtained when the rotation test quantity is used to apply rotation to the camera.
In the present exemplary embodiment, a translation test amount may be used to apply a translation to the camera, for example, to translate the lens by 0.2mm, in which case the image offset amount at different object distances is tested, and a first relationship between the object distance and the image offset amount is obtained. The second relationship between the object distance and the screen offset can be obtained by applying a rotation to the camera with a rotation test quantity, for example, rotating the lens by 1 degree, in which case the screen offset at different object distances is tested.
Step S530, determine a first object distance threshold according to the first relationship and the second relationship.
According to the first relation and the second relation, it can be obtained under which object distance, the picture offset caused by the translation test quantity is equal to the picture offset caused by the rotation test quantity, and the object distance is the first object distance threshold.
Illustratively, referring to fig. 6, the first relationship and the second relationship may be plotted as a curve, respectively, wherein the abscissa may be the object distance and the ordinate may be the picture shift amount. In general, the first relation curve may be a straight line with a constant ordinate, and the second relation curve may be a primary curve or a secondary curve. And the object distance corresponding to the intersection point of the first relation curve and the second relation curve is the first object distance threshold.
In one embodiment, the above determining the target class from the data classes of the inertial sensor according to the object distance may include the following steps:
determining the acceleration data as a target category in response to the object distance being less than a second distance threshold;
determining the acceleration data and the angular velocity data as target categories in response to the object distance being greater than a second object distance threshold value and less than a third object distance threshold value;
in response to the object distance being greater than the third object distance threshold, the angular velocity data is determined to be of the target class.
When the object distance is smaller than the second distance threshold, the image shift caused by rotation is small due to the small object distance, the influence of rotation can be ignored, and only the influence of translation is considered, so that only the acceleration data is determined as the target category. When the object distance is larger than the third object distance threshold, the image shift caused by the translation is small because the object distance is large, the influence of the translation can be ignored, and only the influence of the rotation is considered, so that only the angular velocity data is determined as the target category. When the object distance is between the second distance threshold and the third distance threshold, both the translational and rotational effects need to be considered, and both are therefore determined as target classes.
The second and third object distance thresholds may be empirically or pre-tested. For example, a predetermined ratio (e.g., 90% or other empirical value) may be determined for ignoring the effect of the translation or rotation when the ratio of the picture shift amount caused by the translation or rotation to the equivalent total picture shift amount is lower than the predetermined ratio. Referring to fig. 6, the image shift amounts caused by the translation amount a and the rotation angle B may be calculated at different object distances, the ratio of the image shift amount caused by the translation amount a and the image shift amount caused by the rotation angle B is calculated, the object distance corresponding to the point where the ratio of the image shift amount caused by the rotation angle B is equal to the preset ratio is determined as the second object distance threshold, and when the object distance is smaller than the second object distance threshold, the ratio of the rotation portion is lower than the preset ratio. And determining the object distance corresponding to the point where the ratio of the image translation amount caused by the translation amount A is equal to the preset ratio as a third object distance threshold, wherein when the object distance is greater than the third object distance threshold, the ratio of the translation part is lower than the preset ratio.
According to the first object distance threshold value or the second object distance threshold value and the third object distance threshold value, the appropriate target class can be determined under the condition of different object distances. And then only the inertial sensing data of the target class is used for optical anti-shake control, and other classes of inertial sensing data are not needed.
In one embodiment, in the case of a target class determination, the inertial sensors of the non-target class may also be turned off. For example, if the acceleration data is determined to be of the target category and the angular velocity data does not need to be used, the gyroscope used to sense the angular velocity data may be turned off, thereby further reducing power consumption.
With continued reference to fig. 3, in step S330, pose information of the camera is determined based on the inertial sensing data of the object class.
The pose information of the camera may include at least one of position data and attitude data of the camera. The position data is used to indicate the absolute position or relative position of the camera, and usually includes X, Y, Z coordinates in three axial directions, but this disclosure is not limited thereto, and the position data may also be expressed in the form of spherical coordinates or the like. The attitude data is used to represent the orientation state of the camera. The form and specific content of the attitude data are not limited in the present disclosure, and may be, for example, an absolute attitude in a certain coordinate system, or a relative attitude with respect to a certain reference attitude. Illustratively, the attitude data may include any one or more of an attitude quaternion, an euler angle, and a rotation matrix.
In one embodiment, the target class may include acceleration data. After the acceleration data is obtained, the acceleration data is corrected and integrated in a time domain, so that triaxial displacement data, namely translation data, of the camera can be obtained.
In one embodiment, the target category may include angular velocity data. After the angular velocity data is obtained, the three-axis angle change data of the camera, namely the rotation data, can be obtained by correcting the angular velocity data and integrating the angular velocity data in a time domain.
In one embodiment, pose information of the camera can be determined in a machine learning mode. For example, a pose prediction model, such as an LSTM (Long Short-Term Memory) network, may be trained in advance. In step S330, inertial sensing data of the object type is input to the pose prediction model, and pose information of the camera is output after model processing.
In one embodiment, the pose information of the camera may be represented as a vector Qi (i represents the current i time), which may include X, Y, Z position data and pose data in three axial directions.
In one embodiment, pose information for the cameras may be determined periodically. For example, the sensing frequency of the inertial sensor is 50Hz, that is, the period is 20ms, which means that the inertial sensor collects inertial sensing data every 20ms, and then the pose information of the camera can be determined every 20 ms.
In one embodiment, since the pose information of the camera is used to determine the shake condition of the camera, the relative pose information of the camera can be used for calculation, i.e., the relative position information of the camera can be determined in step S330. For example, the relative pose information of the camera at the current time relative to the preceding time of the current time may be determined based on the inertial sensing data of the object class.
In one embodiment, the target class may include at least two classes of data for the inertial sensor. If the sensing frequencies of the at least two inertial sensors are different, the frequencies of the output data of the different kinds of inertial sensors can be made the same by up-sampling or down-sampling the data of at least one of the inertial sensors. For example, if the target category includes acceleration data and angular velocity data, and the frequency at which the accelerometer collects the acceleration data is different from the frequency at which the gyroscope collects the angular velocity data, the acceleration data or the angular velocity data may be up-sampled or down-sampled, so that the two data are time-synchronized.
Time errors can exist among different types of inertial sensors, for example, acceleration data at the moment t is output by an accelerometer, angular velocity data at the moment t is output by a gyroscope, and the acceleration data at the moment t and the data sensing moment actually corresponding to the angular velocity data are different due to the influence of response delay of the sensors, delay of data transmission and other factors. In one embodiment, the time error between different types of inertial sensors may be corrected in advance, for example, by calibrating the time error between the accelerometer and the gyroscope in combination with other or external sensors, in a compensating manner to minimize the time error.
With continued reference to fig. 3, in step S340, the optical anti-shake control parameters are obtained by analyzing the pose information of the camera.
The optical anti-shake control parameters are used for controlling one or more components in the optical anti-shake system, such as a lens or an image sensor, to perform position adjustment, that is, the optical anti-shake control parameters may include position adjustment information for the lens or the image sensor in the camera.
By analyzing the pose information of the camera, the shaking condition of the camera can be obtained. In the exemplary embodiment, the shake compensation information may be used to characterize the shake of the camera. The camera shake may refer to an amount of extra motion of the camera with respect to a smooth motion state. The smooth motion state may include: a stationary state, a moving state in which a velocity (translational velocity or rotational angular velocity) is kept constant, a moving state in which an acceleration (linear acceleration or angular acceleration) is kept constant, a moving state in which a jerk is kept constant, and the like. The anti-shake compensation information may be the above-described additional motion amount or an inverse additional motion amount (for compensating for the additional motion amount).
In an embodiment, as shown in fig. 7, the obtaining of the optical anti-shake control parameter by analyzing the pose information of the camera may include the following steps S710 and S720:
step S710, determining anti-shake compensation information according to the pose information of the camera at the current moment and the pose information of at least one preamble moment of the current moment.
The preamble time of the current time may be any time before the current time. For example, if the current time is represented by time i, the preamble time may be time i-1, the last inertial sensing data collection time representing the current time, or time i-2, the last inertial sensing data collection time representing the current time, and so on.
According to the pose information of the preamble time, smooth pose information of the current time can be obtained, namely the expected pose information of the camera in a state of keeping smooth motion. For example, the pose information of the camera at a plurality of preamble times may be smoothed to obtain the smoothed pose information of the camera at the current time, and the smoothing may include performing smooth fitting on the pose information of the plurality of preamble times to obtain the smoothed pose information of the current time from a fitted curve. The pose information of the camera at the current time obtained in step S330 may be used as actual pose information, and the anti-shake compensation information may be calculated by the difference between the actual pose information and the smooth pose information.
In one embodiment, the pose information of the camera at the current time and the pose information of the preamble time can be filtered to obtain the filtered pose information of the camera at the current time, and the filtered pose information can be used as smooth pose information; and then determining anti-shake compensation information based on the actual pose information at the current moment and the filtering pose information at the current moment. When filtering processing is performed, the pose information of the preamble time may be filtering pose information of the preamble time, or may be actual pose information of the preamble time. Illustratively, after the actual pose information at the time i is obtained, the actual pose information is fused (for example, weighted fusion) with the filtering pose information at the time i-1 to obtain the filtering pose information at the time i.
In one embodiment, the filtering pose information of the current moment can be determined based on the anti-shake intensity, the pose information of the current moment and the pose information of the preamble moment; and then determining anti-shake compensation information based on the actual pose information at the current moment and the filtering pose information at the current moment.
Wherein, the anti-shake intensity is used to indicate the degree of applied optical anti-shake, and if the anti-shake intensity is higher, the degree of applied optical anti-shake is higher, i.e. the picture stabilizing effect is better. The anti-shake intensity can be set by a user or automatically set by a system, for example, the system can set different anti-shake intensities for different shooting modes, or pre-configure the anti-shake intensities corresponding to different motion states, for example, if the motion speed is higher, the anti-shake intensity is higher, and determine the current anti-shake intensity according to the current motion state (pose information can be adopted). The anti-shake intensity may be used to determine the filtering intensity, for example, when the pose information at the current time is filtered, the anti-shake intensity may be used as the weight of the pose information at the preamble time, and the greater the anti-shake intensity is, the higher the weight of the pose information at the preamble time is, the higher the smoothness is.
The specific filtering manner in the present disclosure is not limited, and for example, the above-mentioned weighted fusion filtering manner may be adopted, and other manners such as Kalman filtering (Kalman filtering) may also be adopted.
Illustratively, the actual pose Q at the current moment can be based on the anti-shake intensity alphaiAnd the filtering pose Qfilter at the last momenti-1To determine the filter pose Qfilter of the current timeiThe formula is as follows:
Qfilteri=f(Qi,Qfilteri-1,alpha) (1)
in the filtering process, Qfilteri-1Has a weight of alpha, QiIs 1-alpha.
Further, the actual pose Q of the current moment can be obtainediAnd filtering pose QfilteriSubtracting to obtain the anti-shake compensation quantity delta Q, namely the anti-shake compensation information. The formula is as follows:
ΔQ=Qi-Qfilteri (2)
the anti-shake compensation amount Δ Q is a vector, and may be an angle compensation amount or a translation compensation amount, or may be a combination of the angle compensation amount and the translation compensation amount.
When the object class includes only angular velocity data, the actual pose QiIncluding only the rotation data, such as euler angles (X, Y, Z), the calculated anti-shake compensation amount Δ Q may be an angle compensation amount including the compensation amounts Δ X, Δ Y, Δ Z of the rotation angle on the X, Y, Z axis.
When the object class includes only acceleration data, the actual pose QiThe calculated anti-shake compensation amount Δ Q may be a translational compensation amount including translational compensation amounts shift _ x, shift _ y, shift _ z on the axis X, Y, Z, as may be displacement amounts (shift _ x, shift _ y, shift _ z) only including translational data.
When the object type includes angular velocity data and acceleration data, the actual pose QiThe rotation data and the translation data may be included, for example, in the form of (X, Y, Z, shift _ X, shift _ Y, shift _ Z), where the first three are euler angles of three axes and the last three are displacement amounts of the three axes, and the anti-shake compensation amount Δ Q obtained through calculation may be a fusion of an angle compensation amount and a translation compensation amount. For example, the angle compensation amount Δ Q may be calculated according to the above formula using the rotation data (X, Y, Z)1The shift compensation amount Δ Q is calculated according to the above formula using the shift data (shift _ x, shift _ y, shift _ z)2. Angle compensation quantity delta Q1And the translational compensation quantity delta Q2Fusing in the corresponding direction to obtain the anti-shake compensation quantity delta Q, specifically, the angle compensation quantity delta Q1Amount of translational compensation Δ Q2And respectively carrying out weighted summation on the corresponding weights in the direction of the corresponding axis to obtain the anti-shake compensation quantity delta Q.
And step S720, determining position adjusting information according to the anti-shake compensation information.
The optical anti-shake system can compensate for the shake of the camera by moving the position of the lens or the image sensor, so as to realize optical anti-shake. The position adjustment information may be used to adjust the position of the lens or the image sensor.
In one embodiment, the position of the lens is adjusted as an example. The lens may be moved in one or more directions. For example, in the current optical anti-shake system, the lens can be moved in a plane (such as XY plane), i.e. the lens can be moved in the X direction and the Y direction. Of course, the disclosure is not limited to this, and for example, the lens may be moved in a three-dimensional space, that is, the lens may be moved in the X direction, the Y direction, and the Z direction.
Fig. 8 shows a schematic diagram in which the lens is moved in the X direction. The range of the moving stroke of the lens is limited by the hardware (such as a frame, a motor, etc.) of the optical anti-shake system. The full stroke includes a linear stroke and a non-linear stroke. When the lens is located at the middle position of the full stroke, the leftward linear stroke of the lens occupies half of the whole linear stroke, and the rightward linear stroke of the lens also occupies half of the whole linear stroke, so that the leftward and rightward compensation capabilities are the same. Since the shake direction of the camera is generally random, the lens can be positioned at the stroke center position by default so that the compensation ability to the left and right is the same. The position adjustment information may include a stroke of the lens moved in a code, which may be in a unit of-1023 to 1023, for example, when the lens is located at a stroke center position, a code value of 0 moves to a negative value to the left, a code value of-1023 indicates a movement to the left at a stroke boundary and a movement to the right at a positive value, and a code value of 1023 indicates a movement to the right at a stroke boundary.
In one embodiment, the lens may be moved in the X direction and the Y direction, and the position adjustment information may include a target stroke position in the X direction and a target stroke position in the Y direction, i.e., a stroke position after the movement. After obtaining the anti-shake compensation amount Δ Q, the anti-shake compensation amount in the X direction and the anti-shake compensation amount in the Y direction can be obtained, and then the stroke in the X direction and the stroke in the Y direction can be calculated correspondingly.
In one embodiment, a preset calibration value may be obtained, and the position adjustment information may be determined based on the anti-shake compensation information and the preset calibration value. The preset calibration value can be used for representing the relationship between the anti-shake compensation information and the travel center position, and may be a preset empirical value, or a degree value set by a user or automatically set by a system.
For example, the sum of the product of the anti-shake compensation amount and the preset calibration value and the stroke center position can be used as the corresponding position adjustment information, and the formula is as follows:
Δcode_x=(Δx·gain_x)+center_code_x
Δcode_y=(Δx·gain_y)+center_code_y (3)
target_Hall_x=f(Δcode_x,Hall_x)
target_Hall_y=f(Δcode_y,Hall_y) (4)
wherein, the gain _ X and the gain _ Y are preset calibration values in the X direction and the Y direction; center _ code _ x and center _ code _ y are middle positions of the full stroke of the lens, Hall _ x and Hall _ y represent the current stroke position of the lens, and delta code _ x and delta code _ y represent calculated compensation stroke values; f is the fusion of the current travel value and the compensation travel values delta code _ x and delta code _ y, and the fusion mode can be filtering (such as Kalman filtering) or other fusion algorithms; target _ Hall _ X and target _ Hall _ Y indicate target stroke positions in the X direction and target stroke positions in the Y direction. Thereby obtaining position adjustment information.
It should be appreciated that the present exemplary embodiment may also achieve optical anti-shake by moving the image sensor, or moving the lens and the image sensor simultaneously. The principle and manner of calculating the position adjustment information of the image sensor are the same as those of the lens, and thus are not described again.
Fig. 9 shows a schematic diagram of an optical anti-shake procedure. And determining the target type according to the object distance, and acquiring inertial sensing data of the target type. And acquiring real-time position values of the lens of the camera in the XY direction at the current moment, namely a current travel position Hall _ X in the X direction and a current travel position Hall _ Y in the Y direction. Inertial sensing data and the current travel position are used as inputs. Calculating the real-time pose of a target object according to the inertial sensing data of the target class, wherein the target object can be an inertial sensor, a camera, a lens of the camera or an image sensor; calculating the filtered pose and the corresponding anti-shake compensation quantity; calculating a target travel position by combining a preset calibration value, a current travel position and the anti-shake compensation amount; and issuing the calculated target stroke position to a driving circuit to drive a motor to move the lens to the target stroke positions of target _ Hall _ x and target _ Hall _ y, thereby completing the optical anti-shake compensation.
Exemplary embodiments of the present disclosure also provide an optical anti-shake control apparatus. Referring to fig. 10, the optical anti-shake control apparatus 1000 may include:
an object distance acquisition module 1010 configured to acquire an object distance between a photographed object and a camera;
a target class determination module 1020 configured to determine a target class among data classes of the inertial sensors according to the object distance;
a pose information determination module 1030 configured to determine pose information for the cameras based on the inertial sensing data for the object class;
and the control parameter determination module 1040 is configured to obtain the optical anti-shake control parameters by analyzing the pose information of the camera.
In one embodiment, the determining the target class from the data classes of the inertial sensor according to the object distance includes:
in response to the object distance being less than a first object distance threshold, determining the acceleration data as a target class;
the angular velocity data is determined to be the target class in response to the object distance being greater than the first object distance threshold.
In one embodiment, the object distance obtaining module 1010 is further configured to:
acquiring a translation test quantity and a corresponding rotation test quantity based on the relation between the translation compensation and the rotation compensation;
under the condition that translation is applied to the camera by using the translation test quantity, a first relation between the object distance and the image offset is obtained, and under the condition that rotation is applied to the camera by using the rotation test quantity, a second relation between the object distance and the image offset is obtained;
and determining a first object distance threshold according to the first relation and the second relation.
In one embodiment, the determining the target class from the data classes of the inertial sensor according to the object distance includes:
in response to the object distance being less than a second distance threshold, determining the acceleration data as a target category;
determining both the acceleration data and the angular velocity data as a target category in response to the object distance being greater than a second object distance threshold and less than a third object distance threshold;
in response to the object distance being greater than the third object distance threshold, the angular velocity data is determined to be of the target class.
In one embodiment, the acquiring an object distance between the object to be photographed and the camera includes:
acquiring focusing parameters when a camera aligns to a shot object;
and determining the object distance according to the focusing parameters.
In one embodiment, the focus parameters may include auto focus control parameters. The above object distance determination according to the focusing parameters includes:
and determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned with the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
In one embodiment, the optical anti-shake control parameters include position adjustment information for a lens or an image sensor in a camera; through the position appearance information of analysis camera, obtain optics anti-shake control parameter, include:
determining anti-shake compensation information according to the pose information of the camera at the current moment and the pose information of at least one preamble moment at the current moment;
and determining position adjusting information according to the anti-shake compensation information.
The specific details of each part in the above device have been described in detail in the method part embodiments, and details that are not disclosed may be referred to in the method part embodiments, and thus are not described again.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium, which may be implemented in the form of a program product, including program code for causing an electronic device to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "exemplary method" section of this specification, when the program product is run on the electronic device. In an alternative embodiment, the program product may be embodied as a portable compact disc read only memory (CD-ROM) and include program code, and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.
Claims (10)
1. An optical anti-shake control method, comprising:
acquiring an object distance between a shot object and a camera;
determining a target class from the data classes of the inertial sensors according to the object distance;
determining pose information of the camera based on the inertial sensing data of the target class;
and obtaining optical anti-shake control parameters by analyzing the pose information of the camera.
2. The method of claim 1, wherein determining a target class in a class of data of inertial sensors based on the object distance comprises:
determining acceleration data as the target class in response to the object distance being less than a first object distance threshold;
in response to the object distance being greater than the first object distance threshold, determining angular velocity data as the target class.
3. The method of claim 2, further comprising:
acquiring a translation test quantity and a corresponding rotation test quantity based on the relation between the translation compensation and the rotation compensation;
acquiring a first relation between the object distance and the image offset under the condition that the translation test quantity is utilized to apply translation to the camera, and acquiring a second relation between the object distance and the image offset under the condition that the rotation test quantity is utilized to apply rotation to the camera;
determining the first object distance threshold according to the first relation and the second relation.
4. The method of claim 1, wherein determining a target class in a class of data of inertial sensors based on the object distance comprises:
determining acceleration data as the target class in response to the object distance being less than a second distance threshold;
determining both acceleration data and angular velocity data as the target class in response to the object distance being greater than the second object distance threshold and less than a third object distance threshold;
in response to the object distance being greater than the third object distance threshold, determining angular velocity data as the target class.
5. The method according to claim 1, wherein the obtaining of the object distance between the object to be photographed and the camera comprises:
acquiring focusing parameters when the camera is aligned with the shot object;
and determining the object distance according to the focusing parameters.
6. The method of claim 5, wherein the focus parameters comprise auto-focus control parameters; the determining the object distance according to the focusing parameter includes:
and determining the object distance corresponding to the automatic focusing control parameter when the camera is aligned with the shot object based on the calibration relation between the automatic focusing control parameter and the object distance.
7. The method according to claim 1, wherein the optical anti-shake control parameters comprise position adjustment information for a lens or image sensor in the camera; through the analysis the position appearance information of camera obtains optics anti-shake control parameter, include:
determining anti-shake compensation information according to the pose information of the camera at the current moment and the pose information of at least one preamble moment of the current moment;
and determining the position adjusting information according to the anti-shake compensation information.
8. An optical anti-shake control apparatus, comprising:
the object distance acquisition module is configured to acquire an object distance between a shot object and the camera;
a target class determination module configured to determine a target class among data classes of inertial sensors according to the object distance;
a pose information determination module configured to determine pose information for the camera based on inertial sensing data for the object class;
and the control parameter determination module is configured to obtain the optical anti-shake control parameters by analyzing the pose information of the camera.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
a camera including an optical anti-shake system;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210181901.XA CN114449173B (en) | 2022-02-25 | 2022-02-25 | Optical anti-shake control method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210181901.XA CN114449173B (en) | 2022-02-25 | 2022-02-25 | Optical anti-shake control method and device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114449173A true CN114449173A (en) | 2022-05-06 |
CN114449173B CN114449173B (en) | 2024-07-02 |
Family
ID=81373640
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210181901.XA Active CN114449173B (en) | 2022-02-25 | 2022-02-25 | Optical anti-shake control method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114449173B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115086553A (en) * | 2022-06-07 | 2022-09-20 | Oppo广东移动通信有限公司 | Anti-shake control method, anti-shake control device, electronic apparatus, and storage medium |
CN115103108A (en) * | 2022-06-06 | 2022-09-23 | Oppo广东移动通信有限公司 | Anti-shake processing method, anti-shake processing device, electronic equipment and computer-readable storage medium |
CN115134525A (en) * | 2022-06-27 | 2022-09-30 | 维沃移动通信有限公司 | Data transmission method, inertia measurement unit and optical anti-shake unit |
CN116095489A (en) * | 2023-04-11 | 2023-05-09 | 北京城建智控科技股份有限公司 | Collaborative anti-shake method based on camera device and storage medium |
CN116300294A (en) * | 2022-10-25 | 2023-06-23 | 荣耀终端有限公司 | Method and device for simulating human body shake |
CN117119303A (en) * | 2023-04-07 | 2023-11-24 | 荣耀终端有限公司 | Control method of camera module |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103685950A (en) * | 2013-12-06 | 2014-03-26 | 华为技术有限公司 | Method and device for preventing shaking of video image |
CN105100614A (en) * | 2015-07-24 | 2015-11-25 | 小米科技有限责任公司 | Optical anti-vibration realization method, apparatus and electronic equipment |
US20180255245A1 (en) * | 2017-03-02 | 2018-09-06 | Canon Kabushiki Kaisha | Image blur correction apparatus, control method, imaging apparatus, and lens apparatus |
CN109639893A (en) * | 2018-12-14 | 2019-04-16 | Oppo广东移动通信有限公司 | Play parameter method of adjustment, device, electronic equipment and storage medium |
CN111355888A (en) * | 2020-03-06 | 2020-06-30 | Oppo广东移动通信有限公司 | Video shooting method and device, storage medium and terminal |
CN112637489A (en) * | 2020-12-18 | 2021-04-09 | 努比亚技术有限公司 | Image shooting method, terminal and storage medium |
CN113452914A (en) * | 2021-06-28 | 2021-09-28 | 上海艾为电子技术股份有限公司 | Optical anti-shake control device, optical anti-shake control method thereof and mobile terminal |
WO2021258321A1 (en) * | 2020-06-24 | 2021-12-30 | 华为技术有限公司 | Image acquisition method and apparatus |
-
2022
- 2022-02-25 CN CN202210181901.XA patent/CN114449173B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103685950A (en) * | 2013-12-06 | 2014-03-26 | 华为技术有限公司 | Method and device for preventing shaking of video image |
CN105100614A (en) * | 2015-07-24 | 2015-11-25 | 小米科技有限责任公司 | Optical anti-vibration realization method, apparatus and electronic equipment |
US20180255245A1 (en) * | 2017-03-02 | 2018-09-06 | Canon Kabushiki Kaisha | Image blur correction apparatus, control method, imaging apparatus, and lens apparatus |
CN109639893A (en) * | 2018-12-14 | 2019-04-16 | Oppo广东移动通信有限公司 | Play parameter method of adjustment, device, electronic equipment and storage medium |
CN111355888A (en) * | 2020-03-06 | 2020-06-30 | Oppo广东移动通信有限公司 | Video shooting method and device, storage medium and terminal |
WO2021258321A1 (en) * | 2020-06-24 | 2021-12-30 | 华为技术有限公司 | Image acquisition method and apparatus |
CN112637489A (en) * | 2020-12-18 | 2021-04-09 | 努比亚技术有限公司 | Image shooting method, terminal and storage medium |
CN113452914A (en) * | 2021-06-28 | 2021-09-28 | 上海艾为电子技术股份有限公司 | Optical anti-shake control device, optical anti-shake control method thereof and mobile terminal |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115103108A (en) * | 2022-06-06 | 2022-09-23 | Oppo广东移动通信有限公司 | Anti-shake processing method, anti-shake processing device, electronic equipment and computer-readable storage medium |
CN115086553A (en) * | 2022-06-07 | 2022-09-20 | Oppo广东移动通信有限公司 | Anti-shake control method, anti-shake control device, electronic apparatus, and storage medium |
CN115134525A (en) * | 2022-06-27 | 2022-09-30 | 维沃移动通信有限公司 | Data transmission method, inertia measurement unit and optical anti-shake unit |
CN115134525B (en) * | 2022-06-27 | 2024-05-17 | 维沃移动通信有限公司 | Data transmission method, inertial measurement unit and optical anti-shake unit |
CN116300294A (en) * | 2022-10-25 | 2023-06-23 | 荣耀终端有限公司 | Method and device for simulating human body shake |
CN116300294B (en) * | 2022-10-25 | 2024-04-12 | 荣耀终端有限公司 | Method and device for simulating human body shake |
CN117119303A (en) * | 2023-04-07 | 2023-11-24 | 荣耀终端有限公司 | Control method of camera module |
CN116095489A (en) * | 2023-04-11 | 2023-05-09 | 北京城建智控科技股份有限公司 | Collaborative anti-shake method based on camera device and storage medium |
CN116095489B (en) * | 2023-04-11 | 2023-06-09 | 北京城建智控科技股份有限公司 | Collaborative anti-shake method based on camera device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114449173B (en) | 2024-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114449173B (en) | Optical anti-shake control method and device, storage medium and electronic equipment | |
US9979889B2 (en) | Combined optical and electronic image stabilization | |
KR101528860B1 (en) | Method and apparatus for correcting a shakiness in digital photographing apparatus | |
CN110035228B (en) | Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium | |
WO2020037959A1 (en) | Image processing method, image processing apparatus, electronic device and storage medium | |
CN106911889B (en) | Image blur correction apparatus and tilt correction apparatus, and control methods thereof | |
US20170134649A1 (en) | Imaging device and imaging method | |
WO2013108434A1 (en) | Shaking amount detection device, imaging device, and shaking amount detection method | |
KR20180101466A (en) | Depth information acquisition method and apparatus, and image acquisition device | |
CN109218627A (en) | Image processing method, device, electronic equipment and storage medium | |
EP2974271B1 (en) | Anti-shake correction system for curved optical sensor | |
JP7182020B2 (en) | Information processing method, device, electronic device, storage medium and program | |
CN114338994B (en) | Optical anti-shake method, apparatus, electronic device, and computer-readable storage medium | |
CN114531546B (en) | Lens adjusting method and device, storage medium and electronic equipment | |
CN107615744A (en) | A kind of image taking determination method for parameter and camera device | |
WO2015033810A1 (en) | Imaging device, method and program | |
EP3267675B1 (en) | Terminal device and photographing method | |
US9407811B2 (en) | Focus control unit in imaging apparatus, method of controlling the focus control unit and medium for controlling the focus control unit | |
US10248859B2 (en) | View finder apparatus and method of operating the same | |
EP4013030A1 (en) | Image processing method and apparatus, and electronic device and computer-readable storage medium | |
CN114567727B (en) | Shooting control system, shooting control method and device, storage medium and electronic equipment | |
JP2020136774A (en) | Image processing apparatus for detecting motion vector, control method of the same, and program | |
CN115022540B (en) | Anti-shake control method, device and system and electronic equipment | |
JP2017038243A (en) | Imaging apparatus | |
CN118413739A (en) | Electronic image stabilization method, electronic image stabilization device, medium and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |