CN117934325A - Image processing method and device, calibration method and device and electronic equipment - Google Patents
Image processing method and device, calibration method and device and electronic equipment Download PDFInfo
- Publication number
- CN117934325A CN117934325A CN202211239981.6A CN202211239981A CN117934325A CN 117934325 A CN117934325 A CN 117934325A CN 202211239981 A CN202211239981 A CN 202211239981A CN 117934325 A CN117934325 A CN 117934325A
- Authority
- CN
- China
- Prior art keywords
- gyroscope
- image
- target
- camera module
- electronic equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 102
- 238000003672 processing method Methods 0.000 title claims abstract description 26
- 230000033001 locomotion Effects 0.000 claims abstract description 322
- 238000006243 chemical reaction Methods 0.000 claims abstract description 78
- 230000008569 process Effects 0.000 claims abstract description 66
- 238000012545 processing Methods 0.000 claims abstract description 48
- 238000004590 computer program Methods 0.000 claims abstract description 24
- 238000003384 imaging method Methods 0.000 claims description 124
- 238000013519 translation Methods 0.000 claims description 27
- 230000001133 acceleration Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 239000011159 matrix material Substances 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 238000001914 filtration Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000013459 approach Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P15/00—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
- G01P15/14—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of gyroscopes
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
Abstract
The present application relates to an image processing method, an apparatus, an electronic device, a storage medium, and a computer program product. The method comprises the following steps: acquiring a first image through a camera module, and acquiring target gyroscope data through a target gyroscope; determining motion information of the target gyroscope based on the target gyroscope data; converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained by the electronic equipment according to the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope in the calibrating process, wherein the first gyroscope is positioned at the position of the target gyroscope in the electronic equipment, and the second gyroscope is positioned at the position of the camera module in the electronic equipment; generating a motion blur kernel based on motion information of the camera module; and adopting the motion blur check to perform motion blur removal on the first image to obtain a target image. The method can reduce the hardware cost of image processing.
Description
Technical Field
The present application relates to the field of image technology, and in particular, to an image processing method and apparatus, a calibration method and apparatus, an electronic device, and a computer readable storage medium.
Background
During the daily photographing of the user, especially in the case of low light photographing, the time of a single exposure may significantly increase, and at this time, the camera shift due to the shake of the electronic device will be larger than the imaging pixel size, thereby causing blurring of the image picture. Therefore, the image needs to be deblurred. Image deblurring techniques are techniques that recover blurred images by information related to blur and the necessary a priori knowledge of the image.
In the traditional image processing method, the sizes of an image sensor and an optical anti-shake motor are generally increased, the light energy collected by an image pickup module is increased, the signal to noise ratio is improved, and a better lens is matched to obtain higher resolution, so that the image blurring caused by shake of electronic equipment is reduced.
However, this conventional image processing method has a problem of high hardware cost.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a calibration method and device, electronic equipment and a computer readable storage medium, which can reduce the hardware cost of image processing.
In a first aspect, the present application provides an image processing method. The method comprises the following steps:
Acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image;
Determining motion information of the target gyroscope based on the target gyroscope data;
Converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
generating a motion blur kernel based on the motion information of the camera module;
And adopting the motion blur check to remove motion blur of the first image to obtain a target image.
In a second aspect, the present application also provides an image processing apparatus. The device comprises:
the data acquisition module is used for acquiring a first image through a camera module of the electronic equipment and acquiring target gyroscope data in the acquisition process of the first image through a target gyroscope of the electronic equipment;
The motion information determining module is used for determining motion information of the target gyroscope based on the target gyroscope data;
The conversion module is used for converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
the fuzzy core generation module is used for generating a motion fuzzy core based on the motion information of the camera module;
And the deblurring module is used for deblurring the first image by adopting the motion blur check to obtain a target image.
In a third aspect, the application further provides electronic equipment. The electronic device comprises a memory and a processor, the memory stores a computer program, and the processor executes the computer program to realize the following steps:
Acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image;
Determining motion information of the target gyroscope based on the target gyroscope data;
Converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
generating a motion blur kernel based on the motion information of the camera module;
And adopting the motion blur check to remove motion blur of the first image to obtain a target image.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
Acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image;
Determining motion information of the target gyroscope based on the target gyroscope data;
Converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
generating a motion blur kernel based on the motion information of the camera module;
And adopting the motion blur check to remove motion blur of the first image to obtain a target image.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of:
Acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image;
Determining motion information of the target gyroscope based on the target gyroscope data;
Converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
generating a motion blur kernel based on the motion information of the camera module;
And adopting the motion blur check to remove motion blur of the first image to obtain a target image.
The image processing method, the image processing device, the electronic equipment, the computer readable storage medium and the computer program product acquire a first image through a camera module of the electronic equipment, and acquire target gyroscope data in the acquisition process of the first image through a target gyroscope of the electronic equipment; determining motion information of the target gyroscope based on the target gyroscope data; then, according to the target conversion relation between the target gyroscope and the camera shooting module, the motion information of the target gyroscope can be converted into the motion information of the camera shooting module, then a motion blur kernel can be generated based on the motion information of the camera shooting module, the motion blur of the first image is removed, the hardware of the optical anti-shake motor can be reduced, the size of the image sensor is prevented from being increased, and the hardware cost of image motion blur removal is reduced. Meanwhile, in the process that the electronic equipment is calibrated, the first gyroscope is located at the position of a target gyroscope in the electronic equipment, and the second gyroscope is located at the position of a camera module in the electronic equipment, so that the conversion relationship between the first gyroscope and the second gyroscope, namely the target conversion relationship between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope, and the first image can be more accurately subjected to motion blur removal based on the target conversion relationship, so that a clearer target image is obtained.
In a sixth aspect, the present application provides a calibration method, which is applied to an electronic device, where a first gyroscope is disposed at a target gyroscope position in the electronic device and a second gyroscope is disposed at a camera module position in the electronic device during the process of calibrating the electronic device. The method comprises the following steps:
Acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
Based on the first gyroscope data and the second gyroscope data, obtaining a conversion relationship between the first gyroscope and the second gyroscope; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
In a seventh aspect, the application further provides a calibration device, which is applied to electronic equipment, wherein in the process that the electronic equipment is calibrated, a first gyroscope is arranged at a target gyroscope position in the electronic equipment, and a second gyroscope is arranged at a camera module position in the electronic equipment. The device comprises:
The data acquisition module is used for acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
The calibration module is used for calibrating and obtaining the conversion relation between the first gyroscope and the second gyroscope based on the first gyroscope data and the second gyroscope data; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
In an eighth aspect, the application further provides an electronic device, wherein in the process of calibrating the electronic device, a first gyroscope is arranged at the position of a target gyroscope in the electronic device, and a second gyroscope is arranged at the position of a camera module in the electronic device. The electronic device comprises a memory and a processor, the memory stores a computer program, and the processor executes the computer program to realize the following steps:
Acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
Based on the first gyroscope data and the second gyroscope data, obtaining a conversion relationship between the first gyroscope and the second gyroscope; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
In a ninth aspect, the present application further provides a computer readable storage medium, which is applied to an electronic device, where a first gyroscope is disposed at a target gyroscope position in the electronic device and a second gyroscope is disposed at a camera module position in the electronic device in a process of calibrating the electronic device. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
Acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
Based on the first gyroscope data and the second gyroscope data, obtaining a conversion relationship between the first gyroscope and the second gyroscope; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
In a tenth aspect, the present application further provides a computer program product, applied to an electronic device, where a first gyroscope is provided at a target gyroscope position in the electronic device and a second gyroscope is provided at a camera module position in the electronic device during a calibration process of the electronic device. The computer program product comprises a computer program which, when executed by a processor, implements the steps of:
Acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
Based on the first gyroscope data and the second gyroscope data, obtaining a conversion relationship between the first gyroscope and the second gyroscope; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
According to the calibrating method, the device, the electronic equipment, the computer readable storage medium and the computer program product, the first gyroscope is located at the target gyroscope position in the electronic equipment, and the second gyroscope is located at the camera module position in the electronic equipment in the calibrating process of the electronic equipment, so that the conversion relation between the first gyroscope and the second gyroscope, namely the target conversion relation between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data collected by the first gyroscope and the gyroscope data collected by the second gyroscope when the electronic equipment moves, and the motion blur of the first image can be removed accurately based on the target conversion relation, so that a clearer target image can be obtained.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an image processing method in one embodiment;
FIG. 2 is a schematic diagram of an electronic device during use in one embodiment;
FIG. 3 is a flowchart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of a uniform block approach in one embodiment;
FIG. 5 is a schematic diagram of a central non-uniform chunking approach in one embodiment;
FIG. 6 is a schematic diagram of a bulk non-uniform chunking approach in one embodiment;
FIG. 7 is a schematic view of an image sensor of an image capturing module rotated about a Y-axis according to one embodiment;
FIG. 8 is a schematic diagram of a sub-motion blur kernel generating a preset region in one embodiment;
FIG. 9 is a schematic diagram of generating a motion blur kernel in one embodiment;
FIG. 10 is a flow chart of a method of marking in one embodiment;
FIG. 11 is a schematic diagram of an electronic device being calibrated in one embodiment;
FIG. 12 is a block diagram showing the structure of an image processing apparatus in one embodiment;
FIG. 13 is a block diagram of the architecture of a marker in one embodiment;
Fig. 14 is an internal structural diagram of an electronic device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In one embodiment, as shown in fig. 1, an image processing method is provided, and this embodiment is illustrated by applying the method to an electronic device, which may be a terminal or a server; it will be appreciated that the invention is also applicable to systems comprising a terminal and a server, and is implemented by interaction of the terminal and the server. The terminal can be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things equipment and portable wearable equipment, and the internet of things equipment can be smart speakers, smart televisions, smart air conditioners, smart vehicle-mounted equipment, smart automobiles and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server may be implemented as a stand-alone server or as a server cluster composed of a plurality of servers.
In this embodiment, the image processing method includes the following steps 102 to 110:
step 102, acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image.
The camera module is a module including a plurality of camera components for processing and transmitting images. The image pickup section may include a lens for imaging, an AF (Auto Focus) motor that achieves focusing during moving of the lens, and an image sensor, and may further include a filter, a circuit board, and the like. The image sensor can convert an optical signal entering the electronic device into an electrical signal and generate an image, and the gyroscope can acquire gyroscope data, wherein the gyroscope data comprises angular acceleration and linear acceleration. Alternatively, the camera module may be a tele camera module, or may be another type of camera module, such as a wide-angle camera module, but not limited thereto.
The target gyroscope data includes angular acceleration and linear acceleration, and may also include translational acceleration.
As shown in fig. 2, in the three-dimensional coordinate system XYZ, the electronic device includes a target gyroscope and an imaging module, which are movable in the X-direction, the Y-direction, and the Z-direction.
Step 104, determining the motion information of the target gyroscope based on the target gyroscope data.
The motion information of the target gyroscope includes a rotation angle and a movement amount of the gyroscope.
Optionally, the electronic device acquires multiple sets of target gyroscope data in a shooting time period of the first image, wherein each set of target gyroscope data comprises angular acceleration and translational acceleration of the target gyroscope; integrating the angular acceleration twice to obtain the rotation angle of the target gyroscope, and integrating the translational acceleration twice to obtain the movement amount of the target gyroscope; and according to the rotation angle and the movement amount of the target gyroscope each time, the coordinate position of the target gyroscope each time can be determined.
Optionally, determining the motion information of the target gyroscope based on the target gyroscope data includes: filtering the target gyroscope data, and obtaining the motion information of the target gyroscope based on the filtered target gyroscope data; wherein the filtering process includes at least one of a noise reduction process and a bias adjustment process.
It will be appreciated that the target gyroscope data output by the target gyroscope may be different from the data format used by the algorithm, for example, the acceleration information output by the target gyroscope may have a positive value or a negative value according to the difference of directions, and the input of the algorithm may be a non-zero data, so that bias processing needs to be performed on the target gyroscope data, for example, bias is added to the target gyroscope data so that the negative value becomes a positive value or an absolute value is taken. The electronic device performs filtering processing on the target gyroscope data, which is a preprocessing process on the original data.
The electronic equipment performs filtering processing on the target gyroscope data, so that more accurate target gyroscope data after the filtering processing can be obtained, and more accurate motion information of the target gyroscope can be obtained.
Because the clock frequency of the target gyroscope is higher, in order to further improve the processing efficiency of an image processing algorithm, the original gyroscope data output by the target gyroscope can be subjected to temporal downsampling, the original gyroscope data output at a plurality of moments are averaged into one target gyroscope data, and then the target gyroscope data is subjected to filtering processing.
Step 106, converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained by the electronic equipment according to the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope in the calibrating process, wherein the first gyroscope is positioned at the position of the target gyroscope in the electronic equipment, and the second gyroscope is positioned at the position of the camera module in the electronic equipment.
It can be understood that the electronic device includes a target gyroscope and a camera module, and the target gyroscope and the camera module are respectively located at different positions, so that a conversion relationship of motion information exists between the target gyroscope and the camera module.
Optionally, the target conversion relationship may be a conversion matrix, the electronic device obtains a conversion matrix between the target gyroscope and the camera module, which is calibrated in advance, and multiplies the motion information of the target gyroscope by the conversion matrix to obtain the motion information of the camera module.
It should be noted that the first gyroscope and the target gyroscope may be different gyroscopes, or may be the same gyroscope, that is, the electronic device replaces the camera module with the second gyroscope and uses the target gyroscope as the first gyroscope in the calibration process.
And step 108, generating a motion blur kernel based on the motion information of the camera module.
The blur kernel is actually a matrix, and the convolution of a sharp image with the blur kernel causes the image to become blurred, thus called the blur kernel. The blur kernel is one of the convolution kernels, and the essence of the image convolution operation is matrix convolution. That is, image blur can be regarded as a process of convolving a blur kernel with a clear image to obtain a blurred image.
The point spread function (PSF, point spread function) is a response to a point impulse function in the spatial domain describing the imaging process of the optical system, and can be used to describe the imaging effect, i.e., the blur kernel.
Optionally, the electronic device determines a motion trail of the camera module based on the motion information of the camera module; and taking the motion trail of the camera shooting module as a motion blur kernel.
It can be understood that a certain motion exists in the process of shooting the first image by the camera module, so that the first image is blurred, namely motion blur, and the motion blur in the first image can be represented by a motion blur kernel.
Optionally, the electronic device obtains motion information of each preset time length of the camera module, and determines the position of each preset time length of the camera module based on the motion information of each preset time length; the motion between adjacent moments is considered as single-direction motion, and all positions are connected according to time sequence, so that a motion track of the camera module is obtained.
In step 110, the motion blur is removed from the first image by using the motion blur kernel to obtain a target image.
Optionally, the electronic device deconvolves the first image with a motion blur kernel using a Patch-wise deconvolution algorithm to obtain the target image. The sharpness of the target image is higher than the sharpness of the first image.
The electronic device performs motion blur removal on the first image, which may be deconvolution, or other processing methods, which are not limited herein.
According to the image processing method, the first image is acquired through the camera module of the electronic equipment, and the target gyroscope data are acquired through the target gyroscope of the electronic equipment in the acquisition process of the first image; determining motion information of the target gyroscope based on the target gyroscope data; then, according to the target conversion relation between the target gyroscope and the camera shooting module, the motion information of the target gyroscope can be converted into the motion information of the camera shooting module, then a motion blur kernel can be generated based on the motion information of the camera shooting module, the motion blur of the first image is removed, the hardware of the optical anti-shake motor can be reduced, the size of the image sensor is prevented from being increased, and the hardware cost of image motion blur removal is reduced. Meanwhile, in the process that the electronic equipment is calibrated, the first gyroscope is located at the position of a target gyroscope in the electronic equipment, and the second gyroscope is located at the position of a camera module in the electronic equipment, so that the conversion relationship between the first gyroscope and the second gyroscope, namely the target conversion relationship between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope, and the first image can be more accurately subjected to motion blur removal based on the target conversion relationship, so that a clearer target image is obtained.
In one embodiment, as shown in fig. 3, a preview screen is displayed in a user interface of a screen of an electronic device, after a user gives a shooting instruction to generate a shutter signal through the user interface, an image capturing module performs an exposure process based on the shutter signal, converts an optical signal collected by a lens into an electrical signal, and outputs the electrical signal as RAW image data (RAW file), and the RAW image data is output as a first image including motion blur after being subjected to image processing by an image processing chip. Meanwhile, the target gyroscope acquires original gyroscope data based on a shutter signal, the original gyroscope data is filtered by a filter to obtain target gyroscope data, and the target gyroscope data is transmitted to a fuzzy core generator to generate a motion fuzzy core. If the output frequency of the target gyroscope is higher, the collected data volume can be downsampled in time. The electronic equipment adopts the motion blur check to carry out motion blur removal on the first image, a target image is obtained, and the target image is displayed in a user interface.
In one embodiment, an imaging surface of an image sensor in the camera module is divided into at least two preset areas; generating a motion blur kernel based on motion information of the camera module, comprising: determining a motion track of a preset imaging point in the preset area based on the motion information of the camera module aiming at each preset area, and generating a sub-motion blur kernel of the preset area based on the motion track of the preset imaging point; and generating a motion blur kernel based on the sub-motion blur kernels of the respective preset areas.
It can be understood that, for the image sensor of the camera module, because the size of the image sensor is larger, if the whole imaging surface of the image sensor adopts the same motion blur check to describe and process the motion blur, artifacts are easily caused in the processed image, so that the imaging surface of the image sensor can be subjected to blocking processing. For example, the pixel resolution of the imaging surface of the image sensor is 4000×3000, the imaging surface of the image sensor is divided into 40×30 preset areas, and the motion blur kernel of one preset imaging point (such as the center point of the preset area) in the preset area of every 100×100 pixels is used as the motion blur kernel of the preset area; correspondingly, the pixel resolution of the image generated by the image sensor is 4000×3000, the image is correspondingly divided into 40×30 blocks, and the convolution kernel required by the deblurring process is taken as the motion blur kernel of a feature point (such as the center point of the block) in each 100×100 pixel block. Therefore, the density of the motion blur kernel is lower than the resolution of the image, and the time for motion blur kernel operation can be saved.
The division manner of the image sensor may be set as needed, and is not limited herein.
In an alternative embodiment, as shown in fig. 4, in a uniform block manner, the electronic device may uniformly divide the imaging surface of the image sensor to obtain at least two preset areas.
In an alternative embodiment, the electronic device may divide the region of interest in the imaging plane of the image sensor by a first density, and divide the non-region of interest in the imaging plane of the image sensor by a second density, so as to obtain at least two preset regions; the first density is higher than the second density.
Both the region of interest and the region of non-interest may be set as desired. In one embodiment, as shown in fig. 5, the region of interest is a central region of the imaging surface of the image sensor and the non-region of interest is an edge region of the imaging surface of the image sensor. It can be understood that people often place the interested shooting object in the center of the picture for shooting, so that the central area of the imaging surface of the image sensor is the interested area and is divided with a higher first density, and each preset area obtained by dividing the central area can obtain a sub-motion blur kernel obtained with precision.
In another embodiment, as shown in fig. 6, the area of interest is the area where the subject is located, and the area of non-interest is other area.
In other embodiments, the region of interest may also be a region in which the user performs a touch operation on the screen corresponds to a region on the imaging surface of the image sensor, and the region of non-interest is a region in the screen in which the user does not perform a touch operation corresponds to a region on the imaging surface of the image sensor.
It can be understood that the electronic device blocks the region of interest with a first density higher, so as to obtain sub-motion blur kernels with higher precision for each preset region, thereby reducing the occurrence of artifacts in the image region corresponding to the region of interest, and blocks the non-region of interest with a second density lower, so as to improve the efficiency of the algorithm.
Optionally, for each preset area, the electronic device determines a motion track of at least one preset imaging point in the preset area based on the motion information of the camera module, and generates a sub-motion blur kernel of the preset area based on the motion track of the at least one preset imaging point; and splicing the sub-motion blur kernels of each preset area to obtain the motion blur kernel of the first image. The position of the preset imaging point in the preset area can be set according to requirements. The preset imaging point may be, for example, a center point in the preset area or an edge point, and is not limited thereto.
If the preset area comprises a preset imaging point, taking the motion track of the preset imaging point as a sub-motion blur kernel of the preset area; if the preset area comprises at least two preset imaging points, generating a sub-motion blur kernel of the preset area based on the motion trail of the at least two preset imaging points. Optionally, the electronic device may take a motion trajectory of any one of the at least two preset imaging points as a sub-motion blur kernel of the preset area, perform an averaging process on the motion trajectories of the at least two preset imaging points, take the motion trajectory obtained by averaging as the sub-motion blur kernel of the preset area, and determine the sub-motion blur kernel of the preset area in other manners, which is not limited herein.
Performing motion blur removal on the first image by adopting a motion blur kernel to obtain a target image, wherein the motion blur kernel comprises the following steps: performing motion blur removal on a corresponding image area in the first image by adopting a sub-motion blur check to obtain a motion blur removed image area; and splicing the image areas subjected to motion blur removal to obtain a target image.
In this embodiment, for each preset area, the electronic device determines, based on the motion information of the camera module, a motion track of a preset imaging point in the preset area, and generates a sub-motion blur kernel of the preset area based on the motion track of the preset imaging point, where the sub-motion blur kernel of each preset area can more accurately represent the motion blur of the image area obtained in the preset area, so as to obtain a more accurate motion blur kernel.
In one embodiment, the motion information of the camera module is motion information of a reference imaging point of an imaging surface of an image sensor in the camera module; for each preset area, determining a motion trail of preset imaging points in the preset area based on motion information of the camera module comprises the following steps: determining a position relationship between a preset imaging point and a reference imaging point in each preset area; determining the position of each preset imaging point for a preset time period at intervals based on the motion information and the position relation of the reference imaging point; and generating a motion track of a preset imaging point based on each position.
The reference imaging point may be set as desired. The reference imaging point may be, for example, a center point of the imaging surface of the image sensor, an edge point of the imaging surface of the image sensor, or other set points, which are not limited herein.
Optionally, determining motion information of preset imaging points at preset intervals for preset time periods based on the motion information and the position relation of the reference imaging points for each preset area; determining the position of the next moment based on the current moment position of the preset imaging point and the motion information after the preset time interval; and connecting the positions at each interval for a preset time length to obtain the motion trail of the preset imaging point.
It will be appreciated that in the motion process of the electronic device, the motion information of the preset imaging point may be mapped based on the motion information of the reference imaging point and the positional relationship between the reference imaging point and the preset imaging point.
As shown in fig. 7 and 8, taking an example of the rotation angle θ of the image sensor around the Y axis in the electronic device for a preset period Δt, a reference imaging point in the imaging plane of the image sensor is taken as a center point, the center point is located on the Y axis, and then the motion information of the center point is the rotation angle θ around the Y axis. Positional relationship between a preset imaging point 1 of a preset area 1 and a center point in an imaging surface of the image sensor: the distance in the X direction is Δx1 and the distance in the Y direction is Δy1; positional relationship between a preset imaging point 2 and a center point of a preset region 2 in an imaging plane of the image sensor: the distance in the X direction is Δx2 and the distance in the Y direction is Δy2. Then, the motion information of the imaging point 1 is preset: Δl1=Δx1- Δx1·sinθ, determining a next time position by moving a preset imaging point 1 from the current position Δl1, connecting the two positions to obtain a motion track of the preset imaging point 1, and taking the motion track of the preset imaging point 1 as a sub-motion blur kernel 1 of a preset region where the preset imaging point 1 is located. Presetting motion information of an imaging point 2: Δl2=Δx2- Δx2·sinθ, determining a next time position by moving the preset imaging point 2 from the current position Δl2, connecting the two positions to obtain a motion track of the preset imaging point 2, and taking the motion track of the preset imaging point 2 as a sub-motion blur kernel 2 of a preset area where the preset imaging point 2 is located.
It should be noted that, the preset duration Δt is smaller than the shutter time, which is the exposure duration for acquiring the first image, so as to ensure that a sufficiently fine motion blur kernel is recorded, and deblur the first image more accurately.
In the present embodiment, for each preset area, the positional relationship between the preset imaging point and the reference imaging point in the preset area is determined; based on the motion information and the position relation of the reference imaging point, the position of the preset imaging point at preset time intervals can be determined; and generating the motion trail of the preset imaging points based on each position, so that the motion trail of each preset imaging point on the imaging surface of the image sensor, namely the sub-motion blur kernels of each preset area, can be accurately determined.
In one embodiment, generating a motion blur kernel based on motion information of the camera module includes: generating a motion blur kernel component in the X direction based on translation information, rotation information around the Y direction and rotation information around the Z direction of an image sensor in the camera module; the plane where the X direction and the Y direction are located is parallel to the imaging surface of the image sensor, and the Z direction is perpendicular to the imaging surface of the image sensor; generating a motion blur kernel component in the Y direction based on translation information, rotation information around the X direction and rotation information around the Z direction of an image sensor in the image pickup module; the motion blur kernel is generated from the motion blur kernel component in the X direction and the motion blur kernel component in the Y direction.
The motion blur kernel component in the X direction and the motion blur kernel component in the Y direction are motion blur generated when the image sensor moves transversely, namely, a plane where the X direction and the Y direction are located is parallel to an imaging plane of the image sensor in the camera module, and the Z direction is perpendicular to the imaging plane of the image sensor.
It will be appreciated that rotation about the Y-axis (vertical) will cause the image point to move in a horizontal direction (X-direction) orthogonal thereto, and rotation about the X-axis will cause the image to move in the Y-direction. Rotation about the Z-axis then produces movement in both the X-direction and the Y-direction. Therefore, in the three-dimensional coordinate system XYZ, based on the translation information, the rotation information and the rotation information of the image sensor in the X direction, the rotation information and the rotation information in the Z direction, the position of the image pickup module in the X direction can be determined every preset time length, the positions of the image pickup module in the X direction can be connected, and the motion blur kernel component in the X direction can be obtained.
Similarly, the image sensor translates in the Y direction, rotates around the X direction and rotates around the Z direction, motion blur kernel components in the Y direction can be influenced, the position of the image pickup module in the Y direction can be determined according to translation information, rotation information around the X direction and rotation information around the Z direction of the image sensor in the image pickup module, preset time intervals can be set at intervals, the positions of the image pickup module in the Y direction can be connected, and the motion blur kernel components in the Y direction can be obtained.
The electronic device adds the motion blur kernel component in the X direction and the motion blur kernel component vector in the Y direction to generate a motion blur kernel.
It can be understood that the defocus blur of the image itself or the defocus blur (image blurring) caused by focusing is usually the image blur desired to be kept, the motion blur kernel component in the Z direction includes defocus blur and motion blur generated by motion and is dominant by the defocus blur generated by motion, and because the depth of field of the image capturing module is larger, the rotation and translation of the image sensor in the capturing process are difficult to cause larger defocus blur, so the electronic device can respectively determine the motion blur kernel component in the X direction and the motion blur kernel component in the Y direction, generate a motion blur kernel, and perform motion blur removal by using the motion blur kernel to the first image, so that a target image after motion blur removal can be obtained, and the target image includes the defocus blur of the image itself desired to be kept by the user, thereby preserving the blurring effect of the image and improving the accuracy of image processing.
In another embodiment, generating a motion blur kernel based on motion information of the camera module includes: generating a motion blur kernel component in the X direction based on translation information, rotation information around the Y direction and rotation information around the Z direction of an image sensor in the camera module; the plane where the X direction and the Y direction are located is parallel to the imaging surface of the image sensor, and the Z direction is perpendicular to the imaging surface of the image sensor; generating a motion blur kernel component in the Y direction based on translation information, rotation information around the X direction and rotation information around the Z direction of an image sensor in the image pickup module; generating a motion blur kernel component in the Z direction based on translation information, rotation information around the X direction and rotation information around the Y direction of an image sensor in the camera module, and scene depth variation or focusing position variation; the motion blur kernel is generated from the motion blur kernel component in the X direction, the motion blur kernel component in the Y direction, and the motion blur kernel component in the Z direction.
It will be appreciated that translation of the image sensor in the Z direction, rotation about the X direction, and rotation about the Y direction can all affect the motion blur kernel component in the Z direction, and that scene depth variations or focus position variations can also affect the motion blur kernel component in the Z direction. The scene depth change amount refers to the change of the distance between a shooting object and the shooting module, and the focusing position change amount refers to the change of the focusing position of the shooting module in the Z direction.
Therefore, based on the translation information, the rotation information around the X direction and the rotation information around the Y direction of the image sensor in the camera module, and the scene depth variation or the focusing position variation, the position of the camera module in the Z direction can be determined every preset time length, the positions of the camera module in the Z direction can be connected every preset time length, and the motion blur kernel component in the Z direction can be obtained.
The electronic device adds the motion blur kernel component in the X direction, the motion blur kernel component in the Y direction, and the motion blur kernel component vector in the Z direction, and may generate a motion blur kernel.
In one embodiment, as shown in FIG. 9, the electronic device obtains target gyroscope data through a target gyroscope, the target gyroscope data including angular acceleration and translational acceleration; integrating the angular acceleration once to obtain angular velocity, and integrating the angular velocity once to obtain an X-axis rotation quantity, a Y-axis rotation quantity and a Z-axis rotation quantity; and integrating the translational acceleration once to obtain a speed, and integrating the speed once to obtain the translational quantity along the X axis, the translational quantity along the Y axis and the translational quantity along the Z axis. The electronic equipment converts the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module, namely, the coordinate system conversion is carried out, so that the rotation quantity of the image sensor in the camera module around the X axis, the rotation quantity of the image sensor in the camera module around the Y axis and the rotation quantity of the image sensor in the camera module around the Z axis, and the translation quantity along the X axis, the translation quantity along the Y axis and the translation quantity along the Z axis can be obtained; based on the translation information, the rotation information around the X direction and the rotation information around the Y direction of the image sensor in the camera module, and the scene depth variation or the focusing position variation, the position of each preset time length of the camera module in the Z direction can be determined, the positions of each preset time length of the camera module in the Z direction are connected, and the motion blur kernel component in the Z direction can be obtained; the motion blur kernel may be generated by vector addition of the motion blur kernel component in the X direction, the motion blur kernel component in the Y direction, and the motion blur kernel component in the Z direction.
In one embodiment, performing motion blur removal on the first image by using a motion blur kernel to obtain a target image includes: processing the first image by adopting a preset processing mode to obtain a second image; performing convolution processing on the second image by adopting a motion blur kernel to obtain a third image; and determining a difference value between the third image and the first image, if the difference value is larger than a preset threshold value, adjusting the second image to obtain a new second image, iteratively executing the step of carrying out convolution processing on the second image by adopting the motion blur kernel, and taking the latest second image as a target image until the difference value is smaller than or equal to the preset threshold value.
The preset processing mode can be set according to the needs, and is not limited herein. The preset difference value may be set as needed, and is not limited herein. The preset difference may be, for example, 0.
Optionally, the electronic device superimposes a random signal on the first image to obtain the second image. It will be appreciated that superimposing the random signal on the first image causes the second image to be different from the first image, enabling the third image after the first iteration to be significantly different from the original blurred image, thereby bringing some regions closer to the desired target image than the first image.
Alternatively, the electronic device may also generate the second image by other preset processing methods, for example, processing the first image using some simple neural network to generate a deblurred second image.
The electronic equipment determines the difference value between the third image and the first image, if the difference value is larger than a preset threshold value, the second image is adjusted to obtain a new second image, the new second image is iteratively executed to carry out the step of carrying out convolution processing on the second image by adopting the motion blur check until the difference value is smaller than or equal to the preset threshold value, and the latest second image is taken as a target image.
In the iterative process, that is, the process of minimizing the difference, each iteration adjusts the second image obtained in the previous iteration, for example, after increasing the pixel value in a certain block in the second image, the difference becomes smaller, and then the pixel value in the block in the next iteration is continuously increased.
In one embodiment, the image degradation model of the electronic device deblurring the first image without taking noise into account is:
Iblurred=ideal*SFmotion
Wherein I ideal refers to an ideal, motion-blurred-free image, also a target image that is desired to be obtained after deblurring; i blurr=ed refers to the first image, i.e. the image containing motion blur; PSF motion refers to the point spread function describing motion blur, i.e., the motion blur kernel generated by the target gyroscope data.
In the actual shooting process, a first image acquired by the camera module is generated after an ideal image is convolved by a motion blur kernel function. Therefore, for convolution operation, inverse solution is performed by an optimization method.
The electronic equipment carries out convolution processing on the second image of the motion blur kernel to obtain a third image, and determines the difference value between the third image and the first image:
ΔI=Igenerated*SFmotion-blurred
Where I generated is the second image, I generated*SFmotion is the third image, I blurred is the first image, and Δi is the difference between the third image and the first image.
By performing a minimum optimization on the difference value until the difference value is less than or equal to a preset threshold value, the latest second image is the target image, and motion blur in the first image is eliminated.
In this embodiment, the electronic device processes the first image by adopting a preset processing manner to obtain a second image; performing convolution processing on the second image by adopting a motion blur kernel to obtain a third image; and if the difference value is larger than the preset threshold value, adjusting the second image to obtain a new second image, and iteratively executing the step of carrying out convolution processing on the second image by adopting the motion blur check on the new second image until the difference value is smaller than or equal to the preset threshold value, wherein the latest second image, namely the target image, can accurately obtain the target image for eliminating the motion blur, and the target image keeps the defocus blur and the blurring effect of the picture.
In one embodiment, there is also provided an image processing method including the steps of:
and A1, acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image.
And step A2, determining the motion information of the target gyroscope based on the target gyroscope data.
A3, converting the motion information of the target gyroscope into the motion information of a reference imaging point of an imaging surface of an image sensor in the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained by the electronic equipment according to the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope in the calibrating process, wherein the first gyroscope is positioned at the position of the target gyroscope in the electronic equipment, and the second gyroscope is positioned at the position of the camera module in the electronic equipment.
Step A4, dividing an imaging surface of an image sensor in the camera module into at least two preset areas; determining a position relationship between a preset imaging point and a reference imaging point in each preset area; generating the position of the preset imaging point in the X direction at preset time intervals based on the translation information, the rotation information around the Y direction and the rotation information around the Z direction of the preset imaging point and the position relation; generating a motion blur kernel component of the preset imaging point in the X direction at each preset time interval based on the position of the preset imaging point in the X direction at each preset time interval; generating the position of the preset imaging point in the Y direction at intervals of preset time length based on the translation information of the preset imaging point in the Y direction, the rotation information around the X direction and the rotation information around the Z direction and the position relation; generating a motion blur kernel component in the Y direction for each preset time length based on the position of the preset imaging point in the Y direction for each preset time length; generating a motion blur kernel of each preset imaging point for preset time length according to the motion blur kernel component of each preset time length in the X direction and the motion blur kernel component of each preset time length in the Y direction; integrating the motion blur kernels of preset imaging points at preset time intervals in time to generate sub-motion blur kernels of a preset area; generating a motion blur kernel based on sub-motion blur kernels of each preset region; the plane where the X direction and the Y direction are located is parallel to the imaging plane of the image sensor, and the Z direction is perpendicular to the imaging plane of the image sensor.
Step A5, processing the first image by adopting a preset processing mode to obtain a second image; performing convolution processing on the second image by adopting a motion blur kernel to obtain a third image; and determining a difference value between the third image and the first image, if the difference value is larger than a preset threshold value, adjusting the second image to obtain a new second image, iteratively executing the step of carrying out convolution processing on the second image by adopting the motion blur kernel, and taking the latest second image as a target image until the difference value is smaller than or equal to the preset threshold value.
In one embodiment, as shown in fig. 10, a calibration method is provided, and this embodiment is illustrated by applying the method to an electronic device, which may be a terminal or a server; it will be appreciated that the invention is also applicable to systems comprising a terminal and a server, and is implemented by interaction of the terminal and the server.
In the embodiment, a first gyroscope is arranged at a target gyroscope position in the electronic equipment in the process of calibrating the electronic equipment, and a second gyroscope is arranged at a camera module position in the electronic equipment; the image processing method includes the following steps 1002 to 1004:
step 1002, acquiring first gyroscope data acquired by a first gyroscope and second gyroscope data acquired by a second gyroscope when an electronic device moves.
Step 1004, obtaining a conversion relationship between the first gyroscope and the second gyroscope based on the first gyroscope data and the second gyroscope data; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
As shown in fig. 11, in the process of being calibrated, the electronic device is fixed on a motion platform capable of three-axis rotation and three-axis translation, a first gyroscope is arranged at a target gyroscope position in the electronic device, and a second gyroscope is arranged at a camera module position. By measuring first gyroscope data of the first gyroscope and second gyroscope data of the second gyroscope, respectively; the first gyroscope data comprises translation data of the first gyroscope on an X axis, translation data of the first gyroscope on a Y axis, translation data of the first gyroscope on a Z axis, rotation data of the first gyroscope around the X axis, rotation data of the first gyroscope around the Y axis and rotation data of the first gyroscope around the Z axis, and the second gyroscope data comprises translation data of the second gyroscope on the X axis, translation data of the first gyroscope on the Y axis, translation data of the first gyroscope around the Z axis, rotation data of the second gyroscope around the X axis, rotation data of the second gyroscope around the Z axis and rotation data of the first gyroscope around the Y axis; based on the first gyroscope data and the second gyroscope data, 36 linear independent equations are established, a conversion matrix R comprising 36 elements can be solved, namely, the conversion relation between the first gyroscope and the second gyroscope is obtained, and the conversion matrix R is used as the target conversion relation between the target gyroscope and the camera module.
The electronic device can multiply the motion information of the target gyroscope by the conversion matrix R to obtain the motion information of the camera module:
wherein, Is the transformation matrix R,/>Is the motion information of the target gyroscope,/>Is the motion information of the camera module.
Optionally, the rotation axis passes through the first gyroscope during calibration of the electronic device.
It can be appreciated that if the rotation axis passes through the first gyroscope, the input matrix formed by the first gyroscope data and the second gyroscope data has fewer non-zero terms, so that the solution of the matrix can be simplified, and the conversion matrix can be determined more quickly.
According to the calibration method, in the process that the electronic equipment is calibrated, the first gyroscope is located at the position of the target gyroscope in the electronic equipment, and the second gyroscope is located at the position of the camera module in the electronic equipment, so that the conversion relation between the first gyroscope and the second gyroscope, namely the target conversion relation between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data collected by the first gyroscope and the gyroscope data collected by the second gyroscope when the electronic equipment moves, and the motion blur of the first image can be removed based on the target conversion relation, so that a clearer target image can be obtained.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an image processing device for realizing the above-mentioned image processing method. The implementation of the solution provided by the apparatus is similar to the implementation described in the above method, so the specific limitation of one or more embodiments of the image processing apparatus provided below may refer to the limitation of the image processing method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 12, there is provided an image processing apparatus including: a data acquisition module 1202, a motion information determination module 1204, a conversion module 1206, a blur kernel generation module 1208, and a deblurring module 1210, wherein:
the data acquisition module 1202 is configured to acquire a first image through a camera module of the electronic device, and acquire target gyroscope data through a target gyroscope of the electronic device in a process of acquiring the first image.
The motion information determining module 1204 is configured to determine motion information of the target gyroscope based on the target gyroscope data.
The conversion module 1206 is configured to convert the motion information of the target gyroscope into the motion information of the camera module according to a target conversion relationship between the target gyroscope and the camera module; the target conversion relation is obtained by the electronic equipment according to the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope in the calibrating process, wherein the first gyroscope is positioned at the position of the target gyroscope in the electronic equipment, and the second gyroscope is positioned at the position of the camera module in the electronic equipment.
The blur kernel generation module 1208 is configured to generate a motion blur kernel based on motion information of the camera module.
The deblurring module 1210 is configured to deblur the first image using the motion blur kernel to obtain a target image.
The image processing device acquires a first image through the camera module of the electronic equipment, and acquires target gyroscope data through the target gyroscope of the electronic equipment in the acquisition process of the first image; determining motion information of the target gyroscope based on the target gyroscope data; then, according to the target conversion relation between the target gyroscope and the camera shooting module, the motion information of the target gyroscope can be converted into the motion information of the camera shooting module, then a motion blur kernel can be generated based on the motion information of the camera shooting module, the motion blur of the first image is removed, the hardware of the optical anti-shake motor can be reduced, the size of the image sensor is prevented from being increased, and the hardware cost of image motion blur removal is reduced. Meanwhile, in the process that the electronic equipment is calibrated, the first gyroscope is located at the position of a target gyroscope in the electronic equipment, and the second gyroscope is located at the position of a camera module in the electronic equipment, so that the conversion relationship between the first gyroscope and the second gyroscope, namely the target conversion relationship between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data of the first gyroscope and the gyroscope data of the second gyroscope, and the first image can be more accurately subjected to motion blur removal based on the target conversion relationship, so that a clearer target image is obtained.
In one embodiment, an imaging surface of an image sensor in the camera module is divided into at least two preset areas; the blur kernel generation module 1208 is further configured to determine, for each preset area, a motion track of a preset imaging point in the preset area based on motion information of the camera module, and generate a sub-motion blur kernel of the preset area based on the motion track of the preset imaging point; and generating a motion blur kernel based on the sub-motion blur kernels of the respective preset areas.
In one embodiment, the motion information of the camera module is motion information of a reference imaging point of an imaging surface of an image sensor in the camera module; the blur kernel generating module 1208 is further configured to determine, for each preset area, a positional relationship between a preset imaging point and a reference imaging point in the preset area; determining the position of each preset imaging point for a preset time period at intervals based on the motion information and the position relation of the reference imaging point; and generating a motion track of a preset imaging point based on each position.
In one embodiment, the blur kernel generating module 1208 is further configured to generate a motion blur kernel component in the X direction based on translation information, rotation information about the Y direction, and rotation information about the Z direction of the image sensor in the camera module; the plane where the X direction and the Y direction are located is parallel to the imaging surface of the image sensor, and the Z direction is perpendicular to the imaging surface of the image sensor; generating a motion blur kernel component in the Y direction based on translation information, rotation information around the X direction and rotation information around the Z direction of an image sensor in the image pickup module; the motion blur kernel is generated from the motion blur kernel component in the X direction and the motion blur kernel component in the Y direction.
In one embodiment, the deblurring module 1210 is further configured to process the first image by a preset processing manner to obtain a second image; performing convolution processing on the second image by adopting a motion blur kernel to obtain a third image; and determining a difference value between the third image and the first image, if the difference value is larger than a preset threshold value, adjusting the second image to obtain a new second image, iteratively executing the step of carrying out convolution processing on the second image by adopting the motion blur kernel, and taking the latest second image as a target image until the difference value is smaller than or equal to the preset threshold value.
Based on the same inventive concept, the embodiment of the application also provides a calibration device for realizing the above related calibration method. The implementation of the solution provided by the device is similar to that described in the above method, so the specific limitation in the embodiments of the calibration device or calibration devices provided below may be referred to above for the limitation of the calibration method, which is not repeated here.
In one embodiment, as shown in fig. 13, a calibration device is provided, which is applied to an electronic device, wherein a first gyroscope is arranged at a target gyroscope position in the electronic device in the process of calibrating the electronic device, and a second gyroscope is arranged at a camera module position in the electronic device; the calibration device comprises: a data acquisition module 1302 and a calibration module 1304, wherein:
The data acquisition module 1302 is configured to acquire first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic device moves.
The calibration module 1304 is configured to calibrate a conversion relationship between the first gyroscope and the second gyroscope based on the first gyroscope data and the second gyroscope data; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
According to the calibrating device, in the calibrating process of the electronic equipment, the first gyroscope is located at the position of the target gyroscope in the electronic equipment, and the second gyroscope is located at the position of the camera module in the electronic equipment, so that the conversion relation between the first gyroscope and the second gyroscope, namely the target conversion relation between the target gyroscope and the camera module, can be accurately obtained through the gyroscope data collected by the first gyroscope and the gyroscope data collected by the second gyroscope when the electronic equipment moves, and the motion blur of the first image can be removed based on the target conversion relation, so that a clearer target image can be obtained.
In one embodiment, the rotation axis passes through the first gyroscope during calibration of the electronic device.
The respective modules in the image processing apparatus and the calibration apparatus described above may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or independent of a processor in the electronic device, or may be stored in software in a memory in the electronic device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, an electronic device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 14. The electronic device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input device. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the electronic device is used to exchange information between the processor and the external device. The communication interface of the electronic device is used for conducting wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an image processing method or a calibration method. The display unit of the electronic device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the electronic equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 14 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the electronic device to which the present inventive arrangements are applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform steps of an image processing method, or steps of a calibration method.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method, or a calibration method.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.
Claims (11)
1. An image processing method, applied to an electronic device, comprising:
Acquiring a first image through a camera module of the electronic equipment, and acquiring target gyroscope data through a target gyroscope of the electronic equipment in the acquisition process of the first image;
Determining motion information of the target gyroscope based on the target gyroscope data;
Converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
generating a motion blur kernel based on the motion information of the camera module;
And adopting the motion blur check to remove motion blur of the first image to obtain a target image.
2. The method of claim 1, wherein the imaging surface of the image sensor in the camera module is divided into at least two preset areas; the generating a motion blur kernel based on the motion information of the camera module includes:
determining a motion track of a preset imaging point in each preset area based on the motion information of the camera module, and generating a sub-motion blur kernel of the preset area based on the motion track of the preset imaging point;
And generating a motion blur kernel based on the sub-motion blur kernels of the respective preset areas.
3. The method of claim 2, wherein the motion information of the camera module is motion information of a reference imaging point of an imaging surface of an image sensor in the camera module;
Determining, for each preset area, a motion trajectory of a preset imaging point in the preset area based on motion information of the camera module, including:
Determining a position relationship between a preset imaging point in each preset area and the reference imaging point;
determining the position of the preset imaging point at intervals of preset time length based on the motion information of the reference imaging point and the position relation;
And generating a motion track of the preset imaging point based on each position.
4. The method of claim 1, wherein generating a motion blur kernel based on motion information of the camera module comprises:
Generating a motion blur kernel component in the X direction based on translation information, rotation information around the Y direction and rotation information around the Z direction of an image sensor in the camera module; the planes of the X direction and the Y direction are parallel to the imaging plane of the image sensor, and the Z direction is perpendicular to the imaging plane of the image sensor;
generating a motion blur kernel component in the Y direction based on translation information of an image sensor in the Y direction, rotation information around the X direction and rotation information around the Z direction in the camera module;
And generating a motion blur kernel according to the motion blur kernel component in the X direction and the motion blur kernel component in the Y direction.
5. The method according to any one of claims 1 to 4, wherein said deblurring said first image with said motion blur kernel to obtain a target image, comprising:
processing the first image by adopting a preset processing mode to obtain a second image;
performing convolution processing on the second image by adopting the motion blur kernel to obtain a third image;
And determining a difference value between the third image and the first image, if the difference value is larger than a preset threshold value, adjusting the second image to obtain a new second image, and iteratively executing the step of carrying out convolution processing on the second image by adopting the motion blur check on the new second image until the difference value is smaller than or equal to the preset threshold value, and taking the latest second image as a target image.
6. The calibrating method is characterized by being applied to electronic equipment, wherein a first gyroscope is arranged at the position of a target gyroscope in the electronic equipment in the calibrating process of the electronic equipment, and a second gyroscope is arranged at the position of a camera module in the electronic equipment; the method comprises the following steps:
Acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
Based on the first gyroscope data and the second gyroscope data, obtaining a conversion relationship between the first gyroscope and the second gyroscope; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
7. The method of claim 6, wherein a rotation axis passes through the first gyroscope during calibration of the electronic device.
8. An image processing apparatus, characterized by being applied to an electronic device, comprising:
the data acquisition module is used for acquiring a first image through a camera module of the electronic equipment and acquiring target gyroscope data in the acquisition process of the first image through a target gyroscope of the electronic equipment;
The motion information determining module is used for determining motion information of the target gyroscope based on the target gyroscope data;
The conversion module is used for converting the motion information of the target gyroscope into the motion information of the camera module according to the target conversion relation between the target gyroscope and the camera module; the target conversion relation is obtained according to gyroscope data of a first gyroscope and gyroscope data of a second gyroscope in the process that the electronic equipment is calibrated, wherein the first gyroscope is positioned at a target gyroscope position in the electronic equipment, and the second gyroscope is positioned at a camera module position in the electronic equipment;
the fuzzy core generation module is used for generating a motion fuzzy core based on the motion information of the camera module;
And the deblurring module is used for deblurring the first image by adopting the motion blur check to obtain a target image.
9. The calibrating device is characterized by being applied to electronic equipment, wherein a first gyroscope is arranged at the position of a target gyroscope in the electronic equipment in the calibrating process of the electronic equipment, and a second gyroscope is arranged at the position of a camera module in the electronic equipment; the device comprises:
The data acquisition module is used for acquiring first gyroscope data acquired by the first gyroscope and second gyroscope data acquired by the second gyroscope when the electronic equipment moves;
The calibration module is used for calibrating and obtaining the conversion relation between the first gyroscope and the second gyroscope based on the first gyroscope data and the second gyroscope data; the conversion relation is used for converting the motion information of the target gyroscope into the motion information of the camera module in the using process of the electronic equipment.
10. An electronic device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the computer program, when executed by the processor, causes the processor to perform the steps of the image processing method according to any one of claims 1 to 5 or the steps of the calibration method according to any one of claims 6 to 7.
11. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 5, or the steps of the calibration method according to any one of claims 6 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211239981.6A CN117934325A (en) | 2022-10-11 | 2022-10-11 | Image processing method and device, calibration method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211239981.6A CN117934325A (en) | 2022-10-11 | 2022-10-11 | Image processing method and device, calibration method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117934325A true CN117934325A (en) | 2024-04-26 |
Family
ID=90749379
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211239981.6A Pending CN117934325A (en) | 2022-10-11 | 2022-10-11 | Image processing method and device, calibration method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117934325A (en) |
-
2022
- 2022-10-11 CN CN202211239981.6A patent/CN117934325A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9998666B2 (en) | Systems and methods for burst image deblurring | |
US10007990B2 (en) | Generating composite images using estimated blur kernel size | |
EP2574038B1 (en) | Image capturing apparatus, image processing apparatus, image processing method, and image processing program | |
Šindelář et al. | Image deblurring in smartphone devices using built-in inertial measurement sensors | |
Delbracio et al. | Removing camera shake via weighted fourier burst accumulation | |
CN105635588B (en) | A kind of digital image stabilization method and device | |
US9258484B2 (en) | Image pickup apparatus and control method for same | |
KR20120099713A (en) | Algorithms for estimating precise and relative object distances in a scene | |
CN113556464B (en) | Shooting method and device and electronic equipment | |
KR101671391B1 (en) | Method for deblurring video using modeling blurred video with layers, recording medium and device for performing the method | |
WO2011077659A1 (en) | Image processing device, imaging device, and image processing method | |
US20140105515A1 (en) | Stabilizing and Deblurring Atmospheric Turbulence | |
CN105516579A (en) | Image processing method and device and electronic equipment | |
CN107633497A (en) | A kind of image depth rendering intent, system and terminal | |
WO2010093040A1 (en) | Motion blur control device, method and program | |
Delbracio et al. | Non-parametric sub-pixel local point spread function estimation | |
CN116170689A (en) | Video generation method, device, computer equipment and storage medium | |
JP2016110312A (en) | Image processing method, and image processor and program | |
Yang et al. | Virtual focus and depth estimation from defocused video sequences | |
CN107395961A (en) | The restored method and device of a kind of view data | |
CN117934325A (en) | Image processing method and device, calibration method and device and electronic equipment | |
Webster et al. | Radial deblurring with ffts | |
Doner et al. | FPGA-based infrared image deblurring using angular position of IR detector | |
CN117917685A (en) | Image processing method, apparatus, electronic device, and computer-readable storage medium | |
Jang et al. | Dual-Modality Cross-Interaction-Based Hybrid Full-Frame Video Stabilization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |