CN110049239B - Image processing method and device, electronic equipment and computer readable storage medium - Google Patents

Image processing method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN110049239B
CN110049239B CN201910233766.7A CN201910233766A CN110049239B CN 110049239 B CN110049239 B CN 110049239B CN 201910233766 A CN201910233766 A CN 201910233766A CN 110049239 B CN110049239 B CN 110049239B
Authority
CN
China
Prior art keywords
image
angular velocity
data
velocity data
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910233766.7A
Other languages
Chinese (zh)
Other versions
CN110049239A (en
Inventor
陈嘉伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910233766.7A priority Critical patent/CN110049239B/en
Publication of CN110049239A publication Critical patent/CN110049239A/en
Application granted granted Critical
Publication of CN110049239B publication Critical patent/CN110049239B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Adjustment Of Camera Lenses (AREA)

Abstract

The application relates to an image processing method, an image processing device, an electronic device and a computer readable storage medium, wherein the method comprises the following steps: controlling the image sensor to acquire a first image and receiving first angular speed data acquired by the gyroscope; binding the first image with the corresponding first angular speed data to obtain combined data, and sending the combined data to the processor; and analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data. The image processing method, the image processing device, the electronic equipment and the computer readable storage medium can improve the image processing efficiency.

Description

Image processing method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
When the camera shoots an image, certain time is needed for imaging. If the camera shakes during imaging, the generated image may be subject to faults, ghosting, and the like, resulting in serious distortion of the captured image. In order to solve the image error caused by the shake, the shake condition of the camera can be detected by the gyroscope, and then the shake error of the image is compensated according to the shake condition detected by the gyroscope.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, electronic equipment and a computer readable storage medium, which can improve the image processing efficiency.
An image processing method applied to an electronic device, wherein the electronic device comprises a gyroscope, an image sensor and a processor, the gyroscope is connected with the image sensor, and the image sensor is connected with the processor, and the method comprises the following steps:
controlling the image sensor to acquire a first image and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired;
binding the first image with the corresponding first angular speed data to obtain combined data, and sending the combined data to the processor;
and analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
An image processing apparatus applied to an electronic device, the electronic device including a gyroscope, an image sensor and a processor, the gyroscope being connected to the image sensor, the image sensor being connected to the processor, the apparatus comprising:
the data acquisition module is used for controlling the image sensor to acquire a first image and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired;
the data binding module is used for binding the first image with the corresponding first angular velocity data to obtain combined data and sending the combined data to the processor;
and the data analysis module is used for analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
An electronic device comprising a memory, a processor, a gyroscope, an image sensor and a processor, the gyroscope being connected to the image sensor, the image sensor being connected to the processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to carry out the steps of:
controlling the image sensor to acquire a first image and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired;
binding the first image with the corresponding first angular speed data to obtain combined data, and sending the combined data to the processor;
and analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
controlling the image sensor to acquire a first image and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired;
binding the first image with the corresponding first angular speed data to obtain combined data, and sending the combined data to the processor;
and analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
The image processing method, the image processing device, the electronic device and the computer-readable storage medium provided by the above embodiments can control the image sensor to acquire the first image and receive the first angular velocity data acquired by the gyroscope. The image sensor may bind the first image and the first angular velocity data into combined data and then send the combined data to the processor. The processor analyzes the received combined data to obtain second angular velocity data and a second image. The first image and the first angular velocity data are transmitted after being bound, and the second image and the second angular velocity data obtained after analysis are corresponding, so that the process of data registration after separate transmission can be avoided, and the image processing efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an application of an image processing method;
FIG. 2 is a flow diagram illustrating a method for image processing according to one embodiment;
FIG. 3 is a flow chart illustrating an image processing method according to another embodiment;
FIG. 4 is a diagram of hardware to implement an image processing method in one embodiment;
FIG. 5 is a flowchart illustrating an image processing method according to another embodiment;
FIG. 6 is a diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 7 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first camera may be referred to as a second camera, and similarly, a second camera may be referred to as a first camera, without departing from the scope of the present application. The first camera and the second camera are both cameras, but they are not the same camera.
Fig. 1 is a diagram illustrating an application scenario of an image processing method according to an embodiment. As shown in fig. 1, the application scenario includes an electronic device 10, and a camera 102 is installed on the electronic device 10. The camera 102 includes a lens that collects light from a scene being photographed and an image sensor that generates an image. A gyroscope and a processor may also be mounted in the electronic device 10, the gyroscope being connected to the image sensor, and the image sensor being connected to the processor. Specifically, the image sensor may be controlled to collect the first image 12, and receive first angular velocity data collected by the gyroscope; the image sensor can bind the first image 12 with the corresponding first angular velocity data to obtain combined data, and send the combined data to the processor; and analyzing the received combined data through a processor to obtain second angular velocity data and a second image. The electronic device 10 may be, but is not limited to, a mobile phone, a computer, a tablet, a wearable device, a personal digital assistant, and the like.
FIG. 2 is a flowchart illustrating an image processing method according to an embodiment. As shown in fig. 2, the image processing method is applied to an electronic device including a gyroscope, an image sensor, and a processor, the gyroscope is connected to the image sensor, and the image sensor is connected to the processor. The image processing method includes steps 202 to 206. Wherein:
step 202, controlling the image sensor to acquire a first image, and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shake degree of a lens corresponding to the image sensor when the first image is acquired.
In one embodiment, when the electronic device needs to capture an image, the image sensor may be powered on, and the image sensor after being powered on may convert light collected by the lens into an electrical signal, thereby generating the image. The image sensor in the present embodiment may be, but is not limited to, a CCD (charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), and the like.
It is understood that a lens and an image sensor may be included in the camera, and the lens may collect light rays in a photographed scene and convert the collected light rays into electrical signals through the image sensor, thereby generating an image. If the lens is shaken, the collected light will also change, i.e. the acquired image will also change. The gyroscope may detect an angular velocity generated when the lens shakes, that is, the first angular velocity data may be used to indicate a shake degree of the lens corresponding to the image sensor when the first image is captured. The larger the first angular velocity data is, the larger the degree of shake of the lens corresponding to the image sensor at the time of capturing the first image is, and the larger the error of the captured first image is.
And step 204, binding the first image and the corresponding first angular velocity data to obtain combined data, and sending the combined data to the processor.
The gyroscope can send the collected first angular velocity data to the image sensor, and the image sensor binds the first image and the corresponding first angular velocity data to obtain combined data. It is understood that the image sensor may capture one frame of the first image, or may capture multiple frames of the first image. When multiple frames of first images are collected, each frame of first image is bound with the corresponding first angular velocity data respectively, and combined data corresponding to each frame of first image is obtained.
Specifically, the first image and the corresponding first angular velocity data are bound, the first angular velocity data may be placed at the head or the tail of the first image, or the first angular velocity data may be inserted into the first image, which is not limited herein. The purpose is to make one-to-one correspondence between the first image and the first angular velocity data, and to perform packing transmission.
After the combined data is obtained, the combined data is sent to a processor for processing. The processor in the present embodiment may be a CPU (Central Processing Unit), an ISP (Image Signal Processing), an MCU (micro controller Unit), or the like, but is not limited thereto.
In one embodiment, the image sensor may send the image data to the Processor through a Mobile Industry Processor Interface (MIPI) Interface, or may send the image data to the Processor through a network wireless Interface, but is not limited thereto.
And step 206, analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
The analysis algorithm for analyzing the combined data corresponds to the binding algorithm for obtaining the combined data, and after the processor receives the combined data, the combined data can be analyzed according to the analysis algorithm to obtain second angular velocity data and a second image. The first image and the first angular velocity data are bound into combined data, and the first image and the first angular velocity data are packed and combined together for transmission. And analyzing the combined data, namely splitting different data to obtain a second image and second angular velocity data.
And after the processor obtains the combined data, analyzing the combined data to obtain a second image and second angular velocity data. The second image obtained by analyzing the combined data may be the same as or different from the first image; the second angular velocity data may be the same as or different from the first angular velocity data, and is not limited herein. The second image and the second angular velocity data obtained by analysis correspond to each other, and registration is not needed.
The resulting second angular velocity data may represent the degree of shake of the lens when the second image is captured, and then the second image may be shake-compensated based on the second angular velocity data. The specific method of compensation may be to translate, crop, rotate, zoom, etc. the second image, but is not limited thereto.
The image processing method provided by the above embodiment may control the image sensor to acquire the first image and receive the first angular velocity data acquired by the gyroscope. The image sensor may bind the first image and the corresponding first angular velocity data into combined data, and then send the combined data to the processor. The processor analyzes the received combined data to obtain second angular velocity data and a second image. The first image and the first angular velocity data are transmitted after being bound, and the second image and the second angular velocity data obtained after analysis are corresponding, so that the process of data registration after separate transmission can be avoided, and the image processing efficiency is improved.
Fig. 3 is a flowchart illustrating an image processing method according to another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 310. Wherein:
step 302, controlling the image sensor to acquire a first image at a first frequency, and receiving first angular velocity data acquired by the gyroscope at a second frequency, wherein the first frequency is less than or equal to the second frequency.
In embodiments provided herein, an image sensor may be controlled to output a first image at a first frequency, and a gyroscope may output first angular velocity data at a second frequency, the first frequency being less than or equal to the second frequency. Thus, when the image sensor outputs a first image of one frame, the gyroscope can output a plurality of frames of first angular velocity data. For example, the image sensor outputs a first image at a frequency of 30HZ (hertz), and the gyroscope outputs first angular velocity data at a frequency of 3KHZ (kilohertz). That is, each time the image sensor outputs one first image, the gyroscope outputs 100 first angular velocity data.
Specifically, the image sensor and the gyroscope may be connected through an SPI (Serial Peripheral Interface), and may also be connected through other interfaces in other embodiments, which is not limited herein. After the gyroscope collects the first angular velocity data, the first angular velocity data is sent to the image sensor through the SPI.
And step 304, acquiring intermediate angular velocity data corresponding to each first image from the received first angular velocity data.
The first frequency of the image sensor for acquiring the first image is less than or equal to the second frequency of the gyroscope for acquiring the first angular velocity data, so that when the image sensor acquires a frame of the first image, the gyroscope may output one or more first angular velocity data. It can be understood that the output first image and the corresponding first angular velocity data have a certain corresponding relationship, and the intermediate angular velocity data to be bound with the first image can be obtained from the first angular velocity data corresponding to the first image.
In one embodiment, the image sensor may sequentially insert the first image and the first angular velocity data into a queue such that the first angular velocity data is divided into different data segments by the first image, and then determine intermediate angular velocity data corresponding to the first image based on the divided data segments of the first angular velocity data.
For example, the queue that sorts the first image and the first angular velocity data is: pic _01 → Gyro _02 → Gyro _03 → Gyro _04 → Gyro _05 → pic _02 → Gyro _06 → Gyro _07 → Gyro _08 → Gyro _09 → Gyro _10, where "pic _ 01" and "pic _ 02" are the identifications of the first images, and "Gyro _ 01" … … "Gyro _ 10" is the identification of the first angular velocity data. It may be considered that the first image "pic _ 01" corresponds to the first angular velocity data "Gyro _ 01" … … "Gyro _ 05", and then the intermediate angular velocity data corresponding to the first image "pic _ 01" is selected from "Gyro _ 01" … … "Gyro _ 05". The first image "pic _ 02" corresponds to the first angular velocity data "Gyro _ 05" … … "Gyro _ 10", and then the intermediate angular velocity data corresponding to the first image "pic _ 02" is selected from "Gyro _ 05" … … "Gyro _ 10".
Specifically, the first angular velocity data may be obtained from the data segment of the divided first angular velocity data as the intermediate angular velocity data of the first image, and the last first angular velocity data may also be obtained as the intermediate angular velocity data of the first image, or any one of the first angular velocity data may be obtained as the intermediate angular velocity data of the first image, or the average value of the first angular velocity data may be used as the intermediate angular velocity data, which is not limited herein.
In one embodiment, the step of acquiring the intermediate angular velocity data may specifically include: sequencing the first image and the first angular speed data according to the acquisition time through an image sensor; and respectively acquiring adjacent first angular velocity data of each first image as corresponding intermediate angular velocity data through the image sensor.
It can be understood that the closer the acquisition time of the first angular velocity data is to the acquisition time of the first image, the more accurate the acquired first angular velocity data is. Therefore, the first image and the first angular velocity data are sorted according to the acquisition time, and the previous first angular velocity data adjacent to the first image may be used as the corresponding intermediate angular velocity data, and the subsequent first angular velocity data adjacent to the first image may also be used as the corresponding intermediate angular velocity data, which is not limited to this.
It should be noted that, the process of the image sensor sorting the first image and the first angular velocity data according to the acquisition time may be understood as integrally sorting all the acquired first images and the first angular velocity data according to the sequence of the acquisition time, so as to form a sorting queue, or may be understood as the sequence of the image sensor acquiring the first images or receiving the first angular velocity data, and it is not necessary to integrally sort all the received first images and the received first angular velocity data and form a sorting queue, which is not limited herein.
And step 306, binding each first image with the corresponding intermediate angular velocity data to obtain combined data, and sending the combined data to the processor.
There may be only one or a plurality of intermediate angular velocity data corresponding to each first image, which is not limited herein. Specifically, the image sensor and the processor may be connected through a buffer, the image sensor stores the combined data of the first image and the corresponding intermediate angular velocity data binding in the buffer, and the processor may read the combined data from the buffer. The step of sending the combined data may specifically include: binding the first image with the corresponding first angular speed data to obtain combined data, and storing the combined data into a buffer; the combined data in the buffer is read by the processor.
And 308, analyzing the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
It is to be understood that the second image and the second angular velocity data are image data and angular velocity data, respectively, which are analyzed from the combined data, and the second image and the first image may be the same or different. The second angular velocity data and the second image may be the same or different, and are not limited herein.
And 310, performing shake compensation on the second image according to the second angular velocity data to obtain a target image.
When an image is collected, if the lens shakes, light received by the lens changes, and the generated image also changes correspondingly. The direction and magnitude of the image shift, etc. may be determined from the angular velocity data, and then the image may be compensated according to the direction and magnitude of the image shift, etc.
Specifically, step 310 includes: acquiring image offset data of a second image according to the second angular velocity data; and carrying out shake compensation on the second image according to the image offset data to obtain a target image.
The image offset data refers to data such as the direction and the size of the image offset, and the image offset data of the second image can be determined according to the second angular velocity data. The lens may be calibrated prior to acquiring the image, and a correspondence between the second angular velocity data and the image shift data may be determined.
For example, the lens is controlled to move according to a plurality of pieces of calibration angular velocity data, then a calibration image is collected after the lens moves according to each piece of calibration angular velocity data, and the collected calibration image is compared with a standard image to determine calibration image offset data of each calibration image. And finally, establishing a corresponding relation according to the plurality of calibration angular velocity data and the calibration image offset data.
After the image offset data is obtained, the second image may be subjected to shake compensation based on the image offset data. For example, if the image shift data is "100 pixels shifted in the a direction and 50 pixels shifted in the b direction", the second image may be shifted by 100 pixels in the opposite direction to the a direction and 50 pixels in the opposite direction to the b direction. The translation of the image may be understood as cropping in the direction of the translation direction, i.e. the above compensation may be understood as cropping the second image 100 pixels in the a-direction and 50 pixels in the b-direction.
In one embodiment, assuming that consecutive images are to be acquired, it is necessary to ensure that the specifications of the acquired consecutive images are uniform, i.e., the consecutively acquired images generally need to have the same size. Thus, a uniform compensation amount can be defined, and all second images are compensated with the uniform compensation amount.
Specifically, the step of compensating the second image according to the image offset data may include: acquiring a reference compensation quantity; and calculating target compensation amounts corresponding to the second image in each compensation direction according to the reference compensation amounts and the image offset data, and performing shake compensation on the second image in each compensation direction according to the target compensation amounts to obtain a target image.
For example, defining the second image in the xoy coordinate system, the reference compensation amount may be "100 pixels in x-axis, 100 pixels in y-axis". Assuming that the image shift data is "120 pixels in the positive x-axis direction and 80 pixels in the positive y-axis direction", if the second image clips 120 pixels in the positive x-axis direction, the reference compensation amount is exceeded, and therefore the second image can be clipped by only 100 pixels in the x-axis direction. Similarly, if the second image is cropped 80 pixels in the positive y-axis direction for an insufficient amount of reference compensation, the second image may be cropped 20 pixels in the negative y-axis direction. So that all the second images will have the same specification after compensation.
The reference compensation amount may be set in advance or calculated based on the obtained image shift data of all the second images. Specifically, the step of obtaining the reference compensation amount may include: and taking the average value of the acquired image offset data of all the second images as a reference compensation amount.
Fig. 4 is a hardware diagram for implementing the image processing method in one embodiment. As shown in fig. 4, the lens 404 may collect light reflected from the object 402 and generate a first image through the image sensor 408, while the gyroscope 406 collects first angular velocity data and transmits the first angular velocity data to the image sensor 408. The image sensor 408 may bind the first image and the first angular velocity data into combined data and send the combined data to the processor 410. The processor 410 then parses the combined data to obtain a second image and second angular velocity data.
The image processing method provided by the above embodiment may determine intermediate angular velocity data corresponding to the first image from the first angular velocity data, transmit the first image and the intermediate angular velocity data after binding, and obtain a second image and a second angular velocity data corresponding to each other after analyzing, so that a process of data registration after separate transmission may be avoided, thereby improving the efficiency of image processing.
In one embodiment, each of the second images corresponds to at least two second angular velocity data, and the at least two second angular velocity data correspond to at least two different image areas in the second image respectively. The step of acquiring the target image may specifically include: and respectively carrying out shake compensation on at least two different image areas in the second image according to at least two pieces of second angular velocity data to obtain corresponding target images.
The image sensor binds each first image with the corresponding intermediate angular velocity data to obtain combined data, and the processor unbinds the combined data to obtain a second image and corresponding second angular velocity data. In this embodiment, each of the first images corresponds to at least two intermediate angular velocity data, and after the analysis of the combined data, each of the second images corresponds to at least two second angular velocity data.
Specifically, when the image sensor generates the first image, the exposure is performed on the pixels one by one, that is, the pixel values of the pixels in the first image are sequentially obtained one by one. Therefore, when the pixel values of different pixel points in the first image are collected, the shaking degrees of the lens are different, that is, the angular velocity data corresponding to different pixel points in the first image may be different.
Each first image may be divided into at least two different image areas and the intermediate angular velocity data corresponding to each image area may be determined separately. Step 304 may specifically include: and acquiring at least two pieces of intermediate angular velocity data corresponding to each first image from the received first angular velocity data.
And binding each first image and the corresponding at least two pieces of intermediate angular velocity data to obtain combined data, and analyzing the combined data to obtain each second image and the corresponding at least two pieces of second angular velocity data. The at least two second angular velocity data correspond to at least two different image areas in the second image, respectively, so that the at least two different image areas in the second image can be subjected to shake compensation according to the at least two angular velocity data, respectively.
Generally, the first image is divided into different image areas according to the pixel value acquisition order of the pixel points. For example, a pixel in the first image may be represented as "pix _ x _ y", where x and y represent the abscissa and ordinate of the pixel in the first image, respectively. The order of arranging the pixel points in the first image according to the pixel value acquisition order is as follows: pix _1_1 … … pix _1_ n … … pix _ m _ n, where m is 600 and n is 300. The first image can be divided into 3 image regions, and the pixels included in each image region are "pix _11 … … pix _200_ n", "pix _11 … … pix _400_ n", and "pix _11 … … pix _600_ n", respectively.
In an embodiment, after at least two pieces of second angular velocity data corresponding to each second image are obtained through analysis, different image areas in the second image may be subjected to shake compensation through the at least two pieces of second angular velocity data, so as to obtain the target image.
In the above embodiment, each second image corresponds to at least two pieces of second angular velocity data, and the accuracy of compensating the second image is improved by performing shake compensation on different image areas in the second image according to the at least two pieces of second angular velocity data.
In an embodiment, as shown in fig. 5, the step of acquiring the target image may further include:
step 502, obtaining an angular velocity difference value of any two second angular velocity data of the at least two second angular velocity data.
When the shake compensation is performed on the second image in the different areas, whether the second angular velocity data corresponding to the different image areas in the same second image are similar or not can be judged. For the same second image, if the at least two second angular velocity data are relatively close, the judder degrees are relatively close when different image areas are collected, and the whole second image can be compensated uniformly; if the difference between the at least two second angular velocity data is large, it indicates that the shake degree is different when different image areas are acquired, and the second image can be compensated by areas.
Specifically, an angular velocity difference between any two of the at least two second angular velocity data may be acquired, and the similarity between any two second angular velocity data may be determined according to the acquired angular velocity difference.
Step 504, when the angular velocity difference is greater than the difference threshold, shake compensation is performed on at least two different image areas in the second image according to at least two second angular velocity data, so as to obtain corresponding target images.
And when the acquired angular velocity difference is larger than the difference threshold, the difference between the at least two second angular velocity data is considered to be larger, namely the dithering procedures of the lens are different when different image areas in the second image are acquired. In this case, different image areas in the second image may be separately compensated, i.e. different image areas may be compensated based on the at least two second angular velocity data, respectively.
Step 506, when the angular velocity difference is smaller than or equal to the difference threshold, determining target angular velocity data of the second image from at least two pieces of second angular velocity data, and performing shake compensation on the second image according to the target angular velocity data to obtain a corresponding target image.
When the obtained angular velocity difference is smaller than or equal to the difference threshold, any two of the at least two second angular velocity data are considered to be relatively close to each other, and then the second image can be subjected to uniform shake compensation. Specifically, the target angular velocity data of the second image may be determined from at least two pieces of second angular velocity data, for example, an average value of the at least two pieces of second angular velocity data may be taken as the target angular velocity data, without being limited thereto. And then carrying out shake compensation on the second image according to the target angular velocity data to obtain a corresponding target image.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, the image processing method may further include:
(1) controlling an image sensor to acquire a first image at a first frequency and receiving first angular velocity data acquired by a gyroscope at a second frequency, wherein the first frequency is less than or equal to the second frequency;
(2) sequencing the first image and the first angular speed data according to the acquisition time;
(3) respectively acquiring adjacent first angular velocity data of each first image as corresponding intermediate angular velocity data;
(4) binding the first image with the corresponding intermediate angular velocity data to obtain combined data, and storing the combined data into a buffer;
(5) reading, by the processor, the combined data in the buffer;
(6) analyzing the received combined data through a processor to obtain second images and corresponding second angular velocity data, wherein each second image corresponds to at least two second angular velocity data, and the at least two second angular velocity data correspond to at least two different image areas in the second images respectively;
(7) acquiring an angular velocity difference value of any two second angular velocity data in the at least two second angular velocity data;
(8) when the angular velocity difference is larger than the difference threshold, respectively acquiring at least two different image area image offset data in the second image according to at least two second angular velocity data, and respectively performing jitter compensation on at least two different image area images in the second image according to the image offset data to obtain a target image;
(9) when the angular velocity difference is smaller than or equal to the difference threshold, determining target angular velocity data of the second image from the at least two pieces of second angular velocity data, acquiring image offset data of the second image according to the target angular velocity data, and performing shake compensation on the second image according to the image offset data to obtain the target image.
Fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 6, the image processing apparatus 600 includes a data acquisition module 602, a data binding module 604, and a data parsing module 606. Wherein:
the data acquisition module 602 is configured to control the image sensor to acquire a first image and receive first angular velocity data acquired by the gyroscope, where the first angular velocity data is used to indicate a shake degree of a lens corresponding to the image sensor when the first image is acquired;
the data binding module 604 is configured to bind the first image and the corresponding first angular velocity data to obtain combined data, and send the combined data to the processor;
and a data analysis module 606, configured to analyze the received combined data through the processor to obtain a second image and corresponding second angular velocity data.
The image processing apparatus provided by the above embodiment may control the image sensor to acquire the first image, and receive the first angular velocity data acquired by the gyroscope. The image sensor may bind the first image and the first angular velocity data into combined data and then send the combined data to the processor. The processor analyzes the received combined data to obtain second angular velocity data and a second image. The first image and the first angular velocity data are transmitted after being bound, and the second image and the second angular velocity data obtained after analysis are corresponding, so that the process of data registration after separate transmission can be avoided, and the image processing efficiency is improved.
In one embodiment, the data acquisition module 602 is further configured to control the image sensor to acquire a first image at a first frequency and receive first angular velocity data acquired by the gyroscope at a second frequency, wherein the first frequency is less than or equal to the second frequency; the data binding module 604 is further configured to obtain intermediate angular velocity data corresponding to each first image from the received first angular velocity data; and binding each first image with the corresponding intermediate angular velocity data to obtain combined data, and sending the combined data to the processor.
In one embodiment, the data binding module 604 is further configured to sort the first image and the first angular velocity data by acquisition time; and respectively acquiring adjacent first angular velocity data of each first image as corresponding intermediate angular velocity data.
In one embodiment, the image sensor and the processor are connected by a buffer; the data binding module 604 is further configured to bind the first image and the corresponding first angular velocity data to obtain combined data, and store the combined data in a buffer; the combined data in the buffer is read by the processor.
In an embodiment, the image processing apparatus further includes an image compensation module, where the image compensation module is configured to perform shake compensation on the second image according to the second angular velocity data to obtain the target image.
In one embodiment, the image compensation module is further configured to obtain image offset data of a second image according to the second angular velocity data; and carrying out shake compensation on the second image according to the image offset data to obtain a target image.
In one embodiment, each second image corresponds to at least two second angular velocity data, the at least two second angular velocity data corresponding to at least two different image areas in the second image, respectively; the image compensation module is further configured to perform shake compensation on at least two different image areas in the second image according to the at least two second angular velocity data, so as to obtain corresponding target images.
In one embodiment, the image compensation module is further configured to obtain an angular velocity difference between any two second angular velocity data of the at least two second angular velocity data; when the angular velocity difference is larger than the difference threshold, respectively performing shake compensation on at least two different image areas in the second image according to at least two pieces of second angular velocity data to obtain corresponding target images; and when the angular velocity difference is smaller than or equal to the difference threshold, determining target angular velocity data of the second image from at least two second angular velocity data, and performing shake compensation on the second image according to the target angular velocity data to obtain a corresponding target image.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The implementation of each module in the image processing apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 7 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 7, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 7, the image processing circuit includes an ISP processor 740 and control logic 750. The image data captured by the imaging device 710 is first processed by the ISP processor 740, and the ISP processor 740 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 710. The imaging device 710 may include a camera having one or more lenses 712 and an image sensor 714. The image sensor 714 may include an array of color filters (e.g., Bayer filters), and the image sensor 714 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 714 and provide a set of raw image data that may be processed by the ISP processor 740. The sensor 720 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 740 based on the type of sensor 720 interface. The sensor 720 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, image sensor 714 may also send raw image data to sensor 720, sensor 720 may provide raw image data to ISP processor 740 based on the type of sensor 720 interface, or sensor 720 may store raw image data in image memory 730.
ISP processor 740 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 740 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 740 may also receive image data from image memory 730. For example, sensor 720 interface sends raw image data to image memory 730, and the raw image data in image memory 730 is then provided to ISP processor 740 for processing. The image Memory 730 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
ISP processor 740 may perform one or more image processing operations, such as temporal filtering, upon receiving raw image data from image sensor 714 interface or from sensor 720 interface or from image memory 730. The processed image data may be sent to image memory 730 for additional processing before being displayed. ISP processor 740 receives processed data from image memory 730 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 740 may be output to display 770 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of ISP processor 740 may also be sent to image memory 730 and display 770 may read image data from image memory 730. In one embodiment, image memory 730 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 740 may be transmitted to the encoder/decoder 760 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on the display 770 device. The encoder/decoder 760 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by ISP processor 740 may be sent to control logic 750 unit. For example, the statistical data may include image sensor 714 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 712 shading correction, and the like. Control logic 750 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 710 and control parameters of ISP processor 740 based on the received statistical data. For example, the control parameters of imaging device 710 may include sensor 720 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 712 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 712 shading correction parameters.
In one embodiment, the steps of the image processing method provided in the above-mentioned embodiment can be implemented by using the image processing technology in fig. 7.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above embodiments.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (11)

1. An image processing method applied to an electronic device, the electronic device comprising a gyroscope, an image sensor and a processor, the gyroscope being connected with the image sensor, the image sensor being connected with the processor, the method comprising:
controlling the image sensor to acquire a first image, and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired, and at least two pieces of intermediate angular velocity data corresponding to each first image are acquired from the first angular velocity data;
binding the first image with the corresponding at least two pieces of intermediate angular velocity data to obtain combined data, and sending the combined data to the processor;
analyzing the received combined data through the processor to obtain second images and corresponding second angular velocity data, wherein each second image corresponds to at least two second angular velocity data, and the at least two second angular velocity data respectively correspond to at least two different image areas in the second images;
and performing shake compensation on at least two different image areas in the second image according to the at least two pieces of second angular velocity data to obtain corresponding target images.
2. The method of claim 1, wherein controlling the image sensor to capture a first image and receiving first angular velocity data captured by the gyroscope comprises:
controlling the image sensor to acquire a first image at a first frequency and receiving first angular velocity data acquired by the gyroscope at a second frequency, wherein the first frequency is less than or equal to the second frequency;
the binding the first image and the corresponding first angular velocity data to obtain combined data, and sending the combined data to the processor includes:
acquiring intermediate angular velocity data corresponding to each first image from the received first angular velocity data;
and binding each first image with corresponding intermediate angular velocity data to obtain combined data, and sending the combined data to the processor.
3. The method according to claim 2, wherein said obtaining intermediate angular velocity data corresponding to each of the first images from the received first angular velocity data comprises:
sequencing the first image and the first angular speed data according to the acquisition time;
and respectively acquiring adjacent first angular velocity data of each first image as corresponding intermediate angular velocity data.
4. The method of claim 1, wherein the image sensor and the processor are connected by a buffer;
the binding the first image and the corresponding first angular velocity data to obtain combined data, and sending the combined data to the processor includes:
binding the first image with the corresponding first angular speed data to obtain combined data, and storing the combined data into the buffer;
reading, by the processor, the combined data in the buffer.
5. The method according to any one of claims 1-4, wherein after the analyzing, by the processor, the received combined data to obtain a second image and corresponding second angular velocity data, further comprises:
and carrying out shake compensation on the second image according to the second angular velocity data to obtain a target image.
6. The method according to claim 5, wherein the performing the shake compensation on the second image according to the second angular velocity data to obtain the target image comprises:
acquiring image offset data of the second image according to the second angular velocity data;
and carrying out shake compensation on the second image according to the image offset data to obtain a target image.
7. The method of claim 6, wherein the performing the jitter compensation on the second image according to the image offset data to obtain the target image comprises:
acquiring a reference compensation quantity;
and calculating target compensation amounts corresponding to the second image in each compensation direction according to the reference compensation amount and the image offset data, and performing shake compensation on the second image in each compensation direction according to the target compensation amounts to obtain a target image.
8. The method according to claim 1, wherein the performing the shake compensation on at least two different image areas in the second image according to the at least two second angular velocity data to obtain the corresponding target images comprises:
acquiring an angular velocity difference value of any two second angular velocity data in the at least two second angular velocity data;
when the angular velocity difference is larger than the difference threshold, respectively performing shake compensation on at least two different image areas in the second image according to the at least two pieces of second angular velocity data to obtain corresponding target images;
and when the angular velocity difference is smaller than or equal to the difference threshold, determining target angular velocity data of the second image from the at least two second angular velocity data, and performing shake compensation on the second image according to the target angular velocity data to obtain a corresponding target image.
9. An image processing apparatus applied to an electronic device, the electronic device comprising a gyroscope, an image sensor and a processor, the gyroscope being connected to the image sensor, the image sensor being connected to the processor, the apparatus comprising:
the data acquisition module is used for controlling the image sensor to acquire a first image and receiving first angular velocity data acquired by the gyroscope, wherein the first angular velocity data is used for representing the shaking degree of a lens corresponding to the image sensor when the first image is acquired, and at least two pieces of intermediate angular velocity data corresponding to each first image are acquired from the first angular velocity data;
the data binding module is used for binding the first image with the corresponding at least two pieces of intermediate angular velocity data to obtain combined data and sending the combined data to the processor;
the data analysis module is used for analyzing the received combined data through the processor to obtain second images and corresponding second angular velocity data, wherein each second image corresponds to at least two pieces of second angular velocity data, and the at least two pieces of second angular velocity data respectively correspond to at least two different image areas in the second images; and performing shake compensation on at least two different image areas in the second image according to the at least two pieces of second angular velocity data to obtain corresponding target images.
10. An electronic device comprising a memory, a processor, a gyroscope, an image sensor and a processor, the gyroscope being connected to the image sensor, the image sensor being connected to the processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 8.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN201910233766.7A 2019-03-26 2019-03-26 Image processing method and device, electronic equipment and computer readable storage medium Active CN110049239B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910233766.7A CN110049239B (en) 2019-03-26 2019-03-26 Image processing method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910233766.7A CN110049239B (en) 2019-03-26 2019-03-26 Image processing method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110049239A CN110049239A (en) 2019-07-23
CN110049239B true CN110049239B (en) 2021-03-23

Family

ID=67275281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910233766.7A Active CN110049239B (en) 2019-03-26 2019-03-26 Image processing method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110049239B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104469251A (en) * 2013-09-23 2015-03-25 联想(北京)有限公司 Image acquisition method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6222514B2 (en) * 2012-01-11 2017-11-01 パナソニックIpマネジメント株式会社 Image processing apparatus, imaging apparatus, and computer program
CN107852462B (en) * 2015-07-22 2020-12-18 索尼公司 Camera module, solid-state imaging element, electronic apparatus, and imaging method
JP2018033100A (en) * 2016-08-26 2018-03-01 キヤノン株式会社 Image processing apparatus and method, and imaging apparatus
CN108737734B (en) * 2018-06-15 2020-12-01 Oppo广东移动通信有限公司 Image compensation method and apparatus, computer-readable storage medium, and electronic device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104469251A (en) * 2013-09-23 2015-03-25 联想(北京)有限公司 Image acquisition method and electronic equipment

Also Published As

Publication number Publication date
CN110049239A (en) 2019-07-23

Similar Documents

Publication Publication Date Title
CN109842753B (en) Camera anti-shake system, camera anti-shake method, electronic device and storage medium
CN110166695B (en) Camera anti-shake method and device, electronic equipment and computer readable storage medium
CN110290323B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110012224B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
CN110536057B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110636223B (en) Anti-shake processing method and apparatus, electronic device, and computer-readable storage medium
CN110035228B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
CN112087580B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110475067B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109951638B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN110035206B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110636216B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109963080B (en) Image acquisition method and device, electronic equipment and computer storage medium
CN110049237B (en) Camera anti-shake method and device, electronic equipment and computer storage medium
CN109951641B (en) Image shooting method and device, electronic equipment and computer readable storage medium
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN111432118B (en) Image anti-shake processing method and device, electronic equipment and storage medium
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107959841B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN108401110B (en) Image acquisition method and device, storage medium and electronic equipment
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN110177212B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN112087571A (en) Image acquisition method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant