CN114007009A - Electronic device and image processing method - Google Patents
Electronic device and image processing method Download PDFInfo
- Publication number
- CN114007009A CN114007009A CN202010682841.0A CN202010682841A CN114007009A CN 114007009 A CN114007009 A CN 114007009A CN 202010682841 A CN202010682841 A CN 202010682841A CN 114007009 A CN114007009 A CN 114007009A
- Authority
- CN
- China
- Prior art keywords
- state information
- image
- image data
- processor
- image signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 20
- 238000007781 pre-processing Methods 0.000 claims abstract description 77
- 230000005540 biological transmission Effects 0.000 claims description 41
- 238000012805 post-processing Methods 0.000 claims description 35
- 238000004458 analytical method Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 description 45
- 238000005457 optimization Methods 0.000 description 21
- 230000004927 fusion Effects 0.000 description 16
- 238000004422 calculation algorithm Methods 0.000 description 15
- 238000000034 method Methods 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000003705 background correction Methods 0.000 description 5
- 230000001629 suppression Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/665—Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides electronic equipment and an image processing method, wherein an image signal pre-processor is additionally arranged, the image signal pre-processor is used for carrying out state information statistics on original image data acquired by an image sensor to correspondingly obtain first state information, and the image signal pre-processor is used for carrying out pre-processing on the original image data to obtain pre-processed image data; then, the image signal processor carries out statistics of state information on the pre-processed image data again to correspondingly obtain second state information; and finally, the application processor fuses the first state information counted by the image signal preprocessor and the second state information counted by the image signal processor to obtain target state information, and the image acquisition parameters of the image sensor are updated by using the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration of the image acquisition parameters of the electronic equipment can be achieved.
Description
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an electronic device and an image processing method.
Background
At present, the quality of the shooting function becomes a key for measuring the performance of an electronic device (such as a smart phone, a tablet computer and the like). To simplify user operation, electronic devices typically automatically configure image acquisition parameters (e.g., exposure parameters, focus parameters, etc.) of image data. However, in the related art, the electronic device automatically configures the image acquisition parameters with poor accuracy.
Disclosure of Invention
The application provides an electronic device and an image processing method, which can improve the accuracy of automatic configuration of image acquisition parameters of the electronic device.
The application discloses electronic equipment includes:
the image sensor is used for acquiring image data according to the configured image acquisition parameters;
the image signal preprocessor is used for counting state information used for updating image acquisition parameters of the image sensor in the image data to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
the image signal processor is used for counting state information used for updating the image acquisition parameters of the image sensor in the pre-processing image data to obtain second state information;
and the application processor is used for fusing the first state information and the second state information to obtain target state information and updating the image acquisition parameters of the image sensor according to the target state information.
The embodiment of the application also discloses an image processing method, which is suitable for electronic equipment, wherein the electronic equipment comprises an image sensor, an image signal pre-processor, an image signal processor and an application processor, and the image processing method comprises the following steps:
acquiring image data through the image sensor according to the configured image acquisition parameters;
counting state information used for updating image acquisition parameters of the image sensor in the image data through the image signal preprocessor to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
counting state information used for updating image acquisition parameters of the image sensor in the pre-processing image data through the image signal processor to obtain second state information;
and fusing the first state information and the second state information through the application processor to obtain target state information, and updating the image acquisition parameters of the image sensor according to the target state information.
In the embodiment of the application, an image signal pre-processor is additionally arranged, the image signal pre-processor is used for carrying out state information statistics on original image data acquired by an image sensor to correspondingly obtain first state information, and the image signal pre-processor is used for carrying out pre-processing on the original image data to obtain pre-processed image data; then, the image signal processor carries out statistics of state information on the pre-processed image data again to correspondingly obtain second state information; and finally, the application processor fuses the first state information counted by the image signal preprocessor and the second state information counted by the image signal processor to obtain target state information, and the image acquisition parameters of the image sensor are updated by using the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration of the image acquisition parameters of the electronic equipment can be achieved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below.
Fig. 1 is a first structural schematic diagram of an electronic device according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of the image signal preprocessor in fig. 1.
Fig. 3 is a schematic diagram of the data flow involved in the embodiment of the present application.
Fig. 4 is a second structural schematic diagram of an electronic device according to an embodiment of the present application.
Fig. 5 is a schematic diagram showing a connection between the image signal pre-processor and the application processor in fig. 4.
Fig. 6 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Detailed Description
The technical solution provided by the embodiment of the present application can be applied to various scenarios requiring data communication, and the embodiment of the present application is not limited thereto.
Referring to fig. 1, fig. 1 is a first structural schematic diagram of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100, such as a mobile electronic device such as a smart phone, a tablet computer, a palm computer, a notebook computer, etc., or a non-mobile electronic device such as a desktop computer, a server, etc., includes an image sensor 110, an image signal pre-processor 120, an image signal processor 130, and an application processor 140.
The image sensor 110, or referred to as a light-sensing element, is a device that converts an optical signal into an electrical signal, and compared with a light-sensing element of a "point" light source such as a photodiode or a phototransistor, the image sensor 110 divides an optical image sensed by the light-sensing element into a plurality of small units, and then converts the optical image into an available electrical signal, thereby obtaining original image data. It should be noted that, in the embodiment of the present application, the type of the image sensor 110 is not limited, and the image sensor may be a Complementary Metal-Oxide-Semiconductor (CMOS) image sensor, a Charge Coupled Device (CCD) image sensor, or the like.
The image signal processor 130 can process the image data collected by the image sensor 110, so as to improve the quality of the image data. For example, the image signal processor 130 can perform optimization processing modes such as white balance correction, strong light suppression, backlight compensation, color enhancement, lens shading correction, etc. on the image data, and may also include optimization processing modes not listed in this application.
Compared to the image signal processor 130, the image signal pre-processor 120 performs some differentiation processing before the image signal processor 130 processes the image data, which may be regarded as preprocessing before the image signal processor 130 performs processing, such as optimization processing manners, such as dead pixel correction processing, time domain noise reduction processing, 3D noise reduction processing, linearization processing, and black level correction processing, and may also include optimization processing manners not listed in this application.
The application processor 140 is a general purpose processor such as a processor designed based on the ARM architecture. .
In the embodiment of the present application, the image sensor 110 is connected to the image signal pre-processor 120, and is configured to acquire image data according to the configured image acquisition parameters, and transmit the acquired image data to the image signal pre-processor 120. It should be noted that the image data acquired by the image sensor 110 is image data in RAW format, and the acquired image data in RAW format is transmitted to the image signal pre-processor 120.
The image capturing parameters include, but are not limited to, exposure parameters, focusing parameters, white balance parameters, and the like. For example, before starting the image data acquisition, the application processor 140 configures initial exposure parameters to the image sensor 110, so that the image sensor 110 acquires image data according to the initial exposure parameters, and transmits the acquired image data to the image signal pre-processor 120.
It should be noted that, in the embodiment of the present application, a connection manner of the image signal pre-Processor 120 and the image sensor 110 is not particularly limited, for example, the image signal pre-Processor 120 and the image sensor 110 are connected by an MIPI (Mobile Industry Processor Interface).
The image sensor 110 packages the image data into a plurality of image data packets when transmitting the image data to the image signal pre-processor 120, and transmits the image data packets to the image signal pre-processor. Illustratively, an image data packet includes a header field, an end-of-packet field and a data field, wherein the header field and the end-of-packet field are used for filling some necessary control information, such as synchronization information, address information, error control information, etc., and the data field is used for filling the actual image content.
On the other hand, the image signal pre-processor 120 receives image data from the image sensor 110. In addition, the image signal pre-processor 120 is further connected to the image signal processor 130, wherein in the embodiment of the present application, the connection manner between the image signal processor 130 and the image signal pre-processor 120 is not particularly limited, for example, the image signal processor 130 and the image signal pre-processor 120 may also be connected by MIPI.
After receiving the image data transmitted from the image sensor 110, the image signal pre-processor 120 uses the image data to count state information, including but not limited to brightness information, sharpness information, contrast information, etc., required by the application processor 140 to update the image capturing parameters of the image sensor 110, and uses the state information counted by the image data as the first state information. In addition, the image signal preprocessor 120 preprocesses the image data from the image sensor 110 according to the configured preprocessing policy, so as to improve the image quality of the image data and obtain preprocessed image data accordingly. It should be noted that the preprocessing of the image data by the image signal preprocessor 120 does not change the format of the image data, i.e., the preprocessed image data obtained by the preprocessing is still in the RAW format.
After completing the pre-processing of the image data and obtaining pre-processed image data, the image signal pre-processor 120 transmits the pre-processed image data to the image signal processor 130.
After receiving the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 uses the pre-processed image data to count state information, including but not limited to brightness information, sharpness information, and contrast information, required by the application processor 140 to update the image capturing parameters of the image sensor 110, and uses the state information counted by the pre-processed image data as second state information.
Also, for the first state information, the image signal pre-processor 120 may transmit the first state information to the application processor 140 directly or to the application processor 140 via the image signal processor 130.
After the image signal pre-processor 120 obtains the first state information through statistics, and the image signal processor 130 obtains the second state information through statistics, the application processor 140 further obtains a new state information by correspondingly fusing the first state information and the second state information according to the configured fusion policy, and records the new state information as the target state information. Therefore, the target state information carries the state information before and after the image data preprocessing to a certain extent, the shooting state of the actual shooting scene can be more accurately reflected, and the image acquisition parameters calculated by utilizing the target state information are more accurate. Accordingly, the application processor 140 calculates a new image acquisition parameter by using the target state information obtained by fusion according to the configured image acquisition parameter calculation algorithm (such as an automatic exposure algorithm, an automatic white balance algorithm, an automatic focusing algorithm, and the like).
In addition, the application processor 140 is also connected to the image sensor 110, and controls the image sensor 110 to start capturing image data and end capturing image data. After calculating the new image acquisition parameters, the application processor updates the image acquisition parameters of the image sensor 110 to the newly calculated image acquisition parameters. The above-mentioned steps are repeated, and the image acquisition parameters of the image sensor 110 are continuously updated until the image acquisition parameters of the image sensor 110 are converged, so as to obtain the best image acquisition effect.
It should be noted that, in the embodiment of the present application, the connection manner between the application processor 140 and the image sensor 110 is not particularly limited, and may be configured by a person of ordinary skill in the art according to practical situations, for example, in the embodiment of the present application, the application processor 140 and the image sensor 110 are connected through an I2C bus.
Compared with the related art, the image signal pre-processor 120 is additionally arranged, the image signal pre-processor 120 is used for carrying out state information statistics on the original image data acquired by the image sensor 110 to correspondingly obtain first state information, and the image signal pre-processor 120 is used for carrying out pre-processing on the original image data to obtain pre-processed image data; then, the image signal processor 130 performs state information statistics on the pre-processed image data to correspondingly obtain second state information; finally, the application processor 140 fuses the first state information counted by the image signal preprocessor 120 and the second state information counted by the image signal processor 130 to obtain target state information, and updates the image acquisition parameters of the image sensor by using the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration of the image acquisition parameters of the electronic equipment can be achieved.
Alternatively, referring to fig. 2, the image signal pre-processor 120 includes:
an image signal processing unit 1201 for counting first state information of image data; and
carrying out first preprocessing on image data;
the neural network processing unit 1202 is configured to perform second preprocessing on the image data after the first preprocessing to obtain preprocessed image data.
The first image signal processing unit 1201 is connected to the image sensor 110, and configured to perform statistics on state information of image data from the image sensor 110 to obtain first state information correspondingly. The image signal preprocessing unit 120 is further configured to perform a first preprocessing on the image data according to the configured preprocessing policy. It should be noted that, in the embodiment of the present application, the first preprocessing performed by the image signal processing unit 1201 is not particularly limited, and includes but is not limited to optimization processing manners such as dead pixel correction processing, time domain noise reduction processing, 3D noise reduction processing, linearization processing, black level correction processing, and the like, and of course, optimization processing manners not listed in the present application may also be included.
The neural network processing unit 1202 is configured to perform second preprocessing on the image data after the first preprocessing to obtain preprocessed image data. The neural network processing unit 1202 is fixed with a plurality of neural network algorithms (for example, a neural network-based video night scene algorithm, a video HDR algorithm, a video blurring algorithm, a video noise reduction algorithm, a video super-resolution algorithm, and the like), and after the first image signal processing unit 1201 finishes first preprocessing of the image data, the neural network processing unit 1202 invokes the corresponding neural network algorithm to perform second preprocessing on the image data after the first preprocessing according to the configured preprocessing policy to obtain preprocessed image data.
In a popular way, the image signal processing unit 1201 performs a preliminary optimization process on the image data by using a non-AI image optimization method, and then the neural network processing unit 1202 further optimizes the preliminarily optimized image data by using an AI image optimization method.
Optionally, in an embodiment, the application processor 140 is configured to:
acquiring a first weight value corresponding to the first state information and acquiring a second weight value corresponding to the second state information;
and carrying out weighted summation according to the first state information, the first weight value, the second state information and the second weight value to obtain target state information.
The application further provides a fusion strategy of the optional first state information and the second state information. The application processor 140 fuses the first state information and the second state information in a weighted summation manner, and the weighted result is used as target state information obtained by fusion, which can be expressed as:
S’=a*S1+b*S2;
wherein S' represents the fused target state information, S1Representing first state information, S2Denotes second state information, a denotes a weight assigned to the first state information, and b denotes a weight assigned to the second state information.
It should be noted that, in the embodiment of the present application, a weight configuration manner of the first state information and the second state information is not specifically limited, and may be configured by a person skilled in the art according to actual needs. For example, static weights may be configured for the second state information and the first state information, that is, fixed weights may be assigned to the second state information and the first state information in advance, or dynamic weights may be configured for the second state information and the first state information, that is, the second state information and the first state information dynamically change according to a certain specified condition.
For example, the weight of the first state information may be set to 0.9, and the weight of the second state information may be set to 0.1;
for another example, the weights of the first state information and the second state information may be dynamically assigned according to the preprocessing performed by the image signal preprocessor, wherein the greater the difference between the images of the image data before and after preprocessing, the greater the weight of the first state information is, and the smaller the weight of the second state information is; the smaller the difference between the images of the image data before and after the pre-processing is, the smaller the weight of the first status information is configured, and the larger the weight of the second status information is configured.
It should be noted that the first status information and the second status information include status information of a plurality of different dimensions, and the application processor 140 performs fusion of the status information in each dimension according to the above fusion policy provided in the present application, for example, fusion of brightness in the first status information and brightness in the second status information, fusion of sharpness in the first status information and sharpness in the second status information, and the like.
Optionally, in an embodiment, the image signal pre-processor 120 is further configured to transmit the first state information and the pre-processed image data to the image signal processor 130;
the image signal processor 130 is further configured to perform post-processing on the pre-processed image data to obtain post-processed image data; and
sending the post-processing image data, the first status information, the second status information, and the first indication information to the application processor 140
The application processor 140 is further configured to fuse the first status information and the second status information according to the first indication information to obtain target status information.
Referring to fig. 3, in the embodiment of the present application, the image signal preprocessor 120 statistically obtains the first state information of the image data from the image sensor 110, preprocesses the image data to obtain preprocessed image data, and then transmits the first state information and the preprocessed image data to the image signal processor 130. For example, the image signal pre-processor 120 transmits the first state information obtained by statistics and the pre-processed image data obtained by pre-processing to the image signal processor 130 based on the MIPI interface between the image signal pre-processor and the image signal processor 130.
After receiving the first state information and the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 performs state statistics on the pre-processed image data to obtain second state information, and performs post-processing on the pre-processed image data to obtain post-processed image data.
In the embodiment of the present application, the post-processing performed by the image signal processor 130 is not specifically limited, and the difference between the post-processing performed by the image signal pre-processor 120 and the pre-processing performed by the image signal pre-processor 120 is taken as a constraint (i.e., the image signal pre-processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by those skilled in the art according to actual needs. For example, post-processing performed by the image signal processor 130, such as optimization processing manners of strong light suppression, backlight compensation, color enhancement, lens shading correction, etc., may also include optimization processing manners not listed in this application.
After finishing the post-processing of the pre-processed image data, the image signal processor 130 correspondingly obtains post-processed image data, counts the second state information, generates first indication information, and sends the post-processed image data, the first state information, the second state information, and the first indication information to the application processor 140.
Accordingly, the application processor 140 fuses the first state information and the second state information according to the first indication to obtain the target state information.
Optionally, in an embodiment, the application processor 140 is further configured to:
previewing the post-processing image data and/or carrying out video coding according to the post-processing image data when the post-processing image data is dynamic image data; or,
and when the post-processing image data is a static image, carrying out image coding according to the post-processing image data.
It should be noted that the image type does not change with the processing of the image data, that is, if the original image data is a still image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a still image, and if the original image data is a moving image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a moving image. The still image is, for example, a single frame image shot in real time, the dynamic image is, for example, one frame image in an image sequence acquired during preview, and one frame image in an image sequence acquired during video recording.
Optionally, in an embodiment, the image signal pre-processor 120 is further configured to transmit the first state information and the pre-processed image data to the image signal processor 130 when the current transmission mode is configured as the first transmission mode.
It should be noted that in the embodiment of the present application, two selectable transmission modes, namely a first transmission mode and a second transmission mode, are provided.
The image signal pre-processor 120 is configured to be in a first transmission mode (or called synchronous transmission mode) in the current transmission mode, and transmits the first state information and the pre-processed image data to the image signal processor 130, and transmits the first state information to the application processor 140 through the image signal processor 130, and in addition, the image signal processor 130 further transmits the post-processed image data after the pre-processed image data, the second state information obtained by counting the post-processed image data, and the generated first indication information to the application processor 140 along with the first state information.
Optionally, in an embodiment, the image signal pre-processor 120 is further configured to send the first state information and the second indication information to the application processor 140 when the current transmission mode is configured as the second transmission mode;
the application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain target state information.
Referring to fig. 4, in the embodiment of the present invention, the image signal pre-processor 120 is further connected to the application processor 140, and the image signal pre-processor 120 directly transmits the first state information to the application processor 140 when the current transmission mode is configured as the second transmission mode.
On the other hand, the image signal pre-processor 120 also transmits pre-processed image data obtained by pre-processing the image data to the image signal processor 130. Accordingly, the image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data, in addition to performing state statistics on the pre-processed image data to obtain second state information.
In the embodiment of the present application, the post-processing performed by the image signal processor 130 is not specifically limited, and the difference between the post-processing performed by the image signal pre-processor 120 and the pre-processing performed by the image signal pre-processor 120 is taken as a constraint (i.e., the image signal pre-processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by those skilled in the art according to actual needs. For example, post-processing performed by the image signal processor 130, such as optimization processing manners of strong light suppression, backlight compensation, color enhancement, lens shading correction, etc., may also include optimization processing manners not listed in this application.
After finishing the post-processing of the pre-processed image data, the image signal processor 130 correspondingly obtains post-processed image data, and counts to obtain second state information, and then sends the post-processed image data and the second state information to the application processor 140.
The application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain target state information.
Alternatively, referring to fig. 5, a first connection and a second connection are established between the image signal pre-processor 120 and the application processor 140, and the image signal pre-processor 120 is further configured to send the first state information to the application processor 140 through the first connection and send the second indication information to the application processor 140 through the second connection.
For example, the image signal preprocessing unit 120 and the application processor 140 establish a first connection through a Serial Peripheral Interface (SPI), and establish a second connection through a General-purpose input/output (GPIO) Interface.
Optionally, in an embodiment, the image signal pre-processor 120 is further configured to:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode as a first mode or a second mode according to the analysis result.
The historical image data is image data acquired before the image data. The image signal preprocessor 120 analyzes the historical image data according to the historical image data acquired before the image data and the configured analysis strategy to obtain an analysis result, and then configures the transmission mode into a first mode and a second mode according to the analysis result.
Illustratively, the historical image data is image data acquired by the image sensor 110 before the image data is acquired, and the image contents of the historical image data and the image data are substantially the same because the historical image data and the image data are acquired continuously. The image signal preprocessor 120 obtains the preprocessing duration of a preset number of historical image data, predicts the preprocessing duration of the obtained image data according to the preprocessing duration of the preset number, and determines whether the preprocessing duration of the image data reaches the preset duration, if so, the transmission mode is configured as the second mode, and if not, the transmission mode is configured as the first mode. Wherein, the preset duration can be obtained from experience value according to actual need by ordinary persons in the art.
For example, when predicting the preprocessing time length of the image data from a preset number of preprocessing time lengths, the image signal preprocessor 120 may obtain an average processing time length of the preset number of preprocessing time lengths as the preprocessing time length of the image data.
For another example, when the preprocessing time length of the image data is predicted according to the preset number of preprocessing time lengths, the image signal preprocessor 120 may perform weighted summation on the preset number of preprocessing time lengths, and use the weighted summation as the preprocessing time length of the image data, where for the preset number of preprocessing time lengths, the weight is smaller if the corresponding historical image data is acquired earlier than the image data.
Optionally, in an embodiment, the image sensor 110 is a plurality of image sensors, and the application processor 140 is further configured to update the image acquisition parameters of each image sensor 110 synchronously.
In this embodiment, for each image sensor 110, a new image acquisition parameter is obtained correspondingly according to the calculation method of the image acquisition parameter provided in any one of the above embodiments, and then the image acquisition parameter of each image sensor is updated synchronously according to the new image acquisition parameter of each image sensor 110.
The present application further provides an image processing method applied to the electronic device provided in the present application, please refer to fig. 6 and fig. 1 in combination, and a flow of the image processing method may be as follows:
in 201, acquiring image data by the image sensor 110 according to the configured image acquisition parameters;
in 202, the image signal preprocessor 120 counts state information used for updating image acquisition parameters of the image sensor 110 in the image data to obtain first state information, and preprocesses the image data to obtain preprocessed image data;
in 203, counting state information used for updating image acquisition parameters of the image sensor 110 in the pre-processing image data by the image signal processor 130 to obtain second state information;
in 204, the application processor 140 fuses the first state information and the second state information to obtain target state information, and updates the image capturing parameters of the image sensor 110 according to the target state information.
In the embodiment of the present application, the image sensor 110 is connected to the image signal pre-processor 120, and is configured to acquire image data according to the configured image acquisition parameters, and transmit the acquired image data to the image signal pre-processor 120. It should be noted that the image data acquired by the image sensor 110 is image data in RAW format, and the acquired image data in RAW format is transmitted to the image signal pre-processor 120.
The image capturing parameters include, but are not limited to, exposure parameters, focusing parameters, white balance parameters, and the like. For example, before starting the image data acquisition, the application processor 140 configures initial exposure parameters to the image sensor 110, so that the image sensor 110 acquires image data according to the initial exposure parameters, and transmits the acquired image data to the image signal pre-processor 120.
It should be noted that, in the embodiment of the present application, a connection manner of the image signal pre-Processor 120 and the image sensor 110 is not particularly limited, for example, the image signal pre-Processor 120 and the image sensor 110 are connected by an MIPI (Mobile Industry Processor Interface).
The image sensor 110 packages the image data into a plurality of image data packets when transmitting the image data to the image signal pre-processor 120, and transmits the image data packets to the image signal pre-processor. Illustratively, an image data packet includes a header field, an end-of-packet field and a data field, wherein the header field and the end-of-packet field are used for filling some necessary control information, such as synchronization information, address information, error control information, etc., and the data field is used for filling the actual image content.
On the other hand, the image signal pre-processor 120 receives image data from the image sensor 110. In addition, the image signal pre-processor 120 is further connected to the image signal processor 130, wherein in the embodiment of the present application, the connection manner between the image signal processor 130 and the image signal pre-processor 120 is not particularly limited, for example, the image signal processor 130 and the image signal pre-processor 120 may also be connected by MIPI.
After receiving the image data transmitted from the image sensor 110, the image signal pre-processor 120 uses the image data to count state information, including but not limited to brightness information, sharpness information, contrast information, etc., required by the application processor 140 to update the image capturing parameters of the image sensor 110, and uses the state information counted by the image data as the first state information. In addition, the image signal preprocessor 120 preprocesses the image data from the image sensor 110 according to the configured preprocessing policy, so as to improve the image quality of the image data and obtain preprocessed image data accordingly. It should be noted that the preprocessing of the image data by the image signal preprocessor 120 does not change the format of the image data, i.e., the preprocessed image data obtained by the preprocessing is still in the RAW format.
After completing the pre-processing of the image data and obtaining pre-processed image data, the image signal pre-processor 120 transmits the pre-processed image data to the image signal processor 130.
After receiving the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 uses the pre-processed image data to count state information, including but not limited to brightness information, sharpness information, and contrast information, required by the application processor 140 to update the image capturing parameters of the image sensor 110, and uses the state information counted by the pre-processed image data as second state information.
Also, for the first state information, the image signal pre-processor 120 may transmit the first state information to the application processor 140 directly or to the application processor 140 via the image signal processor 130.
After the image signal pre-processor 120 obtains the first state information through statistics, and the image signal processor 130 obtains the second state information through statistics, the application processor 140 further obtains a new state information by correspondingly fusing the first state information and the second state information according to the configured fusion policy, and records the new state information as the target state information. Therefore, the target state information carries the state information before and after the image data preprocessing to a certain extent, the shooting state of the actual shooting scene can be more accurately reflected, and the image acquisition parameters calculated by utilizing the target state information are more accurate. Accordingly, the application processor 140 calculates a new image acquisition parameter by using the target state information obtained by fusion according to the configured image acquisition parameter calculation algorithm (such as an automatic exposure algorithm, an automatic white balance algorithm, an automatic focusing algorithm, and the like).
In addition, the application processor 140 is also connected to the image sensor 110, and controls the image sensor 110 to start capturing image data and end capturing image data. After calculating the new image acquisition parameters, the application processor updates the image acquisition parameters of the image sensor 110 to the newly calculated image acquisition parameters. The above-mentioned steps are repeated, and the image acquisition parameters of the image sensor 110 are continuously updated until the image acquisition parameters of the image sensor 110 are converged, so as to obtain the best image acquisition effect.
It should be noted that, in the embodiment of the present application, the connection manner between the application processor 140 and the image sensor 110 is not particularly limited, and may be configured by a person of ordinary skill in the art according to practical situations, for example, in the embodiment of the present application, the application processor 140 and the image sensor 110 are connected through an I2C bus.
Optionally, in an embodiment, fusing, by the application processor 140, the first state information and the second state information to obtain the target state information includes:
acquiring, by the application processor 140, a first weight value corresponding to the first status information and a second weight value corresponding to the second status information;
the target state information is obtained by performing weighted summation by the application processor 140 according to the first state information, the first weight value, the second state information, and the second weight value.
The application further provides a fusion strategy of the optional first state information and the second state information. The application processor 140 fuses the first state information and the second state information in a weighted summation manner, and the weighted result is used as target state information obtained by fusion, which can be expressed as:
S’=a*S1+b*S2;
wherein S' represents the fused target state information, S1Representing first state information, S2Denotes second state information, a denotes a weight assigned to the first state information, and b denotes a weight assigned to the second state information.
It should be noted that, in the embodiment of the present application, a weight configuration manner of the first state information and the second state information is not specifically limited, and may be configured by a person skilled in the art according to actual needs. For example, static weights may be configured for the second state information and the first state information, that is, fixed weights may be assigned to the second state information and the first state information in advance, or dynamic weights may be configured for the second state information and the first state information, that is, the second state information and the first state information dynamically change according to a certain specified condition.
For example, the weight of the first state information may be set to 0.9, and the weight of the second state information may be set to 0.1;
for another example, the weights of the first state information and the second state information may be dynamically assigned according to the preprocessing performed by the image signal preprocessor, wherein the greater the difference between the images of the image data before and after preprocessing, the greater the weight of the first state information is, and the smaller the weight of the second state information is; the smaller the difference between the images of the image data before and after the pre-processing is, the smaller the weight of the first status information is configured, and the larger the weight of the second status information is configured.
It should be noted that the first status information and the second status information include status information of a plurality of different dimensions, and the application processor 140 performs fusion of the status information in each dimension according to the above fusion policy provided in the present application, for example, fusion of brightness in the first status information and brightness in the second status information, fusion of sharpness in the first status information and sharpness in the second status information, and the like.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
transmitting the first state information and the pre-processed image data to the image signal processor 130 through the image signal pre-processor 120;
post-processing the pre-processed image data by the image signal processor 130 to obtain post-processed image data;
transmitting the post-processing image data, the first state information, the second state information, and the first indication information to the application processor 140 through the image signal processor 130;
and fusing the first state information and the second state information according to the first indication information by the application processor 140 to obtain target state information.
Referring to fig. 3, in the embodiment of the present application, the image signal preprocessor 120 statistically obtains the first state information of the image data from the image sensor 110, preprocesses the image data to obtain preprocessed image data, and then transmits the first state information and the preprocessed image data to the image signal processor 130. For example, the image signal pre-processor 120 transmits the first state information obtained by statistics and the pre-processed image data obtained by pre-processing to the image signal processor 130 based on the MIPI interface between the image signal pre-processor and the image signal processor 130.
After receiving the first state information and the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 performs state statistics on the pre-processed image data to obtain second state information, and performs post-processing on the pre-processed image data to obtain post-processed image data.
In the embodiment of the present application, the post-processing performed by the image signal processor 130 is not specifically limited, and the difference between the post-processing performed by the image signal pre-processor 120 and the pre-processing performed by the image signal pre-processor 120 is taken as a constraint (i.e., the image signal pre-processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by those skilled in the art according to actual needs. For example, post-processing performed by the image signal processor 130, such as optimization processing manners of strong light suppression, backlight compensation, color enhancement, lens shading correction, etc., may also include optimization processing manners not listed in this application.
After finishing the post-processing of the pre-processed image data, the image signal processor 130 correspondingly obtains post-processed image data, counts the second state information, generates first indication information, and sends the post-processed image data, the first state information, the second state information, and the first indication information to the application processor 140.
Accordingly, the application processor 140 fuses the first state information and the second state information according to the first indication to obtain the target state information.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
previewing the post-processed image data and/or performing video encoding according to the post-processed image data by the application processor 140 when the post-processed image data is dynamic image data; or,
when the post-processing image data is a still image, image encoding is performed by the application processor 140 based on the post-processing image data.
It should be noted that the image type does not change with the processing of the image data, that is, if the original image data is a still image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a still image, and if the original image data is a moving image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a moving image. The still image is, for example, a single frame image shot in real time, the dynamic image is, for example, one frame image in an image sequence acquired during preview, and one frame image in an image sequence acquired during video recording.
Optionally, in an embodiment, the transmitting the first state information and the pre-processed image data to the image signal processor 130 through the image signal pre-processor 120 includes:
when the current transmission mode is configured as the first transmission mode, the first state information and the pre-processed image data are transmitted to the image signal processor 130 through the image signal pre-processor 120.
It should be noted that in the embodiment of the present application, two selectable transmission modes, namely a first transmission mode and a second transmission mode, are provided.
The image signal pre-processor 120 is configured to be in a first transmission mode (or called synchronous transmission mode) in the current transmission mode, and transmits the first state information and the pre-processed image data to the image signal processor 130, and transmits the first state information to the application processor 140 through the image signal processor 130, and in addition, the image signal processor 130 further transmits the post-processed image data after the pre-processed image data, the second state information obtained by counting the post-processed image data, and the generated first indication information to the application processor 140 along with the first state information.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
transmitting the first state information and the second indication information to the application processor 140 through the image signal pre-processor 120 when the current transmission mode is configured as the second transmission mode;
and fusing the first state information and the second state information according to the second indication information by the application processor 140 to obtain target state information.
Referring to fig. 4, in the embodiment of the present invention, the image signal pre-processor 120 is further connected to the application processor 140, and the image signal pre-processor 120 directly transmits the first state information to the application processor 140 when the current transmission mode is configured as the second transmission mode.
On the other hand, the image signal pre-processor 120 also transmits pre-processed image data obtained by pre-processing the image data to the image signal processor 130. Accordingly, the image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data, in addition to performing state statistics on the pre-processed image data to obtain second state information.
In the embodiment of the present application, the post-processing performed by the image signal processor 130 is not specifically limited, and the difference between the post-processing performed by the image signal pre-processor 120 and the pre-processing performed by the image signal pre-processor 120 is taken as a constraint (i.e., the image signal pre-processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by those skilled in the art according to actual needs. For example, post-processing performed by the image signal processor 130, such as optimization processing manners of strong light suppression, backlight compensation, color enhancement, lens shading correction, etc., may also include optimization processing manners not listed in this application.
After finishing the post-processing of the pre-processed image data, the image signal processor 130 correspondingly obtains post-processed image data, and counts to obtain second state information, and then sends the post-processed image data and the second state information to the application processor 140.
The application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain target state information.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode as a first mode or a second mode according to the analysis result.
The historical image data is image data acquired before the image data. The image signal preprocessor 120 analyzes the historical image data according to the historical image data acquired before the image data and the configured analysis strategy to obtain an analysis result, and then configures the transmission mode into a first mode and a second mode according to the analysis result.
Illustratively, the historical image data is image data acquired by the image sensor 110 before the image data is acquired, and the image contents of the historical image data and the image data are substantially the same because the historical image data and the image data are acquired continuously. The image signal preprocessor 120 obtains the preprocessing duration of a preset number of historical image data, predicts the preprocessing duration of the obtained image data according to the preprocessing duration of the preset number, and determines whether the preprocessing duration of the image data reaches the preset duration, if so, the transmission mode is configured as the second mode, and if not, the transmission mode is configured as the first mode. Wherein, the preset duration can be obtained from experience value according to actual need by ordinary persons in the art.
For example, when predicting the preprocessing time length of the image data from a preset number of preprocessing time lengths, the image signal preprocessor 120 may obtain an average processing time length of the preset number of preprocessing time lengths as the preprocessing time length of the image data.
For another example, when the preprocessing time length of the image data is predicted according to the preset number of preprocessing time lengths, the image signal preprocessor 120 may perform weighted summation on the preset number of preprocessing time lengths, and use the weighted summation as the preprocessing time length of the image data, where for the preset number of preprocessing time lengths, the weight is smaller if the corresponding historical image data is acquired earlier than the image data.
Optionally, in an embodiment, the image sensor 110 is multiple, and the image processing method provided by the present application further includes:
the image acquisition parameters of each image sensor 110 are updated synchronously by the application processor 140.
In this embodiment, for each image sensor 110, a new image acquisition parameter is obtained correspondingly according to the calculation method of the image acquisition parameter provided in any one of the above embodiments, and then the image acquisition parameter of each image sensor is updated synchronously according to the new image acquisition parameter of each image sensor 110.
The electronic device and the image processing method provided by the embodiment of the application are described in detail above. The principles and implementations of the present application are described herein using specific examples, which are presented only to aid in understanding the present application. Meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (10)
1. An electronic device, comprising:
the image sensor is used for acquiring image data according to the configured image acquisition parameters;
the image signal preprocessor is used for counting state information used for updating image acquisition parameters of the image sensor in the image data to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
the image signal processor is used for counting state information used for updating the image acquisition parameters of the image sensor in the pre-processing image data to obtain second state information;
and the application processor is used for fusing the first state information and the second state information to obtain target state information and updating the image acquisition parameters of the image sensor according to the target state information.
2. The electronic device of claim 1, wherein the application processor is configured to:
acquiring a first weight value corresponding to the first state information and acquiring a second weight value corresponding to the second state information;
and carrying out weighted summation according to the first state information, the first weight value, the second state information and the second weight value to obtain the target state information.
3. The electronic device of claim 1, wherein the image signal pre-processor is further configured to transmit the first state information and the pre-processed image data to the image signal processor;
the image signal processor is also used for carrying out post-processing on the pre-processed image data to obtain post-processed image data; and
sending the post-processing image data, the first state information, the second state information and first indication information to the application processor;
the application processor is further configured to obtain the target state information by fusing the first state information and the second state information according to the first indication information.
4. The electronic device of claim 3, wherein the image signal pre-processor is further configured to transmit the first status information and the pre-processed image data to the image signal processor when a current transmission mode is configured as a first transmission mode.
5. The electronic device of claim 4, wherein the image signal pre-processor is further configured to send the first status information and second indication information to the application processor when a current transmission mode is configured as a second transmission mode;
the application processor is further configured to obtain the target state information by fusing the first state information and the second state information according to the second indication information.
6. The electronic device of claim 4 or 5, wherein the image signal pre-processor is further configured to:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode to be a first mode or a second mode according to the analysis result.
7. The electronic device of any of claims 1-5, wherein the image sensor is multiple, and wherein the application processor is further configured to update image acquisition parameters of each image sensor synchronously.
8. An image processing method applied to an electronic device, wherein the electronic device comprises an image sensor, an image signal pre-processor, an image signal processor and an application processor, the image processing method comprising:
acquiring image data through the image sensor according to the configured image acquisition parameters;
counting state information used for updating image acquisition parameters of the image sensor in the image data through the image signal preprocessor to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
counting state information used for updating image acquisition parameters of the image sensor in the pre-processing image data through the image signal processor to obtain second state information;
and fusing the first state information and the second state information through the application processor to obtain target state information, and updating the image acquisition parameters of the image sensor according to the target state information.
9. The image processing method according to claim 8, further comprising:
transmitting the first state information and the pre-processed image data to the image signal processor through the image signal pre-processor when a current transmission mode is configured as a first transmission mode;
post-processing the pre-processed image data through the image signal processor to obtain post-processed image data; and sending the post-processing image data, the first state information, the second state information and first indication information to the application processor;
and fusing the first state information and the second state information according to the first indication information through the application processor to obtain the target state information.
10. The image processing method according to claim 8, further comprising:
transmitting the first state information and second indication information to the application processor through the image signal pre-processor when the current transmission mode is configured as a second transmission mode;
and fusing the first state information and the second state information according to the second indication information through the application processor to obtain the target state information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010682841.0A CN114007009B (en) | 2020-07-15 | 2020-07-15 | Electronic device and image processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010682841.0A CN114007009B (en) | 2020-07-15 | 2020-07-15 | Electronic device and image processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114007009A true CN114007009A (en) | 2022-02-01 |
CN114007009B CN114007009B (en) | 2023-08-18 |
Family
ID=79920183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010682841.0A Active CN114007009B (en) | 2020-07-15 | 2020-07-15 | Electronic device and image processing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114007009B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140307117A1 (en) * | 2013-04-15 | 2014-10-16 | Htc Corporation | Automatic exposure control for sequential images |
CN107147837A (en) * | 2017-06-30 | 2017-09-08 | 维沃移动通信有限公司 | The method to set up and mobile terminal of a kind of acquisition parameters |
CN108370414A (en) * | 2016-10-29 | 2018-08-03 | 华为技术有限公司 | A kind of image pickup method and terminal |
WO2019056242A1 (en) * | 2017-09-21 | 2019-03-28 | 深圳传音通讯有限公司 | Camera photographing parameter setting method for smart terminal, setting device, and smart terminal |
CN110022420A (en) * | 2019-03-13 | 2019-07-16 | 华中科技大学 | A kind of image scanning system based on CIS, method and storage medium |
-
2020
- 2020-07-15 CN CN202010682841.0A patent/CN114007009B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140307117A1 (en) * | 2013-04-15 | 2014-10-16 | Htc Corporation | Automatic exposure control for sequential images |
CN108370414A (en) * | 2016-10-29 | 2018-08-03 | 华为技术有限公司 | A kind of image pickup method and terminal |
CN107147837A (en) * | 2017-06-30 | 2017-09-08 | 维沃移动通信有限公司 | The method to set up and mobile terminal of a kind of acquisition parameters |
WO2019056242A1 (en) * | 2017-09-21 | 2019-03-28 | 深圳传音通讯有限公司 | Camera photographing parameter setting method for smart terminal, setting device, and smart terminal |
CN110022420A (en) * | 2019-03-13 | 2019-07-16 | 华中科技大学 | A kind of image scanning system based on CIS, method and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114007009B (en) | 2023-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102310430B1 (en) | Filming method, apparatus and device | |
US11532076B2 (en) | Image processing method, electronic device and storage medium | |
CN111107276B (en) | Information processing apparatus, control method thereof, storage medium, and imaging system | |
JP7197981B2 (en) | Camera, terminal device, camera control method, terminal device control method, and program | |
EP4033750B1 (en) | Method and device for processing image, and storage medium | |
US20140092263A1 (en) | System and method for remotely performing image processing operations with a network server device | |
CN112822371A (en) | Image processing chip, application processing chip, data statistical system and method | |
JP4499908B2 (en) | Electronic camera system, electronic camera, server computer, and photographing condition correction method | |
CN105472263A (en) | Image capture method and image capture device with use of method | |
CN104811601B (en) | A kind of method and apparatus for showing preview image | |
CN114007009B (en) | Electronic device and image processing method | |
JP2021136461A (en) | Imaging device, control method, program, and storage medium | |
US10944899B2 (en) | Image processing device and image processing method | |
CN114143471B (en) | Image processing method, system, mobile terminal and computer readable storage medium | |
EP3043547B1 (en) | Imaging apparatus, video data transmitting apparatus, video data transmitting and receiving system, image processing method, and program | |
CN109309784B (en) | Mobile terminal | |
CN113747145B (en) | Image processing circuit, electronic apparatus, and image processing method | |
CN113766142B (en) | Image processing apparatus, image signal preprocessing module, device, and processing method | |
CN114677719A (en) | Method, apparatus and computer-readable storage medium for image signal processing | |
CN113873142A (en) | Multimedia processing chip, electronic device and dynamic image processing method | |
JP6514577B2 (en) | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGING APPARATUS | |
CN108810416B (en) | Image processing method and terminal equipment | |
CN113792708A (en) | ARM-based remote target clear imaging system and method | |
JP2007081549A (en) | Imaging system | |
CN113763255A (en) | Image processing method, image processing device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |