CN118247194A - Image processing method and device, computer readable medium and electronic equipment - Google Patents

Image processing method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN118247194A
CN118247194A CN202211650451.0A CN202211650451A CN118247194A CN 118247194 A CN118247194 A CN 118247194A CN 202211650451 A CN202211650451 A CN 202211650451A CN 118247194 A CN118247194 A CN 118247194A
Authority
CN
China
Prior art keywords
image
brightness enhancement
target
original
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211650451.0A
Other languages
Chinese (zh)
Inventor
杨周
蓝和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211650451.0A priority Critical patent/CN118247194A/en
Publication of CN118247194A publication Critical patent/CN118247194A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Landscapes

  • Image Processing (AREA)

Abstract

The disclosure provides an image processing method and device, a computer readable medium and electronic equipment, and relates to the technical field of image processing. The method comprises the following steps: acquiring original image data, and determining an original brightness enhancement guide map according to the original image data; performing an image deformation process on the original image data to obtain target image data; performing synchronous alignment processing on the original brightness enhancement guide map based on the image deformation flow to obtain a target brightness enhancement guide map; and generating an image to be displayed according to the target image data and the target brightness enhancement guide map. According to the method and the device for aligning the original brightness enhancement guide map, the same alignment process as the image deformation flow of the original image data is carried out on the original brightness enhancement guide map, so that the target brightness enhancement guide map and the target image data can be aligned in space position, and the accuracy of the target brightness enhancement guide map is effectively improved.

Description

Image processing method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technology, and in particular, to an image processing method, an image processing apparatus, a computer readable medium, and an electronic device.
Background
Dynamic range refers to the range from darkest to brightest in an image, and is generally expressed in terms of the ratio of brightest to darkest. The larger the dynamic range, the closer the display effect of the image is to the visual effect of the real scene.
Standard dynamic range (STANDARD DYNAMIC RANGE, SDR) is a generic video image standard that limits the display light intensity based on the cathode ray tube display brightness, contrast, and color characteristics. Currently, the maximum display brightness specified by the SDR standard is 100 nit, and the peak brightness of most display devices on the market is greater than 100 nit (nit), for example, the peak brightness of most display devices is generally 500-1000 nit (nit), so that in order to enable the original SDR content to be better adapted to the display device with higher brightness, a scheme for converting the SDR content is required.
The conversion of the SDR content can be realized by adopting a brightness enhancement guide chart, but the compressed RAW image data (only performing operations of bit compression, RGB to gray level conversion and the like) is directly used as the brightness enhancement guide chart, which may result in lower accuracy of the conversion result of the SDR content and poorer restoration effect of the real light ratio.
Disclosure of Invention
The disclosure aims to provide an image processing method, an image processing device, a computer readable medium and electronic equipment, so as to improve the accuracy of a brightness enhancement guide graph, further improve the accuracy of a conversion result of SDR content and improve the real light ratio restoration effect of an image.
According to a first aspect of the present disclosure, there is provided an image processing method including:
Acquiring original image data, and determining an original brightness enhancement guide map according to the original image data;
Performing an image deformation process on the original image data to obtain target image data;
Performing synchronous alignment processing on the original brightness enhancement guide map based on the image deformation flow to obtain a target brightness enhancement guide map;
And generating an image to be displayed according to the target image data and the target brightness enhancement guide map.
According to a second aspect of the present disclosure, there is provided an image processing apparatus including:
The image data acquisition module is used for acquiring original image data and determining an original brightness enhancement guide image according to the original image data;
The image data deformation module is used for carrying out an image deformation process on the original image data to obtain target image data;
the guide image alignment module is used for carrying out synchronous alignment processing on the original brightness enhancement guide image based on the image deformation flow to obtain a target brightness enhancement guide image;
And the image to be displayed generating module is used for generating an image to be displayed according to the target image data and the target brightness enhancement guide map.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
A processor; and
And a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the methods described above.
According to the image processing method provided by the embodiment of the disclosure, an original brightness enhancement guide image can be determined according to original image data, and an image deformation process is performed on the original image data to obtain target image data; and simultaneously, carrying out synchronous alignment processing on the original brightness enhancement guide map based on the image deformation flow to obtain a target brightness enhancement guide map, and generating an image to be displayed according to the target image data and the target brightness enhancement guide map. On the one hand, the original brightness enhancement guide image is subjected to the same deformation processing based on the image deformation flow of the original image data, so that the pixel points in the obtained target brightness enhancement guide image and the pixel points in the target image data are in one-to-one correspondence in the space position, the accuracy of brightness information of the target brightness enhancement guide image is ensured, and further, when a display command of an image to be displayed is received, the target image data can be subjected to brightness enhancement through the target brightness enhancement guide image, the restoration effect and the far-reaching accuracy of the real light ratio of the image can be effectively improved, and the display effect of the original image data after the expansion display is improved; on the other hand, before the original image data does not locally and nonlinearly process the damaged light ratio, an original brightness enhancement guide graph is determined according to the original image data, so that brightness information of an image can be reserved, and accuracy of the brightness information in the target brightness enhancement guide graph is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
Fig. 2 schematically illustrates a flowchart of an image processing method in an exemplary embodiment of the present disclosure;
fig. 3 schematically illustrates a flowchart of performing extended display on an image to be displayed in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a schematic diagram of implementing generation of an image to be displayed based on a mobile terminal in an exemplary embodiment of the present disclosure;
Fig. 5 schematically illustrates a composition diagram of an image processing apparatus in an exemplary embodiment of the present disclosure;
Fig. 6 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an image display method and apparatus of an embodiment of the present disclosure may be applied.
Referring to fig. 1, a system architecture 100 may include an acquisition end 110, a processing end 120, and a display end 130. The capturing end 110 is a device, apparatus or module for capturing images, such as a stand-alone camera or a camera module built into an electronic device. The processing end 120 is a device, apparatus or module for processing the image acquired by the acquisition end 110, and may be an independent processing device (such as a server, a computer, a smart phone), or may be a processor in an electronic device, such as a main processor, an application processor, a graphics processor, an image signal processor, or the like. The display end 130 is a device, apparatus or module for displaying an image, and "displaying" herein refers to visual presentation of an image, not limited to displaying by means of a screen, such as projection, printing, etc., and thus, the display end 130 may be a display, a projector, a printer, or a display module built in an electronic device (such as a display screen of a smart phone). The acquisition end 110, the processing end 120 and the display end 130 can be connected through a wired or wireless communication link, and can also be connected with a specific hardware interface through a bus so as to perform data interaction.
In an alternative embodiment, the acquisition end 110 acquires raw image data and transmits the raw image data to the processing end 120. The processing end 120 acquires the original image data supplied from the acquisition end 110, and determines an original brightness enhancement guidance map from the original image data, and generates an image to be displayed by executing the image processing method in the present exemplary embodiment. Optionally, the image to be displayed may be sent to the display end 130, and the display end 130 expands and displays the target image data based on the target brightness enhancement guide map in the image to be displayed, so as to realize expanding and displaying the dynamic range and contrast of the original image data, and improve the display effect of the original image data on the display end 130.
It should be understood that, although the system architecture 100 is divided into the acquisition end 110, the processing end 120, and the display end 130 in fig. 1, in essence, the system architecture 100 may be a single electronic device, and the acquisition end 110, the processing end 120, and the display end 130 may be different functional modules of the electronic device, for example, the system architecture may be a mobile terminal, such as a smart phone, a wearable device, and so on; then, the acquisition end 110 may be a front-end camera module or a rear-end camera module of the mobile terminal, the processing end 120 may be a main processor, an application processor, a graphics processor, an image signal processor, etc. of the mobile terminal, and the display end 130 may be a display screen of the mobile terminal, which is not limited in this exemplary embodiment.
As can be seen from the above, the main execution body of the image processing method may be the processing end 120 or the system architecture 100 itself, and for convenience of explanation, the present disclosure will be described taking the system architecture 100 as an example of executing the method by a mobile terminal.
Currently, the maximum display brightness specified by the SDR standard is 100 nit, and the peak brightness of most display devices on the market is greater than 100 nit (nit), for example, the peak brightness of most display devices is generally 500-1000 nit (nit), so that in order to enable the original SDR content to be better adapted to the display device with higher brightness, a scheme for converting the SDR content is required.
In one aspect, an HDR (HIGH DYNAMIC RANGE ) rendering and display system may support rendering and displaying standard dynamic range SDR and HDR content to both an HDR-enabled display and a standard display. The HDR rendering and display system uses display processing techniques that can preserve at least some of the HDR content even for standard displays to render digital image content into the HDR space and map the rendered HDR content into the display space of the HDR or standard display. However, the scheme mainly comprises remapping and displaying of the HDR image format, and for the extended display of the SDR image format, the high light is separated by a complex image recognition and processing mechanism, and the problem of inaccurate light ratio recovery still exists.
In another technical scheme, at least two images to be processed of a target scene are obtained, and the brightness of a first image to be processed in the images is processed to obtain a second image to be processed; performing first fusion processing on the first to-be-processed image and the second to-be-processed image to obtain a to-be-processed fusion image; and carrying out second fusion processing on the fusion image to be processed and the rest of the at least two images to be processed except the first image to be processed, so as to obtain an enhanced fusion image. However, the mapping of SDR provided by this scheme to HDR images is mainly dependent on image analysis, the effect is affected by the accuracy of image recognition segmentation, and many parameters belong to empirical values, and cannot provide an accurate mapping scheme.
In another technical scheme, a high brightness threshold of a single-lens image is predicted by a machine learning method, and the image is divided into a diffuse reflection area and a high light reflection area by the high brightness threshold; since the brightness of the dark part area of different lenses is relatively stable, the brightness threshold value of the dark part area is given according to experience; different brightness mapping functions are respectively established aiming at the dark part area, the diffuse reflection area, the high reflection area and the transition area among different brightness areas; and in the cells near the dark part threshold and with the high brightness threshold as the center, adopting a cubic spline interpolation function as a brightness mapping function, so that brightness among all brightness areas is smoothly transited. However, the solution completes the HDR synthesis and dynamic compression by exposing multiple times through multiple cameras and analyzing the histograms of the multiple exposure images to confirm proper exposure information, but the final imaging container is still SDR, and the problem of discomfort can be caused when the final imaging container is watched in an environment beyond SDR, especially in the extended SDR display process, the original dynamic information of the human face is lost, and the pixel value cannot show the best effect after nonlinear compression.
In an alternative technical solution, compressed RAW data (only performing operations of bit compression, RGB to gray scale, etc.) may be directly used as a brightness enhancement guide map to implement conversion of SDR content, but since the image processing flow of the original image data has deformation operations such as rotation and cropping, the pixel points in the original image data do not correspond to the pixel points in the brightness enhancement guide map in spatial positions, resulting in poor display effect of the finally rendered and displayed image.
Based on one or more problems in the related art, the present disclosure provides an image processing method and an image processing apparatus, which are capable of generating a target brightness enhancement guide map based on original image data and embedding the target brightness enhancement guide map into an image to be displayed, so that when the image to be displayed is rendered and displayed, brightness enhancement of the image to be displayed can be achieved according to the target brightness enhancement guide map stored in the image to be displayed, dynamic range and contrast of the image to be displayed are expanded and displayed, restoration effect and restoration accuracy of real light ratio of the image are effectively improved, and display effect of the image is improved. An image processing method and an image processing apparatus according to exemplary embodiments of the present disclosure are specifically described below.
Fig. 2 shows a flowchart of an image processing method in the present exemplary embodiment, which may include the following steps S210 to S240:
in step S210, original image data is acquired, and an original brightness enhancement guidance map is determined from the original image data.
In an exemplary embodiment, the original image data refers to an image that can be subjected to dynamic range expansion, for example, the original image data may be an SDR image or an LDR (Low DYNAMIC RANGE ) image. Of course, the target image may be an HDR image with a dynamic range capable of being further extended, for example, the acquisition end 110 supports acquiring an HDR image with a maximum brightness of 400nit, the display end 130 supports displaying an HDR image with a maximum brightness of 1000nit, the HDR image acquired by the acquisition end 110 may be further extended in dynamic range at the display end 130, and may be used as the original image data, and the type of the original image data is not limited in any way in this example embodiment.
Specifically, the original image data may be Bayer RAW image data generated by collecting an optical signal at the time of photographing by the collecting end, or may be a YUV image generated by a Bayer RAW image without local nonlinear Processing, and of course, the original image data may also be image data at any stage without local nonlinear Processing in an image signal Processing flow (IMAGE SIGNAL Processing, ISP), which is not limited in this exemplary embodiment.
Before the original image data is subjected to local nonlinear processing in an image signal processing flow, an original brightness enhancement guide image can be constructed according to the original image data, for example, multi-frame 10/12/14bit Bayer RAW data acquired by an acquisition end can be acquired, frame picking, alignment and fusion are carried out on the Bayer RAW data, the RAW image data with 16bit depth is expanded to serve as the original image data, in the original flow, black level alignment, linearization processing, edge vignetting correction, color matrix, gamma and the like are continuously carried out on the original image data with 16bit depth, a 10bit YUV image is generated, and the 10bit YUV image is transmitted to an IPE (Image Process Engine, an image processing engine) for further processing. In this embodiment, a processing thread of the guide image may be separately set, and the original image data with 16bit depth is converted into a 16bit gray scale image with only brightness information as the original brightness enhancement guide image after black level alignment, linearization, and edge vignetting correction. Of course, this is merely a schematic distance illustration and should not be construed as imposing any particular limitation on the exemplary embodiments herein.
In step S220, an image deformation process is performed on the original image data, so as to obtain target image data.
In an exemplary embodiment, the image morphing procedure refers to a procedure of a series of image adjustment operations performed when processing the original image data into an output image, and for example, the image morphing procedure may include at least one of a size clipping process, an optical distortion correction process, an edge portrait distortion correction process, a face beautifying face thinning correction process, an image rotation process, and a mirroring process on the original image data.
The target image data refers to image data obtained after the original image data is processed by an IPE (Image Process Engine, image processing engine), for example, if the original image data is RAW image data with 16bit depth, the target image data may be a YUV image with 10bit depth obtained after the RAW image data with 16bit depth is processed by an IPE.
In step S230, the original brightness enhancement guidance map is synchronously aligned based on the image deformation process, so as to obtain a target brightness enhancement guidance map.
In an exemplary embodiment, the synchronous alignment processing refers to a processing procedure of performing the same processing on the original brightness enhancement guide map according to an operation sequence and a call number by using an image deformation operation on the original image data, and by performing the synchronous alignment processing on the original brightness enhancement guide map based on an image deformation flow, the pixel points in the target brightness enhancement guide map and the pixel points of the original image data can be effectively aligned in a spatial position, the one-to-one correspondence of each pixel point is ensured, and the accuracy of the light ratio information stored in the target brightness enhancement guide map is effectively improved.
Optionally, two independent image processing threads may be set in the IPE, one image processing thread may be used to normally perform image deformation flow and image nonlinear processing on the original image data, and the other image processing thread may be used to obtain deformation operation parameters and operation times of the original image data in the image deformation flow, so as to implement synchronous alignment processing on the original brightness enhancement guide image; of course, two image processing threads may be directly input into the original image processing thread without separately setting two image processing threads, and after the original image data is subjected to image deformation processing, the original brightness enhancement guide image is subjected to the same processing, namely, two deformation processing is performed in one algorithm.
In step S240, an image to be displayed is generated according to the target image data and the target brightness enhancement guidance map.
In an exemplary embodiment, after the target image data and the target brightness enhancement guide map are generated, the target brightness enhancement guide map may be embedded into the target image data as image metadata information of the target image data to generate an image to be displayed. For example, the target luminance enhancement guide map may be embedded as target image data specific Exif (Exchangeable IMAGE FILE format ), encoded into a JPEG or HEIF format image to be displayed that can normally spread and display SDR content.
The original brightness enhancement guide image can be subjected to the same deformation processing based on the image deformation flow of the original image data, so that the pixel points in the obtained target brightness enhancement guide image and the pixel points in the target image data are in one-to-one correspondence in the space position, the accuracy of brightness information of the target brightness enhancement guide image is ensured, and further, when a display command of an image to be displayed is received, the target image data can be subjected to brightness enhancement through the target brightness enhancement guide image, the restoration effect and the far-accuracy of the real light ratio of the image can be effectively improved, and the display effect of the original image data after the expansion display is improved; meanwhile, before the original image data does not locally and nonlinearly process the damaged light ratio, the original brightness enhancement guide image can be determined according to the original image data, the brightness information of the image can be reserved, and the accuracy of the brightness information in the target brightness enhancement guide image is improved.
Next, step S210 to step S240 will be described in detail.
In an exemplary embodiment, after generating the image to be displayed based on the target image data and the target brightness enhancement guidance chart, if a display instruction of the image to be displayed is received, for example, the image to be displayed is requested to be displayed in an enlarged manner in an album of the smart phone, the display of the image to be displayed may be achieved through the steps shown in fig. 3, and referring to fig. 3, the method may specifically include:
Step S310, responding to an instruction for displaying the image to be displayed on a target display unit, and acquiring the target brightness enhancement guide graph from the image to be displayed;
Step S320, performing local brightening processing on the target image data based on the target brightness enhancement guidance map, and rendering and displaying the target image data after the local brightening processing to the target display unit, so as to realize expansion display of the image to be displayed on the target display unit.
The target display unit generally refers to a display device that does not support the HDR standard, for example, the target display unit may be a display device with a peak luminance higher than 100 nit (nit) but lower than 1000 nit (nit), a display device with a peak luminance higher than 1000 nit (nit) but only capable of displaying an SDR image with an 8bit depth, or a display device with a peak luminance higher than 1000 nit (nit) and supporting an image with a 10bit depth or more, but incapable of resolving the HDR standard and only capable of supporting gamma mapping or other non-HDR electro-optic conversion functions. Of course, the target display unit may be a display device that does not support the HDR standard, but is not limited thereto.
The instruction for displaying the image to be displayed refers to a request instruction for displaying the image to be displayed in an enlarged manner to the whole display device, for example, the instruction for displaying the image to be displayed may be an operation instruction for clicking a smaller preview image of the image to be displayed in an album to realize full-screen display of the image to be displayed, and of course, the instruction for displaying the image to be displayed may also be an operation instruction for projecting or printing the image to be displayed, which is not particularly limited in this example embodiment.
When an instruction for displaying an image to be displayed on a target display unit is detected, a target brightness enhancement guide image can be obtained from the image to be displayed, and then local brightness enhancement processing can be performed on target image data based on the target brightness enhancement guide image, wherein the local brightness enhancement processing is local brightness enhancement processing, and the target brightness enhancement guide image can comprise brightness enhancement proportion of each pixel in the image to be displayed.
Through synchronous alignment processing of the original brightness enhancement guide image, real light ratio information corresponding to each pixel point of the image to be displayed, namely brightness enhancement proportion, is stored in the target brightness enhancement guide image, so that real brightness information corresponding to each pixel point in the image to be displayed can be determined through the target brightness enhancement guide image, and the brightness information of the target display unit is adjusted according to the real brightness information, so that the dynamic range and contrast of the image to be displayed are expanded and displayed in the target display unit, the restoration effect of the real light ratio of the image is ensured, and the display effect of the image to be displayed is improved.
In an exemplary embodiment, taking an example that two independent image processing threads are set in an IPE to process original image data and an original brightness enhancement guide image respectively, one image processing thread can be used for normally performing image deformation flow, image nonlinear processing and the like on the original image data, and a corresponding image processing algorithm is denoted as a.
The image processing thread is different from the image processing algorithm A in that the image processing algorithm B needs to extract the image deformation processing of the original image data in the IPE and process the original brightness enhancement guide image in the same sequence, so that the image processing algorithm B at least needs to comprise analysis and processing.
Since the original brightness enhancement guide map (processing of the gray map) is different in image format, different in color range (the original brightness enhancement guide map is gray), and different in tone compared with the original image data (processing of the YUV image), some analysis modules can analyze different deformation processing instructions. Therefore, it is required that the image processing algorithm a can output deformation parameters required to satisfy the self deformation, and the image processing algorithm B can receive a call with these parameters instead of simply calling the image processing algorithm a twice.
Specifically, deformation parameters in the image deformation process can be extracted, and the original brightness enhancement guide map is subjected to synchronous alignment processing based on the deformation parameters, so that the target brightness enhancement guide map is obtained.
The deformation parameters refer to parameters adopted by each image deformation operation when performing an image deformation process on the original image data, for example, if the image deformation operation is clipping processing, the corresponding deformation parameters may be a start point and an end point of two-dimensional image clipping, and if the image deformation operation is optical distortion correction processing, the corresponding deformation parameters may be roundness and intensity data of anti-distortion, and the example embodiment is not limited in this regard.
Alternatively, the image deformation processing includes clipping processing, for example, scaling processing (zoom) of the original image data, or clipping processing to obtain the required size and resolution of the output image according to the size and resolution of the image Sensor (Sensor), or of course, clipping processing of other types in the image signal processing flow, which is not limited in this example embodiment.
The deformation parameters transferred by the cropping process may be an image cropping start point and an image cropping end point, or may be a scaling of the image, which is not limited to this example embodiment. Specifically, an image cutting start point and an image cutting end point corresponding to cutting processing can be extracted, and then synchronous cutting processing can be carried out on the original brightness enhancement guide image according to the image cutting start point and the image cutting end point, so that a target brightness enhancement guide image is obtained, and the consistency of the target brightness enhancement guide image and the image size of target image data can be ensured.
Alternatively, the image deformation processing may include an optical distortion correction process, where the optical distortion correction process refers to imaging optical distortion caused by the characteristics of the optical lens when the lens is imaged, such as an image center that is enlarged, and is commonly found in an image captured by a wide-angle camera.
The deformation parameters transmitted by the optical distortion correction processing can be distortion correction roundness and distortion correction intensity, specifically, the distortion correction roundness and distortion correction intensity corresponding to the optical distortion correction processing can be extracted, and then the original brightness enhancement guide image can be subjected to synchronous optical distortion correction processing according to the distortion correction roundness and the distortion correction intensity to obtain a target brightness enhancement guide image, so that the consistency of the target brightness enhancement guide image and the pixel position of the target image data can be ensured.
Optionally, the image deformation processing process may include a portrait distortion correction process, where the portrait distortion correction process may include an edge portrait distortion correction process and a face-beautifying face-thinning correction process, where the edge portrait distortion correction process refers to a process that when a face area in an image is at an edge of an image view angle, a face may appear more obvious unnatural distortion, and the face-beautifying face-thinning correction process refers to a corresponding face deformation process when a user starts a face-beautifying face-thinning function.
The deformation parameter transferred by the portrait distortion correction processing may be a distortion correction grid, for example, the edge portrait distortion correction processing may be a portrait anti-distortion grid (FDC GRID MESH), the face-beautifying face-thinning correction processing may be a face-beautifying face-thinning grid (FB Morph FACE MESHES), and the type and the expression form of the distortion correction grid are not particularly limited in this example embodiment. Specifically, a distortion correction grid corresponding to the portrait distortion correction processing can be extracted; and then, synchronous portrait distortion correction processing can be carried out on the original brightness enhancement guide map according to the distortion correction grid, so that a target brightness enhancement guide map is obtained, and the consistency of the positions of pixels of the target brightness enhancement guide map and the target image data can be ensured.
Optionally, the image deformation processing may include rotation processing, for example, the user may take a horizontal shot, a vertical shot, a reverse shot, etc., and the rotation processing in the image processing procedure may correct the direction of the image.
The parameters transmitted by the rotation process can be rotation angles, and the rotation angles corresponding to the rotation process can be extracted; and then the original brightness enhancement guide image can be synchronously rotated according to the rotation angle to obtain the target brightness enhancement guide image, so that the consistency of the positions of the pixels of the target brightness enhancement guide image and the target image data can be ensured.
Optionally, the image deformation processing includes mirroring, and in some image processing algorithms, mirroring may be performed on the image, for example, mirroring may be performed on the image in a front-end shooting scene.
The deformation parameter transferred by the mirror process may be a mirror flip parameter, for example, the mirror flip parameter may be 0, representing that no mirror process is employed, and the mirror flip parameter may be 1, representing that a mirror process is employed. Specifically, mirror parameters corresponding to mirror processing can be extracted, and then synchronous mirror processing can be performed on the original brightness enhancement guide image according to the mirror parameters to obtain a target brightness enhancement guide image, so that consistency of pixel positions of the target brightness enhancement guide image and target image data can be ensured.
In an exemplary embodiment, image debugging operations may be performed on the original brightness enhancement guidance map after the synchronization alignment process to obtain a target brightness enhancement guidance map; the image debugging operations may include an image gain operation and a gamma correction operation, among others.
For example, the original luminance enhancement guide map after the synchronization alignment process may be subjected to a got (Gain, offset, gamma) process:
Because the original brightness enhancement guide image is derived from multi-frame synthesized 16bit RAW image data, in general, the exposure control of the camera can lead the linear gray scale of most pixel points to be concentrated at low positions (10-12), and more clearance is left at high positions; in this case, the data capacity can be utilized as much as possible, and more details can be recorded, so that even the image with medium and low dynamic state can have better brightness expansion gradient, and therefore, the Gain processing can be performed on the original brightness enhancement guidance chart after the synchronous alignment processing to uniformly occupy the space, and the Gain processing can be realized according to the following relation:
Maskout=Maskin*Gain
The Mask in may represent an original brightness enhancement guide graph before the Gain processing, the Mask out may represent an original brightness enhancement guide graph after the Gain processing, the Gain may represent an optimized Gain, and specifically, the Mask may be defined according to an image parameter of original image data in an actual application and a hardware parameter of a target display unit, which is not limited in any way.
Since the multi-frame fusion of the original image data has undergone the black bit and linearization processing, the original luminance enhancement guidance map thus obtained does not need to be subjected to Offset processing.
The Gamma processing needs to realize the requirements of separating the high light part and the low light part in the image data, distributing gray information as uniformly as possible, matching the effect with the display rendering, and the like, and one optional Gamma processing can be expressed as the following relation:
Maskout=fg(Maskin,Gamma)
The Gamma may represent an effect parameter, may be determined by an analysis algorithm of the image processing algorithm B, and f g () may represent an optimization function, and may specifically be set in a self-defined manner according to an actual situation, which is not limited herein.
Fig. 4 schematically illustrates a schematic diagram of implementing to-be-displayed image generation based on a mobile terminal in an exemplary embodiment of the disclosure.
Referring to fig. 4, taking an example of a mobile terminal performing an image processing method, the mobile terminal may include at least an image Sensor (Sensor) 401, a preprocessing module (preview definition) 402, a RAW image fusion module (Bayer Processing Segment, BPS) 403, an Image Processing Engine (IPE) 404, a luminance enhancement guidance image generation flow module 405, and a JEPG encoding module 406.
Specifically, the image sensor 401 may collect multi-frame 10/12/14bit Bayer RAW image data through an optical lens and a CMOS (Complementary Metal Oxide Semiconductor ) sensor, in which the preprocessing module 402 may generate image metadata based on an image collection process of the image sensor 401, for example, the image metadata may include, but is not limited to, automatic exposure information (AE Info), HDR resolution information, face information, and automatic scene detection result; then, the image sensor 401 transmits the acquired multi-frame 10/12/14bit Bayer RAW image data to the RAW image fusion module 403, the RAW image fusion module 403 can pick up frames, align and fuse the 10/12/14bit Bayer RAW image data to obtain a RAW image which is expanded to 16bit depth, and the RAW image data can be used as original image data, and the original image data with 16bit depth is sent to the brightness enhancement guide image generation flow module 405 to be processed into a gray scale image with 16bit depth to be used as an original brightness enhancement guide image; meanwhile, the RAW image fusion module 403 generates a 10-bit YUV image based on the Bayer RAW image data of 10/12/14 bits of multiframe, and transmits the 10-bit YUV image to the image processing engine 404 for processing.
The image processing engine 404 may include an image deformation process for a 10bit YUV image, which may include clipping, optical distortion correction, edge face distortion correction, face-beautifying face-thinning correction, rotation processing, mirror image processing, and the like, while the image processing engine 404 performs the image deformation process for the 10bit YUV image, the brightness enhancement guidance image generating process module 405 obtains deformation parameters transferred by the image processing engine 404, and performs synchronous alignment processing for an original brightness enhancement guidance image, that is, a gray scale image with 16bit depth, based on the deformation parameters, the original image data is processed by the image processing engine 404 to obtain target image data, the original brightness enhancement guidance image is processed by the brightness enhancement guidance image generating process module 405 to obtain the target brightness enhancement guidance image, and because the same image deformation processing is performed, pixels in the target brightness enhancement guidance image correspond to pixels in spatial positions in one-to-one, and because the original brightness enhancement guidance image is obtained before the original image data is subjected to local nonlinear processing, brightness information of the image is completely stored, so as to ensure that the target brightness enhancement guidance image can restore the real light ratio information of the target image data in the target display unit.
After obtaining the image metadata, the target image data and the target brightness enhancement guide map, the JEPG encoding module 406 may embed the image metadata and the target brightness enhancement guide map into the target image data to obtain the image to be displayed in JEPG or HIEC format. In this way, after receiving the display request of the image to be displayed, the target brightness enhancement guide map and the image metadata can be decoded from the image to be displayed, and then the image to be displayed can be expanded and displayed in the target display unit 408 according to the target brightness enhancement guide map and the image metadata.
In summary, in the present exemplary embodiment, the original brightness enhancement guidance map may be determined according to the original image data, and the image deformation process may be performed on the original image data to obtain the target image data; and simultaneously, carrying out synchronous alignment processing on the original brightness enhancement guide map based on the image deformation flow to obtain a target brightness enhancement guide map, and generating an image to be displayed according to the target image data and the target brightness enhancement guide map. On the one hand, the original brightness enhancement guide image is subjected to the same deformation processing based on the image deformation flow of the original image data, so that the pixel points in the obtained target brightness enhancement guide image and the pixel points in the target image data are in one-to-one correspondence in the space position, the accuracy of brightness information of the target brightness enhancement guide image is ensured, and further, when a display command of an image to be displayed is received, the target image data can be subjected to brightness enhancement through the target brightness enhancement guide image, the restoration effect and the far-reaching accuracy of the real light ratio of the image can be effectively improved, and the display effect of the original image data after the expansion display is improved; on the other hand, before the original image data does not locally and nonlinearly process the damaged light ratio, an original brightness enhancement guide graph is determined according to the original image data, so that brightness information of an image can be reserved, and accuracy of the brightness information in the target brightness enhancement guide graph is improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 5, in the embodiment of the present example, there is further provided an image processing apparatus 500, including an image data acquisition module 510, an image data morphing module 520, a guidance map alignment module 530, and an image generation module 540 to be displayed. Wherein:
The image data acquisition module 510 is configured to acquire original image data, and determine an original brightness enhancement guide map according to the original image data;
The image data deformation module 520 is configured to perform an image deformation process on the original image data to obtain target image data;
The guide map alignment module 530 is configured to perform synchronous alignment processing on the original brightness enhancement guide map based on the image deformation process, so as to obtain a target brightness enhancement guide map;
The to-be-displayed image generating module 540 is configured to generate an to-be-displayed image according to the target image data and the target brightness enhancement guidance map.
In an exemplary embodiment, the image processing apparatus 500 further includes:
The target brightness enhancement guide image acquisition unit can be used for responding to the instruction of displaying the image to be displayed in the target display unit and acquiring the target brightness enhancement guide image from the image to be displayed;
And the expansion display unit can be used for carrying out local brightening processing on the target image data based on the target brightness enhancement guide graph and rendering and displaying the target image data subjected to the local brightening processing to the target display unit so as to realize expansion display of the image to be displayed on the target display unit.
In an exemplary embodiment, the boot graph alignment module 530 may be configured to:
extracting deformation parameters in the image deformation process;
And carrying out synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map.
In an exemplary embodiment, the image morphing process may include a cropping process, and the guide map alignment module 530 may be used to:
extracting an image cutting start point and an image cutting end point corresponding to the cutting processing;
And carrying out synchronous cutting processing on the original brightness enhancement guide graph according to the image cutting starting point and the image cutting ending point to obtain a target brightness enhancement guide graph.
In an exemplary embodiment, the image morphing process may include an optical distortion correction process, and the guide map alignment module 530 may be configured to:
extracting distortion correction roundness and distortion correction intensity corresponding to the optical distortion correction processing;
and carrying out synchronous optical distortion correction processing on the original brightness enhancement guide graph according to the distortion correction roundness and the distortion correction intensity to obtain a target brightness enhancement guide graph.
In an exemplary embodiment, the image morphing process may include a portrait distortion correction process, and the guide map alignment module 530 may be used to:
extracting a distortion correction grid corresponding to the portrait distortion correction processing;
and carrying out synchronous portrait distortion correction processing on the original brightness enhancement guide map according to the distortion correction grid to obtain a target brightness enhancement guide map.
In an exemplary embodiment, the image morphing process may include a rotation process, and the guide map alignment module 530 may be used to:
Extracting a rotation angle corresponding to the rotation treatment;
And carrying out synchronous rotation processing on the original brightness enhancement guide graph according to the rotation angle to obtain a target brightness enhancement guide graph.
In an exemplary embodiment, the image morphing process may include a mirroring process, and the boot graph alignment module 530 may be configured to:
extracting mirror image parameters corresponding to the mirror image processing;
And carrying out synchronous mirror image processing on the original brightness enhancement guide image according to the mirror image parameters to obtain a target brightness enhancement guide image.
In an exemplary embodiment, the image processing apparatus 500 may also be used to:
Performing image debugging operation on the original brightness enhancement guide image subjected to synchronous alignment treatment to obtain a target brightness enhancement guide image;
Wherein the image debugging operations include an image gain operation and a gamma correction operation.
The specific details of each module in the above apparatus are already described in the method section, and the details that are not disclosed can be referred to the embodiment of the method section, so that they will not be described in detail.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide an electronic device that may include a processor and a memory for storing executable instructions of the processor, the processor configured to perform the above-described image processing method via execution of the executable instructions. The electronic device may be provided with a system architecture 100, the construction of which is exemplified below by a mobile terminal 600 in fig. 6. It will be appreciated by those skilled in the art that the configuration of fig. 6 can also be applied to stationary type devices in addition to components specifically for mobile purposes.
As shown in fig. 6, the mobile terminal 600 may specifically include: processor 601, memory 602, bus 603, mobile communication module 604, antenna 1, wireless communication module 605, antenna 2, display 606, camera module 607, audio module 608, power module 609, and sensor module 610.
The processor 601 may include one or more processing units, such as: the Processor 601 may include an AP (Application Processor ), modem Processor, GPU (Graphics Processing Unit, graphics Processor), ISP (IMAGE SIGNAL Processor ), controller, encoder, decoder, DSP (DIGITAL SIGNAL Processor ), baseband Processor and/or NPU (Neural-Network Processing Unit, neural network Processor), and the like.
An encoder may encode (i.e., compress) an image or video to reduce the data size for storage or transmission. The decoder may decode (i.e., decompress) the encoded data of the image or video to recover the image or video data. The mobile terminal 600 may support one or more encoders and decoders, for example: image formats such as JPEG (Joint Photographic Experts Group ), PNG (Portable Network Graphics, portable network graphics), BMP (Bitmap), and Video formats such as MPEG (Moving Picture Experts Group ) 1, MPEG10, h.1063, h.1064, HEVC (HIGH EFFICIENCY Video Coding).
The processor 601 may form a connection with the memory 602 or other components through a bus 603.
Memory 602 may be used to store computer-executable program code that includes instructions. The processor 601 performs various functional applications of the mobile terminal 600 and data processing by executing instructions stored in the memory 602. The memory 602 may also store application data, such as files storing images, videos, and the like.
The communication functions of the mobile terminal 600 may be implemented by the mobile communication module 604, the antenna 1, the wireless communication module 605, the antenna 2, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 604 may provide a mobile communication solution of 3G, 4G, 5G, etc. applied on the mobile terminal 600. The wireless communication module 605 may provide wireless communication solutions for wireless local area networks, bluetooth, near field communications, etc., as applied on the mobile terminal 600.
The display 606 is used to implement display functions such as displaying user interfaces, images, video, and the like. The camera module 607 is used for implementing shooting functions, such as shooting images, videos, etc. The audio module 608 is used to implement audio functions, such as playing audio, capturing speech, etc. The power module 609 is used to implement power management functions such as charging a battery, powering a device, monitoring battery status, etc.
The sensor module 610 may include one or more sensors for implementing corresponding sensing functions. For example, the sensor module 910 may include an exposure sensor for detecting ambient brightness data of the mobile terminal 900 at the time of photographing, and outputting an automatic exposure parameter.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. An image processing method, comprising:
Acquiring original image data, and determining an original brightness enhancement guide map according to the original image data;
Performing an image deformation process on the original image data to obtain target image data;
Performing synchronous alignment processing on the original brightness enhancement guide map based on the image deformation flow to obtain a target brightness enhancement guide map;
And generating an image to be displayed according to the target image data and the target brightness enhancement guide map.
2. The method according to claim 1, wherein the method further comprises:
Responding to an instruction for displaying the image to be displayed on a target display unit, and acquiring the target brightness enhancement guide graph from the image to be displayed;
and carrying out local brightening treatment on the target image data based on the target brightness enhancement guide map, and rendering and displaying the target image data subjected to the local brightening treatment to the target display unit so as to realize the expanded display of the image to be displayed on the target display unit.
3. The method of claim 1, wherein the performing synchronous alignment processing on the original brightness enhancement guidance map based on the image deformation process to obtain a target brightness enhancement guidance map includes:
extracting deformation parameters in the image deformation process;
And carrying out synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map.
4. A method according to claim 3, wherein the image deformation process comprises a cropping process, and the extracting deformation parameters in the image deformation process comprises:
extracting an image cutting start point and an image cutting end point corresponding to the cutting processing;
the step of performing synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map comprises the following steps:
And carrying out synchronous cutting processing on the original brightness enhancement guide graph according to the image cutting starting point and the image cutting ending point to obtain a target brightness enhancement guide graph.
5. A method according to claim 3, wherein the image deformation process comprises an optical distortion correction process, and the extracting deformation parameters in the image deformation process comprises:
extracting distortion correction roundness and distortion correction intensity corresponding to the optical distortion correction processing;
the step of performing synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map comprises the following steps:
and carrying out synchronous optical distortion correction processing on the original brightness enhancement guide graph according to the distortion correction roundness and the distortion correction intensity to obtain a target brightness enhancement guide graph.
6. A method according to claim 3, wherein the image deformation process includes a portrait distortion correction process, and the extracting deformation parameters in the image deformation process includes:
extracting a distortion correction grid corresponding to the portrait distortion correction processing;
the step of performing synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map comprises the following steps:
and carrying out synchronous portrait distortion correction processing on the original brightness enhancement guide map according to the distortion correction grid to obtain a target brightness enhancement guide map.
7. A method according to claim 3, wherein the image morphing process comprises a rotation process, and the extracting morphing parameters in the image morphing process comprises:
Extracting a rotation angle corresponding to the rotation treatment;
the step of performing synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map comprises the following steps:
And carrying out synchronous rotation processing on the original brightness enhancement guide graph according to the rotation angle to obtain a target brightness enhancement guide graph.
8. A method according to claim 3, wherein the image morphing process comprises a mirroring process, and the extracting morphing parameters in the image morphing process comprises:
extracting mirror image parameters corresponding to the mirror image processing;
the step of performing synchronous alignment processing on the original brightness enhancement guide map based on the deformation parameters to obtain a target brightness enhancement guide map comprises the following steps:
And carrying out synchronous mirror image processing on the original brightness enhancement guide image according to the mirror image parameters to obtain a target brightness enhancement guide image.
9. The method according to claim 1, wherein the method further comprises:
Performing image debugging operation on the original brightness enhancement guide image subjected to synchronous alignment treatment to obtain a target brightness enhancement guide image;
Wherein the image debugging operations include an image gain operation and a gamma correction operation.
10. An image processing apparatus, comprising:
The image data acquisition module is used for acquiring original image data and determining an original brightness enhancement guide image according to the original image data;
The image data deformation module is used for carrying out an image deformation process on the original image data to obtain target image data;
the guide image alignment module is used for carrying out synchronous alignment processing on the original brightness enhancement guide image based on the image deformation flow to obtain a target brightness enhancement guide image;
And the image to be displayed generating module is used for generating an image to be displayed according to the target image data and the target brightness enhancement guide map.
11. A computer readable medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any one of claims 1 to 9.
12. An electronic device, comprising:
A processor; and
A memory for storing executable instructions of the processor;
Wherein the processor is configured to perform the method of any one of claims 1 to 9 via execution of the executable instructions.
CN202211650451.0A 2022-12-21 2022-12-21 Image processing method and device, computer readable medium and electronic equipment Pending CN118247194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211650451.0A CN118247194A (en) 2022-12-21 2022-12-21 Image processing method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211650451.0A CN118247194A (en) 2022-12-21 2022-12-21 Image processing method and device, computer readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN118247194A true CN118247194A (en) 2024-06-25

Family

ID=91563204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211650451.0A Pending CN118247194A (en) 2022-12-21 2022-12-21 Image processing method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN118247194A (en)

Similar Documents

Publication Publication Date Title
CN111580765B (en) Screen projection method, screen projection device, storage medium, screen projection equipment and screen projection equipment
US8564679B2 (en) Image processing apparatus, image processing method and program
JP6306845B2 (en) Imaging apparatus and control method thereof
US10129488B2 (en) Method for shooting light-painting video, mobile terminal and computer storage medium
CN106851386B (en) Method and device for realizing augmented reality in television terminal based on Android system
US9826171B2 (en) Apparatus and method for reconstructing high dynamic range video
US9197813B2 (en) Method and apparatus for obtaining a digital image
CN115314617A (en) Image processing system and method, computer readable medium, and electronic device
CN112802033A (en) Image processing method and device, computer readable storage medium and electronic device
CN111432121A (en) Generation method, electronic device, and storage medium
CN112565603B (en) Image processing method and device and electronic equipment
CN117768774A (en) Image processor, image processing method, photographing device and electronic device
CN112261417A (en) Video pushing method and system, equipment and readable storage medium
CN110266967B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108881731B (en) Panoramic shooting method and device and imaging equipment
CN118247194A (en) Image processing method and device, computer readable medium and electronic equipment
CN115861121A (en) Model training method, image processing method, device, electronic device and medium
CN115278189A (en) Image tone mapping method and apparatus, computer readable medium and electronic device
CN115330633A (en) Image tone mapping method and device, electronic equipment and storage medium
CN118214815A (en) Image display method and device, computer readable medium and electronic equipment
CN115529411A (en) Video blurring method and device
CN113364964B (en) Image processing method, image processing apparatus, storage medium, and terminal device
CN110049254B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113873142A (en) Multimedia processing chip, electronic device and dynamic image processing method
CN114827430B (en) Image processing method, chip and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination