CN118214815A - Image display method and device, computer readable medium and electronic equipment - Google Patents

Image display method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN118214815A
CN118214815A CN202211615691.7A CN202211615691A CN118214815A CN 118214815 A CN118214815 A CN 118214815A CN 202211615691 A CN202211615691 A CN 202211615691A CN 118214815 A CN118214815 A CN 118214815A
Authority
CN
China
Prior art keywords
image
image data
displayed
target
brightness enhancement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211615691.7A
Other languages
Chinese (zh)
Inventor
杨周
苗守飞
杨新勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211615691.7A priority Critical patent/CN118214815A/en
Publication of CN118214815A publication Critical patent/CN118214815A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The disclosure provides an image display method and device, a computer readable medium and electronic equipment, and relates to the technical field of image processing. The method comprises the following steps: acquiring an image to be displayed; extracting a brightness enhancement guide map and target image data from the image to be displayed in response to an instruction to display the image to be displayed on a target display unit; performing local brightness enhancement processing on the target image data based on the brightness enhancement guidance map; and rendering and displaying the target image data subjected to the local brightness enhancement processing to the target display unit so as to realize the expansion display of the image to be displayed on the target display unit. According to the method and the device, the image to be displayed is expanded and displayed through the brightness enhancement guide graph, so that the image can restore the real light ratio in the display device, the display effect is improved, and meanwhile, the calculated amount is effectively reduced.

Description

Image display method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image display method, an image display device, a computer readable medium, and an electronic apparatus.
Background
Dynamic range refers to the range from darkest to brightest in an image, and is generally expressed in terms of the ratio of brightest to darkest. The larger the dynamic range, the closer the display effect of the image is to the visual effect of the real scene.
Standard dynamic range (STANDARD DYNAMIC RANGE, SDR) is a generic video image standard that limits the display light intensity based on the cathode ray tube display brightness, contrast, and color characteristics. Currently, the maximum display brightness specified by the SDR standard is 100 nit, and the peak brightness of most display devices on the market is greater than 100 nit (nit), for example, the peak brightness of most display devices is generally 500-1000 nit (nit), so that in order to enable the original SDR content to be better adapted to the display device with higher brightness, a scheme for expanding the dynamic range of displaying the SDR image is required.
Disclosure of Invention
The present disclosure aims to provide an image display method, an image display device, a computer-readable medium, and an electronic apparatus, thereby realizing a scheme for realizing an extended display of a dynamic range of an SDR image on a display unit of a non-HDR standard.
According to a first aspect of the present disclosure, there is provided an image display method including:
Acquiring an image to be displayed;
extracting a brightness enhancement guide map and target image data from the image to be displayed in response to an instruction to display the image to be displayed on a target display unit;
performing local brightness enhancement processing on the target image data based on the brightness enhancement guidance map;
and rendering and displaying the target image data subjected to the local brightness enhancement processing to the target display unit so as to realize the expansion display of the image to be displayed on the target display unit.
According to a second aspect of the present disclosure, there is provided an image display apparatus including:
the image acquisition module is used for acquiring an image to be displayed;
the guide image extraction module is used for responding to the instruction of displaying the image to be displayed on the target display unit and extracting a brightness enhancement guide image and target image data from the image to be displayed;
The local brightness enhancement module is used for carrying out local brightness enhancement processing on the target image data based on the brightness enhancement guide graph;
and the image expansion display module is used for rendering and displaying the target image data subjected to the local brightness enhancement processing to the target display unit so as to realize the expansion display of the image to be displayed on the target display unit.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
A processor; and
And a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the methods described above.
According to the image display method provided by the embodiment of the disclosure, when an instruction of displaying an image to be displayed by a target display unit is received, a brightness enhancement guide image and target image data are extracted from the image to be displayed, local brightness enhancement processing is performed on the target image data based on the brightness enhancement guide image, and the target image data subjected to the local brightness enhancement processing are rendered and displayed to the target display unit, so that the image to be displayed is expanded and displayed by the target display unit. The brightness enhancement guide map which is embedded in the image to be displayed in advance is decoded, and local brightness enhancement of target image data is realized through the brightness enhancement guide map, so that the dynamic range and contrast of the image to be displayed can be expanded and displayed in a target display unit which does not support the HDR standard, the real light ratio information of the image to be displayed is accurately restored, and the display effect of the image to be displayed is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
fig. 2 schematically illustrates a flowchart of an image display method in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow diagram for generating an image to be displayed in an exemplary embodiment of the present disclosure;
Fig. 4 schematically illustrates a schematic diagram of a mobile terminal-based implementation of generating and displaying an image to be displayed in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a simulation effect diagram for simulating different brightness enhancement modes in an exemplary embodiment of the disclosure;
Fig. 6 schematically illustrates a composition diagram of an image display apparatus in an exemplary embodiment of the present disclosure;
fig. 7 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an image display method and apparatus of an embodiment of the present disclosure may be applied.
Referring to fig. 1, a system architecture 100 may include an acquisition end 110, a processing end 120, and a display end 130. The capturing end 110 is a device, apparatus or module for capturing images, such as a stand-alone camera or a camera module built into an electronic device. The processing end 120 is a device, apparatus or module for processing the image acquired by the acquisition end 110, and may be an independent processing device (such as a server, a computer, a smart phone), or may be a processor in an electronic device, such as a main processor, an application processor, a graphics processor, an image signal processor, or the like. The display end 130 is a device, apparatus or module for displaying an image, and "displaying" herein refers to visual presentation of an image, not limited to displaying by means of a screen, such as projection, printing, etc., and thus, the display end 130 may be a display, a projector, a printer, or a display module built in an electronic device (such as a display screen of a smart phone). The acquisition end 110, the processing end 120 and the display end 130 can be connected through a wired or wireless communication link, and can also be connected with a specific hardware interface through a bus so as to perform data interaction.
In an alternative embodiment, the raw image data may be acquired by the acquisition end 110 in advance and transmitted to the processing end 120. The processing end 120 acquires the original image data provided by the acquisition end 110, and determines a brightness enhancement guide map according to the original image data, so as to generate an image to be displayed. In this way, when an instruction of displaying an image to be displayed by the target display unit is received, the image to be displayed may be sent to the display end 130 by the image display method in this embodiment, and the display end 130 expands and displays the target image data based on the brightness enhancement guidance chart in the image to be displayed, so as to realize expanding and displaying the dynamic range and contrast of the original image data, and improve the display effect of the image to be displayed on the display end 130.
It should be understood that, although the system architecture 100 is divided into the acquisition end 110, the processing end 120, and the display end 130 in fig. 1, in essence, the system architecture 100 may be a single electronic device, and the acquisition end 110, the processing end 120, and the display end 130 may be different functional modules of the electronic device, for example, the system architecture may be a mobile terminal, such as a smart phone, a wearable device, and so on; then, the acquisition end 110 may be a front-end camera module or a rear-end camera module of the mobile terminal, the processing end 120 may be a main processor, an application processor, a graphics processor, an image signal processor, etc. of the mobile terminal, and the display end 130 may be a display screen of the mobile terminal, which is not limited in this exemplary embodiment.
As can be seen from the above, the main execution body of the image display method may be the processing end 120 or the system architecture 100 itself, and for convenience of explanation, the present disclosure will be described taking the system architecture 100 as an example of executing the method by a mobile terminal.
Currently, the maximum display brightness specified by the SDR standard is 100 nit, and the peak brightness of most display devices on the market is greater than 100 nit (nit), for example, the peak brightness of most display devices is generally 500-1000 nit (nit), so that in order to enable the original SDR content to be better adapted to the display device with higher brightness, a scheme for converting the SDR content is required.
In one aspect, an HDR (HIGH DYNAMIC RANGE ) rendering and display system may support rendering and displaying standard dynamic range SDR and HDR content to both an HDR-enabled display and a standard display. The HDR rendering and display system uses display processing techniques that can preserve at least some of the HDR content even for standard displays to render digital image content into the HDR space and map the rendered HDR content into the display space of the HDR or standard display. However, the scheme mainly comprises remapping and displaying of the HDR image format, and for the extended display of the SDR image format, the high light is separated by a complex image recognition and processing mechanism, and the problem of inaccurate light ratio recovery still exists.
In another technical scheme, at least two images to be processed of a target scene are obtained, and the brightness of a first image to be processed in the images is processed to obtain a second image to be processed; performing first fusion processing on the first to-be-processed image and the second to-be-processed image to obtain a to-be-processed fusion image; and carrying out second fusion processing on the fusion image to be processed and the rest of the at least two images to be processed except the first image to be processed, so as to obtain an enhanced fusion image. However, the mapping of SDR provided by this scheme to HDR images is mainly dependent on image analysis, the effect is affected by the accuracy of image recognition segmentation, and many parameters belong to empirical values, and cannot provide an accurate mapping scheme.
In another technical scheme, a high brightness threshold of a single-lens image is predicted by a machine learning method, and the image is divided into a diffuse reflection area and a high light reflection area by the high brightness threshold; since the brightness of the dark part area of different lenses is relatively stable, the brightness threshold value of the dark part area is given according to experience; different brightness mapping functions are respectively established aiming at the dark part area, the diffuse reflection area, the high reflection area and the transition area among different brightness areas; and in the cells near the dark part threshold and with the high brightness threshold as the center, adopting a cubic spline interpolation function as a brightness mapping function, so that brightness among all brightness areas is smoothly transited. However, the solution completes the HDR synthesis and dynamic compression by exposing multiple times through multiple cameras and analyzing the histograms of the multiple exposure images to confirm proper exposure information, but the final imaging container is still SDR, and the problem of discomfort can be caused when the final imaging container is watched in an environment beyond SDR, especially in the extended SDR display process, the original dynamic information of the human face is lost, and the pixel value cannot show the best effect after nonlinear compression.
In an optional technical scheme, the process of implementing the extended display of the SDR image content is that, for the original RGB image, the original RGB image is converted into a color mode with separated brightness, such as a YUV image, and a threshold value y0 and an extended intensity are set; for each corresponding brightness value Y, when the brightness value Y is larger than a threshold value Y0, judging the part to be brightened; when the luminance value Y is smaller than the threshold value Y0, it is determined that the brightness-enhancing portion is not required. However, the expansion brightness enhancement degree of the pixel points in the scheme is only related to the corresponding brightness value Y, so that the method is a global processing mode. However, in the current HDR or other dynamic compression techniques, unrecoverable nonlinear processing is performed on the luminance relationship of the pixels, that is, in the SDR container (image), the actual luminance corresponding to the point where the luminance values Y are consistent is not the same; it is also common that the brightness value Y in an SDR container is of opposite magnitude to the actual illumination intensity. Although some algorithm means taking intelligent image segmentation as an example are adopted to make local supplement at present, the limitation of recognition accuracy and precision still exists, and the light and shade relationship in the object still cannot be recovered well.
Based on one or more problems in the related art, the present disclosure provides an image display method and an image display apparatus, which can realize local brightness enhancement of target image data through a brightness enhancement guidance map, can realize expansion display of dynamic range and contrast of an image to be displayed in a target display unit which does not support an HDR standard, accurately restore real light ratio information of the image to be displayed, and improve display effect of the image to be displayed. An image display method and an image display apparatus according to exemplary embodiments of the present disclosure are specifically described below.
Fig. 2 shows a flowchart of an image display method in the present exemplary embodiment, which may include the following steps S210 to S240:
in step S210, an image to be displayed is acquired.
In an exemplary embodiment, the image to be displayed refers to an image stored in the mobile terminal and capable of being displayed in a full-screen enlarged manner, for example, the image to be displayed may be an image stored in an album or a network image on a third party platform, and the exemplary embodiment is not limited thereto in particular.
The image to be displayed may be a JEPG format image file or HEIC format image file, and of course, the image to be displayed may also be other data formats in which data such as image metadata and attribute parameters may be embedded, which is not particularly limited in this example embodiment.
Optionally, in the image processing stage, a brightness enhancement guide map for expanding the dynamic range and contrast of the display image is generated, and the brightness enhancement guide map is embedded into the image to be displayed, so that when the image to be displayed is displayed, the image to be displayed can be expanded and displayed after the brightness enhancement guide map is decoded from the image to be displayed.
In step S220, in response to an instruction to display the image to be displayed on the target display unit, a brightness enhancement guidance map and target image data are extracted from the image to be displayed.
In an exemplary embodiment, the target display unit generally refers to a display device that does not support the HDR standard, for example, the target display unit may be a display device with a peak luminance higher than 100 nit (nit) but lower than 1000 nit (nit), a display device with a peak luminance higher than 1000 nit (nit) but only capable of displaying SDR images with 8bit depth, or a display device with a peak luminance higher than 1000 nit (nit) and supporting images with 10bit depth and above, but cannot parse the HDR standard and only support gamma mapping or other non-HDR electro-optic conversion functions. Of course, the target display unit may be a display device that does not support the HDR standard, but is not limited thereto.
The instruction for displaying the image to be displayed refers to a request instruction for displaying the image to be displayed in an enlarged manner to the whole display device, for example, the instruction for displaying the image to be displayed may be an operation instruction for clicking a smaller preview image of the image to be displayed in an album to realize full-screen display of the image to be displayed, and of course, the instruction for displaying the image to be displayed may also be an operation instruction for projecting or printing the image to be displayed, which is not particularly limited in this example embodiment.
The luminance enhancement guide map embedded in the image to be displayed may be obtained by performing a decoding operation on the image to be displayed, for example, the image to be displayed in the JEPG format may be decoded by a JEPG decoder, which is not particularly limited in this example embodiment.
In step S230, local luminance enhancement processing is performed on the target image data based on the luminance enhancement guidance map.
In an exemplary embodiment, the luminance enhancement guidance map may include a luminance enhancement ratio of each pixel in the image to be displayed, so that new display luminance of the target display unit when displaying the target image data corresponding to the pixel point may be determined according to the luminance enhancement ratio, thereby implementing remapping of luminance of all pixel points of the image to be displayed, and implementing local luminance enhancement processing of the image to be displayed.
In step S240, the target image data after the local brightness enhancement processing is rendered and displayed to the target display unit, so as to realize the expanded display of the image to be displayed on the target display unit.
In an exemplary embodiment, the target image data after the local brightness enhancement processing is rendered and displayed to the target display unit with new display brightness, so that the dynamic range and contrast of the image to be displayed can be expanded and displayed in the target display unit which does not support the HDR standard, the real light ratio information of the image to be displayed can be accurately restored, and the display effect of the image to be displayed can be improved.
Next, step S210 and step S240 will be described in detail.
In an exemplary embodiment, the generation of the image to be displayed may be implemented through the steps in fig. 3, and referring to fig. 3, the method may specifically include:
step S310, obtaining original image data;
step S320, performing image signal processing on the original image data to obtain the target image data;
Step S330 of generating the brightness enhancement guidance map based on the original image data;
step S340, generating the image to be displayed according to the target image data and the brightness enhancement guidance map.
The original image data refers to an image that can be subjected to dynamic range expansion, for example, the original image data may be an SDR image or an LDR (Low DYNAMIC RANGE ) image. Of course, the target image may be an HDR image with a dynamic range capable of being further extended, for example, the acquisition end 110 supports acquiring an HDR image with a maximum brightness of 400nit, the display end 130 supports displaying an HDR image with a maximum brightness of 1000nit, the HDR image acquired by the acquisition end 110 may be further extended in dynamic range at the display end 130, and may be used as the original image data, and the type of the original image data is not limited in any way in this example embodiment.
Specifically, the original image data may be Bayer RAW image data generated by collecting an optical signal at the time of photographing by the collecting end, or may be a YUV image generated by a Bayer RAW image without local nonlinear Processing, and of course, the original image data may also be image data at any stage without local nonlinear Processing in an image signal Processing flow (IMAGE SIGNAL Processing, ISP), which is not limited in this exemplary embodiment.
The image signal processing refers to a processing procedure of performing a series of processing on the original image data to obtain a desired output image, and the image signal processing may be local nonlinear processing on the original image data or image deformation processing on the original image data, which is not particularly limited in this example embodiment.
Optionally, before the original image data is subjected to local nonlinear processing in the image signal processing flow, a brightness enhancement guide graph can be constructed according to the original image data, for example, multi-frame 10/12/14bit Bayer RAW data acquired by an acquisition end can be acquired, frame picking, alignment and fusion are performed on the Bayer RAW data, RAW image data with 16bit depth is expanded, and in the original flow, the original image data with 16bit depth is continuously subjected to black level alignment, linearization, edge vignetting correction, color matrix, gamma and the like to generate a 10bit YUV image, and the 10bit YUV image is transmitted to an IPE (Image Process Engine, an image processing engine) for further processing. In this embodiment, a processing thread of the guide image may be separately set, and the original image data with 16bit depth is converted into a 16bit gray scale image with only brightness information as the brightness enhancement guide image after black level alignment, linearization, and edge vignetting correction. Of course, this is merely a schematic distance illustration and should not be construed as imposing any particular limitation on the exemplary embodiments herein.
Alternatively, the synchronization alignment process may be performed on the luminance enhancement guide map based on the image deformation process of the original image data, so as to obtain a new luminance enhancement guide map, where the pixel space positions of the new luminance enhancement guide map are in one-to-one correspondence with the pixel space positions of the target image data.
The synchronous alignment processing refers to the processing procedure of carrying out the same processing on the original brightness enhancement guide image according to the operation sequence and the calling times by adopting the image deformation operation on the original image data, and the synchronous alignment processing on the original brightness enhancement guide image based on the image deformation flow can effectively ensure that the pixel points in the target brightness enhancement guide image are aligned with the pixel points of the original image data in the space position, ensure the one-to-one correspondence of the pixel points and effectively improve the accuracy of the light ratio information stored in the target brightness enhancement guide image.
Optionally, two independent image processing threads may be set in the IPE, one image processing thread may be used to normally perform image deformation flow and image nonlinear processing on the original image data, and the other image processing thread may be used to obtain deformation operation parameters and operation times of the original image data in the image deformation flow, so as to implement synchronous alignment processing on the brightness enhancement guide image; of course, two image processing threads may be directly input into the original image processing thread without separately setting two image processing threads, and after the original image data is subjected to image deformation processing, the same processing is performed on the brightness enhancement guide image, that is, two deformation processing is performed in one algorithm.
After the target image data and the luminance enhancement guide map are generated, the luminance enhancement guide map may be embedded into the target image data as image metadata information of the target image data to generate an image to be displayed. For example, the luminance enhancement guide map may be embedded as an Exif (Exchangeable IMAGE FILE format) specific to the target image data, encoded into a JPEG or HEIF format image to be displayed that can normally spread and display SDR content.
In an exemplary embodiment, the original image data may be subjected to fusion processing to obtain fused image data; preprocessing the fusion image data, converting the preprocessed fusion image data into a gray level image, and generating a brightness enhancement guide image based on the gray level image.
The process of fusing the original image data means that the image sensor collects multi-frame 10/12/14bit Bayer RAW image data through an optical lens and a CMOS (Complementary Metal Oxide Semiconductor ) sensor to serve as the original image data, and then frames, aligns and fuses the original image data to obtain a RAW image which is expanded to 16bit depth and can serve as fused image data.
After the fused image data is obtained, in the original flow, black level alignment, linearization, edge vignetting correction, color matrix, gamma and other processes are continuously performed, so that a 10-bit YUV image is generated and transmitted to an image processing engine IPE (Image Process Engine). In this embodiment, the RAW image with 16Bit depth is subjected to black level alignment, linearization, edge vignetting correction, conversion, and other preprocessing to obtain a 16Bit gray scale map with only luminance information as a luminance enhancement guide map.
In an exemplary embodiment, image metadata may be acquired when the original image data is acquired, and the image metadata and the brightness enhancement guidance map may be encoded as embedded information into the target image data to obtain an image to be displayed.
The image metadata refers to related parameters of the original image data in the imaging process and image basic information, for example, the image metadata can be automatic exposure information, such as ambient brightness, contrast, exposure time, module sensitivity and the like; the image metadata may be HDR algorithm information, such as the number of frames required for HDR fusion, exposure parameters (EV) corresponding to each frame, fusion information, etc.; the image metadata can be face related information, such as the number of faces, the brightness of the faces and the like; the image metadata may be scene identification information, such as indoor, outdoor, night scene, sky, green plant, etc. The information can influence the hardware behavior of the camera module to achieve the expected HDR algorithm synthesis effect, and can be used as auxiliary information of screen display rendering to be embedded into target image data in the form of image metadata.
The image metadata is used as auxiliary information to assist the rendering and display of the target image data, so that the defects of the brightness enhancement guide map can be further overcome, the brightness restoration accuracy of the image to be displayed is further improved, and the display effect of the image to be displayed is improved.
Optionally, automatic exposure information, high dynamic range algorithm information, face information and scene recognition information can be obtained when the original image data are acquired; and uses the automatic exposure information, the high dynamic range algorithm information, the face information and the scene identification information as the image metadata.
Fig. 4 schematically illustrates a schematic diagram of a mobile terminal-based implementation of generating and displaying an image to be displayed in an exemplary embodiment of the present disclosure.
Referring to fig. 4, taking an example of a mobile terminal performing an image display method, the mobile terminal may include at least an image Sensor (Sensor) 401, a preprocessing module (preview definition) 402, a RAW image fusion module (Bayer Processing Segment, BPS) 403, an Image Processing Engine (IPE) 404, a JEPG encoding module 405, and a JEPG decoding module 406.
Specifically, the image sensor 401 may collect multi-frame 10/12/14bit Bayer RAW image data through an optical lens and a CMOS (Complementary Metal Oxide Semiconductor ) sensor, in which the preprocessing module 402 may generate image metadata based on an image collection process of the image sensor 401, for example, the image metadata may include, but is not limited to, automatic exposure information (AE Info), HDR resolution information, face information, and automatic scene detection result; then, the image sensor 401 transmits the acquired 10/12/14bit Bayer RAW image data to the RAW image fusion module 403, the RAW image fusion module 403 may perform frame picking, alignment and fusion on the 10/12/14bit Bayer RAW image data through the RAW image data fusion unit, so as to obtain a RAW image extended to 16bit depth, which may be used as original image data, and send the 16bit depth original image data to the RAW image gray-scale image conversion unit to process into a gray-scale image with 16bit depth; meanwhile, the RAW image fusion module 403 generates a 10-bit YUV image based on the Bayer RAW image data of 10/12/14 bits of multiframe, and transmits the 10-bit YUV image to the image processing engine 404 for processing.
The image processing engine 404 may perform a YUV image processing procedure on a 10bit YUV image to obtain target image data, and at the same time, the image processing engine 404 may further perform a luminance enhancement guide map generating procedure on a 16bit gray map to obtain a luminance enhancement guide map, and send the target image data and the luminance enhancement guide map to the JEPG encoding module 405.
The JEPG encoding module 405 may obtain the image metadata transferred by the preprocessing module 402, and the target image data and the brightness enhancement guidance map transferred by the image processing engine 404, and encode the image metadata and the brightness enhancement guidance map as embedded information into the target image data according to the JEPG format or the HEIC format encoding mode, so as to obtain the image 406 to be displayed in the JEPG/HIEC format.
Upon receiving a request for displaying the image 406 to be displayed on the target display unit, the image 406 to be displayed may be decoded by the JEPG decoding module 406 to obtain image metadata, a luminance enhancement guidance map, and target image data, local luminance enhancement of the target image data may be achieved by the image metadata and the luminance enhancement guidance map, and the target image data after the local luminance enhancement processing may be rendered and displayed on the target display unit 408 with new display luminance, so as to achieve expansion display of the image 406 to be displayed on the target display unit 408.
Fig. 5 schematically illustrates a schematic diagram of a simulation effect of simulating different brightness enhancement modes in an exemplary embodiment of the present disclosure.
Referring to fig. 5, the process simulation of global brightness enhancement of original image data by judging the relationship between brightness value and threshold value, and local brightness enhancement of original image data by the image display method of the present disclosure may be performed on the original image data, the simulation result of the original image data is a simulation image 501, the simulation result of global brightness enhancement of the original image data by judging the relationship between brightness value and threshold value is a simulation image 502, and the simulation result of local brightness enhancement of the original image data by the image display method of the present disclosure is a simulation image 503.
It can be seen that, in the simulated image 502, although the light source luminance of the local area 504 is restored, the reflected light of the local area 505 is also globally enhanced, but the reflected light of this portion should not be subjected to luminance enhancement processing, compared with the simulated image 501 corresponding to the original image data; in the simulation image 503, the brightness of the reflected light (i.e., the low light portion) of the local area 505 is kept unchanged, the brightness of the light source of the local area 504 is enhanced, a larger dynamic range is realized in the display unit by using the SDR content, the real light ratio information of the image to be displayed is accurately restored, and the display effect of the image to be displayed is improved.
In summary, in the present exemplary embodiment, when an instruction for displaying an image to be displayed by a target display unit is received, a luminance enhancement guide map and target image data may be extracted from the image to be displayed, local luminance enhancement processing may be performed on the target image data based on the luminance enhancement guide map, and the target image data after the local luminance enhancement processing may be rendered and displayed to the target display unit, so as to implement expansion display of the image to be displayed on the target display unit. The brightness enhancement guide map which is embedded in the image to be displayed in advance is decoded, and local brightness enhancement of target image data is realized through the brightness enhancement guide map, so that the dynamic range and contrast of the image to be displayed can be expanded and displayed in a target display unit which does not support the HDR standard, the real light ratio information of the image to be displayed is accurately restored, and the display effect of the image to be displayed is improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 6, in the embodiment of the present example, there is further provided an image display apparatus 600, including an image acquisition module 610, a guidance map extraction module 620, a local brightness enhancement module 630, and an image expansion display module 640. Wherein:
the image acquisition module 610 is configured to acquire an image to be displayed;
the guide image extraction module 620 is configured to extract a brightness enhancement guide image and target image data from the image to be displayed in response to an instruction to display the image to be displayed on the target display unit;
the local brightness enhancement module 630 is configured to perform local brightness enhancement processing on the target image data based on the brightness enhancement guidance map;
the image expansion display module 640 is configured to render and display the target image data after the local brightness enhancement processing to the target display unit, so as to realize expansion display of the image to be displayed on the target display unit.
In an exemplary embodiment, the image display apparatus 600 may include an image generation unit to be displayed, which may be used to:
Acquiring original image data;
performing image signal processing on the original image data to obtain the target image data;
Generating the brightness enhancement guidance map based on the original image data;
and generating the image to be displayed according to the target image data and the brightness enhancement guide map.
In an exemplary embodiment, the image generation unit to be displayed may be configured to:
carrying out fusion processing on the original image data to obtain fused image data;
preprocessing the fusion image data, and converting the preprocessed fusion image data into a gray scale map;
The brightness enhancement guide map is generated based on the gray map.
In an exemplary embodiment, the pixel spatial positions of the luminance enhancement guide map may correspond one-to-one with the pixel spatial positions of the target image data.
In an exemplary embodiment, the image generation unit to be displayed may be configured to:
acquiring image metadata when acquiring the original image data;
and encoding the image metadata and the brightness enhancement guide map into the target image data as embedded information to obtain the image to be displayed.
In an exemplary embodiment, the image generation unit to be displayed may be configured to:
Acquiring automatic exposure information, high dynamic range algorithm information, face information and scene identification information when acquiring the original image data;
And taking the automatic exposure information, the high dynamic range algorithm information, the face information and the scene identification information as the image metadata.
In an exemplary embodiment, the luminance enhancement guide map may include a luminance enhancement ratio of each pixel.
The specific details of each module in the above apparatus are already described in the method section, and the details that are not disclosed can be referred to the embodiment of the method section, so that they will not be described in detail.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide an electronic device. The electronic device may comprise a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the above-described image display method via execution of the executable instructions. The electronic device may be provided with a system architecture 100, the construction of which is exemplified below by a mobile terminal 700 in fig. 7. It will be appreciated by those skilled in the art that the configuration of fig. 7 can also be applied to stationary type devices in addition to components specifically for mobile purposes.
As shown in fig. 7, the mobile terminal 700 may specifically include: processor 701, memory 702, bus 703, mobile communication module 704, antenna 1, wireless communication module 705, antenna 2, display 706, camera module 707, audio module 708, power module 709, and sensor module 710.
The processor 701 may include one or more processing units, such as: the Processor 701 may include an AP (Application Processor ), modem Processor, GPU (Graphics Processing Unit, graphics Processor), ISP (IMAGE SIGNAL Processor ), controller, encoder, decoder, DSP (DIGITAL SIGNAL Processor ), baseband Processor and/or NPU (Neural-Network Processing Unit, neural network Processor), and the like.
An encoder may encode (i.e., compress) an image or video to reduce the data size for storage or transmission. The decoder may decode (i.e., decompress) the encoded data of the image or video to recover the image or video data. The mobile terminal 700 may support one or more encoders and decoders, for example: image formats such as JPEG (Joint Photographic Experts Group ), PNG (Portable Network Graphics, portable network graphics), BMP (Bitmap), and Video formats such as MPEG (Moving Picture Experts Group ) 1, MPEG10, h.1063, h.1064, HEVC (HIGH EFFICIENCY Video Coding).
The processor 701 may form a connection with a memory 702 or other components through a bus 703.
Memory 702 may be used to store computer-executable program code that includes instructions. The processor 701 performs various functional applications and data processing of the mobile terminal 700 by executing instructions stored in the memory 702. The memory 702 may also store application data, such as files storing images, videos, and the like.
The communication functions of the mobile terminal 700 may be implemented by the mobile communication module 704, the antenna 1, the wireless communication module 705, the antenna 2, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 704 may provide a mobile communication solution of 3G, 4G, 5G, etc. applied on the mobile terminal 700. The wireless communication module 705 may provide wireless communication solutions for wireless local area networks, bluetooth, near field communications, etc. that are applied on the mobile terminal 700.
The display 706 is used to implement display functions such as displaying user interfaces, images, video, and the like. The image capturing module 707 is used to implement a capturing function such as capturing an image, video, or the like. The audio module 708 is used to implement audio functions such as playing audio, capturing speech, etc. The power module 709 is used to implement power management functions such as charging a battery, powering a device, monitoring battery status, etc.
The sensor module 710 may include one or more sensors for implementing corresponding sensing functions. For example, the sensor module 710 may include an exposure sensor for detecting ambient brightness data of the mobile terminal 700 at the time of photographing, and outputting an automatic exposure parameter.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image display method, comprising:
Acquiring an image to be displayed;
extracting a brightness enhancement guide map and target image data from the image to be displayed in response to an instruction to display the image to be displayed on a target display unit;
performing local brightness enhancement processing on the target image data based on the brightness enhancement guidance map;
and rendering and displaying the target image data subjected to the local brightness enhancement processing to the target display unit so as to realize the expansion display of the image to be displayed on the target display unit.
2. The method according to claim 1, wherein the method further comprises:
Acquiring original image data;
performing image signal processing on the original image data to obtain the target image data;
Generating the brightness enhancement guidance map based on the original image data;
and generating the image to be displayed according to the target image data and the brightness enhancement guide map.
3. The method of claim 2, wherein the generating the brightness enhancement guidance map based on the original image data comprises:
carrying out fusion processing on the original image data to obtain fused image data;
preprocessing the fusion image data, and converting the preprocessed fusion image data into a gray scale map;
The brightness enhancement guide map is generated based on the gray map.
4. The method of claim 2, wherein the pixel spatial locations of the luminance enhancement guidance map correspond one-to-one with the pixel spatial locations of the target image data.
5. The method of claim 2, wherein generating the image to be displayed from the target image data and the brightness enhancement guidance map comprises:
acquiring image metadata when acquiring the original image data;
and encoding the image metadata and the brightness enhancement guide map into the target image data as embedded information to obtain the image to be displayed.
6. The method of claim 5, wherein the acquiring image metadata at the time of acquiring the raw image data comprises:
Acquiring automatic exposure information, high dynamic range algorithm information, face information and scene identification information when acquiring the original image data;
And taking the automatic exposure information, the high dynamic range algorithm information, the face information and the scene identification information as the image metadata.
7. The method of claim 1, wherein the luminance enhancement guidance map includes a luminance enhancement ratio of each pixel.
8. An image display device, comprising:
the image acquisition module is used for acquiring an image to be displayed;
the guide image extraction module is used for responding to the instruction of displaying the image to be displayed on the target display unit and extracting a brightness enhancement guide image and target image data from the image to be displayed;
The local brightness enhancement module is used for carrying out local brightness enhancement processing on the target image data based on the brightness enhancement guide graph;
and the image expansion display module is used for rendering and displaying the target image data subjected to the local brightness enhancement processing to the target display unit so as to realize the expansion display of the image to be displayed on the target display unit.
9. A computer readable medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
A processor; and
A memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any one of claims 1 to 7 via execution of the executable instructions.
CN202211615691.7A 2022-12-15 2022-12-15 Image display method and device, computer readable medium and electronic equipment Pending CN118214815A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211615691.7A CN118214815A (en) 2022-12-15 2022-12-15 Image display method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211615691.7A CN118214815A (en) 2022-12-15 2022-12-15 Image display method and device, computer readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN118214815A true CN118214815A (en) 2024-06-18

Family

ID=91447638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211615691.7A Pending CN118214815A (en) 2022-12-15 2022-12-15 Image display method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN118214815A (en)

Similar Documents

Publication Publication Date Title
CN111580765B (en) Screen projection method, screen projection device, storage medium, screen projection equipment and screen projection equipment
US20200045219A1 (en) Control method, control apparatus, imaging device, and electronic device
US10129488B2 (en) Method for shooting light-painting video, mobile terminal and computer storage medium
CN106851386B (en) Method and device for realizing augmented reality in television terminal based on Android system
CN110889809B (en) Image processing method and device, electronic equipment and storage medium
US9819873B2 (en) Image-processing apparatus and image-processing method
CN109194855A (en) Imaging method, device and electronic equipment
CN110012227B (en) Image processing method, image processing device, storage medium and electronic equipment
CN115314617A (en) Image processing system and method, computer readable medium, and electronic device
US20230300475A1 (en) Image processing method and apparatus, and electronic device
CN112261417B (en) Video pushing method and system, equipment and readable storage medium
US11521305B2 (en) Image processing method and device, mobile terminal, and storage medium
JP2007215073A (en) Image compression apparatus, image compression program and image compression method, hdr image generator, hdr image generator and hdr image formation method, as well as image processing system, image processing program, and image processing method
CN115861121A (en) Model training method, image processing method, device, electronic device and medium
CN111447360A (en) Application program control method and device, storage medium and electronic equipment
EP4391546A1 (en) Video encoding method and apparatus, and video decoding method and apparatus
CN118214815A (en) Image display method and device, computer readable medium and electronic equipment
CN115278189A (en) Image tone mapping method and apparatus, computer readable medium and electronic device
CN115471435A (en) Image fusion method and device, computer readable medium and electronic equipment
CN115330633A (en) Image tone mapping method and device, electronic equipment and storage medium
CN118247194A (en) Image processing method and device, computer readable medium and electronic equipment
CN115529411A (en) Video blurring method and device
CN114119413A (en) Image processing method and device, readable medium and mobile terminal
CN114298889A (en) Image processing circuit and image processing method
CN108495053B (en) Metadata processing method and device for high dynamic range signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination