CN116567352A - Image processing method, apparatus, device, storage medium, and program product - Google Patents

Image processing method, apparatus, device, storage medium, and program product Download PDF

Info

Publication number
CN116567352A
CN116567352A CN202310560903.4A CN202310560903A CN116567352A CN 116567352 A CN116567352 A CN 116567352A CN 202310560903 A CN202310560903 A CN 202310560903A CN 116567352 A CN116567352 A CN 116567352A
Authority
CN
China
Prior art keywords
video frame
data
target video
local
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310560903.4A
Other languages
Chinese (zh)
Inventor
沈珈立
罗小伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202310560903.4A priority Critical patent/CN116567352A/en
Publication of CN116567352A publication Critical patent/CN116567352A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440245Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an image processing method, an image processing device, an image processing apparatus, a storage medium and a program product. The method comprises the following steps: obtaining a local refreshing instruction, wherein the local refreshing instruction is used for indicating to execute local refreshing from a target video frame, then obtaining first local data of the target video frame, and obtaining static data of a first static area of the first video frame, the first video frame is a previous video frame of the target video frame, then determining first global data of the target video frame according to the first local data and the static data of the first static area, performing image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame, and determining global data of the target video frame based on the characteristic that the static data of the static area is not refreshed, thereby reducing the influence on PQ quality enhancement processing depending on the global data and enhancing the picture display effect.

Description

Image processing method, apparatus, device, storage medium, and program product
Technical Field
The present invention relates to the field of display technologies, and in particular, to an image processing method, apparatus, device, storage medium, and program product.
Background
On many electronic devices with a screen, the usual way to refresh the screen is to refresh several frames of images (e.g. 30 frames or 60 frames, etc.) consecutively within 1 second, which results in a large power consumption.
In the related art, power consumption can be reduced by generally adopting a partial refresh manner for a partial screen. For example, the area playing the video is refreshed normally, the still area of other content is not refreshed, and the previous still content is kept. Such partial refresh can reduce power consumption of the display chip and the panel by reducing the amount of data refreshed.
However, when the electronic device performs global image quality (PQ) enhancement processing, for example, global contrast improvement, during local refresh, the above local update method may result in that the data content of the whole frame image cannot be obtained, and the processing effect of the PQ enhancement processing depending on the whole image statistics is affected, thereby causing poor image display effect.
Disclosure of Invention
The application provides an image processing method, an image processing device, a storage medium and a program product, which are used for solving the problem that the local updating method cannot acquire the data content of an entire frame of image, influences the processing effect of PQ enhancement processing depending on the whole image statistical information, and causes poor image display effect.
In a first aspect, the present application provides an image processing method, including:
obtaining a local refresh indication, wherein the local refresh indication is used for indicating to execute local refresh from a target video frame;
acquiring first local data of the target video frame;
acquiring static data of a first static area of a first video frame, wherein the first video frame is a previous video frame of the target video frame;
determining first global data of the target video frame according to the first local data and the static data;
and carrying out image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
In one possible implementation manner, the acquiring the still data of the first still region of the first video frame includes:
acquiring second global data of the first video frame;
and determining the static data of the first static area according to the second global data and the first local data.
In one possible implementation manner, the acquiring the first local data of the target video frame includes:
acquiring activity data of at least one activity area of the target video frame;
Acquiring preset data from the activity data of the at least one activity area;
determining transition data of the at least one transition region according to the preset data;
wherein the first local data comprises transition data of at least one transition region and activity data of at least one activity region.
In one possible implementation manner, acquiring preset data in the activity data of at least one activity area includes:
determining an upper boundary and/or a lower boundary of an active region for any one of the active regions;
and determining the preset data according to the data of the preset range adjacent to the upper boundary of each active area and/or the data of the preset range adjacent to the lower boundary of each active area.
In one possible implementation, the method further includes:
acquiring active data of an active region and transition data of a transition region of a second video frame, wherein the second video frame is a video frame before receiving a global refreshing instruction or a region changing instruction;
determining third global data of the second video frame according to the static data of the first static area, the active data of the active area of the second video frame and the transition data of the transition area of the second video frame;
And carrying out image enhancement processing on the second video frame according to the third global data to obtain an enhanced video frame of the second video frame.
In one possible implementation, the method further includes:
refreshing a display interface of the display device based on the enhanced video frame of the target video frame;
and displaying the updated display interface.
In a second aspect, the present application provides an image processing apparatus comprising:
the first acquisition module is used for acquiring a local refresh instruction, wherein the local refresh instruction is used for instructing to execute local refresh from a target video frame;
the second acquisition module is used for acquiring the first local data of the target video frame;
a third obtaining module, configured to obtain still data of a first still region of a first video frame, where the first video frame is a video frame that is a previous video frame to the target video frame;
the determining module is used for determining first global data of the target video frame according to the first local data and the static data;
and the processing module is used for carrying out image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
In one possible implementation manner, the third obtaining module is specifically configured to:
acquiring second global data of the first video frame;
and determining the static data of the first static area according to the second global data and the first local data.
In one possible implementation manner, the second obtaining module is specifically configured to:
acquiring activity data of at least one activity area of the target video frame;
acquiring preset data from the activity data of the at least one activity area;
determining transition data of the at least one transition region according to the preset data;
wherein the first local data comprises transition data of at least one transition region and activity data of at least one activity region.
In one possible implementation manner, the second obtaining module is specifically configured to:
determining an upper boundary and/or a lower boundary of an active region for any one of the active regions;
and determining the preset data according to the data of the preset range adjacent to the upper boundary of each active area and/or the data of the preset range adjacent to the lower boundary of each active area.
In one possible implementation, the apparatus further includes: and an enhancement module.
The enhancement module is specifically used for:
acquiring active data of an active region and transition data of a transition region of a second video frame, wherein the second video frame is a video frame before receiving a global refreshing instruction or a region changing instruction;
determining third global data of the second video frame according to the static data of the first static area, the active data of the active area of the second video frame and the transition data of the transition area of the second video frame;
and carrying out image enhancement processing on the second video frame according to the third global data to obtain an enhanced video frame of the second video frame.
In one possible implementation, the apparatus further includes: and a display module.
The display module is specifically used for:
refreshing a display interface of the display device based on the enhanced video frame of the target video frame;
and displaying the updated display interface.
In a third aspect, the present application provides an electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the image processing method as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a computer, are adapted to carry out the image processing method according to the first aspect.
In a fifth aspect, the present application provides a computer program product comprising a computer program for implementing the image processing method of the first aspect when the computer program is executed by a computer.
In a sixth aspect, embodiments of the present application provide a chip on which a computer program is stored, which when executed by the chip causes the image processing method of the first aspect to be performed.
In one possible embodiment, the chip is a chip in a chip module.
In a seventh aspect, embodiments of the present application provide a module apparatus, where the module apparatus includes a power module, a storage module, and a chip module;
the power supply module is used for providing electric energy for the module equipment;
the storage module is used for storing data and instructions;
the chip module is used for executing the image processing method described in the first aspect.
The image processing method, the device, the equipment, the storage medium and the program product provided by the application acquire a local refreshing instruction, the local refreshing instruction is used for instructing to execute local refreshing from a target video frame, then acquire first local data of the target video frame, acquire static data of a first static area of the first video frame, the first video frame is a previous video frame of the target video frame, then determine first global data of the target video frame according to the first local data and the static data of the first static area, perform image enhancement processing on the target video frame according to the first global data, obtain an enhanced video frame of the target video frame, determine global data of the target video frame based on the characteristic that the static data of the static area is not refreshed, thereby reducing the influence on PQ quality enhancement processing depending on the global data and enhancing the picture display effect.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a display interface according to an example of the present application;
fig. 2 is a schematic structural diagram of a display device according to an example of the present application;
Fig. 3 is a flowchart of an image processing method according to a first embodiment of the present disclosure;
FIG. 4 is a schematic diagram of determining first global data of a target video frame from first local data and still data of a first still region, as exemplified herein;
fig. 5 is a flowchart of another image processing method according to the second embodiment of the present application;
FIG. 6 is a schematic diagram of a screen area of a display interface according to an example of the present application;
FIG. 7 is a schematic diagram of stationary data of a first stationary region of a target video frame according to an example of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
On many electronic devices with screens, display power consumption is a non-negligible issue. Particularly, for mobile products such as mobile phone flat panels or commercial displays, which are sensitive to power consumption requirements, it is very important to reduce display power consumption. Typically, the screen is refreshed in a manner that several frames of images (e.g., 30 frames or 60 frames, etc.) are continuously refreshed within 1 second, which results in a large power consumption.
In the related art, for a specific display interface, such as the display interface of fig. 1, the interface is playing a video, wherein the hatched area is the playing area of the video, i.e., the refresh area. For the display interface of fig. 1, the terminal device may reduce power consumption by partially refreshing a portion of the screen, and the content in other areas is not refreshed, so as to maintain the previous still content. This partial refresh scheme reduces the power consumption of the display chip and panel by reducing the amount of data refreshed.
However, in an actual scene, global refresh and local refresh often occur alternately, and when the electronic device performs global image quality (PQ) enhancement processing, for example, global contrast improvement, the above local update method may result in that the data content of the whole frame of image cannot be obtained, and the processing effect of the PQ enhancement processing depending on the statistics information of the whole image is affected, so that the display effect of the screen is poor.
Therefore, the image processing method can acquire the local refreshing indication of the target video frame, then acquire the local data of the target video frame and the static data of the static area of the previous video frame of the target video frame, so as to determine the global data of the target video frame according to the acquired local data and the static data, thereby realizing global image enhancement processing on the target video frame, reducing the influence on PQ quality enhancement processing depending on the global data and enhancing the picture display effect.
In order to facilitate understanding, a structure of a display device to which the page refreshing method of the embodiment of the present application is applied will be described below with reference to an example of fig. 2.
Fig. 2 is a schematic structural diagram of a display device according to an example of the present application, referring to fig. 2, the display device includes: a preset memory 201, an image enhancement component 202, and a control driver module 203 for receiving upper layer commands.
The control driving module 203 may receive an upper layer command, where the upper layer command may include a local refresh indication, where the local refresh indication is used to indicate that a local refresh is performed from the target video frame.
The control driving module 203 may then determine the still data of the still region of the previous video frame of the target video frame and acquire the local data of the target video frame to determine the global data of the target video frame, and send the global data of the target video frame to the image enhancement component 202, so that the image enhancement component 202 performs global image enhancement processing on the target video frame according to the global data.
The preset memory 201 is used for storing static data, where the static data is data that is not refreshed when the display device is in a local refresh state, that is, static data in a static area. The static data may be static data of a static area of a video frame preceding the target video frame, where the static data may be used to determine global data of a frame subsequent to the target video frame when the frame subsequent to the target video frame is locally refreshed, where the local refresh area is unchanged.
In the embodiment of the present application, the display device may be any device having a display screen. The apparatus includes, but is not limited to: user Equipment (UE), mobile device, user terminal, wireless communication device, handheld device, vehicle mount device, etc. For example, some examples of terminals may be: a Mobile Phone (Mobile Phone), a tablet (Pad), a computer with wireless transceiving function (such as a notebook computer, a palm computer, etc.), a Mobile internet device (Mobile Internet Device, MID), a Virtual Reality (VR) device, an augmented Reality (Augmented Reality, AR) device, an augmented Reality (XR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned (Self Driving), a wireless terminal in Remote Medical (Remote Medical), a wireless terminal in Smart Grid (Smart Grid), a wireless terminal in transportation security (Transportation Safety), a wireless terminal in Smart City (Smart City), a wireless terminal in Smart Home (Smart Home), a cellular Phone, a cordless Phone, a session initiation protocol (Session Initiation Protocol, SIP) Phone, a wireless local loop (Wireless Local Loop, WLL) station, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device with wireless communication function, a computing device or a wireless terminal connected to a wireless modem, a wearable device, etc.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following specific embodiments may exist alone or in combination with one another, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 3 is a flowchart of an image processing method according to an embodiment of the present application, where the method may be performed by a display device, or may be performed by an image processing apparatus disposed in the display device, and the apparatus may be a chip, or may be a chip module, or may be an integrated development environment (integrated development environment, IDE), and referring to fig. 3, the method includes the following steps:
s301, acquiring a local refresh instruction, wherein the local refresh instruction is used for instructing to execute local refresh from a target video frame.
The display device may obtain a local refresh indication indicating that a local refresh is to be performed starting from the target video frame.
It will be appreciated that the display device performs a global refresh for a video frame preceding the target video frame, for example for a video frame preceding the target video frame, referred to herein as a first video frame, for which the refresh state of the display device is a global refresh.
By way of example, the partial refresh state refers to a state in which only the contents of the active region are being refreshed and the contents of the other stationary regions are not being refreshed in the display interface of the display device.
The global refreshing state refers to that in a display interface of the display device, the content of the whole display interface is refreshed.
S302, acquiring first local data of a target video frame.
The refresh state of the target video frame is a local refresh state, and then the display device may obtain first local data of the refresh region of the target video frame to enable determination of global data of the target video frame.
For example, the first local data may include transition data of at least one transition region and activity data of at least one activity region.
S303, acquiring static data of a first static area of a first video frame, wherein the first video frame is a previous video frame of a target video frame.
After acquiring the first local data of the target video frame, the display device may acquire still data of a first still region of the first video frame.
S304, determining first global data of the target video frame according to the first local data and the static data.
After the display device obtains the first local data of the target video frame and the still data of the first still region of the first video frame, the first global data of the target video frame may be determined according to the first local data and the still data of the first still region.
Fig. 4 is a schematic diagram of a display device determining first global data of a target video frame according to first local data and static data of a first static area, and as can be seen from fig. 4, the first static area of the first video frame includes two areas, and accordingly, the static data of the first static area is composed of data of the two areas, and the first local data and the static data of the first static area may compose the global data of the target video frame.
S305, performing image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
After the first global data of the target video frame is determined, the display device can perform image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame, so that the global data of the target video frame can be obtained under the local refreshing state of the target video frame, and global image enhancement processing on the target video frame is realized.
For example, the display device may perform image enhancement processing on the target video frame according to the first global data, for example, may reduce backlight, global contrast enhancement, and the like.
In this embodiment, the display device may obtain a local refresh indication, where the local refresh indication is used to indicate that local refresh is performed from a target video frame, then obtain first local data of the target video frame, and obtain still data of a first still region of the first video frame, where the first video frame is a previous video frame of the target video frame, then determine first global data of the target video frame according to the first local data and the still data of the first still region, perform image enhancement processing on the target video frame according to the first global data, obtain an enhanced video frame of the target video frame, determine global data of the target video frame based on a characteristic that the still data of the still region is not refreshed, thereby reducing an influence on PQ quality enhancement processing depending on the global data, and enhancing a picture display effect.
Next, an image processing method provided in the present application will be described by way of example two.
Fig. 5 is a flowchart of another image processing method provided in the second embodiment of the present application, where the method may be performed by a display device, or may be performed by an image processing apparatus provided in the display device, and the apparatus may be a chip, or may be a chip module, or may be an IDE, etc., and referring to fig. 5, the method includes the following steps:
S501, acquiring a local refresh instruction, wherein the local refresh instruction is used for instructing to execute local refresh from a target video frame.
S502, acquiring first local data of a target video frame.
In order to acquire global data of a target video frame, the display device may acquire target first local data, and in particular, the display device may acquire first local data of the target video frame.
In the embodiment of the present application, a video frame corresponding to a display interface may be divided into three areas: a still region, an active region, and a transition region, for example, fig. 6 is a schematic diagram of a picture region of a display interface illustrated in the application, where the transition region is located between the still region and the active region, and the active region is overlapped on the still region, and transition data of the transition region is used for enhancing an image by an image enhancing component, and transition data of the transition region is not displayed in a page.
In one possible implementation, the first local data includes transition data of at least one transition region and activity data of at least one activity region, and the display device may obtain the first local data according to the following manner:
the display device may acquire the activity data of at least one activity area of the target video frame, acquire preset data from the activity data of the at least one activity area, and then determine transition data of the at least one transition area according to the preset data.
In one possible implementation manner, the display device may obtain the preset data from the activity data of at least one activity area by:
for any one of the active areas, the display device may determine an upper boundary and/or a lower boundary of the active area, and then determine preset data according to data of a preset range adjacent to the upper boundary of each of the active areas and/or data of a preset range adjacent to the lower boundary of each of the active areas.
Illustratively, in the example of fig. 6, the upper and lower boundaries of the active area are the boundary line 1 and the boundary line 2, respectively, the preset range may be a line interval, the data of the 6 th line is the data of the preset range adjacent to the upper boundary, and the data of the 12 th line is the data of the preset range adjacent to the lower boundary, that is, the preset data may include the data of the 6 th line and the data of the 12 th line.
In another possible implementation, the display device may obtain still data for a second still region of the target video frame, determine the second still region based on the still data for the second still region, and determine the at least one active region based on the active data for the at least one active region.
For any one of the active areas, the display device may determine a boundary between the second stationary area and the active area, and obtain preset data from the active data of at least one of the active areas according to the boundary.
For example, the still data of the second still region may be composed of a plurality of pixel data, each of which may include a pixel value and a pixel position, and the display device may divide the second still region in the target video frame according to the pixel positions of the plurality of pixels. And dividing at least one active region in the target video frame according to the activity data of the at least one active region.
For convenience of description, the display interface in fig. 6 is divided into a plurality of rows according to the same row spacing, as can be seen from fig. 6, the boundary 1 between the static area 1 and the active area is shown in the drawing, the display device may acquire the data of the 7 th row adjacent to the boundary 1 in the active area, and determine the data of the 7 th row as a part of preset data, where the part of the data of the preset data may be the data of the transition area 1, and as can be seen from the drawing, the transition area 1 overlaps with the area corresponding to the 6 th row of the static area 1.
Similarly, as shown in the figure, the boundary line 2 between the static area 2 and the active area may acquire the data of the 12 th row adjacent to the boundary line 2 in the active area, and determine another part of the preset data from the data of the 12 th row, where the part of the preset data may be used as the data of the transition area 2, and it is known from the figure that the transition area 2 overlaps with the area corresponding to the 13 th row in the static area 2. That is, in this example, the preset data is composed of two parts, namely, data of line 7 and data of line 12.
When the image enhancement component performs image enhancement processing on the display interface, for processing the active data of the active area, when the data processing is performed on the first row of the active area, for example, the 7 th row in fig. 6, the data of the transition area 1 can be acquired, and the data of the 7 th row can be subjected to data enhancement processing, for example, pixel value averaging, etc., so that the accuracy of data enhancement is improved by performing data enhancement on the transition data of the transition area relative to the data of the acquired still area or the complementary 0 pixels.
S503, obtaining second global data of the first video frame.
The refresh state of the display device for the first video frame is global refresh and the display device may obtain second global data for the first video frame.
S504, determining the static data of the first static area according to the second global data and the first local data.
Since the display device is in a local refresh state starting from the target video frame, i.e. the local area of the target video frame is refreshed, the display device displays for the non-refreshed area (i.e. the still area) by using the data of the still area of the first video frame.
Then, the still data of the first still region of the target video frame may be derived from the second global data of the first video frame and the first local data of the target video frame.
For example, as shown in fig. 7, the display device may determine an active area where first local data is located in the target video frame, determine data of the active area in the first video frame, and then determine data other than the data of the active area in the second global data as still data of the first still area of the target video frame.
S505, determining first global data of the target video frame according to the first local data and the static data of the first static area.
S506, performing image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
For the description of S505 and S506, reference may be made to the above embodiments, and the description thereof will not be repeated here.
It will be appreciated that for a subsequent video frame to the target video frame, which may be referred to herein as a second video frame, the second video frame is the video frame prior to receiving the global refresh indication or the region change indication, and the global data for the second video frame may be determined based on the still data of the first still region of the first video frame and the local data of the second video frame.
Specifically, the display device may obtain the active data of the active region and the transition data of the transition region of the second video frame, determine third global data of the second video frame according to the still data of the first still region, the active data of the active region of the second video frame, and the transition data of the transition region of the second video frame, and perform image enhancement processing on the second video frame according to the third global data to obtain an enhanced video frame of the second video frame.
The transition data of the transition region of the second video frame may be determined according to the activity data of the active region of the second video frame, and the display device may determine data of a preset range adjacent to an upper boundary and/or a lower boundary of the active region as the transition data of the transition region. Taking the example in fig. 6 as an example, the upper boundary may be the boundary 1, the lower boundary may be the boundary 2, the preset range of data is the 7 th row and the 12 th row, and in the example in fig. 6, the preset range may be one row pitch. For example, the preset range may be two line intervals, and the data of the preset range adjacent to the upper boundary may be the data of the 5 th line and the 6 th line.
Refresh processing of video frames, illustratively, for refresh of a target video frame, since the display device performs image enhancement processing on the target video frame before the video frame is displayed, the display device may refresh the display interface of the display device based on the enhanced video frame of the target video frame.
In one possible implementation, the display device may update the data of the active area in the display interface to the activity data of the active area of the enhanced video frame of the target video frame. And the data of the static area is not refreshed, and then the display interface with updated data is displayed based on the active data of the active area of the enhanced video frame of the target video frame and the static data of the static area of the enhanced video frame of the target video frame.
In this embodiment, the display device may acquire a local refresh indication for indicating that local refresh is performed from the target video frame, then acquire first local data of the target video frame, acquire second global data of the first video frame, and determine still data of the first still region according to the second global data and the first local data. The first video frame is the previous video frame of the target video frame, then the first global data of the target video frame is determined according to the first local data and the static data of the first static area, the image enhancement processing is carried out on the target video frame according to the first global data, the enhanced video frame of the target video frame is obtained, the global data of the target video frame is determined based on the characteristic that the static data of the static area is not refreshed, and therefore the influence on the PQ quality enhancement processing depending on the global data is reduced, and the picture display effect is enhanced.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present application. Referring to fig. 8, the apparatus 80 includes: a first acquisition module 801, a second acquisition module 802, a third acquisition module 803, a determination module 804, and a processing module 805.
A first obtaining module 801, configured to obtain a local refresh indication, where the local refresh indication is used to instruct to perform a local refresh from a target video frame.
A second acquisition module 802, configured to acquire first local data of the target video frame.
The third obtaining module 803 is configured to obtain still data of a first still region of a first video frame, where the first video frame is a video frame previous to the target video frame.
A determining module 804 is configured to determine first global data of the target video frame according to the first local data and the still data.
And the processing module 805 is configured to perform image enhancement processing on the target video frame according to the first global data, so as to obtain an enhanced video frame of the target video frame.
In one possible implementation, the third obtaining module 803 is specifically configured to:
second global data of the first video frame is acquired.
Stationary data of the first stationary region is determined from the second global data and the first local data.
In one possible implementation, the second obtaining module 802 is specifically configured to:
activity data of at least one active region of a target video frame is acquired.
And acquiring preset data in the activity data of at least one activity area.
And determining transition data of at least one transition region according to the preset data.
Wherein the first local data comprises transition data of at least one transition region and activity data of at least one activity region.
In one possible implementation, the second obtaining module 802 is specifically configured to:
for any one of the active areas, an upper boundary and/or a lower boundary of the active area is determined.
And determining preset data according to the data of the preset range adjacent to the upper boundary of each active area and/or the data of the preset range adjacent to the lower boundary of each active area.
In one possible implementation, the apparatus further includes: and an enhancement module.
The enhancement module is specifically used for:
and acquiring the activity data of the active region and the transition data of the transition region of a second video frame, wherein the second video frame is the video frame before receiving the global refreshing instruction or the region change instruction.
And determining third global data of the second video frame according to the static data of the first static area, the active data of the active area of the second video frame and the transition data of the transition area of the second video frame.
And carrying out image enhancement processing on the second video frame according to the third global data to obtain an enhanced video frame of the second video frame.
In one possible implementation, the apparatus further includes: and a display module.
The display module is specifically used for:
and refreshing a display interface of the display device based on the enhanced video frame of the target video frame.
And displaying the updated display interface.
The device of the present embodiment may be used to execute the technical solutions of the foregoing method embodiments, and the specific implementation manner and the technical effects are similar, and are not repeated herein.
Fig. 9 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application, and as shown in fig. 9, an electronic device 90 may include: at least one processor 901 and a memory 902.
A memory 902 for storing programs. In particular, the program may include program code including computer-executable instructions.
The Memory 902 may include random access Memory (Random Access Memory, RAM) and may also include Non-volatile Memory (Non-volatile Memory), such as at least one disk Memory.
The processor 901 is configured to execute computer-executable instructions stored in the memory 902 to implement the methods described in the foregoing method embodiments. The processor 901 may be a central processing unit (Central Processing Unit, CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits configured to implement embodiments of the present application.
Optionally, the electronic device 90 may further include: a communication interface 903. In a specific implementation, if the communication interface 903, the memory 902, and the processor 901 are implemented independently, the communication interface 903, the memory 902, and the processor 901 may be connected to each other through buses and perform communication with each other. The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. Buses may be divided into address buses, data buses, control buses, etc., but do not represent only one bus or one type of bus.
Alternatively, in a specific implementation, if the communication interface 903, the memory 902, and the processor 901 are integrated on a chip, the communication interface 903, the memory 902, and the processor 901 may complete communication through internal interfaces.
The electronic device 90 may be a chip, a chip module, an IDE, or a display device such as an intelligent home device, an intelligent wearable device, or an intelligent vehicle.
The electronic device of the present embodiment may be used to execute the technical solutions of the foregoing method embodiments, and the specific implementation manner and the technical effects are similar, and are not repeated herein.
A fifth embodiment of the present application provides a computer-readable storage medium, which may include: various media capable of storing computer execution instructions, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a RAM, a magnetic disk, or an optical disc, etc., specifically, the computer execution instructions are stored in the computer readable storage medium, and when the computer execution instructions are executed by a computer, the technical scheme shown in the foregoing method embodiment is executed, and specific implementation manner and technical effects are similar and are not repeated herein.
The sixth embodiment of the present application provides a computer program product, which includes a computer program, and when the computer program is executed by a computer, the technical solution shown in the foregoing method embodiment is executed, and the specific implementation manner and the technical effect are similar, and are not repeated herein.
The seventh embodiment of the present application provides a chip, on which a computer program is stored, where the computer program, when executed by the chip, causes the technical solution shown in the foregoing method embodiment to be executed.
In one possible implementation, the chip may also be a chip module.
The chip of this embodiment may be used to execute the technical solutions shown in the foregoing method embodiments, and the specific implementation manner and the technical effects are similar, and are not repeated here
An eighth embodiment of the present application provides a module apparatus, which includes a power module, a memory module, and a chip module.
The power supply module is used for providing electric energy for the module equipment.
The storage module is used for storing data and instructions.
The chip module of the embodiment may be used to execute the technical solution shown in the foregoing method embodiment, and the specific implementation manner and the technical effect are similar, and are not repeated here.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
In this application, "and/or" is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In this context, the character "/" indicates that the front and rear associated objects are an "or" relationship.
"at least one (item) below" or the like, refers to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein each of a, b, c may itself be an element, or may be a collection comprising one or more elements.
The term "at least one" in this application means one or more. "plurality" means two or more. The first, second, etc. descriptions in the embodiments of the present application are only used for illustrating and distinguishing the description objects, and no order division is used, nor does it indicate that the number of the devices in the embodiments of the present application is particularly limited, and no limitation on the embodiments of the present application should be construed. For example, the first threshold and the second threshold are merely for distinguishing between different thresholds, and are not intended to represent differences in the size, priority, importance, or the like of the two thresholds.
In this application, "exemplary," "in some embodiments," "in other embodiments," etc. are used to indicate an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion.
"of", corresponding "and" associated "in this application may be sometimes used in combination, and it should be noted that the meaning of the expression is consistent when the distinction is not emphasized. Communication, transmission may sometimes be mixed in embodiments of the present application, it should be noted that the meaning expressed is consistent with the de-emphasis. For example, a transmission may include sending and/or receiving, either nouns or verbs.
In this application, "equal to" may be used in conjunction with "less than" or "greater than" but not in conjunction with "less than" and "greater than" at the same time. When the combination of the 'equal' and the 'less' is adopted, the method is applicable to the technical scheme adopted by the 'less'. When being used with 'equal to' and 'greater than', the method is applicable to the technical scheme adopted by 'greater than'.

Claims (10)

1. An image processing method, the method comprising:
obtaining a local refresh indication, wherein the local refresh indication is used for indicating to execute local refresh from a target video frame;
acquiring first local data of the target video frame;
acquiring static data of a first static area of a first video frame, wherein the first video frame is a previous video frame of the target video frame;
determining first global data of the target video frame according to the first local data and the static data of the first static area;
and carrying out image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
2. The method of claim 1, wherein the acquiring the still data of the first still region of the first video frame comprises:
acquiring second global data of the first video frame;
and determining the static data of the first static area according to the second global data and the first local data.
3. The method according to claim 1 or 2, wherein said acquiring the first local data of the target video frame comprises:
Acquiring activity data of at least one activity area of the target video frame;
acquiring preset data from the activity data of the at least one activity area;
determining transition data of the at least one transition region according to the preset data;
wherein the first local data comprises transition data of at least one transition region and activity data of at least one activity region.
4. A method according to claim 3, wherein acquiring preset data in the activity data of at least one activity area comprises:
determining an upper boundary and/or a lower boundary of an active region for any one of the active regions;
and determining the preset data according to the data of the preset range adjacent to the upper boundary of each active area and/or the data of the preset range adjacent to the lower boundary of each active area.
5. The method according to any one of claims 1-4, further comprising:
acquiring active data of an active region and transition data of a transition region of a second video frame, wherein the second video frame is a video frame before receiving a global refreshing instruction or a region changing instruction;
determining third global data of the second video frame according to the static data of the first static area, the active data of the active area of the second video frame and the transition data of the transition area of the second video frame;
And carrying out image enhancement processing on the second video frame according to the third global data to obtain an enhanced video frame of the second video frame.
6. The method according to any one of claims 1-4, further comprising:
refreshing a display interface of the display device based on the enhanced video frame of the target video frame;
and displaying the updated display interface.
7. An image processing apparatus, comprising:
the first acquisition module is used for acquiring a local refresh instruction, wherein the local refresh instruction is used for instructing to execute local refresh from a target video frame;
the second acquisition module is used for acquiring the first local data of the target video frame;
a third obtaining module, configured to obtain still data of a first still region of a first video frame, where the first video frame is a video frame that is a previous video frame to the target video frame;
the determining module is used for determining first global data of the target video frame according to the first local data and the static data;
and the processing module is used for carrying out image enhancement processing on the target video frame according to the first global data to obtain an enhanced video frame of the target video frame.
8. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the image processing method of any one of claims 1-6.
9. A computer-readable storage medium, in which computer-executable instructions are stored, which when executed by a processor are adapted to carry out the image processing method according to any one of claims 1-6.
10. A computer program product comprising a computer program which, when executed by a processor, implements the image processing method of any of claims 1-6.
CN202310560903.4A 2023-05-17 2023-05-17 Image processing method, apparatus, device, storage medium, and program product Pending CN116567352A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310560903.4A CN116567352A (en) 2023-05-17 2023-05-17 Image processing method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310560903.4A CN116567352A (en) 2023-05-17 2023-05-17 Image processing method, apparatus, device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN116567352A true CN116567352A (en) 2023-08-08

Family

ID=87497921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310560903.4A Pending CN116567352A (en) 2023-05-17 2023-05-17 Image processing method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN116567352A (en)

Similar Documents

Publication Publication Date Title
CN105955687B (en) Image processing method, device and system
US11087667B2 (en) Pixel charging method, circuit, display device and computer storage medium
CN113225427B (en) Image display method and terminal equipment
US20200273431A1 (en) Display control device and method, and display system
US11328645B2 (en) Display control method and device for N-primary-color display panel, and display device
WO2018000372A1 (en) Picture display method and terminal
CN113126862B (en) Screen capture method and device, electronic equipment and readable storage medium
CN112596843B (en) Image processing method, device, electronic equipment and computer readable storage medium
US9753532B2 (en) Image processing method and image processing apparatus
CN113010466B (en) Interface switching method of ink screen equipment, ink screen equipment and storage medium
CN111951206A (en) Image synthesis method, image synthesis device and terminal equipment
CN114527980A (en) Display rendering method and device, electronic equipment and readable storage medium
US20060203002A1 (en) Display controller enabling superposed display
CN112184538B (en) Image acceleration method, related device, equipment and storage medium
CN112199149A (en) Interface rendering method and device and electronic equipment
US20230214963A1 (en) Data processing method and apparatus, and electronic device
CN112783392A (en) Information screen display method and device
CN109214977B (en) Image processing apparatus and control method thereof
CN116567352A (en) Image processing method, apparatus, device, storage medium, and program product
CN113393391B (en) Image enhancement method, image enhancement device, electronic apparatus, and storage medium
JP6661339B2 (en) Display control device, multi-display system, and display method of multi-display system
CN114339410A (en) Frame insertion method and device and electronic equipment
CN111179883B (en) Image display method and device, mobile terminal, computer equipment and storage medium
CN116504167A (en) Page refreshing method, page refreshing device, display device, storage medium and program product
CN114495812A (en) Display panel brightness compensation method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination