CN115472140A - Display method, display device, electronic apparatus, and readable storage medium - Google Patents
Display method, display device, electronic apparatus, and readable storage medium Download PDFInfo
- Publication number
- CN115472140A CN115472140A CN202211108429.3A CN202211108429A CN115472140A CN 115472140 A CN115472140 A CN 115472140A CN 202211108429 A CN202211108429 A CN 202211108429A CN 115472140 A CN115472140 A CN 115472140A
- Authority
- CN
- China
- Prior art keywords
- image
- resolution
- frame data
- image frame
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 238000012545 processing Methods 0.000 claims abstract description 177
- 230000004044 response Effects 0.000 claims description 8
- 230000003321 amplification Effects 0.000 description 26
- 238000003199 nucleic acid amplification method Methods 0.000 description 26
- 238000004422 calculation algorithm Methods 0.000 description 21
- 230000006870 function Effects 0.000 description 20
- 238000003672 processing method Methods 0.000 description 15
- 230000009467 reduction Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/391—Resolution modifying circuits, e.g. variable screen formats
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/2007—Display of intermediate tones
- G09G3/2018—Display of intermediate tones by time modulation using two or more time intervals
- G09G3/2022—Display of intermediate tones by time modulation using two or more time intervals using sub-frames
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
- G09G3/32—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
- G09G3/3208—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The application discloses a display method, a display device, electronic equipment and a readable storage medium, which belong to the technical field of display, wherein the method is applied to the electronic equipment, the electronic equipment comprises a system-on-chip and an independent display chip, the system-on-chip is connected with the independent display chip, and the method comprises the following steps: the system-level chip responds to a first instruction for amplifying and displaying a current display picture, and performs image cutting processing on the current display picture to obtain image frame data under a first image resolution; sending the image frame data and the super-resolution indication under the first image resolution to the independent display chip; the independent display chip carries out super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution; and sending the image frame data under the second image resolution to a display screen.
Description
Technical Field
The application belongs to the technical field of display processing, and particularly relates to a display method, a display device, an electronic device and a readable storage medium.
Background
With the development of science and technology, electronic equipment is increasingly applied to various aspects of people's life, and great convenience is provided for people's life. Accordingly, however, higher demands are being made on electronic devices. For example, when a user views photo album photos through an electronic device, during photographing and video recording, or when previewing a scene before photographing, a zoom-in operation of a display screen is often performed to more clearly view details of the screen. However, the resolution of the image corresponding to the display screen is usually reduced while the display screen is enlarged, resulting in poor definition. For another example, even if the user does not perform the enlarging operation, the user sometimes wants some detail information in the image to be displayed more clearly, such as a two-dimensional code or business card information in the image.
Therefore, there is a need to provide a new display method to solve the above technical problems in the prior art.
Disclosure of Invention
An embodiment of the present application provides a display method, a display apparatus, an electronic device, and a readable storage medium, which can solve a problem that a picture is not clearly displayed when a user enlarges the picture.
In a first aspect, an embodiment of the present application provides a display processing method, which is applied to an electronic device, where the electronic device includes a system-on-chip and an independent display chip, and the system-on-chip is connected to the independent display chip, and the method includes: the system-level chip responds to a first instruction for amplifying and displaying a current display picture, and performs image cutting processing on the current display picture to obtain image frame data under a first image resolution; sending the image frame data and the super-resolution indication under the first image resolution to the independent display chip; the independent display chip carries out super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution; and displaying the image frame data under the second image resolution.
In a second aspect, an embodiment of the present application provides a display device, including a system, a chip, and an independent display chip, where the system, the chip, and the independent display chip are connected, where the system and the chip are configured to perform image clipping processing on a current display screen in response to a first instruction for performing enlarged display on the current display screen, obtain image frame data at a first image resolution, and send the image frame data at the first image resolution and a super-resolution indication to the independent display chip; the independent display chip is used for performing super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution, and outputting the image frame data under the second image resolution for display.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to any one of the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the first aspect.
In a sixth aspect, an embodiment of the present application provides a system-on-chip, where the system-on-chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps performed by the system-on-chip in the display method according to the first aspect.
In a seventh aspect, an embodiment of the present application provides an independent display chip, where the system-on-chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps performed by the independent display chip in the display method according to the first aspect.
According to the display method provided by the application, under the condition that the image frame data of the display picture at the first resolution is obtained by amplifying and cutting the picture to be shot preview picture or the picture stored in the local album, the definition of the shot preview picture or the picture stored in the local album after being amplified can be improved by performing super-resolution processing on the image frame data of the display picture at the first resolution, and the problem that the picture definition is poor due to the reduction of the resolution of the image when the image is amplified is solved.
Drawings
FIG. 1 is a flow chart of a display processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a display screen processing procedure of a display processing method according to an embodiment of the present application;
fig. 3 is a flowchart of super-resolution processing in the display processing method of the embodiment of the present application;
FIG. 4 is a flow chart of accurate identification in a display processing method of an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a display processing procedure in the precise recognition mode in the display processing method according to the embodiment of the present application;
FIG. 6 is a block diagram of a stand-alone display chip according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a display device according to an embodiment of the present application;
fig. 8 is a flowchart of a display processing apparatus according to an embodiment of the present application when processing a locally stored picture;
fig. 9 is a flowchart of the display processing apparatus according to the embodiment of the present application when processing a photographing preview screen;
FIG. 10 is a block schematic diagram of an electronic device of an embodiment of the application;
fig. 11 is a hardware configuration diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
It should be noted that all actions of acquiring signals, information or data in the present application are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the corresponding device/account owner.
The display processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
As shown in fig. 1, an embodiment of the present application provides a display method, which is applied to an electronic device. The electronic device may be a mobile phone, PAD, other portable mobile processing device with an electronic display screen. The electronic equipment comprises a display screen, a system on chip SOC and an independent display chip, wherein the system on chip is connected with the independent display chip, and the independent display chip is connected with the display screen. The method comprises steps 101-104:
in this embodiment, the display screen may be a scene preview screen during photographing or during shooting, or may be an existing screen acquired from a memory of the electronic device, such as a picture in a gallery; or may be a picture received by establishing communication with another device. Those skilled in the art will appreciate that the particular source of the display is not limited herein. Since different electronic devices have different screen sizes and screen resolutions, one skilled in the art will appreciate that even the same display may be presented with different resolutions on different electronic device screens, or different portions of the display. It should be understood that, in the present invention, regardless of the screen resolution of the electronic device, regardless of whether the display is actually displayed partially or completely on the display screen of the electronic device, and regardless of whether the display is displayed at different time nodes, what the electronic device displays using the display screen may be referred to as the display.
When the current display picture needs to be amplified and displayed, a user performs touch amplification operation on the current display picture or the electronic device automatically detects a scene needing to be amplified and displayed so as to trigger a first instruction for amplifying and displaying, wherein the system level chip performs image clipping processing on the current display picture corresponding to the first instruction, so as to obtain image frame data under a first image resolution.
The first resolution may be a resolution that matches a screen resolution of the electronic device. For example, the first resolution may be a resolution smaller than the screen resolution, such as 540P or 720P, so that the subsequent processing step can perform super-resolution processing on the image frame data of the first resolution to obtain image frame data with the same screen resolution for screen display. Therefore, the first resolution should be a resolution suitable for obtaining the screen resolution by the subsequent processing step.
Alternatively, when the subsequent processing step is a magnification-type super-resolution processing step as described in the following embodiments, the first resolution is generally a fixed magnification of the screen resolution in order to obtain image frame data of the screen resolution after the subsequent processing step is ended. For example, when the screen resolution is 1080P, the first resolution may be any one of 270P/360P/432P/540P/720P/1080P, so that in the subsequent processing step, super-resolution processing of 4X/3X/2.5X/2X/1.5X/1X magnification may be correspondingly performed, and image frame data at the screen resolution of 1080P may be obtained.
The image frame data of the first resolution may be obtained in the following manner in a specific application scenario of the present invention.
Application scenario one: shooting scene preview
In the application scene, a user turns on a camera, and a camera sensor acquires a shot scene preview picture and displays the picture on a screen. A scene preview taken by a camera sensor of 1600 ten thousand pixels, for example, is displayed on a screen having a screen resolution of 1080P. When the shot scene preview picture displayed on the screen still has details which cannot be clearly presented, the camera APP can automatically detect and call picture amplification operation or manually execute the picture amplification operation by a user so as to trigger a first instruction, and the picture for executing the amplification operation is the display picture. The system-on-chip of the electronic device correspondingly cuts the display picture aiming at the first instruction of the amplification operation, and if the size of the cut picture and the corresponding resolution are just matched with the screen resolution, for example, the fixed magnification of the screen resolution, the resolution corresponding to the cut picture and the corresponding image frame data are used as the image frame data of the display picture under the first resolution. For example, when the screen resolution is 1080P, if the resolution corresponding to the cropped picture is just 720P, i.e. the resolution under 1.5X (1080P/720P) magnification super-resolution processing, the resolution corresponding to the cropped picture and the corresponding image frame data are used as the image frame data of the display picture at the first resolution. In another case, since the area and the scale of the zoom-in operation are not determined and are not fixed, it is likely that the size of the cropped display screen and the corresponding resolution of the display screen by the system-on-chip of the electronic device cannot be matched with the screen resolution for the zoom-in operation, for example, the cropped display screen is not a fixed magnification of the screen resolution, and then the system-on-chip needs to perform pre-interpolation processing on the cropped display screen to obtain image frame data with the resolution matched with the screen resolution, so as to be suitable for performing subsequent processing steps, and the matched image frame data is used as the image frame data of the display screen at the first resolution. For example, when the screen resolution is 1080P, if the resolution corresponding to the picture cropped by the zoom-in operation is 480P, that is, between 432P and 540P, the system on chip may perform a pre-interpolation process on the cropped picture to obtain image frame data with a resolution of 540P, so as to be suitable for performing subsequent processing steps, and use the matched image frame data as the image frame data of the display picture at the first resolution.
[ scenario two: browsing a gallery)
In the application scene, a user enters the local album, views the pictures and selects a certain picture, wherein the picture is the display picture. If the initial resolution of the display picture is higher than the screen resolution, the system-on-chip executes a resolution reduction algorithm to compress the initial resolution of the display picture to the screen resolution and then carries out subsequent display processing. For example: the initial resolution of the display screen is 2160P (2160 × 3840), the screen resolution is 1080P (1080 × 1920), the initial resolution of the display screen is greater than the screen resolution, and the system-on-chip performs resolution reduction processing on the image frame data of the display screen under the initial resolution to obtain the image frame data of the display screen under 1080P. If the initial resolution of the display picture is less than or equal to the screen resolution, the system-on-chip directly carries out subsequent display processing on the image frame data of the display picture. If the details of the picture displayed on the screen still cannot be clearly presented, in order to see the details of the picture on the screen clearly, a user amplifies the display picture to trigger a first instruction, a system-level chip of the electronic device performs corresponding cutting on the display picture aiming at the amplification operation, and if the size and the resolution of the cut picture are still larger than the resolution of the screen, the system-level chip only performs resolution reduction processing, compresses the cut picture into the resolution of the screen and displays the screen. If the cropped picture size and the resolution are smaller than or equal to the screen resolution, and the cropped picture size and the corresponding resolution are exactly matched with the screen resolution, for example, the fixed magnification of the screen resolution, the resolution corresponding to the cropped picture and the corresponding image frame data are used as the image frame data of the display picture at the first resolution. In another case, if the cropped picture size and the resolution are smaller than or equal to the screen resolution, but the area and the scale of the zoom operation are not determined and fixed, and it is likely that the display picture size and the corresponding resolution cannot be matched with the screen resolution by the system on chip of the electronic device for the zoom operation, for example, are not a fixed magnification of the screen resolution, then the system on chip needs to perform pre-interpolation processing on the cropped picture to obtain the image frame data with the resolution matched with the screen resolution so as to be suitable for performing the subsequent processing steps, and the matched image frame data is used as the image frame data of the display picture at the first resolution.
102, sending the image frame data and the super-resolution indication under the first image resolution to the independent display chip;
103, the independent display chip performs super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution;
the super-resolution processing in the present invention may be based on a deep learning artificial intelligence neural network algorithm, and some existing algorithms may be utilized in the present invention, such as the AISR algorithm. In order to improve the efficiency of super-resolution processing, it is generally necessary to cure the AISR processing algorithm to hardware — an AISR processing circuit cured in an independent display chip (independent chip). After the hardware specification is determined, the superminute multiplying power of the AISR processing algorithm is also fixed, a plurality of superminute gears can be designed for the AISR processing algorithm, the superminute gears are usually integers, for example, the gears of 4X/3X/2.5X/2X/1.5X/1X are adjustable, and by adjusting different gears, 4 times of superminute/3 times of superminute/2.5 times of superminute/2 times of superminute/1.5 times of superminute/1 times of superminute can be carried out on images, so that image frame data under different resolutions can be processed.
In one embodiment, the super-resolution processing described in step 103 may be performed as follows: acquiring image related frame data under the second image resolution which is stored in the frame buffer memory last time;
and performing super-resolution processing on the image frame data under the first image resolution by using the image-associated frame data under the second image resolution to obtain the image frame data under the second image resolution.
Specifically, the image frame data at the second image resolution is utilized to perform super-resolution processing of the target magnification on the image frame data at the first image resolution, so as to obtain the image frame data at the second image resolution.
In this embodiment, the separate display chip includes a frame buffer memory. When performing super-resolution processing on the image frame data of the display screen at the first resolution, the associated image frame data at the second image resolution and associated with the display screen to be processed, which is stored in the frame buffer memory, may be used, and the associated image frame data may be assisted by performing super-resolution processing according to the previous frame of image stored in the frame buffer memory, so as to obtain the image frame data at the second image resolution. Then, the image frame data at the second image resolution is stored in the frame buffer memory as image-related frame data at the second image resolution.
The method comprises the following specific steps:
And step 302, the (N + 1) th frame data and the (N) th frame data in the frame buffer memory are used as input of super-resolution processing together, the (N + 1) th frame data is used as related image frame data of the super-resolution processing, and the (N + 1) th frame data is subjected to the super-resolution processing. And outputting the processed N +1 th frame data.
When the nth frame data in the frame buffer memory is associated image frame data for performing super-resolution processing on the (N + 1) th frame data, the resolution of the nth frame data in the frame buffer memory may be the same as the resolution achieved by the (N + 1) th frame data after the super-resolution processing, that is, the resolution of the nth frame data in the frame buffer memory may be the same as the resolution of the second image, so that the (N + 1) th frame data at the second image resolution may be obtained after the (N + 1) th frame data is subjected to the super-resolution processing.
When performing super-resolution processing, the associated image frame data may be distinguished according to the source of the display screen. In the example of the photographed scene preview, the associated image may be an image that meets the second image resolution and is associated with the current photographed scene preview screen, for example, an image that is temporally adjacent and has a high scene parameter approximation degree, among the already photographed screens. In the example of the gallery preview, the related image may be an image in the gallery adjacent to the picture to be processed in shooting time, or the related image may be an image in the gallery having a high degree of approximation with the display content and image parameters of the picture to be processed.
It should be noted that, the method of the present invention generally performs super-resolution processing on a portion of a display image that needs to be emphasized to increase definition, so as to meet the requirement of a user for emphasized viewing of the portion. Therefore, the related image is usually an image having a certain relation with the display content of the display screen, but of course, the related image may be an image having little relation with the display content of the display screen, but when the resolution of the related image satisfies the second image resolution, the related image may be used as the related image to perform super-resolution processing on the image frame data of the display screen at the first resolution so as to improve the screen definition of the display screen. Meanwhile, the image after the super-resolution processing can be further stored in a frame buffer memory so as to be subjected to the super-resolution processing again, and therefore the definition of the picture is gradually improved.
Optionally, the second image resolution refers to a resolution of the image after the super-resolution processing. Since the image frame data of the second image resolution is generally used for direct output, the second image resolution is usually the current screen resolution, although the second image resolution may be higher than the screen resolution. In this case, if the image frame data of the second image resolution is used for screen display, it is also necessary to perform resolution reduction processing.
The target magnification corresponding to the super-resolution processing refers to the magnification during the super-resolution processing. Generally, the determination manner of the target magnification may be adjusted as needed, for example, the system-on-chip may determine the target magnification directly according to the first resolution obtained after the cropping, and send the target magnification together with the image frame data at the first resolution to the independent display chip.
Of course, the target magnification may also be autonomously determined by the independent display chip according to the received image frame data at the first resolution sent by the system-on-chip. In this embodiment, the target magnification is a ratio between a second image resolution and a first image resolution, and the second image resolution may be determined according to a screen resolution or a resolution of an associated image stored in a frame buffer memory.
And 104, sending the image frame data under the second image resolution to a display screen.
In this embodiment, the image frame data of the display at the second image resolution is already an enhanced image after super resolution processing, which may be provided for output to a display screen.
The display processing method of the present invention has been described above according to the embodiment of the present invention, and has the following advantageous effects:
first, the user often performs the operation of enlarging the display screen when viewing photo album photos, taking pictures and recording videos. Because the system-level chip performs the enlarged image processing by cutting and interpolating the original image, i.e. the photo album picture or the image input by the camera sensor, the resolution of the original image is reduced no matter the photo album picture is enlarged or the picture is taken/recorded for preview, which results in poor definition. According to the invention, the image frame data of the display picture at the first resolution after the system level chip processing is subjected to super-resolution processing, so that the definition of the shot preview picture or the picture stored in the local album after being amplified can be improved, and the problem of poor picture definition caused by the reduction of the resolution of the picture during the amplification of the picture is solved.
Secondly, the associated image can be used for performing super-resolution processing on the part of the display picture needing to be emphasized and increased in definition, and the requirement of a user for emphasized viewing of the part can be met.
Thirdly, the problem that the electronic device in the prior art can only perform the pre-interpolation processing through the system-on-chip to perform the local amplification operation and still can not clearly display the picture is remarkably improved.
In the above embodiment, since the image frame data at the first resolution is the object to be finally subjected to the enhanced display, and the image frame data is acquired in different manners in different application scenes, the actual value of the first resolution may be different. Therefore, in order to facilitate super-resolution processing of a display screen with a predetermined magnification, the system on chip, in response to a first instruction for displaying a current display screen in an enlarged manner, performs image cropping processing on the current display screen to obtain image frame data at a first image resolution, and includes:
the system-level chip performs image cutting processing on the current display picture according to the first instruction to obtain image frame data under a third image resolution, and when the third image resolution is equal to the first image resolution, the cut image frame data is used as the image frame data under the first image resolution;
and when the third image resolution is not equal to the first image resolution, performing pre-interpolation processing on the image frame data under the third image resolution to obtain the image frame data under the first resolution.
The step is to provide a processing scheme that a user wants to clearly show the local details of the display screen when browsing the display screen. Namely, a user triggers a first instruction for amplifying and displaying a display picture; under the condition that the first instruction is triggered, image frame data of an amplification area in a third image resolution corresponding to the first instruction in a display picture is intercepted; and then processing the image frame data under the third image resolution to obtain the image frame data under the first resolution.
The following description is provided in connection with specific application scenarios.
[ scene one: shoot scene Preview ]
In the application scene, a user opens a camera, image data of a shot scene preview picture is acquired by a camera sensor, a system-on-chip performs noise reduction, dead pixel correction, color gamut adjustment, demosaicing and other processing on the image data, the acquired image data is displayed on a screen, when the shot scene preview picture displayed on the screen still has details which cannot be clearly presented, a camera APP can automatically detect and trigger a first instruction for amplifying and displaying the picture, the system-on-chip of the electronic device receives and responds to the first instruction for amplifying the picture, and amplification operation on the picture is executed, or the user manually executes the amplification operation to trigger the first instruction. No matter how the picture is amplified, the picture needs to be correspondingly cut by a system-level chip after being amplified, the size of the cut picture and corresponding image frame data are obtained to be used as image frame data under the third image resolution, if the resolution of the image frame data in the amplification area, namely the third image resolution is just matched with the screen resolution, for example, just is a fixed magnification of the screen resolution, the image frame data under the third image resolution is used as the image frame data of the display picture under the first resolution, namely the third image resolution is the first image resolution. In another case, the resolution of the image frame data in the enlarged region is not matched with the screen resolution, and the third image resolution is not equal to the first image resolution, then the system on chip is further required to perform pre-interpolation processing on the image frame data at the third image resolution to match the screen resolution, and the matched image frame data is used as the image frame data of the display screen at the first resolution.
[ scenario two: browsing a gallery)
In this scenario, a user enters the local album, views a picture and selects a certain picture, which is a display picture. If the initial resolution of the display picture is higher than the screen resolution, the system-on-chip executes a resolution reduction algorithm to compress the initial resolution of the display picture to the screen resolution and then carries out subsequent display processing. If the initial resolution of the display picture is less than or equal to the screen resolution, the system-on-chip directly carries out subsequent display processing on the image frame data of the display picture. If the details of the picture displayed on the screen still cannot be clearly presented, in order to clearly see the details of the picture on the screen, the user enlarges the displayed picture, thereby triggering a first instruction for enlarging the displayed picture. And the system-level chip of the electronic equipment responds to the first instruction, correspondingly cuts a display picture, reduces the corresponding image resolution of the cut image frame data to the screen resolution if the cut picture size and the corresponding resolution are still larger than the screen resolution of the electronic equipment, and sends the image frame data to be displayed with the reduced resolution to the independent display chip for display. Otherwise, if the resolution of the image frame data of the enlarged region, i.e., the resolution of the third image, just matches the resolution of the screen, e.g., just is a fixed magnification of the resolution of the screen, the image frame data at the resolution of the third image is used as the image frame data of the display screen at the first resolution, i.e., the resolution of the third image is the first image resolution. If the resolution of the image frame data of the enlarged region is not matched with the screen resolution and the third image resolution is not equal to the first image resolution, the system on chip is further required to perform pre-interpolation processing on the image frame data under the third image resolution to match the screen resolution, and the matched image frame data is used as the image frame data of the display picture under the first resolution.
The display and its effects in the case where steps 101-104 are performed are explained below in connection with fig. 2. As shown in fig. 2 (a), the original resolution of the display screen is 1080P, the screen resolution is 1080P, and the display screen includes a target area 202 that the user needs to focus on. Since the target area 202 is small in size and cannot be clearly viewed by the user in the display screen, the local area 201 including the target area needs to be enlarged. The user manually executes the zooming-in operation so as to trigger the first instruction or the APP automatically detects and triggers the first instruction for zooming in the picture. The system on chip performs cropping and pre-interpolation on the local area, and the resolution of the image frame data of the processed local area 201 is 540P, as shown in fig. 2 (b). However, the resolution of the local area 201 at this resolution is not improved at all, and the resolution of the cropped picture needs to be enlarged by 2 times to 1080P resolution by the super-resolution processing algorithm, so that a picture with a clearer 1080P displayed in the target area 202 can be obtained, as shown in fig. 2 (c).
In one embodiment, as shown in fig. 3, the super-resolution processing described in the above embodiments can be based on the manner described in steps 301-304:
In this step, the image frame data of the display screen at the first resolution is super-resolution reconstructed with low precision by using the associated image frame data, where the associated image frame data is the associated image frame data at the second image resolution, and the definition of the image frame data at the first resolution can be preliminarily improved by super-resolution reconstructing the image frame data of the display screen at the first resolution by using the associated image frame data. For example, the resolution of the associated image is 1080P, the first resolution is 540P, and the frame data of the current 540P can be super-resolution reconstructed by using the frame data of the associated image 1080P, so that the definition of the current 540P is improved. That is, the image preprocessing step here is to not change the resolution of the display screen, and by preprocessing the image frame data of the display screen at the first resolution, the sharpness of the display screen at the first resolution can be preliminarily improved.
In this step, the image frame data of the display screen obtained in step 301 at the first resolution is divided into a plurality of sub-blocks at lower resolutions, and a plurality of image frame data sub-blocks with the fourth resolution are obtained as the input of the next step. For example: the fourth resolution is 10 × 10 sub-blocks.
And step 303, performing image encoding, feature extraction and image decoding on the image data frame subblocks with the fourth resolution to obtain a plurality of image frame data subblocks with the fifth resolution, wherein the fifth resolution is greater than the fourth resolution.
In this step, each fourth resolution sub-block is used as an input of the super-resolution reconstruction step, and the fifth resolution sub-block is output after algorithm processing such as image encoding, feature extraction, image decoding and the like, for example: the 10 × 10 sub-blocks are processed to become 20 × 20 sub-blocks.
It should be noted that the super-resolution reconstruction step is a deep learning model trained by mass data, and usually includes 2 or more layers of convolutional neural networks as an intermediate computation layer, and has low-resolution image coding and high-resolution image decoding functions.
And step 304, combining the plurality of image frame data sub-blocks with the fifth resolution to obtain the image frame data at the second image resolution.
In this step, the plurality of image frame data sub-blocks with the fifth resolution are combined to obtain image frame data at the second image resolution, and the image frame data at the second image resolution is output. Alternatively, the image frame data at the second image resolution may be saved to the frame buffer memory as associated image data for the next image to be processed.
The super-resolution processing in the present invention may be based on a deep learning artificial intelligence neural network algorithm, and some existing algorithms may be utilized in the present invention, such as the AISR algorithm. In order to improve the efficiency of super-resolution processing, it is generally necessary to solidify the AISR processing algorithm into hardware.
Optionally, the AISR algorithm is solidified to an independent display chip — an independent display chip, the independent display chip has an embedded neural-Network Processing Units (NPU), and the AISR algorithm is run by the NPU.
According to the method, the image frame data of the display picture under the first resolution ratio is processed through the image preprocessing step, and the definition of the display picture can be improved preliminarily. Through the AISR algorithm, the image frame data is processed, and the definition of the picture can be optimally improved. The AISR algorithm is operated by the embedded neural network processor of the independent display chip, so that the processing efficiency can be greatly improved, and the image processing speed can be improved.
In another alternative embodiment, the displayed picture output after the super-resolution processing at the second image resolution still has the problem of poor picture definition. In order to solve the problem, the application also provides an accurate identification method. The precise identification method according to the embodiment of the present application is shown in fig. 4, and after step 103, further includes the following steps 401 to 405.
In this embodiment, it can be understood that S401 described above is executed after S103, and therefore, the current display screen in S401 is the display screen corresponding to the image frame data at the second image resolution in S103.
In this embodiment, after the image frame data of the display image at the second image resolution is output, the image may still be unclear, or the display of the key information in the image may still be unclear, and at this time, the accurate identification mode is started. In the accurate identification mode, the user can locally select the area needing to be mainly checked in the display picture, so that a second instruction is triggered. The local selection can be performed by the user for selecting a frame of the area needing to be mainly viewed, or can be performed by the user for cutting the area needing to be mainly viewed, and the area obtained by any mode can be used as the selected area of the user. After the area selected by the user is obtained, the image frame data of the selected area of the display picture under the second image resolution is extracted and directly stored in a frame buffer memory as new associated image frame data associated with the display picture for standby without pre-interpolation processing.
In this embodiment, the super-resolution processing is performed again on the image frame data at the first resolution using the new associated image data in the frame buffer memory. In the super-resolution processing, a part of the image frame data of the display screen at the first resolution corresponding to the new related image may be super-resolution processed based on the new related image data. That is, the super-resolution processing may be performed only on the image frame data of the selected area of the display screen at the first resolution to improve the definition of the selected area, so as to obtain the image frame data of the selected area at the second image resolution.
For example, as shown in fig. 5, fig. 5 shows a display processing method in the precise recognition mode according to the present invention. The screen resolution of the display screen is set to 1080P, and fig. 5 (a) shows an image of the display screen at 1080P. The image may be obtained from a local album or from a photographed preview screen, and 1080P is the initial resolution of the screen. Since the two-dimensional code in the screen is too small, in order to view the two-dimensional code clearly as the target area 502, the local area 501 in fig. 5 (a) is subjected to the zoom-in operation to trigger the first instruction. The instruction can be triggered by the fact that a user manually executes the amplification operation, and can also be triggered by the fact that the APP automatically detects the amplification. The system on chip performs cropping and pre-interpolation on the enlarged picture to obtain fig. 5 (b), and fig. 5 (b) is a picture of the enlarged picture at 270P. After the image frame data of the local area 501 in fig. 5 (b) is subjected to the one-display-chip-four-time super resolution processing, image frame data of the picture at 1080P is obtained, as shown in fig. 5 (c). That is, fig. 5 (c) is an image of the display screen at the second image resolution after the first super-resolution processing. Since the two-dimensional code displayed in fig. 5 (c) is still unclear, the user activates the precise recognition function. Corresponding to steps 501-503 of the present invention, the user manually frames the local area 503 that needs to be viewed with emphasis to trigger the second instruction, i.e., the area in the dashed box containing the two-dimensional code as the target area 502 in fig. 5 (d) is framed. The system on chip triggers a second instruction for the user to determine the area 503 of the display that is selected by the user, i.e., the image frame data within the dashed box. The independent display chip directly stores the image frame data in the dotted line frame in a frame buffer memory as new associated image frame data without performing pre-interpolation processing. And then the single display chip carries out second super-resolution processing on the image frame data corresponding to the image frame data in the step (b) in the figure 5 according to the new associated image frame data. That is, the image frame data of the display screen at the first resolution is subjected to the second super-resolution processing to obtain the image frame data of the display screen after the second super-resolution processing, as shown in fig. 5 (e), it can be seen that the definition of the two-dimensional code in fig. 5 (e) is significantly increased compared with the definition of the two-dimensional code in fig. 5 (d).
According to the embodiment of the application, the storage amount of data in the frame buffer memory can be reduced, the processing calculation amount required for performing super-resolution processing again can be remarkably reduced, and meanwhile the definition of the specific selected area can be further improved by performing the super-resolution processing on the selected area of the display picture. Furthermore, the selection area is determined by the frame selection of the user, the requirements of the user are met, the definition of the selection area is improved, and the user experience is improved.
The following respectively illustrates the display processing method according to the present invention according to three typical scenarios to which the present invention is applied.
[ example 1: preview of photographing scene
And the user acquires a display picture from the photographing preview picture, the SOC performs cutting processing on the image, and the AISR module hyper-resolution function of the independent display chip is used for performing super-resolution processing on the image. For convenience of description, it is assumed in this embodiment that the screen resolution of the display screen is 1080P, and the predetermined magnification of the AISR module is 1X/1.5X/2X/2.5X/4X, which is 5-step adjustable.
The specific implementation steps are as follows:
step 1: the user opens the camera, a shot scene preview picture, for example, a scene preview picture obtained by a camera sensor with 1600 ten thousand pixels, is obtained by the camera sensor, and after image processing such as noise reduction, dead pixel correction, color gamut adjustment, demosaicing and the like of the system-on-chip, image frame data of the preview picture under the initial resolution is obtained and displayed through the display screen.
Step 2: when the shot scene preview picture displayed on the screen still has details which cannot be clearly presented, the camera APP can automatically detect and call picture amplification operation or manually execute a first instruction of the picture amplification operation by a user, a picture targeted by the executed amplification operation is a display picture, and a system level chip of the electronic equipment performs corresponding cutting on the display picture aiming at the amplification operation. If the cropped picture size and the resolution of the corresponding image frame data match the screen resolution, the cropped image frame data may be used as the image frame data of the display picture at the first resolution, step 4 may be executed, for example, the initial resolution is 1080P, the image displayed on the picture at 1080P has an unclear portion, the user manually/APP automatically performs a zoom-in operation, the SOC crops the picture according to the zoom-in operation, and crops the 1080P image to 540p, and the 054p image may be subjected to two-fold super-division to 1080P, so that the cropped picture size and the resolution of the corresponding image frame data match the screen resolution, and the image frame data at 540P may be used as the image frame data of the display picture at the first resolution. If the cut picture size and the resolution of the corresponding image frame data cannot be matched with the screen resolution, the step 4 cannot be directly executed, and the step 3 needs to be executed first. For example, when the image resolution after cropping is 480P, i.e., between 432P and 540P, it does not match the screen resolution, so it is necessary to match the image resolution with the screen resolution first.
And step 3: and the system level chip performs pre-interpolation processing on the cut picture size and the corresponding image frame data to enable the resolution of the cut picture size and the corresponding image frame data to be matched with the screen resolution, and uses the matched image frame data as the image frame data of the display picture under the first resolution. For example, if the cropped image has a resolution of 480P, i.e., between 432P and 540P, the SOC pre-interpolates the cropped image to 540P and sends the image to the independent display chip, and the 2-fold super-resolution function of the independent display chip is turned on to enlarge the image to 1080P.
And 4, step 4: and sending the image frame data of the display picture at the first resolution to an AISR module.
And 5: and the one-display receiving instruction enters a 2-time AISR hyper-resolution mode AISR module to simultaneously process 540P data transmitted by the SOC and 1080P data of the associated image stored in the frame buffer memory, and the 2-time hyper-resolution function is started to hyper-divide image frame data of the display picture at the first resolution from 540P to 1080P to obtain the image frame data of the display picture at the second image resolution.
Step 6: the AISR module outputs the processed images of the display picture at 1080P to a display screen, and simultaneously stores image frame data of the display picture at the second image resolution into a frame buffer memory of the independent display chip, wherein the image frame data is used as image frame data of a new associated image and is referred to for the next super-resolution processing.
[ example 2: browsing a gallery)
And a user checks the original picture stored in the mobile phone and executes image amplification operation, and the definition of the amplified image is improved by using the hyper-resolution function of the AISR module.
Step 1: and the user enters the local album and views the image, after a certain picture is selected, the picture is a display picture, and the SOC reads the initial resolution of the image frame data of the display picture. If the initial resolution is higher than the screen resolution, the system-on-chip executes a resolution reduction algorithm to compress the initial resolution of the display picture to the screen resolution, and uses the image frame data and the resolution of the compressed display picture as the image frame data of the display picture at the first resolution. For example: the initial resolution of the display screen is 2160P (2160 × 3840), the screen resolution is 1080P (1080 × 1920), the initial resolution of the display screen is greater than the screen resolution, the system on chip performs resolution reduction processing on the image frame data of the display screen under the initial resolution to obtain the image frame data of the display screen under 1080P, and step 8 is executed. If the initial resolution of the displayed frame is less than or equal to the screen resolution and the frame displayed on the screen still has details which cannot be clearly presented, execute step 2.
Step 2: the user enlarges the display picture, and the system-on-chip of the electronic equipment correspondingly cuts the display picture aiming at the enlargement operation. If the size of the cut picture and the corresponding resolution are larger than the screen resolution, performing resolution reduction processing, converting the image resolution to the screen resolution, and displaying the obtained image frame data as a display picture, otherwise, performing the step 3.
And step 3: and determining whether the cut image resolution is matched with the screen resolution, if so, taking the image frame data as the image frame data of the display picture under the first resolution, and executing the step 5, and if not, executing the step 4.
And 4, step 4: and pre-interpolating the resolution of the clipped image frame data to be matched with the screen resolution, for example, the resolution is a fixed magnification of the screen resolution, and using the matched image frame data as the image frame data of the display screen at the first resolution.
And 5: and sending the image frame data of the display picture at the first resolution to an AISR module.
Step 6: suppose that the exclusive display receiving instruction enters a 2-time AISR hyper-resolution mode AISR module to simultaneously process 540P data transmitted by the SOC and 1080P data of a related image stored in a frame buffer memory, a 2-time hyper-resolution function is started to hyper-resolution image frame data of the display screen at a first resolution from 540P to 1080P, and image frame data of the display screen at a second image resolution is obtained.
And 7: the AISR module outputs the processed images of the display picture at 1080P to a display screen, and simultaneously stores image frame data of the display picture at the second image resolution into a frame buffer memory of the independent display chip, wherein the image frame data is used as image frame data of a new associated image and is referred to for the next super-resolution processing.
And 8: and executing display output.
[ example 3: accurate recognition scene
The user still has poor image clarity when using the highest power tele camera (10 times as original) in combination with the high power AISR (e.g., 4 times) described in example 1 above, or the picture is still unclear when using the high power AISR described in example 2 above to magnify the photo album. At the moment, the user starts the accurate identification function, and manually frames the target area to realize accurate identification.
Taking a photographing scene as an example, the specific implementation steps are as follows:
step 1: and clicking the camera APP by the user to enter a photographing preview interface. A camera sensor acquires a preview picture of a shot scene, namely a display picture, and displays the preview picture on a screen. Because the proportion of the area needing to be mainly checked by the user to the whole preview picture is too small, the area needing to be mainly checked is not displayed clearly, the user triggers a first instruction for carrying out amplification operation to the maximum amplification factor, and the first instruction is cut according to the amplification area of the picture. In this example, the clipped and pre-interpolated image is taken as an example, and the image frame data after the clipping of the image frame is matched with the screen resolution 1080P of the display screen, and can be used as the image frame data of the display screen at the first resolution.
Step 2: the SOC transmits the image frame data cut by the picture to the independent display chip, and simultaneously transmits an instruction for starting the AISR function to the independent display chip.
And step 3: and the independent display receiving instruction enters a '4-time AISR super-resolution' mode, 270P data transmitted by the SOC and 1080P data of the associated image stored in the independent display frame buffer memory are received, a 4-time super-resolution function is started, image frame data of the display picture at the first resolution is super-resolved to 1080P from 270P, 1080P image frame data are obtained and serve as image frame data of the display picture at the second image resolution, and the image frame data are output to the display screen.
And 4, step 4: because the image frame data of the picture under the second image resolution still has an unclear part when being displayed, in order to view and clarify the part, a user starts the accurate identification function, manually selects the area needing accurate identification in the picture so as to trigger a second instruction, after the user selects the area needing accurate identification in the picture, the SOC determines the image area selected by the user and sends the image area to the independent display chip, and the sent content comprises information for starting the accurate identification function and the area information selected by the user.
And 5: the single display chip acquires the image frame data of the area selected by the user according to the area information framed by the user, and the image frame data of the selected area is taken as the frame data of a new associated image and stored in a frame buffer memory of the single display chip.
Step 6: the AISR module receives image frame data (270P) of the display picture at the first resolution and frame data of a new associated image in the frame buffer memory simultaneously, which are transmitted by the SOC, carries out secondary super-resolution to 1080P on the image frame data of the display picture at the first resolution, obtains image frame data of the new display picture at the second image resolution, and outputs the image frame data through the display screen.
Due to the change of the related image, only the image frame data of the area selected by the user needs to be processed in the second super-resolution processing. This makes the amount of data that needs to be processed smaller. That is to say, the AISR module only calculates for the area selected by the user, and has the advantages of high calculation efficiency, small calculation amount and capability of obtaining the unique beneficial effect of pointedly improving the definition of the selected area.
According to the embodiment of the application, image frame data of a display screen at a first resolution can be acquired, super-resolution processing is performed on the image frame data of the display screen at the first resolution, the image frame data of the display screen at a second image resolution is acquired, wherein the second image resolution is a predetermined magnification of the first resolution, the predetermined magnification is greater than or equal to 1, and the image frame data of the display screen at the second image resolution is output. According to the embodiment of the application, after the image frame data of the display picture under the first resolution is subjected to super-resolution processing, the picture definition of the display picture can be improved.
In the display processing method provided in the embodiment of the present application, the execution main body is a system-level chip and an independent display chip (independent display chip) in the electronic device.
As shown in fig. 6, an embodiment of the present application provides a stand-alone display chip 700, including: a first input interface 701, a super-resolution processing circuit 702, and a first output interface 703.
The first input interface 701 is configured to acquire image frame data of a display screen at a first resolution.
A super-resolution processing circuit 702, configured to perform super-resolution processing on the image frame data of the display screen at the first resolution to obtain image frame data of the display screen at a second image resolution, where the second image resolution is a predetermined magnification of the first resolution.
A first output interface 703 for outputting image frame data at the second image resolution.
Having described the stand-alone display chip of the present invention according to the embodiment of the present invention, the stand-alone display chip of the present invention has the following advantageous effects:
first, the communication connection between the system-on-chip and the independent display chip is established through the first input interface, and information interaction between the system-on-chip and the independent display chip can be realized.
Second, by disposing the super-resolution processing circuit on the independent display chip and completing the super-resolution processing by the super-resolution processing circuit, the power consumption of the system-on-chip can be reduced and the speed of the display processing can be increased compared to the case where the system-on-chip performs the display processing on the image.
Thirdly, the first output interface is arranged on the independent display chip, and then the connection between the independent display chip and other equipment is established through the first output interface, so that the image frame data on the independent display chip can be output to other equipment.
In one embodiment, the stand-alone display chip 700 further includes:
and a frame buffer 704 for storing associated image frame data associated with the display screen at the second image resolution in advance.
The super-resolution processing circuit 702 is further configured to perform super-resolution processing on the image frame data of the display screen at the first resolution by using the associated image frame data, which is stored in the frame buffer memory in advance, at the second image resolution and is associated with the display screen.
According to the embodiment of the application, the super-resolution processing can be performed on the part, needing to be focused and increased in definition, of the display picture by using the associated image, and the requirement of a user for focused viewing of the part can be met.
In one embodiment, the super-resolution processing performed by the super-resolution processing circuit 702 includes:
and performing super-resolution reconstruction on the image frame data of the display picture at the first resolution by using the associated image frame data so as to preliminarily improve the definition of the image frame data at the first resolution.
Sub-block dividing the image frame data at the first resolution to form a plurality of sub-blocks of image frame data having a fourth resolution, wherein the fourth resolution is less than the first resolution.
And carrying out image coding, feature extraction and image decoding on the image data frame subblocks with the fourth resolution so as to obtain a plurality of image frame data subblocks with the fifth resolution, wherein the fifth resolution is higher than the fourth resolution.
And combining the plurality of image frame data sub-blocks with the fifth resolution to obtain the image frame data at the second image resolution.
According to the embodiment of the application, the image preprocessing executed by the independent display chip is used for preprocessing the image frame data of the display picture under the first resolution, so that the definition of the display picture can be preliminarily improved. Through the AISR algorithm, the image frame data is processed, and the definition of the picture can be optimally improved. The AISR algorithm is operated by the embedded neural network processor of the independent display chip, so that the processing efficiency can be greatly improved, and the image processing speed can be improved.
The stand-alone display chip in the embodiments of the present application may be a component in an electronic device, such as an integrated circuit. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet Computer, a notebook Computer, a palm top Computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an Ultra-Mobile Personal Computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (Television, TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited. The electronic device in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The independent display chip provided in the embodiment of the present application can implement each process implemented by the display processing method embodiment, and is not described here again to avoid repetition.
The system-on-chip includes: and the instruction input interface is used for receiving a first instruction for amplifying and displaying the display picture.
And the data interception circuit is used for intercepting the image frame data of the enlarged area corresponding to the first instruction in the display picture in response to the first instruction.
And the pre-interpolation processing circuit is used for carrying out pre-interpolation processing on the image frame data so as to obtain the image frame data of the display picture corresponding to the amplification area under the first resolution, so as to provide the image frame data to the first input interface of the independent display chip.
Optionally, an embodiment of the present application further provides a display device 800, as shown in fig. 7, including a stand-alone display chip 70 and a system-on-chip 80 connected thereto according to any one of the second aspects of the present application. The system and chip 80 is configured to, in response to a first instruction for performing an enlarged display on a current display screen, perform image cropping processing on the current display screen to obtain image frame data at a first image resolution, and send the image frame data at the first image resolution and a super-resolution indicator to the independent display chip;
the independent display chip 70 is configured to perform super-resolution processing on the image frame data at the first image resolution, obtain image frame data at a second image resolution, and send the image frame data at the second image resolution to the display screen.
Optionally, the independent display chip 70 further includes a frame buffer memory 710, where the frame buffer memory 710 is used for storing image-related frame data; the independent display chip 70 is further configured to obtain image associated frame data at a second image resolution, which is stored in the frame buffer memory last time, perform super-resolution processing on the image frame data at the first image resolution by using the image associated frame data at the second image resolution, and obtain image frame data at the second image resolution; and storing the image frame data at the second image resolution in the frame buffer memory 710 as image-related frame data at the second image resolution.
Optionally, the system on chip 80 is further configured to determine, in response to a second instruction for locally selecting a current display screen, a selection area corresponding to the second instruction from the current display screen, and send the determined selection area to the independent display chip;
the independent display chip 70 is further configured to replace image-related frame data stored in the frame buffer memory 710 at a second image resolution with image frame data of the selected area, where the image frame data of the selected area is image frame data located in the selected area in the image frame data at the first image resolution, perform super-resolution processing on the image frame data at the first image resolution again by using the replaced image-related frame data in the frame buffer memory 710, obtain image frame data at the second image resolution again, and send the obtained image frame data at the second image resolution to the display screen.
Optionally, the system-on-chip 80 is configured to read image frame data to be displayed, reduce the resolution of the image frame data to be displayed to the screen resolution when the resolution of the image frame data to be displayed is greater than the resolution of the screen of the electronic device, and send the image frame data to be displayed with the reduced resolution to the independent display chip 70;
the independent display chip 70 is configured to send the image frame data to be displayed with reduced resolution to the display screen.
Optionally, the image frame data to be displayed is image frame data of a camera shooting preview;
the system-on-chip 80 is specifically configured to perform image cropping processing on the current display screen according to the first instruction to obtain image frame data at a third image resolution, take the cropped image frame data as the image frame data at the first image resolution when the third image resolution is equal to the first image resolution, and perform pre-interpolation processing on the image frame data at the third image resolution when the third image resolution is not equal to the first image resolution to obtain the image frame data at the first resolution.
Optionally, the image frame data to be displayed is image frame data in a gallery;
the system-on-chip 80 is specifically configured to perform image clipping processing on the current display frame according to the first instruction, and obtain image frame data at a third image resolution;
when the third image resolution is larger than the screen resolution of the electronic equipment, reducing the image resolution corresponding to the cut image frame data to the screen resolution, and sending the image frame data to be displayed with the reduced resolution to the independent display chip for displaying;
when the third image resolution is equal to the first image resolution, taking the clipped image frame data as image frame data under the first image resolution;
and when the third image resolution is smaller than the first image resolution, performing pre-interpolation processing on the image frame data under the third image resolution to obtain the image frame data under the first resolution.
The following describes in detail the display processing procedure of the display processing device according to the embodiment of the present application, taking the display screen as a locally stored screen or a shooting preview screen as an example, as shown in fig. 8 and 9, respectively. For convenience of description, the names of chinese are corresponded by the following letters:
DSI: display Serial Interface, display screen Serial Interface;
CSI: camera Serial Interface, camera Serial Interface;
ISP: image Signal Processing, image Signal Processing;
SDR: standard Dynamic Range, standard Dynamic Range;
HDR: high Dynamic Range, high Dynamic Range;
AINR: artificial intellectual Noise Reduction, artificial Intelligence Noise Reduction;
AISR: artificial Intelligence Super Resolution;
SOC: system On Chip, system On Chip.
RX: an interface is received.
TX: and sending the interface.
As shown in fig. 8, fig. 8 shows a flowchart of the display processing device when processing a locally stored picture, and the specific flow is:
(1) The SOC reads image data from a local storage and sends the image data to the independent display chip through a DSI TX0 interface of the SOC.
(2) The one-display chip receives image data through the DSI RX0 and processes the image data through the AINR processing circuit, the AISR processing circuit, and the SDR to HDR processing circuit.
(3) And the single display chip transmits the processed image data to a display screen through a DSI TX0 interface.
As shown in fig. 9, fig. 9 is a flowchart of the display processing device when processing the photographing preview screen, and the specific flow is:
(1) And the camera sensor sends image data obtained by photographing to the SOC through the CSI RX0 interface.
(2) The SOC receives image data, converts the image data into data format (RAW into RGB and then YUV), and then sends the image data to a DSI RX1 interface of an independent display chip through a DSI TX1 interface.
(3) The single display chip receives image data through the DSI RX1 interface, processes the image data through the AINR processing module, the AISR processing module and other ISP data processing modules, and then transmits the image data back to the CSI RX1 interface of the SOC through the CSI TX1 interface.
(4) The SOC performs operations such as encoding and storing on the processed image data, and transmits the image data to a DSI RX0 interface of the single display chip through the DSI TX 0.
(5) The image data is processed by other image enhancement modules (such as SDR to HDR) of the independent display chip and then output to a display screen through a DSI TX0 interface.
According to the embodiment of the application, when the target image in the display picture is small, the target image can be amplified in a targeted manner through the first instruction and the second instruction, the image frame data of the amplified area is captured, the image frame data of the amplified area is subjected to super-resolution processing, the high-resolution target image is obtained, the resolution of the target image can be improved while the target image is amplified, and the problem that the image definition is poor due to the amplification of the image can be avoided. By carrying out pre-interpolation processing on the image frame data of the amplification area, the image frame data of the amplification area can be ensured to be super-resolution processed, the definition of a target image can be improved, and the problem that the image display is not clear after the display image is amplified can be solved.
Optionally, referring to fig. 10, an electronic device 900 is further provided in an embodiment of the present application, and includes a processor 901 and a memory 902, where the memory 902 stores a program or an instruction that can be executed on the processor 901, and when the program or the instruction is executed by the processor 901, the steps of the foregoing display processing method embodiment are implemented, and the same technical effects can be achieved, and are not described again to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Among them, the input Unit 1004 may include a Graphic Processing Unit (GPU) 10041 and a microphone 10042, the graphic processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or nonvolatile memory, or the memory 1009 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1009 in the embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the display processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
An embodiment of the present application provides a system-on-chip, where the system-on-chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps performed by the system-on-chip in the display method according to the first aspect.
An embodiment of the present application provides an independent display chip, where the system-on-chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement the steps performed by the independent display chip in the display method according to the first aspect.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing display processing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (14)
1. A display method is applied to electronic equipment, the electronic equipment comprises a system-on-chip and an independent display chip, the system-on-chip is connected with the independent display chip, and the method comprises the following steps:
the system-level chip responds to a first instruction for amplifying and displaying a current display picture, and cuts the current display picture to obtain image frame data under a first image resolution;
sending the image frame data and the super-resolution indication under the first image resolution to the independent display chip;
the independent display chip carries out super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution;
and sending the image frame data under the second image resolution to a display screen.
2. The method of claim 1, wherein the super-resolution processing of the image frame data at the first image resolution to obtain image frame data at a second image resolution comprises:
acquiring image associated frame data under the second image resolution which is stored in the frame buffer memory last time;
performing super-resolution processing on the image frame data under the first image resolution by using the image-associated frame data under the second image resolution to obtain image frame data under the second image resolution;
the method further comprises, after the obtaining image frame data at the second image resolution:
and storing the image frame data under the second image resolution in the frame buffer memory as image related frame data under the second image resolution.
3. The method of claim 2, wherein after said displaying image frame data at the second image resolution, the method further comprises:
the system-level chip responds to a second instruction for locally selecting a current display picture, and determines a selection area corresponding to the second instruction from the current display picture;
sending the determined selection area to the independent display chip;
the independent display chip replaces image-related frame data under a second image resolution stored in the frame buffer memory with the image frame data of the selected area, wherein the image frame data of the selected area is the image frame data in the selected area in the image frame data under the second image resolution;
performing super-resolution processing on the image frame data under the first image resolution again by using the replaced image-associated frame data in the frame buffer memory, and obtaining the image frame data under the second image resolution again;
and sending the image frame data at the second image resolution to the display screen.
4. The method according to claim 1, wherein prior to the first instruction to display a current display screen in an enlarged manner, the method further comprises:
the system-level chip reads image frame data to be displayed, reduces the resolution of the image frame data to be displayed to the screen resolution under the condition that the resolution of the image frame data to be displayed is larger than the resolution of a screen of the electronic equipment, and sends the image frame data to be displayed with the reduced resolution to the independent display chip;
and sending the image frame data to be displayed with the reduced resolution to the display screen.
5. The method according to claim 4, wherein the image frame data to be displayed is image frame data of a camera shooting preview image, and the system on chip performs image cropping processing on a current display screen in response to a first instruction for performing enlarged display on the current display screen to obtain image frame data at a first image resolution, and comprises:
the system-level chip performs image cutting processing on the current display picture according to the first instruction to obtain image frame data under a third image resolution, and when the third image resolution is equal to the first image resolution, the cut image frame data is used as the image frame data under the first image resolution;
and when the third image resolution is not equal to the first image resolution, performing pre-interpolation processing on the image frame data under the third image resolution to obtain the image frame data under the first resolution.
6. The method according to claim 4, wherein the image frame data to be displayed is image frame data in a gallery, and the system on chip performs image cropping processing on a current display screen in response to a first instruction for performing enlarged display on the current display screen to obtain image frame data at a first image resolution, including:
the system-level chip performs image cutting processing on the current display picture according to the first instruction to obtain image frame data under a third image resolution;
when the third image resolution is greater than the screen resolution of the electronic device, reducing the image resolution corresponding to the cut image frame data to the screen resolution, and sending the image frame data to be displayed with the reduced resolution to the independent display chip for displaying;
when the third image resolution is equal to the first image resolution, taking the clipped image frame data as image frame data under the first image resolution;
and under the condition that the third image resolution is smaller than the first image resolution, performing pre-interpolation processing on the image frame data under the third image resolution to obtain the image frame data under the first resolution.
7. A display device comprises a system-on-chip and an independent display chip, wherein the system-on-chip is connected with the independent display chip,
the system and the chip are used for responding to a first instruction for amplifying and displaying a current display picture, performing image cutting processing on the current display picture to obtain image frame data under a first image resolution, and sending the image frame data and the super-resolution indication under the first image resolution to the independent display chip;
the independent display chip is used for performing super-resolution processing on the image frame data under the first image resolution to obtain image frame data under a second image resolution, and sending the image frame data under the second image resolution to a display screen.
8. The display device of claim 7, wherein the stand-alone display chip further comprises a frame buffer memory,
the frame buffer memory is used for storing image related frame data;
the independent display chip is also used for acquiring image associated frame data under a second image resolution which is stored in a frame buffer memory last time, and performing super-resolution processing on the image frame data under the first image resolution by using the image associated frame data under the second image resolution to acquire the image frame data under the second image resolution; and storing the image frame data at the second image resolution in the frame buffer memory as image-related frame data at the second image resolution.
9. The display device according to claim 8,
the system-on-chip is also used for responding to a second instruction for locally selecting the current display picture, determining a selection area corresponding to the second instruction from the current display picture, and sending the determined selection area to the independent display chip;
the independent display chip is further configured to replace image-related frame data stored in the frame buffer memory at a second image resolution with image frame data of the selected area, where the image frame data of the selected area is image frame data located in the selected area in the image frame data at the first image resolution, perform super-resolution processing on the image frame data at the first image resolution again by using the replaced image-related frame data in the frame buffer memory, obtain image frame data at the second image resolution again, and send the obtained image frame data at the second image resolution to the display screen.
10. The display device according to claim 7,
the system-level chip is used for reading the frame data of the image to be displayed, reducing the resolution of the frame data of the image to be displayed to the resolution of the screen under the condition that the resolution of the frame data of the image to be displayed is larger than the resolution of the screen of the electronic equipment, and sending the frame data of the image to be displayed with the reduced resolution to the independent display chip;
the independent display chip is used for sending the image frame data to be displayed with reduced resolution to the display screen.
11. The display device according to claim 10, wherein the image frame data to be displayed is camera shooting preview image frame data;
the system-level chip is specifically configured to perform image cropping processing on the current display picture according to the first instruction to obtain image frame data at a third image resolution, use the cropped image frame data as the image frame data at the first image resolution when the third image resolution is equal to the first image resolution, and perform pre-interpolation processing on the image frame data at the third image resolution when the third image resolution is not equal to the first image resolution to obtain the image frame data at the first resolution.
12. The display device according to claim 10, wherein the image frame data to be displayed is image frame data in a gallery;
the system-level chip is specifically configured to perform image clipping processing on the current display frame according to the first instruction, and obtain image frame data at a third image resolution;
when the third image resolution is larger than the screen resolution of the electronic equipment, reducing the image resolution corresponding to the cut image frame data to the screen resolution, and sending the image frame data to be displayed with the reduced resolution to the independent display chip for displaying;
when the third image resolution is equal to the first image resolution, taking the clipped image frame data as image frame data under the first image resolution;
and under the condition that the third image resolution is smaller than the first image resolution, performing pre-interpolation processing on the image frame data under the third image resolution to obtain the image frame data under the first resolution.
13. An electronic device, comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the display method of any one of claims 1-6.
14. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the display method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211108429.3A CN115472140B (en) | 2022-09-09 | 2022-09-09 | Display method, display device, electronic apparatus, and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211108429.3A CN115472140B (en) | 2022-09-09 | 2022-09-09 | Display method, display device, electronic apparatus, and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115472140A true CN115472140A (en) | 2022-12-13 |
CN115472140B CN115472140B (en) | 2024-08-23 |
Family
ID=84332929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211108429.3A Active CN115472140B (en) | 2022-09-09 | 2022-09-09 | Display method, display device, electronic apparatus, and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115472140B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118485575A (en) * | 2024-05-07 | 2024-08-13 | 广东省欧度利方科技有限公司 | Image enhancement method of tablet personal computer |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050091571A1 (en) * | 2003-10-23 | 2005-04-28 | Ivan Leichtling | Synchronized graphics and region data for graphics remoting systems |
US20080068359A1 (en) * | 2006-09-15 | 2008-03-20 | Semiconductor Energy Laboratory Co., Ltd. | Display device and method of driving the same |
CN104038685A (en) * | 2012-11-12 | 2014-09-10 | 索尼公司 | Imaging Device And Image Processing Method |
US9158974B1 (en) * | 2014-07-07 | 2015-10-13 | Google Inc. | Method and system for motion vector-based video monitoring and event categorization |
CN105487773A (en) * | 2015-11-27 | 2016-04-13 | 小米科技有限责任公司 | Screen capturing method and device |
CN107481189A (en) * | 2017-06-28 | 2017-12-15 | 西安邮电大学 | A kind of super-resolution image reconstruction method of the rarefaction representation based on study |
CN108322672A (en) * | 2018-02-02 | 2018-07-24 | 青岛海信电器股份有限公司 | Image processing method and device |
CN109191377A (en) * | 2018-07-25 | 2019-01-11 | 西安电子科技大学 | A kind of image magnification method based on interpolation |
CN111340711A (en) * | 2020-05-21 | 2020-06-26 | 腾讯科技(深圳)有限公司 | Super-resolution reconstruction method, device, equipment and storage medium |
CN111932453A (en) * | 2020-07-20 | 2020-11-13 | 合肥富煌君达高科信息技术有限公司 | High-resolution image generation method and high-speed camera integrated with same |
CN112449140A (en) * | 2019-08-29 | 2021-03-05 | 华为技术有限公司 | Video super-resolution processing method and device |
CN113837937A (en) * | 2020-06-24 | 2021-12-24 | Oppo广东移动通信有限公司 | Multimedia processing chip, electronic equipment image fusion method and image cutting method |
CN113850721A (en) * | 2021-09-26 | 2021-12-28 | 陕西师范大学 | Single image super-resolution reconstruction method, device and equipment and readable storage medium |
-
2022
- 2022-09-09 CN CN202211108429.3A patent/CN115472140B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050091571A1 (en) * | 2003-10-23 | 2005-04-28 | Ivan Leichtling | Synchronized graphics and region data for graphics remoting systems |
US20080068359A1 (en) * | 2006-09-15 | 2008-03-20 | Semiconductor Energy Laboratory Co., Ltd. | Display device and method of driving the same |
CN104038685A (en) * | 2012-11-12 | 2014-09-10 | 索尼公司 | Imaging Device And Image Processing Method |
US9158974B1 (en) * | 2014-07-07 | 2015-10-13 | Google Inc. | Method and system for motion vector-based video monitoring and event categorization |
CN105487773A (en) * | 2015-11-27 | 2016-04-13 | 小米科技有限责任公司 | Screen capturing method and device |
CN107481189A (en) * | 2017-06-28 | 2017-12-15 | 西安邮电大学 | A kind of super-resolution image reconstruction method of the rarefaction representation based on study |
CN108322672A (en) * | 2018-02-02 | 2018-07-24 | 青岛海信电器股份有限公司 | Image processing method and device |
CN109191377A (en) * | 2018-07-25 | 2019-01-11 | 西安电子科技大学 | A kind of image magnification method based on interpolation |
CN112449140A (en) * | 2019-08-29 | 2021-03-05 | 华为技术有限公司 | Video super-resolution processing method and device |
CN111340711A (en) * | 2020-05-21 | 2020-06-26 | 腾讯科技(深圳)有限公司 | Super-resolution reconstruction method, device, equipment and storage medium |
CN113837937A (en) * | 2020-06-24 | 2021-12-24 | Oppo广东移动通信有限公司 | Multimedia processing chip, electronic equipment image fusion method and image cutting method |
CN111932453A (en) * | 2020-07-20 | 2020-11-13 | 合肥富煌君达高科信息技术有限公司 | High-resolution image generation method and high-speed camera integrated with same |
CN113850721A (en) * | 2021-09-26 | 2021-12-28 | 陕西师范大学 | Single image super-resolution reconstruction method, device and equipment and readable storage medium |
Non-Patent Citations (1)
Title |
---|
李宁;牛世林;: "基于局部超分辨重建的高精度SAR图像水域分割方法", 雷达学报, no. 01, 15 January 2020 (2020-01-15), pages 174 - 184 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118485575A (en) * | 2024-05-07 | 2024-08-13 | 广东省欧度利方科技有限公司 | Image enhancement method of tablet personal computer |
Also Published As
Publication number | Publication date |
---|---|
CN115472140B (en) | 2024-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11758265B2 (en) | Image processing method and mobile terminal | |
WO2019104705A1 (en) | Image processing method and device | |
JP4250543B2 (en) | Imaging apparatus, information processing apparatus, and control method thereof | |
CN112995500B (en) | Shooting method, shooting device, electronic equipment and medium | |
US8045799B2 (en) | System and method for generating a photograph with variable image quality | |
US8897602B2 (en) | Imaging system with multiframe scaler | |
US20070222858A1 (en) | Monitoring system, monitoring method and program therefor | |
KR20090109594A (en) | System and method for checking framing and sharpness of a digital image | |
US20100289922A1 (en) | Method and system for processing data sets of image sensors, a corresponding computer program, and a corresponding computer-readable storage medium | |
US20230362476A1 (en) | Photographing method and apparatus and electronic device | |
WO2022161260A1 (en) | Focusing method and apparatus, electronic device, and medium | |
US9197813B2 (en) | Method and apparatus for obtaining a digital image | |
CN112422798A (en) | Photographing method and device, electronic equipment and storage medium | |
CN115472140B (en) | Display method, display device, electronic apparatus, and readable storage medium | |
CN111432121A (en) | Generation method, electronic device, and storage medium | |
CN113891018A (en) | Shooting method and device and electronic equipment | |
CN114298889A (en) | Image processing circuit and image processing method | |
JP2006014221A (en) | Imaging apparatus and imaging method | |
CN117768774A (en) | Image processor, image processing method, photographing device and electronic device | |
CN108765281A (en) | A kind of method, apparatus and computer storage media generating thumbnail | |
JP4609315B2 (en) | Imaging device, method of displaying angle frame at zoom, and program | |
CN112887515A (en) | Video generation method and device | |
CN110049247B (en) | Image optimization method and device, electronic equipment and readable storage medium | |
US20070085925A1 (en) | Digital camera apparatus | |
CN114093005A (en) | Image processing method and device, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |