US12008285B2 - Image processing method and display control method - Google Patents
Image processing method and display control method Download PDFInfo
- Publication number
- US12008285B2 US12008285B2 US18/016,430 US202118016430A US12008285B2 US 12008285 B2 US12008285 B2 US 12008285B2 US 202118016430 A US202118016430 A US 202118016430A US 12008285 B2 US12008285 B2 US 12008285B2
- Authority
- US
- United States
- Prior art keywords
- image
- sub
- border
- pixels
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 title claims description 68
- 230000008859 change Effects 0.000 claims description 145
- 230000004044 response Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 15
- 230000001788 irregular Effects 0.000 claims description 8
- 238000012935 Averaging Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 26
- 238000010586 diagram Methods 0.000 description 20
- 230000007423 decrease Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 208000019901 Anxiety disease Diseases 0.000 description 2
- 230000036506 anxiety Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006266 hibernation Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/4401—Bootstrapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/4401—Bootstrapping
- G06F9/4406—Loading of operating system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/373—Details of the operation on graphic patterns for modifying the size of the graphic pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/02—Details of power systems and of start or stop of display operation
- G09G2330/026—Arrangements or methods related to booting a display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0442—Handling or displaying different aspect ratios, or changing the aspect ratio
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/045—Zooming at least part of an image, i.e. enlarging it or shrinking it
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- the present disclosure relates to the field of display technologies, and in particular, to an image processing method, a display control method, and non-transitory computer-readable storage media.
- the display apparatus When a display apparatus (e.g., a displayer) is started up, the display apparatus needs a certain amount of preparation time to display normally.
- the display apparatus may display a startup picture during the startup preparation period after it is started up and before it can be used normally. In this way, it may allow the user know that the display apparatus has been turned on and is being booted, thereby eliminating the user's anxiety and tedium of waiting.
- the startup picture may generally include information such as a company logo and product model.
- an image processing method includes: using image data of a first image as image data of a base region; and generating, based on the image data of the first image, image data of an extended region according to extension policies, so as to obtain image data of a second image, the image data of the second image including the image data of the base region and the image data of the extended region.
- the first image has a first resolution
- the second image has a second resolution
- the first resolution is less than the second resolution.
- generating, based on the image data of the first image, the image data of the extended region according to the extension policies includes: generating, based on a pixel value of at least one pixel in a border of the first image, the image data of the extended region according to the extension policies.
- the border of the first image includes a solid-colored border.
- Generating, based on the pixel value of the at least one pixel in the border of the first image, the image data of the extended region according to the extension policies includes: using a pixel value of a single pixel in the border of the first image as a pixel value of each pixel in the extended region, so as to obtain the image data of the extended region.
- the border of the first image includes a non-solid-colored border.
- Generating, based on the pixel value of the at least one pixel in the border of the first image, the image data of the extended region according to the extension policies includes: generating, based on pixel values of a plurality of pixels in the border of the first image, the image data of the extended region according to the extension policies.
- the non-solid-colored border of the first image includes a border, a color of which gradually changes in a column direction and does not change in a row direction.
- the extended region includes first sub-regions and second sub-regions except the first sub-regions, and the first sub-regions are flush with the base region in the row direction.
- Generating, based on the pixel values of the plurality of pixels in the border of the first image, the image data of the extended region according to the extension policies includes: generating, according to a pixel value of at least one pixel, located at the border, in each row of pixels of the first image, a pixel value of each pixel in a corresponding row of pixels in the first sub-regions; and obtaining, according to a change trend of the pixel values of the plurality of pixels in the border of the first image in the column direction, pixel values of a plurality of rows of pixels that change in the column direction in the second sub-regions.
- Each row of pixels have the same pixel value, and each pixel value is within a range of s pixel value.
- generating, according to the pixel value of the at least one pixel, located at the border, in each row of pixels of the first image, the pixel value of each pixel in the corresponding row of pixels in the first sub-regions includes: using a pixel value of a pixel, located at the border, in each row of pixels of the first image as the pixel value of each pixel in the corresponding row of pixels of the first sub-regions.
- the non-solid-colored border of the first image includes a border, a color of which gradually changes in a row direction and does not change in a column direction.
- the extended region includes third sub-regions and fourth sub-regions except the third sub-regions, and the third sub-regions are flush with the base region in the column direction.
- Generating, based on the pixel values of the plurality of pixels in the border of the first image, the image data of the extended region according to the extension policies includes: generating, according to a pixel value of at least one pixel, located at the border, in each column of pixels of the first image, a pixel value of each pixel in a corresponding column of pixels in the third sub-regions; and obtaining, according to a change trend of the pixel values of the plurality of pixels in the border of the first image in the row direction, pixel values of a plurality of columns of pixels that change in the row direction in the fourth sub-regions.
- Each column of pixels have the same pixel value, and each pixel value is within a range of a pixel value.
- generating, according to the pixel value of the at least one pixel, located at the border, in each column of pixels of the first image, the pixel value of each pixel in the corresponding column of pixels in the third sub-regions includes: using a pixel value of a pixel, located at the border, in each column of pixels of the first image as the pixel value of each pixel in the corresponding column of pixels of the third sub-regions.
- the non-solid-colored border of the first image includes a border, a color of which gradually changes both in a column direction and in a row direction.
- the extended region includes fifth sub-regions, sixth sub-regions, and seventh sub-regions except the fifth sub-regions and the sixth sub-regions, and the fifth sub-regions are flush with the base region in the row direction, and the sixth sub-regions are flush with the base region in the column direction.
- obtaining the image data of the seventh sub-regions based on the image data of the fifth sub-regions and/or the image data of the sixth sub-regions includes: averaging pixel values of two pixels adjacent to a pixel, respectively in the row direction and in the column direction, in a seventh sub-region to obtain a pixel value of the pixel in the seventh sub-region.
- obtaining the image data of the seventh sub-regions based on the image data of the fifth sub-regions and/or the image data of the sixth sub-regions includes: obtaining, according to a change trend of pixel values of pixels in a row in a border of a sixth sub-region adjacent to a seventh sub-region in the row direction, pixel values of a corresponding row of pixels in the seventh sub-region, wherein each pixel value is within the range of the pixel value, or obtaining, according to a change trend of pixel values of pixels in a column in a border of a fifth sub-region adjacent to a seventh sub-region in the column direction, pixel values of a corresponding column of pixels in the seventh sub-region, wherein each pixel value is within the range of the pixel value.
- generating, based on the image data of the first image, the image data of the extended region according to the extension policies includes: generating the image data of the extended region according to pixel values of all pixels in the first image.
- the first image includes a border irregular in color.
- Generating the image data of the extended region according to the pixel values of all the pixels in the first image includes: averaging the pixel values of all the pixels in the first image to obtain a pixel value of each pixel in the extended region.
- the image processing method further includes: identifying a type of the first image according to pixel values of a plurality of pixels in a border of the first image.
- Generating, based on the image data of the first image, the image data of the extended region according to the extension policies includes: generating, based on the image data of the first image, the image data of the extended region according to an extension policy corresponding to the type of the first image.
- the border of the first image includes two first sub-borders parallel to a row direction and two second sub-borders parallel to a column direction. Identifying the type of the first image according to the pixel values of the plurality of pixels in the border of the first image, includes: determining the type of the first image according to a change trend of pixel values of pixels in each first sub-border in the row direction and a change trend of pixel values of pixels in each second sub-border in the column direction.
- the type of the first image is a first type, a second type, a third type, a fourth type or a fifth type.
- the first type is configured to represent that the border of the first image includes a solid-colored border.
- the second type is configured to represent that the border of the first image includes a border, a color of which gradually changes in the column direction and does not change in the row direction.
- the third type is configured to represent that the border of the first image includes a border, a color of which gradually changes in the row direction and does not change in the column direction.
- the fourth type is configured to represent that the border of the first image includes a border, a color of which gradually changes both in the column direction and in the row direction.
- the fifth type is configured to represent that the border of the first image includes a border irregular in color.
- determining the type of the first image according to the change trend of the pixel values of the pixels in each first sub-border in the row direction and the change trend of the pixel values of the pixels in each second sub-border in the column direction includes: determining a first determination result of each first sub-border, wherein if pixel values of all pixels in each row of pixels in the first sub-border are approximately equal, the first determination result includes equality, otherwise the first determination result includes inequality; if the pixel values of all the pixels in each row of pixels in the first sub-border gradually change, the first determination result includes gradual change, otherwise the first determination result includes no gradual change; and determining a second determination result of each second sub-border, wherein if pixel values of all pixels in each column of pixels in the second sub-border are approximately equal, the second determination result includes equality, otherwise the second determination result includes inequality; if the pixel values of all the pixels in each column of pixels in the second sub-border gradually change, the second determination result includes gradual change, otherwise the second determination result includes no gradual change.
- the first image is of the first type. If the first determination result of each first sub-border includes equality and the second determination result of each second sub-border includes gradual change, the first image is of the second type. If the first determination result of each first sub-border includes gradual change and the second determination result of each second sub-border includes equality, the first image is of the third type. If the first determination result of each first sub-border and the second determination result of each second sub-border both include gradual change, the first image is of the fourth type. If at least one of the first determination result of each first sub-border and the second determination result of each second sub-border includes both inequality and no gradual change, the first image is of the fifth type.
- a display control method is provided, which is applied to a display control apparatus.
- the display control method includes: reading image data of a startup picture; performing the image processing method provided by any of the above embodiments to obtain the image data of the second image, wherein the first image in the image processing method is the startup picture; and outputting the image data of the second image to control a display panel for display according to the image data of the second image.
- outputting the image data of the second image includes: outputting the image data of the second image in response to a case where the display control apparatus is in a startup initialization state.
- the display control method further includes: outputting image data of a working picture in response to an end of the startup initialization state of the display control apparatus.
- a non-transitory computer-readable storage medium has stored therein computer program instructions that, when run on a computer (e.g., a display apparatus), cause the computer to perform the image processing method provided by any of the above embodiments, or the display control method provided by any of the above embodiments.
- a computer e.g., a display apparatus
- FIG. 1 is a schematic diagram of a startup picture displayed on a displayer when a user performs a startup operation on the displayer;
- FIG. 2 is a structural diagram of a display apparatus, in accordance with some embodiments.
- FIG. 3 A is a structural diagram of a display control apparatus, in accordance with some embodiments.
- FIG. 3 B is a structural diagram of another display control apparatus, in accordance with some embodiments.
- FIG. 4 is a schematic diagram of a first image and a second image, in accordance with some embodiments.
- FIG. 5 is a flowchart of a display control method, in accordance with some embodiments.
- FIG. 6 A is a schematic diagram of a first image, in accordance with some embodiments.
- FIG. 6 B is a schematic diagram of a second image, in accordance with some embodiments.
- FIG. 7 is a flowchart of an image processing method, in accordance with some embodiments.
- FIG. 8 is a schematic diagram of another first image, in accordance with some embodiments.
- FIG. 9 A is a schematic diagram of another first image and another second image, in accordance with some embodiments.
- FIG. 9 B is a schematic diagram of yet another first image and yet another second image, in accordance with some embodiments.
- FIG. 9 C is a schematic diagram of yet another first image and yet another second image, in accordance with some embodiments.
- FIG. 9 D is a schematic diagram of yet another first image and yet another second image, in accordance with some embodiments.
- FIG. 9 E is a schematic diagram of yet another first image and yet another second image, in accordance with some embodiments.
- FIG. 10 is a structural diagram of yet another display control apparatus, in accordance with some embodiments.
- FIG. 11 is a structural diagram of an image processing apparatus, in accordance with some embodiments.
- FIG. 12 is a structural diagram of yet another display control apparatus, in accordance with some embodiments.
- the term “comprise” and other forms thereof such as the third-person singular form “comprises” and the present participle form “comprising” are construed as an open and inclusive sense, i.e., “including, but not limited to”.
- the terms such as “one embodiment”, “some embodiments”, “exemplary embodiments”, “example”, “specific example” or “some examples” are intended to indicate that specific features, structures, materials or characteristics related to the embodiment(s) or example(s) are included in at least one embodiment or example of the present disclosure. Schematic representations of the above terms do not necessarily refer to the same embodiment(s) or example(s).
- specific features, structures, materials or characteristics may be included in any one or more embodiments or examples in any suitable manner.
- first and second are only used for descriptive purposes only, and are not to be construed as indicating or implying the relative importance or implicitly indicating the number of indicated technical features.
- features defined with “first” or “second” may explicitly or implicitly include one or more of the features.
- the term “a plurality of” or “the plurality of” means two or more unless otherwise specified.
- the terms “coupled” and “connected” and derivatives thereof may be used.
- the term “connected” may be used in the description of some embodiments to indicate that two or more components are in direct physical or electrical contact with each other.
- the term “coupled” may be used in the description of some embodiments to indicate that two or more components are in direct physical or electrical contact.
- the term “coupled” or “communicatively coupled” may also mean that two or more components are not in direct contact with each other, but still cooperate or interact with each other.
- the embodiments disclosed herein are not necessarily limited to the contents herein.
- phrases “at least one of A, B and C” has the same meaning as the phrase “at least one of A, B or C”, and they both include the following combinations of A, B and C: only A, only B, only C, a combination of A and B, a combination of A and C, a combination of B and C, and a combination of A, B and C.
- a and/or B includes the following three combinations: only A, only B, and a combination of A and B.
- the term “if” is optionally construed as “when” or “in a case where” or “in response to determining that” or “in response to detecting”, depending on the context.
- the phrase “if it is determined that” or “if [a stated condition or event] is detected” is optionally construed as “in a case where it is determined that” or “in response to determining that” or “in a case where [the stated condition or event] is detected” or “in response to detecting [the stated condition or event]”, depending on the context.
- a startup process of a display apparatus may include the following steps. First, the displayer receives a startup instruction. In some examples, when the displayer is in a power-off state, a user performs a startup operation on the displayer. For example, referring to FIG. 1 , the user may press a power switch of the displayer. For another example, the user may also perform the startup operation on the displayer through a startup gesture, an infrared remote control, etc. In some other examples, when the displayer is in a standby (or hibernation) state, a startup instruction may be sent to the displayer through a host connected to the displayer. Subsequently, the displayer displays a startup picture of the displayer shown in FIG.
- the startup picture may be stored in a memory of the displayer by the manufacturer of the displayer in advance (e.g., before the product is shipped from a factory), and can be retrieved from the memory of the displayer for display when it needs to be displayed. Thereafter, the displayer may display the booting picture of an operating system in the host.
- a picture displayed by a high-resolution displayer is an image with a corresponding high resolution. Accordingly, the startup picture of such a display is also an image with a corresponding high resolution.
- Such a startup picture with the high resolution is stored in the memory of the displayer, and in this way, the startup picture with the high resolution occupies a large storage space, which not only occupies a lot of storage resources, but also takes a long time to load the startup picture before the startup picture is displayed.
- the display apparatus is a product with an image display function.
- the display apparatus may be a displayer, a television, a billboard, a digital photo frame, a laser printer with a display function, a telephone, a mobile phone, a personal digital assistant (PDA), a digital camera, a camcorder, a viewfinder, a navigator, a vehicle, a large-area wall, a household appliance, an information inquiry device (e.g., a business inquiry device of an electronic government, a bank, a hospital, an electric power department, and other departments), a monitor, or the like.
- PDA personal digital assistant
- the display apparatus may include a display module 200 and a display control apparatus 100 coupled to the display module.
- the display module 200 is configured to display an image (picture)
- the display control apparatus 100 is configured to perform a display control method to output image data to the display module 200 , so as to control the display module 200 to display images corresponding to the image data.
- the display module 200 may include a timing controller (TCON), a data driver circuit (i.e., a source driver circuit), a scanning driver circuit and a display panel (DP, also referred to as a display screen).
- TCON timing controller
- DP display panel
- the display panel may be an organic light-emitting diode (OLED) panel, a quantum dot light-emitting diode (QLED) panel, a liquid crystal display (LCD) panel, or a tiny LED (including a Mini LED or a Micro LED) panel.
- OLED organic light-emitting diode
- QLED quantum dot light-emitting diode
- LCD liquid crystal display
- tiny LED including a Mini LED or a Micro LED
- the display panel may include a plurality of sub-pixels.
- the number and distribution of the plurality of sub-pixels included in the display panel determine a resolution of the display panel, i.e., a resolution of the display module 200 or a resolution of the display apparatus.
- the display panel includes M by N (i.e., M ⁇ N) physical pixels, and each physical pixel includes a red sub-pixel (an R sub-pixel), a green sub-pixel (a G sub-pixel) and a blue sub-pixel (a B sub-pixel).
- the resolution of the display panel is (M ⁇ N).
- the display panel includes ((M ⁇ N)/2) R sub-pixels, (M ⁇ N) G sub-pixels and ((M ⁇ N)/2) B sub-pixels, and these sub-pixels form (M ⁇ N) virtual pixels, which can display an image with a resolution of (M ⁇ N), and the R sub-pixel and the B sub-pixel may be shared by different virtual pixels.
- the resolution of the display panel is also (M ⁇ N).
- the resolution is generally expressed in a multiplicative form.
- the resolution of the display panel may be (1920 ⁇ 1080), (4096 ⁇ 2160) or (8192 ⁇ 4320), which indicates that the display panel includes (1920 ⁇ 1080), (4096 ⁇ 2160) or (8192 ⁇ 4320) physical or virtual pixels, respectively. The higher the resolution, the larger the number of pixels.
- the TCON is used to convert received data signals (e.g., image data output from the display control apparatus 100 ) and received control signals respectively into data signals and control signals that are suitable for the data driver circuit and the scanning driver circuit, so as to realize image display of the display panel.
- Input interfaces of the TCON may include at least one of a transistor-transistor logic (TTL) interface, a low voltage differential signaling (LVDS) interface, an embedded display port (eDP) interface and a V-by-One interface.
- output interfaces of the display control apparatus 100 may include at least one of a TTL interface, a LVDS interface, an eDP interface and a V-by-One interface.
- the TCON may be integrated into the display control apparatus 100 .
- the data driver circuit may be a source driver chip, for example, a driver integrated circuit (IC).
- the data driver circuit is configured to provide, in response to the data signals (i.e., digital signals) and the control signals that are sent by the TCON, a driving signal (also referred to as a data driving signal, which may include a voltage or current corresponding to the digital signal) for each sub-pixel in the display panel.
- a driving signal also referred to as a data driving signal, which may include a voltage or current corresponding to the digital signal
- the data driver circuit may be integrated into the display control apparatus 100 .
- the scanning driver circuit may be a scanning driver chip, for example, a driver IC.
- the scanning driver circuit may be bonded to the display panel.
- the scanning driver circuit may be provided in the display panel, and in this case, it may be referred to as a gate driver on array (GOA, i.e., a scanning driver circuit disposed on an array substrate).
- the scanning driver circuit is configured to provide, in response to the control signals sent by the TCON, a scanning signal to each row of sub-pixels in the display panel.
- the display control apparatus 100 may be a chip system, which may include at least one chip and is configured to perform a display control method.
- the chip may be a programmable logic device.
- the chip may be a field programmable gate array (FPGA) or a complex programmable logic device (CPLD).
- the chip may also be a system-on-a-chip (SoC) chip.
- the display control apparatus 100 is a chip system including a SoC chip and a FPGA chip, and is configured to perform the display control method.
- the chip system may include a SoC board card including the SoC chip and a FPGA board card including the FPGA chip.
- the chip system may include a board card including the SoC chip and the FPGA chip.
- the display control apparatus 100 is a chip system including a FPGA chip, and is configured to perform the display control method.
- the chip system may be a FPGA board card including the FPGA chip.
- the display control apparatus 100 may include at least one processor 101 and at least one memory 102 .
- the at least one memory 102 has stored computer program(s) therein, and the at least one processor 101 is configured to execute the computer program(s) store in the at least one memory 102 , so that the display control apparatus 100 performs the display control method.
- the memory 102 may include a high-speed random access memory, or may include a non-volatile memory such as a magnetic disk storage device or a flash memory device.
- the memory 102 may be a read-only memory (ROM) or a static storage device of any other types that may store static information and instructions, a random access memory (RAM) or a dynamic storage device of any other types that may store information and instructions.
- ROM read-only memory
- RAM random access memory
- dynamic storage device of any other types that may store information and instructions.
- the memory 102 may be a one-time programmable (OTP) memory, an electrically erasable programmable read-only memory (EEPROM), a magnetic disk storage medium, a flash or any other magnetic storage device, or any other medium capable of carrying or storing program codes in the form of instructions or data structures and capable of being accessed by a computer, but the type of the memory is not limited thereto.
- OTP one-time programmable
- EEPROM electrically erasable programmable read-only memory
- the memory 102 may exist independently, and be connected to the processor 101 through a communication line. Alternatively, the memory 102 may be integrated with the processor 101 .
- the processor 101 is used to implement image processing, and may be one or more general-purpose central processing units (CPUs), microcontroller units (MCUs), logic devices, application-specific integrated circuits (ASICs), graphics processing units (GPUs), or integrated circuits (ICs) for controlling execution of programs in some embodiments of the present disclosure.
- the CPU may be a single-CPU or a multi-CPU.
- a processor here may refer to one or more devices, circuits or processing cores for processing data (e.g., computer program instructions).
- the embodiments of the present disclosure provide a display control method.
- the display control method provided by the embodiments of the present disclosure may utilize image data of a startup picture with the low resolution (e.g., a first image 510 ) stored in the display apparatus to enable the display apparatus to display a startup picture with the high resolution (e.g., a second image 520 ).
- the image data of the startup picture with the low resolution may be stored in the display apparatus, which may solve problems of a large storage space occupied by the image data of the startup picture with the high resolution and a long loading time of the image data of the startup picture with the high resolution.
- FIG. 5 is a flowchart of the display control method provided by the embodiments of the present disclosure.
- the display control method includes S 101 to S 103 .
- the image data of the startup picture are read in response to a case where the display control apparatus is powered on.
- the display control apparatus in the display apparatus is powered on to perform the step of reading.
- the image data of the startup picture are read in response to a received read instruction.
- the read instruction may be sent to the display control apparatus by the host in the display apparatus.
- the startup picture includes at least one frame of image. In a case where the startup picture includes one frame of image, it is static. In a case where the startup picture includes a plurality of frames of images, it is dynamic. Each frame of image may include a display content and a background.
- the display content of the startup picture may include a pattern (e.g., a logo), words (e.g., copyright information), and the like, and is generally centralized in the center or any other position of the startup picture.
- the background of the startup picture may be solid-colored. Alternatively, the background of the startup picture may be non-solid-colored, for example, gradually changed in color or irregular in color distribution.
- the startup picture may include the first image 510 having a first resolution, and includes the display content “XXX” and the background. For example, the background is a solid black background.
- FIG. 6 A shows a schematic diagram of the first image 510 .
- the first image 510 includes n rows by m columns of pixels, i.e., (m ⁇ n) pixels.
- a pixel in an x-th row and a y-th column may be denoted as a pixel xy, where a value of x is within a range from 1 to n, inclusive, and a value of y is within a range from 1 to m, inclusive.
- the first image 510 may include pixels 11 , 12 , 13 , . . . , 1 m , 21 , . . . , 2 m , . . . , n 1 , . . . , nm.
- the image data may include RGB image data, or may include YUV image data.
- the RGB image data may include a pixel value of at least one pixel, and the pixel value may include pixel data (e.g., grayscale data) of all sub-pixels in the pixel.
- the two pixels may display the same color.
- the pixel value has a certain range.
- each pixel value obtained or generated is within the range of the pixel value.
- the pixel value takes a large boundary value of the range in a case where the result is greater than the range, and the pixel value takes a small boundary value of the range in a case where the result is less than the range.
- the range of the pixel value is from 0 to 255, inclusive; in a case where the result obtained or generated is greater than 255, the pixel value is 255; and in a case where the result obtained or generated is less than 0, the pixel value is 0.
- an image processing method is performed to obtain image data of a second image.
- the image data of the second image 520 may be obtained based on the image data of the first image 510 .
- the second image 520 has a second resolution, and the second resolution is greater than the first resolution of the first image 510 .
- the second image 520 has a base region B and an extended region E.
- the display control apparatus needs the image data corresponding to the second image 520 , and the image data include pixel values of all pixels in the second image 520 .
- the image data of the second image 520 may include image data of the base region B and image data of the extended region E.
- the image data of the base region B may include pixel values of all pixels in the base region B of the second image 520
- the image data of the extended region E may include pixel values of all pixels in the extended region E of the second image 520 .
- FIG. 6 B shows a structural diagram of the second image 520 .
- the second image 520 has a second resolution, which is denoted as (p ⁇ q) (p is greater than or equal to 1 (p ⁇ 1), and q is greater than or equal to 1 (q ⁇ 1)).
- the second image 520 includes (p ⁇ q) pixels.
- a pixel in an x-th row and a y-th column may be denoted as a pixel xy, where a value of x is within a range from 1 to q, inclusive, and a value of y is within a range from 1 to p, inclusive.
- the second image 520 may include pixels 11 , 12 , 13 , . . .
- the extended region E is an annular region, and the extended region E surrounds the base region B.
- the base region B may be in the middle of the second image 520 .
- the base region B and the second image 520 are both in a shape of a rectangle, and a center point of the base region B coincides with a center point of the second image 520 .
- the image data of the startup picture with the low-resolution may be used to fill an image corresponding to the startup picture with the low-resolution in a portion of a picture displayed on the display apparatus with the high-resolution.
- a corresponding background is filled in the other portion of the picture displayed on the display apparatus with the high-resolution, so that the picture displayed on the display apparatus with the high-resolution is a picture with a uniform color change on the whole.
- the base region B of the second image 520 may be filled with an image corresponding to the first image 510 with the low-resolution, and the extended region E of the second image 520 may be filled with a corresponding solid black background.
- the startup picture with the high-resolution displayed on the display apparatus with the high-resolution includes all information of the startup picture with the low-resolution (e.g., the display content “XXX”), and the color change of the startup picture with the high-resolution is uniform, which may provide the user with a good visual effect.
- the image data of the second image are output in response to a case where the display control apparatus is in a startup initialization state. For example, once the display control apparatus is powered on or when the display control apparatus receives a startup instruction signal, the image data of the second image are output to the TCON to control the display panel for display according to the image data of the second image (i.e., control the display panel to display the second image).
- the display control method further includes the following step.
- image data of a working picture are output in response to an end of the startup initialization state of the display control apparatus.
- the display control apparatus when receiving a startup state ending signal, the display control apparatus outputs the image data of the working picture to the TCON to control the display panel to display the working picture.
- FIG. 7 is a flowchart illustrating steps of the image processing method provided by the embodiments of the present disclosure. Referring to FIG. 7 , the image processing method includes S 201 and S 203 .
- the image data of the first image are used as the image data of the base region.
- the image data of the first image 510 are used as the image data of the base region B of the second image 520 .
- a portion of the second image 520 located at the base region B includes (m ⁇ n) pixels, and pixel values of the (m ⁇ n) pixels are respectively equal to the pixel values of the (m ⁇ n) pixels of the first image 510 , so that the portion of the second image 520 located at the base region B is the first image 510 .
- the image processing method further includes the following step.
- the type of the first image is identified according to pixel values of a plurality of pixels in a border of the first image.
- the extended region of the second image may be filled with a background similar to the background of the first image.
- the color change of the second image may be uniform.
- the first image may be classified as one of various types according to the background of the first image, and a corresponding policy of various extension policies is used to generate the image data of the extended region of the second image, so that the extended region of the second image has a corresponding background, and the color change of the second image may be uniform.
- the first image may be classified as one of various types according to the border of the first image, and then the corresponding extension policy is selected. As a result, the uniform color change may be achieved in a case where the extended region of the second image is stitched together with the base region.
- the border F of the first image 510 includes two first sub-borders parallel to a row direction X, for example, a first sub-border Fx 1 (i.e., a portion O 1 O 2 X 3 X 1 ) and a first sub-border Fx 2 (i.e., a portion O 3 O 4 X 4 X 2 ).
- the border F of the first image 510 also includes two second sub-borders parallel to a column direction Y, for example, a second sub-border Fy 1 (i.e., a portion O 1 O 3 Y 3 Y 1 ) and a second sub-border Fy 2 (i.e., a portion O 2 O 4 Y 4 Y 2 ).
- the type of the first image can be determined according to a change trend of pixel values of pixels in each first sub-border in the row direction and a change trend of pixel values of pixels in each second sub-border in the column direction.
- a determination result of the first sub-border is determined according to the change trend of the pixel values of the pixels in each first sub-border in the row direction.
- the determination result of the first sub-border is denoted as a first determination result.
- a determination result of the first sub-border Fx 1 is denoted as a first determination result 1
- a determination result of the first sub-border Fx 2 is denoted as a first determination result 2.
- the first determination result includes equality; otherwise, the first determination result includes inequality.
- G xn is less than a set value (e.g., 1 or 2), that is, if a difference between any two of R x1 to R xn is less than the set value, a difference between any two of G to x1 G xn is less than the set value, and a difference between any two of B x1 to B xn is less than the set value, the pixel values of the pixels in the x-th row are approximately equal. If the pixel values of the pixels in each row in the first sub-border Fx 1 are approximately equal, the first determination result 1 includes equality; otherwise, the first determination result 1 includes inequality.
- a set value e.g., 1 or 2
- the first determination result includes gradual change; otherwise the first determination result includes no gradual change.
- the pixel values of all the pixels in an x-th row in the first sub-border Fx 1 include (R x1 , R x2 , R x3 . . . R xn ) (B x1 , B x2 , B x3 . . . B xn ) (G x1 , G x2 , G x3 . . .
- ⁇ R xy R xy ⁇ R x(y-1)
- ⁇ G xy G xy ⁇ G x(y-1)
- ⁇ B xy B xy ⁇ B x(y-1) .
- ⁇ R x2 R x2 ⁇ R x1
- ⁇ R x3 R x3 ⁇ R x2 , . . .
- ⁇ R xn R n ⁇ R (n-1)
- ⁇ G x2 G x2 ⁇ G x1
- ⁇ G x3 G x3 ⁇ G x2
- ⁇ B x2 B x2 ⁇ B x1
- ⁇ B x3 B x3 ⁇ B x2
- ⁇ B xn B n ⁇ B (n-1) .
- the first determination result includes inequality.
- a difference between any two of ⁇ R x2 to ⁇ R xn is less than a set value (e.g., 1 or 2), a difference between any two of ⁇ G x2 to ⁇ G xn is less than the set value, and a difference between any two of ⁇ B x2 to ⁇ B xn is less than the set value, it is also indicated that the pixel values of the pixels in the x-th row gradually change.
- the first determination result includes inequality.
- the first determination result includes inequality, and if ⁇ R x2 to ⁇ R xn gradually increase, ⁇ G x2 to ⁇ G xn gradually increase, and ⁇ B x2 to ⁇ B xn gradually increase, it is indicated that the pixel values of the pixels in the x-th row gradually change.
- the first determination result includes inequality, and if ⁇ R x2 to ⁇ R xn gradually decrease, ⁇ G x2 to ⁇ G xn gradually decrease, and ⁇ B x2 to ⁇ B xn gradually decrease, it is indicated that the pixel values of the pixels in the x-th row gradually change.
- the first determination result includes inequality, and if ⁇ R x2 to ⁇ R xn gradually increase or gradually decrease, ⁇ G x2 to ⁇ G xn are approximately equal, and ⁇ B x2 to ⁇ B xn are approximately equal, it is indicated that the pixel values of the pixels in the x-th row gradually change.
- the first determination result 1 includes gradual change; otherwise, the first determination result 1 includes no gradual change.
- a determination result of the second sub-border is determined according to the change trend of the pixel values of the pixels in each second sub-border in the column direction.
- the determination result of the second sub-border is denoted as a second determination result.
- a determination result of the second sub-border Fy 1 is denoted as a second determination result 1
- a determination result of the second sub-border Fy 2 is denoted as a second determination result 2.
- the second determination result includes equality; otherwise, the second determination result includes inequality. If the pixel values of all the pixels in each column of pixels in the second sub-border gradually change, the second determination result includes gradual change; otherwise, the second determination result includes no gradual change.
- the first image is of the first type, and the first type may be configured to indicate that the first image includes a solid-colored border.
- the first determination result 1 of the first sub-border Fx 1 , the first determination result 2 of the first sub-border Fx 2 , the second determination result 1 of the second sub-border Fy 1 , and the second determination result 2 of the second sub-border Fy 2 are all include equality, the first image 510 is of the first type, and the first image 510 may include the solid-colored border.
- the first image is of the second type, and the second type may be configured to indicate that the first image includes a border, a color of which gradually changes in the column direction and does not change in the row direction.
- the first determination result 1 of the first sub-border Fx 1 and the first determination result 2 of the first sub-border Fx 2 both include equality
- the second determination result 1 of the second sub-border Fy 1 and the second determination result 2 of the second sub-border Fy 2 both include gradual change
- the first image 510 is of the second type, and the first image 510 may include the border, a color of which gradually changes in the column direction and does not change in the row direction.
- the first image is of the third type, and the third type may be configured to indicate that the first image includes a border, a color of which gradually changes in the row direction and does not change in the column direction.
- the first determination result 1 of the first sub-border Fx 1 and the first determination result 2 of the first sub-border Fx 2 both include gradual change
- the second determination result 1 of the second sub-border Fy 1 and the second determination result 2 of the second sub-border Fy 2 both include equality
- the first image 510 is of the third type, and the first image 510 may include the border, a color of which gradually changes in the row direction and does not change in the column direction.
- the first image is of the fourth type, and the fourth type may be configured to indicate that the first image includes a border, a color of which gradually changes both in the row direction and in the column direction.
- the first image is of the fifth type, and the fifth type may be configured to indicate that the first image includes a border that is irregular in color.
- the first determination result 1 of the first sub-border Fx 1 , the first determination result 2 of the first sub-border Fx 2 , the second determination result 1 of the second sub-border Fy 1 and the second determination result 2 of the second sub-border Fy 2 includes inequality and no gradual change
- the first image 510 is of the fifth type, and the first image 510 may include the border that is irregular in color.
- the image data of the extended region are generated according to the extension policies.
- the first image may be classified as one of five types according to the border of the first image.
- different extension policies may be used for the first image, so as to use the image data of the first image to generate the image data of the extended region of the second image.
- the image data of the extended region E may be generated according to the extension policies based on a pixel value of at least one pixel in the border F of the first image 510 .
- the first image 510 has the border F.
- the border F includes 4 sub-borders, for example, the sub-borders Fx 1 , Fx 2 , Fy 1 and Fy 2 .
- a width of each sub-border is greater than that of a single pixel, and the width of each sub-border may or may not be equal.
- widths of all sub-borders are equal, and each sub-border includes a width of n pixels.
- the sub-border Fx 1 and the sub-border Fx 2 both include n rows of pixels
- the sub-border Fy 1 and the sub-border Fy 2 both include n columns of pixels.
- n is greater than or equal to 1 and less than or equal to 10 (1 ⁇ n ⁇ 10).
- the first image 510 includes a solid-colored border F 1 .
- the solid-colored border F 1 means that a color of the border F 1 is a single color. In this case, all pixels in the border F 1 of the first image 510 may have the same pixel value.
- a pixel value of a pixel in the border F 1 of the first image 510 may be used as a pixel value of each pixel in the extended region E.
- the border F 1 of the first image 510 is solid-colored, and the pixel values of all the pixels in the border F 1 are equal. Therefore, by using a pixel value of any pixel in the solid-colored border F 1 as the pixel value of each pixel in the extended region E, it may be realized that a color of the extended region E is the same as the color of the border F 1 of the first image 510 . Moreover, since the base region B of the second image 520 may be filled with the first image 510 after step S 201 , the base region B of the second image 520 also has a solid-colored border. In this case, the color of the extended region E is the same as the color of the border of the base region B.
- the second image 520 has a solid-colored background, and a uniform color change.
- a pixel value of a pixel 11 in the border F 1 of the first image 510 is used as the pixel value of each pixel in the extended region E.
- the first image includes a non-solid-colored border.
- the non-solid-colored border means that the border has a plurality of colors.
- each pixel in the border of the first image may have a different pixel value.
- the non-solid-colored border may include a border, a color of which gradually changes in the row direction and/or in the column direction.
- a gradual change in color means that a color gradually changes in a certain direction, for example, the color gradually becomes darker or lighter in the certain direction.
- the change may be uniform or non-uniform.
- a color displayed by all pixels in the image changes gradually, and correspondingly, pixel values of all the pixels in the image also changes gradually.
- the image data of the extended region may be generated, based on pixel values of a plurality of (e.g., z, z is greater than or equal to 2 (z ⁇ 2)) pixels in the border of the first image, according to the extension policies.
- the first image 510 includes a border F 2 , a color of which gradually changes in the column direction and does not change in the row direction.
- the color of border F 2 gradually becomes darker in the column direction, and accordingly, pixel values of all pixels in the border F 2 show a trend of gradually increasing in the column direction.
- Change of the pixel values of all the pixels in the border F 2 in the column direction may be a uniform change, for example, these pixel values are in an arithmetic progression.
- the change of the pixel values of all the pixels in the border F 2 in the column direction may be a non-uniform change.
- the color of the border F 2 does not change in the row direction, and accordingly, pixel values of all pixels in the border F 2 are approximately equal in the row direction.
- the extended region E of the second image 520 shown in FIG. 9 B includes a first sub-region D 11 (i.e., a portion V 1 V 2 S 8 S 5 ), a first sub-region D 12 (i.e., a portion V 3 V 4 S 6 S 7 ), and also includes a second sub-region D 21 (i.e., a portion S 4 S 3 V 3 V 2 ) and a second sub-region D 22 (i.e., a portion S 1 S 2 V 4 V 1 ) except the first sub-regions.
- the first sub-region D 11 and the first sub-region D 12 are both flush with the base region B in the row direction. In this case, each row of pixels in the first sub-region D 11 and the first sub-region D 12 may be flush with a corresponding row of pixels in the base region B.
- a pixel value of each pixel in a corresponding row of pixels in the first sub-regions may be generated; and according to a change trend of pixel values of a plurality of pixels in the border F 2 of the first image 510 in the column direction, pixel values of a plurality of rows of pixels that change in the column direction in the second sub-regions are obtained.
- Each row of pixels have the same pixel value, and each pixel value is within the range of the pixel value.
- the first sub-region D 11 is considered as an example for illustration.
- the image data of the first sub-region D 12 may be generated by using a method similar to a method for generating the image data of the first sub-region D 11 , and details will not be provided herein.
- a pixel value of each pixel in a row of pixels in the first sub-region D 11 may be generated according to a pixel value of at least one pixel located at the border F 2 in a corresponding row of pixels of the first image 510 .
- pixels located at the border F 2 in the corresponding row of pixels of the first image 510 may be a pixel 11 , a pixel 12 , . . . , and a pixel 1 m in the first row of the first image 510 , and according to a pixel value of at least one pixel in these pixels, a pixel value of each pixel in the first row of pixels in the first sub-region D 11 may be generated.
- a pixel value of a pixel, located at the border F 2 , in each row of pixels of the first image 510 is used as the pixel value of each pixel in the corresponding row of pixels in the first sub-region D 11 .
- a pixel value of a pixel i.e., the pixel 11 located at the border F 2 is used as the pixel value of each pixel in the first row of pixels in the first sub-region D 11 .
- pixel values of all pixels in each row of the border F 2 of the first image 510 are approximately equal.
- a pixel value of any pixel located at the border F 2 in each row of pixels of the first image 510 is used as the pixel value of each pixel in the corresponding row of pixels in the first sub-region D 11 , so that each row of pixels in the first sub-region D 11 displays substantially the same color as a corresponding row of pixels in the border F 2 of the first image 510 .
- a color change in the second image 520 from the first image 510 filled in the base region B to the corresponding background filled in the first sub-region D 11 may be uniform.
- the second sub-region D 21 is considered as an example for illustration.
- the second sub-region D 21 includes (i+1) rows of pixels, where i is greater than or equal to 0 (i ⁇ 0).
- a respective one of these rows of pixels may be denoted as an (f+x)-th row of pixels, and a value of x is within a range from 0 to i, such as an f-th row of pixels, an (f+1)-th row of pixels, . . . , an (f+i)-th row of pixels shown in FIG. 9 B .
- Image data of the second sub-region D 21 may be generated according to the change trend of the pixel values of the plurality of pixels in the border F 2 in the column direction, so that a change trend of pixel values of all pixels in the second sub-region D 21 in the column direction is the same as the change trend of the pixel values of the plurality of pixels in the border F 2 of the first image 510 in the column direction.
- pixel values of a plurality of rows of pixels that change in the column direction in the second sub-region D 21 may be obtained based on a pixel value of any pixel (e.g., the pixel value of the pixel 11 ) in the first row of pixels of the first image 510 .
- the pixel values of all pixels in each row of pixels may be equal.
- pixel values of a column of pixels may form an arithmetic progression, so that a color of the second sub-region D 21 changes uniformly in the column direction, which may provide the user with a good viewing feeling.
- the pixel values of all pixels in the column of pixels are different from one other, and all the pixel values form the arithmetic progression.
- pixel values of R fy to R (f+i)y are 10, 15, 20, . . .
- pixel values of adjacent pixels in the column of pixels may be equal, and non-repeated values in pixel values of all pixels in the column may form the arithmetic progression.
- a method for obtaining the image data of the second sub-region D 21 is as follows.
- the second sub-region D 22 may include (k+1) rows of pixels, where k is greater than or equal to 0 (k ⁇ 0).
- a respective one of these rows of pixels may be denoted as a (g+x)-th row of pixels, and a value of x is within a range from 0 to k, such as a g-th row of pixels, a (g+1)-th row of pixels, . . . , a (g+k)-th row of pixels shown in FIG. 9 B .
- a method for obtaining image data of the second sub-region D 22 is similar to the method for obtaining the image data of the second sub-region D 21 .
- pixel values of a plurality of rows of pixels that change in the column direction in the second sub-region D 22 are obtained based on a pixel value of any pixel (e.g., a pixel value (including R n1 , G n1 and B n1 ) of a pixel n 1 ) in an n-th row of pixels of the first image 510 .
- the method for obtaining the image data of the second sub-region D 22 is as follows.
- the first image 510 includes a border F 3 , a color of which gradually changes in the row direction and does not change in the column direction.
- the color of the border F 3 gradually becomes darker in the row direction, and accordingly, pixel values of all pixels in the border F 3 may show a trend of gradually increasing in the row direction.
- Change of the pixel values of all the pixels in the border F 3 in the row direction may be a uniform change, for example, these pixel values are in an arithmetic progression.
- the change of the pixel values of all the pixels in the border F 3 in the row direction may be a non-uniform change.
- the extended region E of the second image 520 shown in FIG. 9 C includes third sub-regions and fourth sub-regions except the third sub-regions.
- the third sub-regions are flush with the base region B in the column direction.
- the second image 520 includes a third sub-region D 31 (i.e., a portion H 4 H 3 S 7 S 8 ), a third sub-region D 32 (i.e., a portion H 1 H 2 S 6 S 5 ), a fourth sub-region D 41 (i.e., a portion S 4 S 1 H 1 H 4 ) and a fourth sub-region D 42 (i.e., a portion S 3 S 2 H 2 H 3 ).
- the third sub-region D 31 and the third sub-region D 32 are both flush with the base region B in the column direction.
- each column of pixels in the third sub-region D 31 and the third sub-region D 32 may be flush with a corresponding column of pixels in the base region B.
- a pixel value of each pixel in a corresponding column of pixels in the third sub-regions may be generated, and according to a change trend of pixel values of a plurality of pixels in the border F 3 of the first image 510 in the row direction, pixel values of a plurality of columns of pixels that change in the row direction in the fourth sub-regions are obtained.
- Each column of pixels have the same pixel value, and each pixel value is within the range of the pixel value.
- the third sub-region D 31 is considered as an example for illustration.
- the image data of the third sub-region D 32 may be generated by using a method similar to a method for generating the image data of the third sub-region D 31 , and details will not be provided herein.
- a pixel value of each pixel in the column of pixels in the third sub-region D 31 may be generated according to a pixel value of at least one pixel located at the border F 3 in a corresponding column of pixels of the first image 510 .
- pixels located at the border F 3 in a corresponding column of pixels of the first image 510 may be the first column of pixels of the first image 510 , including a pixel 11 , a pixel 21 , . . . , a pixel n 1 , and according to a pixel value of at least one pixel in these pixels, a pixel value of each pixel in the first column of pixels in the third sub-region D 31 may be generated.
- a pixel value of a pixel, located at the border F 3 , in each column of pixels of the first image 510 is used as the pixel value of each pixel in the corresponding column of pixels in the third sub-region D 31 .
- a pixel value of a pixel i.e., the pixel 11 located at the border F 3 is used as the pixel value of each pixel in the first column of pixels in the third sub-region D 31 .
- pixel values of all pixels in each column of the border F 3 of the first image 510 are approximately equal.
- a pixel value of any pixel located at the border F 3 in each column of pixels of the first image 510 is used as the pixel value of each pixel in the corresponding column of pixels in the third sub-region D 31 , so that each column of pixels in the third sub-region D 31 displays substantially the same color as a corresponding column of pixels in the border F 3 of the first image 510 .
- a color change from the first image 510 filled in the base region B to the third sub-region D 31 of the second image 520 may be uniform.
- the fourth sub-region D 41 includes (i+1) columns of pixels, where i is greater than or equal to 0 (i ⁇ 0).
- a respective one of these columns of pixels may be denoted as an (f+x)-th column of pixels, and a value of x is within a range from 1 to i, such as an f-th column of pixels, an (f+1)-th column of pixels, . . . , and an (f+i)-th column of pixels shown in FIG. 9 C .
- Image data of the fourth sub-region D 41 may be generated according to the change trend of the pixel values of the plurality of pixels in the border F 3 in the row direction, so that a change trend of pixel values of all pixels in the fourth sub-region D 41 in the row direction is the same as the change trend of the pixel values of the plurality of pixels in the border F 3 of the first image 510 in the row direction.
- pixel values of a plurality of columns of pixels that change in the row direction in the fourth sub-region D 41 may be obtained based on a pixel value of any pixel (e.g., the pixel value of the pixel 11 ) in the first column of pixels of the first image 510 .
- the pixel values of all pixels in each column of pixels may be equal.
- pixel values of a row of pixels may form an arithmetic progression, so that a color of the fourth sub-region D 41 changes uniformly in the row direction, which may provide the user with a good viewing feeling.
- the pixel values of all pixels in the row of pixels are different from one other, and all the pixel values form the arithmetic progression.
- pixel values of adjacent pixels in the row of pixels may be equal, and non-repeated values in pixel values of all pixels in the row may form the arithmetic progression.
- a method for obtaining the image data of the fourth sub-region D 41 is as follows.
- the fourth sub-region D 42 may include (k+1) columns of pixels, where k is greater than or equal to 0 (k ⁇ 0).
- a respective one of these columns of pixels may be denoted as a (g+x)-th column of pixels, and a value of x is within a range from 0 to k, such as a g-th column of pixels, a (g+1)-th column of pixels, . . . , a (g+k)-th column of pixels shown in FIG. 9 C .
- a method for obtaining the image data of the fourth sub-region D 42 is similar to the method for obtaining the image data of the fourth sub-region D 41 .
- pixel values of a plurality of columns of pixels that change in the row direction in the fourth sub-region D 42 are obtained based on a pixel value of any pixel (e.g., a pixel value (including R 1m , G 1m and B 1m ) of a pixel 1 m ) in an m-th column of pixels of the first image 510 .
- the method for obtaining the image data of the fourth sub-region D 42 is as follows.
- the first image 510 includes a border F 4 , a color of which gradually changes both in the column direction and in the row direction.
- the color of the border F 4 gradually becomes darker in the row direction and/or the column direction, and accordingly, pixel value of all pixels in the border F 4 may show a trend of gradually increasing in the row direction and/or the column direction.
- change of the pixel values of all the pixels in the border F 4 in the row direction and/or the column direction may be a uniform change, for example, these pixel values are in an arithmetic progression.
- the change of the pixel values of all the pixels in the border F 4 in the row direction and/or the column direction may be a non-uniform change.
- the extended region E of the second image 520 shown in FIG. 9 D includes fifth sub-regions, sixth sub-regions, and seventh sub-regions except the fifth sub-regions and the sixth sub-regions.
- the fifth sub-regions are flush with the base region B in the row direction
- the sixth sub-regions are flush with the base region B in the column direction.
- the second image 520 includes a fifth sub-region D 51 , a fifth sub-region D 52 , a sixth sub-region D 61 , a sixth sub-region D 62 , and a seventh sub-region D 71 , a seventh sub-region D 72 , a seventh sub-region D 73 and a seventh sub-region D 74 .
- the fifth sub-region D 51 and the fifth sub-region D 52 are both flush with the base region B in the row direction, and in this case, each row of pixels in the fifth sub-region D 51 and the fifth sub-region D 52 may be flush with a corresponding row of pixels in the base region B.
- the sixth sub-region D 61 and the sixth sub-region D 62 are both flush with the base region B in the column direction, and in this case, each column of pixels in the sixth sub-region D 61 and the sixth sub-region D 62 may be flush with a corresponding column of pixels in the base region B.
- pixel values of a corresponding column of pixels in the sixth sub-regions are obtained.
- Each pixel value is within the range of the pixel value.
- Image data of the seventh sub-regions are obtained based on the image data of the fifth sub-regions and/or the image data of the sixth sub-regions.
- the fifth sub-region D 51 includes n rows and (f+i) columns of pixels, and the fifth sub-region D 51 may include a pixel xy which represents a pixel in an x-th row and a y-th column, where a value of x is within a range from 1 to n, and a value of y is within a range from f to (f+i).
- the color of the border F 4 of the first image 510 gradually changes both in the row direction and in the column direction, pixel values of a plurality of pixels located in a row in the border F 4 of the first image 510 have a certain change trend in the row direction, and pixel values of a corresponding row of pixels in the fifth sub-region D 51 may be obtained according to this change trend.
- the change trend of the pixel values of all pixels in the corresponding row of pixels in the fifth sub-region D 51 in the row direction is the same as a change trend of pixel values of the row of pixels in the border F 4 of the first image 510 in the row direction.
- pixel values of the x-th row of pixels in the fifth sub-region D 51 may be obtained.
- Pixel values of pixels from a first row to an n-th row in the fifth sub-region D 51 are obtained according to the above method, and thus the image data of the fifth sub-region D 51 can be obtained.
- pixel values of a row of pixels may form an arithmetic progression, so that a color of the fifth sub-region D 51 changes uniformly in the row direction, which may provide the user with a good viewing feeling.
- a method for obtaining the image data of the fifth sub-region D 51 is as follows.
- the fifth sub-region D 52 includes n rows and (g+k) columns of pixels, and the fifth sub-region D 52 may include a pixel xy which represents a pixel in an x-th row and a y-th column, where a value of x is within a range from 1 to n, and a value of y is within a range from g to (g+k).
- pixel values of the x-th row of pixels in the fifth sub-region D 52 may be obtained.
- Pixel values of pixels from a first row to an n-th row in the fifth sub-region D 52 are obtained according to the above method, and thus the image data of the fifth sub-region D 52 can be obtained.
- a method for obtaining the image data of the fifth sub-region D 52 is as follows.
- the sixth sub-region D 61 includes (h+a) rows and m columns of pixels, and the sixth sub-region D 61 may include a pixel xy which represents a pixel in an x-th row and a y-th column, where a value of x is within a range from h to (h+a), and a value of y is within a range from 1 to m.
- the color of the border F 4 of the first image 510 gradually changes both in the row direction and in the column direction, pixel values of a plurality of pixels located in a column in the border F 4 of the first image 510 have a certain change trend in the column direction, and pixel values of a corresponding column of pixels in the sixth sub-region D 61 may be obtained according to this change trend.
- the change trend of the pixel values of all pixels in the corresponding column of pixels in the sixth sub-region D 61 in the column direction is the same as a change trend of pixel values of the column of pixels in the border F 4 of the first image 510 in the column direction.
- pixel values of the y-th column of pixels in the sixth sub-region D 61 may be obtained. Pixel values of pixels from a first column to an m-th column in the sixth sub-region D 61 are obtained according to the above method, and thus the image data of the sixth sub-region D 61 can be obtained.
- pixel values of a column of pixels may form an arithmetic progression, so that a color of the sixth sub-region D 61 changes uniformly in the column direction, which may provide the user with a good viewing feeling.
- a method for obtaining the image data of the sixth sub-region D 61 is as follows.
- the sixth sub-region D 62 includes (j+z) rows and m columns of pixels, and the sixth sub-region D 62 may include a pixel xy which represents a pixel in an x-th row and a y-th column, where a value of x is within a range from j to (j+z), and a value of y is within a range from 1 to m.
- pixel value of the y-th column of pixels in the sixth sub-region D 62 may be obtained.
- Pixel values of pixels from a first column to an m-th column in the sixth sub-region D 62 are obtained according to the above method, and thus the image data of the sixth sub-region D 62 can be obtained.
- a method for obtaining the image data of the sixth sub-region D 62 is as follows.
- obtaining the image data of the seventh sub-regions based on the image data of the fifth sub-regions and/or the image data of the sixth sub-regions may include:
- the method for generating the image data of the seventh sub-region will be described below by considering the seventh sub-region D 71 as an example.
- Methods for generating image data of the seventh sub-regions D 72 , D 73 and D 74 are similar to the method for generating the image data of the seventh sub-region D 71 , and details will not be provided herein.
- the seventh sub-region D 71 includes x rows and y columns of pixels, and the seventh sub-region D 71 may include a pixel xy which represents a pixel in an x-th row and a y-th column, where a value of x is within a range from h to (h+a), and a value of y is within a range from f to (f+i).
- the method for generating the image data of the seventh sub-region D 71 is as follows.
- a pixel value of an unknown pixel may be calculated based on pixel values of two adjacent pixels respectively in the row direction and the column direction by extrapolating from a pixel in the first row and the first column of the first image 510 filled in the base region B in the second image 520 to a pixel in an (h+a)-th row and an (f+i)-th column of the seventh sub-region D 71 .
- a pixel value of a pixel in an h-th row and an f-th column in the seventh sub-region D 71 is calculated based on a pixel value of a pixel in a first row and an f-th column in the fifth sub-region D 51 and a pixel value of a pixel in an h-th row and a first column in the sixth sub-region D 61 .
- the calculation method may be averaging pixel values.
- the method for generating the image data of the seventh sub-region D 71 is, for another example, as follows. Based on the image data of the fifth sub-region D 51 adjacent to the seventh sub-region D 71 , pixel values of a corresponding column of pixels in the seventh sub-region D 71 are obtained using a method similar to the method for generating the image data of the sixth sub-region D 61 based on the image data of the first image 510 , that is, according to a change trend of pixel values of pixels located in a column in the border of the fifth sub-region D 51 in the column direction, so as to obtain the image data of the seventh sub-region D 71 .
- the method for generating the image data of the seventh sub-region D 71 is, for yet another example, as follows. Based on the image data of the sixth sub-region D 61 adjacent to the seventh sub-region D 71 , pixel values of a corresponding row of pixels in the seventh sub-region D 71 are obtained using a method similar to the method for generating the image data of the fifth sub-region D 51 based on the image data of the first image 510 , that is, according to a change trend of pixel values of pixels located in a row in the border of the sixth sub-region D 61 in the row direction, so as to obtain the image data of the seventh sub-region D 71 .
- the first image 510 includes a border F 5 that is irregular in color.
- the pixel values of all the pixels in the first image 510 may be averaged to obtain a pixel value of each pixel in the extended region.
- R xy ( R 11 +R 12 + . . . +R 1m +R 21 +R 22 + . . . +R 2m + . . . +R n1 +R n2 + . . . +R nm )/( n ⁇ m );
- G xy ( G 11 +G 12 + . . . +G 1m +G 21 +G 22 + . . . +G 2m + . . . +G n1 +G n2 + . . . +G nm )/( n ⁇ m );
- B xy ( B 11 +B 12 + . . . +B 1m +B 21 +B 22 + . . . +B 2m + . . . +B n1 +B n2 + . . . +B nm )/( n ⁇ m ).
- steps S 201 and S 203 may be performed simultaneously; alternatively, steps S 201 and S 203 may not be performed simultaneously, and there is no sequential order therebetween.
- the image processing method includes the step S 202 , and in this case, the step S 202 may be performed before the step S 203 .
- FIG. 10 shows a block diagram of the display control apparatus 300 provided by the embodiments of the present disclosure.
- the display control apparatus 300 includes a reading module 310 , an image processing apparatus 320 and an output module 330 .
- the reading module 310 is configured to read image data of a startup picture in response to a case where the display control apparatus 300 is powered on. In some embodiments, the reading module 310 may perform the step S 101 in the display control method provided by any of the above embodiments.
- the image processing apparatus 320 is configured to perform the image processing method provided by any of the above embodiments to obtain the image data of the second image.
- the first image in the image processing method may be the startup picture read by the reading module.
- the output module 330 is configured to output the image data of the second image, so as to control a display panel for display according to the image data of the second image, that is, to display the second image with a large resolution.
- the output module 330 may perform the step S 103 and/or the step S 104 in the display control method described in any of the above embodiments.
- Some embodiments of the present disclosure further provide an image processing apparatus.
- this image processing apparatus may be used as the image processing apparatus 320 in the display control apparatus shown in FIG. 10 .
- the image processing apparatus 320 may include a first processing module 321 and a second processing module 322 .
- the first processing module 321 is configured to use the image data of the first image as the image data of the base region.
- the first processing module 321 may perform the step S 201 in the image processing method provided by any of the above embodiments.
- the second processing module 322 is configured to generate, based on the image data of the first image, the image data of the extended region according to the extension policies, so as to obtain the image data of the second image including the image data of the base region and the image data of the extended region.
- the second processing module 322 may perform the step S 202 and/or the step S 203 in the image processing method described in any of the above embodiments.
- Embodiments of the image processing apparatus described in FIG. 11 and the display control apparatus described in FIG. 10 are merely exemplary.
- the division of the above modules is only a logical functional division. In actual implementation, there may be other division manners.
- a plurality of modules or components may be combined or integrated into another system, or some features may be omitted or not performed. All functional modules in the embodiments of the present disclosure may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into a single module.
- the above modules may be implemented in a form of hardware or in a form of software functional units.
- the above modules each may be implemented by a software functional module generated after at least one processor 101 in FIG. 3 B reads program codes stored in the memory 102 .
- the above modules may be implemented by different hardware in the display apparatus.
- the above functional modules may also be implemented by combining software and hardware.
- the reading module 310 and the image processing apparatus 320 in FIG. 10 may be implemented in a manner of program codes executed by the processor 101
- the output module 330 may be an output interface, e.g., a high-definition (HD) display protocol interface such as the eDP interface.
- HD high-definition
- the display control apparatus may be a chip system, including a SoC board card and a FPGA board card.
- the SoC board card is configured to store and/or load a startup picture, and includes a startup picture storage module 601 , a storage controller 602 , a sending module 603 and a processor 604 .
- the startup picture storage module 601 is configured to store the startup picture.
- the startup picture storage module 601 may, for example, be a memory, for which reference may, for example, be made to the description of the memory 102 in FIG. 3 B .
- the storage controller 602 is configured to read out the startup picture from the startup picture storage module 601 and transmit it to the sending module 603 in response to a case where the display control apparatus 300 is powered on.
- the storage controller 602 may be a direct memory access (DMA) controller.
- DMA direct memory access
- the sending module 603 is configured to transmit the startup picture to the FPGA chip.
- the sending module 603 includes a sending interface such as a LVDS interface, which is configured to transmit the startup picture to a FPGA chip through a LVDS protocol.
- the processor 604 is configured to control the storage controller 602 and the sending module 603 to implement their respective functions.
- the processor 604 reference may, for example, be made to the description of the processor 101 in FIG. 3 B .
- the FPGA board card is configured to identify the type of the startup picture and/or generate image data of a high-resolution image (e.g., the second image) based on the image data of the startup picture (e.g., the first image).
- the FPGA board card includes a receiving module 605 , a storage module 606 , a pixel sampling module 607 , a type determination module 608 , an image extension module 609 , a selector 610 and a display output module 611 .
- the pixel sampling module 607 , the type determination module 608 , the image extension module 609 and the selector 610 may be included in the FPGA chip.
- the receiving module 605 is configured to receive the image data of the startup picture sent by the SoC board.
- the receiving module 605 includes a receiving interface such as a LVDS interface, which is configured to receive the image data of the startup picture sent by the SoC board through the LVDS protocol.
- the storage module 606 is configured to buffer the received image data of the startup picture by frame, so as to achieve synchronization with a subsequent system.
- the storage module 606 may be a double data rate synchronous Dynamic Random Access Memory (DDR SDRAM).
- DDR SDRAM double data rate synchronous Dynamic Random Access Memory
- the pixel sampling module 607 may be configured to perform a part of the step of identifying the type of the first image in the image processing method provided by any of the above embodiments, for example, perform the step S 202 in FIG. 7 .
- the type determination module 608 may be configured to perform the other part of the step of identifying the type of the first image in the image processing method provided by any of the above embodiments, for example, perform the step S 202 in FIG. 7 .
- the image extension module 609 may be configured to perform the step of generating the image data of the extended region according to the extension policies based on the image data of the first image in the image processing method provided by any of the above embodiments, for example, perform the step S 203 in FIG. 7 .
- the selector 610 is configured to select to output data of the startup picture or data of a normal working picture.
- the selector 610 may be configured to perform the step S 103 and/or the step S 104 in the display control method shown in FIG. 5 .
- an initial state of the selector 610 may be set to display the startup picture. That is, in response to the case where the display control apparatus is powered on, the display apparatus is in the startup initialization state, and the selector 610 selects to display the startup picture, so as to output the image data of the second image in response to a case where the display control apparatus is in the startup initialization state.
- the SoC chip may transmit a signal to the selector 610 , so that the selector 610 selects a working picture (i.e., a normal display image to be displayed after the startup image).
- the display output module 611 is configured to output the selected picture to a display screen at a rear end.
- the display output module 611 may be an output interface, for example, at least one of the eDP interface and the V-by-One interface.
- the FPGA board card may further include a front-end processing system 612 for processing the received image data of the working picture and outputting the processed image data to the selector.
- the processing may include, for example, hue adjustment, brightness adjustment, contrast adjustment, chromaticity calibration, and the like.
- the functions implemented by the display apparatus, the display control apparatus and the image processing apparatus are similar to the functions implemented by the steps in the display control method and the image processing method.
- the display apparatus, the display control apparatus and the image processing apparatus provided by the embodiments of the present disclosure may realize an effect of using the image data of a low-resolution image to enable the display apparatus to display a high-resolution image.
- the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
- the software program may be implemented in a form of a computer program product in whole or in part.
- the computer program product includes computer program instructions that, when executed by a computer (e.g. the display apparatus), cause the computer to perform the image processing method provided by any of the above embodiments or the display control method provided by any of the above embodiments.
- the computer program instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
- the computer program instructions may be transmitted from one website, computer, server or data center to another website, computer, server or data center via a wired (e.g., a coaxial cable, an optical fiber, a digital subscriber line (DSL)) manner or a wireless (e.g., infrared, wireless, microwave, etc.) manner.
- a wired e.g., a coaxial cable, an optical fiber, a digital subscriber line (DSL)
- a wireless e.g., infrared, wireless, microwave, etc.
- the embodiments of the present disclosure further provide a computer program.
- the computer program When the computer program is executed by a computer (e.g., the display apparatus), the computer program causes the computer to perform the image processing method provided by any of the above embodiments or the display control method provided by any of the above embodiments.
- the embodiments of the present disclosure further provide a non-transitory computer-readable storage medium.
- the computer-readable storage medium has stored therein computer program instructions that, when run on a computer (e.g., the display apparatus), cause the computer to perform the image processing method provided by any of the above embodiments or the display control method provided by any of the above embodiments.
- the computer-readable storage medium may be any available medium that can be accessed by a computer, or a server integrated by one or more available media, a data center integrated by one or more available media, and other data storage devices.
- the available medium may be a magnetic medium (e.g., a floppy disk, a magnetic disk or a magnetic tape), an optical medium (e.g., a digital versatile disk (DVD)), or a semiconductor medium (e.g., a solid state drive (SSD)), etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
R (f+x) =R 11 +x×(R 11 −R 21);
G (f+x) =G 11 +x×(G 11 −G 21); and
B (f+x) =B 11 +x×(B 11 −B 21).
R (g+x) =R n1 −x×(R 11 −R 21);
G (g+x) =G n1 −x×(G 11 −G 21); and
B (g+x) =B n1 −x×(B 11 −B 21).
R (f+x) =R 11 +x×(R 11 −R 12);
G (f+x) =G 11 +x×(G 11 −G 12); and
B (f+x) =B 11 +x×(B 11 −B 12).
R (g+x) =R 1m −x×(R 11 −R 12);
G (g+x) =G 1m −x×(G 11 −G 12); and
B (g+x) =B 1m −x×(B 11 −B 12).
R xy =R x1+(y−f+1)×(R x1 −R x2);
G xy =G x1+(y−f+1)×(G x1 −G x2); and
B xy =B x1+(y−f+1)×(B x1 −B x2).
R xy =R xm+(y−g+1)×(R xm −R x(m-1));
G xy =G xm+(y−g+1)×(G xm −G x(m-1)); and
B xy =B xm+(y−g+1)×(B xm −B x(m-1)).
R xy =R 1y+(x−h+1)×(R 1y −R 2y);
G xy =G 1y+(x−h+1)×(G 1y −G 2y); and
B xy =B 1y+(x−h+1)×(B 1y −B 2y).
R xy =R ny+(x−j+1)×(R ny −R (n-1)y);
G xy =G ny+(x−j+1)×(G ny −G (n-1)y); and
B xy =B ny+(x−j+1)×(B ny −B (n-1)y).
R xy=(R x(y-1) +R (x-1)y)/2;
G xy=(G x(y-1) +G (x-1)y)/2; and
B xy=(B x(y-1) +B (x-1)y)/2.
R xy=(R 11 +R 12 + . . . +R 1m +R 21 +R 22 + . . . +R 2m + . . . +R n1 +R n2 + . . . +R nm)/(n×m);
G xy=(G 11 +G 12 + . . . +G 1m +G 21 +G 22 + . . . +G 2m + . . . +G n1 +G n2 + . . . +G nm)/(n×m); and
B xy=(B 11 +B 12 + . . . +B 1m +B 21 +B 22 + . . . +B 2m + . . . +B n1 +B n2 + . . . +B nm)/(n×m).
Claims (19)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110349875.2A CN113050999B (en) | 2021-03-31 | 2021-03-31 | Image processing method and device, display control method and device, and display device |
CN202110349875.2 | 2021-03-31 | ||
PCT/CN2021/129899 WO2022205924A1 (en) | 2021-03-31 | 2021-11-10 | Image processing method and apparatus, display control method and apparatus, and display apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
US20230273760A1 US20230273760A1 (en) | 2023-08-31 |
US12008285B2 true US12008285B2 (en) | 2024-06-11 |
Family
ID=76516731
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/016,430 Active US12008285B2 (en) | 2021-03-31 | 2021-11-10 | Image processing method and display control method |
Country Status (3)
Country | Link |
---|---|
US (1) | US12008285B2 (en) |
CN (1) | CN113050999B (en) |
WO (1) | WO2022205924A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113050999B (en) | 2021-03-31 | 2024-04-23 | 京东方科技集团股份有限公司 | Image processing method and device, display control method and device, and display device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150254802A1 (en) * | 2014-03-10 | 2015-09-10 | Sony Corporation | Method and device for simulating a wide field of view |
US20170039680A1 (en) | 2015-08-05 | 2017-02-09 | Toshiba Tec Kabushiki Kaisha | Display control device and method for displaying ui screen on display device |
CN106847228A (en) | 2017-04-25 | 2017-06-13 | 京东方科技集团股份有限公司 | The driving method of display panel, the drive device of display panel and display panel |
CN108153570A (en) * | 2017-12-22 | 2018-06-12 | 联想(北京)有限公司 | A kind of start-up picture display control method and device |
CN111459431A (en) | 2020-03-19 | 2020-07-28 | 深圳市优博讯科技股份有限公司 | Startup L OGO display method and system |
CN113050999A (en) | 2021-03-31 | 2021-06-29 | 京东方科技集团股份有限公司 | Image processing method and device, display control method and device, and display device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100521342B1 (en) * | 1998-10-16 | 2005-12-21 | 삼성전자주식회사 | Logo image display methode of computer system |
US8842938B2 (en) * | 2012-06-27 | 2014-09-23 | Xerox Corporation | Method and system for performing resolution expansion with high quality edge enhancement |
CN111381749A (en) * | 2018-12-28 | 2020-07-07 | 广州市百果园信息技术有限公司 | Image display and processing method, device, equipment and storage medium |
CN110618803B (en) * | 2019-09-10 | 2023-07-14 | 北京金山安全软件有限公司 | Image display method and device |
CN110796997B (en) * | 2019-11-14 | 2021-12-21 | 京东方科技集团股份有限公司 | Method and device for realizing non-uniform resolution display |
-
2021
- 2021-03-31 CN CN202110349875.2A patent/CN113050999B/en active Active
- 2021-11-10 US US18/016,430 patent/US12008285B2/en active Active
- 2021-11-10 WO PCT/CN2021/129899 patent/WO2022205924A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150254802A1 (en) * | 2014-03-10 | 2015-09-10 | Sony Corporation | Method and device for simulating a wide field of view |
US20170039680A1 (en) | 2015-08-05 | 2017-02-09 | Toshiba Tec Kabushiki Kaisha | Display control device and method for displaying ui screen on display device |
CN106847228A (en) | 2017-04-25 | 2017-06-13 | 京东方科技集团股份有限公司 | The driving method of display panel, the drive device of display panel and display panel |
US20190073972A1 (en) * | 2017-04-25 | 2019-03-07 | Boe Technology Group Co., Ltd. | Driving method for display panel, driving device for display panel and display panel |
CN108153570A (en) * | 2017-12-22 | 2018-06-12 | 联想(北京)有限公司 | A kind of start-up picture display control method and device |
CN111459431A (en) | 2020-03-19 | 2020-07-28 | 深圳市优博讯科技股份有限公司 | Startup L OGO display method and system |
CN113050999A (en) | 2021-03-31 | 2021-06-29 | 京东方科技集团股份有限公司 | Image processing method and device, display control method and device, and display device |
Also Published As
Publication number | Publication date |
---|---|
WO2022205924A1 (en) | 2022-10-06 |
US20230273760A1 (en) | 2023-08-31 |
CN113050999B (en) | 2024-04-23 |
CN113050999A (en) | 2021-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107633824B (en) | Display device and control method thereof | |
CN105955687B (en) | Image processing method, device and system | |
CN110232890B (en) | Method of performing image adaptive tone mapping and display apparatus employing the same | |
US20090135211A1 (en) | Image displaying system and method for eliminating mura defect | |
CN109935203B (en) | Display apparatus performing low gray-scale monochrome image compensation and method of operating the same | |
WO2022033110A1 (en) | Liquid crystal display and driving compensation method therefor and driving compensation apparatus thereof | |
CN113495709B (en) | Color correction method, AP chip, terminal and storage medium | |
US20140204007A1 (en) | Method and system for liquid crystal display color optimization with sub-pixel openings | |
US20210343239A1 (en) | Display driving method of integrated circuit, integrated circuit, display screen and display apparatus | |
US20160180558A1 (en) | Display apparatus and controlling method | |
JP4442438B2 (en) | Image display device, driving method thereof, and electronic apparatus | |
US12008285B2 (en) | Image processing method and display control method | |
US8259120B2 (en) | Seamless switching between graphics controllers | |
CN114495812B (en) | Display panel brightness compensation method and device, electronic equipment and readable storage medium | |
EP3012830B1 (en) | Image up-scale unit and method | |
US10714034B2 (en) | Display device, control circuit of display panel, and display method | |
JP2016031468A (en) | Display control device, display device, and display system | |
US20160232827A1 (en) | Pixel driving method and associated display device | |
US11636830B2 (en) | Driving method and apparatus of display panel | |
TWI718913B (en) | Display method | |
WO2022077859A1 (en) | Display effect enhancement method, apparatus, and device | |
CN116524871B (en) | Driving method, driving device, display device and electronic equipment | |
US10157583B2 (en) | Display apparatus and display control method thereof | |
CN114119778A (en) | Deep color mode generation method of user interface, electronic equipment and storage medium | |
CN111681555B (en) | Display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BOE TECHNOLOGY GROUP CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GENG, LIHUA;MA, XITONG;DUAN, RAN;AND OTHERS;REEL/FRAME:062385/0171 Effective date: 20220802 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |