CN116095261A - Display method and display device - Google Patents

Display method and display device Download PDF

Info

Publication number
CN116095261A
CN116095261A CN202211713330.6A CN202211713330A CN116095261A CN 116095261 A CN116095261 A CN 116095261A CN 202211713330 A CN202211713330 A CN 202211713330A CN 116095261 A CN116095261 A CN 116095261A
Authority
CN
China
Prior art keywords
resolution
video data
video
edid
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211713330.6A
Other languages
Chinese (zh)
Inventor
卢平光
何营昊
于新磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202211713330.6A priority Critical patent/CN116095261A/en
Publication of CN116095261A publication Critical patent/CN116095261A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0125Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards being a high definition standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The application discloses a display method, display equipment and a computer readable storage medium, which relate to the technical field of display and can be used for displaying images. The method comprises the following steps: acquiring the EDID processing capability of the extended display identification data of the video source equipment; receiving first video data from the video source device under the condition that the EDID processing capability of the video source device is supporting parsing and expanding EDID; adjusting the resolution of the first video data from the first resolution to a second resolution to obtain second video data; adjusting the resolution of the second video data from the second resolution to the third resolution to obtain third video data; the third resolution is a resolution of a display of the display device; the third video data is displayed in a manner that scans two rows of pixels simultaneously.

Description

Display method and display device
Technical Field
The application relates to the field of intelligent equipment, in particular to a display method and display equipment.
Background
Currently, HDMI (high definition multimedia interface, high-definition multimedia interface) has been widely used for display devices such as televisions. HDMI is a digital video/audio interface technology that can transmit both audio and video signals. An HDMI device (i.e., video source device) can input high-frequency video data, for example, high-frame rate video data of 240hz, to a display device such as a television. However, since the data receiving interface (e.g., v-by-one cable or v-by-one interface) of the screen driving board (timing controller, TCON) in most display devices can only receive video data of 120hz meeting the self resolution requirement at maximum, the video data of 240hz has exceeded the transmission bandwidth of the v-by-one cable. Therefore, in the prior art, bandwidth is typically reserved by reducing the resolution of the video data to smoothly transmit the video data to the TCON, so that the TCON controls the display screen to display. But in so doing, results in a much reduced sharpness of the video compared to the video sent by the video source device. Further, since the refresh rate of the hardware support of the display screen of the television itself is 120hz, a soft high-brush technology (for example, a hardware super resolution (hardware super resolution, HSR) or a two-line gate (DLG) technology) is required, so that the display screen can implement a refresh rate of 240hz to display a video of 240 hz. Soft high-brush techniques, however, reduce the number of pixels in the video data in the vertical direction, resulting in reduced definition of the final rendered video.
In addition, if the video data input by the video source device is 120hz, in order to display 240hz, a motion estimation/motion compensation technique (motion estimation/motion compensation, MEMC) is also required to multiply the video frequency of 120hz to 240hz. However, this technique may cause a large number of repeated video frames in the final displayed video, and the smoothness is not good enough in the appearance.
Disclosure of Invention
The embodiment of the application provides a display method and display equipment, which can clearly and smoothly display video data with higher refresh rate when the refresh rate supported by the display equipment is lower.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a display device is provided, which may include: a display; the refresh rate of the display is the first refresh rate, and the resolution of the display is the third resolution; a processor configured to obtain extended display identification data EDID processing capabilities of the video source device; a communicator configured to receive first video data from the video source device in a case where EDID processing capability of the video source device is supporting parsing extended EDID; the extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate; the processor is further configured to adjust the resolution of the first video data from the first resolution to a second resolution to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device; the processor is further configured to adjust the resolution of the second video data from the second resolution to the third resolution, resulting in third video data; the processor is further configured to control the display to display the third video data in a manner that scans two rows of pixels simultaneously.
In a possible implementation manner of the first aspect, the processor is specifically configured to: if the category of the video source device is determined to be the first type device, the EDID processing capability of the video source device is determined to support analysis and expansion EDID.
In a possible implementation manner of the first aspect, the processor is specifically configured to: if the category of the video source equipment is determined not to be the first equipment, acquiring characteristic parameters of the video source equipment; and under the condition that the characteristic parameters of the video source equipment can be resolved, determining the EDID processing capability of the video source equipment according to the characteristic parameters of the video source equipment.
In a possible implementation manner of the first aspect, the processor is specifically configured to: the control communicator sends a query request to the server; the query request carries the characteristic parameters of the video source equipment, and is used for requesting the EDID processing capacity of the video source equipment; the control communicator receives the inquiry response from the server; if the query response indicates that the EDID processing capability of the video source device exists and indicates that the EDID processing capability of the video source device is in support of analysis and extension of EDID, determining that the EDID processing capability of the video source device is in support of analysis and extension of EDID; if the query response indicates that the EDID processing capability of the video source equipment does not exist or indicates that the EDID processing capability of the video source equipment is not supported for analysis and expansion, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; and if the DDC communication statistic value indicates that the completion degree of the extended EDID of the video source equipment read display equipment is completely completed, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
In a possible implementation manner of the first aspect, the processor is specifically configured to: under the condition that the characteristic parameters of the video source equipment cannot be resolved, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; and if the DDC TRAINING state indicates that the completion degree of the video source equipment for reading the extended EDID of the display equipment is complete, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
In a possible implementation manner of the first aspect, the processor is specifically configured to: copying each row of pixels of each video frame in the first video data so as to adjust the resolution of the first video data from the first resolution to a fourth resolution and obtain fourth video data; the fourth vertical pixel value of the fourth resolution is twice the first vertical pixel value of the first resolution; and copying each column of pixels of each video frame in the fourth video data, and de-duplicating repeated rows of pixels in each video frame in the fourth video data so as to adjust the resolution of the fourth video data from the fourth resolution to the second resolution, thereby obtaining the second video data.
In a possible implementation manner of the first aspect, the processor is specifically configured to: and copying each row of pixels in each video frame of the second video data so as to adjust the resolution of the second video data from the second resolution to the third resolution, thereby obtaining third video data.
In a possible implementation manner of the first aspect, the first refresh rate is 120 Hz, the first resolution is 1920 x 1080 progressive scan P, the first frame rate is 240Hz, and the third resolution is 3840 x 2160p.
In a second aspect, a display method is provided for use in a display device. The method may include: acquiring the EDID processing capability of the extended display identification data of the video source equipment; receiving first video data from the video source device under the condition that the EDID processing capability of the video source device is supporting parsing and expanding EDID; the extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate; adjusting the resolution of the first video data from the first resolution to a second resolution to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device; adjusting the resolution of the second video data from the second resolution to the third resolution to obtain third video data; the third video data is displayed in a manner that scans two rows of pixels simultaneously.
In a third aspect, a display device is provided, which has a function of implementing the method described in the second aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a fourth aspect, a display device is provided, the display device acquiring module, the processing module, and the display module. The system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring the EDID processing capacity of the extended display identification data of the video source equipment; the acquisition module is further used for receiving the first video data from the video source equipment under the condition that the EDID processing capability of the video source equipment is in support of analysis and expansion of the EDID; the extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate; the processing module is used for adjusting the resolution of the first video data received by the acquisition module from the first resolution to the second resolution so as to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device; the processing module is further used for copying each row of pixels in each video frame of the second video data so that the resolution of the second video data is adjusted from the second resolution to a third resolution to obtain third video data; and the display module is used for displaying the third video data obtained by the processing module in a mode of simultaneously scanning two rows of pixels.
In a fifth aspect, there is provided a display device including: a processor and a memory; the memory is configured to store computer-executable instructions that, when executed by the first device, cause the first device to perform the display method of any one of the second aspects.
In a sixth aspect, there is provided a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the display method of any one of the second aspects above.
In a seventh aspect, there is provided a computer program product comprising instructions which, when run on a display device, cause the display device to perform the display method of any of the second aspects above.
In an eighth aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting a display device to implement the functions referred to in the second aspect above. In one possible design, the apparatus further includes a memory for storing program instructions and data necessary for the display device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
According to the technical scheme provided by the embodiment of the application, when the resolution of the display screen of the display device supporting display is the third resolution and the refresh rate is the first refresh rate, the bandwidth which is supported by the highest data receiving interface for controlling the screen driving board displayed by the display to receive data is considered to be the bandwidth corresponding to the video data with the resolution of the third resolution and the frame rate of the first frame rate. Based on this, in order to enable the display screen of the display device to display video data of a higher frame rate (for example, a frame rate twice the first refresh rate) more smoothly without losing pixel values, it is necessary to make the total pixel value corresponding to the resolution (i.e., second resolution) of the video data received by the screen driving board half the total pixel value of the third resolution and make the frame rate of the video data received by the screen driving board twice the first refresh rate. In addition, when the display screen with the first refresh rate is used for displaying the video data with the first frame rate, a soft high-brush technology (such as DLG or HSR) is adopted, and the technology mainly performs special processing on the scanning of each row of pixels in each video frame (two rows are scanned simultaneously) so as to achieve the purpose of doubling the scanning speed, so that the purpose of displaying the video data with the first frame rate by the display screen with the first refresh rate can be realized. Therefore, in the embodiment of the present application, the second horizontal pixel value of the second resolution may be the same as the third horizontal pixel value of the third resolution, where the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution.
Based on the foregoing, in the technical solution of the embodiment of the present application, in order for the screen driving board to receive video data with a resolution of the third resolution and a frame rate of the video data with the first frame rate, it is necessary to make the frame rate of the first video data acquired by the display device from the video source device be the first frame rate. Meanwhile, due to the specification limitation of the video resolution in practice, there is a difference between the resolution (e.g., first resolution) and the second resolution of the first video data provided to the display device by the video source device. Based on the above, when the display device obtains the first video data, the resolution of the first video data needs to be adjusted, so that the second video data is obtained and transmitted to the screen driving board, so that the screen driving board controls the display screen to display according to the second video data.
Further, in practice, when the video source device transmits the first video data to the display device, it is necessary to determine what parameters (frame rate and resolution) of the video data are transmitted to the display device by parsing and reading EDID in the display device. However, existing EDIDs can only define parameters of a lower frame rate (i.e., a third resolution, e.g., 120 Hz) at the highest. Therefore, in order for the video source device to transmit video data of the first frame rate to the display device, it is necessary to configure extended EDID in the display device in advance, which extended EDID may then indicate to receive video data of the first resolution and the first frame rate. After that, in the case that the video source device can parse the extended EDID, the display device can receive the first video data, and further execute the subsequent display procedure.
In summary, because the technical solution provided in the embodiments of the present application, when the first video data with the first frame rate is displayed at the end, the first video data with the first frame rate may be displayed in a dual-line simultaneous scanning manner without any loss of any pixel. Because the pixels in the first video data are not displayed at any time in the whole display flow, the final display frame rate is ensured, the effect of smoothly and clearly displaying the video data with the first frame rate on the display screen with the lower refresh rate (namely the first refresh rate) is realized, and the use experience of a user is improved.
Drawings
Fig. 1 is a schematic scanning diagram of a DLG technology according to an embodiment of the present application;
fig. 2 is a schematic scanning diagram of an HSR technology according to an embodiment of the present application;
FIG. 3 is a schematic diagram of different frame rate effects according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow chart of a display method provided in the prior art;
fig. 5 is a schematic diagram of OSD image and video data fusion according to an embodiment of the present application;
fig. 6 is a schematic diagram of improving video resolution by using HSR technology according to an embodiment of the present application;
fig. 7 is an example schematic diagram of a display method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a display system according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of a control device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a display device according to an embodiment of the present application;
fig. 11 is a schematic software architecture diagram of a display device according to an embodiment of the present application;
fig. 12 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of EDID and extended EDID provided in an embodiment of the present application;
fig. 14 is a second flow chart of a display method according to an embodiment of the present application;
fig. 15 is a flowchart of a third display method according to an embodiment of the present application;
fig. 16 is a flow chart diagram of a display method according to an embodiment of the present application;
fig. 17 is a flowchart fifth of a display method according to an embodiment of the present application;
fig. 18 is a flowchart of a display method according to an embodiment of the present application;
fig. 19 is a flow chart seventh of a display method according to an embodiment of the present application;
fig. 20 is an example schematic diagram of another display method according to an embodiment of the present application;
fig. 21 is a schematic structural diagram of a third video according to an embodiment of the present application;
fig. 22 is a schematic flow chart eight of a display method according to an embodiment of the present application;
Fig. 23 is a flowchart illustrating a display method according to an embodiment of the present application;
fig. 24 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 25 is a schematic structural diagram of another display device according to an embodiment of the present application;
fig. 26 is a schematic structural diagram of still another display device according to an embodiment of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "and/or" in this application is merely an association relation describing an associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment. It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
First, terms related to the present application are explained as follows:
frame rate: i.e., the frame rate, is the frequency (rate) at which bitmap images in frames called units appear continuously on the display screen. The frame rate may also be referred to as the frame frequency and is expressed in hertz (Hz).
Refresh rate: i.e. refresh rate, refers to the rate at which the screen is refreshed. The generally mentioned refresh rate is generally referred to as a vertical refresh rate. The vertical refresh rate indicates how many times the image of the screen is redrawn per second, i.e., the number of screen refreshes per second, in Hz (hertz). The higher the refresh rate, the better the image is stabilized, the more natural and clear the image is displayed, and the less the effect on the eye is. The lower the refresh frequency, the more flickering and dithering of the image, and the faster the eye strain.
HDMI: i.e. high definition multimedia interface, high definition multimedia interface. Is a fully digital video and audio transmission interface capable of transmitting uncompressed audio and video signals. HDMI can be used for set top boxes, DVD players, personal computers, televisions, game consoles, combination expansion machines, digital audio and television sets, and other devices. HDMI can send audio frequency and video signal simultaneously, because audio frequency and video signal adopt same wire rod, simplify the installation degree of difficulty of system's circuit greatly.
In the embodiment of the application, video data is transmitted between the video source device and the display device through the HDMI. In the embodiment of the present application, in order to enable the display device to receive an HDMI signal (the HDMI signal carries video data with a resolution of 1920×1080p and a frame rate of 240 Hz) of 1920×1080@240Hz, the bandwidth required for HDMI decoupling is 18Gb/s according to the HDMI communication protocol. Based on this, the HDMI interface rule mentioned in the present application is at least version 2.0 or higher.
EDID: extended display identification data, the extended display identification data. EDID is a standard established by VESA (Video Electronics Standards Association ) regarding display identification data when establishing DDC display data channel communication protocols. The EDID is stored in the DDC memory in the display, and when the host device (i.e., the device that provides display data to the display, if the display data is video data, the host device is a video source device) is connected to the display, the host device reads the EDID stored in the DDC memory in the display through the DDC channel to determine what parameters of the display data are transmitted to the display.
DDC: display Data Channel, the data channels are displayed. DDC is a data channel used by a host device to access display memory to obtain EDID formatted data in EEPROM (electrically erasable programmable read only memory, charged erasable programmable read-only memory) in the display to determine display attribute (e.g., resolution, refresh rate, aspect ratio, etc.) information of the display.
TCON: namely a logic board is also called a screen driving board, a central control board and a TCON board. The TCON has the functions of converting video data sent by an image processing unit of the display device into an electric signal capable of driving a display screen after processing the video data, and then directly sending the electric signal to the display screen for display. The TCON receives video data from the image processing unit through a data receiving interface (e.g., v-by-one interface) having different bandwidths according to different specifications. In general, the bandwidth of the data receiving interface supports the highest bandwidth corresponding to the resolution and refresh rate supported by the display screen of the display device to which the data receiving interface belongs. In the embodiment of the application, in order to enable the display screen with a low refresh rate to display video data with a high frame rate, the TCON needs to support a soft high-brush technology such as a DLG technology or an HSR technology, that is, by adjusting a gate scanning mode (that is, adjusting the number of line pixels in a video frame), so as to realize video display with a high frame rate.
DLG: the full name is dual line gate, i.e., two rows of gate lines. The DLG technique reduces the rendering precision of vertical pixels (or column pixels), renders only odd lines (or even lines) such as 1, 3, 5, 7, and 9, and copies the data of the odd lines to the even lines of 2, 4, 6, 8, and 10 for display. At this time, the number of the refreshing time is reduced by half, so that the refreshing time is reduced by half compared with the original refreshing time, and the refreshing rate of the display screen can be improved.
For example, in DLG technology, after the odd-numbered data is copied, as shown in fig. 1, TCON controls the simultaneous scanning of every two adjacent rows of gates in the control circuit of the display screen, for example, G1 and G2. The content of the G1 and G2 scans is the same. It can be seen that the principle of DLG technology is similar to that of interlaced scanning, and the display is performed by scanning two repeated lines at a time, so that not only can bandwidth be saved, but also the rough feeling of the picture caused by interlaced scanning can be avoided. However, it is needless to say that the interval line that is repeatedly scanned should be information of other colors, and the improvement of the refresh rate by the display device through the DLG technology is that a part of the picture content is substantially lost, so that the actual appearance of the displayed picture is greatly reduced.
HSR: hardware super resolution, hardware super resolution. The basic principle of the HSR technology is similar to that of the DLG technology, and the longitudinal pixels are compressed, only odd lines (or even lines) are rendered, and the even lines are fused with the information of two adjacent lines for display.
In the HSR technology, after the odd-numbered data is copied, the TCON also controls the simultaneous scanning of every two adjacent rows of gates in the control circuit of the display screen, for example, G1 and G2, as shown in fig. 2. However, based on the HSR technique, the row pixels of the G2 scan are actually obtained after the G1 and G3 combination processing. For example, the pixel value of each pixel in G2 is a weighted average of the pixel value of the corresponding pixel in G1 (i.e., belonging to the same column) and the pixel value of the corresponding pixel in G3. The weights corresponding to G1 and G3 may be the same or different, and are specifically determined according to actual requirements. The manner of obtaining the G2-scanned line pixels using the G1-scanned line pixels and the G3-scanned line pixels may be referred to as interpolation processing.
Based on this, compared with the DLG technology, the color and line transition of the whole picture can be more natural after the picture adopting the HSR technology is fused with the even-numbered pixels, and the definition is correspondingly improved. However, it also has a problem of picture distortion compared with video data actually input by the video source device, resulting in a reduction in video definition.
Based on this, the HDMI device (i.e., video source device) can input video data of high frequency, for example, high refresh rate video data of 240hz, to the display device such as a television.
By way of example, referring to fig. 3, it can be seen that, for a fixed period of time, a 240Hz frame rate video may display 9 frames of images, a 120Hz frame rate video may display 5 frames of images, and a 60Hz frame rate video may display only 3 frames of images. It can be seen that the larger the frame rate, the more detail the video can be presented when it is displayed normally, and the smoother the moving picture. Users also increasingly want display devices to be able to display video at high frame rates (e.g., 240 Hz). . However, since the data receiving interface (e.g., v-by-one cable or v-by-one interface) of the screen driving board (timing controller, TCON) in most display devices can only receive video data of 120hz meeting the self resolution requirement at maximum, the video data of 240hz has exceeded the transmission bandwidth of the v-by-one cable. Therefore, in the prior art, bandwidth is typically reserved by reducing the resolution of the video data to smoothly transmit the video data to the TCON, so that the TCON controls the display screen to display.
For example, the resolution supported by the display device is 3840×2160p and the refresh rate supported by the display device is 120Hz, and the parameters of the video data input by the video source device through the HDMI interface are 3840×2160@120hz (i.e. the resolution is 3840×2160p and the frame rate is 120 Hz). Referring to fig. 4, a process flow of processing and displaying video data sent by a video source device after receiving the video data by a display device in the prior art includes S1-S6:
s1, the display device receives video data from the video source device.
The display device may specifically receive video data from the video source device through its HDMI sink device. The HDMI receiving device may be specifically referred to as HDMI video. Since the parameters of the video data indicated in the EDID predefined by the display device and the parameters of the video supported by the display of the display device are the same, the video source device will input the video data of the parameters of the video that can be displayed by the display of the display device, that is, the video data of the parameters 3840×2160@120hz.
S2, the HDMI receiving device of the display device sends the video data to a Prescaler, so that the Prescaler adjusts the resolution of the video data to 3840 x 1080P.
In particular, the prescaler may be a stand alone device/new product in the display device or may be integrated in Soc (system on chip) in the display device. The bandwidth that can be received by the data receiving interface of the screen driving board TCON of a typical display is related to the parameters of the specific video supported by the display, i.e. the bandwidth corresponding to 3840 x 2160@120hz is supported. But here the display needs to display video data at 240Hz, the data receiving interface of the final TCON needs to receive video data with parameters 3840 x 1080@240hz. Based on the above, the prescaler can obtain 3840×1080@120hz video data by performing line pixel halving operation on each video frame in the video data, so as to facilitate obtaining 3840×1080@240hz video data after subsequent frequency doubling. The row pixel halving operation may be to delete row pixels of odd rows or to delete row pixels of even rows. Line pixel values a line of pixels in the horizontal direction in a video frame.
S3, the display equipment generates an OSD picture.
Specifically, the OSD screen may be sequentially generated by a UI component, a GPU (graphics processing unit, a graphics processor), and a frame buffer (frame buffer) module of the display device. The GPU may specifically be GPU-Mali.
Wherein, the OSD (on-screen display) picture is the picture of the screen menu type adjusting mode. OSD is applied on display to generate special fonts or graphics in screen of display so as to let user obtain some information. When a user operates the television to change channels or adjust volume, image quality and the like, the television screen displays the current state so that the user can know. In order to achieve OSD, it is necessary to add or change the color of some pixels in the image in real time in synchronization with the image displayed on the display, so that they are combined into data that can be recognized in the image by a human being.
In the prior art, for convenience of fusion, specific parameters of OSD images generated by a display device are consistent with parameters of video data with reduced resolution, i.e. 3840×2160@120hz.
Of course, in practice, the frame rate of the OSD frame and the frame rate of the video data may be different, or may be blended in proportion. For example, taking the frame rate of video data as 240Hz and the osd picture as 120Hz or 60Hz as an example, the fusion ratio may be as shown in fig. 5.
If the OSD picture is 120Hz, it is fused with the video data, and then the video frames in the two video data are fused for one OSD picture frame. If the OSD picture is 60Hz, the OSD picture is fused with the video data, and the OSD picture is fused with the video frames in the four video data. Of course, in practice, other fusion methods may exist depending on the frame rates of the two.
S4, the display equipment fuses the OSD picture and the video data input by the video source equipment, and multiplies the frequency of the fused video data by utilizing a memc (motion estimation/motion compensation) technology to obtain a pending video.
The specific parameter of the undetermined video may be 3840×1080@240hz. The memc technology frequency multiplication only improves the fluency of the fused video data to a certain extent, but the added video frames are only copy frames, so that the improvement of the fluency is not ideal as a whole.
Specifically, the step S4 may be performed by a send display unit in the display device.
S5, the display equipment generates a target video by using the undetermined video, and scans and displays two rows of pixels for each video frame in the target video by using a soft high-speed brushing technology.
Since the resolution supported by the display device is 3840×2160p, the specific parameter of the target video may be 3840×2160@240hz.
Specifically, the target video may be obtained by determining each row of pixels in a video frame of the video to be determined as an odd row (or an even row) by using an HSR technology (a soft high-speed brushing technology), and processing the pixels according to a specific interpolation processing mode. Illustratively, referring to FIG. 6, the even-numbered row pixels in each video frame of the target video are derived for the row pixels of its adjacent two odd-numbered rows. Specifically, the pixel value of each pixel in the row pixels of the even-numbered rows is obtained by weighted average of the pixel values of the corresponding pixels in the row pixels of the two adjacent odd-numbered rows. The odd-numbered line pixels in each video frame of the target video are the corresponding line pixels in the video frame of the pending video. That is, each video frame of the target video includes all rows of pixels in the corresponding video frame of the pending video.
Based on the foregoing steps, the video data displayed by the display screen is specifically 3840×2160@240hz video data, but compared with the video data input to the display device by the video source device, the video data is only effectively displayed by 3840×1080@240hz (or referred to as 4k1k@240h) and half of pixels are missing or wrong, so that the finally displayed video effect is distorted and the definition is not good enough. Meanwhile, since half of video frames in the finally presented video data are obtained by frequency multiplication, the smoothness is also insufficient.
In summary, it can be seen that in the prior art, since the data receiving interface (e.g., v-by-one cable or v-by-one interface) of the screen driving board (timing controller, TCON) in most display devices can only receive video data of 120hz meeting the self resolution requirement at maximum, the video data of 240hz already exceeds the transmission bandwidth of the v-by-one cable. Therefore, in the prior art, bandwidth is typically reserved by reducing the resolution of the video data to smoothly transmit the video data to the TCON, so that the TCON controls the display screen to display. But in so doing, results in a much reduced sharpness of the video compared to the video sent by the video source device. Further, since the refresh rate of the hardware support of the display screen of the television itself is 120hz, a soft high-brush technology (for example, a hardware super resolution (hardware super resolution, HSR) or a two-line gate (DLG) technology) is required, so that the display screen can implement a refresh rate of 240hz to display a video of 240hz. Soft high-brush techniques, however, reduce the number of pixels in the video data in the vertical direction, resulting in reduced definition of the final rendered video. In addition, if the video data input by the video source device is 120hz, in order to display 240hz, a motion estimation/motion compensation technique (motion estimation/motion compensation, MEMC) is also required to multiply the video frequency of 120hz to 240hz. However, this technique may cause a large number of repeated video frames in the final displayed video, and the smoothness is not good enough in the appearance.
In view of the above problems, the present application provides a display method applied to a display device. In the technical scheme, the highest supported bandwidth of the data receiving interface for controlling the screen driving board displayed by the display to receive data is considered to be the bandwidth corresponding to the video data with the third resolution frame rate being the first frame rate. Based on this, in order to enable the display screen of the display device to display video data of a higher frame rate (for example, a frame rate twice the first refresh rate) more smoothly without losing pixel values, it is necessary to make the total pixel value corresponding to the resolution (i.e., second resolution) of the video data received by the screen driving board half the total pixel value of the third resolution and make the frame rate of the video data received by the screen driving board twice the first refresh rate. In addition, when the display screen with the first refresh rate is used for displaying the video data with the first frame rate, a soft high-brush technology (such as DLG or HSR) is adopted, and the technology mainly performs special processing on the scanning of each row of pixels in each video frame (two rows are scanned simultaneously) so as to achieve the purpose of doubling the scanning speed, so that the purpose of displaying the video data with the first frame rate by the display screen with the first refresh rate can be realized. In this embodiment, the second horizontal pixel value of the second resolution may be the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution may be half of the third vertical pixel value of the third resolution.
In addition, in the technical solution provided in the present application, in order for the screen driving board to receive video data with a resolution of the third resolution and a frame rate of the video data with the first frame rate, it is necessary to make the frame rate of the first video data acquired by the display device from the video source device be the first frame rate. Meanwhile, due to the specification limitation of the video resolution in practice, there is a difference between the resolution (e.g., first resolution) and the second resolution of the first video data provided to the display device by the video source device. Based on the above, when the display device obtains the first video data, the resolution of the first video data needs to be adjusted, so that the second video data is obtained and transmitted to the screen driving board, so that the screen driving board controls the display screen to display according to the second video data.
Further, in practice, when the video source device transmits the first video data to the display device, it is necessary to determine what parameters (frame rate and resolution) of the video data are transmitted to the display device by parsing and reading EDID in the display device. However, existing EDIDs can only define parameters of a lower frame rate (i.e., a third resolution, e.g., 120 Hz) at the highest. Therefore, in order for the video source device to transmit video data of the first frame rate to the display device, it is necessary to configure extended EDID in the display device in advance, which extended EDID may then indicate to receive video data of the first resolution and the first frame rate. After that, in the case that the video source device can parse the extended EDID, the display device can receive the first video data, and further execute the subsequent display procedure.
For example, referring to fig. 7, taking an example that the resolution supported by the display device is 3840×2160p and the refresh rate supported by the display device is 120Hz, compared to the corresponding prior art as shown in fig. 4, the display device may declare itself to receive 1920×1080@240hz video data in the extended EDID in advance. The video source device may then send 1920 x 1080@240hz first video data to the HDMI video of the display device. Then, based on the resolution requirement (requirement 3840×2160p) of the display screen of the display device and the bandwidth display (bandwidth corresponding to the maximum allowable 3840×1080@240hz) of the TCON, the image processing unit in the display device may increase the resolution of the first video data to 3840×1080p, so as to obtain the second video data with the parameter 3840×1080@240hz. Then, the Display device (specifically, may be a Display unit of the Display device) may fuse the OSD image with the second video data, to obtain fused second video data. The OSD screen is generated in the same manner as in the related art shown in fig. 4.
Then, the Display device (specifically, may be a send Display unit of the Display device) may transmit the fused second video data to the TCON. Based on the requirement of the resolution of the display screen, the TCON can adjust the resolution of the fused second video data to 3840×2160p, so as to obtain third video data with parameters of 3840×2160@240hz. Finally, the two rows of pixels of the video frame can be scanned simultaneously and the display screen can be controlled to display by utilizing part of the functions in the soft high-brush technology (HSR or DLG).
It can be seen that, in the technical solution provided in the present application, when the first video data with the first frame rate is displayed finally, any pixel of the first video data is not lost, and a dual-line simultaneous scanning mode is adopted for displaying. Because the pixels in the first video data are not displayed at any time in the whole display flow, the final display frame rate is ensured, the effect of smoothly and clearly displaying the video data with the first frame rate on the display screen with the lower refresh rate (namely the first refresh rate) is realized, and the use experience of a user is improved.
The display method provided in the embodiments of the present application is described in detail below with reference to the accompanying drawings.
Fig. 8 is a schematic diagram showing a composition structure of a display system to which a display method is applied according to an exemplary embodiment. Referring to fig. 8, the voice recognition system includes a display device 01 and a video source device 02.
Wherein the user can control the display device 01 through the mobile terminal 100 and the control apparatus 200. The control device 200 may be a remote control, and the communication between the remote control and the display device 01 includes infrared protocol communication, bluetooth protocol communication, and wireless or other wired modes for controlling the display device 01. The user can control the display device 01 by inputting user instructions through keys on a remote controller, voice input, control panel input, and the like. In addition, the display apparatus 01 may also directly receive a voice input or a voice instruction of the user through a module (e.g., MIC) configured therein to acquire a voice instruction. In some embodiments, tablet computers, notebook computers, and other smart devices may also be used to control the display device 01.
In some embodiments, the mobile terminal 100 and the display device 01 may have the same or matched software applications installed thereon, so as to implement connection communication through a network protocol, thereby implementing the purpose of one-to-one control operation and data communication. In this case, the audio/video contents displayed on the mobile terminal 100 may also be transmitted to the display device 01, so as to realize a synchronous display function.
The display device 01 and the server 02 can communicate data by a limited or wireless communication means. The server 02 may provide various contents and interactions to the display device 01, for example, the server 02 may store a preset voice recognition model and a preset character error correction rule that are required in the display method provided in the embodiment of the present application, so that the server 02 may provide the voice recognition capability to the display device 01. Alternatively, the server 02 may cooperate with the display device 01 to implement a speech recognition scheme.
For example, in the embodiment of the present application, the display device may have various implementation forms, and may be, for example, a television, a smart television, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), an electronic desktop (electronic table), or a display device capable of performing voice input. The embodiment of the present application does not limit the specific form of the display device herein. In the embodiment of the application, a display device is taken as a television set as an example for schematic description.
For example, in the embodiment of the present application, the video source device 02 may be provided with a device capable of providing video data, such as a set top box, a PC (personal computer ) device, or the like, having an HDMI interface. In the embodiment of the present application, the video source device 02 may determine specific parameters of video data provided to the display device, such as resolution, frame rate, and the like, after reading the EDID in the display device 01.
Fig. 9 illustrates a block diagram of one possible configuration of a control device 200. As shown in fig. 9, the control device 200 includes a controller 210, a communication interface 230, a user input/output interface 240, a memory, and a power supply. The control apparatus 200 may receive an input operation instruction (e.g., a voice instruction) of the user, and convert the operation instruction into an instruction recognizable and responsive to the display device 01, enabling an interaction between the user and the display device 200.
By way of example, taking a display device as a television set as an example, fig. 10 shows a schematic structural diagram of a display device 01 according to an embodiment of the present application.
As shown in fig. 10, the display apparatus 01 includes at least one of a modem 110, a communicator 120, a detector 130, an external device interface 140, a controller 150 (or referred to as a processor 150), a display 160, an audio output interface 170, a memory, a power supply, and a user interface.
In some embodiments the controller includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
The display 160 includes a display screen component for presenting a picture, and a driving component for driving image display, a component for receiving an image signal from the controller output, displaying video content, image content, and a menu manipulation Interface, and a user manipulation User Interface (UI).
The display 160 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 120 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display device 01 can establish transmission and reception of control signals and data signals with the external control device 200 or the video source device 02 through the communicator 120.
A user interface operable to receive control signals from a control device 200, such as an infrared remote control.
The detector 130 is used to collect signals of the external environment or interaction with the outside. For example, the detector 130 includes a light receiver, a sensor for collecting the intensity of ambient light; alternatively, the detector 130 includes an image collector such as a camera, which may be used to collect external environmental scenes, attributes of a user, or user interaction gestures, or alternatively, the detector 130 includes a sound collector such as a microphone, etc. for receiving external sounds.
The external device interface 140 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, or the like. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 110 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 150 and the modem 110 may be located in separate devices, i.e., the modem 110 may also be located in an external device to the host device in which the controller 150 is located, such as an external set-top box or the like.
The controller 150 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 150 controls the overall operation of the display apparatus 01. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 160, the controller 150 may perform an operation related to the object selected by the user command.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), video processor, audio processor, graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 160, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the display device, where the control may include at least one of a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
It will be appreciated that in general, implementation of display device functions requires software in addition to the hardware support described above.
In some embodiments, taking an Android system as an example of an operating system used by the display device 01, referring to fig. 11, the system of the display device 01 may be divided into four layers, namely, an application layer (application layer), an application framework layer (Application Framework layer), a An Zhuoyun line (Android run time) and a system library layer (system runtime layer), and a kernel layer.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be an application developed by a third party developer. In the embodiment of the application, the application program layer may include a voice recognition application, and the application is specifically used for calling a communication interface of the display device 01 to send voice data received by the display device 01 to the server 02 for recognition. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application. The application framework layer includes a number of predefined functions or services. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 11, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), a View system (View system), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the individual applications as well as the usual navigation rollback functions, such as controlling the exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display screen, judging whether a status bar exists or not, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window to display, dithering display, distorting display, etc.), etc.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. The kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), MIC drive, power drive, etc.
The video data referred to in the present application may be data authorized by the user or sufficiently authorized by the parties.
The method in the following embodiments may be implemented in a display device having the above-described hardware structure and software structure. In the following embodiments, a display device is taken as an example of a television, and a display method provided in the embodiments of the present application is described.
Referring to fig. 12, an embodiment of the present application provides a display method, which is applied to a display device, where a refresh rate of a display of the display device is a first refresh rate. The method may include S121-S125:
s121, the television acquires the EDID processing capability of the extended display identification data of the video source equipment.
When the television responds to the playing operation of the user and needs to play the video, the electronic opportunity requests the video source equipment to send video data to the television. In this embodiment of the present application, the video source device is an HDMI device having an HDMI interface.
Before video source devices send video data to a television, it is necessary to make an explicit determination of what parameters (resolution and frame rate) the display of the television specifically supports the video data. Specifically, the video source device needs to perform DDC interaction with the television through the HDMI interface to parse EDID stored in the television, so as to obtain resolution and frame rate of video data supported by the television. The EDID may be stored in a DDC memory area in the television set, among others.
In the display method provided by the embodiment of the application, the main purpose of the display method is to display video data with a higher frame rate, for example, 240Hz video data, by using the television with the refresh rate of the display screen being the first refresh rate. However, referring to fig. 13 (a), the conventional EDID is 256 bytes, in which two extension units, each of which is 128 bytes, are provided. One extension unit is a Base data Block Base Block, and the other extension unit is a consumer electronics association Block CTA Block.
Conventional 256 bytes of EDID only support video data stating that the display supports a frame rate corresponding to a first refresh rate (e.g., 120 Hz). Whereas video data of a first resolution (e.g., 1920 x 1080p) and a first frame rate (e.g., 240 Hz) require declaration in conventional EDID by the displayID protocol, since it is not the native HDMI resolution and frame rate defined by the HDMI association. This, in turn, requires ensuring that the HDMI interface supports parsing extended EDID, i.e., the television will store extended EDID in the DDC memory area in advance. The extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate. Illustratively, the first resolution may be 1920 x 1080p and the first frame rate may be 240Hz. Of course, in practice, the first resolution and the first frame rate may be determined according to the third resolution and the first refresh rate supported by the display screen of the display device and the actual requirement. For example, if the third resolution is 1920×1080p, the first resolution may be 960×540p; if the first refresh rate is 60Hz, the first frame rate may be 120Hz. This is not particularly limited in this application.
As shown in fig. 13 (b), the extended EDID has 512 bytes and four extended units. In addition to the two extension units included in the conventional EDID, the other two extension units are extension Block mapping tables EXT-Block Map and displayID extension blocks. Two expansion units at multiple locations may then be used to declare that the display supports video data at a first resolution and a first frame rate.
However, many HDMI devices currently on the market may not support extended EDID parsing and reading, so in order to avoid this problem, the display device needs to obtain the EDID processing capability of a certain video source device after accessing the video source device. And thus determines whether the subsequent display method can be implemented. The EDID processing capability is used to indicate whether the video source device supports parsing extended EDID or parsing legacy EDID.
In practice, for the old video source device that the television set has been previously connected to, since the old video source device and the television set have already transmitted video data to each other with a high probability so far, EDID data in the television set must be able to be parsed and read by the old video source device. For old and old video source devices, how video data was originally transmitted is still transmitted without any change. Therefore, when the video source device is an old video source device, it is not necessary to determine whether the video source device supports the parsing and reading of the extended EDID. Only the EDID processing capability of the newly accessed video source device needs to be determined. Based on this, referring to fig. 14 in conjunction with fig. 12, before S121, the method further includes S120:
And S120, the television judges whether the video source equipment is newly accessed equipment.
If the television determines that the video source device is a newly accessed device, S121 is executed; if the television determines the newly accessed device of the video source device, the television may repeatedly determine whether the newly accessed device is the newly accessed device when the subsequent video source device is accessed to the television, that is, execute S120; alternatively, the flow ends.
S122, the television receives the first video data from the video source device under the condition that the EDID processing capability of the video source device is in support of parsing and expanding the EDID.
The resolution of the first video data is the first resolution, the frame rate of the first video data is the first frame rate, and the first frame rate is twice the first refresh rate.
In the case that the EDID processing of the video source device is supporting parsing extended EDID, the video source device may send the first video data to the television after parsing the extended EDID stored in the television. Meanwhile, the television set may then include the structure and content of the extended EDID waiting for the first video data from the video source device. Specifically, the HDMI video in the television may receive the first video data. In practice, the video source device sends a timing signal (or HDMI signal) to the television set through the HDMI interface, where the signal carries the first video data.
In some embodiments, in the case that the EDID processing capability of the video source device is not capable of parsing extended EDID, the EDID processing capability of the video source device is considered to be capable of parsing conventional EDID, and in this case, in order for the video source device to smoothly send video data to the display device, it is necessary to modify your EDID to conventional EDID. Based on this, referring to fig. 15 in conjunction with fig. 14, the method further includes S1501 to S1503:
s1501, in the case that the EDID processing capability of the video source device is that the extended EDID is not supported for analysis, the television modifies the extended EDID into EDID.
Specifically, the television may modify the extended EDID stored in itself as shown in (b) of fig. 13 to the conventional EDID as shown in (a) of fig. 13. The conventional EDID may declare that the received resolution is a preset resolution and the frame rate is a preset frame rate of the video data. Illustratively, the preset resolution is any feasible resolution, and the preset frame rate is a frame rate less than or equal to 120 Hz.
S1502, the television triggers the video source device to initiate DDC interaction again.
After modifying the extended EDID to EDID, in order for the video source device to send video data to the television, the video source device may initiate DDC interaction again to parse and read the EDID, thereby determining what parameters (including resolution and frame rate) of video data are to be sent to the television.
S1503, the television receives video data from the video source device.
Based on the technical scheme corresponding to S1501-S1503, the video source device can not analyze the extended EDID, and further can not timely modify the extended EDID in the television set when the request for the first video data is sent to the television set. Therefore, the video source equipment can smoothly send video data to the television for playing, the video watching requirement of the user is prevented from being unsatisfied, and the use experience of the user is improved.
In some embodiments, it may be certain that a large class of video source devices is provided with parse-occupied EDID, such as PC devices. In the case of devices such as video device sites, it is necessary to determine the EDID processing capability of the video source device by other means. Based on this, referring to fig. 16 in conjunction with fig. 15, S122 may specifically include S1601-S1607:
and S1601, the television judges whether the type of the video source device is a first type device.
The first type of device is specifically a device having an EDID processing capability supporting parsing and expanding EDID.
If the television determines that the type of the video source device is the first type device, it may directly determine that the EDID processing capability of the video source device is supporting parsing the extended EDID, and receive the first video data from the video source device while maintaining the structure and content of the extended EDID, i.e., execute S1602.
If the television determines that the type of the video source device is not the first type device, the EDID processing capability of the video source device may be determined after the feature parameter of the video source device is acquired, which performs S1603 and S1604.
Of course, the above step S1601 may not actually exist, and the television set may perform S1602 if it is determined that the type of the video source device is the first type device, and may perform S1603 if it is determined that the type of the video source device is not the first type device.
S1602, the television determines the EDID processing capability of the video source device as supporting parsing and expanding EDID, and receives first video data from the video source device.
S123 is executed after S1602.
S1603, the television acquires characteristic parameters of the video source equipment.
By way of example, the characteristic parameters may include a detailed type and model number. The detailed type may be a manufacturer of the video source device, a device name, etc.
Because the video source device can only be identified and parsed by the television if it is qualified or standard, it is also necessary to determine whether the feature parameters are resolvable or not before determining the EDID processing capability of the video source device according to the feature parameters, i.e. S1604 is performed.
S1604, the television judges whether the characteristic parameters of the video source equipment can be analyzed.
If the television determines that the feature parameters of the video source device are resolvable, S1605 is performed.
If the television determines that the characteristic parameters of the video source device cannot be resolved, the television can determine whether the video source device can normally access a register storing extended EDID and resolve the extended EDID through the DDC communication statistic value of the video source device and the television, so as to determine whether the EDID processing capability of the video source device supports resolving the extended EDID. I.e. S1606-S1607 are performed.
Of course, the above step S1604 may not actually exist, and the television set may execute step S1605 when it is determined that the characteristic parameters of the video source device are resolvable, and execute steps S1606 and S1607 when it is determined that the characteristic parameters of the video source device are not resolvable.
S1605, under the condition that the characteristic parameters of the video source equipment can be analyzed, the television determines the EDID processing capability of the video source equipment according to the characteristic parameters of the video source equipment.
S1602 is executed after S1605.
The characteristic parameter resolvable means that the characteristic parameter obtained by the television accords with a preset standard, so that the television can successfully resolve the characteristic parameter, and further the EDID processing capability of the video source equipment is determined according to the characteristic parameter of the video source equipment.
In one possible implementation, the manufacturer to which the television belongs may store EDID processing capabilities of all video source devices to which the television has access in its own server. Based on this, referring to fig. 17 in conjunction with fig. 16, S1605 may specifically include S1701-S1706:
s1701, the television sends a query request to the server under the condition that the characteristic parameters of the video source equipment can be resolved.
The query request carries characteristic parameters of the video source equipment, and is used for requesting the EDID processing capability of the video source equipment. The server is the server owned by the manufacturer of the television, and the EDID processing capacity of the optional video source equipment accessed by all the televisions acquired by the manufacturer can be stored.
S1702, the television receives a query response of the server.
After receiving the query request, the server can query the EDID processing capability of the responding video source device in the own memory, generate a query response according to the query result and then send the query response to the television. After receiving the query response, the television can determine whether the EDID processing capability of the video source device exists in the server and whether the EDID processing capability of the video source device is supporting parsing and expanding EDID.
S1703, the television judges whether the query response indicates that the EDID processing capability of the video source equipment exists and the EDID processing capability of the video source equipment is supporting analysis and expansion EDID.
If the television determines that the query response indicates that the EDID processing capability of the video source device exists and the EDID processing capability of the video source device is support for parsing and expanding EDID, the television determines that the EDID processing capability of the video source device is support for parsing and expanding EDID. After that, the first video data from the video source device is received while maintaining the structure and content of the extended EDID, i.e., S1704 is performed.
If the television determines whether the query response indicates that the EDID processing capability of the video source device does not exist or indicates that the EDID processing capability of the video source device does not support parsing and expanding EDID, it may be initially determined that the EDID processing capability of the video source device may not support parsing and expanding EDID. However, since the EDID processing capability of all the optional video source devices stored in the server may not accurately cover the EDID processing capability of the video source device due to the capability of the server itself or insufficient acquired data, the television may determine whether the video source device can normally access the register storing the extended EDID and parse the extended EDID through the DDC communication statistics value of the video source device and the television, thereby determining whether the EDID processing capability of the video source device supports parsing the extended EDID. I.e., S1705 and S1706 are performed.
Of course, the above step S1603 may not actually exist, and the television set may perform S1704 if it is determined that the query response indicates that the EDID processing capability of the video source device exists and that the EDID processing capability of the video source device is supporting the parsing expansion EDID, and may perform S1705 and S1706 if it is determined that the query response indicates that the EDID processing capability of the video source device does not exist or that the EDID processing capability of the video source device is not supporting the parsing expansion EDID.
S1704, the television determines that the EDID processing capability of the video source device is supporting parsing and expanding EDID, and receives the first video data from the video source device.
S123 is executed after S1704.
S1705, the television acquires a DDC communication statistic value between the video source equipment and the television.
The DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source device reading display device. In some embodiments, the DDC communication statistics may also be referred to as DDC communication training status.
S1706, the television judges whether the DDC communication statistic value indicates that the video source equipment finishes reading the extended EDID completely.
If the television determines that the DDC communication statistics value indicates that the video source device has all the completion degree of reading the extended EDID, the television may determine that the video source device has the capability of resolving the extended EDID, that is, the EDID processing capability of the video source device is capable of resolving the extended EDID, and then may receive the first video data from the video source device while maintaining the structure and the content of the extended EDID, that is, execute S1704.
If the television determines that the DDC communication statistics value indicates that the completion degree of the video source device for reading the extended EDID is not complete, the television may determine that the video source device does not have or does not completely have the capability of resolving the extended EDID, that is, the EDID processing capability of the video source device is not capable of resolving the extended EDID, and the television may modify the extended EDID into EDID and instruct the video source device again to initiate DDC interaction and then send video data to the television, that is, execute S1501-S1503.
Of course, the step S1706 may not actually exist, and the television may perform S1704 if it is determined that the DDC communication statistics value indicates that the video source device has completely completed reading the extended EDID, and may perform S1501 to S1503 if it is determined that the DDC communication statistics value indicates that the video source device has not completely completed reading the extended EDID.
Based on the technical schemes corresponding to S1701-S1706, the television can accurately determine the EDID processing capability of the video source device according to the characteristic parameters of the video source device, and provide an execution basis for the subsequent procedure of display.
S1606, the television acquires the DDC communication statistic value between the video source equipment and the television.
The DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source device reading display device. In some embodiments, the DDC communication statistics may also be referred to as DDC communication training status.
S1607, the television judges whether the DDC communication statistic value indicates that the video source device finishes reading the extended EDID to be all completed.
If the television determines that the DDC communication statistics value indicates that the video source device has all the completion of reading the extended EDID, the television may determine that the video source device has the capability of resolving the extended EDID, that is, the EDID processing capability of the video source device is capable of resolving the extended EDID, and then may receive the first video data from the video source device while maintaining the structure and content of the extended EDID, that is, execute S1602.
If the television determines that the DDC communication statistics value indicates that the completion degree of the video source device for reading the extended EDID is not complete, the television may determine that the video source device does not have or does not completely have the capability of resolving the extended EDID, that is, the EDID processing capability of the video source device is not capable of resolving the extended EDID, and the television may modify the extended EDID into EDID and instruct the video source device again to initiate DDC interaction and then send video data to the television, that is, execute S1501-S1503.
Based on the technical scheme corresponding to the S1601-S1607, the television can accurately determine the EDID processing capability of the video source device by combining various factors, and provide an execution basis for the subsequent procedure of display.
And S123, the television adjusts the resolution of the first video data from the first resolution to the second resolution so as to obtain second video data.
Wherein the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is less than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display apparatus.
For example, the third resolution may be 3840×2160p, and the second resolution may be 3840×1080p. Of course, in practice, the second resolution may be determined according to the third resolution supported by the display screen of the display device and the actual requirement.
In the case of supporting 3840×2160@120hz video playing with the display of the television, the data receiving interface of the TCON may be v-by-one. In the embodiment of the application, the bandwidth borne by the data receiving interface of the TCON is the bandwidth corresponding to the video data supported by the display. Therefore, the total pixel value of the second resolution of the second video data is the general value of the total pixel value of the third resolution, and the first frame rate is twice the first refresh rate, so that the second video data can be input into a data receiving interface of the TCON and displayed by the Tcon processing.
In the present embodiment, S123 may be executed by the image processing unit FRC (frame rate conversion) in the television set.
In one possible implementation, the first resolution and the second resolution may be the same, in which case the step S103 may specifically be: the television set determines the first video data as the second video data.
In another possible implementation manner, if the first vertical pixel value of the first resolution is half of the third vertical pixel value of the third resolution, and the first horizontal pixel value of the first resolution is half of the third horizontal pixel value of the third resolution, referring to fig. 12, S123 may specifically include S123A:
and S123A, the television copies each column of pixels of each video frame in the first video data so as to adjust the resolution of the first video data from the first resolution to the second resolution.
Therefore, the first video data can be quickly converted into the second video data, the subsequent display method flow is convenient to carry out, and the efficiency is improved.
In one possible implementation, the first vertical pixel value of the first resolution is half of the third vertical pixel value of the third resolution, and the first horizontal pixel value of the first resolution is half of the third horizontal pixel value of the third resolution. In connection with the existing process flow shown in fig. 4, for the purpose of reducing the flow change and reducing the development difficulty, referring to fig. 12, S123 may specifically include S1231 and S1232:
And S1231, the television copies each row of pixels of each video frame in the first video data, so that the resolution of the first video data is adjusted from the first resolution to the fourth resolution, and fourth video data is obtained.
Wherein the fourth vertical pixel value of the fourth resolution is twice the first vertical pixel value of the first resolution. For example, if the first resolution is 1920 x 1080p, the fourth resolution is 1920 x 2160p.
In summary, S1231 may be performed by a composite Remix unit in the image processing unit FRC in the television set.
And S1232, the television copies each column of pixels of each video frame in the fourth video data, and de-duplicates repeated rows of pixels in each video frame in the fourth video data, so that the resolution of the fourth video data is adjusted from the fourth resolution to the second resolution, and the second video data is obtained.
The deduplication of repeated line pixels in each video frame may specifically be to delete one line from two adjacent same line pixels in each video frame.
In summary, S1231 may be performed by a Prescaler in an image processing unit FRC in a television set.
For example, in combination with the example shown in fig. 7 and the technical solution corresponding to S1231-S1232, referring to fig. 20, after the HDMI Vedio of the display device obtains the first video data of 1920×1080@240hz, the Remix unit copies the row pixels of the first video data to obtain the fourth video data of 1920×2160@240hz. And then the Prescaler copies the column pixels of the video frame in the fourth video data and reduces the row pixels by half, thereby obtaining 3840×1080@240Hz second video data.
Based on the technical schemes corresponding to S1231 and S1232, based on the existing flow of the display scheme for displaying the high-frame-rate video on the low-refresh-rate display, before the halving operation is performed on the row pixels of the video data, the row pixels are first duplicated twice as much as the original row pixels, so that after the halving operation (i.e., the row pixels are de-duplicated), the second video data further includes all the pixels in the first video data input by the video source device. The definition of the second video data is ensured, the second video data can be transmitted into a data receiving interface of the Tcon, and meanwhile, preconditions are provided for the subsequent clear and smooth display of the third video.
And S124, the television adjusts the resolution of the second video data from the second resolution to a third resolution, and the third video data is obtained.
In one implementation manner, S124 may specifically include: and copying each row of pixels in each video frame of the second video data so as to adjust the resolution of the second video data from the second resolution to the third resolution, thereby obtaining third video data.
Illustratively, referring to fig. 21, the even-numbered row pixels in each video frame of the third video are derived from the odd-numbered row pixels of the previous video frame. Specifically, the pixel value of each pixel in the row pixels of the even row is the pixel value of the corresponding pixel in the row pixel of the odd row above. The odd-numbered line pixels in each video frame of the third video are the corresponding line pixels in the second video. That is, each video frame of the third video includes all pixels in the corresponding video frame of the second video, that is, all pixels in the first video data. In this way, when the third video data is displayed later, the sharpness is not lowered compared to the first video data.
Summarizing, the embodiment of the present application, S1231 may be performed by Tcon in a television set.
S125, the television displays the third video data in a mode of simultaneously scanning two rows of pixels.
Wherein S125 may be performed by Tcon in a television set; the Tcon can specifically utilize the function of scanning two lines simultaneously in the HSR technology or the DLG technology to scan the third video data, and convert the third video data into an electrical signal which can be displayed on the display screen of the television to display the electrical signal.
According to the technical scheme provided by the embodiment of the application, under the condition that the resolution of the display screen of the television supporting display is the third resolution and the refresh rate is the first refresh rate, the bandwidth which is highest supported by the data receiving interface for controlling the screen driving board displayed by the display to receive data is considered to be the bandwidth corresponding to the video data with the resolution of the third resolution and the frame rate of the first frame rate. Based on this, in order to enable the display screen of the television to display video data of a higher frame rate (for example, a frame rate twice the first refresh rate) more smoothly without losing pixel values, it is necessary to make the total pixel value corresponding to the resolution (i.e., the second resolution) of the video data received by the screen driving board half the total pixel value of the third resolution and make the frame rate of the video data received by the screen driving board twice the first refresh rate. In addition, when the display screen with the first refresh rate is used for displaying the video data with the first frame rate, a soft high-brush technology (such as DLG or HSR) is adopted, and the technology mainly performs special processing on the scanning of each row of pixels in each video frame (two rows are scanned simultaneously) so as to achieve the purpose of doubling the scanning speed, so that the purpose of displaying the video data with the first frame rate by the display screen with the first refresh rate can be realized. Therefore, in the embodiment of the present application, the second horizontal pixel value of the second resolution may be the same as the third horizontal pixel value of the third resolution, where the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution.
Based on the foregoing, in the technical solution of the embodiments of the present application, in order for the screen driving board to receive video data with a resolution of the third resolution and a frame rate of the video data with the first frame rate, it is necessary to make the frame rate of the first video data acquired by the television from the video source device be the first frame rate. Meanwhile, due to the specification limitation of the video resolution in practice, there is a difference between the resolution (e.g., first resolution) and the second resolution of the first video data provided to the television set by the video source device. Based on the above, when the television obtains the first video data, the resolution of the first video data needs to be adjusted, so as to obtain the second video data and transmit the second video data to the screen driving board, so that the screen driving board controls the display screen to display according to the second video data.
Further, in practice, when the video source device transmits the first video data to the television, it needs to analyze and read EDID in the television to determine what parameters (frame rate and resolution) of the video data are transmitted to the television. However, existing EDIDs can only define parameters of a lower frame rate (i.e., a third resolution, e.g., 120 Hz) at the highest. Therefore, in order for the video source device to transmit video data of the first frame rate to the television, it is necessary to configure extended EDID in the television in advance, which extended EDID may then indicate to receive video data of the first resolution and the first frame rate. After that, in the case that the video source device can parse the extended EDID, the television may receive the first video data, and further execute the subsequent display procedure.
In summary, because the technical solution provided in the embodiments of the present application, when the first video data with the first frame rate is displayed at the end, the first video data with the first frame rate may be displayed in a dual-line simultaneous scanning manner without any loss of any pixel. Because the pixels in the first video data are not displayed at any time in the whole display flow, the final display frame rate is ensured, the effect of smoothly and clearly displaying the video data with the first frame rate on the display screen with the lower refresh rate (namely the first refresh rate) is realized, and the use experience of a user is improved.
In some embodiments, the video source device may not be able to provide the first video data to the television for various possible reasons, where the television may need to display according to the existing display procedure if it is determined that the parameters (resolution and frame rate) of the video data transmitted by the video source device are not parameters won by the first video data. Based on this, referring to fig. 22 in conjunction with fig. 17, the step of determining that the EDID processing capability of the video source device is to support parsing extended EDID and receiving the first video data from the video source device (i.e., S1602 and S1704) may specifically include: S2201-S2204:
s2201, the television determines the EDID processing capability of the video source device to support analysis and expansion of EDID, and receives a timing signal from the video source device.
Wherein the timing signal carries video data.
S2202, the television starts a timing signal detection thread.
The timing signal detection thread can be specifically used for detecting whether the resolution and the frame rate of video data carried by the timing signal are the first resolution and the first frame rate.
S2203, the television judges whether the resolution and the frame rate of the video data carried by the timing signal are the first resolution and the first frame rate respectively.
Specifically, whether the resolution and the frame rate mentioned here are the first resolution and the first frame rate, respectively, refers to whether the resolution is the first resolution and whether the frame rate is the first frame rate.
If the television determines that the resolution and the frame rate of the video data carried by the timing signal are the first resolution and the first frame rate respectively, it may be determined that the video source device sends the first video data to the television at this time, then S123 is executed; if the television determines that the resolution of the video data carried by the timing signal is not the first resolution, or the frame rate of the video data carried by the timing signal is not the first frame rate, it may be determined that the video data portion sent by the video source device to the television is the first video data at this time, and S2204 is executed.
S2204, the television processes and displays the video data carried by the timing signal according to the normal HSR processing flow.
The normal HSR process flow may be a display process flow as shown in fig. 4, which is not described herein.
Based on the technical scheme of S2201-S2204, the technical scheme provided by the application can be executed only when the video source device really sends the first video data meeting the requirements to the television. Therefore, the technical scheme provided by the application can adopt different high display modes according to different high video data, and the processing resources of the television are reasonably utilized.
In some embodiments, since in the technical solution provided in the present application, the frequency doubling function of the television, that is, the memc function is not required to be used, referring to fig. 22, as shown in fig. 23, between S2203 and S123, the display method may further include S2301:
s2301, the television turns off the memc function.
Of course, S2201 may be performed at any possible timing in any of the embodiments provided herein, which is not particularly limited herein.
Based on the scheme, the power consumption of the television can be reduced, and energy sources are saved.
In some embodiments, the second video data is fused with the OSD generated by the television before the third video data is displayed, in order to make some necessary controls or symbols present in the display interface. Based on this, referring to fig. 24 in conjunction with fig. 23, in this display method, S2401 is further included between S123 and S124:
S2401, the television fuses the OSD picture and the second video data to update the second video data.
The specific implementation of the OSD frame generation and the fusion with the second video data may refer to the related description in the foregoing embodiment, which is not described herein.
In the embodiment of the present application, S2401 may be executed by a send Display unit in a television set.
It should be noted that, the steps in all the foregoing embodiments may be freely combined according to actual needs, and the examples specifically shown in the drawings in this application are not meant to be specific limitations on the display method provided in this application, and other possible examples should also fall within the scope of the disclosure provided in the present application for providing high technical solutions.
The foregoing description of the solution provided in the embodiments of the present application has been mainly presented in terms of a method. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional modules of the display device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Referring to fig. 25, an embodiment of the present application provides a display device that may include an acquisition module 241, a processing module 252, and a display module 253.
Specifically, the obtaining module 251 is configured to obtain an EDID processing capability of the extended display identification data of the video source device; the obtaining module 251 is further configured to receive the first video data from the video source device if the EDID processing capability of the video source device is supporting parsing and expanding EDID; the extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate; a processing module 252, configured to adjust the resolution of the first video data received by the obtaining module 251 from the first resolution to a second resolution, so as to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device; the processing module 252 is further configured to copy each row of pixels in each video frame of the second video data, so that the resolution of the second video data is adjusted from the second resolution to a third resolution, and obtain third video data; the display module 253 is configured to display the third video data obtained by the processing module 252 in a manner of scanning two rows of pixels simultaneously.
In one possible implementation, the obtaining module 251 is specifically configured to: if the category of the video source device is determined to be the first type device, the EDID processing capability of the video source device is determined to support analysis and expansion EDID.
In one possible implementation, the obtaining module 251 is further specifically configured to: if the category of the video source equipment is determined not to be the first equipment, acquiring characteristic parameters of the video source equipment; and under the condition that the characteristic parameters of the video source equipment can be resolved, determining the EDID processing capability of the video source equipment according to the characteristic parameters of the video source equipment.
In one possible implementation, the obtaining module 251 is further specifically configured to: sending a query request to a server; the query request carries the characteristic parameters of the video source equipment, and is used for requesting the EDID processing capacity of the video source equipment; receiving a query response from the server; if the query response indicates that the EDID processing capability of the video source device exists and indicates that the EDID processing capability of the video source device is in support of analysis and extension of EDID, determining that the EDID processing capability of the video source device is in support of analysis and extension of EDID; if the query response is used for indicating that the EDID processing capability of the video source equipment does not exist or indicating that the EDID processing capability of the video source equipment is not supported for analyzing and expanding the EDID, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; and if the DDC TRAINING state indicates that the completion degree of the video source equipment for reading the extended EDID of the display equipment is complete, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
In one possible implementation, the obtaining module 251 is further specifically configured to: under the condition that the characteristic parameters of the video source equipment cannot be resolved, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; if the DDC communication statistic value indicates that the completion degree of the extended EDID of the video source device reading the display device is complete, determining the EDID processing capability of the video source device as supporting analysis of the extended EDID.
In one possible implementation, the processing module 252 is specifically configured to: copying each row of pixels of each video frame in the first video data received by the acquisition module 251 so as to adjust the resolution of the first video data from the first resolution to a fourth resolution, thereby obtaining fourth video data; the fourth vertical pixel value of the fourth resolution is twice the first vertical pixel value of the first resolution; and copying each column of pixels of each video frame in the fourth video data, and de-duplicating repeated rows of pixels in each video frame in the fourth video data so as to adjust the resolution of the fourth video data from the fourth resolution to the second resolution, thereby obtaining the second video data.
In one possible implementation, the processing module 252 is specifically configured to: and copying each row of pixels in each video frame of the second video data so as to adjust the resolution of the second video data from the second resolution to a third resolution, thereby obtaining third video data.
In one possible implementation, the first refresh rate is 120 Hz, the first resolution is 1920 x 1080 progressive scan P, the first frame rate is 240Hz, and the third resolution is 3840 x 2160p.
With respect to the display device in the above-described embodiment, a specific manner in which each module performs an operation has been described in detail in the foregoing embodiment of the display method, and will not be described in detail herein.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is merely a division of logic functions, and may be fully or partially integrated into one physical entity or may be physically separated. And the units in the device can be all realized in the form of software calls through the processing element; or can be realized in hardware; it is also possible that part of the units are implemented in the form of software, which is called by the processing element, and part of the units are implemented in the form of hardware.
For example, each unit may be a processing element that is set up separately, may be implemented as integrated in a certain chip of the apparatus, or may be stored in a memory in the form of a program, and the functions of the unit may be called and executed by a certain processing element of the apparatus. Furthermore, all or part of these units may be integrated together or may be implemented independently. The processing element described herein, which may also be referred to as a processor, may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in a processor element or in the form of software called by a processing element.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as CPUs or other processors that may invoke programs. For another example, the units may be integrated together and implemented in the form of a system on chip SOC.
In one implementation, the above means for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may include a processing element and a storage element, the processing element invoking a program stored in the storage element to perform the display method described in the above method embodiments. The memory element may be a memory element on the same chip as the processing element, i.e. an on-chip memory element.
In another implementation, the program for performing the above method may be on a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this time, the processing element calls or loads a program from the off-chip storage element on the on-chip storage element to call and execute the display method described in the above method embodiment.
Referring to fig. 26, the embodiment of the present application further provides a display device including a display 261; the refresh rate of the display 261 is a first refresh rate, and the resolution of the display 261 is a third resolution; a processor 262 configured to obtain extended display identification data EDID processing capabilities of the video source device; a communicator 263 configured to receive the first video data from the video source device in the case where the EDID processing capability of the video source device is in support of parsing and expanding EDID; the extended EDID is used for indicating to receive video data with a first resolution and a first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate; the processor 262 is further configured to adjust the resolution of the first video data from the first resolution to the second resolution to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device; the processor 262 is further configured to adjust the resolution of the second video data from the second resolution to the third resolution, resulting in third video data; the processor 262 is further configured to control the display 261 to display third video data in a manner that scans two lines of pixels simultaneously.
In one possible implementation, the processor 262 is specifically configured to: if the category of the video source device is determined to be the first type device, the EDID processing capability of the video source device is determined to support analysis and expansion EDID.
In one possible implementation, the processor 262 is specifically configured to: if the category of the video source equipment is determined not to be the first equipment, acquiring characteristic parameters of the video source equipment; and under the condition that the characteristic parameters of the video source equipment can be resolved, determining the EDID processing capability of the video source equipment according to the characteristic parameters of the video source equipment.
In one possible implementation, the processor 262 is specifically configured to: the control communicator 263 transmits a query request to the server; the query request carries the characteristic parameters of the video source equipment, and is used for requesting the EDID processing capacity of the video source equipment; control communicator 263 receives the query response from the server; if the query response indicates that the EDID processing capability of the video source device exists and indicates that the EDID processing capability of the video source device is in support of analysis and extension of EDID, determining that the EDID processing capability of the video source device is in support of analysis and extension of EDID; if the query response indicates that the EDID processing capability of the video source equipment does not exist or indicates that the EDID processing capability of the video source equipment is not supported for analysis and expansion, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; and if the DDC communication statistic value indicates that the completion degree of the extended EDID of the video source equipment read display equipment is completely completed, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
In one possible implementation, the processor 262 is specifically configured to: under the condition that the characteristic parameters of the video source equipment cannot be resolved, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the extended EDID of the video source equipment reading display equipment; and if the DDC TRAINING state indicates that the completion degree of the video source equipment for reading the extended EDID of the display equipment is complete, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
In one possible implementation, the processor 262 is specifically configured to: copying each row of pixels of each video frame in the first video data so as to adjust the resolution of the first video data from the first resolution to a fourth resolution and obtain fourth video data; the fourth vertical pixel value of the fourth resolution is twice the first vertical pixel value of the first resolution; and copying each column of pixels of each video frame in the fourth video data, and de-duplicating repeated rows of pixels in each video frame in the fourth video data so as to adjust the resolution of the fourth video data from the fourth resolution to the second resolution, thereby obtaining the second video data.
In one possible implementation, the processor 262 is specifically configured to: and copying each row of pixels in each video frame of the second video data so as to adjust the resolution of the second video data from the second resolution to the third resolution, thereby obtaining third video data.
In one possible implementation, the first refresh rate is 120 Hz, the first resolution is 1920 x 1080 progressive scan P, the first frame rate is 240Hz, and the third resolution is 3840 x 2160p.
The embodiment of the application also provides a display device, which may include: a display screen, a memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the display device may perform the various functions or steps performed by the display device (e.g., television) in the method embodiments described above.
For example, the embodiment of the application also provides a chip, and the chip can be applied to the display device or the server. The chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the processor receives and executes computer instructions from the memory of the display device through the interface circuit to implement the methods described in the method embodiments above.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon computer program instructions (or referred to as instructions). The computer program instructions, when executed by a display device, enable the display device to implement a display method as described above.
Embodiments of the present application also provide a computer program product comprising computer instructions for operating a display device as described above, which when run in the display device, cause the display device to implement a display method as described above.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. With such understanding, the technical solutions of the embodiments of the present application may be essentially or partly contributing to the prior art or all or part of the technical solutions may be embodied in the form of a software product, such as: and (5) program. The software product is stored in a program product, such as a computer readable storage medium, comprising instructions for causing a device (which may be a single-chip microcomputer, chip or the like) or processor (processor) to perform all or part of the steps of the methods described in the various embodiments of the application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A display device, the display device comprising:
a display; the refresh rate of the display is a first refresh rate, and the resolution of the display is a third resolution;
a processor configured to obtain extended display identification data EDID processing capabilities of the video source device;
a communicator configured to receive first video data from the video source device in a case where EDID processing capability of the video source device is supported for parsing extended EDID; the extended EDID is used for indicating to receive video data with the resolution of a first resolution and the frame rate of the first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate;
the processor is further configured to adjust a resolution of the first video data from the first resolution to a second resolution to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device;
The processor is further configured to adjust the resolution of the second video data from the second resolution to the third resolution, resulting in third video data;
the processor is further configured to control the display to display the third video data in a manner that scans two rows of pixels simultaneously.
2. The display device of claim 1, wherein the processor is specifically configured to:
and if the category of the video source equipment is determined to be the first equipment, determining the EDID processing capability of the video source equipment as support of analysis and expansion EDID.
3. The display device of claim 2, wherein the processor is specifically configured to:
if the category of the video source equipment is determined not to be the first equipment, acquiring characteristic parameters of the video source equipment;
and under the condition that the characteristic parameters of the video source equipment can be resolved, determining the EDID processing capability of the video source equipment according to the characteristic parameters of the video source equipment.
4. A display device according to claim 3, wherein the processor is specifically configured to:
controlling the communicator to send a query request to a server; the query request carries characteristic parameters of the video source equipment, and is used for requesting the EDID processing capability of the video source equipment;
Controlling the communicator to receive a query response from the server;
if the query response indicates that the EDID processing capability of the video source equipment exists and indicates that the EDID processing capability of the video source equipment is the support of analysis and extension EDID, determining that the EDID processing capability of the video source equipment is the support of analysis and extension EDID;
if the query response indicates that the EDID processing capability of the video source equipment does not exist or indicates that the EDID processing capability of the video source equipment is not supported for analyzing and expanding the EDID, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the video source equipment reading the extended EDID of the display equipment;
and if the DDC communication statistic value indicates that the completion degree of the video source equipment for reading the extended EDID of the display equipment is all completed, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
5. A display device according to claim 3, wherein the processor is specifically configured to:
under the condition that the characteristic parameters of the video source equipment cannot be resolved, acquiring a DDC communication statistic value of a display data channel of the video source equipment; the DDC communication statistic value is used for indicating the completion degree of the video source equipment reading the extended EDID of the display equipment;
And if the DDC communication statistic value indicates that the completion state of the video source equipment for reading the extended EDID of the display equipment is complete, determining the EDID processing capacity of the video source equipment as supporting analysis of the extended EDID.
6. The display device of claim 1, wherein the processor is specifically configured to:
copying each row of pixels of each video frame in the first video data so as to adjust the resolution of the first video data from the first resolution to a fourth resolution and obtain fourth video data; a fourth vertical pixel value of the fourth resolution being twice the first vertical pixel value of the first resolution;
copying each column of pixels of each video frame in the fourth video data, and de-duplicating repeated rows of pixels in each video frame in the fourth video data, so that the resolution of the fourth video data is adjusted from the fourth resolution to the second resolution, and the second video data is obtained.
7. The display device of claim 1, wherein the processor is specifically configured to:
copying each row of pixels in each video frame of the second video data so as to adjust the resolution of the second video data from the second resolution to the third resolution, thereby obtaining third video data.
8. The display device of claim 1, wherein the first refresh rate is 120 Hz, the first resolution is 1920 x 1080 progressive scan P, the first frame rate is 240Hz, and the third resolution is 3840 x 2160p.
9. A display method, characterized by being applied to a display device, the refresh rate of a display of the display device being a first refresh rate, the method comprising:
acquiring the EDID processing capability of the extended display identification data of the video source equipment;
receiving first video data from the video source device under the condition that the EDID processing capability of the video source device is supporting parsing and expanding EDID; the extended EDID is used for indicating to receive video data with the resolution of a first resolution and the frame rate of the first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate;
adjusting the resolution of the first video data from the first resolution to a second resolution to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device;
Adjusting the resolution of the second video data from the second resolution to the third resolution to obtain third video data;
and displaying the third video data in a mode of scanning two rows of pixels simultaneously.
10. A display device, characterized by comprising:
the acquisition module is used for acquiring the EDID processing capacity of the extended display identification data of the video source equipment;
the obtaining module is further configured to receive first video data from the video source device when the EDID processing capability of the video source device is EDID supporting parsing and expanding; the extended EDID is used for indicating to receive video data with the resolution of a first resolution and the frame rate of the first frame rate; the resolution of the first video data is a first resolution, and the frame rate of the first video data is a first frame rate; the first frame rate is twice the first refresh rate;
the processing module is used for adjusting the resolution of the first video data received by the acquisition module from the first resolution to a second resolution so as to obtain second video data; the second horizontal pixel value of the second resolution is the same as the third horizontal pixel value of the third resolution, and the second vertical pixel value of the second resolution is half of the third vertical pixel value of the third resolution; the third resolution is a resolution of a display of the display device; the bandwidth requirement value of the second video data is smaller than or equal to the maximum bandwidth supported by the data receiving interface of the screen driving board TCON of the display device;
The processing module is further configured to copy each row of pixels in each video frame of the second video data, so that the resolution of the second video data is adjusted from the second resolution to the third resolution, and third video data is obtained;
and the display module is used for displaying the third video data obtained by the processing module in a mode of simultaneously scanning two rows of pixels.
CN202211713330.6A 2022-12-29 2022-12-29 Display method and display device Pending CN116095261A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211713330.6A CN116095261A (en) 2022-12-29 2022-12-29 Display method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211713330.6A CN116095261A (en) 2022-12-29 2022-12-29 Display method and display device

Publications (1)

Publication Number Publication Date
CN116095261A true CN116095261A (en) 2023-05-09

Family

ID=86200353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211713330.6A Pending CN116095261A (en) 2022-12-29 2022-12-29 Display method and display device

Country Status (1)

Country Link
CN (1) CN116095261A (en)

Similar Documents

Publication Publication Date Title
CN114286138B (en) Display device, external device and multi-view angle proportional display method
CN111899175A (en) Image conversion method and display device
CN112289271B (en) Display device and dimming mode switching method
CN111954043B (en) Information bar display method and display equipment
CN113849143A (en) Display method, display device, and storage medium
CN111064982B (en) Display control method, storage medium and display device
CN117612499A (en) Display device and screen brightness adjusting method
CN116095261A (en) Display method and display device
CN112235621B (en) Display method and display equipment for visual area
CN110753194B (en) Dual-screen different display method, storage medium and electronic equipment
CN114710633A (en) Display apparatus and video signal display method
CN115547265A (en) Display apparatus and display method
CN114296664A (en) Auxiliary screen brightness adjusting method and display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN112752045A (en) Display device and display method
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
CN115396717B (en) Display device and display image quality adjusting method
CN113973221B (en) Video signal display method and display equipment
CN117612466A (en) Display method and display device
CN115119035B (en) Display device, image processing method and device
CN116092445A (en) Display method and display device
CN113436564B (en) EPOS display method and display equipment
WO2020248815A1 (en) Drive control system, control method and computer-readable storage medium
CN117316079A (en) Display equipment, dimming partition control method and device
CN116647627A (en) Display method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination