CN113923514A - Display device and MEMC (motion estimation and motion estimation) repeated frame discarding method - Google Patents

Display device and MEMC (motion estimation and motion estimation) repeated frame discarding method Download PDF

Info

Publication number
CN113923514A
CN113923514A CN202111116545.5A CN202111116545A CN113923514A CN 113923514 A CN113923514 A CN 113923514A CN 202111116545 A CN202111116545 A CN 202111116545A CN 113923514 A CN113923514 A CN 113923514A
Authority
CN
China
Prior art keywords
video frame
similarity threshold
buffer
current
stored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111116545.5A
Other languages
Chinese (zh)
Other versions
CN113923514B (en
Inventor
徐赛杰
晏飞
李锋
余横
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xinxin Microelectronics Technology Co Ltd
Original Assignee
Qingdao Xinxin Microelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xinxin Microelectronics Technology Co Ltd filed Critical Qingdao Xinxin Microelectronics Technology Co Ltd
Priority to CN202111116545.5A priority Critical patent/CN113923514B/en
Publication of CN113923514A publication Critical patent/CN113923514A/en
Application granted granted Critical
Publication of CN113923514B publication Critical patent/CN113923514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a display device and a MEMC repeat frame discarding method, which comprises the steps of calculating the difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, and calculating the similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is the repeat frame of the video frame stored in the previous Buffer, and when the difference value is not greater than the similarity threshold, the current video frame is considered as the repeat frame, and the video frame at the next moment is stored in the current Buffer. Therefore, repeated frames with the same picture content as the previously stored video frames can be discarded, and the use efficiency of DDR in the display device is improved.

Description

Display device and MEMC (motion estimation and motion estimation) repeated frame discarding method
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a display device and an MEMC repetitive frame discarding method.
Background
The display device has abundant input signal interfaces, for example, video data from signal source devices such as an intelligent network player and a DVD can be acquired through the HDMI interface. Currently, the frame rate of mainstream movies is 24fps, the frame rate of television programs is 25fps, and the frame rate of some film sources is even only 10fps, and when playing the film sources with different frame rates, a signal source device such as an intelligent network player and the like usually outputs a fixed frame rate of 60 fps. This requires duplication of the video frames within the original source to raise the frame rate to 60fps output.
Motion Estimation and Motion Compensation (MEMC) is a technology widely used in frame rate conversion at present, and by estimating a Motion trajectory of an object in a continuous Motion image and then combining image data and an obtained Motion vector, an intermediate image is interpolated, so that a video frame rate is improved, and problems of jitter and tailing during video playing are solved.
When the MEMC performs motion estimation, multiple original video frames need to be acquired from the DDR of the display device, however, if all the video frames output by the signal source device such as the intelligent network player are stored in the DDR without distinction, there are many repeated video frames in the DDR, so that the DDR is not efficiently utilized.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a display device and a MEMC repeated frame discarding method, and the method includes firstly calculating a difference value between a current Buffer and a video frame stored in a previous Buffer, calculating a similarity threshold corresponding to the video frame stored in the current Buffer, and determining whether a video frame at a next moment is stored in the current Buffer or the next Buffer by comparing a relationship between the calculated difference value and the similarity threshold. The method solves the problem that in the prior art, video frames output by signal source equipment such as an intelligent network player and the like are all stored in DDR of display equipment without distinction, so that the DDR is low in use efficiency.
In a first aspect, an embodiment of the present application provides a display device, including a display and a controller, where the display is configured to display a video frame; a controller is coupled with the display and configured to:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold value corresponding to the current video frame.
In a feasible manner, determining a position where the video frame at the next time is stored in the Buffer according to a relationship between the difference value and the similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a feasible manner, the similarity threshold corresponding to the current video frame is calculated as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
and multiplying the average value by a threshold coefficient to obtain a product value which is the similarity threshold corresponding to the current video frame.
In a feasible mode, a similarity threshold corresponding to a video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to be the maximum limit value.
The display device provided by the embodiment of the application calculates the difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer, and calculates the similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is the repeated frame of the video frame stored in the previous Buffer, and when the difference value is not greater than the similarity threshold, the current video frame is considered to be the repeated frame, and the video frame at the next moment is stored in the current Buffer. Therefore, repeated frames with the same picture content as the previously stored video frames can be discarded, and the use efficiency of DDR in the display device is improved.
In a second aspect, an embodiment of the present application provides a MEMC duplicate frame dropping method, including:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold value corresponding to the current video frame.
In a feasible manner, determining a position where the video frame at the next time is stored in the Buffer according to a relationship between the difference value and the similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a feasible manner, the similarity threshold corresponding to the current video frame is calculated as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the method comprises the steps that a similarity threshold corresponding to a video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to be the maximum limit value.
In a third aspect, an embodiment of the present application provides an image processing apparatus, including:
the first calculation module is used for calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
a second calculating module, configured to calculate a similarity threshold corresponding to the current video frame, where the similarity threshold is used to determine whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and the determining module is used for determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
In a feasible manner, the determining module is configured to determine, according to a relationship between the difference value and the similarity threshold corresponding to the current video frame, a position where the video frame at the next time is stored in the Buffer, specifically as follows:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a feasible manner, the second calculating module is configured to calculate a similarity threshold corresponding to the current video frame, specifically as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the method comprises the steps that a similarity threshold corresponding to a video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to be the maximum limit value.
In a fourth aspect, an embodiment of the present application further provides a chip, including: a processor, configured to call and run a computer program from a memory, so that the display device on which the chip is installed performs the method of any of the second aspects.
In a fifth aspect, the present application further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method of any one of the above second aspects.
In a sixth aspect, the present application further provides a computer program product, which includes a computer program, and when executed by a processor, the computer program implements the method of any one of the second aspects.
For technical effects brought by any one implementation manner of the second aspect to the sixth aspect, reference may be made to technical effects brought by different implementation manners of the first aspect, and details are not described here.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a display device in an embodiment of the present application;
fig. 2 is a block diagram showing a configuration of the control device 100 according to the embodiment of the present application;
fig. 3-1 is a block diagram of a hardware configuration of a display device 200 in an embodiment of the present application;
fig. 3-2 is a block diagram of a hardware configuration of the video processor 260-1 in the embodiment of the present application;
fig. 3-3 are block diagrams of hardware configurations of the RAM213 in the embodiment of the present application;
fig. 4 is a schematic diagram illustrating a functional configuration of the display device 200 according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a MEMC repeat frame dropping method in an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating pre-allocated cache space in a DDR in an embodiment of the present application;
FIG. 7 is a diagram of a video frame set outputted by a player according to an embodiment of the present application
FIG. 8-1 is an exemplary diagram of a Buffer1 currently being used as a Buffer in the embodiment of the present application;
FIG. 8-2 is a diagram illustrating an example of a storage location of a video frame at a next moment in time in the embodiment of the present application shown in FIG. 8-1;
FIG. 8-3 is a second exemplary diagram of a video frame storage location of FIG. 8-1 at a next time in the embodiments of the present application;
8-4 are exemplary diagrams of the present Buffer being Buffer5 in the embodiments of the present application;
8-5 are exemplary diagrams of video frame storage locations of FIGS. 8-4 at a next instant in time in embodiments of the present application;
fig. 9 is a schematic flowchart of a MEMC repeat frame dropping method in an embodiment of the present application;
FIG. 10-1 is a schematic diagram illustrating a storage location of a video frame A0 in an embodiment of the present application;
FIG. 10-2 is a schematic diagram illustrating a storage location of a video frame A1 in an embodiment of the present application;
FIG. 10-3 is a schematic diagram illustrating a storage location of a video frame B0 according to an embodiment of the present application;
10-4 are schematic diagrams of the storage locations of video frames B1 in the embodiment of the present application;
FIG. 10-5 is a second schematic diagram illustrating the storage location of a video frame B1 according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of an image processing module according to an embodiment of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
At present, many display devices on the market are matched with the MEMC function, and the function is used for improving the conditions of 'smear' or 'blur' and the like caused by the movement of people in sports events such as ball games and the like or in violent sports lenses. When the MEMC carries out motion estimation, a plurality of original video frames are required to be obtained from a DDR of a display device, the motion direction of the picture in the process is intelligently calculated according to the relation between the front picture and the rear picture of the plurality of original video frames, and then an intermediate image is interpolated by combining image data and the obtained motion vector, so that the whole motion process is smoother.
As described in the foregoing background technology, the signal source device such as the intelligent network player outputs the video frames stored in the DDR to the display device at a fixed frame rate of 60fps, and in order to adapt to film sources with various frame rates, for example, the frame rate of a current mainstream movie is 24fps, the frame rate of a television program is 25fps, and the signal source device such as the intelligent network player generally needs to copy and repeat the video frames in an original film source to achieve the output of the frame rate of 60 fps.
If all the video frames output to the display device by the signal source device such as the intelligent network player are stored in the DDR without distinction, many repeated frames exist in the DDR, and the use efficiency is not high. For example, 3 original video frames are needed for motion estimation by the MEMC, and if there are no repeated frames in the video frames input into the DDR, a buffer space with a size of 3 video frames needs to be allocated for storing the 3 original video frames; if the video frame input into the DDR has a repeat frame, it is assumed that the video frame is a video frame in which 30fps is repeatedly copied to 60fps, and at this time, half of the video frame is repeated, and if the 3 original video frames are all written into the DDR for storage, the DDR needs to allocate a buffer space with a size of 6 video frames. It can be seen that, in the prior art, if all video frames output by the player are stored in the DDR of the display device without distinction, the DDR is not efficiently utilized.
In order to solve the problems in the prior art, embodiments of the present application provide a display device and an MEMC repeated frame discarding method, a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer is calculated, and a similarity threshold corresponding to the current video frame is calculated, where the similarity threshold is used to determine whether the current video frame is a repeated frame of the video frame stored in the previous Buffer, and when it is determined that the difference value is not greater than the similarity threshold, the current video frame is considered to be a repeated frame, and a video frame at a next time is stored in the current Buffer to discard the current video frame with repeated content, so that a DDR in the display device always stores an original video frame in which the picture content appears for the first time, and the use efficiency of the DDR is improved.
Fig. 1 is a schematic view of an application scenario of a display device in an embodiment of the present application. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
The control device 100 may control the display device 200 in a wireless or other wired manner by using a remote controller, including infrared protocol communication, bluetooth protocol communication, other short-distance communication manners, and the like. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a smart television, a smart phone, or the like. The specific smart product type, device model, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
Fig. 2 is a block diagram of a configuration of the control device 100 in the embodiment of the present application. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
Fig. 3-1 is a block diagram of a hardware configuration of a display device 200 in the embodiment of the present application. As shown in fig. 3-1, the display device 200 includes a controller 210, a modem 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 260-2, a display 280, an audio output 270, a memory 290, a power supply, and a user input interface.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, or other network communication protocol modules or near field communication protocol modules, and an infrared receiver (not shown).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as changes can be adaptively displayed by collecting the ambient light; alternatively, the detector 240 includes an image collector 241, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 240 includes a sound collector, such as a microphone, which is used to receive external sounds.
The input/output interface 250 controls data transmission between the display device 200 of the controller 210 and other external devices. Such as receiving video and audio signals or command instructions from an external device.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, motion estimation and motion compensation, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal that can be directly displayed or played on the display device 200.
As shown in the hardware configuration block diagram of the video processor 260-1 of fig. 3-2, as can be seen from fig. 3-2, the video processor 260-1 includes a demultiplexing module 260-11, a video decoding module 260-12, an image composition module 260-13, a MEMC module 260-14, a display formatting module 260-15, and the like.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
An audio output 270, under the control of the controller 210, receiving the sound signal output by the audio processor 260-2, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply interface installed outside the display device 200 to provide an external power supply in the display device 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
Illustratively, a user inputs a user command through the remote control device 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display apparatus 200 responds to the user input.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 3-1, the controller 210 includes a RAM213 and a ROM214, as well as a graphics processor 216, a CPU processor 212, and a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
The ROM214 is used to store instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213, and then starts running and starting the various application programs.
As shown in the hardware configuration block diagram of the RAM213 shown in FIG. 3-3, as can be seen from FIG. 3-3, the RAM213 includes SRAM213-1, DRAM213-2, SDRAM213-3, DDR SDRAM213-4, etc.
An SRAM (Static Random Access Memory) is a Memory having a Static Access function, and can store data stored therein without a refresh circuit.
DRAM (Dynamic Random Access Memory) is the most common system Memory, and can only hold data for a short time.
The SDRAM (Synchronous Dynamic Random Access Memory) is developed on the basis of DRAM, and is a kind of DRAM, where synchronization refers to that a Memory works with a Synchronous clock, and the sending of internal commands and the transmission of data are based on the clock; dynamic means that the memory array needs to be refreshed continuously to ensure that data is not lost; random means that data is not stored linearly in sequence, but read and write from a specified address.
DDR SDRAM is developed on the basis of SDRAM, people are commonly called DDR, and the improved DRAM is basically the same as the SDRAM, except that the data can be read and written twice in one clock, so that the data transmission speed is doubled. This is the most memory currently used in computers and has a cost advantage.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is a bottom layer software module for signal communication among the various hardware in the postpartum care display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
Fig. 4 is a schematic diagram of a functional configuration of a display device 200 in the embodiment of the present application. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Fig. 5 is a schematic flowchart of a MEMC repeated frame discarding method according to an embodiment of the present disclosure, where the method in the embodiment of the present disclosure may be applied to a display device with a video playing function, such as a smart television, a smart phone, a notebook computer, a desktop computer, and the like, and for example, the process may be executed by the display device 200 in fig. 1. As shown in fig. 5, the process includes:
s501: and calculating the difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer, wherein the video frame stored in the current Buffer is the current video frame.
S502: and calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer.
S503: and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold value corresponding to the current video frame.
Before performing step 501, the display device allocates a buffer space for storing a preset number of video frames in the DDR in advance. During actual use, the size of each video frame can be determined in advance according to the size of an input video source, and after the size of a single video frame is determined, a buffer space capable of storing a preset number of video frames can be allocated in advance.
For example, fig. 6 is a schematic diagram of a pre-allocated Buffer space in a DDR provided by the present application, as shown in fig. 6, the entire Buffer space is further divided into 6 buffers, which are respectively marked by Buffer0, Buffer1, Buffer2, Buffer3, Buffer4, and Buffer5, and each Buffer can store one video frame, that is, the Buffer space shown in fig. 6 can store 6 video frames in total.
In step S501, a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer is calculated, where the difference value may be a pixel difference value at a corresponding position of two frames of images, or may be other indexes used for evaluating a difference between image frames in the prior art, and this is not limited herein.
In step S502, a similarity threshold corresponding to the current video frame is calculated as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
and multiplying the average value by a threshold coefficient to obtain a product value which is the similarity threshold corresponding to the current video frame.
As shown in fig. 7, a group of temporally consecutive video frames, assuming that the video frame at the current time t stored in the current Buffer is the video frame M5, how to calculate the similarity threshold corresponding to the video frame M5 is described in detail below.
In this example, suppose N is 3, indicating that in addition to the difference value between the current video frame M5 and the video frame at the previous time, the difference value between each of the 3 video frames before the video frame M5 and the video frame at the previous time is also required, wherein the 3 video frames before the video frame M5 are consecutive in time with the video frame M5, i.e., the video frame M4, the video frame M3, and the video frame M2, respectively.
In the present application, for any received video frame, the difference value between the received video frame and the previous video frame is calculated, and the difference value is stored in an array. Similar to calculating the difference value between the video frames stored in the two buffers, the difference value between two video frames in time succession may be the pixel difference value of the corresponding position of the image, or may be other indexes used for evaluating the difference of the image frames in the prior art.
In the application, in order to ensure that the similarity threshold corresponding to the current video frame can be accurately calculated, the digit of the array stored value is larger than the number of difference values required to be used for calculating the similarity threshold. For example, calculating the similarity threshold requires using 7 difference values, and the number of bits of the array holding the values is 32 bits.
As shown in FIG. 7, the difference value between the current video frame M5 and the video frame M4 is recorded as
Figure DEST_PATH_IMAGE001
By analogy, the difference between the video frame M4 and the video frame M3 is recorded as
Figure DEST_PATH_IMAGE002
The difference between the video frame M3 and the video frame M2 is recorded as
Figure DEST_PATH_IMAGE003
The difference between the video frame M2 and the video frame M1 is recorded as
Figure DEST_PATH_IMAGE004
After obtaining the difference values between the current video frame M5 and the previous 3 video frames and the video frame at the previous moment, calculating the average value of the sum of the difference values
Figure DEST_PATH_IMAGE005
The process is as follows:
Figure DEST_PATH_IMAGE006
after obtaining the average value of the sum of the difference values, multiplying the average value by a threshold coefficient to obtain a product value as a similarity threshold corresponding to the current video frame M5, which includes the following steps:
Figure DEST_PATH_IMAGE007
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE008
the similarity threshold corresponding to the current video frame M5 is a threshold coefficient, and the threshold coefficient may be preset in the display device according to experience.
Therefore, in the application, the similarity threshold corresponding to the video frame is not a preset fixed value, but is determined according to the difference value between the current video frame and each of the previous N video frames and the previous video frame, that is, the similarity threshold dynamically changes along with the input video frame, and has certain flexibility and adaptability.
Further, a protection mechanism is added, namely a maximum limit value and a minimum limit value are set for a similarity threshold value corresponding to a video frame, and if the similarity threshold value is smaller than the minimum limit value, the similarity threshold value is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to be the maximum limit value.
Wherein the maximum limit value and the minimum limit value may be preset in the display device according to practical experience. By setting the maximum limit value, the video frames which should be preserved originally can be further prevented from being discarded due to the overlarge similarity threshold value; by setting the minimum limit value, it is further avoided that the video frames which should be discarded originally are retained because the similarity threshold value is too small.
In step S503, determining a position where the video frame at the next moment is stored in the Buffer according to a relationship between the difference value and the similarity threshold corresponding to the current video frame, specifically including:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
For example, as shown in fig. 8-1, when the current video frame exists in the Buffer1, a difference value between the video frame stored in the Buffer1 and the video frame stored in the Buffer0 is calculated, and the difference value may be a pixel difference value at a corresponding position of the two video frames. Please refer to the foregoing description for a method for calculating a similarity threshold corresponding to a current video frame, which is not described herein again.
If the difference value is not greater than the similarity threshold value, it is determined that the current video frame stored in the Buffer1 is a repeated frame of the video frame stored in the Buffer0, and as shown in fig. 8-2, the video frame at the next moment is stored in the current Buffer, i.e., Buffer1, so as to cover the current video frame stored in the Buffer1 and retain the original video frame stored in the Buffer 0.
If the difference value is greater than the similarity threshold value, it is determined that the current video frame stored in the Buffer1 is not a duplicate of the video frame stored in the Buffer0, and as shown in fig. 8-3, the video frame at the next moment is stored in the next Buffer, i.e., Buffer2, and at this time, both the current video frame stored in the Buffer1 and the video frame stored in the Buffer0 are retained.
In the application, the video frames are written in from the head end of the cache space in sequence, when the video frames are written in the tail end of the cache space, the next video frame is written in the head end of the cache space again, and the data originally stored at the head end of the cache space is covered, so that the cycle is realized.
For example, as shown in fig. 6, in the Buffer space, video frames are sequentially written from the head Buffer0 of the Buffer space, and when the video frames are written into the tail Buffer5 of the Buffer space, the following video frames are re-written into the head Buffer0 of the Buffer space, and the video data originally stored in the head Buffer0 of the Buffer space is overwritten and then are re-written into the following buffers in sequence, so that the process is repeated.
For example, as shown in fig. 8-4 and 8-5, the current video frame exists in Buffer5, a difference value between the video frame stored in Buffer5 and the video frame stored in Buffer4 is calculated, and when the difference value is greater than a similarity threshold corresponding to the current video frame, the current video frame needs to be retained in Buffer5, the video frame at the next time needs to be stored in the next Buffer, and Buffer5 is located at the tail end of the Buffer space, so the video frame at the next time needs to be stored in the head end of the Buffer space, i.e., Buffer 0.
It can be seen from the above embodiments that, when it is determined that the contents of two video frames are the same, the video frame at the next time is written into the current Buffer to cover the video frame stored in the current Buffer, the DDR always stores the original video frame in which the picture content appears for the first time, and discards the repeated frame appearing later and having the same content, thereby improving the use efficiency of the DDR in the display device.
Fig. 9 is a schematic flowchart of a method for discarding MEMC repeated frames in a display device according to an embodiment of the present application, which specifically includes the following steps:
s901: allocating a cache space for storing a preset number of video frames in a DDR in advance;
s902: calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
s903: calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of a video frame stored in a previous Buffer;
s904: judging whether the difference value is not greater than a similarity threshold corresponding to the current video frame, if so, executing a step S905, and if not, executing a step S906;
s905: storing the video frame of the next moment in the current Buffer;
s906: the video frame at the next instant is stored in the next Buffer.
The following further illustrates a complete process of the MEMC repeat frame dropping method in the display device shown in fig. 9, with reference to the schematic diagram of the DDR allocated buffer space shown in fig. 6.
Assuming that the frame rate of a video source is 30fps, the video frame is A, B, C, D, E, F …, and the frame rate output by a signal source device such as an intelligent network player is 60fps, so that the video frames in the video source need to be copied and repeated to achieve the frame rate up from 30fps to 60 fps.
Copy repetition of video frame taking video frame a as an example, video frame a is copied into two identical video frames a, which are respectively marked with a0 and a1, and so on, so as to obtain video frames output by the player, which are a0, a1, B0, B1, C0, C1, D0, D1, E0, E1, F0, and F1 … ….
Assuming that the buffer space has been allocated in the DDR as shown in fig. 6, when the video frames a0, a1, B0, B1, C0, C1, D0, D1, E0, E1, F0, and F1 … … are sequentially input to the DDR, the frame dropping method is repeated based on the flowchart of MEMC shown in fig. 9, which is as follows:
as shown in fig. 10-1, the current Buffer is Buffer0, and Buffer0 stores the current video frame as video frame a 0. In the embodiment of the application, the Buffer0 is located at the head end of the Buffer space, the previous Buffer is the Buffer5 located at the tail end of the Buffer space, and at this time, the Buffer5 is empty and has no video frame stored yet, so that the difference between the video frame a0 stored in the Buffer0 and the video frame stored in the Buffer5 is calculated to be larger. Meanwhile, because there is no video frame in the front, the similarity threshold corresponding to the video frame a0 calculated according to the method described above is also small. Assuming that the similarity threshold is smaller than the minimum threshold, and the similarity threshold is adjusted to the minimum threshold, the difference value is still larger than the adjusted similarity threshold, and it is determined that the video frame a1 at the next time is stored into the next Buffer, i.e., Buffer 1.
As shown in fig. 10-2, the current Buffer is Buffer1, and Buffer1 stores the current video frame as video frame a 1. In the embodiment of the present application, video frame a0 and video frame a1 are two identical video frames, both from video frame a. Therefore, the difference between the two video frames is small, and at the same time, because only the video frame a0 exists in the foregoing, the similarity threshold corresponding to the video frame a1 calculated according to the foregoing method is also small, and it is assumed that the similarity threshold is smaller than the minimum threshold, and the similarity threshold is adjusted to the minimum threshold, and the difference is not greater than the adjusted similarity threshold, and it is considered that the video frame a1 is a repeated frame of the video frame a0, and the video frame a1 needs to be discarded, so that it is determined that the video frame B0 at the next time is stored in the current Buffer, that is, Buffer1, so that the video frame a1 originally stored in the Buffer1 is covered, and the effect of discarding the repeated frame is achieved.
As shown in fig. 10-3, the current Buffer, i.e., Buffer1, the Buffer1 stores the current video frame as video frame B0, and the previous Buffer, i.e., Buffer0 stores the video frame as video frame a 0. In the embodiment of the present application, the video frame a0 and the video frame B0 are obtained according to the video frame a and the video frame B, respectively, and although the video frame a and the video frame B belong to different video frames, it is determined after comparing a difference value between the video frame B0 and the video frame a0 and a similarity threshold corresponding to the video frame B0 if the video frame B0 stored in the Buffer1 needs to be retained, there will be the following two situations:
case 1:
the difference between the video frame a0 and the video frame B0 is not greater than the similarity threshold corresponding to the video frame B0 (or the similarity threshold after adjustment), and at this time, the video frame B0 is considered to be a repeated frame of the video frame a0 and needs to be discarded. Thus, the next time video frame B1 is written into Buffer1 stored in video frame B0, video frame B0 will be completely covered by video frame B1. As shown in fig. 10-4, at the next time, the current Buffer is Buffer1, the video frame stored in Buffer1 is video frame B1, and the previous Buffer is video frame a0, i.e., the video frame stored in Buffer 0.
Case 2:
the difference between the video frame a0 and the video frame B0 is greater than the similarity threshold corresponding to the video frame B0 (or the similarity threshold after adjustment), and at this time, the video frame B0 is considered to be a new video frame, i.e., an original video frame, and needs to be retained. Therefore, the video frame B1 at the next time is written into the next Buffer, Buffer2, at which time the video frame B0 is saved into Buffer 1. As shown in fig. 10-5, at the next time, the current Buffer is Buffer2, the video frame stored in Buffer2 is video frame B1, and the previous Buffer is video frame B0 stored in Buffer 1.
The determination process of whether the other video frames, such as the video frames C0 and C1, need to be retained in this example is similar to the previous process of determining video frames, and is not illustrated herein.
The method for discarding the MEMC repeated frame provided by the embodiment of the application calculates the difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer, and calculates the similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is the repeated frame of the video frame stored in the previous Buffer, and when the difference value is not greater than the similarity threshold, the current video frame is considered to be the repeated frame, and the video frame at the next moment is stored in the current Buffer. Therefore, repeated frames with the same picture content as the previously stored video frames can be discarded, and the use efficiency of the DDR is improved.
Based on the same inventive concept, an embodiment of the present application further provides an image processing apparatus 1100, as shown in fig. 11, including:
a first calculating module 1101, configured to calculate a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, where the video frame stored in the current Buffer is the current video frame;
a second calculating module 1102, configured to calculate a similarity threshold corresponding to the current video frame, where the similarity threshold is used to determine whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
a determining module 1103, configured to determine, according to a relationship between the difference value and the similarity threshold corresponding to the current video frame, a position where the video frame at the next time is stored in the Buffer.
The determining module 1103 is configured to determine, according to a relationship between the difference value and the similarity threshold corresponding to the current video frame, a position where the video frame at the next time is stored in the Buffer, and specifically includes:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
The second calculating module 1102 is configured to calculate a similarity threshold corresponding to the current video frame, which is as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the method comprises the steps that a similarity threshold corresponding to a video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to be the maximum limit value.
Based on the same inventive concept, an embodiment of the present application further provides a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that when a display device installed with the chip is executed, the step of implementing any one of the MEMC repeated frame discarding methods described above is implemented.
Based on the same inventive concept, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of any of the MEMC repeated frame discarding methods.
Based on the same inventive concept, the present application further provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of any of the MEMC repetitive frame dropping methods described above.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A display device, comprising:
a display for displaying a video frame;
a controller coupled with the display and configured to:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold value corresponding to the current video frame.
2. The display device according to claim 1, wherein determining a position where a video frame at a next moment is stored in a Buffer according to a relationship between the difference value and a similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
3. The display device according to claim 1, wherein the similarity threshold corresponding to the current video frame is calculated as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
and multiplying the average value by a threshold coefficient to obtain a product value which is the similarity threshold corresponding to the current video frame.
4. The display device according to claim 1, wherein the similarity threshold corresponding to the video frame has a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to the minimum limit value; if the similarity threshold is larger than a maximum limit value, the similarity threshold is adjusted to be the maximum limit value.
5. An MEMC duplicate frame dropping method, comprising:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold value corresponding to the current video frame.
6. The method according to claim 5, wherein determining a position where the video frame at the next time is stored in the Buffer according to a relationship between the difference value and the similarity threshold corresponding to the current video frame specifically comprises:
if the difference value is not greater than the similarity threshold value, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
7. The method according to claim 5, wherein the similarity threshold corresponding to the current video frame is calculated as follows:
calculating the average value of the sum of difference values between the current video frame and the previous N video frames and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame,
the method comprises the steps that a similarity threshold corresponding to a video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; if the similarity threshold is larger than a maximum limit value, the similarity threshold is adjusted to be the maximum limit value.
8. An image processing apparatus characterized by comprising:
the first calculation module is used for calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
a second calculating module, configured to calculate a similarity threshold corresponding to the current video frame, where the similarity threshold is used to determine whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and the determining module is used for determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
9. A chip, comprising: a processor for calling and running a computer program from a memory so that a display device on which the chip is installed performs the method of any one of claims 5-7.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 5-7.
CN202111116545.5A 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method Active CN113923514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111116545.5A CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111116545.5A CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Publications (2)

Publication Number Publication Date
CN113923514A true CN113923514A (en) 2022-01-11
CN113923514B CN113923514B (en) 2024-03-01

Family

ID=79235916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111116545.5A Active CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Country Status (1)

Country Link
CN (1) CN113923514B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320536A (en) * 2023-05-16 2023-06-23 瀚博半导体(上海)有限公司 Video processing method, device, computer equipment and computer readable storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010154254A (en) * 2008-12-25 2010-07-08 Kyocera Corp Composite image creating apparatus
US9491398B1 (en) * 2010-12-21 2016-11-08 Pixelworks, Inc. System and method for processing assorted video signals
CN106683086A (en) * 2016-12-23 2017-05-17 深圳市大唐盛世智能科技有限公司 Background modeling method and device for intelligent video monitoring
CN106951346A (en) * 2016-01-06 2017-07-14 阿里巴巴集团控股有限公司 The method of testing and device of a kind of response time
CN108540822A (en) * 2018-04-04 2018-09-14 南京信安融慧网络技术有限公司 A kind of key frame of video extraction acceleration system and its extracting method based on OpenCL
US10116989B1 (en) * 2016-09-12 2018-10-30 Twitch Interactive, Inc. Buffer reduction using frame dropping
CN110312095A (en) * 2018-03-20 2019-10-08 瑞昱半导体股份有限公司 Image processor and image treatment method
CN111447488A (en) * 2020-04-01 2020-07-24 青岛海信传媒网络技术有限公司 MEMC control method and display device
CN111738321A (en) * 2020-06-12 2020-10-02 腾讯音乐娱乐科技(深圳)有限公司 Data processing method, device, terminal equipment and storage medium
CN112085097A (en) * 2020-09-09 2020-12-15 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN112866801A (en) * 2021-03-11 2021-05-28 北京小米移动软件有限公司 Video cover determining method and device, electronic equipment and storage medium
CN113032295A (en) * 2021-02-25 2021-06-25 西安电子科技大学 Data packet second-level caching method, system and application
US20210256713A1 (en) * 2020-02-13 2021-08-19 Canon Kabushiki Kaisha Image processing apparatus and image processing method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010154254A (en) * 2008-12-25 2010-07-08 Kyocera Corp Composite image creating apparatus
US9491398B1 (en) * 2010-12-21 2016-11-08 Pixelworks, Inc. System and method for processing assorted video signals
CN106951346A (en) * 2016-01-06 2017-07-14 阿里巴巴集团控股有限公司 The method of testing and device of a kind of response time
US10116989B1 (en) * 2016-09-12 2018-10-30 Twitch Interactive, Inc. Buffer reduction using frame dropping
CN106683086A (en) * 2016-12-23 2017-05-17 深圳市大唐盛世智能科技有限公司 Background modeling method and device for intelligent video monitoring
CN110312095A (en) * 2018-03-20 2019-10-08 瑞昱半导体股份有限公司 Image processor and image treatment method
CN108540822A (en) * 2018-04-04 2018-09-14 南京信安融慧网络技术有限公司 A kind of key frame of video extraction acceleration system and its extracting method based on OpenCL
US20210256713A1 (en) * 2020-02-13 2021-08-19 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN111447488A (en) * 2020-04-01 2020-07-24 青岛海信传媒网络技术有限公司 MEMC control method and display device
CN111738321A (en) * 2020-06-12 2020-10-02 腾讯音乐娱乐科技(深圳)有限公司 Data processing method, device, terminal equipment and storage medium
CN112085097A (en) * 2020-09-09 2020-12-15 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN113032295A (en) * 2021-02-25 2021-06-25 西安电子科技大学 Data packet second-level caching method, system and application
CN112866801A (en) * 2021-03-11 2021-05-28 北京小米移动软件有限公司 Video cover determining method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320536A (en) * 2023-05-16 2023-06-23 瀚博半导体(上海)有限公司 Video processing method, device, computer equipment and computer readable storage medium
CN116320536B (en) * 2023-05-16 2023-08-18 瀚博半导体(上海)有限公司 Video processing method, device, computer equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN113923514B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
US9569159B2 (en) Apparatus, systems and methods for presenting displayed image information of a mobile media device on a large display and control of the mobile media device therefrom
US8811797B2 (en) Switching between time order and popularity order sending of video segments
WO2021179359A1 (en) Display device and display picture rotation adaptation method
WO2023104102A1 (en) Live broadcasting comment presentation method and apparatus, and device, program product and medium
CN112565868B (en) Video playing method and device and electronic equipment
WO2021212463A1 (en) Display device and screen projection method
CN111897478A (en) Page display method and display equipment
CN111836104B (en) Display apparatus and display method
CN112073798B (en) Data transmission method and equipment
CN114157889A (en) Display device and touch-control assistance interaction method
CN113923514B (en) Display device and MEMC repeated frame discarding method
CN113630569B (en) Display apparatus and control method of display apparatus
CN112506859B (en) Method for maintaining hard disk data and display device
WO2021179361A1 (en) Display apparatus
CN112269668A (en) Application resource sharing and display equipment
CN115250357B (en) Terminal device, video processing method and electronic device
WO2022193475A1 (en) Display device, method for receiving screen projection content, and screen projection method
CN115145482A (en) Parameter configuration system, method, reference monitor and medium
WO2021180223A1 (en) Display method and display device
WO2021008137A1 (en) Display device and video picture scaling method
CN115185392A (en) Display device, image processing method and device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN115643454A (en) Display device, video playing method and device thereof
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
CN111726555B (en) Display device, motion estimation method and video processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant