CN113923514B - Display device and MEMC repeated frame discarding method - Google Patents

Display device and MEMC repeated frame discarding method Download PDF

Info

Publication number
CN113923514B
CN113923514B CN202111116545.5A CN202111116545A CN113923514B CN 113923514 B CN113923514 B CN 113923514B CN 202111116545 A CN202111116545 A CN 202111116545A CN 113923514 B CN113923514 B CN 113923514B
Authority
CN
China
Prior art keywords
video frame
similarity threshold
buffer
current
stored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111116545.5A
Other languages
Chinese (zh)
Other versions
CN113923514A (en
Inventor
徐赛杰
晏飞
李锋
余横
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xinxin Microelectronics Technology Co Ltd
Original Assignee
Qingdao Xinxin Microelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xinxin Microelectronics Technology Co Ltd filed Critical Qingdao Xinxin Microelectronics Technology Co Ltd
Priority to CN202111116545.5A priority Critical patent/CN113923514B/en
Publication of CN113923514A publication Critical patent/CN113923514A/en
Application granted granted Critical
Publication of CN113923514B publication Critical patent/CN113923514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter

Abstract

The application provides a display device and a MEMC repeated frame discarding method, wherein the difference value of a current Buffer stored video frame and a previous Buffer stored video frame is calculated, and a similarity threshold corresponding to the current video frame is calculated, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the previous Buffer stored video frame, and when the difference value is not greater than the similarity threshold, the current video frame is considered to be the repeated frame, and the next moment video frame is stored in the current Buffer. Therefore, repeated frames with the same picture content as the video frames stored previously can be discarded, and the use efficiency of DDR in the display device is improved.

Description

Display device and MEMC repeated frame discarding method
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a display device and a method for discarding a MEMC repeated frame.
Background
The display device has rich input signal interfaces, for example, video data from a signal source device such as an intelligent network player, a DVD, etc. is acquired through an HDMI interface. Currently, the frame rate of the main stream movie is 24fps, the frame rate of the television program is 25fps, some of the film source frame rates are even only 10fps, and when playing film sources with different frame rates, signal source devices such as intelligent network players and the like usually output a fixed frame rate of 60fps. This requires copy repetition of the video frames within the original film source to boost the frame rate to 60fps output.
Motion estimation motion compensation (Motion Estimation and Motion Compensation, MEMC) is a technique widely used in frame rate conversion at present, and by estimating the motion track of an object in a continuous moving image and combining image data and an obtained motion vector, an intermediate image is interpolated, so that the video frame rate is improved, and the problems of jitter tailing during video playing and the like are solved.
When the MEMC performs motion estimation, multiple original video frames need to be acquired from the DDR of the display device, however, if video frames output by signal source devices such as an intelligent network player are stored in the DDR without distinction, many repeated video frames exist in the DDR, so that the DDR is not utilized efficiently.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a display device and a MEMC repeated frame discarding method, which first calculate a difference value between a current Buffer and a video frame stored in a previous Buffer, calculate a similarity threshold corresponding to the video frame stored in the current Buffer, and determine whether the video frame is stored in the current Buffer or in the next Buffer at the next time by comparing a relationship between the calculated difference value and the similarity threshold. The problem of among the prior art, the video frame that will be output by signal source equipment such as intelligent network player all stores in the DDR of display device with not distinguishing, the DDR that leads to uses inefficiency is solved.
In a first aspect, embodiments of the present application provide a display device, including a display and a controller, wherein the display is configured to display a video frame; a controller is coupled with the display and configured to:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is a current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
In a possible manner, determining a location of the video frame stored in the Buffer at the next time according to a relationship between the difference value and a similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a possible manner, a similarity threshold corresponding to the current video frame is calculated, specifically as follows:
calculating an average value of the sum of difference values between each of the current video frame and the previous N video frames thereof and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames thereof are continuous in time;
and multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame.
In a feasible manner, a maximum limit value and a minimum limit value are set for a similarity threshold corresponding to the video frame, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
According to the display device provided by the embodiment of the application, the difference value of the current Buffer stored video frame and the previous Buffer stored video frame is calculated, and the similarity threshold corresponding to the current video frame is calculated, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the previous Buffer stored video frame or not, and when the difference value is not greater than the similarity threshold, the current video frame is considered to be a repeated frame, and the next moment video frame is stored in the current Buffer. Therefore, repeated frames with the same picture content as the video frames stored previously can be discarded, and the use efficiency of DDR in the display device is improved.
In a second aspect, an embodiment of the present application provides a method for discarding a MEMC repeated frame, including:
calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is a current video frame;
calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
In a possible manner, determining a location of the video frame stored in the Buffer at the next time according to a relationship between the difference value and a similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a possible manner, a similarity threshold corresponding to the current video frame is calculated, specifically as follows:
calculating an average value of the sum of difference values between each of the current video frame and the previous N video frames thereof and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames thereof are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the similarity threshold corresponding to the video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
In a third aspect, an embodiment of the present application provides an image processing apparatus, including:
the first calculation module is used for calculating the difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer, wherein the video frame stored in the current Buffer is the current video frame;
the second calculation module is used for calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
And the determining module is used for determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
In a possible manner, the determining module is configured to determine, according to a relationship between the difference value and a similarity threshold corresponding to the current video frame, a location where the video frame is stored in the Buffer at a next time, where the location is specifically as follows:
if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
In a possible manner, the second calculating module is configured to calculate a similarity threshold corresponding to the current video frame, where the similarity threshold is specifically as follows:
calculating an average value of the sum of difference values between each of the current video frame and the previous N video frames thereof and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames thereof are continuous in time;
Multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the similarity threshold corresponding to the video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
In a fourth aspect, embodiments of the present application further provide a chip, including: a processor for calling and running a computer program from a memory, causing a display device on which the chip is mounted to perform the method of any of the second aspects above.
In a fifth aspect, embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, which when executed by a processor implements the method of any of the second aspects above.
In a sixth aspect, embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, implements the method of any of the second aspects described above.
Technical effects brought about by any implementation manner of the second aspect to the sixth aspect provided in the present application may refer to technical effects brought about by different implementation manners of the first aspect, and are not described herein again.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will briefly explain the embodiments or the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will briefly explain the embodiments or the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic view of an application scenario of a display device in an embodiment of the present application;
fig. 2 is a block diagram illustrating a configuration of a control apparatus 100 according to an embodiment of the present application;
fig. 3-1 is a hardware configuration block diagram of a display device 200 in the embodiment of the present application;
FIG. 3-2 is a block diagram of the hardware configuration of video processor 260-1 in an embodiment of the present application;
FIGS. 3-3 are block diagrams of hardware configuration of the RAM213 in an embodiment of the present application;
fig. 4 is a schematic functional configuration diagram of a display device 200 according to an embodiment of the present application;
fig. 5 is a flow chart of a method for discarding a MEMC repeated frame in an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating pre-allocation of buffer space in DDR according to an embodiment of the present application;
fig. 7 is a schematic diagram of a video frame group output by a player according to an embodiment of the present application
FIG. 8-1 is a diagram illustrating a current Buffer being Buffer1 according to an embodiment of the present application;
FIG. 8-2 is one example diagram of a video frame memory location at a next time of FIG. 8-1 in an embodiment of the present application;
FIG. 8-3 is a second exemplary diagram of a video frame memory location at a next time of FIG. 8-1 in an embodiment of the present application;
FIGS. 8-4 are diagrams illustrating a current Buffer being Buffer5 in an embodiment of the present application;
FIGS. 8-5 are exemplary diagrams of video frame memory locations at a next time of FIGS. 8-4 in accordance with embodiments of the present application;
Fig. 9 is a flow chart of a method for discarding a MEMC repeated frame in the embodiment of the application;
FIG. 10-1 is a schematic diagram illustrating a storage location of a video frame A0 according to an embodiment of the present application;
FIG. 10-2 is a schematic diagram illustrating a storage location of a video frame A1 according to an embodiment of the present application;
fig. 10-3 are schematic diagrams illustrating storage locations of a video frame B0 according to an embodiment of the present application;
FIGS. 10-4 are schematic diagrams illustrating storage locations of video frame B1 according to embodiments of the present application;
FIGS. 10-5 are diagrams illustrating the second storage location of the video frame B1 according to the embodiments of the present application;
fig. 11 is a schematic structural diagram of an image processing module according to an embodiment of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
Many display devices on the market are currently equipped with a MEMC function for improving the "smear" or "blur" of a person's motion in a sports event or a fierce sports shot such as a ball game. When the MEMC performs motion estimation, a plurality of original video frames are required to be acquired from DDR of the display equipment, the motion direction of the frames in the process is intelligently calculated according to the relation between the front and rear frames of the original video frames, and then the image data and the obtained motion vector are combined to interpolate an intermediate image, so that the whole motion process is smoother.
The video frames stored in the DDR are output to the display device by the signal source device such as the intelligent network player, as described in the foregoing background, the signal source device such as the intelligent network player typically outputs to the display device at a fixed frame rate of 60fps, so as to adapt to the film source with various frame rates, for example, the frame rate of the current mainstream film is 24fps, the frame rate of the television program is 25fps, and the signal source device such as the intelligent network player typically needs to copy and repeat the video frames in the original film source to achieve the output with the frame rate of 60 fps.
If video frames output to the display device by the signal source device such as the intelligent network player are stored in the DDR without distinction, a plurality of repeated frames exist in the DDR, and the use efficiency is not high. For example, the MEMC needs 3 original video frames for motion estimation, and if the video frames input into the DDR have no repeated frames, the buffer space of 3 video frames is allocated for storing the 3 original video frames; if the video frame input into the DDR has a repeated frame, it is assumed that the video frame is a video frame which is repeatedly copied by 30fps to 60fps, and at this time, half of the video frames are repeated, if all the 3 original video frames are written into the DDR for storage, the DDR needs to allocate a buffer space with the size of 6 video frames. It can be seen that in the prior art, if all video frames output by the player are stored in the DDR of the display device without distinction, the DDR is not utilized efficiently.
Aiming at the problems existing in the prior art, the embodiment of the application provides a display device and a MEMC repeated frame discarding method, and the difference value of a current Buffer stored video frame and a previous Buffer stored video frame is calculated, and a similarity threshold corresponding to the current video frame is calculated, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the previous Buffer stored video frame, when the difference value is not greater than the similarity threshold, the current video frame is considered to be the repeated frame, and the video frame at the next moment is stored in the current Buffer so as to discard the current video frame with repeated content, so that DDR (double data rate) in the display device always stores an original video frame with the picture content for the first time, and the use efficiency of DDR (double data rate) is improved.
Fig. 1 is a schematic diagram of an application scenario of a display device in an embodiment of the present application. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
The control device 100 may control the display apparatus 200 through a wireless or other wired manner by using a remote controller including an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication manners. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
By way of example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be one or more groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a smart television, a smart phone, etc. The particular smart product type, device model, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display device 200 may additionally provide an intelligent network television function of a computer support function in addition to the broadcast receiving television function. Examples include web tv, smart tv, internet Protocol Tv (IPTV), etc.
Fig. 2 is a block diagram illustrating a configuration of the control apparatus 100 in the embodiment of the present application. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200.
Fig. 3-1 is a block diagram of a hardware configuration of a display device 200 in an embodiment of the present application. As shown in fig. 3-1, the display device 200 includes a controller 210, a modem 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 260-2, a display 280, an audio output 270, a memory 290, a power supply, and a user input interface.
A display 280 for receiving image signals from the video processor 260-1 and for displaying video content and images and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images. The video content may be displayed from broadcast television content, or may be various broadcast signals receivable via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display device 200 and used to control the display device 200.
And, depending on the type of display 280, a drive assembly for driving the display. Alternatively, if the display 280 is a projection display, a projection device and projection screen may be included.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, or other network communication protocol modules or near field communication protocol modules, and an infrared receiver (not shown in the figure).
The detector 240 is a signal that the display device 200 uses to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and may adaptively display parameter changes by collecting the ambient light, etc.; alternatively, the detector 240 includes an image collector 241, such as a camera, which may be used to collect external environmental scenes, attributes of a user, or user interaction gestures, or alternatively, the detector 240 includes a sound collector, such as a microphone, for receiving external sounds.
An input/output interface 250 for data transmission between the control display device 200 of the controller 210 and other external devices. Such as receiving video signals and audio signals of an external device, or command instructions.
The modem 220 receives broadcast television signals by a wired or wireless receiving method, and can perform modulation and demodulation processing such as amplification, mixing, resonance, etc., and demodulates television audio and video signals carried in a television channel frequency selected by a user and EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuning demodulator 220 is responsive to the user selected television signal frequency and television signals carried by that frequency, as selected by the user, and as controlled by the controller 210.
The video processor 260-1 is configured to receive an external video signal, perform video processing such as decompression, decoding, scaling, noise reduction, motion estimation and motion compensation, resolution conversion, and image composition according to a standard codec protocol of an input signal, and obtain a signal that can be displayed or played on the display device 200.
As shown in the hardware configuration block diagram of the video processor 260-1 in fig. 3-2, as can be seen from fig. 3-2, the video processor 260-1 includes a demultiplexing module 260-11, a video decoding module 260-12, an image synthesizing module 260-13, a MEMC module 260-14, a display formatting module 260-15, and the like.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the external audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like, to obtain a sound signal that can be played in a speaker.
An audio output 270, which receives the sound signal output by the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and an external sound output terminal 274 that can be output to a generating device of an external device, other than the speaker 272 carried by the display device 200 itself, such as: external sound interface or earphone interface, etc.
And a power supply source for providing power supply support for the display device 200 with power inputted from an external power source under the control of the controller 210. The power supply may include a built-in power circuit installed inside the display apparatus 200, or may be an external power source installed in the display apparatus 200, and a power interface providing an external power source in the display apparatus 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
By way of example, a user inputs a user command through the remote control device 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
The controller 210 controls the operation of the display device 200 and responds to the user's operations through various software control programs stored on the memory 290.
As shown in fig. 3-1, the controller 210 includes RAM213 and ROM214, and a graphics processor 216, CPU processor 212, communication interface 218, such as: first interface 218-1 through nth interfaces 218-n, and a communication bus. The RAM213 and the ROM214 are connected to the graphics processor 216, the CPU processor 212, and the communication interface 218 via buses.
ROM214 for storing various system boot instructions. When the power of the display device 200 starts to be started when the power-on signal is received, the CPU processor 212 executes a system start instruction in the ROM and copies the operating system stored in the memory 290 to the RAM213, so that the running of the start operating system starts. When the operating system is started, the CPU processor 212 copies various applications in the memory 290 to the RAM213, and then starts running the various applications.
As shown in the hardware configuration block diagram of the RAM213 of fig. 3-3, as can be seen from fig. 3-3, the RAM213 includes SRAM213-1, DRAM213-2, SDRAM213-3, DDR SDRAM213-4, and the like.
SRAM (Static Random Access Memory ), which is a memory with static access function, can store data stored therein without requiring refresh circuits.
DRAM (Dynamic Random Access Memory ) is the most common system memory, and DRAM can only hold data for a short period of time.
SDRAM (Synchronous Dynamic Random Access Memory ) is developed on the basis of DRAM, which is one type of DRAM, and synchronous means that Memory operation needs synchronous clock, and the transmission of internal command and data are both based on clock; dynamic means that the memory array needs to be refreshed continuously to ensure that data is not lost; the random is that the data is not stored in a linear sequence, but is read and written by a specified address.
DDR SDRAM has been developed based on SDRAM, and is commonly known as DDR, and the improved DRAM is basically the same as SDRAM, except that it can read and write data twice in one clock, thus doubling the data transmission speed. This is the most widely used memory in computers today, and it has cost advantages.
A graphics processor 216 for generating various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer that generates various objects based on the results of the operator, and displays the results of rendering on the display 280.
CPU processor 212 is operative to execute operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include multiple processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. A plurality of or a sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command to select a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
Memory 290 includes storage for various software modules for driving display device 200. Such as: various software modules stored in memory 290, including: a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The base module is a bottom software module for communicating signals between the various hardware in the post-partum care display device 200 and sending processing and control signals to the upper module. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display 280 to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs.
Meanwhile, the memory 290 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 4 is a schematic functional configuration diagram of a display device 200 according to an embodiment of the present application. As shown in fig. 4, the memory 290 is used to store an operating system, application programs, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. Memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically used for storing an operation program for driving the controller 210 in the display device 200, and storing various application programs built in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the application, various objects related to the graphical user interfaces, user data information, and various internal data supporting the application. The memory 290 is used to store system software such as OS kernel, middleware and applications, and to store input video data and audio data, and other user data.
Memory 290 is specifically used to store drivers and related data for audio and video processors 260-1 and 260-2, display 280, communication interface 230, modem 220, detector 240 input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, the software programs used to represent an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, APIs, or application programs), and the kernel may provide interfaces to allow the middleware and APIs, or applications to access the controller to implement control or management of system resources.
By way of example, the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 executes various software programs in the memory 290 such as: broadcast television signal receiving and demodulating functions, television channel selection control functions, volume selection control functions, image control functions, display control functions, audio control functions, external instruction recognition functions, communication control functions, optical signal receiving functions, power control functions, software control platforms supporting various functions, browser functions and other applications.
Fig. 5 is a schematic flow chart of a method for discarding a MEMC repeated frame according to an embodiment of the present application, where the method according to the embodiment of the present application may be applied to a display device having a video playing function, such as a smart tv, a smart phone, a notebook computer, a desktop computer, etc., and the flow may be executed by the display device 200 in fig. 1, for example. As shown in fig. 5, the process includes:
s501: and calculating a difference value between a video frame stored in the current Buffer and a video frame stored in the previous Buffer, wherein the video frame stored in the current Buffer is the current video frame.
S502: and calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer.
S503: and determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
Before performing step 501, the display device allocates in advance a buffer space in the DDR for storing a preset number of video frames. When in actual use, the size of each video frame can be determined in advance according to the size of an input video source, and after the size of a single video frame is determined, the buffer space capable of storing a preset number of video frames can be allocated in advance.
For example, fig. 6 is a schematic diagram of pre-allocating a Buffer space in a DDR provided in the present application, as shown in fig. 6, the whole Buffer space is further divided into 6 buffers, and each Buffer is marked with Buffer0, buffer1, buffer2, buffer3, buffer4, and Buffer5, where each Buffer can store one video frame, that is, the Buffer space shown in fig. 6 can store 6 video frames in total.
In step S501, a difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer is calculated, where the difference value may be a pixel difference value of a corresponding position of two frames of images, or may be other indexes for evaluating the difference of the frames of images in the prior art, which is not limited herein.
In step S502, a similarity threshold corresponding to the current video frame is calculated, which is specifically as follows:
calculating an average value of the sum of difference values between each of the current video frame and the previous N video frames thereof and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames thereof are continuous in time;
and multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame.
As shown in fig. 7, a set of video frames that are continuous in time, assuming that the video frame at the current time t stored in the current Buffer is the video frame M5, how to calculate the similarity threshold corresponding to the video frame M5 is described in detail below.
In this example, let N be 3, which means that in addition to the difference value between the current video frame M5 and the video frame at the previous time, the difference value between each of the 3 video frames before the video frame M5 and the video frame at the previous time is required, wherein the 3 video frames before the video frame M5 and the video frame M5 are consecutive in time, that is, the video frame M4, the video frame M3, and the video frame M2, respectively.
In this application, for any received video frame, a difference value between the received video frame and the video frame at the previous time is calculated, and the difference value is stored in an array. Similar to calculating the difference value between two video frames stored in the Buffer, the difference value between two video frames in continuous time can be the pixel difference value of the corresponding position of the image, and can also be other indexes for evaluating the difference of the image frames in the prior art.
In the application, in order to ensure that the similarity threshold corresponding to the current video frame can be accurately calculated, the number of bits of the array storage numerical value is larger than the number of difference values required to be used for calculating the similarity threshold. For example, calculating the similarity threshold requires using 7 difference values, the number of bits of the array hold value being 32 bits.
As shown in fig. 7, the difference between the current video frame M5 and the video frame M4 is recorded as Similarly, the difference between the video frame M4 and the video frame M3 is recorded as +.>The difference between the video frame M3 and the video frame M2 is recorded as +.>The difference between the video frame M2 and the video frame M1 is recorded as +.>
After obtaining the difference values between the current video frame M5 and the previous 3 video frames and the previous video frame, calculating the average value of the sum of the difference valuesThe process is as follows:
after obtaining the average value of the sum of the difference values, multiplying the average value by a threshold coefficient to obtain a product value which is the similarity threshold corresponding to the current video frame M5, and the process is as follows:
in the method, in the process of the invention,for the similarity threshold corresponding to the current video frame M5, a is a threshold coefficient, which may be empirically preset in the display device.
Therefore, in the application, the similarity threshold corresponding to the video frame is not a preset fixed value, but is determined according to the difference values between the current video frame and the previous N video frames and the previous video frame, that is, the similarity threshold dynamically changes along with the input video frame, so that the application has certain flexibility and adaptability.
Furthermore, a protection mechanism is added, namely a maximum limit value and a minimum limit value are set for the similarity threshold value corresponding to the video frame, and if the similarity threshold value is smaller than the minimum limit value, the similarity threshold value is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
Wherein the maximum limit value and the minimum limit value may be preset in the display device according to practical experience. By setting the maximum limit value, the video frames which should be reserved originally can be further prevented from being discarded due to the overlarge similarity threshold value; by setting the minimum limit value, it is further avoided that video frames that should have been discarded are preserved due to the similarity threshold being too small.
In step S503, determining a location of the video frame stored in the Buffer at the next time according to the relationship between the difference value and the similarity threshold corresponding to the current video frame specifically includes:
if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
For example, as shown in fig. 8-1, the current video frame exists in Buffer1, and a difference value between the video frame stored in Buffer1 and the video frame stored in Buffer0 is calculated, where the difference value may be a pixel difference value of corresponding positions of the two video frames. The calculation method of the similarity threshold corresponding to the current video frame is referred to the foregoing description, and will not be repeated here.
If the difference value is not greater than the similarity threshold, determining that the current video frame stored in Buffer1 is a repeated frame of the video frame stored in Buffer0, and storing the video frame at the next moment in the current Buffer, i.e. Buffer1, so as to cover the current video frame stored in Buffer1 and reserve the original video frame stored in Buffer0, as shown in fig. 8-2.
If the difference value is greater than the similarity threshold, it is determined that the current video frame stored in Buffer1 is not a repeated frame of the video frame stored in Buffer0, as shown in fig. 8-3, and the video frame at the next time is stored in the next Buffer, i.e. Buffer2, where both the current video frame stored in Buffer1 and the video frame stored in Buffer0 are retained.
In the application, video frames are sequentially written from the head end of the buffer space, when the video frames are written to the tail end of the buffer space, the next video frame is rewritten to the head end of the buffer space, and the data originally stored in the head end of the buffer space are covered, so that the cycle is performed.
For example, as shown in fig. 6, video frames are written from the Buffer0 at the head end of the Buffer, and when the video frames are written into the Buffer5 at the tail end of the Buffer, the following video frames are rewritten into the Buffer0 at the head end of the Buffer, and the video data originally stored in the Buffer0 at the head end of the Buffer is covered, and then written into the following buffers again in sequence, so as to cycle.
For example, as shown in fig. 8-4 and 8-5, the current video frame is stored in Buffer5, the difference value between the video frame stored in Buffer5 and the video frame stored in Buffer4 is calculated, when the difference value is greater than the similarity threshold corresponding to the current video frame, the current video frame needs to be retained in Buffer5 at this time, the video frame needs to be stored in the next Buffer at the next time, and Buffer5 is located at the tail end of the Buffer space, so the video frame needs to be stored in the first end of the Buffer space, i.e. Buffer0 at the next time.
According to the embodiment, when the two video frames are judged to have the same content, the video frame at the next moment is written into the current Buffer so as to cover the video frame stored by the current Buffer, the original video frame with the first picture content is always stored in the DDR, and the repeated frame with the same content which is reappeared later is discarded, so that the service efficiency of the DDR in the display device is improved.
Fig. 9 is a flowchart of a method for discarding a MEMC repeated frame in a display device according to an embodiment of the present application, which specifically includes the following steps:
s901: a buffer space for storing a preset number of video frames is allocated in DDR in advance;
s902: calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is a current video frame;
S903: calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of a video frame stored in a previous Buffer;
s904: judging whether the difference value is not greater than a similarity threshold corresponding to the current video frame, if so, executing a step S905, otherwise, executing a step S906;
s905: storing the video frame at the next moment in the current Buffer;
s906: the next time video frame is stored in the next Buffer.
The following is a schematic diagram of DDR allocation buffer space, which is given in connection with fig. 6, further illustrates the complete process of the MEMC repeated frame dropping method in the display device shown in fig. 9.
Assuming that the frame rate of a certain video clip source is 30fps and the frame rate of a video frame is A, B, C, D, E, F …, the frame rate of a video clip output through a signal source device such as an intelligent network player is 60fps, it is necessary to copy and repeat the video frames in the video clip source to achieve a frame rate increase from 30fps to 60 fps.
The copying of the video frames is repeated taking the video frame A as an example, copying the video frame A into two identical video frames A, respectively marked by A0 and A1, and the like, so that the video frames output by the player are A0, A1, B0, B1, C0, C1, D0, D1, E0, E1, F0 and F1 … ….
Assuming that the buffer space has been allocated in the DDR in the manner shown in fig. 6, when video frames A0, A1, B0, B1, C0, C1, D0, D1, E0, E1, F0, F1 … … are sequentially input to the DDR, a flowchart of the frame dropping method is repeated based on the MEMC shown in fig. 9, concretely as follows:
as shown in fig. 10-1, the current Buffer, i.e., buffer0, stores the current video frame as video frame A0. In this embodiment, buffer0 is located at the head end of the Buffer space, and the previous Buffer is Buffer5 located at the tail end of the Buffer space, where the Buffer5 is empty and no video frame is yet stored, so that the difference value between the video frame A0 stored in Buffer0 and the video frame stored in Buffer5 is calculated to be larger. Meanwhile, the similarity threshold corresponding to the video frame A0 calculated by the method is very small because no video frame exists in the front. Assuming that the similarity threshold is smaller than the minimum limit value, and the similarity threshold is adjusted to be the minimum limit value, the difference value is still larger than the similarity threshold after adjustment, and it is determined that the video frame A1 at the next time is stored in the next Buffer, i.e., buffer 1.
As shown in fig. 10-2, the current Buffer, i.e., buffer1, the current video frame stored in Buffer1 is video frame A1. In the embodiment of the present application, the video frame A0 and the video frame A1 are two identical video frames, both from the video frame a. Therefore, the difference value between two video frames is very small, and at the same time, since only the video frame A0 is provided before, the similarity threshold corresponding to the video frame A1 calculated according to the method described above is also very small, and if the similarity threshold is smaller than the minimum limit value and the similarity threshold is adjusted to be the minimum limit value, the difference value is not greater than the adjusted similarity threshold, and the video frame A1 is considered to be a repeated frame of the video frame A0, and the video frame A1 needs to be discarded, so that it is determined that the video frame B0 at the next time is stored in the current Buffer, that is, the video frame A1 originally stored in the Buffer1 is covered, and the effect of discarding the repeated frame is achieved.
As shown in fig. 10-3, the current Buffer is Buffer1, the current video frame stored in Buffer1 is video frame B0, and the video frame stored in the previous Buffer is Buffer0 is video frame A0. In this embodiment of the present application, the video frame A0 and the video frame B0 are obtained according to the video frame a and the video frame B, respectively, although the video frame a and the video frame B belong to different video frames, whether the video frame B0 stored in the Buffer1 needs to be reserved or not is determined after comparing the difference value between the video frame B0 and the video frame A0 with the similarity threshold corresponding to the video frame B0, and the following two situations will exist:
case 1:
the difference value between the video frame A0 and the video frame B0 is not greater than the similarity threshold corresponding to the video frame B0 (or is the similarity threshold after adjustment), and at this time, the video frame B0 is considered to be a repeated frame of the video frame A0 and needs to be discarded. Therefore, the video frame B1 at the next time is written into Buffer1 where the video frame B0 is stored, and the video frame B0 is completely covered by the video frame B1. As shown in fig. 10-4, at the next moment, the current Buffer is Buffer1, the video frame stored in Buffer1 is video frame B1, and the video frame stored in the previous Buffer is Buffer0 is video frame A0.
Case 2:
the difference value between the video frame A0 and the video frame B0 is greater than the similarity threshold corresponding to the video frame B0 (or the similarity threshold after adjustment), where the video frame B0 is considered to be a new video frame, i.e. an original video frame, and needs to be reserved. Therefore, the video frame B1 at the next time is written into the next Buffer, i.e., buffer2, and at this time, the video frame B0 is saved into the Buffer 1. As shown in fig. 10-5, at the next moment, the current Buffer is Buffer2, the video frame stored in Buffer2 is video frame B1, and the video frame stored in the previous Buffer is Buffer1 is video frame B0.
In this example, the determination of whether or not other video frames, such as video frames C0 and C1, need to be reserved is similar to the previous determination of video frames, and will not be illustrated.
According to the MEMC repeated frame discarding method, the difference value of the video frame stored in the current Buffer and the video frame stored in the previous Buffer is calculated, and the similarity threshold corresponding to the current video frame is calculated, wherein the similarity threshold is used for judging whether the current video frame is the repeated frame of the video frame stored in the previous Buffer, and when the difference value is not greater than the similarity threshold, the current video frame is considered to be the repeated frame, and the video frame at the next moment is stored in the current Buffer. Therefore, repeated frames with the same picture content as the video frames stored previously can be discarded, and the use efficiency of DDR is improved.
Based on the same inventive concept, there is also provided an image processing apparatus 1100, as shown in fig. 11, including:
a first calculation module 1101, configured to calculate a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, where the video frame stored in the current Buffer is a current video frame;
A second calculating module 1102, configured to calculate a similarity threshold corresponding to the current video frame, where the similarity threshold is used to determine whether the current video frame is a repeated frame of the video frame stored in the previous Buffer;
a determining module 1103 is configured to determine a location of the video frame stored in the Buffer at the next time according to the relationship between the difference value and the similarity threshold corresponding to the current video frame.
The determining module 1103 is configured to determine a location of the video frame stored in the Buffer at the next time according to a relationship between the difference value and a similarity threshold corresponding to the current video frame, and specifically includes:
if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer;
and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
The second calculating module 1102 is configured to calculate a similarity threshold corresponding to the current video frame, which is specifically as follows:
Calculating an average value of the sum of difference values between each of the current video frame and the previous N video frames thereof and the previous video frame at the previous moment, wherein the current video frame and the previous N video frames thereof are continuous in time;
multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame;
the similarity threshold corresponding to the video frame is provided with a maximum limit value and a minimum limit value, and if the similarity threshold is smaller than the minimum limit value, the similarity threshold is adjusted to be the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
Based on the same inventive concept, the embodiment of the application also provides a chip, wherein the chip comprises a processor, and the processor is used for calling and running a computer program from a memory, so that the display device provided with the chip realizes the steps of any MEMC repeated frame discarding method when executing.
Based on the same inventive concept, the embodiments of the present application further provide a computer readable storage medium, in which a computer program is stored, which when executed by a processor, implements the steps of any of the above-mentioned MEMC repeated frame dropping methods.
Based on the same inventive concept, the present application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of any of the above-mentioned MEMC repeated frame dropping methods.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (9)

1. A display device, characterized by comprising: a display for displaying video frames; a controller, coupled to the display, configured to: calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is a current video frame; calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer; determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame;
the calculating the similarity threshold corresponding to the current video frame specifically includes: calculating an average value of difference values between each of a current video frame and the previous N video frames thereof and each of the previous video frames at the previous time, wherein the current video frame and the previous N video frames thereof are continuous in time; and multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame.
2. The display device according to claim 1, wherein determining a location of a video frame stored in a Buffer at a next time according to a relationship between the difference value and a similarity threshold corresponding to the current video frame, specifically comprises: if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer; and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
3. The display device according to claim 1, wherein a maximum limit value and a minimum limit value are set for a similarity threshold value corresponding to a video frame, and if the similarity threshold value is smaller than the minimum limit value, the similarity threshold value is adjusted to the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
4. A method for MEMC repeated frame dropping, comprising: calculating a difference value between a video frame stored in a current Buffer and a video frame stored in a previous Buffer, wherein the video frame stored in the current Buffer is a current video frame; calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer; determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame;
the similarity threshold corresponding to the current video frame is calculated, and the similarity threshold is specifically as follows: calculating an average value of difference values between each of a current video frame and the previous N video frames thereof and each of the previous video frames at the previous time, wherein the current video frame and the previous N video frames thereof are continuous in time; and multiplying the average value by a threshold coefficient to obtain a product value which is a similarity threshold corresponding to the current video frame.
5. The method according to claim 4, wherein determining the location of the video frame stored in the Buffer at the next time according to the relationship between the difference value and the similarity threshold corresponding to the current video frame specifically comprises: if the difference value is not greater than the similarity threshold, determining that the current video frame is a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the current Buffer; and if the difference value is larger than the similarity threshold value, determining that the current video frame is not a repeated frame of the video frame stored in the previous Buffer, and storing the video frame at the next moment in the next Buffer.
6. The method according to claim 4, wherein a maximum limit value and a minimum limit value are set for the similarity threshold value corresponding to the video frame, and if the similarity threshold value is smaller than the minimum limit value, the similarity threshold value is adjusted to the minimum limit value; and if the similarity threshold is larger than the maximum limit value, adjusting the similarity threshold to the maximum limit value.
7. An image processing apparatus for performing the method of any of claims 4-6, comprising: the first calculation module is used for calculating the difference value between the video frame stored in the current Buffer and the video frame stored in the previous Buffer, wherein the video frame stored in the current Buffer is the current video frame; the second calculation module is used for calculating a similarity threshold corresponding to the current video frame, wherein the similarity threshold is used for judging whether the current video frame is a repeated frame of the video frame stored in the previous Buffer; and the determining module is used for determining the position of the video frame stored in the Buffer at the next moment according to the relation between the difference value and the similarity threshold corresponding to the current video frame.
8. A chip, comprising: a processor for calling and running a computer program from a memory, causing a display device on which the chip is mounted to perform the method of any of claims 4-6.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the steps of the method of any of claims 4-6.
CN202111116545.5A 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method Active CN113923514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111116545.5A CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111116545.5A CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Publications (2)

Publication Number Publication Date
CN113923514A CN113923514A (en) 2022-01-11
CN113923514B true CN113923514B (en) 2024-03-01

Family

ID=79235916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111116545.5A Active CN113923514B (en) 2021-09-23 2021-09-23 Display device and MEMC repeated frame discarding method

Country Status (1)

Country Link
CN (1) CN113923514B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320536B (en) * 2023-05-16 2023-08-18 瀚博半导体(上海)有限公司 Video processing method, device, computer equipment and computer readable storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010154254A (en) * 2008-12-25 2010-07-08 Kyocera Corp Composite image creating apparatus
US9491398B1 (en) * 2010-12-21 2016-11-08 Pixelworks, Inc. System and method for processing assorted video signals
CN106683086A (en) * 2016-12-23 2017-05-17 深圳市大唐盛世智能科技有限公司 Background modeling method and device for intelligent video monitoring
CN106951346A (en) * 2016-01-06 2017-07-14 阿里巴巴集团控股有限公司 The method of testing and device of a kind of response time
CN108540822A (en) * 2018-04-04 2018-09-14 南京信安融慧网络技术有限公司 A kind of key frame of video extraction acceleration system and its extracting method based on OpenCL
US10116989B1 (en) * 2016-09-12 2018-10-30 Twitch Interactive, Inc. Buffer reduction using frame dropping
CN110312095A (en) * 2018-03-20 2019-10-08 瑞昱半导体股份有限公司 Image processor and image treatment method
CN111447488A (en) * 2020-04-01 2020-07-24 青岛海信传媒网络技术有限公司 MEMC control method and display device
CN111738321A (en) * 2020-06-12 2020-10-02 腾讯音乐娱乐科技(深圳)有限公司 Data processing method, device, terminal equipment and storage medium
CN112085097A (en) * 2020-09-09 2020-12-15 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN112866801A (en) * 2021-03-11 2021-05-28 北京小米移动软件有限公司 Video cover determining method and device, electronic equipment and storage medium
CN113032295A (en) * 2021-02-25 2021-06-25 西安电子科技大学 Data packet second-level caching method, system and application

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021128537A (en) * 2020-02-13 2021-09-02 キヤノン株式会社 Image processing device, image processing method, program and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010154254A (en) * 2008-12-25 2010-07-08 Kyocera Corp Composite image creating apparatus
US9491398B1 (en) * 2010-12-21 2016-11-08 Pixelworks, Inc. System and method for processing assorted video signals
CN106951346A (en) * 2016-01-06 2017-07-14 阿里巴巴集团控股有限公司 The method of testing and device of a kind of response time
US10116989B1 (en) * 2016-09-12 2018-10-30 Twitch Interactive, Inc. Buffer reduction using frame dropping
CN106683086A (en) * 2016-12-23 2017-05-17 深圳市大唐盛世智能科技有限公司 Background modeling method and device for intelligent video monitoring
CN110312095A (en) * 2018-03-20 2019-10-08 瑞昱半导体股份有限公司 Image processor and image treatment method
CN108540822A (en) * 2018-04-04 2018-09-14 南京信安融慧网络技术有限公司 A kind of key frame of video extraction acceleration system and its extracting method based on OpenCL
CN111447488A (en) * 2020-04-01 2020-07-24 青岛海信传媒网络技术有限公司 MEMC control method and display device
CN111738321A (en) * 2020-06-12 2020-10-02 腾讯音乐娱乐科技(深圳)有限公司 Data processing method, device, terminal equipment and storage medium
CN112085097A (en) * 2020-09-09 2020-12-15 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN113032295A (en) * 2021-02-25 2021-06-25 西安电子科技大学 Data packet second-level caching method, system and application
CN112866801A (en) * 2021-03-11 2021-05-28 北京小米移动软件有限公司 Video cover determining method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113923514A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
US11743414B2 (en) Real time video special effects system and method
US11727958B2 (en) Real time video special effects system and method
US8359545B2 (en) Fast and smooth scrolling of user interfaces operating on thin clients
WO2021179359A1 (en) Display device and display picture rotation adaptation method
US8811797B2 (en) Switching between time order and popularity order sending of video segments
CN112073798B (en) Data transmission method and equipment
CN112653920B (en) Video processing method, device, equipment and storage medium
WO2016118295A1 (en) Profiles identifying camera capabilities that are usable concurrently
CN112565868B (en) Video playing method and device and electronic equipment
WO2023104102A1 (en) Live broadcasting comment presentation method and apparatus, and device, program product and medium
CN111836104B (en) Display apparatus and display method
CN112954376A (en) Video playing method and display equipment
CN113923514B (en) Display device and MEMC repeated frame discarding method
CN113132769A (en) Display device and sound and picture synchronization method
CN115250357B (en) Terminal device, video processing method and electronic device
WO2021180223A1 (en) Display method and display device
CN115145482A (en) Parameter configuration system, method, reference monitor and medium
WO2021179361A1 (en) Display apparatus
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN115185392A (en) Display device, image processing method and device
CN111726555B (en) Display device, motion estimation method and video processing method
CN112218156A (en) Method for adjusting video dynamic contrast and display equipment
CN114051141A (en) Historical superposition-based uniform velocity vector estimation method and display equipment
CN112135173B (en) Method for improving play-starting code rate of streaming media and display equipment
CN115119029B (en) Display equipment and display control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant