WO2019080847A1 - 一种视频数据处理方法及视频数据处理装置 - Google Patents

一种视频数据处理方法及视频数据处理装置

Info

Publication number
WO2019080847A1
WO2019080847A1 PCT/CN2018/111520 CN2018111520W WO2019080847A1 WO 2019080847 A1 WO2019080847 A1 WO 2019080847A1 CN 2018111520 W CN2018111520 W CN 2018111520W WO 2019080847 A1 WO2019080847 A1 WO 2019080847A1
Authority
WO
WIPO (PCT)
Prior art keywords
video data
frame
module
data processing
receiving end
Prior art date
Application number
PCT/CN2018/111520
Other languages
English (en)
French (fr)
Inventor
钟光华
郑自浩
Original Assignee
南昌黑鲨科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南昌黑鲨科技有限公司 filed Critical 南昌黑鲨科技有限公司
Priority to JP2020520518A priority Critical patent/JP2021500786A/ja
Priority to EP18869522.5A priority patent/EP3644613A4/en
Priority to KR1020207006697A priority patent/KR20200077507A/ko
Publication of WO2019080847A1 publication Critical patent/WO2019080847A1/zh
Priority to US16/854,819 priority patent/US20200252581A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • H04N7/013Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the incoming video signal comprising different parts having originally different frame rate, e.g. video and graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • H04N19/31Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability in the temporal domain
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0105Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level using a storage device with different write and read speed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • H04N7/0132Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
    • H04N7/014Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes involving the use of motion vectors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/106Determination of movement vectors or equivalent parameters within the image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0435Change or adaptation of the frame rate of the video stream
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/12Frame memory handling
    • G09G2360/121Frame memory handling using a cache memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression

Definitions

  • the present invention relates to the field of video processing, and in particular, to a video data processing method and a video data processing apparatus.
  • the video card decodes the stored video data and transmits it to the display screen at a certain frame rate.
  • MIPI and HDMI are used between the decoding device and the display screen.
  • DisplayPort and other video transmission interface connections are used.
  • the existing video frame rate is 15fps, 24fps and 30fps. If the smoothness of the human eye is to be achieved, the frame rate needs to be kept above 60fps, which is why the refresh rate of the display screen is above 60hz.
  • the frame rate of the video data is different from the refresh rate of the display screen, it may cause a jam or jitter when the video data is displayed on the display screen.
  • MEMC Motion Insertion and Motion Compensation
  • the display of the video data will have a delay. That is to say, the insertion of the frame is at least waited until the second frame of the video data involved in the interpolation operation is received, and the delay of displaying the video data on the display screen includes waiting for the reception of the first frame of the video data and the second frame of the video data.
  • the frame insertion time is calculated, wherein the frame insertion calculation time is much smaller than the transmission time of the first frame of video data.
  • the frame rate of video data is 30 fps per second, and the time of waiting for two frames is 66.6 ms, that is, the display delay is at least 66.6 ms. If the video data and the user have interactions, such as a game operation interface, the displayed delay will cause the interaction to be out of sync, reducing the user interaction experience.
  • an object of the present invention is to provide a video data processing method and a video data processing device, which can achieve the technical effect of reducing the video processing delay by increasing the video data transmission speed and early inserting the frame operation timing.
  • the present invention discloses a video data processing method for processing video data transmitted from a transmitting end operating at a first frame rate to a receiving end operating at a second frame rate, including the following steps:
  • the sending end converts the video data into at least one video data frame according to the first frame rate.
  • the transmitting end sends the video data frame generated in the previous frame duration to the receiving end once in each frame duration corresponding to the first frame rate, where the data in the transmission period of each video data frame The ratio of the transmission time to the transmission period is less than or equal to one-half;
  • the receiving end respectively receives a video data frame within a duration of two adjacent frames corresponding to the first frame rate.
  • step S104 performing frame interpolation on the two video data frames received in step S103, to obtain at least one video data insertion frame;
  • S105 Insert the video data into a frame between the two video data frames to form a group of video data frames to be played.
  • the sending end sends a video data frame to the receiving end through a physical interface.
  • the step S102 performs the following steps within each frame duration corresponding to the first frame rate:
  • the ratio of the frequency of the line sync signal to the reference frequency is at least 2; when the step S103 is performed, the received line sync signal is divided according to the ratio.
  • the video data processing method further includes the following steps:
  • S106 The receiving end displays the video data frame to be played according to the second frame rate.
  • the present invention also discloses a video data processing apparatus, including a transmitting end operating at a first frame rate and a receiving end operating at a second frame rate, wherein the video data processing apparatus comprises:
  • a conversion module configured to be at the sending end, converting the video data into at least one video data frame according to the first frame rate
  • a sending module configured to be connected to the conversion module, and send a video data frame generated in a previous frame duration to the receiving end once in each frame duration corresponding to the first frame rate, where The ratio of the data transmission time in the transmission period of each video data frame to the transmission period is less than or equal to one-half;
  • a receiving module configured to receive, by the receiving end, a video data frame in a duration of two adjacent frames corresponding to the first frame rate
  • the frame insertion module is disposed at the receiving end, and is connected to the receiving module, and performs frame interpolation operation on two video data frames received by the receiving module to obtain at least one video data insertion frame;
  • the framing module is disposed at the receiving end, and is connected to the interpolation frame operation module, and inserts the video data into a frame between the two video data frames to form a group of video data frames to be played.
  • the sending module sends a video data frame to the receiving end through a physical interface.
  • the sending module includes:
  • a buffer unit configured to be at the sending end, writing a video data frame within a frame duration corresponding to the first frame rate
  • a signal transmission unit disposed at the transmitting end, transmitting a control signal and an auxiliary signal to the physical interface
  • a data transmission unit configured to be connected to the buffer unit, and configured to transmit the video data frame to the physical interface within a preset time threshold
  • the period compensation unit waits for the end of the current transmission period.
  • the ratio of the frequency of the line synchronization signal to the reference frequency is at least 2; and the receiving module divides the received line synchronization signal according to the ratio.
  • the video processing device further includes:
  • the playing module displays the video data frame to be played according to the second frame rate.
  • FIG. 1 is a schematic flow chart of a method for processing video data according to a preferred embodiment of the present invention
  • step S102 is a schematic flow chart of step S102 in accordance with a preferred embodiment of the present invention.
  • FIG. 3 is a block diagram showing the structure of the video data processing apparatus in accordance with a preferred embodiment of the present invention.
  • FIG. 4 is a block diagram showing the structure of the transmitting module in accordance with a preferred embodiment of the present invention.
  • 10-video data processing device 11-transmitter, 111-transition module, 112-transmission module, 1121-cache unit, 1122-signal transmission unit, 1123-data transmission unit, 1124-cycle compensation unit, 12-receiver, 121-receiving module, 122-interpolation frame operation module, 123-frame frame module, 124-playback module.
  • first, second, third, etc. may be used in the present disclosure to describe various information, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as second information without departing from the scope of the present disclosure.
  • second information may also be referred to as first information.
  • word "if” as used herein may be interpreted as "when” or “when” or “in response to a determination.”
  • the video data processing method includes:
  • the transmitting end 11 converts the video data into at least one video data frame according to the first frame rate.
  • the transmitting end 11 may be a device capable of decoding, such as a player or a graphics card, and decodes a video file in a digital format into a playable video signal, where the video signal is composed of multiple frames of video data.
  • the transmitting end 11 can generate video data of each frame according to the first frame rate, and the first frame rate can be 15 fps, 24 fps or 30 fps, where fps refers to the number of frames transmitted per second, and the more frames per second, the displayed
  • the movement will be smoother. In general, the minimum value to avoid unsmooth motion is 30fps, and some computer video formats can only provide 15 frames per second.
  • the video data may be in a data format such as wmv, rmvb, 3gp, or mp4, and is often stored in a storage device in the form of a video file.
  • the video data is converted into at least one video data frame, and the video data frame, that is, the video data content played in each frame, is often in the form of a pixel picture, and can be regarded as a picture; when the first frame rate is At 15fps, there are 15 video data frames in 1 second.
  • the number of converted video data frames is different depending on the playback duration of the video data and the first frame rate.
  • the video data frame is the basis of a subsequent playback operation, and the playback device plays the video data frame in a frame by frame manner to implement a dynamic video effect.
  • the transmitting end 11 sends the video data frame generated in the previous frame duration to the receiving end 12 once in each frame duration corresponding to the first frame rate, where the transmission period of each video data frame is The ratio of the data transmission time to the transmission period is less than or equal to one-half.
  • the receiving end 12, that is, the device for playing video may be a display screen, a television, and the like, and operates at a second frame rate, which may be 60 fps or even higher.
  • the second frame rate may be twice or even more than the first frame rate for smooth playback.
  • the device that plays the video raises the second frame rate to achieve a fast forward effect, and the first frame rate will also be synchronously boosted during fast forward.
  • the duration of the video data frame sent by the transmitting end 11 is substantially equal to the duration of playing a video data frame, that is, one frame is transmitted and one frame is played; and the currently transmitted video data frame is The video data frame from which the video file is converted in the last frame time.
  • the first frame rate is 30 fps
  • the corresponding frame duration is 33.3 milliseconds
  • step S101 converts one video data frame in each frame duration
  • the transmitting end 11 converts in 0-33.3 milliseconds.
  • the first video data frame is transmitted, and the first video data frame is transmitted to the receiving end 12 within a second frame duration of 33.3 milliseconds to 66.6 milliseconds, and only one time is transmitted, and the transmitting end 11 continues.
  • each video data frame is substantially equal to a frame duration, including a data transmission time and an auxiliary transmission time, wherein the data transmission time is substantially used for transmitting the video data frame, and the auxiliary transmission time is performed.
  • Controlling the time of signal transmission, audio signal transmission or other auxiliary information transmission, also referred to as blanking period, is indicated by blanking in software development.
  • the ratio of the data transmission time to the transmission period is generally above 80%, that is, the time occupied by the data transmission is more than half of the transmission period.
  • This step provides an improvement to the above prior art by shortening the data transmission time by increasing the transmission speed of the video data frame, thereby reducing the ratio of the data transmission time to the transmission period, the ratio being less than or equal to Half.
  • the auxiliary transmission time in the transmission period can be lengthened accordingly such that the transmission period remains unchanged and is still substantially equal to one frame duration.
  • the main point of this step is that the generation speed of the video data frame is separated from the transmission speed, which breaks the technical route that the two technologies are basically coordinated and synchronized in the prior art, and shortens the transmission time of the video data by increasing the transmission speed of the video data frame.
  • Increasing the transmission speed of the video data frame can be achieved by increasing the utilization rate of the transmission interface.
  • the maximum data transmission speed of the HDMI interface is 48 Gbps, and the demand for a 1080p video and an 8-channel audio signal is less than 0.5 GB/s. Therefore, the transmission interface still has a large lifting capacity, and the transmission speed of the video data frame can be increased by several times or even several times.
  • the first frame rate is 30 fps
  • the corresponding frame duration is 33.3 milliseconds
  • step S101 is converted within each frame duration.
  • a video data frame is output; the transmitting end 11 converts the first video data frame within 0-33.3 milliseconds, and transmits the first video data frame to the second frame duration of 33.3 milliseconds to 44.4 milliseconds.
  • the receiving end 12 converts the second video data frame in 33.3 milliseconds to 66.6 milliseconds; in the 66.6 milliseconds to 77.7 milliseconds, the transmitting end 11 sends the second video data frame to The receiving end 12.
  • the time for transmitting the second video data frame is 99.9 milliseconds, and the step is 77.7 milliseconds, which is 22.2 milliseconds earlier.
  • the time calculation does not consider the transmission time of the auxiliary signal in the transmission period.
  • the auxiliary signal transmission of the second video data frame can be completed within the previous frame duration, without occupying the duration of the frame, so that one frame can be used.
  • the transmission of video data frames is performed as soon as possible.
  • the frame duration for implementing this step is calculated at the first frame rate, that is, the play duration of each frame at the first frame rate.
  • the receiving end 12 receives a video data frame in the adjacent two frame durations corresponding to the first frame rate.
  • This step illustrates the manner of receiving the receiving end 12. Since the same video data frame is transmitted once in the same duration corresponding to the first frame rate in step S102, it means that the receiving end 12 receives at most one video data frame in the same time. Since the transmission time of the video data frame is compressed in each transmission period in step S102, the receiving end 12 does not have to wait until the end of the current frame duration to complete the reception of the video data frame, and may also perform subsequent subsequent steps. Insert frame operation. In this step, the receiving end 12 receives a video data frame in the adjacent two frame durations corresponding to the first frame rate, and serves as a basis for performing a frame interpolation operation in a subsequent step.
  • the first frame rate is 30 fps
  • the corresponding frame duration is 33.3 milliseconds
  • the second frame is 33.3 milliseconds to 66.6 milliseconds.
  • the receiving end 12 receives the first video data frame once, and the receiving completion time is 44.4 milliseconds; in the third frame duration of 66.6 milliseconds to 99.9 milliseconds, the receiving end 12 receives the second time.
  • the video data frame has a completion time of 77.7 milliseconds.
  • the receiving end 12 receives a video data frame, and so on, until the video data is completely received, and the video data frame is received once every two adjacent frame durations. Note that the frame duration for implementing this step is calculated at the first frame rate.
  • the frame insertion operation must be performed by relying on two video data frames, so the receiving end 12 can only receive the first video data frame and the second video data at the end of the third frame duration.
  • the frame can be used for subsequent frame interpolation operations, so that the actual frame insertion time must wait at least until the end of the third frame time, that is, after 99.9 milliseconds.
  • the duration is very short, for example, about 3 milliseconds, and the insertion of the frame can be started at the time of 80 milliseconds, which reduces the frame insertion delay compared to the prior art, and shortens the video playback delay as a whole. If the transmitting end 11 sends the same video data frame faster in the same frame duration in step S102, the receiving end 12 will receive a video data frame within a frame time shorter, and the frame will be inserted. The time of the operation is further advanced, making the video processing delay shorter.
  • step S104 Perform frame interpolation on the two video data frames received in step S103 to obtain at least one video data interpolation frame.
  • This step performs an interpolation frame operation, and the object participating in the operation is the adjacent two video data frames received in the step S103.
  • the common interpolation algorithm is MEMC, the full name of Motion Estimate and Motion Compensation, namely motion estimation and motion compensation, which is a motion image compensation technology often used in LCD TVs.
  • the principle is to use a dynamic image system to insert a frame of motion compensation frame between the traditional two frames of images to increase the 50/60 Hz refresh rate of the ordinary flat-panel TV to 100/120 Hz. In this way, the moving picture is clearer and smoother, and is superior to the normal response effect, thereby achieving the effect of clearing the residual image of the previous frame image and improving the dynamic definition, and reducing the image tail to a degree that is difficult for the human eye to perceive.
  • the frame order of the original flow picture is: 1, 2, 3, 4, 5, 6; the MEMC technique analyzes the motion trend of the image in both horizontal and vertical directions through the partition block. Inserting an intermediate frame between the original various frames, that is, video data insertion frame, the frame sequence after the insertion of the frame becomes: 1, 1C, 2, 2C, 3, 3C, 4, 4C, 5, 5C, 6, So the original field frequency is not enough to show all the current frames, so you need to double the field frequency, that is, from 50/60Hz to 100/120Hz, it can be seen that MEMC technology and frequency multiplication technology are inseparable.
  • the receiving end 12 will play the final video data frame at the second frame rate.
  • the number of frames inserted between the two video data frames is also different.
  • the first frame rate is 30 fps
  • the second frame rate is 60 fps, so that one video data interpolation frame is inserted between every two video data frames.
  • the second frame rate is 120 fps, three video data interpolation frames must be inserted between every two video data frames.
  • S105 Insert the video data into a frame between the two video data frames to form a group of video data frames to be played.
  • step S104 the framing operation is performed, and the video data inserted in step S104 is inserted into the received video data frame to form a group of video data frames to be played, so that the playing device can follow the working frequency of the hardware, that is, the first The video data frame is played at a frame rate of one frame by one frame, and no jam occurs.
  • the transmitting end 11 when the step S102 is performed, sends a video data frame to the receiving end 12 through a physical interface.
  • the improved embodiment is limited to the connection mode of the transmitting end 11 and the receiving end 12, that is, the physical interface is a video transmission interface such as MIPI, HDMI, and DisplayPort.
  • the physical interface After the completion of the control signal, the audio signal or other auxiliary information in the auxiliary transmission time of the transmission period, the physical interface may be in a high impedance or other state, in a low power mode.
  • the video data processing method further includes the following steps:
  • the receiving end 12 displays the video data frame to be played according to the second frame rate.
  • This step performs a playback operation, and plays a group of video data frames including the video data insertion frame on the display device.
  • the pixel information required for the playback picture has been recorded in each video data frame, and the pixel information can be displayed by the hardware device to realize the playback of the video data. Since the steps S103, S104, and S105 are consecutively executed, the video data frame to be played is continuously generated, so this step does not have to wait until a plurality of video data frames are received, but after the group is completed in step S105.
  • a playback operation is performed at the second frame rate.
  • step S102 performs the following steps within each frame duration corresponding to the first frame rate:
  • the sending end 11 generates the video data frame and stores the video data frame in the buffer unit 1121 in the sending end 11.
  • the buffer unit is also called a Frame Buffer. This step is completed at the code layer, that is, one write command is executed to the cache unit 1121 within one frame time.
  • S102-2 transmitting a control signal and an auxiliary signal to the physical interface.
  • This step performs control signal and auxiliary signal transmission operations, and may also include transmission of audio signals.
  • the transmission time occupied by the transmission of the above signal and the transmission of the video data frame is not compressed.
  • the types of control signals and interface protocols are different, and there are different control signal timings.
  • Some control signals may cooperate in the transmission process of the video data frames, not necessarily according to separate steps. Executed separately.
  • the video data frame in the buffer unit 1121 is sent to the physical interface, and is implemented by the driver layer, and the software data is converted into an electrical signal and sent through the physical interface.
  • This step performs a protocol that must satisfy the physical interface.
  • the time for transmitting the video data frame in this step should be within a preset time threshold, and the time threshold is less than one-half of the transmission period, which actually defines the video data frame transmission time, Achieve the technical effect of shortening the video processing delay.
  • This step is to make up the duration of the current transmission period to ensure the transmission rhythm of the video data frame. Since the speed at which the transmitting end 11 generates the video data frame is performed according to the first frame rate, even if the transmission speed of the video data frame becomes faster, the transmitting end 11 needs to wait for the next video data to be generated. The frame can be transmitted, so this step is implemented to keep the transmission period stable.
  • the ratio of the frequency of the line synchronization signal to the reference frequency is at least 2; when the step S103 is performed, the received line synchronization signal is divided according to the ratio. frequency.
  • the improved embodiment further preferably implements a technical means for shortening the transmission time of the video data frame, that is, by adjusting the frequency of the line synchronization signal when the video data frame is transmitted.
  • the reference frequency is a frequency of a line synchronization signal when the ratio of the data transmission time to the transmission period is greater than 80% in the prior art.
  • Line synchronization also known as horizontal synchronization
  • the line sync signal is also called HSYNC when digital video data is transmitted.
  • HSYNC When HSYNC is valid, the received signal belongs to the same line.
  • VSYNC there is also a field sync signal, also known as VSYNC.
  • VSYNC When VSYNC is active, the received signals belong to the same field. For example, to display a picture with a pixel of Ax B, there are:;, where PCLK is a pixel point synchronization clock signal, and each PCLK corresponds to one pixel point.
  • the reference frequency is the frequency of HSYNC in the prior art. According to the above calculation relationship, if the frequency of HSYNC is raised to 3 times of the reference frequency, and the duty cycle of VSYNC is unchanged, the video data frame, that is, the pixel point data, may be within one third of the transmission period. Upon completion, it is also necessary to insert an auxiliary transmission time of two-thirds of the length of the transmission period, that is, a Blinking time. In the improved embodiment, the ratio of the frequency of the horizontal synchronization signal to the reference frequency is at least 2, that is, the transmission time of the video data frame is shortened by at least 1 time.
  • the receiving end 12 Since the frequency of the line synchronizing signal is doubled, the receiving end 12 needs to divide the received line synchronizing signal to restore the actual line synchronizing timing, and a frequency divider can be added to the receiving end 12 to implement.
  • the frequency division multiple of the frequency divider is a ratio of the frequency threshold of the horizontal synchronization signal to the reference frequency.
  • the receiving end 12 further needs to cache the video data frame first, and then process the frequency-divided line synchronization signal to implement display synchronization.
  • the video data processing apparatus 10 includes a transmitting end 11 operating at a first frame rate and operating at a second frame rate.
  • the receiving end 12 the video data processing apparatus 10 further includes:
  • the conversion module 111 is disposed at the transmitting end 12, and converts the video data into at least one video data frame according to the first frame rate.
  • the conversion module 111 may be a device having a decoding capability, such as a player or a graphics card, converting the video data of different data formats into multiple video data frames, and satisfying the first frame rate when converting video data frames.
  • the sending module 112 is disposed at the transmitting end 11, and is connected to the conversion mode 112.
  • the transmitting end 11 sends the previous frame duration to the receiving end 12 within each frame duration corresponding to the first frame rate.
  • the internally generated video data frame is once, wherein the ratio of the data transmission time in the transmission period of each video data frame to the transmission period is less than or equal to one-half.
  • the sending module 112 receives the converted video data frame from the conversion module 111 and transmits the converted video data frame to the receiving end 12.
  • the transmitting module 112 fully utilizes the capacity of the video data transmission channel, increases the transmission rate, and compresses the time for transmitting a video data frame.
  • the receiving module 121 is disposed at the receiving end 12, and receives a video data frame respectively in the adjacent two frame durations corresponding to the first frame rate.
  • the receiving module 121 receives the video data frame sent by the sending module 112. Because the sending module 112 compresses the transmission time of the video data frame, the receiving module 121 can be in the middle of the transmission period. The completion of the video data frame is received.
  • the receiving module 121 can respectively receive two video data frames in the adjacent two frame durations, which provides a basis for subsequent interpolation frame operations. Since the receiving module 121 uses at most one-half of the time for receiving the video data frame within one frame duration, the remaining time in the frame duration can be used for the interpolation frame operation, which is advanced relative to the prior art. The moment when the interpolation operation starts.
  • the frame duration of the working reference of the receiving module 121 is calculated at the first frame rate, that is, the playing duration of each frame at the first frame rate.
  • the frame insertion module 122 is disposed at the receiving end 12, and is connected to the receiving module 121, and performs frame interpolation on two video data frames received by the receiving module 121 to obtain at least one video data interpolation frame.
  • the frame insertion operation module 122 acquires two adjacent video data frames from the receiving module 121, and performs frame interpolation operations based on the two video data frames.
  • the frame interpolation operation module 122 embeds a frame operation algorithm, such as MEMC, that is, motion estimation and motion compensation algorithms.
  • the framing module 123 is disposed at the receiving end 12, and is connected to the interpolation frame operation module 122, and inserts the video data into a frame between the two video data frames to form a group of videos to be played. Data Frame.
  • the framing module 123 obtains a video data insertion frame from the UC haring operation module 122, and then acquires a received video data frame from the receiving module 121, and inserts the video data into a frame as two bases of its calculation.
  • the video data frames form a group of video data frames to be played together.
  • the transmitting module 112 transmits a video data frame to the receiving end 12 through a physical interface.
  • the improved embodiment is limited to the connection mode of the transmitting end 11 and the receiving end 12, that is, the physical interface is a video transmission interface such as MIPI, HDMI, and DisplayPort.
  • the sending module 112 is connected to the receiving module 121 through the physical interface to implement transmission of a video data frame.
  • the video processing device 10 further includes:
  • the playing module 124 is disposed at the receiving end 12, and is connected to the framing module 123, and displays the video data frame to be played according to the second frame rate.
  • the playing module 124 acquires the video data frame to be played from the framing module 123, and plays according to the second frame rate.
  • the playing module 124 may be a display screen and a display circuit thereof, and the display circuit is configured to convert the video data frame into an electrical signal representing a physical pixel, and the display screen displays the physical pixel.
  • the sending module 112 includes:
  • the buffer unit 1121 is disposed at the transmitting end 11 and writes a video data frame within a frame duration corresponding to the first frame rate.
  • the cache unit 1121 may be a physical storage medium such as a memory, a hard disk, or the like, and is capable of storing data.
  • the signal transmission unit 1122 is disposed at the transmitting end 11 and transmits a control signal and an auxiliary signal to the physical interface.
  • the signal transmission unit 1122 performs a control signal and an auxiliary signal transmission operation, and may also include a transmission operation of the audio signal.
  • the types of control signals and interface protocols are different, and there are also different control signal timings, and some control signals will cooperate in the transmission process of the video data frames.
  • the data transmission unit 1123 is disposed at the sending end 11 and is connected to the buffer unit 1121 to transmit the video data frame to the physical interface within a preset time threshold.
  • the data transfer unit 1123 implements the conversion of software data into electrical signals through the physical layer through the drive layer.
  • the data transmission unit 1123 must satisfy the protocol of the physical interface when transmitting the video data frame, and the time for transmitting the video data frame should be within a preset time threshold, and the time threshold is less than the transmission period. One-half of the actual is to limit the transmission time of the video data frame to achieve the technical effect of shortening the video processing delay.
  • the signal transmission unit 1122 when the data transmission unit 1123 transmits the video data frame, the signal transmission unit 1122 needs to cooperate with each other to perform control signal state control.
  • the signal transmission unit 1122 For example, in the HDMI interface, VSYNC and HSYNC are required. High and low level control for auxiliary confirmation of data transmission.
  • the period compensation unit 1124 waits for the end of the current transmission period.
  • the function of the period compensation unit 1124 is to supplement the duration of the current transmission period to ensure the transmission tempo of the video data frame. Since the speed at which the conversion module 111 generates the video data frame is performed according to the first frame rate, even if the transmission speed of the video data frame becomes faster, the conversion module 111 needs to wait for the next video to be generated. Data frames can be transmitted. Therefore, after the data transmission unit 1123 transmits the video data frame, the period compensation unit 1124 keeps the transmission period stable.
  • the ratio of the frequency of the line synchronization signal to the reference frequency is at least 2; the receiving module 121 synchronizes the received line.
  • the signal is divided by the ratio.
  • the improved embodiment further preferably implements a technical means for shortening the transmission time of the video data frame, that is, by adjusting the frequency of the line synchronization signal when the video data frame is transmitted.
  • the data transmission unit 1123 boosts the frequency of the line synchronization signal by at least 1 time, and shortens the transmission time of the video data frame by at least 1 time.
  • the receiving module 121 is further provided with a frequency divider for dividing the received line synchronization signal to restore the actual line synchronization timing.
  • the frequency division multiple of the frequency divider is a ratio of the frequency threshold of the horizontal synchronization signal to the reference frequency.
  • the receiving module 121 further needs to cache the video data frame first, and then process the frequency-matched line synchronization signal to implement display synchronization.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Television Systems (AREA)

Abstract

本发明提供了一种视频数据处理方法及视频数据处理装置,所述视频数据处理方法包括以下步骤:发送端按照第一帧速率将所述视频数据转换为至少一个视频数据帧;发送端在所述第一帧速率对应的每一帧时长内向接收端发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一;接收端于相邻两个帧时长内分别接收一视频数据帧;对上一步骤中接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧;将视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。上述技术方案实施后,有效减少视频数据在处理过程中的延时,提升用户体验。

Description

一种视频数据处理方法及视频数据处理装置 技术领域
本发明涉及视频处理领域,尤其涉及一种视频数据处理方法及视频数据处理装置。
技术背景
人们使用视频播放设备播放视频时,需要将视频数据解码后传输给播放设备,例如显卡将存储的视频数据解码后按照一定的帧速率传输给显示屏幕,解码设备与显示屏幕之间采用MIPI、HDMI、DisplayPort等视频传输接口连接。目前,现有的视频帧速率为15fps,24fps和30fps,如果要达到人眼观看流畅的效果,需要帧率保持在60fps以上,这也是显示屏幕的刷新率在60hz以上的原因。但是由于视频数据的帧速率与显示屏幕的刷新率不同,就会造成视频数据显示到显示屏幕上时出现卡顿或者抖动现象。
为了解决卡顿的问题,现有技术中有一种叫做MEMC(运动插帧和运动补偿)的视频增强算法,它根据物体运动的矢量,对视频数据进行插帧,使得视频数据的帧数等于显示屏幕刷新所需的帧数。由于插帧之后的视频数据的帧数和显示屏幕的帧数相同,显示屏幕上只需要逐帧处理即可,因此不会在显示屏幕上产生卡顿或者抖动的问题。
然而,采用MEMC视频增强算法解决视频卡顿和抖动的问题时,由于运动矢量的计算需要至少两帧的数据才可以计算出插帧内容,视频数据的显示将有延时。也就是说插帧至少要等到参与插帧运算的第二帧视频数据接收完毕才能计算出来,显示屏幕上显示视频数据的延时包括等待接收第一帧视频数据和第二帧视频数据的时间以及插帧计算时间,其中插帧计算时间远小于第一帧视频数据的传输时间。例如视频数据的帧速率为每秒30fps,等待两帧的时间即为66.6ms,即显示延时至少为66.6ms。如果视频数据和用户存在交互的情况,例如游戏操作界面,那么显示的延时将引起交互不同步的问题,降低了用户交互操作的体验。
发明概要
为了克服现有技术缺陷,本发明的目的在于提供一种视频数据处理方法及视频数据处理装置,通过提升视频数据传输速度,提早插帧运算的时机,实现减小视频处理延时的技术效果。
本发明公开了一种视频数据处理方法,用于处理由工作于第一帧速率的发送端发送至工作于第二帧速率的接收端的视频数据,包括以下步骤:
S101:所述发送端按照所述第一帧速率将所述视频数据转换为至少一个视频数据帧;
S102:所述发送端在所述第一帧速率对应的每一帧时长内向所述接收端发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一;
S103:所述接收端于所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧;
S104:对步骤S103中接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧;
S105:将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。
优选地,步骤S102执行时,所述发送端通过一物理接口向所述接收端发送视频数据帧。
优选地,所述步骤S102在所述第一帧速率对应的每一帧时长内执行以下步骤:
S102-1:向所述发送端内的缓存单元写入一视频数据帧;
S102-2:向所述物理接口传输控制信号及辅助信号;
S102-3:于一预设时间阈值内向所述物理接口传输所述视频数据帧;
S102-4:等待当前传输周期结束。
优选地,步骤S102-3执行时,调整行同步信号的频率与基准频率的比值至少为2;步骤S103执行时,对接收的行同步信号按照所述比值进行分频。
优选地,步骤S105之后,所述视频数据处理方法还包括以下步骤:
S106:所述接收端按照所述第二帧速率显示所述待播放的视频数据帧。
本发明还公开了一种视频数据处理装置,包括工作于第一帧速率的发送端及工作于第二帧速率的接收端,其特征在于,所述视频数据处理装置包括:
转换模块,设于所述发送端,按照所述第一帧速率将所述视频数据转换为至少一个视频数据帧;
发送模块,设于所述发送端,与所述转换模块连接,在所述第一帧速率对应的每一帧时长内向所述接收端发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一;
接收模块,设于所述接收端,于所述第一帧速率对应的相邻两个帧时长内分别接收 一视频数据帧;
插帧运算模块,设于所述接收端,与所述接收模块连接,对所述接收模块接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧;
组帧模块,设于所述接收端,与所述插帧运算模块连接,将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。
优选地,所述发送模块通过一物理接口向所述接收端发送视频数据帧。
优选地,所述发送模块包括:
缓存单元,设于所述发送端,在所述第一帧速率对应的一帧时长内写入一视频数据帧;
信号传输单元,设于所述发送端,向所述物理接口传输控制信号及辅助信号;
数据传输单元,设于所述发送端,与所述缓存单元连接,于一预设时间阈值内向所述物理接口传输所述视频数据帧;
周期补偿单元,等待当前传输周期结束。
优选地,所述数据传输单元传输所述视频数据帧时,调整行同步信号的频率与基准频率的比值至少为2;所述接收模块对接收的行同步信号按照所述比值进行分频。
优选地,所述视频处理装置还包括:
播放模块,按照所述第二帧速率显示所述待播放的视频数据帧。
采用了上述技术方案后,与现有技术相比,具有以下有益效果:
1.有效减少视频数据在处理过程中的延时,提高交互操作实时性,提升用户体验;
2.无须对硬件设备作改动,成本较低。
附图说明
图1为符合本发明一优选实施例中所述视频数据处理方法的流程示意图;
图2为符合本发明一优选实施例中所述步骤S102的流程示意图;
图3为符合本发明一优选实施例中所述视频数据处理装置的结构框图;
图4为符合本发明一优选实施例中所述发送模块的结构框图。
附图标记:
10-视频数据处理装置、11-发送端、111-转换模块、112-发送模块、1121-缓存单元、1122-信号传输单元、1123-数据传输单元、1124-周期补偿单元、12-接收端、121-接收模块、122-插帧运算模块、123-组帧模块、124-播放模块。
发明内容
以下结合附图与具体实施例进一步阐述本发明的优点。
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
在本发明的描述中,需要理解的是,术语“纵向”、“横向”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
在本发明的描述中,除非另有规定和限定,需要说明的是,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是机械连接或电连接,也可以是两个元件内部的连通,可以是直接相连,也可以通过中间媒介间接相连,对于本领域的普通技术人员而言,可以根据具体情况理解上述术语的具体含义。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,“模块”与“部件”可以混合地使用。
参阅图1,为符合本发明一优选实施例中所述视频数据处理方法的流程示意图,所述视频数据处理方法包括:
S101:所述发送端11按照所述第一帧速率将所述视频数据转换为至少一个视频数据 帧。
所述发送端11可以是播放机、显卡等具备解码能力的设备,将数字格式的视频文件解码为可播放的视频信号,所述视频信号由多帧视频数据组成。所述发送端11按照第一帧速率产生的各帧视频数据,所述第一帧速率可以是15fps,24fps或30fps,其中fps指每秒传输帧数,每秒钟帧数愈多,所显示的动作就会愈流畅。通常,要避免动作不流畅的最低值是30fps,某些计算机视频格式每秒只能提供15帧。所述视频数据可以是wmv、rmvb、3gp、mp4等数据格式,常以视频文件的形式存储在存储设备中。本步骤将所述视频数据转换为至少一个视频数据帧,所述视频数据帧即每一帧播放的视频数据内容,常常为像素画面的形式,可以视为一幅图;当第一帧速率为15fps时,1秒内有15个视频数据帧。根据视频数据的播放时长及第一帧速率的不同,转换出的视频数据帧的数量也不同。所述视频数据帧为后续播放操作的基础,播放设备按照一帧一帧的方式播放所述视频数据帧,实现动态的视频效果。
S102:所述发送端11在所述第一帧速率对应的每一帧时长内向所述接收端12发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一。
所述接收端12即播放视频的设备,可以是显示屏幕、电视机等设备,工作在第二帧速率,可以是60fps甚至更高。第二帧速率可能是第一帧速率的2倍甚至更多倍,以实现流畅的播放效果。同样地,当用户选择快进时,所述播放视频的设备会提升所述第二帧速率,实现快进的效果,快进时所述第一帧速率也将同步提升。
在现有技术中,所述发送端11发送所述视频数据帧的时长与播放显示一视频数据帧的时长是大致相等的,即传输一帧,播放一帧;且当前传输的视频数据帧为上一帧时长的时间内将所述视频文件转换出的视频数据帧。举例来说,当第一帧速率为30fps时,对应的每一帧时长为33.3毫秒,步骤S101在每一帧时长内转换出一个视频数据帧;所述发送端11在0-33.3毫秒内转换出第一个视频数据帧,则在33.3毫秒-66.6毫秒的第二帧时长内传输所述第一个视频数据帧至所述接收端12,仅传输1次,同时所述发送端11还继续转换出第二个视频数据帧;在66.6毫秒-99.9毫秒内,所述发送端11发送所述第二个视频数据帧至所述接收端12;以此类推,不断地一帧一帧传输所述视频数据帧,直至所述视频文件传输完毕。每一视频数据帧的传输周期与一帧时长大致相等,包括数据传输时间与辅助传输时间,其中所述数据传输时间实质实际用于传输所述视频数据帧的时间,所述辅助传输时间是进行控制信号传输、音频信号传输或其他辅助信息传输的时间,所述辅助传输时间也被称为消隐期,在软件开发中用blanking表示。其中所述数据 传输时间占所述传输周期的比例一般在80%以上,也就是说数据传输占用的时间在所述传输周期的一大半以上。
本步骤对上述现有技术提出了改进,通过提升所述视频数据帧的传输速度来缩短所述数据传输时间,从而降低所述数据传输时间与所述传输周期的比值,所述比值小于或等于二分之一。所述传输周期中的辅助传输时间可以相应地延长,使得所述传输周期保持不变,仍与一帧时长大致相等。本步骤的要点是所述视频数据帧的生成速度与传输速度分离,突破现有技术中两者基本协调同步的技术路线,通过提升所述视频数据帧的传输速度来缩短视频数据传输的时间。提升所述视频数据帧的传输速度可通过提升传输接口的利用率来实现,例如HDMI接口最高数据传输速度为48Gbps,一个1080p的视频和一个8声道的音频信号需求少于0.5GB/s,因此传输接口仍有很大的提升容量,可以成倍甚至几十倍地提升所述视频数据帧的传输速度。
举例来说,当所述数据传输时间与所述传输周期的比值为三分之一时,第一帧速率为30fps,对应的每一帧时长为33.3毫秒,步骤S101在每一帧时长内转换出一个视频数据帧;所述发送端11在0-33.3毫秒内转换出第一个视频数据帧,则在33.3毫秒-44.4毫秒的第二帧时长内传输所述第一个视频数据帧至所述接收端12;在33.3毫秒-66.6毫秒内,所述发送端11转换出第二个视频数据帧;在66.6毫秒-77.7毫秒内,所述发送端11发送所述第二个视频数据帧至所述接收端12。现有技术中发送完成第二个视频数据帧的时间为99.9毫秒,本步骤为77.7毫秒,提前了22.2毫秒。上述时间计算未考虑所述传输周期内辅助信号的传输时间,实际上第二个视频数据帧的辅助信号传输可在上一帧时长内完成,无需占用本帧时长的时间,这样可以在一帧时间内尽快进行视频数据帧的传输。需要注意的是,实现本步骤的帧时长是以第一帧速率计算的,即在第一帧速率下每一帧的播放时长。
S103:所述接收端12于所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧。
本步骤阐述所述接收端12的接收方式。由于步骤S102在第一帧速率对应的同一时长内传输同一视频数据帧仍为1次,也就意味着所述接收端12在同样的时间内最多接收到1次视频数据帧。由于步骤S102在每一传输周期中压缩了视频数据帧的传输时间,实际上所述接收端12并不必等到当前帧时长结束即可完成对所述视频数据帧的接收,也可以提前进行后续的插帧运算。本步骤中,所述接收端12在所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧,作为后续步骤进行插帧运算的基础。
举例来说,当所述数据传输时间与所述传输周期的比值为三分之一时,第一帧速率 为30fps,对应的每一帧时长为33.3毫秒,在33.3毫秒-66.6毫秒的第二帧时长内所述接收端12接收到1次第一视频数据帧,接收完成的时刻为44.4毫秒;在66.6毫秒-99.9毫秒的第三帧时长内,所述接收端12接收到1次第二视频数据帧,接收完成的时刻为77.7毫秒。同样地,每经过一帧时长,所述接收端12就接收一个视频数据帧,以此类推,直至所述视频数据全部接收完成,每两个相邻的帧时长内均接收一次视频数据帧。注意,实现本步骤的帧时长是以第一帧速率计算的。
在现有技术中,所述插帧运算必须依托两个视频数据帧才能进行,因此所述接收端12只能在第三帧时长结束时接收到所述第一视频数据帧和第二视频数据帧,方能进行后续的插帧运算,这样实际的插帧播放时间至少要等到第三帧时长结束,也就是第99.9毫秒以后。
本步骤压缩了所述接收端12的接收时间,上述示例中所述接收端12在77.7毫秒完成第二视频数据帧的接收,可随即进行插帧运算,由于插帧运算的时间相对于一帧时长非常短,例如3毫秒左右,则在80毫秒的时刻即可开始播放插帧,相对于现有技术减少了插帧延迟,从整体上缩短了视频播放延时。若步骤S102中所述发送端11在同一帧时长内发送同一视频数据帧的速度更快,则所述接收端12在一帧时长内接收到一个视频数据帧的时间会更短,将插帧运算的时间进一步提前,使得视频处理延时更短。
S104:对步骤S103中接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧。
本步骤执行插帧运算,参与运算的对象为所述步骤S103中接收的相邻两个视频数据帧。常见的插帧运算算法为MEMC,全称Motion Estimate and Motion Compensation,即运动估计和运动补偿,是液晶电视中经常用到的运动画质补偿技术。其原理是采用动态映像系统,在传统的两帧图像之间加插一帧运动补偿帧,将普通平板电视的50/60Hz刷新率提升至100/120Hz。这样,运动画面更加清晰流畅,优于常态响应效果,从而达到清除上一帧图像的残影、提高动态清晰度的效果,将影像拖尾降至人眼难以感知的程度。举例开说,原来的一副流动画面的帧顺序是:1、2、3、4、5、6;MEMC技术通过分区块,在水平和垂直两个方向上对图象的运动趋势加以分析以后在起原来各种帧之间插入一个中间帧,即视频数据插帧,插帧后的帧序列变为:1、1C、2、2C、3、3C、4、4C、5、5C、6,这样原来的场频就不足以显现现在所有的帧,所以就需要将场频提高一倍,即从50/60Hz提高到100/120Hz,可见MEMC技术和倍频技术是分不开的。
经过插帧运算后,所述接收端12将以所述第二帧速率播放最终的视频数据帧。考虑所述第一帧速率和第二帧速率的倍数关系,所述两个视频数据帧之间插入的帧数也不同。 例如所述第一帧速率为30fps,所述第二帧速率为60fps,则每两个视频数据帧之间插入一个视频数据插帧即可。若所述第二帧速率为120fps,,则每两个视频数据帧之间须插入三个视频数据插帧。
S105:将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。
本步骤执行组帧操作,将步骤S104中得到的视频数据插帧插入接收的视频数据帧中,形成一组待播放的视频数据帧,这样就能使播放设备按照硬件的工作频率,也就是第二帧速率一帧一帧地播放所述视频数据帧了,且不会产生卡顿。
作为所述视频数据处理方法的进一步改进,步骤S102执行时,所述发送端11通过一物理接口向所述接收端12发送视频数据帧。本改进实施例对所述发送端11和接收端12的连接方式作了限定,即通过物理接口连接,所述物理接口为MIPI、HDMI、DisplayPort等视频传输接口。在所述传输周期的辅助传输时间内,发送完成控制信号、音频信号或其他辅助信息后,所述物理接口可以是高阻或者其他的状态,处于低功耗模式。
作为所述视频数据处理方法的进一步改进,步骤S105之后,所述视频数据处理方法还包括以下步骤:
S106:所述接收端12按照所述第二帧速率显示所述待播放的视频数据帧。
本步骤执行播放操作,将组帧完毕的包括在视频数据插帧在内的一组视频数据帧在显示设备上播放。每一视频数据帧已经记录了播放画面所需的像素信息,可由硬件设备显示所述像素信息,实现视频数据的播放。由于步骤S103、S104、S105是不断连续执行的步骤,会源源不断地产生待播放的视频数据帧,因此本步骤不必等到接收很多视频数据帧才播放,而是在步骤S105组帧完毕后即可按照所述第二帧速率进行播放操作。
参阅图2,为符合本发明一优选实施例中所述步骤S102的流程示意图,所述步骤S102在所述第一帧速率对应的每一帧时长内执行以下步骤:
S102-1:向所述发送端11内的缓存单元1121写入一视频数据帧。
所述发送端11生成所述视频数据帧后将所述视频数据帧存入所述发送端11内的缓存单元1121内,在某些应用环境下所述缓存单元又称为Frame Buffer。本步骤在代码层完成,即在一帧时长内向所述缓存单元1121执行1次写入指令。
S102-2:向所述物理接口传输控制信号及辅助信号。
本步骤执行控制信号及辅助信号传输操作,也可以包括音频信号的传输。上述信号的传输与视频数据帧的传输占用的传输时间不被压缩。根据不同的物理接口,其控制信号的类型和接口协议均不同,也会有不同的控制信号时序,部分控制信号可能会在所述 视频数据帧的传输过程中协同作用,不一定按照分离的步骤单独执行。
S102-3:于一预设时间阈值内向所述物理接口传输所述视频数据帧。
本步骤将所述缓存单元1121内的视频数据帧发送至所述物理接口,通过驱动层实现,将软件数据转换为电信号通过所述物理接口发出。本步骤执行须满足所述物理接口的协议。本步骤中传输所述视频数据帧的时间应在一预设的时间阈值内,所述时间阈值小于所述传输周期的二分之一,实际上就是限定了所述视频数据帧传输时间,以实现缩短视频处理延时的技术效果。
S102-4:等待当前传输周期结束。
本步骤的目的是补足当前传输周期的时长,以保证所述视频数据帧的传输节奏。由于所述发送端11产生所述视频数据帧的速度按照所述第一帧速率进行,因此即便所述视频数据帧的传输速度变快了,仍需等待所述发送端11产生下一视频数据帧才能进行传输,故实施本步骤保持所述传输周期稳定不变。
作为所述视频数据处理方法的进一步改进,步骤S102-3执行时,调整行同步信号的频率与基准频率的比值至少为2;步骤S103执行时,对接收的行同步信号按照所述比值进行分频。本改进实施例进一步优选了缩短所述视频数据帧的传输时间的技术手段,即通过调整视频数据帧传输时的行同步信号的频率来实现。所述基准频率为现有技术中所述数据传输时间占所述传输周期的比例在80%以上时的行同步信号的频率。行同步又称为水平同步,为控制显示屏幕中电子束从右边返回起点(屏幕的左端)的过程,也叫行逆程。随着数字显示技术的发展,在数字化的视频数据传输时,行同步信号又被称为HSYNC,当HSYNC有效时,接收到的信号属于同1行。相应地,还有场同步信号又被称为VSYNC,当VSYNC有效时,接收到的信号属于同一场。比如,要显示一个像素为Ax B的画面,则有:;,其中PCLK为像素点同步时钟信号,每个PCLK对应一个像素点。所述基准频率即在现有技术中HSYNC的频率。根据上述计算关系,若将HSYNC的频率提升为基准频率的3倍,而VSYNC的工作周期不变,则所述视频数据帧也就是像素点数据可在所述传输周期三分之一的时间内完成,还需插入所述传输周期三分之二长度的辅助传输时间,也就是Blinking时间。本改进实施例中,所述行同步信号的频率与所述基准频率的比值至少为2,也就是所述视频数据帧的传输时间缩短至少1倍。由于行同步信号的频率提升了1倍,那么接收端12需要对接收到的行同步信号进行分频,以还原实际的行同步时序,可在所述接收端12增设一分频器来实现,所述分频器的分频倍数为所述行同步信号的频率阈所述基准频率的比值。所述接收端12还需对所述视频数据帧先行缓存,而后配合分频后的行同步信号进行处理,从而实现显示同步。
参阅图3,为符合本发明一优选实施例中所述视频数据处理装置10的结构框图,所述视频数据处理装置10包括工作于第一帧速率的发送端11及工作于第二帧速率的接收端12,所述视频数据处理装置10还包括:
-转换模块111
所述转换模块111设于所述发送端12,按照所述第一帧速率将所述视频数据转换为至少一个视频数据帧。所述转换模块111可以是播放机、显卡等具备解码能力的设备,将不同数据格式的所述视频数据转换为多个视频数据帧,转换视频数据帧时满足所述第一帧速率。
-发送模块112
所述发送模块112设于所述发送端11,与所述转换模112连接,所述发送端11在所述第一帧速率对应的每一帧时长内向所述接收端12发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一。所述发送模块112从所述转换模块111接收转换好的视频数据帧并向所述接收端12发送。所述发送模块112充分利用了视频数据传输通道的容量,提升了传输速率,压缩了发送一个视频数据帧的时间。
-接收模块121
所述接收模块121设于所述接收端12,于所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧。所述接收模块121接收所述发送模块112发来的视频数据帧,由于所述发送模块112压缩了所述视频数据帧的传输时间,因此所述接收模块121在所述传输周期的当中即可接收完成所述视频数据帧。所述接收模块121在相邻两个帧时长内可分别接收到两个视频数据帧,为后续的插帧运算提供基础。由于所述接收模块121在一个帧时长内最多有二分之一的时间用于接收所述视频数据帧,因此所述帧时长内的剩余时间可用于插帧运算,相对于现有技术提前了插帧运算开始的时刻。所述接收模块121工作参照的帧时长是以第一帧速率计算的,即在第一帧速率下每一帧的播放时长。
-插帧运算模块122
所述插帧运算模块122设于所述接收端12,与所述接收模块121连接,对所述接收模块121接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧。所述插帧运算模块122从所述接收模块121获取相邻的两个视频数据帧,以所述两个视频数据帧为基础进行插帧运算。所述插帧运算模块122内嵌插帧运算算法,例如MEMC,即运动估计和运动补偿算法。
-组帧模块123
所述组帧模块123设于所述接收端12,与所述插帧运算模块122连接,将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。所述组帧模块123从所述UC哈镇运算模块122获取视频数据插帧,再从所述接收模块121获取接收的视频数据帧,将所述视频数据插帧插入作为其计算基础的两个视频数据帧之间,共同组成一组待播放的视频数据帧。
作为所述视频数据处理装置的进一步改进,所述发送模块112通过一物理接口向所述接收端12发送视频数据帧。本改进实施例对所述发送端11和接收端12的连接方式作了限定,即通过物理接口连接,所述物理接口为MIPI、HDMI、DisplayPort等视频传输接口。所述发送模块112通过所述物理接口与所述接收模块121连接,实现视频数据帧的传输。
作为所述视频处理装置10的进一步改进,所述视频处理装置10还包括:
-播放模块124
所述播放模块124设于所述接收端12,与所述组帧模块123连接,按照所述第二帧速率显示所述待播放的视频数据帧。所述播放模块124从所述组帧模块123获取所述待播放的视频数据帧,并按照所述第二帧速率进行播放。所述播放模块124可以是显示屏及其显示电路,显示电路用于将所述视频数据帧转换为显现物理像素的电信号,所述显示屏显示所述物理像素。
参阅图4,为符合本发明一优选实施例中所述发送模块112的结构框图,所述发送模块112包括:
-缓存单元1121
所述缓存单元1121设于所述发送端11,在所述第一帧速率对应的一帧时长内写入一视频数据帧。所述缓存单元1121可以是物理存储介质,例如内存、硬盘等,能够存储数据。
-信号传输单元1122
所述信号传输单元1122,设于所述发送端11,向所述物理接口传输控制信号及辅助信号。所述信号传输单元1122执行控制信号及辅助信号传输操作,也可以包括音频信号的传输操作。根据不同的物理接口,其控制信号的类型和接口协议均不同,也会有不同的控制信号时序,部分控制信号会在所述视频数据帧的传输过程中协同作用。
-数据传输单元1123
所述数据传输单元1123设于所述发送端11,与所述缓存单元1121连接,于一预设时间阈值内向所述物理接口传输所述视频数据帧。所述数据传输单元1123通过驱动层实 现将软件数据转换为电信号通过所述物理接口发出。所述数据传输单元1123传输所述视频数据帧时须满足所述物理接口的协议,且传输所述视频数据帧的时间应在一预设的时间阈值内,所述时间阈值小于所述传输周期的二分之一,实际上就是限定了所述视频数据帧传输时间,以实现缩短视频处理延时的技术效果。根据某些物理接口的接口协议,所述数据传输单元1123传输所述视频数据帧时,需要所述信号传输单元1122同步配合进行控制信号状态控制,例如在HDMI接口中,需要对VSYNC和HSYNC的高低电平进行控制,以进行数据传输的辅助确认。
-周期补偿单元1124
所述周期补偿单元1124等待当前传输周期结束。所述周期补偿单元1124的作用是补足当前传输周期的时长,以保证所述视频数据帧的传输节奏。由于所述转换模块111产生所述视频数据帧的速度按照所述第一帧数率进行,因此即便所述视频数据帧的传输速度变快了,仍需等待所述转换模块111产生下一视频数据帧才能进行传输。故在所述数据传输单元1123传输所述视频数据帧完毕后,所述周期补偿单元1124保持所述传输周期稳定不变。
作为上述视频数据处理装置10的进一步改进,所述数据传输单元1123传输所述视频数据帧时,调整行同步信号的频率与基准频率的比值至少为2;所述接收模块121对接收的行同步信号按照所述比值进行分频。本改进实施例进一步优选了缩短所述视频数据帧的传输时间的技术手段,即通过调整视频数据帧传输时的行同步信号的频率来实现。所述数据传输单元1123将行同步信号的频率提升至少1倍,使所述视频数据帧的传输时间缩短至少1倍。所述接收模块121还设有分频器,对接收到的行同步信号进行分频,以还原实际的行同步时序。所述分频器的分频倍数为所述行同步信号的频率阈所述基准频率的比值。所述接收模块121还需对所述视频数据帧先行缓存,而后配合分频后的行同步信号进行处理,从而实现显示同步。
应当注意的是,本发明的实施例有较佳的实施性,且并非对本发明作任何形式的限制,任何熟悉该领域的技术人员可能利用上述揭示的技术内容变更或修饰为等同的有效实施例,但凡未脱离本发明技术方案的内容,依据本发明的技术实质对以上实施例所作的任何修改或等同变化及修饰,均仍属于本发明技术方案的范围内。

Claims (10)

  1. 一种视频数据处理方法,用于处理由工作于第一帧速率的发送端发送至工作于第二帧速率的接收端的视频数据,其特征在于,包括以下步骤:
    S101:所述发送端按照所述第一帧速率将所述视频数据转换为至少一个视频数据帧;
    S102:所述发送端在所述第一帧速率对应的每一帧时长内向所述接收端发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一;
    S103:所述接收端于所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧;
    S104:对步骤S103中接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧;
    S105:将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。
  2. 如权利要求1所述的视频数据处理方法,其特征在于,
    步骤S102执行时,所述发送端通过一物理接口向所述接收端发送视频数据帧。
  3. 如权利要求2所述的视频数据处理方法,其特征在于,
    所述步骤S102在所述第一帧速率对应的每一帧时长内执行以下步骤:
    S102-1:向所述发送端内的缓存单元写入一视频数据帧;
    S102-2:向所述物理接口传输控制信号及辅助信号;
    S102-3:于一预设时间阈值内向所述物理接口传输所述视频数据帧;
    S102-4:等待当前传输周期结束。
  4. 如权利要求3所述的视频数据处理方法,其特征在于,
    步骤S102-3执行时,调整行同步信号的频率与基准频率的比值至少为2;
    步骤S103执行时,对接收的行同步信号按照所述比值进行分频。
  5. 如权利要求1-4任一项所述的视频数据处理方法,其特征在于,
    步骤S105之后,所述视频数据处理方法还包括以下步骤:
    S106:所述接收端按照所述第二帧速率显示所述待播放的视频数据帧。
  6. 一种视频数据处理装置,包括工作于第一帧速率的发送端及工作于第二帧速率的接收端,其特征在于,所述视频数据处理装置包括:
    转换模块,设于所述发送端,按照所述第一帧速率将所述视频数据转换为至少一个视频数据帧;
    发送模块,设于所述发送端,与所述转换模块连接,在所述第一帧速率对应的每一帧时长内向所述接收端发送前一帧时长内产生的视频数据帧1次,其中每一视频数据帧的传输周期内的数据传输时间与所述传输周期的比值小于或等于二分之一;
    接收模块,设于所述接收端,于所述第一帧速率对应的相邻两个帧时长内分别接收一视频数据帧;
    插帧运算模块,设于所述接收端,与所述接收模块连接,对所述接收模块接收的两个视频数据帧进行插帧运算,得到至少一个视频数据插帧;
    组帧模块,设于所述接收端,与所述插帧运算模块连接,将所述视频数据插帧放入所述两个视频数据帧之间,形成一组待播放的视频数据帧。
  7. 如权利要求6所述的视频数据处理装置,其特征在于,
    所述发送模块通过一物理接口向所述接收端发送视频数据帧。
  8. 如权利要求7所述的视频数据处理装置,其特征在于,
    所述发送模块包括:
    缓存单元,设于所述发送端,在所述第一帧速率对应的一帧时长内写入一视频数据帧;
    信号传输单元,设于所述发送端,向所述物理接口传输控制信号及辅助信号;
    数据传输单元,设于所述发送端,与所述缓存单元连接,于一预设时间阈值内向所述物理接口传输所述视频数据帧;
    周期补偿单元,等待当前传输周期结束。
  9. 如权利要求8所述的视频数据处理装置,其特征在于,
    所述数据传输单元传输所述视频数据帧时,调整行同步信号的频率与基准频率的比值至少为2;
    所述接收模块对接收的行同步信号按照所述比值进行分频。
  10. 如权利要求6-9任一项所述的视频数据处理装置,其特征在于,
    所述视频处理装置还包括:
    播放模块,按照所述第二帧速率显示所述待播放的视频数据帧。
PCT/CN2018/111520 2017-10-24 2018-10-23 一种视频数据处理方法及视频数据处理装置 WO2019080847A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2020520518A JP2021500786A (ja) 2017-10-24 2018-10-23 ビデオデータ処理方法及びビデオデータ処理装置
EP18869522.5A EP3644613A4 (en) 2017-10-24 2018-10-23 VIDEO DATA PROCESSING PROCESS AND VIDEO DATA PROCESSING DEVICE
KR1020207006697A KR20200077507A (ko) 2017-10-24 2018-10-23 비디오 데이터 처리 방법 및 비디오 데이터 처리 장치
US16/854,819 US20200252581A1 (en) 2017-10-24 2020-04-21 Video data processing method and video data processing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711001979.4 2017-10-24
CN201711001979.4A CN107707860B (zh) 2017-10-24 2017-10-24 一种视频数据处理方法、处理装置及计算机可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/854,819 Continuation US20200252581A1 (en) 2017-10-24 2020-04-21 Video data processing method and video data processing device

Publications (1)

Publication Number Publication Date
WO2019080847A1 true WO2019080847A1 (zh) 2019-05-02

Family

ID=61182224

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/111520 WO2019080847A1 (zh) 2017-10-24 2018-10-23 一种视频数据处理方法及视频数据处理装置

Country Status (6)

Country Link
US (1) US20200252581A1 (zh)
EP (1) EP3644613A4 (zh)
JP (1) JP2021500786A (zh)
KR (1) KR20200077507A (zh)
CN (1) CN107707860B (zh)
WO (1) WO2019080847A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107707860B (zh) * 2017-10-24 2020-04-10 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质
CN107707934A (zh) * 2017-10-24 2018-02-16 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质
CN111586319B (zh) * 2020-05-27 2024-04-09 北京百度网讯科技有限公司 视频的处理方法和装置
KR20220006680A (ko) * 2020-07-08 2022-01-18 삼성디스플레이 주식회사 표시 장치 및 이를 이용한 표시 패널의 구동 방법
EP4280595A4 (en) * 2021-01-29 2024-03-06 Huawei Technologies Co., Ltd. DATA TRANSMISSION METHOD AND APPARATUS

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101031042A (zh) * 2006-02-28 2007-09-05 三星电子株式会社 具有帧率转换的图像显示设备及其方法
US20090135910A1 (en) * 2007-11-27 2009-05-28 Samsung Electronics, Co., Ltd Video apparatus to combine graphical user interface (gui) with frame rate conversion (frc) video and method of providing a gui thereof
CN101529890A (zh) * 2006-10-24 2009-09-09 索尼株式会社 图像摄取设备和再现控制设备
CN101669361A (zh) * 2007-02-16 2010-03-10 马维尔国际贸易有限公司 用于改善低分辨率和低帧速率视频的方法和系统
CN105828183A (zh) * 2015-01-04 2016-08-03 华为技术有限公司 处理视频帧的方法、视频处理芯片以及运动估计和运动补偿memc芯片
CN106713855A (zh) * 2016-12-13 2017-05-24 深圳英飞拓科技股份有限公司 一种视频播放方法及装置
CN107707934A (zh) * 2017-10-24 2018-02-16 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质
CN107707860A (zh) * 2017-10-24 2018-02-16 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8542747B2 (en) * 2006-12-26 2013-09-24 Broadcom Corporation Low latency cadence detection for frame rate conversion
JP5183231B2 (ja) * 2008-02-05 2013-04-17 キヤノン株式会社 映像再生装置及び制御方法
CN105306866A (zh) * 2015-10-27 2016-02-03 青岛海信电器股份有限公司 帧率转换方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101031042A (zh) * 2006-02-28 2007-09-05 三星电子株式会社 具有帧率转换的图像显示设备及其方法
CN101529890A (zh) * 2006-10-24 2009-09-09 索尼株式会社 图像摄取设备和再现控制设备
CN101669361A (zh) * 2007-02-16 2010-03-10 马维尔国际贸易有限公司 用于改善低分辨率和低帧速率视频的方法和系统
US20090135910A1 (en) * 2007-11-27 2009-05-28 Samsung Electronics, Co., Ltd Video apparatus to combine graphical user interface (gui) with frame rate conversion (frc) video and method of providing a gui thereof
CN105828183A (zh) * 2015-01-04 2016-08-03 华为技术有限公司 处理视频帧的方法、视频处理芯片以及运动估计和运动补偿memc芯片
CN106713855A (zh) * 2016-12-13 2017-05-24 深圳英飞拓科技股份有限公司 一种视频播放方法及装置
CN107707934A (zh) * 2017-10-24 2018-02-16 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质
CN107707860A (zh) * 2017-10-24 2018-02-16 南昌黑鲨科技有限公司 一种视频数据处理方法、处理装置及计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3644613A4 *

Also Published As

Publication number Publication date
CN107707860A (zh) 2018-02-16
JP2021500786A (ja) 2021-01-07
CN107707860B (zh) 2020-04-10
US20200252581A1 (en) 2020-08-06
KR20200077507A (ko) 2020-06-30
EP3644613A1 (en) 2020-04-29
EP3644613A4 (en) 2020-12-09

Similar Documents

Publication Publication Date Title
WO2019080847A1 (zh) 一种视频数据处理方法及视频数据处理装置
WO2019080846A1 (zh) 一种视频数据处理方法及视频数据处理装置
US8300087B2 (en) Method and system for response time compensation for 3D video processing
CN104917990B (zh) 通过调整垂直消隐进行视频帧速率补偿
JP5695211B2 (ja) ベースバンド映像データの送信装置および受信装置ならびに送受信システム
CN111479154B (zh) 音画同步的实现设备、方法及计算机可读存储介质
JP2007072130A (ja) 画像表示システム、画像表示装置、画像データ出力装置、画像処理プログラム、及びこの画像処理プログラムを記録した記録媒体
US8610763B2 (en) Display controller, display control method, program, output device, and transmitter
WO2013182011A1 (zh) 在线视频实时变速播放方法及系统
JP5183231B2 (ja) 映像再生装置及び制御方法
US8593575B2 (en) Video display apparatus for shortened-delay processing of a video signal and video processing method
CN114613306A (zh) 显示控制芯片、显示面板及相关设备、方法和装置
US20110018979A1 (en) Display controller, display control method, program, output device, and transmitter
CN112188182A (zh) 立体显示控制系统、立体显示系统及立体显示控制器
CN111757034A (zh) 一种基于fpga的视频同步显示方法、装置和存储介质
CN202285412U (zh) 低帧率传输或运动图像闪烁消除系统
WO2017032115A1 (zh) 音视频播放设备和音视频播放方法
WO2015132957A1 (ja) 映像機器及び映像処理方法
TW202002604A (zh) 影像處理方法及電子設備
WO2013076778A1 (ja) 映像送信装置、映像受信装置、映像送信方法及び映像受信方法
WO2023000484A1 (zh) 一种帧率稳定输出方法、系统及智能终端
CN118175247A (zh) 一种低帧频设备实现高帧率vrr的方法及系统
CN115762428A (zh) 一种基于电子墨水屏的显示装置及系统
JP2013066020A (ja) 動画再生装置及び動画再生方法
JP2010166337A (ja) 通信装置、通信装置の制御方法、制御プログラム、および、記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18869522

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018869522

Country of ref document: EP

Effective date: 20200125

ENP Entry into the national phase

Ref document number: 2020520518

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE