WO2016154816A1 - 数据处理方法及装置 - Google Patents

数据处理方法及装置 Download PDF

Info

Publication number
WO2016154816A1
WO2016154816A1 PCT/CN2015/075290 CN2015075290W WO2016154816A1 WO 2016154816 A1 WO2016154816 A1 WO 2016154816A1 CN 2015075290 W CN2015075290 W CN 2015075290W WO 2016154816 A1 WO2016154816 A1 WO 2016154816A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
current image
encoding
display
encoded
Prior art date
Application number
PCT/CN2015/075290
Other languages
English (en)
French (fr)
Inventor
孙增才
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to RU2017135314A priority Critical patent/RU2662648C1/ru
Priority to CN201580065215.4A priority patent/CN107004018B/zh
Priority to PCT/CN2015/075290 priority patent/WO2016154816A1/zh
Priority to JP2017550612A priority patent/JP6483850B2/ja
Priority to EP15886803.4A priority patent/EP3264284B1/en
Publication of WO2016154816A1 publication Critical patent/WO2016154816A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/154Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
    • H04N19/426Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements using memory downsizing methods
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/103Detection of image changes, e.g. determination of an index representative of the image change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2350/00Solving problems of bandwidth in display systems

Definitions

  • the present invention relates to the field of data processing, and in particular, to a data processing method and apparatus.
  • the multi-screen sharing display refers to a technique of simultaneously displaying an image frame in a display buffer of a first device in a second device.
  • the image frame needs to be encoded before the first device shares the image frame.
  • the first device predetermines a coding quality, where the coding quality is used to indicate that the encoded image frame is restored to a reduction level of the image frame before encoding, and the quality of the coding quality is positively correlated with the reduction level, that is, the higher the coding quality, the coding The larger the amount of data afterwards, the higher the reduction level of the encoded image frame to the image frame before encoding.
  • the first device processes the image frame into I frame data or P frame data or B frame data according to the encoding quality and the importance degree of the image frame, and shares the encoded image frame to the second device for display.
  • the I frame data is obtained by encoding a key frame in the image frame
  • the P frame data is obtained by encoding a difference between the P frame data and an I frame data or P frame data before the P frame data
  • the B frame data is obtained by encoding the difference between the B frame data and an I frame data or P frame data before the B frame data, and a P frame data subsequent to the B frame data.
  • the image frame is an image frame generated by the dynamic display scene
  • the difference between the image frames in the dynamic display scene is large. If the encoding quality determined at this time is too high, the encoded image frame is transmitted. The amount of data is large, and it takes more time to transmit the encoded image frame, thereby reducing the real-time performance of the shared display.
  • the embodiment of the present invention provides a data processing method and apparatus.
  • the technical solution is as follows:
  • a data processing method comprising:
  • the first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, where the dynamic display scene is in a predetermined time period a display scene in which an internal image frame dynamically changes, the static display scene being a display scene in which an image frame remains unchanged during the predetermined time period;
  • the current image frame is an image frame generated in a dynamic display scene
  • the current image frame is encoded according to a first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to before encoding The first reduction level of the image frame;
  • the current image frame is an image frame generated in a static display scene, encoding the current image frame according to a second encoding quality, where the second encoding quality is used to indicate that the encoded image frame is restored to be before encoding a second reduction level of the image frame; the second reduction level being higher than the first reduction level;
  • the encoded current image frame is shared to a second device for display.
  • the method further includes:
  • the static display condition is that the current image frame displayed remains unchanged
  • the current image frame that is encoded again is shared to the second device for display.
  • the detecting, by the first device, whether the display of the current image frame meets a static display condition including:
  • the pause instruction is for instructing the first device to stop updating the current image frame in the display buffer to a next image frame.
  • the first device detects the display Whether the current image frame in the buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, including:
  • Determining an application that generates the current image frame detecting whether the application belongs to a dynamic program list or a static program list, and the dynamic program list includes an application that provides a dynamic display scene
  • the static program list includes an application that provides a static display scenario
  • the encoding interface invoked by the application that provides the current image frame, whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene, and the encoding interface includes a first encoding An interface and a second encoding interface, the first encoding interface is configured to instruct to encode an image frame according to the first encoding quality, and the second encoding interface is configured to instruct to encode an image frame according to the second encoding quality ;or,
  • the fourth possibility in the first aspect before the first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, the first device further includes:
  • the resolution of the second device is higher than the resolution of the first device, performing super-resolution processing on the current image frame.
  • the method further includes:
  • Adjusting, according to the quality of the network, the coding parameter corresponding to the first coding quality, or Adjusting, according to the quality of the network, the coding parameter corresponding to the second coding quality, the quality of the network is positively correlated with the data amount of the encoded current image frame, and the encoded current image frame is based on The encoding parameters are encoded.
  • the second encoding quality is corresponding to the lossless compression mode
  • the sharing the encoded current image frame to the second device for display includes:
  • the lossy compressed data channel is used to transmit an image frame encoded by lossy compression
  • the lossy compressed data channel is used to transmit an image frame obtained by encoding the lossy coding protocol.
  • a data processing apparatus comprising:
  • the image frame detecting module is configured to: the first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, where the dynamic display scene is in a predetermined time period a display scene in which an internal image frame dynamically changes, the static display scene being a display scene in which an image frame remains unchanged during the predetermined time period;
  • a first encoding module configured to: when the image frame detecting module detects that the current image frame is an image frame generated in a dynamic display scene, encoding the current image frame according to a first encoding quality, where An encoding quality is used to indicate that the encoded image frame is restored to a first reduction level of the image frame before encoding;
  • a second encoding module configured to: when the image frame detecting module detects that the current image frame is an image frame generated in a static display scene, encode the current image frame according to a second encoding quality, where The second encoding quality is used to indicate that the encoded image frame is restored to a second reduction level of the image frame before encoding; the second reduction level is higher than the first reduction level;
  • a first sharing module configured to share the current image frame encoded by the first encoding module or the second encoding module to a second device for display.
  • the device further includes:
  • a display detection module configured to: after the first sharing module shares the encoded current image frame to the second device for display, if the encoded current image frame is according to the first If the code quality is obtained by encoding the current image frame, detecting whether the display of the current image frame by the first device meets a static display condition, and the static display condition is that the current image frame displayed remains unchanged. ;
  • a third encoding module configured to re-pair the current image frame according to the second encoding quality when the display detecting module detects that the display of the current image frame by the first device meets the static display condition Coding
  • a second sharing module configured to share the current image frame that is encoded by the third encoding module to the second device for display.
  • the display detection module is specifically configured to:
  • the pause instruction is for instructing the first device to stop updating the current image frame in the display buffer to a next image frame.
  • the image frame detection module Specifically used for:
  • Determining an application that generates the current image frame detecting whether the application belongs to a dynamic program list or a static program list, the dynamic program list including an application that provides a dynamic display scene, the static program list including providing a static display scene Application; or,
  • the encoding interface invoked by the application that provides the current image frame, whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene, and the encoding interface includes a first encoding An interface and a second encoding interface, the first encoding interface is configured to instruct to encode an image frame according to the first encoding quality, and the second encoding interface is configured to instruct to encode an image frame according to the second encoding quality ;or,
  • the device further includes:
  • An image frame processing module configured to: before the image frame detection module detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, acquiring the second device The resolution of the second device is super-resolution processed if the resolution of the second device is higher than the resolution of the first device.
  • the device further includes:
  • a network quality obtaining module configured to acquire a quality of a network to which the encoded current image frame is to be transmitted
  • a parameter adjustment module configured to adjust an encoding parameter corresponding to the first encoding quality according to the quality of the network acquired by the network quality obtaining module, or adjust the second encoding quality according to a quality of the network
  • the quality of the network is positively correlated with the encoded data volume of the current image frame, and the encoded current image frame is encoded according to the coding parameters.
  • the second coding quality is corresponding to the lossless compression mode
  • the first sharing module is specifically configured to:
  • the lossy compressed data channel is used to transmit an image frame encoded by lossy compression
  • the frame sharing is performed by the second device, where the lossy encoding protocol includes an encoding protocol corresponding to the lossless compression mode, and the lossy compressed data channel is used for transmission through the lossy encoding protocol.
  • the resulting image frame is performed by the second device, where the lossy encoding protocol includes an encoding protocol corresponding to the lossless compression mode, and the lossy compressed data channel is used for transmission through the lossy encoding protocol.
  • a data processing apparatus comprising: a bus, and a processor, a memory, a transmitter, and a receiver coupled to the bus.
  • the memory is for storing a plurality of instructions, the instructions being configured to be executed by the processor;
  • the processor is configured to detect whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, where the dynamic display scene is an image frame within a predetermined time period a dynamically changing display scene, the static display scene being a display scene in which the image frame remains unchanged during the predetermined time period;
  • the processor is further configured to: when the current image frame is an image frame generated in a dynamic display scenario, encode the current image frame according to a first coding quality, where the first coding quality is used to indicate coding The subsequent image frame is restored to the first reduction level of the image frame before encoding;
  • the processor is further configured to: when the current image frame is an image frame generated in a static display scenario, encode the current image frame according to a second encoding quality, where the second encoding quality is used to indicate encoding The subsequent image frame is restored to a second reduction level of the image frame before encoding; the second reduction level is higher than the first reduction level;
  • the transmitter is configured to share the current image frame encoded by the processor to a second device for display.
  • the processor is further configured to: after the transmitter shares the encoded current image frame to the second device for display, if the encoded current image frame is according to the first encoding quality pair If the current image frame is encoded, it is detected whether the display of the current image frame by the first device satisfies a static display condition, and the static display condition is that the current image frame displayed remains unchanged;
  • the processor is further configured to: when the display of the current image frame by the first device meets the static display condition, re-encode the current image frame according to the second encoding quality;
  • the transmitter is further configured to share the current image frame that is encoded by the processor to the second device for display.
  • the processor is specifically configured to:
  • the pause instruction is for instructing the first device to stop updating the current image frame in the display buffer to a next image frame.
  • the processor is specifically used to:
  • Determining an application that generates the current image frame detecting whether the application belongs to a dynamic program list or a static program list, the dynamic program list including an application that provides a dynamic display scene, the static program list including providing a static display scene Application; or,
  • the encoding interface invoked by the application that provides the current image frame, whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene, and the encoding interface includes a first encoding An interface and a second encoding interface, the first encoding interface is configured to instruct to encode an image frame according to the first encoding quality, and the second encoding interface is configured to instruct to encode an image frame according to the second encoding quality ;or,
  • the processor is further configured to: before detecting whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, obtain a resolution of the second device; The resolution of the second device is higher than the resolution of the first device, and the current image frame is subjected to super-resolution processing.
  • the processor is further configured to acquire a quality of a network that is to transmit the encoded current image frame
  • the processor is further configured to: adjust an encoding parameter corresponding to the first encoding quality according to a quality of the network, or adjust an encoding parameter corresponding to the second encoding quality according to a quality of the network, where The quality of the network is positively correlated with the amount of data of the encoded current image frame, and the encoded current image frame is encoded according to the encoding parameter.
  • the second coding quality corresponds to a lossless compression mode
  • the transmitter is specifically configured to:
  • the lossy compressed data channel is used to transmit an image frame encoded by lossy compression
  • the lossy compressed data channel is used to transmit an image frame obtained by encoding the lossy coding protocol.
  • the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; if the current image frame is an image frame generated under a dynamic display scene, according to the first The encoding quality encodes the current image frame, and the first encoding quality is used to indicate that the encoded image frame is restored to the first restored level of the image frame before encoding; if the current image frame is an image frame generated in the static display scene, And encoding the current image frame according to the second encoding quality, where the second encoding quality is used to indicate that the encoded image frame is restored to a second reduction level of the image frame before encoding; the second restoration level is higher than the first reduction level; The current image frame is dynamically displayed in the scene.
  • the current image frame is encoded by using the first coding quality, because the first reduction level indicated by the first coding quality is lower than the second reduction level indicated by the second coding quality,
  • the current image frame has a small amount of data, which solves the problem that the image frame generated in the dynamic display scene has a large difference and the restoration level indicated by the encoding quality is too high, thereby reducing the real-time performance of the shared display, and improving the shared display.
  • the real-time effect is provided.
  • FIG. 1 is a flowchart of a method for processing a data according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of still another data processing apparatus according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention.
  • the data processing method may include:
  • Step 101 The first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and the dynamic display scene is dynamically changed in a predetermined time period.
  • the scene is displayed, and the static display scene is a display scene in which the image frame remains unchanged for a predetermined period of time.
  • Step 102 If the current image frame is an image frame generated in a dynamic display scene, the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the image frame before encoding. The first level of reduction.
  • Step 103 If the current image frame is an image frame generated in a static display scene, according to the second The encoding quality encodes the current image frame, and the second encoding quality is used to indicate that the encoded image frame is restored to the second reduction level of the image frame before encoding; the second restoration level is higher than the first reduction level.
  • Step 104 Share the encoded current image frame to the second device for display.
  • the data processing method detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the current image frame may be encoded with the first coding quality when the current image frame is an image frame generated by the dynamic display scene, due to the first coding quality
  • the first restored level of the indication is lower than the second restored level indicated by the second encoding quality, and thus the encoded current image frame
  • the data processing method may include:
  • Step 201 The first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and the dynamic display scene is dynamically changed in a predetermined time period.
  • the scene is displayed, and the static display scene is a display scene in which the image frame remains unchanged for a predetermined period of time.
  • the display buffer is a buffer for storing image frames that the first device is displaying.
  • the current image frame may be a screenshot of a file being displayed by the first device.
  • the file can be text, images, web pages, slides, desktops, videos, games, and the like.
  • the display buffer can buffer 3 image frames, and the buffer duration of the image frames in the display buffer is related to the type of the display buffer and the type of the file.
  • the type of the display buffer includes a fixed type and a refresh type, and the difference is that when the first device continuously displays a picture, the image frame buffered in the fixed type display buffer does not change, that is, the first device displays the same.
  • the image frame is continuously cached in the display buffer; the image frames buffered in the refresh type display buffer are periodically refreshed to the same image.
  • the image frame that is, the image frame displayed by the first device is periodically refreshed to the next identical image frame.
  • the video is a video and the video displays 25 image frames in 1 second as an example.
  • the buffer time of each image frame in the display buffer is 25/. 60 seconds; when the first device pauses to play the video, the image frame displayed when the first device is paused has a buffer duration in the fixed type of display buffer equal to the pause duration, and each identical image frame displayed when the first device is paused is
  • the cache time in the display buffer of the refresh type is 25/60 seconds.
  • the dynamic display scene refers to a display scene in which an image frame dynamically changes during a predetermined time period, for example, turning a page text within a predetermined time period, turning a page slide within a predetermined time period, refreshing a web page within a predetermined time period, at a predetermined time
  • the video is played in the segment, the game screen is refreshed within a predetermined time period, the desktop icon is refreshed within a predetermined time period, the desktop button is refreshed within a predetermined time period, and the like.
  • the static display scene refers to a display scene in which the image frame remains unchanged for a predetermined period of time, for example, displaying a picture within a predetermined time period, displaying a web page within a predetermined time period, and the like.
  • the predetermined time period may be an empirical value set by the user, or may be other values, which is not limited in this embodiment.
  • the first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, including:
  • the dynamic interface list includes the control interface that is called in the dynamic display scenario.
  • the static interface list is included in the static display scenario. The control interface that is called; or,
  • the encoding interface invoked by the application that provides the current image frame, whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene, and the encoding interface includes a first encoding interface and a second An encoding interface, the first encoding interface is configured to instruct to encode the image frame according to the first encoding quality, and the second encoding interface is configured to instruct to encode the image frame according to the second encoding quality; or
  • a frame is an image frame produced in a static display scene.
  • the user can pre-set the dynamic program list and the static program list, and then add the application to the corresponding list according to the functions implemented by the application.
  • the dynamic program list includes an application that provides a dynamic display scenario
  • the static program list includes an application that provides a static display scenario.
  • a player is used to play a video
  • a play video is a dynamic display scene
  • the player can be added to the dynamic program list.
  • the game is a dynamic display scene, and the game program can be added to the dynamic program list.
  • the dynamic program list and the static program list may be manually classified by the user, or the program type corresponding to the dynamic program list and the static program list may be preset, and the application belonging to a certain type is automatically added. Go to the corresponding list.
  • the program type can be set by the developer, or it can be set by the app store, or it can be set manually by the user.
  • Program types can include text, image, and video.
  • Applications such as readers and text editors can be text-based; applications such as image viewing and image editing can be image-based; players, games, and the like.
  • the app can be of video type. Assuming that the video type corresponds to a dynamic program list, applications such as players and games can be automatically added to the dynamic program list.
  • the first device may determine an application that generates the current image frame, and if the application belongs to the dynamic program list, determine that the current image frame is an image generated in the dynamic display scene. Frame; if the application belongs to a static program list, it is determined that the current image frame is an image frame generated under a static display scene.
  • the first device may detect whether each current image frame is an image frame generated in a dynamic display scene or an image frame generated in a static display scene. Since the detection of each current image frame requires more processing resources, the first device can also dynamically image the image frame generated in the image frame sequence generated by an application. The image frame generated in the display scene is detected. If the image frame is an image frame generated in a dynamic display scene, each image frame in the image frame sequence is determined as an image frame generated in the dynamic display scene. If the image frame is an image frame generated in a static display scene, each image frame in the image frame sequence is determined as an image frame generated in a static display scene.
  • the user may preset a dynamic interface list and a static interface list, where the dynamic interface list includes a control interface that is called in a dynamic display scenario, and the static interface list is included in Statically displays the control interface that is called in the scene.
  • the dynamic interface list includes a control interface that is called in a dynamic display scenario
  • the static interface list is included in Statically displays the control interface that is called in the scene.
  • a control interface invoked when the system of the first device is animated and the animation is in the foreground display state can be added to the dynamic interface list.
  • the system can monitor the creation and end of the animation.
  • the first device can detect whether the control interface belongs to the dynamic interface list or the static interface list. If the control interface belongs to the dynamic interface list, determine that the current image frame is a dynamic display scenario. The generated image frame; if the control interface belongs to the static interface list, it is determined that the current image frame is an image frame generated in a static display scene.
  • an encoding interface may be newly added in the system, the encoding interface includes a first encoding interface for indicating that the image frame is encoded according to the first encoding amount, and for indicating the quality according to the second encoding.
  • a second encoding interface that encodes an image frame.
  • the application When the application invokes the encoding interface to encode the current image frame, it may determine whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene according to an encoding interface invoked by the application. Specifically, when the application invokes the first encoding interface to encode the current image frame, determining that the current image frame is an image frame generated in a dynamic display scene; when the application calls the second encoding interface to encode the current image frame And determining that the current image frame is an image frame generated in a static display scene.
  • all the image data included in the current image frame in the display buffer and the image frame adjacent to the current image frame may be acquired, and all image data is detected to be the same. If all the image data are different, Then determining that the current image frame is an image frame generated in a dynamic display scene; if all the image data are the same, determining that the current image frame is an image frame generated in a static display scene.
  • the image data is a set obtained by expressing the gradation value of each pixel in the image frame by a numerical value.
  • the first device compares the current image frame with the image frame adjacent to the current image frame. Since the image data contains a large amount of data, comparing all the image data requires more processing resources, and therefore, the image can also be The frame is subjected to feature extraction, and according to the obtained image feature, whether the current image frame is an image frame generated under a dynamic display scene or an image frame generated under a static display scene is detected. Since the amount of data of the image features is small, processing resources can be saved.
  • the image feature may be part of the image data in the image frame, or may be summary information generated by using a specific algorithm for the image frame, or may be other information. The image feature is not limited in this embodiment.
  • an algorithm for feature extraction may be determined in advance, and an image frame adjacent to the current image frame in the display buffer is acquired, and the current image frame and the adjacent image are respectively respectively according to the algorithm.
  • Feature extraction is performed for each image frame in the frame to obtain two image features of the same kind, and whether the two image features are the same. If the two image features are different, it is determined that the current image frame is generated under the dynamic display scene. Image frame; if the two image features are the same, it is determined that the current image frame is an image frame generated under a static display scene.
  • the image frame generated by the current image frame in the dynamic display scene or the image frame generated in the static display scene may be detected by other methods, which is not limited in this embodiment.
  • the first device may further provide an option for the user to select a dynamic display scene and a static display scene.
  • determining that the current image frame is an image frame generated in the dynamic display scene determining that the current image frame is an image frame generated in the dynamic display scene.
  • the user selects an option to statically display the scene it is determined that the current image frame is an image frame generated under the static display scene.
  • the method before the first device detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, the method further includes: acquiring a resolution of the second device; If the resolution of the second device is higher than the resolution of the first device, the current image frame is subjected to super-resolution processing.
  • the resolution of the second device may be sent to the first device after the connection is established with the first device, or the resolution of the second device may be set by the user in the first device.
  • the example does not limit the manner in which the resolution of the second device is acquired.
  • the super-resolution processing may be a processing performed by a nonlinear dynamic stretching or other super-resolution algorithm, and the processing procedure is already very mature, and is not described in this embodiment.
  • the processing capability of the first device is stronger than the processing capability of the second device, so that the super-resolution processing of the current image frame by the first device, rather than the super-resolution processing of the current image frame by the second device, can be improved.
  • the processing efficiency of the current image frame is assumed that the processing capability of the first device is stronger than the processing capability of the second device, so that the super-resolution processing of the current image frame by the first device, rather than the super-resolution processing of the current image frame by the second device, can be improved.
  • Step 202 If the current image frame is an image frame generated in a dynamic display scene, the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the image frame before encoding. The first level of reduction.
  • the first electronic device may encode the current image frame with a poor first encoding quality, and reduce the data amount of the current image frame by the first encoding quality, thereby improving the real-time performance of the current image frame display.
  • the first encoding quality is used to indicate that the encoded image frame is restored to a first restoration level of the image frame before encoding.
  • the level of reduction usually refers to between the restored image frame and the image frame before restoration. Similarity.
  • the embodiment further provides a third encoding quality, where the third encoding quality is used to indicate that the encoded image frame is restored to a third restoration level of the image frame before encoding, and the second restoration level is higher than the third.
  • the reduction level, the third reduction level is higher than the first reduction level.
  • the current image frame is an interactive dynamic display scene such as a slide show, a text page turn, or a web page update
  • the current image frame may be encoded according to the first encoding quality
  • the current image frame is a dynamic display scene such as a video or an animation
  • the current image frame can be encoded according to the third encoding quality.
  • Step 203 If the current image frame is an image frame generated in a static display scene, the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the image frame before encoding. a second reduction level; the second reduction level is higher than the first reduction level.
  • the first electronic device may encode the current image frame with a better second encoding quality, and increase the data amount of the current image frame by using the second encoding quality, thereby improving the display effect on the current image frame.
  • Step 204 Share the encoded current image frame to the second device for display.
  • the first electronic device shares the encoded current image frame encoded by step 202 or step 203 to the second device, and the second device decodes and displays the encoded current image frame.
  • the second encoding quality is corresponding to the lossless compression mode
  • the encoded current image frame is shared to the second device for display, including:
  • the lossy compressed data channel is used to transmit image frames encoded by lossy coding protocols.
  • the present embodiment provides two methods for transmitting the lossless compressed data, as follows:
  • a lossless compressed data channel can be added in the existing scenario where only the lossy compressed data channel exists, that is, mutually independent lossy compressed data channels and lossless compression are set.
  • the data channel, the lossy compressed data channel is used to transmit the image frame encoded by the lossy coding protocol, and the lossless compressed data channel is used to transmit the lossless compressed data encoded by the lossless coding protocol.
  • an encoding protocol corresponding to the lossless compression mode may be set, and the encoding protocol is used to instruct the first device to perform encoding by lossless compression, and then add the encoding protocol to the lossy encoding protocol. That is, the lossy coding protocol supports lossless frames in addition to I frames, P frames, and B frames obtained according to lossy compression, and lossless frames are image frames obtained by encoding image frames according to lossless compression. At this time, the lossy frame can be transmitted through a lossy compressed data channel corresponding to the lossy coding protocol.
  • Step 205 If the encoded current image frame is obtained by encoding the image frame according to the first encoding quality, detecting whether the display of the current image frame by the first device satisfies a static display condition, and the static display condition is the current image frame displayed. constant.
  • the encoded current image frame is obtained by encoding the image frame according to the first coding quality, the data size of the encoded current image frame is small, and the display effect is poor. If the user is interested in the encoded current image frame, it is necessary to improve the display effect on the encoded current image frame.
  • whether the user is interested in the encoded current image frame can be embodied by detecting whether the display of the current image frame by the first device satisfies a static display condition, and the static display condition is that the current image frame displayed remains unchanged.
  • detecting whether the display of the current image frame by the first device meets the static display condition includes:
  • the pause instruction is for instructing the first device to stop updating the current image frame in the display buffer to the next image frame.
  • the user can control the length of time during which the current image frame is displayed.
  • the first device may detect whether the duration of displaying the current image frame is greater than a preset threshold, if the user is interested in an image frame, and the first device may display the current image frame. If the duration is greater than the preset threshold, it is determined that the user is interested in the current image frame, that is, the display of the current image frame by the first device satisfies the static display condition; if the duration of the current image frame is less than the preset threshold, the user is determined to The current image frame is not of interest, ie, the display of the current image frame by the first device does not satisfy the static display condition.
  • the image frame is a slide show
  • the user controls the first device to switch the currently displayed slide to the next slide by triggering a page turning instruction.
  • the first device can receive the page turning command, Switching the previous slide to the current slide and starting timing. If the timing reaches the preset threshold and the page turning command is not received, the display of the slide by the first device satisfies the static display condition; if the timing does not reach the preset When the page turning instruction is received at the threshold, the display of the slide by the first device does not satisfy the static display condition.
  • the first device switches the current image frame to the next image frame every predetermined time interval. Since the first device can control whether the image frame is displayed for a long time when the user is interested in an image frame, the first device can detect whether the pause command is received, and if the first device receives the pause command, determine the user pair.
  • the current image frame is of interest, that is, the display of the current image frame by the first device satisfies the static display condition; if the first device does not receive the pause instruction, it is determined that the user is not interested in the current image frame, that is, the first device pair
  • the display of the current image frame does not satisfy the static display condition.
  • the image frame is a slide show, and the currently displayed slide is switched to the next slide every predetermined time interval.
  • the first device may start timing after displaying the slideshow, and detect whether a pause command is received during the time interval. If the pause command is received within the time interval, the display of the slide by the first device satisfies the static display. Condition; if the pause command is not received within the time interval, the display of the slide by the first device does not satisfy the static display condition.
  • Step 206 If the display of the current image frame by the first device satisfies the static display condition, the current image frame is re-encoded according to the second encoding quality.
  • Step 207 Share the re-encoded current image frame to the second device for display.
  • the first device may further encode the local content in the current image frame according to the second encoding quality, and then share the encoded partial content to the second device for display.
  • the local content may be an animation embedded in a webpage, a change of an interface button, or the like.
  • the first device Since the first device only updates the local content in the current image frame, instead of updating the entire current image frame, the time consumed by the encoding can be reduced, and the time consumed for transmitting the encoded data can be saved, and the real-time of the shared display is improved. Sex.
  • the process of sharing the re-encoded current image frame to the second device may refer to the description in step 204, and details are not described herein.
  • the data processing method provided in this embodiment further includes:
  • the encoded current image frame is encoded according to the encoding parameters. Arrived.
  • the first device may further adjust the coding parameter corresponding to the first coding quality according to the quality of the network, or adjust the coding parameter coding parameter corresponding to the second coding quality according to the quality of the network, so as to ensure the coding quality is not guaranteed. Adjust the amount of data of the current image frame on the premise of change. Among them, the technology for obtaining the quality of the network by the first device is very mature, and will not be described here.
  • the highest quality lossy compression can be adjusted to lossless compression to improve the display effect; when the quality of the network is poor, the number of key frames can be reduced to reduce the amount of data and improve Share the real-time nature of the display.
  • the data processing method detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the current image frame may be encoded with the first coding quality when the current image frame is an image frame generated by the dynamic display scene, due to the first coding quality
  • the first restored level of the indication is lower than the second restored level indicated by the second encoding quality, and thus the encoded current image frame
  • the current image frame is re-encoded according to the second encoding quality; and the re-encoded current image frame is shared to the second device for display,
  • the second image quality is used to re-encode and share the current image frame to improve the display effect of the second device on the current image frame.
  • FIG. 3 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention.
  • the data processing device may include:
  • the image frame detection module 301 is configured to detect, by the first device, whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and dynamically display
  • the scene is a display scene in which the image frame dynamically changes within a predetermined time period
  • the static display scene is a display scene in which the image frame remains unchanged within a predetermined period of time;
  • the first encoding module 302 is configured to: when the image frame detecting module 301 detects that the current image frame is an image frame generated in a dynamic display scene, encode the current image frame according to the first encoding quality, where the first encoding quality is used to indicate The encoded image frame is restored to a first reduction level of the image frame before encoding;
  • the second encoding module 303 is configured to: when the image frame detecting module 301 detects that the current image frame is an image frame generated in a static display scene, encode the current image frame according to the second encoding quality, where the second encoding quality is used to indicate The encoded image frame is restored to a second reduction level of the image frame before encoding; the second reduction level is higher than the first reduction level;
  • the first sharing module 304 is configured to share the current image frame encoded by the first encoding module 302 or the second encoding module 303 to the second device for display.
  • the data processing apparatus detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; if the current image frame Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the current image frame may be encoded with the first coding quality when the current image frame is an image frame generated by the dynamic display scene, due to the first coding quality
  • the indicated first reduction level is lower than the second reduction level indicated by the second coding quality, thus reducing the encoded current picture
  • the amount of data in the frame is small, which solves the problem that the image frame generated in the dynamic display scene has a large difference and the restoration level indicated by the encoding quality is too high, thereby reducing the real-time performance of the shared display, and improving the real-time performance of the shared display. Effect.
  • FIG. 4 is a schematic structural diagram of still another data processing apparatus according to an embodiment of the present invention.
  • the data processing device may include:
  • the image frame detection module 401 is configured to detect, by the first device, whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and the dynamic display scene is within a predetermined time period.
  • a display scene in which the image frame dynamically changes, and the static display scene is a display scene in which the image frame remains unchanged within a predetermined period of time;
  • the first encoding module 402 is configured to: when the image frame detecting module 401 detects that the current image frame is an image frame generated in a dynamic display scene, encode the current image frame according to the first encoding quality, where the first encoding quality is used to indicate The encoded image frame is restored to a first reduction level of the image frame before encoding;
  • the second encoding module 403 is configured to: when the image frame detecting module 401 detects that the current image frame is an image frame generated in a static display scene, encode the current image frame according to the second encoding quality, where the second encoding quality is used to indicate The encoded image frame is restored to a second reduction level of the image frame before encoding; the second reduction level is higher than the first reduction level;
  • the first sharing module 404 is configured to share the current image frame encoded by the first encoding module 402 or the second encoding module 403 to the second device for display.
  • the device further includes:
  • the display detection module 405 is configured to: after the first sharing module 404 shares the encoded current image frame to the second device for display, if the encoded current image frame is encoded according to the first encoding quality, the current image frame is obtained. , detecting whether the display of the current image frame by the first device satisfies the static display condition, and the static display condition is that the current image frame displayed remains unchanged;
  • the third encoding module 406 is configured to re-code the current image frame according to the second encoding quality when the display detecting module 405 detects that the display of the current image frame by the first device meets the static display condition;
  • the second sharing module 407 is configured to share the current image frame that is encoded by the third encoding module 406 to the second device for display.
  • the display detection module 405 is specifically configured to:
  • pause instruction is used to instruct the first device to stop updating the current image frame in the display buffer to the next image frame.
  • the image frame detection module 401 is specifically configured to:
  • Determining an application that generates a current image frame detecting whether the application belongs to a dynamic program list or a static program list, the dynamic program list includes an application that provides a dynamic display scene, and the static program list includes an application that provides a static display scene; or
  • the dynamic interface list includes the control interface that is called in the dynamic display scenario.
  • the static interface list is included in the static display scenario. Control interface; or,
  • the encoding interface includes a first encoding interface and a second encoding interface, where the first encoding interface is used to indicate that the image frame is encoded according to the first encoding quality.
  • a second encoding interface configured to indicate that the image frame is encoded according to the second encoding quality;
  • the device further includes:
  • the image frame processing module 408 is configured to: after the image frame detection module 401 detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, obtain the resolution of the second device. Rate; if the resolution of the second device is higher than the resolution of the first device, the current image frame is subjected to super-resolution processing.
  • the device further includes:
  • a network quality obtaining module 409 configured to acquire a quality of a network that is to transmit the encoded current image frame
  • the parameter adjustment module 410 is configured to adjust the coding parameter corresponding to the first coding quality according to the quality of the network acquired by the network quality acquisition module 409, or adjust the coding parameter corresponding to the second coding quality according to the quality of the network, and the quality of the network. It is positively correlated with the data amount of the encoded current image frame, and the encoded current image frame is encoded according to the encoding parameters.
  • the second encoding quality corresponds to the lossless compression mode
  • the first sharing module 404 is specifically configured to:
  • a new lossless compressed data channel is newly created, and the encoded current image frame is shared by the lossless compressed data channel to the second device for display.
  • the lossless compressed data channel and the lossy compressed data channel are independent of each other, and the lossy compressed data channel is used for transmission and loss.
  • the encoded current image frame is shared to the second device for display by the lossy compressed data channel corresponding to the lossy coding protocol, and the lossy coding protocol includes a new and corresponding to the lossless compression mode.
  • the data processing apparatus detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; if the current image frame Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the current image frame may be encoded with the first coding quality when the current image frame is an image frame generated by the dynamic display scene, due to the first coding quality
  • the first restored level of the indication is lower than the second restored level indicated by the second encoding quality, and thus the encoded current image frame
  • the current image frame is re-encoded according to the second encoding quality; and the re-encoded current image frame is shared to the second device for display,
  • the second image quality is used to re-encode and share the current image frame to improve the display effect of the second device on the current image frame.
  • FIG. 5 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention.
  • the data processing apparatus may include a bus 501, and a processor 502, a memory 503, a transmitter 504, and a receiver 505 connected to the bus 501.
  • the memory 503 is configured to store a plurality of instructions, and the instructions are configured to be executed by the processor 502;
  • the processor 502 is configured to detect whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and the dynamic display scene is dynamically changed in a predetermined time period.
  • the processor 502 is further configured to: when the current image frame is an image frame generated by the dynamic display scene, encode the current image frame according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the encoded image frame.
  • the processor 502 is further configured to: when the current image frame is an image frame generated in a static display scene, encode the current image frame according to the second encoding quality, where the second encoding quality is used to indicate that the encoded image frame is restored to the encoded image frame. a second reduction level of the previous image frame; the second reduction level is higher than the first reduction level;
  • the transmitter 504 is configured to share the current image frame encoded by the processor 502 to the second device for display.
  • the data processing apparatus detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; if the current image frame Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the current image frame may be encoded with the first coding quality when the current image frame is an image frame generated by the dynamic display scene, due to the first coding quality
  • the first restored level of the indication is lower than the second restored level indicated by the second encoding quality, and thus the encoded current image frame Small amount of data, the display image frame to solve dynamic scenario difference generated when the large and high quality coding level reduction, reducing the problem of real-time shared display achieve the effect of improving the real-time sharing display.
  • the data processing apparatus may include a bus 501, and a processor 502, a memory 503, a transmitter 504, and a receiver 505 connected to the bus 501.
  • the memory 503 is configured to store a plurality of instructions, and the instructions are configured to be executed by the processor 502;
  • the processor 502 is configured to detect whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, and the dynamic display scene is dynamically changed in a predetermined time period.
  • the processor 502 is further configured to: when the current image frame is an image frame generated by the dynamic display scene, encode the current image frame according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the encoded image frame.
  • the processor 502 is further configured to: when the current image frame is an image frame generated by the static display scene, And encoding the current image frame according to the second encoding quality, where the second encoding quality is used to indicate that the encoded image frame is restored to a second reduction level of the image frame before encoding; the second restoration level is higher than the first reduction level;
  • the transmitter 504 is configured to share the current image frame encoded by the processor 502 to the second device for display.
  • the processor 502 is further configured to: after the transmitter 504 shares the encoded current image frame to the second device for display, if the encoded current image frame is encoded according to the first encoding quality, the image frame is obtained. , detecting whether the display of the current image frame by the first device satisfies the static display condition, and the static display condition is that the current image frame displayed remains unchanged;
  • the processor 502 is further configured to: when the display of the current image frame by the first device meets the static display condition, re-encode the current image frame according to the second encoding quality;
  • the transmitter 504 is further configured to share the current image frame that is re-encoded by the processor 502 to the second device for display.
  • the processor 502 is specifically configured to:
  • pause instruction is used to instruct the first device to stop updating the image frame in the display buffer to the next image frame.
  • the processor 502 is specifically configured to:
  • Determining an application that generates a current image frame detecting whether the application belongs to a dynamic program list or a static program list, the dynamic program list includes an application that provides a dynamic display scene, and the static program list includes an application that provides a static display scene; or
  • the dynamic interface list includes the control interface that is called in the dynamic display scenario.
  • the static interface list is included in the static display scenario. Control interface; or,
  • the encoding interface detects whether the current image frame is an image frame generated in a dynamic display scene or an image frame generated in a static display scene by using an encoding interface that is provided by an application that provides the current image frame, and the encoding interface includes a first encoding interface and a second encoding interface.
  • the first encoding interface is configured to instruct to encode the image frame according to the first encoding quality
  • the second encoding interface is configured to instruct to encode the image frame according to the second encoding quality
  • the current image frame is an image frame generated in a dynamic display scene.
  • the image features are the same, it is determined that the current image frame is an image frame generated under a static display scene.
  • the processor 502 is further configured to: before detecting whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene, acquiring a resolution of the second device; If the resolution of the second device is higher than the resolution of the first device, the current image frame is subjected to super-resolution processing.
  • the processor 502 is further configured to obtain a quality of a network that is to transmit the encoded current image frame.
  • the processor 502 is further configured to: adjust, according to the quality of the network, the coding parameter corresponding to the first coding quality, or adjust the coding parameter corresponding to the second coding quality according to the quality of the network, the quality of the network, and the current image frame after the coding.
  • the amount of data is positively correlated, and the encoded current image frame is encoded according to the encoding parameters.
  • the second encoding quality corresponds to the lossless compression mode
  • the transmitter 504 is specifically configured to:
  • a new lossless compressed data channel is newly created, and the encoded current image frame is shared by the lossless compressed data channel to the second device for display.
  • the lossless compressed data channel and the lossy compressed data channel are independent of each other, and the lossy compressed data channel is used for transmission and loss.
  • the encoded current image frame is shared to the second device for display by the lossy compressed data channel corresponding to the lossy coding protocol, and the lossy coding protocol includes a new coding protocol corresponding to the lossless compression mode, and the lossy compression
  • the data channel is used to transmit image frames encoded by lossy coding protocols.
  • the data processing apparatus detects whether the current image frame in the display buffer is an image frame generated in a dynamic display scene or an image frame generated in a static display scene; if the current image frame Is the image frame generated in the dynamic display scene, and the current image frame is encoded according to the first encoding quality, where the first encoding quality is used to indicate that the encoded image frame is restored to the first restoration level of the image frame before encoding;
  • the current image frame is an image frame generated in a static display scene, and the current image frame is encoded according to the second encoding quality, and the second encoding quality is used to indicate that the encoded image frame is restored to the second restoration of the image frame before encoding.
  • the second reduction level is higher than the first reduction level; the first image can be used when the current image frame is an image frame generated by the dynamic display scene
  • the code quality encodes the current image frame, since the first restoration level indicated by the first coding quality is lower than the second reduction level indicated by the second coding quality, the encoded current image frame has a smaller amount of data
  • the current image frame is re-encoded according to the second encoding quality; and the re-encoded current image frame is shared to the second device for display,
  • the second image quality is used to re-encode and share the current image frame to improve the display effect of the second device on the current image frame.
  • the data processing apparatus provided by the foregoing embodiment is only exemplified by the division of each functional module, and in actual applications, the function allocation may be completed by different functional modules as needed.
  • the internal structure of the data processing device is divided into different functional modules to perform all or part of the functions described above.
  • the data processing apparatus and the data processing method embodiment provided by the foregoing embodiments are in the same concept, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit may be only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined. Or it can be integrated into another system, or some features can be ignored or not executed.
  • Another point, the mutual coupling shown or discussed The direct or direct coupling or communication connection may be an indirect coupling or communication connection through some interface, device or unit, and may be in electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression Of Band Width Or Redundancy In Fax (AREA)

Abstract

本发明公开了一种数据处理方法及装置,涉及数据处理领域,所述方法包括:第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平。本发明解决了动态显示场景下共享显示的实时性低的问题,达到了提高共享显示的实时性的效果。

Description

数据处理方法及装置 技术领域
本发明涉及数据处理领域,特别涉及一种数据处理方法及装置。
背景技术
多屏共享显示指将第一设备的显示缓冲区中的图像帧同时显示在第二设备中的技术。在第一设备共享图像帧之前,需要对图像帧进行编码。
第一设备预先确定编码质量,该编码质量用于指示编码后图像帧还原为编码前的图像帧的还原水平,且编码质量的优劣与还原水平呈正相关关系,即,编码质量越高,编码后的数据量越大,编码后的图像帧还原成编码前的图像帧的还原水平越高。第一设备根据该编码质量以及图像帧的重要程度,将该图像帧处理成I帧数据或P帧数据或B帧数据,将编码后的图像帧共享给第二设备进行显示。其中,I帧数据是对图像帧中的关键帧进行编码得到的,P帧数据是对该P帧数据与该P帧数据之前的一个I帧数据或P帧数据之间的差别进行编码得到的,B帧数据是对该B帧数据与该B帧数据之前的一个I帧数据或P帧数据,以及该B帧数据之后的一个P帧数据之间的差别进行编码得到的。
当图像帧是动态显示场景下所产生的图像帧时,由于动态显示场景下各个图像帧之间的差别较大,若此时确定的编码质量过高,会导致编码后的图像帧在传输时的数据量较大,传输编码后的图像帧所要消耗的时间较多,从而降低共享显示的实时性。
发明内容
为了解决动态显示场景下所产生的图像帧差别较大且编码质量过高时,降低共享显示的实时性的问题,本发明实施例提供了一种数据处理方法及装置。所述技术方案如下:
第一方面,提供了一种数据处理方法,所述方法包括:
第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段 内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
若所述当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
若所述当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
将编码后的所述当前图像帧共享给第二设备进行显示。
在第一方面的第一种可能的实现方式中,所述将编码后的所述当前图像帧共享给所述第二设备进行显示之后,还包括:
若编码后的所述当前图像帧是根据所述第一编码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
若所述第一设备对所述当前图像帧的显示满足所述静态显示条件,则根据所述第二编码质量重新对所述当前图像帧进行编码;
将再次编码后的所述当前图像帧共享给所述第二设备进行显示。
根据第一方面的第一种可能的实现方式,在第一方面的第二种可能的实现方式中,所述检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,包括:
获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
根据第一方面或第一方面的第一种可能的实现方式或第一方面的第二种可能的实现方式,在第一方面的第三种可能的实现方式中,所述第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,包括:
确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序, 所述静态程序列表包括提供静态显示场景的应用程序;或者,
确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述第二编码质量对图像帧进行编码;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
根据第一方面或第一方面的第一种可能的实现方式或第一方面的第二种可能的实现方式或第一方面的第三种可能的实现方式,在第一方面的第四种可能的实现方式中,所述第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,还包括:
获取所述第二设备的分辨率;
若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像帧进行超分辨率处理。
根据第一方面或第一方面的第一种可能的实现方式或第一方面的第二种可能的实现方式或第一方面的第三种可能的实现方式或第一方面的第四种可能的实现方式,在第一方面的第五种可能的实现方式中,所述方法,还包括:
获取将要传输编码后的所述当前图像帧的网络的质量;
根据所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根 据所述网络的质量调整所述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
在第一方面的第六种可能的实现方式中,所述第二编码质量对应于无损压缩方式,所述将编码后的所述当前图像帧共享给第二设备进行显示,包括:
新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
第二方面,提供了一种数据处理装置,所述装置包括:
图像帧检测模块,用于第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
第一编码模块,用于在所述图像帧检测模块检测出所述当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
第二编码模块,用于在所述图像帧检测模块检测出所述当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
第一共享模块,用于将所述第一编码模块或所述第二编码模块编码后的所述当前图像帧共享给第二设备进行显示。
在第二方面的第一种可能的实现方式中,所述装置,还包括:
显示检测模块,用于在所述第一共享模块将编码后的所述当前图像帧共享给所述第二设备进行显示之后,若编码后的所述当前图像帧是根据所述第一编 码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
第三编码模块,用于在所述显示检测模块检测出所述第一设备对所述当前图像帧的显示满足所述静态显示条件时,根据所述第二编码质量重新对所述当前图像帧进行编码;
第二共享模块,用于将所述第三编码模块再次编码后的所述当前图像帧共享给所述第二设备进行显示。
根据第二方面的第一种可能的实现方式,在第二方面的第二种可能的实现方式中,所述显示检测模块,具体用于:
获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
根据第二方面或第二方面的第一种可能的实现方式或第二方面的第二种可能的实现方式,在第二方面的第三种可能的实现方式中,所述图像帧检测模块,具体用于:
确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序,所述静态程序列表包括提供静态显示场景的应用程序;或者,
确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述第二编码质量对图像帧进行编码;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不 同时,确定图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
根据第二方面或第二方面的第一种可能的实现方式或第二方面的第二种可能的实现方式或第二方面的第三种可能的实现方式,在第二方面的第四种可能的实现方式中,所述装置,还包括:
图像帧处理模块,用于所述图像帧检测模块检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取所述第二设备的分辨率;若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像帧进行超分辨率处理。
根据第二方面或第二方面的第一种可能的实现方式或第二方面的第二种可能的实现方式或第二方面的第三种可能的实现方式或第二方面的第四种可能的实现方式,在第二方面的第五种可能的实现方式中,所述装置,还包括:
网络质量获取模块,用于获取将要传输编码后的所述当前图像帧的网络的质量;
参数调整模块,用于根据所述网络质量获取模块获取到的所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根据所述网络的质量调整所述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
在第二方面的第六种可能的实现方式中,所述第二编码质量对应于无损压缩方式,所述第一共享模块,具体用于:
新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像 帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
第三方面,提供了一种数据处理装置,所述装置包括:总线,以及连接到所述总线的处理器、存储器、发送器和接收器。其中,所述存储器用于存储若干个指令,所述指令被配置成由所述处理器执行;
所述处理器,用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
所述处理器,还用于在所述当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
所述处理器,还用于在所述当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
所述发送器,用于将所述处理器编码后的所述当前图像帧共享给第二设备进行显示。
在第三方面的第一种可能的实现方式中,
所述处理器,还用于所述发送器将编码后的所述当前图像帧共享给所述第二设备进行显示之后,若编码后的所述当前图像帧是根据所述第一编码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
所述处理器,还用于在所述第一设备对所述当前图像帧的显示满足所述静态显示条件时,根据所述第二编码质量重新对所述当前图像帧进行编码;
所述发送器,还用于将所述处理器再次编码后的所述当前图像帧共享给所述第二设备进行显示。
根据第三方面的第一种可能的实现方式,在第三方面的第二种可能的实现方式中,所述处理器,具体用于:
获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
根据第三方面或第三方面的第一种可能的实现方式或第三方面的第二种可能的实现方式,在第三方面的第三种可能的实现方式中,所述处理器,具体用于:
确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序,所述静态程序列表包括提供静态显示场景的应用程序;或者,
确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述第二编码质量对图像帧进行编码;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
根据第三方面或第三方面的第一种可能的实现方式或第三方面的第二种可能的实现方式或第三方面的第三种可能的实现方式,在第三方面的第四种可能的实现方式中,
所述处理器,还用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取所述第二设备的分辨率;若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像帧进行超分辨率处理。
根据第三方面或第三方面的第一种可能的实现方式或第三方面的第二种可能的实现方式或第三方面的第三种可能的实现方式或第三方面的第四种可能的实现方式,在第三方面的第五种可能的实现方式中,
所述处理器,还用于获取将要传输编码后的所述当前图像帧的网络的质量;
所述处理器,还用于根据所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根据所述网络的质量调整所述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
在第三方面的第六种可能的实现方式中,所述第二编码质量对应于无损压缩方式,所述发送器,具体用于:
新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
本发明实施例提供的技术方案的有益效果是:
通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所 产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的一种数据处理方法的方法流程图;
图2是本发明实施例提供的又一种数据处理方法的方法流程图;
图3是本发明实施例提供的一种数据处理装置的结构示意图;
图4是本发明实施例提供的又一种数据处理装置的结构示意图;
图5是本发明实施例提供的一种数据处理装置的结构示意图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明实施方式作进一步地详细描述。
请参见图1,其示出了本发明实施例提供的一种数据处理方法的方法流程图。该数据处理方法,可以包括:
步骤101,第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景。
步骤102,若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平。
步骤103,若当前图像帧是静态显示场景下所产生的图像帧,则根据第二 编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平。
步骤104,将编码后的当前图像帧共享给第二设备进行显示。
综上所述,本发明实施例提供的数据处理方法,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
请参见图2,其示出了本发明实施例提供的又一种数据处理方法的方法流程图。该数据处理方法,可以包括:
步骤201,第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景。
显示缓冲区是用于存储第一设备正在显示的图像帧的缓冲区。其中,当前图像帧可以是第一设备正在显示的文件的截图。比如,文件可以是文本、图片、网页、幻灯片、桌面、视频、游戏等。
通常,显示缓冲区可以缓存3个图像帧,且显示缓冲区中图像帧的缓存时长与显示缓冲区的类型以及文件的类型相关。其中,显示缓冲区的类型包括固定类型和刷新类型,其区别在于:在第一设备持续显示一个画面时,固定类型的显示缓冲区中缓存的图像帧不变,即,第一设备显示的该图像帧持续缓存在显示缓冲区中;刷新类型的显示缓冲区中缓存的图像帧被定期刷新为相同的图 像帧,即,第一设备显示的该图像帧被定期刷新为下一个相同的图像帧。
为了便于理解,本实施例以文件是视频且视频1秒显示25个图像帧为例进行说明,则在第一设备正常播放视频时,每个图像帧在显示缓冲区中的缓存时长为25/60秒;在第一设备暂停播放视频时,第一设备暂停时显示的图像帧在固定类型的显示缓冲区中的缓存时长等于暂停时长,第一设备暂停时显示的每个相同的图像帧在刷新类型的显示缓冲区中的缓存时长为25/60秒。
动态显示场景是指在预定时间段内图像帧动态变化的显示场景,比如,在预定时间段内翻页文本、在预定时间段内翻页幻灯片、在预定时间段内刷新网页、在预定时间段内播放视频、在预定时间段内刷新游戏画面、在预定时间段内刷新桌面图标、在预定时间段内刷新桌面按钮等。静态显示场景是指在预定时间段内图像帧保持不变的显示场景,比如,在预定时间段内显示一张图片、在预定时间段内显示一个网页等。其中,预定时间段可以是用户设置的经验值,也可以是其他数值,本实施例不作限定。
本实施例中,第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,包括:
1)确定生成当前图像帧的应用程序,检测应用程序属于动态程序列表还是静态程序列表,动态程序列表包括提供动态显示场景的应用程序,静态程序列表包括提供静态显示场景的应用程序;或者,
2)确定显示当前图像帧时调用的控件接口,检测控件接口属于动态接口列表还是静态接口列表,动态接口列表包括在动态显示场景下被调用的控件接口,静态接口列表包括在静态显示场景下被调用的控件接口;或者,
3)通过提供当前图像帧的应用程序所调用的编码接口检测当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,编码接口包括第一编码接口和第二编码接口,第一编码接口用于指示根据第一编码质量对图像帧进行编码,第二编码接口用于指示根据第二编码质量对图像帧进行编码;或者,
4)获取显示缓冲区中与当前图像帧相邻的图像帧,检测当前图像帧和相邻的图像帧各自所包括的图像数据是否相同,当图像数据不同时,确定当前图像帧是动态显示场景下所产生的图像帧,当图像数据相同时,确定当前图像帧是静态显示场景下所产生的图像帧;或者,
5)获取显示缓冲区中与当前图像帧相邻的图像帧,对当前图像帧和相邻 的图像帧中的同一种图像特征进行提取,检测各自得到的图像特征是否相同,当图像特征不同时,确定当前图像帧是动态显示场景下所产生的图像帧当图像特征相同时,确定当前图像帧是静态显示场景下所产生的图像帧。
在第一种检测方法中,用户可以预先设置动态程序列表和静态程序列表,再根据应用程序所实现的功能,将应用程序添加到对应的列表中。其中,动态程序列表包括提供动态显示场景的应用程序,静态程序列表包括提供静态显示场景的应用程序。比如,播放器用于播放视频,而播放视频是动态显示场景,可以将播放器添加到动态程序列表中。又比如,游戏是动态显示场景,可以将游戏程序添加到动态程序列表中。
本实施例中,动态程序列表和静态程序列表可以由用户手动对应用程序进行分类得到,也可以预先设置动态程序列表和静态程序列表各自对应的程序类型,将属于某一个类型的应用程序自动添加到对应的列表中。其中,程序类型可以开发人员设置的,也可以是应用商店设置的,还可以是用户手动设置的。
程序类型可以包括文本型、图像型和视频型,则阅读器、文字编辑之类的应用程序可以属于文本型;图片查看、图片编辑之类的应用程序可以属于图像型;播放器、游戏之类的应用程序可以属于视频型。假设视频型对应于动态程序列表,则可以将播放器、游戏之类的应用程序自动添加到动态程序列表中。
由于当前图像帧是应用程序生成的,因此,第一设备可以确定生成该当前图像帧的应用程序,若该应用程序属于动态程序列表,则确定该当前图像帧是动态显示场景下所产生的图像帧;若该应用程序属于静态程序列表,则确定该当前图像帧是静态显示场景下所产生的图像帧。
可选的,第一设备可以对每一个当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧进行检测。由于对每一个当前图像帧进行检测需要消耗较多的处理资源,因此,第一设备也可以对一个应用程序生成的图像帧序列中的一个图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧进行检测,若该图像帧是动态显示场景下所产生的图像帧,则将该图像帧序列中的每一个图像帧都确定为动态显示场景下所产生的图像帧;若该图像帧是静态显示场景下所产生的图像帧,则将该图像帧序列中的每一个图像帧都确定为静态显示场景下所产生的图像帧。
在第二种检测方法中,用户可以预先设置动态接口列表和静态接口列表,动态接口列表包括在动态显示场景下被调用的控件接口,静态接口列表包括在 静态显示场景下被调用的控件接口。比如,可以将在第一设备的系统创建动画且该动画处于前景显示状态时调用的控件接口添加到动态接口列表中。其中,系统可以监控到动画的创建和结束。
由于在显示当前图像帧时需要调用控件接口,因此,第一设备可以检测该控件接口属于动态接口列表还是静态接口列表,若该控件接口属于动态接口列表,则确定该当前图像帧是动态显示场景下所产生的图像帧;若该控件接口属于静态接口列表,则确定该当前图像帧是静态显示场景下所产生的图像帧。
在第三种检测方法中,可以在系统中新增编码接口,该编码接口包括用于指示根据第一编码量对图像帧进行编码的第一编码接口,以及,用于指示根据第二编码质量对图像帧进行编码的第二编码接口。
在应用程序调用编码接口对当前图像帧进行编码时,可以根据应用程序所调用的编码接口确定当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧。具体地,当应用程序调用第一编码接口对当前图像帧进行编码时,确定该当前图像帧是动态显示场景下所产生的图像帧;当应用程序调用第二编码接口对当前图像帧进行编码时,确定该当前图像帧是静态显示场景下所产生的图像帧。
在第四种检测方法中,可以获取显示缓冲区中当前图像帧和与当前图像帧相邻的图像帧各自所包括的所有图像数据,检测所有的图像数据是否相同,若所有的图像数据不同,则确定该当前图像帧是动态显示场景下所产生的图像帧;若所有的图像数据相同,则确定该当前图像帧是静态显示场景下所产生的图像帧。其中,图像数据是将图像帧中各个像素的灰度值用数值进行表示后得到的集合。
第一设备对当前图像帧和与当前图像帧相邻的图像帧的整体进行比较,由于图像数据包含的数据量较多,比较所有图像数据需要消耗的处理资源较多,因此,还可以对图像帧进行特征提取,根据得到的图像特征检测当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧。由于图像特征的数据量较少,因此,可以节省处理资源。其中,图像特征可以是图像帧中的部分图像数据,也可以是采用特定算法对图像帧生成的摘要信息,还可以是其他信息,本实施例不对图像特征进行限定。
在第五种检测方法中,可以预先确定一种特征提取的算法,获取显示缓冲区中与当前图像帧相邻的图像帧,根据该算法分别对当前图像帧和相邻的图像 帧中的每个图像帧进行特征提取,得到同种的两个图像特征,检测这两个图像特征是否相同,若这两个图像特征不同,则确定该当前图像帧是动态显示场景下所产生的图像帧;若这两个图像特征相同,则确定该当前图像帧是静态显示场景下所产生的图像帧。
当然,除了上述五种检测方法,还可以通过其它方法检测当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,本实施例不作限定。
需要说明的是,第一设备还可以向用户提供选择动态显示场景和静态显示场景的选项,当用户选择动态显示场景的选项时,确定当前图像帧是动态显示场景下所产生的图像帧;当用户选择静态显示场景的选项时,确定当前图像帧是静态显示场景下所产生的图像帧。
本实施例中,第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,还包括:获取第二设备的分辨率;若第二设备的分辨率高于第一设备的分辨率,则对当前图像帧进行超分辨率处理。
具体地,第二设备的分辨率可以是与第一设备建立连接之后,通过该连接发送给第一设备的,或,第二设备的分辨率可以是用户在第一设备中设置的,本实施例不限定对第二设备的分辨率的获取方式。
其中,超分辨率处理可以是通过非线性动态拉伸或其它超分辨率算法进行的处理,此处理过程已经非常成熟,本实施例不作赘述。
假设第一设备的处理能力比第二设备的处理能力强,这样,通过第一设备对当前图像帧进行超分辨率处理,而不是通过第二设备对当前图像帧进行超分辨率处理,可以提高对当前图像帧的处理效率。
步骤202,若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平。
在动态显示场景下,显示的图像帧更新较快,此时需要保证图像帧显示的实时性。即,第一电子设备可以采用较差的第一编码质量对当前图像帧进行编码,通过第一编码质量来减少当前图像帧的数据量,从而提高对当前图像帧显示的实时性。其中,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平。还原水平通常指还原后的图像帧与还原前的图像帧之间 的相似度。
需要说明的是,本实施例还提供了第三编码质量,第三编码质量用于指示编码后的图像帧还原为编码前的图像帧的第三还原水平,且第二还原水平高于第三还原水平,第三还原水平高于第一还原水平。当当前图像帧是幻灯片播放、文本翻页、网页更新等交互式的动态显示场景时,可以根据第一编码质量对当前图像帧进行编码;当当前图像帧是视频、动画等动态显示场景时,可以根据第三编码质量对当前图像帧进行编码。
步骤203,若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平。
在静态显示场景下,显示的图像帧不变,此时需要保证图像帧的显示效果。即,第一电子设备可以采用较好的第二编码质量对当前图像帧进行编码,通过第二编码质量来增加当前图像帧的数据量,从而提高对当前图像帧的显示效果。
步骤204,将编码后的当前图像帧共享给第二设备进行显示。
第一电子设备将步骤202或步骤203编码得到的编码后的当前图像帧共享给第二设备,由第二设备对该编码后的当前图像帧进行解码和显示。
本实施例中,第二编码质量对应于无损压缩方式,将编码后的当前图像帧共享给第二设备进行显示,包括:
1)新建无损压缩数据通道,通过无损压缩数据通道将编码后的当前图像帧共享给第二设备进行显示,无损压缩数据通道与有损压缩数据通道相互独立,有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
2)通过与有损编码协议对应的有损压缩数据通道,将编码后的当前图像帧共享给第二设备进行显示,有损编码协议包括新增的且与无损压缩方式对应的编码协议,有损压缩数据通道用于传输通过有损编码协议编码得到的图像帧。
为了解决一些地区无法使用专用传输技术来传输无损压缩数据,导致无损压缩方式的使用范围受限的问题,本实施例提供了两种传输无损压缩数据的方法,具体如下:
在第一种传输方式中,可以在现有的只存在有损压缩数据通道的场景下,新增无损压缩数据通道,即,设置了相互独立的有损压缩数据通道和无损压缩 数据通道,有损压缩数据通道用于传输通过有损编码协议编码得到的图像帧,无损压缩数据通道用于传输通过无损编码协议编码得到的无损压缩数据。
在第二种传输方式中,可以设置与无损压缩方式对应的编码协议,该编码协议用于指示第一设备通过无损压缩方式进行编码,再将该编码协议添加到有损编码协议中。即,有损编码协议除了支持根据有损压缩方式得到的I帧、P帧和B帧外,还支持无损帧,无损帧是根据无损压缩方式对图像帧进行编码得到的图像帧。此时,可以通过与有损编码协议对应的有损压缩数据通道传输该无损帧。
步骤205,若编码后的当前图像帧是根据第一编码质量对图像帧进行编码得到的,则检测第一设备对当前图像帧的显示是否满足静态显示条件,静态显示条件为显示的当前图像帧保持不变。
若编码后的当前图像帧是根据第一编码质量对图像帧进行编码得到的,该编码后的当前图像帧的数据量较小,显示效果较差。若用户对该编码后的当前图像帧感兴趣,则需要提高对该编码后的当前图像帧的显示效果。其中,用户是否对编码后的当前图像帧感兴趣可以通过检测第一设备对当前图像帧的显示是否满足静态显示条件来体现,静态显示条件是显示的当前图像帧保持不变。
具体地,检测第一设备对当前图像帧的显示是否满足静态显示条件,包括:
1)获取第一设备显示当前图像帧的时长,检测时长是否大于预设阈值;或者,
2)检测是否接收到暂停指令,暂停指令用于指示第一设备停止将显示缓冲区中的当前图像帧更新为下一个图像帧。
在第一种检测方法中,用户可以控制显示当前图像帧的时长。由于当用户对某一个图像帧感兴趣时,可以控制第一设备长时间显示该图像帧,因此,第一设备可以检测显示当前图像帧的时长是否大于预设阈值,若显示该当前图像帧的时长大于预设阈值,则确定用户对该当前图像帧感兴趣,即,第一设备对当前图像帧的显示满足静态显示条件;若显示该当前图像帧的时长小于预设阈值,则确定用户对该当前图像帧不感兴趣,即,第一设备对当前图像帧的显示不满足静态显示条件。
比如,图像帧是幻灯片,且用户通过触发翻页指令控制第一设备将当前显示的幻灯片切换为下一个幻灯片。此时,第一设备可以在接收到翻页指令后, 将前一个幻灯片切换为当前的幻灯片,并开始计时,若计时达到预设阈值且未接收到翻页指令,则第一设备对幻灯片的显示满足静态显示条件;若计时未达到预设阈值时接收到翻页指令,则第一设备对幻灯片的显示不满足静态显示条件。
在第二种检测方法中,第一设备每隔预定时间间隔将当前图像帧切换为下一个图像帧。由于当用户对某一个图像帧感兴趣时,可以控制第一设备长时间显示该图像帧,因此,第一设备可以检测是否接收到暂停指令,若第一设备接收到暂停指令,则确定用户对该当前图像帧感兴趣,即,第一设备对当前图像帧的显示满足静态显示条件;若第一设备未接收到暂停指令,则确定用户对该当前图像帧不感兴趣,即,第一设备对当前图像帧的显示不满足静态显示条件。
比如,图像帧是幻灯片,且每隔预定时间间隔将当前显示的幻灯片切换为下一个幻灯片。此时,第一设备可以在显示幻灯片后开始计时,检测在该时间间隔内是否接收到暂停指令,若在该时间间隔内接收到暂停指令,则第一设备对幻灯片的显示满足静态显示条件;若在该时间间隔内未接收到暂停指令,则第一设备对幻灯片的显示不满足静态显示条件。
步骤206,若第一设备对当前图像帧的显示满足静态显示条件,则根据第二编码质量重新对当前图像帧进行编码。
步骤207,将再次编码后的当前图像帧共享给第二设备进行显示。
需要说明的是,第一设备还可以根据第二编码质量对当前图像帧中的局部内容进行编码,再将编码后的局部内容共享给第二设备进行显示。比如,局部内容可以是网页内嵌的动画、界面按钮变化等。
由于第一设备只更新当前图像帧中的局部内容,而不是更新整个当前图像帧,既可以减少编码所消耗的时间,也可以节省传输编码后的数据所消耗的时间,提高了共享显示的实时性。
当第二编码质量对应于无损压缩方式时,将再次编码后的当前图像帧共享给第二设备的过程可以参考步骤204中的描述,此处不赘述。
可选的,本实施例中提供的数据处理方法,还包括:
1)获取将要传输编码后的当前图像帧的网络的质量;
2)根据网络的质量调整第一编码质量所对应的编码参数,或者,根据网络的质量调整第二编码质量所对应的编码参数,网络的质量与编码后的当前图像帧的数据量呈正相关关系,编码后的当前图像帧是根据编码参数进行编码得 到的。
本实施例中,第一设备还可以根据网络的质量调整第一编码质量所对应的编码参数,或,根据网络的质量调整第二编码质量所对应的编码参数编码参数,从而在保证编码质量不变的前提下调整当前图像帧的数据量。其中,第一设备获取网络的质量的技术已经非常成熟,此处不作赘述。
比如,当网络的质量较好时,可以将最高质量的有损压缩调整为无损压缩,以提高显示效果;当网络的质量较差时,可以减少关键帧的个数,以减少数据量,提高共享显示的实时性。
综上所述,本发明实施例提供的数据处理方法,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
另外,通过在第一设备对当前图像帧的显示满足静态显示条件时,根据第二编码质量重新对当前图像帧进行编码;将再次编码后的当前图像帧共享给第二设备进行显示,可以在用户对显示的当前图像帧感兴趣时,采用第二编码质量重新编码和共享该当前图像帧,以提高第二设备对该当前图像帧的显示效果。
请参考图3,其示出了本发明实施例提供的一种数据处理装置的结构示意图。该数据处理装置,可以包括:
图像帧检测模块301,用于第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示 场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景;
第一编码模块302,用于在图像帧检测模块301检测出当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
第二编码模块303,用于在图像帧检测模块301检测出当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;
第一共享模块304,用于将第一编码模块302或第二编码模块303编码后的当前图像帧共享给第二设备进行显示。
综上所述,本发明实施例提供的数据处理装置,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,减少编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
请参考图4,其示出了本发明实施例提供的又一种数据处理装置的结构示意图。该数据处理装置,可以包括:
图像帧检测模块401,用于第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景;
第一编码模块402,用于在图像帧检测模块401检测出当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
第二编码模块403,用于在图像帧检测模块401检测出当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;
第一共享模块404,用于将第一编码模块402或第二编码模块403编码后的当前图像帧共享给第二设备进行显示。
可选的,装置,还包括:
显示检测模块405,用于在第一共享模块404将编码后的当前图像帧共享给第二设备进行显示之后,若编码后的当前图像帧是根据第一编码质量对当前图像帧进行编码得到的,则检测第一设备对当前图像帧的显示是否满足静态显示条件,静态显示条件为显示的当前图像帧保持不变;
第三编码模块406,用于在显示检测模块405检测出第一设备对当前图像帧的显示满足静态显示条件时,根据第二编码质量重新对当前图像帧进行编码;
第二共享模块407,用于将第三编码模块406再次编码后的当前图像帧共享给第二设备进行显示。
可选的,显示检测模块405,具体用于:
获取第一设备显示当前图像帧的时长,检测时长是否大于预设阈值;或者,
检测是否接收到暂停指令,暂停指令用于指示第一设备停止将显示缓冲区中的当前图像帧更新为下一个图像帧。
可选的,图像帧检测模块401,具体用于:
确定生成当前图像帧的应用程序,检测应用程序属于动态程序列表还是静态程序列表,动态程序列表包括提供动态显示场景的应用程序,静态程序列表包括提供静态显示场景的应用程序;或者,
确定显示当前图像帧时调用的控件接口,检测控件接口属于动态接口列表还是静态接口列表,动态接口列表包括在动态显示场景下被调用的控件接口,静态接口列表包括在静态显示场景下被调用的控件接口;或者,
通过提供当前图像帧的应用程序所调用的编码接口检测当前图像帧是动 态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,编码接口包括第一编码接口和第二编码接口,第一编码接口用于指示根据第一编码质量对图像帧进行编码,第二编码接口用于指示根据第二编码质量对图像帧进行编码;或者,
获取显示缓冲区中与当前图像帧相邻的图像帧,检测当前图像帧和相邻的图像帧各自所包括的图像数据是否相同,当图像数据不同时,确定当前图像帧是动态显示场景下所产生的图像帧,当图像数据相同时,确定当前图像帧是静态显示场景下所产生的图像帧;或者,
获取显示缓冲区中与当前图像帧相邻的图像帧,对当前图像帧和相邻图的像帧中的同一种图像特征进行提取,检测各自得到的图像特征是否相同,当图像特征不同时,确定当前图像帧是动态显示场景下所产生的图像帧,当图像特征相同时,确定当前图像帧是静态显示场景下所产生的图像帧。
可选的,装置,还包括:
图像帧处理模块408,用于图像帧检测模块401检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取第二设备的分辨率;若第二设备的分辨率高于第一设备的分辨率,则对当前图像帧进行超分辨率处理。
可选的,装置,还包括:
网络质量获取模块409,用于获取将要传输编码后的当前图像帧的网络的质量;
参数调整模块410,用于根据网络质量获取模块409获取到的网络的质量调整第一编码质量所对应的编码参数,或者,根据网络的质量调整第二编码质量所对应的编码参数,网络的质量与编码后的当前图像帧的数据量呈正相关关系,编码后的当前图像帧是根据编码参数进行编码得到的。
可选的,第二编码质量对应于无损压缩方式,第一共享模块404,具体用于:
新建无损压缩数据通道,通过无损压缩数据通道将编码后的当前图像帧共享给第二设备进行显示,无损压缩数据通道与有损压缩数据通道相互独立,有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
通过与有损编码协议对应的有损压缩数据通道,将编码后的当前图像帧共享给第二设备进行显示,有损编码协议包括新增的且与无损压缩方式对应的编 码协议,有损压缩数据通道用于传输通过有损编码协议编码得到的图像帧。
综上所述,本发明实施例提供的数据处理装置,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
另外,通过在第一设备对当前图像帧的显示满足静态显示条件时,根据第二编码质量重新对当前图像帧进行编码;将再次编码后的当前图像帧共享给第二设备进行显示,可以在用户对显示的当前图像帧感兴趣时,采用第二编码质量重新编码和共享该当前图像帧,以提高第二设备对该当前图像帧的显示效果。
请参考图5,其示出了本发明实施例提供的一种数据处理装置的结构示意图。该数据处理装置,可以包括:总线501,以及连接到总线501的处理器502、存储器503、发送器504和接收器505。其中,存储器503用于存储若干个指令,指令被配置成由处理器502执行;
处理器502,用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景;
处理器502,还用于在当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
处理器502,还用于在当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;
发送器504,用于将处理器502编码后的当前图像帧共享给第二设备进行显示。
综上所述,本发明实施例提供的数据处理装置,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
请参考图5,本发明实施例提供了又一种数据处理装置,该数据处理装置,可以包括:总线501,以及连接到总线501的处理器502、存储器503、发送器504和接收器505。其中,存储器503用于存储若干个指令,指令被配置成由处理器502执行;
处理器502,用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,动态显示场景是在预定时间段内图像帧动态变化的显示场景,静态显示场景是在预定时间段内图像帧保持不变的显示场景;
处理器502,还用于在当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
处理器502,还用于在当前图像帧是静态显示场景下所产生的图像帧时, 根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;
发送器504,用于将处理器502编码后的当前图像帧共享给第二设备进行显示。
可选的,处理器502,还用于发送器504将编码后的当前图像帧共享给第二设备进行显示之后,若编码后的当前图像帧是根据第一编码质量对图像帧进行编码得到的,则检测第一设备对当前图像帧的显示是否满足静态显示条件,静态显示条件为显示的当前图像帧保持不变;
处理器502,还用于在第一设备对当前图像帧的显示满足静态显示条件时,根据第二编码质量重新对当前图像帧进行编码;
发送器504,还用于将处理器502再次编码后的当前图像帧共享给第二设备进行显示。
可选的,处理器502,具体用于:
获取第一设备显示当前图像帧的时长,检测时长是否大于预设阈值;或者,
检测是否接收到暂停指令,暂停指令用于指示第一设备停止将显示缓冲区中的图像帧更新为下一个图像帧。
可选的,处理器502,具体用于:
确定生成当前图像帧的应用程序,检测应用程序属于动态程序列表还是静态程序列表,动态程序列表包括提供动态显示场景的应用程序,静态程序列表包括提供静态显示场景的应用程序;或者,
确定显示当前图像帧时调用的控件接口,检测控件接口属于动态接口列表还是静态接口列表,动态接口列表包括在动态显示场景下被调用的控件接口,静态接口列表包括在静态显示场景下被调用的控件接口;或者,
通过提供当前图像帧的应用程序所调用的编码接口检测当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,编码接口包括第一编码接口和第二编码接口,第一编码接口用于指示根据第一编码质量对图像帧进行编码,第二编码接口用于指示根据第二编码质量对图像帧进行编码;或者,
获取显示缓冲区中与当前图像帧相邻的图像帧,检测当前图像帧和相邻的图像帧各自所包括的图像数据是否相同,当图像数据不同时,确定当前图像帧 是动态显示场景下所产生的图像帧,当图像数据相同时,确定当前图像帧是静态显示场景下所产生的图像帧;或者,
获取显示缓冲区中与当前图像帧相邻的图像帧,对当前图像帧和相邻的图像帧中的同一种图像特征进行提取,检测各自得到的图像特征是否相同,当图像特征不同时,确定当前图像帧是动态显示场景下所产生的图像帧,当图像特征相同时,确定当前图像帧是静态显示场景下所产生的图像帧。
可选的,处理器502,还用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取第二设备的分辨率;若第二设备的分辨率高于第一设备的分辨率,则对当前图像帧进行超分辨率处理。
可选的,处理器502,还用于获取将要传输编码后的当前图像帧的网络的质量;
处理器502,还用于根据网络的质量调整第一编码质量所对应的编码参数,或者,根据网络的质量调整第二编码质量所对应的编码参数,网络的质量与编码后的当前图像帧的数据量呈正相关关系,编码后的当前图像帧是根据编码参数进行编码得到的。
可选的,第二编码质量对应于无损压缩方式,发送器504,具体用于:
新建无损压缩数据通道,通过无损压缩数据通道将编码后的当前图像帧共享给第二设备进行显示,无损压缩数据通道与有损压缩数据通道相互独立,有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
通过与有损编码协议对应的有损压缩数据通道,将编码后的当前图像帧共享给第二设备进行显示,有损编码协议包括新增的且与无损压缩方式对应的编码协议,有损压缩数据通道用于传输通过有损编码协议编码得到的图像帧。
综上所述,本发明实施例提供的数据处理装置,通过检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧;若当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对当前图像帧进行编码,第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;若当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对当前图像帧进行编码,第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;第二还原水平高于第一还原水平;可以在当前图像帧是动态显示场景下所产生的图像帧时,采用第一编 码质量对该当前图像帧的进行编码,由于第一编码质量所指示的第一还原水平低于第二编码质量所指示的第二还原水平,因此,编码后的当前图像帧的数据量较少,解决了动态显示场景下所产生的图像帧差别较大且编码质量所指示的还原水平过高时,降低共享显示的实时性的问题,达到了提高共享显示的实时性的效果。
另外,通过在第一设备对当前图像帧的显示满足静态显示条件时,根据第二编码质量重新对当前图像帧进行编码;将再次编码后的当前图像帧共享给第二设备进行显示,可以在用户对显示的当前图像帧感兴趣时,采用第二编码质量重新编码和共享该当前图像帧,以提高第二设备对该当前图像帧的显示效果。
需要说明的是:上述实施例提供的数据处理装置在进行数据处理时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将数据处理装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的数据处理装置与数据处理方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,可以仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦 合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应所述以权利要求的保护范围为准。

Claims (21)

  1. 一种数据处理方法,其特征在于,所述方法包括:
    第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
    若所述当前图像帧是动态显示场景下所产生的图像帧,则根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
    若所述当前图像帧是静态显示场景下所产生的图像帧,则根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
    将编码后的所述当前图像帧共享给第二设备进行显示。
  2. 根据权利要求1所述的方法,其特征在于,所述将编码后的所述当前图像帧共享给所述第二设备进行显示之后,还包括:
    若编码后的所述当前图像帧是根据所述第一编码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
    若所述第一设备对所述当前图像帧的显示满足所述静态显示条件,则根据所述第二编码质量重新对所述当前图像帧进行编码;
    将再次编码后的所述当前图像帧共享给所述第二设备进行显示。
  3. 根据权利要求2所述的方法,其特征在于,所述检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,包括:
    获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
    检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
  4. 根据权利要求1至3任一项所述的方法,其特征在于,所述第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,包括:
    确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序,所述静态程序列表包括提供静态显示场景的应用程序;或者,
    确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
    通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述第二编码质量对图像帧进行编码;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
  5. 根据权利要求1至4任一项所述的方法,其特征在于,所述第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,还包括:
    获取所述第二设备的分辨率;
    若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像 帧进行超分辨率处理。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,所述方法,还包括:
    获取将要传输编码后的所述当前图像帧的网络的质量;
    根据所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根据所述网络的质量调整所述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
  7. 根据权利要求1所述的方法,其特征在于,所述第二编码质量对应于无损压缩方式,所述将编码后的所述当前图像帧共享给第二设备进行显示,包括:
    新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
    通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
  8. 一种数据处理装置,其特征在于,所述装置包括:
    图像帧检测模块,用于第一设备检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
    第一编码模块,用于在所述图像帧检测模块检测出所述当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
    第二编码模块,用于在所述图像帧检测模块检测出所述当前图像帧是静态 显示场景下所产生的图像帧时,根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
    第一共享模块,用于将所述第一编码模块或所述第二编码模块编码后的所述当前图像帧共享给第二设备进行显示。
  9. 根据权利要求8所述的装置,其特征在于,所述装置,还包括:
    显示检测模块,用于在所述第一共享模块将编码后的所述当前图像帧共享给所述第二设备进行显示之后,若编码后的所述当前图像帧是根据所述第一编码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
    第三编码模块,用于在所述显示检测模块检测出所述第一设备对所述当前图像帧的显示满足所述静态显示条件时,根据所述第二编码质量重新对所述当前图像帧进行编码;
    第二共享模块,用于将所述第三编码模块再次编码后的所述当前图像帧共享给所述第二设备进行显示。
  10. 根据权利要求9所述的装置,其特征在于,所述显示检测模块,具体用于:
    获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
    检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
  11. 根据权利要求8至10任一项所述的装置,其特征在于,所述图像帧检测模块,具体用于:
    确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序,所述静态程序列表包括提供静态显示场景的应用程序;或者,
    确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态 接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
    通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述第二编码质量对图像帧进行编码;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
  12. 根据权利要求8至11任一项所述的装置,其特征在于,所述装置,还包括:
    图像帧处理模块,用于所述图像帧检测模块检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取所述第二设备的分辨率;若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像帧进行超分辨率处理。
  13. 根据权利要求8至12任一项所述的装置,其特征在于,所述装置,还包括:
    网络质量获取模块,用于获取将要传输编码后的所述当前图像帧的网络的质量;
    参数调整模块,用于根据所述网络质量获取模块获取到的所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根据所述网络的质量调整所 述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
  14. 根据权利要求8所述的装置,其特征在于,所述第二编码质量对应于无损压缩方式,所述第一共享模块,具体用于:
    新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
    通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
  15. 一种数据处理装置,其特征在于,所述装置包括:总线,以及连接到所述总线的处理器、存储器、发送器和接收器。其中,所述存储器用于存储若干个指令,所述指令被配置成由所述处理器执行;
    所述处理器,用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述动态显示场景是在预定时间段内图像帧动态变化的显示场景,所述静态显示场景是在所述预定时间段内图像帧保持不变的显示场景;
    所述处理器,还用于在所述当前图像帧是动态显示场景下所产生的图像帧时,根据第一编码质量对所述当前图像帧进行编码,所述第一编码质量用于指示编码后的图像帧还原为编码前的图像帧的第一还原水平;
    所述处理器,还用于在所述当前图像帧是静态显示场景下所产生的图像帧时,根据第二编码质量对所述当前图像帧进行编码,所述第二编码质量用于指示编码后的图像帧还原为编码前的图像帧的第二还原水平;所述第二还原水平高于所述第一还原水平;
    所述发送器,用于将所述处理器编码后的所述当前图像帧共享给第二设备进行显示。
  16. 根据权利要求15所述的装置,其特征在于,
    所述处理器,还用于所述发送器将编码后的所述当前图像帧共享给所述第二设备进行显示之后,若编码后的所述当前图像帧是根据所述第一编码质量对所述当前图像帧进行编码得到的,则检测所述第一设备对所述当前图像帧的显示是否满足静态显示条件,所述静态显示条件为显示的所述当前图像帧保持不变;
    所述处理器,还用于在所述第一设备对所述当前图像帧的显示满足所述静态显示条件时,根据所述第二编码质量重新对所述当前图像帧进行编码;
    所述发送器,还用于将所述处理器再次编码后的所述当前图像帧共享给所述第二设备进行显示。
  17. 根据权利要求16所述的装置,其特征在于,所述处理器,具体用于:
    获取所述第一设备显示所述当前图像帧的时长,检测所述时长是否大于预设阈值;或者,
    检测是否接收到暂停指令,所述暂停指令用于指示所述第一设备停止将所述显示缓冲区中的所述当前图像帧更新为下一个图像帧。
  18. 根据权利要求15至17任一项所述的装置,其特征在于,所述处理器,具体用于:
    确定生成所述当前图像帧的应用程序,检测所述应用程序属于动态程序列表还是静态程序列表,所述动态程序列表包括提供动态显示场景的应用程序,所述静态程序列表包括提供静态显示场景的应用程序;或者,
    确定显示所述当前图像帧时调用的控件接口,检测所述控件接口属于动态接口列表还是静态接口列表,所述动态接口列表包括在动态显示场景下被调用的控件接口,所述静态接口列表包括在静态显示场景下被调用的控件接口;或者,
    通过提供所述当前图像帧的应用程序所调用的编码接口检测所述当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧,所述编码接口包括第一编码接口和第二编码接口,所述第一编码接口用于指示根据所述第一编码质量对图像帧进行编码,所述第二编码接口用于指示根据所述 第二编码质量对图像帧进行编码;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,检测所述当前图像帧和所述相邻的图像帧各自所包括的图像数据是否相同,当所述图像数据不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像数据相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧;或者,
    获取所述显示缓冲区中与所述当前图像帧相邻的图像帧,对所述当前图像帧和所述相邻的图像帧中的同一种图像特征进行提取,检测各自得到的所述图像特征是否相同,当所述图像特征不同时,确定所述当前图像帧是动态显示场景下所产生的图像帧,当所述图像特征相同时,确定所述当前图像帧是静态显示场景下所产生的图像帧。
  19. 根据权利要求15至18任一项所述的装置,其特征在于,
    所述处理器,还用于检测显示缓冲区中的当前图像帧是动态显示场景下所产生的图像帧还是静态显示场景下所产生的图像帧之前,获取所述第二设备的分辨率;若所述第二设备的分辨率高于所述第一设备的分辨率,则对所述当前图像帧进行超分辨率处理。
  20. 根据权利要求15至19任一项所述的装置,其特征在于,
    所述处理器,还用于获取将要传输编码后的所述当前图像帧的网络的质量;
    所述处理器,还用于根据所述网络的质量调整所述第一编码质量所对应的编码参数,或者,根据所述网络的质量调整所述第二编码质量所对应的编码参数,所述网络的质量与编码后的所述当前图像帧的数据量呈正相关关系,编码后的所述当前图像帧是根据所述编码参数进行编码得到的。
  21. 根据权利要求15所述的装置,其特征在于,所述第二编码质量对应于无损压缩方式,所述发送器,具体用于:
    新建无损压缩数据通道,通过所述无损压缩数据通道将编码后的所述当前图像帧共享给所述第二设备进行显示,所述无损压缩数据通道与有损压缩数据通道相互独立,所述有损压缩数据通道用于传输通过有损压缩方式编码得到的图像帧;或,
    通过与有损编码协议对应的有损压缩数据通道,将编码后的所述当前图像 帧共享给所述第二设备进行显示,所述有损编码协议包括新增的且与所述无损压缩方式对应的编码协议,所述有损压缩数据通道用于传输通过所述有损编码协议编码得到的图像帧。
PCT/CN2015/075290 2015-03-27 2015-03-27 数据处理方法及装置 WO2016154816A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
RU2017135314A RU2662648C1 (ru) 2015-03-27 2015-03-27 Способ и устройство для обработки данных
CN201580065215.4A CN107004018B (zh) 2015-03-27 2015-03-27 数据处理方法及装置
PCT/CN2015/075290 WO2016154816A1 (zh) 2015-03-27 2015-03-27 数据处理方法及装置
JP2017550612A JP6483850B2 (ja) 2015-03-27 2015-03-27 データ処理方法および装置
EP15886803.4A EP3264284B1 (en) 2015-03-27 2015-03-27 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/075290 WO2016154816A1 (zh) 2015-03-27 2015-03-27 数据处理方法及装置

Publications (1)

Publication Number Publication Date
WO2016154816A1 true WO2016154816A1 (zh) 2016-10-06

Family

ID=57006428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/075290 WO2016154816A1 (zh) 2015-03-27 2015-03-27 数据处理方法及装置

Country Status (5)

Country Link
EP (1) EP3264284B1 (zh)
JP (1) JP6483850B2 (zh)
CN (1) CN107004018B (zh)
RU (1) RU2662648C1 (zh)
WO (1) WO2016154816A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764705A (zh) * 2021-01-26 2021-05-07 重庆紫光华山智安科技有限公司 一种用户界面共享方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190268601A1 (en) * 2018-02-26 2019-08-29 Microsoft Technology Licensing, Llc Efficient streaming video for static video content
CN112714273A (zh) 2020-12-25 2021-04-27 北京字节跳动网络技术有限公司 屏幕共享显示方法、装置、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070064808A1 (en) * 2005-09-22 2007-03-22 Sanyo Electric Co., Ltd. Coding device and coding method enable high-speed moving image coding
CN101855911A (zh) * 2007-09-28 2010-10-06 杜比实验室特许公司 处理视频信息
CN102546917A (zh) * 2010-12-31 2012-07-04 联想移动通信科技有限公司 带摄像头的移动终端及其视频处理方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5990955A (en) * 1997-10-03 1999-11-23 Innovacom Inc. Dual encoding/compression method and system for picture quality/data density enhancement
JP2003309841A (ja) * 2002-02-12 2003-10-31 Hitachi Kokusai Electric Inc 動画像伝送装置
US8964830B2 (en) * 2002-12-10 2015-02-24 Ol2, Inc. System and method for multi-stream video compression using multiple encoding formats
CN100512488C (zh) * 2005-03-24 2009-07-08 华为技术有限公司 无线局域网向呈现系统提供呈现信息的方法及系统
EP1837826A1 (en) * 2006-03-20 2007-09-26 Matsushita Electric Industrial Co., Ltd. Image acquisition considering super-resolution post-interpolation
WO2008137432A2 (en) * 2007-05-01 2008-11-13 Dyyno Sharing of information and formatting information for transmission over a communication network
US8687702B2 (en) * 2008-10-27 2014-04-01 Advanced Micro Devices, Inc. Remote transmission and display of video data using standard H.264-based video codecs
TWI401968B (zh) * 2009-10-30 2013-07-11 Awind Inc 螢幕畫面之編碼方法及其應用之電子裝置
US9578336B2 (en) * 2011-08-31 2017-02-21 Texas Instruments Incorporated Hybrid video and graphics system with automatic content detection process, and other circuits, processes, and systems
US9277230B2 (en) * 2011-11-23 2016-03-01 Qualcomm Incorporated Display mode-based video encoding in wireless display devices
JP2014200076A (ja) * 2013-03-15 2014-10-23 株式会社リコー 配信制御システム、配信制御方法、及びプログラム
WO2014157889A1 (en) * 2013-03-25 2014-10-02 Samsung Electronics Co., Ltd. Method and apparatus for improving quality of experience in sharing screen among devices, and recording medium thereof
US9325985B2 (en) * 2013-05-28 2016-04-26 Apple Inc. Reference and non-reference video quality evaluation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070064808A1 (en) * 2005-09-22 2007-03-22 Sanyo Electric Co., Ltd. Coding device and coding method enable high-speed moving image coding
CN101855911A (zh) * 2007-09-28 2010-10-06 杜比实验室特许公司 处理视频信息
CN102546917A (zh) * 2010-12-31 2012-07-04 联想移动通信科技有限公司 带摄像头的移动终端及其视频处理方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3264284A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764705A (zh) * 2021-01-26 2021-05-07 重庆紫光华山智安科技有限公司 一种用户界面共享方法及电子设备

Also Published As

Publication number Publication date
JP6483850B2 (ja) 2019-03-13
EP3264284A1 (en) 2018-01-03
EP3264284B1 (en) 2021-09-15
RU2662648C1 (ru) 2018-07-26
EP3264284A4 (en) 2018-04-11
CN107004018B (zh) 2020-11-17
CN107004018A (zh) 2017-08-01
JP2018514133A (ja) 2018-05-31

Similar Documents

Publication Publication Date Title
AU2010341605B2 (en) Systems and methods for video-aware screen capture and compression
CN109104610B (zh) 实时屏幕共享
CN112087633B (zh) 视频解码方法、装置及存储介质
US11627369B2 (en) Video enhancement control method, device, electronic device, and storage medium
CN109309842B (zh) 直播数据处理方法和装置、计算机设备和存储介质
US10645391B2 (en) Graphical instruction data processing method and apparatus, and system
CN113825020B (zh) 视频清晰度切换方法、装置、设备、存储介质及程序产品
CN115089966B (zh) 应用于云游戏的视频渲染方法、系统及相关设备
CN113301355A (zh) 视频传输、直播与播放方法、设备及存储介质
CN111343503B (zh) 视频的转码方法、装置、电子设备及存储介质
WO2016154816A1 (zh) 数据处理方法及装置
CN110418209B (zh) 一种应用于视频传输的信息处理方法及终端设备
KR102273141B1 (ko) 클라우드 스트리밍 서비스 시스템, 스틸 이미지 압축 기법을 이용한 클라우드 스트리밍 서비스 방법 및 이를 위한 장치
CN109640094B (zh) 视频解码方法、装置以及电子设备
WO2020038071A1 (zh) 视频增强控制方法、装置、电子设备及存储介质
US20210400334A1 (en) Method and apparatus for loop-playing video content
CN109379630B (zh) 视频处理方法、装置、电子设备及存储介质
CN112672147A (zh) 一种基于投屏的解码方法、设备及系统
KR20160131829A (ko) 클라우드 스트리밍 서비스 시스템, 이미지 타입에 따른 알파 값을 이용한 이미지 클라우드 스트리밍 서비스 방법 및 이를 위한 장치
US20220279190A1 (en) Transmission apparatus, reception apparatus, transmission method,reception method, and program
KR20220144241A (ko) 서버 및 그 제어 방법
CN116582707A (zh) 一种视频同步显示方法、装置、设备及介质
CN117956226A (zh) 云手机的视频解码方法和装置
CN117221293A (zh) 数据传输方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15886803

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017550612

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2015886803

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017135314

Country of ref document: RU