CN116996639A - Screen-projection frame rate acquisition method and device, computer equipment and storage medium - Google Patents

Screen-projection frame rate acquisition method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN116996639A
CN116996639A CN202310139841.XA CN202310139841A CN116996639A CN 116996639 A CN116996639 A CN 116996639A CN 202310139841 A CN202310139841 A CN 202310139841A CN 116996639 A CN116996639 A CN 116996639A
Authority
CN
China
Prior art keywords
image frame
sequence
screen
image
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310139841.XA
Other languages
Chinese (zh)
Inventor
权博博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TCL New Technology Co Ltd
Original Assignee
Shenzhen TCL New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TCL New Technology Co Ltd filed Critical Shenzhen TCL New Technology Co Ltd
Priority to CN202310139841.XA priority Critical patent/CN116996639A/en
Publication of CN116996639A publication Critical patent/CN116996639A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0261Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level

Abstract

The application provides a screen-throwing frame rate acquisition method, a device, computer equipment and a storage medium, wherein the method comprises the following steps: collecting an image frame sequence to be projected; acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence; determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence; and transmitting the image frames in the image frame sequence to the second terminal at the screen-throwing frame rate. The brightness difference information of the corresponding image frame sequence of the screen picture in the first terminal is used for detecting the picture state type of the screen picture, so that the screen throwing frame rate is dynamically adjusted based on the picture state type, and the power consumption of the terminal is reduced while the picture smoothness in the screen throwing process is ensured.

Description

Screen-projection frame rate acquisition method and device, computer equipment and storage medium
Technical Field
The application relates to the technical field of screen projection of terminals, in particular to a screen projection frame rate acquisition method, a screen projection frame rate acquisition device, computer equipment and a storage medium.
Background
With the development of technology, terminal devices are more and more diversified, and screen projection technology is also widely applied. The screen projection refers to sending the content of one terminal to another terminal for display, for example, the screen content displayed by the mobile phone screen is projected to a television for display. In the screen throwing process, the sending end device continuously samples the image data of the screen picture at a certain frame rate and sends the acquired image data to the receiving end device, but when the screen of the sending end device is a static picture or a picture with slower change, the image data obtained by sampling the screen picture is always unchanged, and continuously sending the same image data to the receiving end device always causes the waste of power consumption of the terminal device.
Disclosure of Invention
Based on the foregoing, it is necessary to provide a method, an apparatus, a computer device and a storage medium for acquiring a screen frame rate, so as to reduce power consumption of the device in a screen projection process.
In a first aspect, the present application provides a method for acquiring a screen frame rate, which is applied to a first terminal, and the method includes:
collecting an image frame sequence to be projected;
acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence;
determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence;
and transmitting the image frames in the image frame sequence to the second terminal at the screen-throwing frame rate.
In some embodiments of the present application, the step of obtaining the brightness difference information between every two adjacent image frames in the image frame sequence includes:
acquiring first brightness information based on pixel information of a preset area in a first image frame;
acquiring second brightness information based on pixel information of a preset area in the second image frame; wherein the first image frame and the second image frame are two adjacent image frames in the image frame sequence;
And calculating the difference value between the first brightness information and the second brightness information to obtain brightness difference information between the first image frame and the second image frame.
In some embodiments of the present application, the preset area includes a first area with a first reference line as a center line and a second area with a second reference line as a center line; the first reference line and the second reference line are reference lines with different directions.
In some embodiments of the present application, the preset area includes a first rectangular area with a first reference line as a center line and a second rectangular area with a second reference line as a center line; wherein the first reference line is parallel to the horizontal axis of the image frame and the second reference line is parallel to the vertical axis of the image frame.
In some embodiments of the present application, before the step of obtaining the luminance difference information between every two adjacent image frames in the image frame sequence, the method further includes:
determining a first reference line and a second reference line based on a center point of an image frame in the image frame sequence;
a preset region in the first image frame and a preset region in the second image frame are determined based on the first reference line and the second reference line.
In some embodiments of the present application, before the step of obtaining the luminance difference information between every two adjacent image frames in the image frame sequence, the method further includes:
Acquiring differential pixel points between the first frame image and the second frame image;
determining a first reference line and a second reference line in the second image frame and the first image frame based on the differential pixel points;
and acquiring a preset area in the first image frame and a preset area in the second image frame based on the first reference line and the second reference line.
In some embodiments of the present application, the picture status types include a dynamic picture type and a static picture type;
the step of determining the picture state type of the image frame sequence and the screen projection frame rate corresponding to the picture state type based on the brightness difference sequence comprises the following steps:
if the continuous preset quantity of brightness difference information in the brightness difference sequence is larger than a first brightness threshold value, determining that the image frame sequence is of a dynamic picture type, and determining that the screen-throwing frame rate is a first frame rate;
if the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to the first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen-throwing frame rate is of a second frame rate;
wherein the first frame rate is greater than the second frame rate.
In some embodiments of the present application, before the step of obtaining the luminance difference information between each two adjacent image frames in the image frame sequence to obtain the luminance difference sequence of the image frame sequence, the method further includes:
Acquiring a transmission bandwidth between the second terminal and the second terminal;
correspondingly, the obtaining the brightness difference information between every two adjacent image frames in the image frame sequence to obtain the brightness difference sequence of the image frame sequence comprises the following steps:
and if the transmission bandwidth is greater than a preset bandwidth threshold, acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence.
In some embodiments of the present application, after the transmission bandwidth between the image frame sequence and the second terminal is acquired, if the transmission bandwidth information is less than or equal to a preset bandwidth threshold, the image frame in the image frame sequence is sent to the second terminal at a fixed frame rate.
In some embodiments of the present application, the step of transmitting the image frames in the sequence of image frames to the second terminal at the screen frame rate includes:
determining screen-throwing interval time based on the screen-throwing frame rate;
acquiring a target image frame acquired latest in an image frame sequence and the actual interval time between the target image frame and the image frame transmitted last time;
if the actual interval time is smaller than the screen-throwing interval time, discarding the target image frame;
and if the actual interval time is greater than or equal to the screen-throwing interval time, sending the target image frame to the second terminal.
In some embodiments of the present application, after the step of determining the screen frame rate corresponding to the picture status type, the method further includes:
and sending the screen-throwing frame rate to the second terminal, so that the second terminal plays the received image frames according to the screen-throwing frame rate.
In a second aspect, the present application provides a device for acquiring a screen-dropping frame rate, where the device includes:
the image acquisition module is used for acquiring an image frame sequence to be projected;
the brightness difference acquisition module is used for acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence;
the frame rate determining module is used for determining the picture state type of the image frame sequence and the screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence;
and the screen projection module is used for transmitting the image frames in the image frame sequence to the second terminal at the screen projection frame rate.
In a third aspect, the present application also provides a computer device comprising:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to implement the above-described drop frame rate acquisition method.
In a fourth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program to be loaded by a processor for performing the steps of the method for acquiring a frame rate of a screen.
The method, the device, the computer equipment and the storage medium for acquiring the screen-throwing frame rate acquire an image frame sequence to be thrown; acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence; determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence; and transmitting the image frames in the image frame sequence to the second terminal at the screen-throwing frame rate. The brightness difference information of the corresponding image frame sequence of the screen picture in the first terminal is used for detecting the picture state type of the screen picture, so that the screen throwing frame rate is dynamically adjusted based on the picture state type, and the power consumption of the terminal is reduced while the picture smoothness in the screen throwing process is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of a scene of a method for acquiring a screen frame rate in an embodiment of the application;
FIG. 2 is a flowchart of a method for acquiring a frame rate of a screen shot according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a step of obtaining luminance difference information between image frames according to an embodiment of the present application;
FIG. 4A is a schematic diagram of a predetermined region in an image frame according to an embodiment of the present application;
FIG. 4B is a schematic diagram of a predetermined region in another image frame according to an embodiment of the present application;
FIG. 5A is a schematic diagram of a predetermined region in a still further image frame according to an embodiment of the present application;
FIG. 5B is a schematic diagram of a predetermined region in yet another image frame according to an embodiment of the present application;
FIG. 6 is a flowchart of another method for acquiring a frame rate of a screen shot according to an embodiment of the present application;
FIG. 7 is a flowchart of another method for acquiring a frame rate of a screen shot according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a screen frame rate obtaining apparatus according to an embodiment of the present application;
fig. 9 is a schematic diagram of a computer device in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present application, the word "for example" is used to mean "serving as an example, instance, or illustration. Any embodiment described as "for example" in this disclosure is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the application. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known structures and processes have not been described in detail so as not to obscure the description of the application with unnecessary detail. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The screen-throwing frame rate acquisition method provided by the embodiment of the application can be applied to a screen-throwing frame rate acquisition system shown in figure 1. The screen frame rate obtaining system includes a first terminal 101 and a second terminal 102, where the first terminal 101 and the second terminal 102 may be cellular or other communication devices with displays, and may specifically be one of a desktop terminal or a mobile terminal, such as a mobile phone, a tablet computer, a notebook computer, a television, and the like. Specifically, the first terminal 101 collects a sequence of image frames to be projected; acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence; determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence; and transmitting the image frames in the image frame sequence to the second terminal 102 at the screen-dropping frame rate.
It will be understood by those skilled in the art that the application environment shown in fig. 1 is merely an application scenario of the present application, and is not limited to the application scenario of the present application, and other application environments may further include more or fewer computer devices than those shown in fig. 1, for example, only 1 second terminal 102 is shown in fig. 1, and it may be understood that in the screen frame rate obtaining system, the first terminal 101 may also perform screen projection to one or more other terminals, which is not limited herein. In addition, as shown in fig. 1, the screen frame rate acquisition system may further include a memory for storing data, such as a specific value of the screen frame rate, an image frame to be screened, and the like.
It should be further noted that, the schematic view of the scene of the screen frame rate acquisition system shown in fig. 1 is only an example, and the screen frame rate acquisition system and the scene described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided by the embodiments of the present application, and as one of ordinary skill in the art can know, along with the evolution of the screen frame rate acquisition system and the appearance of a new service scene, the technical solution provided by the embodiments of the present application is equally applicable to similar technical problems.
Referring to fig. 2, an embodiment of the present application provides a method for acquiring a frame rate of a screen, which is mainly applied to the first terminal 101 in fig. 1 and illustrated by the following steps S210 to S240, which are specifically as follows:
step S210, a sequence of image frames to be projected is acquired.
Wherein the image frame sequence comprises a plurality of image frames; specifically, the first terminal may sample the content displayed on the display screen to obtain a sequence of image frames.
Step S220, obtaining the brightness difference information between every two adjacent image frames in the image frame sequence to obtain the brightness difference sequence of the image frame sequence.
The brightness difference information may refer to brightness differences between different image frames; it can be understood that human eyes are more sensitive to the brightness in the image frames, and the accuracy of the picture state type can be improved by acquiring the brightness difference information between the image frames for the subsequent judgment of the picture state type. Specifically, the brightness difference information between the image frames of two adjacent frames can be obtained by calculation according to the pixel information of the image frames of two adjacent frames; for example, YUV color space data or RGB color space data of pixel points in an image frame may be converted into luminance information of the image frame, and luminance difference information between image frames of two adjacent frames may be calculated based on the luminance information of the two adjacent frames; for another example, the Y component value in the YUV color space data corresponding to the pixel point may be directly determined as the brightness information of the image frame, and further, based on the brightness information of the image frames of the two adjacent frames, the brightness difference information between the image frames of the two adjacent frames may be calculated.
Further, the brightness difference information between the image frames of the two adjacent frames can be obtained by calculation according to the pixel information of the image frames of the two adjacent frames on all the pixel points. For example, luminance information of each pixel point in the image frame may be determined according to YUV color space data or RGB color space data corresponding to each pixel point in the image frame, further, luminance difference values between luminance information on the same pixel point in adjacent image frames are calculated based on the luminance information of each pixel point, and finally, luminance difference information between image frames of two adjacent frames is calculated according to the luminance difference values of the two adjacent image frames on all the pixel points.
In order to reduce the calculation amount of the brightness difference information between the image frames, the number of the pixel points in the image frames is considered, and the brightness difference information can be determined according to the pixel information of the pixel points in the preset area of the two adjacent image frames. Specifically, in one embodiment, as shown in fig. 3, the step of acquiring luminance difference information between every two adjacent image frames in the image frame sequence includes:
step S310, acquiring first brightness information based on pixel information of a preset area in a first image frame;
step S320, obtaining second brightness information based on pixel information of a preset area in the second image frame; wherein the first image frame and the second image frame are two adjacent image frames in the image frame sequence;
Step S330, calculating the difference between the first brightness information and the second brightness information to obtain the brightness difference information between the first image frame and the second image frame.
The number of the preset areas may be one or more, and is not limited herein.
Specifically, the preset area may include a plurality of rectangular areas uniformly distributed in the image frame (including the first image frame and the second image frame), and for example, as shown in fig. 4A, the preset area may include rectangular areas 410, 420, 430, 440, and 450 uniformly distributed in the image frame.
The preset area may also include a first area taking a first reference line as a central line and a second area taking a second reference line as a central line in the image frame, wherein the first reference line and the second reference line are two reference lines with different directions. It is understood that the number of the first reference lines (or the first areas) and the second reference lines (or the second areas) may be one or more, and the number of the first reference lines and the second reference lines may be equal or unequal; the first region and the second region may be rectangular regions or irregular regions, and are not limited thereto. For example, as shown in fig. 4B, the first reference line may include a reference line 450, a first region centered on the first reference line may be a region 460, the second reference line may include a reference line 470, and a second region centered on the second reference line may be a region 480.
Further, in one embodiment, the preset area includes a first rectangular area with the first reference line as a center line and a second rectangular area with the second reference line as a center line; wherein the first reference line is parallel to the horizontal axis of the image frame and the second reference line is parallel to the vertical axis of the image frame.
Specifically, the first reference line and the second reference line may be reference lines with a center point of the image frame as a reference point. In one embodiment, before the step of obtaining the luminance difference information between every two adjacent image frames in the image frame sequence, the method further includes: determining a first reference line and a second reference line based on a center point of an image frame in the image frame sequence; a preset region in the first image frame and a preset region in the second image frame are determined based on the first reference line and the second reference line.
For example, as shown in fig. 5A, the first reference line includes a reference line 510, the first rectangular region of the first reference line being the center line includes a region 520, the second reference line includes a reference line 530, the second rectangular region of the second reference line being the center line includes a region 540, and the reference line 510 and the reference line 530 cross the center point of the image frame.
The first rectangular area and the second rectangular area which are determined by the first reference line and the second reference line taking the center point as the datum point are used for guaranteeing effective sampling of brightness information in image frames with different sizes, and accuracy of the state types of subsequent pictures is guaranteed while the calculated amount of brightness difference information is reduced.
Specifically, the first reference line and the second reference line may be reference lines using differential pixel points of the image frame as reference points. In one embodiment, before the step of obtaining the luminance difference information between every two adjacent image frames in the image frame sequence, the method further includes: acquiring differential pixel points between the first frame image and the second frame image; determining a first reference line and a second reference line in the second image frame and the first image frame based on the differential pixel points; and acquiring a preset area in the first image frame and a preset area in the second image frame based on the first reference line and the second reference line.
For example, as shown in fig. 5B, the differential pixel point includes a point 550A and a point 550B, the passing point 550A may determine a first reference line 560A and a second reference line 580A, the passing point 550A may determine the first reference line 550B and the second reference line 580B, a first rectangular region with the first reference line as a center line includes regions 570A and 570B, and a second rectangular region with the second reference line as a center line includes regions 590A and 590B.
The first rectangular area and the second rectangular area which are determined by the first reference line and the second reference line which take the differential pixel points as datum points are used for guaranteeing effective sampling of brightness information in image frames, and accuracy of the state type of a subsequent picture is guaranteed while the calculated amount of the brightness difference information is reduced.
After the preset area is determined, the pixel information of the first image frame in the preset area can be obtained, the first brightness information is determined based on the pixel information of the first image frame in the preset area, the pixel information of the second image frame in the preset area is obtained, and the second brightness information is determined based on the pixel information of the second image frame in the preset area. Specifically, for the first image frame, YUV color space data of each pixel point in the preset area of the first image frame may be obtained, and a luminance value of each pixel point in the preset area may be determined based on a luminance component in the YUV color space data corresponding to each pixel point, so as to obtain the first luminance information. For the second image frame, YUV color space data of each pixel point in the preset area of the second image frame can be obtained, and the brightness value of each pixel point in the preset area is determined based on the brightness component in the YUV color space data corresponding to each pixel point, so that second brightness information is obtained.
After the first brightness information of the first image frame and the second brightness information of the second image frame are obtained, the brightness difference value of the pixel points at the same position can be calculated based on the first brightness information and the second brightness information, and then the brightness difference information between the first image frame and the second image frame is determined based on the brightness difference values corresponding to all the pixel points in the preset area.
For example, the first brightness information includes brightness values at each pixel point in a preset area in the nth image frame, which are respectively recorded asThe second brightness information comprises brightness values of each pixel point in the preset area in the (n-1) th frame image frame, which are respectively marked as +.>The brightness difference information between the first image frame and the second image frame can be specifically calculated by the following formula:
f(n)=ΣS i
wherein S is i A sum value representing a difference value of brightness values at each pixel point in the i-th preset area between the n-th image frame and the (n-1) -th image frame; f (n) denotes luminance difference information between the nth image frame and the (n-1) th image frame.
The brightness difference sequence comprises brightness difference information of at least one adjacent two-frame image frame. Specifically, according to the acquired pixel information of the image frames of two adjacent frames, the brightness difference information between the image frames of each two adjacent frames in the image sequence frame sequence can be determined, and then the brightness difference sequence of the image frame sequence is constructed based on the acquired brightness information.
It can be understood that the brightness difference sequence of the image frame sequence is not generated at one time, but can be calculated based on the pixel information of the image frame obtained by the latest acquisition and the image frame of the previous frame when the first terminal acquires one image frame in the process of screen projection from the first terminal to the second terminal, and the brightness difference information between the image frame and the image frame of the previous frame is stored in a buffer queue to obtain the brightness difference sequence corresponding to the image frame sequence.
Step S230, determining a frame status type of the image frame sequence and a frame rate corresponding to the frame status type based on the brightness difference sequence.
The picture state type is used for identifying the changing speed of the picture in the image frame sequence acquired by the first terminal. Specifically, the picture status type of the image frame sequence may be determined based on how fast the image picture changes, for example, the picture status type may be classified into a still picture type, a dynamic picture type; for another example, the picture status type may be classified into a game picture type, a video picture type, a PPT picture type.
The screen frame rate refers to the frequency of the first terminal sending image frames to the second terminal. The screen frame rate corresponding to different picture state types is understood to be larger when the image picture corresponding to the picture state type changes faster, and smaller when the image picture corresponding to the picture state type changes slower. For example, the screen is projected during video playing or game playing, the user is more sensitive to picture synchronization and picture smoothness, and the higher the screen projection frequency of the image frame sequence is, the better the experience is; on the contrary, only the screen is projected in the processes of viewing, switching and the like of the static page, the user does not obviously perceive the picture synchronization and the picture fluency, and the screen projection frame rate is fluent as long as the threshold value of the optical residual of the human eyes is 1/24 second. Taking a picture state type including a static picture type and a dynamic picture type as an example, the screen-throwing frame rate corresponding to the static picture type is smaller than the screen-throwing frame rate corresponding to the dynamic picture type; for another example, taking the example that the picture status type includes a game picture type, a video picture type, and a PPT picture type, the screen frame rate corresponding to the game picture type, the video picture type, and the PPT picture type is sequentially decreased.
When judging the picture state type of the image frame sequence acquired in real time, in order to avoid frequent adjustment of the screen-throwing frame rate caused by switching and changing of the picture state type, a fuzzy interval can be set to realize inert switching of the picture state type. Taking the example that the picture status type includes a still picture type and a dynamic picture type, in one embodiment, the picture status type includes a dynamic picture type and a still picture type; the step of determining the picture state type of the image frame sequence and the screen projection frame rate corresponding to the picture state type based on the brightness difference sequence comprises the following steps: if the continuous preset quantity of brightness difference information in the brightness difference sequence is larger than a first brightness threshold value, determining that the image frame sequence is of a dynamic picture type, and determining that the screen-throwing frame rate is a first frame rate; if the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to the first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen-throwing frame rate is of a second frame rate; wherein the first frame rate is greater than the second frame rate.
Specifically, each time the first terminal acquires an image frame, the brightness difference information between the image frame and the image frame of the previous frame can be calculated based on the pixel information of the image frame acquired by the latest acquisition and the image frame of the previous frame, and the brightness difference information is stored in a buffer queue to obtain a brightness difference sequence; at this time, the number of luminance difference information continuously greater than the first luminance threshold in the luminance difference sequence or the number of luminance difference information continuously less than or equal to the first luminance threshold in the luminance difference sequence may be counted in real time based on the luminance difference information, and if the number of luminance difference information continuously preset in the luminance difference sequence is greater than the first luminance threshold, the image frame sequence is determined to be a dynamic picture type; if the continuous N (N is greater than 1) luminance difference information in the luminance difference sequence is smaller than or equal to the first luminance threshold value, determining that the image frame sequence is of a static picture type.
For example, taking the value of N as 30, if 30 f (N) are successively smaller than or equal toThe image frame sequence is considered to be a still picture type; if 30 f (n) are greater than +.>The image frame sequence is considered to be a dynamic picture type. Wherein (1)>Representing a first luminance threshold.
More specifically, in the process of screen projection from the first terminal to the second terminal, a target image frame acquired most recently in an image frame sequence can be determined, target brightness difference information between the target image frame and a previous image frame is acquired, if the target brightness difference information and (N-1) brightness difference information continuous with the target brightness difference information in the brightness difference sequence are both greater than brightness difference information of a first brightness threshold value, the image frame sequence is determined to be a dynamic picture type, and the screen projection frame rate is determined to be a first frame rate; if the target brightness difference information and (N-1) brightness difference information which is continuous with the target brightness difference information in the brightness difference sequence are all smaller than or equal to the brightness difference information of the first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen throwing frame rate is of a second frame rate.
Similarly, when the number of the picture state types is three or more, a corresponding brightness threshold may be set for each picture state type, and then brightness difference information in the brightness difference sequence is compared with each brightness threshold to determine the picture state type corresponding to the image frame sequence. Taking the example that the picture state type comprises a game picture type, a video picture type and a PPT picture type, determining the picture state type of the image frame sequence and the screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence, comprising the following steps: if the continuous preset number of brightness difference information in the brightness difference sequence is larger than the second brightness threshold value, determining that the image frame sequence is of a game picture type, and determining that the screen throwing frame rate is of a third frame rate; if the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to the second brightness threshold value and is larger than the third brightness threshold value, determining that the image frame sequence is of a video picture type, and determining that the screen-throwing frame rate is of a fourth frame rate; if the continuous preset quantity of brightness difference information in the brightness difference sequence is smaller than the third brightness threshold value, determining that the image frame sequence is of a PPT picture type, and determining that the screen-throwing frame rate is of a fifth frame rate; wherein the third frame rate is greater than the fourth frame rate, and the fourth frame rate is greater than the fifth frame rate.
For example, taking the value of N as 30, if 30 f (N) are successively smaller than or equal toThen the image frame sequence is considered to be a PPT picture type; if 30 f (n) are greater than +.>And is less than or equal to->The image frame sequence is considered to be a video picture type; if 30 f (n) are greater than +.>The image frame sequence is considered to be a game picture type. Wherein (1)>The second luminance threshold value and the third luminance threshold value are respectively represented.
And step S240, transmitting the image frames in the image frame sequence to the second terminal at the screen-throwing frame rate.
After determining the screen-shot frame rate, the image frames in the sequence of image frames may be sent to the second terminal at the screen-shot frame rate. It will be appreciated that the sequence of image frame rates may be transmitted to the second terminal at the screen frame rate based on a mirrored screen casting protocol (Miracast protocol).
Specifically, in one embodiment, the step of transmitting the image frames in the image frame sequence to the second terminal at the screen frame rate includes: determining screen-throwing interval time based on the screen-throwing frame rate; acquiring a target image frame acquired latest in an image frame sequence and the actual interval time between the target image frame and the image frame transmitted last time; if the actual interval time is smaller than the screen-throwing interval time, discarding the target image frame; and if the actual interval time is greater than or equal to the screen projection interval time, the target image frame is sent to the second terminal.
The screen-throwing interval time refers to interval time of two front and rear image frames sent in the screen-throwing process. Specifically, in the process that the first terminal throws the screen to the second terminal, after the target image frame acquired last in the image frame sequence is acquired, the actual interval time between the target image frame and the image frame transmitted last time can be calculated, if the actual interval time is less than the screen throwing interval time, the target image frame is discarded so as to reduce the pressure of image frame coding and image frame transmission, and if the actual interval time is greater than or equal to the screen throwing interval time, the target image frame is normally packaged and transmitted, and the smoothness of the screen throwing picture is ensured.
Meanwhile, in one embodiment, after the step of determining the screen frame rate corresponding to the picture status type, the method further includes: and sending the screen-throwing frame rate to the second terminal, so that the second terminal plays the received image frames according to the screen-throwing frame rate. And transmitting the screen-throwing frame rate to the second terminal, so that the second terminal updates the playing frame rate of the image frame based on the screen-throwing frame rate, and the picture synchronism between the first terminal and the second terminal is ensured.
In the screen-throwing frame rate acquisition method, an image frame sequence to be thrown on a screen is acquired; acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence; determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence; and transmitting the image frames in the image frame sequence to the second terminal at the screen-throwing frame rate. The brightness difference information of the corresponding image frame sequence of the screen picture in the first terminal is used for detecting the picture state type of the screen picture, so that the screen throwing frame rate is dynamically adjusted based on the picture state type, and the power consumption of the terminal is reduced while the picture smoothness in the screen throwing process is ensured.
Considering the influence of the communication bandwidth between the first terminal and the second terminal on the transmission of the image frame sequence, in one embodiment, the step of obtaining the brightness difference information between every two adjacent image frames in the image frame sequence, before obtaining the brightness difference sequence of the image frame sequence, further includes: acquiring a transmission bandwidth between the second terminal and the second terminal; and if the transmission bandwidth is greater than a preset bandwidth threshold, executing the step of acquiring the brightness difference information between every two adjacent image frames in the image frame sequence to obtain the brightness difference sequence of the image frame sequence.
Further, if the transmission bandwidth information is smaller than or equal to a preset bandwidth threshold, sending the image frames in the image frame sequence to the second terminal at a fixed frame rate.
The preset bandwidth threshold value can be set according to actual conditions; in one embodiment, the preset bandwidth threshold may be set to 50Mbps. Specifically, the first terminal may calculate the data amount of the transmission information, the transmission time of the transmission information, and the reception time of the response information received by the second terminal based on the transmission information, and calculate the transmission bandwidth between the first terminal and the second terminal.
After determining the transmission bandwidth, the transmission bandwidth can be compared with a preset bandwidth threshold, and when the transmission bandwidth is larger than the preset bandwidth threshold, the step of acquiring the brightness difference information of the front and rear two frames of frames based on the pixel information of the front and rear two frames of frames in the image frame sequence is executed, so that the first terminal can judge the frame state type based on the brightness difference information, and further adjust the screen throwing frame rate based on the frame state type. And when the transmission bandwidth is smaller than or equal to a preset bandwidth threshold, determining the screen-throwing frame rate as a fixed frame rate, so that the first terminal sends the image frame sequence at the fixed frame rate and stops judging the picture state type.
The screen frame rate obtaining method provided by the embodiment of the application is further described below with reference to the application scenario of fig. 1. In one embodiment, as shown in fig. 6 and 7, the screen frame rate acquisition method includes:
s610, the first terminal collects an image frame sequence to be projected.
Referring to fig. 7, the first terminal, as a Source end, may perform screen recording sampling on content displayed on a screen in real time from a Media Source (surface provider) through a Media streaming module (Media provider) to obtain an image frame sequence.
S620, for each adjacent first image frame and second image frame in the image frame sequence, the first terminal obtains first brightness information based on pixel information of a preset area in the first image frame, and obtains second brightness information based on pixel information of a preset area in the second image frame.
Since the number of pixel points in one frame of image frame is huge, in order to reduce the calculation complexity of the brightness difference information, and ensure that the effective sampling of the brightness information can be performed on the image frame of the first terminal in the horizontal screen mode or the image frame of the first terminal in the vertical screen mode, referring to fig. 5A, the preset area may include a first rectangular area with the first reference line as the center line and a second rectangular area with the second reference line as the center line; wherein the first reference line is parallel to the horizontal axis of the image frame and the second reference line is parallel to the vertical axis of the image frame.
S630, the first terminal calculates the difference value between the first brightness information and the second brightness information to obtain brightness difference information between the first image frame and the second image frame.
And S640, the first terminal acquires a brightness difference sequence based on brightness difference information between every two adjacent first image frames and second image frames.
If the continuous preset number of luminance difference information in the luminance difference sequence is greater than the first luminance threshold, the first terminal determines that the image frame sequence is a dynamic picture type, and determines that the screen-throwing frame rate is the first frame rate.
S660, if the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to the first brightness threshold value, the first terminal determines that the image frame sequence is of a static picture type, and determines that the screen-throwing frame rate is of a second frame rate; wherein the first frame rate is greater than the second frame rate.
Referring to fig. 7, each time a first terminal obtains an image frame through a Media streaming module (Media player), the image frame is sent to a frame detection module (frame detector), target brightness difference information between a current image frame and a previous image frame is detected in real time through the frame detection module, if the target brightness difference information and (N-1) historical brightness difference information continuous with the target brightness difference information in a brightness difference sequence are both greater than brightness difference information of a first brightness threshold value, the image frame sequence is determined to be a dynamic picture type, and a screen throwing frame rate is determined to be the first frame rate; if the target brightness difference information and (N-1) historical brightness difference information which is continuous with the target brightness difference information in the brightness difference sequence are all smaller than or equal to the brightness difference information of the first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen throwing frame rate is of a second frame rate.
S670, the first terminal sends the image frames in the image frame sequence to the second terminal at the screen frame rate.
Referring to fig. 7, the obtained screen frame rate is sent to a frame rate change judging module by a frame detecting module, the frame rate change judging module judges whether the newly received screen frame rate is the same as the current screen frame rate, if not, the screen frame rate is sent to change, and the newly received screen frame rate is sent to a frame interval threshold module (frame interval threshold) and a network session module (network session); after the frame interval threshold module receives the new screen frame rate to adjust the screen interval time, the network session module informs the second terminal of the new screen frame rate through an RTSP (Real Time Streaming Protocol, real-time streaming protocol) protocol, so that the second terminal updates the play frame rate according to the new screen frame rate.
Each time a first terminal acquires an image frame through a Media streaming module (Media impeller), the image frame is also sent to a frame interval threshold module, the frame interval threshold module can calculate the actual interval time between the image frame and the image frame which is sent last time after acquiring the image frame which is acquired last time, and when the actual interval time is greater than or equal to the screen throwing interval time, the image frame is sent to a coding sending module, and the coding sending module carries out image coding on the image frame and sends the image frame to a second terminal to realize screen throwing; if the actual interval time is less than the screen-throwing interval time, discarding the image frame so as to reduce the pressure of image frame coding and image frame transmission in the coding and transmitting module.
In order to better implement the method for acquiring the screen frame rate according to the embodiment of the present application, on the basis of the method for acquiring the screen frame rate according to the embodiment of the present application, the embodiment of the present application further provides a device for acquiring the screen frame rate, as shown in fig. 8, where the device 800 for acquiring the screen frame rate includes:
the image acquisition module 810 is used for acquiring an image frame sequence to be projected;
a brightness difference acquisition module 820, configured to acquire brightness difference information between every two adjacent image frames in the image frame sequence, so as to obtain a brightness difference sequence of the image frame sequence;
a frame rate determining module 830, configured to determine a frame status type of the image frame sequence and a screen frame rate corresponding to the frame status type based on the luminance difference sequence;
and the screen projection module 840 is configured to send the image frames in the image frame sequence to the second terminal at the screen projection frame rate.
In some embodiments of the present application, the luminance difference obtaining module 820 is specifically configured to obtain the first luminance information based on the pixel information of the preset area in the first image frame; acquiring second brightness information based on pixel information of a preset area in the second image frame; wherein the first image frame and the second image frame are two adjacent image frames in the image frame sequence; and calculating the difference value between the first brightness information and the second brightness information to obtain brightness difference information between the first image frame and the second image frame.
In some embodiments of the present application, the preset area includes a first area with a first reference line as a center line and a second area with a second reference line as a center line; the first reference line and the second reference line are reference lines with different directions.
In some embodiments of the present application, the preset area includes a first rectangular area with a first reference line as a center line and a second rectangular area with a second reference line as a center line; wherein the first reference line is parallel to the horizontal axis of the image frame and the second reference line is parallel to the vertical axis of the image frame.
In some embodiments of the present application, the luminance difference obtaining module 820 is specifically further configured to determine a first reference line and a second reference line based on a center point of an image frame in the image frame sequence; a preset region in the first image frame and a preset region in the second image frame are determined based on the first reference line and the second reference line.
In some embodiments of the present application, the luminance difference obtaining module 820 is specifically further configured to obtain a differential pixel point between the first frame image and the second frame image; determining a first reference line and a second reference line in the second image frame and the first image frame based on the differential pixel points; and acquiring a preset area in the first image frame and a preset area in the second image frame based on the first reference line and the second reference line.
In some embodiments of the present application, the picture status types include a dynamic picture type and a static picture type; the frame rate determining module 830 is specifically configured to determine that the image frame sequence is a dynamic picture type and determine that the screen-throwing frame rate is a first frame rate when a continuous preset number of luminance difference information in the luminance difference sequence is greater than a first luminance threshold; when the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to a first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen-throwing frame rate is of a second frame rate; wherein the first frame rate is greater than the second frame rate.
In some embodiments of the present application, the screen-projection frame rate obtaining device further includes a bandwidth obtaining module, configured to obtain a transmission bandwidth between the screen-projection frame rate obtaining module and the second terminal; when the transmission bandwidth is greater than the preset bandwidth threshold, the brightness difference acquisition module 820 is caused to perform the step of acquiring brightness difference information between every two adjacent image frames in the image frame sequence, so as to obtain a brightness difference sequence of the image frame sequence.
In some embodiments of the present application, the brightness difference acquisition module 820 is configured to send the image frames in the image frame sequence to the second terminal at a fixed frame rate when the transmission bandwidth information is less than or equal to a preset bandwidth threshold.
In some embodiments of the present application, the screen-casting module 840 is configured to determine a screen-casting interval time based on a screen-casting frame rate; acquiring a target image frame acquired latest in an image frame sequence and the actual interval time between the target image frame and the image frame transmitted last time; when the actual interval time is smaller than the screen-throwing interval time, discarding the target image frame; and when the actual interval time is greater than or equal to the screen projection interval time, the target image frame is sent to the second terminal.
In some embodiments of the present application, the screen frame rate obtaining device further includes a frame rate sending module, configured to send the screen frame rate to the second terminal, so that the second terminal plays the received image frame according to the screen frame rate.
The specific limitation of the screen frame rate acquisition device can be referred to the limitation of the screen frame rate acquisition method hereinabove, and will not be described herein. The above-mentioned each module in the screen frame rate obtaining device may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In some embodiments of the present application, the screen frame rate acquisition apparatus 800 may be implemented in the form of a computer program that is executable on a computer device as shown in fig. 9. The memory of the computer device may store various program modules constituting the screen-shot frame rate acquisition apparatus 800, such as the image acquisition module 810, the luminance difference acquisition module 820, the frame rate determination module 830, and the screen-shot module 840 shown in fig. 8. The computer program constituted by the respective program modules causes the processor to execute the steps in the screen frame rate acquisition method of the respective embodiments of the present application described in the present specification.
For example, the computer apparatus shown in fig. 9 may perform step S210 through the image acquisition module 810 in the screen shot frame rate acquisition apparatus 800 as shown in fig. 8. The computer device may perform step S220 through the luminance difference acquisition module 820. The computer device may pass through the frame rate determination module 830 to perform step S230. The computer device may go through the screen projection module 840 to perform step S240. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external computer device through a network connection. The computer program when executed by a processor implements a method of screen frame rate acquisition.
It will be appreciated by persons skilled in the art that the architecture shown in fig. 9 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting as to the computer device to which the present inventive arrangements are applicable, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In some embodiments of the application, a computer device is provided that includes one or more processors; a memory; and one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to perform the steps of the above-described screen frame rate acquisition method. The step of the screen frame rate acquisition method herein may be a step in the screen frame rate acquisition method of each of the above embodiments.
In some embodiments of the present application, a computer readable storage medium is provided, in which a computer program is stored, where the computer program is loaded by a processor, so that the processor performs the steps of the above-mentioned screen frame rate acquisition method. The step of the screen frame rate acquisition method herein may be a step in the screen frame rate acquisition method of each of the above embodiments.
Those skilled in the art will appreciate that implementing all or part of the above-described embodiment methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein can include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can take many forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above describes in detail a method, an apparatus, a computer device and a storage medium for acquiring a screen frame rate provided by the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present application, where the above description of the embodiments is only for helping to understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (14)

1. The screen-throwing frame rate acquisition method is characterized by being applied to a first terminal, and comprises the following steps:
collecting an image frame sequence to be projected;
acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence;
determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence;
and sending the image frames in the image frame sequence to a second terminal at the screen-throwing frame rate.
2. The method of claim 1, wherein the step of obtaining luminance difference information between each two adjacent image frames in the sequence of image frames comprises:
Acquiring first brightness information based on pixel information of a preset area in a first image frame;
acquiring second brightness information based on pixel information of a preset area in the second image frame; wherein the first image frame and the second image frame are two adjacent image frames in the image frame sequence;
and calculating the difference value between the first brightness information and the second brightness information to obtain brightness difference information between the first image frame and the second image frame.
3. The method of claim 2, wherein the predetermined area comprises a first area centered on a first reference line and a second area centered on a second reference line; wherein the first reference line and the second reference line are reference lines with different directions.
4. The method of claim 2, wherein the predetermined area comprises a first rectangular area centered on a first reference line and a second rectangular area centered on a second reference line; wherein the first reference line is parallel to a horizontal axis of the image frame and the second reference line is parallel to a vertical axis of the image frame.
5. The method according to claim 3 or 4, wherein prior to the step of obtaining luminance difference information between every two adjacent image frames in the sequence of image frames, further comprising:
Determining the first reference line and the second reference line based on a center point of an image frame in the image frame sequence;
a preset region in the first image frame and a preset region in the second image frame are determined based on the first reference line and the second reference line.
6. The method according to claim 3 or 4, wherein prior to the step of obtaining luminance difference information between every two adjacent image frames in the sequence of image frames, further comprising:
acquiring differential pixel points between the first frame image and the second frame image;
determining the first reference line and the second reference line in the second image frame and the first image frame based on the differential pixel points;
and acquiring a preset area in the first image frame and a preset area in the second image frame based on the first reference line and the second reference line.
7. The method of claim 1, wherein the picture status types include a dynamic picture type and a static picture type;
the step of determining the picture state type of the image frame sequence and the screen projection frame rate corresponding to the picture state type based on the brightness difference sequence comprises the following steps:
If the continuous preset number of brightness difference information in the brightness difference sequence is larger than a first brightness threshold value, determining that the image frame sequence is of a dynamic picture type, and determining that the screen-throwing frame rate is a first frame rate;
if the continuous preset number of brightness difference information in the brightness difference sequence is smaller than or equal to a first brightness threshold value, determining that the image frame sequence is of a static picture type, and determining that the screen-throwing frame rate is of a second frame rate;
wherein the first frame rate is greater than the second frame rate.
8. The method according to claim 1, wherein the step of obtaining the luminance difference information between every two adjacent image frames in the image frame sequence, before obtaining the luminance difference sequence of the image frame sequence, further comprises:
acquiring a transmission bandwidth between the second terminal and the second terminal;
correspondingly, the obtaining the brightness difference information between every two adjacent image frames in the image frame sequence to obtain the brightness difference sequence of the image frame sequence comprises the following steps:
and if the transmission bandwidth is greater than a preset bandwidth threshold, acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence.
9. The method of claim 8, wherein after the obtaining the transmission bandwidth with the second terminal, further comprising:
and if the transmission bandwidth information is smaller than or equal to the preset bandwidth threshold value, sending the image frames in the image frame sequence to the second terminal at a fixed frame rate.
10. The method of claim 1, wherein the step of transmitting the image frames of the sequence of image frames to the second terminal at the screen-drop frame rate comprises:
determining screen-throwing interval time based on the screen-throwing frame rate;
acquiring a target image frame acquired latest in the image frame sequence and the actual interval time between the target image frame and the image frame transmitted last time;
discarding the target image frame if the actual interval time is smaller than the screen-throwing interval time;
and if the actual interval time is greater than or equal to the screen-throwing interval time, the target image frame is sent to the second terminal.
11. The method according to any one of claims 1 to 5, wherein after the step of determining the screen frame rate corresponding to the picture status type, further comprises:
And sending the screen-throwing frame rate to the second terminal, so that the second terminal plays the received image frames according to the screen-throwing frame rate.
12. A screen frame rate acquisition apparatus, the apparatus comprising:
the image acquisition module is used for acquiring an image frame sequence to be projected;
the brightness difference acquisition module is used for acquiring brightness difference information between every two adjacent image frames in the image frame sequence to obtain a brightness difference sequence of the image frame sequence;
the frame rate determining module is used for determining a picture state type of the image frame sequence and a screen throwing frame rate corresponding to the picture state type based on the brightness difference sequence;
and the screen projection module is used for transmitting the image frames in the image frame sequence to the second terminal at the screen projection frame rate.
13. A computer device, the computer device comprising:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to implement the screen frame rate acquisition method of any one of claims 1 to 11.
14. A computer-readable storage medium, having stored thereon a computer program, the computer program being loaded by a processor to perform the steps of the screen frame rate acquisition method of any one of claims 1 to 11.
CN202310139841.XA 2023-02-13 2023-02-13 Screen-projection frame rate acquisition method and device, computer equipment and storage medium Pending CN116996639A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310139841.XA CN116996639A (en) 2023-02-13 2023-02-13 Screen-projection frame rate acquisition method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310139841.XA CN116996639A (en) 2023-02-13 2023-02-13 Screen-projection frame rate acquisition method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116996639A true CN116996639A (en) 2023-11-03

Family

ID=88522019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310139841.XA Pending CN116996639A (en) 2023-02-13 2023-02-13 Screen-projection frame rate acquisition method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116996639A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0951182A1 (en) * 1998-04-14 1999-10-20 THOMSON multimedia S.A. Method for detecting static areas in a sequence of video pictures
KR101698314B1 (en) * 2015-12-09 2017-01-20 경북대학교 산학협력단 Aparatus and method for deviding of static scene based on statistics of images
US20170034542A1 (en) * 2014-07-17 2017-02-02 Panasonic Intellectual Property Management Co., Ltd. Recognition data generation device, image recognition device, and recognition data generation method
CN107277607A (en) * 2017-06-09 2017-10-20 努比亚技术有限公司 A kind of screen picture method for recording, terminal and computer-readable recording medium
CN108184165A (en) * 2017-12-28 2018-06-19 广东欧珀移动通信有限公司 Video broadcasting method, electronic device and computer readable storage medium
CN113438501A (en) * 2020-03-23 2021-09-24 腾讯科技(深圳)有限公司 Video compression method, device, computer equipment and storage medium
CN114245196A (en) * 2021-12-08 2022-03-25 卓米私人有限公司 Screen recording stream pushing method and device, electronic equipment and storage medium
CN114307136A (en) * 2021-12-27 2022-04-12 努比亚技术有限公司 Game screen-throwing power consumption control method and device and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0951182A1 (en) * 1998-04-14 1999-10-20 THOMSON multimedia S.A. Method for detecting static areas in a sequence of video pictures
US20170034542A1 (en) * 2014-07-17 2017-02-02 Panasonic Intellectual Property Management Co., Ltd. Recognition data generation device, image recognition device, and recognition data generation method
KR101698314B1 (en) * 2015-12-09 2017-01-20 경북대학교 산학협력단 Aparatus and method for deviding of static scene based on statistics of images
CN107277607A (en) * 2017-06-09 2017-10-20 努比亚技术有限公司 A kind of screen picture method for recording, terminal and computer-readable recording medium
CN108184165A (en) * 2017-12-28 2018-06-19 广东欧珀移动通信有限公司 Video broadcasting method, electronic device and computer readable storage medium
CN113438501A (en) * 2020-03-23 2021-09-24 腾讯科技(深圳)有限公司 Video compression method, device, computer equipment and storage medium
CN114245196A (en) * 2021-12-08 2022-03-25 卓米私人有限公司 Screen recording stream pushing method and device, electronic equipment and storage medium
CN114307136A (en) * 2021-12-27 2022-04-12 努比亚技术有限公司 Game screen-throwing power consumption control method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN114584849B (en) Video quality evaluation method, device, electronic equipment and computer storage medium
CN110267098B (en) Video processing method and terminal
WO2020048429A1 (en) Method and apparatus for obtaining media resource
CN112262427B (en) Smear evaluation method, smear improvement method, and electronic device
US10446089B2 (en) Method, system and computer readable storage medium for driving liquid crystal displays
CN112995776B (en) Method, device, equipment and storage medium for determining screen capture frame rate of shared screen content
CN112584234A (en) Video image frame complementing method and related device
KR20200011000A (en) Device and method for augmented reality preview and positional tracking
CN111709891A (en) Training method of image denoising model, image denoising method, device and medium
CN110971833B (en) Image processing method and device, electronic equipment and storage medium
CN111013131A (en) Delayed data acquisition method, electronic device, and storage medium
CN112788337A (en) Video automatic motion compensation method, device, equipment and storage medium
CN112565909A (en) Video playing method and device, electronic equipment and readable storage medium
CN116996639A (en) Screen-projection frame rate acquisition method and device, computer equipment and storage medium
CN114647468B (en) Screen projection image display method and device, electronic equipment and storage medium
CN110809166B (en) Video data processing method and device and electronic equipment
US8212796B2 (en) Image display apparatus and method, program and recording media
CN112804469A (en) Video call processing method, device, equipment and storage medium
CN113066068B (en) Image evaluation method and device
CN114466228B (en) Method, equipment and storage medium for improving smoothness of screen projection display
CN114554126B (en) Baseboard management control chip, video data transmission method and server
CN113489745B (en) Video data transmission method, device, equipment and storage medium
US11546675B2 (en) Methods, systems, and media for streaming video content using adaptive buffers
US20220070384A1 (en) Imaging apparatus and method for controlling imaging apparatus
CN112399096B (en) Video processing method, device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination