CN111683273A - Method and device for determining video blockage information - Google Patents

Method and device for determining video blockage information Download PDF

Info

Publication number
CN111683273A
CN111683273A CN202010490059.9A CN202010490059A CN111683273A CN 111683273 A CN111683273 A CN 111683273A CN 202010490059 A CN202010490059 A CN 202010490059A CN 111683273 A CN111683273 A CN 111683273A
Authority
CN
China
Prior art keywords
video
target video
target
information
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010490059.9A
Other languages
Chinese (zh)
Inventor
李露
高谦
冯毅
李福昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202010490059.9A priority Critical patent/CN111683273A/en
Publication of CN111683273A publication Critical patent/CN111683273A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2407Monitoring of transmitted content, e.g. distribution time, number of downloads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • H04N21/43637Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64723Monitoring of network processes or resources, e.g. monitoring of network load

Abstract

The application discloses a method and a device for determining video blocking information, relates to the technical field of data processing, and is used for judging whether the blocking condition occurs in the video playing process of a terminal device. The method comprises the following steps: receiving first indication information from terminal equipment, wherein the first indication information is used for acquiring a target video and comprises a performance parameter of the terminal equipment; acquiring performance parameters of a target network, wherein the target network is a network for sending the target video to the terminal equipment; and inputting the performance parameters of the terminal equipment, the performance parameters of the target network and the data of the target video into a preset video pause information determination model to obtain pause information of the target video, wherein the pause times of the target video is the pause information of the target video played by the terminal equipment determined by the preset video pause information determination model. The embodiment of the application is applied to the process of playing the video by the terminal equipment.

Description

Method and device for determining video blockage information
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and an apparatus for determining video blocking information.
Background
With the development of mobile technology, a user can watch videos through a plurality of video applications installed in a terminal device. However, when the terminal device plays a video, the video may be jammed when the terminal device plays the video due to network conditions and the like. Thereby affecting the viewing experience of the user.
When the terminal device plays a video, how to accurately judge whether the video played by the terminal device is blocked becomes a technical problem to be solved urgently.
Disclosure of Invention
The application provides a method and a device for determining video blocking information, which are used for accurately determining the blocking condition of a video played by a terminal device.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect, a method for determining video stuck information is provided, where the method includes:
the video morton determination device (for simplifying the description, the determination device is taken as an example to describe the following) receives first indication information used for indicating to acquire the target video from the terminal device, wherein the first indication information comprises a characteristic value corresponding to a performance parameter of the terminal device. The determining device acquires a characteristic value corresponding to a performance parameter of a target network for transmitting the target video and data of the target video. The determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the target network and the data of the target video into a preset video pause information determining model to obtain pause information of the target video, wherein the pause information of the target video is the pause information of the target video played by the terminal device determined by the preset video pause information determining model.
In the application, the determining device inputs the performance parameters of the terminal device, the performance parameters of the network for transmitting the target video, the data of the target video and other data of multiple dimensions into the preset video pause information determining model, so that pause information of the terminal device in the process of playing the target video can be obtained, and the video pause information is comprehensive and accurate.
In a second aspect, there is provided a device for determining video morton information, where the device may be a server or a chip applied to the server, and the device may include:
the communication unit is used for receiving first indication information used for indicating to acquire the target video from the terminal equipment, and the first indication information comprises characteristic values corresponding to the performance parameters of the terminal equipment.
And the communication unit is also used for acquiring the characteristic value corresponding to the performance parameter of the target network for transmitting the target video and the data of the target video.
And the processing unit is used for inputting the characteristic value corresponding to the performance parameter of the terminal equipment, the characteristic value corresponding to the performance parameter of the target network and the data of the target video into the preset video pause information determination model to obtain pause information of the target video, wherein the pause information of the target video is the pause information of the target video played by the terminal equipment determined by the preset video pause information determination model.
In a third aspect, a computer-readable storage medium is provided, having stored therein instructions that, when executed, implement the method of the first aspect.
In a fourth aspect, there is provided a computer program product comprising at least one instruction which, when run on a computer, causes the computer to perform the method of the first aspect.
In a fifth aspect, a chip is provided, the chip comprising at least one processor and a communication interface, the communication interface being coupled to the at least one processor, the at least one processor being configured to execute computer programs or instructions to implement the method of the first aspect.
The above-provided devices or computer-readable storage media or computer program products or chips are used for executing the corresponding methods provided above, and therefore, the beneficial effects achieved by the devices or computer program products or chips can refer to the beneficial effects of the corresponding schemes in the corresponding methods provided above, which are not described in detail herein.
Drawings
Fig. 1 is a schematic structural diagram of a communication system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a determining apparatus 200 according to an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating a method for determining video blocking information according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of another method for determining video blocking information according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of another determining apparatus 50 provided in the embodiment of the present application;
fig. 6 is a schematic structural diagram of a communication system according to an embodiment of the present application.
Detailed Description
With the development of mobile technology, a user can view videos through a plurality of video applications installed in a terminal device. However, when the terminal device plays a video, the video may be jammed when the terminal device plays the video due to network conditions and the like. Thereby affecting the viewing experience of the user.
In order to better determine the cause of video jamming, whether the video jamming occurs can be determined according to the video data volume in the preset time, the buffer area request and the like. The cause of video seizure may be determined, for example, in the following manner.
The method comprises the steps of acquiring a video frame sequence of a source video and a video frame sequence of a target video. At least one group of adjacent first test frames and second test frames are obtained from video frames of a source video. And matching a first matching frame and a second matching frame in the video frame of the target video according to the first test frame and the second test frame. And determining whether the target video is blocked according to at least one group of the index value of the first test frame, the index value of the second test frame, the index value of the first matching frame and the index value of the second matching frame.
Based on the technical scheme of the method, a group of adjacent video frame sequences of the source video are matched with the video frames of the target video, and then whether the video is blocked or not can be judged according to the matching result.
And secondly, detecting whether the data volume of the video data to be played in the buffer area in the preset time length meets the preset data volume condition or not in the video playing process, and if so, determining that the video playing is blocked.
According to the technical scheme based on the mode, the buffer area is the position closest to the picture display in the video playing process, and the data volume in the buffer area directly determines whether a new picture is updated on a video picture, so that the accuracy of pause detection in video playing can be improved by applying the scheme provided by the embodiment of the application.
Separating the video data from the audio and video data; encapsulating image information of a pre-stored reference image into video data to obtain encapsulated video data; acquiring image data in the process of displaying the encapsulated video data in a designated area, and recording first acquisition time of the image data; and determining whether the played video is stuck or not according to the image data and the first acquisition time of the image data.
According to the technical scheme based on the mode, the image information of the reference image is packaged into the video data, the image data is collected and the collection time of the image data is recorded in the process of displaying the video data, and whether the video data is blocked or not is determined based on the image data and the collection time.
Setting a timer in the video player; when the video player starts to decode the video file, starting the timer at the same time; monitoring the playing progress time of the video player and the timing time in the timer in real time; and judging whether the video player is jammed or not according to the playing progress time of the video player and the timing time in the timer.
Based on the technical scheme of the mode, because the time for decoding the video file of the video player is correlated with the video playing time, whether the video is jammed during playing can be judged by comparing the time for decoding the video file of the video player with the video playing time.
According to the technical solutions of the four modes, whether the video is jammed during playing is judged based on the video information, and the jam information of the video cannot be comprehensively judged.
In view of this, the embodiment of the present application provides a method for determining video jam information, which is used for comprehensively and accurately determining the video jam information. The method comprises the following steps: the determining device receives first indication information used for acquiring the target video from the terminal equipment, wherein the first indication information comprises a characteristic value corresponding to the performance parameter of the terminal equipment. The determining device acquires a characteristic value corresponding to a performance parameter of a network for transmitting data of the target video and the data of the target video. The determining device inputs the characteristic value corresponding to the performance parameter of the terminal equipment, the characteristic value corresponding to the performance parameter of the target network and the data of the target video into a preset video pause information determining model to obtain pause information of the target video. Therefore, the pause information of the terminal equipment playing the target video can be obtained.
It should be noted that, in this embodiment of the present application, the performance parameter of the terminal device may be referred to as a performance index of the terminal device. The performance parameter of the terminal device may be used to indicate the performance of the hardware or software of the terminal device. For example, the performance parameters of the terminal device may include: the type of the terminal device, a Random Access Memory (RAM) usage rate of the terminal device, a Central Processing Unit (CPU) of the terminal device, an internal version parameter of the terminal device, a kernel version parameter of the terminal device, and one or more of the internal version parameters of the terminal device.
For example, the model of the terminal device may include: OPPO R9 Plusm A, OPPO R7sm, Redmi Note 3, OPPO A300m, vivo X7, OPPO R9 Plus, R/Plusm, HUAWEI RIO-AL00, m2 Note, vivo X7 Plus, vivo Y51A, vivo V3Max A, vivo X6SA, Coolpad 8675-A, and the like.
The performance parameter of the network may be referred to as a performance indicator of the network. The performance parameters of the network may be used to indicate the quality of the video data transmitted by the network and/or basic information of the network. For example, the performance parameters of the network may include: one or more of a signal to interference plus noise ratio (SINR) of the network, a Reference Signal Receiving Power (RSRP) of the network, a transmission delay of the network, a Reference Signal Receiving Quality (RSRQ) of the network, a bandwidth of the network, and a network format. The performance parameters of the network device may further include an Access Point Name (APN)/Service Set Identifier (SSID), a Location Area Code (LAC), a Cell Identifier (CI), a Physical Cell Identifier (PCI), and a Tracking Area Code (TAC) of the network.
The data of the video may be referred to as parameter feature values of the video. For example, the data of the video may include one or more of a video website name, an IP address of the video (e.g., URL, intranet IP, extranet IP), a frame rate and bitrate of the video, a video start time, an average rate of the video, a peak rate of the video, a time length of the video, a size and aggregate traffic of the video, a sharpness of the video, and a resolution of the video.
The video pause information can be used for representing the pause condition of the terminal device when playing the video. For example, the pause information may include a pause frequency of the terminal device playing the video, a pause time of the terminal device playing the video, and the like within a preset time. The pause time length can be the time length of each pause and the total pause time length of the video played by the terminal equipment. The total stuck duration is the sum of a plurality of stuck durations. Alternatively, the stuck time may be an average of a plurality of stuck times, or the like.
For example, in the process of playing the video 1, the total number of times of pause occurrence of the terminal device is 3, and the duration of each pause is t1、t2、t3. The total duration of the clamping is t1+t2+t3The average duration of the katton is (t)1+t2+t3)/3。
It should be noted that the performance parameters of the terminal device, the performance parameters of the network, and the video data are merely exemplary, and do not limit the embodiments of the present application.
In the method provided by the embodiment of the application, the determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the network for transmitting the target video, the data of the target video and other data of multiple dimensions into the preset video pause information determining model, so that pause information of the terminal device in the process of playing the target video can be obtained, and the method is comprehensive and accurate.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The method for determining video morton information provided in the embodiment of the present application may be applied to any communication system supporting communication, and the communication system may be a 3rd generation partnership project (3 GPP) communication system, for example, a 5G mobile communication system, a New Radio (NR) system, an NR vehicle networking (V2X) system, and other next generation communication systems, and may also be a non-3 GPP communication system, without limitation. The following describes a method for video morton information provided in an embodiment of the present application, with reference to fig. 1 as an example.
Fig. 1 is a schematic architecture diagram of a communication system to which an embodiment of the present application is applied. As shown in fig. 1, the communication system includes a determination apparatus 110 and at least one terminal device (e.g., terminal device 120 and terminal device 130 in fig. 1). The terminal device may be connected to the determination means through a wireless network or a wired network. For example, the terminal device may be communicatively connected to the determination device via a wireless network such as a 4G network or a 5G network, or the terminal device may be communicatively connected to the determination device via a network coaxial cable, a twisted pair cable, an optical fiber, or the like.
The terminal equipment may be fixed or mobile. Fig. 1 is a schematic diagram, and other devices, such as a base station, a video server, etc., may also be included in the communication system, which are not shown in fig. 1. The embodiment of the present application does not limit the number of terminal devices included in the communication system.
The determining device in fig. 1 may be pause information for determining that the terminal device plays the video. For example, the determining device may be a network device or a network element in the network device, or the determining device may also be a device for connecting the network device and the terminal device, for example, the determining device may be a router, a Mobile Edge Computing (MEC) server, or the like. Without limitation.
The Terminal device in fig. 1 may also be referred to as a Terminal, a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone (mobile phone), a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in home (smart home), and the like. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
It should be noted that fig. 1 is an exemplary drawing, and the number of devices shown in fig. 1 is not limited. And the communication system shown in fig. 1 may include other devices besides the device shown in fig. 1, without limitation.
In particular, the apparatus of fig. 1 may adopt the structure shown in fig. 2, or include the components shown in fig. 2. Fig. 2 is a schematic composition diagram of a determining apparatus 200 according to an embodiment of the present application, where the determining apparatus 200 may be a terminal device or a chip or a system on a chip in the terminal device. Alternatively, the determining means 200 may be a chip or a system on chip in the first access network device or below the access network device. As shown in fig. 2, the determination device 200 includes a processor 201, a communication interface 202, and a communication line 203.
Further, the determining apparatus 200 may further include a memory 204. The processor 201, the memory 204 and the communication interface 202 may be connected via a communication line 203.
The processor 201 is a CPU, a general purpose processor Network (NP), a Digital Signal Processor (DSP), a microprocessor, a micro controller, a Programmable Logic Device (PLD), or any combination thereof. The processor 201 may also be other devices with processing functions, such as, without limitation, a circuit, a device, or a software module.
A communication interface 202 for communicating with other devices or other communication networks. The other communication network may be an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), or the like. The communication interface 202 may be a module, a circuit, a communication interface, or any device capable of enabling communication.
A communication line 203 for transmitting information between the respective components included in the determination apparatus 200.
A memory 204 for storing instructions. Wherein the instructions may be a computer program.
The memory 204 may be a read-only memory (ROM) or other types of static storage devices that can store static information and/or instructions, a Random Access Memory (RAM) or other types of dynamic storage devices that can store information and/or instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including a compact disc, a laser disc, an optical disc, a digital versatile disc, a blu-ray disc, etc.), a magnetic disc storage medium or other magnetic storage devices, and the like, without limitation.
It is noted that the memory 204 may exist separately from the processor 201 or may be integrated with the processor 201. The memory 204 may be used for storing instructions or program code or some data or the like. The memory 204 may be located inside the determination apparatus 200, or may be located outside the determination apparatus 200, without limitation. The processor 201 is configured to execute the instructions stored in the memory 204 to implement the measurement method provided by the following embodiments of the present application.
In one example, processor 201 may include one or more CPUs, such as CPU0 and CPU1 in fig. 2.
As an alternative implementation, the determining apparatus 200 includes a plurality of processors, for example, the processor 207 may be included in addition to the processor 201 in fig. 2.
As an alternative implementation, the determining apparatus 200 further includes an output device 205 and an input device 206. Illustratively, the input device 206 is a keyboard, mouse, microphone, joystick, or like device, and the output device 205 is a display screen, speaker (spaker), or like device.
It is noted that the determining apparatus 200 may be a desktop computer, a portable computer, a network server, a mobile phone, a tablet computer, a wireless terminal, an embedded device, a chip system or a device with a similar structure as in fig. 2. Further, the constituent structures shown in fig. 2 do not constitute limitations of the terminal device, and the terminal device may include more or less components than those shown in fig. 2, or combine some components, or a different arrangement of components, in addition to the components shown in fig. 2.
In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
In addition, acts, terms, and the like referred to between the embodiments of the present application may be mutually referenced and are not limited. In the embodiment of the present application, the name of the message or the name of the parameter in the message exchanged between the devices is only an example, and other names may also be used in the specific implementation, which is not limited.
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first terminal and the second terminal are only used for distinguishing different terminals, and the sequence order of the terminals is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or descriptions. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The following describes a method for determining video jam information provided in an embodiment of the present application with reference to the communication system shown in fig. 1. The determining device and the terminal device described in the following embodiments may have components shown in fig. 2, and are not described again. In this application, the actions, terms, and the like referred to in the embodiments are all mutually referred to, and are not limited. In the embodiment of the present application, the name of the message exchanged between the devices or the name of the parameter in the message, etc. are only an example, and other names may also be used in the specific implementation, which is not limited. The actions related to the embodiments of the present application are only an example, and other names may also be used in specific implementations, such as: the term "comprising" in the embodiments of the present application may also be replaced by "carrying" or the like.
Fig. 3 provides a method for determining video morton information according to an embodiment of the present application, and as shown in fig. 3, the method includes:
step 301, the determining device receives first indication information from the terminal equipment.
Wherein the determining means may be the determining means 110 in fig. 1. The terminal device may be any one of the terminal devices in fig. 1, for example, the terminal device may be the terminal device 120 or the terminal device 130 in fig. 1, without limitation.
The first indication information may be used to obtain the target video, and/or the first indication information may be used to indicate a characteristic value corresponding to a performance parameter of the terminal device. The performance parameters of the terminal device may refer to the above description, and are not described in detail. The first indication information is used for indicating the performance parameters of the terminal device, and may also be described as the performance parameters of the terminal device carried by the first indication information.
In a possible implementation manner, if the first indication information is used to obtain the target video, the first indication information may be address information of the target video. For example, the address information of the target video may be a Uniform Resource Locator (URL) or an IP address of the target video.
In another possible implementation manner, if the first indication information may also be used to indicate a characteristic value corresponding to a performance parameter of the terminal device, the first indication information may include a plurality of bits. Each bit of the plurality of bits corresponds to a performance parameter of the terminal device. For example, the plurality of bits includes bit 1, bit 2, and bit 3. Wherein, bit 1 corresponds to performance parameter 1 of the terminal device, bit 2 corresponds to performance parameter 2 of the terminal device, and bit 3 corresponds to performance parameter 3 of the terminal device.
For example, when the bit 1 takes the value of T1, the characteristic value corresponding to the performance parameter 1 of the terminal device is T1; when the bit 2 is taken as T2, the characteristic value corresponding to the performance parameter 2 of the terminal device is T2; when the value of bit 3 is T3, the characteristic value corresponding to the performance parameter 3 indicating the terminal device is T3.
In another possible implementation manner, if the first indication information is not used to indicate the characteristic value corresponding to the performance parameter of the terminal device, the determining device may further obtain the characteristic value corresponding to the performance parameter of the terminal device through interaction with the terminal device.
For example, the determining means may send second indication information to the terminal device, where the second indication information is used to obtain the performance parameter of the terminal device. Correspondingly, the terminal equipment receives the second indication information from the terminal equipment. After receiving the second indication information, the terminal device may be triggered to send a characteristic value corresponding to the performance parameter of the terminal device to the determining apparatus.
The description of the second indication information may refer to the description of the first indication information, and is not repeated.
Step 302, the determining device obtains a characteristic value corresponding to the performance parameter of the target network and data of the target video.
The target network refers to a network for transmitting data of the target video. For example, when the terminal device and the determining apparatus are communicatively connected via a 5G network, the target network may be the 5G network.
In one example, the determining means may have characteristic values corresponding to performance parameters of a plurality of networks, wherein the target network may be any one of the plurality of networks. For example, when the determining means is a network device, the determining means may store in advance a characteristic value corresponding to a performance parameter of the target network.
The specific description of the characteristic value corresponding to the performance parameter of the target network may refer to the description of the characteristic value corresponding to the performance parameter of the terminal device in step 310, which is not described in detail herein.
In another example, the determining apparatus may obtain the characteristic value corresponding to the performance parameter of the target network through interaction with a network device corresponding to the target network. For example, when the determining device is a router or an MEC server, and when the determining device receives the first indication information from the terminal device, the determining device may be triggered to perform interaction to obtain the feature value corresponding to the performance parameter of the target network. Alternatively, the determining device may actively acquire the characteristic values corresponding to the performance parameters of the plurality of networks including the target network, for example, the determining device may periodically acquire the performance parameters of the plurality of networks. And is not limited.
In another example, the determining device determines the target network according to the location information of the terminal device, for example, the location information may be coordinate information, such as Global Positioning System (GPS) coordinates or beidou coordinates. The location information of the terminal device may be carried in the first indication information, or the determining apparatus may obtain the location information of the terminal device through interaction with the terminal device, which is not limited.
For example, the terminal device is a mobile phone, and the target network is a wireless network. When the terminal device is located in the coverage area of the wireless network, the terminal device sends first indication information to the determining device through the wireless network, and the determining device can determine the wireless network (namely, a target network) carrying the first indication information according to the location information of the terminal device in the first indication information.
Wherein the determination means may acquire the data of the target video through interaction with a network or a server that provides the data of the target video.
For example, the determining device may determine the server corresponding to the target video according to the address information of the target video. The determining means may transmit third indication information for acquiring data of the target video to a server corresponding to the target video. The specific description of the third indication information may refer to the second indication information, which is not repeated. After receiving the third indication information from the determining device, the server corresponding to the target video may trigger the server corresponding to the target video to send data of the target video to the determining device.
Step 303, the determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the target network, and the data of the target video into a preset video pause information determining model to obtain pause information of the target video.
The preset video pause information determining model is used for determining pause information when the terminal equipment plays the target video.
In a possible implementation manner, the video stuck information determination model is preset to be determined by training according to a plurality of sample data, and each sample data in the plurality of sample data may include feature values corresponding to performance parameters of a plurality of terminal devices, feature values corresponding to performance parameters of a plurality of networks, data of a plurality of videos, and stuck information of the plurality of terminal devices when playing the plurality of videos. Specifically, the training determination method of the preset video stuck information determination model may refer to the following description.
Based on the technical scheme shown in fig. 3, the determining device inputs the performance parameters of the terminal device, the performance parameters of the network for transmitting the target video, the data of the target video and other data with multiple dimensions into the preset video pause information determining model, so that pause information of the terminal device in the process of playing the target video can be obtained, and the video pause information is comprehensive and accurate.
In a possible implementation manner of the method shown in fig. 3, before step 301, the method provided in the embodiment of the present application may further include: and training and determining a preset video pause information determination model according to a plurality of sample data.
The preset video stuck information determination model may be configured in advance for the determination device, may be obtained by the determination device from another device, or may be obtained by training the determination device according to a plurality of sample data, which is not limited. The following describes a mode in which the determination device is trained from a plurality of sample data:
s1, the determining device obtains characteristic values corresponding to the performance parameters of the plurality of terminal devices, characteristic values corresponding to the performance parameters of the plurality of networks, data of the plurality of videos and pause information when the plurality of terminal devices play the plurality of videos.
The plurality of terminal devices are terminal devices which play at least one video in the plurality of videos. The data of each video in the plurality of videos may include a video IP, a video size, a video time length, and hiton information of the video during playing of each terminal device. The morton information may refer to the description of the method shown in fig. 3, and is not repeated.
Wherein the plurality of networks may be networks that transmit video to the terminal device. For example, the plurality of networks may include network 1, network 2, and network 3. The plurality of terminal devices may include terminal device 1 and terminal device 2. The terminal device 1 and/or the terminal device 2 may acquire data of at least one of the plurality of videos through any one of the network 1, the network 2, and the network 3.
S2, the determining device processes the performance parameters of the plurality of terminal devices, the performance parameters of the plurality of networks and the data of the plurality of videos to obtain data meeting preset conditions.
The preset condition may be one or more conditions that satisfy the construction of the preset stuck information determination model.
For example, the preset conditions may include one or more of that the data missing degree of the sample data is less than a first preset value, that there is no single value data, that each data has the same encoding manner, that the proportion of the type of each data is greater than or equal to a second preset value, and that the importance of each data is greater than a third preset value. The above preset conditions are explained below:
1. the data loss degree of each sample data is smaller than a first preset value.
Wherein, the first preset value is set according to needs. The data missing degree refers to a ratio between the number of non-data in the plurality of sample data and the total number of data. For example, if the number of sample data is 100, where 60 sample data in the 100 sample data lack the IP address of the video, the data loss degree of the IP address of the video in the 100 sample data is 60/100, which is 0.6 > 0.5.
If the data certainty of a certain data of the plurality of sample data is greater than or equal to the first preset value, the data can be deleted. Taking the first preset value of 0.5 as an example, and combining the above example, the data loss degree 0.6 of the IP address of the video in the 100 sample data is greater than 0.5, and the determining device may delete the IP address in each sample data in the 100 sample data.
2. There is no single value data.
The single value data indicates that the feature values corresponding to a certain sample data in the plurality of sample data are the same. For example, taking 100 sample data as an example, the network standard of each sample data in the 100 sample data is the same, such as an LTE network. That is, there is a single value data in the 100 sample data.
If there is a single value data in the plurality of sample data, the single value data corresponding to each sample in the plurality of sample data may be deleted. For example, with reference to the above example, the network standard of each sample data in the 100 sample data is the same, and the determining device may delete the network standard of each sample data in the 100 sample data.
3. Each sample data has the same encoding mode.
The encoding method may process non-numeric sample data into numeric data. For example, the encoding method may be a one-hot (one-hot) encoding method.
Wherein each sample data may comprise a plurality of types of data. The plurality of types may include numeric, categorical, temporal, textual, statistical, and address types.
For example, the data of the numerical type may include RAM usage of the terminal device, CPU usage of the terminal device, longitude and latitude of the terminal device, LAC of the network, CI of the network, SINR of the network, PCI of the network, TAC of the network, RSRP of the network, peak rate of the video, total flow of the video. The type of data may include location mode, network type, APN/SSID. The temporal type data may include a video start time. The text-type data may include the model of the terminal device, the specific address where the terminal device is located (e.g., street address, province, city), the URL of the video. The statistical type of data may include an average rate of video. The address-type data may include an intranet IP of the video, an extranet IP, and an IP address of the video.
In one example, for time-based data, the partitioning and extraction can be performed by year, month, day, week, hour, minute, second, day of the year. For example, the start time of video 1 is 12 o ' clock 22 min 15 sec at 8/23 in 2018, the start time of video 2 is 12 o ' clock 30 min 11 sec at 8/23 in 2018, and the start time of video 3 is 12 o ' clock 35 min 17 sec at 8/23 in 2018. In videos 1 to 3, the one-hot coding result for year (2008) is 0100, the one-hot coding result for month (8) is 1000, the one-hot coding result for day (23) is 10111, and the one-hot coding result for hour (12) is 01100; the result of the one-hot coding corresponding to the sub-division (22) in the video 1 is 011010, the result of the one-hot coding corresponding to the sub-division (30) in the video 2 is 011110, and the result of the one-hot coding corresponding to the sub-division (35) in the video 3 is 100101; the one-hot coding result for second (15) in video 1 is 001111, the one-hot coding result for second (11) in video 2 is 001011, and the one-hot coding result for second (17) in video 3 is 010000.
In another example, for address-type data, such as an IP address, the determining device may divide the address into a plurality of segments according to a subnet mask of the address, and perform one-hot encoding according to a segment of each address information. The specific encoding result may refer to the above time-based data encoding method, which is not described in detail.
In yet another example, the result of determining that the device has been subjected to one-hot encoding for textual data, such as a video website name, may be as shown in table 1.
TABLE 1
Figure BDA0002520727420000141
Figure BDA0002520727420000151
It should be noted that the names of the video websites in table 1 are merely exemplary, and other video websites, for example, mango TV, B station, and the like, may also be included in the embodiment of the present application, without limitation. The coding of each video website is also exemplary, and each video website may correspond to other coding, without limitation.
Further, in order to ensure continuity of the numerical sample data, the numerical sample data may be encoded. For example, the determining means may perform sorting according to the magnitude of the numerical value, such as modular sorting or ascending power sorting, and the determining means uses the serial numbers of the plurality of sample data as the encoding result.
For example, the numerical type encoding result can be shown in table 2.
TABLE 2
Model of terminal equipment Number of times of calton Encoding
OPPO R9 Plusm A 8170 516
OPPO R7sm 3737 515
Redmi Note 3 2685 514
OPPO A300m 2424 513
vivo X7 2370 512
OPPO R9 Plus 2365 511
R/Plusm 2250 510
HUAWEI RIO-AL00 2156 509
m2 note 1835 508
vivo X7 Plus 1644 507
vivo Y51A 1541 506
vivo V3Max A 1322 505
vivo X6SA 1128 504
Coolpad 8675-A 1093 503
It should be noted that the terminal devices and the number of stutter in table 2 are only exemplary, and other terminal devices and the number of stutter may also be included in the embodiment of the present application. Without limitation. Each coding of the number of times of katton is also exemplary, and each coding of the number of times of katton may also correspond to other codes, without limitation.
4. The proportion of the type of each data is greater than or equal to a second preset value.
Wherein, the second preset value can be set according to the requirement.
For example, the number of sample data whose stuck times is greater than the first threshold value among the plurality of sample data is a, and the number of sample data whose stuck times is less than or equal to the first threshold value is B. Wherein, A/B ═ t1. If t1The determining means may increase the number of sample data with the number of times of calkings less than or equal to the first threshold among the plurality of sample data, for example, the determining means may copy one or more sample data of the sample data, for example, increase or decrease a certain sample data from one to two or more. Alternatively, the determining means may further reduce the number of sample data having a number of times of stuck greater than the first threshold among the plurality of sample data, for example, the determining means may delete one or more of the sample data having a number of times of stuck greater than the first threshold among the plurality of sample data. In this way, the number of sample data with the number of times of pausing greater than the first threshold value in the plurality of sample data and the number of sample data with the number of times of pausing less than or equal to the first threshold value can be balanced, so that the determination model of the preset video pausing information obtained by training is more accurate.
5. The importance of each data is greater than a third preset value.
Wherein, the importance of the data refers to the kini index or the information entropy of the characteristic value of the data. The determination means may calculate the importance of the characteristic value of each data according to a preset algorithm. For example, the preset algorithm may include a random forest algorithm, a gradient boosting iterative decision tree (GBDT) algorithm, a distributed gradient enhancement library (XGBoost) algorithm, and the like. Without limitation. The random forest algorithm, the GBDT algorithm, and the XGBoost algorithm may refer to the prior art and are not described in detail.
And S3, training the data meeting the preset conditions by the determining device according to a preset training algorithm to obtain a preset video Caton information determining model.
For example, the preset training algorithm may be a neural network algorithm, a decision tree algorithm, or the like, without limitation. The neural network algorithm and the decision tree algorithm can refer to the prior art and are not described in detail. The determining device may also use other algorithms to train the data meeting the preset condition, without limitation.
In a possible implementation manner of the method shown in fig. 3, a method for determining video morton information provided in an embodiment of the present application may further include: when the pause information of the target video exceeds the preset range, the determining device adjusts the performance parameters of the target video so that the pause information of the target video played by the terminal equipment meets a preset threshold.
The preset range is set according to needs, for example, the pause information includes a pause frequency and/or a pause time of the terminal device playing the target video. The determining means may adjust the performance parameter of the target video when the stuck frequency of the target video exceeds a fourth preset value and/or the stuck time of the target video exceeds a fifth preset value.
If the characteristic value corresponding to the performance parameter of the target network is a fixed value, the determining device may reduce the characteristic value corresponding to the data of the target video. For example, determining the performance parameters of the device adjustment target video may include: the determining device reduces the definition of the target video, reduces the size of the target video, reduces the code rate of the target video, and the like. Without limitation. Or, if the performance parameter of the target network corresponds to a plurality of characteristic values, the determining device may transmit the data of the target video with the optimal performance parameter of the target network. For example, there may be multiple different bandwidths in the target network. The determination means may transmit the data of the target video at the maximum bandwidth.
Based on the possible implementation manner, the determining device may reduce the transmission delay of the target video by adjusting the performance parameter of the target video or adjusting the performance parameter of the target network.
In a possible implementation manner of the method shown in fig. 3, a method for determining video morton information provided in an embodiment of the present application may further include: the determining means transmits the first video data to the terminal device.
And the first video data is the data of the adjusted target video.
Based on the possible implementation mode, after the terminal device receives the adjusted target video, the target video can be smoothly played, and user experience is improved.
The method of fig. 3 will be described in conjunction with the communication system of fig. 1.
As shown in fig. 4, for another method for determining video morton information provided in an embodiment of the present application, the method may include:
step 401, the determining device determines a preset video stuck information determining model.
The detailed description of step 401 may refer to the first possible implementation manner of fig. 3, and is not repeated.
Step 402, the terminal device sends first indication information to the determining device. Accordingly, the determination means receives the first instruction information from the terminal device.
Step 403, the determining device obtains the characteristic value corresponding to the performance parameter of the target network and the data of the target video.
Step 404, the determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the target network, and the data of the target video into a preset video pause information determining model to obtain pause information of the target video.
The detailed description of steps 402 to 404 may refer to the method shown in fig. 3, and is not repeated.
Step 405, the determining device determines whether the pause information of the target video exceeds a preset range.
If the pause information of the target video does not exceed the preset range, the determining device may execute step 406; if the credit of the target video exceeds the predetermined range, the determining device may execute steps 407 to 408.
Step 406, the determining device sends the data of the target video to the terminal equipment. Accordingly, the terminal device receives data of the target video.
Step 407, the determining device adjusts the data of the target video so that the pause information of the target video meets a preset threshold.
Step 408, the determining means sends the data of the first video to the terminal device. Accordingly, the terminal device receives data of the first video.
The detailed description of steps 405 to 408 may refer to the second implementation manner and the third implementation manner described in fig. 3.
Based on the method shown in fig. 4, the determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the network for transmitting the target video, the data of the target video and other data of multiple dimensions into the preset video pause information determining model, so that pause information of the terminal device in the process of playing the target video can be obtained, and the method is comprehensive and accurate.
All the schemes in the above embodiments of the present application can be combined without contradiction.
In the embodiment of the present application, the determination apparatus for video morton information may be divided into the functional modules or the functional units according to the above method examples, for example, each functional module or functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in a form of hardware, or may be implemented in a form of a software functional module or a functional unit. The division of the modules or units in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module according to each function, fig. 5 shows a schematic structural diagram of a determining device 50, where the determining device 50 may be a server or a chip applied to the server, and the determining device may be configured to perform the functions of the determining device in the above-mentioned embodiments. The determination means 50 shown in fig. 5 may include: a communication unit 502 and a processing unit 501.
The communication unit 502 is configured to receive first indication information from a terminal device, where the first indication information is used to obtain a target video, and the first indication information includes a feature value corresponding to a performance parameter of the terminal device.
The communication unit 502 is further configured to obtain a feature value corresponding to a performance parameter of a target network and data of a target video, where the target network is a network for transmitting the data of the target video.
The processing unit 501 is configured to input a feature value corresponding to a performance parameter of the terminal device, a feature value corresponding to a performance parameter of the target network, and data of the target video into a preset video pause information determination model to obtain pause information of the target video, where the pause information of the target video is pause information of the target video played by the terminal device determined by the preset video pause information determination model.
The specific implementation manner of the determining apparatus 50 may refer to the behavior function of the determining apparatus in the determining method of the video morton information shown in fig. 3 or fig. 4.
In one possible design, the determining apparatus 50 shown in fig. 5 may further include a storage unit 503. The memory unit 503 is used for storing program codes and instructions.
In one possible design, the performance parameters of the terminal device include one or more of a model of the terminal device, a Random Access Memory (RAM) utilization rate of the terminal device, a Central Processing Unit (CPU) utilization rate of the terminal device, an Operating System (OS) version parameter of the terminal device, a baseband version parameter of the terminal device, a kernel version parameter of the terminal device, and an internal version parameter of the terminal device; the performance parameters of the target network include: one or more of a signal to interference plus noise ratio, SINR, of the target network, a reference signal received power, RSRP, of the target network, a transmission rate of the target network; the data of the target video includes: the network protocol IP address of the target video, the frame rate and the code rate of the target video, the time length of the target video and the size of the target video.
In one possible design, the pause information of the target video includes the pause frequency of the terminal device playing the target video and/or the pause time of the terminal device playing the target video. The processing unit 501 is further configured to adjust data of the target video when the pause frequency of the target video exceeds a first preset value and/or the pause time of the target video exceeds a second preset value, so that the pause information of the target video meets a preset threshold.
In a possible design, the communication unit 502 is further configured to send data of a first video to the terminal device, where the first video is an adjusted target video.
As yet another implementable manner, the processing unit 501 in fig. 5 may be replaced by a processor, which may integrate the functions of the processing unit 501. The communication unit 502 in fig. 5 may be replaced by a transceiver or transceiver unit, which may integrate the functionality of the communication unit 502.
Further, when the processing unit 501 is replaced by a processor and the communication unit 502 is replaced by a transceiver or a transceiver unit, the communication device 50 according to the embodiment of the present application may be the communication device shown in fig. 3.
Fig. 6 is a block diagram of a communication system according to an embodiment of the present application, and as shown in fig. 6, the system may include: terminal device 601, determining means 602, etc.
Therein, the terminal device 601 may be configured to perform the steps of the terminal device in fig. 4. For example, the terminal device 601 may perform step 402. The determination means 602 has the function of the determination means 50 shown in fig. 5.
Specifically, in this possible design, the specific implementation process of the terminal device 601 may refer to the execution process of the terminal device related to the embodiment of the method in fig. 4, and the specific implementation process of the determining apparatus 602 may refer to the execution process of the network device related to the embodiments of the methods in fig. 3 and fig. 4.
Based on the system shown in fig. 6, the determining device inputs the characteristic value corresponding to the performance parameter of the terminal device, the characteristic value corresponding to the performance parameter of the network for transmitting the target video, the data of the target video and other data of multiple dimensions into the preset video pause information determining model, so that pause information of the terminal device in the process of playing the target video can be obtained, and the video pause information is comprehensive and accurate.
The embodiment of the application also provides a computer readable storage medium. All or part of the processes in the above method embodiments may be performed by relevant hardware instructed by a computer program, which may be stored in the above computer-readable storage medium, and when executed, may include the processes in the above method embodiments. The computer readable storage medium may be an internal storage unit of the determination device (including the data sending end and/or the data receiving end) in any of the foregoing embodiments, for example, a hard disk or a memory of the determination device. The computer readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash memory card (flash card), and the like, which are provided on the terminal device. Further, the computer-readable storage medium may include both an internal storage unit and an external storage device of the determination apparatus. The computer-readable storage medium is used for storing the computer program and other programs and data required by the determination device. The computer-readable storage medium described above may also be used to temporarily store data that has been output or is to be output.
It should be noted that the terms "first" and "second" and the like in the description, claims and drawings of the present application are used for distinguishing different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
It should be understood that, in the present application, "at least one" means one or more, "a plurality" means two or more, "at least two" means two or three and three or more, "and/or" for describing an association relationship of associated objects, it means that there may be three relationships, for example, "a and/or B" may mean: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the contextual objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions to enable a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for determining video jam information, comprising:
receiving first indication information from terminal equipment, wherein the first indication information is used for acquiring a target video and comprises a characteristic value corresponding to a performance parameter of the terminal equipment;
acquiring a characteristic value corresponding to a performance parameter of a target network and data of the target video, wherein the target network is a network for transmitting the data of the target video;
inputting the characteristic value corresponding to the performance parameter of the terminal equipment, the characteristic value corresponding to the performance parameter of the target network and the data of the target video into a preset video pause information determination model to obtain pause information of the target video, wherein the pause information of the target video is the pause information of the target video played by the terminal equipment determined by the preset video pause information determination model.
2. The method of determining video morton information according to claim 1,
the performance parameters of the terminal equipment comprise one or more of the model of the terminal equipment, the utilization rate of a Random Access Memory (RAM) of the terminal equipment, the utilization rate of a Central Processing Unit (CPU) of the terminal equipment, the version parameter of an Operating System (OS) of the terminal equipment, the baseband version parameter of the terminal equipment, the kernel version parameter of the terminal equipment and the internal version parameter of the terminal equipment;
the performance parameters of the target network include: one or more of a signal to interference plus noise ratio, SINR, of the target network, a reference signal received power, RSRP, of the target network, a transmission rate of the target network;
the data of the target video comprises: the network protocol IP address of the target video, the frame rate and the code rate of the target video, the time length of the target video and the size of the target video.
3. The method according to claim 1 or 2, wherein the video seizure information of the target video includes a seizure frequency of the terminal device playing the target video and/or a seizure time of the terminal device playing the target video, and the method further comprises:
when the pause frequency of the target video exceeds a first preset value and/or the pause time of the target video exceeds a second preset value, adjusting the data of the target video so that the pause information of the target video meets a preset threshold.
4. The method of determining video stuck information according to claim 3, said method further comprising:
and sending data of a first video to the terminal equipment, wherein the first video is the adjusted target video.
5. An apparatus for determining video stuck information, the apparatus comprising: a communication unit and a processing unit;
the communication unit is used for receiving first indication information from terminal equipment, wherein the first indication information is used for acquiring a target video and comprises a characteristic value corresponding to a performance parameter of the terminal equipment;
the communication unit is further configured to acquire a feature value corresponding to a performance parameter of a target network and data of the target video, where the target network is a network for transmitting the data of the target video;
the processing unit is configured to input a characteristic value corresponding to a performance parameter of the terminal device, a characteristic value corresponding to a performance parameter of the target network, and data of the target video into a preset video pause information determination model to obtain pause information of the target video, where the pause information of the target video is pause information of the target video played by the terminal device determined by the preset video pause information determination model.
6. The apparatus for determining video katon information of claim 5,
the performance parameters of the terminal equipment comprise one or more of the model of the terminal equipment, the RAM utilization rate of the terminal equipment, the CPU utilization rate of the terminal equipment, the OS version parameter of the terminal equipment, the baseband version parameter of the terminal equipment, the kernel version parameter of the terminal equipment and the internal version parameter of the terminal equipment;
the performance parameters of the target network include: one or more of a SINR of the target network, a RSRP of the target network, a transmission rate of the target network;
the data of the target video comprises: the IP address of the target video, the frame rate and the code rate of the target video, the time length of the target video and the size of the target video.
7. The apparatus for determining video morton information according to claim 5, wherein the morton information of the target video comprises a morton frequency of the terminal device playing the target video and/or a morton time of the terminal device playing the target video;
the processing unit is further configured to adjust data of the target video when the pause frequency of the target video exceeds a first preset value and/or the pause time of the target video exceeds a second preset value, so that the pause information of the target video meets a preset threshold.
8. The apparatus according to claim 7, wherein the communication unit is further configured to send data of a first video to the terminal device, where the first video is the adjusted target video.
9. A computer-readable storage medium having stored therein instructions which, when executed, implement the method of any one of claims 1 to 4.
10. A chip comprising at least one processor and a communication interface, the communication interface being coupled to the at least one processor, the at least one processor being configured to execute a computer program or instructions to implement the method of any one of claims 1 to 4.
CN202010490059.9A 2020-06-02 2020-06-02 Method and device for determining video blockage information Pending CN111683273A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010490059.9A CN111683273A (en) 2020-06-02 2020-06-02 Method and device for determining video blockage information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010490059.9A CN111683273A (en) 2020-06-02 2020-06-02 Method and device for determining video blockage information

Publications (1)

Publication Number Publication Date
CN111683273A true CN111683273A (en) 2020-09-18

Family

ID=72453720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010490059.9A Pending CN111683273A (en) 2020-06-02 2020-06-02 Method and device for determining video blockage information

Country Status (1)

Country Link
CN (1) CN111683273A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112383791A (en) * 2020-11-12 2021-02-19 咪咕视讯科技有限公司 Media data processing method and device, electronic equipment and storage medium
CN113262467A (en) * 2021-05-19 2021-08-17 北京小米移动软件有限公司 Application control method, device and storage medium
CN113395512A (en) * 2021-05-27 2021-09-14 北京达佳互联信息技术有限公司 Stuck detection method and device, stuck detection server and storage medium
CN114025211A (en) * 2021-10-27 2022-02-08 福建野小兽健康科技有限公司 Video issuing method and system adaptive to user equipment
CN114189700A (en) * 2021-11-23 2022-03-15 广州博冠信息科技有限公司 Live broadcast card pause prompting method and device, computer equipment and storage medium
WO2022052566A1 (en) * 2020-09-08 2022-03-17 北京金山云网络技术有限公司 Video bitrate adjustment method and apparatus, and electronic device and machine-readable storage medium
CN114666662A (en) * 2022-03-23 2022-06-24 Oppo广东移动通信有限公司 Video blockage optimization method and device, terminal equipment and storage medium
CN116347488A (en) * 2023-02-21 2023-06-27 荣耀终端有限公司 Network blocking processing method, device and storage medium

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2334049A2 (en) * 2009-12-14 2011-06-15 QNX Software Systems GmbH & Co. KG Synchronization of video presentation by video cadence modification
CN103517143A (en) * 2013-08-29 2014-01-15 小米科技有限责任公司 Method, device and terminal device for playing network video
CN108319974A (en) * 2018-01-22 2018-07-24 腾讯科技(深圳)有限公司 Data processing method, device, storage medium and electronic device
CN108509457A (en) * 2017-02-28 2018-09-07 阿里巴巴集团控股有限公司 A kind of recommendation method and apparatus of video data
CN108650684A (en) * 2018-02-12 2018-10-12 中国联合网络通信集团有限公司 A kind of correlation rule determines method and device
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging
CN109040801A (en) * 2018-07-19 2018-12-18 北京达佳互联信息技术有限公司 Media code rate by utilizing adaptive approach, device, computer equipment and storage medium
CN109451300A (en) * 2018-11-12 2019-03-08 中国联合网络通信集团有限公司 The determination method and apparatus of video quality score
CN109493886A (en) * 2018-12-13 2019-03-19 西安电子科技大学 Speech-emotion recognition method based on feature selecting and optimization
CN109598534A (en) * 2018-10-25 2019-04-09 北京三快在线科技有限公司 Information determines method and device, electronic equipment and storage medium
CN109858970A (en) * 2019-02-02 2019-06-07 中国银行股份有限公司 A kind of user's behavior prediction method, apparatus and storage medium
CN109921941A (en) * 2019-03-18 2019-06-21 腾讯科技(深圳)有限公司 Network servicequality evaluates and optimizes method, apparatus, medium and electronic equipment
CN110225417A (en) * 2019-05-09 2019-09-10 网宿科技股份有限公司 Data processing method and server, the method and server that detect Caton
CN110335058A (en) * 2019-04-30 2019-10-15 中国联合网络通信集团有限公司 A kind of sample generating method and device of user satisfaction prediction model
CN110363346A (en) * 2019-07-12 2019-10-22 腾讯科技(北京)有限公司 Clicking rate prediction technique, the training method of prediction model, device and equipment
CN110458685A (en) * 2019-06-27 2019-11-15 上海淇馥信息技术有限公司 Based on the pseudo- risk-taking method, apparatus of machine learning Rating Model identification, electronic equipment
CN110765923A (en) * 2019-10-18 2020-02-07 腾讯科技(深圳)有限公司 Face living body detection method, device, equipment and storage medium
CN111158546A (en) * 2019-12-27 2020-05-15 北京奇艺世纪科技有限公司 Media information display method and device, storage medium and electronic device

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2334049A2 (en) * 2009-12-14 2011-06-15 QNX Software Systems GmbH & Co. KG Synchronization of video presentation by video cadence modification
CN103517143A (en) * 2013-08-29 2014-01-15 小米科技有限责任公司 Method, device and terminal device for playing network video
CN108509457A (en) * 2017-02-28 2018-09-07 阿里巴巴集团控股有限公司 A kind of recommendation method and apparatus of video data
CN108319974A (en) * 2018-01-22 2018-07-24 腾讯科技(深圳)有限公司 Data processing method, device, storage medium and electronic device
CN108650684A (en) * 2018-02-12 2018-10-12 中国联合网络通信集团有限公司 A kind of correlation rule determines method and device
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging
CN109040801A (en) * 2018-07-19 2018-12-18 北京达佳互联信息技术有限公司 Media code rate by utilizing adaptive approach, device, computer equipment and storage medium
CN109598534A (en) * 2018-10-25 2019-04-09 北京三快在线科技有限公司 Information determines method and device, electronic equipment and storage medium
CN109451300A (en) * 2018-11-12 2019-03-08 中国联合网络通信集团有限公司 The determination method and apparatus of video quality score
CN109493886A (en) * 2018-12-13 2019-03-19 西安电子科技大学 Speech-emotion recognition method based on feature selecting and optimization
CN109858970A (en) * 2019-02-02 2019-06-07 中国银行股份有限公司 A kind of user's behavior prediction method, apparatus and storage medium
CN109921941A (en) * 2019-03-18 2019-06-21 腾讯科技(深圳)有限公司 Network servicequality evaluates and optimizes method, apparatus, medium and electronic equipment
CN110335058A (en) * 2019-04-30 2019-10-15 中国联合网络通信集团有限公司 A kind of sample generating method and device of user satisfaction prediction model
CN110225417A (en) * 2019-05-09 2019-09-10 网宿科技股份有限公司 Data processing method and server, the method and server that detect Caton
CN110458685A (en) * 2019-06-27 2019-11-15 上海淇馥信息技术有限公司 Based on the pseudo- risk-taking method, apparatus of machine learning Rating Model identification, electronic equipment
CN110363346A (en) * 2019-07-12 2019-10-22 腾讯科技(北京)有限公司 Clicking rate prediction technique, the training method of prediction model, device and equipment
CN110765923A (en) * 2019-10-18 2020-02-07 腾讯科技(深圳)有限公司 Face living body detection method, device, equipment and storage medium
CN111158546A (en) * 2019-12-27 2020-05-15 北京奇艺世纪科技有限公司 Media information display method and device, storage medium and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李言兵: "VoLTE视频感知优化方法研究", 《电信技术》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022052566A1 (en) * 2020-09-08 2022-03-17 北京金山云网络技术有限公司 Video bitrate adjustment method and apparatus, and electronic device and machine-readable storage medium
CN112383791A (en) * 2020-11-12 2021-02-19 咪咕视讯科技有限公司 Media data processing method and device, electronic equipment and storage medium
CN113262467A (en) * 2021-05-19 2021-08-17 北京小米移动软件有限公司 Application control method, device and storage medium
CN113395512A (en) * 2021-05-27 2021-09-14 北京达佳互联信息技术有限公司 Stuck detection method and device, stuck detection server and storage medium
CN113395512B (en) * 2021-05-27 2023-02-28 北京达佳互联信息技术有限公司 Stuck detection method and device, stuck detection server and storage medium
CN114025211A (en) * 2021-10-27 2022-02-08 福建野小兽健康科技有限公司 Video issuing method and system adaptive to user equipment
CN114189700A (en) * 2021-11-23 2022-03-15 广州博冠信息科技有限公司 Live broadcast card pause prompting method and device, computer equipment and storage medium
CN114666662A (en) * 2022-03-23 2022-06-24 Oppo广东移动通信有限公司 Video blockage optimization method and device, terminal equipment and storage medium
CN116347488A (en) * 2023-02-21 2023-06-27 荣耀终端有限公司 Network blocking processing method, device and storage medium
CN116347488B (en) * 2023-02-21 2023-10-20 荣耀终端有限公司 Network blocking processing method, device and storage medium

Similar Documents

Publication Publication Date Title
CN111683273A (en) Method and device for determining video blockage information
JP6179907B2 (en) Method and apparatus for monitoring media presentation
CN106331739B (en) Live broadcasting method, equipment, server, system and live state monitoring method
US10305613B2 (en) Method and system for detecting image delay
US9002979B2 (en) Sports timing system (STS) event and participant announcement communication system (EPACS) and method
CN104618195B (en) Bandwidth estimation method and apparatus
CN104935955B (en) A kind of methods, devices and systems transmitting live video stream
CN108040258B (en) Encoding and decoding method, device and system
CN109286813B (en) Video communication quality detection method and device
US11540028B2 (en) Information presenting method, terminal device, server and system
US10652209B2 (en) Router address type identification method and apparatus
US20220311692A1 (en) Methods and apparatus to monitor media in a direct media network
CN107832142B (en) Resource allocation method and equipment for application program
WO2019091191A1 (en) Data processing method and apparatus
CN112738553A (en) Self-adaptive cloud rendering system and method based on network communication quality
CN112738538B (en) Live broadcasting room on-hook behavior detection method and device, electronic equipment and computer readable storage medium
EP3310048A1 (en) Video bit rate identification method and device
WO2023045434A1 (en) Access detection method, system, and apparatus
CN109698932B (en) Data transmission method, camera and electronic equipment
CN112235592B (en) Live broadcast method, live broadcast processing method, device and computer equipment
JP7431514B2 (en) Method and system for measuring quality of video call service in real time
CN113014619A (en) Construction site data monitoring method and device, electronic device and storage medium
JP6838451B2 (en) Congestion information transmission terminal device, congestion status evaluation system, congestion status evaluation method, and congestion information transmission program
CN111526381A (en) Method and device for optimizing live broadcast resources and electronic equipment
JP6657834B2 (en) Communication system, communication analysis method, and storage server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200918

RJ01 Rejection of invention patent application after publication