CN114338922A - Video playing method and video playing device - Google Patents

Video playing method and video playing device Download PDF

Info

Publication number
CN114338922A
CN114338922A CN202210048463.XA CN202210048463A CN114338922A CN 114338922 A CN114338922 A CN 114338922A CN 202210048463 A CN202210048463 A CN 202210048463A CN 114338922 A CN114338922 A CN 114338922A
Authority
CN
China
Prior art keywords
video
video data
network
terminal device
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210048463.XA
Other languages
Chinese (zh)
Other versions
CN114338922B (en
Inventor
姚东强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210048463.XA priority Critical patent/CN114338922B/en
Publication of CN114338922A publication Critical patent/CN114338922A/en
Application granted granted Critical
Publication of CN114338922B publication Critical patent/CN114338922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Telephone Function (AREA)

Abstract

The application provides a video playing method and a video playing device, which are beneficial to providing continuous video service for call ringing or call ringing of a user under the condition of single wireless voice call continuity (SRVCC) switching, and improving the use experience of the user. The method comprises the following steps: the terminal equipment uses a first network system to play a video generated by the video data on line on a call interface based on the video data from the network equipment, wherein the video comprises video color vibration or video color ring; under the condition that the first network system needs to be switched to a second network system, the terminal equipment acquires candidate video data, wherein the candidate video data comprises the video data cached in the terminal equipment or video data prestored in a memory of the terminal equipment, and the second network system is a network system which does not support online video playing; and the terminal equipment plays the candidate video generated by the candidate video data on the call interface.

Description

Video playing method and video playing device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a video playing method and a video playing apparatus.
Background
Currently, operators have online video polyphonic ringtone service, and audio polyphonic ringtone. The video color vibration service requires that when receiving an incoming call from a terminal device of a calling user (which may be called a calling terminal), the terminal device of a called user (which may be called a called terminal) is in an idle state, the called terminal should support display and play of video color vibration at the ringing stage of audio and video calls according to the video color vibration media negotiation requirement, and support recovery of the call between the calling user and the called user according to the requirement. The video color ring service requires that the calling terminal should support the display and play of the video color ring at the ringing stage of the audio and video calls and support the recovery of the calls between the calling and the called according to the requirements of the negotiation of the video color ring media under the condition that the called terminal is in an idle state when the calling terminal receives an incoming call.
Generally, after the called user transacts the video ring back tone service or the video ring back tone service, if a Single Radio Voice Call Continuity (SRVCC) handover occurs at any one of the terminals (the calling terminal or the called terminal) during the call ringing phase of the calling user or the call ringing phase of the called user, that is, the terminal device needs to switch from a fourth generation mobile communication technology (4G) network system or a fifth generation mobile communication technology (5G) network system to a second generation mobile communication technology (2G) network system or a third generation mobile communication technology (3G) network system, the video cannot be continuously played at the calling terminal, for example, the video ring back tone cannot be continuously played at the calling terminal, but rather to play the audio color ring; the called terminal can not play the video color vibration continuously, but can change to play the incoming call ring.
Therefore, in the case of the SRVCC handover, the terminal device cannot provide a continuous video service for the user, which may reduce the user experience.
Disclosure of Invention
The embodiment of the application provides a video playing method and a video playing device, which are beneficial to providing continuous video service for call ringing or call ringing of a user under the condition of SRVCC switching, and improving the use experience of the user.
In a first aspect, a video playing method is provided, where the method includes: the terminal equipment uses a first network system to play a video generated by the video data on line on a call interface based on the video data from the network equipment, wherein the video comprises video color vibration or video color ring. Under the condition that it is determined that the first network system needs to be switched to the second network system, the terminal device obtains candidate video data, wherein the candidate video data comprises video data cached in the terminal device or video data prestored in a memory of the terminal device, and the second network system is a network system which does not support online video playing. And the terminal equipment plays the candidate video generated by the candidate video data on the call interface.
In the application, if the terminal device is a calling terminal, in a call ringing stage, the terminal device may play a video polyphonic ringtone on a call interface by using a first network system based on the video data; if the terminal equipment is a called terminal, the terminal equipment can play video color vibration on the incoming call interface by using the first network system based on the video data in the call ringing stage.
When the terminal device determines that the first network system needs to be switched to the second network system, the terminal device may not be able to play the video (video color vibration or video color ring) on line normally, but instead, the terminal device changes to play the audio. Under the circumstance, according to the video playing method provided by the application, the terminal device can play the video generated by the video data cached in the terminal device, or play the video generated by the video data prestored in the memory of the terminal device, so that the trouble of the user caused by the fact that the playing video is converted into the playing audio in the call interface is solved, the continuous video service can be provided for the user, and the use experience of the user is improved.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: the terminal equipment determines that the first network system needs to be switched to the second network system.
With reference to the first aspect, in some implementation manners of the first aspect, the determining, by the terminal device, that switching from the first network type to the second network type is required includes: under the condition that the current network of the terminal equipment does not support the first network system, the terminal equipment determines that the first network system needs to be switched to the second network system. This is advantageous for improving the continuity of the voice call of the terminal device.
With reference to the first aspect, in some implementation manners of the first aspect, the determining, by the terminal device, that switching from the first network type to the second network type is required includes: under the condition that the network quality of the terminal equipment in the first network system is lower than a preset threshold, the terminal equipment determines that the first network system needs to be switched to the second network system. This is advantageous for improving the network service quality of the terminal device.
With reference to the first aspect, in certain implementations of the first aspect, the candidate video data includes video data buffered in the terminal device. Before the terminal device acquires the candidate video data, the method further comprises: the terminal equipment buffers the video data into a buffer. The terminal device acquires the candidate video data, and the method comprises the following steps: and the terminal equipment acquires the candidate video data from the buffer.
In the application, the terminal device can cache the video data of the video (video color vibration or video color ring) played online on the call interface, so that the terminal device can be provided with available video data under the condition that the network system needs to be switched.
With reference to the first aspect, in certain implementations of the first aspect, the candidate video data includes video data pre-stored in a memory of the terminal device. The terminal device acquires the candidate video data, and the method comprises the following steps: and the terminal equipment acquires the candidate video data from the memory.
In the application, the terminal device can acquire the pre-stored local video data from the memory, which is beneficial to providing available video data for the terminal device under the condition that the network system needs to be switched.
With reference to the first aspect, in some implementation manners of the first aspect, the first network type is a 4G network type or a 5G network type, and the second network type is a 2G network type or a 3G network type.
In the application, the first network system includes a network system that can support the terminal device to play the video color ring or the video color ring online. The second network system comprises a network system which does not support the terminal equipment to play video color vibration or video color ring online.
In a second aspect, a video playing apparatus is provided, including: and the processing module is used for playing videos generated by the video data on line on a call interface by using a first network system based on the video data from the network equipment, wherein the videos comprise video color vibration or video color ring. The acquisition module is used for acquiring candidate video data under the condition that the first network system needs to be switched to a second network system, wherein the candidate video data comprises the cached video data or video data prestored in a memory, and the second network system is a network system which does not support online video playing. The processing module is further configured to: and playing the candidate video generated by the candidate video data on the call interface.
With reference to the second aspect, in some implementations of the second aspect, the processing module is configured to: and determining that the first network system needs to be switched to the second network system.
With reference to the second aspect, in some implementations of the second aspect, the processing module is configured to: and under the condition that the current network does not support the first network system, determining that the first network system needs to be switched to the second network system.
With reference to the second aspect, in some implementations of the second aspect, the processing module is configured to: and under the condition that the network quality under the first network system is lower than a preset threshold, determining that the first network system needs to be switched to the second network system.
With reference to the second aspect, in some implementations of the second aspect, the candidate video data includes video data buffered in the terminal device. The processing module is used for: the video data is buffered in a buffer. The acquisition module is used for: and acquiring the candidate video data from the buffer.
With reference to the second aspect, in some implementations of the second aspect, the candidate video data includes video data pre-stored in a memory of the terminal device. The acquisition module is used for: and acquiring the candidate video data from the memory.
With reference to the second aspect, in some implementation manners of the second aspect, the first network standard is a 4G network standard or a 5G network standard, and the second network standard is a 2G network standard or a 3G network standard.
In a third aspect, there is provided another video playing apparatus, including a processor, coupled to a memory, and configured to execute instructions in the memory to implement the method in any one of the possible implementations of the first aspect. Optionally, the apparatus further comprises a memory. Optionally, the apparatus further comprises a communication interface, the processor being coupled to the communication interface.
In one implementation, the video playing apparatus is a terminal device. When the video playback apparatus is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation manner, the video playing apparatus is a chip configured in the terminal device. When the video playing apparatus is a chip configured in a terminal device, the communication interface may be an input/output interface.
In a fourth aspect, a processor is provided, comprising: input circuit, output circuit and processing circuit. The processing circuit is configured to receive a signal via the input circuit and transmit a signal via the output circuit, so that the processor performs the method of any one of the possible implementations of the first aspect.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a flip-flop, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the signal output by the output circuit may be output to and transmitted by a transmitter, for example and without limitation, and the input circuit and the output circuit may be the same circuit that functions as the input circuit and the output circuit, respectively, at different times. The present application is not limited to the specific implementation of the processor and various circuits.
In a fifth aspect, a processing apparatus is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory, and may receive signals via the receiver and transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, there are one or more processors and one or more memories.
Alternatively, the memory may be integrated with the processor, or provided separately from the processor.
In a specific implementation process, the memory may be a non-transitory (non-transitory) memory, such as a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of the memory and the arrangement manner of the memory and the processor are not limited in this application.
It will be appreciated that the associated data interaction process, for example, sending the indication information, may be a process of outputting the indication information from the processor, and receiving the capability information may be a process of receiving the input capability information from the processor. In particular, the data output by the processor may be output to a transmitter and the input data received by the processor may be from a receiver. The transmitter and receiver may be collectively referred to as a transceiver, among others.
The processing device in the fifth aspect may be a chip, the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory, which may be integrated with the processor, located external to the processor, or stand-alone.
In a sixth aspect, there is provided a computer program product comprising: computer program code which, when executed, causes a computer to perform the method of any of the possible implementations of the first aspect described above.
In a seventh aspect, a computer-readable storage medium is provided, which stores a computer program that, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a scenario provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applicable;
fig. 3 is a block diagram of a software structure of a terminal device to which the embodiment of the present application is applicable;
fig. 4 is a schematic flow chart of a video playing method provided in an embodiment of the present application;
fig. 5 is a schematic flow chart of another video playing method provided by an embodiment of the present application;
fig. 6 is a schematic block diagram of a video playing apparatus provided in an embodiment of the present application;
fig. 7 is a schematic block diagram of another video playing apparatus provided in the embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
Before describing the video playing method and the video playing apparatus provided in the embodiments of the present application, the following description is made.
First, in the embodiments shown below, terms and english abbreviations such as video color vibration, and masking control are exemplary examples given for convenience of description, and should not limit the present application in any way. This application is not intended to exclude the possibility that other terms may be defined in existing or future protocols to carry out the same or similar functions.
Second, the first, second and various numerical numbers in the embodiments shown below are merely for convenience of description and are not intended to limit the scope of the embodiments of the present application. For example, to distinguish between different video chapters, to distinguish between different request messages, etc.
Third, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, and c, may represent: a, or b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
The terminal device in the embodiment of the present application may be a handheld device, a vehicle-mounted device, etc. having a wireless connection function, and may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. Currently, some examples of terminals are: a mobile phone (mobile phone), a tablet computer, a smart television, a notebook computer, a tablet computer (Pad), a handheld computer, a Mobile Internet Device (MID), a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (self driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety, a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol) phone, a local loop (SIP) phone, a wireless local loop (personal digital assistant (PDA), a personal digital assistant (WLL), a personal digital assistant (personal digital assistant), a wireless terminal in smart grid), a wireless terminal in transportation safety, a wireless terminal in smart city (smart city), a wireless terminal in smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol) phone, a wireless telephone, a personal digital assistant (PDA, a personal digital assistant (personal digital assistant, a, The embodiments of the present application do not limit the specific technology and the specific device form adopted by the terminal device, and are not limited in the embodiments of the present application.
By way of example and not limitation, in the embodiments of the present application, the terminal device may also be a wearable device. Wearable equipment can also be called wearable intelligent equipment, is the general term of applying wearable technique to carry out intelligent design, develop the equipment that can dress to daily wearing, like glasses, gloves, wrist-watch, dress and shoes etc.. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable smart device includes full functionality, large size, and can implement full or partial functionality without relying on a smart phone, such as: smart watches or smart glasses and the like, and only focus on a certain type of application functions, and need to be used in cooperation with other devices such as smart phones, such as various smart bracelets for physical sign monitoring, smart jewelry and the like.
It should be understood that in the embodiment of the present application, the terminal device may be an apparatus for implementing a function of the terminal device, or may be an apparatus capable of supporting the terminal device to implement the function, such as a chip system, and the apparatus may be installed in the terminal. In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
The terminal device in the embodiment of the present application may also be referred to as: user Equipment (UE), Mobile Station (MS), Mobile Terminal (MT), access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent, or user device, etc.
The network device in the embodiment of the present application may be any device having a wireless transceiving function, and the network device is deployed with an internet protocol multimedia subsystem (IMS).
The IMS network uses a Session Initiation Protocol (SIP) as a unique session control protocol, and is a multimedia control/call control platform in a Packet Switch (PS), and the IMS enables the PS to have partial functions of a Circuit Switch (CS) and support both conversational and non-conversational multimedia services. Illustratively, the IMS network may include at least one of an access-session border controller (a-SBC), a proxy-call session control network element (P-CSCF), an inquiry-call session control network element (I-CSCF), a serving-call session control network element (S-CSCF), or a multimedia telephony service application server (MMTEL AS).
Optionally, the network device of the embodiment of the present application may further include a core network device and an access network device, where the core network device and the access network device may be different physical devices independent from the network device, or the network device of the embodiment of the present application integrates a part of functions of the core network device and a part of functions of the access network device.
It should be understood that in the embodiment of the present application, the network device may be an apparatus for implementing a function of the network device, and may also be an apparatus capable of supporting the network device to implement the function, for example, a system on chip, and the apparatus may be installed in the network device.
It should also be understood that the network device and the terminal device in the embodiments of the present application may be deployed on land, including indoors or outdoors, hand-held or vehicle-mounted; or deployed on the surface; or on aerial airplanes, balloons, and satellites. The application scenarios of the network device and the terminal device are not limited in the embodiments of the present application.
For ease of understanding, the following briefly introduces terms referred to in the embodiments of the present application.
1. Video color vibration: the called user transacts the business in the operator, the terminal ring of the called user is replaced by a section of video. The video color vibration function requires that the called terminal is in an idle state when receiving an incoming call from the calling terminal, and the called terminal supports the display and play of the video color vibration at the ringing stage of audio and video calls and supports the recovery of the calls between the calling and the called as required according to the negotiation requirement of the video color vibration media.
2. Video color ring back tone: the terminal ring back tone of the calling subscriber is replaced by a video segment in the service handled by the called subscriber in the operator. The video color ring function requires that the calling terminal should support the display and play of the video color ring at the ringing stage of the audio and video calls and support the recovery of the calls between the calling and the called according to the requirements under the condition that the called terminal is in an idle state when the calling terminal receives the incoming call.
3. Audio color ring back tone: the terminal ring back tone of the calling subscriber is replaced by a section of audio in the service handled by the called subscriber in the operator. The audio color ring function requires that the calling terminal should support the display and play of the audio color ring at the ringing stage of audio and video calls and support the recovery of the calls between the calling and the called according to the requirements of the negotiation of the video color ring media under the condition that the called terminal is in an idle state when receiving the call from the calling terminal.
4. SRVCC switching: SRVCC mainly solves the problem how to keep the continuity of voice service when a single radio frequency terminal device is switched from a 4G network system or a 5G network system to a 2G network system or a 3G network system, namely, the single radio frequency terminal device is seamlessly switched between VoIP voice and CS voice controlled by IMS.
It should be understood that, in the embodiments of the present application, the terminal device of the calling user is referred to as a calling terminal, and the terminal device of the called user is referred to as a called terminal. It should be understood that the calling terminal and the called terminal are named for the convenience of calling and called, and should not limit the functions of the terminal device itself. The calling party and the called party are corresponding, the calling terminal may also become the called terminal of other users, and the called terminal may also become the calling terminal of other users. This is not a limitation of the present application.
Taking video polyphonic ringtone as an example, at present, an operator network supports a user to upload at most 10 videos as video polyphonic ringtone per month through music application software (APP), and view and set a play rule through the music APP. After the user transacts the video color ring service, at least one video color ring can be set on the music APP of the operator to replace the ring back tone of the terminal. The video polyphonic ringtone function can also refer to the setting rule of the video polyphonic ringtone subsequently.
Illustratively, if the called user sets a video lottery for the terminal, the terminal can play the set video lottery when there is an incoming call.
For example, if the called user sets a plurality of video lottery tickets for the terminal, the terminal may randomly select one video lottery ticket from the plurality of video lottery tickets to play when there is an incoming call.
Illustratively, if the called user does not set a video lottery for the terminal, the terminal may randomly play one or more default video lottery when there is an incoming call, and the played video lottery is a season-good video periodically changed by the operator.
Generally, after a called user transacts a video color ring service or a video color ring service, if any one of terminals (a calling terminal or a called terminal) in the call ringing stage of the calling user or the call ringing stage of the called user has SRVCC handover, the terminal device cannot continue playing video online, for example, the calling terminal cannot continue playing video color ring, but instead changes to playing audio color ring, so that a video picture disappears, which may cause trouble to the user; the called terminal can not play the video color vibration continuously but changes to play the incoming call ring, so that for the same incoming call, the incoming call prompt tone of the terminal equipment is the video color vibration firstly, and then changes to the incoming call ring, and the user may fail or have problems for the transacted video color vibration service, which may cause troubles to the user.
Therefore, in the case of the SRVCC handover, the terminal device cannot provide a continuous video service for the user, which may reduce the user experience.
In view of this, embodiments of the present application provide a video playing method and a video playing apparatus, so that a terminal device locally caches received video data or pre-stores video data, so that the terminal device provides a continuous video service for a user when SRVCC handover occurs, and user experience is improved.
Fig. 1 is a schematic diagram of a scenario provided in an embodiment of the present application. As shown in fig. 1, the scenario shows a terminal device 11 and a network device 12.
The terminal device 11 may be a called terminal, and receives a call request sent by a calling terminal through the network device 12. Illustratively, after the user manages the video lottery service in the operator, the terminal device 11 may play the video lottery according to a preset playing rule when receiving the incoming call message.
Terminal device 11 may also act as a calling terminal, sending a call request to a called terminal through network device 12. For example, after the user manages the video color ring service, the terminal device 11 may play the video color ring according to a preset playing rule when sending the incoming call message.
The network device 12 may be connected to the terminal device 11 in a wireless manner, and perform audio and video media negotiation of video color vibration or video color ring to provide audio and video media data for the terminal device 11.
The terminal device in the above scenario may be fixed in position or may be mobile. The embodiments of the present application do not limit the number of network devices and terminal devices included in the scenario.
Fig. 2 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applied. As shown in fig. 2, the terminal device 200 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the terminal device 200. In other embodiments of the present application, terminal device 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, a Display Processing Unit (DPU), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, terminal device 200 may also include one or more processors 110. The processor may be, among other things, a neural center and a command center of the terminal device 200. The processor can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses, reduces the latency of the processor 110 and thus increases the efficiency of the terminal device 200.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a USB interface, etc. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 200, and may also be used to transmit data between the terminal device 200 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is an illustrative description, and does not limit the structure of the terminal device 200. In other embodiments of the present application, the terminal device 200 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the terminal device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 200. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 200, including Wireless Local Area Networks (WLAN), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), NFC, Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of terminal device 200 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that terminal device 200 can communicate with networks and other devices via wireless communication techniques. The wireless communication technologies may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a bei dou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 200 can implement a display function by the GPU, the display screen 194, the application processor, and the like. The application processor may include an NPU and/or a DPU. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like. The DPU is also called a display sub-system (DSS), and is configured to adjust the color of the display screen 194, and the DPU may adjust the color of the display screen through a three-dimensional look-up table (3D LUT). The DPU may also perform scaling, noise reduction, contrast enhancement, backlight brightness management, hdr processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, or a quantum dot light-emitting diode (QLED). In some embodiments, the terminal device 200 may include 1 or N display screens 194, where N is a positive integer greater than 1.
The terminal device 200 may implement a photographing function through the ISP, one or more cameras 193, a video codec, a GPU, one or more display screens 194, and an application processor, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 200. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data files such as music, photos, videos, and the like are saved in the external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the terminal device 200 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data (such as photos, contacts, etc.) created during use of the terminal device 200, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. In some embodiments, the processor 110 may cause the terminal device 200 to execute various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The internal memory 121 is used to store the LUT set, the preset mapping relationship, and the preset LUT color difference information in the embodiment of the present application. The LUT set includes all LUT elements that can be supported by the terminal device 200, and the LUT elements may also be referred to as an LUT template. The preset mapping relationship is used to represent the corresponding relationship between the plurality of picture attributes and the plurality of LUT elements, and can be shown in table two below. The preset LUT color difference information includes a color difference between each two LUT elements, and may be embodied in the form of a color difference table, for example.
The terminal device 200 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 200 can listen to music through the speaker 170A, or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal apparatus 200 receives a call or voice information, it is possible to receive a voice by bringing the receiver 170B close to the human ear. The microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The terminal device 200 may be provided with at least one microphone 170C. In other embodiments, the terminal device 200 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 200 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, and so on. The headphone interface 170D is used to connect a wired headphone. The earphone interface 170D may be the USB interface 130, may be an open mobile platform (OMTP) standard interface of 3.5mm, and may also be a CTIA (cellular telecommunications industry association) standard interface.
The sensors 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the terminal device 200 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the present application, a software structure of the terminal device 200 is exemplarily described by taking an Android (Android) system with a layered architecture as an example.
Fig. 3 is a block diagram of a software structure of a terminal device to which the embodiment of the present application is applied. The layered architecture divides the software system of the terminal device 200 into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into an application layer (APP), an application framework layer (application framework), an Android runtime (Android runtime), and a system library and kernel layer (kernel).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 3, the application packages may include camera, calendar, map, phone, music, WLAN, bluetooth, video, social, gallery, navigation, short message, etc. applications.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 3, the application framework layers may include a window manager, a content provider, an explorer, a notification manager, a view system, a phone manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system includes visual controls such as controls to display text, controls to display pictures, and the like.
The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide a communication function of the terminal apparatus 200. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in the status bar, a prompt tone is given, the terminal device 200 vibrates, an indicator lamp flashes, and the like.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least comprises a display driver, an audio driver, a bluetooth driver, a Wi-Fi driver, and the like, which is not limited in the embodiments of the present application.
Illustratively, in the embodiment of the present application, the kernel layer employs a display driver and an audio driver to drive the display screen 194 and the speaker 170A in the terminal device 200 to implement video playing.
Fig. 4 is a schematic flow chart of a video playing method 400 provided in an embodiment of the present application. The method 400 may be applied to the scenario shown in fig. 1, and the steps of the method 400 may be performed by a terminal device 11 in the scenario, and the terminal device 11 may have the architecture shown in fig. 2 and/or fig. 3, which is not limited in this embodiment. The method 400 includes the steps of:
s401, based on video data from network equipment, playing a video generated by the video data on line on a call interface by using a first network system, wherein the video comprises video color vibration or video color ring;
s402, under the condition that the first network system needs to be switched to the second network system, candidate video data are obtained;
and S403, playing the candidate video generated by the candidate video data on the call interface.
In the embodiment of the application, the call interface may include a call interface and an incoming call interface. If the terminal equipment is a calling terminal, the terminal equipment can play a video color ring on a calling interface by using a first network system based on the video data in a call ringing stage; if the terminal equipment is a called terminal, the terminal equipment can play video color vibration on the incoming call interface by using the first network system based on the video data in the call ringing stage.
The candidate video data includes video data cached in the terminal device or video data prestored in a memory of the terminal device. The second network system is a network system that does not support online playing of videos (including video color vibration or video color ring).
In this embodiment, the switching from the first network type to the second network type may be the SRVCC handover. The SRVCC handover may be successfully or unsuccessfully handed over, which is not limited in the embodiment of the present application.
In this embodiment of the present application, the terminal device may play a candidate video when it is determined that it is necessary to switch from the first network system to the second network system, where the candidate video may be a video color ring or a video color ring that is played in the first network system at a ring stage or a ring stage of the current call, and the candidate video may also be a video that is downloaded by the terminal device before the current call and is prestored in the local memory. Therefore, the trouble of the terminal equipment for converting playing video into playing audio color ring or incoming call ring to the user can be avoided under the condition of SRVCC switching, and the continuous video playing experience can be provided for the user, so that the use experience of the user is improved.
Optionally, the first network system is 4G or 5G, and the second network system is 2G or 3G.
It should be understood that the first network system may also be a network system that is evolved in the future and can support video color ring or video color ring online playing, which is not limited in this application embodiment.
As an alternative embodiment, the method 400 further comprises: the terminal equipment determines that the first network system needs to be switched to the second network system.
In a possible implementation manner, the terminal device may determine that it is necessary to switch from the first network type to the second network type when the current network does not support the first network type.
Illustratively, the first network system is a 4G network system, the second network system is a 3G network system, and when a user is in a voice call process, if a current network of the terminal device cannot support the 4G network system, the terminal device needs to be switched from the 4G network system to the 3G network system.
In another possible implementation manner, the terminal device may determine that the first network system needs to be switched to the second network system when the network quality in the first network system is lower than a preset threshold.
Illustratively, the first network system is a 4G network system, the second network system is a 3G network system, and when a user moves to an area with weak 4G network signals or leaves a coverage area of the 4G network during a voice call using the 4G network system, the terminal device needs to be switched from the 4G network system to the 3G network system in order to ensure continuity of the voice call.
Optionally, the parameter for determining the network quality may include at least one of a signal to interference plus noise ratio (SINR), a Reference Signal Receiving Power (RSRP), a Reference Signal Receiving Quality (RSRQ), or a Received Signal Strength Indication (RSSI). Different parameters may correspond to different preset thresholds.
Illustratively, the first network standard is a 4G network standard, the second network standard is a 3G network standard, the preset threshold of RSRP is-105 dB, and when the RSRP of the terminal device in the 4G network standard is-115 dB and is lower than the threshold of RSRP, the terminal device may determine that the 4G network signal is weak and needs to be switched from the 4G network standard to the 3G network standard.
Illustratively, the first network system is a 4G network system, the second network system is a 3G network system, the preset threshold of SINR is 5dB, and when the SINR of the terminal device in the 4G network system is 3dB, the terminal device may determine that the 4G network signal is weak and needs to be switched from the 4G network system to the 3G network system.
As an alternative embodiment, the candidate video data includes video data buffered in the terminal device. Before the terminal device acquires the candidate video data, the method 400 further includes: and the terminal equipment buffers the video data into the buffer. The terminal equipment acquires candidate video data and comprises the following steps: and the terminal equipment acquires the candidate video data from the buffer.
In this embodiment of the present application, the terminal device may buffer the video color vibration data or the video color ring data being played in the process of playing the video color vibration or the video color ring in the first network system, that is, the candidate video data includes the video color vibration data or the video color ring data. And then, under the condition that the terminal equipment determines that the first network system needs to be switched to the second network system, the terminal equipment can acquire the cached video color ring data or the video color ring data from the buffer.
Optionally, the terminal device may circularly play the video color ring generated by the cached video color ring data on the play interface, or play the video color ring generated by the cached video color ring data.
As an optional embodiment, the candidate video data includes video data prestored in a memory of the terminal device. The terminal equipment acquires candidate video data and comprises the following steps: and the terminal equipment acquires the candidate video data from the memory.
In this embodiment of the application, when the terminal device determines that it is necessary to switch from the first network type to the second network type, the terminal device may obtain the pre-stored local video data from the memory, that is, the candidate video data includes the local video data.
Optionally, the terminal device may play a local video generated from the local video data on the play interface.
It should be understood that, in the above embodiment, no matter whether the terminal device is successfully switched from the first network system to the second network system, once the terminal device triggers the switching to cause that the video color vibration or the video color ring cannot be played normally online, the terminal device may play the candidate video cached in the buffer or the candidate video prestored in the memory. Therefore, under the condition of switching failure, the back-switching processing of the video color vibration or the video color ring is not carried out at this time, and the use experience of a user is favorably improved.
In the following, referring to fig. 5, a video playing method according to the present application is described by taking a terminal device as a called terminal to receive an incoming call message as an example.
Fig. 5 is a schematic flow chart of another video playing method 500 provided in the embodiment of the present application. The method 500 may be applied to the scenario shown in fig. 1, and S501 to S509 in the method 500 may be performed before the steps of the method 400, but the embodiment of the present application is not limited thereto. The method 500 includes the steps of:
s501, the network equipment sends an incoming call message to the terminal equipment. Accordingly, the terminal device receives the incoming call message.
Illustratively, the incoming call message is a request (invite) message, which is used to initiate a voice call to the called terminal. Optionally, the invite message includes an "Alert-Info: < urn: Alert: service: crs >" field.
S502, the terminal device sends a confirmation message to the network device. Accordingly, the network device receives the acknowledgement message.
Illustratively, the confirmation message is an invite 180 message, and the invite 180 message is used to confirm that the invite message is received and the call is being processed.
S503, the network device sends a first negotiation request message to the terminal device, wherein the first negotiation request message is used for requesting the terminal device to carry out audio and video media resource negotiation, and therefore the audio and video media resources are determined for subsequently playing the video color vibration again. Accordingly, the terminal device receives the first negotiation request message.
Illustratively, the first negotiation request message is an update (update) message, and the update message may include an early-media (early-media) field, which indicates that the network device may normally transmit video data if the field is "sendrecv". If the field is "inactive", it indicates that the network device may be abnormal and cannot normally transmit video data.
Illustratively, the network device may include a lottery platform, and a Session Description Protocol (SDP) message of the update message may include an audio-related field, such as an "audio a ═ content: g.3gpp.crs" field, and a video-related field, such as a "video a ═ content: g.3gpp.crs" field, which may indicate to the terminal device that the SDP message is sent by the lottery platform and that negotiation of audio and video media resources for video lottery may be performed.
Optionally, audio media parameters, such as Payload Type (PT) value, port number, coding format, coding rate, etc., are also included in the SDP message. Also included in the SDP message are video media parameters such as PT value, port number, encoding format, encoding rate, etc.
It should be understood that the audio media parameters and the video media parameters in this step are media parameters that can be supported by the network device, for example, the audio PT value field corresponds to 0 and 8, which indicate that the types of payloads that can be supported by the network device are μ -law Pulse Code Modulation (PCM) audio and a-law PCM audio.
Optionally, the SDP message further includes a quality of service (QoS) current status and a QoS desired status of the network device. The current QoS status may be "none" in this step, which indicates that the bearer is not established and video data cannot be transmitted. The QoS expected state may be "sendrecv" in this step, indicating that video data can be normally transmitted.
S504, the terminal device sends a first negotiation response message to the network device, and the first negotiation response message is used for confirming the negotiation of the audio and video media resources with the network device. Accordingly, the network device receives the first negotiation response message.
Illustratively, the first negotiation response message is an update 200 acknowledgement (OK) message, which includes audio-related fields, e.g., an "audio a ═ content: g.3gpp.crs" field, and video-related fields, e.g., a "video a ═ content: g.3gpp.crs" field.
Optionally, audio media parameters, such as Payload Type (PT) value, port number, coding format, coding rate, etc., are also included in the SDP message. Also included in the SDP message are video media parameters such as PT value, port number, encoding format, encoding rate, etc.
It should be understood that the audio media parameter and the video media parameter in this step are audio/video media parameters that can be supported by the terminal device, for example, the audio PT value field corresponds to 0 and 3, which indicate that the types of payloads that can be supported by the terminal device are μ law PCM audio and global system for mobile communication (GSM) audio.
Illustratively, if the network device supports μ -law PCM audio and a-law PCM audio, the network device and the terminal device may negotiate the type of payload as commonly supported μ -law PCM audio.
Optionally, the SDP message further includes a quality of service (QoS) current status and a QoS desired status of the network device. In this step, the QoS current state may be "none", which indicates that a bearer is not established and video data cannot be transmitted, and the QoS expected state may be "sendrecv", which indicates that the network device expects to normally transmit video data.
It should be understood that the first negotiation request message and the first negotiation response message may include at least one PT value, at least one port number, at least one coding format, at least one coding rate, etc., and the network device and the terminal device may finally negotiate one PT value, one port number, one coding format, one coding rate, etc. for the video lottery play from among the at least one PT value, the at least one port number, the at least one coding format, the at least one coding rate, etc.
S505, the network device sends a link establishment request message to the terminal device, where the link establishment request message is used to request establishment of a link that can transmit video data. Accordingly, the terminal device receives the link establishment request message.
S506, the terminal device sends a link establishment response message to the network device, where the link establishment response message is used to confirm establishment of a link that can transmit video data. Accordingly, the network device receives the link establishment response message.
Illustratively, the link that can transmit the video data is a link with a QoS Class Identifier (QCI) of 2(QCI 2), and the QCI 2 link is used for carrying video traffic.
Based on the above S505 and S506, a link may be established between the network device and the terminal device to transmit video data of the video lottery.
S507, the terminal device sends a second negotiation request message to the network device. Accordingly, the network device receives the second negotiation request message.
Illustratively, the second negotiation request message is an update message, and an SDP message of the update message includes audio media parameters supported by the called terminal, such as PT value, port number, coding format, coding rate, and the like. The SDP message also includes video media parameters supported by the terminal device, such as PT value, port number, encoding format, encoding rate, etc.
Optionally, the SDP message further includes a QoS current status and a QoS desired status. In this step, the current QoS status may be "sendrecv," which indicates that a bearer is established and the terminal device may normally transmit video data. The QoS expected state may be "sendrecv" indicating that the terminal device expects to transmit video data normally. It should be appreciated that the bearer may be the QCI 2 link described above.
It should be understood that the second negotiation request message includes the audio video media parameters such as a PT value, a port number, a coding format, a coding rate, etc. that have been negotiated for playing the video lottery.
S508, the network device sends a second negotiation response message to the terminal device. Accordingly, the terminal device receives the second negotiation response message.
Alternatively, the network device may update the QoS current status and the QoS desired status of the network device and the terminal device.
Illustratively, the second negotiation request message is an update 200 OK message, and an SDP message of the update 200 OK message includes the updated QoS current status and QoS desired status of the network device. In this step, the current QoS status may be "sendrecv," which indicates that the bearer is established and the network device may normally transmit video data. The QoS expected state may be "sendrecv" indicating that the network device expects to transmit video data normally.
It should be understood that the second negotiation response message includes the audio video media parameters such as a PT value, a port number, a coding format, a coding rate, etc. that have been negotiated for the video lottery.
S509, the network device sends video data to the terminal device, and the video data is used for generating video color vibration. Accordingly, the terminal device receives the video data.
Illustratively, the network device may send the response message over the QCI 2 link described above.
And S510, the terminal equipment plays the video color vibration generated by the video data on the incoming call interface.
And S511, the terminal equipment caches the video data of the playing video lottery.
S512, the terminal equipment judges whether to trigger switching from the first network system to the second network system.
Illustratively, the first network standard is a 4G network standard or a 5G network standard, and the second network standard is a 2G network standard or a 3G network standard.
S513, under the condition that the switching from the first network system to the second network system is triggered, the terminal device circularly plays the video color vibration generated by the cached video data on the incoming call interface, or plays the local existing video of the terminal device.
The local video of the terminal device is the local video prestored in the memory of the terminal device before the ringing of the call.
It should be understood that the terminal device may trigger the switching of the network system, and may switch successfully or fail, which is not limited in this embodiment of the present application.
And S514, the terminal equipment connects the incoming call and stops playing the video.
In this step, the played video includes the video chromatic vibration generated by the cached video data, or an existing video in the local of the terminal device.
And S515, the terminal equipment deletes the cached video data.
In the embodiment of the application, the network device and the terminal device may negotiate audio and video media resources for playing the video ringing, establish a link for transmitting video data, and play the video ringing on the incoming call interface on line based on the acquired video data. And under the condition that the terminal equipment triggers switching from the first network system to the second network system, the terminal equipment plays the video color vibration generated by the cached video data on the incoming call interface or plays the local existing video of the terminal equipment. Therefore, the trouble that the video playing is switched to the audio playing for the user due to the switching of the network modes can be avoided, and the continuous video playing experience is favorably brought for the user, so that the use experience of the user is improved.
It should be understood that fig. 5 is described above by taking the example of the terminal device as a called terminal to receive an incoming call. Similarly, when the terminal device is used as a calling terminal, the terminal device may send a call request through the network device, and perform audio and video media resource negotiation of the video polyphonic ringtone with the network device to obtain video data. After negotiation, the terminal device may play a video coloring ring back tone generated from the video data on the call interface. Similarly, under the condition that the terminal equipment triggers to switch from the first network system to the second network system, the terminal equipment plays the video color ring generated by the cached video data on a call interface or plays the local existing video of the terminal equipment. Therefore, continuous video playing experience is brought to the user, and the using experience of the user is improved.
The specific negotiation process of the audio and video media resources of the video polyphonic ringtone is similar to the negotiation process of the audio and video media resources of the video polyphonic ringtone, and details are not repeated here.
The sequence numbers of the above processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not be limited in any way to the implementation process of the embodiments of the present application.
The video playing method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 5, and the video playing apparatus according to the embodiment of the present application is described in detail below with reference to fig. 6 and 7.
Fig. 6 shows a schematic block diagram of a video playing apparatus 600 provided in an embodiment of the present application, where the apparatus 600 includes a processing module 610 and an obtaining module 620.
Wherein the processing module 610 is configured to: based on video data from network equipment, a first network system is used for playing videos generated by the video data on line on a call interface, wherein the videos comprise video color vibration or video color ring. The obtaining module 620 is configured to: and under the condition that the first network system needs to be switched to a second network system, acquiring candidate video data, wherein the candidate video data comprises cached video data or video data prestored in a memory, and the second network system is a network system which does not support the online video playing. The processing module 610 is further configured to: and playing the candidate video generated by the candidate video data on the call interface.
Optionally, the processing module 610 is configured to: and determining that the first network system needs to be switched to the second network system.
Optionally, the processing module 610 is configured to: and under the condition that the current network does not support the first network system, determining that the first network system needs to be switched to the second network system.
Optionally, the processing module 610 is configured to: and under the condition that the network quality under the first network system is lower than a preset threshold, determining that the first network system needs to be switched to the second network system.
Optionally, the candidate video data comprises video data buffered in the terminal device. The processing module 610 is configured to: the video data is buffered in a buffer. The obtaining module 620 is configured to: and acquiring the candidate video data from the buffer.
Optionally, the candidate video data includes video data prestored in a memory of the terminal device. The obtaining module 620 is configured to: and acquiring the candidate video data from the memory.
Optionally, the first network system is a 4G network system or a 5G network system, and the second network system is a 2G network system or a 3G network system.
In an alternative example, as will be understood by those skilled in the art, the apparatus 600 may be embodied as the terminal device in the above-described embodiment, or the functions of the terminal device in the above-described embodiment may be integrated in the apparatus 600. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. The apparatus 600 may be configured to perform various processes and/or steps corresponding to the terminal device in the foregoing method embodiment.
It should be appreciated that the apparatus 600 herein is embodied in the form of functional modules. The term module herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 600 in fig. 6 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 7 shows a schematic block diagram of another video playing apparatus 700 provided in the embodiment of the present application. The apparatus 700 includes a processor 710, a transceiver 720, and a memory 730. The processor 710, the transceiver 720 and the memory 730 are in communication with each other through an internal connection path, the memory 730 is used for storing instructions, and the processor 710 is used for executing the instructions stored in the memory 730 to control the transceiver 720 to transmit and/or receive signals.
It should be understood that the apparatus 700 may be embodied as the terminal device in the foregoing embodiment, or the functions of the terminal device in the foregoing embodiment may be integrated in the apparatus 700, and the apparatus 700 may be configured to perform each step and/or flow corresponding to the terminal device in the foregoing method embodiment. Alternatively, the memory 730 may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information. The processor 710 may be configured to execute the instructions stored in the memory, and when the processor executes the instructions, the processor may perform the steps and/or processes corresponding to the terminal device in the foregoing method embodiments.
It should be understood that, in the embodiment of the present application, the processor 710 may be a Central Processing Unit (CPU), and the processor may also be other general processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field Programmable Gate Arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and so on. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the embodiments of the present application, and all the changes or substitutions should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A video playback method, comprising:
the method comprises the steps that terminal equipment uses a first network system to play videos generated by video data on a call interface on line based on the video data from network equipment, wherein the videos comprise video color vibration or video color ring;
under the condition that the first network system needs to be switched to a second network system, the terminal equipment acquires candidate video data, wherein the candidate video data comprises video data cached in the terminal equipment or video data prestored in a memory of the terminal equipment, and the second network system is a network system which does not support online video playing;
and the terminal equipment plays the candidate video generated by the candidate video data on the call interface.
2. The method of claim 1, further comprising:
and the terminal equipment determines that the first network standard needs to be switched to the second network standard.
3. The method of claim 2, wherein the determining, by the terminal device, that a handover from the first network standard to the second network standard is required comprises:
and under the condition that the current network of the terminal equipment does not support the first network standard, the terminal equipment determines that the first network standard needs to be switched to the second network standard.
4. The method of claim 2, wherein the determining, by the terminal device, that a handover from the first network standard to the second network standard is required comprises:
and under the condition that the network quality of the terminal equipment under the first network standard is lower than a preset threshold, the terminal equipment determines that the first network standard needs to be switched to the second network standard.
5. The method according to any of claims 1-4, wherein the candidate video data comprises the video data buffered in the terminal device;
before the terminal device acquires the candidate video data, the method further includes:
the terminal equipment caches the video data in a buffer;
the terminal equipment acquires candidate video data, and the method comprises the following steps:
and the terminal equipment acquires the candidate video data from the buffer.
6. The method according to any one of claims 1-4, wherein the candidate video data comprises video data pre-stored in a memory of the terminal device;
the terminal equipment acquires candidate video data, and the method comprises the following steps:
and the terminal equipment acquires the candidate video data from the memory.
7. The method according to any of claims 1-6, wherein the first network standard is a fourth-generation mobile communication technology 4G network standard or a fifth-generation mobile communication technology 5G network standard, and the second network standard is a second-generation mobile communication technology 2G network standard or a third-generation mobile communication technology 3G network standard.
8. A video playback device comprising means for performing the method of any one of claims 1-7.
9. A video playback apparatus, comprising: a processor coupled with a memory for storing a computer program that, when invoked by the processor, causes the apparatus to perform the method of any of claims 1-7.
10. A computer-readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1-7.
11. A computer program product, characterized in that computer program code is included in the computer program product, which, when run on a computer, causes the computer to implement the method according to any one of claims 1-7.
CN202210048463.XA 2022-01-17 2022-01-17 Video playing method and video playing device Active CN114338922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210048463.XA CN114338922B (en) 2022-01-17 2022-01-17 Video playing method and video playing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210048463.XA CN114338922B (en) 2022-01-17 2022-01-17 Video playing method and video playing device

Publications (2)

Publication Number Publication Date
CN114338922A true CN114338922A (en) 2022-04-12
CN114338922B CN114338922B (en) 2023-01-24

Family

ID=81029140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210048463.XA Active CN114338922B (en) 2022-01-17 2022-01-17 Video playing method and video playing device

Country Status (1)

Country Link
CN (1) CN114338922B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041465A (en) * 2023-07-18 2023-11-10 荣耀终端有限公司 Video call optimization method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106603466A (en) * 2015-10-15 2017-04-26 中国移动通信集团公司 Media capability consultation method and device when network switching
CN107707969A (en) * 2017-09-04 2018-02-16 深圳市屯奇尔科技有限公司 Video broadcasting method, device and terminal device
CN113259526A (en) * 2020-02-13 2021-08-13 中国移动通信集团广东有限公司 Video content playing method and device and electronic equipment
CN113726958A (en) * 2020-05-26 2021-11-30 中国移动通信有限公司研究院 Video color ring back tone playing method, color ring back tone platform and terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106603466A (en) * 2015-10-15 2017-04-26 中国移动通信集团公司 Media capability consultation method and device when network switching
CN107707969A (en) * 2017-09-04 2018-02-16 深圳市屯奇尔科技有限公司 Video broadcasting method, device and terminal device
CN113259526A (en) * 2020-02-13 2021-08-13 中国移动通信集团广东有限公司 Video content playing method and device and electronic equipment
CN113726958A (en) * 2020-05-26 2021-11-30 中国移动通信有限公司研究院 Video color ring back tone playing method, color ring back tone platform and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041465A (en) * 2023-07-18 2023-11-10 荣耀终端有限公司 Video call optimization method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114338922B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN111602379B (en) Voice communication method, electronic equipment and system
CN114079893B (en) Bluetooth communication method, terminal device and computer readable storage medium
WO2022033296A1 (en) Bluetooth communication method, wearable device, and system
CN110602686B (en) Method for using remote SIM module and electronic equipment
CN113645715B (en) Method and terminal device for monitoring link
CN112335294B (en) Emergency call method and user terminal
CN114338922B (en) Video playing method and video playing device
CN114338913B (en) Fault diagnosis method, electronic device and readable storage medium
WO2023160179A1 (en) Magnification switching method and magnification switching apparatus
CN113676902B (en) System, method and electronic equipment for providing wireless internet surfing
CN116663587A (en) Two-dimensional code identification method and identification device
CN114900583B (en) Method and device for controlling video color vibration playing
CN116077943B (en) Method for scheduling system resources and related device
CN114390569B (en) Method and device for measuring synchronous signal block and mobile terminal
CN116405594B (en) Voice communication method and device
WO2023160491A1 (en) Communication method, electronic device, network device and system
CN116709584B (en) Method for connecting car machine and terminal equipment
CN115297530B (en) Network connection method and device
CN116708674B (en) Communication method and electronic equipment
WO2024067037A1 (en) Service calling method and system, and electronic device
WO2023236693A1 (en) Application icon display method and related apparatus
CN111801931B (en) Method for switching on and hanging up telephone when call occurs SRVCC switch
CN118158313A (en) Method, electronic device and system for displaying electric quantity
CN116938950A (en) Data transmission method, electronic equipment and storage medium
CN117715001A (en) IMS short message processing method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant