CN117955950A - Method and equipment for joining multimedia activities - Google Patents

Method and equipment for joining multimedia activities Download PDF

Info

Publication number
CN117955950A
CN117955950A CN202211344827.5A CN202211344827A CN117955950A CN 117955950 A CN117955950 A CN 117955950A CN 202211344827 A CN202211344827 A CN 202211344827A CN 117955950 A CN117955950 A CN 117955950A
Authority
CN
China
Prior art keywords
activity
terminal device
card
user
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211344827.5A
Other languages
Chinese (zh)
Inventor
曹文砚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211344827.5A priority Critical patent/CN117955950A/en
Publication of CN117955950A publication Critical patent/CN117955950A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method and equipment for joining a multimedia activity, which relate to the technical field of terminals and can support users to join multimedia activities shared by other users at any time and improve user experience. In the application, when the activity information from the calling terminal is received, the called terminal can display the corresponding time-limited link entry, can store the activity information locally and always display the activity card corresponding to the first activity according to the stored activity information. Based on the method, the user of the called terminal can not only join the multimedia activity initiated by the calling terminal through the link entry, but also join the multimedia activity initiated by the calling terminal through the activity card at any time when the user can not join through the link entry due to overtime. The method can avoid the problem that the multimedia activities shared by other users cannot be added due to overtime caused by the fact that the multimedia activities shared by other users are not added through the time-limited link entry in time, support the users to add the multimedia activities shared by other users at any time, and improve user experience.

Description

Method and equipment for joining multimedia activities
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a method and equipment for joining a multimedia activity.
Background
With the development of communication technology and terminal technology, various multimedia resources such as video resources, music resources, live broadcast resources and the like greatly enrich the daily life of people. In the process of using a terminal device to watch video, play music, live broadcast, watch multimedia activities, etc., users often want to share the multimedia activities to friends, so as to synchronize experience with the friends. But there is a lack of a specific solution in the prior art to meet the needs of the user.
Disclosure of Invention
The application provides a method and equipment for joining a multimedia activity, which can support a user to join the multimedia activity shared by other users at any time and improve the user experience.
In order to achieve the above purpose, the embodiment of the application adopts the following technical scheme:
In a first aspect, there is provided a method of joining a multimedia campaign, the method comprising: the first terminal device receives activity information of a first activity from the second terminal device; the first terminal device storing the activity information for linking to an activity page of a first activity ongoing on the first terminal device; the first terminal device displays an activity card of the first activity; after receiving the operation of joining the first activity by the user through the activity card, the first terminal device performs the first activity synchronously with the second terminal device through the activity card.
As an example, the activity information of the first activity from the second terminal device is used to launch/pull from the background to the foreground an application capable of executing the first activity and jump to a specific first activity page. Illustratively, the first activity is a video activity and the application capable of executing the first activity is a video application; or the first activity is an audio activity and the application capable of performing the first activity is a music player.
According to the scheme provided by the first aspect, when the activity information from the calling terminal is received, the called terminal can not store the activity information locally, and the activity card corresponding to the first activity is always displayed according to the stored activity information. Since the activity information is already stored locally, a permanent entry page is included in an activity card corresponding to an activity. Based on the above, the user of the called terminal can join the multimedia activity initiated by the calling terminal through the activity card at any time. The method can support the user to join the multimedia activities shared by other users at any time, and improve the user experience.
For example, compared with the case that the existing called terminal does not store the activity information and only displays the time-limited link entry, the scheme provided by the first aspect can avoid the problem that the time-out cannot be added due to the fact that the multimedia activity shared by other users is not added through the time-limited link entry in time.
As a possible implementation manner, the first terminal device displays an activity card of a first activity, including: the first terminal device displays an activity card of the first activity at a preset position according to the setting of the user. The scheme provided by the application is suitable for different movable card display schemes, for example, the specific display position of the movable card is not limited, for example, the display position of the movable card can be set by user definition, so that the scheme has strong applicability and high flexibility.
As one possible implementation, the preset positions include one or more of the following: desktop, negative one screen, notification bar. The scheme provided by the application is suitable for different movable card display schemes, for example, the display positions of the movable cards can comprise one or more places such as a desktop, a negative screen, a notification bar and the like, so that the scheme has strong applicability and high flexibility.
As a possible implementation manner, the activity information is sent to the first terminal device by the second terminal device through the first application; the method further comprises the following steps: after the first terminal device receives the activity information, displaying a time-limited link entry of the first activity on an application interface of the first application; the time-limited link entry supports a user to join the first activity through the time-limited link entry within a preset time period when the first terminal equipment receives the activity information, and does not support the user to join the first activity through the time-limited link entry after the first terminal equipment receives the preset time period of the activity information. In the scheme, the called terminal can display the corresponding time-limited link entry when receiving Hu Soudao activity information, so that a user can directly join in a first activity through the entry in a preset duration on a current page, and the convenience of the user in accessing the activity is improved.
As a possible implementation manner, the page where the time-limited link entry is located includes an option of later joining a first activity, and the first terminal device displays an activity card of the first activity, including: upon receiving an operation of the user clicking on an option to join the first activity later, the first terminal device displays an activity card of the first activity. The application does not limit the specific time for displaying the activity card by the called terminal, for example, the activity card can be displayed when the user gives up to join the activity through the time-limited link entry, so as to support the user to join the multimedia activity shared by other users through the activity card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the first terminal device displays an activity card of a first activity, including: after the first terminal equipment receives the preset duration of the activity information, the first terminal equipment displays the activity card of the first activity. The application does not limit the specific time for displaying the active card by the called terminal, for example, the active card can be displayed when the time-limited link entry is overtime, so as to support the user to add the multimedia activities shared by other users through the active card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the first terminal device displays an activity card of a first activity, including: when the time-limited link entry message disappears, the first terminal device displays an activity card of the first activity. The application does not limit the specific time for displaying the activity card by the called terminal, for example, the activity card can be displayed when the time-limited link entrance disappears, so as to support the user to add the multimedia activity shared by other users through the activity card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the user joins the first activity through the activity card after receiving the preset duration of the activity information by the first terminal device. Based on the method, after the time-limited link entry is overtime, the user can still join the multimedia activities shared by other users at any time through the activity card, so that the problem that the overtime cannot be added due to the fact that the multimedia activities shared by other users are not timely joined through the time-limited link entry is avoided, and the user experience is improved.
As a possible implementation manner, in a process that the first terminal device performs the first activity synchronously with the second terminal device through the activity card, the method further includes: the first terminal device receives an activity parameter of the adjusted first activity from the second terminal device, wherein the activity parameter comprises one or more of: play progress, play speed, brightness, volume, play status, interface layout; the first terminal device updates the activity parameters of the first activity. Based on the above, the synchronism and/or the consistency of the playing effect when the second terminal device and the first terminal device perform the first activity can be ensured.
As a possible implementation manner, in a process that the first terminal device performs the first activity synchronously with the second terminal device through the activity card, the method further includes: in response to a user's adjustment of an activity parameter of the first activity, the first terminal device transmits the adjusted activity parameter of the first activity to the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout. Based on the above, the synchronism and/or the consistency of the playing effect when the second terminal device and the first terminal device perform the first activity can be ensured.
As a possible implementation, the first activity includes an audio activity or a video activity. The application is not limited to the specific form of the first activity, and therefore, the scheme is suitable for sharing scenes of multiple activities. Illustratively, audio activity such as a music playing activity, etc. Video activities such as movie video playing activities, live video viewing activities, etc.
In a second aspect, there is provided a terminal device comprising: a communication unit for receiving activity information of a first activity from a second terminal device; a storage unit for storing the activity information for linking to an activity page of a first activity ongoing on the first terminal device; a display unit for displaying the activity card of the first activity; and the processing unit is used for synchronously carrying out the first activity with the second terminal equipment through the activity card after receiving the operation of adding the first activity through the activity card by the user.
According to the scheme provided by the second aspect, when the activity information from the calling terminal is received, the called terminal can not store the activity information locally, and the activity card corresponding to the first activity is always displayed according to the stored activity information. Since the activity information is already stored locally, a permanent entry page is included in an activity card corresponding to an activity. Based on the above, the user of the called terminal can join the multimedia activity initiated by the calling terminal through the activity card at any time. The method can support the user to join the multimedia activities shared by other users at any time, and improve the user experience.
For example, compared with the case that the existing called terminal does not store the activity information and only displays the time-limited link entry, the scheme provided by the first aspect can avoid the problem that the time-out cannot be added due to the fact that the multimedia activity shared by other users is not added through the time-limited link entry in time.
As a possible implementation manner, the displaying unit displays an activity card of a first activity, including: the display unit displays the activity card of the first activity at a preset position according to the setting of the user. The scheme provided by the application is suitable for different movable card display schemes, for example, the specific display position of the movable card is not limited, for example, the display position of the movable card can be set by user definition, so that the scheme has strong applicability and high flexibility.
As one possible implementation, the preset positions include one or more of the following: desktop, negative one screen, notification bar. The scheme provided by the application is suitable for different movable card display schemes, for example, the display positions of the movable cards can comprise one or more places such as a desktop, a negative screen, a notification bar and the like, so that the scheme has strong applicability and high flexibility.
As a possible implementation manner, the activity information is sent to the first terminal device by the second terminal device through the first application; the display unit is further configured to: displaying a time-limited link entry of a first activity on an application interface of a first application after the communication unit receives the activity information; the time-limited link entry supports a user to join the first activity through the time-limited link entry within a preset time period when the first terminal equipment receives the activity information, and does not support the user to join the first activity through the time-limited link entry after the first terminal equipment receives the preset time period of the activity information. In the scheme, the called terminal can display the corresponding time-limited link entry when receiving Hu Soudao activity information, so that a user can directly join in a first activity through the entry in a preset duration on a current page, and the convenience of the user in accessing the activity is improved.
As a possible implementation manner, the page on which the time-limited link entry is located includes an option of later joining the first activity, and the display unit displays an activity card of the first activity, including: when the terminal device receives an operation that the user clicks an option to join the first activity later, the display unit displays an activity card of the first activity. The application does not limit the specific time for displaying the activity card by the called terminal, for example, the activity card can be displayed when the user gives up to join the activity through the time-limited link entry, so as to support the user to join the multimedia activity shared by other users through the activity card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the displaying unit displays an activity card of a first activity, including: after the terminal equipment receives the preset duration of the activity information, the display unit displays the activity card of the first activity. The application does not limit the specific time for displaying the active card by the called terminal, for example, the active card can be displayed when the time-limited link entry is overtime, so as to support the user to add the multimedia activities shared by other users through the active card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the displaying unit displays an activity card of a first activity, including: when the time-limited link entry message disappears, the display unit displays an active card of the first activity. The application does not limit the specific time for displaying the activity card by the called terminal, for example, the activity card can be displayed when the time-limited link entrance disappears, so as to support the user to add the multimedia activity shared by other users through the activity card at any time later. Therefore, the scheme has strong applicability and high flexibility.
As a possible implementation manner, the user joins the first activity through the activity card after receiving the preset duration of the activity information by the first terminal device. Based on the method, after the time-limited link entry is overtime, the user can still join the multimedia activities shared by other users at any time through the activity card, so that the problem that the overtime cannot be added due to the fact that the multimedia activities shared by other users are not timely joined through the time-limited link entry is avoided, and the user experience is improved.
As a possible implementation manner, in the process that the processing unit performs the first activity synchronously with the second terminal device through the activity card, the communication unit is further configured to: receiving an activity parameter of the adjusted first activity from the second terminal device, wherein the activity parameter comprises one or more of: play progress, play speed, brightness, volume, play status, interface layout; the processing unit is further configured to update the activity parameters of the first activity according to the adjusted activity parameters of the first activity. Based on the above, the synchronism and/or the consistency of the playing effect when the second terminal device and the first terminal device perform the first activity can be ensured.
As a possible implementation manner, in the process that the processing unit performs the first activity synchronously with the second terminal device through the activity card, the communication unit is further configured to: in response to a user's adjustment of an activity parameter of the first activity, transmitting the adjusted activity parameter of the first activity to the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout. Based on the above, the synchronism and/or the consistency of the playing effect when the second terminal device and the first terminal device perform the first activity can be ensured.
As a possible implementation, the first activity includes an audio activity or a video activity. The application is not limited to the specific form of the first activity, and therefore, the scheme is suitable for sharing scenes of multiple activities. Illustratively, audio activity such as a music playing activity, etc. Video activities such as movie video playing activities, live video viewing activities, etc.
In a third aspect, there is provided a terminal device comprising: a memory for storing computer program instructions; the communication interface is used for transmitting and receiving signals; the display is used for displaying an interface; a processor configured to execute the instructions, so that the terminal device implements the method as in any of the possible implementations of the first aspect.
In a fourth aspect, an activity sharing system is provided, the system comprising a second terminal device and a first terminal device, wherein the second terminal device is configured to send activity information of a first activity performed by the second terminal device to the first terminal device, and the first terminal device is configured to implement a method according to the activity information of the first activity as in any possible implementation manner of the first aspect.
In a fifth aspect, there is provided a computer readable storage medium having stored thereon computer readable instructions which, when executed by a processor, implement a method as in any one of the possible implementations of the first aspect.
In a sixth aspect, a chip system is provided, the chip system including a processor, a memory, the memory having instructions stored therein; the instructions, when executed by the processor, implement a method as in any one of the possible implementations of the first aspect. The chip system may be formed of a chip or may include a chip and other discrete devices.
In a seventh aspect, a computer program product is provided comprising computer readable instructions which, when run on a computer, cause the method as in any one of the possible implementations of the first aspect to be implemented.
Drawings
Fig. 1 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
FIG. 2 is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
FIG. 3A is a flowchart illustrating a method for joining a multimedia campaign according to an embodiment of the present application;
FIG. 3B is a flowchart illustrating a method for joining a multimedia campaign according to an embodiment of the present application;
FIG. 3C is a flowchart III of a method for joining a multimedia campaign according to an embodiment of the present application;
fig. 4 is a schematic diagram of a video playing interface in a video activity process of a second terminal device according to an embodiment of the present application;
Fig. 5 is a schematic diagram of an audio playing interface in the process of performing audio activity by using a second terminal device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a scenario in which a calling terminal initiates multimedia activity sharing according to an embodiment of the present application;
Fig. 7A is a second schematic diagram of a scenario in which a calling terminal initiates multimedia activity sharing according to an embodiment of the present application;
fig. 7B is a schematic diagram of a third scenario in which a calling terminal initiates multimedia activity sharing according to an embodiment of the present application;
fig. 8 is a schematic diagram of a scenario in which a called terminal receives multimedia activity sharing according to an embodiment of the present application;
Fig. 9 is a second schematic diagram of a scenario in which a called terminal receives multimedia activity sharing according to an embodiment of the present application;
Fig. 10 is a schematic diagram of a third scenario in which a called terminal receives multimedia activity sharing according to an embodiment of the present application;
fig. 11 is a schematic diagram of a scenario in which a called terminal receives multimedia activity sharing according to an embodiment of the present application;
fig. 12 is a schematic diagram of a called terminal joining a multimedia activity through an activity card according to an embodiment of the present application;
Fig. 13 is a schematic diagram of adding two other called terminals into a multimedia activity through an activity card according to an embodiment of the present application;
FIG. 14 is a flowchart of a method for joining a multimedia campaign according to an embodiment of the present application;
fig. 15 is a schematic diagram of a first activity interface in a multimedia activity sharing scenario according to an embodiment of the present application;
fig. 16 is a schematic diagram of a multimedia activity sharing network architecture according to an embodiment of the present application;
fig. 17 is an interaction diagram of a method for joining a multimedia campaign according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
In order to meet the requirement of real-time sharing and synchronous multimedia activities among multiple devices, as a possible implementation method, a first terminal device can access a certain multimedia activity shared by a second terminal device through a time-limited link entry corresponding to the activity, and then synchronously perform the multimedia activities with the second terminal device, such as synchronous video playing, synchronous music playing and the like.
However, the above manner of displaying the link entry is real-time because the first terminal device does not store the specific activity information locally, for example, the first terminal device does not support storing the specific activity information locally, and the time-limited link entry is cleared after the specified time limit is exceeded, so after the first terminal device displays the time-limited link entry, if the user of the first terminal device does not agree to join in time, the subsequent first terminal device cannot join in activity through the time-limited link entry.
In order to solve the above technical problems in the above process, and support users to join activities shared by other users at any time, and improve user experience, in another method for joining a multimedia activity provided in the embodiment of the present application, after a second terminal device shares activity information corresponding to a certain multimedia activity with a first terminal device, the first terminal device may store the activity information locally. If the user does not agree to join in time, the subsequent user can still join the activity shared by the second terminal device according to the activity information stored by the first terminal device. Wherein the activity information includes one or more of an activity ID (e.g., a room ID), an activity parameter (e.g., a resource title, a resource address, a play progress, a play speed, a brightness, a volume, a play status, an interface layout, etc.).
As a possible implementation manner, in order to facilitate the user operation, after receiving the activity information shared by the second terminal device, the first terminal device may further display a corresponding permanent entry page according to the stored activity information, where the permanent entry page is associated with the activity information stored locally. When the user joins the activity shared by the second terminal device from the permanent entry page, the first terminal device may join the activity shared by the second terminal device according to the activity information associated with the permanent entry page.
As an example, after receiving the activity information shared by the second terminal device, the first terminal device may display an activity card, that is, a permanent entry page corresponding to the activity. Taking an operating system of a first terminal device as For example, the first terminal device may provide active card services through meta-service capabilities (feature ability, FA).
In the multimedia activity sharing scenario, the second terminal device is an activity sharing initiator, and the first terminal device is an activity sharing receiver.
As an example, the method for joining a multimedia activity provided by the embodiment of the present application is applicable to a process in which a first terminal device (i.e., a called terminal) joins an activity shared by a second terminal device (i.e., a calling terminal). The first terminal device/the second terminal device may include, but is not limited to, a smart phone, a netbook, a tablet computer, a personal computer (personal computer, a PC), a palm computer, a vehicle-mounted device, a wearable device (such as a smart watch, a smart bracelet, a smart glasses, etc.), a camera (such as a single-lens reflex camera, a card camera, etc.), a smart television, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), a portable multimedia player (portable multimedia player, PMP), a projection device, a smart screen device, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a Mixed Reality (MR) device, a television, or a motion sensing game machine in a human-computer interaction scene, etc. The specific functions and structures of the first terminal device/the second terminal device are not limited by the application.
As an example, please refer to fig. 1, fig. 1 shows a schematic hardware structure of a terminal device according to an embodiment of the present application. The terminal device may be a first terminal device or a second terminal device.
As shown in fig. 1, the terminal device may include a processor 110, a memory (including an external memory interface 120 and an internal memory 121), a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and the like. The sensor module 180 may include a touch sensor 180A, among others. Optionally, the sensor module 180 may further include a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a flight controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180A, microphone, camera 193, etc., respectively, through different I2C bus interfaces. For example, the processor 110 may be coupled to the touch sensor 180A through an I2C interface, such that the processor 110 communicates with the touch sensor 180A through an I2C bus interface to implement a touch function of the terminal device.
In the embodiment of the present application, the processor 110 may acquire, through the I2C bus interface, a touch operation such as a click operation or a drag operation performed on the interface by the user detected by the touch sensor 180A, thereby determining a specific intention corresponding to the touch operation, and further respond to the touch operation, for example, initiate an activity sharing, join an activity, and so on.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the display 194, the camera assembly 193, the wireless communication module 160, and the like.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the terminal device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or video through display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wiFi network), bluetooth BT, global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near Field Communication (NFC), infrared (IR), etc. applied on the terminal device. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 of the terminal device is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device can communicate with the network and other devices through wireless communication technology. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
In some embodiments of the present application, if the terminal device is a first terminal device, the terminal device may send information, such as sharing activity information or activity parameters, to the second terminal device through the wireless communication module 160 based on the wireless communication technology.
In other embodiments of the present application, if the terminal device is a second terminal device, the terminal device may receive information, such as activity information, activity parameters, etc., from the first terminal device through the wireless communication module 160 based on the wireless communication technology.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
In the embodiment of the application, the terminal equipment can draw and render the layers on interfaces such as a communication application interface, an activity sharing interface, a multimedia activity interface and the like through the GPU. The Layer drawing is mainly used for drawing an interface Layer (Layer) to be displayed. The essence of layer drawing is the filling of pixels. The layer rendering is mainly used for adjusting brightness, contrast, saturation and the like of the drawn layer, and meanwhile, the state of the original layer is not changed.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
In the embodiment of the present application, the terminal device may display an application interface, an activity sharing interface, a multimedia activity interface, etc. through the display screen 194.
The terminal device may implement photographing functions through an ISP, a camera assembly 193, a video codec, a GPU, a display screen 194, an application processor, and the like. In an embodiment of the application, ISP, camera assembly 193, video codec, GPU and display 194 may support video calls between terminal devices and other terminal devices.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as audio, video, activity information, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code. By way of example, computer programs may include operating system programs and application programs. The operating system may include, but is not limited to Apple treeOS, and the like. Wherein the executable program code includes instructions. The processor 110 executes various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, an application program required for at least one function, and the like. The storage data area may store data created during use of the terminal device (such as interface data, activity information, etc.), etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an application processor, and the like. Such as music playing, recording, etc. Regarding the specific operation and function of the audio module 170, the speaker 170A, the receiver 170B, and the microphone 170C, reference may be made to the description in the conventional art.
The touch sensor 180A, also referred to as a "touch panel". The touch sensor 180A may be disposed on the display 194, and the touch sensor 180A and the display 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor 180A is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate detected touch operations (including information of touch location, touch force, contact area, and touch duration) to the processor to determine a touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180A may also be disposed on the surface of the terminal device at a different location than the display 194.
In the embodiment of the present application, the touch operation detected by the touch sensor 180A may be an operation performed on or near the touch screen by a finger, or an operation performed on or near the touch screen by a user using a touch-control auxiliary tool such as a stylus, a touch-control pen, a touch-control ball, or the like.
In addition, for the description of the hardware such as the key 190, the motor 291, the indicator 292, etc., reference may be made to the conventional technology, and the description of the embodiment of the present application is omitted.
It will be appreciated that the structure illustrated in fig. 1 of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
FIG. 2 toA system is illustrated as an example, which shows a software architecture block diagram of an electronic device according to an embodiment of the present application. Illustratively, software may be divided into several layers based on a hierarchical architecture, each layer having a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, as shown in FIG. 2, the/>, may be based on a hierarchical architectureThe system is respectively an application program layer, an application program Framework layer (Framework) and a frame layer (I/O) from top to bottomRuntime (/ >)Runtime), system libraries and kernel layers.
The application layer may include a series of applications, among others. The series of applications may include third party applications such as map, navigation, audio applications, video applications, instant messaging applications, etc. as shown in fig. 2, and may also include native functions or applications integrated in the operating system of the terminal device such as camera, gallery, calendar, talk, short message, FA, etc. as shown in fig. 2. The instant messaging application has an activity sharing function; the FA may provide an active card service.
An application Framework layer (Framework) is used to provide an application programming interface (application programming interface, API) and programming Framework for the application of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include activity information, video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build an application interface. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is arranged to provide communication functions for the terminal device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines.Runtime is responsible for/>Scheduling and management of the system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part isIs a core library of (a).
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between the hardware layer and the software layer. The kernel layer may include the display driver shown in fig. 2, input/output device drivers (e.g., keyboard driver, touch screen driver, earphone driver, speaker driver, microphone driver, etc.), camera driver, audio driver, sensor driver, etc. The user performs input operation through an input device (such as a touch screen, a keyboard, a microphone and the like), and the kernel layer can generate corresponding input events according to the input operation received by the input device.
Illustratively, in some embodiments of the present application, the input operation received by the input device may include a click (e.g., single click, double click, etc.) operation/long press operation/slide operation/hover gesture operation, etc., of the touch screen by a user's finger. In other embodiments of the present application, the input operation received by the input device may include a frame selection operation/a long press operation/a click (e.g., a single click, a double click, etc.) operation/a sliding operation/a drag operation, etc. on the touch screen by the user using a stylus, a touch pen, a touch ball, etc. touch assistance tool.
A method for adding a multimedia activity according to an embodiment of the present application will be specifically described below with reference to the accompanying drawings.
As shown in fig. 3A, a method for joining a multimedia activity according to an embodiment of the present application may include the following steps 1 to 5:
step 1: the first terminal device receives activity information of a first activity from the second terminal device.
Wherein the activity information of the first activity is used for linking to an activity page of the first activity ongoing on the first terminal device.
For example, the activity information of the first activity from the second terminal device is used to launch/pull from the background to the foreground an application capable of executing the first activity and jump to a specific first activity page.
Step 2: the first terminal device stores activity information of the first activity.
Step 3: the first terminal device displays an activity card of the first activity.
Step 4: the first terminal device receives an operation of joining a first activity by a user through the activity card.
Step 5: the first terminal device performs the first activity synchronously with the second terminal device through the activity card.
As an example, for a case where an application capable of executing a first activity is not running in the first terminal device, the first terminal device may start the application capable of executing the first activity according to the activity card and then jump to a specific first activity page.
As another example, for the case where an application capable of executing a first activity is run in the background in the first terminal device, the first terminal device may pull the application capable of executing the first activity from the background to the foreground according to the activity card, and then jump to a specific first activity page.
A method for adding a multimedia activity according to an embodiment of the present application will be specifically described below with reference to specific embodiments.
As an example, please refer to fig. 3B, fig. 3B shows a flowchart of a method for joining a multimedia activity according to an embodiment of the present application. As shown in fig. 3B, a method for joining a multimedia activity according to an embodiment of the present application may include S301-S302 and S303A-S304A:
S301: the second terminal device performs the first activity.
Wherein the first activity is a multimedia activity. The first activity may be an audio activity or a video activity, for example. Audio activities such as music playing activities, etc. Video activities such as movie video playing activities, live video viewing activities, etc.
For the case that the first activity is a video activity, the second terminal device performing the first activity may specifically include: the second terminal device plays the first video. The second terminal device playing the first video specifically means that the second terminal device displays a first video playing interface. As an example, please refer to fig. 4, fig. 4 shows a schematic diagram of a video playing interface in a video activity process of a second terminal device according to an embodiment of the present application. The video playback interface is shown as interface 401 in fig. 4.
For the case where the first activity is an audio activity, the second terminal device performing the first activity may specifically include: the second terminal device plays the first audio. Optionally, the second terminal device performing the first activity may further include: the second terminal device displays the first audio playing interface. As an example, please refer to fig. 5, fig. 5 shows a schematic diagram of an audio playing interface in the process of performing audio activities by the second terminal device according to an embodiment of the present application. The audio playback interface is shown as interface 501 in fig. 5.
S302: in response to the second user sharing the first activity to the first terminal device, the second terminal device transmits activity information of the first activity to the first terminal device.
Wherein, the activity information of the first activity may include the first activity ID and the activity parameters of the first activity, etc. The first activity ID is an activity room ID corresponding to the first activity. The activity parameter is one or more of time information of sending activity information to the first terminal device by the second terminal device, a resource title, a resource address of the first activity, a playing progress, a playing speed, brightness, volume, playing state, interface layout and the like of the first activity. The play progress of the first event is as for example the XX minute of the X-th set. The play speed is normal, 2 times, 3 times, etc. Play status such as play or pause.
As an example, the operation of the second user to share the first activity to the first terminal device may be initiated by the second user clicking on an activity sharing functionality control on the first activity interface. Illustratively, the activity sharing functionality controls are such as "listen together" functionality buttons, or "see together" functionality buttons.
Taking the scenario shown in fig. 4 as an example, in response to the second user clicking the active sharing functionality control 402 on the video playing interface 401, the second terminal device displays the interface 403 shown in fig. 4. Wherein, the interface 403 includes a plurality of sharing path options, including an instant messaging application option 404. Further, in response to the operation of the second user selecting the instant messaging application option 404 on the interface 403, the second terminal device displays an interface 405. Wherein, the interface 405 includes a plurality of sharing target options, including the first user. And in response to a second user selecting the first user's operation on the interface 405, the second terminal device transmits the activity information to the first terminal device. The activity information is used to indicate an activity ID and an activity parameter corresponding to the video playing interface 401.
In the scenario shown in fig. 4, the second user shares the first activity to the first terminal device as a combination of the following operations: clicking on the active sharing functionality control 402 on the interface 401, selecting the instant messaging application option 404 on the interface 403, selecting the first user on the interface 405. Or may be an operation by a second user selecting a first user on the interface 405 in the scenario illustrated in fig. 4.
Taking the scenario shown in fig. 5 as an example, in response to the second user clicking the active sharing functionality control 502 on the audio playing interface 501, the second terminal device displays the interface 503 shown in fig. 5. Wherein, the interface 503 includes a plurality of sharing path options, including an instant messaging application option 504. Further, in response to the operation of the second user selecting the instant messaging application option 504 on the interface 503, the second terminal device displays an interface 505. Wherein the interface 505 includes a plurality of sharing target options, including the first user. And, in response to a second user selecting the first user's operation on the interface 505, the second terminal device transmits the activity information to the first terminal device. The activity information is used to indicate an activity ID and an activity parameter corresponding to the video playing interface 501.
In the scenario shown in fig. 5, the second user shares the first activity to the first terminal device as a combination of the following operations: clicking on the active sharing functionality control 502 on the interface 501, selecting the instant messaging application option 504 on the interface 503, selecting the first user on the interface 505. Or may be an operation of selecting the first user on the interface 505 by the second user in the scenario illustrated in fig. 5.
As another example, the operation of the second user to share the first activity to the first terminal device may be an operation of the user dragging the first activity interface to a chat interface of the instant messaging application.
Taking the scenario shown in fig. 6 as an example, when the second terminal device simultaneously displays the video playing interface 601 and the instant messaging interface 602 for chatting with the first user, in response to an operation of dragging the video playing interface 601 to the instant messaging interface 602 by the second user, the second terminal device transmits the activity information to the first terminal device and displays the instant messaging interface 603 shown in fig. 6. The activity information is used to indicate an activity ID and an activity parameter corresponding to the video playing interface 601.
In the scenario shown in fig. 6, the operation of the second user sharing the first activity to the first terminal device is an operation of the second user dragging the video playing interface 601 to the instant messaging interface 602 for chatting with the first user.
As another example, the operation of the second user to share the first activity to the first terminal device may be an operation of the second user on the instant messaging interface.
Taking the scenario shown in fig. 7A as an example, when the second terminal device displays the video call interface 701 with the first user, the second terminal device displays the interface 703 shown in fig. 7A in response to the operation of the second user clicking the active sharing function control 702 on the video call interface 701. The interface 703 includes a plurality of audio-video activities ongoing by the second terminal device, including video activity a. Further, in response to the second user selecting the operation of video activity a on the interface 703, the second terminal device transmits the activity information to the first terminal device. Wherein the activity information is used to indicate the activity ID and activity parameters of the video activity a.
In the scenario shown in fig. 7A, the second user shares the first activity to the first terminal device as a combination of the following operations: clicking on the activity sharing functionality control 702 on the interface 701→selecting video activity a on the interface 703. Or an operation of selecting video activity a on the interface 703 by the second user in the scenario shown in fig. 7A.
It should be noted that, in the foregoing embodiments, the operation of the second user to share the first activity is only taken as an example of several operations of triggering the activity sharing, and the specific operation mode and the scene of triggering the activity sharing are not specifically limited in the present application. For example, the operation of the second user to share the first activity may be initiated by the second user clicking on an activity sharing functionality control on a multi-person video communication interface displayed on the second terminal device shown in fig. 7B. For another example, the operation of the second user to share the first activity may be initiated by the second user clicking on an activity sharing functionality control on a group chat interface displayed on the second terminal device. For these scenarios, the second terminal device will send the activity information of the first activity to each user in the group, i.e. the second terminal device will send the activity information of the first activity to the plurality of first terminal devices.
S303A: the first terminal device displays an activity card of the first activity.
In the embodiment of the application, after the activity information from the second terminal device is received, the activity card displayed by the first terminal device is the permanent entry page of the first activity. The activity card carries activity information of the first activity. The activity information of the first activity may include a first activity ID, activity parameters of the first activity, and the like. The activity parameter is one or more of time information of sending activity information to the first terminal device by the second terminal device, a resource title, a resource address, a playing progress, a playing speed, brightness, volume, playing state, interface layout and the like of the first activity.
Taking an operating system of a first terminal device asFor example, the first terminal device may provide the active card service through the FA, i.e. the active card may be an FA card.
As one possible implementation, the first terminal device may save the activity information of the first activity, including the first activity ID and the activity parameters of the first activity, after receiving the activity information from the second terminal device. Further, the second terminal device may display the activity card based on the saved activity information.
The embodiment of the application is not limited to the specific form of the first terminal equipment displaying the movable card.
For example, in some examples, the first terminal device may display the active card in the form of a hover card over a currently active interface of the first terminal device. Referring to fig. 12, fig. 12 shows an example of a movable card 1200 displayed in the form of a floating card.
As another example, in other examples, the first terminal device may display the active card in a preset position of a notification bar, desktop, negative one-screen, or the like. Referring to fig. 13, fig. 13 (a) shows an example of displaying an active card 1300-1 in a notification bar; fig. 13 (b) shows an example of displaying the activity card 1300-2 on the desktop.
It should be noted that the active cards in the scenes shown in fig. 12-13 of the present application are merely examples, and the present application is not limited to the specific content included in the active cards. For example, activity information of the first activity, such as a title, a collection number, etc., may also be included on the activity card.
In some embodiments, the first terminal device may also display a time-limited linked entry of the first activity after receiving the activity information from the second terminal device, as shown in S303B in fig. 3B. The first terminal device may display the time-limited link entry on the instant communication interface, or may display the time-limited link entry in the form of a notification message, which is not limited in the embodiment of the present application.
As an example, assume that a first terminal device is currently displaying a first interface, where the first interface is an instant messaging interface of a first user and a second user, and the second terminal device shares an activity initiated on the instant messaging interface of the first user and the second user, in which case the first terminal device may display a time-limited link entry on the first interface (i.e., the instant messaging interface of the first user and the second user).
Illustratively, assuming that the second terminal device triggers the activity sharing of the first terminal device in the manner shown in fig. 4, 5 or 6, and the first terminal device displays the instant communication interface 801 shown in fig. 8 for chatting with the second user, after receiving the activity information from the second terminal device, the first terminal device displays the instant communication interface 802 shown in fig. 8, wherein the limited link entry page 803 is displayed on the instant communication interface 802. The time-limited link entry page 803 is for the first terminal device to link to an active page. Illustratively, as shown in FIG. 8, the time-limited link entry page 803 includes an immediate join option and a later join option for the first user to select to join the activity immediately or later.
Or, for example, assuming that the second terminal device triggers the active sharing of the first terminal device in the manner shown in fig. 7A, and the first terminal device displays the instant communication interface 901 for video communication with the second user shown in fig. 9, after receiving the activity information from the second terminal device, the first terminal device displays the instant communication interface 902 shown in fig. 9, wherein the limited link entry page 903 is displayed on the instant communication interface 902. The time-limited link entry page 903 is used for the first terminal device to link to the active page. Illustratively, as shown in FIG. 9, the time-limited link entry page 903 includes an immediate join option and a later join option for the first user to select to join the activity immediately or later.
As another example, assume that the first terminal device is currently displaying a second interface, wherein the second interface is different from the first interface, for which case the first terminal device may display the time-limited linked entry in the form of a notification message.
Illustratively, assuming that the second terminal device triggers the activity sharing of the first terminal device in the manner shown in fig. 4, 5 or 6, and the first terminal device displays the news interface 1001 shown in fig. 10, after receiving the activity information from the second terminal device, the first terminal device displays the interface 1002 shown in fig. 10, wherein the interface 1002 has the notification message 1003 including the time-limited link entry displayed thereon. The notification message 1003 is used to notify the first user that activity information is received and to present the user with a time-limited linked entry. Illustratively, as shown in FIG. 10, the notification message 1003 includes an immediate join option and a later join option for the first user to select to join the activity immediately or later.
Or, for example, assuming that the second terminal device triggers the activity sharing of the first terminal device in the manner shown in fig. 7A, and the first terminal device displays the video call interface 1101 shown in fig. 11, after receiving the activity information from the second terminal device, the first terminal device displays the interface 1102 shown in fig. 11, where the interface 1102 displays a notification message 1103 including the time-limited link entry thereon. The notification message 1103 is used to notify the first user that the activity information is received and to present the user with a time-limited linked entry. Illustratively, as shown in FIG. 11, the notification message 1103 includes an immediate join option and a later join option for the first user to select to join the activity immediately or later.
It should be noted that, the embodiment of the present application is not limited to a specific time when the first terminal device displays the active card.
For example, in some examples, the first terminal device may save the activity information and display the activity card directly upon receiving the activity information from the second terminal device.
As another example, in other examples, the first terminal device may save the received activity information and display the activity card upon receiving an operation that the first user refuses to join the activity through the time-limited link entry, such as when the first user clicks a "later join" option in the time-limited link entry.
As another example, in other examples, the first terminal device may save the activity information and display the time-limited link entry upon receiving the activity information from the second terminal device, and display the activity card upon receiving an operation by the first user to refuse to join the activity through the time-limited link entry, such as the first user clicking on a "later join" option in the time-limited link entry.
As another example, in other examples, the first terminal device may save the activity information and display the time-limited link portal when receiving the activity information from the second terminal device, and display the activity card after receiving the preset duration of the activity information from the second terminal device.
As another example, in other examples, the first terminal device may save the received activity information and display the activity card when a preset duration of the activity information from the second terminal device is received.
As another example, in other examples, the first terminal device may save the activity information and display the activity card after a preset period of time after receiving the first user click of the "later join" option in the time-limited link entry.
As another example, in other examples, the first terminal device may save the activity information and display the time-limited link entry when the activity information is received from the second terminal device, and display the activity card after a preset period of time after receiving the "later join" option in the time-limited link entry clicked by the first user.
It will be appreciated that the first user may join the activity shared by the second terminal device through a time-limited link entry displayed by the first terminal device. For example, the first user may join the activity shared by the second terminal device by clicking on the immediate join option in the time-limited link entry page 803 shown in fig. 8. As another example, the first user may immediately join the activity shared by the second terminal device by clicking on the time-limited link entry page 903 shown in fig. 9. As another example, the first user may join the activity shared by the second terminal device by clicking on the immediate join option in notification message 1003 shown in fig. 10. As another example, the first user may join the activity shared by the second terminal device by clicking on the immediate join option in notification message 1103 shown in fig. 11.
Or the first user may join the activity shared by the second terminal device based on the activity card displayed by the first terminal device. For example, the first user may join the activity shared by the second terminal device at any time by clicking on the join activity option in the activity card 1200 shown in fig. 12. As another example, the first user may join the activity shared by the second terminal device at any time by clicking the join activity option in the activity card 1300-1 shown in fig. 13 (a). As another example, the first user may join the activity shared by the second terminal device at any time by clicking the join activity option in the activity card 1300-2 shown in fig. 13 (b).
For a specific description of the first terminal device joining the second terminal device to share the activity through the time-limited link entry or the activity card, reference may be made to S304A or S304B.
S304A: and responding to the operation of the first user to join the first activity through the time-limited link entry, and starting the first activity page by the first terminal equipment according to the time-limited link entry to synchronously perform the first activity with the second terminal equipment.
Illustratively, the first user joins the first activity through the time-limited link portal such as the first user clicking on the time-limited link portal page or the first user clicking on a first preset option on the time-limited link portal page.
For example, the first preset option is shown as an "immediate join" option on the time-limited link entry page 803 shown in fig. 8, an "immediate join" option on the time-limited link entry page 903 shown in fig. 9, an "immediate join" option on the notification message 1003 shown in fig. 10, or an "immediate join" option on the notification message 1103 shown in fig. 11. The operation of the first user to join the first activity through the time-limited link entry may be an operation of the first user clicking on the "join immediately" option shown in fig. 8, 9, 10 or 11.
Wherein, the activity information of the first activity indicated by the time-limited link entry is such as the first activity ID and the activity parameter of the first activity. The activity parameter is one or more of time information of sending activity information to the first terminal device by the second terminal device, a resource title, a resource address of the first activity, a playing progress, a playing speed, brightness, volume, playing state, interface layout and the like of the first activity. Based on the above, the first terminal device performs the first activity synchronously with the second terminal device according to the activity information indicated by the time-limited link entry, which specifically may include: the first terminal equipment enters an activity sharing group according to the activity ID indicated by the time-limited link entry, links to an activity page of the first activity according to the resource address indicated by the time-limited link entry, and synchronizes one or more of the playing progress, the playing speed, the brightness, the volume, the playing state, the interface layout and the like of the first activity with the second terminal equipment according to the activity parameter indicated by the time-limited link entry.
S304B: and responding to the operation of the first user for joining the first activity through the activity card, and synchronously carrying out the first activity with the second terminal equipment by the first terminal equipment according to the activity information corresponding to the activity card.
The first user may join the first activity through the activity card, for example, the first user clicking on the activity card, or the first user clicking on a second preset option on the activity card.
For example, the second preset option is the "join activity" option on the activity card 1200 shown in fig. 12 or the "join activity" option on the activity card 1300 shown in fig. 13. The operation of the first user to join the first activity through the activity card may be an operation of the first user clicking on the "join activity" option shown in fig. 12 or fig. 13.
Wherein, the activity information of the first activity may include the first activity ID and the activity parameters of the first activity, etc. The activity parameter is one or more of time information of sending activity information to the first terminal device by the second terminal device, a resource title, a resource address of the first activity, a playing progress, a playing speed, brightness, volume, playing state, interface layout and the like of the first activity. Based on the above, the first terminal device performs the first activity synchronously with the second terminal device according to the activity information corresponding to the activity card, which specifically may include: the first terminal equipment enters an activity sharing group according to the activity ID corresponding to the activity card, links to an activity page of the first activity according to the resource address corresponding to the activity card, and synchronizes one or more of the playing progress, the playing speed, the brightness, the volume, the playing state, the interface layout and the like of the first activity with the second terminal equipment according to other activity parameters corresponding to the activity card.
It should be noted that, as one possible implementation scenario, as shown in fig. 4, fig. 5, or fig. 6, in the scenario shown in fig. 3B, the second terminal device may send, during the process of performing the first activity, activity information of the first activity to the first terminal device in response to an operation initiated by the user on the first activity interface to share the first activity to the first terminal device.
As another possible implementation scenario, as in the scenario shown in fig. 7A or fig. 7B, the second terminal device may send the activity information of the first activity to the first terminal device in response to an operation initiated by the user on the instant communication application interface to share the first activity to the first terminal device.
For the scenario shown in fig. 7A or fig. 7B, as shown in fig. 3C, when detecting that the user initiates an operation of activity sharing on the instant communication application interface, the second terminal device acquires an ongoing audio/video activity. Wherein the ongoing audio-visual activity comprises an ongoing audio activity and/or an ongoing video activity. The ongoing audio-visual activity includes a first activity. Further, as shown in fig. 3C, upon receiving an operation of sharing the first activity by the second user to the first terminal device, the second terminal device performs S302. And the first terminal device performs S303B, and performs S304A upon receiving an operation of the first user joining the first activity through the time-limited link entry, or performs S304B upon receiving an operation of the first user joining the first activity through the activity card.
Optionally, if the activity sharing of the first activity has ended, if the first user initiates an operation of joining the activity through the activity card, the first terminal device may prompt the first user that the activity sharing has ended or that the activity card has expired, etc.
Optionally, in the process that the first terminal device and the second terminal device synchronously perform the first activity, the second terminal device may further send activity information of the first activity to other terminal devices according to the operation of the first user, so as to invite other users to synchronously perform the first activity together.
Further optionally, when the number of terminal devices joining the first activity exceeds the preset number, the second terminal device may further display a reminder to remind the user that the member in the active sharing group has reached the upper limit.
Further, in the process of synchronizing the first activity between the first terminal device and the second terminal device, if the second user adjusts one or more of the playing progress, the playing speed, the brightness, the volume, the playing state, the interface layout and the like for the first activity on the second terminal device, the second terminal device may synchronize the activity parameters with the first terminal device, so as to ensure the synchronization and/or the consistency of the playing effect when the second terminal device and the first terminal device perform the first activity.
As shown in fig. 14, an exemplary method for joining a multimedia campaign provided in an embodiment of the present application may further include S1401-S1403:
s1401: the second terminal device receives an adjustment of an activity parameter of the first activity by the second user.
Wherein the activity parameters of the first activity are one or more of the resource address, playing progress, playing speed, brightness, volume, playing status, interface layout, etc. of the first activity. The second user may adjust the activity parameters of the first activity, such as one or more of changing resources, changing the number of sets, adjusting the progress of playing resources, adjusting the speed of playing resources, adjusting the brightness of the interface, adjusting the volume of playing, pausing/playing, adjusting the layout of the interface, etc.
S1402: the second terminal device sends the adjusted activity parameters of the first activity and the current moment to the first terminal device.
The current time here refers to a time when the two terminal devices send the adjusted activity parameters of the first activity to the first terminal device, and is referred to as the first time.
S1403: and the first terminal equipment performs play calibration of the first activity according to the adjusted activity parameters of the first activity.
Taking the first terminal device performing the calibration of the playing progress of the first activity according to the adjusted playing progress, playing speed and current time information sent by the second terminal device as an example, the first terminal device may determine the calibrated playing progress of the first activity by adopting the following calculation formula: the calibrated play progress of the first activity=the adjusted play progress of the first activity + (second time-first time) the play speed. The second time is the time when the first terminal device performs play calibration of the first activity.
In some embodiments, the authority to make activity parameter adjustments and synchronize to other terminal devices may be controlled by the calling end, as shown in fig. 14.
In other embodiments, all terminal devices joining the first activity (including the first terminal device) have the right to make activity parameter adjustments and synchronize to other terminal devices.
Further optionally, in the process that the first terminal device and the second terminal device synchronously perform the first activity, if the second user posts comments (such as a barrage) on the second terminal device for the first activity, the second terminal device may also send the comments to other terminal devices (including the first terminal device) that join the first activity; or if the first user posts comments (such as a bullet screen, etc.) on the first terminal device for the first activity, the first terminal device may also send the comments to other terminal devices (including the second terminal device) that join the first activity. Further, as shown in fig. 15, each terminal device may display a bullet screen of the user of the plurality of terminal devices (e.g., the first terminal device, the second terminal device, the third terminal device, and the fourth terminal device) joining the first activity on the first activity. Or comments made by the users of the plurality of terminal devices joining the first activity about the first activity may also be displayed on the first activity interface in other forms, which is not limited by the embodiment of the present application.
Taking the example that the terminal device provides the activity card service through the FA, as an example, the method for joining the multimedia activity provided by the embodiment of the present application may be applied to the network architecture shown in fig. 16. As shown in fig. 16, an instant communication application, an audio application, and a video application are installed in each of the first terminal device and the second terminal device. The first terminal device also supports FA card services.
The instant messaging application has an activity sharing function. Illustratively, instant messaging applications such asHua Cheng Ji (Chinese character)Etc.
As shown in fig. 16, the instant communication applications installed in the first terminal device and the second terminal device may include a call module, a Real Time Clock (RTC) module. The communication module is used for carrying out instant communication with other devices. The RTC module is used for providing accurate real-time so as to ensure low time delay of instant messaging.
As shown in fig. 16, an activity sharing software development kit (software development kit, SDK) is also integrated into the instant messaging application installed in the first terminal device and the second terminal device. The activity sharing SDK is used for supporting an activity sharing function of the instant messaging application. For example, for the second terminal device, the activity sharing SDK integrated in the instant messaging application is configured to be responsible for acquiring the multimedia activity information being executed in the second terminal device in response to the operation of initiating the activity sharing by the user, and acquiring the activity parameters of the multimedia activity set/modified by the user. For another example, for the first terminal device, the activity sharing SDK integrated in the instant messaging application is configured to be responsible for allocating an activity resource playing task to a corresponding audio application or video application in response to an operation of a user to join in an activity; or sending the activity parameters to the audio application or the video application to synchronize one or more parameters of playing progress, playing speed, brightness, volume, playing state, interface layout, etc. For another example, for the first terminal device, the activity sharing SDK integrated in the instant messaging application is configured to be responsible for temporarily storing the corresponding activity information in the first terminal device when the activity information is received, for example, in the form of an FA card.
The instant messaging application installed in the first terminal device and the instant messaging application installed in the second terminal device can perform instant messaging based on a wireless communication technology. For example, as shown in fig. 16, the instant messaging application installed in the first terminal device and the instant messaging application installed in the second terminal device may perform instant messaging through a server corresponding to the instant messaging application based on a wireless communication technology. Takes instant messaging application as ChinaFor example, a server corresponding to an instant messaging application, such as a "Change" >Etc.
The FA card service is used for temporarily storing the activity information received by the instant messaging application in the form of an FA card.
The audio application is responsible for audio playback. Illustratively, an audio application such as a music player,Music, etc.
As shown in fig. 16, the audio applications installed in the first terminal device and the second terminal device include an audio playing module. And the audio application is also integrated with an activity sharing SDK. The audio playing module is used for being responsible for audio playing; the activity sharing SDK is used to support an activity sharing function of the audio application. For example, for the second terminal device, the activity sharing SDK integrated in the audio application is used to support the audio application to send the active information of the ongoing audio activity, such as the title, address, playing progress, playing speed, brightness, volume, playing status, interface layout, etc. of the audio activity to the instant messaging application. For another example, for the first terminal device, the integrated activity sharing SDK in the audio application is used to support the audio application to respond to the operation that the user joins the activity through the time-limited link entry, and open the audio playing interface corresponding to the time-limited link entry; or in response to the operation of joining the activity by the user through the activity card (such as the FA card), opening the audio playing interface corresponding to the time-limited link entry.
Video applications are used to be responsible for video playback, such as televised playback, movie playback, live video playback, etc. Illustratively, video applications such asVideo,/>Video, etc.
As shown in fig. 16, the video applications installed in the first terminal device and the second terminal device include a video play module. And the video application is also integrated with an activity sharing SDK. The video playing module is used for playing video; the activity sharing SDK is used to support the activity sharing functionality of the video application. For example, for the second terminal device, the activity sharing SDK integrated in the video application is used to support the video application to send the activity information of the ongoing video activity, such as the title, address, playing progress, playing speed, brightness, volume, playing status, interface layout, etc. of the video activity to the instant messaging application. For another example, for the first terminal device, the activity sharing SDK integrated in the video application is used to support the video application to respond to the operation that the user joins the activity through the time-limited link entry, and open the video playing interface corresponding to the time-limited link entry; or in response to the operation of joining the activity by the user through the activity card (such as the FA card), opening the video playing interface corresponding to the time-limited link entry.
Taking the network architecture shown in fig. 16 as an example, please refer to fig. 17, fig. 17 taking the first activity as a video activity as an example, a method interaction diagram for joining a multimedia activity according to an embodiment of the present application is shown. As shown in fig. 17, a method for joining a multimedia event according to an embodiment of the present application may include S1701-S1705, and S1706-1 to S1708-1:
s1701: the video application in the second terminal device performs the first activity.
S1702: the video application in the second terminal device receives an operation that the second user shares the first activity to the first terminal device through the instant messaging application.
S1703: an instant messaging application in the second terminal device obtains activity information of the first activity.
S1704: the instant messaging application in the second terminal device sends the activity information of the first activity to the instant messaging application of the first terminal device through the server.
S1705: the instant messaging application in the first terminal device displays a time-limited link entry.
S1706-1: an instant messaging application in a first terminal device receives an operation of a first user joining a first activity through a time-limited link entry.
S1707-1: the instant messaging application in the first terminal device invokes the video application in the first terminal device.
S1708-1: the video application in the first terminal device performs the first activity in synchronization with the video application in the second terminal device.
Or as shown in fig. 17, a method for joining a multimedia event according to an embodiment of the present application may include the steps S1701-S1705 described above, and the following S1706-2 to S1711-2:
s1706-2: an instant messaging application in a first terminal device receives an operation that a first user refuses to join a first activity through a time-limited link entry.
S1707-2: the FA card service in the first terminal device obtains activity information of a first activity from the instant communication application.
S1708-2: the FA card service in the first terminal device generates a FA card (i.e., an active card).
S1709-2: the FA card service in the first terminal equipment receives the operation that the first user joins the first activity through the FA card.
S1710-2: the FA card service in the first terminal device invokes the video application in the first terminal device.
S1711-2: the video application in the first terminal device performs the first activity in synchronization with the video application in the second terminal device.
It should be noted that fig. 17 only takes, as an example, the FA card displayed by the first terminal device after receiving an operation that the first user refuses to join the first activity through the time-limited link entry. In some embodiments, the first terminal device may also display the FA card directly upon receiving the activity information from the second terminal device. Or in other embodiments, the first terminal device may further display the FA card after receiving the preset duration of the activity information from the second terminal device. The embodiment of the application is not limited with respect to a specific time when the first terminal device displays the active card.
It can be understood that, according to the method for joining a multimedia activity provided by the embodiment of the present application, after the first terminal device (i.e. the called end) receives the activity information about the first activity shared by the second terminal device (i.e. the calling end), the first terminal device may not only display the corresponding time-limited link entry, but also store the activity information locally, and always display the activity card corresponding to the first activity according to the stored activity information. Based on this, the user of the first terminal device (such as the first user) can not only join the ongoing activity of the second terminal device by clicking the time-limited link entry, but also join the ongoing activity of the second terminal device by the activity card at any time when the user can no longer join the second terminal device by the time-limited link entry due to time-out. The method can avoid the problem that the first user cannot join the activities shared by other users due to overtime caused by the fact that the first user does not join the activities through the time-limited link entry in time, support the first user to join the activities at any time, and improve user experience.
It is to be understood that the various aspects of the embodiments of the application may be used in any reasonable combination, and that the explanation or illustration of the various terms presented in the embodiments may be referred to or explained in the various embodiments without limitation.
It should also be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
It will be appreciated that, in order to implement the functions of any of the above embodiments, the terminal device includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the terminal equipment, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
It should also be understood that each module in the terminal device may be implemented in software and/or hardware, which is not particularly limited. In other words, the terminal device is presented in the form of a functional module. A "module" herein may refer to an application specific integrated circuit ASIC, an electronic circuit, a processor and memory that execute one or more software or firmware programs, an integrated logic circuit, and/or other devices that can provide the described functionality.
In an alternative, when data transmission is implemented using software, it may be implemented wholly or partly in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are fully or partially implemented. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line ((digital subscriber line, DSL)), or wireless (e.g., infrared, wireless, microwave, etc.), the computer-readable storage medium may be any available medium that can be accessed by the computer or a data storage device such as a server, data center, etc., that contains an integration of one or more available media, the available media may be magnetic media, (e.g., floppy disk, hard disk, tape), optical media (e.g., digital versatile disk (digital video disk, DVD)), or semiconductor media (e.g., solid state disk Solid STATE DISK (SSD)), etc.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in an electronic device. The processor and the storage medium may reside as discrete components in a terminal device.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.

Claims (25)

1. A method of joining a multimedia campaign, the method comprising:
the first terminal device receives activity information of a first activity from the second terminal device;
the first terminal device stores the activity information, wherein the activity information is used for being linked to an activity page of the first activity which is ongoing on the first terminal device;
the first terminal device displays the activity card of the first activity;
After receiving the operation of joining the first activity by the user through the activity card, the first terminal equipment synchronously performs the first activity with the second terminal equipment through the activity card.
2. The method of claim 1, wherein the first terminal device displaying the first active activity card comprises:
And the first terminal equipment displays the activity card of the first activity at a preset position according to the setting of the user.
3. The method of claim 2, wherein the preset locations comprise one or more of: desktop, negative one screen, notification bar.
4. A method according to any of claims 1-3, characterized in that the activity information is sent by the second terminal device to the first terminal device via a first application; the method further comprises the steps of:
after receiving the activity information, the first terminal device displays a time-limited link entry of the first activity on an application interface of the first application;
The time-limited link entry supports a user to join the first activity through the time-limited link entry within a preset time period when the first terminal equipment receives the activity information, and does not support the user to join the first activity through the time-limited link entry after the first terminal equipment receives the preset time period of the activity information.
5. The method of claim 4, wherein the page on which the time-limited link entry is located includes an option to later join the first activity, wherein the first terminal device displays an activity card of the first activity, comprising:
and when receiving an operation of clicking an option for later joining the first activity by a user, displaying an activity card of the first activity by the first terminal equipment.
6. The method of claim 4, wherein the first terminal device displaying the first active activity card comprises:
and after the first terminal equipment receives the preset duration of the activity information, the first terminal equipment displays the activity card of the first activity.
7. The method of claim 4, wherein the first terminal device displaying the first active activity card comprises:
and when the time-limited link entry message disappears, the first terminal device displays the active card of the first activity.
8. The method according to any one of claims 4 to 7, wherein,
And the user joins the first activity through the activity card after the first terminal equipment receives the preset duration of the activity information.
9. The method according to any of claims 1-8, wherein during the first activity of a first terminal device in synchronization with the second terminal device via the activity card, the method further comprises:
The first terminal device receives an activity parameter of the adjusted first activity from the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout;
The first terminal device updates an activity parameter of the first activity.
10. The method according to any of claims 1-8, wherein during the first activity of a first terminal device in synchronization with the second terminal device via the activity card, the method further comprises:
in response to a user adjustment of an activity parameter of the first activity, the first terminal device sends the adjusted activity parameter of the first activity to the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout.
11. The method of any of claims 1-10, wherein the first activity comprises an audio activity or a video activity.
12. A terminal device, characterized in that the terminal device comprises:
A communication unit for receiving activity information of a first activity from a second terminal device;
a storage unit for storing the activity information for linking to an activity page of the first activity ongoing on the first terminal device;
A display unit for displaying the activity card of the first activity;
and the processing unit is used for synchronously carrying out the first activity with the second terminal equipment through the activity card after the terminal equipment receives the operation of adding the first activity through the activity card by the user.
13. The terminal device of claim 12, wherein the display unit displays an activity card of the first activity, comprising:
the display unit displays the activity card of the first activity at a preset position according to the setting of the user.
14. The terminal device of claim 13, wherein the preset locations include one or more of: desktop, negative one screen, notification bar.
15. The terminal device according to any of claims 12-14, wherein the activity information is sent by the second terminal device to the first terminal device via a first application; the display unit is further configured to:
Displaying a time-limited link entry of the first activity on an application interface of the first application after the communication unit receives the activity information;
The time-limited link entry supports a user to join the first activity through the time-limited link entry within a preset time period when the first terminal equipment receives the activity information, and does not support the user to join the first activity through the time-limited link entry after the first terminal equipment receives the preset time period of the activity information.
16. The terminal device according to claim 15, wherein the time-limited link entry includes an option to join the first activity later on a page on which the time-limited link entry is located, and the display unit displays an activity card of the first activity, including:
And when the terminal equipment receives an operation of clicking an option for later joining the first activity by a user, the display unit displays an activity card of the first activity.
17. The terminal device of claim 15, wherein the display unit displays an activity card of the first activity, comprising:
and after the first terminal equipment receives the preset duration of the activity information, the display unit displays the activity card of the first activity.
18. The terminal device of claim 15, wherein the display unit displays an activity card of the first activity, comprising:
When the time-limited link entry message disappears, the display unit displays an activity card of the first activity.
19. Terminal device according to any of the claims 15-18, characterized in that,
And the user joins the first activity through the activity card after the first terminal equipment receives the preset duration of the activity information.
20. The terminal device according to any of the claims 15-18, wherein during the process of the processing unit performing the first activity in synchronization with the second terminal device via the activity card, the communication unit is further configured to: receiving an activity parameter of the adjusted first activity from the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout;
the processing unit is further configured to: and updating the activity parameters of the first activity according to the adjusted activity parameters of the first activity.
21. The terminal device according to any of the claims 12-19, wherein during the process of the processing unit performing the first activity in synchronization with the second terminal device via the activity card, the communication unit is further configured to:
In response to a user's adjustment of an activity parameter of the first activity, transmitting the adjusted activity parameter of the first activity to the second terminal device, the activity parameter comprising one or more of: play progress, play speed, brightness, volume, play status, interface layout.
22. The terminal device of any of claims 12-21, wherein the first activity comprises an audio activity or a video activity.
23. A terminal device, characterized in that the terminal device comprises:
the communication interface is used for transmitting and receiving signals;
a memory for storing computer program instructions;
The display is used for displaying an interface;
A processor configured to execute the instructions, to cause the terminal device to perform the method according to any of claims 1-11.
24. A computer readable storage medium having stored thereon computer executable instructions which when executed by processing circuitry implement the method of any of claims 1-11.
25. A chip system, characterized in that the chip system comprises a processing circuit and a storage medium, wherein the storage medium stores instructions; the instructions, when executed by the processing circuitry, implement the method of any of claims 1-11.
CN202211344827.5A 2022-10-31 2022-10-31 Method and equipment for joining multimedia activities Pending CN117955950A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211344827.5A CN117955950A (en) 2022-10-31 2022-10-31 Method and equipment for joining multimedia activities

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211344827.5A CN117955950A (en) 2022-10-31 2022-10-31 Method and equipment for joining multimedia activities

Publications (1)

Publication Number Publication Date
CN117955950A true CN117955950A (en) 2024-04-30

Family

ID=90800600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211344827.5A Pending CN117955950A (en) 2022-10-31 2022-10-31 Method and equipment for joining multimedia activities

Country Status (1)

Country Link
CN (1) CN117955950A (en)

Similar Documents

Publication Publication Date Title
CN113553014B (en) Application interface display method under multi-window screen projection scene and electronic equipment
WO2020221039A1 (en) Screen projection method, electronic device and screen projection system
CN110572431A (en) Card sharing method, device and system
CN113556598A (en) Multi-window screen projection method and electronic equipment
CN114040242B (en) Screen projection method, electronic equipment and storage medium
EP4060475A1 (en) Multi-screen cooperation method and system, and electronic device
CN112527174B (en) Information processing method and electronic equipment
CN116360725B (en) Display interaction system, display method and device
WO2020238759A1 (en) Interface display method and electronic device
CN112527222A (en) Information processing method and electronic equipment
WO2022042769A2 (en) Multi-screen interaction system and method, apparatus, and medium
JP2023547414A (en) Display methods and electronic devices
WO2022135163A1 (en) Screen projection display method and electronic device
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN115426521A (en) Method, electronic device, medium, and program product for screen capture
EP4254927A1 (en) Photographing method and electronic device
WO2022127670A1 (en) Call method and system, and related device
WO2021042881A1 (en) Message notification method and electronic device
CN117955950A (en) Method and equipment for joining multimedia activities
CN114079691A (en) Equipment identification method and related device
US20240086035A1 (en) Display Method and Electronic Device
WO2024067169A1 (en) Information processing method and electronic device
WO2022268009A1 (en) Screen sharing method and related device
WO2022267786A1 (en) Shortcut icon display method and terminal device
WO2024140757A1 (en) Cross-device screen splitting method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination