CN109640000B - Rich media communication method, terminal equipment and computer readable storage medium - Google Patents

Rich media communication method, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN109640000B
CN109640000B CN201811458046.2A CN201811458046A CN109640000B CN 109640000 B CN109640000 B CN 109640000B CN 201811458046 A CN201811458046 A CN 201811458046A CN 109640000 B CN109640000 B CN 109640000B
Authority
CN
China
Prior art keywords
pictures
target file
terminal device
module
synthesizing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811458046.2A
Other languages
Chinese (zh)
Other versions
CN109640000A (en
Inventor
颜铁芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811458046.2A priority Critical patent/CN109640000B/en
Publication of CN109640000A publication Critical patent/CN109640000A/en
Application granted granted Critical
Publication of CN109640000B publication Critical patent/CN109640000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention discloses a rich media communication method and terminal equipment, relates to the technical field of communication, and aims to solve the problem that the traffic consumption of the terminal equipment and the network burden of a communication system can be increased by respectively establishing an RCS (Rich client Server) session for each image to be uploaded to a server. The method comprises the following steps: determining N pictures to be sent, wherein N is an integer greater than or equal to 2; synthesizing the N pictures into a target file, wherein the target file comprises any one of the following items: a moving picture or video; sending a rich media communication message to the target device, the rich media communication message including the target file. The method can be applied to a scene of uploading a plurality of pictures to a server based on the RCS session.

Description

Rich media communication method, terminal equipment and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a rich media communication method and terminal equipment.
Background
Rich media Communication (RCS) is a converged Communication service that integrates multiple business functions such as file transfer, instant messaging, content sharing, address book, message storage, location service, presentation, and the like, by an iterative method based on the prior art standard.
Currently, according to the RCS standard protocol, a picture can be uploaded to the server every time an RCS session is established. For example, if a user wants to upload a picture to a server, the user may trigger the terminal device to establish an RCS session with the server, so that the terminal device may upload the picture to the server based on the RCS session.
However, according to the above method, if the user wants to upload multiple pictures to the server, an RCS session may need to be established for each of the multiple pictures, so that the terminal device may upload the multiple pictures to the server based on the multiple RCS sessions, which may increase traffic consumption of the terminal device and network load of the communication system.
Disclosure of Invention
The embodiment of the invention provides a rich media communication method and terminal equipment, and aims to solve the problem that the traffic consumption of the terminal equipment and the network burden of a communication system can be increased by respectively establishing an RCS (Rich client Server) session for each image to be uploaded to a server.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a rich media communication method, where the method includes: determining N pictures to be sent, wherein N is an integer greater than or equal to 2; synthesizing the N pictures into a target file, wherein the target file comprises any one of the following items: a moving picture or video; sending a rich media communication message to the target device, the rich media communication message including the target file.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes a determining module, a combining module, and a sending module. The device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining N pictures to be sent, and N is an integer greater than or equal to 2; a synthesizing module, configured to synthesize the N pictures determined by the determining module into a target file, where the target file includes any one of: a moving picture or video; the sending module is used for sending the rich media communication message to the target equipment, wherein the rich media communication message comprises the target file synthesized by the synthesizing module.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the rich media communication method provided in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the rich media communication method provided in the first aspect.
In the embodiment of the present invention, N (N is an integer greater than or equal to 2) pictures to be sent may be determined; synthesizing the N pictures into a target file (the target file comprises any one of dynamic pictures or videos); and sending a rich media communication message to the target device (the rich media communication message including the target file). According to the scheme, the terminal equipment can synthesize the N pictures to be sent into the target file, so that the terminal equipment can establish an RCS session with the server and upload the synthesized target file to the server based on the RCS session without respectively establishing an RCS session for each picture in the N pictures, and therefore the flow consumption of the terminal equipment can be reduced, and the network burden of a communication system can be reduced.
Drawings
Fig. 1 is a schematic architecture diagram of a communication system according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a rich media communication method according to an embodiment of the present invention;
FIG. 3 is a second schematic diagram of a rich media communication method according to an embodiment of the invention;
FIG. 4 is a third schematic diagram of a rich media communication method according to an embodiment of the present invention;
FIG. 5 is a fourth schematic diagram illustrating a rich media communication method according to an embodiment of the invention;
FIG. 6 is a fifth exemplary diagram illustrating a rich media communication method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 8 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 9 is a third schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 10 is a fourth schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 11 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, a first control and a second control, etc. are used to distinguish between different controls, rather than to describe a particular order of controls.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
The embodiment of the invention provides a rich media communication method and terminal equipment, which can determine N (N is an integer greater than or equal to 2) pictures to be sent; synthesizing the N pictures into a target file (the target file comprises any one of dynamic pictures or videos); and sending a rich media communication message to the target device (the rich media communication message including the target file). According to the scheme, the terminal equipment can synthesize the N pictures to be sent into the target file, so that the terminal equipment can establish an RCS session with the server and upload the synthesized target file to the server based on the RCS session without respectively establishing an RCS session for each picture in the N pictures, and therefore the flow consumption of the terminal equipment can be reduced, and the network burden of a communication system can be reduced.
The rich media communication method and the terminal equipment provided by the embodiment of the invention can be applied to an RCS system. The method and the device can be particularly applied to a scene of uploading a plurality of pictures to a server based on the RCS session.
Fig. 1 illustrates an architecture diagram of a communication system according to an embodiment of the present invention. As shown in fig. 1, the communication system may include a UE 01, at least one UE 02, and a server 03. The UE 01 and the server 03 can establish connection and communication therebetween, and at least one UE 02 and the server 03 can establish connection and communication therebetween.
In the embodiment of the present invention, the UE 01 and the server 03, and at least one of the UE 02 and the server 03 shown in fig. 1 are exemplarily illustrated by using a wired connection as an example, which does not limit the embodiment of the present invention. It is understood that, in actual implementation, the UE 01 and the server 03, and the at least one UE 02 and the server 03 may also be connected wirelessly, which may be determined according to actual usage requirements.
In this embodiment of the present invention, a possible implementation manner is that the UE 01 may send the rich media message to the server 03, and the server 03 may store the rich media message. Another possible implementation manner is that the UE 01 is a sender UE and the at least one UE 02 is a receiver UE, for example, when the UE 01 sends the rich media message to the at least one UE 02, the UE 01 may first send the rich media message to the server 03, and then the server 03 may forward the received rich media message to the at least one UE 02. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
A UE is a device that provides voice and/or data connectivity to a user, a handheld device with wired/wireless connectivity, or other processing device connected to a wireless modem. A UE may communicate with one or more core Network devices via a Radio Access Network (RAN). The UE may be a mobile terminal, such as a mobile phone (or "cellular" phone) and a computer with a mobile terminal, or a portable, pocket, hand-held, computer-included, or vehicle-mounted mobile device, that exchanges speech and/or data with the RAN, such as a Personal Communication Service (PCS) phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA), and so on. A UE may also be referred to as a User Agent (User Agent) or a terminal device, etc.
The terminal device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. For example, the mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
The server in the embodiment of the invention can manage resources and provide services for the terminal equipment. Illustratively, the server may be various application servers, such as an RCS application server, an MMTsl (multimedia telephony) application server, and the like, and the application server may provide various instant messaging service functions (e.g., point-to-point chat, group chat, file transfer, content sharing, and the like) for the user. The server may include a processor, hard disk, memory, bus, and the like. For example, the server may store therein installation packages of a plurality of applications, and may verify security of the target installation package, send a verification result to the terminal device, and the like in response to a download request of the terminal device for the target installation package.
Based on the communication system shown in fig. 1, as shown in fig. 2, an embodiment of the present invention provides a rich media communication method. The method can be applied to terminal equipment. The method may include steps 100-102 described below.
Step 100, the terminal device determines N pictures to be sent.
Wherein N is an integer greater than or equal to 2.
It should be noted that, in the embodiment of the present invention, the terminal device shown in fig. 2 may be the UE 01 shown in fig. 1.
In the embodiment of the present invention, the terminal device may determine N pictures to be sent in the following scene one or scene two:
the method comprises the following steps that firstly, terminal equipment receives input of N pictures to be sent selected by a user; and in response to the input, determining the N pictures as pictures to be sent.
For example, in the process of chatting between the user and the friend of the user, if the user wants to send several self-portrait pictures to the friend of the user, the user may trigger the terminal device to continuously take N pictures, so that the terminal device may determine the N pictures to be sent.
And in a second scene, the terminal equipment automatically determines N pictures to be sent.
For example, in the process of using the terminal device by the user, if the terminal device receives N pictures sent by the first device within a preset time period, the terminal device may automatically determine the N pictures as pictures to be sent to the second device.
Optionally, in an embodiment of the present invention, the N pictures are still pictures.
Optionally, in this embodiment of the present invention, the N pictures may be pictures stored in the terminal device (for example, pictures in an album application), may also be pictures shot by the terminal device, and may also be pictures received by the terminal device from a network side. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in the embodiment of the present invention, the N pictures may include the same target object (for example, the N pictures may be continuous pictures of the same user), or may not include the same target object.
It should be noted that, in the embodiment of the present invention, when the number of pictures to be sent is N (N is an integer greater than or equal to 2), the terminal device may perform step 101 and step 102 in the following embodiments of the present invention. In the case that the picture to be sent is 1 picture, the terminal device may directly send the 1 picture to the target device without performing steps 100 to 102 in the embodiment of the present invention. For the implementation manner in which the terminal device directly sends the picture to the target device, reference may be made to the prior art specifically, and details are not described in the embodiments of the present invention.
And step 101, the terminal equipment synthesizes the N pictures into a target file.
Wherein, the target file may include any one of the following: moving pictures or video.
In the embodiment of the invention, the dynamic picture is a picture formed by combining a plurality of static pictures, and the dynamic picture can be switched according to the preset frequency so as to achieve the dynamic effect.
Optionally, in the embodiment of the present invention, when the target file is a dynamic picture, the format of the dynamic picture may be a gif format, an avi format, a bmx format, a swf format, or the like.
Optionally, in this embodiment of the present invention, when the target file is a video, the format of the video may be an rm format, an rmvb format, an avi format, an amv format, a dmv format, a flv format, or the like.
Optionally, in the embodiment of the present invention, a possible implementation manner is that the terminal device directly synthesizes the N pictures into the target file. Another possible implementation manner is that the terminal device may detect a type of a network to which the terminal device is connected, and synthesize the N pictures into the target file when the type of the network to which the terminal device is connected is a preset network type. Another possible implementation manner is that the terminal device receives a first input from the user, and synthesizes the N pictures into the target file in response to the first input. Specific implementation manners of the latter two possible implementation manners will be described in step 103 and step 101A, and step 104 and step 101B, which are described below, and are not described herein again.
Optionally, in the embodiment of the present invention, a possible implementation manner is that the terminal device directly synthesizes the N pictures into a dynamic picture (or a video) according to a preset setting. Another possible implementation manner is that the terminal device receives input of a user to the first control, and in response to the input of the pair of first controls, synthesizes the N pictures into a dynamic picture; or the terminal equipment receives the input of the user to the second control and responds to the input of the second control to synthesize the N pictures into the video. For a specific implementation of the latter possible implementation, the following steps 101C1 and 101C2, step 101D1, and step 101D2 will be described, and details will not be described here.
Optionally, in the embodiment of the present invention, the terminal device may synthesize the N pictures into the target file according to any one of the following two manners:
in the first mode, the terminal equipment synthesizes the N pictures into a target file according to the selection sequence (or the shooting sequence or the receiving sequence) of the N pictures.
For example, assuming that a user selects picture 1, picture 2, picture 3, picture 4, picture 5, and picture 6 in sequence from an album of the terminal device, the terminal device may determine the N pictures as pictures to be sent, and sequentially synthesize picture 1, picture 2, picture 3, picture 4, picture 5, and picture 6 into a target file according to the selection order.
For example, assuming that a user triggers the terminal device to continuously shoot N pictures, namely, picture 1, picture 2, picture 3, picture 4, picture 5 and picture 6, the terminal device may determine the N pictures as pictures to be sent, and sequentially synthesize the picture 1, the picture 2, the picture 3, the picture 4, the picture 5 and the picture 6 into a target file according to a shooting sequence.
For example, assuming that N pictures received by the terminal device from the first device in sequence are picture 1, picture 2, picture 3, picture 4, picture 5, and picture 6, respectively, the terminal device may determine the N pictures as pictures to be sent, and sequentially synthesize the picture 1, the picture 2, the picture 3, the picture 4, the picture 5, and the picture 6 into a target file according to a receiving sequence.
And secondly, the terminal equipment can synthesize the N pictures into a target file according to the similarity of the N pictures.
For example, assuming that a user selects picture 1, picture 2, picture 3, picture 4, picture 5, and picture 6 in sequence from an album of the terminal device, the similarity between picture 1 and picture 2 is 10%, the similarity between picture 1 and picture 3 is 65%, the similarity between picture 1 and picture 4 is 50%, the similarity between picture 1 and picture 5 is 30%, and the similarity between picture 1 and picture 6 is 15%, the terminal device may synthesize picture 1, picture 3, picture 4, picture 5, picture 6, and picture 2 into a target file in sequence from high to low.
Step 102, the terminal device sends a rich media communication message to the target device, wherein the rich media communication message includes the target file.
Optionally, in this embodiment of the present invention, before the terminal device sends the rich-media communication message to the target device, the terminal device may establish an RCS session with the target device, so that the terminal device may send the target file to the target device based on the RCS session. It should be noted that, for the process of establishing the RCS session between the terminal device and the target device, reference may be made to the prior art, and details are not described here.
Optionally, in the embodiment of the present invention, the target device may be a server, may also be a receiving-end UE, and may also be other possible devices. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
For example, if the target device is a server, the terminal device may upload the target file to the server based on an RCS session established between the terminal device and the server; if the target device is a receiving UE, the terminal device may upload the target file to the server based on an RCS session established between the terminal device and the server, and then the server may transmit the target file to the receiving UE.
Optionally, in this embodiment of the present invention, the rich media communication message may further include a target message. The target message may be used to indicate the way in which the N pictures in the target file are synthesized.
Optionally, in the embodiment of the present invention, after the target device receives the rich media communication message sent by the terminal device, the target device may split the target file into N pictures.
Illustratively, after the target device receives the rich media communication message sent by the terminal device, the rich media communication message may be decompressed, and the target file and the target message may be obtained. Thus, the target device can split the target file according to the target message and acquire N pictures.
The embodiment of the invention provides a rich media communication method, because terminal equipment can synthesize N pictures to be sent into a target file, the terminal equipment can establish an RCS session with a server and upload the synthesized target file to the server based on the RCS session without respectively establishing an RCS session for each picture in the N pictures, thereby reducing the flow consumption of the terminal equipment and reducing the network burden of a communication system.
Optionally, in this embodiment of the present invention, the step 101 may be specifically implemented by any one of the following two optional implementation manners:
first alternative implementation,
Referring to fig. 2, as shown in fig. 3, before the step 101, the rich media communication method provided by the embodiment of the present invention may further include the following step 103. Accordingly, the step 101 can be specifically realized by the step 101A described below.
And 103, detecting the type of the network connected with the terminal equipment by the terminal equipment.
And step 101A, under the condition that the type of the network connected with the terminal equipment is a preset network type, the terminal equipment synthesizes the N pictures into a target file.
Optionally, in this embodiment of the present invention, the preset network type may be any one of a mobile data network (e.g., a code division multiple access technology network, a general packet radio service technology network, a cellular digital packet data switching network), a wireless local area network (e.g., a wireless fidelity network), and the like. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
For example, assuming that the preset network type is a mobile data network, after the terminal device determines N pictures to be sent, the terminal device may detect whether the type of the network connection established between the terminal device and the target device is the mobile data network. Under the condition that the type of the network connected with the terminal equipment is a mobile data network, the terminal equipment can synthesize the N pictures into a target file; in the case that the type of the network to which the terminal device is connected is another network type (for example, wifi), the terminal device may directly send the N pictures to the target device without synthesizing the N pictures into the target file.
According to the rich media communication method provided by the embodiment of the invention, the N pictures can be synthesized into the target file under the condition that the type of the network connected with the terminal equipment is the preset network type, and the synthesized target file is uploaded to the server based on the RCS session established between the terminal equipment and the target, so that the flow consumption of the terminal equipment can be reduced under the condition that the type of the network connected with the terminal equipment is the preset network type.
Second alternative implementation,
Referring to fig. 2, as shown in fig. 4, before the step 101, the rich media communication method provided by the embodiment of the present invention may further include the following step 104. Accordingly, the step 101 can be specifically realized by the step 101B described below.
And 104, receiving a first input of a user by the terminal equipment.
And step 101B, the terminal equipment responds to the first input and synthesizes the N pictures into a target file.
Wherein the first input may be used to determine to synthesize the N pictures into the target file.
Optionally, in this embodiment of the present invention, the first input may be a touch input, a voice input, or a gesture input. For example, in a case that the first input is a touch input, the first input may specifically be a click input, a slide input, and the like of the preset control by the user. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
For example, after the terminal device determines N pictures to be sent, the terminal device may display a menu list on the screen, where the menu list may include a first option and a second option, where the first option may be used to determine that the N pictures are synthesized into the target file, and the second option is used to determine that the N pictures are not synthesized. If the terminal device receives input (namely first input) of a first option from a user, the terminal device can synthesize the N pictures into a target file in response to the first input; if the terminal device receives the input of the second option from the user, the terminal device can respond to the input of the second option and directly send the N pictures to the target device without synthesizing the N pictures.
According to the rich media communication method provided by the embodiment of the invention, whether the N pictures are synthesized into the target file or not can be determined according to the input of the user, so that the flexibility of a file synthesizing mode can be improved, and the experience of the user can be improved.
Optionally, with reference to fig. 2, as shown in fig. 5, before the step 101, the rich media communication method provided in the embodiment of the present invention may further include a step 105 described below. Accordingly, the step 101 may be specifically realized by the steps 101C1 and 101C2 described below, or may be realized by the steps 101D1 and 101D2 described below.
And 105, the terminal equipment displays the first control and the second control on a screen of the terminal equipment.
The first control can be used for determining that the N pictures are synthesized into the dynamic pictures, and the second control can be used for determining that the N pictures are synthesized into the video.
And step 101C1, the terminal equipment receives a second input of the first control by the user.
And step 101C2, the terminal equipment responds to the second input and synthesizes the N pictures into the dynamic pictures.
And step 101D1, the terminal device receives a third input of the second control by the user.
And step 101D2, the terminal equipment responds to the third input and synthesizes the N pictures into the video.
It should be noted that fig. 5 in the embodiment of the present invention is described as an example with reference to fig. 2, and does not limit the embodiment of the present invention in any way. It is understood that, in practical implementation, fig. 5 may also be implemented in combination with fig. 3 or fig. 4, and may be determined according to actual use requirements.
The description is given by way of example in conjunction with fig. 3. The terminal device may first detect the type of network to which the terminal device is connected. Under the condition that the type of the network connected with the terminal device is the preset network type, the terminal device can display the first control and the second control on a screen of the terminal device. If the terminal equipment receives second input of the user to the first control, the terminal equipment responds to the second input and synthesizes the N pictures into the dynamic picture; if the terminal device receives a third input to the second control from the user, the terminal device may synthesize the N pictures into a video in response to the third input.
The description is given by way of example in conjunction with fig. 4. The terminal device may receive a first input from a user and display a first control and a second control on a screen of the terminal device in response to the first input. If the terminal equipment receives second input of the user to the first control, the terminal equipment responds to the second input and synthesizes the N pictures into the dynamic picture; if the terminal device receives a third input to the second control from the user, the terminal device may synthesize the N pictures into a video in response to the third input.
According to the rich media communication method provided by the embodiment of the invention, the first control and the second control can be displayed on the screen of the terminal equipment, so that a user can select to synthesize the N pictures into the dynamic picture or synthesize the N pictures into the video according to actual requirements, thereby improving the diversity of user selection and improving the experience of the user.
Optionally, with reference to fig. 2, as shown in fig. 6, after the step 102, the rich media communication method provided in the embodiment of the present invention may further include a step 106 described below.
And 106, the terminal equipment displays the first frame image of the target file on a screen of the terminal equipment or plays the target file.
In the embodiment of the present invention, in the case where the target file is a moving image or video, the moving image or video may include a plurality of frames of images. Each of the plurality of frames of images may be one of the N pictures.
Illustratively, the target file is taken as a dynamic image for example. After the terminal device transmits the rich-media communication message including the target file to the target device, the terminal device may display a first frame image of the dynamic image on a screen of the terminal device or may play the dynamic image on the screen of the terminal device. Further, if the terminal device displays the first frame image of the dynamic image on the screen, after the terminal device receives a touch input of the user, the terminal device may sequentially display all images of the dynamic image in response to the touch input.
Illustratively, the target file is taken as a video for example. After the terminal device transmits the rich-media communication message including the video to the target device, the terminal device may display a first frame image of the video on a screen of the terminal device or may play the video on the screen of the terminal device. Further, if the terminal device displays the first frame image of the video on the screen, after the terminal device receives a touch input of the user, the terminal device may sequentially display all images of the video in response to the touch input.
Compared with the method for displaying the N pictures on the screen of the terminal device, the method for communicating the rich media provided by the embodiment of the invention can reduce the size of the screen area occupied by the N pictures by displaying the first frame image of the target file on the screen of the terminal device or playing the target file, and can enable a user to know the display form of the target file, thereby improving the experience of the user.
As shown in fig. 7, an embodiment of the present invention provides a terminal device 700. The terminal device may comprise a determining module 701, a combining module 702 and a sending module 703. The determining module 701 is configured to determine N pictures to be sent, where N is an integer greater than or equal to 2; a synthesizing module 702, configured to synthesize the N pictures determined by the determining module 701 into a target file, where the target file includes any one of: a moving picture or video; a sending module 703, configured to send a rich media communication message to the target device, where the rich media communication message includes the target file synthesized by the synthesizing module 702.
Optionally, with reference to fig. 7, as shown in fig. 8, the terminal device provided in the embodiment of the present invention may further include a detection module 704. A detection module 704, configured to detect a type of a network to which the terminal device is connected before the synthesis module 702 synthesizes the N pictures into the target file; the synthesizing module 702 is specifically configured to synthesize the N pictures into the target file when the detecting module 704 detects that the type of the network connected to the terminal device is the preset network type.
Optionally, with reference to fig. 7, as shown in fig. 9, the terminal device provided in the embodiment of the present invention may further include a receiving module 705. A receiving module 705, configured to receive a first input of a user before the synthesizing module 702 synthesizes the N pictures into the target file; the synthesizing module 702, specifically responding to the first input received by the receiving module 705, synthesizes the N pictures into a target file.
Optionally, with reference to fig. 7, as shown in fig. 10, the terminal device provided in the embodiment of the present invention may further include a display module 706 and a receiving module 705. A display module 706, configured to display a first control and a second control on a screen of the terminal device before the synthesis module 702 synthesizes the N pictures into the target file, where the first control is used to determine that the N pictures are synthesized into a dynamic picture, and the second control is used to determine that the N pictures are synthesized into a video. The receiving module 705 is configured to receive a second input from the user to the first control, or receive a third input from the user to the second control. A synthesizing module 702, specifically configured to, and in response to the second input received by the receiving module 705, synthesize the N pictures into a dynamic picture; or in response to the third input received by the receiving module 705, the N pictures are synthesized into video.
Optionally, as shown in fig. 10, the terminal device provided in the embodiment of the present invention may further include a display module 706. A display module 706, configured to display a first frame image of the target file on a screen of the terminal device or play the target file after the sending module 703 sends the rich media communication message to the target device.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and is not described here again to avoid repetition.
The embodiment of the invention provides the terminal equipment, because the terminal equipment can synthesize the N pictures to be sent into the target file, the terminal equipment can establish an RCS session with the server and upload the synthesized target file to the server based on the RCS session without respectively establishing an RCS session for each picture in the N pictures, thereby reducing the flow consumption of the terminal equipment and reducing the network burden of a communication system.
Fig. 11 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 11, the terminal device 200 includes, but is not limited to: radio frequency unit 201, network module 202, audio output unit 203, input unit 204, sensor 205, display unit 206, user input unit 207, interface unit 208, memory 209, processor 210, and power supply 211. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 11 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The processor 210 is configured to determine N pictures to be sent, where N is an integer greater than or equal to 2; and synthesizing the N pictures into a target file, wherein the target file comprises any one of the following items: moving pictures or video. The radio frequency unit 201 is configured to send a rich media communication message to the target device, where the rich media communication message includes the target file synthesized by the processor 210.
The embodiment of the invention provides the terminal equipment, because the terminal equipment can synthesize the N pictures to be sent into the target file, the terminal equipment can establish an RCS session with the server and upload the synthesized target file to the server based on the RCS session without respectively establishing an RCS session for each picture in the N pictures, thereby reducing the flow consumption of the terminal equipment and reducing the network burden of a communication system.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 201 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 210; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 201 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 202, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 203 may convert audio data received by the radio frequency unit 201 or the network module 202 or stored in the memory 209 into an audio signal and output as sound. Also, the audio output unit 203 may also provide audio output related to a specific function performed by the terminal apparatus 200 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 203 includes a speaker, a buzzer, a receiver, and the like.
The input unit 204 is used to receive an audio or video signal. The input Unit 204 may include a Graphics Processing Unit (GPU) 2041 and a microphone 2042, and the Graphics processor 2041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 206. The image frames processed by the graphic processor 2041 may be stored in the memory 209 (or other storage medium) or transmitted via the radio frequency unit 201 or the network module 202. The microphone 2042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 201 in case of a phone call mode.
The terminal device 200 further comprises at least one sensor 205, such as light sensors, motion sensors and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 2061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 2061 and/or the backlight when the terminal apparatus 200 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 205 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 206 is used to display information input by the user or information provided to the user. The Display unit 206 may include a Display panel 2061, and the Display panel 2061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 207 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 207 includes a touch panel 2071 and other input devices 2072. Touch panel 2071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 2071 (e.g., user operation on or near the touch panel 2071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 2071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 210, and receives and executes commands sent by the processor 210. In addition, the touch panel 2071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 207 may include other input devices 2072 in addition to the touch panel 2071. In particular, the other input devices 2072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not further described herein.
Further, a touch panel 2071 may be overlaid on the display panel 2061, and when the touch panel 2071 detects a touch operation on or near the touch panel 2071, the touch panel is transmitted to the processor 210 to determine the type of the touch event, and then the processor 210 provides a corresponding visual output on the display panel 2061 according to the type of the touch event. Although the touch panel 2071 and the display panel 2061 are shown as two separate components in fig. 11 to implement the input and output functions of the terminal device, in some embodiments, the touch panel 2071 and the display panel 2061 may be integrated to implement the input and output functions of the terminal device, and are not limited herein.
The interface unit 208 is an interface for connecting an external device to the terminal apparatus 200. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 208 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 200 or may be used to transmit data between the terminal apparatus 200 and the external device.
The memory 209 may be used to store software programs as well as various data. The memory 209 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 209 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 210 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 209 and calling data stored in the memory 209, thereby performing overall monitoring of the terminal device. Processor 210 may include one or more processing units; optionally, the processor 210 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 210.
Terminal device 200 may also include a power source 211 (e.g., a battery) for providing power to various components, and optionally, power source 211 may be logically connected to processor 210 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the terminal device 200 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes the processor 210 shown in fig. 11, the memory 209, and a computer program stored in the memory 209 and capable of running on the processor 210, where the computer program, when executed by the processor 210, implements the processes of the foregoing method embodiment, and can achieve the same technical effect, and details are not described here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. Examples of the computer-readable storage medium include a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method described in the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A rich media communication method is applied to a terminal device, and the method comprises the following steps:
determining N pictures to be sent, wherein N is an integer greater than or equal to 2;
synthesizing the N pictures into a target file, wherein the target file comprises any one of the following items: a moving picture or video;
sending a rich media communication message to a target device, wherein the rich media communication message comprises the target file;
before the synthesizing the N pictures into the target file, the method further includes:
detecting the type of a network connected with the terminal equipment;
the synthesizing the N pictures into a target file comprises the following steps:
and under the condition that the type of the network connected with the terminal equipment is a preset network type, synthesizing the N pictures into the target file.
2. The method according to claim 1, wherein before the synthesizing of the N pictures into the target file, the method further comprises:
receiving a first input of a user;
the synthesizing the N pictures into a target file comprises the following steps:
and responding to the first input, and synthesizing the N pictures into the target file.
3. The method according to any one of claims 1 or 2, wherein before the synthesizing of the N pictures into the target file, the method further comprises:
displaying a first control and a second control on a screen of the terminal device, wherein the first control is used for determining that the N pictures are synthesized into a dynamic picture, and the second control is used for determining that the N pictures are synthesized into a video;
synthesizing the N pictures into the target file, including:
receiving a second input of the first control by a user; synthesizing the N pictures into a dynamic picture in response to the second input; or,
receiving a third input of the second control by the user; synthesizing the N pictures into a video in response to the third input.
4. The method of any of claims 1 or 2, wherein after sending the rich media communication message to the target device, the method further comprises:
and displaying the first frame image of the target file on a screen of the terminal equipment, or playing the target file.
5. The terminal equipment is characterized by comprising a determining module, a synthesizing module, a detecting module and a sending module;
the determining module is used for determining N pictures to be sent, wherein N is an integer greater than or equal to 2;
the synthesis module is configured to synthesize the N pictures determined by the determination module into a target file, where the target file includes any one of: a moving picture or video;
the sending module is configured to send a rich media communication message to a target device, where the rich media communication message includes the target file synthesized by the synthesizing module;
the detection module is used for detecting the type of a network connected with the terminal equipment before the synthesis module synthesizes the N pictures into the target file;
the synthesizing module is specifically configured to synthesize the N pictures into the target file when the detecting module detects that the type of the network connected to the terminal device is a preset network type.
6. The terminal device of claim 5, wherein the terminal device further comprises a receiving module;
the receiving module is used for receiving a first input of a user before the synthesizing module synthesizes the N pictures into the target file;
the synthesizing module is specifically configured to synthesize the N pictures into the target file in response to the first input received by the receiving module.
7. The terminal device according to claim 5 or 6, wherein the terminal device further comprises a display module and a receiving module;
the display module is configured to display a first control and a second control on a screen of the terminal device before the synthesis module synthesizes the N pictures into the target file, where the first control is configured to determine to synthesize the N pictures into a dynamic picture, and the second control is configured to determine to synthesize the N pictures into a video;
the receiving module is used for receiving a second input of the user to the first control or receiving a third input of the user to the second control;
the synthesis module is specifically configured to, in response to the second input received by the receiving module, synthesize the N pictures into a dynamic picture; or in response to the third input received by the receiving module, synthesizing the N pictures into a video.
8. The terminal device according to claim 5, wherein the terminal device further comprises a display module;
the display module is configured to display a first frame image of the target file on a screen of the terminal device or play the target file after the sending module sends the rich media communication message to the target device.
9. A terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the rich media communication method of any one of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the rich media communication method according to any one of claims 1 to 4.
CN201811458046.2A 2018-11-30 2018-11-30 Rich media communication method, terminal equipment and computer readable storage medium Active CN109640000B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811458046.2A CN109640000B (en) 2018-11-30 2018-11-30 Rich media communication method, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811458046.2A CN109640000B (en) 2018-11-30 2018-11-30 Rich media communication method, terminal equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109640000A CN109640000A (en) 2019-04-16
CN109640000B true CN109640000B (en) 2021-08-10

Family

ID=66070531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811458046.2A Active CN109640000B (en) 2018-11-30 2018-11-30 Rich media communication method, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109640000B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114827063A (en) * 2021-01-28 2022-07-29 华为技术有限公司 Message forwarding method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102171994A (en) * 2011-04-14 2011-08-31 华为技术有限公司 Method and device for transmitting and receiving notification message for presence service
CN102487455A (en) * 2009-10-29 2012-06-06 中国电信股份有限公司 Video play system of rich media content and method thereof
EP2600619A1 (en) * 2011-07-15 2013-06-05 Sony Corporation Transmitter, transmission method and receiver
CN104079468A (en) * 2013-03-25 2014-10-01 腾讯科技(深圳)有限公司 Animation transmission method and system
CN108307173A (en) * 2016-08-31 2018-07-20 北京康得新创科技股份有限公司 The processing method of picture receives terminal, sends terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107528766B (en) * 2016-07-14 2020-09-01 腾讯科技(深圳)有限公司 Information pushing method, device and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102487455A (en) * 2009-10-29 2012-06-06 中国电信股份有限公司 Video play system of rich media content and method thereof
CN102171994A (en) * 2011-04-14 2011-08-31 华为技术有限公司 Method and device for transmitting and receiving notification message for presence service
EP2600619A1 (en) * 2011-07-15 2013-06-05 Sony Corporation Transmitter, transmission method and receiver
CN104079468A (en) * 2013-03-25 2014-10-01 腾讯科技(深圳)有限公司 Animation transmission method and system
CN108307173A (en) * 2016-08-31 2018-07-20 北京康得新创科技股份有限公司 The processing method of picture receives terminal, sends terminal

Also Published As

Publication number Publication date
CN109640000A (en) 2019-04-16

Similar Documents

Publication Publication Date Title
CN108540655B (en) Caller identification processing method and mobile terminal
CN109408168B (en) Remote interaction method and terminal equipment
US20200257433A1 (en) Display method and mobile terminal
CN108712577B (en) Call mode switching method and terminal equipment
CN110213485B (en) Image processing method and terminal
CN110531913B (en) Video color ring processing method, communication equipment and computer readable storage medium
CN110784771A (en) Video sharing method and electronic equipment
CN110719319B (en) Resource sharing method, device, terminal equipment and storage medium
CN109144703B (en) Multitask processing method and terminal equipment thereof
CN109412932B (en) Screen capturing method and terminal
CN108282759B (en) Service processing method and mobile communication terminal
CN107786427B (en) Information interaction method, terminal and computer readable storage medium
CN109889757B (en) Video call method and terminal equipment
CN111158817A (en) Information processing method and electronic equipment
CN111147919A (en) Play adjustment method, electronic equipment and computer readable storage medium
CN109803110B (en) Image processing method, terminal equipment and server
CN108924035B (en) File sharing method and terminal
CN109101151B (en) Information display processing method and terminal equipment
WO2021017785A1 (en) Data transmission method and terminal device
CN112752353A (en) Connection method and terminal equipment
CN110796438B (en) Message sending method and mobile terminal
CN108536513B (en) Picture display direction adjusting method and mobile terminal
CN108259808B (en) Video frame compression method and mobile terminal
CN108762709B (en) Terminal control method, terminal and computer readable storage medium
WO2021169834A1 (en) Session information processing method, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant