CN113010706A - Multimedia information processing method, device, equipment and medium - Google Patents

Multimedia information processing method, device, equipment and medium Download PDF

Info

Publication number
CN113010706A
CN113010706A CN202110281431.XA CN202110281431A CN113010706A CN 113010706 A CN113010706 A CN 113010706A CN 202110281431 A CN202110281431 A CN 202110281431A CN 113010706 A CN113010706 A CN 113010706A
Authority
CN
China
Prior art keywords
information
multimedia
multimedia information
terminal
piece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110281431.XA
Other languages
Chinese (zh)
Inventor
王娟娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kingsoft Cloud Network Technology Co Ltd
Original Assignee
Beijing Kingsoft Cloud Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kingsoft Cloud Network Technology Co Ltd filed Critical Beijing Kingsoft Cloud Network Technology Co Ltd
Priority to CN202110281431.XA priority Critical patent/CN113010706A/en
Publication of CN113010706A publication Critical patent/CN113010706A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present disclosure relates to a multimedia information processing method, apparatus, device, and medium. The method comprises the steps of sending one or more pieces of multimedia information to a server, receiving address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server, and generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information. And further, displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information. The multimedia information can be displayed on the terminal, and the user can edit or adjust the multimedia information displayed on the terminal until the multimedia information achieves the effect expected by the user. Therefore, the multimedia information sent by the terminal to other users through the server can achieve the expected effect of the user, and the user experience is improved.

Description

Multimedia information processing method, device, equipment and medium
Technical Field
The present disclosure relates to the field of information technology, and in particular, to a method, an apparatus, a device, and a medium for processing multimedia information.
Background
With the continuous development of information technology, terminals have become indispensable communication devices in people's daily life. For example, a user may employ a terminal to send multimedia information to other users.
However, the multimedia information sent by the terminal may not achieve the effect expected by the user, thereby degrading the user experience.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, the present disclosure provides a multimedia information processing method, apparatus, device and medium, so that multimedia information sent by a terminal to other users through a server can achieve an effect expected by the user, thereby improving user experience.
In a first aspect, an embodiment of the present disclosure provides a multimedia information processing method, including:
sending one or more multimedia messages to a server;
receiving address information and type information respectively corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server;
generating label information corresponding to each multimedia information according to the address information and the type information corresponding to each multimedia information;
and displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
In a second aspect, an embodiment of the present disclosure provides a multimedia information processing method, where the method includes:
receiving one or more multimedia messages sent by a first terminal;
generating address information and type information corresponding to each multimedia information in the one or more multimedia information respectively;
and sending address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, wherein the first terminal is used for generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information, and displaying the multimedia information on a user interface according to the label information corresponding to each piece of multimedia information.
In a third aspect, an embodiment of the present disclosure provides a multimedia information processing apparatus, including:
the sending module is used for sending one or more multimedia messages to the server;
a receiving module, configured to receive, from the server, address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information;
the generating module is used for generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information;
and the display module is used for displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
In a fourth aspect, an embodiment of the present disclosure provides a multimedia information processing apparatus, including:
the receiving module is used for receiving one or more multimedia messages sent by a first terminal;
the generating module is used for generating address information and type information which correspond to each piece of multimedia information in the one or more pieces of multimedia information respectively;
and the sending module is used for sending the address information and the type information which respectively correspond to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, and the first terminal is used for generating label information which respectively corresponds to each piece of multimedia information according to the address information and the type information which respectively correspond to each piece of multimedia information and displaying the multimedia information on a user interface according to the label information which respectively corresponds to each piece of multimedia information.
In a fifth aspect, an embodiment of the present disclosure provides a multimedia information processing apparatus, including:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of the first or second aspect.
In a sixth aspect, the disclosed embodiments provide a computer-readable storage medium having a computer program stored thereon, the computer program being executed by a processor to implement the method of the first or second aspect.
The method, the device, the equipment and the medium for processing the multimedia information provided by the embodiment of the disclosure send one or more multimedia information to the server, receive address information and type information respectively corresponding to each piece of multimedia information in the one or more multimedia information from the server, and generate label information respectively corresponding to each piece of multimedia information according to the address information and the type information respectively corresponding to each piece of multimedia information. And further, displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information. The multimedia information can be displayed on the terminal, and the user can edit or adjust the multimedia information displayed on the terminal until the multimedia information achieves the effect expected by the user. Therefore, the multimedia information sent by the terminal to other users through the server can achieve the expected effect of the user, and the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a multimedia information processing method provided in an embodiment of the present disclosure;
fig. 2 is a schematic diagram of an application scenario provided by the embodiment of the present disclosure;
FIG. 3 is a flowchart of a multimedia information processing method according to another embodiment of the disclosure;
FIG. 4 is a schematic diagram of a rich text editor provided by an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a user interface provided by an embodiment of the present disclosure;
fig. 6 is a flow chart of a multimedia information processing method according to another embodiment of the disclosure;
fig. 7 is a flow chart of a multimedia information processing method according to another embodiment of the disclosure;
fig. 8 is a flow chart of a multimedia information processing method according to another embodiment of the disclosure;
fig. 9 is a flowchart of a multimedia information processing method according to another embodiment of the disclosure;
FIG. 10 is a schematic diagram of a user interface provided by another embodiment of the present disclosure;
fig. 11 is a schematic diagram of another application scenario provided by the embodiment of the present disclosure;
fig. 12 is a schematic diagram illustrating a server sending a multimedia message to a client according to the embodiment of the present disclosure;
fig. 13 is a schematic diagram of an editing process of multimedia information and a playback process of multimedia information by a terminal according to an embodiment of the present disclosure;
FIG. 14 is a schematic structural diagram of a multimedia information processing apparatus according to an embodiment of the disclosure;
FIG. 15 is a schematic diagram of another multimedia information processing apparatus according to an embodiment of the disclosure;
fig. 16 is a schematic structural diagram of a multimedia information processing apparatus according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Generally, the multimedia information sent by the terminal may not achieve the effect expected by the user, thereby reducing the user experience. To solve this problem, embodiments of the present disclosure provide a multimedia information processing method, which is described below with reference to specific embodiments.
Fig. 1 is a flowchart of a multimedia information processing method according to an embodiment of the disclosure. The multimedia information processing method can be applied to the application scenario as shown in fig. 2. Specifically, the application scenario may include the terminal 21, the server 22, and the terminal 23. It is to be understood that other devices may be included in the application scenario, which is only schematically illustrated here. Specifically, the terminal 21 or the terminal 23 includes, but is not limited to, a smart phone, a palm computer, a tablet computer, a wearable device with a display screen, a desktop computer, a notebook computer, an all-in-one machine, a smart home device, and the like. For example, in the present embodiment, the terminal 21 may be a desktop computer, and the terminal 23 may be a smartphone. In addition, the server 22 may be a cloud server, a server cluster, a certain server in the server cluster, or the like. As shown in fig. 1, the method comprises the following specific steps:
s101, one or more pieces of multimedia information are sent to a server.
For example, the terminal 21 may send one or more multimedia messages locally to the server 22. Each of the one or more multimedia messages may specifically be video message, audio message, i.e. voice message, text message, or image message. Or each multimedia message may be a combination of at least two of video information, audio information, text information, and image information. For example, the terminal 21 transmits local video information and audio information to the server 22. Specifically, the terminal 21 may upload multimedia information or a multimedia file to the server through the form.
S102, receiving address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server.
For example, in the case where the server 22 receives the video information and the audio information from the terminal 21, the server 22 may store the video information and the audio information in a storage system associated with the server 22, the storage system may be integrated in the server 22, or the storage system may exist separately from the server 22 and be communicatively connected to the server 22. Further, the server 22 may generate address information and type information corresponding to the video information and address information and type information corresponding to the audio information. The address information corresponding to the video information may be a storage address of the video information in the storage system, and the address information corresponding to the audio information may be a storage address of the audio information in the storage system. In some embodiments, the video information may be referred to as a video file, the address information corresponding to the video information may be a file path or a file address of the video file, and the type information corresponding to the video information may be a file type of the video file. Similarly, the audio information may be referred to as an audio file, the address information corresponding to the audio information may be a file path or a file address of the audio file, and the type information corresponding to the audio information may be a file type of the audio file.
Further, the server 22 may transmit address information and type information corresponding to the video information and address information and type information corresponding to the audio information to the terminal 21.
S103, generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information.
When receiving the address information and the type information corresponding to the video information and the address information and the type information corresponding to the audio information, the terminal 21 may generate the tag information corresponding to the video information according to the address information and the type information corresponding to the video information and generate the tag information corresponding to the audio information according to the address information and the type information corresponding to the audio information. The tag information corresponding to the video information and the tag information corresponding to the audio information may be HyperText Markup Language (HTML) tags, respectively. For example, the tag information corresponding to the video information may be a video (video) tag, and the tag information corresponding to the audio information may be an audio (audio) tag.
And S104, displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
For example, when the terminal 21 generates a video tag and an audio tag, the video information may be displayed on the user interface of the terminal 21 according to the video tag, the text information corresponding to the audio information may be displayed on the user interface of the terminal 21 according to the audio tag, or the audio information may be played in the user interface according to the audio tag.
Further, the user of the terminal 21 can browse the video information and the audio information in the user interface. In the case where the video information and the audio information satisfy the effect expected by the user, the terminal 21 may transmit a confirmation instruction to the server 22, and further, the server 22 may transmit the video information and the audio information to other terminals, for example, the terminal 23. In the case where the video information and the audio information do not satisfy the user's intended effect, the user of the terminal 21 may edit, adjust, or modify the video information and the audio information in the user interface until the video information and the audio information satisfy the user's intended effect.
The method and the device for generating the label information respectively correspond to each piece of multimedia information comprise the steps of sending one or more pieces of multimedia information to a server, receiving address information and type information respectively corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server, and generating the label information respectively corresponding to each piece of multimedia information according to the address information and the type information respectively corresponding to each piece of multimedia information. And further, displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information. The multimedia information can be displayed on the terminal, and the user can edit or adjust the multimedia information displayed on the terminal until the multimedia information achieves the effect expected by the user. Therefore, the multimedia information sent by the terminal to other users through the server can achieve the expected effect of the user, and the user experience is improved.
Fig. 3 is a flowchart of a multimedia information processing method according to another embodiment of the disclosure. The method comprises the following specific steps:
s301, sending one or more local multimedia messages to the server through the rich text editor.
As shown in fig. 2, the terminal 21 may be equipped with a browser, and a display component of the terminal 21, such as the display 211, may display a user interface of the browser, which may be a user interface as described above. In particular, the browser may be embedded with a Text Editor, such as a Rich Text Editor (RTE). The terminal 21 may upload one or more multimedia messages local to the terminal 21 to the server 22 via the rich text editor. For example, the terminal 21 may upload video information and audio information local to the terminal 21 to the server 22 through the rich text editor.
Fig. 4 is a schematic diagram of a rich text editor, wherein each of the "1 st frame", "2 nd frame" and "3 rd frame" may correspond to a multimedia message. For example, in the case where the terminal 21 uploads video information, the user of the terminal 21 may click on "frame 1" and further click on the icon 42, thereby uploading local video information. In addition, the user of the terminal 21 can also enter text information related to the video information, for example, a title, description information, etc. of the video information in the text entry box 45 of the rich text editor.
Similarly, in the case that the terminal 21 uploads the audio information, the user of the terminal 21 may click on "frame 2" and further click on the icon 41, thereby uploading the local audio information. In addition, the user of the terminal 21 can also input text information related to the audio information, for example, a title, description information, and the like of the audio information, in the text input box 45 of the rich text editor.
In some embodiments, the number of "1 st frame", "2 nd frame" and "3 rd frame" in the rich text editor may be adjusted according to the number of uploaded multimedia information. For example, in the case where the user of the terminal 21 plans to upload one multimedia information to the server 22 through the terminal 21, the terminal 21 may reserve the "1 st frame" according to the user setting. In the case where the user of the terminal 21 plans to upload two multimedia information to the server 22 through the terminal 21, the terminal 21 may retain the "1 st frame" and the "2 nd frame" according to the user setting. However, at least one of the icons "frame 1", "frame 2", and "frame 3" is retained. In addition, an icon 43 as shown in fig. 4 is used to upload image information. Icon 44 is used to go back to the previous step, e.g. the user of terminal 21 uploads an image via icon 43, which can be undone by clicking on icon 44.
S302, receiving address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server.
Specifically, the implementation process of S302 and S102 is consistent with specific principles, and is not described herein again.
S303, generating label information corresponding to each multimedia information according to the address information and the type information corresponding to each multimedia information.
Specifically, the implementation process of S303 and S103 is consistent with a specific principle, and is not described herein again.
S304, displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
Optionally, the user interface includes a first area and a second area; the first area is an editing interface corresponding to the rich text editor; the second area is a virtual terminal display area.
FIG. 5 is a schematic view of a user interface of a browser, shown generally at 50, including a first region and a second region. The first area is an editing interface corresponding to the rich text editor, and the editing interface is shown in fig. 4. The second area is a virtual terminal display area, such as 51 shown in fig. 5, which may be a display area of a simulated or virtual handset. That is, the virtual terminal display area is not a real cell phone interface, but contents displayed in the user interface of the browser by the terminal 21.
Optionally, displaying the multimedia information on the user interface according to the label information corresponding to each piece of multimedia information, including: and synchronously displaying the multimedia information in the first area and the second area of the user interface according to the label information corresponding to each multimedia information.
For example, in a case where the terminal 21 generates a video tag and an audio tag, the video tag and the audio tag may be inserted into a rich text editor, and the browser may render video information on a user interface of the browser according to the video tag, so that the first area and the second area of the user interface synchronously display the video information. In addition, the browser can also render text information corresponding to the audio information on a user interface of the browser according to the audio label, so that the text information corresponding to the audio information is synchronously displayed in the first area and the second area of the user interface.
For example, in the case where "frame 1" is selected, the terminal 21 displays the video information in synchronization in the first area and the second area of the user interface according to the video tag. Under the condition that the 'frame 2' is selected, the terminal 21 synchronously displays text information corresponding to the audio information in the first area and the second area of the user interface according to the audio label. For example, the text input box 45 and the virtual terminal display area 51 display the text information "32131" in synchronization.
Further, the video information and the audio information may correspond to a switch button, respectively, and the video information is played in a case where the switch button of the video information is turned on. The audio information is played in a case where the switch button of the audio information is turned on. For example, in the case where "frame 1" is selected, identification information of the video information and a toggle button of the video information may be displayed in the user interface of the browser. In a case where the switch button of the video information is turned on, the terminal 21 displays the video information in synchronization in the first area and the second area of the user interface according to the video tag.
It is understood that, in the present embodiment, the rich text editor can be used not only to edit the multimedia information, but also to render the multimedia information. The multimedia information may also be referred to as multimedia message. The virtual terminal display area can display the multimedia information back.
According to the method and the device, the rich text editor sends one or more pieces of multimedia information local to the terminal to the server, so that the rich text editor has the function of uploading the multimedia information, the terminal can edit the multimedia message content through the rich text editor, and the local video file, the voice file, the text input and the like can be flexibly uploaded. In addition, the rich text editor and the display area of the virtual terminal synchronously display the multimedia information back, so that a user can check the display effect of the multimedia information on the virtual terminal, and the user can edit, adjust or modify the multimedia information back through the rich text editor again, so that the multimedia information sent by the terminal to other users through the server can achieve the effect expected by the user, and the user experience is improved.
On the basis of the above embodiment, after the multimedia information is displayed on the user interface according to the label information corresponding to each piece of multimedia information, the method further includes the following steps as shown in fig. 6:
s601, responding to a multimedia information modification instruction, and modifying at least one piece of multimedia information in the one or more pieces of multimedia information to obtain modified multimedia information.
In the case where the first area and the second area synchronously display the multimedia information, if the user of the terminal 21 determines that the display effect of the multimedia information on the virtual terminal does not meet the expected effect by viewing the display effect, the user of the terminal 21 may edit, adjust or modify the redisplayed multimedia information through the rich text editor. For example, as shown in fig. 5, the user of the terminal 21 may send a multimedia information modification instruction to the terminal 21 by clicking "delete current frame", where the multimedia information modification instruction may be used to delete the currently selected multimedia information, that is, the terminal 21 may delete the currently selected multimedia information according to the multimedia information modification instruction. Alternatively, the user of the terminal 21 may send a multimedia information modification instruction to the terminal 21 by clicking the "add frame", where the multimedia information modification instruction may be used to add new multimedia information, that is, the terminal 21 may load new multimedia information according to the multimedia information modification instruction. Further, "material library addition" as shown in fig. 5 may be used to add a material library, which may be a collection of multimedia information.
S602, sending the modified multimedia information to the server.
For example, the terminal 21 may upload the modified multimedia information to the server 22 through a rich text editor. Similarly, the server 22 may generate a file path and a file type of the modified multimedia information, and feed back the file path and the file type of the modified multimedia information to the terminal 21, and the terminal 21 may generate a corresponding HTML tag according to the file path and the file type of the modified multimedia information, and display the modified multimedia information back according to the HTML tag until a display effect of the modified multimedia information on the virtual terminal meets an expected effect.
According to the embodiment of the method and the device, the rich text editor and the display area of the virtual terminal synchronously display the multimedia information, so that a user can check the display effect of the multimedia information on the virtual terminal, and can edit, adjust or modify the displayed multimedia information again through the rich text editor, and the multimedia information sent to other users by the terminal through the server can achieve the effect expected by the user, and the user experience is improved.
Fig. 7 is a flowchart of a multimedia information processing method according to another embodiment of the disclosure. The method comprises the following specific steps:
s701, receiving one or more multimedia messages sent by a first terminal.
For example, the terminal 21 is regarded as a first terminal, and the server 22 receives video information and audio information transmitted from the terminal 21.
S702, generating address information and type information corresponding to each multimedia information in the one or more multimedia information respectively.
For example, the server 22 may store the video information and audio information in a storage system associated with the server 22, the storage system may be integrated within the server 22, or the storage system may exist separate from the server 22 and communicatively coupled to the server 22. Further, the server 22 may generate address information and type information corresponding to the video information and address information and type information corresponding to the audio information.
And S703, sending address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, where the first terminal is configured to generate tag information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information, and display the multimedia information on a user interface according to the tag information corresponding to each piece of multimedia information.
For example, the server 22 may transmit address information and type information corresponding to the video information and address information and type information corresponding to the audio information to the terminal 21. When receiving the address information and the type information corresponding to the video information and the address information and the type information corresponding to the audio information, the terminal 21 may generate the tag information corresponding to the video information according to the address information and the type information corresponding to the video information and generate the tag information corresponding to the audio information according to the address information and the type information corresponding to the audio information. The tag information corresponding to the video information and the tag information corresponding to the audio information may be HyperText Markup Language (HTML) tags, respectively. For example, the tag information corresponding to the video information may be a video (video) tag, and the tag information corresponding to the audio information may be an audio (audio) tag.
Further, the terminal 21 may display the video information on the user interface of the terminal 21 according to the video tag, display text information corresponding to the audio information on the user interface of the terminal 21 according to the audio tag, or play the audio information in the user interface according to the audio tag.
The method and the device for processing the multimedia information generate address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information by receiving the one or more pieces of multimedia information sent by a first terminal, and send the address information and the type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal. The multimedia information can be displayed on the first terminal in a playback mode, and the user can edit or adjust the multimedia information displayed on the first terminal in the playback mode until the multimedia information achieves the effect expected by the user. Therefore, the multimedia information sent by the first terminal to other users through the server can achieve the expected effect of the users, and the user experience is improved.
On the basis of the above embodiment, after the address information and the type information respectively corresponding to each multimedia information in the one or more multimedia information are sent to the first terminal, the method further includes the following steps as shown in fig. 8:
s801, receiving the modified multimedia information in the one or more multimedia information sent by the first terminal.
In the case where the first area and the second area synchronously display the multimedia information, if the user of the terminal 21 determines that the display effect of the multimedia information on the virtual terminal does not meet the expected effect by viewing the display effect, the user of the terminal 21 may edit, adjust or modify the redisplayed multimedia information through the rich text editor. Further, the terminal 21 may upload the modified multimedia information to the server 22 through the rich text editor. The server 22 receives the modified multimedia information sent by the terminal 21.
S802, generating address information and type information corresponding to the modified multimedia information.
For example, the server 22 may generate a file path and a file type for the modified multimedia information.
And S803, sending the address information and the type information corresponding to the modified multimedia information to the first terminal.
For example, the server 22 may feed back the file path and the file type of the modified multimedia information to the terminal 21, and the terminal 21 may generate a corresponding HTML tag according to the file path and the file type of the modified multimedia information, and display the modified multimedia information back according to the HTML tag until the display effect of the modified multimedia information on the virtual terminal meets the expected effect.
According to the embodiment of the method and the device, the rich text editor and the display area of the virtual terminal synchronously display the multimedia information, so that a user can check the display effect of the multimedia information on the virtual terminal, and can edit, adjust or modify the displayed multimedia information again through the rich text editor, and the multimedia information sent to other users by the terminal through the server can achieve the effect expected by the user, and the user experience is improved.
On the basis of the above embodiment, after the address information and the type information respectively corresponding to each multimedia information in the one or more multimedia information are sent to the first terminal, the method further includes the following steps as shown in fig. 9:
and S901, receiving an audit submission instruction sent by the first terminal.
As shown in fig. 10, in case the multimedia information echoed on the virtual terminal is in accordance with the desired effect, the user of the terminal 21 may click on a "submit audit" button in the user interface. The terminal 21 may generate an audit submitting instruction according to a click operation of the user on the "submit audit" button, and send the audit submitting instruction to the server 22. The server 22 may receive an audit submission instruction that may be sent by the terminal 21. It will be appreciated that the operation of the "submit an audit" button by the user of the terminal 21 is not limited to a click, but may also be, for example, a double click, a long press, etc.
And S902, according to the audit submitting instruction, sending the address information and the type information corresponding to each multimedia information in the one or more multimedia information to a second terminal, wherein the second terminal is used for auditing the one or more multimedia information.
For example, as shown in fig. 11, in a case where the server 22 receives the audit submitting instruction from the terminal 21, the server 22 may send address information and type information respectively corresponding to each of one or more pieces of multimedia information uploaded by the terminal 21 to a second terminal, for example, the terminal 24. The terminal 24 may be the terminal of an auditor. Similarly, the terminal 24 may generate an HTML tag according to the address information and the type information corresponding to the multimedia information, and display the multimedia information according to the HTML tag, and the specific process may refer to the contents of the above embodiments, which is not described herein again. Further, the auditor can check whether the multimedia information displayed back on the terminal 24 meets the requirements.
And S903, sending the one or more multimedia information to a third terminal under the condition that the one or more multimedia information is approved.
For example, in the case where the auditor confirms that the multimedia information displayed back on the terminal 24 meets the requirements, the terminal 24 may send confirmation information to the server 22. The server 22 may send a multimedia message to a third terminal, for example, the terminal 23, in case of receiving the confirmation information, where the multimedia message may include one or more multimedia information uploaded by the terminal 21.
In another possible implementation manner, after the address information and the type information respectively corresponding to each piece of the one or more pieces of multimedia information are sent to the first terminal, the method further includes: receiving a confirmation instruction sent by the first terminal; and sending the one or more multimedia information to a third terminal according to the confirmation instruction.
For example, in some embodiments, one or more multimedia messages uploaded by the terminal 21 to the server 22 do not need to be audited by auditors. In this case, if the multimedia information echoed by the terminal 21 on the virtual terminal corresponds to the desired effect, the terminal 21 may send a confirmation instruction to the server 22. The server 22 may send a multimedia message to a third terminal, for example, the terminal 23, in a case of receiving the confirmation instruction, where the multimedia message may include one or more pieces of multimedia information uploaded by the terminal 21.
As shown in fig. 10, during the process of editing multimedia information by the user of the terminal 21 through the rich text editor, if the editing is stopped halfway, the user may click a "save" button in the user interface. When the user re-edits the stored multimedia information, the terminal 21 and the server 22 may continue processing according to the procedure described in the above embodiment.
In addition, the process of sending the multimedia message to the client, for example, the user of the terminal 23, by the server may refer to the process shown in fig. 12, and the implementation manner and the specific principle of the process may refer to the contents of the above embodiments, which are not described herein again. It is understood that in some embodiments, there may be a plurality of terminals 23, that is, the server may transmit the multimedia message to a plurality of clients in a unified manner.
In addition, the process of editing the multimedia information and the process of displaying the multimedia information back by the user of the terminal 21 may refer to the process shown in fig. 13, and the implementation manner and the specific principle of the process may refer to the contents of the above embodiments, which are not described herein again.
According to the method and the device, the server receives the audit submission instruction sent by the first terminal, and the address information and the type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information are sent to the second terminal according to the audit submission instruction, so that a user of the second terminal can audit the one or more pieces of multimedia information uploaded to the server by the first terminal, and the server can send the one or more pieces of multimedia information to the third terminal only when the audit is passed, and therefore reliability and safety of multimedia information transmission are guaranteed.
Fig. 14 is a schematic structural diagram of a multimedia information processing apparatus according to an embodiment of the disclosure. The multimedia information processing apparatus may be the terminal 21 or a component in the terminal 21 as described above. The multimedia information processing apparatus provided in the embodiment of the present disclosure may execute the processing flow provided in the embodiment of the multimedia information processing method, as shown in fig. 14, the multimedia information processing apparatus 140 includes:
a sending module 141, configured to send one or more multimedia messages to a server;
a receiving module 142, configured to receive, from the server, address information and type information corresponding to each piece of the one or more pieces of multimedia information, respectively;
a generating module 143, configured to generate, according to the address information and the type information respectively corresponding to each piece of multimedia information, tag information respectively corresponding to each piece of multimedia information;
the display module 144 is configured to display the multimedia information on the user interface according to the label information corresponding to each piece of multimedia information.
Optionally, the sending module 141 is specifically configured to:
sending one or more multimedia messages locally to the server through a rich text editor.
Optionally, the user interface includes a first area and a second area;
the first area is an editing interface corresponding to the rich text editor;
the second area is a virtual terminal display area.
Optionally, the display module 144 is specifically configured to: and synchronously displaying the multimedia information in the first area and the second area of the user interface according to the label information corresponding to each multimedia information.
Optionally, the apparatus 140 further includes: a modifying module 145, configured to modify at least one multimedia message of the one or more multimedia messages in response to a multimedia message modifying instruction after the displaying module 144 displays the multimedia messages on the user interface according to the tag information corresponding to each multimedia message, so as to obtain modified multimedia messages; the sending module 141 is further configured to: and sending the modified multimedia information to the server.
The multimedia information processing apparatus in the embodiment shown in fig. 14 can be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects are similar, and are not described herein again.
Fig. 15 is a schematic structural diagram of a multimedia information processing apparatus according to an embodiment of the disclosure. The multimedia information processing apparatus may be the server 22 or a component in the server 22 as described above. The multimedia information processing apparatus provided by the embodiment of the present disclosure may execute the processing flow provided by the embodiment of the multimedia information processing method, as shown in fig. 15, the multimedia information processing apparatus 150 includes:
a receiving module 151, configured to receive one or more multimedia messages sent by a first terminal;
a generating module 152, configured to generate address information and type information corresponding to each piece of the one or more pieces of multimedia information;
a sending module 153, configured to send address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, where the first terminal is configured to generate tag information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information, and display the multimedia information on a user interface according to the tag information corresponding to each piece of multimedia information.
Optionally, the receiving module 151 is further configured to: after the sending module 153 sends the address information and the type information corresponding to each multimedia information in the one or more multimedia information to the first terminal, the modified multimedia information in the one or more multimedia information sent by the first terminal is received; the generation module 152 is further configured to: generating address information and type information corresponding to the modified multimedia information; the sending module 153 is further configured to: and sending the address information and the type information corresponding to the modified multimedia information to the first terminal.
Optionally, the receiving module 151 is further configured to: after the sending module 153 sends the address information and the type information corresponding to each multimedia information in the one or more multimedia information to the first terminal, an audit submitting instruction sent by the first terminal is received; the sending module 153 is further configured to: according to the auditing submission instruction, sending address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to a second terminal, wherein the second terminal is used for auditing the one or more pieces of multimedia information; and in the case that the one or more multimedia information is approved, sending the one or more multimedia information to a third terminal.
Optionally, the receiving module 151 is further configured to: after the sending module 153 sends the address information and the type information corresponding to each multimedia information in the one or more multimedia information to the first terminal, receiving a confirmation instruction sent by the first terminal; the sending module 153 is further configured to: and sending the one or more multimedia information to a third terminal according to the confirmation instruction.
The multimedia information processing apparatus in the embodiment shown in fig. 15 can be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects are similar, and are not described herein again.
Fig. 16 is a schematic structural diagram of a multimedia information processing apparatus according to an embodiment of the present disclosure. The multimedia information processing apparatus may be the terminal 21 or the server 22 as described above. The multimedia information processing apparatus provided by the embodiment of the present disclosure may execute the processing flow provided by the embodiment of the multimedia information processing method, as shown in fig. 16, the multimedia information processing apparatus 160 includes: memory 161, processor 162, computer programs, and communications interface 163; wherein the computer program is stored in the memory 161 and is configured to be executed by the processor 162 in the multimedia information processing method as described above.
In addition, the embodiment of the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, the computer program being executed by a processor to implement the multimedia information processing method described in the above embodiment.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (13)

1. A method for processing multimedia information, the method comprising:
sending one or more multimedia messages to a server;
receiving address information and type information respectively corresponding to each piece of multimedia information in the one or more pieces of multimedia information from the server;
generating label information corresponding to each multimedia information according to the address information and the type information corresponding to each multimedia information;
and displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
2. The method of claim 1, wherein sending one or more multimedia messages to a server comprises:
sending one or more multimedia messages locally to the server through a rich text editor.
3. The method of claim 2, wherein the user interface comprises a first area and a second area;
the first area is an editing interface corresponding to the rich text editor;
the second area is a virtual terminal display area.
4. The method of claim 3, wherein displaying the multimedia information on the user interface according to the label information corresponding to each multimedia information respectively comprises:
and synchronously displaying the multimedia information in the first area and the second area of the user interface according to the label information corresponding to each multimedia information.
5. The method according to claim 1, wherein after the multimedia information is displayed on the user interface according to the label information corresponding to each multimedia information, the method further comprises:
responding to a multimedia information modification instruction, and modifying at least one piece of multimedia information in the one or more pieces of multimedia information to obtain modified multimedia information;
and sending the modified multimedia information to the server.
6. A method for processing multimedia information, the method comprising:
receiving one or more multimedia messages sent by a first terminal;
generating address information and type information corresponding to each multimedia information in the one or more multimedia information respectively;
and sending address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, wherein the first terminal is used for generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information, and displaying the multimedia information on a user interface according to the label information corresponding to each piece of multimedia information.
7. The method of claim 6, wherein after sending the address information and the type information corresponding to each of the one or more multimedia messages to the first terminal, the method further comprises:
receiving modified multimedia information in the one or more multimedia information sent by the first terminal;
generating address information and type information corresponding to the modified multimedia information;
and sending the address information and the type information corresponding to the modified multimedia information to the first terminal.
8. The method of claim 6, wherein after sending the address information and the type information corresponding to each of the one or more multimedia messages to the first terminal, the method further comprises:
receiving an audit submitting instruction sent by the first terminal;
according to the auditing submission instruction, sending address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information to a second terminal, wherein the second terminal is used for auditing the one or more pieces of multimedia information;
and in the case that the one or more multimedia information is approved, sending the one or more multimedia information to a third terminal.
9. The method of claim 6, wherein after sending the address information and the type information corresponding to each of the one or more multimedia messages to the first terminal, the method further comprises:
receiving a confirmation instruction sent by the first terminal;
and sending the one or more multimedia information to a third terminal according to the confirmation instruction.
10. A multimedia information processing apparatus, characterized by comprising:
the sending module is used for sending one or more multimedia messages to the server;
a receiving module, configured to receive, from the server, address information and type information corresponding to each piece of multimedia information in the one or more pieces of multimedia information;
the generating module is used for generating label information corresponding to each piece of multimedia information according to the address information and the type information corresponding to each piece of multimedia information;
and the display module is used for displaying the multimedia information on a user interface according to the label information corresponding to each multimedia information.
11. A multimedia information processing apparatus, characterized by comprising:
the receiving module is used for receiving one or more multimedia messages sent by a first terminal;
the generating module is used for generating address information and type information which correspond to each piece of multimedia information in the one or more pieces of multimedia information respectively;
and the sending module is used for sending the address information and the type information which respectively correspond to each piece of multimedia information in the one or more pieces of multimedia information to the first terminal, and the first terminal is used for generating label information which respectively corresponds to each piece of multimedia information according to the address information and the type information which respectively correspond to each piece of multimedia information and displaying the multimedia information on a user interface according to the label information which respectively corresponds to each piece of multimedia information.
12. A multimedia information processing apparatus characterized by comprising:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any one of claims 1-9.
13. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-9.
CN202110281431.XA 2021-03-16 2021-03-16 Multimedia information processing method, device, equipment and medium Pending CN113010706A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110281431.XA CN113010706A (en) 2021-03-16 2021-03-16 Multimedia information processing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110281431.XA CN113010706A (en) 2021-03-16 2021-03-16 Multimedia information processing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN113010706A true CN113010706A (en) 2021-06-22

Family

ID=76408397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110281431.XA Pending CN113010706A (en) 2021-03-16 2021-03-16 Multimedia information processing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113010706A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080006443A (en) * 2006-07-12 2008-01-16 (주)에스비에스아이 Multimedia edit/play system and method for operating the same
US20120197970A1 (en) * 2011-01-28 2012-08-02 Marie Clarke Arturi Method and system for multi-media object creation and social publishing of same
CN103533382A (en) * 2013-09-24 2014-01-22 四川汇源吉迅数码科技有限公司 Mobile video new media production, uploading and publishing system
CN104090979A (en) * 2014-07-23 2014-10-08 上海天脉聚源文化传媒有限公司 Method and device for editing webpage
CN104796767A (en) * 2015-03-31 2015-07-22 北京奇艺世纪科技有限公司 Method and system for editing cloud video
CN109543119A (en) * 2018-10-08 2019-03-29 深圳市梦网科技发展有限公司 Page preview system and method
CN109963166A (en) * 2017-12-22 2019-07-02 上海全土豆文化传播有限公司 Online Video edit methods and device
CN110674624A (en) * 2019-06-18 2020-01-10 北京无限光场科技有限公司 Method and system for editing image and text
CN111831937A (en) * 2019-04-23 2020-10-27 腾讯科技(深圳)有限公司 Data processing method and device and computer storage medium
CN111832274A (en) * 2019-04-15 2020-10-27 北京智启蓝墨信息技术有限公司 Rich media editing method, rich media generating method, rich media editing device, rich media generating device and rich media generating system in intelligent teaching materials

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080006443A (en) * 2006-07-12 2008-01-16 (주)에스비에스아이 Multimedia edit/play system and method for operating the same
US20120197970A1 (en) * 2011-01-28 2012-08-02 Marie Clarke Arturi Method and system for multi-media object creation and social publishing of same
CN103533382A (en) * 2013-09-24 2014-01-22 四川汇源吉迅数码科技有限公司 Mobile video new media production, uploading and publishing system
CN104090979A (en) * 2014-07-23 2014-10-08 上海天脉聚源文化传媒有限公司 Method and device for editing webpage
CN104796767A (en) * 2015-03-31 2015-07-22 北京奇艺世纪科技有限公司 Method and system for editing cloud video
CN109963166A (en) * 2017-12-22 2019-07-02 上海全土豆文化传播有限公司 Online Video edit methods and device
CN109543119A (en) * 2018-10-08 2019-03-29 深圳市梦网科技发展有限公司 Page preview system and method
CN111832274A (en) * 2019-04-15 2020-10-27 北京智启蓝墨信息技术有限公司 Rich media editing method, rich media generating method, rich media editing device, rich media generating device and rich media generating system in intelligent teaching materials
CN111831937A (en) * 2019-04-23 2020-10-27 腾讯科技(深圳)有限公司 Data processing method and device and computer storage medium
CN110674624A (en) * 2019-06-18 2020-01-10 北京无限光场科技有限公司 Method and system for editing image and text

Similar Documents

Publication Publication Date Title
CN109815200B (en) File sharing method and device and storage medium
CN110597774B (en) File sharing method, system, device, computing equipment and terminal equipment
CN111541930B (en) Live broadcast picture display method and device, terminal and storage medium
CN104995596A (en) Managing audio at the tab level for user notification and control
CN106610826B (en) Method and device for manufacturing online scene application
US20110228763A1 (en) Method and apparatus for accessing services of a device
US20230328026A1 (en) Mail processing method and apparatus, and electronic device, and computer readable medium
CN113325979A (en) Video generation method and device, storage medium and electronic equipment
CN112329403A (en) Live broadcast document processing method and device
CN104615432B (en) Splash screen information processing method and client
CN116126447A (en) Information processing method, device, electronic equipment and computer readable medium
JP6337449B2 (en) CONFERENCE SERVER DEVICE, PROGRAM, INFORMATION PROCESSING METHOD, AND CONFERENCE SYSTEM
CN116263914A (en) Document processing method and related product
CN114422468A (en) Message processing method, device, terminal and storage medium
CN111404977B (en) Document remote demonstration and viewing method and terminal equipment
WO2017165253A1 (en) Modular communications
KR101835884B1 (en) Method and apparatus for accessing one or more services of a device
EP2387179A2 (en) Method and devices for a computer conference
CN113010706A (en) Multimedia information processing method, device, equipment and medium
WO2016000638A1 (en) Networking cooperation method and machine using such method
JP2023547427A (en) Previously viewed content processing method and device, electronic device and computer program
WO2016127888A1 (en) Method and device for downloading multimedia file
JP2015049789A (en) Display control device, operation method of display control device, and computer program
US20170149578A1 (en) Networking cooperation method and machine using such method
CN111414495A (en) Multimedia data acquisition method, device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination