CN114268833A - Live broadcast content acquisition and processing method and device, electronic equipment and medium - Google Patents

Live broadcast content acquisition and processing method and device, electronic equipment and medium Download PDF

Info

Publication number
CN114268833A
CN114268833A CN202210204006.5A CN202210204006A CN114268833A CN 114268833 A CN114268833 A CN 114268833A CN 202210204006 A CN202210204006 A CN 202210204006A CN 114268833 A CN114268833 A CN 114268833A
Authority
CN
China
Prior art keywords
processing
media information
live content
editing tool
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210204006.5A
Other languages
Chinese (zh)
Inventor
李鲲
李永海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taide Wangju Beijing Technology Co ltd
Original Assignee
Taide Wangju Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taide Wangju Beijing Technology Co ltd filed Critical Taide Wangju Beijing Technology Co ltd
Priority to CN202210204006.5A priority Critical patent/CN114268833A/en
Publication of CN114268833A publication Critical patent/CN114268833A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for acquiring and processing live content, electronic equipment and a medium. One embodiment of the method comprises: collecting media information and displaying the media information to an operation processing area of a media information editing interface; detecting an editing tool area; in response to detecting a user click operation for an editing tool icon in an editing tool area, determining an operation editing tool icon of the user click operation as a target editing tool icon; determining a processing mode corresponding to the target editing tool icon as a target processing mode; processing the media information based on the target processing mode and the user processing operation in the operation processing area; and responding to the confirmation of the processing, confirming the processed media information as live content information, and displaying the live content information to a preview interface. The implementation mode provides a convenient and effective processing and manufacturing method for live content.

Description

Live broadcast content acquisition and processing method and device, electronic equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of live broadcast, in particular to a live broadcast content acquisition and processing method, a live broadcast content acquisition and processing device, electronic equipment and media.
Background
The online live broadcast is carried out by utilizing the internet and the streaming media technology, integrates elements such as images, characters, sounds and the like, has good sound and form and excellent effect, and gradually becomes the current mainstream online communication mode. The anchor user can create a live broadcast room through the anchor client to carry out live broadcast. In order to reduce errors in the live broadcasting process, part of anchor broadcasters carry out live broadcasting in a mode of playing pre-made live broadcasting contents, and audience users can enter the live broadcasting room through audience clients to watch the live broadcasting. However, the production of fine live content with high definition usually has high requirements on the format, encoding format and equipment of media information used for producing the live content, which brings great difficulty to the production of the live content. Therefore, a convenient and effective processing and production method of live broadcast content is urgently needed.
Disclosure of Invention
In view of this, embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a medium for acquiring and processing live content, so as to solve a problem in the prior art how to conveniently and effectively produce live content by processing media information.
In a first aspect of the embodiments of the present disclosure, a method for acquiring and processing live content is provided, including: the method comprises the steps of collecting media information and displaying the media information to an operation processing area of a media information editing interface, wherein the media information comprises at least one of the following items: audio, picture, video; the media information editing interface also comprises an editing tool area; detecting the editing tool area, wherein the editing tool area comprises at least one editing tool icon; in response to detecting a user click operation on an editing tool icon in the editing tool area, determining an operation editing tool icon of the user click operation as a target editing tool icon; determining a processing mode corresponding to the target editing tool icon as a target processing mode; processing the media information based on the target processing mode and the user processing operation in the operation processing area; and in response to the determination that the processing is completed, determining the processed media information as live content information, and displaying the live content information to a preview interface.
In a second aspect of the embodiments of the present disclosure, a device for acquiring and processing live content is provided, where the device includes: the media information editing device comprises a collecting unit and a display unit, wherein the collecting unit is configured to collect media information and display the media information to an operation processing area of a media information editing interface, and the media information comprises at least one of the following items: audio, picture, video; the media information editing interface also comprises an editing tool area; the detection unit is configured to detect the editing tool area, wherein the editing tool area comprises at least one editing tool icon; a first determination unit configured to determine, in response to detection of a user click operation for an editing tool icon in the editing tool area, an operation editing tool icon of the user click operation as a target editing tool icon; a second determining unit configured to determine a processing manner corresponding to the target editing tool icon as a target processing manner; a processing unit configured to process the media information based on the target processing method and a user processing operation in the operation processing area; and the display unit is configured to determine the processed media information as live content information in response to determining that the processing is completed, and display the live content information to a preview interface.
In a third aspect of the embodiments of the present disclosure, an electronic device is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the above method when executing the computer program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
One of the above-described various embodiments of the present disclosure has the following advantageous effects: firstly, displaying the collected media information to a media information editing interface for a user to process the media information; then, detecting an editing tool area of the media information editing interface so as to determine a target editing tool icon selected by a user; then, determining a processing mode corresponding to the target editing tool icon as a target processing mode; then, according to the target processing mode and the user processing operation detected in the operation processing area, processing the media information; and finally, determining the processed media information as live content information, and displaying the live content information to a preview interface. The method provided by the disclosure realizes a series of processes of acquiring, processing and finishing to obtain the live content information aiming at the media information, detects the operation of a user aiming at a media information editing interface in real time, and then correspondingly processes various styles of the media information to finally obtain the live content information meeting the user requirements.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a schematic diagram of one application scenario of a method of live content acquisition and processing, according to some embodiments of the present disclosure;
fig. 2 is a flow diagram of some embodiments of a method of live content acquisition and processing according to the present disclosure;
fig. 3 is a schematic block diagram of some embodiments of a live content capture and processing apparatus according to the present disclosure;
FIG. 4 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
A method, an apparatus, an electronic device, and a medium for acquiring and processing live content according to embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an application scenario of a method for capturing and processing live content according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may collect media information 102, and display the media information 102 to an operation processing area 104 of a media information editing interface 103, where the media information includes at least one of the following items: audio, picture, video; the media information editing interface 103 further includes an editing tool area 105. The computing device 101 may then detect the editing tool area 105, wherein the editing tool area includes at least one editing tool icon. Then, in response to detection of a user click operation for the editing tool icon in the editing tool area 105, the operation editing tool icon of the user click operation is determined as the target editing tool icon 106. Thereafter, the computing device 101 may determine the processing manner corresponding to the target editing tool icon 106 as the target processing manner 107. After that, the computing device 101 may process the media information 102 based on the target processing method 107 and the user processing operation 108 in the operation processing area 104. Finally, in response to determining that the processing is complete, the computing device 101 may determine the processed media information as live content information 109 and display the live content information 109 to a preview interface.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
Fig. 2 is a schematic flow chart of a method for acquiring and processing live content according to an embodiment of the present disclosure. The method of capturing and processing live content of fig. 2 may be performed by the computing device 101 of fig. 1. As shown in fig. 2, the method for acquiring and processing live content includes the following steps:
step S201, collecting media information, and displaying the media information to an operation processing area of a media information editing interface, where the media information editing interface further includes an editing tool area.
In some embodiments, an execution subject (e.g., the computing device 101 shown in fig. 1) of the live content collecting and processing method may connect to a terminal device in a wireless connection manner, and use media information transmitted by the terminal device as the media information. Then, the execution subject may display the media information to an operation processing area of the media information editing interface. Here, the media information includes, but is not limited to, at least one of: audio, picture, video.
It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
Step S202, detecting the editing tool area, where the editing tool area includes at least one editing tool icon.
In some embodiments, the execution subject may detect the editing tool area. Here, the detection may be detection for determining whether there is a click operation or other selected operation for the above-described editing tool area.
Step S203, in response to detecting a user click operation on the editing tool icon in the editing tool area, determining an operation editing tool icon of the user click operation as a target editing tool icon.
In some embodiments, in response to detecting a user click operation for the editing tool icon in the editing tool area, the execution subject may determine an operation editing tool icon of the user click operation as a target editing tool icon. As an example, in response to detecting a user click operation for the "bold" icon in the editing tool area, the execution subject may determine the "bold" icon as the target editing tool icon.
Step S204, determining the processing mode corresponding to the target editing tool icon as the target processing mode.
In some embodiments, the execution subject may determine a processing manner corresponding to the target editing tool icon as a target processing manner. Here, the target processing manner includes, but is not limited to, one or more of the following: processing characters, processing picture position patterns, inserting and displaying multi-track pictures and processing videos; the user processing operation has a corresponding operation position. For example, the target editing tool icon may be a "bold" icon, the corresponding processing method may be "bold font style of the selected character", and the execution subject may determine the processing method as the target processing method. As another example, the target editing tool icon may be an "italic" icon, the corresponding processing method may be "font style of the selected character is changed to italic", and the execution subject may determine the processing method as the target processing method.
Step S205 is performed to process the media information based on the target processing method and the user processing operation in the operation processing area.
In some embodiments, based on the target processing manner and the user processing operation in the operation processing area, the execution main body may process the media information by: the first step, in response to the fact that the processing mode is determined to be word processing and the user processing operation comprises a word insertion operation, the execution main body can receive inserted word information; the second step, the execution main body can display the character information to the operation position of the user processing operation aiming at the media information; and thirdly, in response to the fact that the user processing operation further comprises a character style adjusting operation, the execution main body can perform corresponding style adjustment on the character information to complete the processing of the media information. Here, the adjustment of the text style includes, but is not limited to: font style adjustment, adjustment of font size, color and transparency.
In some optional implementation manners of some embodiments, the executing body may further process the media information by: in response to determining that the media information is a video and the target processing mode is video processing, the execution main body may perform corresponding special effects/clipping and splicing processing on the media information based on the user processing operation. Here, the special effect/cropping and splicing processing of the media information is processing accurate to a frame level picture, wherein the special effect processing includes, but is not limited to, picture rotation processing, picture flip processing, and Logo (Logo) addition/removal processing.
In some optional implementation manners of some embodiments, the executing body may further process the media information by: the first step, in response to determining that the media information is a picture and the target processing mode is picture position pattern processing, based on the user processing operation, the execution main body may perform corresponding position adjustment/pattern adjustment on the media information; a second step of displaying at least one target insertion picture in the operation processing area by the execution main body in response to determining that the processing mode is multi-track picture insertion and display processing and picture position pattern processing, wherein the target insertion picture is selected in advance; and thirdly, based on the user processing operation, the execution main body can perform corresponding position adjustment/style adjustment on the media information. Here, the position adjustment/pattern adjustment includes, but is not limited to: adjusting the position of the picture, adjusting the size of the picture, adjusting the rotation of the picture and adjusting the watermark of the picture.
Step S206, in response to determining that the processing is completed, determining the processed media information as live content information, and displaying the live content information to a preview interface.
In some embodiments, in response to determining that the processing is complete, the executing entity may determine the processed media information as live content information. Then, the execution subject may display the live content information to a preview interface. Here, the preview interface may be a preset presentation window for observing the result of the process (live content information).
In some optional implementations of some embodiments, the method further comprises:
in the first step, the execution main body can control at least one target terminal to play the live content information.
And secondly, the execution main body can acquire the sequence position of the live content information in the play list of each target terminal in the at least one target terminal in a wireless connection mode. As an example, the ordinal position may be an order number of the live information in a playlist of the target terminal.
And thirdly, the execution main body can monitor the progress of the live content information in the playing process. As an example, the progress monitoring may be monitoring of the remaining playing time of the live content information and monitoring of whether the playing of the live content information is finished.
And fourthly, in response to the fact that the live content information is determined to be played, the execution main body can play other live content information based on the sequence position. As an example, in response to determining that the playing of the live content information is finished, the execution main body may play the next live content information according to the sequence position.
In some embodiments, the progress monitoring of the live content information and the playing of other live content information avoid live neutral, and a seamless live broadcast process is realized.
And fifthly, the execution main body can store the historical live content information of the playing result in the playlist.
And sixthly, responding to the detected user management operation aiming at the playlist, and enabling the execution main body to correspondingly adjust the live content information in the playlist. Here, the user management operation may be an operation of deleting/adding live content information in the playlist by the user, or an operation of adjusting a playback order of live content information in the playlist.
In some optional implementation manners of some embodiments, the acquisition and processing method of the live content implemented by the execution subject may adopt a B/S (Browser/Server) architecture, where the B/S architecture is a modified or improved architecture of the C/S architecture. The application mode of the B/S framework unifies the client, and the core part for realizing the system function is centralized on the server, so that the development, maintenance and use of the system are simplified.
In some optional implementation manners of some embodiments, the execution main body supports video media information processing in an H264 video coding format, supports audio media information processing in an AAC/MP3 audio coding format, supports picture media information processing in JPEG, PNG, BMP formats, supports picture media information processing in a PNG format with a channel, and supports an open interface to transmit media information to implement the method for acquiring and processing live broadcast content.
One of the above-described various embodiments of the present disclosure has the following advantageous effects: firstly, displaying the collected media information to a media information editing interface for a user to process the media information; then, detecting an editing tool area of the media information editing interface so as to determine a target editing tool icon selected by a user; then, determining a processing mode corresponding to the target editing tool icon as a target processing mode; then, according to the target processing mode and the user processing operation detected in the operation processing area, processing the media information; and finally, determining the processed media information as live content information, and displaying the live content information to a preview interface. The method provided by the disclosure realizes a series of processes of acquiring, processing and finishing to obtain the live content information aiming at the media information, detects the operation of a user aiming at a media information editing interface in real time, and then correspondingly processes various styles of the media information to finally obtain the live content information meeting the user requirements. In addition, the method provided by the disclosure has no limitation on the insertion track of the photo media information in the process of processing the media information; the format and the coding format of various media information are supported, and the compatibility is extremely strong; and the difficulty of butting the media resources is reduced by adopting an open interface. The method provided by the disclosure adopts the B/S architecture, greatly simplifies the computer load of the client, reduces the cost and workload of system maintenance and upgrading, reduces the overall cost of the user, has no time and place limitation, saves resources for the user and improves the user experience.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 3 is a schematic diagram of a device for acquiring and processing live content according to an embodiment of the present disclosure. As shown in fig. 3, the apparatus for acquiring and processing live content includes: an acquisition unit 301, a detection unit 302, a first determination unit 303, a second determination unit 304, a processing unit 305 and a display unit 306. The acquisition unit 301 is configured to acquire media information and display the media information to an operation processing area of a media information editing interface, where the media information includes at least one of the following items: audio, picture, video; the media information editing interface also comprises an editing tool area; a detecting unit 302 configured to detect the editing tool area, wherein the editing tool area includes at least one editing tool icon; a first determining unit 303 configured to determine, in response to detection of a user click operation for an editing tool icon in the editing tool area, an operation editing tool icon of the user click operation as a target editing tool icon; a second determining unit 304, configured to determine a processing manner corresponding to the target editing tool icon as a target processing manner; a processing unit 305 configured to process the media information based on the target processing method and a user processing operation in the operation processing area; a display unit 306 configured to determine the processed media information as live content information in response to determining that the processing is completed, and display the live content information to a preview interface.
In some optional implementations of some embodiments, the target processing manner includes one or more of the following: processing characters, processing picture position patterns, inserting and displaying multi-track pictures and processing videos; the user processing operation has a corresponding operation position.
In some optional implementations of some embodiments, the processing unit 305 of the live content capture and processing device is further configured to: receiving inserted character information in response to the fact that the target processing mode is determined to be character processing and the user processing operation comprises a character insertion operation; displaying the character information to an operation position of the user processing operation for the media information; and responding to the detection that the user processing operation also comprises a character style adjusting operation, and performing corresponding style adjustment on the character information to finish the processing of the media information.
In some optional implementations of some embodiments, the processing unit 305 of the live content capture and processing device is further configured to: in response to determining that the media information is a video and the target processing mode is video processing, performing corresponding special effect/cutting and splicing processing on the media information based on the user processing operation; in response to determining that the media information is a picture and the target processing mode is picture position pattern processing, performing corresponding position adjustment/pattern adjustment on the media information based on the user processing operation; in response to determining that the processing mode is multi-track picture insertion and display processing and picture position pattern processing, displaying at least one target insertion picture to the operation processing area, wherein the target insertion picture is selected in advance; and performing corresponding position adjustment/style adjustment on the media information based on the user processing operation.
In some optional implementation manners of some embodiments, the method for acquiring and processing the live content employs a B/S architecture, and the method for acquiring and processing the live content supports video media information processing in an H264 video coding format, supports audio media information processing in an AAC/MP3 audio coding format, supports picture media information processing in JPEG, PNG, and BMP formats, supports picture media information processing in a PNG format with a channel, and supports open interface transmission of media information.
In some optional implementations of some embodiments, the live content capture and processing device is further configured to: controlling at least one target terminal to play the live content information; acquiring the sequence position of the live broadcast content information in a play list of each target terminal in the at least one target terminal; monitoring the progress of the live broadcast content information in the playing process; and responding to the determined end of the live content information playing, and playing other live content information based on the sequence position.
In some optional implementations of some embodiments, the live content capture and processing device is further configured to: storing the historical live broadcast content information which is played in the playlist; and responding to the detected user management operation aiming at the playlist, and correspondingly adjusting the live content information in the playlist.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described herein again.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Fig. 4 is a schematic diagram of a computer device 4 provided by the disclosed embodiment. As shown in fig. 4, the computer device 4 of this embodiment includes: a processor 401, a memory 402 and a computer program 403 stored in the memory 402 and executable on the processor 401. The steps in the various method embodiments described above are implemented when the processor 401 executes the computer program 403. Alternatively, the processor 401 implements the functions of the respective modules/units in the above-described respective apparatus embodiments when executing the computer program 403.
Illustratively, the computer program 403 may be partitioned into one or more modules/units, which are stored in the memory 402 and executed by the processor 401 to accomplish the present disclosure. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 403 in the computer device 4.
The computer device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computer devices. Computer device 4 may include, but is not limited to, a processor 401 and a memory 402. Those skilled in the art will appreciate that fig. 4 is merely an example of a computer device 4 and is not intended to limit computer device 4 and may include more or fewer components than those shown, or some of the components may be combined, or different components, e.g., the computer device may also include input output devices, network access devices, buses, etc.
The Processor 401 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 402 may be an internal storage unit of the computer device 4, for example, a hard disk or a memory of the computer device 4. The memory 402 may also be an external storage device of the computer device 4, such as a plug-in hard disk provided on the computer device 4, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, memory 402 may also include both internal storage units of computer device 4 and external storage devices. The memory 402 is used for storing computer programs and other programs and data required by the computer device. The memory 402 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, a division of modules or units, a division of logical functions only, an additional division may be made in actual implementation, multiple units or components may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (10)

1. A method for acquiring and processing live content is characterized by comprising the following steps:
the method comprises the steps of collecting media information and displaying the media information to an operation processing area of a media information editing interface, wherein the media information comprises at least one of the following items: audio, picture, video; the media information editing interface also comprises an editing tool area;
detecting the editing tool area, wherein the editing tool area comprises at least one editing tool icon;
in response to detecting a user click operation on an editing tool icon in the editing tool area, determining an operation editing tool icon of the user click operation as a target editing tool icon;
determining a processing mode corresponding to the target editing tool icon as a target processing mode;
processing the media information based on the target processing mode and the user processing operation in the operation processing area;
and in response to determining that the processing is completed, determining the processed media information as live content information, and displaying the live content information to a preview interface.
2. The method for acquiring and processing live content according to claim 1, wherein the target processing mode includes one or more of the following: processing characters, processing picture position patterns, inserting and displaying multi-track pictures and processing videos; the user processing operation has a corresponding operation position.
3. The method for acquiring and processing live content according to claim 2, wherein the processing the media information based on the target processing manner and the user processing operation in the operation processing area comprises:
receiving inserted text information in response to the fact that the target processing mode is determined to be text processing and the user processing operation comprises a text insertion operation;
displaying the text information to an operation position of the user processing operation for the media information;
and responding to the detection that the user processing operation further comprises a text style adjusting operation, and performing corresponding style adjustment on the text information to finish the processing of the media information.
4. The method for acquiring and processing live content according to claim 2, wherein the processing the media information based on the target processing manner and the user processing operation in the operation processing area comprises:
in response to the fact that the media information is determined to be a video and the target processing mode is video processing, performing corresponding special effect/cutting and splicing processing on the media information based on the user processing operation;
in response to determining that the media information is a picture and the processing mode is picture position pattern processing, performing corresponding position adjustment/pattern adjustment on the media information based on the user processing operation;
in response to determining that the target processing mode is multi-track picture insertion and display processing and picture position pattern processing, displaying at least one target insertion picture to the operation processing area, wherein the target insertion picture is preselected;
and performing corresponding position adjustment/style adjustment on the media information based on the user processing operation.
5. The method for acquiring and processing the live content according to claim 1, wherein the method for acquiring and processing the live content adopts a B/S architecture, the method for acquiring and processing the live content supports video media information processing in an H264 video coding format, audio media information processing in an AAC/MP3 audio coding format, picture media information processing in JPEG, PNG and BMP formats, picture media information processing in a PNG format with a channel, and media information transmission through an open interface.
6. The method for acquiring and processing live content according to claim 1, wherein in response to determining that the processing is completed, the media information obtained by the processing is determined as live content information, and after the live content information is displayed on a preview interface, the method further comprises:
controlling at least one target terminal to play the live content information;
acquiring the sequence position of the live broadcast content information in a play list of each target terminal in the at least one target terminal;
monitoring the progress of the live content information in the playing process;
and responding to the determined end of the live content information playing, and playing other live content information based on the sequence position.
7. The method for acquiring and processing live content according to claim 6, wherein the method further comprises:
storing the historical live broadcast content information which is played in the playlist;
and responding to the detected user management operation aiming at the playlist, and correspondingly adjusting the live content information in the playlist.
8. A device for acquiring and processing live content, comprising:
the media information editing device comprises a collecting unit and a display unit, wherein the collecting unit is configured to collect media information and display the media information to an operation processing area of a media information editing interface, and the media information comprises at least one of the following items: audio, picture, video; the media information editing interface also comprises an editing tool area;
the detection unit is configured to detect the editing tool area, wherein the editing tool area comprises at least one editing tool icon;
a first determination unit configured to determine, in response to detection of a user click operation for an editing tool icon in the editing tool area, an operation editing tool icon of the user click operation as a target editing tool icon;
a second determining unit configured to determine a processing manner corresponding to the target editing tool icon as a target processing manner;
a processing unit configured to process the media information based on the target processing manner and a user processing operation in the operation processing area;
and the display unit is configured to determine the processed media information as live content information in response to determining that the processing is completed, and display the live content information to a preview interface.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202210204006.5A 2022-03-03 2022-03-03 Live broadcast content acquisition and processing method and device, electronic equipment and medium Pending CN114268833A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210204006.5A CN114268833A (en) 2022-03-03 2022-03-03 Live broadcast content acquisition and processing method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210204006.5A CN114268833A (en) 2022-03-03 2022-03-03 Live broadcast content acquisition and processing method and device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN114268833A true CN114268833A (en) 2022-04-01

Family

ID=80834017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210204006.5A Pending CN114268833A (en) 2022-03-03 2022-03-03 Live broadcast content acquisition and processing method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN114268833A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024022179A1 (en) * 2022-07-26 2024-02-01 北京字跳网络技术有限公司 Media content display method and apparatus, electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102054510A (en) * 2010-11-08 2011-05-11 武汉大学 Video preprocessing and playing method and system
US20130302005A1 (en) * 2012-05-09 2013-11-14 Youtoo Technologies, LLC Recording and publishing content on social media websites
US20160188155A1 (en) * 2013-02-11 2016-06-30 Inkling Systems, Inc. Creating and editing digital content works
CN108737903A (en) * 2017-04-25 2018-11-02 腾讯科技(深圳)有限公司 A kind of multimedia processing system and multi-media processing method
CN111556329A (en) * 2020-04-26 2020-08-18 北京字节跳动网络技术有限公司 Method and device for inserting media content in live broadcast
CN112565858A (en) * 2019-09-26 2021-03-26 西安诺瓦星云科技股份有限公司 Program editing method and device and program publishing method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102054510A (en) * 2010-11-08 2011-05-11 武汉大学 Video preprocessing and playing method and system
US20130302005A1 (en) * 2012-05-09 2013-11-14 Youtoo Technologies, LLC Recording and publishing content on social media websites
US20160188155A1 (en) * 2013-02-11 2016-06-30 Inkling Systems, Inc. Creating and editing digital content works
CN108737903A (en) * 2017-04-25 2018-11-02 腾讯科技(深圳)有限公司 A kind of multimedia processing system and multi-media processing method
CN112565858A (en) * 2019-09-26 2021-03-26 西安诺瓦星云科技股份有限公司 Program editing method and device and program publishing method and device
CN111556329A (en) * 2020-04-26 2020-08-18 北京字节跳动网络技术有限公司 Method and device for inserting media content in live broadcast

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024022179A1 (en) * 2022-07-26 2024-02-01 北京字跳网络技术有限公司 Media content display method and apparatus, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN108989885B (en) Video file transcoding system, segmentation method, transcoding method and device
CN112291627A (en) Video editing method and device, mobile terminal and storage medium
CN108833787B (en) Method and apparatus for generating short video
CN112261416A (en) Cloud-based video processing method and device, storage medium and electronic equipment
CN111436005B (en) Method and apparatus for displaying image
CN110809189B (en) Video playing method and device, electronic equipment and computer readable medium
EP4131983A1 (en) Method and apparatus for processing three-dimensional video, readable storage medium, and electronic device
CN113256499B (en) Image splicing method, device and system
CN110113677A (en) The generation method and device of video subject
CN111225288A (en) Method and device for displaying subtitle information and electronic equipment
CN111797061B (en) Multimedia file processing method and device, electronic equipment and storage medium
CN113655930A (en) Information publishing method, information display method and device, electronic equipment and medium
CN113033677A (en) Video classification method and device, electronic equipment and storage medium
CN108632645A (en) Information demonstrating method and device
CN114268833A (en) Live broadcast content acquisition and processing method and device, electronic equipment and medium
CN113992926B (en) Interface display method, device, electronic equipment and storage medium
CN113395538A (en) Sound effect rendering method and device, computer readable medium and electronic equipment
CN115209215A (en) Video processing method, device and equipment
WO2023165390A1 (en) Zoom special effect generating method and apparatus, device, and storage medium
CN111506241A (en) Special effect display method and device for live broadcast room, electronic equipment and computer medium
CN111669476A (en) Watermark processing method, device, electronic equipment and medium
US20160142456A1 (en) Method and Device for Acquiring Media File
CN104575542A (en) Method and device for realizing audio regional play
CN111385638B (en) Video processing method and device
CN112188269A (en) Video playing method and device and video generating method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220401