CN110662082A - Data processing method, device, system, mobile terminal and storage medium - Google Patents

Data processing method, device, system, mobile terminal and storage medium Download PDF

Info

Publication number
CN110662082A
CN110662082A CN201910944073.9A CN201910944073A CN110662082A CN 110662082 A CN110662082 A CN 110662082A CN 201910944073 A CN201910944073 A CN 201910944073A CN 110662082 A CN110662082 A CN 110662082A
Authority
CN
China
Prior art keywords
data
live
target
resolution
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910944073.9A
Other languages
Chinese (zh)
Inventor
谢纨楠
范威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN201910944073.9A priority Critical patent/CN110662082A/en
Publication of CN110662082A publication Critical patent/CN110662082A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present disclosure relates to a data processing method, apparatus, system, mobile terminal and storage medium, applied to a mobile terminal, wherein the method comprises: determining a target file to be played in live broadcasting; collecting first direct broadcasting data; displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution; generating second live broadcast data according to the displayed target file; and overlapping the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution to generate target live broadcast data. According to the embodiment, different live contents can be displayed on the live page at the same time, and the live contents are enriched.

Description

Data processing method, device, system, mobile terminal and storage medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a data processing method, apparatus, system, mobile terminal, and storage medium.
Background
With the continuous maturity of streaming media technology and the continuous improvement of network environment, the application of live broadcasting is more and more extensive. Compared with the common video on demand, the live broadcast has stronger interactivity. A more common form of live broadcasting is that the anchor is live in its own live room towards the audience.
In the related technology, in the video live broadcast process, a main broadcast terminal of a live broadcast room uploads collected live broadcast video to a live broadcast server in real time, and audience terminals of the live broadcast room acquire the live broadcast video from the live broadcast server and play the live broadcast video on a display interface of the audience terminals. Therefore, live broadcast content depends on the content acquired by the camera in real time, and the live broadcast content is single.
Disclosure of Invention
The disclosure provides a data processing method, a data processing device, a data processing system, a mobile terminal and a storage medium, which are used for at least solving the problem that in the related art, live broadcast content depends on content acquired by a camera in real time and is relatively single. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a data processing method, which is applied to a mobile terminal, the data processing method including:
determining a target file to be played in live broadcasting;
collecting first direct broadcasting data;
displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution;
generating second live broadcast data according to the displayed target file;
and overlapping the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution to generate target live broadcast data.
According to a second aspect of the embodiments of the present disclosure, there is provided a data processing apparatus, which is applied to a mobile terminal, the data processing apparatus including:
the target file determining module is configured to determine a target file to be played in live broadcast;
a first live data acquisition module configured to acquire first live data;
a display module configured to display the first live data in a preset first resolution in the live page and display the target file in a preset second resolution;
the second live data generation module is configured to generate second live data according to the displayed target file;
and the target live broadcast data generation module is configured to superimpose the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution to generate target live broadcast data.
According to a third aspect of embodiments of the present disclosure, there is provided a data processing system comprising a first client, a server, and a second client;
the first client is configured to:
determining a target file to be played in live broadcasting;
collecting first direct broadcasting data;
displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution;
generating second live broadcast data according to the displayed target file;
according to the first resolution and the second resolution, overlapping the first live data and the second live data to generate target live data;
sending the target live broadcast data to a server;
the server is configured to store the target live broadcast data, obtain a storage address of the target live broadcast data, generate a download link according to the storage address and an IP address of the server, and send the download link to the second client;
the second client is configured to download corresponding target live broadcast data from the server according to the download link, and locally play the target live broadcast data.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a mobile terminal including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method described above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a storage medium having instructions that, when executed by a processor of the mobile terminal, enable the mobile terminal to perform the above-described method.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
in this embodiment, after determining the target file played in the live broadcast, the first live broadcast data and the target file can be simultaneously displayed in different resolutions in the live broadcast page in combination with the terminal acquiring the first live broadcast data, so that the live broadcast page can simultaneously display different live broadcast contents, and the live broadcast contents are enriched. And the anchor end can push a plurality of contents simultaneously when pushing, enriching the live scenes.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow chart illustrating one embodiment of a data processing method according to an exemplary embodiment.
Fig. 2 is a schematic diagram of a live page containing more options, shown in accordance with an example embodiment.
FIG. 3 is a schematic diagram illustrating a live page containing a file sharing interface in accordance with an exemplary embodiment.
FIG. 4 is a diagram illustrating a live page containing a list of album selections in accordance with an exemplary embodiment.
FIG. 5 is a diagram illustrating a live page containing a list of photos in accordance with an exemplary embodiment.
Fig. 6 is a schematic diagram illustrating a live page simultaneously displaying a target file and first live data according to an example embodiment.
FIG. 7 is a block diagram illustrating a data processing apparatus according to an example embodiment.
FIG. 8 is a block diagram illustrating a data processing system in accordance with an exemplary embodiment.
Fig. 9 is a block diagram illustrating an apparatus for performing the above-described method embodiments according to an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating an embodiment of a data processing method according to an exemplary embodiment, where the embodiment may be applied to a mobile terminal, and in particular, may be applied to an anchor client, and may include the following steps.
In step S11, a target file to be played in the live broadcast is determined.
In an embodiment, a file sharing interface may be displayed in a live page of the anchor client, where the file sharing interface is used to provide an entry for the anchor user to select another file of the terminal for live sharing in the live process of the anchor user.
In a possible implementation manner of this embodiment, step S11 may include step S111-step S113.
And step S111, displaying a live broadcast page containing the file sharing interface.
In this step, when the anchor user performs live broadcast through the mobile terminal, the live broadcast page may include a file sharing interface. For example, as shown in the live page diagrams of fig. 2-3, when the anchor user triggers the "more" function option in the part a of the live page in fig. 2, the function option page shown in fig. 3 is entered, and the "file sharing" function in the part B included in the function option page is the file sharing interface.
Step S112, when the file sharing interface is detected to be triggered, a file list of the terminal is obtained, and the file list is displayed.
In this step, after the anchor user triggers the file sharing interface in the live page, the mobile terminal can call the file interface to acquire a file list formed by all files in the mobile terminal, and display the file list in the live page.
For example, when the anchor user clicks the file sharing interface in fig. 3, all file types of the terminal, such as "album", "audio", "document", "installation package", and the like, may be popped up. Assuming the anchor user selects the "album" file, an album selection list as shown in FIG. 4 may be entered, in which the "recent photos" and "all photos" options may be included, from which the anchor user may select the images or videos that need to be shared. If there are no images or videos in the "recent photos" that the anchor user wants, the user can click on the "all photos" option and enter the all photos list as shown in FIG. 5, and the user can select the images or videos that he wants to share from the all photos list.
In step S113, the target file selected from the file list is acquired.
In this step, after the file selected by the user is detected, the file may be taken as a target file.
As one example, the target file may comprise a multimedia file or a non-multimedia file, etc. The multimedia file may include image data, audio data, video data, or the like. The non-multimedia files can comprise other files except the multimedia files in the terminal, such as Word, PPT, Excel and other files, so that the live broadcast scene can be expanded to more fields, such as scenes of education, training or lectures and the like.
In step S12, first live data is acquired.
As an example, the first live data may include first image data collected by a camera of the mobile terminal and/or first audio data collected by a microphone mic of the mobile terminal.
In step S13, the first live data is displayed in the live page at a first resolution, and the target file is displayed at a second resolution.
In one embodiment, a View control in a UI (User Interface) API (Application Programming Interface) provided by a system of the terminal may be called to display the first live data and the target file in the anchor client. In one example, first live data may be displayed in a first View control at a first resolution; the target file is displayed in the second View control at a second resolution. Of course, the contents displayed in the first View control and the second View control can be switched with each other according to the switching operation of the user.
In one embodiment, the first resolution and the second resolution may be delivered to the anchor client by the live server, and the first resolution and the second resolution stored in the live server may be configured in advance by a developer or an operation and maintenance person.
As an example, the first resolution may be a resolution of a preset widget in the current live page; the second resolution may be a full screen resolution required for the current live broadcast.
For example, as shown in the live page diagram of fig. 6, after the anchor user selects the target file, the acquisition resolution of the anchor end may be adjusted to the size of the small window, that is, the currently acquired live view (i.e., the first live data) may be reduced to the small window at the lower right corner for display. The picture of the target file may be displayed on the screen full, for example, if the target file is a picture, the picture may be scaled to a second resolution required for live broadcast by using a scaling algorithm such as libyuv (libyuv is a library for implementing interconversion, rotation, and scaling between YUV and RGB of Google open source); for another example, if the target file is a video, the video may be rendered through OpenGL (Open Graphics Library, or Open Graphics Library), and a GPU (Graphics Processing Unit) may be used in the rendering pipeline to scale the video frames in the video to the second resolution required for live broadcasting.
In one example, the anchor user may also drag the widget into place or stretch the size of the widget.
In step S14, second live data is generated according to the displayed target file.
As one example, the second live data may include second image data and/or second audio data.
In one example, if the target file is image data, the image data may be regarded as second image data. Specifically, YUV ("Y" indicates brightness (Luma) and "U" and "V" indicate Chroma and saturation (Chroma)) data obtained by decoding the image data may be used as the second image data, and since the target file is image data, there is no audio data, that is, the second image data is used as the second live data.
In another example, if the target file is audio data, the audio data may be used as the second audio data. Specifically, PCM (Pulse Code Modulation) data obtained by decoding the audio data may be used as the second audio data, and since the target file is audio data, there is no image data, that is, the second audio data is used as the second live data.
In another example, if the target file is video data, the video data may be decoded to obtain corresponding second image data and second audio data as second live data. Specifically, the video data may be decoded, and the decoded YUV data may be used as the second image data, and the decoded PCM data may be used as the second audio data.
In another example, if the target file is a non-multimedia file, a screenshot may be performed on the displayed non-multimedia file to obtain second image data, and since the target file is non-multimedia data, there is no audio data, that is, the second image data obtained by the screenshot is used as second live data. In implementation, for a non-multimedia file, during display, an API for a system of the terminal to provide a preview of the file may be called to display on a View, and then the content of the View may be changed into a bitmap image through Graphics Contexts (Graphics Contexts), so as to obtain second image data.
In step S15, the first live data and the second live data are superimposed according to the first resolution and the second resolution, so as to generate target live data.
In this embodiment, the anchor client may merge the first live data and the second live data and then push out the merged data, so as to reduce the processing flow of the viewer.
In one possible embodiment, the target live data includes target image data and/or target audio data; step S15 may further include step S151-step 152.
And step S151, according to the first resolution and the second resolution, overlapping the first image data acquired by the camera at the current moment with the second image data to generate target image data at the current moment.
In this step, the image data in the confluent live broadcast data may be a superposition of the captured first image data and the decoded second image data.
In a possible implementation manner of this embodiment, step S151 may include steps S1511 to S1515.
Step S1511, an intersection region of the first resolution and the second resolution is obtained.
Step S1512 determines first pixel data located in the intersection region in the first live broadcast data, and second pixel data located in the intersection region in the second live broadcast data.
Step S1513, if the first live data is displayed above the second live data, displaying the first pixel data in the intersection region.
Step S1514, if the second live data is displayed above the first live data, displaying the second pixel data in the intersection region.
And step S1515, combining the intersection area and information displayed in other areas except the intersection area in the live broadcast page to generate target image data.
For example, as shown in fig. 6, if the resolution of the widget is the first resolution, the widget is used to display first live data (i.e., a main broadcast picture) collected by a camera of the terminal. The resolution of the live broadcast page is a second resolution, and is used for displaying the content of the target file, and then the intersection area of the first resolution and the second resolution is the area of the small window. Since the widget displays the first live broadcast data in fig. 6, the pixel data of the pixel point located in the intersection area in the second live broadcast data corresponding to the target file may be replaced with the pixel data of the pixel point corresponding to the first live broadcast data, so as to obtain the target image data.
Similarly, if the widget in fig. 6 displays the content of the target file and the live broadcast page displays the first live broadcast data in full screen, the pixel data in the widget in the first live broadcast data may be replaced with the second live broadcast data corresponding to the target file.
In another embodiment, if the first resolution and the second resolution do not completely overlap (the small window shown in fig. 6 completely overlaps the live page), the pixel data in the overlapping area may be set as the content at the corresponding position of the live data displayed at the uppermost layer, and the target image data may be generated by combining the content displayed in the other areas in the live page.
In step S152, the target audio data is determined.
In one example, the target audio data in the target live data derived from the confluence is: and if the first audio data and the second audio data exist, the target audio data is audio data obtained after the first audio data and the second audio data are subjected to sound mixing. In this embodiment, if the microphone of the terminal acquires the first audio data and can decode the second audio data from the target file (if the target file is a video file), the target audio data is a content obtained by mixing the first audio data and the second audio data. As for the mixing method, a general mixing method can be referred to, and this embodiment does not limit this.
In another example, if one of the first audio data and the second audio data exists, the target audio data may be determined to be the existing first audio data or second audio data. In this embodiment, if the microphone of the terminal acquires the first audio data but cannot decode the second audio data from the target file (for example, the target file is a picture, a word, a PPT, or the like), the target audio data may be determined to be the first audio data. If the first audio data is not collected by the microphone of the terminal but the second audio data can be decoded from the target file (for example, the target file is a video file or an audio file), the target audio data can be determined as the second audio data.
In another example, if neither the first audio data nor the second audio data exists, it may be determined that the target audio data is empty. In this embodiment, if the microphone of the terminal cannot acquire the first audio data and cannot decode the second audio data from the target file, the target audio data is null, that is, there is no audio data at the current time.
After the target image data and the target audio data are obtained, the target image data and the target audio data can form target live broadcast data sent by confluence.
After obtaining the target live data, in one embodiment, the target live data may be sent to a second user, so that the second user plays the target live data.
In one implementation, the target live broadcast data may be sent to the edge server through the live broadcast server for storage, and an IP address of one of the edge servers storing the target live broadcast data is selected to generate a download address, which is sent to the second user, where the second user may be an audience user.
After receiving the download address, the second user can request to download the target live broadcast data from the corresponding edge server, and locally play the target live broadcast data.
In one embodiment, the frame rate in the live broadcast process may be determined as follows: and if the target file is a non-video file, transmitting the target live broadcast data according to a preset frame rate. And if the target file is a video file, transmitting the target live broadcast data according to the frame rate of the decoded video file.
For example, if the target file is a picture, the target live broadcast data may be sent periodically at a preset frame rate, where if the anchor user selects one picture, the second image data in the target live broadcast data sent each time is the same picture, and if the anchor user selects multiple pictures, the second image data in the target live broadcast data sent each time is the multiple pictures selected in turn. If the target file is a video file, the target live broadcast data can be sent according to the frame rate decoded by the video file.
In this embodiment, the anchor user can play the selected target file by triggering the file sharing interface in the live broadcasting process of the mobile terminal, so that the target file and the anchor picture can be simultaneously displayed in a live broadcasting page, and the anchor terminal can simultaneously push a plurality of contents, thereby enriching the live broadcasting scene and the live broadcasting contents.
Furthermore, since the anchor side performs the confluence release of the first live data and the second live data in this embodiment, the live data received by the viewer side is merged data, so that the viewer side cannot move the first live data or the second live data in the playing page, for example, cannot move the position of a small window, thereby ensuring the consistency between the viewer picture and the anchor picture, and reducing the processing flow of the viewer side.
FIG. 7 is a block diagram illustrating a data processing apparatus according to an example embodiment. Referring to fig. 7, the apparatus may be applied to a mobile terminal, and includes a target file determining module 701, a first live data collecting module 702, a display module 703, a second live data generating module 704, and a target live data generating module 705.
A target file determining module 701 configured to determine a target file to be played in a live broadcast;
a first live data acquisition module 702 configured to acquire first live data;
a display module 703 configured to display the first live data in the live page at a preset first resolution, and display the target file at a preset second resolution;
a second live data generating module 704 configured to generate second live data according to the displayed target file;
a target live broadcast data generating module 705, configured to superimpose the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution, so as to generate target live broadcast data.
In a possible implementation manner of this embodiment, the first live data includes first image data collected by a camera of the mobile terminal and/or first audio data collected by a microphone of the mobile terminal; the second live data comprises second image data and/or second audio data; the target live broadcast data comprises target image data and/or target audio data;
the target live data generation module 705 includes:
the target image data generation submodule is configured to superimpose first image data acquired by the camera at the current moment and second image data according to the first resolution and the second resolution to generate target image data at the current moment;
a target audio data generation submodule configured to determine target audio data, wherein the target audio data comprises one of: if the first audio data and the second audio data exist, the target audio data is audio data obtained after the first audio data and the second audio data are subjected to sound mixing; if any one of the first audio data and the second audio data exists, the target audio data is the existing first audio data or second audio data; and if the first audio data and the second audio data do not exist, the target audio data is empty.
In a possible implementation manner of this embodiment, the target image data generation sub-module is specifically configured to:
acquiring an intersection region of the first resolution and the second resolution;
determining first pixel data located in the intersection region in the first live broadcast data and second pixel data located in the intersection region in the second live broadcast data;
if the first live broadcast data is displayed above the second live broadcast data, displaying the first pixel data in the intersection region;
if the second live broadcast data is displayed above the first live broadcast data, displaying the second pixel data in the intersection region;
and combining the intersection area and information displayed in other areas except the intersection area in the live broadcast page to generate target image data.
In a possible implementation manner of this embodiment, the object file includes a multimedia file or a non-multimedia file; the second live data generation module 704 is specifically configured to:
if the target file is image data, taking the image data as second image data;
if the target file is audio data, the audio data is used as second audio data;
if the target file is video data, decoding the video data to obtain corresponding second image data and second audio data;
and if the target file is a non-multimedia file, screenshot is carried out on the displayed non-multimedia file to obtain second image data.
In a possible implementation manner of this embodiment, the display module 703 includes:
a first display sub-module configured to display the first live data in a first View View control at the first resolution;
a second display sub-module configured to display the target file at the second resolution in a second View View control.
In a possible implementation manner of this embodiment, the first resolution is a resolution of a preset widget in a current live page; the second resolution is a full screen resolution required by the current live broadcast.
In a possible implementation manner of this embodiment, the target file detecting module 701 includes:
the live broadcast page display sub-module is configured to display a live broadcast page containing a file sharing interface;
the file list display sub-module is configured to acquire a file list of a terminal and display the file list when the file sharing interface is detected to be triggered;
and the target file determining sub-module is configured to acquire the selected target file from the file list.
In a possible implementation manner of this embodiment, if the target file is a non-video file, the target live data is sent at a preset frame rate; and if the target file is a video file, transmitting target live broadcast data according to the frame rate of the decoded video file.
FIG. 8 is a block diagram illustrating a data processing system in accordance with an exemplary embodiment. Referring to fig. 8, the system may include a first client 10, a server 20, and a second client 30.
The first client 10 is configured to:
determining a target file to be played in live broadcasting;
collecting first direct broadcasting data;
displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution;
generating second live broadcast data according to the displayed target file;
according to the first resolution and the second resolution, overlapping the first live data and the second live data to generate target live data;
sending the target live broadcast data to a server;
the server 20 is configured to store the target live broadcast data, obtain a storage address of the target live broadcast data, generate a download link according to the storage address and an IP address of the server, and send the download link to the second client;
the second client 30 is configured to download the corresponding target live data from the server according to the download link, and locally play the target live data.
With regard to the apparatus and system in the above embodiments, the specific manner in which each module performs operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
Fig. 9 is a block diagram illustrating an apparatus for performing the above-described method embodiments according to an example embodiment.
In an exemplary embodiment, there is also provided a storage medium comprising instructions, such as a memory comprising instructions, executable by a processor of an apparatus to perform the method embodiment of fig. 1 described above. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
An embodiment of the present disclosure further provides a mobile terminal, including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of the embodiment of fig. 1 described above.
The embodiment of the present disclosure also provides a storage medium, and when instructions in the storage medium are executed by a processor of the mobile terminal, the mobile terminal is enabled to execute the method in the embodiment of fig. 1.
The disclosed embodiments also provide a computer program product comprising executable program code, wherein the program code, when executed by the above-described apparatus, implements the method according to the embodiment of fig. 1.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A data processing method is applied to a mobile terminal, and is characterized by comprising the following steps:
determining a target file to be played in live broadcasting;
collecting first direct broadcasting data;
displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution;
generating second live broadcast data according to the displayed target file;
and overlapping the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution to generate target live broadcast data.
2. The data processing method according to claim 1, wherein the first live data comprises first image data collected by a camera of the mobile terminal and/or first audio data collected by a microphone of the mobile terminal; the second live data comprises second image data and/or second audio data; the target live broadcast data comprises target image data and/or target audio data;
the step of superimposing the first live data and the second live data according to the first resolution and the second resolution to generate target live data includes:
according to the first resolution and the second resolution, overlapping first image data acquired at the current moment and the second image data acquired by the camera to generate target image data at the current moment;
determining target audio data, wherein the target audio data comprises one of: if the first audio data and the second audio data exist, the target audio data is audio data obtained after the first audio data and the second audio data are subjected to sound mixing; if any one of the first audio data and the second audio data exists, the target audio data is the existing first audio data or second audio data; and if the first audio data and the second audio data do not exist, the target audio data is empty.
3. The data processing method according to claim 2, wherein the step of superimposing, according to the first resolution and the second resolution, the first image data acquired by the camera at the current time with the second image data, and generating the target image data at the current time comprises:
acquiring an intersection region of the first resolution and the second resolution;
determining first pixel data located in the intersection region in the first live broadcast data and second pixel data located in the intersection region in the second live broadcast data;
if the first live broadcast data is displayed above the second live broadcast data, displaying the first pixel data in the intersection region;
if the second live broadcast data is displayed above the first live broadcast data, displaying the second pixel data in the intersection region;
and combining the intersection area and information displayed in other areas except the intersection area in the live broadcast page to generate target image data.
4. A data processing method according to claim 2 or 3, wherein the object file comprises a multimedia file or a non-multimedia file; the multimedia file comprises image data and/or video data and/or audio data;
the step of generating second live data according to the displayed target file comprises the following steps:
if the target file is image data, taking the image data as second image data;
if the target file is audio data, the audio data is used as second audio data;
if the target file is video data, decoding the video data to obtain corresponding second image data and second audio data;
and if the target file is a non-multimedia file, screenshot is carried out on the displayed non-multimedia file to obtain second image data.
5. A data processing method according to any of claims 1-3, wherein the step of displaying the first live data in the live page at the first resolution and the first target file at the second resolution comprises:
displaying the first live data in a first View View control at the first resolution;
displaying the target file in a second View View control at the second resolution.
6. A data processing method according to any of claims 1 to 3, wherein the first resolution is a resolution of a preset small window in a current live page; the second resolution is a full screen resolution required by the current live broadcast.
7. A data processing apparatus, wherein the apparatus is applied to a mobile terminal, the data processing apparatus comprising:
the target file determining module is configured to determine a target file to be played in live broadcast;
a first live data acquisition module configured to acquire first live data;
a display module configured to display the first live data in a preset first resolution in the live page and display the target file in a preset second resolution;
the second live data generation module is configured to generate second live data according to the displayed target file;
and the target live broadcast data generation module is configured to superimpose the first live broadcast data and the second live broadcast data according to the first resolution and the second resolution to generate target live broadcast data.
8. A data processing system, comprising a first client, a server, and a second client;
the first client is configured to:
determining a target file to be played in live broadcasting;
collecting first direct broadcasting data;
displaying the first live broadcast data in the live broadcast page at a preset first resolution, and displaying the target file at a preset second resolution;
generating second live broadcast data according to the displayed target file;
according to the first resolution and the second resolution, overlapping the first live data and the second live data to generate target live data;
sending the target live broadcast data to a server;
the server is configured to store the target live broadcast data, obtain a storage address of the target live broadcast data, generate a download link according to the storage address and an IP address of the server, and send the download link to the second client;
the second client is configured to download corresponding target live broadcast data from the server according to the download link, and locally play the target live broadcast data.
9. A mobile terminal, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1-6.
10. A storage medium, wherein instructions in the storage medium, when executed by a processor of the mobile terminal, enable the mobile terminal to perform the method of any of claims 1-6.
CN201910944073.9A 2019-09-30 2019-09-30 Data processing method, device, system, mobile terminal and storage medium Pending CN110662082A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910944073.9A CN110662082A (en) 2019-09-30 2019-09-30 Data processing method, device, system, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910944073.9A CN110662082A (en) 2019-09-30 2019-09-30 Data processing method, device, system, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN110662082A true CN110662082A (en) 2020-01-07

Family

ID=69038420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910944073.9A Pending CN110662082A (en) 2019-09-30 2019-09-30 Data processing method, device, system, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110662082A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023272652A1 (en) * 2021-06-30 2023-01-05 东莞市小精灵教育软件有限公司 Image preprocessing method and apparatus, computer device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162221A (en) * 2015-03-23 2016-11-23 阿里巴巴集团控股有限公司 The synthetic method of live video, Apparatus and system
CN106658215A (en) * 2016-12-15 2017-05-10 北京小米移动软件有限公司 Method and device for pushing live file
CN107396166A (en) * 2017-08-07 2017-11-24 北京小米移动软件有限公司 The method and device of live middle display video
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN108259989A (en) * 2018-01-19 2018-07-06 广州华多网络科技有限公司 Method, computer readable storage medium and the terminal device of net cast
CN109327741A (en) * 2018-11-16 2019-02-12 网易(杭州)网络有限公司 Game live broadcasting method, device and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162221A (en) * 2015-03-23 2016-11-23 阿里巴巴集团控股有限公司 The synthetic method of live video, Apparatus and system
CN106658215A (en) * 2016-12-15 2017-05-10 北京小米移动软件有限公司 Method and device for pushing live file
CN107396166A (en) * 2017-08-07 2017-11-24 北京小米移动软件有限公司 The method and device of live middle display video
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN108259989A (en) * 2018-01-19 2018-07-06 广州华多网络科技有限公司 Method, computer readable storage medium and the terminal device of net cast
CN109327741A (en) * 2018-11-16 2019-02-12 网易(杭州)网络有限公司 Game live broadcasting method, device and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023272652A1 (en) * 2021-06-30 2023-01-05 东莞市小精灵教育软件有限公司 Image preprocessing method and apparatus, computer device, and storage medium

Similar Documents

Publication Publication Date Title
US10165321B2 (en) Facilitating placeshifting using matrix codes
CN109168014B (en) Live broadcast method, device, equipment and storage medium
US20210377591A1 (en) System and methods for interactive filters in live streaming media
US11128894B2 (en) Method and mobile terminal for processing data
US11962858B2 (en) Video playback method, video playback terminal, and non-volatile computer-readable storage medium
US11516521B2 (en) Generating composite video stream for display in VR
CN108174272B (en) Method and device for displaying interactive information in live broadcast, storage medium and electronic equipment
CN111083515B (en) Method, device and system for processing live broadcast content
CN107040808B (en) Method and device for processing popup picture in video playing
RU2718118C2 (en) Information processing device and information processing method
US10622018B2 (en) Video-production system with metadata-based DVE feature
US10636178B2 (en) System and method for coding and decoding of an asset having transparency
US20190141366A1 (en) System and method for insertion of an asset into a source dynamic media
US20200213631A1 (en) Transmission system for multi-channel image, control method therefor, and multi-channel image playback method and apparatus
CN112262570A (en) Method and system for automatic real-time frame segmentation of high-resolution video streams into constituent features and modification of features in individual frames to create multiple different linear views from the same video source simultaneously
CN110662082A (en) Data processing method, device, system, mobile terminal and storage medium
CN108632644B (en) Preview display method and device
CN115379277A (en) VR panoramic video playing method and system based on IPTV service
CN113315982A (en) Live broadcast method, computer storage medium and equipment
CN113573117A (en) Video live broadcast method and device and computer equipment
EP4340370A1 (en) Cloud-based input latency measurement
WO2023193524A1 (en) Live streaming video processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN117336518A (en) Live broadcast method, device, equipment and storage medium
CN115022713A (en) Video data processing method and device, storage medium and electronic equipment
CN113490001A (en) Audio and video data sharing method, server, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200107