CN111083511A - Video transmission method and device - Google Patents
Video transmission method and device Download PDFInfo
- Publication number
- CN111083511A CN111083511A CN201911343357.9A CN201911343357A CN111083511A CN 111083511 A CN111083511 A CN 111083511A CN 201911343357 A CN201911343357 A CN 201911343357A CN 111083511 A CN111083511 A CN 111083511A
- Authority
- CN
- China
- Prior art keywords
- video data
- video
- mec server
- terminal
- base station
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/21805—Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/27—Server based end-user applications
- H04N21/274—Storing end-user multimedia data in response to end-user request, e.g. network recorder
- H04N21/2747—Remote storage of video programs received via the downstream path, e.g. from the server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/22—Adaptations for optical transmission
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application provides a video transmission method and a video transmission device, relates to the field of communication, and can reduce the delay time of video data transmission. The method comprises the following steps: the method comprises the steps that an MEC server receives first video data shot by a camera through a base station; the method comprises the steps that an MEC server receives a video request message sent by a live broadcast management platform, wherein the video request message is used for indicating a first terminal to request second video data, and the second video data are part or all of first video data; and the MEC server sends the second video data to the first terminal through the base station according to the video request message. The embodiment of the application is used in the process of transmitting video data.
Description
Technical Field
The present application relates to the field of mobile communications, and in particular, to a video transmission method and apparatus.
Background
With the continuous development of network technology, internet video transmission technology is widely applied. The real-time nature of internet video makes it possible to transmit real-time video streams, for example in live webcasts. In the internet real-time video transmission, a large amount of video data is transmitted, and the real-time video transmission has higher requirements on video delay and picture quality.
At present, when a user watches a video live broadcast on a terminal, video data shot by a camera needs to be transmitted to a core network through a base station, the core network transmits the video data to a cloud server, and finally the cloud server sends the video data to the terminal through the base station. However, since the distance between the base station and the core network is long, the video data transmission time between the base station and the core network is long, which causes the video data transmission delay to be high, and reduces the smoothness of real-time video transmission.
Disclosure of Invention
The embodiment of the invention provides a video transmission method and a video transmission device, which can reduce the time for transmitting video data.
In order to achieve the purpose, the invention provides the following technical scheme:
in a first aspect, the present application provides a video transmission method, including: the method comprises the steps that an MEC server receives first video data shot by a camera through a base station; the method comprises the steps that an MEC server receives a video request message sent by a live broadcast management platform, wherein the video request message is used for indicating a first terminal to request first video data; and the MEC server sends the second video data to the first terminal through the base station according to the video request message.
Based on the technical scheme, the MEC server is deployed beside the base station for receiving the video data shot by the camera, and the base station and the MEC server are connected in a cable mode (such as optical fiber), so that a user only needs to receive the video data sent to the terminal by the MEC server through the base station when watching the live video, and does not need to receive the video data sent to the cloud server by the core network and then sent to the terminal by the base station. Based on this, because the distance between MEC server and the base station is less than the distance between core network and the base station for the time that the party of the event transmitted video data to user terminal reduces, has promoted the user and has watched live experience.
In one possible design, if the first video data includes video data from multiple views, the second video data is video data from a first view, and the first view is one of the multiple views.
In one possible design, the method includes: the method comprises the steps that an MEC server receives a video switching message sent by a live broadcast management platform, wherein the video switching message is used for indicating a first terminal to request video data of a second visual angle in first video data, the second visual angle is different from a first visual angle, and the second visual angle is one of a plurality of visual angles; and the MEC server sends the video data of the second visual angle to the first terminal through the base station according to the video switching message.
In one possible design, the method includes: the MEC server generates third video data according to the first video data, wherein the resolution of the third video data is smaller than that of the first video data; the MEC server sends the third video data to the cloud server.
In a second aspect, the present application provides an MEC server, comprising: the receiving module is used for receiving first video data shot by the camera through the base station; and receiving a video request message sent by the live broadcast management platform, wherein the video request message is used for indicating the first terminal to request second video data, and the second video data is part or all of the first video data. And the sending module is used for sending the second video data to the first terminal through the base station according to the video request message.
In one possible design, if the first video data includes video data from multiple views, the second video data is video data from a first view, and the first view is one of the multiple views.
In one possible design, the receiving module is further configured to receive a video switching message sent by the live broadcast management platform, where the video switching message is used to instruct the first terminal to request video data of a second view angle in the first video data, the second view angle is different from the first view angle, and the second view angle is one view angle in multiple view angles; and the sending module is further used for sending the video data of the second visual angle to the first terminal through the base station according to the video switching message.
In one possible design, the MEC server further includes: the processing module is used for generating third video data according to the first video data, and the resolution of the third video data is smaller than that of the first video data; the sending module is further configured to send the third video data to the cloud server.
In a third aspect, the present invention provides an MEC server, comprising: a processor, a memory, and a communication interface; wherein the plurality of programs for the MEC server includes computer executable instructions that, when executed by the MEC server, are executed by the processor to implement the video transmission method as described in the first aspect and any one of the possible implementations of the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored therein instructions that, when executed on a terminal, cause the terminal to perform a video transmission method as described in the first aspect and any one of its possible implementations.
In a fifth aspect, an embodiment of the present invention provides a computer program product containing instructions that, when run on a server, cause the server to perform the video transmission method as described in the first aspect and any one of the possible implementations of the first aspect.
In a sixth aspect, an embodiment of the present invention provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a computer program or instructions to implement the video transmission method as described in the first aspect and any possible implementation manner of the first aspect.
Specifically, the chip provided in the embodiment of the present invention further includes a memory for storing a computer program or instructions.
Drawings
Fig. 1 is a system architecture diagram of a communication system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a video transmission system according to an embodiment of the present invention;
fig. 3 is a flowchart of a video transmission method according to an embodiment of the present invention;
fig. 4 is a view of a video shooting scene according to an embodiment of the present invention;
fig. 5 is a flowchart of another video transmission method according to an embodiment of the present invention;
fig. 6 is a flowchart of another video transmission method according to an embodiment of the present invention;
fig. 7 is a flowchart of another video transmission method according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an MEC server according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of another MEC server according to an embodiment of the present invention.
Detailed Description
The character "/" herein generally indicates that the former and latter associated objects are in an "or" relationship. For example, A/B may be understood as A or B.
In the description of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
Furthermore, the terms "comprising" and "having" and any variations thereof as referred to in the description of the invention are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In addition, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "e.g.," is intended to present concepts in a concrete fashion.
In order to facilitate understanding of the technical solutions of the present application, some technical terms are described below.
1、MEC
Mobile Edge Computing (MEC) is a technology based on 5G evolution architecture and integrates Mobile access networks with internet services in depth. The MEC can provide services and cloud computing functions required by IT of telecommunication users nearby by using the wireless access network, so as to create a telecommunication service environment with high performance, low delay and high bandwidth, accelerate the rapid downloading of various contents, services and applications in the network, and enable consumers to enjoy uninterrupted high-quality network experience. The MEC can improve user experience and save bandwidth resources on one hand, and provides third party application integration by sinking computing capacity to the mobile edge node on the other hand, thereby providing infinite possibility for service innovation of the mobile edge entrance.
2. Encoding
Coding is the process of converting information from one form or format to another, also referred to as code of a computer programming language, simply coding. Characters, numbers or other objects are coded into numbers by a predetermined method, or information and data are converted into predetermined electric pulse signals. Codes are widely used in electronic computers, televisions, remote controls, communications, and the like. Encoding is the process by which information is converted from one form or format to another. Decoding is the inverse of encoding.
3. Decoding
Decoding (Decoding) is a process of restoring digital codes to contents it represents or converting electric pulse signals, optical signals, radio waves, etc. to information, data, etc. it represents by a specific method. Decoding is the process by which the recipient restores the received symbol or code to information, corresponding to the encoding process.
4. Framing
The frame extraction means that in a video, a process of taking a picture at a certain time interval and combining the pictures to form the video is simulated in a mode of extracting a plurality of frames at a certain interval.
5. Pulling flow
The pull streaming refers to the process that the server has live content, and the client establishes connection with the server according to protocol types (such as RTMP, RTP, RTSP, HTTH and the like) to pull data.
In the present application, the video transmission method may be applied to a plurality of application scenarios, for example, live broadcasting, remote conference, and the like. The following specifically describes embodiments in the present application with reference to a live application scenario.
As shown in fig. 1, an architecture diagram of a communication system provided in an embodiment of the present application is shown, where the communication system may include: the system comprises a camera 10, a base station 20, an MEC server 30, a terminal 40, a core network 50, a cloud server 60, a base station 70 and a terminal 80.
For example, as shown in fig. 1, the camera 10 transmits captured video data to the MEC server 30 through the base station 20, and the MEC server 30 transmits the video data to the terminal 40 through the base station 20.
In another embodiment, such as that shown in fig. 1, the camera 10 transmits the captured video data to the MEC server 30 through the base station 20. The MEC server 30 transmits the video data to the core network 40. After that, the core network 50 transmits the video data to the cloud server 60, and the cloud server 60 transmits the video data to the terminal 80 through the base station 70.
Wherein the camera is used for shooting video data. The cameras include non-panoramic cameras and/or panoramic cameras. The terminals include a terminal supporting the playing of the panoramic video data and a terminal not supporting the playing of the panoramic video data. For example, the terminal supporting playing panoramic video data may be VR glasses. The terminal which does not support playing the panoramic video data may be a smart phone.
It should be noted that, in the embodiment of the present invention, the number of cameras, base stations, and terminals in fig. 1 is only illustrative, and any number of cameras, base stations, and terminals may be provided according to an actual application scenario.
As shown in fig. 2, a video transmission system provided in an embodiment of the present application includes: the system comprises a camera, a live broadcast management platform, an MEC server, a cloud server and a terminal.
And the live broadcast management platform is used for receiving the video management message sent by the terminal.
An MEC server, comprising: the system comprises a live broadcast module, a local streaming media module, a video storage module, a platform public information module and a service database module. It should be noted that the live broadcast module is used to encode, transcode, play, and the like, video data. The local streaming media module is used for executing the pull stream operation. The video storage module is used for storing video data shot by the camera. The platform public information module is used for storing the information of the MEC server. The service database module is used for storing the request message, the account information of the user and the like.
A cloud server, comprising: the system comprises a live broadcast management module, a video storage module and an account management module. It should be noted that the live broadcast management module is used for storing the video control message. The video storage module is used for storing video data. The account management module is used for managing account information of the user.
As shown in fig. 3, a video transmission method provided in an embodiment of the present application includes the following steps.
S101, the MEC server receives first video data shot by the camera through the base station.
Wherein, MEC server passes through the cable with the base station and connects. Illustratively, the cable includes: optical cables and electric cables.
When the camera is a non-panoramic camera, the first video data is non-panoramic video data.
When the camera is a panoramic camera, the first video data comprises panoramic video data and/or non-panoramic video data. The non-panoramic video data refers to data of one visual angle shot by one camera of the panoramic camera. The panoramic video data is video data of a 360-degree view angle formed by video data of different view angles shot by a plurality of cameras of the panoramic camera.
It is understood that, in practical applications, the first video data shot by the panoramic camera includes video data of a plurality of view angles.
Optionally, after receiving the first video data, the MEC server stores the first video data.
S102, the MEC server receives a video request message sent by the live broadcast management platform.
The video request message is used for indicating the first terminal to request the second video data. The second video data is part or all of the first video data.
Illustratively, in a case where the first video data is panoramic video data, the second video data is panoramic video data. In a case where the first video data is non-panoramic video data, the second video data is non-panoramic video data.
Optionally, the video request message includes: a terminal identification and a video data identification. Wherein, the terminal identification is: and the terminal identification supporting the playing of the panoramic video data or the terminal identification not supporting the playing of the panoramic video data. The video data identification includes: a panoramic video data identification or a non-panoramic video data identification.
Optionally, the video request message further includes: a video period message. The second video data may also be video clip data when the video request message comprises a video period message.
Illustratively, when the user watches the sports event live broadcast by using the terminal, the user does not see the video 10 min 01 s to 15 min 30 s after the start of the game, the video time period is 10 min 01 s to 15 min 30 s, and the second video data is the video clip of 10 min 01 s to 15 min 30 s in the first video data.
It is understood that the MEC server may perform the video playback function according to the video period. Therefore, the user can select a video period to watch the video when the user overlooks the video or wants to watch a certain video segment again.
S103, the MEC server sends second video data to the first terminal through the base station according to the video request message.
In case one, if the video request message includes a terminal identifier supporting playing of panoramic video data and a panoramic video data identifier, the MEC server sends the panoramic video data to the terminal supporting playing of the panoramic video data through the base station according to the video request message.
And in the second situation, if the video request message comprises a terminal identifier supporting the playing of the panoramic video data and a non-panoramic video data identifier, the MEC server sends the non-panoramic video data to the terminal supporting the playing of the panoramic video data through the base station according to the video request message.
And thirdly, if the video request message comprises a terminal identifier supporting that the panoramic video data is not played and a non-panoramic video data identifier, the MEC server sends the non-panoramic video data to a terminal not supporting that the panoramic video data is played through the base station according to the video request message.
Based on the technical scheme, the MEC server is deployed beside the base station for receiving the video data shot by the camera, and the base station and the MEC server are connected in a cable mode (such as optical fiber), so that a user only needs to receive the video data sent to the terminal by the MEC server through the base station when watching the live video, and does not need to receive the video data sent to the cloud server by the core network and then sent to the terminal by the base station. Based on this, because the distance between MEC server and the base station is less than the distance between core network and the base station for the time that the party of the event transmitted video data to user terminal reduces, has promoted the user and has watched live experience.
Optionally, if the first video data includes video data of multiple views, the second video data is video data of a first view, and the first view is one of the multiple views.
When the first video data comprises video data of a plurality of visual angles, a user can select one visual angle for better watching the video live broadcast. The embodiment shown in fig. 4 will be described with reference to a specific application scenario.
In a live event scene, the party may place non-panoramic cameras at multiple locations (e.g., four directions east, west, south, and north) of the event venue to capture video data from different perspectives. As shown in fig. 4, the non-panoramic camera 21 on the west side of the field may photograph a scene in an area a, which is the video data of the first view angle in the first video data; the non-panoramic camera 22 on the north side of the field may capture a scene in the area b, which is the video data of the second view angle in the first video data; the non-panoramic camera 23 on the east side of the field may photograph a scene in the area c, which is the video data of the third view angle in the first video data; the non-panoramic camera 24 on the south side of the field may photograph a scene within the area d, which is the video data of the fourth view angle in the first video data.
It should be noted that, in the embodiment of the present application, the party at the event may further place a panoramic camera at the center of the event field, so as to capture and generate video data at different viewing angles. In addition, the first video data is generated by shooting through different camera lenses in the panoramic camera, so that the situation that a plurality of non-panoramic cameras are used for shooting videos can be avoided, and the cost is saved.
If the user wants to view the video data of the second viewing angle while viewing the video data of the first viewing angle, the user can select to switch the video data to view the video data of the second viewing angle.
Specifically, the embodiment of the present application provides a video transmission method, so that a terminal can switch video data of different viewing angles. As shown in fig. 5, the method includes the following steps.
S201, the MEC server receives a video switching message sent by the live broadcast management platform.
The video switching message is used for indicating the first terminal to request video data of a second visual angle in the first video data, wherein the second visual angle is different from the first visual angle, and the second visual angle is one of a plurality of visual angles.
S202, the MEC server sends the video data of the second visual angle to the first terminal through the base station according to the video switching message.
Illustratively, when a user watches video data of a first view angle by using a terminal, the MEC server sends the video data of a second view angle to the terminal through the base station according to the video switching message. The user may then view the video data from the second perspective on the terminal.
Based on the technical scheme, when a user wants to switch videos with different viewing angles, the MEC server receives a video switching message sent by the live broadcast management platform, and sends video data of a second viewing angle to the first terminal through the base station according to the video switching message. Therefore, the user can switch and watch the video data with different visual angles on the terminal at will, and the watching experience of the user is improved.
As shown in fig. 6, a video transmission method according to an embodiment of the present application is provided. The method comprises the following steps.
S301, the MEC server generates third video data according to the first video data.
Wherein the resolution of the third video data is less than the resolution of the first video data. Illustratively, the resolution of the first video data is 4K, and the resolution of the third video data is 1080P.
In an embodiment of the present application, the first video data includes: panoramic video data and/or non-panoramic video data. For panoramic video data included in the first video data, the third video data generated by the MEC server is panoramic video data with lower resolution. For the non-panoramic video data included in the first video data, the third video data generated by the MEC server is the non-panoramic video data with lower resolution.
In one possible implementation manner, the MEC server performs frame extraction processing on the first video data to generate third video data.
And S302, the MEC server sends third video data to the cloud server.
And S303, the cloud server stores the third video data.
Based on the above technical solution, in the video transmission method provided in the embodiment of the present application, the MEC server generates the third video data according to the first video data, and the data size of the third video data is small because the resolution of the third video data is low. And further, the MEC server sends third video data to the cloud server; the cloud server sends the third video data to the second terminal, and the sending time of the cloud server to the second terminal is reduced due to the fact that the amount of the third video data is small, so that the time delay of video data transmission is reduced.
Fig. 7 shows a video transmission method according to an embodiment of the present application. The method comprises the following steps.
S401, the MEC server generates third video data according to the first video data.
S402, the MEC server sends third video data to the first terminal through the base station.
Based on the above technical scheme, after the MEC server generates the third video data according to the first video data, the MEC server sends the third video data to the first terminal through the base station. Due to the fact that the resolution of the third video data is low, the data volume of the third video data sent by the MEC server to the first terminal through the base station is small, and the flow consumed by the user for watching the video is reduced. Therefore, the user can select to view the third video data with a small amount of traffic, thereby viewing the video with less traffic.
In the embodiment of the present application, functional modules or functional units may be divided for the MEC server according to the above method example, for example, each functional module or functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in a form of hardware, or may be implemented in a form of a software functional module or a functional unit. The division of the modules or units in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
As shown in fig. 8, an MEC server provided for an embodiment of the present invention includes:
a receiving module 101, configured to receive, through a base station, first video data captured by a camera; and receiving a video request message sent by the live broadcast management platform, wherein the video request message is used for indicating the first terminal to request second video data, and the second video data is part or all of the first video data.
A sending module 102, configured to send, according to the video request message, the second video data to the first terminal through the base station.
Optionally, if the first video data includes video data of multiple views, the second video data is video data of a first view, and the first view is one of the multiple views.
Optionally, the receiving module 101 is further configured to receive a video switching message sent by the live broadcast management platform, where the video switching message is used to instruct the first terminal to request video data of a second view angle in the first video data, the second view angle is different from the first view angle, and the second view angle is one view angle of multiple view angles.
The sending module 102 is further configured to send, according to the video switching message, video data of the second view angle to the first terminal through the base station.
Optionally, the MEC server further includes: and the processing module 103 is configured to generate third video data according to the first video data, where a resolution of the third video data is smaller than a resolution of the first video data.
The sending module 102 is further configured to send the third video data to the cloud server.
Fig. 9 shows yet another possible structure of the MEC server involved in the above embodiment. The MEC server includes: a processor 201 and a communication interface 202. The processor 201 is used to control and manage the actions of the device, for example, to perform the various steps in the method flows shown in the above-described method embodiments, and/or to perform other processes for the techniques described herein. The communication interface 202 is used to support the MEC server's communication with other network entities. The MEC server may also include a memory 203 and a bus 204, the memory 203 being used to store program codes and data for the devices.
The processor 201 may implement or execute various exemplary logical blocks, units and circuits described in connection with the present disclosure. The processor may be a central processing unit, general purpose processor, digital signal processor, application specific integrated circuit, field programmable gate array or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, units, and circuits described in connection with the present disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
The bus 204 may be an Extended Industry Standard Architecture (EISA) bus or the like. The bus 204 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
The present application provides a computer program product containing instructions, which when run on a computer, causes the computer to execute the video transmission method in the above method embodiments.
The embodiment of the present application further provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are run on a computer, the computer is caused to execute the video transmission method in the method flow shown in the foregoing method embodiment.
The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a register, a hard disk, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, any suitable combination of the above, or any other form of computer readable storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC). In embodiments of the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Since the MEC server, the computer-readable storage medium, and the computer program product in the embodiments of the present invention may be applied to the method described above, reference may also be made to the method embodiments for obtaining technical effects, and details of the embodiments of the present invention are not described herein again.
The above is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A video transmission method is characterized in that the method is applied to an MEC server, and the MEC server is connected with a base station through a cable; the method comprises the following steps:
the MEC server receives first video data shot by a camera through the base station;
the MEC server receives a video request message sent by a live broadcast management platform, wherein the video request message is used for indicating a first terminal to request second video data, and the second video data are part or all of the first video data;
and the MEC server sends the second video data to the first terminal through the base station according to the video request message.
2. The method of claim 1, wherein the second video data is video data of a first view if the first video data comprises video data of a plurality of views, the first view being one of the plurality of views.
3. The video transmission method according to claim 2, wherein the method further comprises:
the MEC server receives a video switching message sent by the live broadcast management platform, wherein the video switching message is used for indicating the first terminal to request video data of a second visual angle in the first video data, the second visual angle is different from the first visual angle, and the second visual angle is one of the visual angles;
and the MEC server sends the video data of the second visual angle to the first terminal through the base station according to the video switching message.
4. The video transmission method according to any one of claims 1 to 3, characterized in that the method further comprises:
the MEC server generates third video data according to the first video data, wherein the resolution of the third video data is smaller than that of the first video data;
the MEC server sends the third video data to a cloud server.
5. An MEC server, wherein the MEC server is connected with a base station through a cable; the MEC server includes:
the receiving module is used for receiving first video data shot by a camera through the base station; receiving a video request message sent by a live broadcast management platform, wherein the video request message is used for indicating a first terminal to request second video data, and the second video data is part or all of the first video data;
and the sending module is used for sending the second video data to the first terminal through the base station according to the video request message.
6. The MEC server of claim 5, wherein if the first video data comprises video data of a plurality of views, the second video data is video data of a first view, the first view being one of the plurality of views.
7. The MEC server of claim 6,
the receiving module is further configured to receive a video switching message sent by the live broadcast management platform, where the video switching message is used to instruct the first terminal to request video data of a second view angle in the first video data, where the second view angle is different from the first view angle, and the second view angle is one view angle in the multiple view angles;
the sending module is further configured to send, according to the video switching message, the video data of the second view angle to the first terminal through the base station.
8. The MEC server of any one of claims 5 to 7, wherein the MEC server further comprises:
the processing module is used for generating third video data according to the first video data, and the resolution of the third video data is smaller than that of the first video data;
the sending module is further configured to send the third video data to a cloud server.
9. An MEC server, comprising: a processor, a memory, and a communication interface; wherein the plurality of programs for said MEC server include computer-executable instructions that, when executed by the processor, cause the MEC server to perform the video transmission method of any of claims 1-4.
10. A computer-readable storage medium having stored therein instructions which, when executed by a computer, cause the computer to perform the video transmission method of any of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911343357.9A CN111083511A (en) | 2019-12-24 | 2019-12-24 | Video transmission method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911343357.9A CN111083511A (en) | 2019-12-24 | 2019-12-24 | Video transmission method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111083511A true CN111083511A (en) | 2020-04-28 |
Family
ID=70317036
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911343357.9A Pending CN111083511A (en) | 2019-12-24 | 2019-12-24 | Video transmission method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111083511A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112261353A (en) * | 2020-09-01 | 2021-01-22 | 浙江大华技术股份有限公司 | Video monitoring and shunting method, system and computer readable storage medium |
CN113992931A (en) * | 2021-10-29 | 2022-01-28 | 湖北邮电规划设计有限公司 | Event live broadcast double-task scheduling method based on mobile edge calculation |
CN114339308A (en) * | 2022-01-04 | 2022-04-12 | 腾讯音乐娱乐科技(深圳)有限公司 | Video stream loading method, electronic equipment and storage medium |
WO2024087938A1 (en) * | 2022-10-24 | 2024-05-02 | 华为技术有限公司 | Media live broadcast method and apparatus and electronic device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107995498A (en) * | 2016-10-26 | 2018-05-04 | 北京视联动力国际信息技术有限公司 | A kind of live video stream switching method and apparatus of terminal |
CN110266664A (en) * | 2019-06-05 | 2019-09-20 | 中国联合网络通信有限公司广州市分公司 | A kind of Cloud VR video living transmission system based on 5G and MEC |
-
2019
- 2019-12-24 CN CN201911343357.9A patent/CN111083511A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107995498A (en) * | 2016-10-26 | 2018-05-04 | 北京视联动力国际信息技术有限公司 | A kind of live video stream switching method and apparatus of terminal |
CN110266664A (en) * | 2019-06-05 | 2019-09-20 | 中国联合网络通信有限公司广州市分公司 | A kind of Cloud VR video living transmission system based on 5G and MEC |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112261353A (en) * | 2020-09-01 | 2021-01-22 | 浙江大华技术股份有限公司 | Video monitoring and shunting method, system and computer readable storage medium |
CN112261353B (en) * | 2020-09-01 | 2022-10-28 | 浙江大华技术股份有限公司 | Video monitoring and shunting method, system and computer readable storage medium |
CN113992931A (en) * | 2021-10-29 | 2022-01-28 | 湖北邮电规划设计有限公司 | Event live broadcast double-task scheduling method based on mobile edge calculation |
CN114339308A (en) * | 2022-01-04 | 2022-04-12 | 腾讯音乐娱乐科技(深圳)有限公司 | Video stream loading method, electronic equipment and storage medium |
WO2024087938A1 (en) * | 2022-10-24 | 2024-05-02 | 华为技术有限公司 | Media live broadcast method and apparatus and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111083511A (en) | Video transmission method and device | |
KR101852893B1 (en) | Information pushing method, device, and system | |
CN111416989A (en) | Video live broadcast method and system and electronic equipment | |
CN112019927B (en) | Video live broadcast method, microphone connecting equipment, live broadcast system and storage medium | |
CN102523501B (en) | Synchronous playing method and system, mobile terminal, digital terminal and server | |
CN107819809B (en) | Method and device for synchronizing content | |
CN103533460A (en) | Method, device, terminal and system for sharing television service | |
CN102811373A (en) | Method for carrying out video broadcast on Internet and mobile Internet by mobile terminal | |
CN111092898B (en) | Message transmission method and related equipment | |
CN102325181A (en) | Instant audio/video interactive communication method based on sharing service and instant audio/video interactive communication system based on sharing service | |
CN110062268A (en) | A kind of audio-video sends and receives processing method and processing device with what screen played | |
CN105577645A (en) | Agent-based HLS client-end device and realization method thereof | |
CN103716681A (en) | Code stream switching method and electronic equipment | |
CN115250356A (en) | Multi-camera switchable virtual camera of mobile phone | |
CN114095671A (en) | Cloud conference live broadcast system, method, device, equipment and medium | |
CN108882010A (en) | A kind of method and system that multi-screen plays | |
EP3399713A1 (en) | Device, system, and method to perform real-time communication | |
CN111510720A (en) | Real-time streaming media data transmission method, electronic device and server | |
CN112804471A (en) | Video conference method, conference terminal, server and storage medium | |
CN107493478B (en) | Method and device for setting coding frame rate | |
CN103581607A (en) | Method for transmitting video stream to local endpoint host using remote camera device | |
CN113259729B (en) | Data switching method, server, system and storage medium | |
CN106664432B (en) | Multimedia information playing method and system, acquisition equipment and standardized server | |
CN111107387B (en) | Video transcoding method and device and computer storage medium | |
KR20180123863A (en) | Apparatus and Method for interlocking Broadcasting Receiving Terminal with Mobile Terminal in IPTV System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200428 |
|
RJ01 | Rejection of invention patent application after publication |