CN109257588A - A kind of data transmission method, terminal, server and storage medium - Google Patents
A kind of data transmission method, terminal, server and storage medium Download PDFInfo
- Publication number
- CN109257588A CN109257588A CN201811162929.9A CN201811162929A CN109257588A CN 109257588 A CN109257588 A CN 109257588A CN 201811162929 A CN201811162929 A CN 201811162929A CN 109257588 A CN109257588 A CN 109257588A
- Authority
- CN
- China
- Prior art keywords
- data
- dimensional video
- depth data
- depth
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000005540 biological transmission Effects 0.000 title claims abstract description 98
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000004891 communication Methods 0.000 claims description 41
- 238000012545 processing Methods 0.000 claims description 15
- 230000002194 synthesizing effect Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/222—Secondary servers, e.g. proxy server, cable television Head-end
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
The embodiment of the invention discloses a kind of data transmission method, terminal, server and storage mediums, which comprises obtains the depth data and two-dimensional video data in three dimensional video data to be transmitted;For the depth data of every frame, depth data and preset depth data are compared, determine the variance data between the depth data and the preset depth data;The two-dimensional video data in three dimensional video data to be transmitted is sent, and sends the variance data.
Description
Technical Field
The present application relates to data transmission technology, and relates to, but is not limited to, a data transmission method, a terminal, a server, and a storage medium.
Background
With the continuous development of the mobile communication network, the transmission rate of the mobile communication network is rapidly improved, thereby providing powerful technical support for the generation and development of the three-dimensional video service. The three-dimensional video data includes two-dimensional video data (e.g., RGB data) and Depth data (Depth data), and the transmission of the three-dimensional video data is to transmit the two-dimensional video data and the Depth data, respectively. Since the Depth data needs to be transmitted for each frame of image during the transmission of the three-dimensional video data, the amount of data is very large. Therefore, when such a large amount of data is transmitted in a whole, the transmission speed and the transmission accuracy are inevitably affected.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide a data transmission method, a terminal, a server, and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
an embodiment of the present application provides a data transmission method, including:
acquiring depth data and two-dimensional video data in three-dimensional video data to be transmitted;
comparing the depth data with preset depth data aiming at the depth data of each frame, and determining difference data between the depth data and the preset depth data;
and sending two-dimensional video data in the three-dimensional video data to be transmitted to a Mobile Edge Computing (MEC) server, and sending the difference data.
In the above scheme, the comparing, for each frame of depth data, the depth data with preset depth data to determine difference data between the depth data and the preset depth data includes:
comparing the depth data of the current frame with the depth data of the previous frame of the current frame aiming at the depth data of each frame, and determining the difference data between the depth data of the current frame and the depth data of the previous frame;
correspondingly, the sending two-dimensional video data in the three-dimensional video data to be transmitted to the mobile edge computing server and sending the difference data includes:
transmitting the two-dimensional video data and the difference data for each frame to the MEC server.
In the above scheme, after the outputting the difference data, the method further includes:
acquiring depth data of a next frame corresponding to the difference data;
and if the depth data of the next frame is different from the depth data corresponding to the difference data, transmitting the depth data of the next frame.
The embodiment of the application also provides a data transmission method, application and an MEC server, wherein the method comprises the following steps:
receiving difference data and two-dimensional video data sent by a terminal;
restoring depth data in the three-dimensional video data to be transmitted according to the difference data;
and synthesizing the depth data and the two-dimensional video data into three-dimensional video data.
In the above solution, before the synthesizing the depth data and the two-dimensional video data into three-dimensional video data, the method further includes:
receiving difference data and two-dimensional video data;
restoring at least the depth data of the current frame according to the difference data;
and synthesizing the depth data of the current frame, the preset depth data and the two-dimensional video data into three-dimensional video data.
An embodiment of the present application further provides a terminal, where the terminal includes: the device comprises an acquisition unit, a first data transmission unit and a first communication unit; wherein,
the acquisition unit is used for acquiring depth data and two-dimensional video data in three-dimensional video data to be transmitted;
the first data transmission unit is used for comparing depth data with preset depth data aiming at the depth data of each frame and determining difference data between the depth data and the preset depth data;
the first communication unit is configured to send two-dimensional video data in three-dimensional video data to be transmitted to the mobile edge computing MEC server, and send the difference data.
In the foregoing solution, the first data transmission unit is configured to compare, for the depth data of each frame, depth data of a current frame with depth data of a previous frame of the current frame, and determine difference data between the depth data of the current frame and the depth data of the previous frame;
correspondingly, the first communication unit is configured to send the two-dimensional video data and the difference data of each frame to the MEC server.
In the foregoing solution, the obtaining unit is further configured to obtain depth data of a next frame corresponding to the difference data;
the first communication unit is further configured to transmit the depth data of the next frame if the depth data of the next frame is different from the depth data corresponding to the difference data.
The embodiment of the application also provides an MEC server, which comprises a second communication unit and a second data transmission unit; wherein,
the second communication unit is used for receiving the difference data and the two-dimensional video data sent by the terminal;
the second data transmission unit is used for recovering the depth data in the three-dimensional video data to be transmitted according to the difference data; and the depth data and the two-dimensional video data are synthesized into three-dimensional video data.
In the above scheme, the second communication unit is further configured to receive difference data and two-dimensional video data;
the second data transmission unit is further configured to recover at least depth data of the current frame according to the difference data; and the video processing device is also used for synthesizing the depth data of the current frame, preset depth data and the two-dimensional video data into three-dimensional video data.
The embodiment of the present application further provides a computer storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the steps of the data transmission method applied to the terminal according to the embodiment of the present application; or, the instructions, when executed by the processor, implement the steps of the data transmission method applied to the MEC server according to the embodiment of the present application.
The embodiment of the present application further provides a terminal, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the steps of the data transmission method applied to the terminal according to the embodiment of the present application are implemented.
The embodiment of the present application further provides an MEC server, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the steps of the data transmission method applied to the MEC server in the embodiment of the present application are implemented.
The embodiment of the application provides a data transmission method, a terminal, a server and a storage medium, wherein, firstly, depth data and two-dimensional video data in three-dimensional video data to be transmitted are obtained; then, aiming at the depth data of each frame, comparing the depth data with preset depth data, and determining difference data between the depth data and the preset depth data; and finally, sending two-dimensional video data in the three-dimensional video data to be transmitted to a mobile edge computing MEC server, and sending the difference data. By adopting the technical scheme of the embodiment of the application, only the depth data with the difference is transmitted from the terminal, so that the transmission quantity of the depth data is greatly reduced, and the fluency of the network is improved.
Drawings
Fig. 1 is a schematic diagram of a system architecture applied to a data transmission method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an implementation of a data transmission method according to an embodiment of the present application;
fig. 3 is an interactive diagram of an implementation of a data transmission method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 6 is a schematic diagram of a hardware component structure of the data transmission device according to the embodiment of the present application.
Detailed Description
Before describing the technical solution of the embodiment of the present application in detail, a system architecture applied to the data transmission method of the embodiment of the present application is first briefly described. The data transmission method of the embodiment of the application is applied to related services of three-dimensional video data, such as services for sharing three-dimensional video data, live broadcast services based on three-dimensional video data, and the like. In this case, since the data amount of the three-dimensional video data is large, the depth data and the two-dimensional video data transmitted respectively need high technical support in the data transmission process, and thus the mobile communication network is required to have a high data transmission rate and a stable data transmission environment.
Fig. 1 is a schematic diagram of a system architecture applied to a data transmission method according to an embodiment of the present application; as shown in fig. 1, the system may include a terminal, a base station, an MEC server, a service processing server, a core network, the Internet (Internet), and the like; and a high-speed channel is established between the MEC server and the service processing server through a core network to realize data synchronization.
Taking an application scenario of interaction between two terminals shown in fig. 1 as an example, an MEC server a is an MEC server deployed near a terminal a (a sending end), and a core network a is a core network in an area where the terminal a is located; correspondingly, the MEC server B is an MEC server deployed near the terminal B (receiving end), and the core network B is a core network of an area where the terminal B is located; the MEC server A and the MEC server B can establish a high-speed channel with the service processing server through the core network A and the core network B respectively to realize data synchronization.
After three-dimensional video data sent by a terminal A are transmitted to an MEC server A, the MEC server A synchronizes the data to a service processing server through a core network A; and then, the MEC server B acquires the three-dimensional video data sent by the terminal A from the service processing server and sends the three-dimensional video data to the terminal B for presentation.
Here, if the terminal B and the terminal a realize transmission through the same MEC server, the terminal B and the terminal a directly realize transmission of three-dimensional video data through one MEC server at this time without participation of a service processing server, and this mode is called a local backhaul mode. Specifically, suppose that the terminal B and the terminal a realize transmission of three-dimensional video data through the MEC server a, and after the three-dimensional video data sent by the terminal a is transmitted to the MEC server a, the MEC server a sends the three-dimensional video data to the terminal B for presentation.
Here, the terminal may select an evolved node b (eNB) accessing the 4G network or a next generation evolved node b (gNB) accessing the 5G network based on a network situation, or a configuration situation of the terminal itself, or an algorithm of the self-configuration, so that the eNB is connected with the MEC server through a Long Term Evolution (LTE) access network, and the gNB is connected with the MEC server through a next generation access network (NG-RAN).
Here, the MEC server is deployed on the network edge side near the terminal or the data source, that is, near the terminal or near the data source, not only in a logical location but also in a geographical location. Unlike the existing mobile communication network in which the main service processing servers are deployed in several large cities, the MEC server can be deployed in a plurality of cities. For example, in an office building, there are many users, and a MEC server may be deployed near the office building.
The MEC server serves as an edge computing gateway with the core capabilities of network convergence, computing, storage and application, and provides platform support comprising an equipment domain, a network domain, a data domain and an application domain for edge computing. The intelligent connection and data transmission system is connected with various intelligent devices and sensors, so that intelligent connection and data transmission services are provided nearby, different types of applications and data are processed in the MEC server, key intelligent services such as real-time service, intelligent service, data aggregation and interoperation, safety and privacy protection and the like are realized, and the intelligent decision efficiency of the service is effectively improved.
The present application will be described in further detail with reference to the following drawings and specific embodiments.
The embodiment of the application provides a data transmission method, which is applied to a terminal, wherein the terminal can be a mobile terminal such as a mobile phone and a tablet personal computer, and can also be a computer-type terminal. Fig. 2 is a schematic flow chart illustrating an implementation of a data transmission method according to an embodiment of the present application; as shown in fig. 2, the method comprises the steps of:
step S201, obtaining depth data and two-dimensional video data in three-dimensional video data to be transmitted.
Here, the depth data in the three-dimensional video data to be transmitted may be collected, or the depth data sent by other devices may be received.
Step S202, aiming at each frame of depth data, comparing the depth data with preset depth data, and determining difference data between the depth data and the preset depth data.
Here, M frames of depth data are included in the three-dimensional video data, and whether the depth data of each frame is the same as the preset depth data is compared one by one, if the depth data of each frame is different from the preset depth data, the difference data between the depth data and the preset depth data is determined; the difference data is then transmitted to the MEC server. If the depth data is the same as the preset depth data, the depth data is not transmitted, or only one piece of information indicating that the depth data is the same is transmitted, without increasing the amount of data transmitted.
Step S203, sending two-dimensional video data in the three-dimensional video data to be transmitted to the mobile edge computing MEC server, and sending the difference data.
Here, only the two-dimensional video data and the difference data are transmitted to the MEC server. The difference data may be understood as a difference between a pixel corresponding to the depth data and a pixel corresponding to the preset depth data.
That is to say, when the terminal transmits the depth data to the MEC server, the terminal does not transmit each frame of depth data to the MEC server completely, and only transmits the difference data different from the preset depth data to the MEC server, so that the MEC server can restore the corresponding depth data according to the difference data, and then synthesize the three-dimensional video data according to the depth data and the two-dimensional video data.
In the embodiment, only the difference data among the depth data is transmitted to the MEC server, so that the transmission quantity of the depth data is reduced, and the network fluency is improved.
In this embodiment, as an implementation manner, the obtaining three-dimensional video data includes: the terminal obtains three-dimensional video data from a collection assembly capable of collecting at least depth data; the acquisition component can establish a communication link with at least one terminal so that the corresponding terminal can obtain the three-dimensional video data.
In the embodiment, because the acquisition component capable of acquiring the depth data is relatively expensive, the terminal does not have the acquisition function of the three-dimensional video data, but acquires the three-dimensional video data through the acquisition component independent of the terminal, and establishes a communication link through the acquisition component and the communication component in the terminal, so that the terminal acquires the three-dimensional video data acquired by the acquisition component. The acquisition assembly can be specifically realized by at least one of the following components: the camera comprises a depth camera, a binocular camera, a 3D structured light camera module and a Time Of Flight (TOF) camera module.
Here, the acquisition component can establish a communication link with at least one terminal to transmit acquired three-dimensional video data to the at least one terminal, so that the corresponding terminal can acquire the three-dimensional video data, and thus, the three-dimensional video data acquired by one acquisition component can be shared with at least one terminal, thereby realizing the sharing of the acquisition component.
As another embodiment, the terminal has a function of acquiring three-dimensional video data, and it can be understood that the terminal is provided with an acquisition component capable of acquiring at least depth data, for example, at least one of the following components: degree of depth camera, binocular camera, 3D structured light module of making a video recording, TOF module of making a video recording to gather three-dimensional video data.
The obtained three-dimensional video data comprises two-dimensional video data and depth data; the two-dimensional video data is used for representing a planar image, and can be RGB data for example; the depth data characterizes a distance between a surface of an acquisition object for which the acquisition assembly is directed and the acquisition assembly.
An embodiment of the present application further provides a data transmission method, and fig. 3 is an implementation interaction diagram of the data transmission method according to the embodiment of the present application, and as shown in fig. 3, the method includes the following steps:
step S301, the terminal obtains depth data and two-dimensional video data in three-dimensional video data to be transmitted.
Here, in step S301, the terminal may collect depth data in the three-dimensional video data to be transmitted through the structured light, or other devices may transmit the depth data to the terminal.
Step S302, the terminal compares the depth data of the current frame with the depth data of the previous frame of the current frame aiming at the depth data of each frame, and determines the difference data between the depth data of the current frame and the depth data of the previous frame.
Here, for example, the depth data has a total of 100 frames, and for each frame, the first frame depth data is completely transmitted to the MEC server, then the second frame depth data is compared with the first frame depth data, and the difference data between the two is determined, and the difference data is transmitted to the MEC server, that is, in this embodiment, when the second frame depth data is transmitted, the difference data between the second frame depth data and the first frame depth data is actually transmitted.
Step S303, the terminal sends the two-dimensional video data of each frame and the difference data to the MEC server.
Here, the terminal transmits two-dimensional video data of each frame to the MEC server, and transmits difference data between every two adjacent frames to the MEC server. After step S303, the embodiment of the present application further includes: acquiring depth data of a next frame corresponding to the difference data; and if the depth data of the next frame is different from the depth data corresponding to the difference data, transmitting the depth data of the next frame. That is, on the basis that the second frame depth data is different from the first frame depth data and the difference data between the second frame depth data and the first frame depth data is transmitted to the MEC server, if the third frame depth data is also different from the second frame depth data, the complete third frame depth data may be directly transmitted, or the difference data between the third frame depth data and the second frame depth data may be transmitted.
In step S304, the MEC server receives the difference data and the two-dimensional video data.
Here, the difference data is difference data between the depth data of the current frame and the depth data of the previous frame. The difference data may be understood as a difference of a pixel corresponding to the depth data of the current frame and a pixel corresponding to the depth data of the previous frame.
Step S305, the MEC server restores the depth data in the three-dimensional video data to be transmitted according to the difference data.
Step S306, the MEC server synthesizes the depth data and the two-dimensional video data into three-dimensional video data.
In the embodiment of the application, the terminal transmits the two-dimensional video data and the difference data between the depth data of the current frame and the depth data of the previous frame, instead of the complete depth data of the two frames, to the MEC server, so that the data transmission amount is greatly reduced, and the network transmission speed is increased.
When three-dimensional video data is transmitted in the related art, the depth data and the two-dimensional video data which are respectively transmitted need higher technical support in the data transmission process, so that a mobile communication network is required to have higher data transmission rate and more stable data transmission environment. However, since Depth data needs to be transmitted for each frame of image, the data size is very large, which results in excessive network data transmission, network congestion, and the like.
Based on this, an embodiment of the present application provides a data transmission method, which is applicable to a low-speed static modeling scene, and is configured to perform preprocessing on depth data before transmitting two-dimensional video data and the depth data from a terminal to an MEC server, obtain difference data by comparing differences between pixels of the depth data of a current frame and pixels of the depth data of a previous frame in three-dimensional video data in a non-compressed manner, transmit the difference data to the MEC server through a high-speed transmission network, and combine the difference data and the obtained two-dimensional video data into three-dimensional video data by the MEC server. For example, dividing three-dimensional video data into a plurality of frames, B represents depth data of one frame, and I represents difference data between a pixel of depth data of a frame next to B and a pixel corresponding to B, in this embodiment, the manner of transmitting the depth data may include various manners (here, to avoid redundancy and repetition, only two manners of transmitting the depth data are explained):
the first way to transmit the depth data: firstly, transmitting a complete frame of depth data; secondly, analyzing the difference data between the first frame depth data and the next frame depth data, and then transmitting the difference data (namely, when the second frame is transmitted, the difference data is transmitted); thirdly, when a third frame is transmitted, transmitting the complete depth data of a next frame corresponding to the difference data; finally, when the fourth frame is transmitted, similar to the method for transmitting the second frame, the difference data between the pixels corresponding to the depth data of the third frame and the pixels corresponding to the depth data of the fourth frame is analyzed, and the difference data is transmitted (namely, B-I-B-I-B-I, wherein in the transmission process, the ratio of the depth data of the transmission complete frame to the transmission difference data is 1: 1).
The first way to transmit the depth data: firstly, transmitting a complete frame of depth data; secondly, analyzing the difference data between the first frame depth data and the next frame depth data, and then transmitting the difference data (namely, when the second frame is transmitted, the difference data is transmitted); thirdly, analyzing the difference data between the depth data of the second frame and the depth data of the next frame, and then transmitting the difference data (namely, when the third frame is transmitted, the difference data is transmitted); thirdly, when the fourth frame is transmitted, transmitting the complete depth data of the next frame corresponding to the difference data corresponding to the third frame; then, when the fifth frame is transmitted, the difference data between the depth data of the fourth frame and the depth data of the fifth frame is transmitted (i.e., B-I1-I2-B-I1-I2-B-I1-I2, and the ratio of the depth data of the transmission complete frame to the transmission difference data is 1:2 during the transmission). Obviously, in the present embodiment, in the process of transmitting depth data, the ratio of the depth data of the transmission complete frame to the transmission difference data may be not only 1:1 or 1:2, but also any other ratio.
In the embodiment of the application, when the depth data of the three-dimensional video data is transmitted, only the depth data of the difference pixels between the current frame and the previous frame is transmitted, so that the network data transmission is greatly reduced, and the smoothness of the network transmission is effectively improved.
In order to implement the method of the terminal side in the embodiment of the application, the embodiment of the application also provides a terminal. Fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application; as shown in fig. 4, the terminal includes: an acquisition unit 41, a first data transmission unit 42, and a first communication unit 43; wherein,
the acquiring unit 41 is configured to acquire depth data and two-dimensional video data in three-dimensional video data to be transmitted;
the first data transmission unit 42 is configured to compare depth data with preset depth data for each frame of depth data, and determine difference data between the depth data and the preset depth data;
the first communication unit 43 is configured to send two-dimensional video data in the three-dimensional video data to be transmitted to the mobile edge computing MEC server, and send the difference data.
In an embodiment, the first data transmission unit 42 is configured to compare, for the depth data of each frame, the depth data of a current frame with the depth data of a previous frame of the current frame, and determine difference data between the depth data of the current frame and the depth data of the previous frame;
correspondingly, the first communication unit 43 is configured to send the two-dimensional video data and the difference data of each frame to the MEC server.
In an embodiment, as shown in fig. 4, the terminal further includes: the obtaining unit 41 is further configured to obtain depth data of a next frame corresponding to the difference data;
the first communication unit 43 is further configured to transmit the depth data of the next frame if the depth data of the next frame is different from the depth data corresponding to the difference data.
In this embodiment, the first data transmission Unit 33 in the terminal may be implemented in practical applications by a Processor in the terminal, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a Micro Control Unit (MCU), or a Programmable Gate Array (FPGA); the first communication unit 32 in the terminal can be realized by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol and the like) and a transceiving antenna in practical application; the acquiring unit 31 in the terminal can be realized by a stereo camera, a binocular camera or a structured light camera in practical application, or can be realized by a communication module (comprising a basic communication suite, an operating system, a communication module, a standardized interface, a protocol and the like) and a transmitting and receiving antenna; the detection unit 34 in the terminal may be implemented by a processor such as a CPU, a DSP, an MCU, or an FPGA in combination with a communication module in practical application.
It should be noted that: in the terminal provided in the foregoing embodiment, when data transmission is performed, only the division of the program modules is described as an example, and in practical applications, the processing distribution may be completed by different program modules according to needs, that is, the internal structure of the terminal is divided into different program modules to complete all or part of the processing described above. In addition, the terminal and the data transmission method provided by the above embodiments belong to the same concept, and the specific implementation process thereof is described in the method embodiments, which is not described herein again.
Correspondingly, in order to implement the method at the server side in the embodiment of the present application, the embodiment of the present application further provides a server, specifically an MEC server. Fig. 5 is a schematic structural diagram of a server according to an embodiment of the present application; as shown in fig. 5, the server includes a second communication unit 51 and a second data transmission unit 52; wherein,
the second communication unit 51 is configured to receive the difference data and the two-dimensional video data sent by the terminal;
the second data transmission unit 52 is configured to recover depth data in the three-dimensional video data to be transmitted according to the difference data; and the depth data and the two-dimensional video data are synthesized into three-dimensional video data.
In an embodiment, the second communication unit 51 is further configured to receive difference data and two-dimensional video data;
the second data transmission unit 52 is further configured to recover at least depth data of the current frame according to the difference data; and the video processing device is also used for synthesizing the depth data of the current frame, preset depth data and the two-dimensional video data into three-dimensional video data.
In this embodiment of the application, the second data transmission unit 52 in the server may be implemented by a processor in the server, such as a CPU, a DSP, an MCU, or an FPGA, in practical application; the second communication unit 51 in the server can be implemented by a communication module (including a basic communication suite, an operating system, a communication module, a standardized interface, a protocol, etc.) and a transceiver antenna in practical application.
It should be noted that: in the above embodiment, when the server performs data transmission, only the division of the program modules is taken as an example, and in practical applications, the transmission allocation may be completed by different program modules according to needs, that is, the internal structure of the server is divided into different program modules to complete all or part of the above-described processing. In addition, the server and the data transmission method provided by the above embodiments belong to the same concept, and the specific implementation process thereof is described in the method embodiments in detail, which is not described herein again.
Based on the hardware implementation of the above devices, an embodiment of the present application further provides a data transmission device, fig. 6 is a schematic diagram of a hardware structure of the data transmission device in the embodiment of the present application, and as shown in fig. 6, the data transmission device 60 includes a memory 61, a processor 62, and a computer program stored in the memory and capable of running on the processor. As a first implementation manner, when the data transmission device is a terminal, a processor located in the terminal executes the program to implement: acquiring depth data and two-dimensional video data in three-dimensional video data to be transmitted; comparing the depth data with preset depth data aiming at the depth data of each frame, and determining difference data between the depth data and the preset depth data; and sending two-dimensional video data in the three-dimensional video data to be transmitted to a Mobile Edge Computing (MEC) server, and sending the difference data.
In one embodiment, the processor at the terminal implements, when executing the program: comparing the depth data of the current frame with the depth data of the previous frame of the current frame aiming at the depth data of each frame, and determining the difference data between the depth data of the current frame and the depth data of the previous frame; transmitting the two-dimensional video data and the difference data for each frame to the MEC server.
In one embodiment, the processor at the terminal implements, when executing the program: acquiring depth data of a next frame corresponding to the difference data; and if the depth data of the next frame is different from the depth data corresponding to the difference data, transmitting the depth data of the next frame.
As a second embodiment, when the data transmission device is a server, the processor located in the server executes the program to implement: receiving difference data and two-dimensional video data sent by a terminal; restoring depth data in the three-dimensional video data to be transmitted according to the difference data; and synthesizing the depth data and the two-dimensional video data into three-dimensional video data.
In one embodiment, the program when executed by a processor located on a server implements: receiving difference data and two-dimensional video data; restoring at least the depth data of the current frame according to the difference data; and synthesizing the depth data of the current frame, the preset depth data and the two-dimensional video data into three-dimensional video data.
It will be appreciated that the data transfer device (terminal or server) also includes a communications interface 63; the individual components in a data transmission device (terminal or server) are coupled together by means of a bus system. It will be appreciated that a bus system is used to enable communications among the components. The bus system includes a power bus, a control bus, and a status signal bus in addition to a data bus.
The embodiment of the present application further provides a computer storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the steps of the data transmission method applied to the terminal according to the embodiment of the present application; or, the instructions, when executed by the processor, implement the steps of the data transmission method applied to the MEC server according to the embodiment of the present application.
The embodiment of the present application further provides a terminal, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the steps of the data transmission method applied to the terminal according to the embodiment of the present application are implemented.
The embodiment of the present application further provides an MEC server, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the steps of the data transmission method applied to the MEC server in the embodiment of the present application are implemented.
In the several embodiments provided in the present application, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including instructions for causing a computer device (which may be a personal computer, a server, or a mobile phone) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
It should be noted that: the technical solutions described in the embodiments of the present application can be arbitrarily combined without conflict.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.
Claims (13)
1. A data transmission method, applied to a terminal, the method comprising:
acquiring depth data and two-dimensional video data in three-dimensional video data to be transmitted;
comparing the depth data with preset depth data aiming at the depth data of each frame, and determining difference data between the depth data and the preset depth data;
and sending two-dimensional video data in the three-dimensional video data to be transmitted, and sending the difference data.
2. The method of claim 1, wherein comparing the depth data with preset depth data for the depth data of each frame to determine difference data between the depth data and the preset depth data comprises:
comparing the depth data of the current frame with the depth data of the previous frame of the current frame aiming at the depth data of each frame, and determining the difference data between the depth data of the current frame and the depth data of the previous frame;
correspondingly, sending two-dimensional video data in the three-dimensional video data to be transmitted, and sending the difference data, includes:
transmitting the two-dimensional video data and the difference data for each frame.
3. The method of claim 1, wherein after said outputting said difference data, said method further comprises:
acquiring depth data of a next frame corresponding to the difference data;
and if the depth data of the next frame is different from the depth data corresponding to the difference data, transmitting the depth data of the next frame.
4. A data transmission method applied to an MEC server, the method comprising:
receiving difference data and two-dimensional video data sent by a terminal;
restoring depth data in the three-dimensional video data to be transmitted according to the difference data;
and synthesizing the depth data and the two-dimensional video data into three-dimensional video data.
5. The method of claim 4, wherein prior to said compositing said depth data and two-dimensional video data into three-dimensional video data, said method further comprises:
receiving difference data and two-dimensional video data;
restoring at least the depth data of the current frame according to the difference data;
and synthesizing the depth data of the current frame, the preset depth data and the two-dimensional video data into three-dimensional video data.
6. A terminal, characterized in that the terminal comprises: the device comprises an acquisition unit, a first data transmission unit and a first communication unit; wherein,
the acquisition unit is used for acquiring depth data and two-dimensional video data in three-dimensional video data to be transmitted;
the first data transmission unit is used for comparing depth data with preset depth data aiming at the depth data of each frame and determining difference data between the depth data and the preset depth data;
the first communication unit is used for sending two-dimensional video data in the three-dimensional video data to be transmitted and sending the difference data.
7. The terminal according to claim 6, wherein the first data transmission unit is configured to compare, for the depth data of each frame, depth data of a current frame with depth data of a previous frame of the current frame, and determine difference data between the depth data of the current frame and the depth data of the previous frame;
correspondingly, the first communication unit is used for transmitting the two-dimensional video data and the difference data of each frame.
8. The terminal according to claim 6, wherein the obtaining unit is further configured to obtain depth data of a next frame corresponding to the difference data;
the first communication unit is further configured to transmit the depth data of the next frame if the depth data of the next frame is different from the depth data corresponding to the difference data.
9. An MEC server, characterized in that the server comprises a second communication unit and a second data transmission unit; wherein,
the second communication unit is used for receiving the difference data and the two-dimensional video data sent by the terminal;
the second data transmission unit is used for recovering the depth data in the three-dimensional video data to be transmitted according to the difference data; and the depth data and the two-dimensional video data are synthesized into three-dimensional video data.
10. The server according to claim 9, wherein the second communication unit is further configured to receive difference data and two-dimensional video data;
the second data transmission unit is further configured to recover at least depth data of the current frame according to the difference data; and the video processing device is also used for synthesizing the depth data of the current frame, preset depth data and the two-dimensional video data into three-dimensional video data.
11. A computer storage medium having computer instructions stored thereon, wherein the instructions, when executed by a processor, perform the steps of the data transmission method of any one of claims 1 to 3; alternatively, the instructions when executed by the processor implement the steps of the data transmission method of claim 4 or 5.
12. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the data transmission method according to any of claims 1 to 3 are implemented when the processor executes the program.
13. An MEC server comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the data transmission method according to claim 4 or 5 when executing the program.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811162929.9A CN109257588A (en) | 2018-09-30 | 2018-09-30 | A kind of data transmission method, terminal, server and storage medium |
PCT/CN2019/100647 WO2020063171A1 (en) | 2018-09-30 | 2019-08-14 | Data transmission method, terminal, server and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811162929.9A CN109257588A (en) | 2018-09-30 | 2018-09-30 | A kind of data transmission method, terminal, server and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109257588A true CN109257588A (en) | 2019-01-22 |
Family
ID=65045349
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811162929.9A Pending CN109257588A (en) | 2018-09-30 | 2018-09-30 | A kind of data transmission method, terminal, server and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109257588A (en) |
WO (1) | WO2020063171A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020063171A1 (en) * | 2018-09-30 | 2020-04-02 | Oppo广东移动通信有限公司 | Data transmission method, terminal, server and storage medium |
CN113993104A (en) * | 2021-10-26 | 2022-01-28 | 中汽创智科技有限公司 | Data transmission method, device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101990107A (en) * | 2009-07-31 | 2011-03-23 | 廖礼士 | Encoding system and method, decoding system and method, display system and method |
CN102868899A (en) * | 2012-09-06 | 2013-01-09 | 华映光电股份有限公司 | Method for processing three-dimensional image |
CN104221385A (en) * | 2012-04-16 | 2014-12-17 | 高通股份有限公司 | View synthesis based on asymmetric texture and depth resolutions |
CN105847777A (en) * | 2016-03-24 | 2016-08-10 | 湖南拓视觉信息技术有限公司 | Method and device for transmitting three-dimensional depth images |
CN107241563A (en) * | 2017-06-16 | 2017-10-10 | 深圳天珑无线科技有限公司 | Method, intelligent mobile terminal and the device with store function of transmission of video |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN203416351U (en) * | 2013-05-31 | 2014-01-29 | 江西省电力设计院 | A power station workshop video monitoring system |
CN108495112B (en) * | 2018-05-10 | 2020-12-22 | Oppo广东移动通信有限公司 | Data transmission method, terminal and computer storage medium |
CN109257588A (en) * | 2018-09-30 | 2019-01-22 | Oppo广东移动通信有限公司 | A kind of data transmission method, terminal, server and storage medium |
-
2018
- 2018-09-30 CN CN201811162929.9A patent/CN109257588A/en active Pending
-
2019
- 2019-08-14 WO PCT/CN2019/100647 patent/WO2020063171A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101990107A (en) * | 2009-07-31 | 2011-03-23 | 廖礼士 | Encoding system and method, decoding system and method, display system and method |
CN104221385A (en) * | 2012-04-16 | 2014-12-17 | 高通股份有限公司 | View synthesis based on asymmetric texture and depth resolutions |
CN102868899A (en) * | 2012-09-06 | 2013-01-09 | 华映光电股份有限公司 | Method for processing three-dimensional image |
CN105847777A (en) * | 2016-03-24 | 2016-08-10 | 湖南拓视觉信息技术有限公司 | Method and device for transmitting three-dimensional depth images |
CN107241563A (en) * | 2017-06-16 | 2017-10-10 | 深圳天珑无线科技有限公司 | Method, intelligent mobile terminal and the device with store function of transmission of video |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020063171A1 (en) * | 2018-09-30 | 2020-04-02 | Oppo广东移动通信有限公司 | Data transmission method, terminal, server and storage medium |
CN113993104A (en) * | 2021-10-26 | 2022-01-28 | 中汽创智科技有限公司 | Data transmission method, device, equipment and storage medium |
CN113993104B (en) * | 2021-10-26 | 2023-12-26 | 中汽创智科技有限公司 | Data transmission method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020063171A1 (en) | 2020-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108495112B (en) | Data transmission method, terminal and computer storage medium | |
CN107172111B (en) | Data transmission method, device and system | |
CN112106370A (en) | System and method for optimizing dynamic point clouds based on prioritized transformation | |
CN109151436B (en) | Data processing method and device, electronic equipment and storage medium | |
CN109194946B (en) | Data processing method and device, electronic equipment and storage medium | |
CN109410319B (en) | Data processing method, server and computer storage medium | |
AU2019345715B2 (en) | Methods and devices for data processing, electronic device | |
CN108667936B (en) | Data processing method, terminal, mobile edge computing server and storage medium | |
CN109871189A (en) | A kind of multiple terminals screen sharing method and device based on Network File System | |
CN109257588A (en) | A kind of data transmission method, terminal, server and storage medium | |
CN109272576B (en) | Data processing method, MEC server, terminal equipment and device | |
WO2020063170A1 (en) | Data processing method, terminal, server and storage medium | |
CN109413405B (en) | Data processing method, terminal, server and computer storage medium | |
CN109246408B (en) | Data processing method, terminal, server and computer storage medium | |
Makiyah et al. | Emulation of point cloud streaming over 5G network | |
CN109151430A (en) | A kind of data processing method, terminal, server and computer storage medium | |
CN108632376A (en) | A kind of data processing method, terminal, server and computer storage media | |
CN109147043B (en) | Data processing method, server and computer storage medium | |
CN109389674B (en) | Data processing method and device, MEC server and storage medium | |
CN109246409B (en) | Data processing method, terminal, server and computer storage medium | |
WO2020062919A1 (en) | Data processing method, mec server and terminal device | |
CN109151435B (en) | Data processing method, terminal, server and computer storage medium | |
CN109309839B (en) | Data processing method and device, electronic equipment and storage medium | |
CN108737807B (en) | Data processing method, terminal, server and computer storage medium | |
CN109299323B (en) | Data processing method, terminal, server and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190122 |
|
RJ01 | Rejection of invention patent application after publication |