WO2020063171A1 - 数据传输方法、终端、服务器和存储介质 - Google Patents

数据传输方法、终端、服务器和存储介质 Download PDF

Info

Publication number
WO2020063171A1
WO2020063171A1 PCT/CN2019/100647 CN2019100647W WO2020063171A1 WO 2020063171 A1 WO2020063171 A1 WO 2020063171A1 CN 2019100647 W CN2019100647 W CN 2019100647W WO 2020063171 A1 WO2020063171 A1 WO 2020063171A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
dimensional video
depth
depth data
frame
Prior art date
Application number
PCT/CN2019/100647
Other languages
English (en)
French (fr)
Inventor
夏炀
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020063171A1 publication Critical patent/WO2020063171A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/222Secondary servers, e.g. proxy server, cable television Head-end
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Definitions

  • This application relates to data transmission technology, and relates to, but is not limited to, a data transmission method, terminal, server, and storage medium.
  • the three-dimensional video data includes two-dimensional video data (for example, Red Green Blue (RGB) data) and depth data (Depth data), and the transmission of three-dimensional video data is to transmit two-dimensional video data and depth data, respectively.
  • RGB Red Green Blue
  • Depth data depth data
  • the embodiments of the present application provide a data transmission method, a terminal, a server, and a storage medium.
  • An embodiment of the present application provides a data transmission method.
  • the method includes:
  • comparing the depth data with preset depth data for each frame of depth data to determine difference data between the depth data and the preset depth data includes:
  • For the depth data of each frame compare the depth data of the current frame of the three-dimensional video data to be transmitted with the depth data of the previous frame of the current frame to determine the depth data of the current frame and Difference data between the depth data of the previous frame;
  • the sending the two-dimensional video data and the difference data includes:
  • the method after sending the two-dimensional video data and corresponding difference data of each frame, the method further includes:
  • the depth data of the next frame is transmitted.
  • An embodiment of the present application further provides a data transmission method, an application and a MEC server, the method includes:
  • the depth data and the two-dimensional video data are synthesized into three-dimensional video data.
  • the method before the combining the depth data and the two-dimensional video data into three-dimensional video data, the method further includes:
  • the depth data of the current frame, the preset depth data, and the two-dimensional video data are synthesized into three-dimensional video data.
  • An embodiment of the present application further provides a terminal, where the terminal includes: an obtaining unit, a first data transmission unit, and a first communication unit; wherein,
  • the obtaining unit is configured to obtain depth data and two-dimensional video data of each frame in the three-dimensional video data to be transmitted;
  • the first data transmission unit is configured to compare depth data with preset depth data for each frame of depth data, and determine a difference between the depth data and the preset depth data. data;
  • the first communication unit is configured to send the two-dimensional video data and the difference data.
  • the first data transmission unit is configured to, for the depth data of each frame, combine the depth data of the current frame of the three-dimensional video data to be transmitted and the previous frame of the current frame. Comparing the depth data to determine difference data between the depth data of the current frame and the depth data of the previous frame;
  • the first communication unit is configured to send two-dimensional video data and corresponding difference data of each frame.
  • the obtaining unit is further configured to obtain depth data of a next frame corresponding to the difference data
  • the first communication unit is further configured to transmit the depth data of the next frame if the depth data of the next frame is different from the depth data of the current frame corresponding to the difference data.
  • An embodiment of the present application further provides a MEC server, where the server includes a second communication unit and a second data transmission unit;
  • the second communication unit is configured to receive difference data and two-dimensional video data sent by a terminal
  • the second data transmission unit is configured to recover the depth data in the three-dimensional video data to be transmitted according to the difference data; and is further configured to synthesize the depth data and the two-dimensional video data into three-dimensional video data.
  • the second communication unit is further configured to receive difference data and two-dimensional video data
  • the second data transmission unit is further configured to recover at least the depth data of the current frame according to the difference data; and is further configured to synthesize the depth data of the current frame, the preset depth data, and the two-dimensional video data Is 3D video data.
  • An embodiment of the present application further provides a computer storage medium storing computer instructions that, when executed by a processor, implement the steps of the data transmission method applied to a terminal according to the embodiments of the present application; or, the instructions are When the processor executes, the steps of the data transmission method applied to the MEC server described in the embodiments of the present application are implemented.
  • An embodiment of the present application further provides a MEC server including a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the program, the processor implements the program described in the embodiment of the present application. Steps of a data transmission method applied to a MEC server.
  • the embodiments of the present application provide a data transmission method, terminal, server, and storage medium. First, the depth data and two-dimensional video data of each frame in the three-dimensional video data to be transmitted are obtained; and then, the depth of each frame is obtained. Data, comparing depth data with preset depth data to determine difference data between the depth data and the preset depth data; finally, sending two-dimensional video data and the mobile edge computing MEC server Difference data.
  • the technical solution of the embodiment of the present application only the difference depth data is transmitted from the terminal, thereby greatly reducing the transmission amount of depth data and improving the smoothness of the network.
  • FIG. 1 is a schematic diagram of a system architecture applied to a data transmission method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a data transmission method according to an embodiment of the present application.
  • FIG. 3 is an implementation interaction diagram of a data transmission method according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a server according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a hardware composition and structure of a data transmission device according to an embodiment of the present application.
  • the data transmission method in the embodiment of the present application is applied to a service related to three-dimensional video data, such as a service for sharing three-dimensional video data, or a live broadcast service based on three-dimensional video data.
  • a service related to three-dimensional video data such as a service for sharing three-dimensional video data, or a live broadcast service based on three-dimensional video data.
  • the transmitted depth data and 2D video data require higher technical support in the data transmission process, so the mobile communication network needs a faster data transmission rate. , And a more stable data transmission environment.
  • FIG. 1 is a schematic diagram of a system architecture applied to a data transmission method according to an embodiment of the present application.
  • the system may include a terminal, a base station, a MEC server, a service processing server, a core network, and the Internet.
  • the MEC server and services High-speed channels are established between the processing servers through the core network to achieve data synchronization.
  • MEC server A is a MEC server deployed near terminal A (sender), and core network A is the core network in the area where terminal A is located; B is a MEC server deployed near terminal B (receiving end), and core network B is the core network in the area where terminal B is located; MEC server A and MEC server B can communicate with the service processing server through core network A and core network B, respectively. Establish high-speed channels for data synchronization.
  • the MEC server A synchronizes the data to the service processing server through the core network A; and the MEC server B obtains the three-dimensional video data sent by the terminal A from the service processing server And send it to terminal B for presentation.
  • terminal B and terminal A use the same MEC server to implement transmission, then terminal B and terminal A directly implement three-dimensional video data transmission through one MEC server without the participation of a service processing server.
  • This method is called local Way back. Specifically, it is assumed that the terminal B and the terminal A realize the transmission of three-dimensional video data through the MEC server A, and after the three-dimensional video data sent by the terminal A is transmitted to the MEC server A, the three-dimensional video data is sent by the MEC server A to the terminal B for presentation.
  • the terminal may select an evolved base station (eNB) that accesses a 4G network or a next-generation evolved base station (gNB) that accesses a 5G network based on the network situation, or the configuration of the terminal itself, or an algorithm configured by itself.
  • eNB evolved base station
  • gNB next-generation evolved base station
  • the eNB is connected to the MEC server through a Long Term Evolution (LTE) access network, so that the gNB is connected to the MEC server through the Next Generation Access Network (NG-RAN).
  • LTE Long Term Evolution
  • NG-RAN Next Generation Access Network
  • the MEC server is deployed on the edge of the network near the terminal or the source of the data.
  • the so-called near the terminal or the source of the data is not only at the logical location, but also geographically close to the terminal or the source of the data.
  • multiple MEC servers can be deployed in one city. For example, in an office building with many users, a MEC server can be deployed near the office building.
  • the MEC server as an edge computing gateway with core capabilities of converged networks, computing, storage, and applications, provides platform support for edge computing including device domains, network domains, data domains, and application domains. It connects various types of smart devices and sensors, and provides smart connection and data transmission services nearby, allowing different types of applications and data to be processed in the MEC server, realizing business real-time, business intelligence, data aggregation and interoperation, security and privacy protection, etc. Key intelligent services effectively improve the intelligent decision-making efficiency of the business.
  • FIG. 2 is a schematic flowchart of a data transmission method according to an embodiment of the present application. As shown in FIG. 2, the method includes the following steps:
  • Step S201 Obtain depth data and two-dimensional video data of each frame in the three-dimensional video data to be transmitted.
  • the depth data in the three-dimensional video data to be transmitted may be collected, or the depth data sent by other devices may be received.
  • Step S202 For each frame of depth data, compare the depth data with preset depth data to determine difference data between the depth data and the preset depth data.
  • the 3D video data includes M frame depth data, and compares each frame's depth data with the preset depth data one by one. If they are different, the difference data between the two is determined; then the difference data is transmitted to MEC server. If the depth data is the same as the preset depth data, the depth data will not be transmitted, or only one piece of information indicating that they are the same will be transmitted without increasing the data transmission amount.
  • Step S203 Send two-dimensional video data and the difference data to the mobile edge computing MEC server.
  • the difference data may be understood as a difference between a pixel corresponding to the depth data and a pixel corresponding to the preset depth data.
  • the terminal transmits depth data to the MEC server, it does not completely transmit the depth data of each frame to the MEC server. It only transmits the difference data that is different from the preset depth data to the MEC server.
  • the MEC server can recover the corresponding depth data according to the difference data, and then synthesize the three-dimensional video data based on the depth data and the two-dimensional video data.
  • only the difference data between the depth data is transmitted to the MEC server, which reduces the transmission amount of the depth data and improves the network fluency.
  • the obtaining three-dimensional video data includes: obtaining, by the terminal, three-dimensional video data from an acquisition component capable of acquiring at least depth data; and the acquisition component being capable of establishing a communication link with at least one terminal. So that the corresponding terminal obtains the three-dimensional video data.
  • the terminal since the acquisition component capable of acquiring depth data is relatively expensive, the terminal does not have the acquisition function of three-dimensional video data, but collects three-dimensional video data through an acquisition component independent of the terminal, and then passes the acquisition component and the terminal.
  • the communication component establishes a communication link, so that the terminal obtains the three-dimensional video data collected by the acquisition component.
  • the acquisition component may be specifically implemented by at least one of the following: a depth camera, a binocular camera, a 3D structured light camera module, and a time of flight (TOF) camera module.
  • the acquisition component can establish a communication link with at least one terminal to transmit the acquired three-dimensional video data to the at least one terminal, so that the corresponding terminal obtains the three-dimensional video data, so that the three-dimensional video data collected by one acquisition component can be shared To at least one terminal, so as to realize the sharing of the collection components.
  • the terminal itself has a function of acquiring three-dimensional video data.
  • the terminal is provided with an acquisition component capable of acquiring at least depth data, for example, at least one of the following components: a depth camera, a binocular camera, and a 3D structure Optical camera module and TOF camera module to collect three-dimensional video data.
  • the obtained three-dimensional video data includes two-dimensional video data and depth data.
  • the two-dimensional video data is used to represent a planar image, and may be RGB data, for example.
  • the depth data represents the surface of the acquisition object targeted by the acquisition component and Distance.
  • FIG. 3 is an implementation interaction diagram of the data transmission method according to the embodiment of the present application. As shown in FIG. 3, the method includes the following steps:
  • Step S301 The terminal acquires depth data and two-dimensional video data in the three-dimensional video data to be transmitted.
  • the step S301 may be that the terminal collects depth data in the three-dimensional video data to be transmitted through structured light, or other devices may transmit the depth data to the terminal.
  • Step S302 For the depth data of each frame, the terminal compares the depth data of the current frame of the three-dimensional video data to be transmitted with the depth data of the previous frame of the current frame to determine the depth of the current frame. Difference data between the data and the depth data of the previous frame.
  • the first frame of depth data is completely transmitted to the MEC server, and then the second frame of depth data is compared with the first frame of depth data.
  • the difference data between the two is determined, and the difference data is transmitted to the MEC server. That is, in this embodiment, when the second frame depth data is transmitted, the second frame depth data and the first frame are actually transmitted. Difference data between depth data.
  • Step S303 The terminal sends the two-dimensional video data and the difference data of each frame to the MEC server.
  • the terminal sends the two-dimensional video data of each frame to the MEC server, and sends the difference data between each adjacent two frames to the MEC server.
  • the embodiment of the present application further includes: acquiring depth data of a next frame corresponding to the difference data; if the depth data of the next frame is different from the depth data corresponding to the difference data, transmitting the next data One frame of depth data.
  • the depth data of the third frame is different from the depth data of the second frame, it can be
  • the complete third frame depth data may be directly transmitted, or the difference data between the third frame depth data and the second frame depth data may be transmitted.
  • step S304 the MEC server receives the difference data and the two-dimensional video data.
  • the difference data is the difference data between the depth data of the current frame and the depth data of the previous frame.
  • the difference data can be understood as the difference between the pixels corresponding to the depth data of the current frame and the pixels corresponding to the depth data of the previous frame.
  • Step S305 The MEC server recovers the depth data in the three-dimensional video data to be transmitted according to the difference data.
  • step S306 the MEC server synthesizes the depth data and the two-dimensional video data into three-dimensional video data.
  • the terminal transmits the two-dimensional video data and the difference data between the depth data of the current frame and the depth data of the previous frame instead of the complete two-frame depth data to the MEC server, thereby greatly reducing The amount of data transmitted increases the speed of network transmission.
  • the depth data and two-dimensional video data that are transmitted separately require higher technical support during the data transmission process, so the mobile communication network needs a faster data transmission rate and more stable data Transmission environment.
  • this data volume is very large, resulting in excessive network data transmission and network congestion.
  • the embodiment of the present application provides a data transmission method, which is suitable for low-speed, static modeling scenarios.
  • Processing, non-compressed method by comparing the difference between the pixels of the depth data of the current frame and the pixels of the depth data of the previous frame in the 3D video data, and obtaining the difference data, and then transmitting the difference data through a high-speed transmission network
  • the MEC server combines the difference data and the obtained two-dimensional video data into three-dimensional video data.
  • the three-dimensional video data is divided into many frames, B represents one frame of depth data, and I represents the difference data between the pixels of the depth data of the next frame next to B and the pixels corresponding to B.
  • the way of deep data may include multiple ways (here, to avoid redundancy and duplication, only two ways of transmitting the deep data are explained):
  • the first way to transmit the depth data first, a complete frame of depth data is transmitted; second, the difference data between the first frame depth data and the next frame depth data is analyzed, and then the difference data (i.e., When transmitting the second frame, it is actually the difference data transmitted; again, when transmitting the third frame, the complete one-frame depth data of the next frame corresponding to the difference data is transmitted; finally, when transmitting the fourth frame, Similar to the method of transmitting the second frame, first analyze the difference data between the pixels corresponding to the depth data of the third frame and the pixels corresponding to the depth data of the fourth frame, and transmit the difference data (ie BIBIBI, during the transmission process, The ratio of the depth data transmitted to the full frame to the difference data transmitted is 1: 1).
  • the first way to transmit the depth data first, a complete frame of depth data is transmitted; second, the difference data between the first frame depth data and the next frame depth data is analyzed, and then the difference data (i.e., When transmitting the second frame, it is actually the difference data transmitted); again, analyze the difference data between the second frame depth data and the next frame depth data, and then transmit the difference data (that is, when transmitting the third frame, In fact, it is the difference data transmitted; once again, when transmitting the fourth frame, the complete one-frame depth data of the next frame corresponding to the difference data corresponding to the third frame is transmitted; then, when transmitting the fifth frame, it is transmitted again
  • the difference data between the depth data of the fourth frame and the depth data of the fifth frame that is, B-I1-I2-B-I1-I2-B-I1-I2, during the transmission, the depth of the complete frame is transmitted
  • the ratio of data to transmission difference data is 1: 2). Obviously, in this embodiment, in the process of transmitting depth data, the ratio of the
  • FIG. 4 is a schematic structural diagram of a structure of a terminal according to an embodiment of the present application.
  • the terminal includes: an obtaining unit 41, a first data transmission unit 42, and a first communication unit 43;
  • the obtaining unit 41 is configured to obtain depth data and two-dimensional video data of each frame in the three-dimensional video data to be transmitted;
  • the first data transmission unit 42 is configured to compare the depth data with preset depth data for each frame of depth data, and determine a difference between the depth data and the preset depth data. Difference data
  • the first communication unit 43 is configured to send the two-dimensional video data and the difference data.
  • the first data transmission unit 42 is configured to, for the depth data of each frame, combine the depth data of the current frame of the three-dimensional video data to be transmitted with the previous one of the current frame. Comparing the depth data of the frames to determine the difference data between the depth data of the current frame and the depth data of the previous frame;
  • the first communication unit 43 is configured to send the two-dimensional video data and the difference data of each frame to the MEC server.
  • the terminal further includes: the obtaining unit 41 is further configured to obtain depth data of a next frame corresponding to the difference data;
  • the first communication unit 43 is further configured to transmit the depth data of the next frame if the depth data of the next frame is different from the depth data of the current frame corresponding to the difference data.
  • the first data transmission unit 33 in the terminal may be implemented by a processor in the terminal, such as a central processing unit (CPU) and a digital signal processor (Digital Signal). (Processor, DSP), Microcontroller Unit (MCU), or Programmable Gate Array (FPGA), etc .; the first communication unit 32 in the terminal can be implemented through a communication module in actual applications.
  • a processor in the terminal such as a central processing unit (CPU) and a digital signal processor (Digital Signal). (Processor, DSP), Microcontroller Unit (MCU), or Programmable Gate Array (FPGA), etc .
  • the first communication unit 32 in the terminal can be implemented through a communication module in actual applications.
  • the acquisition unit 31 in the terminal can be implemented by a stereo camera, a binocular camera, or a structured light camera It can be realized through communication modules (including: basic communication suite, operating system, communication module, standardized interfaces and protocols, etc.) and transceiver antennas;
  • the detection unit 34 in the terminal can be implemented by a processor such as a CPU in practical applications , DSP, MCU or FPGA combined with communication modules.
  • the terminal provided in the foregoing embodiment only uses the division of the foregoing program modules as an example for data transmission.
  • the above processing can be allocated by different program modules according to needs, that is, the terminal
  • the internal structure is divided into different program modules to complete all or part of the processing described above.
  • the terminal and the data transmission method embodiments provided in the foregoing embodiments belong to the same concept. For specific implementation processes, refer to the method embodiments, and details are not described herein again.
  • FIG. 5 is a schematic structural diagram of a server according to an embodiment of the present application; as shown in FIG. 5, the server includes a second communication unit 51 and a second data transmission unit 52;
  • the second communication unit 51 is configured to receive difference data and two-dimensional video data sent by a terminal;
  • the second data transmission unit 52 is configured to recover the depth data in the three-dimensional video data to be transmitted according to the difference data; and is further configured to synthesize the depth data and the two-dimensional video data into three-dimensional video data.
  • the second communication unit 51 is further configured to receive difference data and two-dimensional video data
  • the second data transmission unit 52 is further configured to recover at least the depth data of the current frame according to the difference data; and is further configured to convert the depth data of the current frame, the preset depth data, and the two-dimensional video data Synthesized into three-dimensional video data.
  • the second data transmission unit 52 in the server may be implemented by a processor in the server, such as a CPU, a DSP, an MCU, or an FPGA, in actual applications; the second communication in the server
  • the unit 51 may be implemented by a communication module (including: a basic communication suite, an operating system, a communication module, a standardized interface and a protocol, etc.) and a transmitting and receiving antenna in practical applications.
  • the server provided in the above embodiment only uses the division of the above program modules as an example for data transmission.
  • the above-mentioned transmission allocation can be completed by different program modules according to needs, that is, the server The internal structure is divided into different program modules to complete all or part of the processing described above.
  • the server and the data transmission method embodiments provided in the foregoing embodiments belong to the same concept. For specific implementation processes, refer to the method embodiments, and details are not described herein again.
  • FIG. 6 is a schematic diagram of a hardware composition structure of the data transmission device according to the embodiment of the present application.
  • the data transmission device 60 includes a memory. 61.
  • the processor located on the terminal executes the program to implement: acquiring depth data and two-dimensional video data in the three-dimensional video data to be transmitted; and depth data for each frame Comparing the depth data with the preset depth data to determine the difference data between the depth data and the preset depth data; sending a two-dimensional video of the three-dimensional video data to be transmitted to the mobile edge computing server Data, and send the difference data.
  • the processor located at the terminal executes the program, it is implemented: for the depth data of each frame, comparing the depth data of the current frame with the depth data of the previous frame of the current frame, Determining difference data between the depth data of the current frame and the depth data of the previous frame; and sending the two-dimensional video data and the difference data of each frame to the MEC server.
  • the processor located at the terminal executes the program, it is implemented: obtaining depth data of a next frame corresponding to the difference data; if the depth data of the next frame corresponds to the depth data of the difference data Differently, the depth data of the next frame is transmitted.
  • the processor located on the server executes the program to implement: receiving difference data and two-dimensional video data sent by the terminal; and recovering the three-dimensional video data to be transmitted according to the difference data.
  • the processor located on the server executes the program, it realizes: receiving the difference data and the two-dimensional video data; recovering at least the depth data of the current frame according to the difference data; The preset depth data and the two-dimensional video data are synthesized into three-dimensional video data.
  • the data transmission device further includes a communication interface 63; various components in the data transmission device (terminal or server) are coupled together through a bus system. Understandably, the bus system is configured to enable connection and communication between these components.
  • the bus system also includes a power bus, a control bus, and a status signal bus.
  • An embodiment of the present application further provides a computer storage medium storing computer instructions that, when executed by a processor, implement the steps of the data transmission method applied to a terminal according to the embodiments of the present application; or, the instructions are When the processor executes, the steps of the data transmission method applied to the MEC server described in the embodiments of the present application are implemented.
  • An embodiment of the present application further provides a terminal including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the application described in the embodiment of the present application is implemented. Steps of terminal-based data transmission method.
  • An embodiment of the present application further provides a MEC server including a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the program, the processor implements the program described in the embodiment of the present application. Steps of a data transmission method applied to a MEC server.
  • the disclosed method and smart device may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed components are coupled, or directly coupled, or communicated with each other through some interfaces.
  • the indirect coupling or communication connection of the device or unit may be electrical, mechanical, or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, which may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into a second processing unit, or each unit may be separately used as a unit, or two or more units may be integrated into a unit;
  • the above integrated unit may be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the foregoing program may be stored in a computer-readable storage medium.
  • the program is executed, the program is executed.
  • the method includes the steps of the foregoing method embodiment.
  • the foregoing storage medium includes: various types of media that can store program codes, such as a mobile storage device, a ROM, a RAM, a magnetic disk, or an optical disc.
  • the above-mentioned integrated unit of the present application is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a mobile phone) is caused to execute all or part of the methods described in the embodiments of the present application.
  • the foregoing storage medium includes: various types of media that can store program codes, such as a mobile storage device, a ROM, a RAM, a magnetic disk, or an optical disc.
  • depth data and two-dimensional video data of each frame in the three-dimensional video data to be transmitted are obtained; and then, for the depth data of each frame, the depth data and the preset depth data are processed. Compare, determine difference data between the depth data and the preset depth data; and finally, send the two-dimensional video data and the difference data.

Abstract

本申请实施例公开了一种数据传输方法、终端、服务器和存储介质,所述方法包括:获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;发送所述二维视频数据和所述差异数据。

Description

数据传输方法、终端、服务器和存储介质
相关申请的交叉引用
本申请实施例基于申请号为201811162929.9、申请日为2018年09月30日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此以引入方式并入本申请实施例。
技术领域
本申请涉及数据传输技术,涉及但不限于一种数据传输方法、终端、服务器和存储介质。
背景技术
随着移动通信网络的不断发展,移动通信网络的传输速率飞速提高,从而给三维视频业务的产生和发展提供了有力的技术支持。三维视频数据包括二维视频数据(例如红绿蓝(Red Green Blue,RGB)数据)和深度数据(Depth数据),而三维视频数据的传输是分别传输二维视频数据和深度数据。在三维视频数据的传输过程中由于对于每一帧图像,都需要传输Depth数据,这个数据量是非常大;所以对于如此大量的数据全部传输时,势必会对传输速度、传输的正确率造成影响。
发明内容
为解决上述技术问题,本申请实施例提供了一种数据传输方法、终端、服务器和存储介质。
本申请实施例的技术方案是这样实现的:
本申请实施例提供一种数据传输方法,所述方法包括:
获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;
针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;
发送所述二维视频数据和所述差异数据。
在上述方案中,所述针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定深度数据与所述预设的深度数据之间的差异数据,包括:
针对所述每一帧的深度数据,将所述待传输的三维视频数据的当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;
对应地,所述发送所述二维视频数据和所述差异数据,包括:
发送每一帧的二维视频数据和对应的差异数据。
在上述方案中,在所述发送每一帧的二维视频数据和对应的差异数据之后,所述方法还包括:
获取所述差异数据对应的下一帧的深度数据;
如果所述下一帧的深度数据与所述差异数据对应的当前帧的深度数据不同,传输所述下一帧的深度数据。
本申请实施例还提供了一种数据传输方法,应用与MEC服务器,所述方法包括:
接收终端发送的差异数据和二维视频数据;
根据所述差异数据恢复待传输的三维视频数据中的深度数据;
将所述深度数据和二维视频数据合成为三维视频数据。
在上述方案中,在所述将所述深度数据和二维视频数据合成为三维视频数据之前,所述方法还包括:
接收差异数据和二维视频数据;
根据所述差异数据至少恢复出当前帧的深度数据;
将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成 为三维视频数据。
本申请实施例还提供了一种终端,所述终端包括:获取单元、第一数据传输单元和第一通信单元;其中,
所述获取单元,配置为获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;
所述第一数据传输单元,配置为针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;
所述第一通信单元,配置为发送所述二维视频数据和所述差异数据。
在上述方案中,所述第一数据传输单元,配置为针对所述每一帧的深度数据,将所述待传输的三维视频数据的当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;
对应地,所述第一通信单元,配置为发送每一帧的二维视频数据和对应的差异数据。
在上述方案中,所述获取单元,还配置为获取所述差异数据对应的下一帧的深度数据;
所述第一通信单元,还配置为如果所述下一帧的深度数据与所述差异数据对应的当前帧的深度数据不同,传输所述下一帧的深度数据。
本申请实施例还提供了一种MEC服务器,所述服务器包括第二通信单元和第二数据传输单元;其中,
所述第二通信单元,配置为接收终端发送的差异数据和二维视频数据;
所述第二数据传输单元,配置为根据所述差异数据恢复待传输的三维视频数据中的深度数据;还配置为将所述深度数据和二维视频数据合成为三维视频数据。
在上述方案中,所述第二通信单元,还配置为接收差异数据和二维视 频数据;
所述第二数据传输单元,还配置为根据所述差异数据至少恢复出当前帧的深度数据;还配置为将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成为三维视频数据。
本申请实施例还提供了一种计算机存储介质,其上存储有计算机指令,该指令被处理器执行时实现本申请实施例所述的应用于终端的数据传输方法的步骤;或者,该指令被处理器执行时实现本申请实施例所述的应用于MEC服务器的数据传输方法的步骤。
本申请实施例还提供了一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现本申请实施例所述的应用于终端的数据传输方法的步骤。
本申请实施例还提供了一种MEC服务器,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现本申请实施例所述的应用于MEC服务器的数据传输方法的步骤。
本申请实施例提供一种数据传输方法、终端、服务器和存储介质,其中,首先,获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;然后,针对每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;最后,向移动边缘计算MEC服务器发送二维视频数据和所述差异数据。采用本申请实施例的技术方案,通过从终端仅传输有差异的深度数据,从而大大减少了深度数据的传输量,提高了网络的流畅度。
附图说明
图1为本申请实施例的数据传输方法应用的系统架构示意图;
图2为本申请实施例的数据传输方法的实现流程示意图;
图3为本申请实施例的数据传输方法的实现交互图;
图4为本申请实施例的终端的一种组成结构示意图;
图5为本申请实施例的服务器的组成结构示意图;
图6为本申请实施例的数据传输设备的硬件组成结构示意图。
具体实施方式
在对本申请实施例的技术方案进行详细说明之前,首先对本申请实施例的数据传输方法应用的系统架构进行简单说明。本申请实施例的数据传输方法应用于三维视频数据的相关业务,该业务例如是三维视频数据分享的业务,或者基于三维视频数据的直播业务等等。在这种情况下,由于三维视频数据的数据量较大,分别传输的深度数据和二维视频数据在数据传输过程中需要较高的技术支持,因此需要移动通信网络具有较快的数据传输速率,以及较稳定的数据传输环境。
图1为本申请实施例的数据传输方法应用的系统架构示意图;如图1所示,系统可包括终端、基站、MEC服务器、业务处理服务器、核心网和互联网(Internet)等;MEC服务器与业务处理服务器之间通过核心网建立高速通道以实现数据同步。
以图1所示的两个终端交互的应用场景为例,MEC服务器A为部署于靠近终端A(发送端)的MEC服务器,核心网A为终端A所在区域的核心网;相应的,MEC服务器B为部署于靠近终端B(接收端)的MEC服务器,核心网B为终端B所在区域的核心网;MEC服务器A和MEC服务器B可与业务处理服务器之间分别通过核心网A和核心网B建立高速通道以实现数据同步。
其中,终端A发送的三维视频数据传输到MEC服务器A后,由MEC服务器A通过核心网A将数据同步至业务处理服务器;再由MEC服务器B从业务处理服务器获取终端A发送的三维视频数据,并发送至终端B进行呈现。
这里,如果终端B与终端A通过同一个MEC服务器来实现传输,此时终端B和终端A直接通过一个MEC服务器实现三维视频数据的传输,不需要业务处理服务器的参与,这种方式称为本地回传方式。具体地,假设终端B与终端A通过MEC服务器A实现三维视频数据的传输,终端A发送的三维视频数据传输到MEC服务器A后,由MEC服务器A发送三维视频数据至终端B进行呈现。
这里,终端可基于网络情况、或者终端自身的配置情况、或者自身配置的算法选择接入4G网络的演进型基站(eNB),或者接入5G网络的下一代演进型基站(gNB),从而使得eNB通过长期演进(Long Term Evolution,LTE)接入网与MEC服务器连接,使得gNB通过下一代接入网(NG-RAN)与MEC服务器连接。
这里,MEC服务器部署于靠近终端或数据源头的网络边缘侧,所谓靠近终端或者靠近数据源头,不仅是逻辑位置上,还在地理位置上靠近终端或者靠近数据源头。区别于现有的移动通信网络中主要的业务处理服务器部署于几个大城市中,MEC服务器可在一个城市中部署多个。例如在某写字楼中,用户较多,则可在该写字楼附近部署一个MEC服务器。
其中,MEC服务器作为具有融合网络、计算、存储、应用核心能力的边缘计算网关,为边缘计算提供包括设备域、网络域、数据域和应用域的平台支撑。其联接各类智能设备和传感器,就近提供智能联接和数据传输业务,让不同类型的应用和数据在MEC服务器中进行处理,实现业务实时、业务智能、数据聚合与互操作、安全与隐私保护等关键智能服务,有效提升业务的智能决策效率。
下面结合附图及具体实施例对本申请作进一步详细的说明。
本申请实施例提供了一种数据传输方法,应用于终端中,终端可以是例如手机、平板电脑等移动终端,也可以是电脑等类型的终端。图2为本申请实施例的数据传输方法的实现流程示意图;如图2所示,所述方法包 括以下步骤:
步骤S201,获取待传输的三维视频数据中每一帧的深度数据和二维视频数据。
这里,可以是采集待传输的三维视频数据中的深度数据,还可以是接收到其他设备发送的该深度数据。
步骤S202,针对每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据。
这里,在三维视频数据中包含M帧深度数据,逐个比较每一帧的深度数据与预设的深度数据是否相同,如果不同,即确定二者之间的差异数据;然后将该差异数据传输给MEC服务器。如果深度数据与预设的深度数据相同,那么将不传输该深度数据,或者是仅传输一个用于表明二者相同的信息,不会增加数据的传输量。
步骤S203,向移动边缘计算MEC服务器发送二维视频数据和所述差异数据。
这里,仅将二维视频数据传输给MEC服务器和差异数据传输给MEC服务器。所述差异数据可以理解为是深度数据对应的像素与预设的深度数据对应的像素之间的差异。
也就是说,终端向MEC服务器传输深度数据时,并不是将每一帧深度数据都完整的传输到MEC服务器,仅仅将与预设的深度数据有差异的差异数据,传输给MEC服务器,如此,MEC服务器可以根据差异数据恢复出对应的深度数据,然后根据深度数据和二维视频数据合成三维视频数据。
在实施例中,仅将深度数据之间的差异数据传输给MEC服务器,减少了深度数据的传输量,提升了网络流畅度。
本实施例中,作为一种实施方式,所述获得三维视频数据,包括:所述终端从至少能够采集深度数据的采集组件获得三维视频数据;所述采集组件能够与至少一个终端建立通信链路以使对应终端获得所述三维视频数 据。
本实施方式中,由于能够采集深度数据的采集组件相对比较昂贵,终端并不具备三维视频数据的采集功能,而是通过独立于终端的采集组件采集三维视频数据,再通过采集组件和终端中的通信组件建立通信链路,使得终端获得采集组件采集的三维视频数据。其中,所述采集组件具体可通过以下至少之一实现:深度摄像头、双目摄像头、3D结构光摄像模组、飞行时间(Time Of Flight,TOF)摄像模组。
这里,采集组件能够与至少一个终端建立通信链路以将采集得到的三维视频数据传输至所述至少一个终端,以使对应终端获得三维视频数据,这样能够实现一个采集组件采集的三维视频数据共享给至少一个终端,从而实现采集组件的共享。
作为另一种实施方式,终端自身具备三维视频数据的采集功能,可以理解,终端设置有至少能够采集深度数据的采集组件,例如设置有以下组件至少之一:深度摄像头、双目摄像头、3D结构光摄像模组、TOF摄像模组,以采集三维视频数据。
其中,获得的三维视频数据包括二维视频数据和深度数据;所述二维视频数据用于表征平面图像,例如可以是RGB数据;深度数据表征采集组件所针对的采集对象的表面与采集组件之间的距离。
本申请实施例又提供一种数据传输方法,图3为本申请实施例的数据传输方法的实现交互图,如图3所示,所述方法包括以下步骤:
步骤S301,终端获取待传输的三维视频数据中的深度数据和二维视频数据。
这里,所述步骤S301可以是,终端通过结构光采集待传输的三维视频数据中的深度数据,也可以是其他设备将所述深度数据传输给所述终端。
步骤S302,终端针对所述每一帧的深度数据,将待传输的三维视频数 据的当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据。
这里,比如,深度数据一共有100帧,针对其中的每一帧,先将第一帧深度数据完整的传输给MEC服务器,然后,将第二帧深度数据与第一帧深度数据进行比对,并确定二者之间的差异数据,将该差异数据传输给MEC服务器,也就是说,在本实施例中,传输第二帧深度数据时,其实传输的是第二帧深度数据与第一帧深度数据之间的差异数据。
步骤S303,终端向所述MEC服务器发送每一帧的二维视频数据和所述差异数据。
这里,终端将每一帧的二维视频数据都发送给MEC服务器,并且将每相邻两帧之间的差异数据发送给MEC服务器。在步骤S303之后,本申请实施例还包括:获取所述差异数据对应的下一帧的深度数据;如果所述下一帧的深度数据与所述差异数据对应的深度数据不同,传输所述下一帧的深度数据。也就是说,在第二帧深度数据与第一帧深度数据不同,向MEC服务器传输了二者之间差异数据的基础上,如果第三帧深度数据与第二帧深度数据也不同,可以是直接传输完整的第三帧深度数据,还可以是传输第三帧深度数据与第二帧深度数据之间的差异数据。
步骤S304,MEC服务器接收差异数据和二维视频数据。
这里,所述差异数据即为当前帧的深度数据与前一帧的深度数据之间的差异数据。所述差异数据可以理解为当前帧的深度数据对应的像素与前一帧的深度数据对应的像素的差。
步骤S305,MEC服务器根据所述差异数据恢复待传输的三维视频数据中的深度数据。
步骤S306,MEC服务器将所述深度数据和二维视频数据合成为三维视频数据。
在本申请实施例中,终端将二维视频数据和当前帧的深度数据与前一 帧的深度数据之间的差异数据,而不是完整的两帧深度数据,传输给MEC服务器,从而大大减少了数据传输量,提升了网络传输速度。
在相关技术中传输三维视频数据时,分别传输的深度数据和二维视频数据在数据传输过程中需要较高的技术支持,因此需要移动通信网络具有较快的数据传输速率,以及较稳定的数据传输环境。但是由于对于每一帧图像,都需要传输Depth数据,这个数据量是非常大,导致网络数据传输量过大,网络拥堵等。
基于此,本申请实施例提供一种数据传输方法,适用与较低速,静态建模的场景中,在将二维视频数据和深度数据从终端传输到MEC服务器之前,对深度数据进行的预处理,非压缩的方式通过比较三维视频数据中当前帧的深度数据的像素和前一帧的深度数据的像素之间的差异,并得到该差异数据,然后将该差异数据通过高速传输的网络传输到MEC服务器,然后由MEC服务器将该差异数据和获取到的二维视频数据结合为三维视频数据。比如,将三维视频数据分成很多帧,B代表一帧深度数据,I代表紧邻B的下一帧的深度数据的像素与B对应的像素之间的差异数据,在本实施例中,传输所述深度数据的方式可以包括多种方式(这里,为了避免冗余和重复,仅解释两种传输所述深度数据的方式):
第一种传输所述深度数据的方式:首先,传输一个完整的一帧深度数据;其次,分析第一帧深度数据与下一帧深度数据之间的差异数据,接下来传输该差异数据(即传输第二帧的时候,其实是传输的差异数据);再次,传输第三帧的时候,传输该差异数据对应的下一帧的完整的一帧深度数据;最后,传输第四帧的时候,与传输第二帧的方法类似,先分析第三帧的深度数据对应的像素与第四帧的深度数据对应的像素之间的差异数据,传输该差异数据(即B-I-B-I-B-I,在传输的过程中,传输完整帧的深度数据与传输差异数据的比例为1:1)。
第一种传输所述深度数据的方式:首先,传输一个完整的一帧深度数据;其次,分析第一帧深度数据与下一帧深度数据之间的差异数据,接下来传输该差异数据(即传输第二帧的时候,其实是传输的差异数据);再次,分析第二帧深度数据与下一帧深度数据之间的差异数据,接下来传输该差异数据(即传输第三帧的时候,其实是传输的差异数据);再次,传输第四帧的时候,传输第三帧对应的差异数据对应的下一帧的完整的一帧深度数据;然后,传输第五帧的时候,又是传输第四帧的深度数据与第五帧的深度数据之间的差异数据,即,B-I1-I2-B-I1-I2-B-I1-I2,在传输的过程中,传输完整帧的深度数据与传输差异数据的比例为1:2)。显然,在本实施例中,在传输深度数据的过程中,传输完整帧的深度数据与传输差异数据的比例不仅仅可以是1:1或者1:2,还可以是任意的其他比例。
在本申请实施例中,传输三维视频数据的深度数据时,仅传输当前帧和前一帧有差异像素的深度数据,不仅大大减少网络数据传输,而且有效的提升了网络传输的流畅度。
为实现本申请实施例终端侧的方法,本申请实施例还提供了一种终端。图4为本申请实施例的终端的一种组成结构示意图;如图4所示,所述终端包括:获取单元41、第一数据传输单元42和第一通信单元43;其中,
所述获取单元41,配置为获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;
所述第一数据传输单元42,配置为针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;
所述第一通信单元43,配置为发送所述二维视频数据和所述差异数据。
在一实施例中,所述第一数据传输单元42,配置为针对所述每一帧的深度数据,将所述待传输的三维视频数据的当前帧的深度数据和所述当前 帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;
对应地,所述第一通信单元43,配置为向所述MEC服务器发送每一帧的二维视频数据和所述差异数据。
在一实施例中,如图4所示,所述终端还包括:所述获取单元41,还配置为获取所述差异数据对应的下一帧的深度数据;
所述第一通信单元43,还配置为如果所述下一帧的深度数据与所述差异数据对应的当前帧的深度数据不同,传输所述下一帧的深度数据。
本申请实施例中,所述终端中的第一数据传输单元33,在实际应用中可由所述终端中的处理器,比如中央处理器(Central Processing Unit,CPU)、数字信号处理器(Digital Signal Processor,DSP)、微控制单元(Microcontroller Unit,MCU)或可编程门阵列(Field-Programmable Gate Array,FPGA)等实现;所述终端中的第一通信单元32,在实际应用中可通过通信模组(包含:基础通信套件、操作系统、通信模块、标准化接口和协议等)及收发天线实现;所述终端中的获取单元31,在实际应用中可通过立体摄像头、双目摄像头或结构光摄像头实现,或者可通过通信模组(包含:基础通信套件、操作系统、通信模块、标准化接口和协议等)及收发天线实现;所述终端中的检测单元34,在实际应用中可由处理器比如CPU、DSP、MCU或FPGA等结合通信模组实现。
需要说明的是:上述实施例提供的终端在进行数据传输时,仅以上述各程序模块的划分进行举例说明,实际应用中,可以根据需要而将上述处理分配由不同的程序模块完成,即将终端的内部结构划分成不同的程序模块,以完成以上描述的全部或者部分处理。另外,上述实施例提供的终端与数据传输方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
相应地,为实现本申请实施例服务器侧的方法,本申请实施例还提供 了一种服务器,具体为MEC服务器。图5为本申请实施例的服务器的组成结构示意图;如图5所示,所述服务器包括第二通信单元51和第二数据传输单元52;其中,
所述第二通信单元51,配置为接收终端发送的差异数据和二维视频数据;
所述第二数据传输单元52,配置为根据所述差异数据恢复待传输的三维视频数据中的深度数据;还配置为将所述深度数据和二维视频数据合成为三维视频数据。
在一实施例中,所述第二通信单元51,还配置为接收差异数据和二维视频数据;
所述第二数据传输单元52,还配置为根据所述差异数据至少恢复出当前帧的深度数据;还配置为将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成为三维视频数据。
本申请实施例中,所述服务器中的第二数据传输单元52,在实际应用中可由所述服务器中的处理器,比如CPU、DSP、MCU或FPGA等实现;所述服务器中的第二通信单元51,在实际应用中可通过通信模组(包含:基础通信套件、操作系统、通信模块、标准化接口和协议等)及收发天线实现。
需要说明的是:上述实施例提供的服务器在进行数据传输时,仅以上述各程序模块的划分进行举例说明,实际应用中,可以根据需要而将上述传输分配由不同的程序模块完成,即将服务器的内部结构划分成不同的程序模块,以完成以上描述的全部或者部分处理。另外,上述实施例提供的服务器与数据传输方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
基于上述设备的硬件实现,本申请实施例还提供了一种数据传输设备,图6为本申请实施例的数据传输设备的硬件组成结构示意图,如图6所示, 数据传输装置60,包括存储器61、处理器62及存储在存储器上并可在处理器上运行的计算机程序。作为第一种实施方式,数据传输设备为终端时,位于终端的处理器执行所述程序时实现:获取待传输的三维视频数据中的深度数据和二维视频数据;针对每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;向移动边缘计算服务器发送待传输的三维视频数据中的二维视频数据,并发送所述差异数据。
在一实施例中,位于终端的处理器执行所述程序时实现:针对所述每一帧的深度数据,将当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;向所述MEC服务器发送每一帧的二维视频数据和所述差异数据。
在一实施例中,位于终端的处理器执行所述程序时实现:获取所述差异数据对应的下一帧的深度数据;如果所述下一帧的深度数据与所述差异数据对应的深度数据不同,传输所述下一帧的深度数据。
作为第二种实施方式,数据传输设备为服务器时,位于服务器的处理器执行所述程序时实现:接收终端发送的差异数据和二维视频数据;根据所述差异数据恢复待传输的三维视频数据中的深度数据;将所述深度数据和二维视频数据合成为三维视频数据。
在一实施例中,位于服务器的处理器执行所述程序时实现:接收差异数据和二维视频数据;根据所述差异数据至少恢复出当前帧的深度数据;将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成为三维视频数据。
可以理解,数据传输设备(终端或服务器)还包括通信接口63;数据传输设备(终端或服务器)中的各个组件通过总线系统耦合在一起。可理解,总线系统配置为实现这些组件之间的连接通信。总线系统除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。
本申请实施例还提供了一种计算机存储介质,其上存储有计算机指令,该指令被处理器执行时实现本申请实施例所述的应用于终端的数据传输方法的步骤;或者,该指令被处理器执行时实现本申请实施例所述的应用于MEC服务器的数据传输方法的步骤。
本申请实施例还提供了一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现本申请实施例所述的应用于终端的数据传输方法的步骤。
本申请实施例还提供了一种MEC服务器,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现本申请实施例所述的应用于MEC服务器的数据传输方法的步骤。
在本申请所提供的几个实施例中,应该理解到,所揭露的方法和智能设备,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个第二处理单元中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步 骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:移动存储设备、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
或者,本申请上述集成的单元如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器、或者手机等)执行本申请各个实施例所述方法的全部或部分。而前述的存储介质包括:移动存储设备、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
需要说明的是:本申请实施例所记载的技术方案之间,在不冲突的情况下,可以任意组合。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。
工业实用性
本申请实施例中,首先,获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;然后,针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;最后,发送所述二维视频数据和所述差异数据。采用本申请实施例的技术方案,通过从终端仅传输有差异的深度数据,从而大大减少了深度数据的传输量,提高了网络的流畅度。

Claims (13)

  1. 一种数据传输方法,其中,应用于终端,所述方法包括:
    获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;
    针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;
    发送所述二维视频数据和所述差异数据。
  2. 根据权利要求1所述的方法,其中,所述针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定深度数据与所述预设的深度数据之间的差异数据,包括:
    针对所述每一帧的深度数据,将所述待传输的三维视频数据的当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;
    对应地,发送所述二维视频数据和所述差异数据,包括:
    发送每一帧的二维视频数据和对应的差异数据。
  3. 根据权利要求1或2所述的方法,其中,在所述发送每一帧的二维视频数据和对应的差异数据之后,所述方法还包括:
    获取所述差异数据对应的下一帧的深度数据;
    如果所述下一帧的深度数据与所述差异数据对应的当前帧的深度数据不同,传输所述下一帧的深度数据。
  4. 一种数据传输方法,其中,应用于MEC服务器,所述方法包括:
    接收终端发送的差异数据和二维视频数据;
    根据所述差异数据恢复待传输的三维视频数据中的深度数据;
    将所述深度数据和二维视频数据合成为三维视频数据。
  5. 根据权利要求4所述的方法,其中,在所述将所述深度数据和二维视频数据合成为三维视频数据之前,所述方法还包括:
    接收差异数据和二维视频数据;
    根据所述差异数据至少恢复出当前帧的深度数据;
    将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成为三维视频数据。
  6. 一种终端,其中,所述终端包括:获取单元、第一数据传输单元和第一通信单元;其中,
    所述获取单元,配置为获取待传输的三维视频数据中每一帧的深度数据和二维视频数据;
    所述第一数据传输单元,配置为针对所述每一帧的深度数据,将深度数据和预设的深度数据进行比对,确定所述深度数据与所述预设的深度数据之间的差异数据;
    所述第一通信单元,配置为发送所述二维视频数据和所述差异数据。
  7. 根据权利要求6所述的终端,其中,所述第一数据传输单元,配置为针对所述每一帧的深度数据,将所述待传输的三维视频数据的当前帧的深度数据和所述当前帧的前一帧的深度数据进行比对,确定所述当前帧的深度数据与所述前一帧的深度数据之间的差异数据;
    对应地,所述第一通信单元,配置为发送每一帧的二维视频数据和对应的差异数据。
  8. 根据权利要求6所述的终端,其中,所述获取单元,还配置为获取所述差异数据对应的下一帧的深度数据;
    所述第一通信单元,还配置为如果所述下一帧的深度数据与所述差异数据对应的当前帧的深度数据不同,传输所述下一帧的深度数据。
  9. 一种MEC服务器,其中,所述服务器包括第二通信单元和第二数据传输单元;其中,
    所述第二通信单元,配置为接收终端发送的差异数据和二维视频数据;
    所述第二数据传输单元,配置为根据所述差异数据恢复待传输的三维 视频数据中的深度数据;还配置为将所述深度数据和二维视频数据合成为三维视频数据。
  10. 根据权利要求9所述的服务器,其中,所述第二通信单元,还配置为接收差异数据和二维视频数据;
    所述第二数据传输单元,还配置为根据所述差异数据至少恢复出当前帧的深度数据;还配置为将所述当前帧的深度数据、预设的深度数据和所述二维视频数据合成为三维视频数据。
  11. 一种计算机存储介质,其上存储有计算机指令,其中,该指令被处理器执行时实现权利要求1至3任一项所述数据传输方法的步骤;或者,该指令被处理器执行时实现权利要求4或5所述数据传输方法的步骤。
  12. 一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,所述处理器执行所述程序时实现权利要求1至3任一项所述数据传输方法的步骤。
  13. 一种MEC服务器,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,所述处理器执行所述程序时实现权利要求4或5所述数据传输方法的步骤。
PCT/CN2019/100647 2018-09-30 2019-08-14 数据传输方法、终端、服务器和存储介质 WO2020063171A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811162929.9 2018-09-30
CN201811162929.9A CN109257588A (zh) 2018-09-30 2018-09-30 一种数据传输方法、终端、服务器和存储介质

Publications (1)

Publication Number Publication Date
WO2020063171A1 true WO2020063171A1 (zh) 2020-04-02

Family

ID=65045349

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/100647 WO2020063171A1 (zh) 2018-09-30 2019-08-14 数据传输方法、终端、服务器和存储介质

Country Status (2)

Country Link
CN (1) CN109257588A (zh)
WO (1) WO2020063171A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109257588A (zh) * 2018-09-30 2019-01-22 Oppo广东移动通信有限公司 一种数据传输方法、终端、服务器和存储介质
CN113993104B (zh) * 2021-10-26 2023-12-26 中汽创智科技有限公司 一种数据传输方法、装置、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203416351U (zh) * 2013-05-31 2014-01-29 江西省电力设计院 电站厂房视频监控系统
CN108495112A (zh) * 2018-05-10 2018-09-04 Oppo广东移动通信有限公司 数据传输方法及终端、计算机存储介质
CN109257588A (zh) * 2018-09-30 2019-01-22 Oppo广东移动通信有限公司 一种数据传输方法、终端、服务器和存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101990107A (zh) * 2009-07-31 2011-03-23 廖礼士 编码系统及方法、解码系统及方法、显示系统及方法
US20130271565A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated View synthesis based on asymmetric texture and depth resolutions
CN102868899A (zh) * 2012-09-06 2013-01-09 华映光电股份有限公司 一种三维图像处理方法
CN105847777B (zh) * 2016-03-24 2018-04-17 湖南拓视觉信息技术有限公司 一种传输三维深度图像的方法及装置
CN107241563B (zh) * 2017-06-16 2020-01-07 深圳市玩视科技有限公司 视频传输的方法、智能移动终端及具有存储功能的装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203416351U (zh) * 2013-05-31 2014-01-29 江西省电力设计院 电站厂房视频监控系统
CN108495112A (zh) * 2018-05-10 2018-09-04 Oppo广东移动通信有限公司 数据传输方法及终端、计算机存储介质
CN109257588A (zh) * 2018-09-30 2019-01-22 Oppo广东移动通信有限公司 一种数据传输方法、终端、服务器和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TENCENT ET AL.: "Terminology Clarification", 3GPP TSG-SA WG1 MEETING #83 S 1-182738, 24 August 2018 (2018-08-24), XP051475358 *

Also Published As

Publication number Publication date
CN109257588A (zh) 2019-01-22

Similar Documents

Publication Publication Date Title
CN108495112B (zh) 数据传输方法及终端、计算机存储介质
US11373319B2 (en) System and method for optimizing dynamic point clouds based on prioritized transformations
WO2017152723A1 (zh) 一种数据传输方法、装置及系统
WO2020063169A1 (zh) 数据处理方法及装置、电子设备及存储介质
US20210201568A1 (en) Data Processing Method and Electronic Device
WO2018200337A1 (en) System and method for simulating light transport between virtual and real objects in mixed reality
AU2019345715B2 (en) Methods and devices for data processing, electronic device
CN109410319B (zh) 一种数据处理方法、服务器和计算机存储介质
WO2020063171A1 (zh) 数据传输方法、终端、服务器和存储介质
CN108667936B (zh) 数据处理方法、终端、移动边缘计算服务器及存储介质
WO2020063170A1 (zh) 数据处理方法、终端、服务器和存储介质
CN109413405B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
WO2020063168A1 (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN108632376B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
WO2020062919A1 (zh) 一种数据处理方法、mec服务器、终端设备
CN109147043B (zh) 一种数据处理方法、服务器及计算机存储介质
CN116503498A (zh) 一种画面渲染方法和相关装置
CN109246409B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN109389674B (zh) 数据处理方法及装置、mec服务器及存储介质
CN109151435B (zh) 一种数据处理方法、终端、服务器及计算机存储介质
CN109309839B (zh) 数据处理方法及装置、电子设备及存储介质
CN108737807B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
WO2020063172A1 (zh) 数据处理方法、终端、服务器和存储介质
CN109299323B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN109302598B (zh) 一种数据处理方法、终端、服务器和计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19865516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19865516

Country of ref document: EP

Kind code of ref document: A1