CN109413409B - Data processing method, MEC server and terminal equipment - Google Patents

Data processing method, MEC server and terminal equipment Download PDF

Info

Publication number
CN109413409B
CN109413409B CN201811162079.2A CN201811162079A CN109413409B CN 109413409 B CN109413409 B CN 109413409B CN 201811162079 A CN201811162079 A CN 201811162079A CN 109413409 B CN109413409 B CN 109413409B
Authority
CN
China
Prior art keywords
information
terminal equipment
dimensional image
shooting angle
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811162079.2A
Other languages
Chinese (zh)
Other versions
CN109413409A (en
Inventor
夏炀
谭正鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811162079.2A priority Critical patent/CN109413409B/en
Publication of CN109413409A publication Critical patent/CN109413409A/en
Priority to PCT/CN2019/090328 priority patent/WO2020062919A1/en
Application granted granted Critical
Publication of CN109413409B publication Critical patent/CN109413409B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Telephone Function (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The embodiment of the invention provides a data processing method, terminal equipment and an MEC server, wherein the method comprises the following steps: receiving attitude information of the terminal equipment; determining a shooting angle and space relative position information of the terminal equipment based on the attitude information of the terminal equipment; and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.

Description

Data processing method, MEC server and terminal equipment
Technical Field
The embodiment of the application relates to the technical field of information processing, in particular to a data processing method, terminal equipment and an MEC server.
Background
With the continuous development of the mobile communication network, the transmission rate of the mobile communication network is rapidly improved, thereby providing powerful technical support for the generation and development of the three-dimensional video service. The three-dimensional data includes two-dimensional image data (e.g., RGB data, etc.) and Depth data (Depth data), and the transmission of the three-dimensional data is to transmit two-dimensional video data and Depth data, respectively. However, when performing three-dimensional modeling, only information about the top, bottom, left, and right of two-dimensional image information can be obtained, and thus the accuracy of three-dimensional modeling cannot be ensured.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present invention provide a data processing method, a terminal device, and an MEC server.
The data processing method provided by the embodiment of the application is applied to an MEC server and comprises the following steps:
receiving attitude information of the terminal equipment;
determining a shooting angle and space relative position information of the terminal equipment based on the attitude information of the terminal equipment;
and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
The data processing method provided by the embodiment of the application is applied to terminal equipment and comprises the following steps:
acquiring attitude information of terminal equipment;
sending the attitude information of the terminal equipment to a network side; the network side determines the shooting angle and the space relative position information of the terminal equipment based on the attitude information of the terminal equipment, and carries out three-dimensional image modeling based on the shooting angle and the space relative position information of the terminal equipment.
An embodiment of the present application provides an MEC server, including:
the first communication unit is used for receiving attitude information of the terminal equipment;
the first processing unit is used for determining the shooting angle and the space relative position information of the terminal equipment based on the attitude information of the terminal equipment; and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
The embodiment of the application provides a terminal device, including:
the acquisition unit is used for acquiring the attitude information of the terminal equipment;
and the second communication unit is used for sending the attitude information of the terminal equipment to the network side.
The MEC server provided by the embodiment of the application comprises a processor and a memory. The memory is used for storing computer programs, and the processor is used for calling and running the computer programs stored in the memory and executing the data processing method.
The terminal device provided by the embodiment of the application comprises a processor and a memory. The memory is used for storing computer programs, and the processor is used for calling and running the computer programs stored in the memory and executing the data processing method.
The chip provided by the embodiment of the application is used for realizing the data processing method.
Specifically, the chip includes: and the processor is used for calling and running the computer program from the memory so that the equipment provided with the chip executes the data processing method.
A computer-readable storage medium provided in an embodiment of the present application is used for storing a computer program, and the computer program enables a computer to execute the data processing method described above.
The computer program product provided by the embodiment of the present application includes computer program instructions, and the computer program instructions enable a computer to execute the data processing method.
The computer program provided by the embodiment of the present application, when running on a computer, causes the computer to execute the data processing method described above.
By the technical scheme, the shooting angle and the space relative position information of the terminal equipment can be determined based on the attitude information of the terminal equipment, and then three-dimensional image modeling is carried out.
Drawings
Fig. 1 is a schematic diagram of a system architecture according to an embodiment of the present application;
fig. 2 is a first schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 3a is a first schematic view of a scenario provided in an embodiment of the present application;
fig. 3b is a schematic view of a scenario provided in the embodiment of the present application;
fig. 4 is a schematic flowchart illustrating a second data processing method according to an embodiment of the present application;
fig. 5 is a schematic structural component diagram of an MEC server provided in an embodiment of the present application;
fig. 6 is a schematic structural component diagram of a terminal device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a communication device according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a chip of an embodiment of the present application;
fig. 9 is a schematic block diagram of a communication system according to an embodiment of the present application.
Detailed Description
Before the technical solution of the embodiment of the present invention is explained in detail, a system architecture to which the data processing method of the embodiment of the present invention is applied is first briefly explained. The data processing method of the embodiment of the invention is applied to related services of three-dimensional video data, such as services for sharing three-dimensional video data, live broadcast services based on three-dimensional video data and the like. In this case, since the data amount of the three-dimensional video data is large, the depth data and the two-dimensional video data transmitted respectively need high technical support in the data transmission process, and thus the mobile communication network is required to have a high data transmission rate and a stable data transmission environment.
FIG. 1 is a schematic diagram of a system architecture for applying a data processing method according to an embodiment of the present invention; as shown in fig. 1, the system may include a terminal, a base station, an MEC server, a service processing server, a core network, the Internet (Internet), and the like; and a high-speed channel is constructed between the MEC server and the service processing server through a core network to realize data synchronization.
Taking an application scenario of interaction between two terminals shown in fig. 1 as an example, an MEC server a is an MEC server deployed near a terminal a (a sending end), and a core network a is a core network in an area where the terminal a is located; correspondingly, the MEC server B is an MEC server deployed near the terminal B (receiving end), and the core network B is a core network of an area where the terminal B is located; the MEC server A and the MEC server B can construct a high-speed channel with the service processing server through the core network A and the core network B respectively to realize data synchronization.
After three-dimensional video data sent by a terminal A are transmitted to an MEC server A, the MEC server A synchronizes the data to a service processing server through a core network A; and then, the MEC server B acquires the three-dimensional video data sent by the terminal A from the service processing server and sends the three-dimensional video data to the terminal B for presentation.
Here, if the terminal B and the terminal a realize transmission through the same MEC server, the terminal B and the terminal a directly realize transmission of three-dimensional video data through one MEC server at this time without participation of a service processing server, and this mode is called a local backhaul mode. Specifically, suppose that the terminal B and the terminal a realize transmission of three-dimensional video data through the MEC server a, and after the three-dimensional video data sent by the terminal a is transmitted to the MEC server a, the MEC server a sends the three-dimensional video data to the terminal B for presentation.
Here, the terminal may select an evolved node b (eNB) accessing the 4G network or a next generation evolved node b (gNB) accessing the 5G network based on a network situation, or a configuration situation of the terminal itself, or an algorithm of the self-configuration, so that the eNB is connected with the MEC server through a Long Term Evolution (LTE) access network, and the gNB is connected with the MEC server through a next generation access network (NG-RAN).
Here, the MEC server is deployed on the network edge side near the terminal or the data source, that is, near the terminal or near the data source, not only in a logical location but also in a geographical location. Unlike the existing mobile communication network in which the main service processing servers are deployed in several large cities, the MEC server can be deployed in a plurality of cities. For example, in an office building, there are many users, and a MEC server may be deployed near the office building.
The MEC server serves as an edge computing gateway with the core capabilities of network convergence, computing, storage and application, and provides platform support comprising an equipment domain, a network domain, a data domain and an application domain for edge computing. The intelligent connection and data processing system is connected with various intelligent devices and sensors, provides intelligent connection and data processing services nearby, enables different types of applications and data to be processed in the MEC server, achieves key intelligent services such as real-time service, intelligent service, data aggregation and interoperation, safety and privacy protection and the like, and effectively improves intelligent decision efficiency of the service.
An embodiment of the present invention provides a data processing method, which is applied to an MEC server, and as shown in fig. 2, includes:
step 201: receiving attitude information of the terminal equipment;
step 202: determining a shooting angle and space relative position information of the terminal equipment based on the attitude information of the terminal equipment;
step 203: and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
Here, the terminal device is a device provided with a camera, and specifically, the terminal device may include a camera capable of acquiring two-dimensional image information and a camera capable of acquiring depth information in this embodiment.
In the foregoing step 201, the attitude information of the terminal device may be acquired by using a gyroscope; the obtained attitude information can be angle information of three axial directions when the terminal equipment deflects and inclines; it should be understood here that the angular information of the terminal device in three axial directions can be referenced to the world coordinate system.
In the foregoing step 202, determining the shooting angle and the spatial relative position information of the terminal device based on the posture information of the terminal device may include: determining a shooting angle of the terminal equipment for a target object according to the position relation between the terminal equipment and an acquisition component thereof; and determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the attitude information of the terminal equipment.
The shooting angle corresponds to the position relation between the terminal equipment and the acquisition component of the terminal equipment; specifically, the method further comprises: and acquiring the position relation between the terminal equipment and the acquisition component thereof in each shooting angle under at least one preset shooting angle.
Further, the acquisition component of the terminal equipment can be understood as the position relationship between the terminal equipment and the two-dimensional camera thereof; the terminal device may measure the position relationship by using a plane where the rear housing is located, and the two-dimensional camera may measure the position relationship by using a central axis thereof, for example, referring to fig. 3a, a longitudinal section schematic diagram of the right side may be obtained after the terminal device is sliced by using a tangent line a; the central axis of the tangent plane collecting component on the right side can have a relative position relation with the shell plane of the terminal equipment; further, the shooting angle of the terminal equipment is determined according to the preset shooting angle and the relative relation between the terminal equipment and the acquisition component.
And determining the spatial relative position information of the terminal device and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal device, referring to fig. 3b, determining the spatial relative position information between the terminal device and the target object based on the shooting angle and the depth information corresponding to the two-dimensional image information and the posture information of the terminal device, that is, based on the relative position relationship between the world coordinate terminal device and the target object, that is, enabling the network device to acquire the relative position relationship between the terminal device and the target object in the world coordinate system.
And then, according to the shooting angle and the spatial relative position relation of the terminal equipment, combining the two-dimensional image information and the depth information of the two-dimensional image information to carry out three-dimensional image modeling, and finally obtaining a three-dimensional image conforming to the current shooting scene.
Therefore, by adopting the scheme, the shooting angle and the space relative position information of the terminal equipment can be determined based on the attitude information of the terminal equipment, and then the three-dimensional image modeling is carried out.
An embodiment of the present invention provides a data processing method, which is applied to a terminal device, and as shown in fig. 4, the method includes:
step 401: acquiring attitude information of terminal equipment;
step 402: sending the attitude information of the terminal equipment to a network side; the network side determines the shooting angle and the space relative position information of the terminal equipment based on the attitude information of the terminal equipment, and carries out three-dimensional image modeling based on the shooting angle and the space relative position information of the terminal equipment.
Here, the terminal device is a device provided with a camera, and specifically, the terminal device may include a camera capable of acquiring two-dimensional image information and a camera capable of acquiring depth information in this embodiment.
In step 401, acquiring the attitude information of the terminal device may use a gyroscope to detect the attitude information of the terminal device. The attitude information acquired by the gyroscope can be angle information of three axial directions of terminal equipment during deflection and inclination; it should be understood here that the angular information of the terminal device in three axial directions can be referenced to the world coordinate system.
While the foregoing scheme is executed, the method may further include:
acquiring two-dimensional image information and corresponding depth information of a target object;
and sending the two-dimensional image information and the depth information corresponding to the two-dimensional image information to a network side.
The acquiring of the two-dimensional image information and the depth information corresponding to the two-dimensional image information may be: acquiring a two-dimensional image of a target object through a first camera unit to obtain N frames of two-dimensional image information of the target object; n is an integer greater than or equal to 1; synchronously acquiring depth information corresponding to each frame of two-dimensional image information in the N frames of two-dimensional image information through a second camera unit and a first camera unit;
it should be noted that the first camera unit may be a 2D camera, and the second camera unit may be a depth camera; moreover, the 2D camera and the depth camera are synchronously used for shooting, that is, the two-dimensional image information and the depth information collected by the two cameras can correspond to each other in a time domain; for example, the two-dimensional image 1 is acquired at the time 1, and the depth information 1 is acquired at the time 1, that is, the two-dimensional image 1 and the depth information 1 are in a one-to-one correspondence relationship, and further, when the two-dimensional image and the depth information are acquired at other times, both the two-dimensional image and the depth information can be in one-to-one correspondence relationship by the time information.
It should also be noted that the terminal device may transmit the two-dimensional image information and the depth information corresponding to the body region based on the TCP protocol.
Therefore, by adopting the scheme, the posture information of the terminal equipment can be sent to the network side, so that the network side can determine the shooting angle and the space relative position information of the terminal equipment, and further perform three-dimensional image modeling.
An embodiment of the present invention provides an MEC server, as shown in fig. 5, including:
a first communication unit 51 for receiving attitude information of the terminal device;
a first processing unit 52, configured to determine a shooting angle and spatial relative position information of the terminal device based on the posture information of the terminal device; and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
Here, the terminal device is a device provided with a camera, and specifically, the terminal device may include a camera capable of acquiring two-dimensional image information and a camera capable of acquiring depth information in this embodiment.
The first processing unit 52 is configured to determine a shooting angle of the terminal device for a target object according to a position relationship between the terminal device and an acquisition component thereof; and determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the attitude information of the terminal equipment.
The shooting angle corresponds to the position relation between the terminal equipment and the acquisition component of the terminal equipment; specifically, the first processing unit 52 is configured to obtain a position relationship between the terminal device and the acquisition component thereof in at least one preset shooting angle.
Further, the acquisition component of the terminal equipment can be understood as the position relationship between the terminal equipment and the two-dimensional camera thereof; the terminal device may measure the position relationship by using a plane where the rear housing is located, and the two-dimensional camera may measure the position relationship by using a central axis thereof, for example, referring to fig. 3a, a longitudinal section schematic diagram of the right side may be obtained after the terminal device is sliced by using a tangent line a; the central axis of the tangent plane collecting component on the right side can have a relative position relation with the shell plane of the terminal equipment; further, the shooting angle of the terminal equipment is determined according to the preset shooting angle and the relative relation between the terminal equipment and the acquisition component.
And determining the spatial relative position information of the terminal device and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal device, referring to fig. 3b, determining the spatial relative position information between the terminal device and the target object based on the shooting angle and the depth information corresponding to the two-dimensional image information and the posture information of the terminal device, that is, based on the relative position relationship between the world coordinate terminal device and the target object, that is, enabling the network device to acquire the relative position relationship between the terminal device and the target object in the world coordinate system.
And then, according to the shooting angle and the spatial relative position relation of the terminal equipment, combining the two-dimensional image information and the depth information of the two-dimensional image information to carry out three-dimensional image modeling, and finally obtaining a three-dimensional image conforming to the current shooting scene.
Therefore, by adopting the scheme, the shooting angle and the space relative position information of the terminal equipment can be determined based on the attitude information of the terminal equipment, and then the three-dimensional image modeling is carried out.
An embodiment of the present invention provides a terminal device, as shown in fig. 6, including:
the acquisition unit 61 is used for acquiring the attitude information of the terminal equipment;
and a second communication unit 62, configured to send the posture information of the terminal device to the network side.
Here, the terminal device is a device provided with a camera, and specifically, the terminal device may include a camera capable of acquiring two-dimensional image information and a camera capable of acquiring depth information in this embodiment.
The aforementioned acquisition unit 61 is configured to detect the attitude information of the terminal device by using a gyroscope. The attitude information acquired by the gyroscope can be angle information of three axial directions of terminal equipment during deflection and inclination; it should be understood here that the angular information of the terminal device in three axial directions can be referenced to the world coordinate system.
While the foregoing scheme is executed, the method may further include:
an acquisition unit 61 configured to acquire two-dimensional image information for a target object and depth information corresponding thereto;
and a second communication unit 62 configured to send the two-dimensional image information and the depth information corresponding to the two-dimensional image information to the network side.
The acquiring of the two-dimensional image information and the depth information corresponding to the two-dimensional image information may be: acquiring a two-dimensional image of a target object through a first camera unit to obtain N frames of two-dimensional image information of the target object; n is an integer greater than or equal to 1; synchronously acquiring depth information corresponding to each frame of two-dimensional image information in the N frames of two-dimensional image information through a second camera unit and a first camera unit;
it should be noted that the first camera unit may be a 2D camera, and the second camera unit may be a depth camera; moreover, the 2D camera and the depth camera are synchronously used for shooting, that is, the two-dimensional image information and the depth information collected by the two cameras can correspond to each other in a time domain; for example, the two-dimensional image 1 is acquired at the time 1, and the depth information 1 is acquired at the time 1, that is, the two-dimensional image 1 and the depth information 1 are in a one-to-one correspondence relationship, and further, when the two-dimensional image and the depth information are acquired at other times, both the two-dimensional image and the depth information can be in one-to-one correspondence relationship by the time information.
It should also be noted that the terminal device may transmit the two-dimensional image information and the depth information corresponding to the body region based on the TCP protocol.
Therefore, by adopting the scheme, the posture information of the terminal equipment can be sent to the network side, so that the network side can determine the shooting angle and the space relative position information of the terminal equipment, and further perform three-dimensional image modeling.
Fig. 7 is a schematic structural diagram of a communication device 700 provided in this embodiment, and it should be understood that the communication device shown in fig. 7 may be a terminal device or an MEC server in this embodiment. The communication device 700 shown in fig. 7 comprises a processor 710, and the processor 710 can call and run a computer program from a memory to implement the method in the embodiment of the present application.
Optionally, as shown in fig. 7, the communication device 700 may also include a memory 720. From the memory 720, the processor 710 can call and run a computer program to implement the method in the embodiment of the present application.
The memory 720 may be a separate device from the processor 710, or may be integrated into the processor 710.
Optionally, as shown in fig. 7, the communication device 700 may further include a transceiver 730, and the processor 710 may control the transceiver 730 to communicate with other devices, and specifically, may transmit information or data to the other devices or receive information or data transmitted by the other devices.
The transceiver 730 may include a transmitter and a receiver, among others. The transceiver 730 may further include an antenna, and the number of antennas may be one or more.
Optionally, the communication device 700 may specifically be an MEC server in the embodiment of the present application, and the communication device 700 may implement a corresponding process implemented by the MEC server in each method in the embodiment of the present application, and for brevity, no further description is given here.
Optionally, the communication device 700 may specifically be a mobile terminal device/terminal device according to this embodiment, and the communication device 700 may implement a corresponding process implemented by the mobile terminal device/terminal device in each method according to this embodiment, which is not described herein again for brevity.
Fig. 8 is a schematic structural diagram of a chip of an embodiment of the present application. The chip 800 shown in fig. 8 includes a processor 810, and the processor 810 can call and run a computer program from a memory to implement the method in the embodiment of the present application.
Optionally, as shown in fig. 8, chip 800 may further include a memory 820. From the memory 820, the processor 810 can call and run a computer program to implement the method in the embodiment of the present application.
The memory 820 may be a separate device from the processor 810 or may be integrated into the processor 810.
Optionally, the chip 800 may further include an input interface 830. The processor 810 may control the input interface 830 to communicate with other devices or chips, and specifically, may obtain information or data transmitted by other devices or chips.
Optionally, the chip 800 may further include an output interface 840. The processor 810 can control the output interface 840 to communicate with other devices or chips, and in particular, can output information or data to other devices or chips.
Optionally, the chip may be applied to the MEC server in the embodiment of the present application, and the chip may implement a corresponding process implemented by the MEC server in each method in the embodiment of the present application, and for brevity, details are not described here again.
Optionally, the chip may be applied to the mobile terminal device/terminal device in the embodiment of the present application, and the chip may implement the corresponding process implemented by the mobile terminal device/terminal device in each method in the embodiment of the present application, and for brevity, no further description is given here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip or a system-on-chip, etc.
Fig. 9 is a schematic block diagram of a communication system 900 provided in an embodiment of the present application. As shown in fig. 9, the communication system 900 includes a terminal apparatus 910 and an MEC server 920.
The terminal device 910 may be configured to implement the corresponding function implemented by the terminal device in the foregoing method, and the MEC server 920 may be configured to implement the corresponding function implemented by the MEC server in the foregoing method for brevity, which is not described herein again.
It should be understood that the processor of the embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
It should be understood that the above memories are exemplary but not limiting illustrations, for example, the memories in the embodiments of the present application may also be Static Random Access Memory (SRAM), dynamic random access memory (dynamic RAM, DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (enhanced SDRAM, ESDRAM), Synchronous Link DRAM (SLDRAM), Direct Rambus RAM (DR RAM), and the like. That is, the memory in the embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
The embodiment of the application also provides a computer readable storage medium for storing the computer program.
Optionally, the computer-readable storage medium may be applied to the MEC server in the embodiment of the present application, and the computer program enables the computer to execute the corresponding process implemented by the MEC server in each method in the embodiment of the present application, which is not described herein again for brevity.
Optionally, the computer-readable storage medium may be applied to the mobile terminal device/terminal device in the embodiment of the present application, and the computer program enables the computer to execute the corresponding process implemented by the mobile terminal device/terminal device in the methods in the embodiments of the present application, which is not described herein again for brevity.
Embodiments of the present application also provide a computer program product comprising computer program instructions.
Optionally, the computer program product may be applied to the MEC server in the embodiment of the present application, and the computer program instructions enable the computer to execute the corresponding processes implemented by the MEC server in the methods in the embodiment of the present application, which are not described herein again for brevity.
Optionally, the computer program product may be applied to the mobile terminal device/terminal device in the embodiment of the present application, and the computer program instructions enable the computer to execute the corresponding processes implemented by the mobile terminal device/terminal device in the methods in the embodiments of the present application, which are not described herein again for brevity.
The embodiment of the application also provides a computer program.
Optionally, the computer program may be applied to the MEC server in the embodiment of the present application, and when the computer program runs on a computer, the computer is enabled to execute the corresponding process implemented by the MEC server in each method in the embodiment of the present application, and details are not described herein for brevity.
Optionally, the computer program may be applied to the mobile terminal device/terminal device in the embodiment of the present application, and when the computer program runs on a computer, the computer executes a corresponding process implemented by the mobile terminal device/terminal device in each method in the embodiment of the present application, which is not described herein again for brevity.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or an MEC server, etc.) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A data processing method is applied to an MEC server and comprises the following steps:
receiving attitude information of the terminal equipment; the attitude information comprises three axial angle information when the terminal equipment deflects and tilts;
determining a shooting angle of the terminal equipment based on the attitude information of the terminal equipment;
receiving two-dimensional image information aiming at a target object sent by the terminal equipment and depth information corresponding to the two-dimensional image information; the depth information is acquired by the terminal equipment through a camera for acquiring the depth information;
determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal equipment;
and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
2. The method according to claim 1, wherein the determining the shooting angle of the terminal device based on the attitude information of the terminal device comprises:
and determining the shooting angle of the terminal equipment for the target object according to the position relation between the terminal equipment and the acquisition component thereof.
3. The method according to claim 2, wherein before determining the shooting angle of the terminal device for the target object according to the position relation between the terminal device and the acquisition component thereof, the method further comprises:
and acquiring the position relation between the terminal equipment and the acquisition component thereof in each shooting angle under at least one preset shooting angle.
4. A data processing method is applied to terminal equipment and comprises the following steps:
acquiring attitude information of the terminal equipment; the attitude information comprises three axial angle information when the terminal equipment deflects and tilts;
sending the attitude information of the terminal equipment to a network side;
acquiring two-dimensional image information and corresponding depth information of a target object; the depth information is acquired by the terminal equipment through a camera for acquiring the depth information;
sending the two-dimensional image information and the depth information corresponding to the two-dimensional image information to a network side; the network side determines the shooting angle of the terminal equipment based on the attitude information of the terminal equipment; and determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal equipment, and performing three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
5. The method of claim 4, wherein the collecting the attitude information of the terminal device comprises:
and detecting the attitude information of the terminal equipment by using a gyroscope.
6. An MEC server, comprising:
the device comprises a first communication unit, a second communication unit and a third communication unit, wherein the first communication unit is used for receiving attitude information of a terminal device, and receiving two-dimensional image information aiming at a target object and depth information corresponding to the two-dimensional image information, which are sent by the terminal device; the attitude information comprises three axial angle information when the terminal equipment deflects and tilts; the depth information is acquired by the terminal equipment through a camera for acquiring the depth information;
the first processing unit is used for determining the shooting angle of the terminal equipment based on the attitude information of the terminal equipment; determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal equipment; and carrying out three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
7. The MEC server of claim 6, wherein the first processing unit is configured to determine a shooting angle of the terminal device for the target object according to a position relationship between the terminal device and an acquisition component thereof.
8. The MEC server of claim 7, wherein the first processing unit is configured to obtain a position relationship between the terminal device and the capturing component thereof in each preset at least one shooting angle.
9. A terminal device, comprising:
the acquisition unit is used for acquiring the attitude information of the terminal equipment and acquiring the two-dimensional image information of the target object and the depth information corresponding to the two-dimensional image information; the attitude information comprises three axial angle information when the terminal equipment deflects and tilts; the acquisition unit is used for acquiring the depth information by utilizing a camera for acquiring the depth information;
the second communication unit is used for sending the attitude information of the terminal equipment to a network side and sending the two-dimensional image information and the depth information corresponding to the two-dimensional image information to the network side; the network side determines the shooting angle of the terminal equipment based on the attitude information of the terminal equipment; and determining the space relative position information of the terminal equipment and the target object based on the shooting angle, the depth information corresponding to the two-dimensional image information and the posture information of the terminal equipment, and performing three-dimensional image modeling based on the shooting angle of the terminal equipment and the space relative position information.
10. The terminal device according to claim 9, wherein the acquisition unit is configured to detect the attitude information of the terminal device by using a gyroscope.
11. An MEC server, comprising: a processor and a memory for storing a computer program, the processor being configured to invoke and execute the computer program stored in the memory to perform the method of any of claims 1 to 3.
12. A terminal device, comprising: a processor and a memory for storing a computer program, the processor being configured to invoke and execute the computer program stored in the memory to perform the method of any of claims 4 to 5.
13. A chip, comprising: a processor for calling and running a computer program from a memory so that a device on which the chip is installed performs the method of any one of claims 1 to 3.
14. A chip, comprising: a processor for calling and running a computer program from a memory so that a device on which the chip is installed performs the method of any one of claims 4 to 5.
15. A computer-readable storage medium storing a computer program for causing a computer to perform the method of any one of claims 1 to 5.
CN201811162079.2A 2018-09-30 2018-09-30 Data processing method, MEC server and terminal equipment Active CN109413409B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811162079.2A CN109413409B (en) 2018-09-30 2018-09-30 Data processing method, MEC server and terminal equipment
PCT/CN2019/090328 WO2020062919A1 (en) 2018-09-30 2019-06-06 Data processing method, mec server and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811162079.2A CN109413409B (en) 2018-09-30 2018-09-30 Data processing method, MEC server and terminal equipment

Publications (2)

Publication Number Publication Date
CN109413409A CN109413409A (en) 2019-03-01
CN109413409B true CN109413409B (en) 2020-12-22

Family

ID=65466690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811162079.2A Active CN109413409B (en) 2018-09-30 2018-09-30 Data processing method, MEC server and terminal equipment

Country Status (2)

Country Link
CN (1) CN109413409B (en)
WO (1) WO2020062919A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109413409B (en) * 2018-09-30 2020-12-22 Oppo广东移动通信有限公司 Data processing method, MEC server and terminal equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103047969A (en) * 2012-12-07 2013-04-17 北京百度网讯科技有限公司 Method for generating three-dimensional image through mobile terminal and mobile terminal
CN108230437A (en) * 2017-12-15 2018-06-29 深圳市商汤科技有限公司 Scene reconstruction method and device, electronic equipment, program and medium

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104641399B (en) * 2012-02-23 2018-11-23 查尔斯·D·休斯顿 System and method for creating environment and for location-based experience in shared environment
CN103517061B (en) * 2013-09-03 2015-09-23 展讯通信(上海)有限公司 A kind of display control method of terminal equipment and device
CN106296801B (en) * 2015-06-12 2019-11-26 联想(北京)有限公司 A kind of method that establishing object three-dimensional image model and electronic equipment
US10291845B2 (en) * 2015-08-17 2019-05-14 Nokia Technologies Oy Method, apparatus, and computer program product for personalized depth of field omnidirectional video
US20180253894A1 (en) * 2015-11-04 2018-09-06 Intel Corporation Hybrid foreground-background technique for 3d model reconstruction of dynamic scenes
CN105898271A (en) * 2015-12-28 2016-08-24 乐视致新电子科技(天津)有限公司 360-degree panoramic video playing method, playing module and mobile terminal
CN105787988B (en) * 2016-03-21 2021-04-13 联想(北京)有限公司 Information processing method, server and terminal equipment
CN109076253A (en) * 2016-04-28 2018-12-21 索尼公司 Information processing unit and information processing method and three-dimensional image data transmitting method
CN106023302B (en) * 2016-05-06 2020-06-09 武汉雄楚高晶科技有限公司 Mobile communication terminal, server and method for realizing three-dimensional reconstruction
CN106204630B (en) * 2016-08-19 2019-03-12 浙江宇视科技有限公司 A kind of method and device configuring video camera
CN108053435A (en) * 2017-11-29 2018-05-18 深圳奥比中光科技有限公司 Dynamic realtime three-dimensional rebuilding method and system based on handheld mobile device
CN108182726A (en) * 2017-12-29 2018-06-19 努比亚技术有限公司 Three-dimensional rebuilding method, cloud server and computer readable storage medium
CN108495112B (en) * 2018-05-10 2020-12-22 Oppo广东移动通信有限公司 Data transmission method, terminal and computer storage medium
CN109413409B (en) * 2018-09-30 2020-12-22 Oppo广东移动通信有限公司 Data processing method, MEC server and terminal equipment
CN109272576B (en) * 2018-09-30 2023-03-24 Oppo广东移动通信有限公司 Data processing method, MEC server, terminal equipment and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103047969A (en) * 2012-12-07 2013-04-17 北京百度网讯科技有限公司 Method for generating three-dimensional image through mobile terminal and mobile terminal
CN108230437A (en) * 2017-12-15 2018-06-29 深圳市商汤科技有限公司 Scene reconstruction method and device, electronic equipment, program and medium

Also Published As

Publication number Publication date
CN109413409A (en) 2019-03-01
WO2020062919A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US11475536B2 (en) Context-aware synthesis for video frame interpolation
US8810632B2 (en) Apparatus and method for generating a three-dimensional image using a collaborative photography group
CN109272576B (en) Data processing method, MEC server, terminal equipment and device
CN111327758B (en) Camera sharing method and device
US11044457B2 (en) Method for processing data, server and computer storage medium
US9052866B2 (en) Method, apparatus and computer-readable medium for image registration and display
WO2019091191A1 (en) Data processing method and apparatus
CN109495733B (en) Three-dimensional image reconstruction method, device and non-transitory computer readable storage medium thereof
CN109413409B (en) Data processing method, MEC server and terminal equipment
WO2020063170A1 (en) Data processing method, terminal, server and storage medium
JP2017229067A (en) Method and apparatus for creating pair of stereoscopic images using at least one lightfield camera
KR102319538B1 (en) Method and apparatus for transmitting image data, and method and apparatus for generating 3dimension image
CN109246408B (en) Data processing method, terminal, server and computer storage medium
CN109151430B (en) Data processing method, terminal, server and computer storage medium
CN109413405B (en) Data processing method, terminal, server and computer storage medium
CN108632376B (en) Data processing method, terminal, server and computer storage medium
CN109120912B (en) Data processing method, MEC server, terminal equipment and device
CN109147043B (en) Data processing method, server and computer storage medium
CN109345623B (en) Model verification method, server and computer storage medium
CN109246409B (en) Data processing method, terminal, server and computer storage medium
CN109151435B (en) Data processing method, terminal, server and computer storage medium
KR20170088623A (en) Method for generating multi-view image by using a plurality of mobile terminal
CN109299323B (en) Data processing method, terminal, server and computer storage medium
WO2021200226A1 (en) Information processing device, information processing method, and program
CN109325997B (en) Model checking method, server and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant