WO2020063168A1 - Procédé de traitement de données, terminal, serveur et support d'informations d'ordinateur - Google Patents

Procédé de traitement de données, terminal, serveur et support d'informations d'ordinateur Download PDF

Info

Publication number
WO2020063168A1
WO2020063168A1 PCT/CN2019/100632 CN2019100632W WO2020063168A1 WO 2020063168 A1 WO2020063168 A1 WO 2020063168A1 CN 2019100632 W CN2019100632 W CN 2019100632W WO 2020063168 A1 WO2020063168 A1 WO 2020063168A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
instruction information
model
image data
data
Prior art date
Application number
PCT/CN2019/100632
Other languages
English (en)
Chinese (zh)
Inventor
夏炀
李虎
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020063168A1 publication Critical patent/WO2020063168A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • This application relates to data processing technology, and in particular, to a data processing method, terminal, server, and computer storage medium.
  • the three-dimensional video data includes two-dimensional image data (for example, RGB data) and depth data (Depth data).
  • two-dimensional video data and depth data are required respectively.
  • the amount of data collection for 3D video data is very large, so the amount of data to be transmitted is also very large, and high technical support is required during the data transmission process. Therefore, the mobile communication network needs a faster data transmission rate and a more stable data transmission rate. Data transmission environment.
  • MEC Mobile Edge Computing
  • embodiments of the present application provide a data processing method, a terminal, a server, and a computer storage medium.
  • An embodiment of the present application provides a data processing method applied to a MEC server.
  • the method includes: sending instruction information to a terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition;
  • the preset model set includes a plurality of three-dimensional models of non-critical parts other than key parts;
  • a model including three-dimensional video data of the non-critical part and the key part is established according to the first model and the second model; the second model is established from the three-dimensional video data of the key part received.
  • the sending instruction information to the terminal includes:
  • the sending the first indication information to the terminal according to a model in the model set includes: according to a type of the model in the model set and / or a model corresponding to the model.
  • the characteristic parameter sends first indication information to the terminal.
  • the sending instruction information to the terminal includes: sending second instruction information to the terminal; the second instruction information is used to indicate that the terminal is collecting three-dimensional video data. Select image data that can distinguish non-critical parts.
  • An embodiment of the present application further provides a data processing method, which is applied to a terminal; the method includes:
  • the instruction information is used to instruct the terminal to transmit image data that meets a preset condition
  • the receiving the instruction information from the MEC server includes:
  • the first instruction information is used to instruct the terminal to collect image data used to distinguish non-critical parts corresponding to a model
  • the method Before the sending image data to the MEC server based on the instruction information, the method further includes:
  • Image data is obtained based on the first instruction information, and the image data can distinguish non-critical parts corresponding to the model.
  • the receiving the instruction information from the MEC server includes:
  • the second instruction information is used to instruct the terminal to select image data capable of distinguishing non-critical parts from the collected three-dimensional video data;
  • the method Before the sending image data to the MEC server based on the instruction information, the method further includes:
  • the method further includes: obtaining three-dimensional video data; extracting first data corresponding to key parts from the three-dimensional video data, and sending the first data to a MEC server .
  • An embodiment of the present application further provides a MEC server.
  • the server includes: a first communication unit, a selection unit, and a modeling unit. Among them,
  • the first communication unit is configured to send instruction information to a terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition; and is further used to receive image data from the terminal;
  • the selection unit is configured to select a first model from a preset model set based on the image data received by the first communication unit; the preset model set includes a plurality of other than key parts 3D models of non-critical parts;
  • the modeling unit is configured to establish a model including the three-dimensional video data of the non-critical part and the key part according to the first model and the second model; and the second model is obtained by receiving the three-dimensional video of the key part. Data creation.
  • the first communication unit is configured to send first instruction information to the terminal according to a model in the model set; the first instruction information is used to indicate the The terminal collects image data used to distinguish non-critical parts corresponding to the model.
  • the first communication unit is configured to send the first indication information to the terminal according to a type of the model in the model set and / or a characteristic parameter corresponding to the model.
  • the first communication unit is configured to send second instruction information to the terminal; the second instruction information is used to instruct the terminal in the collected three-dimensional video data Select image data that can distinguish non-critical parts.
  • An embodiment of the present application further provides a terminal, where the terminal includes: a second communication unit and an obtaining unit;
  • the second communication unit is configured to receive instruction information from a MEC server; the instruction information is used to instruct the terminal to transmit image data that meets a preset condition;
  • the obtaining unit is configured to obtain image data
  • the second communication unit is further configured to send image data that satisfies the preset condition to the MEC server based on the instruction information.
  • the second communication unit is configured to receive first instruction information from the MEC server; the first instruction information is used to instruct the terminal to collect the information corresponding to the model. Image data of non-critical parts;
  • the obtaining unit is configured to obtain image data based on the first instruction information, and the image data can distinguish non-critical parts corresponding to a model.
  • the second communication unit is configured to receive second instruction information from the MEC server; the second instruction information is used to instruct the terminal in the collected three-dimensional video data Select image data that can distinguish non-critical parts;
  • the acquiring unit is configured to acquire image data, and select image data capable of distinguishing non-critical parts from the obtained three-dimensional video data according to the second instruction information.
  • the obtaining unit is configured to obtain three-dimensional video data
  • the second communication unit is configured to extract first data corresponding to key parts from the three-dimensional video data, and send the first data to a MEC server.
  • An embodiment of the present application further provides a computer storage medium having computer instructions stored thereon, which, when executed by a processor, implement the steps of the data processing method applied to the MEC server according to the embodiments of the present application; or, the instruction When executed by a processor, the steps of the data processing method applied to a terminal according to the embodiments of the present application are implemented.
  • An embodiment of the present application further provides a MEC server including a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the program, the processor implements the program described in the embodiment of the present application. Steps of a data processing method applied to a MEC server.
  • An embodiment of the present application further provides a terminal including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the application described in the embodiment of the present application is implemented. Steps of the terminal-based data processing method.
  • An embodiment of the present application further provides a chip, including: a processor, configured to call and run a computer program from a memory, so that a device installed with the chip executes the data processing method applied to the MEC server according to the embodiment of the present application. Or the steps of the data processing method applied to a terminal described in the embodiments of the present application.
  • An embodiment of the present application further provides a computer program product including computer program instructions, which causes the computer to execute the steps of the data processing method applied to the MEC server according to the embodiments of the present application; or The computer executes the steps of the data processing method applied to the terminal according to the embodiments of the present application.
  • the embodiment of the present application further provides a computer program, which causes the computer to execute the steps of the data processing method applied to the MEC server according to the embodiment of the present application; or the computer program causes the computer to execute the embodiment of the present application The steps of the data processing method applied to a terminal.
  • the data processing method, terminal, server, and computer storage medium provided in the embodiments of the present application.
  • the method corresponding to the server includes: sending instruction information to the terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition.
  • the method corresponding to the terminal includes: receiving instruction information from the MEC server; the instruction information is used to instruct the terminal to transmit image data satisfying a preset condition; acquiring image data, and sending the satisfaction to the MEC server based on the instruction information The image data of the preset condition.
  • the server sends instruction information to the terminal to instruct the terminal to transmit image data that meets the preset conditions, so that the server can select a suitable model, which greatly improves the accuracy of the selected preset model; To a certain extent, it also greatly shortens the modeling time of the complete model.
  • FIG. 1 is a schematic diagram of a system architecture applied to a data processing method according to an embodiment of the present application
  • FIG. 2 is a first schematic flowchart of a data processing method according to an embodiment of the present application
  • FIG. 3 is a second schematic flowchart of a data processing method according to an embodiment of the present application.
  • FIG. 4 is a third flowchart of a data processing method according to an embodiment of the present application.
  • FIG. 5 is a fourth schematic flowchart of a data processing method according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a server according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a hardware composition and structure of a data processing device according to an embodiment of the present application.
  • the data processing method in the embodiment of the present application is applied to a service related to three-dimensional video data.
  • the service is, for example, a service for sharing three-dimensional video data, or a live broadcast service based on three-dimensional video data.
  • the transmitted depth data and 2D video data require higher technical support in the data transmission process, so the mobile communication network needs a faster data transmission rate. , And a more stable data transmission environment.
  • FIG. 1 is a schematic diagram of a system architecture applied to a data transmission method according to an embodiment of the present application.
  • the system may include a terminal, a base station, a MEC server, a service processing server, a core network, and the Internet.
  • the MEC server and services High-speed channels are established between the processing servers through the core network to achieve data synchronization.
  • MEC server A is a MEC server deployed near terminal A (sender), and core network A is the core network in the area where terminal A is located; B is a MEC server deployed near terminal B (receiving end), and core network B is the core network in the area where terminal B is located; MEC server A and MEC server B can communicate with the service processing server through core network A and core network B, respectively. Establish high-speed channels for data synchronization.
  • the MEC server A synchronizes the data to the service processing server through the core network A; and the MEC server B obtains the three-dimensional video data sent by the terminal A from the service processing server. And send it to terminal B for presentation.
  • terminal B and terminal A use the same MEC server to implement transmission, then terminal B and terminal A directly implement three-dimensional video data transmission through one MEC server without the participation of a service processing server.
  • This method is called local Way back. Specifically, it is assumed that the terminal B and the terminal A realize the transmission of three-dimensional video data through the MEC server A, and after the three-dimensional video data sent by the terminal A is transmitted to the MEC server A, the three-dimensional video data is sent by the MEC server A to the terminal B for presentation.
  • the terminal may select an evolved base station (eNB) that accesses a 4G network or a next-generation evolved base station (gNB) that accesses a 5G network based on the network situation, or the configuration of the terminal itself, or an algorithm configured by itself.
  • eNB evolved base station
  • gNB next-generation evolved base station
  • the eNB is connected to the MEC server through a Long Term Evolution (LTE) access network, so that the gNB is connected to the MEC server through the Next Generation Access Network (NG-RAN).
  • LTE Long Term Evolution
  • NG-RAN Next Generation Access Network
  • the MEC server is deployed on the edge of the network near the terminal or the source of the data.
  • the so-called near the terminal or the source of the data is not only at the logical location, but also geographically close to the terminal or the source of the data.
  • multiple MEC servers can be deployed in one city. For example, in an office building with many users, a MEC server can be deployed near the office building.
  • the MEC server as an edge computing gateway with core capabilities of converged networks, computing, storage, and applications, provides platform support for edge computing including device domains, network domains, data domains, and application domains. It connects various types of smart devices and sensors, and provides smart connection and data processing services nearby, allowing different types of applications and data to be processed in the MEC server, realizing business real-time, business intelligence, data aggregation and interoperation, security and privacy protection, etc. Key intelligent services effectively improve the intelligent decision-making efficiency of the business.
  • FIG. 2 is a first schematic flowchart of a data processing method according to an embodiment of the present application; as shown in FIG. 2, the method includes:
  • Step 101 Send instruction information to the terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition.
  • Step 102 Receive image data from the terminal, and select a first model from a preset model set based on the image data; the preset model set includes a plurality of non-critical parts other than key parts. Three-dimensional model.
  • Step 103 Establish a model including the non-critical part and the three-dimensional video data of the key part according to the first model and the second model; the second model is established from the received three-dimensional video data of the key part.
  • a model set is preset in the server, and the model set includes a three-dimensional model of a non-critical part; the non-critical part is a part other than the key part.
  • a key part is predefined in the terminal, and the terminal extracts 3D video data of a predefined key part by performing image recognition processing on the 3D video data;
  • a non-key part is predefined in the server, and the server is based on a predefined non-key part Model in advance to obtain multiple 3D models corresponding to non-critical parts, and generate a model set based on multiple 3D models.
  • the three-dimensional video data of the key part is data associated with a transmission algorithm.
  • the stability of the three-dimensional video data of the key part usually does not meet a preset stability requirement. of. It can be understood that the three-dimensional video data of key parts is prone to errors during the transmission process using this transmission algorithm, that is, the three-dimensional video data of key parts is sensitive data and data with poor stability.
  • the face of the person is more three-dimensional relative to the body part, so the probability of an error in the data of the face part is more It is high, or there is a high probability of errors in the nose part data and the eye part data. Therefore, at least one of a face part, or a nose part, an eye part, and a mouth part may be defined as a key part. Furthermore, key parts in the three-dimensional video data are identified through image recognition processing.
  • the two-dimensional video data and depth data in the three-dimensional video data are obtained separately; the key parts in the two-dimensional video data can be identified through image recognition processing; and the positions and key parts are determined based on the key parts in the two-dimensional video data.
  • Corresponding depth data As one implementation, the depth data corresponding to the key parts can be sent as three-dimensional video data, so that the MEC server can model based on the three-dimensional video data representing the depth data; as another In an implementation manner, the two-dimensional video data and depth data corresponding to the key parts are used as three-dimensional video data, and the three-dimensional video data is sent, so that the MEC server can model the key parts based on the depth data, and can also perform color based on the two-dimensional video data. Of padding.
  • the method further includes: receiving three-dimensional video data from a terminal; the three-dimensional video data is data corresponding to a key part; and establishing a response to the three-dimensional video data based on the three-dimensional video data. Key model of the second part.
  • the MEC server may pre-configure feature parameters of a model corresponding to a plurality of key parts; then, when determining whether a second model is completed, the feature parameters of the pre-configured key-part model may be separately compared with Whether the second model matches; if the matching rate of the feature parameters of a preset model with the second model exceeds a preset threshold, it may indicate that the establishment of the second model is complete.
  • the feature parameters of the pre-configured key part model may specifically be contour feature points and / or bone key points. Further, after the establishment of the second model is completed, a first model corresponding to a non-critical part needs to be selected.
  • the server sends instruction information to the terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition, so that the server selects an adapted first model based on the image data.
  • the sending the instruction information to the terminal includes: sending the first instruction information to the terminal according to a model in the model set; the first instruction information is used to instruct the terminal to collect for distinguishing Image data of non-critical parts corresponding to the model.
  • the server sends the first instruction information to the terminal based on the model in the model set, and the first instruction information is used to instruct the terminal to collect image data capable of distinguishing non-critical parts corresponding to the model, that is, the server informs the terminal in this embodiment. What image data is collected.
  • the sending the first indication information to the terminal according to a model in the model set includes: sending to the terminal according to a type of the model in the model set and / or a characteristic parameter corresponding to the model. First indication information.
  • the type of the model indicates the type of non-critical parts
  • the characteristic parameters corresponding to the model include parameters related to the contour points and / or bone points of the non-critical parts.
  • the server may generate the first indication information based on the type of the non-critical portion and / or the characteristic parameters of the non-critical portion; the first indication information may include the type of the non-critical portion and / or the characteristic parameter of the non-critical portion .
  • the type of the non-critical part may specifically be an arm type, a hand type, a leg type, a foot type, a torso type, etc .; of course, it may also include a background type (such as a city street type, an indoor type, Mountain types, etc.). Therefore, the terminal collects image data according to the type of the non-critical part and / or the characteristic parameter of the non-critical part in the first instruction information.
  • the image data is image data that cannot be used for modeling.
  • the image data may be two-dimensional image data, such as RGB data.
  • the sending the instruction information to the terminal includes: sending the second instruction information to the terminal; the second instruction information is used to instruct the terminal to select from the collected three-dimensional video data that can distinguish non- Image data of key parts.
  • the server sends second instruction information to the terminal, and the second instruction information is used to instruct the terminal to select image data capable of distinguishing non-critical parts from the collected three-dimensional video data, that is, in this embodiment, the server and It does not instruct the terminal to collect any image data, but selects image data that can distinguish non-critical parts from the collected three-dimensional video data.
  • the image data is image data that cannot be used for modeling.
  • the image data may be two-dimensional image data, such as RGB data.
  • the sending the instruction information to the terminal includes: sending the instruction information through handshake signaling.
  • a handshake process is required, which specifically includes: the terminal sends a first data packet to the server to establish a communication connection with the server; after the server receives the first data packet, it sends a The terminal sends a first acknowledgement data packet; after receiving the first acknowledgement data packet from the server, the terminal sends a second acknowledgement data packet to the server. At this point, the terminal can transmit data to the server.
  • the indication information described in the embodiment of the present application may be carried in the first confirmation data packet and sent to the terminal; as another example, the indication information may also be sent to the terminal through an independent data packet.
  • the server receives image data from the terminal, and since the image data can distinguish non-critical parts, the server may select a preset model set based on the type and / or feature parameters of the non-critical parts in the image data. Selecting a first model, for example, selecting a first model consistent with the type of non-critical parts in the image data from a preset model set; and / or selecting the first model from the preset model set and the image data The first model where the similarity of the characteristic parameters of the non-critical parts in the medium reaches a preset threshold.
  • the non-critical part in the image data is an arm
  • multiple models of the arm type may be determined in a preset model set, and further selected among the multiple models determined based on the contour features of the arms in the image data The first model with the highest similarity to this contour feature.
  • a complete model is further established based on the first model of the non-critical part and the second model of the key part.
  • the server sends instruction information to the terminal to instruct the terminal to transmit image data that meets the preset conditions, so that the server can select a suitable model, which greatly improves the accuracy of the selected preset model; To a certain extent, it also greatly shortens the modeling time of the complete model.
  • FIG. 3 is a second schematic flowchart of a data processing method according to an embodiment of the present application; as shown in FIG. 3, the method includes:
  • Step 201 Receive instruction information from a MEC server; the instruction information is used to instruct the terminal to transmit image data that meets a preset condition.
  • Step 202 Acquire image data, and send image data that satisfies the preset condition to the MEC server based on the instruction information.
  • a key part is predefined in the terminal, and the terminal extracts 3D video data of a predefined key part by performing image recognition processing on the 3D video data;
  • a non-key part is predefined in the server, and the server is based on a predefined non-key part Model in advance to obtain multiple 3D models corresponding to non-critical parts, and generate a model set based on multiple 3D models.
  • the three-dimensional video data of the key part is data associated with a transmission algorithm.
  • the stability of the three-dimensional video data of the key part usually does not meet a preset stability requirement. of. It can be understood that the three-dimensional video data of key parts is prone to errors during the transmission process using this transmission algorithm, that is, the three-dimensional video data of key parts is sensitive data and data with poor stability.
  • the face of the person is more three-dimensional relative to the body part, so the probability of an error in the data of the face part is more It is high, or there is a high probability of errors in the nose part data and the eye part data. Therefore, at least one of a face part, or a nose part, an eye part, and a mouth part may be defined as a key part. Furthermore, key parts in the three-dimensional video data are identified through image recognition processing.
  • the two-dimensional video data and depth data in the three-dimensional video data are obtained separately; the key parts in the two-dimensional video data can be identified through image recognition processing; and the positions and key parts are determined based on the key parts in the two-dimensional video data.
  • Corresponding depth data As one implementation, the depth data corresponding to the key parts can be sent as three-dimensional video data, so that the MEC server can model based on the three-dimensional video data representing the depth data; as another In an implementation manner, the two-dimensional video data and depth data corresponding to the key parts are used as three-dimensional video data, and the three-dimensional video data is sent, so that the MEC server can model the key parts based on the depth data, and can also perform color based on the two-dimensional video data. Of padding.
  • the method further includes: obtaining three-dimensional video data; extracting first data corresponding to key parts from the three-dimensional video data, and sending the first data to the MEC server.
  • the server In order for the server to establish a second model corresponding to the key part; further, after the establishment of the second model, a first model corresponding to the non-key part needs to be selected.
  • the terminal obtains indication information of the server, where the indication information is used to instruct the terminal to transmit image data that meets a preset condition, so that the server selects an adapted first model based on the image data.
  • step 301 receiving first instruction information from a MEC server; the first instruction information is used to instruct the terminal to collect image data for distinguishing non-critical parts corresponding to a model; .
  • Step 302 Obtain image data based on the first instruction information, and the image data can distinguish non-critical parts corresponding to the model.
  • Step 303 Send the image data to the MEC server.
  • the first indication information includes a type of the model and / or a characteristic parameter corresponding to the model.
  • the type of the model indicates the type of non-critical parts
  • the characteristic parameters corresponding to the model include related parameters of the contour points and / or bone points of the non-critical parts.
  • the type of the non-critical part may specifically be an arm type, a hand type, a leg type, a foot type, a torso type, etc .; of course, it may also include a background type (such as a city street type, an indoor type, Mountain types, etc.).
  • the terminal may collect image data according to the type of non-critical parts and / or characteristic parameters of the non-critical parts in the first instruction information, and send the obtained image data to the MEC server.
  • the image data is image data that cannot be used for modeling.
  • the image data may be two-dimensional image data, such as RGB data.
  • step 401 Receive second instruction information from the MEC server; the second instruction information is used to instruct the terminal to select non-critical information from the collected three-dimensional video data. Image data of parts.
  • Step 402 Obtain three-dimensional video data, and select image data capable of distinguishing non-critical parts from the obtained three-dimensional video data according to the second instruction information.
  • Step 403 Send the image data to the MEC server.
  • the second indication information may also include a type of the model and / or a characteristic parameter corresponding to the model.
  • the description of the type of the model and the characteristic parameters corresponding to the model can be described above, and are not repeated here.
  • the terminal may select image data that satisfies the type of the model and / or the characteristic parameters corresponding to the model from the obtained three-dimensional video data based on the type of the model and / or the characteristic parameters corresponding to the model, and sends the selected image data to the MEC server.
  • the image data is image data that cannot be used for modeling.
  • the image data may be two-dimensional image data, such as RGB data.
  • the receiving the indication information from the MEC server includes: receiving the indication information from the MEC server through handshake signaling.
  • a handshake process is required, which specifically includes: the terminal sends a first data packet to the server to establish a communication connection with the server; after the server receives the first data packet, it sends a The terminal sends a first acknowledgement data packet; after receiving the first acknowledgement data packet from the server, the terminal sends a second acknowledgement data packet to the server. At this point, the terminal can transmit data to the server.
  • the indication information described in the embodiment of the present application may be carried in the first confirmation data packet and sent to the terminal; as another example, the indication information may also be sent to the terminal through an independent data packet.
  • the server sends instruction information to the terminal to instruct the terminal to transmit image data that meets the preset conditions, so that the server can select a suitable model, which greatly improves the accuracy of the selected preset model; To a certain extent, it also greatly shortens the modeling time of the complete model.
  • FIG. 6 is a schematic structural diagram of a structure of a terminal according to an embodiment of the present application.
  • the terminal includes: a second communication unit 51 and an obtaining unit 52;
  • the second communication unit 51 is configured to receive instruction information from a MEC server; the instruction information is used to instruct the terminal to transmit image data that meets a preset condition;
  • the obtaining unit 52 is configured to obtain image data
  • the second communication unit 51 is further configured to send image data that satisfies the preset condition to the MEC server based on the instruction information.
  • the second communication unit 51 is configured to receive first instruction information from the MEC server; the first instruction information is used to instruct the acquisition unit 52 to collect data for distinguishing Image data of non-critical parts corresponding to the model;
  • the obtaining unit 52 is configured to obtain image data based on the first instruction information, and the image data can distinguish non-critical parts corresponding to a model.
  • the second communication unit 51 is configured to receive second instruction information from the MEC server; the second instruction information is used to instruct the acquiring unit 52 to collect the three-dimensional information Select image data that can distinguish non-critical parts from video data;
  • the acquiring unit 52 is configured to acquire image data, and select image data capable of distinguishing non-critical parts from the obtained three-dimensional video data according to the second instruction information.
  • the obtaining unit 52 is configured to obtain three-dimensional video data
  • the second communication unit 51 is configured to extract first data corresponding to key parts from the three-dimensional video data, and send the first data to a MEC server.
  • the obtaining unit 52 in the terminal may be implemented by a processor in the terminal, such as a central processing unit (CPU, Central Processing Unit), and a digital signal processor (DSP, Digital Signal Processor). ), Microcontroller Unit (MCU, Microcontroller Unit) or Programmable Gate Array (FPGA, Field-Programmable Gate Array), etc .; the second communication unit 51 in the terminal can be implemented through a communication module (including : Basic communication suite, operating system, communication module, standardized interfaces and protocols, etc.) and transceiver antenna implementation.
  • a communication module including : Basic communication suite, operating system, communication module, standardized interfaces and protocols, etc.
  • the terminal provided in the foregoing embodiment only uses the division of the foregoing program modules as an example when performing data processing.
  • the above processing can be allocated by different program modules according to needs, that is, the terminal
  • the internal structure is divided into different program modules to complete all or part of the processing described above.
  • the terminal and the data processing method embodiments provided in the foregoing embodiments belong to the same concept. For specific implementation processes, refer to the method embodiments, and details are not described herein again.
  • FIG. 7 is a schematic structural diagram of a server according to an embodiment of the present application.
  • the server includes: a first communication unit 61, a selection unit 62, and a modeling unit 63;
  • the first communication unit 61 is configured to send instruction information to a terminal, where the instruction information is used to instruct the terminal to transmit image data that meets a preset condition; and is further configured to receive image data from the terminal;
  • the selection unit 62 is configured to select a first model from a preset model set based on the image data received by the first communication unit 61, and the preset model set includes a plurality of parts other than key parts. 3D models of other non-critical parts;
  • the modeling unit 63 is configured to establish a model including the three-dimensional video data of the non-critical part and the key part according to the first model and the second model; Video data creation.
  • the first communication unit 61 is configured to send first instruction information to the terminal according to a model in the model set; the first instruction information is used to indicate all
  • the terminal collects image data used to distinguish non-critical parts corresponding to the model.
  • the first communication unit 61 is configured to send the first indication information to the terminal according to a type of the model in the model set and / or a characteristic parameter corresponding to the model.
  • the first communication unit 61 is configured to send second instruction information to the terminal; the second instruction information is used to indicate that the terminal is collecting three-dimensional video data. Select image data that can distinguish non-critical parts.
  • the selection unit 62 and the modeling unit 63 in the server may be implemented by a processor in the server, such as a CPU, a DSP, an MCU, or an FPGA;
  • a communication unit 61 may be implemented by a communication module (including: a basic communication kit, an operating system, a communication module, a standardized interface and a protocol, etc.) and a transceiver antenna in practical applications.
  • the server provided by the foregoing embodiment only uses the division of the foregoing program modules as an example for data processing.
  • the above processing can be allocated by different program modules according to needs. That is, the server The internal structure is divided into different program modules to complete all or part of the processing described above.
  • the server and the data processing method embodiments provided in the foregoing embodiments belong to the same concept. For specific implementation processes, refer to the method embodiments, and details are not described herein again.
  • FIG. 8 is a schematic diagram of a hardware composition structure of the data processing device according to the embodiment of the present application.
  • the data processing device includes a memory and a processing unit. And a computer program stored on a memory and run on a processor; as a first implementation manner, when the data processing device is a terminal, the processor located on the terminal executes the program when the processor executes the program, and is applied to the terminal in the embodiments of this application Steps of the data processing method.
  • the data processing device is a MEC server
  • a processor located on the server executes the program
  • the steps of the data processing method applied to the MEC server in the embodiment of the present application are implemented.
  • the data processing device also includes a communication interface; various components in the data processing device (terminal or server) are coupled together through a bus system. Understandably, the bus system is used to implement connection and communication between these components.
  • the bus system also includes a power bus, a control bus, and a status signal bus.
  • the memory in this embodiment may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (ROM, Read Only Memory), a programmable read-only memory (PROM, Programmable Read-Only Memory), or an erasable programmable read-only memory (EPROM, Erasable Programmable Read- Only Memory), Electrically Erasable and Programmable Read-Only Memory (EEPROM), Magnetic Random Access Memory (FRAM, ferromagnetic random access memory), Flash Memory (Flash Memory), Magnetic Surface Memory , Compact disc, or read-only compact disc (CD-ROM, Compact Disc-Read-Only Memory); the magnetic surface memory can be a disk memory or a tape memory.
  • the volatile memory may be random access memory (RAM, Random Access Memory), which is used as an external cache.
  • RAM random access memory
  • RAM Random Access Memory
  • many forms of RAM are available, such as Static Random Access Memory (SRAM, Static Random Access Memory), Synchronous Static Random Access Memory (SSRAM, Static Random Access, Memory), Dynamic Random Access DRAM (Dynamic Random Access Memory), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM, Double Data Rate Synchronous Dynamic Random Access Memory), enhanced Type Synchronous Dynamic Random Access Memory (ESDRAM, Enhanced Random Dynamic Access Memory), Synchronous Link Dynamic Random Access Memory (SLDRAM, SyncLink Dynamic Random Access Memory), Direct Memory Bus Random Access Memory (DRRAM, Direct Rambus Random Access Memory) ).
  • SRAM Static Random Access Memory
  • SSRAM Synchronous Static Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • the methods disclosed in the embodiments of the present application may be applied to a processor, or implemented by a processor.
  • the processor may be an integrated circuit chip with signal processing capabilities. In the implementation process, each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
  • the aforementioned processor may be a general-purpose processor, a DSP, or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like.
  • the processor may implement or execute various methods, steps, and logic block diagrams disclosed in the embodiments of the present application.
  • a general-purpose processor may be a microprocessor or any conventional processor.
  • the software module may be located in a storage medium.
  • the storage medium is located in the memory.
  • the processor reads the information in the memory and completes the steps of the foregoing method in combination with its hardware.
  • An embodiment of the present application further provides a chip including a processor, and the processor may call and run a computer program from a memory to implement the method in the embodiment of the present application.
  • the chip may further include a memory.
  • the processor may call and run a computer program from the memory to implement the method in the embodiment of the present application.
  • the memory can be a separate device independent of the processor, or it can be integrated in the processor.
  • the chip may further include an input interface.
  • the processor can control the input interface to communicate with other devices or chips. Specifically, the processor can obtain information or data sent by other devices or chips.
  • the chip may further include an output interface.
  • the processor may control the output interface to communicate with other devices or chips.
  • the processor may output information or data to the other devices or chips.
  • the chip may be applied to the MEC server in the embodiments of the present application, and the chip may implement the corresponding processes implemented by the MEC server in the methods of the embodiments of the present application.
  • the chip may implement the corresponding processes implemented by the MEC server in the methods of the embodiments of the present application.
  • the chip may be applied to the terminal in the embodiments of the present application, and the chip may implement the corresponding processes implemented by the terminal in each method of the embodiments of the present application.
  • the chip may implement the corresponding processes implemented by the terminal in each method of the embodiments of the present application.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-level chip, a system chip, a chip system or a system-on-chip.
  • An embodiment of the present application further provides a computer storage medium, specifically a computer-readable storage medium.
  • Computer instructions are stored thereon, and when the computer instructions are executed by a processor, the data processing method applied in the embodiment of the present application to a terminal or a MEC server is implemented.
  • An embodiment of the present application further provides a computer program product, including computer program instructions.
  • the computer program product can be applied to the MEC server in the embodiments of the present application, and the computer program instructions cause the computer to execute the corresponding processes implemented by the MEC server in each method of the embodiments of the present application. More details.
  • the computer program product can be applied to the terminal in the embodiment of the present application, and the computer program instruction causes the computer to execute the corresponding process implemented by the terminal in each method of the embodiment of the present application. For brevity, it will not be repeated here. .
  • the embodiment of the present application also provides a computer program.
  • the computer program may be applied to the MEC server in the embodiment of the present application.
  • the computer program When the computer program is run on the computer, the computer is caused to execute the corresponding processes implemented by the MEC server in each method in the embodiment of the present application. , Will not repeat them here.
  • the computer program may be applied to a terminal in the embodiment of the present application.
  • the computer program When the computer program is run on a computer, the computer is caused to execute a corresponding process implemented by the terminal in each method of the embodiment of the present application. This will not be repeated here.
  • An embodiment of the present application further provides a data processing system, including a MEC server and a terminal.
  • the terminal may be used to implement a corresponding function implemented by the terminal in the foregoing method, and the MEC server may be used to implement the above method.
  • the corresponding functions implemented by the MEC server are not repeated here for brevity.
  • the disclosed methods and devices may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed components are coupled, or directly coupled, or communicated with each other through some interfaces.
  • the indirect coupling or communication connection of the device or unit may be electrical, mechanical, or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, which may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into a second processing unit, or each unit may be separately used as a unit, or two or more units may be integrated into a unit;
  • the above integrated unit may be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the foregoing program may be stored in a computer-readable storage medium.
  • the program is executed, the program is executed.
  • the method includes the steps of the foregoing method embodiment.
  • the foregoing storage medium includes: various types of media that can store program codes, such as a mobile storage device, a ROM, a RAM, a magnetic disk, or an optical disc.
  • the above-mentioned integrated unit of the present application is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device) is caused to perform all or part of the methods described in the embodiments of the present application.
  • the foregoing storage medium includes: various types of media that can store program codes, such as a mobile storage device, a ROM, a RAM, a magnetic disk, or an optical disc.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé de traitement de données, un terminal, un serveur et un support d'informations d'ordinateur. Le procédé consiste : à envoyer des informations d'instruction à un terminal, les informations d'instruction étant utilisées pour commander au terminal de transmettre des données d'image qui satisfont une condition prédéfinie ; à recevoir des données d'image en provenance du terminal, et à sélectionner un premier modèle à partir d'un ensemble préconfiguré de modèles en fonction des données d'image, l'ensemble préconfiguré de modèles comprenant une pluralité de modèles tridimensionnels de parties non clés autres que les parties clés ; et à établir un modèle de données vidéo tridimensionnelles comprenant les parties non clés et les parties clés en fonction du premier modèle et d'un second modèle, le second modèle étant établi à partir de données vidéo tridimensionnelles des parties clés reçues.
PCT/CN2019/100632 2018-09-30 2019-08-14 Procédé de traitement de données, terminal, serveur et support d'informations d'ordinateur WO2020063168A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811161927.8A CN109151430B (zh) 2018-09-30 2018-09-30 一种数据处理方法、终端、服务器和计算机存储介质
CN201811161927.8 2018-09-30

Publications (1)

Publication Number Publication Date
WO2020063168A1 true WO2020063168A1 (fr) 2020-04-02

Family

ID=64810574

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/100632 WO2020063168A1 (fr) 2018-09-30 2019-08-14 Procédé de traitement de données, terminal, serveur et support d'informations d'ordinateur

Country Status (2)

Country Link
CN (1) CN109151430B (fr)
WO (1) WO2020063168A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109151430B (zh) * 2018-09-30 2020-07-28 Oppo广东移动通信有限公司 一种数据处理方法、终端、服务器和计算机存储介质
CN111447504B (zh) * 2020-03-27 2022-05-03 北京字节跳动网络技术有限公司 三维视频的处理方法、装置、可读存储介质和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140270374A1 (en) * 2013-03-15 2014-09-18 Nito, Inc. Systems, Methods, and Software for Detecting an Object in an Image
US20170055147A1 (en) * 2015-08-19 2017-02-23 Alibaba Group Holding Limited Method, client terminal and server for establishing communication
CN107465885A (zh) * 2016-06-06 2017-12-12 中兴通讯股份有限公司 一种实现视频通讯的方法和装置
CN108600728A (zh) * 2018-05-10 2018-09-28 Oppo广东移动通信有限公司 一种数据传输方法及终端、计算机存储介质
CN109151430A (zh) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 一种数据处理方法、终端、服务器和计算机存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101610421B (zh) * 2008-06-17 2011-12-21 华为终端有限公司 视频通讯方法、装置及系统
US10672180B2 (en) * 2016-05-02 2020-06-02 Samsung Electronics Co., Ltd. Method, apparatus, and recording medium for processing image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140270374A1 (en) * 2013-03-15 2014-09-18 Nito, Inc. Systems, Methods, and Software for Detecting an Object in an Image
US20170055147A1 (en) * 2015-08-19 2017-02-23 Alibaba Group Holding Limited Method, client terminal and server for establishing communication
CN107465885A (zh) * 2016-06-06 2017-12-12 中兴通讯股份有限公司 一种实现视频通讯的方法和装置
CN108600728A (zh) * 2018-05-10 2018-09-28 Oppo广东移动通信有限公司 一种数据传输方法及终端、计算机存储介质
CN109151430A (zh) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 一种数据处理方法、终端、服务器和计算机存储介质

Also Published As

Publication number Publication date
CN109151430B (zh) 2020-07-28
CN109151430A (zh) 2019-01-04

Similar Documents

Publication Publication Date Title
KR102480642B1 (ko) 데이터 처리 방법, 서버 및 컴퓨터 저장 매체
CN108495112B (zh) 数据传输方法及终端、计算机存储介质
WO2020063168A1 (fr) Procédé de traitement de données, terminal, serveur et support d'informations d'ordinateur
WO2023020502A1 (fr) Procédé et appareil de traitement de données
WO2021031386A1 (fr) Procédé et dispositif de positionnement, serveur, support de stockage et terminal
WO2020063170A1 (fr) Procédé de traitement de données, terminal, serveur et support d'informations
CN109272576B (zh) 一种数据处理方法、mec服务器、终端设备及装置
CN109413405B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN104521211B (zh) 一种会话连接建立的方法、装置和系统
WO2020063171A1 (fr) Procédé de transmission de données, terminal, serveur et support de stockage
CN109246409B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN108632376A (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN109299323B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN109147043B (zh) 一种数据处理方法、服务器及计算机存储介质
CN109151435B (zh) 一种数据处理方法、终端、服务器及计算机存储介质
WO2020062919A1 (fr) Procédé de traitement de données, serveur de mec et dispositif terminal
WO2022183496A1 (fr) Procédé et appareil d'apprentissage de modèle
CN109345623B (zh) 模型校验方法及服务器、计算机存储介质
CN109302598B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
CN110243366B (zh) 一种视觉定位方法及装置、设备、存储介质
RU2800627C2 (ru) Способ обработки данных, сервер и компьютерный носитель данных
WO2024036453A1 (fr) Procédé d'apprentissage fédéré et dispositif associé
CN108737807B (zh) 一种数据处理方法、终端、服务器和计算机存储介质
KR102168223B1 (ko) 기초 구조물에 3차원 입체 디자인을 합성한 이미지를 포함하는 건축 정보를 제공하는 방법 및 그 시스템
CN109325997B (zh) 模型校验方法及服务器、计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19865782

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19865782

Country of ref document: EP

Kind code of ref document: A1