WO2020098418A1 - 控制用户数据的方法及相关装置 - Google Patents

控制用户数据的方法及相关装置 Download PDF

Info

Publication number
WO2020098418A1
WO2020098418A1 PCT/CN2019/110117 CN2019110117W WO2020098418A1 WO 2020098418 A1 WO2020098418 A1 WO 2020098418A1 CN 2019110117 W CN2019110117 W CN 2019110117W WO 2020098418 A1 WO2020098418 A1 WO 2020098418A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
user
data
precision
user data
Prior art date
Application number
PCT/CN2019/110117
Other languages
English (en)
French (fr)
Inventor
符谋政
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP19877531.4A priority Critical patent/EP3693915A4/en
Priority to US16/764,716 priority patent/US20210224886A1/en
Publication of WO2020098418A1 publication Critical patent/WO2020098418A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0613Third-party assisted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects

Definitions

  • the present application relates to the field of controlling user data and terminals, and in particular to a method and related device for controlling user data.
  • Virtual try-on mainly includes two-dimensional (2D) virtual try-on and three-dimensional (3D) virtual try-on. The following is a brief introduction.
  • 2D virtual try-on obtains a 2D image of clothing by shooting, drawing, etc., obtains a user's video stream through a camera, and pastes a 2D clothing image on the user's body in the video stream through computer graphics processing. Since the clothes image is 2D and lacks a three-dimensional sense, the 2D virtual try-on cannot fully reflect the effect of the user wearing clothes. For example, when the user turns sideways and turns around, the corresponding try-on effect cannot be presented.
  • 3D virtual try-on refers to the use of three-dimensional modeling technology to generate a three-dimensional model of the user's human body and clothing, and then use the three-dimensional geometric deformation or cloth physical deformation simulation algorithm to simulate the human body in the three-dimensional scene The dressing effect of the model. Since the models and clothing are three-dimensional, 3D virtual try-on can fully demonstrate the user's try-on effect.
  • 3D virtual try-on can provide users with a true try-on effect, which can better meet the user's try-on needs, and is a key development direction in the future.
  • a virtual try-on application (application, APP) generally obtains a 3D model of a user's body and a 3D model of clothing at the same time to show the dressing effect of the human model.
  • APP application
  • users are increasingly aware of personal data or personal privacy protection, and may refuse to provide personal body size data (such as body data of sensitive parts) to virtual try-on applications, making virtual try-on difficult to achieve.
  • clothing manufacturers or e-commerce platforms also have confidentiality requirements for product data, and may refuse to provide important product data to virtual try-on applications, making virtual try-on difficult to achieve .
  • This application provides a method for controlling user data and related devices, which can reduce the risk of high-precision user data leakage and ensure the security of high-precision user data.
  • the present application provides a method for controlling user data, which is applied to an electronic device, and the electronic device is installed with a first application and a second application.
  • the method includes: the second application obtains user's high-precision user data and low Precision user data; the second application obtains the size data of the first item provided by the first application; wherein the size data reflects the size of the first item; the second application determines the optimal size of the first item based on the size data and high-precision user data , The best size matches the user's body; high-precision user data reflects the user's body details, and low-precision user data reflects the user's body contour; the second application provides the best size and low-precision user data to the first application to generate 3D Try on the figure.
  • the second application acquires and processes high-precision user data.
  • the second application only provides low-precision user data to the first application.
  • Generate a 3D try-on image reduce the risk of high-precision user data leakage, and meet the user's privacy protection needs.
  • high-precision user data can almost reflect the detailed characteristics of various parts of the user's body, and the amount of information is large. Users generally hope to ensure the security of high-precision user data.
  • the low-precision user data can only reflect the general characteristics of the user's body, the amount of information is small, and the user's security requirements for low-precision user data are also low. Therefore, the call of high-precision user data is more strict than the call of low-precision user data.
  • High-precision user data is only provided to applications trusted by users, and low-precision user data can be provided to most applications (including applications trusted by users).
  • the second application is an application developed by a mobile phone manufacturer
  • the first application is a third-party application.
  • the first application may be shopping applications such as Taobao and Jingdong
  • the second application may be an AR application developed by a mobile phone manufacturer.
  • the second application may obtain the user's high-precision user data and low-precision user data in two ways:
  • the second application sends a first acquisition request to the second server, and receives the user's high-precision user data and low-precision user data sent by the second server.
  • the first acquisition request may carry the identification information of the second application; the user's high-precision user data and low-precision user data are the identification information of the second application after verification, the first Two servers sent.
  • the process of the second server verifying the identification information of the second application that is, the process of the second server verifying whether the second application is an application trusted by the user, the process may include: the second server checking whether the pre-stored application identification includes the second application , If it is, it is confirmed that the second application is an application trusted by the user, that is, it is verified.
  • the applications corresponding to the application identifiers pre-stored by the second server are all applications trusted by the user, and can be set by the second server or the user.
  • the second application receives the high-precision user data sent by the second server, or receives user input, or reads the user detected by the electronic device; the second application is calculated based on the user's high-precision user data User's low-precision user data.
  • the low-precision user data can be obtained from the high-precision user data after fuzzification. For example, you can delete some data from high-precision user data to obtain low-precision user data.
  • the method before the second application obtains the size data of the first item provided by the first application, the method further includes: the electronic device displaying an interface of the first application, the interface including the first item; electronic The device receives the user's operation to select the first item.
  • the 3D try-on image is generated based on the optimal size, low-precision user data, and the effect data of the first item; the effect data of the first item is provided by the first application, and the effect data Reflect the details of the first item.
  • the 3D try-on map is generated based on the best size, low-precision user data, and the effect data of the first item.
  • the 3D try-on map can reflect the fit and the first item when the user tries on the first item. Details.
  • the items in this application may be commodities.
  • the items in this application may be clothes, pants, hats, glasses, etc.
  • item data is divided into size data and effect data.
  • the same item can have multiple size data, such as the size data corresponding to different sizes.
  • the same article can also have multiple effect data, such as effect data corresponding to different patterns or different colors.
  • the size data reflects the size of each part of the product, and in most cases it will be disclosed to the consumer, allowing the consumer to select the appropriate product.
  • the effect data is the main difference between an item and other items. In order to improve the competitiveness of the product, the safety requirements of the effect data are higher. Therefore, the calling of effect data is more strict than the calling of size data.
  • the effect data is only provided to applications trusted by the item provider (such as a merchant), and the size data can be provided to most applications (including applications trusted by the item provider).
  • the first application may obtain the size data and effect data of the first item in the following manner: the first application sends a second acquisition request to the first server, and the second acquisition request carries the identification information of the first application; After the identification information of the first application is verified, the size data and effect data of the first item sent by the first server are received.
  • the process of the first server verifying the identification information of the first application that is, the process of the first server verifying whether the first application is an application trusted by the item provider
  • the process may include: the first server checks whether the pre-stored application identification includes the first An application identifier, if it is, confirms that the first application is an application trusted by the item provider, that is, passes verification.
  • the applications corresponding to the application identifiers pre-stored by the first server are all applications trusted by the item provider, and can be set by the first server or the user.
  • the method may further include: the electronic device displaying the generated 3D try-on image on the interface of the first application.
  • the user can view the effect of trying on the first item through the 3D try-on image displayed on the electronic device, which can bring the user a good shopping experience.
  • the present application provides an electronic device including a memory and a processor.
  • the memory stores at least one program.
  • the at least one program includes a first application and a second application.
  • the processor is used to run the second application to enable the electronic device to execute:
  • the second application is an application developed by a mobile phone manufacturer
  • the first application is a third-party application
  • the first application may be shopping applications such as Taobao and Jingdong
  • the second application may be an AR application developed by a mobile phone manufacturer.
  • the manner in which the processor runs the second application to enable the electronic device to obtain the user's high-precision user data and low-precision user data may include the following two:
  • the processor runs the second application to cause the electronic device to send the first acquisition request to the second server and receive the user's high-precision user data and low-precision user data sent by the second server.
  • the first acquisition request may carry the identification information of the second application; the user's high-precision user data and low-precision user data are the identification information of the second application after verification, the first Two servers sent.
  • the process of the second server verifying the identification information of the second application that is, the process of the second server verifying whether the second application is an application trusted by the user, the process may include: the second server checking whether the pre-stored application identification includes the second application , If it is, it is confirmed that the second application is an application trusted by the user, that is, it is verified.
  • the applications corresponding to the application identifiers pre-stored by the second server are all applications trusted by the user, and can be set by the second server or the user.
  • the processor runs the second application to enable the electronic device to receive the high-precision user data sent by the second server, or to receive user input, or to read the user detected by the electronic device; the second application is based on the user The high-precision user data is calculated to obtain the user's low-precision user data.
  • the low-precision user data can be obtained from the high-precision user data after fuzzification. For example, you can delete some data from high-precision user data to obtain low-precision user data.
  • the electronic device further includes a touch screen, and the processor runs the second application to enable the electronic device to run the first item before acquiring the size data of the first item provided by the first application
  • An application to be executed by the electronic device the touch screen displays an interface of the first application, the interface including the first item; the touch screen receives the user's operation to select the first item.
  • the 3D try-on image is generated based on the optimal size, low-precision user data, and effect data of the first item; the effect data of the first item is provided by the first application, and the effect data Reflect the details of the first item.
  • the 3D try-on map is generated based on the best size, low-precision user data, and the effect data of the first item.
  • the 3D try-on map can reflect the fit and the first item when the user tries on the first item. Details.
  • the processor before the processor runs the second application to enable the electronic device to obtain the size data of the first item provided by the first application, the processor is also used to run the first application to cause the electronic device to execute: send to the first server The second acquisition request carries the identification information of the first application; after the identification information of the first application is verified, it receives the size data and effect data of the first item sent by the first server.
  • the electronic device further includes a touch screen
  • the processor is further configured to run the first application to cause the electronic device to execute: the touch screen displays the generated 3D try-on image on the interface of the first application.
  • the present application provides a method for controlling user data, which is applied to a second server.
  • the method includes: the second server receives a first acquisition request sent by a second application, and the second application is installed in an electronic device; The server sends the user's high-precision user data and low-precision user data to the second application.
  • the first acquisition request carries the identification information of the second application, and after the identification information of the second application is verified, the second server sends the user's high-precision user data and Low-precision user data.
  • the present application provides a second server, including: one or more processors and one or more memories; one or more memories are coupled to the one or more processors, and the one or more memories are used for storage
  • Computer program code the computer program code includes computer instructions, and when one or more processors execute the computer instructions, the electronic device executes the method for controlling user data as provided in the third aspect.
  • the present application provides a method for controlling user data, which is applied to a first server.
  • the method includes: the first server receives a second acquisition request sent by a first application, and the first application is installed in an electronic device; The server sends the size data and effect data of the first item to the first application.
  • the second acquisition request carries the identification information of the first application
  • the first server sends the size data of the first item to the first application after the identification information of the first application is verified Performance data.
  • the present application provides a first server, including: one or more processors and one or more memories; one or more memories are coupled to one or more processors, and one or more memories are used for storage
  • Computer program code the computer program code includes computer instructions, and when one or more processors execute the computer instructions, the electronic device executes the method for controlling user data as provided in the fifth aspect.
  • the present application provides a computer storage medium, including computer instructions, which when executed on an electronic device, causes the electronic device to execute the method for controlling user data as described in the first aspect.
  • the present application provides a computer program product containing instructions that, when run on a computer, cause the computer to execute the method for controlling user data described in the first aspect above.
  • 3D virtual try-on can be achieved without providing high-precision user data to the first application and high-precision user data to the second application.
  • This application can ensure that high-precision user data is only called by applications that the user trusts during the 3D virtual try-on process, avoid high-precision user data being leaked to applications that the user does not trust, and reduce the risk of high-precision user data leakage.
  • FIG. 1 is a schematic diagram of the try-on technology in the prior art
  • FIG. 3 is a block diagram of the software structure of the electronic device provided by this application.
  • FIG. 4 is a schematic diagram of a scenario of scanning to obtain user data provided by this application.
  • FIGS. 5-6 are schematic diagrams of human-computer interaction provided by this application.
  • FIG. 7 is a schematic flowchart of a method for controlling user data provided by this application.
  • 9 is a schematic diagram of human-computer interaction provided by this application.
  • first and second are used for description purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include one or more of the features.
  • the meaning of “plurality” is two or more.
  • the method for controlling user data of the present application is applied to electronic devices.
  • This application does not specifically limit the types of electronic devices mentioned, and the electronic devices may be portable electronic devices such as mobile phones, tablet computers, personal digital assistants (personal digital assistants (PDAs), wearable devices, laptops), etc. , Can also be non-portable electronic devices such as electronic mirrors, desktop computers. Examples of electronic devices include, but are not limited to, electronic devices equipped with iOS, android, microsoft, or other operating systems.
  • FIG. 2 shows a schematic structural diagram of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine some components, or split some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), and an image signal processor (image) signal processor (ISP), controller, memory, video codec, digital signal processor (DSP), baseband processor, and / or neural-network processing unit (NPU) Wait.
  • application processor application processor
  • AP application processor
  • modem processor graphics processor
  • GPU graphics processor
  • ISP image signal processor
  • controller memory
  • video codec video codec
  • DSP digital signal processor
  • NPU neural-network processing unit
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of fetching instructions and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the instructions stored in the memory are used by the electronic device 100 to execute the method for controlling user data in the embodiment of the present application.
  • the data stored in the memory may include user data including high-precision user data and low-precision user data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Avoid repeated access, reduce the waiting time of the processor 110, thus improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • Interfaces can include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit, sound, I2S) interface, pulse code modulation (pulse code modulation (PCM) interface, universal asynchronous transceiver (universal asynchronous) receiver / transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input / output (GPIO) interface, subscriber identity module (SIM) interface, and And / or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may respectively couple the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the MIPI interface can be used to connect the processor 110 to peripheral devices such as the display screen 194 and the camera 193.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through the DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured via software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface may be used to connect the processor 110 to the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present invention is only a schematic illustration, and does not constitute a limitation on the structure of the electronic device 100.
  • the electronic device 100 may also use different interface connection methods in the foregoing embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the electronic device 100 can communicate with other devices using wireless communication functions.
  • the electronic device 100 can communicate with the server 200 to obtain user data stored by the server 200 or a user's high-precision 3D body model, low-precision 3D body model, and the like.
  • the electronic device 100 can communicate with the server 300 to obtain the product data stored in the server 300 or a 3D accurate model of the product, a 3D inner container model, and so on.
  • the communication process between the electronic device and the server 200 and the server 300 reference may be made to related descriptions in subsequent embodiments, and details are not described here.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and so on.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1 and filter, amplify, etc. the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor and convert it to electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), and global navigation satellites that are applied to the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters electromagnetic wave signals, and transmits the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic wave radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long-term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology, etc.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 realizes a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the GPU can be used to construct a 3D model of the liner using the size data of the commodity.
  • the GPU may use the size data of the commodity to construct a 3D model of the liner of the commodity.
  • the construction process of the inner liner 3D model of the commodity reference may be made to the related descriptions in the subsequent embodiments.
  • the GPU may be used to construct a high-precision 3D body model using high-precision user data and a low-precision 3D body model using low-precision user data.
  • a high-precision 3D body model using high-precision user data
  • a low-precision 3D body model using low-precision user data.
  • the GPU may be used to match the user's high-precision 3D body model and the 3D liner models corresponding to different sizes of the goods, to determine the best size of the goods.
  • the GPU may be used to match the user's high-precision 3D body model and the 3D liner models corresponding to different sizes of the goods, to determine the best size of the goods.
  • the GPU can be used to generate a 3D virtual try-on map.
  • the process of generating the 3D virtual try-on image reference may be made to the related descriptions in subsequent embodiments.
  • the GPU may be used to superimpose the 3D accurate model of the commodity and the user's real image to generate an effect picture.
  • the process of superimposing the 3D accurate model of the commodity and the real image of the user to generate the effect picture reference may be made to the relevant description of the subsequent embodiments.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel may use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix organic light-emitting diode (active-matrix organic light) emitting diode, AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 may be used to display various interfaces of the system output of the electronic device 100.
  • various interfaces output by the electronic device 100 reference may be made to related descriptions in subsequent embodiments.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP processes the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, and the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be set in the camera 193.
  • the camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the camera 193 is used to obtain a real image of the user.
  • the digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 is selected at a frequency point, the digital signal processor is used to perform Fourier transform on the energy at the frequency point.
  • Video codec is used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the electronic device 100, such as image recognition, face recognition, voice recognition, and text understanding.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.).
  • the storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100 and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • LED light emitting diode
  • a light detector such as a photodiode.
  • the ambient light sensor 180L is used to sense the brightness of ambient light.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect the temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K may be provided on the display screen 194, and the touch sensor 180K and the display screen 194 constitute a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation may be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the key 190 includes a power-on key, a volume key, and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the motor 191 may generate a vibration prompt.
  • the indicator 192 may be an indicator light, which may be used to indicate a charging state, a power change, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into or removed from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 uses eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to exemplarily explain the software structure of the electronic device 100.
  • FIG. 3 is a software block diagram of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor.
  • the layers communicate with each other through a software interface.
  • the Android system is divided into four layers, from top to bottom are the application layer, the application framework layer, the Android runtime and the system library, and the kernel layer.
  • the application layer may include a series of application packages.
  • the application package may include shopping applications (eg Taobao APP, JD.com APP, Amazon APP), applications for managing user data (eg AR APP), camera, gallery, calendar, call, map, navigation WLAN, Bluetooth, music, video, SMS and other applications.
  • shopping applications eg Taobao APP, JD.com APP, Amazon APP
  • applications for managing user data eg AR APP
  • the application framework layer provides an application programming interface (application programming interface) and programming framework for applications at the application layer.
  • the application framework layer includes some predefined functions. In this application, the shopping application and the application for managing user data can communicate through the API.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and so on.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, intercept the screen, etc.
  • Content providers are used to store and retrieve data, and make these data accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
  • the view system includes visual controls, such as controls for displaying text and controls for displaying pictures.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes an SMS notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of call status (including connection, hang up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear after a short stay without user interaction.
  • the notification manager is used to notify the completion of downloading, message reminders, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • the text message is displayed in the status bar, a prompt sound is emitted, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core library and virtual machine. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one part is the function function that Java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in the virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer into binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library may include multiple functional modules. For example: surface manager (surface manager), media library (Media library), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • surface manager surface manager
  • media library Media library
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports a variety of commonly used audio, video format playback and recording, and still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • a three-dimensional graphics processing library is used to construct a liner 3D model using the size data of the commodity.
  • the three-dimensional graphics processing library may be used to construct high-precision 3D body models using high-precision user data and low-precision 3D body models using low-precision user data.
  • the three-dimensional graphics processing library can be used to match the user's high-precision 3D body model and the 3D liner models corresponding to different sizes of the commodity to determine the optimal size of the commodity.
  • a three-dimensional graphics processing library can be used to generate 3D virtual try-on images.
  • the three-dimensional graphics processing library can be used to superimpose the 3D accurate model of the commodity and the user's real image to generate an effect picture.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least the display driver, camera driver, audio driver, and sensor driver.
  • the electronic device in the present application is configured with a touch screen (hereinafter referred to as a touch screen), which can be used to receive user operations and display interface content output by the system of the electronic device.
  • the interface content may include the interface of the running application and the system-level menu, etc., and may specifically be composed of the following interface elements: input interface elements, such as buttons, text input boxes, and scroll bars , Menu (menu), etc .; and output-type interface elements, such as windows, labels, etc.
  • the interface content may be a 3D try-on screen displayed to the user by the electronic device.
  • the following embodiment will take the electronic device as the mobile phone 100 as an example to describe the method for controlling user data provided by the embodiment of the present application.
  • the method for controlling user data can ensure the safety of user data and / or product data when reducing 3D virtual try-on, and reduce the risk of user data leakage and the loss of product data.
  • user data In order to better understand the method for controlling user data in the embodiments of the present application, first, two types of data involved in the present application are introduced: user data and product data.
  • User data includes data that reflects a user's physical characteristics.
  • the data reflecting the user's body characteristics include, but are not limited to: height, weight, chest circumference, waist circumference, hip circumference, arm width, arm length, arm circumference, thigh length, thigh circumference, calf length, calf circumference, head circumference, The size and position of the head organs (mouth, eyes, nose), ankle circumference, foot length, foot width, arch height, instep height, length of each toe, etc.
  • the user data may also include data reflecting the external appearance of the user.
  • the data reflecting the external image of the user includes, but is not limited to: hairstyle, hair color, face shape, skin color, etc.
  • user data may be ranked according to a certain strategy.
  • the user data can be graded according to the accuracy reflecting the user's physical characteristics. For example, user data can be divided into high-precision user data and low-precision user data.
  • high-precision user data refers to data that can accurately and comprehensively reflect the details of the user's body parts.
  • high-precision foot data contains details of each part of the foot, such as detailed 3D data of the arch of the foot, the position of each toe, heel, ankle, and instep;
  • the data will contain detailed 3D data of the user's shoulders, arms, torso, thighs, and calves.
  • high-precision user data is described and stored in the standard description format of existing 3D models, such as the OBJ file format.
  • the high-precision user data may actually include information such as a collection of three-dimensional spatial coordinates of all vertices used to describe the surface characteristics of the user's body, the number of patches, and other information. Construct high-precision 3D models.
  • High-precision user data can be collected in the following two ways:
  • the scanning device refers to a device that can acquire three-dimensional data of an object by scanning.
  • the scanning device may be a mobile phone or other devices (such as an infrared scanner, a camera with depth measurement information, etc.).
  • the scanning device can collect accurate user data by scanning the user's body. Exemplarily, referring to FIG. 4, it illustrates a possible scenario where an external device scans the user ’s body and collects high-precision user data.
  • High-precision user data measured and provided by users can obtain user data through manual measurement.
  • the user can measure body weight with a weight scale, measure the dimensions of body parts with a tape measure, and so on.
  • low-precision user data is relative to high-precision user data, which can roughly and simply reflect the user's physical characteristics, but cannot accurately and comprehensively reflect the details of the user's body parts. That is to say, the low-precision user data can reflect the general outline of the user's body, but does not include the exact size of each part of the body.
  • the amount of data contained in low-precision user data is less than the amount of data contained in high-precision user data.
  • low-precision foot data includes contour data of each part of the foot, such as foot length, width, height, etc., but does not include detailed three-dimensional data, such as arch height, the position of each toe , Ankle shape, etc.
  • the low-precision user data is described and stored in the standard description format of the existing 3D model, such as the OBJ file format.
  • Low-precision user data can be obtained by processing high-precision user data.
  • the low-precision user data can be obtained from the high-precision user data after fuzzification. For example, you can delete some data from high-precision user data to obtain low-precision user data.
  • vertices with a vertex distance of less than 1 cm in the high-precision user data can be deleted and smoothed to remove hundreds of thousands of high-precision user data The patch is reduced to less than 10,000 patches.
  • the user data may be stored in the mobile phone 100 or the server 200, which is not limited here.
  • the user data may be stored encrypted in the mobile phone 100 (for example, stored in the encryption chip of the mobile phone 100) or the server 200.
  • decryption or verification is required to ensure that the user data is Stored safety.
  • the user data may be stored in a storage file corresponding to the application installed in the mobile phone 100, or may be separately stored in the mobile phone 100 as a data source.
  • the mobile phone 100 or the server 200 may store user data hierarchically. See Table 1, which shows a possible form of hierarchical storage of user data.
  • User data is stored under the user account, that is, the corresponding user data can be found through the user account.
  • the user account is a character string, which can be one or any combination of the following: letters, numbers, symbols, etc.
  • the user account may be the user's name, mailbox, phone number, etc.
  • the following describes specific operations when the mobile phone 100 or the server 200 stores user data hierarchically.
  • the scanning device can transmit the collected high-precision user data to the mobile phone 100 or the server 200 for storage.
  • the user can directly input the high-precision user data to the mobile phone 100 for storage, and can also transmit the high-precision user data to the server 200 through a terminal device (such as a mobile phone 100 or a computer) For storage.
  • the low-precision user data may be obtained and stored after the mobile phone 100 or the server 200 processes the high-precision user data.
  • high-precision user data can reflect the detailed characteristics of various parts of the user's body, and the amount of information is large, users generally want to ensure the security of high-precision user data due to personal data or personal privacy protection.
  • the low-precision user data can only reflect the general characteristics of the user's body, the amount of information is small, and the user's security requirements for low-precision user data are also low.
  • the permission of other devices or applications to call user data may be set, and only authorized other devices or applications can call high-precision user data or low-precision user data.
  • the calling of high-precision user data is more strict than the calling of low-precision user data.
  • high-precision user data can only be called by applications trusted by the user.
  • User-trusted applications refer to applications that do not leak high-precision user data, or do not use high-precision user data to do things that harm the interests of users.
  • high-precision user data is stored in a mobile phone manufacturer (such as Huawei) or a server 200 that cooperates with a mobile phone manufacturer.
  • the server 200 can provide high-precision user data to applications trusted by users.
  • Applications trusted by users can include mobile phone manufacturers.
  • Developed applications such as augmented reality (AR) applications for mobile phone manufacturers, fast service smart platforms, etc.
  • AR augmented reality
  • high-precision user data is always called by applications trusted by users, which can ensure that high-precision data is not leaked to other manufacturers or third-party applications (such as third-party shopping applications such as Taobao and JD.com).
  • the application trusted by the user may be set by the user independently, or may be set by the mobile phone 100 or the server 200.
  • the user can be asked whether he can call each time high-precision user data is called, and only after obtaining the user's authorization can high-precision user data be called.
  • low-precision user data can be called by most applications.
  • applications that can call low-precision user data include not only the above-mentioned user-trusted applications (such as various applications developed by mobile phone manufacturers), but also third-party applications (such as third-party shopping applications such as Taobao and Jingdong) .
  • the application that can call the low-precision user data may be set by the user independently, or may be set by the mobile phone 100 or the server 200.
  • user data may also be stored in other forms in this application.
  • it can also be classified according to body parts (such as head, upper body torso, lower body torso, feet, etc.), and store high-precision user data and low-precision user data corresponding to each type of body part, respectively.
  • user data of multiple users can also be stored under one user account.
  • user data of multiple users can also be stored under one user account.
  • user data of user 1 and user 3 may also be included.
  • User 2 and user 3 may be the family of user 1.
  • the high-precision user data can be used to construct a user's high-precision 3D body model to achieve accurate modeling of the user's body.
  • the process of constructing the user's high-precision 3D body model based on the high-precision user data may include: constructing the high-precision user data into a high-precision 3D body model through modeling tools or algorithms. Understandably, in some embodiments, a high-precision 3D model may also be constructed for each body part separately.
  • a high-precision 3D model of the head can be constructed based on high-precision user data of the head
  • a high-precision 3D model of the upper body torso can be constructed based on the high-precision user data of the upper body
  • a height of the lower body can be constructed from high-precision user data of the lower body
  • Accurate 3D model constructing a high-precision 3D model of the foot based on high-precision user data of the foot.
  • the low-precision user data can be used to construct a low-precision 3D body model of the user, so as to achieve a simple simulation of the user's body.
  • the process of constructing the user's low-precision 3D body model based on the low-precision user data may include: constructing the low-precision user data into a low-precision 3D body model through a modeling tool or algorithm. Understandably, in some embodiments, similar to the high-precision 3D model described above, a low-precision 3D model may also be constructed for each body part separately.
  • the commodity data includes data reflecting the characteristics of the commodity.
  • the goods in this application refer to items that can be worn on the user.
  • the products in this application include but are not limited to: clothing (such as clothes (T-shirts, shirts, coats, etc.), pants (such as trousers, shorts, etc.)), shoes and boots, accessories (such as hats, glasses, watches, accessories ( Such as scarves, earrings, etc.)).
  • Commodity characteristics may include the size, material, texture, decoration, etc. of the commodity.
  • the product data of a piece of clothing may include: clothing length, bust, waist, shoulder width, sleeve length, fabric material, lining material, cuff circumference, hem circumference, fabric texture, fabric decoration, fabric Elasticity, thickness, cutting method (such as drop shoulder, shoulder pad, etc.), style (tight, slim, loose, etc.), etc.
  • the product data of a pair of shoes may include: shoe size, sole length, lining height, sole material, toe style (pointed head, round head, etc.), upper material, boot height, insole material, upper lining Material, heel shape (flat heel, high heel), boot surface material, upper pattern, etc.
  • the commodity data may be classified according to a certain strategy.
  • product data can be divided into size data and effect data.
  • the size data and the effect data together constitute the commodity data of one commodity.
  • the size data refers to data that can reflect the degree of fitting when the user wears the product.
  • the size data may include: clothing length, bust, waist, shoulder width, sleeve length, cuff circumference, hem circumference, etc.
  • the size data may include: shoe size, sole length, lining height, boot height, etc.
  • each size of a product corresponds to size data, that is, a product may have multiple size data.
  • the effect data refers to data that can reflect the effect when the user wears the product.
  • the effect data can include: fabric material, lining material, fabric texture, decoration on the fabric, fabric elasticity, thickness, cutting method (such as drop shoulder, shoulder pad, etc.), version (tight, Slim, loose, etc.) etc.
  • the effect data may include: sole material, toe style (pointed head, round head, etc.), upper material, insole material, upper lining material, boot surface material, upper pattern, etc.
  • the same product there may be multiple different performance data.
  • multiple colors may be applied to the same product, and each color has corresponding effect data.
  • the same product may correspond to multiple patterns, and each pattern corresponds to effect data.
  • Commodity data can be collected in the following two ways:
  • the commodity data may be stored in the server 300.
  • the server 300 may be a server of a merchant or a shopping platform (for example, Taobao, Jingdong, etc.).
  • the server 300 may store commodity data by category. See Table 3, which shows one possible form of storing commodity data by category.
  • the commodity data is stored under the commodity identifier, that is, the corresponding commodity data can be found through the commodity identifier.
  • the product identification can be the product model number, product description, etc.
  • the commodity data when the server 300 stores commodity data by category, the commodity data may not be stored in the form of Table 3, but may be described and stored in the standard description format of the existing 3D model, such as the OBJ file format.
  • the size data reflects the size of each part of the product. In most cases, the merchant will disclose the size data to the consumer and let the consumer select the appropriate product. Therefore, the merchant has lower security requirements for the size data. Performance data is the main difference between a product and other products. In order to improve the competitiveness of products, merchants have higher security requirements for performance data. In order to ensure data security, in some embodiments, after the commodity data is stored in categories, other devices or applications may be authorized to call up the commodity data. Only other authorized devices or applications can invoke size data or effect data. The calling of effect data is more strict than the calling of size data.
  • performance data can only be called by applications trusted by the merchant.
  • An application trusted by a merchant refers to an application that does not disclose effect data, or does not use the effect data to do anything that harms the interests of the merchant.
  • the effect data is stored in the server 300 of the shopping application (for example, Taobao APP), and the server 300 can provide the effect data to the application (for example, Taobao APP) that has a cooperative relationship with the merchant.
  • the effect data is called by an application trusted by the merchant, which can ensure that the effect data is not leaked.
  • the application trusted by the merchant may be set by the merchant or the server 300.
  • the size data can be called by most applications.
  • the applications that can call size data include not only the above-mentioned applications trusted by merchants (such as Taobao APP), but also various applications developed by mobile phone manufacturers or developers who cooperate with mobile phone manufacturers.
  • the application that can call the size data can be set by the merchant or the server 300.
  • the server 300 may store commodity data of multiple commodity.
  • the size data can be used to construct a 3D model of the inner container of the commodity, and realize the simulation of the inner container of the commodity.
  • a product may have multiple size data, therefore, a product can correspond to multiple liner 3D models.
  • the process of constructing the 3D liner model of the commodity according to the size data may include: constructing the size data into the 3D liner model of the commodity through a modeling tool or algorithm.
  • Commodity data can be used to build accurate 3D models of commodities to achieve accurate simulation of commodities.
  • the accurate 3D model of the product is constructed by the size data and the effect data. Any size data and any effect data can be used to form a 3D accurate model of the product. That is, a product may correspond to multiple 3D accurate models.
  • the process of constructing the accurate 3D model of the commodity according to the commodity data may include: constructing the commodity data into an accurate 3D model of the commodity through a modeling tool or an algorithm. Specifically, when describing commodity data in the standard description format of the existing 3D model, the commodity data can be parsed through modeling tools or algorithms, and the 3D surface can be generated through vertex and patch descriptions. Go to the 3D surface and render different lighting effects for different materials to get a 3D accurate model.
  • user data is divided into high-precision user data and low-precision user data
  • product data is divided into size data and effect data.
  • High-precision user data is only provided to applications trusted by users
  • product performance data is only provided to applications trusted by merchants.
  • the user-trusted application uses the user's high-precision 3D body model to match the liner 3D model corresponding to multiple sizes of the product to obtain the best size.
  • the application trusted by the merchant uses the 3D accurate model corresponding to the best size and the user's low-precision 3D body model to show the user the 3D try-on effect.
  • Application scenario 1 User 1 selects the desired product through the shopping application on the mobile phone 100, and checks the 3D virtual try-on effect in the shopping application.
  • user data is stored in the server 200, and the user data includes high-precision user data and low-precision user data.
  • the commodity data is stored in the server 300, and the commodity data includes size data and effect data.
  • the mobile phone 100 is installed with an application for managing user data and a shopping application.
  • the application for managing user data is an application trusted by the user, that is, the application for managing user data can call high-precision user data and low-precision user data in the server 200.
  • the shopping application is an application trusted by the merchant, that is, the shopping application can call the size data and effect data in the server 300.
  • the application for managing user data in the mobile phone 100 may be pre-installed, or may be installed after the user downloads.
  • the shopping application can be pre-installed or installed after the user downloads it.
  • the application for managing user data and the shopping application can communicate through an application programming interface (application programming interface, API).
  • API application programming interface
  • the user interface 30 may include: a status bar 301, a hideable navigation bar 302, a time and weather widget 303, and icons of multiple applications
  • the status bar 301 may include the name of the network operator (for example, China Mobile), Wi-Fi icon, signal strength, current remaining power, and so on.
  • the status bar 301 may further include a Bluetooth icon, an alarm clock icon, and the like.
  • the navigation bar 302 may include a return key icon (triangle in the figure), a home screen icon (circle in the figure), and a multitasking key icon (square in the figure).
  • a return key icon triangle in the figure
  • a home screen icon circle in the figure
  • a multitasking key icon square in the figure.
  • the user 1 After launching the shopping APP, the user 1 can search or select the products he wants to buy or understand in the shopping APP. After user 1 selects the product that he wants to buy or learn, he can click the icon or link of the product to enter the product details interface.
  • the product selected by user 1 is a pair of pants, and the corresponding product detail interface 40 may be as shown in 6a of FIG. 6. The following description refers to the pair of pants selected by user 1 as pants 400.
  • the product detail interface 40 may include: a product display image 401, a product description 402, a return control 403, a shopping cart control 404, an immediate purchase control 405, an AR try-on control 406, and the like.
  • the product display image 401 may include photos or videos of the product or the model wearing the product.
  • the product description 402 is a simple description of the product, which may include price, keywords (such as brand, style, popular elements, etc.), sales volume, and place of delivery.
  • the mobile phone 100 displays a selection box of product parameters (such as size, color, number of pieces, etc.) on the product detail interface 40.
  • the mobile phone 100 adds the pants 400 to Currently logged into the shopping cart corresponding to the account of the shopping APP.
  • the mobile phone 100 displays a selection box of product parameters (such as size, color, number of pieces, etc.) on the product detail interface 40, and the user jumps to the payment interface after selecting the parameters.
  • the product detail interface 40 may include more content, for example, a service view control, a product parameter selection control, and a product parameter view control.
  • the user 1 can view a more detailed product introduction and user comments through a gesture of swiping up on the product detail interface 40 with a finger. Understandably, 6a in FIG. 6 is only an example. Different shopping apps or different products may have different elements and controls included in the corresponding product detail interface, and the arrangement of each element and control may be different, and no limitation is made here.
  • the user 1 wants to view the 3D try-on effect of the pants 400, he can click the AR try-on control 406 with a finger or a stylus.
  • the following describes a possible data processing process after the user 1 clicks on the AR try-on control 406 with reference to the drawings.
  • FIG. 7 shows a possible data interaction between the mobile phone 100 and the server 200 and the server 300 and the internal data processing process of the mobile phone 100.
  • the data interaction and processing process may include the following steps:
  • the mobile phone 100 obtains a 3D model of the liner of the pants 400 through the shopping APP.
  • the cloud service 300 stores product data corresponding to the pants 400.
  • the commodity data corresponding to the pants 400 may include: size data and effect data of the pants 400.
  • the server 300 and the shopping APP can communicate through the network.
  • the shopping APP can obtain the 3D model of the inner liner of the pants 400 by the following two methods:
  • the shopping APP may obtain the size data of the pants 400 from the server 300, and use the size data of the pants 400 to construct a 3D model of the liner of the pants 400.
  • the mobile phone 100 may request the server 300 to obtain the size data of the pants 400 through the shopping APP when entering the product detail interface 40.
  • the server 300 may verify whether the shopping APP has the right to call the product size data. After verifying that the shopping APP has the right to call the product size data, the server 300 sends the size data of the pants 400 to the shopping APP.
  • the server 300 may verify whether the shopping APP has the right to call the product size data according to the following manner: the server 300 may store the identification of each application having the right to call the product data, and the request sent by the shopping APP to the server 300 carries the For the identifier of the shopping APP, the server 300 checks whether each stored identifier includes the identifier of the shopping APP, and if so, confirms that the shopping APP has the right to call the size data of the product.
  • the identification of the application may be the name, icon, code, etc. of the application.
  • the pants 400 may correspond to multiple size data. That is, the shopping APP can receive the size data corresponding to different sizes of the pants 400, and construct a liner 3D model according to the multiple size data of the pants 400, respectively.
  • the shopping APP can use the computer graphics processing capability of the mobile phone 100 itself to construct a 3D liner model of the pants 400.
  • the specific steps of the shopping APP using the size data of the pants 400 to construct the liner 3D model please refer to the relevant description about the commodity data in point (2) above, which will not be repeated here.
  • the shopping APP may obtain the 3D model of the liner of the pants 400 from the server 300.
  • the server 300 may construct a liner 3D model according to the stored size data of the pants 400. That is, the server 300 may store the 3D model of the liner of the pants 400.
  • the specific steps for the server 300 to construct the 3D model of the liner can be referred to the related description of the commodity data in point (2) above, which will not be repeated here.
  • the mobile phone 100 may request the server 300 to obtain the 3D model of the liner of the pants 400 through the shopping APP when entering the product detail interface 40.
  • the server 300 may verify whether the shopping APP has the right to call the product size data. After verifying that the shopping APP has the right to call the product size data, the server 300 sends the 3D model of the liner of the pants 400 to the shopping APP.
  • the server 300 verifying whether the shopping APP has the authority to call the product size data, reference may be made to the related description in the above (1) embodiment.
  • the shopping app can receive the liner 3D models corresponding to different sizes of the pants 400 sent by the cloud service 300, respectively.
  • the shopping APP sends the 3D model of the liner of the pants 400 to the application for managing user data, that is, AR APP.
  • the AR APP obtains a high-precision 3D body model and a low-precision 3D body model of the user who currently needs to virtually try on the pants 400.
  • the server 200 stores user data.
  • the user data may include user data of a user, for example, Table 1.
  • the user data may include user data of multiple users, such as Table 2.
  • User data may include: high-precision user data and low-precision user data.
  • the server 200 and the AR APP can communicate through the network.
  • the user who currently needs to virtually try on the pants 400 can be determined in the following ways:
  • the AR APP generally manages user data through a user account, and the user account currently logged in to the mobile phone manufacturer AR APP can correspond to user data of one or more users, see Table 1 and Table 2.
  • the mobile phone 100 defaults to the user who currently needs to virtually try on the pants 400.
  • the user data stored under the user account currently logged in to the AR APP is shown in Table 1, and the user who currently needs to virtually try on the pants 400 is User 1.
  • the mobile phone 100 may select one user among the multiple users as the user who currently needs to virtually try on the pants 400 according to a certain policy. For example, the mobile phone 100 may obtain the storage time of multiple user data under the user account, and use the user corresponding to the oldest user data as the user who currently needs to virtually try on the pants 400. For another example, the mobile phone 100 may use the owner (for example, user 1) as the user who currently needs to virtually try on the pants 400. The owner identity of the user 1 may be added by the user 1 when managing user data through the AR APP.
  • user 1 may set a user (for example, user 1's mother) as the default user who needs virtual try-on of pants 400, and mobile phone 100 may determine the user who needs virtual try-on of pants 400 according to the setting (for example, User 1's mother).
  • a user for example, user 1's mother
  • mobile phone 100 may determine the user who needs virtual try-on of pants 400 according to the setting (for example, User 1's mother).
  • the AR APP after the AR APP receives the 3D model of the liner 400 of the pants sent by the shopping APP, it can obtain the user of one or more users currently logged into the AR APP account from the AR APP ’s storage file or the server 200 Logo.
  • the user identification may be the user's name, nickname, mobile phone number or email address associated with the user account, or an avatar used to identify the user.
  • the AR APP can display the obtained multiple user IDs under the user account currently logged in to the AR APP, and provide the user with options or controls for selecting the user who currently needs to virtually try on the pants 400.
  • the AR APP may send one or more user IDs obtained under the user account currently logged in to the AR APP to the shopping APP, and the shopping APP provides the user with a choice to currently need to virtually try on the pants 400 User's options or controls.
  • the mobile phone 100 may display a plurality of controls 407 on the product detail interface 40 to select a user who currently needs to virtually try on the pants 400.
  • the multiple controls 407 display multiple user IDs under the user account currently logged into the AR APP.
  • control 407 may also be displayed in other ways.
  • the mobile phone 100 can also jump from the product details interface 40 to an interface specifically for selecting a user who currently needs to virtually try on the pants 400, and display multiple controls 407 on the interface.
  • the display interface of the mobile phone 100 can also jump to the main interface of the AR APP, and multiple controls 407 are provided on the main interface of the AR APP.
  • the user 1 can select the control corresponding to the user who currently needs to virtually try on the pants 400 among the multiple controls 407 provided by the mobile phone 100 to determine the user who currently needs to virtually try on the pants 400. As shown in 6b of FIG. 6, user 1 can click the control corresponding to user 1 to determine user 1 as the user who currently needs to virtually try on pants 400. After the user 1 selects the user who currently needs to virtually try on the pants 400, the mobile phone 100 can send the user ID of the user selected by the user 1 to the AR APP.
  • the AR APP can obtain a high-precision 3D body model of the user. Taking the user who currently needs to virtually try on the pants 400 as the user 1 as an example, the process of the AR APP obtaining the high-precision 3D body model of the user 1 will be described in detail below.
  • the AR APP can obtain the high-precision 3D body model of the user who currently needs to virtually try on the pants 400 (ie, user 1), which may include the following two types:
  • the AR APP can obtain user data of user 1 (including high-precision user data and low-precision user data) from the server 200, and use the high-precision user data of user 1 to construct a high-precision 3D body model , Use user 1's low-precision user data to build a low-precision 3D body model.
  • the AR APP may request the server 200 to obtain the user data of the user 1.
  • the server 200 may verify whether the AR APP has the authority to call user data, and after verifying that the AR APP has the authority to call user data, send the user data of the user 1 to the AR APP.
  • the process of the server 200 verifying whether the AR APP has the authority to call user data is similar to the process of the server 300 verifying whether the shopping APP has the authority to call product size data, and reference may be made to the related description.
  • the AR APP can utilize the computer graphics processing capability of the mobile phone 100 to construct a high-precision 3D body model and a low-precision 3D body model of user 1.
  • the specific steps of AR APP to construct user 1's high-precision 3D body model and low-precision 3D body model can be referred to the relevant description of user data in point (1) above, which will not be repeated here.
  • the AR APP may obtain the high-precision 3D body model and the low-precision 3D body model of the user 1 from the server 200.
  • the server 200 may construct a high-precision 3D body model and a low-precision 3D body model of the user 1 according to the stored user data.
  • the server 200 constructing the high-precision 3D body model and the low-precision 3D body model of the user reference may be made to the related description about user data in point (1) above, and details are not repeated here.
  • the AR APP may request the server 200 to obtain the high-precision 3D body model and the low-precision 3D body model of the user 1.
  • the high-precision 3D body model of user 1 reflects the high-precision user data of user 1
  • the low-precision 3D body model of user 1 reflects the low-precision user data of user 1
  • the server 200 receives the request sent by the AR APP, You can verify whether AR APP has the right to call user data.
  • the server 200 sends the high-precision 3D body model and the low-precision 3D body model of the user 1 to the AR APP.
  • the process of the server 200 verifying whether the AR APP has the authority to call user data is similar to the process of the server 300 verifying whether the shopping APP has the authority to call product size data, and reference may be made to the related description.
  • the present application may also acquire the high-precision 3D body model and the low-precision 3D body model of the user 1 through other methods.
  • the AR APP may obtain the high-precision 3D body model of user 1 from the server 200, and obtain the low-precision 3D body model of user 1 after blurring the high-precision 3D body model of user 1.
  • the AR APP matches the high-precision 3D body model of the user 1 with the 3D liner models corresponding to different sizes of the pants 400, and determines the best size among the multiple sizes of the pants 400.
  • the process of matching the high-precision 3D body model of user 1 with the 3D liner model of pants 400 may specifically include: performing collision detection and pressure on the high-precision 3D body model of user 1 and the 3D liner model of pants 400 Figure simulation to determine whether the user's figure matches the size of the pants.
  • two 3D models are simply superimposed, and through collision detection, it can be obtained whether the clothing interferes with the user's body, and how much interference there is.
  • Adding physical effects such as gravity and elasticity of clothes material elasticity can simulate the heat map of the pressure on the user's body after the clothes are worn on the user.
  • the pressure map of the matched laundry is within a specific interval, the corresponding size may be considered to be the most suitable clothing size for the user.
  • the specific interval can be obtained according to experiments.
  • the optimal size determined in step 4 is the size most suitable for the body of the user 1 among the multiple sizes of the pants 400.
  • step 4 the high-precision 3D body model of user 1 can accurately reflect the details of user 1 ’s body parts. Therefore, the best size determined using user 1 ’s high-precision 3D body model is the most suitable size for user 1 ’s body. .
  • the AR APP sends the determined best size and the low-precision 3D body model of User 1 to the shopping APP.
  • the shopping APP obtains a 3D virtual try-on image according to the 3D accurate model corresponding to the best size of the pants 400 and the low-precision 3D body model of the user 1.
  • the 3D virtual try-on image may be generated by the shopping APP, or the 3D virtual try-on image may be generated by the server 300 and sent to the shopping APP, as described in detail below.
  • the shopping app uses the 3D accurate model corresponding to the best size of the pants 400 and the low-precision 3D body model of the user 1 to generate a 3D virtual try-on image.
  • the low-precision 3D body model of user 1 is sent to the shopping APP by the AR APP in step 5, and the 3D accurate model corresponding to the best size of the pants 400 is the shopping APP according to the AR APP in step 5.
  • the best size to send is determined.
  • the process of the shopping app determining the 3D accurate model corresponding to the optimal size of the pants 400 is described in detail below.
  • the 3D accurate model of the product is determined by the size data and the effect data. Since the pants 400 may correspond to one or more effect data, the 3D accurate model corresponding to the optimal size of the pants 400 may have one or more Pcs.
  • the 3D accurate models corresponding to the large codes of the products shown in the table include: 3D accurate models constructed from the size data corresponding to the large codes and the effect data corresponding to the large five-pointed star patterns, The 3D accurate model constructed by the size data and the effect data corresponding to the small five-pointed star pattern, the 3D accurate model constructed by the size data corresponding to the large size and the effect data corresponding to the curve pattern.
  • the shopping APP can obtain the 3D accurate model corresponding to the optimal size of the pants 400 by the following two methods:
  • the shopping APP may obtain the size data corresponding to the optimal size of the pants 400 and one or more effect data of the pants 400 from the server 300 to construct the corresponding corresponding size of the pants 400 3D accurate model.
  • the size data corresponding to the optimal size of the pants 400 may be obtained by the shopping APP from the server 300 in the (1) th method in step 1.
  • the one or more effect data of the pants 400 may be obtained by the shopping APP while obtaining the size data from the server 300 in the (1) way in step 1, or it may be received by the shopping APP in step 5.
  • the best size sent by ARAPP is obtained from server 300.
  • the server 300 may verify whether the shopping APP has the authority to call the effect data of the commodity, and after verifying that the shopping APP has the authority to call the effect data of the commodity, send one or more effect data of the pants 400 to the shopping APP.
  • the shopping APP After the shopping APP obtains the size data corresponding to the optimal size of the pants 400 and one or more effect data of the pants 400 from the server 300, it can use the computer graphics processing capability of the mobile phone 100 to construct the optimal size of the pants 400 Corresponding one or more 3D accurate models.
  • the specific steps of the shopping app to construct the 3D accurate model corresponding to the optimal size of the pants 400 please refer to the relevant description about the commodity data in point (2) above, which will not be repeated here.
  • the shopping APP may obtain the 3D accurate model corresponding to the optimal size of the pants 400 from the server 300.
  • the server 300 may generate 3D accurate models of the plurality of pants 400 according to the stored one or more effect data of the pants 400 and the size data corresponding to each size. That is, the server 300 can store 3D accurate models of a plurality of pants 400.
  • the server 300 constructing the 3D accurate model of the pants 400 reference may be made to the related description of the commodity data in point (2) above, which will not be repeated here.
  • the shopping APP may request the server 300 to obtain one or more accurate 3D models corresponding to the optimal size of the pants 400.
  • the 3D accurate model of the pants 400 reflects the size and effect of the pants 400.
  • the server 300 can verify whether the shopping APP has the authority to call the size data and effect data of the pants 400. After verifying that the shopping app has the authority to call the size data and effect data of the pants 400, the server 300 sends the 3D accurate model corresponding to the optimal size of the pants 400 to the shopping app.
  • the shopping APP sends the best size of the pants 400 and the low-precision 3D body model of the user 1 to the server 300.
  • the cloud service 300 uses the 3D accurate model corresponding to the optimal size of the pants 400 and the low-precision 3D body model of the user 1 to generate a 3D virtual try-on image.
  • the server 300 sends the generated 3D virtual try-on image to the shopping APP.
  • the server 300 can generate 3D accurate models of a plurality of pants 400 according to the stored one or more effect data of the pants 400 and the size data corresponding to each size. That is, the server 300 can store 3D accurate models of a plurality of pants 400. The server 300 may determine one or more 3D accurate models corresponding to the best size according to the best size sent by the shopping APP, and send the one or more 3D accurate models to the shopping APP.
  • the shopping APP or the server 300 obtains the 3D accurate model corresponding to the optimal size of the pants 400
  • the 3D accurate model corresponding to the optimal size of the pants 400 and the user 1 ’s low Accurate 3D body model to generate 3D virtual try-on images. Since there may be one or more 3D accurate models corresponding to the optimal size of the pants 400, there may be one or more 3D virtual try-on images generated.
  • the following uses a 3D accurate model corresponding to the optimal size of the pants 400 as an example to describe a process in which the shopping APP or the server 300 uses the 3D accurate model and the low-precision 3D body model of the user 1 to generate a 3D virtual try-on image.
  • the process may specifically include: the shopping APP calls the low-precision 3D body model of user 1, superimposes the 3D accurate model of the pants 400, and generates a superimposed try-on effect.
  • the try-on effect can be static or dynamic.
  • the shopping APP obtains one or more 3D virtual try-on images through the above (1) or (2) method
  • the one or more 3D virtual try-on images can be displayed, so that the user can view the try-on effect.
  • the shopping app obtains the 3D virtual try-on image
  • it can jump from the product details interface 40 to the 3D virtual try-on interface 50.
  • the 3D virtual try-on interface 50 includes: A 3D virtual try-on image.
  • the size corresponding to the 3D accurate model of the pants 400 in the 3D virtual try-on figure is the above-mentioned optimal size, and the effect corresponding to the 3D accurate model of the pants 400 in the 3D virtual try-on figure may be among multiple effects of the pants 400 anyone.
  • the 3D virtual try-on interface 50 may also include a control 408 for selecting the effect of the pants 400.
  • the control 408 provides the user with options for different patterns of the pants 400.
  • the user 1 can click on the control 408 to select which pattern of the pants 400 he wants to try on.
  • the mobile phone 100 can determine the pattern of the pants 400 that the user 1 currently wants to try on, and replace the 3D virtual try-on image in the 3D virtual try-on interface 50 with the one selected by the user 3D virtual try-on image of the pattern.
  • the 3D virtual try-on image that matches the pattern selected by the user is generated using the 3D accurate model corresponding to the best size of the pants 400 and the effect selected by the user (ie, the pattern selected by the user), and the low-precision 3D body model of user of.
  • a wait icon may be displayed on the interface of the shopping APP.
  • the waiting icon can be a buffer bar, an hourglass icon, or a circle with an arrow.
  • the wait icon can be static or dynamic.
  • the 3D accurate model of the pants 400 is obtained according to the optimal size and matches the user ’s body.
  • the 3D accurate model of the pants 400 realizes an accurate simulation of clothing, that is, the 3D virtual try-on image finally displayed by the shopping APP provides the user with a true try-on effect.
  • the high-precision 3D body model of user 1 and the 3D liner models corresponding to different sizes of pants 400 may not be used for matching, and the high-precision user data of user 1 may be matched with different sizes of pants 400 respectively. Match the size data to determine the best size for pants 400.
  • the shopping APP in step 1 can obtain the size data corresponding to different sizes of the pants 400
  • the AR APP in step 3 can obtain the high-precision user data of user 1
  • the AR APP in step 4 can use the user
  • the high-precision user data of 1 and the size data corresponding to different sizes of the pants 400 are matched to determine the optimal size of the pants 400.
  • AR APP uses user 1's high-precision user data and the size data corresponding to different sizes of pants 400 to match, and the process of determining the optimal size of pants 400 may include: through a computer vision algorithm (computer vision, CV), the High-precision user data (such as shoulder width, arm length, chest circumference, waist circumference, hip circumference, thigh circumference, calf circumference, height and other key size data), compared with the size data corresponding to different sizes of different pants 400, can be obtained and The best size for the user's body.
  • CV computer vision, CV
  • the High-precision user data such as shoulder width, arm length, chest circumference, waist circumference, hip circumference, thigh circumference, calf circumference, height and other key size data
  • Application scenario 1 describes a method for controlling user data according to an embodiment of the present application by using a user's virtual try-on pants as an example. It is understandable that the method is not limited to pants, and the method for controlling user data according to the embodiment of the present application may also be applied to virtual wearing. Commodity scene. For example, when a user wants to try on tops, shoes, glasses, hats and other commodities, a method similar to the embodiment in FIG. 7 can also be used to provide a virtual try-on image, thereby protecting user privacy and protecting the interests of businesses.
  • the user data may also be interacted and processed according to the body part of the user when wearing the product.
  • the AR APP can obtain the high-precision 3D body model and the low-precision 3D body model of the upper body and torso of the user who currently needs to virtually try on clothes .
  • the AR APP can obtain the high-precision 3D body model and the low-precision 3D body model of the user's foot that currently needs to virtually try on the shoes.
  • the AR APP can acquire the high-precision 3D body model and the low-precision 3D body model of the head of the user who currently needs to virtually try on the glasses.
  • the shopping APP may not display the 3D virtual try-on image generated by the 3D accurate model of the pants 400 and the low-precision body model of the user 1 for the user, but display the user 1 ’s The rendering of the real image and the 3D accurate model of the pants 400. This is explained in detail below.
  • the mobile phone 100 can turn on the camera (front camera or rear camera), and collect the real image of the user 1 through the camera.
  • the real image of user 1 may be a photo or a video.
  • the shopping APP can obtain the effect image formed by superimposing the real image of the user 1 and the 3D accurate model of the pants 400 in the following two ways:
  • the shopping APP may superimpose a 3D accurate model corresponding to the optimal size of the pants 400 and the real image of the user 1 to generate an effect picture.
  • the shopping APP obtains the 3D accurate model corresponding to the optimal size of the pants 400, please refer to the relevant description in point (1) in step 6 above.
  • the mobile phone 100 can send the collected real image of the user 1 to the server 300, and the server 300 can send a 3D accurate model corresponding to the optimal size of the pants 400 and the user 1 ’s After the real images are superimposed to generate an effect picture, the effect picture is sent to the shopping APP.
  • the shopping APP or the server 300 superimposes a 3D accurate model corresponding to the optimal size of the pants 400 and the real image of the user 1 to generate a rendering, which may specifically include: analyzing the user 1 ’s Based on the bone point information in the real image, Taobao APP superimposes the 3D accurate model of the pants 400 to which the bone point is bound to the corresponding position of the human bone point on the video stream according to the results of bone point recognition, showing the effect of trying on.
  • the 3D accurate model of the product and the user's real image in the above manner to generate an effect picture
  • the 3D accurate model of the pants 400 can be superimposed on the user's body as the user's body changes , To give users a good virtual try-on experience.
  • the shopping application and the AR application are installed in the electronic device as an example to describe the method for controlling user data in the embodiment of the present application.
  • the shopping application and the AR application exchange data, they can also exchange data through a shared data interface or a shared data storage location.
  • the function of the shopping application in application scenario 1 may also be implemented by a webpage accessed by a browser installed in the electronic device, for example, a webpage of a shopping website in the browser.
  • the function of the AR application may also be a system function or service integrated on the electronic device, and may not necessarily be implemented through the application.
  • Application scenario 2 User 1 selects the commodity to be purchased through the application for managing user data on the mobile phone 100, and checks the 3D virtual try-on effect in the shopping application on the mobile phone 100.
  • the user data in the application scenario 2 is stored in the server 200, and the commodity data is stored in the server 300. Refer to the related description in the application scenario 1.
  • the mobile phone 100 is installed with an application for managing user data and a shopping application.
  • the difference between the application scenario 2 and the application scenario 1 is that the user 1 does not select the commodity to be purchased in the shopping application, but selects the commodity to be purchased in the application for managing user data.
  • the following uses the application for managing user data as an APP to illustrate.
  • the AR APP can also search the content of the applications installed on the mobile phone 100.
  • the AR APP can communicate with a fast service smart platform (for example, HUAWEI capability (HAG)) through the network, and realize content search for applications installed on the mobile phone 100 through the fast service smart platform.
  • the fast service smart platform is a service distribution platform and can also be regarded as a server.
  • the fast service smart platform can analyze user needs and distribute user needs to service partners that match user needs.
  • the service partner can return the search results based on the user's demand to the AR APP.
  • the service partner may be an application installed in the mobile phone 100, such as a shopping APP.
  • the data interaction and processing process may include the following steps:
  • User 1 enters keywords in the AR APP, and the AR APP sends the received keywords to the fast service smart platform.
  • the user when the user wants to view the 3D try-on effect of a product, he can first open the AR APP in the mobile phone 100 and enter the keyword of the product in the AR APP.
  • the user 1 can click the icon 305 of the AR APP in the user interface 30 with a finger or a stylus, and in response to the click operation, the mobile phone 100 displays the main interface of the AR APP.
  • FIG. 9a shows a main interface 80 of a possible AR APP.
  • the main interface 80 of the AR APP may include: a search box 801.
  • the main interface 80 may also include applications recommended by AR APP, search history, and the like.
  • the user 1 can input the keyword of the commodity to be searched in the search box 801. For example, referring to 9b of FIG. 9, when the user 1 wants to purchase clothing, he can click the search box 801 and input keywords through the virtual keyboard. After user 1 enters the keyword, he can click the search control in the virtual keyboard. After detecting the click operation of the search control in the virtual keyboard by the user 1, the mobile phone 100 sends the keyword “clothing” input by the user 1 to the fast service smart platform through the AR APP.
  • the quick service smart platform analyzes user needs according to keywords and sends the keywords to the shopping APP that matches the needs of user 1.
  • the quick service smart platform can analyze user needs according to keywords, determine the applications that match the user needs among the applications installed in the mobile phone 100, and send the keywords to the application.
  • the quick service smart platform can determine that the user wants to understand the introduction of Hawking, and the application that matches the user's needs can be Wikipedia, Baidu Encyclopedia and other applications.
  • the quick service smart platform can determine that the user wants to view Hawking's book “a brief history of time”, and the application that matches the user's needs can be a reading application Such as WeChat reading.
  • step 2 the keyword received by the fast service smart platform is "clothing", then the fast service smart platform can determine that the user wants to buy clothing, and the application that matches the user's needs can be a shopping application, such as a shopping APP.
  • the shopping APP searches according to the keywords and sends the search results to the AR APP.
  • the shopping APP may send keywords to the server 300, obtain search results from the server 300, and send the search results to the AR APP.
  • AR APP displays the search results sent by the shopping APP.
  • User 1 selects a piece of clothing that he wants to buy in the search results.
  • the AR APP sends the identification of a garment selected by the user 1 to the shopping APP, and the mobile phone 100 jumps to the shopping APP and displays the product detail interface of the garment selected by the user 1.
  • the clothing selected by the user is referred to as pants 400.
  • the identification of the clothing may be the style or text description of the clothing.
  • the AR APP has provided multiple hyperlinks, and each hyperlink is connected to the product details interface of a garment in the shopping APP. When the user selects a piece of clothing, it is equivalent to jump to the corresponding product details interface in the shopping APP through a hyperlink.
  • the search result includes images and text descriptions corresponding to multiple products found by the shopping app based on keywords.
  • the user 1 can click on the image or text introduction of the pants 400 in the search result of the interface 90.
  • the mobile phone 100 jumps to the shopping APP and displays the product details interface of the pants 400 in the shopping APP.
  • the product detail interface may be as shown in 6a of FIG. 6, and reference may be made to related descriptions.
  • the mobile phone 100 obtains a 3D model of the liner of the pants 400 through the shopping APP.
  • the shopping APP sends the 3D model of the liner of the pants 400 to the application for managing user data, that is, AR APP.
  • the AR APP obtains the high-precision 3D body model and the low-precision 3D body model of the user who currently needs to virtually try on the pants 400.
  • the AR APP matches the high-precision 3D body model of the user 1 with the 3D liner models corresponding to the different sizes of the pants 400, and determines the best size among the multiple sizes of the pants 400.
  • AR APP sends the determined best size and the low-precision 3D body model of user 1 to the shopping APP.
  • the shopping APP obtains a 3D virtual try-on image according to the 3D accurate model corresponding to the best size of the pants 400 and the low-precision 3D body model of the user 1.
  • steps 6-11 are the same as steps 1-6 in the embodiment of FIG. 7, and reference may be made to the related description of the embodiment of FIG. 7, which will not be repeated here.
  • user data is stored in the server 200.
  • user data may not be stored in the server 200 but stored in the mobile phone 100.
  • the user data may be stored in a storage file corresponding to an application (for example, AR APP) installed in the mobile phone 100 for managing user data, or may be separately stored in the mobile phone 100 as a data source.
  • the application for managing the user data may obtain the high-precision 3D body model and the low-precision 3D body model of the user who currently needs to virtually try on the pants 400 according to the locally stored user data.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a dedicated computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transferred from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be from a website site, computer, server or data center Transmission to another website, computer, server or data center via wired (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device including a server, a data center, and the like integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种控制用户数据的方法及相关装置。由电子设备中的第二应用获取高精度用户数据,并且只由第二应用来处理高精度用户数据。当第一应用需要用户数据时,第二应用向第一应用提供处理之后的数据以及低精度用户数据。上述方法仅由第二应用获取并处理高精度用户数据,可在3D虚拟试穿过程中避免将高精度用户数据泄露给第一应用,降低高精度用户数据泄露的风险。

Description

控制用户数据的方法及相关装置
本申请要求于2018年11月13日提交中国专利局、申请号为201811347521.9、发明名称为“控制用户数据的方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及控制用户数据的以及终端领域,特别涉及控制用户数据的方法及相关装置。
背景技术
随着网络技术的发展,网络购物日趋成熟,人们越来越多地在网上购买所需要的商品。在网购时,用户不能真实地看到或触摸到商品。
用户在购买衣物类商品时,通常有试穿需求,想要通过试穿来直观地了解实际的穿着效果。为了满足用户的试穿需求,提高用户的网购体验,提出了一种虚拟试穿技术。通过虚拟试穿,用户不必真实地更换衣服,就可以看到试穿新衣服的效果。
虚拟试穿主要包括二维(2D)虚拟试穿和三维(3D)虚拟试穿,下面简单介绍。
参见图1左侧附图,2D虚拟试穿通过拍摄、绘制等方式获得服装的2D图像,通过摄像头获取用户的视频流,通过计算机图形处理将2D的衣物图像贴上视频流中的用户身体。由于衣物图像是2D的,缺乏立体感,因此2D虚拟试穿不能全面反映用户穿着衣物的效果。例如,当用户侧身、转身时无法呈现对应的试穿效果。
参见图1右侧附图,3D虚拟试穿是指利用三维建模技术,生成用户人体以及服装的三维模型,再借助于三维几何变形或布料物理形变模拟等算法,在三维场景中模拟出人体模型的穿衣效果。由于模特和衣物都是三维的,3D虚拟试穿能够充分地展示用户的试穿效果。
上述可知,3D虚拟试穿能够为用户提供真实的试穿效果,可以更好地满足用户的试穿需求,是未来重点发展的方向。
从图1右侧附图可知,为了实现3D虚拟试穿,一般是虚拟试穿应用(application,APP)同时获取用户身体的3D模型和衣物的3D模型,以展示人体模型的穿衣效果。但是,用户对于个人数据或个人隐私的保护意识越来越强,可能会拒绝将个人身体尺寸数据(例如敏感部位的身体数据)提供给虚拟试穿应用,使得虚拟试穿难以实现。同样的,服装厂商或电商平台考虑到知识产权保护、产品竞争力等因素,对商品的数据也有保密要求,可能会拒绝将重要的商品数据提供给虚拟试穿应用,使得虚拟试穿难以实现。
因此,如何在3D虚拟试穿过程中保护用户数据或商品数据,是当前亟需解决的问题。
发明内容
本申请提供了控制用户数据的方法及相关装置,可以降低高精度用户数据泄露的风险,保证高精度用户数据的安全。
第一方面,本申请提供了一种控制用户数据的方法,应用于电子设备,该电子设备安装有第一应用和第二应用,该方法包括:第二应用获取用户的高精度用户数据和低精度用户数据;第二应用获取第一应用提供的第一物品的尺寸数据;其中,尺寸数据反映第一物品的尺寸;第二应用根据尺寸数据和高精度用户数据确定第一物品的最佳尺码,最佳尺码和用户的 身体相匹配;高精度用户数据反映用户的身体细节,低精度用户数据反映用户的身体轮廓;第二应用向第一应用提供最佳尺码和低精度用户数据以生成3D试穿图。
实施第一方面的方法,将用户数据分为高精度用户数据和低精度用户数据,由第二应用获取并处理高精度用户数据,第二应用仅提供低精度用户数据给第一应用,就能生成3D试穿图,降低了高精度用户数据泄露的风险,满足用户对于隐私的保护需求。
本申请中,高精度用户数据几乎能反映用户身体各个部位的细节特征,信息量较大,用户一般希望能保证高精度用户数据的安全。而低精度用户数据仅能反映用户身体的大体特征,信息量较小,用户对于低精度用户数据的安全要求也较低。因此,高精度用户数据的调用比低精度用户数据的调用更加严格。高精度用户数据仅提供给用户信任的应用,低精度用户数据可提供给大部分应用(包括用户信任的应用)。
结合第一方面,在一些实施例中,第二应用为手机厂商开发的应用,第一应用为第三方应用。例如,第一应用可以为淘宝、京东等购物应用,第二应用可以为手机厂商开发的AR应用。
结合第一方面,第二应用获取用户的高精度用户数据和低精度用户数据的方式可包括以下两种:
(1)在一些实施例中,第二应用向第二服务器发送第一获取请求,并接收第二服务器发送的用户的高精度用户数据和低精度用户数据。
可选的,在第(1)种方式中,第一获取请求中可携带第二应用的标识信息;用户的高精度用户数据和低精度用户数据是第二应用的标识信息通过验证之后,第二服务器发送的。第二服务器验证第二应用的标识信息的过程,即第二服务器验证第二应用是否为用户信任的应用的过程,该过程可包括:第二服务器查看预先存储的应用标识中是否包括第二应用的标识,若是,则确认第二应用为用户信任的应用,即通过验证。这里,第二服务器预先存储的应用标识对应的应用都为用户信任的应用,可由第二服务器或用户进行设置。
(2)在一些实施例中,第二应用接收第二服务器发送的,或接收用户录入,或读取电子设备检测的用户的高精度用户数据;第二应用根据用户的高精度用户数据计算得到用户的低精度用户数据。
可选的,低精度用户数据可以由高精度用户数据经过模糊化处理后得到。例如,可以在高精度用户数据中删除部分数据,得到低精度用户数据。
结合第一方面,在一些实施例中,第二应用获取第一应用提供的第一物品的尺寸数据之前,该方法还包括:电子设备显示第一应用的界面,该界面包括第一物品;电子设备接收用户选择第一物品的操作。
结合第一方面,在一些实施例中,该3D试穿图是根据最佳尺码、低精度用户数据和第一物品的效果数据生成的;第一物品的效果数据由第一应用提供,效果数据反映第一物品的细节。也就是说,3D试穿图是根据最佳尺码、低精度用户数据和第一物品的效果数据生成的,该3D试穿图可以反映出用户试穿第一物品时的合身程度以及第一物品的细节。
具体的,本申请中的物品可以为商品。举例来说,本申请中的物品可以为衣服、裤子、帽子、眼镜等。在本申请的一些实施例中,将物品数据分为尺寸数据和效果数据。同一款物品可以有多个尺寸数据,例如不同尺码分别对应的尺寸数据。同一款物品也可以有多个效果数据,例如不同花纹或不同颜色分别对应的效果数据。
尺寸数据反映商品各个部分的大小,大多数情况下会将其公开给消费者,让消费者挑选 合适的商品。而效果数据是一款物品和其他物品的主要区别点,为了提高产品的竞争力,对效果数据的安全要求较高。因此,效果数据的调用比尺码数据的调用更加严格。效果数据仅提供给物品提供方(例如商家)信任的应用,尺寸数据可提供给大部分应用(包括物品提供方信任的应用)。
在一些实施例中,第一应用可通过以下方式获取第一物品的尺寸数据和效果数据:第一应用向第一服务器发送第二获取请求,第二获取请求中携带第一应用的标识信息;第一应用的标识信息通过验证之后,接收第一服务器发送的第一物品的尺寸数据和效果数据。第一服务器验证第一应用的标识信息的过程,即第一服务器验证第一应用是否为物品提供方信任的应用的过程,该过程可包括:第一服务器查看预先存储的应用标识中是否包括第一应用的标识,若是,则确认第一应用为物品提供方信任的应用,即通过验证。这里,第一服务器预先存储的应用标识对应的应用都为物品提供方信任的应用,可由第一服务器或用户进行设置。
结合第一方面,在一些实施例中,该方法还可包括:电子设备在第一应用的界面显示生成的3D试穿图。用户可以通过电子设备显示的3D试穿图,查看自己试穿第一物品的效果,可以为用户带来良好的购物体验。
第二方面,本申请提供一种电子设备,包括存储器和处理器,存储器存储有至少一个程序,至少一个程序包括第一应用和第二应用,处理器用于运行第二应用以使电子设备执行:
获取用户的高精度用户数据和低精度用户数据;获取第一应用提供的第一物品的尺寸数据;其中,尺寸数据反映第一物品的尺寸;根据尺寸数据和高精度用户数据确定第一物品的最佳尺码,最佳尺码和用户的身体相匹配;高精度用户数据反映用户的身体细节,低精度用户数据反映用户的身体轮廓;向第一应用提供最佳尺码和低精度用户数据以生成3D试穿图。
结合第二方面,在一些实施例中,第二应用为手机厂商开发的应用,第一应用为第三方应用。例如,第一应用可以为淘宝、京东等购物应用,第二应用可以为手机厂商开发的AR应用。
结合第二方面,处理器运行第二应用以使电子设备获取用户的高精度用户数据和低精度用户数据的方式可包括以下两种:
(1)在一些实施例中,处理器运行第二应用以使电子设备向第二服务器发送第一获取请求,并接收第二服务器发送的用户的高精度用户数据和低精度用户数据。
可选的,在第(1)种方式中,第一获取请求中可携带第二应用的标识信息;用户的高精度用户数据和低精度用户数据是第二应用的标识信息通过验证之后,第二服务器发送的。第二服务器验证第二应用的标识信息的过程,即第二服务器验证第二应用是否为用户信任的应用的过程,该过程可包括:第二服务器查看预先存储的应用标识中是否包括第二应用的标识,若是,则确认第二应用为用户信任的应用,即通过验证。这里,第二服务器预先存储的应用标识对应的应用都为用户信任的应用,可由第二服务器或用户进行设置。
(2)在一些实施例中,处理器运行第二应用以使电子设备接收第二服务器发送的,或接收用户录入,或读取电子设备检测的用户的高精度用户数据;第二应用根据用户的高精度用户数据计算得到用户的低精度用户数据。
可选的,低精度用户数据可以由高精度用户数据经过模糊化处理后得到。例如,可以在高精度用户数据中删除部分数据,得到低精度用户数据。
结合第二方面,在一些实施例中,该电子设备还包括触摸屏,处理器运行第二应用以使电子设备在获取第一应用提供的第一物品的尺寸数据之前,处理器还用于运行第一应用以使 电子设备执行:触摸屏显示第一应用的界面,该界面包括第一物品;触摸屏接收用户选择第一物品的操作。
结合第二方面,在一些实施例中,3D试穿图是根据最佳尺码、低精度用户数据和第一物品的效果数据生成的;第一物品的效果数据由第一应用提供,该效果数据反映第一物品的细节。也就是说,3D试穿图是根据最佳尺码、低精度用户数据和第一物品的效果数据生成的,该3D试穿图可以反映出用户试穿第一物品时的合身程度以及第一物品的细节。
在一些实施例中,处理器运行第二应用以使电子设备获取第一应用提供的第一物品的尺寸数据之前,处理器还用于运行第一应用以使电子设备执行:向第一服务器发送第二获取请求,第二获取请求中携带第一应用的标识信息;第一应用的标识信息通过验证之后,接收第一服务器发送的第一物品的尺寸数据和效果数据。
结合第二方面,在一些实施例中,电子设备还包括触摸屏,处理器还用于运行第一应用以使电子设备执行:触摸屏在第一应用的界面显示生成的3D试穿图。
第三方面,本申请提供一种控制用户数据的方法,应用于第二服务器,该方法包括:第二服务器接收第二应用发送的第一获取请求,第二应用安装于电子设备中;第二服务器向第二应用发送用户的高精度用户数据和低精度用户数据。
结合第三方面,在一些实施例中,第一获取请求中携带第二应用的标识信息,第二服务器在第二应用的标识信息通过验证之后,向第二应用发送用户的高精度用户数据和低精度用户数据。
第四方面,本申请提供一种第二服务器,包括:一个或多个处理器、一个或多个存储器;一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,该电子设备执行如第三方面提供的控制用户数据的方法。
第五方面,本申请提供一种控制用户数据的方法,应用于第一服务器,该方法包括:第一服务器接收第一应用发送的第二获取请求,第一应用安装于电子设备中;第一服务器向第一应用发送第一物品的尺寸数据和效果数据。
结合第五方面,在一些实施例中,第二获取请求中携带第一应用的标识信息,第一服务器在第一应用的标识信息通过验证之后,向第一应用发送第一物品的尺寸数据和效果数据。
第六方面,本申请提供一种第一服务器,包括:一个或多个处理器、一个或多个存储器;一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,该电子设备执行如第五方面提供的控制用户数据的方法。
第七方面,本申请提供一种计算机存储介质,包括计算机指令,当该计算机指令在电子设备上运行时,使得该电子设备执行如第一方面描述的控制用户数据的方法。
第八方面,本申请提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第一方面描述的控制用户数据的方法。
实施本申请,无需将高精度用户数据提供给第一应用,将高精度用户数据提供给第二应用,就可以实现3D虚拟试穿。本申请可以在3D虚拟试穿过程中,保证高精度用户数据仅被用户信任的应用调用,避免高精度用户数据被泄露给用户不信任的应用,降低了高精度用户数据泄露的风险。
附图说明
图1是现有技术中的试穿技术的示意图;
图2为本申请提供的电子设备的结构示意图;
图3为本申请提供的电子设备的软件结构框图;
图4为本申请提供的扫描获取用户数据的场景示意图;
图5-图6为本申请提供的人机交互示意图;
图7为本申请提供的控制用户数据的方法的流程示意图;
图8为本申请提供的另一种控制用户数据的方法的流程示意图;
图9为本申请提供的人机交互示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请的控制用户数据的方法应用于电子设备。本申请对提及的电子设备的类型不做具体限定,电子设备可以为手机、平板电脑、个人数字助理(personal digital assistant,PDA)、可穿戴设备、膝上型计算机(laptop)等便携式电子设备,也可以为电子穿衣镜、台式计算机等非便携式电子设备。电子设备的示例包括但不限于搭载iOS、android、microsoft或者其他操作系统的电子设备。
首先介绍本申请中电子设备的结构。参见图2,图2示出了电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器 (application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。存储器存储的指令用于电子设备100执行本申请实施例中的控制用户数据的方法。在本申请的一些实施例中,存储器存储的数据可包括用户数据,该用户数据包括高精度用户数据和低精度用户数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。
UART接口是一种通用串行数据总线,用于异步通信。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不 构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
在一些实施例中,电子设备100可以使用无线通信功能和其他设备通信。例如,电子设备100可以和服务器200通信,获取服务器200存储的用户数据或者用户的高精度3D身体模型、低精度3D身体模型等。又例如,电子设备100可以和服务器300通信,获取服务器300存储的商品数据或者商品的3D精确模型、3D内胆模型等。其中,电子设备和服务器200、服务器300的通信过程可参照后续实施例的相关描述,在此暂不赘述。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple  access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
在一些实施例中,GPU可用于使用商品的尺寸数据构建内胆3D模型。例如,GPU可使用商品的尺寸数据构建商品的内胆3D模型。商品的内胆3D模型的构建过程可参考后续实施例的相关描述。
在一些实施例中,GPU可用于使用高精度用户数据构建高精度3D身体模型,使用低精度用户数据构建低精度3D身体模型。高精度3D身体模型、低精度3D身体模型的构建过程可参考后续实施例的相关描述。
在一些实施例中,GPU可用于将用户的高精度3D身体模型和商品的不同尺码分别对应的3D内胆模型进行匹配,确定商品的最佳尺码。商品的最佳尺码的确定过程可参考后续实施例的相关描述。
在一些实施例中,GPU可用于生成3D虚拟试穿图。例如,3D虚拟试穿图的生成过程可参考后续实施例的相关描述。
在一些实施例中,GPU可用于将商品的3D精确模型和用户的真实图像叠加,生成效果图。商品的3D精确模型和用户的真实图像进行叠加生成效果图的过程,可参考后续实施例的相关描述。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
在一些实施例中,显示屏194可用于显示电子设备100的系统输出的各个界面。电子设备100输出的各个界面可参考后续实施例的相关描述。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体 (complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。在一些实施例中,摄像头193用于获取用户的真实图像。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。陀螺仪传感器180B可以用于确定电子设备100的运动姿态。气压传感器180C用于测量气压。
磁传感器180D包括霍尔传感器。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。
距离传感器180F,用于测量距离。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。
环境光传感器180L用于感知环境光亮度。
指纹传感器180H用于采集指纹。
温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸 传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。
马达191可以产生振动提示。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图3是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图3所示,应用程序包可以包括购物应用(例如淘宝APP、京东APP、亚马逊APP)、用于管理用户数据的应用(例如AR APP)、相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。本申请中,购物应用和用于管理用户数据的应用之间可通过API通信。
如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
在一些实施例中,三维图形处理库用于使用商品的尺寸数据构建内胆3D模型。
在一些实施例中,三维图形处理库可用于使用高精度用户数据构建高精度3D身体模型,使用低精度用户数据构建低精度3D身体模型。
在一些实施例中,三维图形处理库可用于将用户的高精度3D身体模型和商品的不同尺码分别对应的3D内胆模型进行匹配,确定商品的最佳尺码。
在一些实施例中,三维图形处理库可用于生成3D虚拟试穿图。
在一些实施例中,三维图形处理库可用于将商品的3D精确模型和用户的真实图像叠加,生成效果图。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
从图2可知,本申请中的电子设备配置有触控屏(以下称触摸屏),可用于接收用户操作以及显示电子设备的系统输出的界面内容。该界面内容可包括正在运行的应用程序的界面以及系统级别菜单等,具体可由下述界面元素组成:输入型界面元素,例如按键(button),文本输入框(text),滑动条(scroll Bar),菜单(menu)等;以及输出型界面元素,例如视窗(window),标签(label)等。在一些实施例中,该界面内容可以为电子设备展示给用户的3D试穿画面。
以下实施例将以电子设备为手机100为例,对本申请实施例提供的控制用户数据的方法进行说明。
本申请实施例的控制用户数据的方法可以在实现3D虚拟试穿时,保证用户数据和/或商品数据的安全,降低用户数据泄露的风险以及商品数据的流失。
为了更好地理解本申请实施例的控制用户数据的方法,首先介绍本申请涉及的两种数 据:用户数据和商品数据。
(一)用户数据。
用户数据包括反映用户体型特征的数据。举例来说,反映用户体型特征的数据包括但不限于:身高、体重、胸围、腰围、臀围、臂宽、臂长、臂围、大腿长、大腿围、小腿长、小腿围、头围、头部各个器官(嘴、眼、鼻子)的尺寸及位置、脚踝围、脚长、脚宽、足弓高度、脚背高度、各个脚趾的长度等。在一些实施例中,用户数据还可包括反映用户外在形象的数据。举例来说,反映用户外在形象的数据包括但不限于:发型、发色、脸型、肤色等。
在一些实施例中,用户数据可以按照一定的策略分级。可选的,用户数据可以按照反映用户体型特征的精度分级。例如,用户数据可以分为高精度用户数据和低精度用户数据。
其中,高精度用户数据是指能够精准、全面地反映用户身体部位细节的数据。举例来说,针对用户的脚部,高精度脚部数据包含脚部各部分的细节,如足弓,各脚趾位置,足跟,脚踝,脚背的详细三维数据;针对用户的全身,高精度用户数据会包含用户肩部,手臂,躯干,大腿,小腿等身体各部分的详细三维数据。在一些实施例中,高精度用户数据采用现有3D模型的标准描述格式进行描述和存储,如OBJ文件格式。通过3D模型的标准描述格式来描述高精度用户数据时,该高精度用户数据实际可包括用来描述用户身体表面特征的所有顶点的三维空间坐标的集合、面片数量等信息,这些信息可用于构建高精度3D模型。
高精度用户数据可以通过以下两种方式采集:
1、通过扫描设备扫描用户身体,采集高精度用户数据。这里,扫描设备是指可以通过扫描获取物体的三维数据的设备。扫描设备可以是手机或者其他设备(例如红外线扫描仪、带有深度测量信息的摄像头等)。扫描设备通过扫描用户身体,可以采集到精确的用户数据。示例性地,参见图4,其示出了一种可能的外部设备扫描用户身体、采集高精度用户数据的场景。
2、用户测量并提供的高精度用户数据。具体的,用户可以通过手动测量的方式获取到用户数据。例如,用户可以通过体重秤测量体重、通过皮尺测量身体部位的维度等。
其中,低精度用户数据是相对于高精度用户数据而言的,其能够大概、简略地反映用户的身体特征,但是不能精准、全面地反映用户身体部位的细节。也就是说,低精度用户数据可以反映用户身体的大体轮廓,但不包括身体各部位的精确尺寸。在一些实施例中,低精度用户数据包含的数据量小于高精度用户数据包含的数据量。举例来说,针对用户的脚部,低精度脚部数据包含脚部各部分的轮廓数据,如脚部长度、宽度、高度等,但不包括详细的三维数据,如足弓高度、各脚趾位置、脚踝形状等。在一些实施例中,低精度用户数据采用现有3D模型的标准描述格式进行描述和存储,如OBJ文件格式。
低精度用户数据可以由高精度用户数据经过处理后得到。可选的,低精度用户数据可以由高精度用户数据经过模糊化处理后得到。例如,可以在高精度用户数据中删除部分数据,得到低精度用户数据。又例如,在通过3D模型的标准描述格式来描述高精度用户数据时,可以将高精度用户数据中顶点距离小于1cm以下的顶点删除,进行平滑处理,将高精度用户数据中的几十万个面片降低到小于1万个面片。
下面介绍用户数据的存储方式。
用户数据可以存储在手机100中,也可以存储在服务器200中,这里不做限制。在一些实施例中,用户数据可以加密存储在手机100(例如存储在手机100的加密芯片中)或服务器200中,当需要使用这些用户数据时,需要进行解密或者验证,从而可以保证用户数据在 存储过程中的安全。当用户数据存储在手机100中时,用户数据可以存储在手机100所安装应用对应的存储文件中,也可以作为一个数据源单独存储在手机100中。
在一些实施例中,手机100或服务器200可以分级存储用户数据。参见表1,其示出了一种可能的分级存储用户数据的形式。用户数据存储在用户账号下,即通过用户账号可以找到对应的用户数据。用户账号为字符串,该字符串可以为以下一个或任何组合:字母,数字,符号等。例如,用户账户可以为用户姓名、邮箱、电话号码等。
Figure PCTCN2019110117-appb-000001
表1
下面介绍手机100或服务器200分级存储用户数据时的具体操作。
通过上述第1种方式采集高精度用户数据时,扫描设备可以将采集到的高精度用户数据传输给手机100或服务器200进行存储。通过上述第2种方式采集高精度用户数据时,用户可以将高精度用户数据直接输入到手机100进行存储,也可以通过终端设备(例如手机100或电脑等)将高精度用户数据传输至服务器200进行存储。低精度用户数据可以是手机100或服务器200对高精度用户数据做处理后得到并存储的。
由于高精度用户数据几乎能反映用户身体各个部位的细节特征,信息量较大,出于对个人数据或者个人隐私保护的考虑,用户一般希望能保证高精度用户数据的安全。而低精度用户数据仅能反映用户身体的大体特征,信息量较小,用户对于低精度用户数据的安全要求也较低。为了保证数据安全,在一些实施例中,分级存储用户数据后,可以设置其他设备或者应用调用用户数据的权限,只有有权限的其他设备或者应用能够调用高精度用户数据或低精度用户数据。高精度用户数据的调用比低精度用户数据的调用更加严格。
在一些实施例中,高精度用户数据只能被用户信任的应用调用。用户信任的应用是指不会泄露高精度用户数据,或者,不会使用高精度用户数据做损害用户利益的事情的应用。举例来说,高精度用户数据存储在手机厂商(例如华为)或者与手机厂商有合作的服务器200中,该服务器200可以提供高精度用户数据给用户信任的应用,用户信任的应用可以包括手机厂商开发的应用,例如手机厂商的增强现实(augmented reality,AR)应用、快服务智慧平台等。在上述示例中,高精度用户数据总是被用户信任的应用调用,可以保证高精度数据不被泄露给其他厂商或第三方应用(例如淘宝、京东等第三方购物应用)。这里,用户信任的应用可以是用户自主设置的,也可以是手机100或服务器200设置的。
进一步地,为了更有力地保证高精度用户数据的安全,可以在每次调用高精度用户数据时,都询问用户是否可以调用,在得到用户的授权后,才能调用高精度用户数据。
在一些实施例中,低精度用户数据可以被大部分应用调用。举例来说,可以调用低精度用户数据的应用不仅包括上述提及的用户信任的应用(例如手机厂商开发的各类应用),还可以包括第三方应用(例如淘宝、京东等第三方购物应用)。这里,可以调用低精度用户数据的应用可以是用户自主设置的,也可以是手机100或服务器200设置的。
不限于上述实施例以及表1中的分级存储用户数据的形式,本申请中还可以通过其他形式存储用户数据。例如,还可以按照身体部位分类(如分为头部、上身躯干、下身躯干、脚部等),分别存储每一类身体部位对应的高精度用户数据和低精度用户数据。
不限于表1中一个用户账号下存储一个用户的用户数据,本申请中还可以在一个用户账 号下存储多个用户的用户数据。示例性地,参见表2,在用户账号(即用户1)下除了用户1的用户数据,还可以包括用户2和用户3的用户数据。用户2和用户3可以为用户1的家人。
Figure PCTCN2019110117-appb-000002
表2
高精度用户数据可以用于构建用户的高精度3D身体模型,实现对用户身体的精确建模。具体的,根据高精度用户数据构建用户的高精度3D身体模型的过程可包括:通过建模工具或者算法,将高精度用户数据构建为高精度3D身体模型。可理解的,在一些实施例中,还可以针对每个身体部位分别构建高精度3D模型。例如,可以根据头部的高精度用户数据构建头部的高精度3D模型,根据上身躯干的高精度用户数据构建上身躯干的高精度3D模型,根据下身躯干的高精度用户数据构建下身躯干的高精度3D模型,根据脚部的高精度用户数据构建脚部的高精度3D模型等。
低精度用户数据可以用于构建用户的低精度3D身体模型,实现对用户身体的简略模拟。具体的,根据低精度用户数据构建用户的低精度3D身体模型的过程可包括:通过建模工具或者算法,将低精度用户数据构建为低精度3D身体模型。可理解的,在一些实施例中,和上述的高精度3D模型类似,还可以针对每个身体部位分别构建低精度3D模型。
(二)商品数据。
商品数据包括反映商品特征的数据。
本申请中的商品是指可以穿戴到用户身上的物品。本申请中的商品包括但不限于:服装(如衣服(T恤、衬衫、外套等)、裤子(如长裤、短裤等)等)、鞋靴、饰品(如帽子、眼镜、手表、配饰(如围巾、耳环等))等。商品特征可包括商品的尺寸、材质、纹路、装饰等。
举例来说,一件衣服的商品数据可包括:衣长、胸围、腰围、肩宽、袖长、面料材质、里料材质、袖口围度、下摆围度、面料纹理、面料上的装饰、面料弹力、厚度、剪裁方式(如落肩、垫肩等)、款式(紧身、修身、宽松等)等。
再举例来说,一双鞋子的商品数据可包括:鞋子尺码、鞋底长度、内里高度、鞋底材质、鞋头款式(尖头、圆头等)、鞋面材质、靴筒高度、鞋垫材质、鞋面内里材质、鞋跟形状(平跟、高跟)、靴筒面材质、鞋面图案等。
在一些实施例中,商品数据可以按照一定的策略分类。可选的,商品数据可以分为尺寸数据和效果数据。尺寸数据和效果数据共同构成了一件商品的商品数据。
其中,尺寸数据是指能够反映用户穿戴商品时的合身程度的数据。举例来说,针对一件衣服,尺寸数据可包括:衣长、胸围、腰围、肩宽、袖长、袖口围度、下摆围度等。再举例来说,针对一双鞋子,尺寸数据可包括:鞋子尺码、鞋底长度、内里高度、靴筒高度等。
可理解的,针对同一款商品(例如同一款衣服、裤子等),可能有多个不同的尺码(例如大码、中码、小码等)。一款商品的每一个尺码都对应有尺寸数据,即一款商品可能有多个尺 寸数据。
其中,效果数据是指能够反映用户穿戴商品时的效果的数据。举例来说,针对一件衣服,效果数据可包括:面料材质、里料材质、面料纹理、面料上的装饰、面料弹力、厚度、剪裁方式(如落肩、垫肩等)、版型(紧身、修身、宽松等)等。再举例来说,针对一双鞋子,效果数据可包括:鞋底材质、鞋头款式(尖头、圆头等)、鞋面材质、鞋垫材质、鞋面内里材质、靴筒面材质、鞋面图案等。
可理解的,针对同一款商品,可能有多个不同的效果数据。例如,同一款商品可能对应用多种颜色,每一个颜色都对应有效果数据。又例如,同一款商品可能对应有多种花纹,每一种花纹都对应有效果数据。
商品数据可以通过以下两种方式采集:
1、通过设备拍照或者扫描商品,采集商品数据。这里,可以通过摄像头拍摄商品的图像,通过计算机分析该图像以采集商品数据。也可以通过红外线扫描仪等设备扫描商品,采集商品数据。
2、通过商品的生产资料获取商品数据。具体的,商家在生产商品的过程中,会有相关的生产资料,生产资料上记录有商品数据。举例来说,一件衣服在制作过程中,会确定衣服的尺寸、材料、纹饰等数据,这些数据可以记录在衣服的生产资料中,通过该生产资料可以获取衣服的商品数据。
下面介绍商品数据的存储方式。
商品数据可以存储在服务器300中。该服务器300可以是商家或购物平台(例如淘宝、京东等)的服务器。
在一些实施例中,服务器300可以分类存储商品数据。参见表3,其示出了一种可能的分类存储商品数据的形式。商品数据存储在商品标识下,即通过商品标识可以找到对应的商品数据。商品标识可以为商品款号、商品描述等。
Figure PCTCN2019110117-appb-000003
表3
在一些实施例中,服务器300分类存储商品数据时,商品数据可以不存储为表3的形式,而通过现有3D模型的标准描述格式进行描述和存储,如OBJ文件格式。
尺寸数据反映商品各个部分的大小,大多数情况下商家会将尺寸数据公开给消费者,让消费者挑选合适的商品,因此,商家对于尺寸数据的安全要求较低。效果数据是一款商品和其他商品的主要区别点,为了提高产品的竞争力,商家对效果数据的安全要求较高。为了保证数据安全,在一些实施例中,分类存储商品数据后,可以设置其他设备或者应用调用商品 数据的权限,只有有权限的其他设备或者应用能够调用尺码数据或者效果数据。效果数据的调用比尺码数据的调用更加严格。
在一些实施例中,效果数据只能被商家信任的应用调用。商家信任的应用是指不会泄露效果数据,或者,不会使用效果数据做损害商家利益的事情的应用。举例来说,效果数据存储在购物应用(例如淘宝APP)的服务器300中,该服务器300可以提供效果数据给和商家有合作关系的应用(例如淘宝APP)。在上述示例中,效果数据被商家信任的应用调用,可以保证效果数据不被泄露。这里,商家信任的应用可以由商家或服务器300设置。
在一些实施例中,尺寸数据可以被大部分应用调用。举例来说,可以调用尺寸数据的应用不仅包括上述提及的商家信任的应用(例如淘宝APP),还可以包括手机厂商或与手机厂商合作的开发者开发的各类应用。这里,可以调用尺寸数据的应用可以由商家或服务器300设置。
可理解的,不限于表3中示出的一款商品的商品数据,服务器300中可以存储多款商品的商品数据。
尺寸数据可以用于构建商品的3D内胆模型,实现对商品内胆的模拟。一款商品可能有多个尺寸数据,因此,一款商品可以对应有多个内胆3D模型。根据尺寸数据构建商品的3D内胆模型的过程可包括:通过建模工具或者算法,将尺寸数据构建为商品的3D内胆模型。
商品数据(包括尺寸数据和效果数据)可以用于构建商品的3D精确模型,实现对商品的精确模拟。商品的3D精确模型由尺寸数据和效果数据共同构建,任意一个尺寸数据和任意一个效果数据都可以用于构成商品的一个3D精确模型。即一款商品可能对应有多个3D精确模型。根据商品数据构建商品的3D精确模型的过程可包括:通过建模工具或者算法,将商品数据构建为商品的3D精确模型。具体的,当通过现有3D模型的标准描述格式描述商品数据时,可通过建模工具或者算法解析商品数据,通过顶点与面片描述,生成3D表面,再读取材质数据,将表面纹理贴到3D表面,并针对不同的材质渲染出不同的光照效果,得到3D精确模型。
下面结合附图和应用场景,以用户在网购时使用3D虚拟试穿技术为例,详细介绍本申请实施例提供的控制用户数据的方法。
在本申请实施例的控制用户数据的方法中,将用户数据分为高精度用户数据和低精度用户数据,将商品数据分为尺寸数据和效果数据。高精度用户数据仅提供给用户信任的应用,商品的效果数据仅提供给商家信任的应用。用户信任的应用使用用户的高精度3D身体模型和商品的多个尺码对应的内胆3D模型匹配,得到最佳尺码。商家信任的应用使用该最佳尺码对应的3D精确模型和用户的低精度3D身体模型为用户展示3D试穿效果。在实现3D虚拟试穿时,可以保证用户数据和/或商品数据的安全,降低用户数据泄露的风险以及商品数据的流失。
应用场景1:用户1通过手机100上的购物应用选定想要购买的商品,在购物应用中查看3D虚拟试穿效果。
在应用场景1中,用户数据存储在服务器200中,该用户数据包括高精度用户数据和低精度用户数据。商品数据存储在服务器300中,该商品数据包括尺寸数据和效果数据。
在应用场景1中,手机100安装有用于管理用户数据的应用和购物应用。用于管理用户数据的应用是用户信任的应用,即用于管理用户数据的应用可以调用服务器200中的高精度 用户数据和低精度用户数据。购物应用是商家信任的应用,即购物应用可以调用服务器300中的尺寸数据和效果数据。其中,手机100中用于管理用户数据的应用可以是预装的,也可以是用户下载后安装的。购物应用可以是预装的,也可以是用户下载后安装的。
用于管理用户数据的应用和购物应用之间可以通过应用程序接口(application programming interface,API)通信。以下将以用于管理用户数据的应用为手机厂商开发的AR APP为例进行说明。
参见图5,其为手机100的触摸屏上显示的一个用户界面30,该用户界面30可以包括:状态栏301、可隐藏的导航栏302、时间和天气widget 303,还有多个应用程序的图标例如购物应用淘宝APP的图标304、AR APP的图标305等。其中,状态栏301可包括网络运营商的名称(例如中国移动)、Wi-Fi图标、信号强度、当前的剩余电量等。在一些实施例中,状态栏301还可包括蓝牙图标、闹钟图标等。导航栏302可包括返回键图标(图中的三角形)、主屏幕图标(图中的圆形)、多任务键图标(图中的正方形)。当手机100检测到用户的手指或者触控笔针对某一个应用图标的点击事件后,响应于该点击事件,手机100启动应用,显示与该应用图标对应的用户界面。例如,当手机100检测到用户的手指触摸淘宝的图标304时,手机100响应于该触摸事件启动淘宝APP,显示淘宝APP的主界面。
在启动购物APP之后,用户1可以在购物APP中搜索或选择自己想要购买或了解的商品。用户1选定想要购买或了解的商品之后,可以点击商品的图标或链接进入商品详情界面。示例性地,用户1选定的商品为一款裤子,对应的商品详情界面40可以如图6的6a所示。以下描述将用户1选定的这一款裤子称为裤子400。
如6a所示,商品详情界面40可包括:商品展示图像401、商品描述402、返回控件403、加入购物车控件404、立即购买控件405、AR试穿控件406等。其中,商品展示图像401可包括商品或者模特穿戴商品时的照片、视频。商品描述402为对商品的简单描述,可包括价格、关键词(如品牌、款式、流行元素等)、销量、发货地等。当用户1点击返回控件403时,手机100返回显示商品详情界面40的上一级界面。当用户1点击加入购物车控件404时,手机100会在商品详情界面40上显示商品参数(如尺码、颜色、件数等)的选择框,用户选定参数后,手机100会将裤子400添加到当前登录到购物APP的账号所对应的购物车中。当用户点击立即购买控件405时,手机100会在商品详情界面40上显示商品参数(如尺码、颜色、件数等)的选择框,用户选定参数后跳转到付款界面。
除了上述元素以及控件,商品详情界面40还可包括更多内容,例如还可包括服务查看控件、商品参数选择控件、商品参数查看控件等。用户1可以通过手指在商品详情界面40中往上滑动的手势,查看更加详细的商品介绍以及用户评论等。可理解的,图6的6a仅为示例,不同的购物APP或者不同商品,其对应的商品详情界面包含的元素及控件、各个元素及控件的排布方式等有可能不同,这里不做限制。
如果用户1想要查看裤子400的3D试穿效果,则可以通过手指或触控笔点击AR试穿控件406。下面结合附图描述用户1点击AR试穿控件406之后,一种可能的数据处理过程。
示例性地,参见图7,图7示出了一种可能的手机100和服务器200、服务器300之间的数据交互以及手机100的内部数据处理过程。如图7所示,该数据交互及处理过程可包括如下步骤:
1、手机100通过购物APP获取裤子400的内胆3D模型。
具体的,云服务300存储有裤子400对应的商品数据。这里,裤子400对应的商品数据 可包括:裤子400的尺寸数据和效果数据。服务器300和购物APP可以通过网络通信。
购物APP获取裤子400的内胆3D模型的方式可包括以下两种:
(1)在一些实施例中,购物APP可以从服务器300处获取裤子400的尺寸数据,并使用裤子400的尺寸数据构建裤子400的内胆3D模型。
具体的,手机100可以在检测到用户对AR试穿控件406的点击事件后,或者,可以在进入商品详情界面40时,通过购物APP向服务器300请求获取裤子400的尺寸数据。可选的,服务器300接收到购物APP发送的请求后,可以验证购物APP是否具有调用商品尺寸数据的权限。在验证购物APP具有调用商品尺寸数据的权限之后,服务器300将裤子400的尺寸数据发送给购物APP。在一些实施例中,服务器300可根据以下方式验证购物APP是否具有调用商品尺寸数据的权限:服务器300可存储具有调用商品数据权限的各个应用的标识,购物APP发送给服务器300的请求中携带有该购物APP的标识,服务器300查看存储的各个标识中是否包含该购物APP的标识,若是,则确认该购物APP具有调用商品尺寸数据的权限。其中,应用的标识可以为该应用的名称、图标、代码等。
由于裤子400对应有一个或多个尺码(例如大码、中码、小码),因此,裤子400可能对应有多个尺寸数据。即,购物APP可接收到裤子400的不同尺码分别对应的尺寸数据,并根据裤子400的多个尺寸数据分别构建内胆3D模型。购物APP可利用手机100本身的计算机图形处理能力,构建裤子400的3D内胆模型。购物APP使用裤子400的尺寸数据构建内胆3D模型的具体步骤可参照前文第(二)点中关于商品数据的相关描述,这里不再赘述。
(2)在另一些实施例中,购物APP可以从服务器300处获取裤子400的内胆3D模型。
服务器300可以根据存储的裤子400的尺寸数据构建内胆3D模型。即,服务器300可以存储裤子400的内胆3D模型。服务器300构建内胆3D模型的具体步骤可参照前文第(二)点中关于商品数据的相关描述,这里不再赘述。
手机100可以在检测到用户对AR试穿控件406的点击事件后,或者,可以在进入商品详情界面40时,通过购物APP向服务器300请求获取裤子400的内胆3D模型。可选的,由于裤子400的内胆3D模型反映了裤子400的尺寸数据,服务器300接收到购物APP发送的请求后,可以验证购物APP是否具有调用商品尺寸数据的权限。在验证购物APP具有调用商品尺寸数据的权限之后,服务器300将裤子400的内胆3D模型发送给购物APP。服务器300验证购物APP是否具有调用商品尺寸数据的权限的过程可参照上述第(1)个实施例中的相关描述。
由于裤子400对应有一个或者多个尺码(例如大码、中码、小码),因此,购物APP可接收到云服务300发送的裤子400的不同尺码分别对应的内胆3D模型。
2、购物APP将裤子400的内胆3D模型发送给用于管理用户数据的应用,即AR APP。
3、AR APP获取当前需要虚拟试穿裤子400的用户的高精度3D身体模型和低精度3D身体模型。
具体的,服务器200存储有用户数据。一些实施例中,该用户数据可包括一个用户的用户数据,例如表1。一些实施例中,该用户数据可包括多个用户的用户数据,例如表2。用户数据可包括:高精度用户数据和低精度用户数据。服务器200和AR APP可以通过网络通信。
当前需要虚拟试穿裤子400的用户可以通过以下方式确定:
(1)当前需要虚拟试穿裤子400的用户是手机100默认的。
具体的,AR APP一般通过用户账号管理用户数据,当前登录手机厂商AR APP的用户账号下可以对应有一个或多个用户的用户数据,可参见表1及表2。
如果当前登录AR APP的用户账号下仅有一个用户的用户数据,则手机100默认当前需要虚拟试穿裤子400的用户为该用户。示例性地,当前登录AR APP的用户账号下存储的用户数据如表1,则当前需要虚拟试穿裤子400的用户为用户1。
如果当前登录AR APP的用户账号下有多个用户的用户数据,则手机100可以根据一定的策略在该多个用户中默认选择一个用户作为当前需要虚拟试穿裤子400的用户。例如,手机100可以获取用户账号下多个用户数据的存储时间,将存储时间最早的一个用户数据所对应的用户作为当前需要虚拟试穿裤子400的用户。又例如,手机100可以将机主(例如用户1)作为当前需要虚拟试穿裤子400的用户,用户1的机主身份可以是用户1在通过AR APP管理用户数据时添加的。又例如,用户1可以预先将某个用户(例如用户1的妈妈)设置为默认的需要虚拟试穿裤子400的用户,则手机100可根据该设置确定当前需要虚拟试穿裤子400的用户(例如用户1的妈妈)。
(2)当前需要虚拟试穿裤子400的用户是用户1选择的。
一些实施例中,AR APP接收到购物APP发送的裤子400的内胆3D模型之后,可以从AR APP的存储文件或者服务器200中获取当前登录AR APP的用户账号下的一个或多个用户的用户标识。用户标识可以为与用户账号关联的用户的姓名、昵称、手机号码或邮箱等,也可以是用于标识用户的头像。
AR APP可以显示获取到的当前登录AR APP的用户账号下的多个用户标识,给用户提供选择当前需要虚拟试穿裤子400的用户的选项或控件。
在一种可能的实施方式中,AR APP可以将获取到的当前登录AR APP的用户账号下的一个或多个用户标识发送给购物APP,由购物APP为用户提供选择当前需要虚拟试穿裤子400的用户的选项或控件。示例性地,参见图6的6b,手机100可以在商品详情界面40上显示选择当前需要虚拟试穿裤子400的用户的多个控件407。该多个控件407显示当前登录AR APP的用户账号下的多个用户标识。
不限于6b所示的在商品详情界面40上显示控件407的方式,还可以通过其他方式显示控件407。例如,手机100还可以从商品详情界面40跳转到专门用于选择当前需要虚拟试穿裤子400的用户的界面,并在该界面上显示多个控件407。又例如,手机100的显示界面还可以跳转到AR APP的主界面,在AR APP的主界面上提供多个控件407。
用户1可以在手机100提供的多个控件407中,选择当前需要虚拟试穿裤子400的用户对应的控件,以确定当前需要虚拟试穿裤子400的用户。如图6的6b所示,用户1可以点击用户1对应的控件,将用户1确定为当前需要虚拟试穿裤子400的用户。用户1选择当前需要虚拟试穿裤子400的用户后,手机100可以将用户1选择的用户的用户标识发送给AR APP。
通过上述(1)或(2)任一种方式确定当前需要虚拟试穿裤子400的用户之后,AR APP可获取该用户的高精度3D身体模型。下面以当前需要虚拟试穿裤子400的用户为用户1为例,详细说明AR APP获取用户1的高精度3D身体模型的过程。
AR APP获取当前需要虚拟试穿裤子400的用户(即用户1)的高精度3D身体模型的方式可包括以下两种:
(1)在一些实施例中,AR APP可以从服务器200处获取用户1的用户数据(包括高精 度用户数据和低精度用户数据),并使用用户1的高精度用户数据构建高精度3D身体模型,使用用户1的低精度用户数据构建低精度3D身体模型。
具体的,AR APP接收到购物APP发送的裤子400的内胆3D模型之后,可以向服务器200请求获取用户1的用户数据。可选的,服务器200可以验证AR APP是否具有调用用户数据的权限,在验证AR APP具有调用用户数据的权限之后,将用户1的用户数据发送给AR APP。服务器200验证AR APP是否具有调用用户数据的权限的过程,和服务器300验证购物APP是否具有调用商品尺寸数据的权限的过程类似,可参照相关描述。
AR APP接收到用户1的用户数据后,可利用手机100的计算机图形处理能力,构建用户1的高精度3D身体模型和低精度3D身体模型。AR APP构建用户1的高精度3D身体模型和低精度3D身体模型的具体步骤可参照前文第(一)点中关于用户数据的相关描述,这里不再赘述。
(2)在另一些实施例中,AR APP可以从服务器200处获取用户1的高精度3D身体模型和低精度3D身体模型。
服务器200可以根据存储的用户数据构建用户1的高精度3D身体模型和低精度3D身体模型。服务器200构建用户1的高精度3D身体模型和低精度3D身体模型的具体步骤可参照前文第(一)点中关于用户数据的相关描述,这里不再赘述。
具体的,AR APP接收到购物APP发送的裤子400的内胆3D模型之后,可以向服务器200请求获取用户1的高精度3D身体模型和低精度3D身体模型。可选的,由于用户1的高精度3D身体模型反映用户1的高精度用户数据,用户1的低精度3D身体模型反映用户1的低精度用户数据,服务器200接收到AR APP发送的请求后,可以验证AR APP是否具有调用用户数据的权限。在验证AR APP具有调用用户数据的权限之后,服务器200将用户1的高精度3D身体模型和低精度3D身体模型发送给AR APP。服务器200验证AR APP是否具有调用用户数据的权限的过程,和服务器300验证购物APP是否具有调用商品尺寸数据的权限的过程类似,可参照相关描述。
不限于上述(1)或(2)所述的获取方式,本申请还可通过其他方式获取用户1的高精度3D身体模型和低精度3D身体模型。例如,在一些实施例中,AR APP可以从服务器200处获取用户1的高精度3D身体模型,在对用户1的高精度3D身体模型做模糊处理后得到用户1的低精度3D身体模型。
4、AR APP将用户1的高精度3D身体模型和裤子400的不同尺码分别对应的3D内胆模型进行匹配,在裤子400的多个尺码中确定最佳尺码。
具体的,将用户1的高精度3D身体模型和裤子400的3D内胆模型进行匹配的过程具体可包括:对用户1的高精度3D身体模型和裤子400的3D内胆模型做碰撞检测和压力图仿真,判断用户身材是否与裤子的尺码相匹配。其中,将2个3D模型进行简单叠加,通过碰撞检测,可以得出衣物与用户身材是否有干涉的部分,以及有多少干涉的部分。增加重力与衣服材质弹力等物理作用引擎,可以模仿出衣物穿在用户身上之后,对用户身体的压力热图。当匹配衣物的压力图处于特定区间内,可认为对应的尺码是最适合用户的衣物尺码。该特定区间可以根据实验得到。
可理解的,步骤4中确定的最佳尺码为裤子400的多个尺码中,最合适用户1身体的尺码。
在步骤4中,用户1的高精度3D身体模型能够精确地反映用户1身体部位的细节,因此,使用用户1的高精度3D身体模型确定出的最佳尺码,是最合适用户1身体的尺码。
5、AR APP将确定出的最佳尺码和用户1的低精度3D身体模型发送给购物APP。
6、购物APP根据裤子400的最佳尺码所对应的3D精确模型和用户1的低精度3D身体模型,获取3D虚拟试穿图。
具体的,可以由购物APP生成3D虚拟试穿图,也可以由服务器300生成3D虚拟试穿图后发送给购物APP,下面详细描述。
(1)购物APP使用裤子400的最佳尺码所对应的3D精确模型和用户1的低精度3D身体模型,生成3D虚拟试穿图。
在第(1)种方式中,用户1的低精度3D身体模型是步骤5中AR APP发送给购物APP的,裤子400的最佳尺码所对应的3D精确模型是购物APP根据步骤5中AR APP发送的最佳尺码确定的。下面详细描述购物APP确定裤子400的最佳尺码所对应的3D精确模型的过程。
可理解的,商品的3D精确模型由尺寸数据和效果数据共同确定,由于裤子400可能对应有一个或多个效果数据,因此,裤子400的最佳尺码所对应的3D精确模型可能有一个或多个。举例来说,参见表3,表中所示商品的大码对应的3D精确模型包括:由大码对应的尺寸数据和大五角星图案对应的效果数据构建的3D精确模型、由大码对应的尺寸数据和小五角星图案对应的效果数据构建的3D精确模型、大码对应的尺寸数据和曲线图案对应的效果数据构建的3D精确模型。
和步骤1中购物APP获取裤子400的内胆3D模型时类似,购物APP获取裤子400的最佳尺码所对应的3D精确模型的方式也可包括以下两种:
(a)在一些实施例中,购物APP可以从服务器300处获取裤子400的最佳尺码所对应的尺寸数据,以及裤子400的一个或多个效果数据,构建裤子400的最佳尺码所对应的3D精确模型。
其中,裤子400的最佳尺码所对应的尺寸数据可以是购物APP可以通过步骤1中的第(1)种方式从服务器300处获取到的。
其中,裤子400的一个或多个效果数据可以是购物APP通过步骤1中的第(1)种方式从服务器300处获取尺寸数据的同时获取到的,也可以是购物APP在步骤5中接收到AR APP发送的最佳尺码后从服务器300处获取到的。可选的,服务器300可以验证购物APP是否具有调用商品的效果数据的权限,在验证购物APP具有调用商品的效果数据的权限之后,将裤子400的一个或多个效果数据发送给购物APP。
购物APP从服务器300处获取裤子400的最佳尺码所对应的尺寸数据,以及裤子400的一个或多个效果数据之后,可以利用手机100本身的计算机图形处理能力,构建裤子400的最佳尺码所对应的一个或多个3D精确模型。这里,购物APP构建裤子400的最佳尺码所对应的3D精确模型的具体步骤可参照前文第(二)点中关于商品数据的相关描述,这里不再赘述。
(b)在另一些实施例中,购物APP可以从服务器300处获取裤子400的最佳尺码所对应的3D精确模型。
服务器300可以根据存储的裤子400的一个或多个效果数据、以及各个尺码分别对应的尺寸数据,生成多个裤子400的3D精确模型。即服务器300中可以存储多个裤子400的3D精确模型。服务器300构建裤子400的3D精确模型的具体步骤可参照前文第(二)点中关于商品数据的相关描述,这里不再赘述。
购物APP可以向服务器300请求获取裤子400的最佳尺码所对应的一个或多个3D精确模型。可选的,裤子400的3D精确模型反映了裤子400的尺寸以及效果,服务器300接收到购物APP的请求后,可以验证购物APP是否具有调用裤子400的尺寸数据和效果数据的权限。服务器300验证购物APP具有调用裤子400的尺寸数据和效果数据的权限之后,将裤子400的最佳尺码所对应的3D精确模型发送给购物APP。
(2)购物APP将裤子400的最佳尺码和用户1的低精度3D身体模型发送给服务器300。云服务300使用裤子400的最佳尺码所对应的3D精确模型和用户1的低精度3D身体模型,生成3D虚拟试穿图。服务器300将生成的3D虚拟试穿图发送给购物APP。
在第(2)种方式中,服务器300可以根据存储的裤子400的一个或多个效果数据,以及各个尺码分别对应的尺寸数据,生成多个裤子400的3D精确模型。即服务器300中可以存储多个裤子400的3D精确模型。服务器300可以根据购物APP发送的最佳尺码确定该最佳尺码对应的一个或多个3D精确模型,并将该一个或多个3D精确模型发送给购物APP。
在上述(1)或(2)中,购物APP或者服务器300获取到裤子400的最佳尺码对应的3D精确模型后,可以使用裤子400的最佳尺码所对应的3D精确模型和用户1的低精度3D身体模型,生成3D虚拟试穿图。由于裤子400的最佳尺码所对应的3D精确模型可能有一个或多个,因此,生成的3D虚拟试穿图也可能有一个或多个。
下面以裤子400的最佳尺码所对应的某一个3D精确模型为例,描述购物APP或者服务器300使用该3D精确模型和用户1的低精度3D身体模型生成3D虚拟试穿图的过程。该过程具体可包括:购物APP调用用户1的低精度3D身体模型,叠加裤子400的3D精确模型,生成叠加后的试穿效果。试穿效果可以是静态,也可以是动态的。
购物APP通过上述(1)或(2)方式获取到一个或多个3D虚拟试穿图后,可以显示该一个或多个3D虚拟试穿图,使得用户查看试穿效果。示例性地,参考图6的6c,购物APP获取到3D虚拟试穿图后,可以从商品详情界面40跳转到3D虚拟试穿界面50。3D虚拟试穿界面50包括:购物APP获取到的一个3D虚拟试穿图。该3D虚拟试穿图中的裤子400的3D精确模型对应的尺码为上述最佳尺码,该3D虚拟试穿图中的裤子400的3D精确模型对应的效果可以是裤子400的多个效果中的任意一个。
在一些实施例中,3D虚拟试穿界面50还可以包括用于选择裤子400的效果的控件408。示例性地,如6c所示,控件408为用户提供了裤子400的不同花纹的选项。用户1可以点击控件408,选择想要试穿具有哪一种花纹的裤子400。手机100检测到用户1对控件408的点击事件后,可以确定用户1当前想要试穿的裤子400的花纹,并将3D虚拟试穿界面50中的3D虚拟试穿图更换为符合用户选择的花纹的3D虚拟试穿图。其中,符合用户选择的花纹的3D虚拟试穿图是使用裤子400的最佳尺码和用户选择的效果(即用户选择的花纹)所对应的3D精确模型、以及用户1的低精度3D身体模型生成的。
在应用场景1中,从用户点击商品详情界面40中的AR试穿控件406,到购物APP显示如6c所示的3D虚拟试穿图,手机100和服务器200、服务器300之间有数据交互,手机100内部也执行了数据处理过程。在一些实施例中,为了使用户体验更好,可以在用户点击商品 详情界面40中的AR试穿控件406之后,购物APP显示3D虚拟试穿图之前,在购物APP的界面上显示等待图标。等待图标可以是缓冲条、沙漏图标,也可以是带有箭头的圆圈。等待图标可以是静态的,也可以是动态的。
在图7所示的方案中,购物APP最终显示的3D虚拟试穿图中,裤子400的3D精确模型根据最佳尺码得到,和用户的身体相匹配。此外,裤子400的3D精确模型实现了对服装的精确模拟,即购物APP最终展示的3D虚拟试穿图为用户提供了真实的试穿效果。
在应用场景1中,无需将高精度用户数据提供给购物APP,将高精度用户数据提供给AR APP就可以实现3D虚拟试穿,可以保证高精度用户数据仅被用户信任的应用调用,保证了高精度用户数据的安全,从而保护用户的隐私。同样的,在应用场景1中,无需将商品的效果数据提供给AR APP,仅将商品的效果数据提供给购物APP就可以实现3D虚拟试穿,可以保证商品的效果数据仅被商家信任的应用调用,保证了商品的效果数据的安全,从而保护商家利益。
在一些实施例中,可以不利用用户1的高精度3D身体模型和裤子400的不同尺码分别对应的3D内胆模型进行匹配,而利用用户1的高精度用户数据和裤子400的不同尺码分别对应的尺寸数据进行匹配,确定裤子400的最佳尺码。具体的,结合图7实施例,步骤1中购物APP可以获取裤子400的不同尺码分别对应的尺寸数据,步骤3中AR APP可以获取用户1的高精度用户数据,步骤4中AR APP可以利用用户1的高精度用户数据和裤子400的不同尺码分别对应的尺寸数据进行匹配,确定裤子400的最佳尺码。其中,AR APP利用用户1的高精度用户数据和裤子400的不同尺码分别对应的尺寸数据进行匹配,确定裤子400的最佳尺码的过程可包括:通过计算机视觉算法(computer vision,CV),将高精度用户数据(如肩宽、臂长,胸围,腰围,臀围,大腿围,小腿围,身高等关键尺寸数据),与不同裤子400的不同尺码分别对应的尺寸数据进行对比,可获得和用户身体匹配的最佳尺码。
应用场景1中以用户虚拟试穿裤子为例描述了本申请实施例的控制用户数据的方法,可理解的,不限于裤子,本申请实施例的控制用户数据的方法还可以应用于虚拟穿戴其他商品的场景。例如,用户想要试穿上衣、鞋子,试戴眼镜、帽子等商品时,也可以采用和图7实施例类似的方法提供虚拟试穿图,从而保护用户隐私,保护商家利益。
在一些实施例中,为了简化数据交互以及处理过程,还可以根据用户穿戴商品时的身体部位来交互及处理用户数据。例如,在应用场景1中用户想要试穿衣服时,图7中的步骤3中,AR APP可以获取当前需要虚拟试穿衣服的用户的上身躯干的高精度3D身体模型和低精度3D身体模型。又例如,用户想要试穿鞋子时,图7的步骤3中,AR APP可以获取当前需要虚拟试穿鞋子的用户的脚部的高精度3D身体模型和低精度3D身体模型。又例如,用户想要试戴眼镜时,图7的步骤3中,AR APP可以获取当前需要虚拟试戴眼镜的用户的头部的高精度3D身体模型和低精度3D身体模型。
在一些实施例中,图7的步骤6中,购物APP可以不为用户显示由裤子400的3D精确模型和用户1的低精度身体模型生成的3D虚拟试穿图,而为用户显示用户1的真实图像和裤子400的3D精确模型叠加而成的效果图。下面详细说明。
具体的,购物APP接收到AR APP发送的最佳尺码后,在步骤6中手机100可以开启摄 像头(前置摄像头或后置摄像头),通过摄像头采集用户1的真实图像。用户1的真实图像可以是照片或视频。
裤子400的最佳尺码所对应的3D精确模型可能有一个或多个,下面以其中任意一个3D精确模型为例进行说明。
在上述实施例中,购物APP获取用户1的真实图像和裤子400的3D精确模型叠加而成的效果图的方式可以有以下两种:
(1)在一种可能的实施方式中,可以由购物APP可以将裤子400的最佳尺码对应的一个3D精确模型和用户1的真实图像进行叠加生成效果图。购物APP获取裤子400的最佳尺码对应的3D精确模型的方式参照上述步骤6中第(1)点中的相关描述。
(2)在另一种可能的实施方式中,手机100可以将采集到的用户1的真实图像发送给服务器300,服务器300可以将裤子400的最佳尺码对应的一个3D精确模型和用户1的真实图像进行叠加生成效果图后,将该效果图发送给购物APP。
在上述两种可能的实施方式中,购物APP或服务器300将裤子400的最佳尺码对应的一个3D精确模型和用户1的真实图像进行叠加生成效果图的过程,具体可包括:解析用户1的真实图像中的骨骼点信息,淘宝APP根据骨骼点识别的结果,将已绑定骨骼点的裤子400的3D精确模型,叠加到视频流上相应的人体骨骼点位置,呈现出试穿的效果。
通过上述方式将商品的3D精确模型和用户的真实图像进行叠加生成效果图,当用户可以抬手、转身、侧身时,裤子400的3D精确模型都可以随着用户身体的变化叠加到用户身体上,给用户好的虚拟试穿体验。
可理解的,应用场景1中以电子设备中安装有购物应用和AR应用为例,对本申请实施例中的控制用户数据的方法进行了说明。在一些实施例中,购物应用和AR应用在交互数据时,也可以通过共享数据接口或者共享数据存储位置的方式交互数据。在一些实施例中,应用场景1中购物应用的功能也可以由电子设备中安装的浏览器所访问的网页实现,例如浏览器中购物网站的网页。在一些实施例中,AR应用的功能也可以是集成在电子设备上的系统功能或服务,并不一定通过应用的方式实现。
应用场景2:用户1通过手机100上的用于管理用户数据的应用选定想要购买的商品,在手机100上的购物应用中查看3D虚拟试穿效果。
和应用场景1相同,应用场景2中的用户数据存储在服务器200中,商品数据存储在服务器300中,可参照应用场景1中的相关描述。
手机100安装有用于管理用户数据的应用和购物应用。应用场景2和应用场景1的不同之处在于,用户1不在购物应用中选定想要购买的商品,而在用于管理用户数据的应用中选定想要购买的商品。以下将以用于管理用户数据的应用为AR APP为例进行说明。
和应用场景1中相比,AR APP还可以实现对手机100已安装的应用的内容搜索。具体的,AR APP可以和快服务智慧平台(例如华为快服务智慧平台(HUAWEI ability gallery,HAG))通过网络通信,通过快服务智慧平台实现对手机100已安装的应用的内容搜索。快服务智慧平台是一个服务分发平台,也可以看作是一个服务器。快服务智慧平台可以解析用户需求,将用户需求分发至和用户需求相匹配的服务合作方。服务合作方接收到用户需求后,可以向AR APP返回根据用户需求搜索到的结果。这里,服务合作方可以是手机100中已安 装的应用,例如购物APP。
下面结合图8描述可能的手机100和服务器200、服务器300、快服务智慧平台之间的数据交互以及手机100的内部数据处理过程。如图8所示,该数据交互及处理过程可包括如下步骤:
1、用户1在AR APP中输入关键词,AR APP将接收到的关键词发送给快服务智慧平台。
具体的,当用户想要查看一款商品的3D试穿效果时,可以先打开手机100中的AR APP,并在AR APP中输入商品的关键词。
示例性地,可参见图5,用户1可以用手指或触控笔点击用户界面30中AR APP的图标305,响应于该点击操作,手机100显示AR APP的主界面。
示例性地,参见图9的9a,其示出了一种可能的AR APP的主界面80。AR APP的主界面80可包括:搜索框801。在一些实施例中,主界面80还可包括AR APP建议的应用,以及搜索历史等。
用户1可以在搜索框801中输入想要搜索的商品的关键词。举例来说,参见图9的9b,当用户1想要购买服装时,可以点击搜索框801,通过虚拟键盘输入关键词。用户1输入关键词后,可以点击虚拟键盘中的搜索控件。手机100检测到用户1针对虚拟键盘中的搜索控件的点击操作后,通过AR APP将用户1输入的关键词“服装”发送给快服务智慧平台。
2、快服务智慧平台根据关键词解析用户需求,将关键词发送至和用户1的需求相匹配的购物APP。
快服务智慧平台可以根据关键词分析用户需求,在手机100已安装的应用中确定和用户需求相匹配的应用,并将关键词发送给该应用。
举例来说,如果关键词为“霍金”,则快服务智慧平台可以确定用户想要了解霍金的简介,和用户需求相匹配的应用可以为维基百科、百度百科等应用。
举例来说,如果关键词为“霍金”“时间简史”,则快服务智慧平台可以确定用户想要查看霍金的书籍“时间简史”,和用户需求相匹配的应用可以为阅读类应用,如微信读书等。
步骤2中,快服务智慧平台接收到的关键词为“服装”,则快服务智慧平台可以确定用户想要购买衣物,和用户需求相匹配的应用可以为购物应用,如购物APP等。
3、购物APP根据关键词进行搜索,将搜索结果发送给AR APP。
在一些实施例中,购物APP可以将关键词发送给服务器300,从服务器300处获取到搜索结果,并将该搜索结果发送给AR APP。
4、AR APP显示购物APP发送的搜索结果,用户1在该搜索结果中选定想要购买的一款服装。
5、AR APP将用户1选定的一款服装的标识发送给购物APP,手机100跳转到购物APP并显示用户1选定服装的商品详情界面。以下将用户选定的服装称为裤子400。
这里,服装的标识可以为服装的款号或文字描述等。在步骤4和步骤5中,可以看做AR APP给出了多个超链接,每个超链接连接到购物APP中一款服装的商品详情界面。当用户选定一款服装时,相当于通过超链接跳转到购物APP中对应的商品详情界面。
示例性地,参见图9的9c,其示出了一种可能的AR APP显示搜索结果的界面90。如9c所示,该搜索结果包括购物APP根据关键词搜索到的多件商品对应的图像以及文字介绍。
用户1可以在界面90的搜索结果中点击裤子400的图像或者文字介绍,手机100响应于用户1的点击操作,跳转到购物APP,显示购物APP中裤子400的商品详情界面。示例性地, 该商品详情界面可以如图6的6a所示,可参照相关描述。
6、手机100通过购物APP获取裤子400的内胆3D模型。
7、购物APP将裤子400的内胆3D模型发送给用于管理用户数据的应用,即AR APP。
8、AR APP获取当前需要虚拟试穿裤子400的用户的高精度3D身体模型和低精度3D身体模型。
9、AR APP将用户1的高精度3D身体模型和裤子400的不同尺寸分别对应的3D内胆模型进行匹配,在裤子400的多个尺码中确定最佳尺码。
10、AR APP将确定出的最佳尺码和用户1的低精度3D身体模型发送给购物APP。
11、购物APP根据裤子400的最佳尺码所对应的3D精确模型和用户1的低精度3D身体模型,获取3D虚拟试穿图。
可理解的,步骤6-11和图7实施例中的步骤1-6相同,可以参照图7实施例的相关描述,这里不再赘述。
在上述应用场景1和应用场景2中,用户数据存储在服务器200中。在一些实施例中,用户数据可以不存储在服务器200中,而存储在手机100中。用户数据可以存储在手机100中安装的用于管理用户数据的应用(例如AR APP)对应的存储文件中,也可以作为一个数据源单独存储在手机100中。当用户数据存储在手机100中时,用于管理用户数据的应用可以根据本地存储的用户数据获取当前需要虚拟试穿裤子400的用户的高精度3D身体模型和低精度3D身体模型。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (21)

  1. 一种控制用户数据的方法,所述方法应用于电子设备,所述电子设备安装有第一应用和第二应用,其特征在于,所述方法包括:
    所述第二应用获取用户的高精度用户数据和低精度用户数据;
    所述第二应用获取所述第一应用提供的第一物品的尺寸数据;其中,所述尺寸数据反映所述第一物品的尺寸;
    所述第二应用根据所述尺寸数据和所述高精度用户数据确定所述第一物品的最佳尺码,所述最佳尺码和所述用户的身体相匹配;所述高精度用户数据反映所述用户的身体细节,所述低精度用户数据反映所述用户的身体轮廓;
    所述第二应用向所述第一应用提供所述最佳尺码和所述低精度用户数据以生成3D试穿图。
  2. 根据权利要求1所述的方法,其特征在于,所述第二应用为手机厂商开发的应用,所述第一应用为第三方应用。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二应用获取用户的高精度用户数据和低精度用户数据,包括:
    所述第二应用向第二服务器发送第一获取请求,并接收所述第二服务器发送的所述用户的高精度用户数据和低精度用户数据。
  4. 根据权利要求3所述的方法,其特征在于,所述第一获取请求中携带所述第二应用的标识信息;所述用户的高精度用户数据和低精度用户数据是所述第二应用的标识信息通过验证之后,所述第二服务器发送的。
  5. 根据权利要求1或2所述的方法,其特征在于,所述第二应用获取用户的高精度用户数据和低精度用户数据,包括:
    所述第二应用接收第二服务器发送的,或接收用户录入,或读取所述电子设备检测的所述用户的高精度用户数据;
    所述第二应用根据所述用户的高精度用户数据计算得到所述用户的低精度用户数据。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,所述第二应用获取所述第一应用提供的第一物品的尺寸数据之前,所述方法还包括:
    所述电子设备显示所述第一应用的界面,所述界面包括所述第一物品;
    所述电子设备接收用户选择所述第一物品的操作。
  7. 根据权利要求1至6任一项所述的方法,其特征在于,所述3D试穿图是根据所述最佳尺码、所述低精度用户数据和所述第一物品的效果数据生成的;所述第一物品的效果数据由所述第一应用提供,所述效果数据反映所述第一物品的细节。
  8. 根据权利要求7所述的方法,其特征在于,所述第二应用获取所述第一应用提供的第一物品的尺寸数据之前,所述方法还包括:
    所述第一应用向第一服务器发送第二获取请求,所述第二获取请求中携带所述第一应用的标识信息;所述第一应用的标识信息通过验证之后,接收所述第一服务器发送的所述第一物品的尺寸数据和所述效果数据。
  9. 根据权利要求1至8任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备在第一应用的界面显示生成的所述3D试穿图。
  10. 一种电子设备,包括存储器和处理器,所述存储器存储有至少一个程序,所述至少一个程序包括第一应用和第二应用,所述处理器用于运行所述第二应用以使所述电子设备执行:
    获取用户的高精度用户数据和低精度用户数据;
    获取所述第一应用提供的第一物品的尺寸数据;其中,所述尺寸数据反映所述第一物品的尺寸;
    根据所述尺寸数据和所述高精度用户数据确定所述第一物品的最佳尺码,所述最佳尺码和所述用户的身体相匹配;所述高精度用户数据反映所述用户的身体细节,所述低精度用户数据反映所述用户的身体轮廓;
    向所述第一应用提供所述最佳尺码和所述低精度用户数据以生成3D试穿图。
  11. 根据权利要求10所述的电子设备,其特征在于,所述第二应用为手机厂商开发的应用,所述第一应用为第三方应用。
  12. 根据权利要求10或11所述的电子设备,其特征在于,所述处理器运行所述第二应用以使所述电子设备获取用户的高精度用户数据和低精度用户数据,包括:
    所述处理器运行所述第二应用以使所述电子设备执行:
    向第二服务器发送第一获取请求,并接收所述第二服务器发送的所述用户的高精度用户数据和低精度用户数据。
  13. 根据权利要求12所述的电子设备,其特征在于,所述第一获取请求中携带所述第二应用的标识信息;所述用户的高精度用户数据和低精度用户数据是所述第二应用的标识信息通过验证之后,所述第二服务器发送的。
  14. 根据权利要求10或11所述的电子设备,其特征在于,其特征在于,所述处理器运行所述第二应用以使所述电子设备获取用户的高精度用户数据和低精度用户数据,包括:
    所述处理器运行所述第二应用以使所述电子设备执行:
    接收第二服务器发送的,或接收用户录入,或读取所述电子设备检测的所述用户的高精度用户数据;
    根据所述用户的高精度用户数据计算得到所述用户的低精度用户数据。
  15. 根据权利要求10至14任一项所述的电子设备,其特征在于,所述电子设备还包括触摸屏,
    所述处理器运行所述第二应用以使所述电子设备获取所述第一应用提供的第一物品的尺寸数据之前,所述处理器还用于运行所述第一应用以使所述电子设备执行:
    所述触摸屏显示所述第一应用的界面,所述界面包括所述第一物品;
    所述触摸屏接收用户选择所述第一物品的操作。
  16. 根据权力要求10至15任一项所述的电子设备,其特征在于,所述3D试穿图是根据所述最佳尺码、所述低精度用户数据和所述第一物品的效果数据生成的;所述第一物品的效果数据由所述第一应用提供,所述效果数据反映所述第一物品的细节。
  17. 根据权力要求16所述的电子设备,其特征在于,所述处理器运行所述第二应用以使所述电子设备获取所述第一应用提供的第一物品的尺寸数据之前,所述处理器还用于运行所述第一应用以使所述电子设备执行:
    向第一服务器发送第二获取请求,所述第二获取请求中携带所述第一应用的标识信息; 所述第一应用的标识信息通过验证之后,接收所述第一服务器发送的所述第一物品的尺寸数据和所述效果数据。
  18. 根据权利要求10至17任一项所述的电子设备,其特征在于,所述电子设备还包括触摸屏,
    所述处理器还用于运行所述第一应用以使所述电子设备执行:所述触摸屏在第一应用的界面显示生成的所述3D试穿图。
  19. 一种控制用户数据的方法,所述方法应用于第二服务器,其特征在于,包括:
    所述第二服务器接收第二应用发送的第一获取请求,所述第二应用安装于电子设备中;
    所述第二服务器向所述第二应用发送用户的高精度用户数据和低精度用户数据。
  20. 根据权利要求19所述的方法,其特征在于,所述第一获取请求中携带所述第二应用的标识信息,所述第二服务器在所述第二应用的标识信息通过验证之后,向所述第二应用发送所述用户的高精度用户数据和低精度用户数据。
  21. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在终端上运行时,使得所述终端执行如权利要求1-9任一项所述的控制用户数据的方法。。
PCT/CN2019/110117 2018-11-13 2019-10-09 控制用户数据的方法及相关装置 WO2020098418A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19877531.4A EP3693915A4 (en) 2018-11-13 2019-10-09 METHOD FOR CONTROLLING USER DATA AND RELATED DEVICE
US16/764,716 US20210224886A1 (en) 2018-11-13 2019-10-09 Method for Controlling User Data and Related Apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811347521.9 2018-11-13
CN201811347521.9A CN109615462B (zh) 2018-11-13 2018-11-13 控制用户数据的方法及相关装置

Publications (1)

Publication Number Publication Date
WO2020098418A1 true WO2020098418A1 (zh) 2020-05-22

Family

ID=66004291

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/110117 WO2020098418A1 (zh) 2018-11-13 2019-10-09 控制用户数据的方法及相关装置

Country Status (4)

Country Link
US (1) US20210224886A1 (zh)
EP (1) EP3693915A4 (zh)
CN (1) CN109615462B (zh)
WO (1) WO2020098418A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109615462B (zh) * 2018-11-13 2022-07-22 华为技术有限公司 控制用户数据的方法及相关装置
US20220327783A1 (en) * 2021-04-08 2022-10-13 Ostendo Technologies, Inc. Virtual Mannequin - Method and Apparatus for Online Shopping Clothes Fitting

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104637083A (zh) * 2015-01-29 2015-05-20 吴宇晖 一种虚拟试衣系统
CN106097081A (zh) * 2016-08-23 2016-11-09 宇龙计算机通信科技(深圳)有限公司 一种虚拟试衣方法及服务器
US20170039775A1 (en) * 2015-08-07 2017-02-09 Ginman Group, Inc. Virtual Apparel Fitting Systems and Methods
CN107563875A (zh) * 2017-09-22 2018-01-09 安徽网网络科技有限公司 基于人体3d建模的拍照试衣系统及其使用方法
CN109615462A (zh) * 2018-11-13 2019-04-12 华为技术有限公司 控制用户数据的方法及相关装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1932874A (zh) * 2006-10-10 2007-03-21 中山大学 一种运用于pda的三维试衣系统
GB201102794D0 (en) * 2011-02-17 2011-03-30 Metail Ltd Online retail system
JP6392756B2 (ja) * 2012-09-05 2018-09-19 ボディ パス リミテッド 2次元画像シーケンスから正確な身体サイズ測定値を得るためのシステム及び方法
CN103578005A (zh) * 2013-11-20 2014-02-12 李拓彬 智能试镜系统及其实现方法
KR20170094279A (ko) * 2014-12-16 2017-08-17 미테일 리미티드 3d 의복 이미지와 조합되는 사람의 3d 가상 신체 모델을 생성하기 위한 방법들, 및 관련 디바이스들, 시스템들 및 컴퓨터 프로그램 제품들
US10475113B2 (en) * 2014-12-23 2019-11-12 Ebay Inc. Method system and medium for generating virtual contexts from three dimensional models
US10248993B2 (en) * 2015-03-25 2019-04-02 Optitex Ltd. Systems and methods for generating photo-realistic images of virtual garments overlaid on visual images of photographic subjects
CN105787751A (zh) * 2016-01-06 2016-07-20 湖南拓视觉信息技术有限公司 三维人体虚拟试衣方法和系统
CN107220886A (zh) * 2017-05-10 2017-09-29 应凯 实现共享服饰的智能云试衣系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104637083A (zh) * 2015-01-29 2015-05-20 吴宇晖 一种虚拟试衣系统
US20170039775A1 (en) * 2015-08-07 2017-02-09 Ginman Group, Inc. Virtual Apparel Fitting Systems and Methods
CN106097081A (zh) * 2016-08-23 2016-11-09 宇龙计算机通信科技(深圳)有限公司 一种虚拟试衣方法及服务器
CN107563875A (zh) * 2017-09-22 2018-01-09 安徽网网络科技有限公司 基于人体3d建模的拍照试衣系统及其使用方法
CN109615462A (zh) * 2018-11-13 2019-04-12 华为技术有限公司 控制用户数据的方法及相关装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3693915A4

Also Published As

Publication number Publication date
CN109615462A (zh) 2019-04-12
US20210224886A1 (en) 2021-07-22
CN109615462B (zh) 2022-07-22
EP3693915A4 (en) 2020-12-02
EP3693915A1 (en) 2020-08-12

Similar Documents

Publication Publication Date Title
CN113454974B (zh) 用于确定表盘图像的方法及其电子设备
US11263469B2 (en) Electronic device for processing image and method for controlling the same
US11189070B2 (en) System and method of generating targeted user lists using customizable avatar characteristics
US10921958B2 (en) Electronic device supporting avatar recommendation and download
US11145047B2 (en) Method for synthesizing image and an electronic device using the same
US9668114B2 (en) Method for outputting notification information and electronic device thereof
KR102206060B1 (ko) 전자 장치의 효과 디스플레이 방법 및 그 전자 장치
US11900504B2 (en) Augmented reality experiences for physical products in a messaging system
WO2021052311A1 (zh) 一种根据后壳颜色显示用户界面的方法和电子设备
WO2020098418A1 (zh) 控制用户数据的方法及相关装置
US20170148225A1 (en) Virtual dressing system and virtual dressing method
CN112991494A (zh) 图像生成方法、装置、计算机设备及计算机可读存储介质
CN112288553A (zh) 物品推荐方法、装置、终端及存储介质
CN111028071B (zh) 账单处理方法、装置、电子设备及存储介质
WO2023129996A1 (en) Dynamically presenting augmented reality content generators
CN110213307B (zh) 多媒体数据推送方法、装置、存储介质及设备
US11132574B2 (en) Method for detecting marker and electronic device thereof
US11070736B2 (en) Electronic device and image processing method thereof
WO2022033432A1 (zh) 内容推荐方法、电子设备和服务器
CN110021057A (zh) 一种信息显示方法及终端
CN113869900A (zh) 资源管理方法、装置、服务器及介质
CN110543305B (zh) 替换EasyUI组件的方法及装置
CN112258385B (zh) 多媒体资源的生成方法、装置、终端及存储介质
KR20150121899A (ko) 바이너리 업데이트 방법 및 그 전자 장치
CN117950882A (zh) 图像传输控制方法及相关装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019877531

Country of ref document: EP

Effective date: 20200505

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19877531

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE