CN110738548A - Virtual fitting method and device, mobile terminal and computer readable storage medium - Google Patents

Virtual fitting method and device, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN110738548A
CN110738548A CN201910920340.9A CN201910920340A CN110738548A CN 110738548 A CN110738548 A CN 110738548A CN 201910920340 A CN201910920340 A CN 201910920340A CN 110738548 A CN110738548 A CN 110738548A
Authority
CN
China
Prior art keywords
user
target
data
image
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910920340.9A
Other languages
Chinese (zh)
Other versions
CN110738548B (en
Inventor
熊林博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910920340.9A priority Critical patent/CN110738548B/en
Publication of CN110738548A publication Critical patent/CN110738548A/en
Application granted granted Critical
Publication of CN110738548B publication Critical patent/CN110738548B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The invention provides virtual fitting methods, devices, mobile terminals and computer readable storage media, wherein the method comprises the steps of collecting characteristic data and user data of target clothes, generating a three-dimensional fitting image of a user based on the characteristic data and the user data, receiving adjustment operation of the user on a bending angle of electronic equipment, responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to a display mode corresponding to the bending angle.

Description

Virtual fitting method and device, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of mobile terminals, and in particular, to a method, an apparatus, a mobile terminal and a computer-readable storage medium for kinds of virtual fitting.
Background
With the rise of online shopping, many users choose to purchase clothes on the internet. The users of online shopping of clothes are troubled by the fact that whether the clothes are suitable or not can not be intuitively known, and the clothes can be evaluated only by depending on the clothes images displayed by the online stores, the sizes and material parameters of the clothes and by means of experience and visual imagination, which often causes the difference between the buyer show and the buyer show to be large. Therefore, various techniques for virtual fitting are currently available.
During the research process of the prior art, the inventor finds that the existing virtual fitting technology is implemented by , which is to detect the position of the human body image in the photo and then cover the selected clothes on the human body image, and this method is very dependent on the angle of the photo, and the displayed mode list is difficult for the user to obtain effective fitting reference information from the image, resulting in an increase of the returned goods.
Disclosure of Invention
The invention provides virtual fitting methods, devices, a mobile terminal and a computer readable storage medium, and aims to solve the problem that the return and exchange conditions are increased due to the fact that a user difficultly obtains effective fitting reference information from a mode list displayed by the existing virtual fitting method.
, the embodiment of the invention provides a virtual fitting method, applied to an electronic device, the method including:
collecting characteristic data and user data of target clothes;
generating a three-dimensional fitting image of the user based on the feature data and the user data;
receiving the adjustment operation of a user on the bending angle of the electronic equipment;
and responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to a display mode corresponding to the bending angle.
In a second aspect, an embodiment of the present invention provides kinds of apparatuses for virtual fitting, which are applied to an electronic device, and the apparatuses include:
the acquisition module is used for acquiring characteristic data and user data of the target clothes;
an image generation module for generating a three-dimensional fitting image of the user based on the feature data and the user data;
the operation receiving module is used for receiving the adjustment operation of a user on the bending angle of the electronic equipment;
and the display module is used for responding to the adjustment operation and displaying the three-dimensional fitting image on a display screen according to the display mode corresponding to the bending angle.
In a third aspect, an embodiment of the present invention provides mobile terminals, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the method for virtual fitting according to aspect .
In a fourth aspect, embodiments of the present invention provide computer-readable storage media having stored thereon a computer program, which when executed by a processor, performs the steps of the method of virtual fitting as described in aspect .
The embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, the characteristic data and the user data of the target clothes are collected, and the three-dimensional fitting image of the user is generated based on the characteristic data and the user data; receiving the adjustment operation of a user on the bending angle of the electronic equipment; and responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to a display mode corresponding to the bending angle. In the method, the three-dimensional fitting image of the user is generated according to the feature data of the target clothes and the user data, the three-dimensional fitting image can show the three-dimensional fitting effect of the user, the fitting effect of the user can be shown according to the showing mode corresponding to the bending angle of the electronic equipment, and the user can obtain multiple fitting reference information, so that a better purchasing decision can be made, and the occurrence of goods return and change situations is reduced.
Drawings
FIG. 1 is a flow chart of a method of virtual fitting provided in an embodiment of the invention;
FIG. 2 is a second flowchart of a method for virtual fitting provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating the manner in which the bending angle is determined according to an embodiment of the present invention;
FIG. 4 is a second schematic diagram of a display mode determined according to a bending angle provided in the embodiment of the present invention;
fig. 5 is a third schematic diagram of a display mode determined according to a bending angle provided in the embodiment of the present invention;
FIG. 6 is a fourth schematic view of a display mode determined according to a bending angle provided in the embodiment of the present invention;
FIG. 7 is a third flowchart of a method for virtual fitting provided in an embodiment of the present invention;
FIG. 8 is a schematic diagram of a selection of a presentation mode according to an embodiment of the present invention;
FIG. 9 is a fourth flowchart of a method of virtual fitting provided in an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating operations for obtaining a dynamic video according to embodiments of the present invention;
FIG. 11 is a fifth flowchart of a method for virtual fitting provided in the embodiments of the present invention
Fig. 12 is a block diagram showing the structure of a virtual fitting apparatus provided in the embodiment of the present invention;
fig. 13 is a second block diagram of the virtual fitting apparatus according to the embodiment of the present invention;
fig. 14 is a third block diagram of a virtual fitting apparatus provided in the embodiment of the present invention;
fig. 15 is a fourth block diagram of a virtual fitting apparatus provided in the embodiment of the present invention;
fig. 16 is a block diagram showing a structure of a virtual fitting apparatus according to an embodiment of the present invention;
fig. 17 is a hardware configuration diagram of mobile terminals in the embodiments of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention.
Referring to fig. 1, showing a flowchart of a method of virtual fitting provided in an embodiment of the present invention is applied to an electronic device, and the electronic device described in the embodiment of the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The virtual fitting method specifically comprises the following steps:
step 101, collecting characteristic data and user data of target clothes.
The characteristic data of the target clothes refers to the data which represents the characteristics of the clothes, such as the type, style, color, material, size, display pictures of the clothes, and the like of the clothes, the mode of acquiring the characteristic data of the target clothes by the electronic equipment is that the electronic equipment automatically acquires the characteristic data of the clothes selected by the user on the online shopping platform, for example, the electronic equipment can acquire the characteristic data of the target clothes from the display pictures and detailed introduction characters of the online shopping platform, and the second mode is that the user manually uploads the characteristic data of the target clothes.
In addition, user data such as the body type, the height, the weight, the face, the skin color and the photo of the user need to be acquired, the photo of the user can be automatically shot for the user by the electronic equipment, the photo can also be manually uploaded by the user, and a plurality of photos with different angles and different body parts can be shot or uploaded to meet the requirement of three-dimensional modeling, so that three-dimensional human body models can be built after the user data are acquired.
And 102, generating a three-dimensional fitting image of the user based on the feature data and the user data.
In the embodiment of the invention, the electronic equipment performs fitting on the three-dimensional clothes according to the synthesized three-dimensional clothes and the built three-dimensional human body model, namely generates the three-dimensional fitting image. And if the three-dimensional clothes comprise a plurality of pieces, generating a plurality of three-dimensional fitting images of the user. The three-dimensional fitting image can deform according to constraint conditions such as the unevenness of a human body, the material of clothes and the like, and comprises the three-dimensional fitting effect of each body angle of a user.
Step 103, receiving the adjustment operation of the user on the bending angle of the electronic equipment.
In the embodiment of the present invention, the electronic device may determine to detect a bending operation of a user for the electronic device when it is determined by the internal sensor that the display screen of the electronic device is bent by the user, and at this time, if the electronic device is a flexible terminal with a flexible screen or a folding terminal without a flexible screen, and a force applied to the electronic device by the user is sufficient to bend the electronic device, the electronic device may be bent and deformed accordingly. The forces of different magnitudes correspond to different bending deformations and bending angles. For example, the user may adjust the bending angle of the display screen of the flexible terminal to 120 degrees, 150 degrees, etc., or adjust the bending angle of the folder terminal to 120 degrees, 150 degrees, etc.
And 104, responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to the display mode corresponding to the bending angle.
In the embodiment of the invention, the display modes of the electronic equipment, which correspond to different three-dimensional fitting images, can be set according to different bending angles. The different display modes may include displaying three-dimensional fitting images at different angles, displaying three-dimensional fitting images at different body parts, displaying three-dimensional fitting images of different target clothes, and the like.
The user can intuitively know the effect of wearing target clothes by watching the three-dimensional fitting images in different display modes on the display screen. For example, by displaying three-dimensional fitting images at different angles, fitting effects on the front, side and back of the body can be visually seen; the three-dimensional fitting images of different body parts are displayed, so that the fitting effect of the upper half body and the lower half body can be seen in detail; the three-dimensional fitting images of different target clothes are displayed, so that a user can display fitting effects of different clothes through switching in a short time, dressing effects of different target clothes are compared, and the user can make a decision quickly.
In summary, in the embodiment of the present invention, feature data and user data of a target garment are collected, and a three-dimensional fitting image of the user is generated based on the feature data and the user data; receiving the adjustment operation of a user on the bending angle of the electronic equipment screen; and responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to a display mode corresponding to the bending angle. In the method, the three-dimensional fitting image of the user is generated according to the feature data of the target clothes and the user data, the three-dimensional fitting image can show the three-dimensional fitting effect of the user, the fitting effect of the user can be shown according to the showing mode corresponding to the bending angle of the electronic equipment, and the user can obtain multiple fitting reference information, so that a better purchasing decision can be made, and the occurrence of goods return and change situations is reduced.
Referring to fig. 2, a second flowchart of a method for virtual fitting provided in the embodiment of the present invention is shown, where the method is applied to an electronic device, and specifically, the method may include the following steps.
Step 201, collecting characteristic data and user data of the target clothes.
In the embodiment of the present invention, step 201 may refer to step 101, which is not described herein again.
Step 202, generating a three-dimensional fitting image of the user based on the feature data and the user data.
In the embodiment of the present invention, step 202 may refer to step 102, which is not described herein again.
Step 203, receiving an adjustment operation of the bending angle of the electronic device by a user.
In the embodiment of the present invention, step 203 may refer to step 103, which is not described herein again.
After step 203, steps 204 to 205 may be performed, and steps 206 to 207 may be performed.
And 204, determining a target display angle corresponding to the bending angle.
In the embodiment of the invention, the display angles refer to different display angles of the three-dimensional fitting image. For example, the front side of the human body in the three-dimensional fitting image can be set as an initial display angle, and different lateral display surfaces which rotate from the front side to the left or the right can be set until the human body rotates to the back side, wherein the different lateral display surfaces correspond to different display angles of the three-dimensional fitting image. The target display angles corresponding to different bending angles can be predefined. Specifically, the electronic device may provide a setting option, and the user may set the target display angle corresponding to the bending angle through the setting option. Of course, the target display angle corresponding to each bending angle may also be defined in the electronic device in advance by a developer, and this is not specifically limited in the embodiment of the present invention.
And step 205, displaying the three-dimensional fitting image on the display screen according to the target display angle.
In the embodiment of the invention, the three-dimensional fitting images with different angles can be controlled to be displayed on the display screen by adjusting the bending angle of the electronic equipment. For example, when the bending angle of the electronic device is adjusted to 120 degrees, the display angle of the three-dimensional fitting image is the image of the front of the user; with the gradual increase of the bending angle, the three-dimensional fitting image rotates towards the left side by taking the three-dimensional fitting image as a center, and when the bending angle of the electronic equipment reaches 150 degrees, the display angle of the three-dimensional fitting image rotates to be in the direction of 90 degrees of the side of the user, namely the image of the side of the user is displayed; when the bending angle of the electronic equipment is continuously increased, the three-dimensional fitting image continuously rotates towards the back of the human body until the bending angle of the electronic equipment reaches 180 degrees, and the display angle of the three-dimensional fitting image is the image of the back of the user.
Of course, the adjustment manner includes, but is not limited to, the above-mentioned manner, as long as it can be realized that the three-dimensional fitting images of the user at different angles are displayed on the display screen along with different bending angles.
Fig. 3 is a schematic diagram illustrating the manner of determining the display according to the bending angle provided in the embodiment of the present invention.
Referring to fig. 3, S1 corresponds to a th display region of the electronic device display screen, S2 corresponds to a second display region of the electronic device display screen, and when the bending angle between S1 and S2 is 120 degrees, a three-dimensional fitting image of the front of the user is displayed in the display region of S1.
Fig. 4 is a second schematic diagram of a display manner determined according to a bending angle provided in the embodiment of the present invention.
Referring to fig. 4, S1 corresponds to the th display region of the electronic device display screen, S2 corresponds to the second display region of the electronic device display screen, and when the bending angle between S1 and S2 is 150 degrees, a three-dimensional fitting image of the user' S side is displayed in the display region of S1.
The user can directly know the effect of wearing target clothes through watching the three-dimensional fitting images at different display angles in the display screen bending process, and the three-dimensional fitting images slowly rotate along with different bending angles of the display screen, so that the process that the body of the user rotates after wearing the target clothes is simulated, the user can obtain a dynamic wearing effect, and the dynamic feeling of the target clothes in vision is increased.
Step 206, selecting a target fitting image from the at least two three-dimensional fitting images according to the bending angle; wherein different three-dimensional fitting images are for different target clothes.
In the embodiment of the present invention, the target clothes are at least two pieces, and the three-dimensional fitting image includes at least two three-dimensional fitting images generated according to the at least two pieces of clothes. Correspondingly, different three-dimensional fitting images aim at different target clothes.
The method includes that target fitting images corresponding to different bending angle ranges are defined in advance, for example, the bending angles of 120-149 degrees correspond to fitting images arranged at in sequence in a drawing library, and the bending angles of 150-179 degrees correspond to fitting images arranged at the second.
And step 207, displaying the target fitting image on the display screen.
For example, when the bending angle of the electronic device is rotated to 120 degrees, three-dimensional fitting images corresponding to the target clothes 1 are displayed on the display screen, and when the bending angle is rotated to 150 degrees, second three-dimensional fitting images corresponding to the target clothes 2 are displayed, specifically, when the bending angle is set to be 120 degrees to 149 degrees, images of the front side, the side face and the back side of the user in the three-dimensional fitting images are respectively displayed, when the bending angle is set to be 150 degrees to 179 degrees, images of the front side, the side face and the back side of the user in the second three-dimensional fitting images are respectively displayed, and when the bending angle reaches 180 degrees, the display screen is changed into a straight-face display mode, and at the same time, the front side of the three-dimensional fitting images and the front side of the second three-dimensional fitting images are simultaneously displayed.
Fig. 5 is a third schematic diagram of a display mode determined according to a bending angle provided in the embodiment of the present invention.
Referring to fig. 5, when the bending angle between S1 and S2 is 120 degrees, a three-dimensional fitting image of the user wearing the skirt is displayed in the display area of S1.
Fig. 6 is a fourth schematic view of a display mode determined according to a bending angle provided in the embodiment of the present invention.
Referring to fig. 6, when the bending angle between S1 and S2 is 150 degrees, a three-dimensional fitting image of the user wearing pants is displayed in the display area of S1.
The user can visually compare the fitting effect of fitting different target clothes by watching the three-dimensional fitting images of the different target clothes in the bending process of the display screen, and the user can select the clothes which are most suitable for the user from the target clothes.
Optionally, after generating a three-dimensional fitting image of the user based on the feature data and the user data, the method further comprises:
and projecting and displaying the three-dimensional fitting image through a projection module of the electronic equipment.
In the embodiment of the invention, projection modules can be built in the electronic equipment, for example, an application program providing a projection function is installed, or external projection devices are connected to the electronic equipment, so that a user can project and display a three-dimensional fitting image in a target area such as a wall surface, a curtain and the like, and the fitting effect can be more visually seen by the user conveniently.
In summary, the embodiment of the present invention has the beneficial effects of the virtual fitting method shown in fig. 1, and also determines the target display angle corresponding to the bending angle, and displays the three-dimensional fitting image on the display screen according to the target display angle, so that the user can visually know the effect of wearing the target clothes by watching the three-dimensional fitting images at different display angles during the bending process of the display screen, and the three-dimensional fitting image slowly rotates along with the different bending angles of the display screen, thereby simulating the body rotation process after the user wears the target clothes, so that the user can obtain a dynamic wearing effect, and increasing the visual dynamic feeling of the target clothes; in addition, a target fitting image is selected from the at least two three-dimensional fitting images according to the bending angle, and the target fitting image is displayed on the display screen, so that the user can visually compare the fitting effect of fitting different target clothes, and the user can quickly select the clothes most suitable for the user from the target clothes.
Referring to fig. 7, a third flowchart of a method for virtual fitting provided in the embodiment of the present invention is shown, where the method is applied to an electronic device, and specifically may include the following steps.
Step 301, collecting characteristic data and user data of the target clothes.
In the embodiment of the present invention, step 301 may refer to step 101, which is not described herein again.
Step 302, generating a three-dimensional fitting image of the user based on the feature data and the user data.
In the embodiment of the present invention, step 302 may refer to step 102, which is not described herein again.
Step 303, detecting the category of the target clothes.
In the embodiment of the invention, the category of the target clothes can be detected according to the characteristic data of the target clothes. For example, whether the target clothes is a hat, a jacket, pants, or shoes is detected.
And 304, intercepting a target display area image from the three-dimensional fitting image according to the category of the target clothes.
In the embodiment of the invention, the body part of the user wearing the target clothes is determined according to the category of the target clothes, and the display area image corresponding to the body part is intercepted from the three-dimensional fitting image. For example, if the type of the target clothes is a hat, an area image above the face of the user is captured from the three-dimensional fitting image to be used as a target display area image; and if the type of the target clothes is shoes, capturing an area below the lower leg of the user from the three-dimensional fitting image as a target display area image.
And 305, displaying the target display area image on the display screen.
In the embodiment of the invention, after the image of the target display area is intercepted, the image of the target display area is directly displayed on the display screen. For example, a three-dimensional image of an area above the user's face may be displayed on a display screen to facilitate the user to observe the wearing effect of a hat in detail, or a three-dimensional image of an area below the user's calf to facilitate the user to observe the fitting effect of shoes in detail.
Optionally, after the step 303, a step a 1-a 2 is further included:
step A1, receiving the selection operation of the user for the preset display mode; the preset display modes at least comprise a whole body mode, a local mode and a default mode.
In the embodiment of the invention, the manual selection of the presentation mode by the user is supported. And before the display screen displays the three-dimensional fitting image, selecting a display mode.
Fig. 8 is a schematic diagram of selecting a presentation mode according to an embodiment of the present invention.
Referring to fig. 8, when a button 'local mode' is clicked, it represents that a user selects 'local mode'; when the button 'whole body mode' is clicked, the user selects 'whole body mode'; when the button 'default mode' is clicked, the user selects 'system default mode', and the electronic equipment automatically adjusts the display mode according to the clothing categories.
And step A2, responding to the selection operation, and displaying the three-dimensional fitting image according to the display mode corresponding to the selection operation.
In the embodiment of the invention, the three-dimensional fitting image is displayed on the display screen according to the selected display mode according to the selection operation of the user. For example, if the user selects the local mode, a local image of the user's body is displayed; and if the user selects the whole body mode, displaying the image of the whole body of the user, and if the default mode is selected, automatically adjusting the display mode by the electronic equipment according to the clothing type.
In summary, the embodiment of the present invention has the beneficial effects of the virtual fitting method shown in fig. 1, and also detects the type of the target clothing, captures the target display area image from the three-dimensional fitting image according to the type of the target clothing, and displays the target display area image on the display screen, so that the user can view the part of the fitting clothing in detail, and the user can know the wearing detail effect conveniently. In addition, the selection operation of the user on a preset display mode can be received, and the three-dimensional fitting image is displayed according to the display mode corresponding to the selection operation in response to the selection operation, so that the user can select the display mode according to will, and the method is more humanized.
Referring to fig. 9, a fourth flowchart of a method for virtual fitting provided in the embodiment of the present invention is shown, where the method is applied to an electronic device, and specifically may include the following steps.
Step 401, collecting characteristic data and user data of the target clothes.
In the embodiment of the present invention, step 401 may refer to step 101, which is not described herein again.
Step 402, generating a three-dimensional fitting image of the user based on the feature data and the user data.
In the embodiment of the present invention, step 402 may refer to step 102, which is not described herein again.
Step 403, determining the matching clothes of the target clothes according to the characteristic data.
In the embodiment of the invention, after the characteristic data of the target clothes are collected, the information such as the type, style and color of the target clothes can be determined, the electronic equipment performs big data analysis by combining the characteristic data and the human body data, and calculates the matching clothes and matching scheme suitable for the target clothes, so that series complete wearing schemes such as coats, trousers and shoes can be obtained.
Step 404, generating a three-dimensional fitting image of the user based on the feature data, the matched clothes and the user data collected in advance.
In the embodiment of the invention, three-dimensional fitting images of the user can be obtained according to the characteristic data and the user data of the target clothes and the matched clothes, so that the fitting images matched with the whole body can be obtained, the user can conveniently watch the fitting effect of the target clothes, and the user can conveniently know the matching effect of the target clothes and other clothes, and the user can know whether the target clothes are matched conveniently and whether the matched clothes need to be purchased or not.
Optionally, the user data is dynamic video data of the user, and the step 402 includes steps A3-a 4:
step A3, collecting the characteristic data of the target clothes, and obtaining the three-dimensional data of the target clothes according to the characteristic data of the target clothes.
In the embodiment of the invention, after the characteristic data of the target clothes are collected, the clothes can be three-dimensionally changed by using a three-dimensional virtual reality technology, so that the three-dimensional data of the target clothes are obtained.
Step A4, synthesizing the three-dimensional data of the target clothes and the dynamic video data of the user to obtain the dynamic three-dimensional fitting image of the user.
In the embodiment of the invention, the dynamic video data of the user can be collected in advance, and the dynamic video data of the user can comprise activities of walking, running, bending, standing, sitting and the like of the user. And synthesizing the dynamic video data of the user and the three-dimensional data of the target clothes to further obtain a dynamic three-dimensional fitting image of the user.
Fig. 10 is a schematic diagram of an operation of acquiring a dynamic video according to an embodiment of the present invention.
Referring to fig. 10, dynamic video data of a user may be acquired based on a selection operation of the user. Specifically, the user may click a dynamic recording button to select real-time recording of the user video, or may click a manual uploading button to select uploading of the user's historical video data.
If automatic recording is selected, the electronic equipment automatically starts a video recording function, videos of the whole body need to be recorded before the camera, the system automatically models and generates a dynamic three-dimensional fitting image after recording is completed, and if manual uploading is selected, the user needs to manually select to upload videos stored in the electronic equipment, and the electronic equipment models and generates the dynamic three-dimensional fitting image after uploading.
The user watches the dynamic three-dimensional fitting image, and can visually see the effects of walking, running, bending, standing, sitting and the like after wearing the target clothes, so that the user can more comprehensively know the dressing effect in various activity scenes to help the user to more quickly obtain a purchase decision.
Step 405, receiving an operation of adjusting a bending angle of the electronic device by a user.
In the embodiment of the present invention, step 405 may refer to step 103, which is not described herein again.
And 406, responding to the adjustment operation, and displaying the three-dimensional fitting image on the display screen according to the display mode corresponding to the bending angle.
In the embodiment of the present invention, step 406 may refer to step 104, which is not described herein again.
In summary, the embodiment of the present invention has the beneficial effects of the virtual fitting method shown in fig. 1, and also determines the matching clothes of the target clothes according to the feature data of the target clothes, and generates the three-dimensional fitting image of the user based on the feature data, the matching clothes and the pre-collected user data, so that the fitting image matched with the whole body can be obtained, which not only facilitates the user to view the fitting effect of the target clothes, but also facilitates the user to know the matching effect of the target clothes and other clothes. In addition, the three-dimensional data of the target clothes and the dynamic video data of the user are synthesized to obtain the dynamic three-dimensional fitting image of the user, so that the user can visually see the effects of actions such as walking, running, bending, standing, sitting and the like after wearing the target clothes, the user can more comprehensively know the dressing effect in various activity scenes, and the user can be helped to more quickly obtain a purchasing decision.
Referring to fig. 11, a fifth flowchart of a method for virtual fitting provided in the embodiment of the present invention is shown, where the method is applied to an electronic device, and specifically may include the following steps.
Step 501, collecting physique information of a user through a sensor; the constitution information at least comprises body temperature and skin touch information of the user.
In the embodiment of the invention, the electronic device may be externally connected with a sensor through technologies such as bluetooth and WIFI (Wireless Fidelity), and the physical information of the user, for example, the body temperature, the skin touch, the skin texture and the like of the user, is acquired through the sensor. Specifically, the user wears the sensor on a designated part of the body, and the physical quality information of the user is obtained through the sensor and is sent to the electronic equipment. Wherein the designated part is a part worn by the target clothes. For example, if the target item of clothing is a jacket, the sensor is worn on the upper body of the user and contacts the skin of the user.
Step 502, obtaining tactile information of the target clothes; the tactile information at least comprises material and thickness information of the target clothes.
In the embodiment of the invention, the tactile information of the target clothes can be obtained in the introduction characters or pictures of the target clothes in the website platform, or can be obtained by customer service personnel consulting the website platform.
Step 503, generating body sensing data of the target clothes for the user according to the physique information of the user and the touch information of the target clothes.
In the embodiment of the invention, the electronic equipment takes the physique information of the user and the tactile information of the target clothes as parameters to be input into a target algorithm to perform somatosensory modeling, and then somatosensory data of the target clothes for the user is generated. The body sensing data may include data such as body sensing temperature, contact touch feeling, etc. of the user to the target laundry.
Step 504, controlling the sensor contacting the skin of the user to simulate the somatosensory data.
In the embodiment of the invention, the sensor simulates the body feeling data, and because the sensor is worn on the appointed part of the human body and contacts with the skin, the skin of the appointed part can feel the body feeling simulated by the sensor, and then the user can feel the temperature, the touch feeling, the comfort level and the like of the target clothes after wearing the clothes on the body.
The body feeling of the target clothes is sensed through the sensor, and the user can further learn the temperature, the touch feeling, the comfort degree and other information of the target clothes, so that the user can make a correct purchasing decision conveniently.
In summary, in addition to the beneficial effects of the method for virtual fitting in fig. 1, the embodiment of the present invention further acquires the body constitution information of the user through a sensor, acquires the touch information of the target clothing, generates the body feeling data of the target clothing for the user according to the body constitution information of the user and the touch information of the target clothing, and controls the sensor contacting the skin of the user to simulate the body feeling data.
Referring to fig. 12, is a block diagram of a virtual fitting apparatus according to an embodiment of the present invention, where the virtual fitting apparatus 600 is applied to an electronic device, and may specifically include:
the acquisition module 601 is used for acquiring feature data and user data of target clothes;
an image generation module 602, configured to generate a three-dimensional fitting image of the user based on the feature data and the user data;
an operation receiving module 603, configured to receive an adjustment operation of a user on a bending angle of the electronic device;
and a display module 604, configured to display the three-dimensional fitting image on the display screen according to the display mode corresponding to the bending angle in response to the adjustment operation.
The virtual fitting device provided by the embodiment of the invention can realize each process realized in the method embodiment of fig. 1, and is not described again to avoid repetition.
In this way, in the embodiment of the present invention, feature data and user data of a target garment are collected, and a three-dimensional fitting image of the user is generated based on the feature data and the user data; receiving the adjustment operation of a user on the bending angle of the display screen; and responding to the adjustment operation, and displaying the three-dimensional fitting image on the display screen according to a display mode corresponding to the bending angle. In the method, the three-dimensional fitting image of the user is generated according to the feature data of the target clothes and the user data, the three-dimensional fitting image can show the three-dimensional fitting effect of the user, the fitting effect of the user can be shown according to the showing mode corresponding to the bending angle of the electronic equipment, and the user can obtain multiple fitting reference information, so that a better purchasing decision can be made, and the occurrence of goods return and change situations is reduced.
Referring to fig. 13, on the basis of fig. 12, a second block diagram of the virtual fitting apparatus according to the embodiment of the present invention is shown, wherein the display module 604 includes:
a display angle determining submodule 6041 configured to determine a target display angle corresponding to the bending angle;
, a display sub-module 6042 for displaying the three-dimensional fitting image on the display screen according to the target display angle.
The display module 604 further includes:
a selecting submodule 6043 for selecting a target fitting image from the at least two three-dimensional fitting images according to the bending angle; wherein different three-dimensional fitting images are for different target clothes;
and a second display sub-module 6044 configured to display the target fitting image on the display screen.
Optionally, the apparatus further comprises:
and the projection module is used for projecting and displaying the three-dimensional fitting image.
The virtual fitting device provided by the embodiment of the invention can realize each process realized in the method embodiment of fig. 2, and is not described again to avoid repetition.
The virtual fitting device provided in the embodiment of the invention has the beneficial effects of the virtual fitting device in fig. 12, and also determines a target display angle corresponding to the bending angle, displays the three-dimensional fitting image on the display screen according to the target display angle, so that a user can visually know the effect of wearing target clothes by watching the three-dimensional fitting image at different display angles in the bending process of the display screen, and the three-dimensional fitting image slowly rotates along with different bending angles of the display screen, thereby simulating the body rotation process after the user wears the target clothes, so that the user can obtain a dynamic wearing effect, and the visual dynamic feeling of the target clothes is increased; in addition, a target fitting image is selected from the at least two three-dimensional fitting images according to the bending angle, and the target fitting image is displayed on the display screen, so that the user can visually compare the fitting effect of fitting different target clothes, and the user can quickly select the clothes most suitable for the user from the target clothes.
Referring to fig. 14, on the basis of fig. 12, a third block diagram of the virtual fitting apparatus according to the embodiment of the present invention is shown, and the apparatus 600 further includes:
a category detection module 605 for detecting a category of the target laundry;
a display area intercepting module 606, configured to intercept a target display area image from the three-dimensional fitting image according to the category of the target clothing;
a third display module 607, configured to display the target display area image on the display screen.
The virtual fitting device provided by the embodiment of the invention can realize each process realized in the method embodiment of fig. 3, and is not described again to avoid repetition.
The virtual fitting device provided in the embodiment of the invention has the beneficial effects of the virtual fitting device in fig. 12, and also detects the type of the target clothes, intercepts the target display area image from the three-dimensional fitting image according to the type of the target clothes, and displays the target display area image on the display screen, so that a user can watch the part of the fitting clothes in detail, and the user can conveniently know the wearing detail effect. In addition, the selection operation of the user on a preset display mode can be received, and the three-dimensional fitting image is displayed according to the display mode corresponding to the selection operation in response to the selection operation, so that the user can select the display mode according to will, and the method is more humanized.
Referring to fig. 15, on the basis of fig. 12, a fourth structural block diagram of a virtual fitting apparatus according to an embodiment of the present invention is shown, where the apparatus 600 further includes:
a collocation determining module 608 for determining collocation clothes of the target clothes according to the feature data;
the image generation module 602 includes:
an image generating sub-module 6021 for generating a three-dimensional fitting image of the user based on the feature data, the collocation clothes, and the user data acquired in advance.
Optionally, the user data is dynamic video data of the user, and the image generating module 602 includes:
the three-dimensional data acquisition sub-module is used for acquiring three-dimensional data of the target clothes according to the characteristic data of the target clothes;
and the second image generation submodule is used for synthesizing the three-dimensional data of the target clothes and the dynamic video data of the user to obtain a dynamic three-dimensional fitting image of the user.
The virtual fitting device provided by the embodiment of the invention can realize each process realized in the method embodiment of fig. 4, and is not described again to avoid repetition.
The virtual fitting device provided in the embodiment of the invention has the beneficial effects of the virtual fitting device in fig. 12, and also determines the matching clothes of the target clothes according to the feature data of the target clothes, and generates the three-dimensional fitting image of the user based on the feature data, the matching clothes and the pre-acquired user data, so that the fitting image matched with the whole body can be obtained, the fitting effect of the target clothes can be conveniently watched by the user, and the matching effect of the target clothes and other clothes can be more conveniently known by the user. In addition, the three-dimensional data of the target clothes and the dynamic video data of the user are synthesized to obtain the dynamic three-dimensional fitting image of the user, so that the user can visually see the effects of actions such as walking, running, bending, standing, sitting and the like after wearing the target clothes, the user can more comprehensively know the dressing effect in various activity scenes, and the user can be helped to more quickly obtain a purchasing decision.
Referring to fig. 16, on the basis of fig. 12, a fifth structural block diagram of a virtual fitting apparatus according to an embodiment of the present invention is shown, where the apparatus 600 further includes:
the physique information acquisition module 609 is used for acquiring physique information of the user through the sensor; the constitution information at least comprises body temperature and skin touch information of the user;
a tactile information acquisition module 610 for acquiring tactile information of the target laundry; the tactile information at least comprises material and thickness information of the target clothes;
a body feeling data generating module 611, configured to generate body feeling data of the target laundry for the user according to the body constitution information of the user and the touch information of the target laundry;
a simulation module 612 configured to control the sensor contacting the skin of the user to simulate the somatosensory data.
The virtual fitting device provided by the embodiment of the invention can realize each process realized in the method embodiment of fig. 5, and is not described again to avoid repetition.
The virtual fitting device provided in the embodiment of the present invention has the beneficial effects of the virtual fitting device in fig. 12, and also acquires the physical information of the user through a sensor, acquires the tactile information of the target clothing, generates the somatosensory data of the target clothing for the user according to the physical information of the user and the tactile information of the target clothing, and controls the sensor contacting the skin of the user to simulate the somatosensory data.
Fig. 17 is a schematic hardware configuration diagram of mobile terminals in various embodiments of the invention.
The mobile terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, a sound output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 17 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 710 is configured to:
collecting characteristic data and user data of target clothes;
generating a three-dimensional fitting image of the user based on the feature data and the user data;
receiving the adjustment operation of a user on the bending angle of the electronic equipment;
and in response to the adjustment operation, controlling a display unit 706 to display the three-dimensional fitting image on the display screen according to the display mode corresponding to the bending angle.
In the embodiment of the invention, the characteristic data and the user data of the target clothes are collected, and the three-dimensional fitting image of the user is generated based on the characteristic data and the user data; receiving the adjustment operation of a user on the bending angle of the display screen; and responding to the adjustment operation, and displaying the three-dimensional fitting image on the display screen according to a display mode corresponding to the bending angle. In the method, the three-dimensional fitting image of the user is generated according to the feature data of the target clothes and the user data, the three-dimensional fitting image can show the three-dimensional fitting effect of the user, the fitting effect of the user can be shown according to the showing mode corresponding to the bending angle of the electronic equipment, and the user can obtain multiple fitting reference information, so that a better purchasing decision can be made, and the occurrence of goods return and change situations is reduced.
It should be understood that in the embodiments of the present invention, the rf unit 701 may be used for receiving and transmitting signals during information transmission and reception or a call, specifically, receiving downlink data from a base station and then processing the received downlink data to the processor 710, and further, transmitting uplink data to the base station.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The sound output unit 703 may convert sound data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into a sound signal and output as sound. Also, the sound output unit 703 may also provide sound output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The sound output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive a sound or video signal. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sound, and may be capable of processing such sound into sound data. The processed voice data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 further includes at least sensors 705, such as a light sensor, a motion sensor, and other sensors, specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 7061 or backlight when the mobile terminal 700 moves to the ear, sensors as the motion sensor can detect the magnitude of acceleration in various directions (, which is generally three axes), detect the magnitude and direction of gravity when the mobile terminal is stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tapping), and the like, and the sensors can further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, which will not be described herein again.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
, the touch panel 7071 can be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine a type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700, for example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, a sound input/output (I/O) port, a video I/O port, an earphone port, and the like.
The memory 709 may be mainly comprised of a storage program area that may store an operating system, application programs required for at least functions (such as a sound playing function, an image playing function, etc.), and the like, and a storage data area that may store data created according to the use of the cellular phone (such as sound data, a phone book, etc.), and the like, and in addition, the memory 709 may be comprised of a high-speed random access memory, and may further include a nonvolatile memory, such as at least disk storage devices, flash memory devices, or other volatile solid state storage devices.
The processor 710 is the control center of the mobile terminal and connects various parts of the entire mobile terminal using various interfaces and lines to perform overall monitoring of the mobile terminal by running or executing software programs or modules stored in the memory 709 and calling data stored in the memory 709 to perform various functions of the mobile terminal and process the data, the processor 710 may include or more processing units, and preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules not shown, which are not described in detail herein.
Preferably, the embodiment of the present invention further provides mobile terminals, which include a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program, when executed by the processor 710, implements each process of the above display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
Based on the hardware structure of the mobile terminal, the following detailed description will be made of embodiments of the present invention.
The embodiment of the present invention further provides computer-readable storage media, where a computer program is stored on the computer-readable storage media, and when the computer program is executed by a processor, the computer program implements the processes of the display method embodiment, and can achieve the same technical effects, and is not described herein again to avoid repetition.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises an series of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Based on the understanding that the technical solution of the present invention per se or parts contributing to the prior art can be embodied in the form of software products stored in storage media (such as ROM/RAM, magnetic disk, optical disk) and including instructions for causing terminals (which may be mobile phones, computers, servers, air conditioners, or network devices, etc.) to execute the methods described in the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1, method for virtual fitting, applied to electronic equipment, the method comprising:
collecting characteristic data and user data of target clothes;
generating a three-dimensional fitting image of the user based on the feature data and the user data;
receiving the adjustment operation of a user on the bending angle of the electronic equipment;
and responding to the adjustment operation, and displaying the three-dimensional fitting image on a display screen according to a display mode corresponding to the bending angle.
2. The method according to claim 1, wherein the displaying the three-dimensional fitting image on the display screen according to the display mode corresponding to the bending angle comprises:
determining a target display angle corresponding to the bending angle;
and displaying the three-dimensional fitting image on a display screen according to the target display angle.
3. The method of claim 1, wherein the target garment is at least two pieces, and the three-dimensional fitting image comprises at least two three-dimensional fitting images generated from the at least two pieces of garment;
the three-dimensional fitting image is displayed on a display screen according to the display mode corresponding to the bending angle, and the method comprises the following steps:
selecting a target fitting image from the at least two three-dimensional fitting images according to the bending angle; wherein different three-dimensional fitting images are for different target clothes;
and displaying the target fitting image on a display screen.
4. The method of claim 1, wherein after the generating a three-dimensional fitting image of the user based on the feature data and the user data, the method further comprises:
detecting a category of the target laundry;
intercepting a target display area image from the three-dimensional fitting image according to the category of the target clothes;
the displaying of the three-dimensional fitting image on the display screen includes:
and displaying the target display area image on a display screen.
5. The method of claim 1, further comprising, after said collecting characteristic data of the target garment:
determining the matched clothes of the target clothes according to the characteristic data;
the generating a three-dimensional fitting image of the user based on the feature data and the user data includes:
and generating a three-dimensional fitting image of the user based on the feature data, the matched clothes and the pre-collected user data.
6. The method of claim 1, wherein the user data is dynamic video data of the user, and wherein generating the three-dimensional fitting image of the user based on the feature data and the user data comprises:
acquiring three-dimensional data of the target clothes according to the characteristic data of the target clothes;
and synthesizing the three-dimensional data of the target clothes and the dynamic video data of the user to obtain a dynamic three-dimensional fitting image of the user.
7. The method of claim 1, further comprising:
collecting physique information of a user through a sensor; the constitution information at least comprises body temperature and skin touch information of the user;
acquiring touch information of the target clothes; the tactile information at least comprises material and thickness information of the target clothes;
generating body sensing data of the target clothes for the user according to the physique information of the user and the touch information of the target clothes;
controlling the sensor in contact with the user's skin to simulate the somatosensory data.
8. The method of claim 1, wherein after the generating a three-dimensional fitting image of the user based on the feature data and the user data, the method further comprises:
and projecting and displaying the three-dimensional fitting image through a projection module of the electronic equipment.
9, apparatus for virtual fitting, applied to electronic equipment, the apparatus comprising:
the acquisition module is used for acquiring characteristic data and user data of the target clothes;
an image generation module for generating a three-dimensional fitting image of the user based on the feature data and the user data;
the operation receiving module is used for receiving the adjustment operation of a user on the bending angle of the electronic equipment;
and the display module is used for displaying the three-dimensional fitting image on a display screen according to the display mode corresponding to the bending angle.
10. The apparatus of claim 9, wherein the display module comprises:
the display angle determining submodule is used for determining a target display angle corresponding to the bending angle;
and the display sub-module is used for displaying the three-dimensional fitting image on the display screen according to the target display angle.
11. The apparatus of claim 9, wherein the target garment is at least two pieces, and the three-dimensional fitting image includes at least two three-dimensional fitting images generated from the at least two pieces of garment; the display module comprises:
the selecting submodule is used for selecting a target fitting image from the at least two three-dimensional fitting images according to the bending angle; wherein different three-dimensional fitting images are for different target clothes;
and the second display sub-module is used for displaying the target fitting image on a display screen.
12. The apparatus of claim 9, further comprising:
a category detection module for detecting a category of the target laundry;
the display area intercepting module is used for intercepting a target display area image from the three-dimensional fitting image according to the category of the target clothes;
and the third display module is used for displaying the three-dimensional image of the target display area on the display screen.
13. The apparatus of claim 9, further comprising:
the collocation determining module is used for determining collocation clothes of the target clothes according to the characteristic data;
the image generation module comprises:
an image generating sub-module for generating a three-dimensional fitting image of the user based on the feature data, the matched clothing, and the user data collected in advance.
14. The apparatus of claim 9, wherein the user data is dynamic video data of the user, and the image generation module comprises:
the three-dimensional data acquisition sub-module is used for acquiring three-dimensional data of the target clothes according to the characteristic data of the target clothes;
and the second image generation submodule is used for synthesizing the three-dimensional data of the target clothes and the dynamic video data of the user to obtain a dynamic three-dimensional fitting image of the user.
15. The apparatus of claim 9, further comprising:
the physique information acquisition module is used for acquiring physique information of a user through a sensor; the constitution information at least comprises body temperature and skin touch information of the user;
the tactile information acquisition module is used for acquiring tactile information of the target clothes; the tactile information at least comprises material and thickness information of the target clothes;
the body feeling data generation module is used for generating body feeling data of the target clothes for the user according to the physique information of the user and the touch information of the target clothes;
a simulation module to control the sensor in contact with the skin of the user to simulate the somatosensory data.
16. The apparatus of claim 9, further comprising:
and the projection module is used for projecting and displaying the three-dimensional fitting image.
A mobile terminal of , comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the method of virtual fitting of any of claims 1 to 8 to .
18, computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of virtual fitting according to any of claims 1 to 8.
CN201910920340.9A 2019-09-26 2019-09-26 Virtual fitting method and device, mobile terminal and computer readable storage medium Active CN110738548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910920340.9A CN110738548B (en) 2019-09-26 2019-09-26 Virtual fitting method and device, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910920340.9A CN110738548B (en) 2019-09-26 2019-09-26 Virtual fitting method and device, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110738548A true CN110738548A (en) 2020-01-31
CN110738548B CN110738548B (en) 2021-11-09

Family

ID=69269569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910920340.9A Active CN110738548B (en) 2019-09-26 2019-09-26 Virtual fitting method and device, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110738548B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112685649A (en) * 2021-01-25 2021-04-20 深圳创维-Rgb电子有限公司 Clothing recommendation method and device, storage medium and terminal equipment
CN113584842A (en) * 2020-04-30 2021-11-02 云米互联科技(广东)有限公司 Fan control method, control terminal and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160163109A1 (en) * 2013-08-02 2016-06-09 Seiko Epson Corporation Display device, head mounted display, display system, and control method for display device
CN106651498A (en) * 2016-09-29 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Information processing method and device
CN108172161A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Display methods, mobile terminal and computer readable storage medium based on flexible screen
CN109388229A (en) * 2017-08-11 2019-02-26 哈尔滨工业大学 A kind of immersion virtual fit method and system with sense of touch experience
CN109960482A (en) * 2019-02-28 2019-07-02 努比亚技术有限公司 A kind of 3D rendering display methods, terminal and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160163109A1 (en) * 2013-08-02 2016-06-09 Seiko Epson Corporation Display device, head mounted display, display system, and control method for display device
CN106651498A (en) * 2016-09-29 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Information processing method and device
CN109388229A (en) * 2017-08-11 2019-02-26 哈尔滨工业大学 A kind of immersion virtual fit method and system with sense of touch experience
CN108172161A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Display methods, mobile terminal and computer readable storage medium based on flexible screen
CN109960482A (en) * 2019-02-28 2019-07-02 努比亚技术有限公司 A kind of 3D rendering display methods, terminal and computer readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113584842A (en) * 2020-04-30 2021-11-02 云米互联科技(广东)有限公司 Fan control method, control terminal and computer readable storage medium
CN113584842B (en) * 2020-04-30 2023-12-12 云米互联科技(广东)有限公司 Fan control method, control terminal and computer readable storage medium
CN112685649A (en) * 2021-01-25 2021-04-20 深圳创维-Rgb电子有限公司 Clothing recommendation method and device, storage medium and terminal equipment

Also Published As

Publication number Publication date
CN110738548B (en) 2021-11-09

Similar Documents

Publication Publication Date Title
CN112162671B (en) Live broadcast data processing method and device, electronic equipment and storage medium
CN109361865B (en) Shooting method and terminal
CN108184050B (en) Photographing method and mobile terminal
JP7230055B2 (en) Application program display adaptation method and device, terminal, storage medium, and computer program
CN109600550B (en) Shooting prompting method and terminal equipment
CN110809115B (en) Shooting method and electronic equipment
CN108712603B (en) Image processing method and mobile terminal
CN107817939B (en) Image processing method and mobile terminal
CN109499061B (en) Game scene picture adjusting method and device, mobile terminal and storage medium
CN109218648B (en) Display control method and terminal equipment
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN110970003A (en) Screen brightness adjusting method and device, electronic equipment and storage medium
CN108683850B (en) Shooting prompting method and mobile terminal
CN108804546B (en) Clothing matching recommendation method and terminal
CN109618218B (en) Video processing method and mobile terminal
CN109671034B (en) Image processing method and terminal equipment
CN109448069B (en) Template generation method and mobile terminal
CN109461124A (en) A kind of image processing method and terminal device
CN110650367A (en) Video processing method, electronic device, and medium
CN110738548B (en) Virtual fitting method and device, mobile terminal and computer readable storage medium
CN111314616A (en) Image acquisition method, electronic device, medium and wearable device
CN110086998B (en) Shooting method and terminal
CN108346083B (en) Information processing method and mobile terminal
CN107563353B (en) Image processing method and device and mobile terminal
CN111405361B (en) Video acquisition method, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant