CN108921933A - A kind of true dress ornament tries method and apparatus on - Google Patents
A kind of true dress ornament tries method and apparatus on Download PDFInfo
- Publication number
- CN108921933A CN108921933A CN201810460227.2A CN201810460227A CN108921933A CN 108921933 A CN108921933 A CN 108921933A CN 201810460227 A CN201810460227 A CN 201810460227A CN 108921933 A CN108921933 A CN 108921933A
- Authority
- CN
- China
- Prior art keywords
- model
- dimensional
- clothes
- real
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000013507 mapping Methods 0.000 claims abstract description 12
- 238000001514 detection method Methods 0.000 claims abstract description 4
- 238000009499 grossing Methods 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000009958 sewing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Computer Graphics (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention belongs to three-dimensional dress ornaments to try field on, more particularly to a kind of true dress ornament tries method and apparatus on, establish three-dimensional dress ornament model library, two-dimension garment piece model library, human 3d model library, user selects the true dress ornament for wanting to try on, three-dimensional dress ornament model is selected according to the affiliated type of true dress ornament, corresponding garment piece model is selected from garment piece database, two-dimension garment piece model is fitted on three-dimensional dress ornament model, obtains the initial threedimensional model of true dress ornament;The initial threedimensional model of true dress ornament is displayed on the screen, according to contour line of the garment piece on initial threedimensional model, camera is instructed to be acquired the data texturing of true dress ornament;Texture mapping is carried out to initial threedimensional model, obtains the threedimensional model of true dress ornament;Establish the three-dimensional (3 D) manikin of wearer to be tried;The threedimensional model of true dress ornament and the three-dimensional (3 D) manikin of wearer to be tried are subjected to collision detection, realize dress ornament virtually trying.For solving true dress ornament modeling in real time and trying on, the sense of reality of three-dimensional dress ornament model is improved.
Description
Technical Field
The invention belongs to the field of three-dimensional dress fitting, and particularly relates to a real dress fitting method and device.
Background
At present, as online shopping goes deep into daily life of people, virtual fitting of clothes gradually becomes a hot point in the field of three-dimensional modeling, human face data is captured through a camera, a human body three-dimensional model is built in real time, then a user performs fitting by selecting three-dimensional virtual clothes in a model library, and the fitting effect can be shown on an interface.
However, in the prior art, a user can only try on virtual clothes existing in a database, and the real clothes which are not modeled cannot be tried on, so that the virtual fitting application range of the clothes is narrow; secondly, the three-dimensional modeling of the real clothes is too simple, the three-dimensional modeling of the real clothes is carried out through sewing of the front garment piece and the rear garment piece, and the built model cannot reflect the details of the real clothes and is much like the clothes generated by a computer.
Disclosure of Invention
Aiming at the defects, the invention provides a real clothes fitting method and a real clothes fitting device, which are used for solving the problems of real clothes modeling and fitting in real time and improving the reality sense of a three-dimensional clothes model.
The invention relates to a real dress fitting method, which comprises the following specific steps:
step S1: establishing a clothing model library, and establishing different three-dimensional clothing models aiming at different clothing types;
step S2: establishing a garment piece database, and establishing different two-dimensional garment piece models according to the shapes of garment pieces cut by the garment;
step S3: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
step S4: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
step S5: when the terminal camera collects data of the real clothes, displaying an initial three-dimensional model of the real clothes on a screen, and guiding the camera to collect texture data of the real clothes according to the contour line of the garment piece on the initial three-dimensional model;
step S6: after the texture data of the real clothes are collected, performing texture mapping on the initial three-dimensional model to obtain a three-dimensional model of the real clothes;
step S7: deriving a three-dimensional human body model from a human body three-dimensional model library according to the gender of the person to be tried, and further deforming the three-dimensional human body model according to body type data and head portrait information input by a user to obtain the three-dimensional human body model of the person to be tried;
step S8: and performing collision detection on the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on to realize virtual fitting of the clothes.
Meanwhile, the invention also provides a real dress fitting device, which comprises the following modules:
a clothing model library establishing module: the three-dimensional clothing model library is used for establishing a clothing model library and establishing different three-dimensional clothing models aiming at different clothing types;
a garment piece model library establishing module: the method is used for establishing a database of garment pieces, and establishing different two-dimensional garment piece models according to the shapes of the garment pieces cut by the garment;
a human body three-dimensional model library establishing module: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
an initial three-dimensional model acquisition module: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
a texture data acquisition module: the method is used for displaying an initial three-dimensional model of the real clothes on a screen when a terminal camera collects data of the real clothes, and guiding the camera to collect texture data of the real clothes according to the contour line of a garment piece on the initial three-dimensional model;
the clothes three-dimensional model building module: after the texture data of the real clothes are collected, texture mapping is carried out on the initial three-dimensional model to obtain a three-dimensional model of the real clothes;
a human body three-dimensional model building module: the system comprises a human body three-dimensional model library, a human body three-dimensional model library and a human body three-dimensional model transformation module, wherein the human body three-dimensional model library is used for deriving a three-dimensional human body model from the human body three-dimensional model library according to the gender of a person to be tried, and the three-dimensional human body model is further transformed according to body type data and head portrait information input by a user to obtain the three-;
a fitting module: the method is used for detecting the collision between the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on, and realizing the virtual try-on of the clothes.
In the invention, the clothes are divided into a plurality of garment pieces, the garment pieces can be customized according to the real cutting condition of the real clothes, meanwhile, in the texture acquisition process of the real clothes, the texture acquisition is carried out according to the guidance of the garment piece outline, and the texture mapping is carried out by taking the garment pieces as units; meanwhile, the fitting of the invention does not depend on the garment model modeled by the system, when the user is interested in a certain garment but does not want to actually fit, the user only needs to take out the mobile phone, the real garment three-dimensional modeling fitting can be completed in real time through simple selection and photographing, and when the virtual effect is good, the actual fitting is performed.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a view showing the structure of the apparatus of the present invention.
Detailed Description
The embodiments are described in detail below with reference to the accompanying drawings.
The method of the invention as shown in figure 1 is a flow chart:
the method of the invention comprises the following steps:
step S1: establishing a clothing model library, and establishing different three-dimensional clothing models aiming at different clothing types;
step S2: establishing a garment piece database, and establishing different two-dimensional garment piece models according to the shapes of garment pieces cut by the garment;
wherein, the shape of the tailoring cut pieces comprises: common panel shapes and user-defined panel shapes.
The user-defined garment piece shape is a shape which is manually drawn by a user according to the cutting of the current real garment piece when the user cannot find a proper corresponding shape in a common garment piece shape, and the user performs smoothing treatment on the shape drawn by the user after confirming the input; the smoothing process includes fitting a straight or curved line to the user-drawn shape contour.
The accuracy of garment modeling is improved by dividing the garment modeling into a plurality of garment pieces.
Step S3: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
step S4: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
step S5: when the terminal camera collects data of the real clothes, displaying an initial three-dimensional model of the real clothes on a screen, and guiding the camera to collect texture data of the real clothes according to the contour line of the garment piece on the initial three-dimensional model;
according to the contour line of the garment piece in the initial three-dimensional model, the camera is guided to collect texture data of the real garment, and the method specifically comprises the following steps: enabling the contour line on the initial three-dimensional model to coincide with the cutting contour line of the real clothes, if the cutting contour line of the real clothes cannot be fitted according to the contour line on the initial three-dimensional model, the user can further adjust the contour line on the initial three-dimensional model, and the contour line of the three-dimensional clothes model is updated according to the adjustment result; after the texture data is collected, the texture data is divided according to the contour lines.
The contour lines of the garment pieces on the garment three-dimensional model guide a user to collect garment texture data, and the effectiveness of texture collection is improved.
Step S6: after the texture data of the real clothes are collected, performing texture mapping on the initial three-dimensional model to obtain a three-dimensional model of the real clothes; the texture map comprises: and (5) performing texture mapping by taking the garment pieces as units.
Step S7: deriving a three-dimensional human body model from a human body three-dimensional model library according to the gender of the person to be tried, and further deforming the three-dimensional human body model according to body type data and head portrait information input by a user to obtain the three-dimensional human body model of the person to be tried;
step S8: and performing collision detection on the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on to realize virtual fitting of the clothes.
The apparatus of the present invention shown in fig. 2 is configured as follows: the three-dimensional model library fitting system comprises a clothing model library establishing module, a garment piece model library establishing module, a human body three-dimensional model library establishing module, an initial three-dimensional model obtaining module, a texture data collecting module, a clothing three-dimensional model establishing module, a human body three-dimensional model establishing module and a fitting module.
Wherein,
a clothing model library establishing module: the three-dimensional clothing model library is used for establishing a clothing model library and establishing different three-dimensional clothing models aiming at different clothing types;
a garment piece model library establishing module: the method is used for establishing a database of garment pieces, and establishing different two-dimensional garment piece models according to the shapes of the garment pieces cut by the garment; the garment cut piece shape comprises: common panel shapes and user-defined panel shapes. The user-defined garment piece shape is a shape which is manually drawn by a user according to the cutting of the current real garment piece when the user cannot find a proper corresponding shape in a common garment piece shape, and the user performs smoothing treatment on the shape drawn by the user after confirming the input. The smoothing process includes fitting a straight or curved line to the user-drawn shape contour.
A human body three-dimensional model library establishing module: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
an initial three-dimensional model acquisition module: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
a texture data acquisition module: the method is used for displaying an initial three-dimensional model of the real clothes on a screen when a terminal camera collects data of the real clothes, and guiding the camera to collect texture data of the real clothes according to the contour line of a garment piece on the initial three-dimensional model;
according to the contour line of the garment piece in the initial three-dimensional model, the camera is guided to collect texture data of the real garment, and the method specifically comprises the following steps: enabling the contour line on the initial three-dimensional model to coincide with the cutting contour line of the real clothes, if the cutting contour line of the real clothes cannot be fitted according to the contour line on the initial three-dimensional model, the user can further adjust the contour line on the initial three-dimensional model, and the contour line of the three-dimensional clothes model is updated according to the adjustment result; after the texture data is collected, the texture data is divided according to the contour lines.
The clothes three-dimensional model building module: after the texture data of the real clothes are collected, texture mapping is carried out on the initial three-dimensional model to obtain a three-dimensional model of the real clothes; the texture map comprises: and (5) performing texture mapping by taking the garment pieces as units.
A human body three-dimensional model building module: the system comprises a human body three-dimensional model library, a human body three-dimensional model library and a human body three-dimensional model transformation module, wherein the human body three-dimensional model library is used for deriving a three-dimensional human body model from the human body three-dimensional model library according to the gender of a person to be tried, and the three-dimensional human body model is further transformed according to body type data and head portrait information input by a user to obtain the three-;
a fitting module: the method is used for detecting the collision between the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on, and realizing the virtual try-on of the clothes.
The above embodiments are only preferred embodiments of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are also within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A real dress fitting method is characterized in that: the method comprises the following steps:
step S1: establishing a clothing model library, and establishing different three-dimensional clothing models aiming at different clothing types;
step S2: establishing a garment piece database, and establishing different two-dimensional garment piece models according to the shapes of garment pieces cut by the garment;
step S3: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
step S4: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
step S5: when the terminal camera collects data of the real clothes, displaying an initial three-dimensional model of the real clothes on a screen, and guiding the camera to collect texture data of the real clothes according to the contour line of the garment piece on the initial three-dimensional model;
step S6: after the texture data of the real clothes are collected, performing texture mapping on the initial three-dimensional model to obtain a three-dimensional model of the real clothes;
step S7: deriving a three-dimensional human body model from a human body three-dimensional model library according to the gender of the person to be tried, and further deforming the three-dimensional human body model according to body type data and head portrait information input by a user to obtain the three-dimensional human body model of the person to be tried;
step S8: and performing collision detection on the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on to realize virtual fitting of the clothes.
2. The method of claim 1, wherein the step S2 of tailoring the cut panel shapes comprises: common panel shapes and user-defined panel shapes.
3. The method of claim 2, wherein the user-defined panel shapes are shapes that are drawn manually by the user based on current real dress cuts when the user cannot find a suitable corresponding shape in the common panel shapes, and the user smoothes the shapes drawn by the user after confirming the input.
4. The method of claim 3, wherein said smoothing comprises fitting a straight or curved line to the user-drawn shape contour.
5. The method of claim 1, wherein the step S5 of directing the camera to collect texture data of the real garment according to the contour lines of the cut pieces in the initial three-dimensional model includes: enabling the contour line on the initial three-dimensional model to coincide with the cutting contour line of the real clothes, if the cutting contour line of the real clothes cannot be fitted according to the contour line on the initial three-dimensional model, the user can further adjust the contour line on the initial three-dimensional model, and the contour line of the three-dimensional clothes model is updated according to the adjustment result; after the texture data is collected, the texture data is divided according to the contour lines.
6. The method of claim 1, wherein the texture map comprises: and (5) performing texture mapping by taking the garment pieces as units.
7. The utility model provides a real dress fitting device which characterized in that: the device comprises the following modules:
a clothing model library establishing module: the three-dimensional clothing model library is used for establishing a clothing model library and establishing different three-dimensional clothing models aiming at different clothing types;
a garment piece model library establishing module: the method is used for establishing a database of garment pieces, and establishing different two-dimensional garment piece models according to the shapes of the garment pieces cut by the garment;
a human body three-dimensional model library establishing module: establishing a human body three-dimensional model library, wherein the model can dynamically change according to body type data input by a user;
an initial three-dimensional model acquisition module: selecting real clothes to be tried on by a user, selecting a three-dimensional clothes model according to the type of the real clothes through a trying-on terminal, selecting a corresponding clothes piece model from a clothes piece database according to cutting of the real clothes, and fitting a two-dimensional clothes piece model on the three-dimensional clothes model according to the operation of the user to obtain an initial three-dimensional model of the real clothes;
a texture data acquisition module: the method is used for displaying an initial three-dimensional model of the real clothes on a screen when a terminal camera collects data of the real clothes, and guiding the camera to collect texture data of the real clothes according to the contour line of a garment piece on the initial three-dimensional model;
the clothes three-dimensional model building module: after the texture data of the real clothes are collected, texture mapping is carried out on the initial three-dimensional model to obtain a three-dimensional model of the real clothes;
a human body three-dimensional model building module: the system comprises a human body three-dimensional model library, a human body three-dimensional model library and a human body three-dimensional model transformation module, wherein the human body three-dimensional model library is used for deriving a three-dimensional human body model from the human body three-dimensional model library according to the gender of a person to be tried, and the three-dimensional human body model is further transformed according to body type data and head portrait information input by a user to obtain the three-;
a fitting module: the method is used for detecting the collision between the three-dimensional model of the real clothes and the three-dimensional human body model of the person to be tried on, and realizing the virtual try-on of the clothes.
8. The apparatus of claim 7, wherein the garment cut panel shape comprises: the shapes of common cut pieces and user-defined cut pieces; according to the contour line of the garment piece in the initial three-dimensional model, the camera is guided to collect texture data of the real garment, and the method specifically comprises the following steps: enabling the contour line on the initial three-dimensional model to coincide with the cutting contour line of the real clothes, if the cutting contour line of the real clothes cannot be fitted according to the contour line on the initial three-dimensional model, the user can further adjust the contour line on the initial three-dimensional model, and the contour line of the three-dimensional clothes model is updated according to the adjustment result; after the texture data are collected, dividing the texture data according to contour lines; the texture map includes: and (5) performing texture mapping by taking the garment pieces as units.
9. The apparatus of claim 8, wherein the user-defined panel shapes are shapes that are drawn manually by the user based on current real dress cuts when the user cannot find a suitable corresponding shape in the common panel shapes, and the user smoothes the shapes drawn by the user after confirming the input.
10. The apparatus of claim 9, wherein said smoothing comprises fitting a straight or curved line to the user-drawn shape contour.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810460227.2A CN108921933A (en) | 2018-05-14 | 2018-05-14 | A kind of true dress ornament tries method and apparatus on |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810460227.2A CN108921933A (en) | 2018-05-14 | 2018-05-14 | A kind of true dress ornament tries method and apparatus on |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108921933A true CN108921933A (en) | 2018-11-30 |
Family
ID=64402631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810460227.2A Withdrawn CN108921933A (en) | 2018-05-14 | 2018-05-14 | A kind of true dress ornament tries method and apparatus on |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108921933A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112785723A (en) * | 2021-01-29 | 2021-05-11 | 哈尔滨工业大学 | Automatic garment modeling method based on two-dimensional garment image and three-dimensional human body model |
CN113516581A (en) * | 2020-04-14 | 2021-10-19 | 阿里巴巴集团控股有限公司 | Data processing method, device and equipment |
CN115690181A (en) * | 2022-11-07 | 2023-02-03 | 深圳市诗恩商业智能有限公司 | Model proportion calculation method based on feature fusion and RBF (radial basis function) network |
-
2018
- 2018-05-14 CN CN201810460227.2A patent/CN108921933A/en not_active Withdrawn
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113516581A (en) * | 2020-04-14 | 2021-10-19 | 阿里巴巴集团控股有限公司 | Data processing method, device and equipment |
CN112785723A (en) * | 2021-01-29 | 2021-05-11 | 哈尔滨工业大学 | Automatic garment modeling method based on two-dimensional garment image and three-dimensional human body model |
CN115690181A (en) * | 2022-11-07 | 2023-02-03 | 深圳市诗恩商业智能有限公司 | Model proportion calculation method based on feature fusion and RBF (radial basis function) network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105354876B (en) | A kind of real-time volume fitting method based on mobile terminal | |
US8976230B1 (en) | User interface and methods to adapt images for approximating torso dimensions to simulate the appearance of various states of dress | |
CN104217350B (en) | Virtual try-on realization method and device | |
WO2018076437A1 (en) | Method and apparatus for human facial mapping | |
CN106066898B (en) | Three-dimensional full-fashioned knitted sweater and method and system for generating three-dimensional knitting pattern thereof | |
US11439194B2 (en) | Devices and methods for extracting body measurements from 2D images | |
CN113987344B (en) | Intelligent 3D garment style simulation method based on layout library and cost estimation method thereof | |
CN102982581B (en) | System for virtually trying and method based on image | |
CN108921933A (en) | A kind of true dress ornament tries method and apparatus on | |
CN103413118B (en) | Online glasses try-on method | |
CN108986159A (en) | A kind of method and apparatus that three-dimensional (3 D) manikin is rebuild and measured | |
KR20170073623A (en) | Fast 3d model fitting and anthropometrics | |
CN104781849A (en) | Fast initialization for monocular visual simultaneous localization and mapping (SLAM) | |
CN107831900B (en) | human-computer interaction method and system of eye-controlled mouse | |
CN106705837A (en) | Gesture-based object measurement method and device | |
US20150269759A1 (en) | Image processing apparatus, image processing system, and image processing method | |
CN104091269A (en) | Virtual fitting method and virtual fitting system | |
CN105809507A (en) | Virtualized wearing method and virtualized wearing apparatus | |
Lu et al. | The development of an intelligent system for customized clothing making | |
CN103106586A (en) | Three-dimensional virtual fitting system | |
CN103871099A (en) | 3D simulation matching processing method and system based on mobile terminal | |
JP6262105B2 (en) | Image processing apparatus, image processing system, image processing method, and program | |
CN106446346A (en) | Electronic fitting method | |
CN106447713A (en) | Automatic measurement method and device based on point cloud human body model | |
CN106527719A (en) | House for sale investigation system based on AR (Augmented Reality) technology and real-time three-dimensional modeling |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20181130 |