CN105204630A - Method and system for garment design through motion sensing - Google Patents

Method and system for garment design through motion sensing Download PDF

Info

Publication number
CN105204630A
CN105204630A CN201510563427.7A CN201510563427A CN105204630A CN 105204630 A CN105204630 A CN 105204630A CN 201510563427 A CN201510563427 A CN 201510563427A CN 105204630 A CN105204630 A CN 105204630A
Authority
CN
China
Prior art keywords
gesture
movement locus
clothes
dress designing
body sense
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510563427.7A
Other languages
Chinese (zh)
Other versions
CN105204630B (en
Inventor
刘威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liu wei
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201510563427.7A priority Critical patent/CN105204630B/en
Publication of CN105204630A publication Critical patent/CN105204630A/en
Application granted granted Critical
Publication of CN105204630B publication Critical patent/CN105204630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a system for garment design through motion sensing. Garment design is realized through motion sensing, accordingly, a user can revise garment information randomly in a scene in a display device and can obtain a garment model drawing through calculation. A motion-sensing operation system based on 3D (three-dimensional) gestures is designed, various motions of hands and the body of the user in a 3D space can be detected with a 3D gesture detection technique, accordingly, mapping between the 3D gestures and the changes of the garment structure in the display device can be designed, the user can perform various 3D gesture motions in a real space, and a 3D garment displayed in a 3D manner can correspondingly change according to the motions. The cycle of garment design can be greatly shortened, and the garment customization efficiency is improved.

Description

The method and system of dress designing is carried out in a kind of body sense
Technical field
The invention belongs to Computer-aided Design Technology field, relate to the method and system that dress designing is carried out in a kind of body sense specifically.
Background technology
At present at garment industry, occur Digital human model fitting mode, namely people are by the manikin virtually trying clothes of oneself, and according to fitting effects, determine whether to buy.Because most of dress designing is that starting point is carried out with standard figure, and on garment marketing market, due to the difference of buman body type, often there is the problem that the model designed and buman body type do not match, or there is clothes dominant hue and the inharmonic problem of human body complexion, produce following phenomenon thus: the clothes designed do not reach satisfied sales volume, people but often can't buy oneself very fit clothes.Although Digital human model fitting can help people to select clothes, do not provide the solution according to a Digital human model customization gonosome clothes.Common dress designing carries out interaction design based on keyboard and mouse, and the software learning cycle is long, uses inconvenience.
Therefore, the present inventor needs a kind of new technology of design badly to improve its problem.
Summary of the invention
The present invention aims to provide the method and system that dress designing is carried out in a kind of body sense, and it can simplify the cycle of dress designing greatly, improves custom made clothing efficiency.
For solving the problems of the technologies described above, technical scheme of the present invention is:
A method for dress designing is carried out in body sense, comprises the steps:
S1: the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point.
S2: resolve gesture classification and movement locus thereof, obtains corresponding operational order.
S3: according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time.
S4: repeat step S1 to S3 until editor terminates, preserve the clothing information after editor.
Further, described step S2 specifically comprises:
S21: according to the gesture shape of making an appointment, determine gesture classification.
S22: quantification treatment is carried out to the movement locus of gesture, obtains trace information.
S23: obtain corresponding operational order according to real-time gesture classification and described trace information.
Further, also comprise:
S5: according to clothing information, calculates and generates clothes version type drawing, and wherein said clothes version type drawing is the entire and part parts drawing of garment production.
Further, described step S1 specifically comprises:
S11: obtain in operator gesture coordinate a little in real space.
S12: using at least one point in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identify movement locus.
Further, described clothing information comprises apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
A system for dress designing is carried out in body sense, comprising:
Identification module, for the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point.
Parsing module, for resolving gesture classification and movement locus thereof, obtains corresponding operational order.
Editor module, for according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time.
Preserve module, for preserving the clothing information after editor.
Further, described parsing module specifically comprises:
Gesture resolution unit, for according to the gesture shape of making an appointment, determines gesture classification.
Track resolution unit, for carrying out quantification treatment to the movement locus of gesture, obtains trace information.
Operational order map unit, for obtaining corresponding operational order according to real-time gesture classification and described trace information.
Further, also comprise:
Clothes version type drawing creation module, for according to clothing information, calculate and generate clothes version type drawing, wherein said clothes version type drawing is the entire and part parts drawing of garment production.
Further, described identification module specifically comprises:
Gesture identification unit, for obtain in operator's gesture coordinate a little in real space.
Track identification unit, for putting at least one in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identifies movement locus.
Further, described clothing information comprises apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
Adopt technique scheme, the present invention at least comprises following beneficial effect:
The method and system of dress designing is carried out in body sense of the present invention, realizes dress designing by body sense, makes random amendment in user's scene in the display device take information, and can by calculating clothes version type drawing.The present invention's design is based on the body sense operating system of three-dimension gesture, utilize three-dimension gesture detection technique, detect the various actions three dimensions assistant and health, thus the apparel construction designed in three-dimension gesture and display device change between mapping, object makes people in realistic space, perform various three-dimensional gesture action, and the 3D clothes of 3-D display then carry out respective change according to action.It can simplify the cycle of dress designing greatly, improves custom made clothing efficiency.
Accompanying drawing explanation
Fig. 1 is the process flow diagram that the method for dress designing is carried out in body sense of the present invention;
Fig. 2 is the system architecture schematic diagram that dress designing is carried out in body sense of the present invention.
Embodiment
Below in conjunction with drawings and Examples, the present invention is further described.
embodiment 1
As shown in Figure 1, for the method for dress designing is carried out in a kind of body sense meeting the present embodiment, comprise the steps:
S1: the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point.
S2: resolve gesture classification and movement locus thereof, obtains corresponding operational order.
S3: according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time.
S4: repeat step S1 to S3 until editor terminates, preserve the clothing information after editor.Preserve clothing information file to internal memory, or preserve clothing information file to hard disk.
Particularly, described step S1 comprises:
S11: obtain in operator gesture coordinate a little in real space.
S12: using at least one point in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identify movement locus.Such as using a finger finger tip as measuring and calculating reference point, by calculating this measuring and calculating reference point in motion process at the coordinate of each position, identify movement locus.Above-mentioned measuring and calculating coordinate, all can realize based on stereovision technique, repeat no more here.
Further, described step S2 specifically comprises:
S21: according to the gesture shape of making an appointment, determine gesture classification.
S22: quantification treatment is carried out to the movement locus of gesture, obtains trace information.
S23: obtain corresponding operational order according to real-time gesture classification and described trace information.
Such as:
Measure position, the both hands centre of the palm, connecting the both hands centre of the palm is straight line AB.
When the end value of straight line AB length and the difference of initial value are greater than default threshold value, determine described movement locus be both hands away from.
When the initial value of straight line AB length and the difference of end value are greater than default threshold value, determine that described movement locus is that both hands are close.
When straight line AB central point displacement is greater than default threshold value, determine that described movement locus is both hands translation.
Measure the inclination angle between straight line AB and horizontal line, when the difference of inclination angle initial value and end value is greater than default threshold value, determine that described movement locus is that both hands rotate.
The distance measured between thumb and forefinger is x.
When x initial value is greater than default threshold value, end value is less than default threshold value, determines that described movement locus is that forefinger thumb is mediated.
When x initial value is be less than default threshold value, end value is greater than default threshold value, determines that described movement locus is that forefinger thumb is decontroled.
When x initial value is be less than default threshold value, end value, also for being less than default threshold value, determines that described movement locus is that forefinger thumb drags.
Corresponding operational order specifically comprises:
When described movement locus be both hands away from time, amplify the 3D manikin in display device and clothes, namely vision zooms out, and now field range becomes large, is suitable for and observes overall condition.
When described movement locus be both hands near time, reduce the 3D manikin in display device and clothes, namely vision furthers, and now field range diminishes, and is suitable for and observes certain details.
When described movement locus is both hands translation, translation is carried out to the 3D manikin in display device and clothes in the position according to straight line AB central point, is applicable to the other parts that user observes scene.
When described movement locus is both hands rotation, according to the angle that straight line AB rotates, the 3D manikin in display device and clothes is rotated, be applicable to user at any angle views jobbie.
When described movement locus is the kneading of forefinger thumb, then select clothing information according to the position of forefinger finger tip.Described clothing information includes but not limited to apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
Particularly, forefinger thumb mediates apparel construction point: forefinger thumb is mediated, in the display device from forefinger finger tip nearest system point be selected, and in the display device highlighted.
Forefinger thumb drags apparel construction point: hand moves at three dimensions, and the position making the system point be selected follow hand is in the display device moved, and also follows adjustment with the structure lines that this system point is end points.
Forefinger thumb decontrols apparel construction point: after forefinger thumb is decontroled, the change of system point is fixed, and system point no longer highlights.
Forefinger thumb mediates apparel construction line: forefinger thumb is mediated, in the display device from forefinger finger tip nearest structure lines be selected, and in the display device highlighted.
Forefinger thumb drags apparel construction line: hand moves at three dimensions at hand, and the position making the structure lines shape be selected follow hand changes.
Forefinger thumb decontrols apparel construction line: after forefinger thumb is decontroled, the change of structure lines is fixed, and structure lines no longer highlights.
In described step S3:
When the gesture instruction identified is amplified, 3D manikin and 3D clothes are amplified in the display device.
When the gesture instruction identified is reduced, 3D manikin and 3D clothes are reduced in the display device.
When the gesture instruction translation identified, according to straight line AB central point movement locus by 3D manikin and the translation in the display device of 3D clothes.
When the gesture instruction identified rotates, according to straight line AB inclination angle, 3D manikin and 3D clothes are rotated in the display device.
Further, also comprise:
S5: according to clothing information, calculates and generates clothes version type drawing, and wherein said clothes version type drawing is the entire and part parts drawing of garment production.
More preferably, also step S0 is comprised before step S1 described in the present embodiment:
S01: identify the body gesture of client and the movement locus in real space thereof.
S02: according to the body gesture of client, the posture of 3D manikin in adjustment display device, wears effect for what check clothes under various body gesture.
Then operator revises accordingly according to wearing effect, and in amendment process in the amended clothes of real time inspection wear effect.Ensure that design is more scientific and reasonable with this, reduce the workload of designer to a certain extent.
The present embodiment realizes dress designing by body sense, makes random amendment in user's scene in the display device take information, and can by calculating clothes version type drawing.The present invention's design is based on the body sense operating system of three-dimension gesture, utilize three-dimension gesture detection technique, detect the various actions three dimensions assistant and health, thus the apparel construction designed in three-dimension gesture and display device change between mapping, object makes people in realistic space, perform various three-dimensional gesture action, and the 3D clothes of 3-D display then carry out respective change according to action.It can simplify the cycle of dress designing greatly, improves custom made clothing efficiency.
embodiment 2
As shown in Figure 2, for the system of dress designing is carried out in a kind of body sense meeting the present embodiment, it specifically comprises:
Identification module, for the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point.
Parsing module, for resolving gesture classification and movement locus thereof, obtains corresponding operational order.
Editor module, for according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time.
Preserve module, for preserving the clothing information after editor.Preserve clothing information file to internal memory, or preserve clothing information file to hard disk.
Further, described identification module specifically comprises:
Gesture identification unit, for obtain in operator's gesture coordinate a little in real space.
Track identification unit, for putting at least one in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identifies movement locus.Such as using a finger finger tip as measuring and calculating reference point, by calculating this measuring and calculating reference point in motion process at the coordinate of each position, identify movement locus.Above-mentioned measuring and calculating coordinate, all can realize based on stereovision technique, repeat no more here.
Further, described parsing module specifically comprises:
Gesture resolution unit, for according to the gesture shape of making an appointment, determines gesture classification.
Track resolution unit, for carrying out quantification treatment to the movement locus of gesture, obtains trace information.
Operational order map unit, for obtaining corresponding operational order according to real-time gesture classification and described trace information.
As:
Measure position, the both hands centre of the palm, connecting the both hands centre of the palm is straight line AB.
When the end value of straight line AB length and the difference of initial value are greater than default threshold value, determine described movement locus be both hands away from.
When the initial value of straight line AB length and the difference of end value are greater than default threshold value, determine that described movement locus is that both hands are close.
When straight line AB central point displacement is greater than default threshold value, determine that described movement locus is both hands translation.
Measure the inclination angle between straight line AB and horizontal line, when the difference of inclination angle initial value and end value is greater than default threshold value, determine that described movement locus is that both hands rotate.
The distance measured between thumb and forefinger is x.
When x initial value is greater than default threshold value, end value is less than default threshold value, determines that described movement locus is that forefinger thumb is mediated.
When x initial value is be less than default threshold value, end value is greater than default threshold value, determines that described movement locus is that forefinger thumb is decontroled.
When x initial value is be less than default threshold value, end value, also for being less than default threshold value, determines that described movement locus is that forefinger thumb drags.
Corresponding operational order specifically comprises:
When described movement locus be both hands away from time, amplify the 3D manikin in display device and clothes, namely vision zooms out, and now field range becomes large, is suitable for and observes overall condition.
When described movement locus be both hands near time, reduce the 3D manikin in display device and clothes, namely vision furthers, and now field range diminishes, and is suitable for and observes certain details.
When described movement locus is both hands translation, translation is carried out to the 3D manikin in display device and clothes in the position according to straight line AB central point, is applicable to the other parts that user observes scene.
When described movement locus is both hands rotation, according to the angle that straight line AB rotates, the 3D manikin in display device and clothes is rotated, be applicable to user at any angle views jobbie.
When described movement locus is the kneading of forefinger thumb, then select clothing information according to the position of forefinger finger tip.Described clothing information includes but not limited to apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
Particularly, forefinger thumb mediates apparel construction point: forefinger thumb is mediated, in the display device from forefinger finger tip nearest system point be selected, and in the display device highlighted.
Forefinger thumb drags apparel construction point: hand moves at three dimensions, and the position making the system point be selected follow hand is in the display device moved, and also follows adjustment with the structure lines that this system point is end points.
Forefinger thumb decontrols apparel construction point: after forefinger thumb is decontroled, the change of system point is fixed, and system point no longer highlights.
Forefinger thumb mediates apparel construction line: forefinger thumb is mediated, in the display device from forefinger finger tip nearest structure lines be selected, and in the display device highlighted.
Forefinger thumb drags apparel construction line: hand moves at three dimensions at hand, and the position making the structure lines shape be selected follow hand changes.
Forefinger thumb decontrols apparel construction line: after forefinger thumb is decontroled, the change of structure lines is fixed, and structure lines no longer highlights.
In described editor module:
When the gesture instruction identified is amplified, 3D manikin and 3D clothes are amplified in the display device.
When the gesture instruction identified is reduced, 3D manikin and 3D clothes are reduced in the display device.
When the gesture instruction translation identified, according to straight line AB central point movement locus by 3D manikin and the translation in the display device of 3D clothes.
When the gesture instruction identified rotates, according to straight line AB inclination angle, 3D manikin and 3D clothes are rotated in the display device.
Preferably, the present embodiment also comprises:
Clothes version type drawing creation module, for according to clothing information, calculate and generate clothes version type drawing, wherein said clothes version type drawing is the entire and part parts drawing of garment production.
More preferably, the present embodiment also comprises client identification module, and it specifically comprises:
Gesture recognition unit, for identifying the body gesture of client and the movement locus in real space thereof.
Effect display unit, for the body gesture according to client, the posture of 3D manikin in adjustment display device, wears effect for what check clothes under various body gesture.
Then operator revises accordingly according to wearing effect in the display device, and in amendment process in the amended clothes of real time inspection wear effect.Ensure that design is more scientific and reasonable with this, reduce the workload of designer to a certain extent.
The present embodiment realizes dress designing by body sense, makes random amendment in user's scene in the display device take information, and can by calculating clothes version type drawing.The present invention's design is based on the body sense operating system of three-dimension gesture, utilize three-dimension gesture detection technique, detect the various actions three dimensions assistant and health, thus the apparel construction designed in three-dimension gesture and display device change between mapping, object makes people in realistic space, perform various three-dimensional gesture action, and the 3D clothes of 3-D display then carry out respective change according to action.It can simplify the cycle of dress designing greatly, improves custom made clothing efficiency.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of complete hardware embodiment, completely software implementation or the embodiment in conjunction with software and hardware aspect.And the present invention can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) of computer usable program code.
The present invention describes with reference to according to the process flow diagram of the method for the embodiment of the present invention, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can being provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computing machine or other programmable data processing device produce device for realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
Although describe the preferred embodiments of the present invention, those skilled in the art once obtain the basic creative concept of cicada, then can make other change and amendment to these embodiments.So claims are intended to be interpreted as comprising preferred embodiment and falling into all changes and the amendment of the scope of the invention.

Claims (10)

1. carry out a method for dress designing with body sense, it is characterized in that, comprise the steps:
S1: the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point;
S2: resolve gesture classification and movement locus thereof, obtains corresponding operational order;
S3: according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time;
S4: repeat step S1 to S3 until editor terminates, preserve the clothing information after editor.
2. method of carrying out dress designing with body sense as claimed in claim 1, it is characterized in that, described step S2 specifically comprises:
S21: according to the gesture shape of making an appointment, determine gesture classification;
S22: quantification treatment is carried out to the movement locus of gesture, obtains trace information;
S23: obtain corresponding operational order according to real-time gesture classification and described trace information.
3. method of carrying out dress designing with body sense as claimed in claim 1 or 2, is characterized in that, also comprise:
S5: according to clothing information, calculates and generates clothes version type drawing, and wherein said clothes version type drawing is the entire and part parts drawing of garment production.
4. the method for carrying out dress designing with body sense as described in as arbitrary in claim 1-3, it is characterized in that, described step S1 specifically comprises:
S11: obtain in operator gesture coordinate a little in real space;
S12: using at least one point in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identify movement locus.
5. the method for carrying out dress designing with body sense as described in as arbitrary in claim 1-4, it is characterized in that: described clothing information comprises apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
6. carry out a system for dress designing with body sense, it is characterized in that, comprising:
Identification module, for the gesture of identifying operation person and the movement locus in real space thereof, the movement locus of wherein said gesture comprises the shape of movement locus and/or the position of movement locus Chosen Point;
Parsing module, for resolving gesture classification and movement locus thereof, obtains corresponding operational order;
Editor module, for according to operational order, edits the clothing information in 3D manikin and/or 3D clothes and carries out 3-D display in real time;
Preserve module, for preserving the clothing information after editor.
7. system of carrying out dress designing with body sense as claimed in claim 6, it is characterized in that, described parsing module specifically comprises:
Gesture resolution unit, for according to the gesture shape of making an appointment, determines gesture classification;
Track resolution unit, for carrying out quantification treatment to the movement locus of gesture, obtains trace information;
Operational order map unit, for obtaining corresponding operational order according to real-time gesture classification and described trace information.
8. the system of carrying out dress designing with body sense as claimed in claims 6 or 7, is characterized in that, also comprise:
Clothes version type drawing creation module, for according to clothing information, calculate and generate clothes version type drawing, wherein said clothes version type drawing is the entire and part parts drawing of garment production.
9. the system of carrying out dress designing with body sense as described in as arbitrary in claim 6-8, it is characterized in that, described identification module specifically comprises:
Gesture identification unit, for obtain in operator's gesture coordinate a little in real space;
Track identification unit, for putting at least one in gesture as measuring and calculating reference point, by calculating this measuring and calculating reference point at the coordinate of each position in motion process, thus identifies movement locus.
10. the system of carrying out dress designing with body sense as described in as arbitrary in claim 6-9, it is characterized in that: described clothing information comprises apparel construction line and apparel construction point, described apparel construction line be apparel modeling can be caused to change garment elements, outside and the inner general name needing the line sewed up, described apparel construction point is the end points of described apparel construction line.
CN201510563427.7A 2015-09-07 2015-09-07 A kind of method and system carrying out dress designing with body-sensing Active CN105204630B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510563427.7A CN105204630B (en) 2015-09-07 2015-09-07 A kind of method and system carrying out dress designing with body-sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510563427.7A CN105204630B (en) 2015-09-07 2015-09-07 A kind of method and system carrying out dress designing with body-sensing

Publications (2)

Publication Number Publication Date
CN105204630A true CN105204630A (en) 2015-12-30
CN105204630B CN105204630B (en) 2018-11-23

Family

ID=54952364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510563427.7A Active CN105204630B (en) 2015-09-07 2015-09-07 A kind of method and system carrying out dress designing with body-sensing

Country Status (1)

Country Link
CN (1) CN105204630B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106723666A (en) * 2017-01-06 2017-05-31 段莉 A kind of leather shoes customization method and system
CN107230134A (en) * 2017-05-27 2017-10-03 郑州云海信息技术有限公司 A kind of virtual costume customized management method and system
CN112434396A (en) * 2019-08-21 2021-03-02 版腾信息科技(上海)有限公司 Layout design method, layout design device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
CN102402641A (en) * 2010-09-14 2012-04-04 盛乐信息技术(上海)有限公司 Network-based three-dimensional virtual fitting system and method
CN103854303A (en) * 2014-03-06 2014-06-11 寇懿 Three-dimensional hair style design system and method based on somatosensory sensor
CN104851005A (en) * 2015-05-18 2015-08-19 包建伟 O2O electronic commerce platform for self-help clothing design and 3D display
CN104850226A (en) * 2015-04-30 2015-08-19 北京农业信息技术研究中心 Three-dimensional interactive fruit tree shape trimming method based on gesture recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102402641A (en) * 2010-09-14 2012-04-04 盛乐信息技术(上海)有限公司 Network-based three-dimensional virtual fitting system and method
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
CN103854303A (en) * 2014-03-06 2014-06-11 寇懿 Three-dimensional hair style design system and method based on somatosensory sensor
CN104850226A (en) * 2015-04-30 2015-08-19 北京农业信息技术研究中心 Three-dimensional interactive fruit tree shape trimming method based on gesture recognition
CN104851005A (en) * 2015-05-18 2015-08-19 包建伟 O2O electronic commerce platform for self-help clothing design and 3D display

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106723666A (en) * 2017-01-06 2017-05-31 段莉 A kind of leather shoes customization method and system
CN107230134A (en) * 2017-05-27 2017-10-03 郑州云海信息技术有限公司 A kind of virtual costume customized management method and system
CN112434396A (en) * 2019-08-21 2021-03-02 版腾信息科技(上海)有限公司 Layout design method, layout design device, computer equipment and storage medium
CN112434396B (en) * 2019-08-21 2023-03-10 版腾信息科技(上海)有限公司 Layout design method, layout design device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN105204630B (en) 2018-11-23

Similar Documents

Publication Publication Date Title
US11073915B2 (en) Modification of three-dimensional garments using gestures
TWI579734B (en) 3d visualization
US10481689B1 (en) Motion capture glove
CN104854537B (en) It is interacted from, multi-modal natural user with the multiple spurs of computing device
Lockwood et al. Fingerwalking: motion editing with contact-based hand performance
CN107450714A (en) Man-machine interaction support test system based on augmented reality and image recognition
CN102226880A (en) Somatosensory operation method and system based on virtual reality
WO2014200781A1 (en) Locating and orienting device in space
CN110389659A (en) The system and method for dynamic haptic playback are provided for enhancing or reality environment
JP7490072B2 (en) Vision-based rehabilitation training system based on 3D human pose estimation using multi-view images
WO2015153673A1 (en) Providing onscreen visualizations of gesture movements
Vitali et al. A virtual environment to emulate tailor’s work
Aditya et al. Recent trends in HCI: A survey on data glove, LEAP motion and microsoft kinect
CN105204630A (en) Method and system for garment design through motion sensing
KR20200122126A (en) Method and apparatus for performing 3d sketch
Cohen et al. A 3d virtual sketching system using NURBS surfaces and leap motion controller
JP2015114762A (en) Finger operation detection device, finger operation detection method, finger operation detection program, and virtual object processing system
Cho et al. 3D volume drawing on a potter's wheel
Luong et al. Human computer interface using the recognized finger parts of hand depth silhouette via random forests
Hartmann et al. A virtual touchscreen with depth recognition
Arora Creative visual expression in immersive 3D environments
Hoang et al. Ultrasonic glove input device for distance-based interactions
Lu et al. A combined strategy of hand tracking for desktop VR
Oshita et al. Character motion synthesis by principal component analysis and motion control interface by hands
An et al. Construction of industrial Robot Virtual Assemble Training System with Unity3D

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20160206

Address after: 430070 Hubei Province, Wuhan city Hongshan District Yinhai Road eight building 510 court B

Applicant after: Duan Li

Address before: 430070 Hubei Province, Wuhan city Hongshan District Yinhai Road eight building 510 court B

Applicant before: Liu Wei

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20161115

Address after: The main street of East Mountain Community Dongzong Avenue 523122 in Guangdong city of Dongguan province No. 43 days Hui proud office building five building A515 room

Applicant after: Dongguan universal robot Co., Ltd.

Address before: 430070 Hubei Province, Wuhan city Hongshan District Yinhai Road eight building 510 court B

Applicant before: Duan Li

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20170123

Address after: 430070 Hubei Province, Wuhan city Hongshan District Yinhai Road eight building 510 court B

Applicant after: Liu Wei

Address before: The main street of East Mountain Community Dongzong Avenue 523122 in Guangdong city of Dongguan province No. 43 days Hui proud office building five building A515 room

Applicant before: Dongguan universal robot Co., Ltd.

GR01 Patent grant
GR01 Patent grant