CN106651498A - Information processing method and device - Google Patents
Information processing method and device Download PDFInfo
- Publication number
- CN106651498A CN106651498A CN201610865542.4A CN201610865542A CN106651498A CN 106651498 A CN106651498 A CN 106651498A CN 201610865542 A CN201610865542 A CN 201610865542A CN 106651498 A CN106651498 A CN 106651498A
- Authority
- CN
- China
- Prior art keywords
- information
- clothes
- target clothes
- input
- wearable device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Human Computer Interaction (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiments of the invention disclose an information processing method, comprising: generating a stereoscopic human body model; when a clothes try-on operation inputted from an internet clothes try-on interface is detected, obtaining the target clothes information wherein the target clothes information includes the information on the type, size, colors, material components and feel; synthesizing the information of the target clothes corresponding to the target clothes with the stereoscopic human body model; displaying a synthesized effect image after the processing on a virtualized actual environment interface; and transmitting the feel information of the target clothes to a wearable device so that the wearable device could output a feel information. The embodiments of the invention also disclose an information processing device. According to the invention, while displaying the stereoscopic effect of clothes, the method, through the feel signal outputted after the try-on of a wearable device, provides more commodity information, reduces the chances of commodity refund and therefore, increases the online shopping efficiency.
Description
Technical field
The present invention relates to field of computer technology, more particularly to a kind of information processing method and device.
Background technology
With the development of global information technology, future will be the information with network computer and kownledge economy as core
Epoch.E-business in recent years vigorously develops, and Online Store provides magnanimity commodity and selects for user, however, for clothing etc.
Commodity, only with the picture character on network be difficult fully understand commodity, easily occur size it is improper the problems such as and cause to replace
Goods, so as to reduce the efficiency of shopping at network.
The content of the invention
Embodiment of the present invention technical problem to be solved is, there is provided a kind of information processing method and device, Ke Yishi
When show the stereoeffect of clothes and the tactile signals after fit by wearable device output simultaneously, contribute to providing more business
Product information, reduces goods return and replacement, so as to improve the efficiency of shopping at network.
In order to solve above-mentioned technical problem, a kind of information processing method is embodiments provided, methods described includes:
Generate three-dimensional humanoid model;
When the fitting operation being input at network fitting interface is detected, target clothes information, the target clothing are obtained
Information is taken including style, size, color, material composition and tactile impressions information;
The corresponding target clothes of the target clothes information is carried out into synthesis process with the three-dimensional humanoid model, virtual
Synthesis described in real border interface display process after synthetic effect figure, and the tactile impressions information of the target clothes sent to wearing set
It is standby, so that wearable device output tactile signals.
Correspondingly, the embodiment of the present invention additionally provides a kind of information processor, and described device includes:
Model generation module, for generating three-dimensional humanoid model;
Clothes data obtaining module, for when the fitting being input at network fitting interface operation is detected, obtaining mesh
Mark clothes information, the target clothes information includes style, size, color, material composition and tactile impressions information;
Message processing module, for the corresponding target clothes of the target clothes information to be entered with the three-dimensional humanoid model
Row synthesis is processed, the synthetic effect figure after synthesis described in virtual reality interface display is processed, and touching the target clothes
Sense information is sent to wearable device, so that wearable device output tactile signals.In embodiments of the present invention, by generating
Three-dimensional humanoid model synthesized with the target clothes that obtains when detecting the fitting operation being input at network fitting interface
Process, the synthetic effect figure after the synthesis of virtual reality interface display is processed, and the tactile impressions information of target clothes is sent to wearing
Equipment is worn, so that wearable device output tactile signals.By the stereoeffect of real-time exhibition clothes and simultaneously by wearing
Tactile signals after equipment output fitting, can provide more merchandise newss, goods return and replacement be reduced, so as to improve shopping at network
Efficiency.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
The accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this
Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can be with
Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of schematic flow sheet of the information processing method in the embodiment of the present invention;
Fig. 2 is the interface schematic diagram of a kind of three-dimensional humanoid model in the embodiment of the present invention;
Fig. 3 is a kind of interface schematic diagram of the synthetic effect in the embodiment of the present invention;
Fig. 4 is the schematic flow sheet of the information processing method in another embodiment of the present invention;
Fig. 5 is a kind of structural representation of the information processor in the embodiment of the present invention;
Fig. 6 is a kind of Organization Chart of the computer system for performing above- mentioned information processing method in the embodiment of the present invention.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than the embodiment of whole.It is based on
Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made
Embodiment, belongs to the scope of protection of the invention.
Term " comprising " and " having " in description and claims of this specification and above-mentioned accompanying drawing and they appoint
What deforms, it is intended that cover non-exclusive including.For example contain process, method, system, the product of series of steps or unit
Product or equipment are not limited to the step of listing or unit, but alternatively also include the step of not listing or unit, or
Alternatively also include other steps intrinsic for these processes, method, product or equipment or unit.
The execution of the information processing method referred in the embodiment of the present invention depends on computer program, can run on Feng Ruoyi
On the computer system of graceful system.The computer program can it is integrated in the application, also can transport as independent tool-class application
OK.The computer system can be the terminal devices such as PC, panel computer, notebook computer, smart mobile phone.
It is described in detail individually below.
Fig. 1 is a kind of schematic flow sheet of information processing method in the embodiment of the present invention, and as shown in the figure methods described is at least
Including:
Step S101, generates three-dimensional humanoid model;
Specifically, when structure model trigger is detected or received, such as detect for the input of network fitting interface
Opening operation when or trigger and the default structure model trigger message phase such as the operating gesture that is input into by body-sensing instrument
Under the triggering situations such as timing, trigger message processing meanss start camera, and gather user's figure information by camera, such as logical
Cross camera to be scanned user according to preset order, height information, body weight letter of the figure information at least including user
Breath and measurements of the chest, waist and hips information etc., further according to these user's figure informations for gathering three-dimensional humanoid model is generated, wherein, described generation is stood
The figure information and preset algorithm that body people's row model can be gathered according to is modeled the three-dimensional humanoid model of construction.
For example, as shown in Fig. 2 A is user in figure, when information processor receives modeling trigger, adopt
The figure information of collection user, and generate corresponding three-dimensional humanoid Model B.
Optionally, can equal proportion build the three-dimensional humanoid model that matches with the figure information for collecting, it is described etc.
Ratio can be 1:1, it is also possible to less than 1:1, specifically do not limit.
Further, after three-dimensional humanoid model is generated, by adding the information such as skin for the model, then generate and scheme
The humanoid model of A identicals in 2.
Step S102, when the fitting operation being input at network fitting interface is detected, obtains target clothes information, institute
Target clothes information is stated including style, size, color, material composition and tactile impressions information;
Specifically, when detect user by body-sensing instrument or directly can touch-control the input operation of network fitting interface
When, the operation can be matched with default fitting operation.If being input into by body-sensing instrument, the operation can be default cunning
Start gesture, if directly can touch interface input, the operation can be screen sliding, click etc..If the operation and default examination
Clothing operation matches, then using the object chosen as target clothes, and obtain the style of target clothes, size, color, material into
Point, multiple information such as brand and price, while obtaining the tactile impressions information of the target clothes.
Wherein, the input operation at network fitting interface matches it is believed that the hand of input with default fitting operation
Gesture is more than or predetermined threshold value with the matching degree of default gesture, it is also contemplated that the screen sliding direction of input is in default screen sliding direction model
In enclosing.
Step S103, the corresponding target clothes of the target clothes information is carried out at synthesis with the three-dimensional humanoid model
Reason, the synthetic effect figure after synthesis described in virtual reality interface display is processed, and the tactile impressions information of the target clothes is sent out
Wearable device is delivered to, so that wearable device output tactile signals.
Specifically, it is if target clothes is the full lace closing waist one-piece dress of black, the target clothes is three-dimensional humanoid with what is generated
Model is synthesized, will target clothes be through with three-dimensional humanoid model, in synthetic effect such as Fig. 3 shown in A, and this is synthesized
Design sketch shown at virtual reality interface, and user is by wearing the three-dimensional of the VR equipment observables such as the VR glasses synthetic effect
(Three Dimensional, 3D) figure.
Optionally, if in the three-dimensional humanoid model such as Fig. 2 shown in A, the synthetic effect figure correspondence after synthesis is as in Fig. 3
Shown in B.
Optionally, the synthetic effect figure can be shown using default display effect, the default display effect
Can be in a rotational display of plane 180, or the rotational display of space 360, can also amplify periodically to reduce
Show etc., it is not especially limited herein.
While synthetic effect figure is shown, the tactile impressions information of target clothes is sent to wearable device, set by wearing
Standby output tactile signals.The wearable device can be the clothes of the virtual reality (Virtual Reality, VR) with sensor
Decorations, are sent after tactile signals by starting the sensor of VR dress ornaments, and the user for dressing the VR dress ornaments can be with real-time perception target clothing
The sense of touch of clothes.
In embodiments of the present invention, by the way that the three-dimensional humanoid model for generating is input into detecting at network fitting interface
Fitting operation when the target clothes that obtains carry out synthesis process, the synthetic effect after the synthesis of virtual reality interface display is processed
Figure, and the tactile impressions information of target clothes is sent to wearable device, so that wearable device output tactile signals.By real-time
Show the stereoeffect of clothes and while the tactile signals after wearable device output fitting, can provide more commodity letters
Breath, reduces goods return and replacement, so as to improve the efficiency of shopping at network.
Fig. 4 is the schematic flow sheet of the information processing method that another embodiment of the present invention is provided, as shown in the figure methods described
At least include:
Step S201, by camera user's figure information is gathered, and generates stereo man according to user's figure information
Shape model, user's figure information includes height information, body weight information and measurements of the chest, waist and hips information;
Specifically, when structure model trigger is detected or received, such as detect for the input of network fitting interface
Opening operation when or trigger and the default structure model trigger message phase such as the operating gesture that is input into by body-sensing instrument
Under the triggering situations such as timing, trigger message processing meanss start camera, and gather user's figure information by camera, such as logical
Cross camera to be scanned user according to preset order, height information, body weight letter of the figure information at least including user
Breath and measurements of the chest, waist and hips information etc., further according to these user's figure informations for gathering three-dimensional humanoid model is generated, wherein, described generation is stood
The figure information and preset algorithm that body people's row model can be gathered according to is modeled the three-dimensional humanoid model of construction.
For example, as shown in Fig. 2 A is user in figure, when information processor receives modeling trigger, adopt
The figure information of collection user, and generate corresponding three-dimensional humanoid Model B.
Optionally, can equal proportion build the three-dimensional humanoid model that matches with the figure information for collecting, it is described etc.
Ratio can be 1:1, it is also possible to less than 1:1, specifically do not limit.
Further, after three-dimensional humanoid model is generated, by adding the information such as skin for the model, then generate and scheme
The humanoid model of A identicals in 2.
Step S202, when the fitting operation being input at network fitting interface is detected, obtains target clothes information, institute
Target clothes information is stated including style, size, color, material composition and tactile impressions information;
Specifically, when detect user by body-sensing instrument or directly can touch-control the input operation of network fitting interface
When, the operation can be matched with default fitting operation.If being input into by body-sensing instrument, the operation can be default cunning
Start gesture, if directly can touch interface input, the operation can be screen sliding, click etc..If the operation and default examination
Clothing operation matches, then using the object chosen as target clothes, and obtain the style of target clothes, size, color, material into
Point, multiple information such as brand and price, while obtaining the tactile impressions information of the target clothes.
Wherein, the input operation at network fitting interface matches it is believed that the hand of input with default fitting operation
Gesture is more than or predetermined threshold value with the matching degree of default gesture, it is also contemplated that the screen sliding direction of input is in default screen sliding direction model
In enclosing.
Step S203, the corresponding target clothes of the target clothes information is carried out at synthesis with the three-dimensional humanoid model
Reason, the synthetic effect figure after virtual reality interface shows the synthesis process using default display effect, and by the target
The tactile impressions information of clothes is sent to wearable device, so that wearable device output tactile signals;
Specifically, it is if target clothes is the full lace closing waist one-piece dress of black, the target clothes is three-dimensional humanoid with what is generated
Model is synthesized, will target clothes be through with three-dimensional humanoid model, in synthetic effect such as Fig. 3 shown in A, and this is synthesized
Design sketch shown at virtual reality interface, and user is by wearing the three-dimensional of the VR equipment observables such as the VR glasses synthetic effect
(Three Dimensional, 3D) figure.
Optionally, if in the three-dimensional humanoid model such as Fig. 2 shown in A, the synthetic effect figure correspondence after synthesis is as in Fig. 3
Shown in B.
Optionally, the synthetic effect figure can be shown using default display effect, the default display effect
Can be in a rotational display of plane 180, or the rotational display of space 360, can also amplify periodically to reduce
Show etc., it is not especially limited herein.
While synthetic effect figure is shown, the tactile impressions information of target clothes is sent to wearable device, set by wearing
Standby output tactile signals.The wearable device can be the clothes of the virtual reality (Virtual Reality, VR) with sensor
Decorations, are sent after tactile signals by starting the sensor of VR dress ornaments, and the user for dressing the VR dress ornaments can be with real-time perception target clothing
The sense of touch of clothes.
Step S204, what acquisition was input into tries body-sensing information on, and the body-sensing information of trying on is including comfort quantity, elastic
At least one in degree parameter and satisfaction parameter;
Specifically, user can be issued the page and be directed to body after the body-sensing that target clothes is perceived by wearable device in information
The corresponding body-sensing information of sense input, such as comfort quantity, elasticity parameter, satisfaction parameter, sexual valence ratio.Wherein institute is defeated
The parameters for entering can be one goal gradient of selection in default at least one parameter level, it is also possible to be according to one
The input scoring of parameter preset standard, does not specifically limit.
Optionally, when the satisfaction parameter is more than or equal to default satisfaction parameter threshold, by the target clothing
Clothes are added to default choosing in list.
Specifically, the information processor is compared the satisfaction parameter of user input with default satisfaction parameter threshold
Compared with if the satisfaction parameter of user input automatically adds the target clothes more than or equal to default satisfaction parameter threshold
It is added to default choosing in list (such as shopping cart) for user's purchase.
Optionally, after default choosing in list is added to, payment interface can be automatic jumped to, is purchased so that user pays
Buy.
Step S205, body-sensing information and the synthetic effect figure tried on is issued in the information issue page,
Receive for the real-time comment information for trying body-sensing information and synthetic effect figure input on.
Specifically, information processor believes the body-sensing of trying on of synthetic effect figure (such as Fig. 3 A or 3B) and user input
Breath is issued in the Web information issuance page, and other users can release news for this and commented in real time, and by information
Processing meanss receive the real-time comment information with for reference.
In embodiments of the present invention, by the way that the three-dimensional humanoid model for generating is input into detecting at network fitting interface
Fitting operation when the target clothes that obtains carry out synthesis process, the synthetic effect after the synthesis of virtual reality interface display is processed
Figure, and the tactile impressions information of target clothes is sent to wearable device, so that wearable device output tactile signals.By real-time
Show the stereoeffect of clothes and while the tactile signals after wearable device output fitting, can provide more commodity letters
Breath, reduces goods return and replacement, so as to improve the efficiency of shopping at network.
Fig. 5 is a kind of composition structural representation of information processor provided in an embodiment of the present invention, described as shown in the figure
Device includes:
Model generation module 10, for generating three-dimensional humanoid model;
Optionally, the model generation module 10 specifically for:
User's figure information is gathered by camera, and three-dimensional humanoid model, institute are generated according to user's figure information
User's figure information is stated including height information, body weight information and measurements of the chest, waist and hips information.
Specifically, when structure model trigger is detected or received, such as detect for the input of network fitting interface
Opening operation when or trigger and the default structure model trigger message phase such as the operating gesture that is input into by body-sensing instrument
Under the triggering situations such as timing, trigger message processing meanss start camera, and gather user's figure information by camera, such as logical
Cross camera to be scanned user according to preset order, height information, body weight letter of the figure information at least including user
Breath and measurements of the chest, waist and hips information etc., further according to these user's figure informations for gathering three-dimensional humanoid model is generated, wherein, described generation is stood
The figure information and preset algorithm that body people's row model can be gathered according to is modeled the three-dimensional humanoid model of construction.
For example, as shown in Fig. 2 A is user in figure, when information processor receives modeling trigger, adopt
The figure information of collection user, and generate corresponding three-dimensional humanoid Model B.
Optionally, can equal proportion build the three-dimensional humanoid model that matches with the figure information for collecting, it is described etc.
Ratio can be 1:1, it is also possible to less than 1:1, specifically do not limit.
Further, after three-dimensional humanoid model is generated, by adding the information such as skin for the model, then generate and scheme
The humanoid model of A identicals in 2.
Clothes data obtaining module 20, for when the fitting being input at network fitting interface operation is detected, obtaining
Target clothes information, the target clothes information includes style, size, color, material composition and tactile impressions information;
Specifically, when detect user by body-sensing instrument or directly can touch-control the input operation of network fitting interface
When, the operation can be matched with default fitting operation.If being input into by body-sensing instrument, the operation can be default cunning
Start gesture, if directly can touch interface input, the operation can be screen sliding, click etc..If the operation and default examination
Clothing operation matches, then using the object chosen as target clothes, and obtain the style of target clothes, size, color, material into
Point, multiple information such as brand and price, while obtaining the tactile impressions information of the target clothes.
Wherein, the input operation at network fitting interface matches it is believed that the hand of input with default fitting operation
Gesture is more than or predetermined threshold value with the matching degree of default gesture, it is also contemplated that the screen sliding direction of input is in default screen sliding direction model
In enclosing.
Message processing module 30, for by the corresponding target clothes of the target clothes information and the three-dimensional humanoid model
Synthesis process is carried out, the synthetic effect figure after synthesis described in virtual reality interface display is processed, and by the target clothes
Tactile impressions information is sent to wearable device, so that wearable device output tactile signals.
Specifically, it is if target clothes is the full lace closing waist one-piece dress of black, the target clothes is three-dimensional humanoid with what is generated
Model is synthesized, will target clothes be through with three-dimensional humanoid model, in synthetic effect such as Fig. 3 shown in A, and this is synthesized
Design sketch shown at virtual reality interface, and user is by wearing the three-dimensional of the VR equipment observables such as the VR glasses synthetic effect
(Three Dimensional, 3D) figure.
Optionally, if in the three-dimensional humanoid model such as Fig. 2 shown in A, the synthetic effect figure correspondence after synthesis is as in Fig. 3
Shown in B.
Optionally, described information processing module 30 specifically for:
The corresponding target clothes of the target clothes information is carried out into synthesis process with the three-dimensional humanoid model, virtual
Real boundary face shows the synthetic effect figure after the synthesis process using default display effect, and by the sense of touch of the target clothes
Information is sent to wearable device, so that wearable device output tactile signals.
Specifically, the default display effect can be in a rotational display of plane 180, or space 360
Rotational display, can also amplify display etc. periodically to reduce, and be not especially limited herein.
While synthetic effect figure is shown, the tactile impressions information of target clothes is sent to wearable device, set by wearing
Standby output tactile signals.The wearable device can be the clothes of the virtual reality (Virtual Reality, VR) with sensor
Decorations, are sent after tactile signals by starting the sensor of VR dress ornaments, and the user for dressing the VR dress ornaments can be with real-time perception target clothing
The sense of touch of clothes.
Optionally, as shown in figure 5, described device also includes:
Body-sensing data obtaining module 40, for what acquisition was input into body-sensing information is tried on, and the body-sensing information of trying on includes
At least one in comfort quantity, elasticity parameter and satisfaction parameter;
Specifically, user can be issued the page and be directed to body after the body-sensing that target clothes is perceived by wearable device in information
The corresponding body-sensing information of sense input, such as comfort quantity, elasticity parameter, satisfaction parameter, sexual valence ratio.Wherein institute is defeated
The parameters for entering can be one goal gradient of selection in default at least one parameter level, it is also possible to be according to one
The input scoring of parameter preset standard, does not specifically limit.
Comment information receiving module 50, for body-sensing information and the synthetic effect figure tried on to be issued in information
The page is issued, and is received for the real-time comment information for trying body-sensing information and synthetic effect figure input on.
Specifically, information processor believes the body-sensing of trying on of synthetic effect figure (such as Fig. 3 A or 3B) and user input
Breath is issued in the Web information issuance page, and other users can release news for this and commented in real time, and by information
Processing meanss receive the real-time comment information with for reference.
Optionally, as shown in figure 5, described device also includes:
Clothes add module 60, for when the satisfaction parameter is more than or equal to default satisfaction parameter threshold,
The target clothes is added into default choosing in list.
Specifically, the information processor is compared the satisfaction parameter of user input with default satisfaction parameter threshold
Compared with if the satisfaction parameter of user input automatically adds the target clothes more than or equal to default satisfaction parameter threshold
It is added to default choosing in list (such as shopping cart) for user's purchase.
Optionally, after default choosing in list is added to, payment interface can be automatic jumped to, is purchased so that user pays
Buy.
In embodiments of the present invention, by the way that the three-dimensional humanoid model for generating is input into detecting at network fitting interface
Fitting operation when the target clothes that obtains carry out synthesis process, the synthetic effect after the synthesis of virtual reality interface display is processed
Figure, and the tactile impressions information of target clothes is sent to wearable device, so that wearable device output tactile signals.By real-time
Show the stereoeffect of clothes and while the tactile signals after wearable device output fitting, can provide more commodity letters
Breath, reduces goods return and replacement, so as to improve the efficiency of shopping at network.
Fig. 6 illustrates a kind of computer system 10 based on von Neumann system of operation above- mentioned information processing method.Should
Computer system 10 can be smart mobile phone, panel computer, palm PC, and the user terminal such as notebook computer or PC sets
It is standby.Specifically, it may include outer input interface 1001, processor 1002, the memory 1003 and defeated connected by system bus
Outgoing interface 1004.Wherein, outer input interface 1001 may include touch screen 10016, can also optionally include network interface
10018.Memory 1003 may include external memory 10032 (such as hard disk, CD or floppy disk etc.) and built-in storage 10034.It is defeated
Outgoing interface 1004 may include the equipment such as display screen 10042 and sound equipment/loudspeaker 10044.
In the present embodiment, the operation of this method is based on computer program, and the program file of the computer program is stored in
In the aforementioned external memory 10032 based on the computer system 10 of von Neumann system, built-in storage is operationally loaded into
In 10034, then it is compiled as being transferred to be performed in processor 1002 after machine code, so that being based on von Neumann system
Computer system 10 in formed model generation module 10 in logic, clothes data obtaining module 20, message processing module 30,
Body-sensing data obtaining module 40, comment information receiving module 50 and clothes add module 60, and in above- mentioned information processing method
In implementation procedure, the parameter of input is received by outer input interface 1001, and is transferred to be cached in memory 1003, then
It is input in processor 1002 and is processed, the result data of process or be cached in memory 1003 is subsequently processed,
Or be passed to output interface 1004 and exported.
One of ordinary skill in the art will appreciate that realizing all or part of flow process in above-described embodiment method, can be
Related hardware is instructed to complete by computer program, described program can be stored in a computer read/write memory medium
In, the program is upon execution, it may include such as the flow process of the embodiment of above-mentioned each method.Wherein, described storage medium can be magnetic
Dish, CD, read-only memory (Read-Only Memory, ROM) or random access memory (Random Access
Memory, RAM) etc..
Above disclosed is only present pre-ferred embodiments, can not limit the right model of the present invention with this certainly
Enclose, therefore the equivalent variations made according to the claims in the present invention, still belong to the scope that the present invention is covered.
Claims (10)
1. a kind of information processing method, it is characterised in that include:
Generate three-dimensional humanoid model;
When the fitting operation being input at network fitting interface is detected, target clothes information, the target clothes letter are obtained
Breath includes style, size, color, material composition and tactile impressions information;
The corresponding target clothes of the target clothes information is carried out into synthesis process with the three-dimensional humanoid model, in virtual reality
Synthetic effect figure after synthesis process described in interface display, and the tactile impressions information of the target clothes is sent to wearable device,
So that the wearable device exports tactile signals.
2. the method for claim 1, it is characterised in that the three-dimensional humanoid model of the generation, including:
User's figure information is gathered by camera, and three-dimensional humanoid model, the use are generated according to user's figure information
Family figure information includes height information, body weight information and measurements of the chest, waist and hips information.
3. the method for claim 1, it is characterised in that described after synthesis described in virtual reality interface display is processed
Synthetic effect figure, including:
Synthetic effect figure after virtual reality interface shows that the synthesis is processed using default display effect.
4. the method for claim 1, it is characterised in that the tactile impressions information by the target clothes is sent to wearing
After equipment, also include:
What acquisition was input into tries body-sensing information on, described to try body-sensing information on including comfort quantity, elasticity parameter and expire
At least one in meaning degree parameter;
Body-sensing information and the synthetic effect figure tried on is issued in the information issue page, is received and is directed to the examination
Wear the real-time comment information of body-sensing information and synthetic effect figure input.
5. method as claimed in claim 4, it is characterised in that methods described also includes:
When the satisfaction parameter is more than or equal to default satisfaction parameter threshold, the target clothes is added to default
In choosing list.
6. a kind of information processor, it is characterised in that include:
Model generation module, for generating three-dimensional humanoid model;
Clothes data obtaining module, for when the fitting being input at network fitting interface operation is detected, obtaining target clothing
Information is taken, the target clothes information is included in style, size, color, material composition and tactile impressions information;
Message processing module, for the corresponding target clothes of the target clothes information to be closed with the three-dimensional humanoid model
Into process, the synthetic effect figure after synthesis described in virtual reality interface display is processed, and the sense of touch of the target clothes is believed
Breath is sent to wearable device, so that wearable device output tactile signals.
7. device as claimed in claim 6, it is characterised in that the model generation module specifically for:
User's figure information is gathered by camera, and three-dimensional humanoid model, the use are generated according to user's figure information
Family figure information includes height information, body weight information and measurements of the chest, waist and hips information.
8. device as claimed in claim 6, it is characterised in that described information processing module specifically for:
The corresponding target clothes of the target clothes information is carried out into synthesis process with the three-dimensional humanoid model, in virtual reality
Interface shows the synthetic effect figure after the synthesis process using default display effect, and by the tactile impressions information of the target clothes
Send to wearable device, so that wearable device output tactile signals.
9. device as claimed in claim 6, it is characterised in that described device also includes:
Body-sensing data obtaining module, for what acquisition was input into body-sensing information is tried on, and the body-sensing information of trying on is including comfort level
At least one in parameter, elasticity parameter and satisfaction parameter;
Comment information receiving module, for body-sensing information and the synthetic effect figure tried on to be entered in the information issue page
Row is issued, and is received for the real-time comment information for trying body-sensing information and synthetic effect figure input on.
10. device as claimed in claim 9, it is characterised in that described device also includes:
Clothes add module, for when the satisfaction parameter is more than or equal to default satisfaction parameter threshold, will be described
Target clothes is added to default choosing in list.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610865542.4A CN106651498A (en) | 2016-09-29 | 2016-09-29 | Information processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610865542.4A CN106651498A (en) | 2016-09-29 | 2016-09-29 | Information processing method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106651498A true CN106651498A (en) | 2017-05-10 |
Family
ID=58853978
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610865542.4A Pending CN106651498A (en) | 2016-09-29 | 2016-09-29 | Information processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106651498A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108090811A (en) * | 2017-11-23 | 2018-05-29 | 中国计量大学 | The textile product virtual reality net purchase system and method quantified based on subjective sensation |
CN108898979A (en) * | 2018-04-28 | 2018-11-27 | 深圳市奥拓电子股份有限公司 | Advertisement machine interactive approach, interactive system for advertisement player and advertisement machine |
CN109388229A (en) * | 2017-08-11 | 2019-02-26 | 哈尔滨工业大学 | A kind of immersion virtual fit method and system with sense of touch experience |
CN110148040A (en) * | 2019-05-22 | 2019-08-20 | 珠海随变科技有限公司 | A kind of virtual fit method, device, equipment and storage medium |
CN110738548A (en) * | 2019-09-26 | 2020-01-31 | 维沃移动通信有限公司 | Virtual fitting method and device, mobile terminal and computer readable storage medium |
CN114902266A (en) * | 2021-11-26 | 2022-08-12 | 株式会社威亚视 | Information processing apparatus, information processing method, information processing system, and program |
WO2023093679A1 (en) * | 2021-11-23 | 2023-06-01 | 北京字节跳动网络技术有限公司 | Image processing method and apparatus, electronic device, and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103514350A (en) * | 2012-06-27 | 2014-01-15 | 富泰华工业(深圳)有限公司 | Electronic device with virtual fit function and virtual fit method |
CN103761670A (en) * | 2014-02-07 | 2014-04-30 | 华为技术有限公司 | Touch sensation interaction method and device in shopping |
CN104751514A (en) * | 2015-04-03 | 2015-07-01 | 厦门唯尔酷信息技术有限公司 | Wearing fitness simulating method and wearing fitness simulating system |
CN105787751A (en) * | 2016-01-06 | 2016-07-20 | 湖南拓视觉信息技术有限公司 | 3D human body virtual fitting method and system |
CN105956912A (en) * | 2016-06-06 | 2016-09-21 | 施桂萍 | Method for realizing network fitting |
-
2016
- 2016-09-29 CN CN201610865542.4A patent/CN106651498A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103514350A (en) * | 2012-06-27 | 2014-01-15 | 富泰华工业(深圳)有限公司 | Electronic device with virtual fit function and virtual fit method |
CN103761670A (en) * | 2014-02-07 | 2014-04-30 | 华为技术有限公司 | Touch sensation interaction method and device in shopping |
CN104751514A (en) * | 2015-04-03 | 2015-07-01 | 厦门唯尔酷信息技术有限公司 | Wearing fitness simulating method and wearing fitness simulating system |
CN105787751A (en) * | 2016-01-06 | 2016-07-20 | 湖南拓视觉信息技术有限公司 | 3D human body virtual fitting method and system |
CN105956912A (en) * | 2016-06-06 | 2016-09-21 | 施桂萍 | Method for realizing network fitting |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109388229A (en) * | 2017-08-11 | 2019-02-26 | 哈尔滨工业大学 | A kind of immersion virtual fit method and system with sense of touch experience |
CN108090811A (en) * | 2017-11-23 | 2018-05-29 | 中国计量大学 | The textile product virtual reality net purchase system and method quantified based on subjective sensation |
CN108181987A (en) * | 2017-11-23 | 2018-06-19 | 中国计量大学 | A kind of textile cognitive method based on virtual reality |
CN108195690A (en) * | 2017-11-23 | 2018-06-22 | 中国计量大学 | A kind of textile sensory perceptual system based on virtual reality |
CN108898979A (en) * | 2018-04-28 | 2018-11-27 | 深圳市奥拓电子股份有限公司 | Advertisement machine interactive approach, interactive system for advertisement player and advertisement machine |
CN110148040A (en) * | 2019-05-22 | 2019-08-20 | 珠海随变科技有限公司 | A kind of virtual fit method, device, equipment and storage medium |
CN110738548A (en) * | 2019-09-26 | 2020-01-31 | 维沃移动通信有限公司 | Virtual fitting method and device, mobile terminal and computer readable storage medium |
CN110738548B (en) * | 2019-09-26 | 2021-11-09 | 维沃移动通信有限公司 | Virtual fitting method and device, mobile terminal and computer readable storage medium |
WO2023093679A1 (en) * | 2021-11-23 | 2023-06-01 | 北京字节跳动网络技术有限公司 | Image processing method and apparatus, electronic device, and storage medium |
CN114902266A (en) * | 2021-11-26 | 2022-08-12 | 株式会社威亚视 | Information processing apparatus, information processing method, information processing system, and program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106651498A (en) | Information processing method and device | |
GB2564745B (en) | Methods for generating a 3D garment image, and related devices, systems and computer program products | |
US10417825B2 (en) | Interactive cubicle and method for determining a body shape | |
CN108292449A (en) | Three-dimensional garment is changed using gesture | |
CN106055710A (en) | Video-based commodity recommendation method and device | |
CN104199542A (en) | Intelligent mirror obtaining method and device and intelligent mirror | |
CN106202304A (en) | Method of Commodity Recommendation based on video and device | |
CN104966284A (en) | Method and equipment for acquiring object dimension information based on depth data | |
CN111681070A (en) | Method, device, storage device and equipment for purchasing online commodities | |
CN105374057A (en) | Virtual try-on apparatus and virtual try-on method | |
CN105374058A (en) | Virtual try-on apparatus, virtual try-on system and virtual try-on method | |
CN106960475B (en) | Method and device for processing part clicking of three-dimensional model, storage medium and processor | |
CN108196669A (en) | Modification method, device, processor and the head-mounted display apparatus of avatar model | |
CN107666435A (en) | A kind of method and device for shielding message | |
CN106779774A (en) | Virtual fitting system and virtual fit method | |
CN110119201A (en) | A kind of method and apparatus of virtual experience household appliance collocation domestic environment | |
KR20200139934A (en) | Clothes Wearing Service Apparatus and Method using Augmented Reality | |
WO2022262508A1 (en) | Augmented reality-based intelligent trying on method and system, terminal, and medium | |
KR102064653B1 (en) | Wearable glasses and method for clothes shopping based on augmented relity | |
JP2019128923A (en) | Information providing device, method and program | |
KR102278882B1 (en) | Merchandise sales service device based on dynamic scene change, Merchandise sales system based on dynamic scene change, method for selling merchandise based on dynamic scene change and computer readable medium having computer program recorded therefor | |
CN112102018A (en) | Intelligent fitting mirror implementation method and related device | |
CN103440580B (en) | A kind of method and apparatus of the medicated clothing image that virtual fitting is provided | |
KR20140015709A (en) | System for image matching and method thereof | |
CN106651500B (en) | Online shopping system based on video image recognition technology and virtual reality technology of spatial feedback characteristics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170510 |