CN107967062B - Intelligent fitting method and system based on somatosensory interaction and shop window - Google Patents
Intelligent fitting method and system based on somatosensory interaction and shop window Download PDFInfo
- Publication number
- CN107967062B CN107967062B CN201711432380.6A CN201711432380A CN107967062B CN 107967062 B CN107967062 B CN 107967062B CN 201711432380 A CN201711432380 A CN 201711432380A CN 107967062 B CN107967062 B CN 107967062B
- Authority
- CN
- China
- Prior art keywords
- image information
- information
- somatosensory interaction
- body type
- clothes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 104
- 230000003238 somatosensory effect Effects 0.000 title claims abstract description 93
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000000694 effects Effects 0.000 claims abstract description 34
- 230000006698 induction Effects 0.000 claims abstract description 14
- 230000009471 action Effects 0.000 claims description 41
- 230000000875 corresponding effect Effects 0.000 claims description 16
- 239000004973 liquid crystal related substance Substances 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 8
- 239000012780 transparent material Substances 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 4
- 238000001931 thermography Methods 0.000 claims description 4
- 230000001360 synchronised effect Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Social Psychology (AREA)
- Strategic Management (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Development Economics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an intelligent fitting method, fitting system and shop window based on somatosensory interaction, which comprises the following steps: the distance sensor senses distance information between the display window and the display window; the body type camera acquires body type image information; extracting clothing image information in a database; matching the acquired body type image information with the garment image information to generate effect image information; and transmitting the effect image information to a display device. The intelligent fitting system comprises a distance sensor, a somatosensory interaction device, an infrared induction camera, a memory, display equipment and a processor. According to the invention, the infrared induction camera scans the human body and generates the human body model matched with the customer through a certain algorithm, and then the image information of the clothes and the information of the human body model are combined into the image of wearing the clothes through a certain algorithm, so that the effect of the device is more similar to the effect of actually wearing the clothes by a person.
Description
Technical Field
The invention relates to the field of somatosensory interaction, in particular to an intelligent fitting method, system and showcase based on somatosensory interaction.
Background
Along with the continuous development of somatosensory interaction technology, users also put forward higher and higher requirements on somatosensory interaction control, and gesture control is a subclass of somatosensory interaction, and is mainly used for controlling various applications or devices by identifying the shape and movement track of hands of people.
In general, in a physical shop of clothing, a showcase is set on a side close to a roadside for visiting a better-looking clothing in the shop to attract consumers to enter the shop for purchase, and when the consumers are attracted by the clothing in the showcase as passers-by, in order to avoid trouble, the consumers can not enter the shop for try-on, but particularly want to try-on the clothing on the own bed.
Disclosure of Invention
The invention aims to solve the technical problems that: the intelligent fitting method based on somatosensory interaction is convenient for fitting, the intelligent fitting system based on somatosensory interaction is further provided, and the intelligent fitting shop window based on somatosensory interaction is also provided.
In order to solve the problems, the invention relates to an intelligent fitting method based on somatosensory interaction, which comprises the following steps:
the distance sensor senses distance information between the display window and the display window;
the body type camera acquires body type image information;
extracting clothing image information in a database;
matching the acquired body type image information with the garment image information to generate effect image information;
and transmitting the effect image information to a display device.
Optionally, before the step of acquiring the body type image information by the body type camera, the intelligent fitting method of the invention comprises the following steps:
the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
and sending the somatosensory interaction instruction to the bag distance body type camera, and executing the somatosensory interaction instruction by the body type camera.
Optionally, before the step of acquiring the motion information by the somatosensory interaction device, the intelligent fitting method includes:
and comparing the sensed distance information with a set value, and when the sensed distance information exceeds the set value, sending prompt information of the distance range to display equipment by the somatosensory interaction device, wherein the display equipment plays a guiding video of the distance range.
Optionally, before the step of matching the obtained body type image information with the obtained clothes image information to generate the effect image information, the intelligent fitting method further comprises the following steps:
performing key point identification on the acquired body type image information;
and acquiring key points of the clothes image information, and matching the key points of the body type image information with the key points of the clothes image information.
Optionally, before the step of sending the effect image information to the display device, the intelligent fitting method further includes:
the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
and executing the somatosensory interaction instruction, and sending the somatosensory interaction instruction and an execution result of the somatosensory interaction instruction to display equipment.
Optionally, before the step of executing the somatosensory interaction instruction, the intelligent fitting method further includes:
and when the identified motion information does not have the motion interaction instruction matched with the motion information, extracting the motion information corresponding to the motion interaction instruction similar to the motion information, sending the unidentified information to the display equipment, and sending a guidance video of the motion information corresponding to the similar motion interaction instruction to the display equipment.
Optionally, after the step of sending the effect image information to the display device, the intelligent fitting method further includes:
the wireless connection device establishes connection with the mobile phone;
and after receiving the corresponding instruction information, saving the effect image information and sending the effect image information to the mobile phone.
The invention also relates to an intelligent fitting system based on somatosensory interaction, which comprises:
distance sensor: the distance sensor is used for detecting the distance between a person and the showcase;
somatosensory interaction device: the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
infrared induction camera: the infrared induction sensor is used for acquiring body type image information of a person;
a memory: the memory is used for storing clothes image information;
display device: the display device is used for displaying the effect image information and receiving the somatosensory interaction command of the somatosensory interaction device.
A processor: for receiving feedback information from other elements and for sending control commands to other elements.
Optionally, the intelligent fitting system of the present invention further includes:
wireless connection device: the wireless connection device is used for connecting a mobile phone of a user.
The invention also relates to an intelligent fitting shop window based on somatosensory interaction, which is characterized in that: the display device comprises a showcase, a distance sensor, a somatosensory interaction device, an infrared induction camera, a memory, display equipment and a processor;
the distance sensor is used for detecting the distance between a person and the showcase;
the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
the infrared induction sensor is used for acquiring body type image information of a person;
the memory is used for storing clothes image information;
the display equipment is used for displaying the effect image information and receiving a somatosensory interaction command of the somatosensory interaction device;
the device is used for receiving feedback information of other elements and sending control commands to other elements;
the showcase comprises four vertical faces, the vertical face close to the roadside is made of transparent materials, the display equipment is liquid crystal display equipment, the liquid crystal display equipment is arranged on the vertical face made of the transparent materials, the display face of the liquid crystal display equipment faces outwards, and the distance sensor is arranged on the outer side of the showcase.
The beneficial effects of the invention are as follows: according to the intelligent fitting system, the intelligent fitting system is started by sensing a specific gesture of an intentional pedestrian through the somatosensory interaction device, but the intelligent fitting system is started for the unintentional action of the pedestrian far away from the shop window, the distance sensor senses the distance between the pedestrian and the shop window, and the intelligent fitting system is not started when the pedestrian is not in a designated area, so that misoperation is avoided. After the intelligent fitting system is started, the infrared sensing camera scans the human body and generates a human body model matched with a customer through a certain algorithm, then the image information of the clothes and the information of the human body model are combined into an image of wearing the clothes through a certain algorithm, and the effect of the intelligent fitting system is more similar to that of a person actually wearing the clothes.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of an intelligent fitting method of the present invention;
fig. 2 is a schematic flow chart of a second embodiment of the intelligent fitting method of the present invention;
FIG. 3 is a schematic flow chart of a third embodiment of the intelligent fitting method of the present invention;
fig. 4 is a schematic flow chart of a fourth embodiment of the intelligent fitting method of the present invention;
fig. 5 is a schematic diagram of the composition structure of the intelligent fitting system of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present invention, it should be noted that, directions or positional relationships indicated by terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., are directions or positional relationships based on those shown in the drawings, or are directions or positional relationships conventionally put in use of the inventive product, are merely for convenience of describing the present invention and simplifying the description, and are not indicative or implying that the apparatus or element to be referred to must have a specific direction, be constructed and operated in a specific direction, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Example 1
As shown in fig. 1, a flowchart of a first embodiment of the intelligent fitting method of the present invention, a virtual fitting method according to this embodiment includes:
101. the distance sensor senses distance information between the display window and the display window;
in this embodiment, the distance sensor is mainly used to detect the distance between the pedestrian at the roadside and the shop window, so as to avoid the pedestrian with a larger distance from operating the shop window unintentionally. Only when standing in the designated range of the showcase, intelligent fitting can be performed. When the sensed distance information exceeds a set value, the somatosensory interaction device sends prompt information of the distance range to the display device, and the display device plays a guiding video of the distance range to guide the intentional customer to operate in an exhaustive mode.
102. The body type camera acquires body type image information;
in this embodiment, the customer stands in a specific area, and the body type camera mainly detects the external contour of the body of the customer without taking off the clothes on the body when the customer is fitting, and in this embodiment, the body type camera may use an infrared sensing camera, and the infrared sensing sensor can capture the external contour of the body through the heat dissipation condition of the body after that, and process the heat sensing image captured by the infrared sensing camera through the processor to obtain the external contour of the body.
103. Extracting clothing image information in a database;
in this embodiment, all the pieces of clothing image information stored in the database should be photographed in advance and stored in the database, and the pieces of clothing image information are obtained mainly including the steps of:
s01, putting clothes with different sizes on model bodies corresponding to the sizes, and performing corresponding actions;
in this embodiment, after the user wears the garment, the user frequently makes actions such as lifting the hands to a horizontal state, lifting the hands, separating the feet, and squatting the garment, and taking a picture of 360 degrees under each action, and setting an interval angle for each picture according to the processing capability and the detail expressive capability of the processing, for example, setting to take a picture every 10 degrees for better visual effect, i.e. each group of two-dimensional image information includes 36 pictures. When a customer performs clothes test, the customer can be guided in the posture of clothes test according to the action of the model when taking a picture on the display device.
S02, shooting a group of two-dimensional image information of each action garment in the 360-degree direction;
s03, processing the group of image information into three-dimensional image information;
in this embodiment, three-dimensional synthesis is performed on each group of photographed photos through the photo processor, that is, each group of photos is synthesized into a 360-degree panoramic three-dimensional model, so that when a customer performs fitting, the corresponding angle details are reflected.
S04, model characteristic information in the three-dimensional image information is removed, and three-dimensional image information of the clothes is generated;
in this embodiment, the characteristic information of the model in the synthesized 360-degree panoramic three-dimensional model should be removed by the image processor, and only the characteristic information of the clothing should be maintained. The method can adopt the following direction when processing, and the method is used for carrying out synchronous thermal imaging shooting on the model when carrying out image shooting, generating a model body model through a photo shot by the thermal imaging, and removing the model body model of the model from the synthesized 360-degree panoramic three-dimensional model to obtain the image characteristic information of the garment.
S05, storing three-dimensional image information of the clothes under each action.
104. Matching the acquired body type image information with the garment image information to generate effect image information;
in this embodiment, the image feature information of the clothing generated as described above is matched with the figure image information of the customer, and the pose of the customer is recognized to match the image information of the clothing in the corresponding pose. When a customer turns around in situ to make a fitting, a display can be arranged at the front and back of the customer, so that the customer can conveniently rotate 180 degrees, and the details of the clothes behind the customer when wearing the clothes can be observed.
105. And transmitting the effect image information to a display device.
In this embodiment, the effect image is displayed on the display device for viewing by customers. The embodiment also comprises a wireless connection device which can be connected with a mobile phone of a customer, wherein the wireless connection device can be set to be connected with wireless Bluetooth, and also can be set to be connected with the mobile phone through a network, and the customer inputs the number of the fitting room through the mobile phone APP to extract and transmit the photo.
Example two
As shown in fig. 2, which is a schematic flow chart of a second embodiment of the intelligent fitting method of the present invention, the virtual fitting method according to the present embodiment includes:
201. the distance sensor senses distance information between the display window and the display window;
202. the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
the body feeling interaction instruction is sent to the bag distance body type camera, and the body type camera executes the body feeling interaction instruction;
in this embodiment, in order to avoid the step of opening intelligent fitting between the passers-by passing by the shop window unintentionally, a customer who is unexpected in fitting can do a specific trigger action within a specific range of the shop window through the motion of the somatosensory interaction device, for example, draw a circle, the somatosensory interaction device detects the motion of the customer and emits a somatosensory interaction instruction to the body type camera, and the steps are as follows.
203. The body type camera acquires body type image information;
204. extracting clothing image information in a database;
205. matching the acquired body type image information with the garment image information to generate effect image information;
206. and transmitting the effect image information to a display device.
In other steps in this embodiment, the same steps as those in embodiment one are the same as the functions implemented in embodiment one, and for specific description, please refer to embodiment one.
Example III
As shown in fig. 3, a flowchart of a second embodiment of the intelligent fitting method of the present invention, the virtual fitting method of the present embodiment includes:
301. the distance sensor senses distance information between the display window and the display window;
302. the body type camera acquires body type image information;
303. extracting clothing image information in a database;
304. performing key point identification on the acquired body type image information;
305. and acquiring key points of the clothes image information, and matching the key points of the body type image information with the key points of the clothes image information.
In this embodiment, when storing the image information of the clothing, several key identification points should be located at the joints of the model where the clothing is supported, and when processing the body type image information of the customer, the key points corresponding to the identification points of the clothing image information are found, and when combining the body type image information of the customer with the clothing image information, only the key points on the body type image information of the customer are required to be corresponding to the identification points on the clothing image information, so that the efficiency and accuracy of the picture can be effectively improved.
306. Matching the acquired body type image information with the garment image information to generate effect image information;
307. and transmitting the effect image information to a display device.
In other steps in this embodiment, the same steps as those in embodiment one are the same as the functions implemented in embodiment one, and for specific description, please refer to embodiment one.
Example IV
As shown in fig. 4, which is a schematic flow chart of a second embodiment of the intelligent fitting method of the present invention, the virtual fitting method according to the present embodiment includes:
401. the distance sensor senses distance information between the display window and the display window;
402. the body type camera acquires body type image information;
403. extracting clothing image information in a database;
404. matching the acquired body type image information with the garment image information to generate effect image information;
405. the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
and executing the somatosensory interaction instruction, and sending the somatosensory interaction instruction and an execution result of the somatosensory interaction instruction to display equipment.
406. And transmitting the effect image information to a display device.
In this embodiment, the body feeling recognition device is used to change the image information of the clothes to be tested, for example, if the hand of the customer is set to swing laterally and leftwards, then the next one is set, and the right horizontal swing is set to be the last one. And when the identified motion information does not have the motion interaction instruction matched with the motion information, extracting the motion information corresponding to the motion interaction instruction similar to the motion information, sending the non-identification information to the display equipment, and sending a guidance video of the motion information corresponding to the similar motion interaction instruction to the display equipment to guide a customer to operate in a correct posture.
In other steps in this embodiment, the same steps as those in embodiment one are the same as the functions implemented in embodiment one, and for specific description, please refer to embodiment one.
Example five
As shown in fig. 5, this embodiment relates to an intelligent fitting system based on somatosensory interaction, the intelligent fitting system includes:
distance sensor: the distance sensor is used for detecting the distance between a person and the showcase;
somatosensory interaction device: the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
infrared induction camera: the infrared induction sensor is used for acquiring body type image information of a person;
a memory: the memory is used for storing clothes image information;
display device: the display device is used for displaying the effect image information and receiving the somatosensory interaction command of the somatosensory interaction device.
A processor: for receiving feedback information from other elements and for sending control commands to other elements.
And the wireless connection device is used for being connected with the mobile phone.
Example six
The embodiment also relates to an intelligent fitting shop window based on somatosensory interaction, which comprises a shop window body, a distance sensor, a somatosensory interaction device, an infrared induction camera, a memory, display equipment and a processor;
the distance sensor is used for detecting the distance between a person and the showcase;
the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
the infrared induction sensor is used for acquiring body type image information of a person;
the memory is used for storing clothes image information;
the display equipment is used for displaying the effect image information and receiving a somatosensory interaction command of the somatosensory interaction device;
the device is used for receiving feedback information of other elements and sending control commands to other elements;
the showcase comprises four vertical faces, the vertical face close to the roadside is made of transparent materials, the display equipment is liquid crystal display equipment, the liquid crystal display equipment is arranged on the vertical face made of the transparent materials, the display face of the liquid crystal display equipment faces outwards, and the distance sensor is arranged on the outer side of the showcase.
The foregoing embodiments are merely examples of the present invention, and the scope of the present invention includes, but is not limited to, the forms and styles of the foregoing embodiments, and any suitable changes or modifications made by those skilled in the art, which are consistent with the claims of the present invention, shall fall within the scope of the present invention.
Claims (10)
1. An intelligent fitting method based on somatosensory interaction is characterized by comprising the following steps of: the intelligent fitting method comprises the following steps:
the distance sensor senses distance information between the display window and the display window;
the body type camera acquires body type image information;
extracting clothing image information in a database;
the clothes image information stored in the database is shot in advance and stored in the database, and the information of the clothes image is obtained by the following steps:
s01, putting clothes of different sizes on models corresponding to the sizes, and performing corresponding actions, setting interval angles for each shooting according to the processing capacity and the detail expression capacity, and guiding the posture of the clothes test of a customer on display equipment according to the actions of the models when shooting pictures when the customer tests the clothes;
s02, shooting a group of two-dimensional image information of each action garment in the 360-degree direction;
s03, processing the group of image information into three-dimensional image information;
s04, removing model characteristic information in the three-dimensional image information, generating three-dimensional image information of the clothes, removing characteristic information of the models in the synthesized 360-degree panoramic three-dimensional model through an image processor, only retaining the characteristic information of the clothes, adopting the following direction when processing, carrying out synchronous thermal imaging shooting on the models when carrying out image shooting, generating body type models of the models through photos shot by thermal imaging, and removing the body type models of the models from the synthesized 360-degree panoramic three-dimensional model to obtain image characteristic information of the clothes;
s05, storing three-dimensional image information of the clothes under each action;
matching the acquired body type image information with the garment image information to generate effect image information;
and transmitting the effect image information to a display device.
2. The intelligent fitting method based on somatosensory interaction according to claim 1, wherein: before the step of acquiring the body type image information by the body type camera, the intelligent fitting method comprises the following steps:
the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
and sending the somatosensory interaction instruction to the body type camera, and executing the somatosensory interaction instruction by the body type camera.
3. The intelligent fitting method based on somatosensory interaction according to claim 2, wherein: before the step of acquiring the action information by the somatosensory interaction device, the intelligent fitting method comprises the following steps:
and comparing the sensed distance information with a set value, and when the sensed distance information exceeds the set value, sending prompt information of the distance range to display equipment by the somatosensory interaction device, wherein the display equipment plays a guiding video of the distance range.
4. The intelligent fitting method based on somatosensory interaction according to claim 1, wherein: before the step of matching the acquired body type image information with the garment image information to generate the effect image information, the intelligent fitting method further comprises the following steps:
performing key point identification on the acquired body type image information;
and acquiring key points of the clothes image information, and matching the key points of the body type image information with the key points of the clothes image information.
5. The intelligent fitting method based on somatosensory interaction according to claim 1, wherein: before the step of transmitting the effect image information to the display device, the intelligent fitting method further includes:
the somatosensory interaction device acquires action information;
identifying the action information;
matching the identified action information with the somatosensory interaction instruction;
and executing the somatosensory interaction instruction, and sending the somatosensory interaction instruction and an execution result of the somatosensory interaction instruction to display equipment.
6. The intelligent fit-on method based on somatosensory interaction according to claim 2 or 5, wherein: before executing the somatosensory interaction instruction, the intelligent fitting method further comprises the following steps:
and when the identified motion information does not have the motion interaction instruction matched with the motion information, extracting the motion information corresponding to the motion interaction instruction similar to the motion information, sending the unidentified information to the display equipment, and sending a guidance video of the motion information corresponding to the similar motion interaction instruction to the display equipment.
7. The intelligent fitting method based on somatosensory interaction according to claim 1, wherein: after the step of transmitting the effect image information to the display device, the intelligent fitting method further includes:
the wireless connection device establishes connection with the mobile phone;
and after receiving the corresponding instruction information, saving the effect image information and sending the effect image information to the mobile phone.
8. An intelligent fitting system based on somatosensory interaction of any one of claims 1-7, characterized in that: the intelligent fitting system comprises:
distance sensor: the distance sensor is used for detecting the distance between a person and the showcase;
somatosensory interaction device: the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
infrared induction camera: the infrared induction camera is used for acquiring body type image information of a person;
a memory: the memory is used for storing clothes image information;
display device: the display equipment is used for displaying the effect image information and receiving a somatosensory interaction command of the somatosensory interaction device;
a processor: for receiving feedback information from other elements and for sending control commands to other elements.
9. The somatosensory interaction based intelligent fitting system according to claim 8, wherein: the intelligent fitting system also comprises:
wireless connection device: the wireless connection device is used for connecting a mobile phone of a user.
10. An intelligent fitting shop window based on somatosensory interaction of the intelligent fitting method of any one of claims 1-7, characterized in that: the display device comprises a showcase, a distance sensor, a somatosensory interaction device, an infrared induction camera, a memory, display equipment and a processor;
the distance sensor is used for detecting the distance between a person and the showcase;
the somatosensory interaction device is used for identifying the action information of the person and matching the somatosensory interaction command according to the action information of the person;
the infrared induction camera is used for acquiring body type image information of a person;
the memory is used for storing clothes image information;
the display equipment is used for displaying the effect image information and receiving a somatosensory interaction command of the somatosensory interaction device;
the device is used for receiving feedback information of other elements and sending control commands to other elements;
the showcase comprises four vertical faces, transparent materials are adopted by the vertical faces close to the roadsides, the display equipment is liquid crystal display equipment, the liquid crystal display equipment is arranged on the vertical faces of the transparent materials, the display face of the liquid crystal display equipment faces outwards, and the distance sensor is arranged on the outer side of the showcase.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711432380.6A CN107967062B (en) | 2017-12-26 | 2017-12-26 | Intelligent fitting method and system based on somatosensory interaction and shop window |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711432380.6A CN107967062B (en) | 2017-12-26 | 2017-12-26 | Intelligent fitting method and system based on somatosensory interaction and shop window |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107967062A CN107967062A (en) | 2018-04-27 |
CN107967062B true CN107967062B (en) | 2023-11-24 |
Family
ID=61994855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711432380.6A Active CN107967062B (en) | 2017-12-26 | 2017-12-26 | Intelligent fitting method and system based on somatosensory interaction and shop window |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107967062B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109308822A (en) * | 2018-10-25 | 2019-02-05 | 湖南城市学院 | A kind of Dancing Teaching Interactive Experience method and system |
CN109753152A (en) * | 2018-12-21 | 2019-05-14 | 北京市商汤科技开发有限公司 | Exchange method and device based on human body attitude, computer equipment |
CN114998517A (en) * | 2022-05-27 | 2022-09-02 | 广亚铝业有限公司 | Aluminum alloy door and window exhibition hall and shared exhibition method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201654841U (en) * | 2010-04-07 | 2010-11-24 | 杭州雄伟科技开发有限公司 | Fashionable fitting service system |
CN102004860A (en) * | 2010-12-02 | 2011-04-06 | 天津市企商科技发展有限公司 | Network real-person fitting system and control method thereof |
CN103049852A (en) * | 2012-12-19 | 2013-04-17 | 武汉世纪炎龙网络科技有限公司 | Virtual fitting system |
-
2017
- 2017-12-26 CN CN201711432380.6A patent/CN107967062B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201654841U (en) * | 2010-04-07 | 2010-11-24 | 杭州雄伟科技开发有限公司 | Fashionable fitting service system |
CN102004860A (en) * | 2010-12-02 | 2011-04-06 | 天津市企商科技发展有限公司 | Network real-person fitting system and control method thereof |
CN103049852A (en) * | 2012-12-19 | 2013-04-17 | 武汉世纪炎龙网络科技有限公司 | Virtual fitting system |
Also Published As
Publication number | Publication date |
---|---|
CN107967062A (en) | 2018-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5863423B2 (en) | Information processing apparatus, information processing method, and program | |
CN105027033B (en) | Method, device and computer-readable media for selecting Augmented Reality object | |
JP6490430B2 (en) | Image processing apparatus, image processing system, image processing method, and program | |
TW201401222A (en) | Electronic device capable of generating virtual clothing model and method for generating virtual clothing model | |
US8036416B2 (en) | Method and apparatus for augmenting a mirror with information related to the mirrored contents and motion | |
JP5210102B2 (en) | Virtual fitting device | |
EP3059710B1 (en) | Fitting support device and method | |
JP5791812B2 (en) | Clothes image processing device, clothes image display method, and program | |
JP5439787B2 (en) | Camera device | |
CN107967062B (en) | Intelligent fitting method and system based on somatosensory interaction and shop window | |
US20120095589A1 (en) | System and method for 3d shape measurements and for virtual fitting room internet service | |
JP6562437B1 (en) | Monitoring device and monitoring method | |
JP2016054450A (en) | Image processing device, image processing system, image processing method, and program | |
WO2016078296A1 (en) | Garment try-on method and apparatus | |
JP2011254411A (en) | Video projection system and video projection program | |
JP6002058B2 (en) | Image processing apparatus, method, and program | |
JP6720385B1 (en) | Program, information processing method, and information processing terminal | |
JP6109288B2 (en) | Information processing apparatus, information processing method, and program | |
KR101720016B1 (en) | A clothing fitting system with a mirror display by using salient points and the method thereof | |
JP2014071501A (en) | Image processor, image processing method and program | |
JP5103682B2 (en) | Interactive signage system | |
CN111199583A (en) | Virtual content display method and device, terminal equipment and storage medium | |
KR20200111049A (en) | A system for measuring body size using image | |
KR20110087407A (en) | Camera simulation system and localization sensing method using the same | |
CN108896035B (en) | Method and equipment for realizing navigation through image information and navigation robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |