CN112536802B - Dressing assisting method and system - Google Patents

Dressing assisting method and system Download PDF

Info

Publication number
CN112536802B
CN112536802B CN202010339173.1A CN202010339173A CN112536802B CN 112536802 B CN112536802 B CN 112536802B CN 202010339173 A CN202010339173 A CN 202010339173A CN 112536802 B CN112536802 B CN 112536802B
Authority
CN
China
Prior art keywords
dressing
clothes
user
dressed
arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010339173.1A
Other languages
Chinese (zh)
Other versions
CN112536802A (en
Inventor
顾震江
刘大志
孙其民
罗沛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202010339173.1A priority Critical patent/CN112536802B/en
Publication of CN112536802A publication Critical patent/CN112536802A/en
Application granted granted Critical
Publication of CN112536802B publication Critical patent/CN112536802B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47GHOUSEHOLD OR TABLE EQUIPMENT
    • A47G25/00Household implements used in connection with wearing apparel; Dress, hat or umbrella holders
    • A47G25/90Devices for domestic use for assisting in putting-on or pulling-off clothing, e.g. stockings or trousers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The application is suitable for the technical field of robots, and provides a dressing auxiliary method and a dressing auxiliary system, wherein the dressing auxiliary method comprises the steps of applying the dressing auxiliary system to the dressing auxiliary system, the dressing auxiliary system comprises a first dressing auxiliary robot and a second dressing auxiliary robot, the method comprises the steps that the first dressing auxiliary robot obtains clothes information related to a user and sends the clothes information to the second dressing auxiliary robot, the second dressing auxiliary robot obtains a to-be-dressed object related to the clothes information and determines a dressing strategy of the to-be-dressed object according to the clothes type contained in the clothes information, and the to-be-dressed object is dressed on the user according to the dressing strategy. This application can wear the clothing of different grade type on the user fast to improve the efficiency that the user dressed.

Description

Dressing assisting method and system
Technical Field
The present application relates to the field of robotics, and more particularly, to a dressing assistance method, system, and computer-readable storage medium.
Background
With the development of economy, the requirements of people on clothes are greatly improved, people often wear various clothes to go out, however, the existing clothes-wearing auxiliary system has single function, different types of clothes cannot be quickly worn on the body of a user, and the clothes-wearing efficiency of the user is affected.
Therefore, a new technical solution is needed to solve the above technical problems.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method and a system for assisting in dressing, which can quickly wear different types of clothes on a user, thereby improving the efficiency of dressing for the user.
A first aspect of an embodiment of the present application provides a dressing assist method, which is applied to a dressing assist system including a first dressing assist robot and a second dressing assist robot, and the method includes:
the first dressing auxiliary robot acquires user-associated clothes information and sends the clothes information to the second dressing auxiliary robot;
the second auxiliary clothes-dressing robot acquires the clothes to be dressed related to the clothes information, determines a clothes-dressing strategy of the clothes to be dressed according to the types of clothes contained in the clothes information, and dresses the clothes to be dressed on the user according to the clothes-dressing strategy.
In one embodiment, the first dressing assist robot acquiring user-associated clothing information includes:
the first dressing auxiliary robot acquires the body information of the user and the environmental parameters of the environment where the user is located;
and the first dressing auxiliary robot determines the clothes information according to the environment parameters and the shape information.
In one embodiment, the second dressing assist robot includes a first robot arm and a second robot arm;
the putting on the user of the clothing to be put on according to the clothing putting-on strategy comprises:
when the dressing strategy is a first dressing strategy, if it is detected that the left arm and the right arm of the user are located at a first preset position under the action of the first mechanical arm, controlling the second mechanical arm to dress the stretched clothes to be dressed on the body of the user when the second mechanical arm stretches the clothes to be dressed;
and if the fact that the left arm and the right arm of the user move from the first preset position to the second preset position under the action of the first mechanical arm is detected, controlling the second mechanical arm to enable the sleeves of the to-be-dressed object to be dressed on the left arm and the right arm of the user.
In one embodiment, the second dressing assist robot includes a first robot arm and a second robot arm;
the putting on the user of the clothing to be put on according to the clothing putting-on strategy comprises:
when the dressing strategy is a second dressing strategy, if the fact that the first arm of the user moves to a third preset position under the action of the first mechanical arm is detected, controlling the second mechanical arm to dress the first sleeve of the clothes to be dressed on the first arm;
if the first arm is detected to be sleeved with the first sleeve of the clothes to be dressed, and the second arm of the user moves to a fourth preset position under the action of the first mechanical arm, controlling the second mechanical arm to sleeve the second sleeve of the clothes to be dressed on the second arm;
and when the second sleeve is detected to be worn on the second arm, controlling the second mechanical arm to buckle the button of the clothes to be worn.
In one embodiment, the method further comprises:
the first dressing auxiliary robot detects the position of the hand of the user and sends a detection result to the second dressing auxiliary robot;
and when the detection result is that the first dressing auxiliary robot does not detect the position of the hand of the user, the second dressing auxiliary robot controls the second mechanical arm to move a target sleeve along the arm direction of the user until the position of the hand of the user is detected, wherein the target sleeve comprises the sleeve, the first sleeve or the second sleeve.
In one embodiment, the dressing strategy comprises a dressing action corresponding to the to-be-dressed object;
the determining a dressing strategy of the to-be-dressed object according to the type of the clothes contained in the clothes information, and dressing the to-be-dressed object on the user according to the dressing strategy comprises:
the second dressing auxiliary robot inputs the clothes types into a dressing planning model which is trained in advance to obtain the dressing strategy;
and the second auxiliary dressing robot executes the dressing action in the dressing strategy to dress the to-be-dressed object on the user.
In one embodiment, the dressing plan model includes a general task layer, a behavior layer, and an action layer;
the second dressing auxiliary robot inputs the types of the clothes contained in the clothes information into a dressing plan model trained in advance, and a processing process of the dressing plan model on the clothes information in the operation of obtaining the dressing strategy includes:
processing the clothes types by utilizing the total task layer to obtain a total task corresponding to the clothes types;
inputting the total task into the behavior layer for processing to obtain a subtask corresponding to the total task;
inputting the subtasks into the action layer for processing to obtain the dressing action corresponding to the subtasks;
obtaining a result of the second dressing auxiliary robot executing the dressing action;
inputting the result of executing the dressing action into the total task layer to obtain the updated total task;
inputting the updated total task into the behavior layer to obtain updated subtasks corresponding to the updated total task;
inputting the updated subtasks into the action layer to obtain the adjusted dressing action.
In one embodiment, the method further comprises:
and obtaining a pre-trained dressing planning model through a deep Q learning network algorithm.
The beneficial effects of the embodiment of the application are that: the clothing dressing strategy of the clothing to be dressed is determined according to the clothing types contained in the clothing information, the clothing to be dressed is dressed on the user according to the determined clothing dressing strategy, and different types of clothing can be dressed on the user quickly, so that the clothing dressing efficiency of the user is improved; when the dressing strategy is the first dressing strategy, the clothing body of the clothing to be dressed is worn on the body of the user, and then the sleeves of the clothing to be dressed are worn on the left arm and the right arm of the user, so that the clothing to be dressed in a buckle-free manner can be quickly worn on the body of the user, and in addition, the sleeves which are worn on the left arm and the right arm of the user can be correspondingly corrected in position, and the clothing to be dressed can be correctly worn on the body of the user; when the dressing strategy is the second dressing strategy, the first sleeves of the clothes to be dressed are firstly worn on the first arms of the user, then the second sleeves of the clothes to be dressed are worn on the second arms of the user, and finally the buttons of the clothes to be dressed are buckled when the position of the hand of the user is detected, so that the clothing to be dressed with the buttons can be quickly worn on the body of the user; the clothes types contained in the clothes information can be input into the pre-trained dressing planning model, so that a dressing strategy of the to-be-dressed clothes is obtained, the second dressing auxiliary robot which executes the dressing strategy is favorable for quickly dressing the to-be-dressed clothes on the body of the user, and the second dressing auxiliary robot has high usability and practicability.
A second aspect of the embodiments of the present application provides a dressing support system, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of the first aspect when executing the computer program.
A third aspect of embodiments of the present application provides a computer-readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the method of any one of the first aspect.
A fourth aspect of embodiments of the present application provides a computer program product, which, when run on a dressing assistance system, causes the dressing assistance system to perform the method of any one of the first aspects described above.
It is understood that the beneficial effects of the second to fourth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a dressing assistance method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a dressing assistance method according to a second embodiment of the present application;
fig. 3 is a schematic flow chart of a dressing assistance method according to a third embodiment of the present application;
fig. 4 is a schematic flow chart of a dressing assistance method according to a fourth embodiment of the present application;
fig. 5 is a schematic structural view of a dressing auxiliary system provided in the fifth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
It should be noted that, the descriptions of "first" and "second" in this embodiment are used to distinguish different regions, modules, and the like, and do not represent a sequential order, and the descriptions of "first" and "second" are not limited to be of different types. The present application only explains and explains the case of the clothes to be dressed as an upper garment, and if the clothes to be dressed is a lower garment, the steps in the embodiments may be modified and processed accordingly, which is not limited herein.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Example one
Fig. 1 is a schematic flow chart of a dressing assistance method according to an embodiment of the present application, which may include the following steps:
s101: the first dressing auxiliary robot acquires the clothes information associated with the user and sends the acquired clothes information to the second dressing auxiliary robot.
The user may be some ordinary users who can wear clothes by themselves, or some special users who cannot wear clothes by themselves, such as the elderly and/or disabled, wherein the clothes information related to the user includes, but is not limited to, the color, size, type and thickness of the clothes to be worn related to the user.
In one embodiment, the clothing information associated with the user may be acquired through a photographing module provided on the first auxiliary clothing-dressing robot, such as a camera, which may specifically be:
a1: the first dressing auxiliary robot acquires the body information of the user and the environmental parameters of the environment where the user is located.
The body scanning can be carried out on the user in the sitting posture or the lying posture through the camera on the first dressing auxiliary robot, and a three-dimensional model of the body corresponding to the scanning result is established, so that the body information of the user is obtained; wherein the environmental parameters of the environment where the user is located can be obtained by the sensors on the first dressing auxiliary robot and/or the second dressing auxiliary robot, wherein the environmental parameters of the environment where the user is located include, but are not limited to, the temperature, the humidity, the wind speed and/or the illumination intensity of the environment where the user is located.
A2: and the first dressing auxiliary robot determines the clothes information related to the user according to the acquired environmental parameters and the acquired shape information.
In order to match the appropriate clothing for the user as much as possible, in one embodiment, the first dressing assisting robot may further determine the clothing information associated with the user in combination with the body dressing law.
In one embodiment, when the display module is provided on the first dressing auxiliary robot, after determining the clothes information associated with the user, the display module may display the clothes information accordingly, so that the user may select the clothes information through a brain-computer interface, eyeball recognition and/or voice interaction. In addition, after the first dressing auxiliary robot acquires the body information of the user, the first dressing auxiliary robot can also acquire the health information of the user according to the face information contained in the body information, so that only the clothes wearable by the user are displayed when dressing recommendation is performed.
In one embodiment, the acquired laundry information may be transmitted to the second dressing assist robot through a communication connection established between the first dressing assist robot and the second dressing assist robot, such as a bluetooth connection or a Wifi connection.
S102: the second auxiliary clothes-dressing robot acquires the clothes to be dressed related to the clothes information, determines a clothes-dressing strategy of the clothes to be dressed according to the clothes types contained in the clothes information, and dresses the related clothes to be dressed on the user according to the determined clothes-dressing strategy.
The second dressing auxiliary robot can move around the dressing area, and the second dressing auxiliary robot acquires the to-be-dressed clothes related to the clothes information, including but not limited to the fact that the second dressing auxiliary robot grabs the to-be-dressed clothes.
Therefore, the clothes wearing strategy of the clothes to be worn is determined according to the clothes types contained in the clothes information, the clothes to be worn can be worn on the user body according to the determined clothes wearing strategy, clothes of different types can be quickly worn on the user body, the clothes wearing efficiency of the user is improved, and the clothes wearing method and the clothes wearing device have high usability and practicability.
Example two
Fig. 2 is a schematic flow chart of a dressing assistance method provided in an embodiment two of the present application, which is a further detailed and supplementary description of the embodiment one, and the method may include the following steps:
s1011: the first dressing auxiliary robot acquires user-associated clothing information and sends the acquired clothing information to a second dressing auxiliary robot comprising a first mechanical arm and a second mechanical arm.
The number of the first mechanical arm and the second mechanical arm can be multiple.
S1021: the second dressing auxiliary robot obtains the to-be-dressed objects related to the clothing information, determines dressing strategies of the to-be-dressed objects according to the clothing types contained in the clothing information, controls the second mechanical arm to dress the to-be-dressed objects on the body of the user when the second mechanical arm struts the to-be-dressed objects if the dressing strategies are the first dressing strategies and the left and right arms of the user are detected to be located at the first preset positions under the action of the first mechanical arm, and controls the second mechanical arm to dress sleeves of the to-be-dressed objects which are strut on the left and right arms of the user when the left and right arms of the user are detected to move to the second preset positions from the first preset positions under the action of the first mechanical arm.
Wherein the first dressing strategy is a strategy of wearing a no-button jacket, such as a T-shirt or a sweater; the positions of the left arm and the right arm of the user can be detected through a photographing module on the first clothes-dressing auxiliary robot, and the positions of the left arm and the right arm of the user are sent to the second clothes-dressing auxiliary robot so as to indicate the second mechanical arm to unfold the clothes to be dressed; the first preset position is a position suitable for a user to penetrate into the clothes body of the clothes to be dressed, for example, the left and right finger tips can contact the lower edge of the clothes body of the clothes to be dressed after the left and right arms of the user are lifted upwards; the second mechanical arm can stretch the clothes to be dressed in a direction from bottom to top so that the stretched clothes to be dressed are in a certain preset shape, for example, the stretched clothes body and sleeves can be Y-shaped; the second preset position is a certain position which the left arm and the right arm of the user reach when the left arm and the right arm are opened outwards from the first preset position, such as a position which forms 90 degrees with the first preset position along the anticlockwise direction or a position which forms 90 degrees with the first preset position along the clockwise direction; in the third embodiment, step S1021' can be referred to control the second robot arm to put the sleeves of the spread clothes on the left and right arms of the user.
S1022: the first dressing auxiliary robot detects the position of the hand of the user and sends the detection result to the second dressing auxiliary robot, and when the detection result shows that the first dressing auxiliary robot does not detect the position of the hand of the user, the second dressing auxiliary robot controls the second mechanical arm to move the sleeves of the object to be dressed along the arm direction of the user until the position of the hand of the user is detected.
In practical applications, after the user wears the sleeves of the clothes to be worn, the sleeves may not be worn in place, for example, the hands of the user are not exposed from the left and right sleeves, and therefore the positions of the left and right sleeves need to be correspondingly detected and adjusted.
Wherein, the hand positions of the user can be detected by adopting the method of detecting the left and right arms of the user by the first dressing auxiliary robot in the step S1021; wherein, the second mechanical arm is provided with a plurality of sensors to assist the second mechanical arm to move the sleeves of the object to be dressed along the arm direction.
As can be seen from the above, in the second embodiment of the present application, compared to the first embodiment, when the dressing strategy is the first dressing strategy, the body of the user can be dressed first, and then the sleeves of the object can be dressed on the left and right arms of the user, so as to facilitate the user to quickly dress the object to be dressed without buckles; in addition, the corresponding position correction can be carried out on the sleeves worn on the left arm and the right arm of the user, so that the clothes to be worn can be correctly worn on the body of the user, and the clothes-correcting device has strong usability and practicability.
EXAMPLE III
Fig. 3 is a schematic flow chart of a dressing assistance method provided in a third embodiment of the present application, which is a further refinement and supplementary description of the first embodiment, and the method may include the following steps:
s1011: the first dressing auxiliary robot acquires user-associated clothing information and sends the acquired clothing information to a second dressing auxiliary robot comprising a first mechanical arm and a second mechanical arm.
The step S1011 is the same as the step S1011 in the second embodiment, and the specific implementation process thereof can refer to the description of the step S1011, which is not repeated herein.
S1021': the second auxiliary clothes-dressing robot obtains the clothes to be dressed related to the clothes information, determines a clothes-dressing strategy of the clothes to be dressed according to the clothes types contained in the clothes information, controls the second mechanical arm to dress the first sleeves of the clothes to be dressed on the first arm of the user if the first arm of the user is detected to move to the third preset position under the action of the first mechanical arm when the clothes-dressing strategy is the second clothes-dressing strategy, and controls the second mechanical arm to dress the second sleeves of the clothes to be dressed on the second arm of the user if the first arm of the clothes to be dressed is detected to be dressed on the first arm and the second arm of the user moves to the fourth preset position under the action of the first mechanical arm.
Wherein the second dressing plan is to wear a button-like coat, such as a shirt or a suit; the first arm of the user can be the left arm of the user or the right arm of the user, wherein the first sleeve of the article to be dressed corresponds to the first arm of the user, wherein the second sleeve of the article to be dressed corresponds to the second arm of the user, specifically, when the first arm of the user is the left arm of the user, the first sleeve of the article to be dressed is the left sleeve of the article to be dressed, the second arm of the user is the right arm of the user at the moment, the second sleeve of the article to be dressed is the right sleeve of the article to be dressed, and when the first arm of the user is the right arm of the user, the first sleeve of the article to be dressed is the right sleeve of the article to be dressed, the second arm of the user is the left arm of the user at the moment, and the second sleeve of the article to be dressed is the left sleeve of the article to be dressed; the third preset position may be a position 90 degrees away from the first preset position in the step S1021 in the counterclockwise direction; the fourth preset position is a position corresponding to the third preset position, for example, a position 90 degrees away from the first preset position in step S1021 in the clockwise direction.
In one embodiment, the second robot arm may fasten the buttons of the to-be-worn items in a sequence selected from top to bottom, bottom to top, or randomly.
And S1022': when the second sleeves are detected to be worn on the second arms of the user, the first dressing auxiliary robot detects the positions of the hands of the user and sends the detection result to the second dressing auxiliary robot, when the detection result is that the first dressing auxiliary robot does not detect the positions of the hands of the user, the second dressing auxiliary robot controls the second mechanical arms to move the sleeves of the object to be dressed along the arm direction until the positions of the hands of the user are detected, and when the positions of the hands of the user are detected, the second mechanical arms are controlled to buckle the buckles of the object to be dressed.
The step S1022' is substantially the same as the step S1022 in the second embodiment, and the specific implementation process thereof can refer to the description of the step S1022, which is not repeated herein.
The button includes, but is not limited to, button, snap button and magic tape.
Therefore, three phases of the embodiment of the application are more convenient to use and practical than the first embodiment, when the dressing strategy is the second dressing strategy, the first sleeves of the clothes to be dressed are firstly worn on the first arms of the user, the second sleeves of the clothes to be dressed are then worn on the second arms of the user, and finally the buttons of the clothes to be dressed are buckled when the positions of the arms of the user are detected, so that the clothes to be dressed with the buttons can be quickly worn on the user.
Example four
Fig. 4 is a schematic flow chart of a dressing assistance method according to a fourth embodiment of the present application, which is another refinement and description of step S102 in the first embodiment, and the method may include the following steps:
s101: the first dressing auxiliary robot acquires the clothes information associated with the user and sends the acquired clothes information to the second dressing auxiliary robot.
S1021': and the second dressing auxiliary robot acquires the to-be-dressed objects related to the clothes information, inputs the clothes types contained in the clothes information into a pre-trained dressing planning model to obtain a dressing strategy of the to-be-dressed objects, and executes the dressing action in the dressing strategy to dress the to-be-dressed objects on the body of the user.
To achieve end-to-end Learning from perception to action, in one embodiment, a pre-trained dressing plan model may be obtained through Deep Q-Learning (DQN) algorithms.
In order to improve the training effect, when the dressing plan model is trained, a related reward function and a penalty function can be set according to the integrity of the article to be dressed and the speed of the second dressing auxiliary robot for dressing the article on the user, so that the robot is rewarded for executing a correct dressing action through the reward function and the penalty function, and the robot is punished for executing an incorrect dressing action. The method specifically comprises the following steps: when the garment is intact, a positive reward, such as +1, may be obtained; when the clothing to be worn is torn, a negative reward can be obtained, such as-1; when the left sleeve does not pass through the left arm, a negative reward is obtained, such as-0.4; when the left sleeve part is worn on the left arm, a positive reward is obtained, such as + 0.4; when the left sleeve is fully worn on the left arm, a positive reward, such as +1, is obtained.
In one embodiment, when the dressing plan model includes a total task layer, a behavior layer and an action layer, the second dressing auxiliary robot inputs the type of the clothes included in the clothes information into the dressing plan model trained in advance to obtain the dressing strategy of the clothes to be dressed, and the processing procedure of the dressing plan model on the clothes information may include:
b1: and processing the clothes types contained in the clothes information by using the total task layer to obtain a total task corresponding to the clothes types.
Wherein the total task layer is the highest layer in the dressing plan model.
B2: and inputting the total task into the behavior layer for processing to obtain a subtask corresponding to the total task.
Wherein the behavior layer is a middle layer in the dressing plan model, wherein the total tasks can be input into the behavior layer in the form of behavior action sequences.
B3: and inputting the subtasks into the action layer for processing to obtain the dressing action corresponding to the subtasks.
Wherein the action layer is the bottom layer in the dressing plan model.
B4: and obtaining a result of the second dressing auxiliary robot executing the dressing action.
B5: and inputting the result of the dressing action into the total task layer to obtain an updated total task.
B6: and inputting the updated total task into the behavior layer to obtain an updated subtask corresponding to the updated total task.
B7: inputting the updated subtasks into the action layer to obtain the adjusted dressing action.
Taking a specific application scenario as an example, the above steps B1-B7 are explained and explained, and when the total task obtained by using the total task layer is to wear a shirt, the shirt is input into the action layer, and subtask 1 (left sleeve of shirt), subtask 2 (right sleeve of shirt) and subtask 3 (button) corresponding to the wearing of the shirt are obtained. The subtask 1, the subtask 2, and the subtask 3 are input to the action layer, and the dressing action 1 (left sleeve to grab shirt) corresponding to the dressing task 1, the dressing action 2 (left sleeve to penetrate into user's left arm) corresponding to the dressing task 1, the dressing action 3 (left sleeve to be pulled upward) corresponding to the dressing task 1, the dressing action 4 (right sleeve to grab shirt) corresponding to the dressing task 2, the dressing action 5 (right sleeve to be penetrated into user's right arm) corresponding to the dressing task 2, the dressing action 6 (right sleeve to be pulled upward) corresponding to the dressing task 2, and the dressing action 7 (button) corresponding to the dressing task 3 are obtained. And obtaining results of the second dressing auxiliary robot for executing the dressing action 1, the dressing action 2, the dressing action 3, the dressing action 4, the dressing action 5, the dressing action 6 and the dressing action 7, inputting the first obtained execution result into a total task layer, obtaining a first updated total task as the rest parts except the left sleeve in the shirt when the dressing action 1, the dressing action 2 and the dressing action 3 are finished, inputting the first updated total task into a behavior layer, obtaining updated subtasks as the subtask 2 and the subtask 3, inputting the subtask 2 and the subtask 3 into the behavior layer, and obtaining an adjusted dressing action as the dressing action 4, the dressing action 5, the dressing action 6 and the dressing action 7. And obtaining results of the second dressing auxiliary robot for executing the dressing action 4, the dressing action 5, the dressing action 6 and the dressing action 7, inputting the obtained execution results into a total task layer, obtaining a secondary updated total task of the rest parts except the left and right sleeves in the shirt when the obtained execution results are that the dressing action 4, the dressing action 5 and the dressing action 6 are completed, inputting the secondary updated total task into a behavior layer, obtaining an updated subtask 3, and inputting the subtask 3 into the behavior layer, thus obtaining the adjusted dressing action as the dressing action 7. And acquiring a result of the second dressing auxiliary robot for executing the dressing action 7, and clearing the total task, the subtask and the dressing action when the currently acquired execution result is that the dressing action 7 is finished, so as to finish the current dressing plan.
As can be seen from the above, in the fourth embodiment of the present application, compared to the first embodiment, the type of the clothes included in the clothes information can be input into the pre-trained dressing planning model, so as to obtain the dressing strategy of the to-be-dressed clothes, which is beneficial for the second dressing auxiliary robot executing the dressing strategy to rapidly wear the to-be-dressed clothes on the body of the user, and has strong usability and practicability.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a dressing auxiliary system provided in the fifth embodiment of the present application. As shown in fig. 5, the dressing assistance system 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps of the first to fourth embodiments of the method, such as the steps S101 to S102 shown in fig. 1.
The dressing assist system 5 may include a first dressing assist robot and a second dressing assist robot including a first robot arm and a second robot arm. The dressing assistance system may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of a dressing assist system 5 and does not constitute a limitation of the dressing assist system 5 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the dressing assist system may also include input output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the dressing support system 5, such as a hard disk or a memory of the dressing support system 5. The memory 51 may also be an external storage device of the dressing support system 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the dressing support system 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the dressing assist system 5. The memory 51 is used for storing the computer program and other programs and data required by the dressing assistance system. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art would appreciate that the modules, elements, and/or method steps of the various embodiments described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, and software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (9)

1. A dressing assist method applied to a dressing assist system including a first dressing assist robot and a second dressing assist robot including a first robot arm and a second robot arm, the method comprising:
the first dressing auxiliary robot acquires user-associated clothes information and sends the clothes information to the second dressing auxiliary robot;
the second auxiliary clothes-dressing robot acquires the clothes to be dressed related to the clothes information, determines a clothes-dressing strategy of the clothes to be dressed according to the types of clothes contained in the clothes information, and dresses the clothes to be dressed on the user according to the clothes-dressing strategy;
the putting on the user of the clothing to be put on according to the clothing putting-on strategy comprises:
when the dressing strategy is a first dressing strategy, if the left arm and the right arm of the user are detected to be located at a first preset position under the action of the first mechanical arm, controlling the second mechanical arm to put the unfolded clothes body of the clothes to be dressed on the body of the user when the second mechanical arm unfolds the clothes to be dressed; wherein the first dressing strategy is a strategy of wearing a no-button jacket;
and if the fact that the left arm and the right arm of the user move from the first preset position to the second preset position under the action of the first mechanical arm is detected, controlling the second mechanical arm to enable the sleeves of the to-be-dressed object to be dressed on the left arm and the right arm of the user.
2. The method of claim 1, wherein the first dressing assist robot obtaining user-associated clothing information comprises:
the first dressing auxiliary robot acquires the body information of the user and the environmental parameters of the environment where the user is located;
and the first dressing auxiliary robot determines the clothes information according to the environment parameters and the shape information.
3. The method of claim 1, wherein the second dressing assist robot comprises a first robotic arm and a second robotic arm;
the putting on the user of the clothing to be put on according to the clothing putting-on strategy comprises:
when the dressing strategy is a second dressing strategy, if the fact that the first arm of the user moves to a third preset position under the action of the first mechanical arm is detected, controlling the second mechanical arm to dress the first sleeve of the clothes to be dressed on the first arm;
if the first arm is detected to be sleeved with the first sleeve of the clothes to be dressed, and the second arm of the user moves to a fourth preset position under the action of the first mechanical arm, controlling the second mechanical arm to sleeve the second sleeve of the clothes to be dressed on the second arm;
and when the second sleeve is detected to be worn on the second arm, controlling the second mechanical arm to buckle the button of the clothes to be worn.
4. The method of claim 3, wherein the method further comprises:
the first dressing auxiliary robot detects the position of the hand of the user and sends a detection result to the second dressing auxiliary robot;
and when the detection result shows that the first dressing auxiliary robot does not detect the position of the hand of the user, the second dressing auxiliary robot controls the second mechanical arm to move a target sleeve along the arm direction of the user until the position of the hand of the user is detected, wherein the target sleeve comprises the sleeve, the first sleeve or the second sleeve.
5. The method according to claim 1, wherein the dressing strategy comprises a dressing action corresponding to the to-be-dressed object;
the determining a dressing strategy of the to-be-dressed object according to the type of the clothes contained in the clothes information, and dressing the to-be-dressed object on the user according to the dressing strategy comprises:
the second dressing auxiliary robot inputs the clothes types into a dressing planning model which is trained in advance to obtain the dressing strategy;
the second auxiliary dressing robot executes the dressing action in the dressing strategy to dress the object to be dressed on the user.
6. The method of claim 5, wherein the dressing plan model includes a general tasks layer, a behavior layer, and an actions layer;
the second dressing auxiliary robot inputs the types of the clothes contained in the clothes information into a dressing plan model trained in advance, and a processing process of the dressing plan model on the clothes information in the operation of obtaining the dressing strategy includes:
processing the clothes type by utilizing the total task layer to obtain a total task corresponding to the clothes type;
inputting the total task into the behavior layer for processing to obtain a subtask corresponding to the total task;
inputting the subtasks into the action layer for processing to obtain the dressing action corresponding to the subtasks;
obtaining a result of the second dressing auxiliary robot executing the dressing action;
inputting the result of executing the dressing action into the total task layer to obtain the updated total task;
inputting the updated total task into the behavior layer to obtain updated subtasks corresponding to the updated total task;
inputting the updated subtasks into the action layer to obtain the adjusted dressing action.
7. The method of claim 5 or 6, further comprising:
and obtaining a pre-trained dressing planning model through a deep Q learning network algorithm.
8. A dressing assistance system comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 7 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202010339173.1A 2020-04-26 2020-04-26 Dressing assisting method and system Active CN112536802B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010339173.1A CN112536802B (en) 2020-04-26 2020-04-26 Dressing assisting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010339173.1A CN112536802B (en) 2020-04-26 2020-04-26 Dressing assisting method and system

Publications (2)

Publication Number Publication Date
CN112536802A CN112536802A (en) 2021-03-23
CN112536802B true CN112536802B (en) 2022-05-17

Family

ID=75013432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010339173.1A Active CN112536802B (en) 2020-04-26 2020-04-26 Dressing assisting method and system

Country Status (1)

Country Link
CN (1) CN112536802B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114572635A (en) * 2022-04-06 2022-06-03 广州南洋理工职业学院 Intelligent garment sample garment image acquisition equipment based on image management

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20119786U1 (en) * 2001-12-05 2002-05-02 Franken Wilhelm Josef Stocking donning
CN106926248A (en) * 2017-01-16 2017-07-07 深圳前海勇艺达机器人有限公司 Method with the robot of fitting function is assisted and assist fitting
CN107784685A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Collocation clothes robot
CN207448489U (en) * 2017-10-31 2018-06-05 广东希瑞干细胞技术有限公司 A kind of automatic putting on device of clean garment
CN108343271A (en) * 2018-02-25 2018-07-31 宁波市晶杰国际物流有限公司 Formula of shooting one wearable device
CN109662475A (en) * 2018-12-26 2019-04-23 阴祖伟 A kind of automatic dressing wardrobe
CN110841094A (en) * 2019-11-26 2020-02-28 上海景峰制药有限公司 Changing method, device, changing cabinet, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20119786U1 (en) * 2001-12-05 2002-05-02 Franken Wilhelm Josef Stocking donning
CN107784685A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Collocation clothes robot
CN106926248A (en) * 2017-01-16 2017-07-07 深圳前海勇艺达机器人有限公司 Method with the robot of fitting function is assisted and assist fitting
CN207448489U (en) * 2017-10-31 2018-06-05 广东希瑞干细胞技术有限公司 A kind of automatic putting on device of clean garment
CN108343271A (en) * 2018-02-25 2018-07-31 宁波市晶杰国际物流有限公司 Formula of shooting one wearable device
CN109662475A (en) * 2018-12-26 2019-04-23 阴祖伟 A kind of automatic dressing wardrobe
CN110841094A (en) * 2019-11-26 2020-02-28 上海景峰制药有限公司 Changing method, device, changing cabinet, equipment and storage medium

Also Published As

Publication number Publication date
CN112536802A (en) 2021-03-23

Similar Documents

Publication Publication Date Title
US9817959B2 (en) Wearable electronic devices
EP3547267A1 (en) Robot control system, machine control system, robot control method, machine control method, and recording medium
US11163158B2 (en) Skin-based approach to virtual modeling
JP2022028692A (en) Information processing device, control method of information processing device, and program
Jiang et al. Enhanced control of a wheelchair-mounted robotic manipulator using 3-D vision and multimodal interaction
CN113227878A (en) Method and system for gaze estimation
CN112536802B (en) Dressing assisting method and system
Dometios et al. Vision-based online adaptation of motion primitives to dynamic surfaces: application to an interactive robotic wiping task
Štrbac et al. Microsoft Kinect‐Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping
US11467668B2 (en) System and method for representing virtual object information with haptic stimulation
CN103295011A (en) Information processing apparatus, information processing method and computer program
CN110399032A (en) The control method and device of wearable device
CN105511750A (en) Switching method and electronic equipment
CN115016645A (en) Multi-degree-of-freedom acquired data glove for cooperative control of artificial fingers
CN107450717A (en) A kind of information processing method and Wearable
CN111768507A (en) Image fusion method and device, computer equipment and storage medium
CN107886387A (en) The implementation method and its device of palm decoration virtual image are provided using augmented reality
Chen et al. Learning to grasp clothing structural regions for garment manipulation tasks
Mastrogiovanni et al. Special issue on advances in tactile sensing and tactile-based human–robot interaction
WO2017169827A1 (en) Information processing device, display control method, and program
Natale et al. Sensorimotor coordination in a “baby” robot: learning about objects through grasping
Ghorbel et al. An embedded real-time hands free control of an electrical wheelchair
CN106926248A (en) Method with the robot of fitting function is assisted and assist fitting
Langlois et al. Improved motion classification with an integrated multimodal exoskeleton interface
Nocentini et al. Preliminary Studies of a Model for a Robot that Creates an Interactive Communication with Elderly People to Satisfy Their Clothing Item Requests

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant